Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

How we think, what we spend our time thinking about, and the way we view and understand the world is important. If we fail to develop accurate beliefs in the world then we will make decisions based on causal structures that do not exist. Our actions, thoughts, and behaviors will inhibit knowledge for ourselves and others, and our species will be worse off because of it.
This idea is at the heart of Quassim Cassam’s book Vices of the Mind. Throughout our human history we have held many beliefs that cannot plausibly be true, or which we came to learn were incorrect over time. Cassam would argue (alongside others such as Steven Pinker, Yuval Noah Harari, and Joseph Henrich) that adopting more accurate and correct beliefs and promoting knowledge would help us systematically make better decisions to improve the life of our fellow humans. Learning where we were wrong and using science, technology, and information to improve our decision-making has helped our world become less violent, given us more opportunity, provided better nutrition, and allowed us to be more cooperative on a global level.
This is why Cassam addresses paranormal beliefs, superstitions, and conspiratorial thinking in his book. While examining conspiracy theories in depth, he writes, “studies have also found that belief in conspiracy theories is associated with superstitious and paranormal beliefs, and it has been suggested that these beliefs are associated because they are underpinned by similar thinking styles [italicized text is cited with Swami et al. 2011].  Cassam argues that conspiracy theories are different from the other two modes of thinking because they can sometimes be accurate in their descriptions of the world. Sometimes a politician truly is running a corruption scheme, sometimes a group of companies are conspiring to keep prices high, and sometimes a criminal organization is hiding nefarious activities in plain sight. Conspiratorial thinking in some instances can reveal real causal connections in the world.
However, conspiratorial thinking is often bizarre and  implausible. When our conspiratorial thinking pushes us off the deep edge, then it does share important characteristics with superstitious and paranormal thinking. All three can be described by positing causal connections that cannot possibly exist between phenomena happening or imagined in the real world. They create explanations that are inaccurate and prevent us from identifying real information about the world. Superstitions posit causal connections between random and unconnected events and paranormal thinking posits causal connections between non-existent entities and real world events. Conspiratorial thinking seems to fall in line with both ways of thinking when it is not describing reality.
Over the last few years we have seen how conspiratorial thinking can be vicious, how it can inhibit knowledge, and how it can have real life and death consequences when it goes wrong. Superstitious thinking doesn’t generally seem to have as severe of consequences, but it still prevents us from making the best possible decisions and still drives us to adopt incorrect worldviews, sometimes entrenching unfair biases and prejudices. Paranormal thinking has been a foundation of many world religions and fables used to teach lessons and encourage particular forms of behavior. However, if it does not describe the world in a real way, then the value of paranormal thinking is minimized, and we should seriously consider the harms that can come from paranormal thinking, such as anxiety, suicide, or hours of lost sleep. These ideas are important to consider because we need to make the best possible decisions based on the most accurate information possible if we want to continue to advance human societies, to live sustainably, and to continue to foster cooperation and community between all humans on a global scale. Thinking accurately takes practice, so pushing against unwarranted conspiracy theories, superstitions, and paranormal beliefs helps us build our epistemic muscles to improve thinking overall.
Thinking Conspiratorially

Thinking Conspiratorially

Over the last few years a number of wild conspiracy theories have become popular. Former President Donald Trump embraced a conspiracy theory that the 2020 Presidential Election was rigged (it was not), supported the Qanon conspiracy theory, and did little to push back against conspiracy theories surrounding COVID-19. His actions, behaviors, and beliefs demonstrate that thinking conspiratorially can be an epistemic vice. His willingness to believe wild falsehoods obstructed knowledge for himself and his most ardent supporters.
However, thinking conspiratorially is not always an epistemic vice. One reason why conspiracy theories become so gripping and why people sometimes fall into them is because real conspiracies do occur. Nixon’s Watergate Scandal, Trump’s withholding of financial and military aid unless Ukraine announced an investigation into Joe Biden and his son, and fraud schemes uncovered by inspectors general and government auditors demonstrate that nefarious conspiracies sometimes are real. While thinking conspiratorially can become an epistemic vice, the same is true for anti-conspiratorial thinking.
In the book Vices of the Mind, Quassim Cassam quotes Dr. Charles Pigden from the University of Otago in New Zealand by writing, “there is nothing inherently vicious about believing or being disposed to believe conspiracy theories.” Cassam argues that conspiratorial thinking is not an epistemic vice on its own, but is instead a context dependent vice or virtue. He continues, “there are environments in which either way of thinking can be epistemically virtuous or vicious, and a way to capture this context-relativity is to describe these thinking styles as conditionally virtuous or vicious.”
The examples I used earlier show how conspiratorial thinking can be either virtuous or vicious. In the case of our former President, his conspiratorial thinking spread misinformation, suppressed true and accurate information, and created a set of false beliefs that some of his supporters believed so strongly that they stormed the United States Capitol in an attempt to stop Congress from certifying the election. The context of his conspiracy theories obstructed knowledge and caused substantial harm to life and property. However, a government auditor who notices inconsistencies in paperwork and accounting practices may be rewarded for thinking conspiratorially, at least to a point. Believing that something nefarious could possibly be going on will encourage the auditor to review financial statements and testimony from personnel with more scrutiny, potentially helping them uncover real fraud. Of course, they could still go too far and push the issue beyond reasonable bounds by thinking conspiratorially, but this type of thinking is conditionally virtuous when it discovers true fraud and improves knowledge about fraud schemes.
Given the dramatic consequences of conspiracy thinking over the last few years, it is easy to dismiss thinking conspiratorially as an epistemic vice. However, we should remember that it is only conditionally an epistemic vice, and that sometimes conspiracies do turn out to be true (or at least partially true). We don’t have to give every conspiracy our respect and attention, but when a conspiracy does appear to be grounded in reality and supported by real evidence, then we should not be too quick to dismiss it.
Causal Links Between Unconnected Events

Causal Links Between Unconnected Events

As a kid I grew up attending basketball camps at UCLA. I played in the old gym that used to host UCLA games in front of a few thousand fans, played on the current court in main stadium, and slept in the dorms. With my history of basketball at UCLA, I have always been a fan of the men’s basketball team, rooting for them and the Nevada Wolf Pack – where I actually went to school. With the UCLA team making a deep run in the NCAA March Madness tournament, I have been reminded of all the superstitious thinking that surrounds sports and that I used to indulge in.
Sports seem to bring out superstitious thinking in even the most rational of people. I try very hard to think about causal structures and to avoid seeing non-existent causal links between unconnected events, but nevertheless, it is hard to not let superstitious thinking creep in. When you are watching a game it is hard not to feel like you have to sit in the right spot, have to watch from a certain room, or have to avoid certain behaviors in order to keep your team in the lead. However, it is absolute nonsense to think that your actions on your couch, miles away from the sporting venue where the game is taking place, could have any causal link to the way that a sports team performs.
In the book Vices of the Mind, Quassim Cassam spends time examining what is happening within our mind when we engage in superstitious thinking. He explains that superstitious thinking qualifies as an epistemic vice because it gets in the way of knowledge. It prevents us from forming accurate beliefs about the world. “Superstitious thinking,” Cassam writes, “isn’t a generally reliable method for forming true beliefs about the future; it won’t generally lead to true beliefs because it posits causal links between unconnected events. … beliefs based on superstitious thinking aren’t reasonable.”
Cassam gives the example of superstitions about walking under ladders in the book. Someone with a superstition believing that bad luck will befall them if they walk under a ladder will probably avoid walking under ladders, and as a result they won’t be as likely to have paint drip on them, to have something fall on their head, or to knock over the ladder and anyone or anything on top of it. Their superstition will lead to better outcomes for them, but not because the superstition helped them create true beliefs about the dangers of walking under ladders. The individual ends up with the correct answer, but interprets the wrong causal chain to get there.
Thinking about rational and plausible causal chains is a way to escape superstitious thinking. You can rationally examine the risks, harms, and benefits of certain behaviors and actions with rational connections between events to see when a superstition is nonsense, and when it pulls from real-life causal chains to help improve life. Trying not step on cracks will not prevent you from starting a causal chain that leads to your mother’s broken back, but it will help ensure you have more stable and steady footing when you walk. Wearing the same basketball jersey for each sports game has no causal connection with the team’s performance, and wearing it or not wearing it will not have an impact on how your favorite team performs. We should strive to have accurate beliefs about the world, we should work to see causal connections clearly, and we should limit superstitious thinking even if it is about trivial things like sports.
Superstitious Thinking

Superstitious Thinking

Would you consider superstitious thinking to be a vice? According to Quassim Cassam in Vices of the Mind, superstitious thinking is indeed an epistemic vice. That is to say, Cassam believes that superstitious thinking is a is reprehensible, blameworthy, systematically obstructs knowledge. By systematically obstructing knowledge, superstitious thinking causes people to adopt beliefs about the world that don’t match reality, leaving them vulnerable to poor decision-making that can have real-world consequences in their lives.
Cassam writes, “a gambler who sees a succession of coin tosses coming down heads thinks that the next toss will be tails because a tails is now due. This is an example of superstitious or magical thinking, thinking that posits a causal link between unconnected events, namely, previous coin tosses and the next toss.” This quote shows how superstitious thinking systematically obstructs knowledge. It causes us to see causal connections when none exist, distorting our perception and theory of reality.
A gambler making bets sees a causal connection between previous roles of a dice or spins of a roulette wheel and the next roll or spin. In reality, each time you flip a coin, roll a dice, or spin a wheel, the previous result has no bearing on the current probability. A coin toss is a 50-50 affair that does not change because the previous flip was heads.
This type of thinking is prevalent in more than just gamblers. Sports enthusiasts regularly see causal links that cannot possibly exist. The same kind of thinking also shows up in people who have lucky clothing, special rituals in aspects of daily life, or who avoid certain phrases or behaviors. In many instances, the causal links we identify are absurd but don’t incur real costs in our lives. Avoiding stepping on cracks in the sidewalk doesn’t cost you anything and growing a beard because your favorite sports team is on a roll might even provide some social benefits and save you time from not shaving. However, giving in to superstitious thinking, as noted before, distorts your view of reality.
The causal chains misperceived through superstitious thinking create false understandings of how the world works. While it is harmless to believe that you need to sit in the same exact spot for your sports team to play well, it is not harmless to believe that hiring a woman to do a certain job is bad luck, and it is not harmless to bet your life savings on a gamble because of superstitious thinking. What may be even worse is that superstitious thinking in one area could spill into other areas, creating a habit of seeing causal chains that don’t exist. Overtime, superstitious thinking will lead to worse outcomes and poor decision-making that will have real costs in our lives.
Rules of Thumb: Helpful, but Systematically Error Producing

Rules of Thumb: Helpful, but Systematically Error Producing

The world throws a lot of complex problems at us. Even simple and mundane tasks and decisions hold a lot of complexity behind them. Deciding what time to wake up at, the best way to go to the grocery store and post office in a single trip, and how much is appropriate to pay for a loaf of bread have incredibly complex mechanisms behind them. In figuring out when to wake up we have to consider how many hours of sleep we need, what activities we need to do in the morning, and how much time it will take for each of those activities to still provide us a cushion of time in case something runs long. In making a shopping trip we are confronted with p=np, one of the most vexing mathematical problems that exists. And the price of bread was once the object of focus for teams of Soviet economists who could not pinpoint the right price for a loaf of bread that would create the right supply to match the population’s demand.
The brain handles all of these problems with relatively simple heuristics and rules of thumb, simplifying decisions so that we don’t waste the whole night doing math problems for the perfect time to set an alarm, don’t miss the entire day trying to calculate the best route to run all our errands, and don’t waste tons of brain power trying to set bread prices. We set a standard alarm time and make small adjustments knowing that we ought to leave the house ready for work by a certain time to make sure we reduce the risk of being late. We stick to main roads and travel similar routes to get where we need to go, eliminating the thousands of right or left turn alternatives we could chose from. We rely on open markets to determine the price of bread without setting a universal standard.
Rules of thumb are necessary in a complex world, but that doesn’t mean they are not without their own downfalls. As Quassim Cassam writes in Vices of the Mind, echoing Daniel Kahneman from Thinking Fast and Slow, “We are hard-wired to use simple rules of thumb (‘heuristics’) to make judgements based on incomplete or ambiguous information, and while these rules of thumb are generally quite useful, they sometimes lead to systematic errors.” Useful, but inadequate, rules of thumb can create predictable and reliable errors or mistakes. Our thinking can be distracted with meaningless information, we can miss important factors, and we can fail to be open to improvements or alternatives that would make our decision-making better.
What is important to recognize is that systematic and predictable errors from rules of thumb can be corrected. If we know where errors and mistakes are systematically likely to arise, then we can take steps to mitigate and reduce those errors. We can be confident with rules of thumb and heuristics that simplify decisions in positive ways while being skeptical of rules of thumb that we know are likely to produce errors, biases, and inaccurate judgements and assumptions. Companies, governments, and markets do this all the time, though not always in a step by step process (sometimes there is one step forward and two steps backward) leading to progress over time. Embracing the usefulness of rules of thumb while acknowledging their shortcomings is a powerful way to improve decision-making while avoiding the cognitive downfall of heuristics.
Pluralistic Ignorance

Pluralistic Ignorance

TV shows and movies frequently have scenes where one character has been putting up with something they dislike in order to please another character, only to find out that the other character also dislikes the thing. I can think of instances where characters have been drinking particular beverages they dislike, playing games they don’t enjoy, or wearing clothing they hate, just because they think another character enjoys that particular thing and they want to share in that experience with the other person. It is a little corny, but I really enjoy the moment when the character recognizes they have been putting themselves in agony for the benefit of the other person, only to realize they have been in agony as well!

 

This particular comedic device plays on pluralistic ignorance. We don’t ever truly know what is in another person’s head, and even if we live with someone for most of our life, we can’t ever know them with complete certainty. When it comes to really knowing everyone around us and everyone in our community or society, we can only ever know most people at a minimal surface level. We follow cues from others that we want to be like, that we think are popular, and that we want to be accepted by. But when everyone is doing this, how can any of us be sure that we all actually want to be the way we present ourselves? We are all imagining what other people think, and trying to live up to those standards, not realizing that we may all hate the thing that we think everyone else considers cool.

 

The whole situation reminds me of AP US History from my junior year in high school. My friend Phil sat toward the back of the classroom and the year he and I had the class was the very last year for our teacher before he planned to retire. He was on autopilot most of the year, a good teacher, but not exactly worried about whether his students payed attention in class or cheated on tests. For one test, Phil was copying off the girl next to him, only to realize halfway through class that she was cheating off him! When Phil told the story later, we all had to ask where any answers were coming from if they were both cheating off each others test.

 

Pluralistic ignorance feels like Phil and his AP US History test. However, pluralistic ignorance can be much more important than my little anecdote. Yesterday’s post was about collective conservatism, a form of groupthink where important decision-makers stick to tradition and familiar strategies and answers even as the world changes and demands new and innovative responses. Pluralistic ignorance can limit our responses to change, locking in tradition because we think that is what people want, even though people may be tired of old habits and patterns and ready for something new.

 

In Nudge, Cass Sunstein and Richard Thaler write, “An important problem here is pluralistic ignorance – that is, ignorance, on the part of all or most, about what other people think. We may follow a practice or tradition not because we like it, or even think it defensible, but merely because we think that most other people like it.”

 

A real world example I can think of would be driving cars. Many people in the country absolutely love cars and see them as symbols of freedom, innovation, and American ingenuity. Thinking that people would be willing to give up their cars or change anything about them seems delusional, and public policy, advertising campaigns, and car designs reflect the idea that people want more, bigger, and faster cars. But is this actually true for most Americans?

 

Our cars emit toxic fumes, tens of thousands of people die annually in crashes, and the lights and sounds of cars can keep those who live along busy streets or next to car enthused neighbors awake at night. People have to pay for auto insurance, vehicles break down frequently, require constant costly maintenance, and in the US there is a constant pressure to have a newer and nicer car to signal how well off one is. My sense is that people generally dislike cars, especially anything dealing with purchasing or repairing a car, but that they put up with them because they think other people like cars and value and respect their car choice. I believe that if there were enough reliable, fast, and convenient alternative transportation options, people would start to ditch cars. I think lots of people buy fancy, powerful, and loud cars because they think other people like them, not necessarily because they actually like the car themselves. If we could come together in an honest way, I think we could all scale back our cars, opting for smaller, quieter, less polluting vehicles or public transportation. There are certainly a lot of problems with public transportation, but I think our obsession and connections with cars is in part pluralistic ignorance as to how much other people actually like and value cars. We are trapped in a vehicular arms race, when we would really all rather not have to worry about cars in the first place.
Can We Avoid Cognitive Errors?

Can We Avoid Cognitive Errors?

Daniel Kahneman is not very hopeful when it comes to our ability to avoid cognitive errors. Toward the end of his book Thinking Fast and Slow, a book all about cognitive errors, predictable biases, and situations in which we can recognize such biases and thinking errors, Kahneman isn’t so sure there is much we can actually do in our lives to improve our thinking.

 

Regarding his own thinking, Kahneman writes, “little can be achieved without considerable effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.”

 

Kahneman’s book is fantastic in part because of his humility. It would be easy to take a book on illusions, cognitive errors, biases, and predictable fallacies and use it to show how much smarter you are than everyone else who makes such thinking mistakes. However, Kahneman uses his own real life examples throughout the book to show how common and easy it is to fall into ways of thinking that don’t actually reflect reality. What is unfortunate though, is how hard it is to actually take what you learn from the book and apply it to your own life. If the author himself can hardly improve his own thinking, then those of us who read the book likely won’t make big changes in our thinking either.

 

“The upshot is that it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so. Observers are less cognitively busy and more open to information than actors,” Kahneman continues. While we might not be able to improve our thinking simply by knowing about cognitive errors and being aware of predictable biases, we can at least recognize them in others. This can help us be more thoughtful when we critique or gossip about others (something we all do even if we claim we don’t).

 

Beyond improving the way we gossip or judge others, Kahneman’s research and his book are incredibly valuable for anyone who is in a design focused role. If you are creating a layout for a webpage, a seating arrangement at a restaurant, or the standard operating procedures for a company, you have an opportunity to design and develop a process and flow that takes cognitive errors and predictable biases into account. Because it is easier to observe others making mistakes than to observe those mistakes in ourselves, we can watch for situations where people are led astray, and help get them back on course. We can develop systems and structures that take our biases and cognitive errors into account, and minimize the damage they may do. We can set the world up to help guide us in a reasonable way through our cognitive errors and biases, but only if we know what to look for.
Focusing Illusion

Focusing Illusion

I wrote earlier about an experiment that Daniel Kahneman discusses in his book Thinking Fast and Slow where college students were asked to evaluate their life and asked to count the number of dates they had been on in the last month. When the question about dates came after the question about happiness, there was no correlation between the two answers. However, when the question about dating came before the question about happiness, those who had few dates tended to rank their overall happiness lower. Later in the book, Kahneman expands on ideas related to this finding and describes the focusing illusion.

 

Kahneman sums up the focusing illusion by writing, “Nothing in life is as important as you think it is when you are thinking about it.”

 

Our brains are limited. They can only hold so much information at one time. What you see is all there is, meaning that the things you directly observe become the reality that your mind works within. We use heuristics, make assumptions, and our thoughts are subject to biases. As a result, the things we pay attention to and think about become the center of our lives. They become more important in our minds than they really should be.

 

The dating and happiness questions help us see the machinery of the mind and help us understand how the brain works. The inner machinery of the mind really does overweight things that we happen to be thinking about. Having more or fewer dates is an important worry for college students, but making students think about their dating life before or after a question about overall happiness shouldn’t really influence the degree to which students rate their overall happiness. However, if the mind is forced to think about dating, it becomes a more important factor in the mind and begins to blend into other considerations.

 

I have seen this happen in my own life. Objectively, I have had a great life. I was raised by a great family in a safe neighborhood in the United States. But at times I was certainly one of those college students whose subjective rating of life was unreasonably influenced by things that shouldn’t have mattered very much. Whether it was not having enough dates, watching the University’s basketball team lose, or having an angry customer at the restaurant I worked at, I can look back and recognize times when I had a negative outlook on life that stemmed from small negative events that I focused on too deeply. I still do this today, but being aware of the focusing illusion and understanding that what you see is all there is has helped me to avoid focusing too deeply and giving too much important to events or opinions that shouldn’t dominate my outlook on life.
A Lack of Internal Consistency

A Lack of Internal Consistency

Something I have been trying to keep in mind lately is that our internal beliefs are not as consistent as we might imagine. This is important right now because our recent presidential election has highlighted the divide between many Americans. In most of the circles I am a part of, people cannot imagine how anyone could vote for Donald Trump. Since they see President Trump as contemptible, it is hard for them to separate his negative qualities from the people who may vote for him. All negative aspects of Trump and of the ideas that people see him as representing are heaped onto his voters. The problem however, is that none of us have as much internal consistency between our thoughts, ideas, opinions, and beliefs for any of us to justify characterizing as much as half the country as bigoted, uncaring, selfish, or really any other adjective (except maybe self-interested).

 

I have written a lot recently about the narratives we tell ourselves. It is problematic that the more simplistic a narrative, the more believable and accurate it feels to us. The world is incredibly complicated, and a simplistic story that seems to make sense of it all is almost certainly wrong. Given this, it is worth looking at our ideas and views and trying to identify areas where we have inconsistencies in our thoughts. This helps us tease apart our narratives and recognize where simplistic thinking is leading us to unfound conclusions.

 

In Thinking Fast and Slow, Daniel Kahneman shows us how this inconsistency between our thoughts, beliefs, and behaviors can arise, using moral ambiguity as an example. He writes, “the beliefs that you endorse when you reflect about morality do not necessarily govern your emotional reactions, and the moral intuitions that come to your mind in different situations are not internally consistent.”

 

It is easy to adopt a moral position against some immoral behavior or attitude, but when we find ourselves in a situation where we are violating that moral position, we find ways to explain our internal inconsistency without directly violating our initial moral stance. We rationalize why our moral beliefs don’t apply to us in a given situation, and we create a story in our minds where there is no inconsistency at all.

 

Once we know that we do this with our own beliefs toward moral behavior, we should recognize that we do this with every area of life. It is completely possible for us to think entirely contradictory things, but to explain away those contradictions in ways that make sense to us, even if it leaves us with incoherent beliefs. And if we do this ourselves, then we should recognize that other people do this as well. So when we see people voting for a candidate and can’t imagine how they could vote for such a candidate, we should assume that they are making internally inconsistent justifications for voting for that candidate. They are creating a narrative in their head where they are making the best possible decision. They may have truly detestable thoughts and opinions, but we should remember that in their minds they are justified and making rational choices.

 

Rather than simply hating people and heaping every negative quality we can onto them. We should pause and ask what factors might be leading them to justify contemptible behavior. We should look for internal inconsistencies and try to help people recognize these areas and move forward more comprehensively. We should see in the negativity in others something we have the same capacity for, and we should try to find more constructive ways to engage with them and help them shift the narrative that justifies their inconsistent thinking.
Can You Remember Your Prior Beliefs? - Joe Abittan

Can You Remember Your Prior Beliefs?

“A general limitation of the human mind,” writes Daniel Kahneman in his book Thinking Fast and Slow, “is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”

 

What Kahneman is referring to with this quote is the difficulty we have in understanding how our thinking evolves and changes over time. To each of us, our thinking slowly adapts and revises itself, sometimes quite dramatically, but often very slowly. Our experience of our changing mind isn’t very reflective of these changes, unless we had a salient change that I would argue is tied in one way or another to an important aspect of our identity. For most changes in our mental approach, we generally don’t remember our prior beliefs and views, and we likely don’t remember a point at which our beliefs changed.

 

In the book Kahneman uses an example of two football teams with the same record playing each other. One team crushes the other, but before we knew the outcome, we didn’t have a strong sense of how the game would go. After watching a resounding victory, it is hard to remember that we once were so uncertain about the future outcome.

 

This tendency of the mind wouldn’t be much of a problem if it was restricted to our thinking about sports – unless we had a serious betting problem. However, this applies to our thinking on many more important topics such as family member marriages, career choices, political voting patterns, and consumer brand loyalty. At this moment, many Democrat voters in our nation probably don’t remember exactly what their opinions were on topics like free trade, immigration, or infectious disease policy prior to the 2016 election. If they do remember their stances on any of those issues, they probably don’t remember all the legal and moral arguments they expressed at that time. Their minds and opinions on the matter have probably shifted in response to President Trump’s policy positions, but it is probably hard for many to say exactly how or why their views have changed.

 

In a less charged example, imagine that you are back in high school, and for years you have really been into a certain brand of shoes. But, one day, you are bullied for liking that brand, or perhaps someone you really dislike is now sporting that same brand, and you want to do everything in your power to distance yourself from any association with the bullying or the person you don’t like. Ditching the shoes and forgetting that you ever liked that brand is an easy switch for our minds to make, and you never have to remember that you too wore those shoes.

 

The high school example is silly, but for me it helps put our brain’s failure to remember previous opinions and beliefs in context. Our brains evolved in a social context, and for our ancestors, navigating complex tribal social structures and hierarchies was complex and sometimes a matter of life and death (not just social media death for a few years in high school like today). Being able to ditch beliefs that no longer fit our needs was probably helpful for our ancestors, especially if it helped them fully commit to a new tribal leader’s strange quirks and new spiritual beliefs. Today, this behavior can cause us to form strange high school (or office) social cliques and can foment toxic political debates, but it may have served a more constructive role for our ancestors forming early human civilizations.