The Elephant in the Brain with Psychics and Mediums - Kevin Simler - Robin Hanson - Mary Roach - Joe Abittan - Spook: Science Tackles the Afterlife

The Elephant in the Brain with Psychics and Mediums

In the book The Elephant in the Brain, Robin Hanson and Kevin Simler argue that our own self-interest drives a huge amount of our behavior. On the surface this doesn’t sound like a huge shock, but if you truly look at how deeply our self-interest is tied to everything we do, you start to see that we like to pretend that we don’t act purely out of our own self-interest. Instead, we lie to ourselves and others and create high minded reasons for our beliefs, behaviors, and actions. But our self-interest is never far behind. It is always there as the elephant in the room (or brain) influencing all that we do even if we constantly try to ignore it.
This is likely what happens when people visit psychics and mediums with the hopes of learning about their future or reconnecting with the spirit of a lost one. Mary Roach describes what is going on with psychics, mediums, and their clients in her book Spook, and I think her explanation is a strong argument for the ideas presented by Hanson and Simler in The Elephant in the Brain. She writes:
“It seems to me that in many cases psychics and mediums prosper not because they’re intentionally fraudulent, but because their subjects are uncritical. The people who visit mediums and psychics are often strongly motivated or constitutionally inclined to believe that what is being said is relevant and meaningful with regard to them or a loved one.”
Both psychics/mediums and their subjects are motivated by self-interests that they don’t want to fully own up to. They both deceive themselves in order to appear to genuinely believe the experience. If you can fool yourself then it becomes much easier to fool others, and that requires that you ignore the elephant (your self-interest) in your brain.
Clients want to believe they are really interacting with the spirit of a lost one and not being fooled or defrauded. Critical thinking and deliberately acknowledging that they are susceptible to being fooled are ignored and forgotten. Instead, the individual’s self-interest acts behind the scenes as they help create the reality they want to inhabit with the help of the psychic or medium.
The psychics and mediums also don’t want to be viewed as fraudsters and quacks. They hide the fact that they have economic and social motivations to appear to have special powers and signal their authenticity. If a client is uncritical, it helps the entire process and allows both parties to ignore their self-interest acting below the surface. Ultimately, as Roach argues, the process is dependent on both practitioners who are willing to believe their subjects are having authentic experiences and on subjects to then believe their psychics and mediums are genuinely communicating with the dead. Without either, and without the self-deception for both, the whole process would fall apart.
Mary Roach on Reincarnation in India

Mary Roach on Reincarnation in India

In the book Spook, Mary Roach writes, “People don’t seem to approach life with the same terrified, risk-aversive tenacity that we do. I’m beginning to understand why, religious doctrine aside, the concept of reincarnation might be so popular here. Rural India seems like a place where life is taken away too easily – accidents, childhood diseases, poverty, murder. If you’ll be back for another go, why get too worked up about the leaving?” Roach is joking of course, but this quote comes at the end of a lengthy description of dangers and risks that she experienced in India that we would find appalling in the United States. Her travels to India brought her face to face with cyclists moving through heavy traffic and breathing diesel smog. She was afraid of large trucks overflowing with potatoes and cauliflower that threatened to spill over onto the vehicle she was riding in. And she was also afraid for the lives of more than one woman riding precariously on the back of a fast moving Vespa.
While the quote is funny, it does get at some interesting ways of thinking about life, death, and how we go about our days. I’m not sure how much of our differences in risk tolerance in the United States versus India comes down to beliefs in reincarnation, but I can see how ideas of reincarnation would be comforting in a dangerous society. I don’t know if reincarnation would be enough to create a moral hazard scenario where people were intentionally negligent about safety because they expected to come back in another life, but I’m sure there is some impact that could be studied.
The quote from Roach also seems to suggest that Americans value our lives differently than individuals in India. She highlights how risk averse Americans tend to be, referring to how much we go out of our way to ensure everything we interact with is safe, and how we try to limit risk in everything from roller coasters to strollers. I think that what is likely going on is a difference in culture that stretches back years and is fraught with technological limitations and differences in population density. I am currently listening to an audiobook with an author who interviewed friends from her childhood in rural Ohio in the 1960’s and 70’s. Her dad was a doctor, and she notes how many individuals, including children, died in accidents involving farming equipment. Today we have adopted technology within everything we do, allowing us to make the world safer. Risk stands out more than in the 1960’s and 70’s when we didn’t have the technology to make everything as safe as we can now. Perhaps the difference that Roach noted, that she jokingly attributed to belief in reincarnation, is simply due to limitations in technology and a need to earn money.
More Information Can Make the World More Confusing

More Information Can Make the World More Confusing

“In my experience,” writes Mary Roach in Spook, “the most staunchly held views are based on ignorance or accepted dogma, not carefully considered accumulations of facts. The more you expose the intricacies and realities of the situation, the less clear-cut things become.”
This quote from Mary Roach is something I have experienced in my own life over and over. I have met many people with very strong views about subjects, and they very often oversimplify an issue and reduce arguments against their position to a straw man. Rather than carefully considering whether their opinions and perspectives are valid, they dismiss arguments against their favored position without real thought. And to be fair, this is something I have even caught myself doing.
I generally seem to be one of those people who can talk about challenging subjects with just about anyone. I think the reason why I am able to talk to people about difficult topics is because I always try to understand how reach the perspective they hold. I also try hard to understand why I hold my own opinions, and I try not to reduce either my own or another person’s opinion to a simple right or wrong morality judgment. I think we come to our opinions through many convoluted paths, and straw-manning an argument does an injustice to the opinions and views of others.
At the same time, I have noticed that those who hold the most oversimplified beliefs do so in a dogmatic manner, as Roach suggested. They may be able to consider facts and go through deeper considerations, but they ultimately fall back on simple dogma, rather than live with the complex cognitive dissonance required to accept that you believe one thing in general, but cannot always rely on that one thing to explain the particulars. Personally, I have found that I can have conversations with these people, but that I feel frustrated when they then turn around and post things on social media that are reductive and ignore the complex perspectives we previously talked through.
Like Roach, I find that those with more detailed and nuanced views, built out of an accumulation of facts, generally are less emotionally invested in a given topic. Perhaps it is a lack of passion for a topic which allowed them to look at facts in such detail, rather than adopting a favored view and immediately dismissing anything that doesn’t align with that view.
Ultimately, I think much of this behavior can be understood by reading Kevin Simler and Robin Hanson’s book The Elephant in the Brain. We are all smart and capable of self-deception in order to more strongly believe the thing we want to believe. Over simplified dogmas simply help us do that better. I think we are often signaling our loyalty to a group or signaling some characteristic that we think is important when we make reductive and dogmatic statements. We recognize what identity we wish to hold and what is in our self-interest, and we act our part, adopt the right beliefs, and signal to others that we are part of the right in-group. In this way, the dogma is a feature and not a bug.
Beliefs are Not Voluntary

Beliefs Are Not Voluntary

One of the ideas that Quassim Cassam examines in his book Vices of the Mind is the idea of responsibility. Cassam recognizes two forms of responsibility in his book and examines those forms of responsibility through the lens of epistemic vices. The first form of responsibility is acquisition responsibility, or our responsibility for acquiring beliefs or developing ways of thinking, and the second form of responsibility is revision responsibility, or our responsibility for changing beliefs and ways of thinking that are shown to be harmful.
 
 
Within this context Cassam provides interesting insight about our beliefs. He writes, “If I raise my arm voluntarily, without being forced to raise it, then I am in this sense responsible for raising it.
Notoriously, we lack voluntary control over our own beliefs. Belief is not voluntary.”
 
 
Cassam explains that if it is raining outside, we cannot help but believe that it is raining. We don’t have control over many of our beliefs, they are in some ways inescapable and determined by factors beyond our control. beliefs are almost forced on us by external factors. I think this is true for many of our beliefs, ranging from spiritual beliefs to objective beliefs about the world. As Cassam argues, we are not acquisition responsible for believing that we are individuals, that something is a certain color, or that our favorite sports team is going to have another dreadful season.
 
 
But we are revision responsible for our beliefs.
 
 
Cassam continues, “We do, however, have a different type of control over our own beliefs, namely, evaluative control, and this is sufficient for us to count as revision responsible for our beliefs.”
 
 
Cassam introduces ideas from Pamela Hieronymi to explain our evaluative control over our beliefs. Hieronymi argues that we can revise our beliefs when new information arises that challenges those beliefs. She uses the example of our beliefs for how long a commute will be and our shifting beliefs if we hear about heavy traffic. We might not be responsible for the initial beliefs that we develop, but we are responsible for changing those beliefs if they turn out to be incorrect. We can evaluate our beliefs, reflect on their accuracy, and make adjustments based on those evaluations.
 
 
It is important for us to make this distinction because it helps us to better think about how we assign blame for inaccurate beliefs. We cannot blame people for developing inaccurate beliefs, but we can blame them for failing to change those beliefs. We should not spend time criticizing people for developing racist beliefs, harmful spiritual beliefs, or wildly inaccurate beliefs about health, well-being, and social structures. What we should do is blame people for failing to recognize their beliefs are wrong, and we should help people build evaluative capacities to better reflect on their own beliefs. This changes our stance from labeling people as racists, bigots, or jerks and instead puts the responsibility on us to foster a society of accurate self-reflection that push back against inaccurate beliefs. Labeling people will blame them for acquiring vices, which is unreasonable, but fostering a culture that values accurate information will ease the transition to more accurate beliefs.

Justified Beliefs

A lot of us have beliefs that are formed out of biases and prejudices. Often those beliefs still end up being true in the end, but they are nevertheless unjustified. A skill of the human mind is to ignore contradictory evidence and focus in on the limited evidence which supports what we want to believe and backs-up our prior assumptions. Whether it is a belief about a sports team, a racial or ethnic group, or about a restaurant, we often adopt unjustified beliefs that we support with anecdotal thinking. When these unjustified beliefs turn out to be correct, we use it as a defense of our biased thinking, and risk becoming entrenched with inaccurate assumptions of how the world works.
In Vices of the Mind Quassim Cassam writes about this directly. He argues that people need to be more considerate when considering whether a way of thinking is helpful or harmful, and whether a true result in the end justifies biased assumptions.  Cassam writes, “leading to true belief is not the same as being conducive to knowledge. Even in cases where an epistemic vice leads someone to believe something true that doesn’t mean that they have the right to be confident that things are as they take them to be or that their belief is justified.”
To take a relatively harmless example, imagine two sports fans who bet on a college basketball game. One fan might be biased in favor of big-name schools, while another might be less biased and willing to look at sports analytics when making decisions about which team is likely to win a game. The biased individual may bet against a smaller school, and may win that bet, but it is hard to say that they would systematically win bets against small schools in favor of more recognizable schools. In any individual instance their bet might pay off, but over the long term we would probably expect the more objective individual without biases who is more open-minded with sports analytics or other survey methods to win more bets. The biased individual who wins a lucky bet does not have justified beliefs even when his bias pays off.
This type of thinking can be more harmful than bets among college basketball fans. The human mind has a remarkable ability to remember the evidence that supports those beliefs we want to be true and to ignore evidence that undermines our desired beliefs. The biased sports fan probably remembers when he was right about a small school being over-hyped, but probably doesn’t remember the times when big-named schools lost to smaller schools. This can happen with people who are biased against police officers, minority groups, or people who drive certain types of cars. The reference class doesn’t matter to our brain, but the individual anecdotes that support our prior beliefs are remembered.
Holding justified beliefs requires that we inform our beliefs based on real-world evidence with statistical value. Basing our beliefs on individual anecdotes will not consistently lead us to having accurate beliefs, and if we do hit upon a true belief from time to time, we won’t be justified in the beliefs, assumptions, and conclusions that we draw. It is important to recognize when our thinking is anecdotal, and to consider whether our beliefs are justified.
Thinking Conspiratorially

Thinking Conspiratorially

Over the last few years a number of wild conspiracy theories have become popular. Former President Donald Trump embraced a conspiracy theory that the 2020 Presidential Election was rigged (it was not), supported the Qanon conspiracy theory, and did little to push back against conspiracy theories surrounding COVID-19. His actions, behaviors, and beliefs demonstrate that thinking conspiratorially can be an epistemic vice. His willingness to believe wild falsehoods obstructed knowledge for himself and his most ardent supporters.
However, thinking conspiratorially is not always an epistemic vice. One reason why conspiracy theories become so gripping and why people sometimes fall into them is because real conspiracies do occur. Nixon’s Watergate Scandal, Trump’s withholding of financial and military aid unless Ukraine announced an investigation into Joe Biden and his son, and fraud schemes uncovered by inspectors general and government auditors demonstrate that nefarious conspiracies sometimes are real. While thinking conspiratorially can become an epistemic vice, the same is true for anti-conspiratorial thinking.
In the book Vices of the Mind, Quassim Cassam quotes Dr. Charles Pigden from the University of Otago in New Zealand by writing, “there is nothing inherently vicious about believing or being disposed to believe conspiracy theories.” Cassam argues that conspiratorial thinking is not an epistemic vice on its own, but is instead a context dependent vice or virtue. He continues, “there are environments in which either way of thinking can be epistemically virtuous or vicious, and a way to capture this context-relativity is to describe these thinking styles as conditionally virtuous or vicious.”
The examples I used earlier show how conspiratorial thinking can be either virtuous or vicious. In the case of our former President, his conspiratorial thinking spread misinformation, suppressed true and accurate information, and created a set of false beliefs that some of his supporters believed so strongly that they stormed the United States Capitol in an attempt to stop Congress from certifying the election. The context of his conspiracy theories obstructed knowledge and caused substantial harm to life and property. However, a government auditor who notices inconsistencies in paperwork and accounting practices may be rewarded for thinking conspiratorially, at least to a point. Believing that something nefarious could possibly be going on will encourage the auditor to review financial statements and testimony from personnel with more scrutiny, potentially helping them uncover real fraud. Of course, they could still go too far and push the issue beyond reasonable bounds by thinking conspiratorially, but this type of thinking is conditionally virtuous when it discovers true fraud and improves knowledge about fraud schemes.
Given the dramatic consequences of conspiracy thinking over the last few years, it is easy to dismiss thinking conspiratorially as an epistemic vice. However, we should remember that it is only conditionally an epistemic vice, and that sometimes conspiracies do turn out to be true (or at least partially true). We don’t have to give every conspiracy our respect and attention, but when a conspiracy does appear to be grounded in reality and supported by real evidence, then we should not be too quick to dismiss it.
Causal Links Between Unconnected Events

Causal Links Between Unconnected Events

As a kid I grew up attending basketball camps at UCLA. I played in the old gym that used to host UCLA games in front of a few thousand fans, played on the current court in main stadium, and slept in the dorms. With my history of basketball at UCLA, I have always been a fan of the men’s basketball team, rooting for them and the Nevada Wolf Pack – where I actually went to school. With the UCLA team making a deep run in the NCAA March Madness tournament, I have been reminded of all the superstitious thinking that surrounds sports and that I used to indulge in.
Sports seem to bring out superstitious thinking in even the most rational of people. I try very hard to think about causal structures and to avoid seeing non-existent causal links between unconnected events, but nevertheless, it is hard to not let superstitious thinking creep in. When you are watching a game it is hard not to feel like you have to sit in the right spot, have to watch from a certain room, or have to avoid certain behaviors in order to keep your team in the lead. However, it is absolute nonsense to think that your actions on your couch, miles away from the sporting venue where the game is taking place, could have any causal link to the way that a sports team performs.
In the book Vices of the Mind, Quassim Cassam spends time examining what is happening within our mind when we engage in superstitious thinking. He explains that superstitious thinking qualifies as an epistemic vice because it gets in the way of knowledge. It prevents us from forming accurate beliefs about the world. “Superstitious thinking,” Cassam writes, “isn’t a generally reliable method for forming true beliefs about the future; it won’t generally lead to true beliefs because it posits causal links between unconnected events. … beliefs based on superstitious thinking aren’t reasonable.”
Cassam gives the example of superstitions about walking under ladders in the book. Someone with a superstition believing that bad luck will befall them if they walk under a ladder will probably avoid walking under ladders, and as a result they won’t be as likely to have paint drip on them, to have something fall on their head, or to knock over the ladder and anyone or anything on top of it. Their superstition will lead to better outcomes for them, but not because the superstition helped them create true beliefs about the dangers of walking under ladders. The individual ends up with the correct answer, but interprets the wrong causal chain to get there.
Thinking about rational and plausible causal chains is a way to escape superstitious thinking. You can rationally examine the risks, harms, and benefits of certain behaviors and actions with rational connections between events to see when a superstition is nonsense, and when it pulls from real-life causal chains to help improve life. Trying not step on cracks will not prevent you from starting a causal chain that leads to your mother’s broken back, but it will help ensure you have more stable and steady footing when you walk. Wearing the same basketball jersey for each sports game has no causal connection with the team’s performance, and wearing it or not wearing it will not have an impact on how your favorite team performs. We should strive to have accurate beliefs about the world, we should work to see causal connections clearly, and we should limit superstitious thinking even if it is about trivial things like sports.
Superstitious Thinking

Superstitious Thinking

Would you consider superstitious thinking to be a vice? According to Quassim Cassam in Vices of the Mind, superstitious thinking is indeed an epistemic vice. That is to say, Cassam believes that superstitious thinking is a is reprehensible, blameworthy, systematically obstructs knowledge. By systematically obstructing knowledge, superstitious thinking causes people to adopt beliefs about the world that don’t match reality, leaving them vulnerable to poor decision-making that can have real-world consequences in their lives.
Cassam writes, “a gambler who sees a succession of coin tosses coming down heads thinks that the next toss will be tails because a tails is now due. This is an example of superstitious or magical thinking, thinking that posits a causal link between unconnected events, namely, previous coin tosses and the next toss.” This quote shows how superstitious thinking systematically obstructs knowledge. It causes us to see causal connections when none exist, distorting our perception and theory of reality.
A gambler making bets sees a causal connection between previous roles of a dice or spins of a roulette wheel and the next roll or spin. In reality, each time you flip a coin, roll a dice, or spin a wheel, the previous result has no bearing on the current probability. A coin toss is a 50-50 affair that does not change because the previous flip was heads.
This type of thinking is prevalent in more than just gamblers. Sports enthusiasts regularly see causal links that cannot possibly exist. The same kind of thinking also shows up in people who have lucky clothing, special rituals in aspects of daily life, or who avoid certain phrases or behaviors. In many instances, the causal links we identify are absurd but don’t incur real costs in our lives. Avoiding stepping on cracks in the sidewalk doesn’t cost you anything and growing a beard because your favorite sports team is on a roll might even provide some social benefits and save you time from not shaving. However, giving in to superstitious thinking, as noted before, distorts your view of reality.
The causal chains misperceived through superstitious thinking create false understandings of how the world works. While it is harmless to believe that you need to sit in the same exact spot for your sports team to play well, it is not harmless to believe that hiring a woman to do a certain job is bad luck, and it is not harmless to bet your life savings on a gamble because of superstitious thinking. What may be even worse is that superstitious thinking in one area could spill into other areas, creating a habit of seeing causal chains that don’t exist. Overtime, superstitious thinking will lead to worse outcomes and poor decision-making that will have real costs in our lives.
The Human Need for Certainty - Joe Abittan

The Human Need for Certainty

Throughout the book Risk Savvy, Gerd Gigerenzer discusses the challenges that people face with thinking statistically, assessing different probable outcomes, and understanding risk. Gigerenzer also discusses how important it is that people become risk literate, and how the future of humanity will require that people better understand risk and uncertainty. What this future requires, he explains, is fighting against aspects of human psychology that are common to all of us and form part of our core nature. One aspect in particular that Gigerenzer highlights as a problem for humans moving forward, is our need for certainty.

 

“Humans appear to have a need for certainty, a motivation to hold onto something rather than to question it,” he writes. Whether it is our religion, our plans for retirement, or the brand of shoes we prefer, we have a need for certainty. We don’t want to question whether our religious, political, or social beliefs are correct. It is more comforting for us to adopt beliefs and be certain that we are correct. We don’t want to continuously re-evaluate our savings plans and open ourselves to the possibility that we are not doing enough to save for retirement. And we like to believe that we purchased the best running shoes, that we bough the most sustainable shoes for the planet, and that our shoe choices are the most popular. In all of these areas, ambiguity makes our decisions harder whereas a feeling of certainty gives us confidence and allows us to move through the world. In many ways, our need for certainty is simply a practicality. There are unlimited possibilities and decisions for us to make every day. Adopting certainty eliminates many possibilities and choices, simplifying our life and allowing us to move through the world without having to question every action of every second of every day.

 

But in the modern world, humans have to be more comfortable living with ambiguity and have to be able to give up certainty in some areas. “For the mature adult,” Gigerenzer writes, “a high need for certainty can be a dangerous thing.”  We live with risk and need to be able to adjust as we face new risks and uncertainties in our lives. We like to hold onto our beliefs and we are not comfortable questioning our decisions, but it can be necessary for us to do so in order to move forward and live in harmony in a changing world with new technologies, different demographics, and new uncertainties. A need for certainty can lead people to become dogmatic, to embrace apologetics when discounting science that demonstrates errors in thinking, and to ignore the realities of a changing world. One way or another, we have to find ways to be flexible and adjust our choices and plans according to risk, otherwise we are likely to make poor choices and be crushed when the world does not align itself with our beliefs and wishes.
A Lack of Internal Consistency

A Lack of Internal Consistency

Something I have been trying to keep in mind lately is that our internal beliefs are not as consistent as we might imagine. This is important right now because our recent presidential election has highlighted the divide between many Americans. In most of the circles I am a part of, people cannot imagine how anyone could vote for Donald Trump. Since they see President Trump as contemptible, it is hard for them to separate his negative qualities from the people who may vote for him. All negative aspects of Trump and of the ideas that people see him as representing are heaped onto his voters. The problem however, is that none of us have as much internal consistency between our thoughts, ideas, opinions, and beliefs for any of us to justify characterizing as much as half the country as bigoted, uncaring, selfish, or really any other adjective (except maybe self-interested).

 

I have written a lot recently about the narratives we tell ourselves. It is problematic that the more simplistic a narrative, the more believable and accurate it feels to us. The world is incredibly complicated, and a simplistic story that seems to make sense of it all is almost certainly wrong. Given this, it is worth looking at our ideas and views and trying to identify areas where we have inconsistencies in our thoughts. This helps us tease apart our narratives and recognize where simplistic thinking is leading us to unfound conclusions.

 

In Thinking Fast and Slow, Daniel Kahneman shows us how this inconsistency between our thoughts, beliefs, and behaviors can arise, using moral ambiguity as an example. He writes, “the beliefs that you endorse when you reflect about morality do not necessarily govern your emotional reactions, and the moral intuitions that come to your mind in different situations are not internally consistent.”

 

It is easy to adopt a moral position against some immoral behavior or attitude, but when we find ourselves in a situation where we are violating that moral position, we find ways to explain our internal inconsistency without directly violating our initial moral stance. We rationalize why our moral beliefs don’t apply to us in a given situation, and we create a story in our minds where there is no inconsistency at all.

 

Once we know that we do this with our own beliefs toward moral behavior, we should recognize that we do this with every area of life. It is completely possible for us to think entirely contradictory things, but to explain away those contradictions in ways that make sense to us, even if it leaves us with incoherent beliefs. And if we do this ourselves, then we should recognize that other people do this as well. So when we see people voting for a candidate and can’t imagine how they could vote for such a candidate, we should assume that they are making internally inconsistent justifications for voting for that candidate. They are creating a narrative in their head where they are making the best possible decision. They may have truly detestable thoughts and opinions, but we should remember that in their minds they are justified and making rational choices.

 

Rather than simply hating people and heaping every negative quality we can onto them. We should pause and ask what factors might be leading them to justify contemptible behavior. We should look for internal inconsistencies and try to help people recognize these areas and move forward more comprehensively. We should see in the negativity in others something we have the same capacity for, and we should try to find more constructive ways to engage with them and help them shift the narrative that justifies their inconsistent thinking.