More on Human Language and Gossip

More on Human Language and Gossip

In my last post I wrote about human language evolving to help us do more than just describe our environments. Language seems helpful to ask someone how many cups of flour are in a cookie recipe, where the nearest gas station is, and whether there are any cops on the freeway (or for our ancestors, what nuts are edible, where one can find edible nuts, and if there is a lion hiding near the place with the edible nuts). However, humans use language for much more than describing these aspects of our environment. In particular, we use language for signaling, gossiping, and saying things without actually saying the thing out loud.
We might use language to say that we believe something which is clearly, objectively false (that the emperor has nice clothes on) to signal our loyalty. We may gossip behind someone’s back to assess from another person whether that individual is trustworthy, as Yuval Noah Harari argues in his book Sapiens. And we might ask someone if they would like to come over to our house to watch Netflix and chill, even if no watching of Netflix is actually in the plans we are asking the other person if they are interested in engaging in. As Robin Hanson and Kevin Simler explain in The Elephant in the Brain, we are asking a question and giving the other person plausible deniability in their response and building plausible deniability into the intent of our question.
These are all very complicated uses of language, and they developed as our brains evolved to be more complicated. The reason evolution favored brain evolution that could support such complicated uses of language is due to the fact that humans are social beings. In Sapiens, Harari writes, “The amount of information that one must obtain and store in order to track the ever-changing relationships of even a few dozen individuals is staggering. (In a band of fifty individuals, there are 1,225 one-on-one relationships and countless more complex social combinations.)” In order for us to signal to a group of humans, gossip about others, or say things that we know will be commonly understood but plausibly denied, our brains needed a lot of power. History suggests that tribes typically ranged from about 50 on the low end to 250 people on the high end, meaning we had a lot of social interactions and considerations to manage. Our brains evolved to make us better social creatures, and language was one of the tools that both supported and drove that evolution.
The Elephant in the Brain with Psychics and Mediums - Kevin Simler - Robin Hanson - Mary Roach - Joe Abittan - Spook: Science Tackles the Afterlife

The Elephant in the Brain with Psychics and Mediums

In the book The Elephant in the Brain, Robin Hanson and Kevin Simler argue that our own self-interest drives a huge amount of our behavior. On the surface this doesn’t sound like a huge shock, but if you truly look at how deeply our self-interest is tied to everything we do, you start to see that we like to pretend that we don’t act purely out of our own self-interest. Instead, we lie to ourselves and others and create high minded reasons for our beliefs, behaviors, and actions. But our self-interest is never far behind. It is always there as the elephant in the room (or brain) influencing all that we do even if we constantly try to ignore it.
This is likely what happens when people visit psychics and mediums with the hopes of learning about their future or reconnecting with the spirit of a lost one. Mary Roach describes what is going on with psychics, mediums, and their clients in her book Spook, and I think her explanation is a strong argument for the ideas presented by Hanson and Simler in The Elephant in the Brain. She writes:
“It seems to me that in many cases psychics and mediums prosper not because they’re intentionally fraudulent, but because their subjects are uncritical. The people who visit mediums and psychics are often strongly motivated or constitutionally inclined to believe that what is being said is relevant and meaningful with regard to them or a loved one.”
Both psychics/mediums and their subjects are motivated by self-interests that they don’t want to fully own up to. They both deceive themselves in order to appear to genuinely believe the experience. If you can fool yourself then it becomes much easier to fool others, and that requires that you ignore the elephant (your self-interest) in your brain.
Clients want to believe they are really interacting with the spirit of a lost one and not being fooled or defrauded. Critical thinking and deliberately acknowledging that they are susceptible to being fooled are ignored and forgotten. Instead, the individual’s self-interest acts behind the scenes as they help create the reality they want to inhabit with the help of the psychic or medium.
The psychics and mediums also don’t want to be viewed as fraudsters and quacks. They hide the fact that they have economic and social motivations to appear to have special powers and signal their authenticity. If a client is uncritical, it helps the entire process and allows both parties to ignore their self-interest acting below the surface. Ultimately, as Roach argues, the process is dependent on both practitioners who are willing to believe their subjects are having authentic experiences and on subjects to then believe their psychics and mediums are genuinely communicating with the dead. Without either, and without the self-deception for both, the whole process would fall apart.
More Information Can Make the World More Confusing

More Information Can Make the World More Confusing

“In my experience,” writes Mary Roach in Spook, “the most staunchly held views are based on ignorance or accepted dogma, not carefully considered accumulations of facts. The more you expose the intricacies and realities of the situation, the less clear-cut things become.”
This quote from Mary Roach is something I have experienced in my own life over and over. I have met many people with very strong views about subjects, and they very often oversimplify an issue and reduce arguments against their position to a straw man. Rather than carefully considering whether their opinions and perspectives are valid, they dismiss arguments against their favored position without real thought. And to be fair, this is something I have even caught myself doing.
I generally seem to be one of those people who can talk about challenging subjects with just about anyone. I think the reason why I am able to talk to people about difficult topics is because I always try to understand how reach the perspective they hold. I also try hard to understand why I hold my own opinions, and I try not to reduce either my own or another person’s opinion to a simple right or wrong morality judgment. I think we come to our opinions through many convoluted paths, and straw-manning an argument does an injustice to the opinions and views of others.
At the same time, I have noticed that those who hold the most oversimplified beliefs do so in a dogmatic manner, as Roach suggested. They may be able to consider facts and go through deeper considerations, but they ultimately fall back on simple dogma, rather than live with the complex cognitive dissonance required to accept that you believe one thing in general, but cannot always rely on that one thing to explain the particulars. Personally, I have found that I can have conversations with these people, but that I feel frustrated when they then turn around and post things on social media that are reductive and ignore the complex perspectives we previously talked through.
Like Roach, I find that those with more detailed and nuanced views, built out of an accumulation of facts, generally are less emotionally invested in a given topic. Perhaps it is a lack of passion for a topic which allowed them to look at facts in such detail, rather than adopting a favored view and immediately dismissing anything that doesn’t align with that view.
Ultimately, I think much of this behavior can be understood by reading Kevin Simler and Robin Hanson’s book The Elephant in the Brain. We are all smart and capable of self-deception in order to more strongly believe the thing we want to believe. Over simplified dogmas simply help us do that better. I think we are often signaling our loyalty to a group or signaling some characteristic that we think is important when we make reductive and dogmatic statements. We recognize what identity we wish to hold and what is in our self-interest, and we act our part, adopt the right beliefs, and signal to others that we are part of the right in-group. In this way, the dogma is a feature and not a bug.
Personally and Politically Disturbed by the Homeless

Personally and Politically Disturbed by the Homeless

On the first page of the preface of The Homeless, Christopher Jencks writes about the responses that many Americans had to the rise of homelessness in American cities in the 1970s. He writes, “The spread of homelessness disturbed affluent Americans for both personal and political reasons. At a personal level, the faces of the homeless often suggest depths of despair that we would rather not imagine, much less confront in the flesh. … At a political level, the spread of homelessness suggests that something has gone fundamentally wrong with America’s economic or social institutions.”
I think the two books which most accurately describe the way that I understand our political and social worlds are Thinking Fast and Slow by Daniel Kahneman and The Elephant in the Brain by Kevin Simler and Robin Hanson. Kahneman suggests that our brains are far more susceptible to cognitive errors than we would like to believe. Much of our decision-making isn’t really so much decision-making as it is excuse making, finding ways to give us agency over decisions that were more or less automatic. Additionally, Kahneman shows that we very frequently, and very predictably, make certain cognitive errors that lead us to inaccurate conclusions about the world. Simler and Hansen show that we often deliberately mislead ourselves, choosing to intentionally buy into our minds’ cognitive errors. By deliberately lying to ourselves and choosing to view ourselves and our beliefs through a false objectivity, we can better lie to others, enhancing the way we signal to the world and making ourselves appear more authentic. [Note: some recent evidence has put some findings from Kahneman in doubt, but I think his general argument around cognitive errors still holds.]
Jencks published his book long before Thinking Fast and Slow and The Elephant in the Brain were published, but I think his observation hints at the findings that Kahneman, Simler, and Hanson would all write about in the coming decades. People wanted to hold onto beliefs they possibly knew or suspected to be false. They were disturbed by a reality that did not match the imagined reality in which they wanted to believe. They embraced cognitive errors and adopted beliefs and conclusions based on those cognitive errors. They deceived themselves about reality to better appear to believe the myths they embraced, and in the end they developed a political system where they could signal their virtue by strongly adhering to the initial cognitive errors that sparked the whole process.
Jencks’ quote shows why homelessness is such a tough issue for many of us to face. When we see large number of people failing and ending up homeless it suggests that there is something more than individual shortcomings at work. It suggests that somewhere within society and our social structures are points of failure. It suggests that our institutions, from which we may benefit as individuals, are not serving everyone. This goes against our beliefs which reinforce our self-interest, and is hard to accept. It is much easier to simply fall back on cognitive illusions and errors and to blame those who have failed. We truly believe that homelessness is the problem of individuals because we are deceiving ourselves, and because it serves our self-interest to do so. When we see homeless, we see a reality we want to ignore and pretend does not exist because we fear it and we fear that we may be responsible for it in some way. We fear that homelessness will necessitate a change in the social structures and institutions that have helped us get to where we are and that changes may make things harder for us or somehow diminishing our social status. This is why we are so disturbed by homeless, why we prefer not to think about it, and why we develop policies based on the assumption that people who end up homeless are deeply flawed individuals and are responsible for their own situation. It is also likely why we have not done enough to help the homeless, why it is becoming a bigger issue in American cities, and why we have been so bad at addressing the real causes of homelessness in America. There is definitely some truth to the argument that homelessness is the result of flawed individuals, which is why it is such a strong argument, but we should accept that there are some flawed causal thoughts at play and that it is often in our self-interest to dismiss the homeless as individual failures.
Self-deceptive Rationalization

Self-Deceptive Rationalization

I don’t like doing online personality quizzes. Part of the reason why I dislike them is because I believe that three of the cognitive errors and biases identified by Daniel Kahneman in his book Thinking Fast and Slow are at play when we take online quizzes.
 
 
First, we are influenced by the availability heuristic. Our perception of how common or how accurate something is can be greatly influenced by whether we have an easy or hard time remembering the thing. This can influence how we answer questions about things we normally prefer or normally like to do. We might be answering based on how quickly we remember something, not on how we actually feel about something.
 
 
Second, we might substitute the questions being asked with easier to answer questions. In reality, this is what is happening with the availability heuristic. A difficult self-reflection question might not be answered directly. We might switch the question out and instead answer a simpler question. In the case of the availability heuristic, we are answering how easily something came to mind rather than the original question, but this can happen outside of the availability heuristic as well. The result is that we are not really measuring what the question purports to measure.
 
 
Third, Kahneman argues that we can think of ourselves as having two operating systems for how we act and feel in the present moment versus how we reflect back and remember previous experiences. The remembering self has different perceptions than the experiencing self, as Kahneman terms the two systems. The remembering self doesn’t have an accurate memory for how much we liked or disliked certain experiences. Think about a vacation. You may be feeling burnt out with work and life, and all you want to do, what you would enjoy the most in the world, is to sit on a familiar beach doing absolutely nothing. But your remembering self won’t take any exciting and novel memories from a week sitting on a beach doing nothing. Your remembering self would much rather have you go on an exciting yet stressful vacation to a new foreign country. This tension between your experiencing and remembering selves makes the reliability of online personality quizzes questionable. Your remembering self answers the questions, not your experiencing self, and they don’t always have the same opinions.
 
 
What this means, is that the kind of reflection that goes into online personality quizzes, or really any reflective activity, can potentially be self-deceptive. Quassim Cassam writes about these dangers in his book Vices of  the Mind. He writes, “there is always the danger that what critical reflection produces is not self-knowledge, but self-deceptive rationalization.” Our biases and cognitive errors can lead us to incorrect answers about ourselves during self-reflection. This process can feel honest and insightful, but it can often be nothing more than a rationalization for behaviors and actions that we want to believe are true about ourselves. The only way through, Cassam continues to explain, is to cultivate real epistemic virtues, to see the world more clearly, and to recognize our epistemic vices to become better thinkers.

How We Chose to Measure Risk

How We Chose to Measure Risk

Risk is a tricky thing to think about, and how we chose to measure and communicate risk can make it even more challenging to comprehend. Our brains like to categorize things, and categorization is easiest when the categories are binary or represent three or fewer distinct possibilities. Once you start adding options and different possible outcomes, decisions quickly become overwhelmingly complex, and our minds have trouble sorting through the possibilities. In his book Thinking Fast and Slow, Daniel Kahneman discusses the challenges of thinking about risk, and highlights another level of complexity in thinking about risk: what measurements we are going to use to communicate and judge risk.

 

Humans are pretty good at estimating coin flips – that is to say that our brains do ok with binary 50-50 outcomes (although as Kahneman shows in his book this can still trip us up from time to time). Once we have to start thinking about complex statistics, like how many people will die from cancer caused by smoking if they smoke X number of packs of cigarettes per month for X number of years, our brains start to have trouble keeping up. However, there is an additional decision that needs to be layered on top statistics such as cigarette related death statistics before we can begin to understand them. That decision is how we are going to report the death statistics.  Will we chose to report deaths per thousand smokers? Will we chose to report the number of packs smoked for a number of years? Will we just chose to report deaths among all smokers, regardless as to whether they smoked one pack per month or one pack before lunch every day?

 

Kahneman writes, “the evaluation of the risk depends on the choice of a measure – with the obvious possibility that the choice may have been guided by a preference for one outcome or another.”

 

Political decisions cannot be escaped, even when we are trying to make objective and scientific statements about risk. If we want to convey that something is dangerous, we might chose to report overall death numbers across the country. Those death numbers might sound like a large number, even though they may represent a very small fraction of incidents. In our lives today, this may be done with COVID-19 deaths, voter fraud instances, or wildfire burn acreage. Our brains will have a hard time comprehending risk in each of these areas, and adding the complexity of how that risk is calculated, measured, and reported can make virtually impossible for any of us to comprehend risk. Clear and accurate risk reporting is vital for helping us understand important risks in our lives and in society, but the entire process can be derailed if we chose measures that don’t accurately reflect risk or that muddy the waters of exactly what the risk is.

What’s Happening in Our Brains Behind the Conscious Self

Toward  the end of the introductory chapter of their book The Elephant in the Brain, Kevin Simler and Robin Hanson explain what they observed with the human mind and what they will be exploring in the coming chapters. They write, “What will emerge from this investigation is a portrait of the human species as strategically self-deceived, not only as individuals but also as a society. Our brains are experts at flirting, negotiation social status, and playing politics, while “we” – the self-conscious parts of the brain – manage to keep our thoughts pure and chaste. “We” don’t always know what our brains are up to, but we often pretend to know, and therein lies the trouble.”

 

The last few days I have written about a few instances where we deceive ourselves and hide our true motives from ourselves. We do this so that in our political and social world we can appear to have high-minded motives and reasons for doing the things we do. Simler and Hanson show that this does not just happen on an individual level, but happens at group and society levels as well. We all contribute to the failure to acknowledge what it is that drives our decisions and why we do what we do.

 

This process takes place behind the conscious self that experiences the world. In the past, I have borrowed from Ezra Klein who has used a metaphor on his podcast about a press secretary. The press secretary for a large company doesn’t sit in on every strategic decision meeting, isn’t a part of every meeting to decide what the future of the company will be, and isn’t part of the team that makes decisions about whether the company will donate money, will begin to hire more minorities, or will launch a new product. But the press secretary does have to explain to the general public why the company is making these decisions, and has to do it in a way that makes the company look as high-minded as possible. The company is supporting the local 5K for autism because they care about the children in the community. The company has decided to hire more minorities because they know the power of having a diverse workforce and believe in equality. The company was forced to close the factory because of unfair trade practices in other countries.

 

On an individual level, our conscious self is acting like the press secretary I described, and this spreads throughout the levels of society. As individuals we say and think one thing while doing another, and so do our political bodies, our family units, our businesses, and the community groups we belong to. There are often hidden motives that we signal to that likely account for a large portion of why we do what we do. This creates awkward situations, especially for those who don’t navigate these unspoken social situations well, and potentially puts us in places where our policy doesn’t align with the things we say we want. We should not hate humans for having these qualities, but we should try to recognize them, especially in our own lives, and control these situations and try to actually live in the way we tell people we live.

Outsiders Within Our Own Minds

How good are you at introspection? How much do you know about yourself and how much do other people know about you that you don’t actually know? I try to ask myself these types of questions and reflect on how much I don’t actually recognize or know about myself, but it is really difficult. I much prefer to look at what I do with the most generous interpretation of myself and my actions and decisions, but I know that somewhere out there someplace is someone who looks at me with the least generous interpretations of me and my actions, in much the same way I look at a person smoking or a person with a political bumper sticker that I dislike and instantly want to ascribe a whole set of negative qualities to them. Beyond simply the idea of looking at ourselves honestly, looking at ourselves from the least favorable position is extremely uncomfortable, and reveals a bunch of things about who we are that we would rather ignore.

 

Luckily for us, our brains protect us from this discomfort by simply failing to be aware of our true actions and motives in any given point in time. The brain ascribes high-minded reasons for our behaviors, and hands us a script with the best interpretation of what we do. When it comes to the brain, we feel like we know what is going on within it, but the reality is that our own minds are mysteries to ourselves.

 

Kevin Simler and Robin Hanson write about this in their book The Elephant in the Brain. “In other words, even we don’t have particularly privileged access to the information and decision-making that goes on inside our minds. We think we’re pretty good at introspection, but that’s largely an illusion. In a way we’re almost like outsiders within our own minds.”

 

We tell ourselves that what we do makes sense, is logical, and is the best possible outcome for everyone in a given situation. The reality is that we are likely more focused on our own gain than anything else. We don’t want to admit this to anyone (even ourselves) and so we sugar coat it and hide  our motivations behind altruistic reasons. Piercing through this with self-reflection is challenging because we are so good at deflecting and deceiving our own thought processes. It is comforting to believe that we are on the right side of a moral issue (especially if we get benefits from the side we are on) and uncomfortable with no clear path forward if we look inward and discover that we may have been wrong the whole time (especially if it seems that our social group has been wrong the whole time). Increasingly in my life I find it imperative that I consider that my brain doesn’t see things clearly and that my brain is likely wrong and short sighted about many issues. Remembering this helps me avoid becoming too certain of my beliefs, and keeps me open to the ways that other people see the world. It helps me recognize when people are acting out of their own self-interest, and helps me pull back in situations where my ego wants to run wild and declare that I am clearly right and those other people are obviously violating good moral ethics.

Deceiving Ourselves

Kevin Simler and Robin Hanson write about evolutionary psychology of the brain in their book The Elephant in the Brain to explain why it is that we have hidden motives and why those hidden motives can be so hard to identify. The authors write (brackets mine, italics in original), “The human brain, according to this view [evolutionary psychology], was designed to deceive itself–in [Robert] Trivers’ words, ‘the better to deceive others.'” The authors look at how self-deception can be positive from an evolutionary perspective, and how that shapes the way we think about ourselves and our place in the world.

 

Fudging on the rules from time to time and making ourselves look better than we really are can be good strategies to survive, or at least they potentially were for our ancestors. Humans evolved in small, political, social tribes with rules and norms to adhere to and enforce to varying degrees. Slight amounts of cheating, if they can go unnoticed, can be beneficial for survival. This drives an evolutionary pressure to pass along selfish genes that favor individual survival, hold up the rules when it is convenient, but push rules aside when it benefits us. Simler and Hanson argue that this pressure is so strong, that we evolved to not even notice when we do this.

 

We also seem to justify our actions (a process known as motivated reasoning) in a way which says that we didn’t really do anything bad, we were just making the best decision we could given the circumstances or we were upholding fairness and justice in the absence of a greater authority to administer justice. The more we can convince ourselves that we are right and that we are on the correct side of a moral argument, the more we can convince others that our actions were just. If we are blatantly lying about our motivations, and we know we are lying, it will be harder to convince others and build support around our actions.

 

If however, we convince ourselves that our actions were right and our motives pure, we will have an easier time convincing others of our correctness and of our value to them and to society. When we give to charity, at least part of our donation is probably driven by a desire to want to be seen as the person who gives to charity or as a person with enough money to give some away. These two motivations, however, would be frowned upon. Instead, we convince ourselves that we give to charity because it is the right thing to do, or because we think the cause is incredibly important. Those both may be true, but if we completely convince ourselves that we are donating for the high-minded reasons, we will be more authentic and better able to convince other people that we made donations for high-minded and not selfish reasons. We are wired not to see the world how it is, but to see it through a filter that magnifies our greatness and minimizes our faults, deceiving ourselves so we can do a better job of presenting the best version of ourselves to the world.
Game Theory of Mind

Game Theory Interactions with Self Deception

“Self deception is useful only when you’re playing against an opponent who can take your mental state into account,” write Kevin Simler and Robin Hanson in The Elephant in the Brain. “Sabotaging yourself works only when you’re playing against an opponent with a theory-of-mind.” 
When we think about other people and their actions, we don’t just look at the hard facts of what happened. We spend a lot of time trying to read small cues and context to understand why someone did something. We project ourselves into the situation, we imagine other people in their situation, and sometimes we even imagine a person from space with no human social awareness in the situation. We strive to understand what types of mental processes and thoughts may have been taking place in the person’s head at the time of an action or decision. From sports, to politics, to office gossip, we attempt to guess the mental state of others, we hold a theory of what is taking place in their mind.
This is a key part of game theory. We have to be able to deduce that others are thinking something and that they are interpreting, reacting to, and making decisions about a given situation and will change their behavior in response to the way that we think and behave. In this world, social decisions and consequences along with individual actions become very complex very fast. What often matters is not so much a given outcome, but the intent behind the outcome. Was this person just trying to make themselves richer, or did they have more altruistic motives of helping everyone? Did this person really want to develop a new type of road to help improve traffic, or again, were they just out for themselves? Is my crime conspirator going to rat me out, or will he keep his mouth shut? These are the types of questions and things we think about when we assume other people have minds that work like ours. 
This brings in self-deception. If we are always looking at others trying to sort out their motives, and if they are doing the same to us, then we better have a really  good poker face when we are lying–or when we are just not quite telling the full truth. “we, humans, must self-deceive. Those who refuse to play such mind games will be at a disadvantage relative to others who play along,” the authors white in their book. Many of us have probably been in a situation where we tried to be truthful and honest, but were afraid that someone who was not truthful could interfere with our plans by seeming to be honest but really lying. They may have made great impressions and possibly gotten the reward we hoped for, ultimately preventing us from doing something good while they scammed the situation. This is why we are under pressure to self-deceive, to over promise, to inflate ourselves, and to fudge the details. After all, if we know we can do something the best, we better make sure we have the chance and don’t have it stolen by someone else who might be lying and less capable. Competing with other smart social creatures encourages self-deception so that we can feel good about ourselves and appear more genuine when we are distorting the facts so that we can get ahead.