Outsiders Within Our Own Minds

How good are you at introspection? How much do you know about yourself and how much do other people know about you that you don’t actually know? I try to ask myself these types of questions and reflect on how much I don’t actually recognize or know about myself, but it is really difficult. I much prefer to look at what I do with the most generous interpretation of myself and my actions and decisions, but I know that somewhere out there someplace is someone who looks at me with the least generous interpretations of me and my actions, in much the same way I look at a person smoking or a person with a political bumper sticker that I dislike and instantly want to ascribe a whole set of negative qualities to them. Beyond simply the idea of looking at ourselves honestly, looking at ourselves from the least favorable position is extremely uncomfortable, and reveals a bunch of things about who we are that we would rather ignore.

 

Luckily for us, our brains protect us from this discomfort by simply failing to be aware of our true actions and motives in any given point in time. The brain ascribes high-minded reasons for our behaviors, and hands us a script with the best interpretation of what we do. When it comes to the brain, we feel like we know what is going on within it, but the reality is that our own minds are mysteries to ourselves.

 

Kevin Simler and Robin Hanson write about this in their book The Elephant in the Brain. “In other words, even we don’t have particularly privileged access to the information and decision-making that goes on inside our minds. We think we’re pretty good at introspection, but that’s largely an illusion. In a way we’re almost like outsiders within our own minds.”

 

We tell ourselves that what we do makes sense, is logical, and is the best possible outcome for everyone in a given situation. The reality is that we are likely more focused on our own gain than anything else. We don’t want to admit this to anyone (even ourselves) and so we sugar coat it and hide  our motivations behind altruistic reasons. Piercing through this with self-reflection is challenging because we are so good at deflecting and deceiving our own thought processes. It is comforting to believe that we are on the right side of a moral issue (especially if we get benefits from the side we are on) and uncomfortable with no clear path forward if we look inward and discover that we may have been wrong the whole time (especially if it seems that our social group has been wrong the whole time). Increasingly in my life I find it imperative that I consider that my brain doesn’t see things clearly and that my brain is likely wrong and short sighted about many issues. Remembering this helps me avoid becoming too certain of my beliefs, and keeps me open to the ways that other people see the world. It helps me recognize when people are acting out of their own self-interest, and helps me pull back in situations where my ego wants to run wild and declare that I am clearly right and those other people are obviously violating good moral ethics.

Deceiving Ourselves

Kevin Simler and Robin Hanson write about evolutionary psychology of the brain in their book The Elephant in the Brain to explain why it is that we have hidden motives and why those hidden motives can be so hard to identify. The authors write (brackets mine, italics in original), “The human brain, according to this view [evolutionary psychology], was designed to deceive itself–in [Robert] Trivers’ words, ‘the better to deceive others.'” The authors look at how self-deception can be positive from an evolutionary perspective, and how that shapes the way we think about ourselves and our place in the world.

 

Fudging on the rules from time to time and making ourselves look better than we really are can be good strategies to survive, or at least they potentially were for our ancestors. Humans evolved in small, political, social tribes with rules and norms to adhere to and enforce to varying degrees. Slight amounts of cheating, if they can go unnoticed, can be beneficial for survival. This drives an evolutionary pressure to pass along selfish genes that favor individual survival, hold up the rules when it is convenient, but push rules aside when it benefits us. Simler and Hanson argue that this pressure is so strong, that we evolved to not even notice when we do this.

 

We also seem to justify our actions (a process known as motivated reasoning) in a way which says that we didn’t really do anything bad, we were just making the best decision we could given the circumstances or we were upholding fairness and justice in the absence of a greater authority to administer justice. The more we can convince ourselves that we are right and that we are on the correct side of a moral argument, the more we can convince others that our actions were just. If we are blatantly lying about our motivations, and we know we are lying, it will be harder to convince others and build support around our actions.

 

If however, we convince ourselves that our actions were right and our motives pure, we will have an easier time convincing others of our correctness and of our value to them and to society. When we give to charity, at least part of our donation is probably driven by a desire to want to be seen as the person who gives to charity or as a person with enough money to give some away. These two motivations, however, would be frowned upon. Instead, we convince ourselves that we give to charity because it is the right thing to do, or because we think the cause is incredibly important. Those both may be true, but if we completely convince ourselves that we are donating for the high-minded reasons, we will be more authentic and better able to convince other people that we made donations for high-minded and not selfish reasons. We are wired not to see the world how it is, but to see it through a filter that magnifies our greatness and minimizes our faults, deceiving ourselves so we can do a better job of presenting the best version of ourselves to the world.

Our Brains Don’t Hold Information as Well as We Think

Anyone who has ever misplaced their keys or their wallet knows that the brain can be a bit faulty. If you have ever been convinced you saw a snake only to find out it was a plastic bag, or if you remembered dropping a pan full of sweet potatoes as a child during Thanksgiving only to get into an argument with your brother about which one of you actually dropped the pan, then you know your brain can misinterpret signals and mis-remember events. For some reason, our hyper-powerful pattern recognition brains seem to be fine with letting us down from time to time.

 

In The Elephant in the Brain, Kevin Simler and Robin Hanson write, “There’s a wide base of evidence showing that human brains are poor stewards of the information they receive from the outside world. But this seems entirely self-defeating, like shooting oneself in the foot. If our minds contain maps of our worlds, what good comes from having an inaccurate version of these maps?” 

 

The question is, why do we have such powerful brains that can do such amazing things, but that still make basic mistakes all the time? The answer that Hanson and Simler propose throughout the book is that having super accurate information in the brain, remembering everything perfectly, and clearly observing everything around us is actually detrimental to our success as a social species. Our view of the world only needs to be so accurate for us to successfully function as biological creatures. We only need senses that satisfice for us to evade predators, avoid poisonous mushrooms, and get enough food. What really drives the evolution of the brain, is being successful socially, and sometimes a bit of deception gives us a big advantage.

 

It is clear that the brain is not perfect at observing the world. We don’t see infrared wavelengths of light, we can’t sense the earths magnetic pull, and we can’t hear as many sounds as dogs can hear. Our experience of the world is limited. On top of those limitations, our brains are not that interested in having an accurate picture of the information that it actually can observe. We must keep this in mind as we go through our lives. What can seem so clear and obvious to us, may be a distorted picture of the world that someone else can see as incomplete. A good way to move forward is to abandon the idea that we have (or must have) a perfect view and opinion of the world. Acknowledge that we have preferences and opinions that shape how we interpret the world, and even if we are not open to changing those opinions, at least be open to the idea that our brains are not designed to have perfect views, and that we might be shortsighted in some areas. We will need to bond with others and form meaningful social groups, but we should not accept that we will have to delude our view of the world and accept alternate facts to fit in.

Our Mind Seems Counterproductive

I listen to a lot of science podcasts, and really love the discoveries, new ways of thinking about the world, and better understandings of the world that we gain from science. Science is a process that strives to be rational and to build on previous knowledge to better understand an objective reality. What is also interesting about science, is that it operates against the way our brains want to work. As much as I love science and as much as I want to be scientific in my thinking and approaches to the world, I understand that a great deal that shapes human beings and the world we build is not rational and seems counterproductive when viewed through a rational lens.

 

Part of the explanation for our minds being so irrational might be explained by Kevin Simler and Robin Hanson in their book The Elephant in the Brain. The authors describe one reason for why our brains evolved to be as complex and irrational as they are: we evolved to be political and deceptive creatures, not to be rational and objective creatures with a comprehensive view of reality. “Here’s the puzzle:” write Simler and Hanson, “we don’t just deceive others; we also deceive ourselves. Our minds habitually distort or ignore critical information in ways that seem, on the face of it, counterproductive. Our mental processes act in bad faith, perverting or degrading our picture of the world.”

 

We act so irrationally and have such an incorrect view of the world according to Simler and Hanson because it helped our ancestors to be more deceptive and to survive. If you wish to tell a white lie to someone or if you really want to appear sincere in your thoughts and actions, it is much easier if you believe the things you are lying about. If you know you are lying and acting in bad faith, you have to be a really good actor or poker player to convince everyone else. We actually benefit if our brains fail to recognize exactly what is driving us and help us systematically not recognize inconvenient truths.

 

For example, I use Strava, a social media platform geared toward runners and cyclists. The app allows us to upload our GPS data from our runs and bike rides and to compare our routes and see who went the fastest along a particular street or who ran up a trail the fastest. At a base level I know that I am using the app because it allows me show off to other people just how good of a runner I am. But if you asked me at any given point why I upload all my workouts to Strava, I would tell you a story about wanting to keep up with friends, wanting to discover new places to go running, and about the data that I can get to analyze my performance. The first story doesn’t look so great for me, but the second one makes me sound social and intelligent. I am inclined to tell myself that is why I use the app and to deny, even to myself, that I use it because I want to prove that I am a better runner than someone else or to show off to my casual running friends who might log-in and see that I went on a long run.

 

Our brains are not the scientifically rational things I wish they were, but in many ways that is important for us as we try to build coalitions and social groups to get things done. We connect in ways that are beyond rationality, and sometimes we need the generous (though often false) view of ourselves as good actors to help us get through the day. We can strive for more rationality in our thoughts and actions, but we should accept that we will only get so far, and we shouldn’t hate ourselves or anyone else for not always having the nice and pure motives that we present.

Deceiving Ourselves

Kevin Simler and Robin Hanson write about evolutionary psychology of the brain in their book The Elephant in the Brain to explain why it is that we have hidden motives and why those hidden motives can be so hard to identify. The authors write (brackets mine, italics in original), “The human brain, according to this view [evolutionary psychology], was designed to deceive itself – in [Robert] Trivers’ words, ‘the better to deceive others.'” The authors look at how self-deception can be positive from an evolutionary perspective, and how that shapes the way we think about ourselves and our place in the world.

 

Fudging on the rules from time to time and making ourselves look better than we really are can be good strategies to survive, or at least they potentially were for our ancestors. Humans evolved in small, political, social tribes with rules and norms that were adhered to and enforced to varying degrees. Slight amounts of cheating, if they can go unnoticed, can be beneficial for survival. This drives an evolutionary pressure to pass along selfish genes that favor individual survival, adhere to the rules when it is convenient, but push rules aside when it benefits us. Simler and Hanson argue that this pressure is so strong, that we evolved to not even notice when we bend rules or apply them flexibly in ways that benefit us.

 

We can also seem to justify our actions, a process known as motivational reasoning, which says that we didn’t really do anything bad, we were just making the best decision we could given the circumstances or we were upholding fairness and justice in the absence of a greater authority to administer justice and fairness for us. The more we can convince ourselves that we are right and that we are on the correct side of a moral argument, the more we can convince others that our actions were just. If we are blatantly lying about our motivations, and we know we are lying, it will be harder to convince others and build support around our actions.

 

If however, we convince ourselves that our actions were right and our motives pure, we will have an easier time convincing others of our correctness and of our value to them and to society. When we give to charity, at least part of our donation is probably driven by a desire to want to be seen as the person who gives to charity or as a person with enough money to give some away. These two motivations, however, would be frowned upon. Instead, we convince ourselves that we gave to charity because it is the right thing to do, or because we think the cause is incredibly important. Those both may be true, but if we completely convince ourselves that we are donating for the high minded reasons, we will be more authentic and better able to convince other people that we made donations for high-minded and not selfish reasons. We are wired not to see the world as it is, but to see it through a filter that magnifies our greatness and minimizes our faults, deceiving ourselves so we can do a better job of presenting the best version of ourselves to the world.

More on the Role of Weapons for Evolution

Weapons reduce the distance between the strongest and weakest members of a group, especially projectile weapons, and change what it means to become a powerful and dominant leader within a social group. When weaker individuals can band together in coalitions with the use of weapons to topple a physically dominant alpha, new skills become more valuable than physical dominance alone.

 

“Once weapons enter the picture,” write Kevin Simler and Robin Hanson in The Elephant in the Brain“physical strength is no longer the most crucial factor in determining a hominid’s success within a group. It’s still important, mind you, but not singularly important. In particular, political skill – being able to identify, join, and possibly lead the most effective coalition – takes over as the determining factor.”

 

Political skills are not so important if your species rarely interacts in groups. If you live mostly in isolation, occasionally meet another member of your species to mate or fight over food, being politically skilled is not too important. Hanson and Simler argue that weapons and a change in power dynamics is what set the human brain on a path toward ever greater evolution. Political skill requires mental acuity, deception, the ability to signal loyalty, and the ability to relate and connect with others. The better your brain is at doing the complex work required for these skills, the more likely you will survive long enough to reproduce. This created the environment for our brains to begin to enlarge, since individuals with bigger brains and more intelligence were generally favored over those who were a little less cognitively capable and therefor less politically and socially skilled.

 

I think it is interesting and important to consider the factors that shaped human evolution. Understanding how our brain came to be the way it is helps us understand why we act the way we do, why we see certain types of biases in thinking, and how we can overcome mistakes in our ways of thought. By acknowledging that our brains developed to be devious, and that our brains did not develop to give us a perfect view of reality, we can better think about how we design institutions and settings to help us think in the most productive ways possible.

The Political Role of Weapons for Our Early Ancestors

Weapons are in interesting consideration for early human evolution and how we ended up in the place we are with large brains and strong social groups. Kevin Simler and Robin Hanson address the importance of weapons in their book The Elephant in the Brain. Weapons change the value of physical strength and the nature of conflict on the individual and group levels. They alter the threats and defenses that our early ancestors faced and could mount.

 

“Weapons are a game changer for two reasons.” write Hanson and Simler, “First, they level the playing field between weak and strong members of a group. … Another way weapons alter the balance of power applies to projectile weapons like stones or spears. Such distance weapons make it much easier for a coalition to gang up on a single individual.”

 

Physical force has been a dominating aspect of human relationships (and probably early human ancestors’ relationships), but we don’t live in societies where just the most physically dominant individuals rule. Weapons are a big part of why this is the case. Once we could hurl projectiles, even just heavy or sharp rocks, at opponents, our social grouping had to change. Coalitions could push back against a dominant individual who did not care about the well being of the group or of others. The role of politics and cooperation could naturally be expected to rise in a system where physical dominance was not the sole determinant of leadership and power.

 

What weapons did, Hanson and Simler argue and I will discuss more tomorrow, is create a system that favored brain development. Social intelligence and intellectual capacity became more valuable when coalitions could rule with weapons, and that created a space where the brain could evolve to become larger and more complex. If pure physical dominance was the best predictor of power and of passing along our genes, then we would not have expected our early ancestors to begin evolving in a way that favored the development of a large and highly energy dependent brain. By bringing physical prowess down a level, weapons it seems, helped further the evolutionary growth of the human brain.

The Social Brain Hypothesis

The California redwoods are amazing trees. They stand taller than any other tree, scraping at the sky as they compete among each other for sunlight. The trees can be packed together in a dense manner, all competing for the same light, all pulling massive amounts of water from the ground up enormous heights. What is interesting, however, is that the redwoods are geographically isolated, not stretching out across huge swaths of the continent, but contained within a fairly narrow region. They don’t compete against other species and spread, but mostly compete for sunlight, water, and resources among themselves.

 

In The Elephant in the Brain, Kevin Simler and Robin Hanson introduce the redwoods as a way to talk about the Social Brain Hypothesis in humans. The idea that our brilliant brains developed so that we could compete against each other, not because our brains helped us outrun lions or get more food than our primate cousins. The authors write,

 

“The earliest Homo Sapiens lived in small, tight-knit bands of 20 to 50 individuals. These bands were our “groves” or “forests,” in which we competed not for sunlight, but for resources more befitting a primate: food, sex, territory, social status. And we had to earn these things, in part, by outwitting and outshining our rivals.
This is what’s known in the literature as the social brain hypothesis, or sometimes the Machiavellian intelligence hypothesis. It’s the idea that our ancestors got smart primarily in order to compete against each other in a variety of social and political scenarios.”

 

I find this super interesting because in many ways we are still fighting among each other as if we were part of a small band of 20 to 50 individuals. We live in a world where food is relatively bountiful (for many but certainly not all) in the United States. We live in a world of online dating where finding a mate is more open to more people. Our “territory” today can be more private than ever and online niche communities can give us a new sense of social status that we could not have obtained in the past if we did not conform to the small groups of our high school, family, or work.

 

We seem to be in a place where we can let go of the pressures that the social brain hypothesis put on our early ancestors, but I don’t see people shedding those pressures very often. We can look at what has driven our species to behave the way we do and see that we don’t need to compete in the same way, we can recognize the great possibilities available to us and move in our own direction, but so often we chose to just show off and do more to impress others as if we still lived in those small tribal bands. Rather than branching out, we seem to often retreat back to a group of 20 to 50 and compete internally in a way that wastes resources on our own selfish motives. I think that we should talk more openly about the social brain hypothesis and the ideas that Hanson and Simler present so that we can have a real discussion about how we move forward without pushing everyone to compete for things that we should be able to provide openly with new systems and organizations.

Our Devious Minds

We now realize,” write Kevin Simler and Robin Hanson in their book The Elephant in the Brain, “that our brains aren’t just hapless and quirky – they’re devious. They intentionally hide information from us, helping us fabricate plausible pro-social motives to act as cover stories for our less savory agendas. As Trivers puts it: “At ever single state [of processing information] from its biased arrival, to its biased encoding, to organizing it around false logic, to misremembering and then misrepresenting it to others – the mind continually acts to distort information flow in favor of the usual goal of appearing better than one really is.

 

Recently I have been pretty fascinated by the idea that our minds don’t do a good job of perceiving reality. The quote above shows many of the points where our minds build a false sense of reality for us and where our perceptions and understanding can go astray. It is tempting to believe that we observe and recognize an objective picture of the world, but there are simply too many points where our mental conceptualization of the world can deviate from an objective reality (if that objective reality ever even exists).

 

What I have taken away from discussions and books focused on the way we think and the mistakes our brain can make is that we cannot always trust our mind. We won’t always remember things correctly and we won’t always see things as clearly as we believe. What we believe to be best and correct about the world may not be accurate. In that sense, we should doubt our beliefs and the beliefs of others constantly. We should develop processes and systems for identifying information that is reasonable and question information that aligns with our prior beliefs as much as information that contradicts our prior beliefs. We should identify key principles that are most important to us, and focus on those, rather than focus on specific and particular instances that we try to understand by filling in answers from generalizations.

What’s Happening in Our Brains Behind the Conscious Self?

Toward  the end of the introductory chapter of their book The Elephant in the Brain, Kevin Simler and Robin Hanson explain what they observed with the human mind and what they will be exploring in the coming chapters. They write, “What will emerge from this investigation is a portrait of the human species as strategically self-deceived, not only as individuals but also as a society. Our brains are experts at flirting, negotiation social status, and playing politics, while “we” – the self-conscious parts of the brain – manage to keep our thoughts pure and chaste. “We” don’t always know what our brains are up to, but we often pretend to know, and therein lies the trouble.

 

The last few days I have written about a few instances where we deceive ourselves and hide our true motives from ourselves. We do this so that in our political and social world we can appear to have high-minded motives and reasons for doing the things we do. Simler and Hanson show that this does not just happen on an individual level, but happens at group and society levels as well. We all contribute to the failure to acknowledge what it is that drives our decisions and why we do what we do.

 

This process takes place behind the conscious self that experiences the world. In the past, I have borrowed from Ezra Klein who has used a metaphor on his podcast about a press secretary. The press secretary for a large company doesn’t sit in on every strategic decision meeting, isn’t a part of every meeting to decide what the future of the company will be, and isn’t part of the team that makes decisions about whether the company will donate money, will begin to hire more minorities, or will launch a new product. But the press secretary does have to explain to the general public why the company is making these decisions, and has to do it in a way that makes the company look as high minded as possible. The company is supporting the local 5K for autism because they care about these children in the community. The company has decided to hire more minorities because they know the power of having a diverse workforce and believe in equality. The company was forced to close the factory because of unfair trade practices in other countries. None of these reasons are self-interested, but the final decision made by the company may be more self-interested than altruistic or even necessary.

 

On an individual level, our conscious self is acting like the press secretary I described and this passes along throughout the levels of society. As individuals we say and think one thing while doing another, and so do our political bodies, our family units, our businesses, and the community groups we belong to. There are often hidden motives that we signal to, without expressing directly, that likely account for a large portion of the reason for us to do what we do. This creates awkward situations, especially for those who don’t navigate these unspoken social situations well, and potentially puts us in places where our policy doesn’t align with the things we say we want. We should not hate humans for having these qualities, but we should try to recognize them, especially in our own lives, and control these situations and try to actually live in the way we tell people we live.