How good are you at introspection? How much do you know about yourself and how much do other people know about you that you don’t actually know? I try to ask myself these types of questions and reflect on how much I don’t actually recognize or know about myself, but it is really difficult. I much prefer to look at what I do with the most generous interpretation of myself and my actions and decisions, but I know that somewhere out there someplace is someone who looks at me with the least generous interpretations of me and my actions, in much the same way I look at a person smoking or a person with a political bumper sticker that I dislike and instantly want to ascribe a whole set of negative qualities to them. Beyond simply the idea of looking at ourselves honestly, looking at ourselves from the least favorable position is extremely uncomfortable, and reveals a bunch of things about who we are that we would rather ignore.
Luckily for us, our brains protect us from this discomfort by simply failing to be aware of our true actions and motives in any given point in time. The brain ascribes high-minded reasons for our behaviors, and hands us a script with the best interpretation of what we do. When it comes to the brain, we feel like we know what is going on within it, but the reality is that our own minds are mysteries to ourselves.
Kevin Simler and Robin Hanson write about this in their book The Elephant in the Brain. “In other words, even we don’t have particularly privileged access to the information and decision-making that goes on inside our minds. We think we’re pretty good at introspection, but that’s largely an illusion. In a way we’re almost like outsiders within our own minds.”
We tell ourselves that what we do makes sense, is logical, and is the best possible outcome for everyone in a given situation. The reality is that we are likely more focused on our own gain than anything else. We don’t want to admit this to anyone (even ourselves) and so we sugar coat it and hide our motivations behind altruistic reasons. Piercing through this with self-reflection is challenging because we are so good at deflecting and deceiving our own thought processes. It is comforting to believe that we are on the right side of a moral issue (especially if we get benefits from the side we are on) and uncomfortable with no clear path forward if we look inward and discover that we may have been wrong the whole time (especially if it seems that our social group has been wrong the whole time). Increasingly in my life I find it imperative that I consider that my brain doesn’t see things clearly and that my brain is likely wrong and short sighted about many issues. Remembering this helps me avoid becoming too certain of my beliefs, and keeps me open to the ways that other people see the world. It helps me recognize when people are acting out of their own self-interest, and helps me pull back in situations where my ego wants to run wild and declare that I am clearly right and those other people are obviously violating good moral ethics.
Kevin Simler and Robin Hanson write about evolutionary psychology of the brain in their book The Elephant in the Brain to explain why it is that we have hidden motives and why those hidden motives can be so hard to identify. The authors write (brackets mine, italics in original), “The human brain, according to this view [evolutionary psychology], was designed to deceive itself–in [Robert] Trivers’ words, ‘the better to deceive others.'” The authors look at how self-deception can be positive from an evolutionary perspective, and how that shapes the way we think about ourselves and our place in the world.
Fudging on the rules from time to time and making ourselves look better than we really are can be good strategies to survive, or at least they potentially were for our ancestors. Humans evolved in small, political, social tribes with rules and norms to adhere to and enforce to varying degrees. Slight amounts of cheating, if they can go unnoticed, can be beneficial for survival. This drives an evolutionary pressure to pass along selfish genes that favor individual survival, hold up the rules when it is convenient, but push rules aside when it benefits us. Simler and Hanson argue that this pressure is so strong, that we evolved to not even notice when we do this.
We also seem to justify our actions (a process known as motivated reasoning) in a way which says that we didn’t really do anything bad, we were just making the best decision we could given the circumstances or we were upholding fairness and justice in the absence of a greater authority to administer justice. The more we can convince ourselves that we are right and that we are on the correct side of a moral argument, the more we can convince others that our actions were just. If we are blatantly lying about our motivations, and we know we are lying, it will be harder to convince others and build support around our actions.
If however, we convince ourselves that our actions were right and our motives pure, we will have an easier time convincing others of our correctness and of our value to them and to society. When we give to charity, at least part of our donation is probably driven by a desire to want to be seen as the person who gives to charity or as a person with enough money to give some away. These two motivations, however, would be frowned upon. Instead, we convince ourselves that we give to charity because it is the right thing to do, or because we think the cause is incredibly important. Those both may be true, but if we completely convince ourselves that we are donating for the high-minded reasons, we will be more authentic and better able to convince other people that we made donations for high-minded and not selfish reasons. We are wired not to see the world how it is, but to see it through a filter that magnifies our greatness and minimizes our faults, deceiving ourselves so we can do a better job of presenting the best version of ourselves to the world.
I have written in the past about the idea and model that our brains act as press secretaries, taking the information that comes into the mind and presenting it in a way that makes everything happening in the mind look as good as it possibly can. This idea comes back in Robin Hanson and Kevin Simler’s book The Elephant in the Brain where the authors expand on the idea. They write,
“Above all, it’s the job of our brain’s Press Secretary to avoid acknowledging our darker motives – to tiptoe around the elephant in the brain. Just as a president’s press secretary should never acknowledge that the president is pursuing a policy in order to get reelected or to appease his financial backers, our brain’s Press Secretary will be reluctant to admit that we’re doing things for purely personal gain, especially when that gain may come at the expense of others. To the extent that we have such motives, the Press Secretary would be wise to remain strategically ignorant of them.”
I really like the way that the authors describe the role of the conscious part of our brains as acting as a press secretary. By keeping us consciously unaware of our motivations for action, we can be strategically ignorant of why we do what we do. Strategic ignorance is common when we pretend that the things we do don’t have external consequences for others, when we don’t want to face the reality of science, or when we just want to avoid doing some unappealing task. In most cases we probably recognize that we are not fooling anyone when we claim we don’t know what’s really happening, but at least it gives us a slight cushion to be comfortable while hoping that the negative consequences don’t come back to bite us.
Hanson and Simler continue the metaphor, “What’s more – and this is where things might start to gt uncomfortable-there’s a very real sense in which we are the Press Secretaries within our minds. In other words, the parts of the mind that we identify with, the parts we think of as our conscious selves…” It is easy to ignore the parts of ourselves that don’t align with the story we want to tell and present to the world about what great people we are. It turns out it is so easy because we are not consciously aware of those parts of ourselves. We are just the press secretary who is handed the script about all the great things happening within us. We purposefully avoid those parts of us that look bad, because we don’t want to acknowledge they are there and have to explain ourselves in spite of those negative aspects of who we are. By simply ignoring those parts of us and sticking to the happy script, we can look great and feel great about the wonderful things we do, even if those wonderful things don’t measure up to the sanitized version we present to the world. There is a lot taking place behind the scenes, but lucky for us, we are just the front facing conscious press secretary who doesn’t see any of it.
“Our minds are built to sabotage information in order to come out ahead in social games.” In The Elephant in the Brain, Kevin Simler and Robin Hanson write about the ways in which we act out of our own self-interest without acknowledging it. We are more selfish, more deceptive, and less altruistic than we would like to admit, even to ourselves. To keep us feeling good about what we do, and to make it easier to put on a benevolent face, our brains seem to deliberately distort information to make us look like we are honest, open, and acting with the best of intentions for everyone.
“When big parts of our minds are unaware of how we try to violate social norms, it’s more difficult for others to detect and prosecute those violations. This also makes it harder for us to calculate optimal behaviors, but overall, the trade-off is worth it.” As someone who thinks critically about Stoicism and believes that self-reflection and awareness are keys to success and happiness, this is hard to take in. It suggests that self-awareness is a bigger burden for social success than blissful unawareness. Being deluded about our actions and behaviors, Simler and Hanson suggest, helps us be better political animals and helps us climb the social hierarchy to attain a better mate, more status, and more allies. Self-awareness, their idea suggests, makes us more aware of the lies we tell ourselves about who we are, what we do, and why we do it, and makes it harder for us to lie and get ahead.
“Of all the things we might be self-deceived about, the most important are our own motives.” Ultimately, however, I think we will be better off if we can understand why we, and everyone else, believe the things we do and behave the way we do. Turning inward and recognizing how often we hide our motives and deceive ourselves and others about our actions can help us overcome bias. We can start to be more intentional about our decisions and think more critically about what we want to work toward. We don’t have to hate humanity because we lie and hide parts of ourselves from even ourselves, but we can better move through the world if we actually know what is going on. Before we become angry over a news story, before we shell out thousands of dollars for new toys, and before we make overt displays of charity, we can ask ourselves if we are doing something for legitimate reasons, or just to deceive others and appear to be someone who cares deeply about an issue or item. Slowly, we can counteract the negative externalities associated with the brain’s faulty perceptions, and we can at least make our corner of the world a little better.
“The unreal disguise of consciousness serves only to emphasize to me the existence of the undisguised unconsciousness.” Fernando Pessoa wrote this in The Book of Disquiet as he reflected on the way that people thought about and moved through the world around him. What Pessoa noted is that we act and behave as though we are consciously making decisions and guiding our lives while in reality we are often driven by unconscious forces that we are not aware of. For Pessoa, life was an every day struggle. He did not live in desolate poverty or have anything particular terrible happening in his life, but was cognizant of the stories people told about their lives and existence, and he could not bring himself to believe any particular story.
It seems like most people, most of the time, are not actually that considerate of the world around them. If we all were more considerate, capitalism would not be elevated in the United States to a quasi-religious stance. We would be able to take action on climate change. We probably would spend less time watching what celebrities were doing, and more time participating in a sport rather than watching and talking about other people playing a sport.
“Occasional hints that they might be deluding themselves–that and only that is what most men experience.”
I have been thinking about consciousness and our experiences quite a bit lately. In Considerations
, Colin Wright encouraged us to think more deeply about world, and to see things beyond our initial reaction. Rob Reid has talked to guests in his podcast After On
about the reality that our brains don’t sense the world as fully as they potentially could. There are senses we just don’t have that we observe in other living creatures. And in The Elephant in the Brain
Kevin Simler and Robin Hanson discuss ways in which being deluded about reality can be an evolutionary advantage for us.
Most people that I meet don’t seem to be interested in thinking beyond their initial reaction to the world. Most people don’t really seem to consider the fact that they have a narrow band of senses through which they can experience the world. And most people don’t seem to be interested in the idea that we evolved to have an inaccurate picture of the universe because it helped us be socially deceptive. But I think it is powerful and important that we recognize how much is going on beyond the recognition of our conscious self. We should strive to have a full existence that helps encourage flourishing for others as well as for ourselves. We should strive to see reality for what it is, and cut through the stories we tell ourselves or that others tell for us. The more considerate we are, the more we can open others to the same reality, and hopefully start to counteract the unconscious immediacy of our reactions to the world which is encouraged by social media and indeed by our brains’ very nature.
Loyalty in social tribes is important. If you are consistently loyal to a strong, smart, and well connected individual in a small group, you can receive a lot of direct benefits. Being disloyal, failing to conform, and only occasionally supporting the person in the social group with the highest social status will not get you the same level of benefits. In our world today we still do this, though it is probably less of a major driver of whether we pass on our genes and have enough food to eat. In the world of our tribal ancestors, however, this likely played a huge role in who was able to pass their genes along, who got to eat from the communal dinner, and who was left out in the cold when there was not enough shelter.
Our relationships involve a certain amount of loyalty, and loyalty cannot be ascertained or demonstrated by just asking someone, “to what degree are you loyal to me?” Loyalty must be demonstrated and shown in subtle indirect ways. When a wife asks, “do these jeans make me look fat?” she may really be asking how loyal and loving her husband is, as opposed to actually asking about her appearance in a pair of jeans (as a guy, I would like to note that I may be 100% dead wrong on this particular example – forgive me if I am totally missing the mark here).
In The Elephant in the Brain Kevin Simler and Robin Hanson write, “we often measure loyalty in our relationships by the degree to which a belief is irrational or unwarranted by the evidence.” So a group or tribe may adopt a completely irrational belief as a type of test, to see who is the most loyal and the least willing to question the leader or cut against the tribe. “It only demonstrates loyalty to believe something that we wouldn’t have reason to believe unless we were loyal.”
I think a lot of religion includes these types of tests. I also think we see this in sports relationships, our relationships to some consumer products, and clearly in our political parties. We need coalitions to do great things or we will only make it so far. People won’t want to join our coalitions unless we can demonstrate loyalty and group belonging. Believing something clearly inaccurate is a good way to show loyalty in an indirect sort of way and to signal to others that we are on their side and have their back.
One thing I have written about on the blog a lot is our tribal nature and how we now live in world that demands global solutions and thinking that we seem to be unable to achieve due to our tribal evolutionary past. We face great challenges today and have opportunities to put in place policies that lift everyone, but often spend much of our time fighting among each other over meaningless tribal points. We become raving sports fanatics for no sensible reason, we will get in fights about which students from which college major is better, and we will form clubs around the typed of truck we drive and literally get in fist fights because we drive Fords and those people over there drive Dodges. For some reason, we feel compelled to be loyal to these groups, even when there is no tangible benefit (in the sense of really prospering or attracting new mates) to being so loyal to a meaningless group.
Our loyalty within these groups is a phenomenon that I find extremely interesting, and at times deeply troubling. One of the reasons why it can be so detrimental and scary for our society is well explained in The Elephant in the Brain by Kevin Simler and Robin Hanson, “When a group’s fundamental tenets are at stake, those who demonstrate the most steadfast commitment – who continue to chant the loudest or clench their eyes the tightest in the face of conflicting evidence – earn the most trust from their fellow group members.”
Groups favor and actively reward loyalty even when loyalty is undeserved. We praise those who stick with the party when the leader is clearly in the wrong. We say that true sports fans still show up and root for the team even when the team flat out sucks and isn’t even fun to watch. We encourage our fellow group members to stick with our side even when overwhelming evidence shows that our side is in the wrong.
This leaves me asking how we ever move forward based on shared understandings of reasonable facts? How do we improve our society if all we do is advocate for things that benefit our social group, even if those things are bad for everyone else? How do we create common understanding if we don’t acknowledge our group and our efforts to advantage our group over others?
I think a key is to begin working to show how meaningless many of the groups we belong to can be. On an individual level we need to develop skills to recognize when we are being defensive about something and showing group loyalty over an idea or to a group that just doesn’t matter. When we can start to step back and admit we were wrong and that changing our opinion doesn’t matter, we can start to move forward. The great challenge is doing this on a societal level, especially if bad actors don’t have the incentive to behave the same way.
“Self deception is useful only when you’re playing against an opponent who can take your mental state into account,” write Kevin Simler and Robin Hanson in The Elephant in the Brain. “Sabotaging yourself works only when you’re playing against an opponent with a theory-of-mind.”
When we think about other people and their actions, we don’t just look at the hard facts of what happened. We spend a lot of time trying to read small cues and context to understand why someone did something. We project ourselves into the situation, we imagine other people in their situation, and sometimes we even imagine a person from space with no human social awareness in the situation. We strive to understand what types of mental processes and thoughts may have been taking place in the person’s head at the time of an action or decision. From sports, to politics, to office gossip, we attempt to guess the mental state of others, we hold a theory of what is taking place in their mind.
This is a key part of game theory. We have to be able to deduce that others are thinking something and that they are interpreting, reacting to, and making decisions about a given situation and will change their behavior in response to the way that we think and behave. In this world, social decisions and consequences along with individual actions become very complex very fast. What often matters is not so much a given outcome, but the intent behind the outcome. Was this person just trying to make themselves richer, or did they have more altruistic motives of helping everyone? Did this person really want to develop a new type of road to help improve traffic, or again, were they just out for themselves? Is my crime conspirator going to rat me out, or will he keep his mouth shut? These are the types of questions and things we think about when we assume other people have minds that work like ours.
This brings in self-deception. If we are always looking at others trying to sort out their motives, and if they are doing the same to us, then we better have a really good poker face when we are lying–or when we are just not quite telling the full truth. “we, humans, must self-deceive. Those who refuse to play such mind games will be at a disadvantage relative to others who play along,” the authors white in their book. Many of us have probably been in a situation where we tried to be truthful and honest, but were afraid that someone who was not truthful could interfere with our plans by seeming to be honest but really lying. They may have made great impressions and possibly gotten the reward we hoped for, ultimately preventing us from doing something good while they scammed the situation. This is why we are under pressure to self-deceive, to over promise, to inflate ourselves, and to fudge the details. After all, if we know we can do something the best, we better make sure we have the chance and don’t have it stolen by someone else who might be lying and less capable. Competing with other smart social creatures encourages self-deception so that we can feel good about ourselves and appear more genuine when we are distorting the facts so that we can get ahead.
Anyone who has ever misplaced their keys or their wallet knows that the brain can be a bit faulty. If you have ever been convinced you saw a snake only to find out it was a plastic bag, or if you remembered dropping a pan full of sweet potatoes as a child during Thanksgiving only to get into an argument with your brother about which one of you actually dropped the pan, then you know your brain can misinterpret signals and mis-remember events. For some reason, our hyper-powerful pattern recognition brains seem to be fine with letting us down from time to time.
In The Elephant in the Brain, Kevin Simler and Robin Hanson write, “There’s a wide base of evidence showing that human brains are poor stewards of the information they receive from the outside world. But this seems entirely self-defeating, like shooting oneself in the foot. If our minds contain maps of our worlds, what good comes from having an inaccurate version of these maps?”
The question is, why do we have such powerful brains that can do such amazing things, but that still make basic mistakes all the time? The answer that Hanson and Simler propose throughout the book is that having super accurate information in the brain, remembering everything perfectly, and clearly observing everything around us is actually detrimental to our success as a social species. Our view of the world only needs to be so accurate for us to successfully function as biological creatures. We only need senses that satisfice for us to evade predators, avoid poisonous mushrooms, and get enough food. What really drives the evolution of the brain, is being successful socially, and sometimes a bit of deception gives us a big advantage.
It is clear that the brain is not perfect at observing the world. We don’t see infrared wavelengths of light, we can’t sense the earths magnetic pull, and we can’t hear as many sounds as dogs can hear. Our experience of the world is limited. On top of those limitations, our brains are not that interested in having an accurate picture of the information that it actually can observe. We must keep this in mind as we go through our lives. What can seem so clear and obvious to us, may be a distorted picture of the world that someone else can see as incomplete. A good way to move forward is to abandon the idea that we have (or must have) a perfect view and opinion of the world. Acknowledge that we have preferences and opinions that shape how we interpret the world, and even if we are not open to changing those opinions, at least be open to the idea that our brains are not designed to have perfect views, and that we might be shortsighted in some areas. We will need to bond with others and form meaningful social groups, but we should not accept that we will have to delude our view of the world and accept alternate facts to fit in.