Our Devious Minds

“We now realize,” write Kevin Simler and Robin Hanson in their book The Elephant in the Brain, “that our brains aren’t just hapless and quirky–they’re devious. They intentionally hide information from us, helping us fabricate plausible pro-social motives to act as cover stories for our less savory agendas. As Trivers puts it: “At ever single state [of processing information]–from its biased arrival, to its biased encoding, to organizing it around false logic, to misremembering and then misrepresenting it to others–the mind continually acts to distort information flow in favor of the usual goal of appearing better than one really is.” 

 

Recently I have been pretty fascinated by the idea that our minds don’t do a good job of perceiving reality. The quote above shows many of the points where our minds build a false sense of reality for us and where our perceptions and understanding can go astray. It is tempting to believe that we observe and recognize an objective picture of the world, but there are simply too many points where our mental conceptualization of the world can deviate from an objective reality (if that objective reality ever even exists).

 

What I have taken away from discussions and books focused on the way we think and the mistakes our brain can make is that we cannot always trust our mind. We won’t always remember things correctly and we won’t always see things as clearly as we believe. What we believe to be best and correct about the world may not be accurate. In that sense, we should doubt our beliefs and the beliefs of others constantly. We should develop processes and systems for identifying information that is reasonable and question information that aligns with our prior beliefs as much as information that contradicts our prior beliefs. We should identify key principles that are most important to us, and focus on those, rather than focus on specific and particular instances that we try to understand by filling in answers from generalizations.

 

A fear that I have is that as we come to doubt the information around us and the perceptions of our minds, we will begin to doubt institutional structures that help us with the flow of information. We should be continually thinking of ways to strengthen institutions that can help us navigate a complex world. At the moment, one of the things I think we are seeing across the globe is that as we doubt information, we doubt institutions which have been valuable in helping human societies advance. We need to find ways to make institutional knowledge more trustworthy and clear so that we can develop institutions which have incentives to provide the most reasonable, clear, and accurate information possible so that we can overcome the biases and misconceptions of the mind.

Sabotage Information

“Our minds are built to sabotage information in order to come out ahead in social games.” In The Elephant in the Brain, Kevin Simler and Robin Hanson write about the ways in which we act out of our own self-interest without acknowledging it. We are more selfish, more deceptive, and less altruistic than we would like to admit, even to ourselves. To keep us feeling good about what we do, and to make it easier to put on a benevolent face, our brains seem to deliberately distort information to make us look like we are honest, open, and acting with the best of intentions for everyone.

 

“When big parts of our minds are unaware of how we try to violate social norms, it’s more difficult for others to detect and prosecute those violations. This also makes it harder for us to calculate optimal behaviors, but overall, the trade-off is worth it.” As someone who thinks critically about Stoicism and believes that self-reflection and awareness are keys to success and happiness, this is hard to take in. It suggests that self-awareness is a bigger burden for social success than blissful unawareness. Being deluded about our actions and behaviors, Simler and Hanson suggest, helps us be better political animals and helps us climb the social hierarchy to attain a better mate, more status, and more allies. Self-awareness, their idea suggests, makes us more aware of the lies we tell ourselves about who we are, what we do, and why we do it, and makes it harder for us to lie and get ahead.

 

“Of all the things we might be self-deceived about, the most important are our own motives.” Ultimately, however, I think we will be better off if we can understand why we, and everyone else, believe the things we do and behave the way we do. Turning inward and recognizing how often we hide our motives and deceive ourselves and others about our actions can help us overcome bias. We can start to be more intentional about our decisions and think more critically about what we want to work toward. We don’t have to hate humanity because we lie and hide parts of ourselves from even ourselves, but we can better move through the world if we actually know what is going on. Before we become angry over a news story, before we shell out thousands of dollars for new toys, and before we make overt displays of charity, we can ask ourselves if we are doing something for legitimate reasons, or just to deceive others and appear to be someone who cares deeply about an issue or item. Slowly, we can counteract the negative externalities associated with the brain’s faulty perceptions, and we can at least make our corner of the world a little better.

Our Devious Minds

We now realize,” write Kevin Simler and Robin Hanson in their book The Elephant in the Brain, “that our brains aren’t just hapless and quirky – they’re devious. They intentionally hide information from us, helping us fabricate plausible pro-social motives to act as cover stories for our less savory agendas. As Trivers puts it: “At ever single state [of processing information] from its biased arrival, to its biased encoding, to organizing it around false logic, to misremembering and then misrepresenting it to others – the mind continually acts to distort information flow in favor of the usual goal of appearing better than one really is.

 

Recently I have been pretty fascinated by the idea that our minds don’t do a good job of perceiving reality. The quote above shows many of the points where our minds build a false sense of reality for us and where our perceptions and understanding can go astray. It is tempting to believe that we observe and recognize an objective picture of the world, but there are simply too many points where our mental conceptualization of the world can deviate from an objective reality (if that objective reality ever even exists).

 

What I have taken away from discussions and books focused on the way we think and the mistakes our brain can make is that we cannot always trust our mind. We won’t always remember things correctly and we won’t always see things as clearly as we believe. What we believe to be best and correct about the world may not be accurate. In that sense, we should doubt our beliefs and the beliefs of others constantly. We should develop processes and systems for identifying information that is reasonable and question information that aligns with our prior beliefs as much as information that contradicts our prior beliefs. We should identify key principles that are most important to us, and focus on those, rather than focus on specific and particular instances that we try to understand by filling in answers from generalizations.