Get to Know as Many Different Approaches as Possible

Get to Know as Many Different Approaches as Possible

On the most recent episode of the Don’t Panic Geocast, Shannon Dulin said something along the lines of, “all models are wrong.” Our minds are not perfect replications of reality.  They operate on models that explain and to some extent predict reality, but what takes place within our mental models is not actually what happens in reality.
 
 
Think of a map. A super simple map doesn’t match the bend in each road perfectly. It doesn’t give you a sense of elevation. It is a model for the area you are interested in traversing with the aid of the map. On the other end, most accurate possible model of the area would be a complete and perfect replication of the area, but of course that would be of no use in helping us better understand and navigate the area. Our mental models are just like these maps. They simplify, cut out some of the clutter, and reduce some of the unique of aspects of reality to give us a more manageable picture and sense of direction. Our mental model is not a perfect replication of reality. Our model is wrong because the only way for it to be right would be for it to be a perfect replication which would be too complex for our minds.
 
 
Given that models are wrong, but that we need them because we need to simplify in order to think, it is important that we constantly explore to better understand what does and does not need to be in our mental models. Along the same thought lines, Yuval Noah Harari writes, “what is important is to get to know as many different approaches as possible and to ask the right questions,” in his book Sapiens.
 
 
We can focus on any given area, from earthquakes, to human happiness, to minimum wage laws and adopt rigid conclusions based on our mental models for understanding the world. But those conclusions are almost certainly wrong because our mental models are almost certainly insufficient and wrong. Getting locked in on a singular mental model or idea will lead us to rigid conclusions that don’t accurately match reality. To get beyond this we need to be able to gather various perspectives and points of view. Not just on a single issue or idea, but on topic. We have to be willing to rethink any mental model that operates in our mind. We need to hone, refine, and adjust mental models with a spirit of exploration and research. Only by trying many different models and combinations will we start to know what is important and what can be stripped out of our model. We have to do this, because we will always rely on mental models to understand the world, and having a wrong model means we misunderstand reality and means we will make poor judgments and decisions that will impact the real world in a negative way.
The Focusing Illusion Continued

The Focusing Illusion Continued

I find the focusing illusion as described by Daniel Kahneman in his book Thinking Fast and Slow to be fascinating because it reveals how strange our actual thinking is. I am constantly baffled by the way that our brains continuously and predictably makes mistakes. The way we think about, interpret, and understand the world is not based on an objective reality, but is instead based on what our brain happens to be focused on at any given time. As Kahneman writes, what you see is all there is, and the focusing illusion is a product of our brain’s limited ability to take in information combined with the brain’s tendency to substitute difficult and complex questions for more simple questions.

 

In the book, Kahneman asks us to think about the overall happiness of someone who recently moved from Ohio to California and also asks us to think about the amount of time that paraplegics spend in a bad mood. In both situations, we make a substitution. We know that people’s overall happiness and general moods are comprised of a huge number of factors, but when we think about the two situations, we focus in on a couple of simple ideas.

 

We assume the person from Ohio is happier in California because the weather in California is always perfect while Ohio experiences cold winters. The economic prospects in California might be better than Ohio, and there are more movie stars and surfing opportunities. Without knowing anything about the person, we probably assume the California move made them happier overall (especially given the additional context and priming based on the weather and jobs prospects that Kahneman presents in the example in his book).

 

For our assumptions about the paraplegic, we likely go the other way with our thoughts. We think about how we would feel if we were in an accident and lost the use of our legs or arms. We assume their life must be miserable, and that they spend much of their day in a bad mood. We don’t make a complex consideration of the individual’s life or ask more information about them, we just make an assumption based on limited information by substituting in the question, “How would I feel if I became paralyzed.” Of course, people who are paralyzed or lose the function of part of their body are still capable of a full range of human emotions, and might still find happiness in their lives in many areas.

 

Kahneman writes, “The focusing illusion can cause people to be wrong about their present state of well-being as well as about the happiness of others, and about their own happiness in the future.”

 

We often say that it is important that we know ourselves and that we be true to ourselves if we want to live healthy and successful lives. But research throughout Thinking Fast and Slow shows us how hard it can be. After reading Kahneman’s book, learning about Nudges from Cass Sunstein and Richard Thaler, and learning how poorly we process risk and chance from Gerd Gigerenzer, I constantly doubt how much I can really know about myself, about others, or really about anything. I am frustrated when people act on intuition, sure of themselves and their ideas in complex areas such as economics, healthcare, or education. I am dismayed by advertisements, religions, and political parties that encourage us to act tribally and to trust our instincts and intuitions. It is fascinating that we can be so wrong about something as personal as our own happiness. It is fascinating that we can be so biased in our thinking and judgement, and that we can make conclusions and assumptions about ourselves and others with limited information and not even notice how poorly our thought processes are. I love thinking about and learning about the biases and cognitive errors of our mind, and it makes me pause when I am sure of myself and when I think that I am clearly right and others are wrong. After all, if what you see is all there is, then your opinions, ideas, and beliefs are almost certainly inadequate to actually describe the reality you inhabit.
Confident But Wrong

Confident, But Wrong

We like confident people. We like people who can tell us something direct and simple to understand while being confident in the statements they make. It makes our job as a receiver easier. We can trust someone with confidence because surely they have thought out what they say, and surely their lack of ambivalence or hesitation means they have solid evidence and a logical coherence to the ideas they are expressing.

 

The problem, however, is that confidence and accuracy are not actually linked. We can be very confident in something that isn’t accurate, true, or correct. What is even worse, it can be hard for us ourselves to recognize when our confidence is misplaced. As Daniel Kahneman writes in his book Thinking Fast and Slow, “We are often confident even when we are wrong, and an objective observer is more likely to detect our errors than we are.”

 

We need to surround ourselves with thoughtful people with expertise in important areas where we will be making crucial decisions. We need to collect input from more than one person before we express complete confidence in another person, idea, or prediction. In the real world, this isn’t often possible, but it is something we should at least be aware of.

 

Trusting confident people is a way of answering an easier question in place of a more difficult question. The question might be, should we invest in this mutual fund or that mutual fund, or should we have a totally different investment strategy that doesn’t involve mutual funds. Instead of asking ourselves how we should invest our savings and doing the difficult research to understand the best strategy for ourselves, we switch to a different question and ask, “do I trust the financial advisor giving me the investment advice?” This is an easier question for us to answer. If the advisor sounds smart, has awards on their desk or wall, and exudes confidence, then they are going to appear more trustworthy, and we will believe what they say. They can present us with a lot of confidence, but be totally wrong, and we will likely go with their recommendations anyway.

 

As Kahneman explains, however, outside observers can help us overcome these confidence traps in ourselves and in how we perceive others. If we have a reliable person with knowledge of a certain area, they can help us think through our arguments to determine if we should be as confident as we are. They can help us evaluate the claims of others, to determine whether their confidence is also well deserved or needs more scrutiny. What is important to remember is that we use confidence as a heuristic, and sometimes we can be confident, but wrong with our thoughts and opinions on a given subject.

Consider Other People’s Opinions Seriously

A principle that Dale Carnegie expresses in his book How to Win Friends and Influence People is, “Show respect for the other person’s opinions. Never say, You’re Wrong.”

 

Telling someone directly that they are wrong doesn’t do much for us. What it does is put the other person in a defensive position by threatening their status and identity. Directly criticizing them and labeling them as wrong, even if it is obvious, doesn’t actually get the other person to recognize their error and change their opinion.

 

To say that someone missed a point, that they committed a logical error, or to say that their conclusion should have fallen elsewhere is a way to get around direct criticism. Better yet is trying to understand where the person came from and why they think the way they do. By doing that, we can actually connect with them and help them examine their thinking and potentially make a change.

 

Carnegie writes, “Remember that other people may be totally wrong. But they don’t think so. Don’t condemn them. Any fool can do that. Try to understand them. Only wise, tolerant, exceptional people even try to do that. There is a reason why the other man thinks and acts as he does. Ferret out that reason – and you have the key to his actions, perhaps to his personality.”

 

When we stand back and tell people they are wrong, we implicitly broadcast how right we are. We don’t consider that other people have different points of view, different experiences, and different backgrounds that shape their views and beliefs. If we can work to better understand these factors and how people ended up where they are with their beliefs, then we have a better possibility of having a real conversation with them. Failing to do so only leads to polarization and an inability to communicate. Remember also that you are probably wrong about many points, and that you have the same capacity as the other person to be wrong in one way or another.

We Might Be Wrong

“If you can be sure of being right only 55 percent of the time,” writes Dale Carnegie in the book How to Win Friends and Influence People, “you can go down to Wall Street and make a million dollars a day. If you can’t be sure of being right even 55 percent of the time, why should you tell other people they are wrong?”

 

We always feel so sure of our judgments and conclusions. From the health and safety of GMO foods, to the impacts of a new tax, to who is going to win the Super Bowl, we are often very confident people. The world seems to always want our opinions, and we are usually very excited to offer our opinion with a staggering amount of confidence. This has lead to a lot of funny social media posts about people being incorrect about history, science, and sports, but more seriously, it can create thinking errors that lead nations to invade countries for poor reasons, lead to mechanical failures of spacecraft and oil platforms, and can cause us to loose huge sums of money when the game doesn’t turn out the way we knew it would.

 

I think a good practice is to look for areas where we feel a high degree of confidence, and to then try to ascribe a confidence level to our thoughts. We can try to tie our confidence levels back to real world events to help us ground our predictions: The percent chance of getting blackjack in a given hand is 4.83%, Steph Curry’s 3-point shooting percentage is 43.5%, and the percent chance of getting heads in a coin flip is of course 50%. Can you anchor your confidence (or the chance you are wrong) to one of these percentages?

 

I haven’t studied this (so I could be wrong – I’d wager the chance I’m wrong and this is not helpful at Steph Curry’s 3-point percentage), but I would expect that doing this type of exercise would help us recognize how overconfident we often are. It might even help us get to the next step, admitting that we might be wrong and considering different possibilities. Carnegie continues:

 

“You will never get into trouble by admitting that you may be wrong. That will stop all argument and inspire your opponent to be just as fair and open and broad-minded as you are. It will make him want to admit that he, too, may be wrong.”

 

The important thing to remember is that the world is incredibly complex, and our minds are only so good at absorbing lots of new data and articulating a comprehensive understanding of the information we synthesize. We should be more willing to consider ways in which our beliefs may be inaccurate, and more willing to listen to reasonable people (especially those who have demonstrated expertise or effective thinking skills) when they suggest an idea that does not conform to our prior beliefs. Try not to be close-minded and overly confident in your own beliefs, and you will be better at avoiding thinking errors and making better long-term decisions.

Changing Your Views on a Group of People

An unfortunate reality in our world is that we don’t have a lot of incentives to change our beliefs about things. What we think and feel regarding a specific item is heavily influenced by more than just our own experiences and rational thoughts about that thing. Our social groups, self-interests, and group identities can shape our beliefs and make it almost impossible for our beliefs to have any flexibility. In this setting, changing our beliefs may require that we break with a group identity, view the world in a way that is inconsistent with the rest of the people around us, and acknowledge that our narrow self-interest is not what is in the best interest of a larger society.

 

Colin Wright wrote about this in his book Becoming Who We Need To Be and related the idea directly to the ways we think about groups of people. He writes, “If we’ve spent our lives hating, or at least feeling superior to, a particular group of people, but then are exposed to convincing information about that group that makes us hate them less, that’s a very awkward moment. Taking this new information seriously would mean having to choose between continuing on as we are now, with our existing biases, our existing way of interacting with these people, our existing group of friends who probably have he same set of biases that we now feel compelled to question, or changing all that.” Wright shows that changing one’s views, even when there is good reason, can be awkward in one’s personal life. Beyond simply saying, “I was wrong,” changing one’s beliefs means that you then have to tell others (who you may have been very close with) that they are still wrong, and that can be hard for many people.

 

I don’t have a solution here for how to improve the likelihood of changing people’s minds. Instead, what I am doing is pointing out how many factors are involved with changing our minds. We should recognize that we may hold many of our beliefs for reasons we don’t want to acknowledge, like peer pressure or self-interest. Given that many of our beliefs may be influenced by factors beyond our own rationality, and given the difficulty we may have in changing our beliefs if they are indeed wrong, we should try to be more flexible in general with how we see the world and how we think about our worldviews. Being skeptical of our own knowledge doesn’t feel as good as telling ourselves that we have it all figured out, but it is probably a better place for us to be. We might not be able to change other people’s views (especially on ideas that are highly visible and salient), but at least we can be more honest with ourselves about the beliefs we have and hopefully more willing to change our beliefs because we never clung to tightly to them in the first place. This in turn may help other people to be more vulnerable in their own beliefs and slightly more open to change.

Cognitive Dissonance

I recent changed my mind about vaping. I have asthma and cigarette smoke really gives me terrible breathing problems so I have never smoked either traditional cigarettes or any type of vaping product. I have hated traditional cigarettes my whole life and as vaping has become a new hit, I have hated it as well. Since vaping really popped onto the scene, I considered it to be basically as evil as traditional cigarettes and didn’t make much of a distinction in my head between the two.

 

A recent podcast interview with Dr. David Abrahms on the Healthcare Policy Podcast changed my mind. Vaping products may be far less deadly than traditional burnt cigarettes. The addictive potential of nicotine is still there and there are certainly plenty of things in vaping products that we should not be putting into our lungs, but vaping products may have far fewer carcinogens than traditional cigarettes and appear as though they are far less deadly than traditional cigarettes. For the first time in history, we have a product which could completely displace traditional cigarettes and tobacco, and most importantly, save millions of lives. I still don’t like vaping and won’t ever do it myself, but I the new information has forced me to change the way I think about and respond to vaping.

 

As humans, we really are not very good at changing our mind. We are not very good at being receptive to information that conflicts with what we already think we believe or with what we want to believe. We become really good at rationalizing the beliefs we already hold or that we want to hold, and we discount any information that doesn’t fit the world view we would like to hold. Any argument or debate is basically meaningless because our beliefs often become part of who we are and become unchangeable as part of our identity.

 

Colin Wright addresses this in his book Becoming Who We Need to Be, “First, we seldom experience cognitive dissonance, which is the feeling of discomfort associated with being exposed to information that contradicts our existing beliefs. This dissonance is a vital component of changing our mind and adjusting our views, and without it, without feeling that we might be wrong about something and therefore it’s probably important to check our math and learn more about the subject we’ve been armchair-philosophizing about on Facebook, we stand little chance of ever tempering our extreme, unjustifiable views.”

 

My example of changing my views on vaping is a short version of experiencing cognitive dissonance and being able to adjust opinions in the face of data, even when it is data that doesn’t align with what I want to see in the world (which is no one ever smoking anything). My example is less profound than changing beliefs about economic systems, about political parties, or about favorite super heroes. At some point I’m not sure we ever really will change those beliefs, but I think it is important to be aware of the small times when we change our beliefs so that we can better monitor the beliefs we do hold and be more aware of the times when we may experience cognitive dissonance. Rather than hiding behind a rationalization of our beliefs and pretending that everything within our belief structure is perfectly coherent, we can accept that there are some parts we don’t have figured out or don’t have perfect scientific evidence to support. For some questions, like what religious belief do you hold or what would be the perfect super power if you could only pick one, you will never have the perfect answer that solves all of life’s mysteries. It is ok to accept that people have been debating these questions forever and to not expect that you will suddenly find the perfect answer that no one else could. Cognitive dissonance may be uncomfortable, but it is a necessary part of our lives today and we should embrace it rather than try to hide from or ignore it.