Thinking Statistically

Thinking Statistically

In Thinking Fast and Slow, Daniel Kahneman personifies two modes of thought as System 1 and System 2. System 1 is fast. It takes in information, processes it rapidly, and doesn’t always make us cognizant of the information we took in. It reacts to the world around us on an intuitive level, isn’t good at math, but is great at positioning us for catching a football.

 

System 2 is slow. Its is deliberate, calculating, and uses a lot of energy to maintain. Because it requires so much energy, we don’t actually active it very often, not unless we really need to. What is worse, System 2 can only operate on the information (unless we have a lot of time to pause specifically for information intake) that System 1 takes in, meaning, it processes incomplete information.

 

System 1 and System 2 are important to keep in mind when we start to to think statistically, something our minds are not good at. When we think back to the 2016 US Presidential election, we can see how hard statistical thinking is. Clinton was favored to win, but there was a statistical chance that Trump would win, as happened. The chance was small, but that didn’t mean the models were all wrong when he did win, it just means that the most likely event forecasted didn’t materialize. We had trouble thinking statistically about win percentages going into the election, and had trouble understanding an unlikely outcome after it happened.

 

“Why is it so difficult for us to think statistically?” Kahneman asks in his book, “We easily think associatively, wee think metaphorically, we think causally, but statistics requires thinking about many things at once, which is something that System 1 is not designed to do.”

 

System 1 operates quickly and cheaply. It takes less energy and effort to run on System 1, but because it is subject to bias and because it makes judgments on incomplete information, it is not reliable for important decisions and calculations based on nuance. We have to engage System 2 to be great at thinking statistically, but statistical thinking still trips up System 2 because it is hard to think about multiple competing outcomes at the same time and weight them appropriately. In Risk Savvy, Gerd Gigerenzer shows that statistical thinking can be substantially improved and that we really can think statistically, but that we need some help from visual aids and tools so that our minds can grasp statistical concepts better. We have to help System 1 so that it can set up System 2 for success if we want to be good at thinking statistically.

 

From the framework that Kahneman lays out, a quick reacting System 1 running on power save mode with limited informational processing power and System 2 operating on incomplete information aggregated by System 1, statistical thinking is nearly impossible. System 1 can’t bring in enough information for System 2 to analyze appropriately. As a result, we fall back on biases or maybe substitute an easier question over the challenging statistical question. Gigerenzer argues that we can think statistically, but that we need the appropriate framing and cues for System 1, so that System 2 can understand the number crunching and leg work that is needed. In the end, statistical thinking doesn’t happen quickly, and requires an ability to hold competing and conflicting information in the mind at the same time, making it hard for us to think statistically rather than anecdotally or metaphorically.
Answering the Easy Question

Answering the Easy Question

One of my favorite pieces from Daniel Kahneman’s book Thinking Fast and Slow, was the research Kahneman presented on mental substitution. Our brains work very quickly, and we don’t always recognize the times when our thinking has moved in a direction we didn’t intend. Our thinking seems to flow logically and naturally from one thought to the next, and we don’t notice the times when our brains make logical errors or jumps that are less than rational. Mental substitution is a great example of this, and one that I know my brain does, but that I often have trouble seeing even when I know to look for it.

 

Kahneman writes, “When the question is difficult and a skilled solution is not available, intuition still has a shot: an answer may come to mind quickly – but it is not an answer to the original question.” 

 

The example that Kahneman uses is of a business executive making a decision on whether to invest in Ford. To make a smart decision, the executive has to know what trends in the auto industry look like and whether Ford is well positioned to adapt to changing economic, climate, and consumer realities. They need to know what Ford’s competition is doing and think about how Ford has performed relative to other automobile companies and how the automotive sector has performed relative to other industries. The decision requires thinking about a lot of factors, and the executive’s time is limited, along with the amount of information they can hold in their head, especially given the other responsibilities at home and in the office that the executive has to worry about.

 

To simplify the decision, the executive might chose to answer a simpler question, as Kahneman explains, “Do I like Ford cars?” If the executive grew up driving a Ford truck, if they really liked the 1965 Mustang, or if the only car crash they were ever involved in was when a person driving a Ford rear-ended them, their decision might be influenced by an intuitive sense of Ford cars and people who drive Fords. Also, if the investor has personnaly met someone within the executive team, they may be swayed by whether or not they liked the person they met. Instead of asking a large question about Ford the company, they might substitute an easier question about a single Ford executive team member.

 

“This is the essence of intuitive heuristics,” writes Kahneman, “when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.” 

 

Often, we already have a certain feeling in mind, and we switch the question being asked so that we can answer in line with our intuition. I grew up driving a Ford, so I might be inclined to favor investing in Ford. I might answer the question of investing in Ford before I am even asked the question, and then, instead of objectively setting out to review a lot of information, I might just cherry pick the information that supports my original inclination. I’m substituting the question at hand, and might even provide myself with plenty of information to support my choice, but it is likely biased and misguided information.

 

It is important to recognize when these prejudices and biases are influencing our decisions. By being aware of how we feel when asked a question, we can think critically to ask if we are being honest with the question that was asked of us. Are we truly answering the right question, or have we substituted for a question that is easier for us to answer?

 

In the question of cars and investments, the cost might not truly be a big deal (at least if you have a well diversified portfolio in other respects), but if we are talking about public policy that could be influenced by racial prejudice or by how deserving we think another group of people is, then our biases could be very dangerous. If we think that a certain group of people is inherently greedy, lazy, or unintelligent, then we might substitute the question, “will this policy lead to the desired social outcome” with the question, “do I like the group that stands to benefit the most from this policy?” The results from answering the wrong question could be disastrous, and could harm our society for years.
Expert Intuition

Expert Intuition

Much of Daniel Kahneman’s book Thinking Fast and Slow is about the breakdowns in our thinking processes, especially regarding the mental shortcuts we use to make decisions. The reality of the world is that there is too much information, too many stimuli, too many things that we could focus on and consider at any given time for us take in everything and make a comprehensive decision. Instead, we rely on short-cuts, use our intuition, and make estimates that help us with our decision-making. Usually we do just fine with this whole process, and that is why we rely so much on these short-cuts, but sometimes, cognitive errors and biases can drive us off a cliff.

 

However, Kahneman stresses that all is not lost. Our intuition can be very reliable if we develop true expertise in the area where we are putting our intuition to the test. As Kahneman writes, “Valid intuitions develop when experts have learned to recognize familiar elements in a new situation and to act in a manner that is appropriate to it.”

 

We can make predictions, we can learn to recognize commonalities between situations, and even on a subconscious level we can absorb and recall information to use in decisions. The key to using our intuition successfully is a careful line between mastery and arrogance. It requires self-awareness to know what we know, to understand an area well enough that we can trust our intuition, and to know what we don’t know, so that we don’t make judgments beyond our area of expertise.

 

While much of Kahneman’s research (the majority of which I’m going to be writing about) is focused on problematic heuristics, predictable cognitive errors, and hidden mental biases, it is important to know when we can trust our intuition and where our thinking doesn’t lead us astray. There are times where developing expertise through practice and experience can help us make better decisions. Even if we are not the expert, we can recognize and learn from those who do have expertise, paying attention when their intuitions forecast something important. Getting a sense for how well the mind can work, and how well humans can think and forecast when they have the right information and knowledge is powerful if we want to think positively about what our future might hold. At the same time, we have to also understand how thinking fast can get us in trouble, and where our expert intuitions may fail us.

More on Hiding Our Motives

Deception is a big part of being a human being. If we try, we can all think of times when we have been deceived. Someone led us to believe one thing, and then we found out that something different was really going on the whole time. If we are honest with ourselves, we can also see that we clearly try to deceive others all the time. We make ourselves seem like we are one thing, but in many ways, we are not exactly what we present ourselves as being. Sometimes we truly are genuine, but often, we are signaling a particular behavior or trait to a group so that we can be accepted, praised, and get some sort of future benefit. In order to do this really well, we create stories and ideas about why we do the things we do, deceiving even ourselves in the process. As Kevin Simler and Robin Hanson wright in their book The Elephant in the Brain, “We hide some of our motives…in order to mislead others.”

 

This is not a pretty idea of humans, and expressing this idea is an admittance that we sometimes are not as great as we like to make everyone believe. This is not an idea that is popular or that everyone will be quick to admit, but I believe that Simler and Hanson are right in saying that it is a huge driving influencer of the world around us. I also don’t think that accepting this about ourselves ends up leaving us in as sad, cynical, and dejected of a place as one might think. Humans and our social groups are complicated, and sometimes being a little deceptive, doing things with ulterior motives at their base, and behaving in a way to signal group alliance or value can be a net positive. We can recognize that we do these things, that we are deceptive, and that we deceive others by lying about our motives, and still make a good impact in the world. The altruist who donates money to the Against Malaria foundation may tell himself and everyone he knows that he donates because he wants to save people’s lives, but truly he just gets a warm glow within himself, and that is perfectly fine as long as the externality from his status seeking behavior is overwhelmingly positive (looking in the mirror on this one).

 

If we don’t accept this reality about ourselves and others then we will spend a lot of time trying to work on the wrong problem and a lot of time being confused as to why our mental models of the world don’t seem to work out. In my own life, recognizing status seeking behavior, self-deception, and motivated thinking helps me to be less judgmental toward other people. I recognize that I have the same capacity for these negative and deceptive behaviors within myself, and I choose (as much as I can) to redirect these types of behaviors in directions that have the greatest positive social impact rather than in the direction that has the greatest personal benefit for me and my feelings. Ultimately, I encourage us to be honest about the fact that we are sometimes rather dishonest and to build our awareness in a way that is easy on ourselves and others for behaving as humans naturally behave, but still nudges us in a direction where we create positive externalities where possible from these ways of being.