Causal Versus Statistical Thinking

Causal Versus Statistical Thinking

Humans are naturally causal thinkers. We observe things happening in the world and begin to apply a causal reason to them, asking what could have led to the observation we made. We attribute intention and desire to people and things, and work out a narrative that explains why things happened the way they did.

 

The problem, however, is that we are prone to lots of mistakes when we think in this way. Especially when we start looking at situations that require statistical thinking. In his book Thinking Fast and Slow, Daniel Kahneman writes the following:

 

“The prominence of causal intuitions is a recurrent theme in this book because people are prone to apply causal thinking inappropriately, to situations that require statistical reasoning. Statistical thinking derives conclusions about individual cases from properties of categories and ensembles. Unfortunately, System 1 does not have the capability for this mode of reasoning; system 2 can learn to think statistically, but few people receive the necessary training.”

 

System 1 is our fast brain. It works quickly to identify associations and patters, but it doesn’t take in a comprehensive set of information and isn’t able to do much serious number crunching. System 2 is our slow brain, able to do the tough calculations, but limited to work on the set of data that System 1 is able to accumulate. Also, System 2 is only active for short periods of time, and only when we consciously make use of it.

 

This leads to our struggles with causal thinking. We have to take in a wide range of possibilities, categories, and ranges of combinations. We have to make predictions and understand that in some set of instances we will see one outcome, but in another set of circumstances we may see a different outcome. Statistical thinking doesn’t pin down a concrete answer the way our causal thinking likes. As a result, we reach conclusions based on incomplete considerations, we ignore some important pieces of information, and we assume that we are correct because our answer feels correct and satisfies some criteria. Thinking causally can be powerful and useful, but only if we fully understand the statistical dimensions at hand, and can fully think through the implications of the causal structures we are defining.
Mood, Creativity, & Cognitive Errors

Mood, Creativity, & Cognitive Errors

In Thinking Fast and Slow, Daniel Kahneman comments on research studying people’s mood and cognitive performance. He writes the following about how we think when we are in a good mood, “when in a good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors.”

 

We think differently when we are in different moods. When we are relaxed and happy, our minds are more creative and our intuitions tend to be more accurate. Kahneman suggests that when we are happy and when we don’t sense threats, our rational and logical part of the brain lets up, allowing our mind to flow more freely. When we are not worried about our safety, our mind doesn’t have to examine and interrogate everything in our environment as thoroughly, hence the tendency toward logical errors. A sense of threat activates our deep thinking, making us more logical, but also diminishing the capacity of our intuitive thinking and making us less creative, less willing to take risks with our ideas and thoughts.

 

The research from Kahneman about mood, creativity, and cognitive errors reminds me of the research Daniel Pink shares in his book When. Pink finds that we tend to be more creative in the afternoons, once our affect has recovered from the afternoon trough when we all need a nap. Once our mood has improved toward the end of the day, Pink suggest that we are more creative. Our minds are able to return to important cognitive work, but are still easily distracted, allowing for more creative thinking.  This seems to tie in with the research from Kahneman. We become more relaxed, and are willing to let ideas flow across the logical boundaries that had previously separated ideas and categories of thought in our minds.

 

It is important that we think about our mood and the tasks we have at hand. If we need to do creative work, we should save it for the afternoon, when our moods improve and we have more capacity for drawing on previously disconnected thoughts and ideas in new ways. We shouldn’t try to cram work that requires logical coherence into times when we are happy and bubbly, our minds simply won’t be operating in the right way to handle the task. When we do work is as important as the mood we bring to work, and both the when and the mood may seriously impact the output.
Answering the Easy Question

Answering the Easy Question

One of my favorite pieces from Daniel Kahneman’s book Thinking Fast and Slow, was the research Kahneman presented on mental substitution. Our brains work very quickly, and we don’t always recognize the times when our thinking has moved in a direction we didn’t intend. Our thinking seems to flow logically and naturally from one thought to the next, and we don’t notice the times when our brains make logical errors or jumps that are less than rational. Mental substitution is a great example of this, and one that I know my brain does, but that I often have trouble seeing even when I know to look for it.

 

Kahneman writes, “When the question is difficult and a skilled solution is not available, intuition still has a shot: an answer may come to mind quickly – but it is not an answer to the original question.” 

 

The example that Kahneman uses is of a business executive making a decision on whether to invest in Ford. To make a smart decision, the executive has to know what trends in the auto industry look like and whether Ford is well positioned to adapt to changing economic, climate, and consumer realities. They need to know what Ford’s competition is doing and think about how Ford has performed relative to other automobile companies and how the automotive sector has performed relative to other industries. The decision requires thinking about a lot of factors, and the executive’s time is limited, along with the amount of information they can hold in their head, especially given the other responsibilities at home and in the office that the executive has to worry about.

 

To simplify the decision, the executive might chose to answer a simpler question, as Kahneman explains, “Do I like Ford cars?” If the executive grew up driving a Ford truck, if they really liked the 1965 Mustang, or if the only car crash they were ever involved in was when a person driving a Ford rear-ended them, their decision might be influenced by an intuitive sense of Ford cars and people who drive Fords. Also, if the investor has personnaly met someone within the executive team, they may be swayed by whether or not they liked the person they met. Instead of asking a large question about Ford the company, they might substitute an easier question about a single Ford executive team member.

 

“This is the essence of intuitive heuristics,” writes Kahneman, “when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.” 

 

Often, we already have a certain feeling in mind, and we switch the question being asked so that we can answer in line with our intuition. I grew up driving a Ford, so I might be inclined to favor investing in Ford. I might answer the question of investing in Ford before I am even asked the question, and then, instead of objectively setting out to review a lot of information, I might just cherry pick the information that supports my original inclination. I’m substituting the question at hand, and might even provide myself with plenty of information to support my choice, but it is likely biased and misguided information.

 

It is important to recognize when these prejudices and biases are influencing our decisions. By being aware of how we feel when asked a question, we can think critically to ask if we are being honest with the question that was asked of us. Are we truly answering the right question, or have we substituted for a question that is easier for us to answer?

 

In the question of cars and investments, the cost might not truly be a big deal (at least if you have a well diversified portfolio in other respects), but if we are talking about public policy that could be influenced by racial prejudice or by how deserving we think another group of people is, then our biases could be very dangerous. If we think that a certain group of people is inherently greedy, lazy, or unintelligent, then we might substitute the question, “will this policy lead to the desired social outcome” with the question, “do I like the group that stands to benefit the most from this policy?” The results from answering the wrong question could be disastrous, and could harm our society for years.
Expert Intuition

Expert Intuition

Much of Daniel Kahneman’s book Thinking Fast and Slow is about the breakdowns in our thinking processes, especially regarding the mental shortcuts we use to make decisions. The reality of the world is that there is too much information, too many stimuli, too many things that we could focus on and consider at any given time for us take in everything and make a comprehensive decision. Instead, we rely on short-cuts, use our intuition, and make estimates that help us with our decision-making. Usually we do just fine with this whole process, and that is why we rely so much on these short-cuts, but sometimes, cognitive errors and biases can drive us off a cliff.

 

However, Kahneman stresses that all is not lost. Our intuition can be very reliable if we develop true expertise in the area where we are putting our intuition to the test. As Kahneman writes, “Valid intuitions develop when experts have learned to recognize familiar elements in a new situation and to act in a manner that is appropriate to it.”

 

We can make predictions, we can learn to recognize commonalities between situations, and even on a subconscious level we can absorb and recall information to use in decisions. The key to using our intuition successfully is a careful line between mastery and arrogance. It requires self-awareness to know what we know, to understand an area well enough that we can trust our intuition, and to know what we don’t know, so that we don’t make judgments beyond our area of expertise.

 

While much of Kahneman’s research (the majority of which I’m going to be writing about) is focused on problematic heuristics, predictable cognitive errors, and hidden mental biases, it is important to know when we can trust our intuition and where our thinking doesn’t lead us astray. There are times where developing expertise through practice and experience can help us make better decisions. Even if we are not the expert, we can recognize and learn from those who do have expertise, paying attention when their intuitions forecast something important. Getting a sense for how well the mind can work, and how well humans can think and forecast when they have the right information and knowledge is powerful if we want to think positively about what our future might hold. At the same time, we have to also understand how thinking fast can get us in trouble, and where our expert intuitions may fail us.

An Illusion of Security, Stability, and Control

The online world is a very interesting place. While we frequently say that we have concerns about privacy, about how our data is being used, and about what information is publicly available to us, very few people delete their social media accounts or take real action when a data breach occurs. We have been moving more and more of our life online, and we have been more accepting of devices connected to the internet that can either be hacked or be used to tacitly spy on us than we would expect given the amount of time we spend expressing concern for our privacy.

 

A quick line from Tyler Cowen’s book The Complacent Class may explain the contradiction. “A lot of our contentment or even enthrallment with online practices may be based on an illusion of security, stability, and control.”

 

I just read Daniel Kahneman’s book Thinking Fast and Slow and in it he writes about a common logical fallacy, the substitution principle. When we are asked difficult questions, we often substitute a simpler question that we can answer. However, we rarely realize that we do this. Cowen’s insight suggests that we are using this substitution fallacy when we are evaluating online practices.

 

Instead of thinking deeply and critically about our privacy, safety, and the security of our personal or financial information in a given context, we substitute. We ask ourselves, does this website intuitively feel legitimate and well put together? If the answer is yes, we are more likely to enter our personal information, allow our online movements to be tracked, enter our preferences, and save our credit card number.

 

If matching technology works well, if our order is fulfilled, and if we are provided with more content that we can continue to enjoy, we will again substitute. Instead of asking whether our data is safe or whether the value we receive exceeds the risk of having our information available, we will ask if we are satisfied with what was provided to us and if we liked the look and feel of what we received. We can pretend to answer the hard questions with illusory answers to easier questions.

 

In the end, we land in a place where the companies and organizations operating on the internet have little incentive to improve their systems, to innovate in ways that create disruptive changes, or to pursue big leaps forward. We are already content and we are not actually asking the hard questions which may push innovation forward. This contentment builds stagnation and prevents us from seeing the risks that exist behind the curtain. We live in our illusion that we control our information online, that we know how the internet works, and that things are stable and will continue to work, even if the outside world is chaotic. This could be a recipe for a long-term disaster that we won’t see coming because we believe we are safely in control when we are not.

Selective Attention

I listened to an episode of the After On podcast this last week, and the guest, Dr. Don Hoffman, suggested that our brains did not evolve to help us understand reality, but evolved to help us survive, which often did not require that our ancestors have the most accurate view of reality but instead had the perceptions necessary to avoid lions, work as a tribe, and pick healthy berries. What we see when we look around us is only a small fraction of the world, our eyes are only able to perceive a rather narrow range of electromagnetic radiation (light). With the fact that our brains did not evolve to give us the most clear picture of reality and with our inability to fully perceive all of reality, we must remember that there are reasons to be skeptical of the thoughts produced by our brain.

 

In his book Becoming Who We Need to Be, author Colin Wright discusses the outcomes of our brains cognitive shortcomings. He writes, “This tendency to pay more attention to the seeming unlikely events that happen to and around us is called “selective attention.” Our brains have a bias toward patterns, and ignore so called uninteresting data…” Wright suggests that this is part of the reason our brains our so bad at statistical thinking as I described yesterday. Statistics is hard because we selectively pick out certain things as important and have a distorted memory of the world based on what we happened to see and notice. Wright continues describing what this means for us, “Which in turn result in our finding meaning in what is almost certainly meaningless…familiarity and feeling of significance is merely the consequence of our brains wigging out over the perceived connection, due to its pattern-finding predilections. Because that’s what it does.”

 

When we recognize that we did not evolve to develop a perfect view of what is happening around us and that our brains only selectively record a small chunk of reality, we can begin to think about how approach the world. We know our brains look for patterns and behave quickly, but that the patterns the brain picks out might not be fully correct or meaningful. We don’t have to eat Pringles every time our team is in the playoffs, because we are aware that our brain is making a false connection between us eating specific chips and our favorite team winning based on a perception that doesn’t really exist. What I am ultimately getting at is that our brains can invent realities that seem reasonable, but are based on cognitive errors, selective attention, and don’t actually align with the physical reality of the universe. We make sense out of meaningless things around us and start to attach symbolic importance to things that should not have any importance in our lives.

 

This distorted reality may not be a problem at an individual level with how any of us move through our lives. No one is going to care too much if you believe you need to drink a specific coffee every morning or sit in a specific spot, but as this mode of thinking scales up to a societal level, we must recognize that beliefs resulting from cognitive bias and error can lead to a world that doesn’t operate equitably for all members of society. Public policy must be grounded in the best empirical science and data that we can collect (even if our interpretation of the data is always going to be imperfect) so that we can distribute our finite resources in a reasonable way, and we must cut through our false narratives to avoid stigmatizing groups and discriminating against people who see the world differently from us.

Attribution Bias

Our brains are pretty impressive pattern recognition machines. We take in a lot of information about the world around us, remember stories, pull information together to form new thoughts and insights, move through the world based on the information we take in, and we are able to predict the results of actions before they have occurred. Our brain evolved to help us navigate a complex, dangerous, and uncertain world.

 

Today however, while our world is arguably more complex and uncertain than ever, it might not be as dangerous on a general day to day basis. I’m pretty sure I won’t encounter any animals who may try to eat me when I sit at the park to read during my lunch break, I won’t need to distinguish between two types of berries to make sure I don’t eat the poison kind, and if the thunder storms scheduled for this evening drop golf ball sized hail, I won’t have to worry to much about where I will find safety and shelter. Nevertheless, my evolved brain is still going to approach the world as if it were the dangerous place it was when my ancestors were evolving their thought capacities, and that will throw some monkey-wrenches into my life and lead to me to see patterns that don’t really exist.

 

Colin Wright has a great quote about this in his book Becoming Who We Need to Be. He writes, “You ascribe meaning to that person’s actions through the lens of what’s called “attribution bias.” If you’re annoyed by their slow driving, that inferred meaning will probably not be generous to the other driver: they’re a bad person, they’re in the way, and they’re doing this because they’re stupid or incapable. That these assumptions about the situation are possibly incorrect – maybe they’re driving slowly because thy’re in deep thought about elephant tool usage – is irrelevant. Ascribing meaning to acts unto itself is impressive, even if we often fail to arrive at a correct, or fully correct understanding of the situation.”

 

We often find ourselves in situations that are random and try to ascribe a greater meaning to the situation or event we are in. At least in the United States, it is incredibly common to hear people say that everything happens for a reason, creating a story for themselves in which this moment of inconvenience is part of a larger story filled with lessons, staircases, detours, success, and failure that are all supposed to culminate in a larger narrative that will one day all make sense. The fact that this way of thinking is so prevalent suggests to me that the power of our pattern recognition focused brains is still in full swing even though we no longer need it to be as active in as many situations of our life. We don’t need every moment of our life to happen for a reason, and if we allow for randomness and eliminate the running narrative of our life, we don’t have to work through challenging apologetics to understand something negative.

 

Attribution bias as described by Wright shows us how wrong our brain can be about the world. It shows us that our brains have certain tendencies that elevate ourselves in thought over the rest of the world that doesn’t conform to our desires, interests, wishes, and preferences. It reveals that we are using parts of our brains that evolved to help our ancestors in ways that we now understand to be irrational. If we can see that the slow person driving in front of us with a political sticker that makes our blood boil is not all the terrible things we instantly think they are (that instead they are a 75 year-old grandfather driving in a new town trying to get to the hospital where his child is sick) then we can recognize that not everything in life has a meaning, or at least not the meaning that our narrow pattern recognizing brain wants to ascribe. Remembering this mental bias and making an effort to recognize this type of thinking and move in a more generous thought direction will help us move through the world with less friction, anger, and disappointment because we won’t develop false patterns that let us down when they fail to materialize in the outcomes we expected.