Seeing Causality

Seeing Causality

In Thinking Fast and Slow Daniel Kahneman describes how a Belgian psychologist changed the way that we understand our thinking in regard to causality. The traditional thinking held that we make observations about the world and come to understand causality through repeated exposure to phenomenological events. As Kahneman writes, “[Albert] Michotte [1945] had a different idea: he argued that we see causality just as directly as we see color.”

 

The argument from Michotte is that causality is an integral part of the human psyche. We think and understand the world through a causal lens. From the time we are infants, we interpret the world causally and we can see and understand causal links and connections in the things that happen around us. It is not through repeated experience and exposure that we learn to view an event as having a cause or as being the cause of another event. It is something we have within us from the beginning.

 

“We are evidently ready from birth to have impressions of causality, which do not depend on reasoning about patterns of causation.”

 

I try to remember this idea of our intuitive and automatic causal understanding of the world when I think about science and how I should relate to science. We go through a lot of effort to make sure that we are as clear as possible with our scientific thinking. We use randomized controlled trials (RCT) to test the accuracy of our hypothesis, but sometimes, an intensely rigorous scientific study isn’t necessary for us to make changes in our behavior based on simple scientific exploration via normal causal thinking. There are some times where we can trust our causal intuition, and without having to rely on an RCT for evidence. I don’t know where to draw the line between causal inferences that we can accept and those that need an RCT, but through honest self-awareness and reflection, we should be able to identify times when our causal interpretations demonstrate validity and are reasonably well insulated from our own self-interests.

 

The Don’t Panic Geocast has discussed two academic journal articles on the effectiveness of parachutes for preventing death when falling from an aircraft during the Fun Paper Friday segment of two episodes. The two papers, both published in the British Medical Journal, are satirical, but demonstrate an important point. We don’t need to conduct an RCT to determine whether using a parachute when jumping from a plane will be more effective at helping us survive the fall than not using a backpack. It is an extreme example, but it demonstrates that our minds can see and understand causality without always needing an experiment to confirm a causal link. In a more consequential example, we can trust our brains when they observe that smoking cigarettes has negative health consequences including increased likelihood of an individual developing lung cancer. An RCT to determine the exact nature and frequency of cancer development in smokers would certainly be helpful in building our scientific knowledge, but the scientific consensus around smoking and cancer should have been accepted much more readily than what it was. An RCT in this example would take years and would potentially be unethical or impossible. Tobacco companies obfuscated the science by taking advantage of the fact that an RCT in this case couldn’t be performed, and we failed to accept the causal link that our brains could see, but could not prove as definitively as we can prove something with an RCT. Nevertheless, we should have trusted our causal thinking brains, and accepted the intuitive answer.

 

We can’t always trust the causal conclusions that our mind reaches, but there are times where we should acknowledge that our brains think causally, and accept that the causal links that we intuit are accurate.
Guided by Impressions of System 1

Guided by Impressions of System 1

In Thinking Fast and Slow Daniel Kahneman shares research showing how easily people can be tricked or influenced by factors that seem to be completely irrelevant to the mental task that the people are asked to carry out. People will remember rhyming proverbs better than non-rhyming proverbs. People will trust a cited research source with an easy to say name over a difficult and foreign sounding name. People will also be influenced by the quality of paper and colors used in advertising materials. No one would admit that rhymes, easy to say names, or paper quality is why they made a certain decision, but statistics show that these things can strongly influence how we decide.

 

Kahneman describes the research this way, “The psychologists who do these experiments do not believe that people are stupid or infinitely gullible. What psychologists do believe is that all of us live much of our life guided by the impressions of System 1 – and we often do not know the source of these impressions.”

 

Making tough and important decisions requires a lot of energy. In many instances, we have to make tough decisions that require a lot of mental effort in a relatively short time. We don’t always have a great pen and paper template to follow for decision-making, and sometimes we have to come to a conclusion in the presence of others, upping the stakes and increasing the pressure as we try to think through our options. As a result, the brain turns to heuristics reliant on System 1. The brain uses intuition, quick impressions, and substitutes questions for an easier decision.

 

We might not know why we intuitively favored one option over the other. When we ask our brain to think back on the decision we made, we are engaging System 2 to think deeply, and it is likely going to overlook and not consider inconsequential factors such as the color of the paper for the option we picked. It won’t remember that the first sales person didn’t make much eye contact with us and that the second person did, but it will substitute some other aspect of competence to give us a reason for trusting sales person number two more.

 

What is important to remember is that System 1 guides a lot of our lives. We don’t always realize it, but System 1 is passing along information to System 2 that isn’t always relevant for the decision that System 2 has to make. Intuitions and quick impressions can be biased and formed by unimportant factors, but even if we don’t consciously recognize them, they get passed along and calculated into our final choice.
Overconfidence

Overconfidence

How much should you trust your intuitions? The answer to the question depends on your level of expertise with the area in which you have intuitions. If you cook with a certain pan on a stove every day, then you are probably pretty good with trusting your intuition for where the temperature should be set, how long the thing you are cooking will need, and where the hottest spots on the pan will be. If you are generally unfamiliar with cars, then you probably shouldn’t trust your intuition about whether or not a certain used car is the right car to purchase. In other words, you should trust your instincts in things you are deeply familiar with and in areas where you are an expert. In areas where you are not an expert and where you only have a handful of experiences, you should consider yourself to be overconfident if you think you have strong intuitions about the situation.

 

Daniel Kahneman demonstrates this with an example of a math problem in his book Thinking Fast and  Slow. Most of us don’t solve a lot of written math problems in our head on a daily basis. As a result, we shouldn’t trust the first intuitive answer that comes to mind when we see one. This is the case with the problem that Kahneman uses in his book. It is deliberately designed to have an intuitive easy answer that is incorrect. It helps us see how our overconfidence can feel justified, but still lead us astray.

 

Kahneman writes, “an observation that will be a recurrent theme of this book: many people are overconfident, prone to place too much faith in their intuitions. They apparently find cognitive effort at least mildly unpleasant and avoid it as much as possible.” Intuitions are easy. They come to mind quickly, and following them doesn’t take much conscious effort or thought. The problem, however, is that our intuitions can be wildly wrong. Sometimes they may help us reach an answer quickly, and if we are an expert they can even be life saving, but in many cases our intuitions can be problematic.  If we don’t ever think through our intuitions, we won’t actually realize how often we act on them, and how our overconfidence can lead to poor outcomes.

 

This doesn’t mean that we have to pull out a note pad and calculator every time we make a decision. Instead, it means we should pause momentarily to ask ourselves if our immediate intuition is justified. If we are driving down a freeway that we take every day, and our intuition says change lanes, we can pause for a beat and consider that we drive this way every day, and know that one lane or the other generally slows down a lot and that we will be better off in a different lane. If we have an intuition instead about a complex public policy, we can take a minute to consider whether we truly know anything about the public policy area, and whether we should be more critical of our intuitions. Jumping to conclusions in public policy based solely on intuition can be dangerous. It doesn’t take too much effort or time to think about whether our intuition can be trusted or whether we are overconfident, but it can have a big impact for how we relate to the world and whether we trust the voice in our own head, or the voice of experts.
Thinking Fast and Evolution

Thinking Fast and Evolution

I have written in the past about how I think I probably put too much emphasis on evolutionary biology, especially considering brains, going all the way back to when our human ancestors liven in small tribes as hunter-gatherers. Perhaps it is because I look for it more than others, but I feel as though characteristics and traits that served us well during that time, still influence much of how we behave and respond to the world today. Sometimes the effects are insignificant, but sometimes I believe they do matter, and sometimes I believe they drive negative outcomes or behaviors that are maladapted to today’s world.

 

As I have begun writing about Daniel Kahneman’s research as presented in his book Thinking Fast and Slow, I have generally given System 1, or what Kahneman describes as our quick, automatic, and reactive part of our brain, a bad rep. But the reality is that it serves an important purpose, and likely served an especially important role over the course of human evolution, getting us to the place we are at today. Knowing that I tend to weigh our evolutionary past heavily (and perhaps too heavily), it is not surprising to me that I view System 1 as an important piece of how we got to where we are, even if System 1 is easy to pick on in our current world.

 

In his book, Kahneman writes, “Any task that requires you to keep several ideas in mind at the same time has the same hurried character. Unless you have the good fortune of a capacious working memory, you may be forced to work uncomfortably hard. The most effortful forms of slow thinking are those that require you to think fast.”

 

Anyone who has had to remember a couple of phone numbers without the benefit of being able to write them down or save them immediately, and anyone who has had to remember more words than Person, Woman, Man, Camera, TV, knows that we feel super rushed when we are suddenly given something important to hold in our working memory. We try to do what we can as quickly as possible to get the information out of our head, stored someplace other than our working memory. We feel rushed to complete the task to ease our cognitive load. Why would our brains work this way? Why would it be that we become so rushed when we have something meaningful that we need to hold in our mind?

 

The answer, as I view it, might go back to our hunter-gatherer ancestors. They mostly needed System 1. They had to react quickly to a noise that could be a dangerous predator. They had to move fast and on instinct to successfully take down dinner. There were not as many things that required deep focus, and the things that required deep focus were not dense academic journal articles, or spreadsheets, or PowerPoints, or a guy with a clip-board asking you to count backward from 200 by 13. You don’t have to worry about pop-ups or advertisements when you are skinning an animal, grinding seeds, or doing some type of work with your hands in a non-digital world. You didn’t have phone numbers to remember and you were not heading into a business meeting with four people you just met, whose names you needed to memorize as quick and fluidly as possible.

 

Slow thinking developed for people who had time for slow thinking. Fast thinking developed when survival was on the line. Today, the slow thinking might be more likely to help us survive than our fast thinking, presuming we don’t have dangerous drives to work each day and are eating safely prepared foods. Slow thinking is a greater advantage for us today, but we also live in a world where slow thinking is still difficult because we have packed more distractions into our environments. We have literally moved ourselves out of environments for which our brains are optimized by evolution, and this has created the challenges and conflicts we face with System 1 and System 2 in our daily lives and in the work we do.
Expert Intuition

Expert Intuition

Much of Daniel Kahneman’s book Thinking Fast and Slow is about the breakdowns in our thinking processes, especially regarding the mental shortcuts we use to make decisions. The reality of the world is that there is too much information, too many stimuli, too many things that we could focus on and consider at any given time for us take in everything and make a comprehensive decision. Instead, we rely on short-cuts, use our intuition, and make estimates that help us with our decision-making. Usually we do just fine with this whole process, and that is why we rely so much on these short-cuts, but sometimes, cognitive errors and biases can drive us off a cliff.

 

However, Kahneman stresses that all is not lost. Our intuition can be very reliable if we develop true expertise in the area where we are putting our intuition to the test. As Kahneman writes, “Valid intuitions develop when experts have learned to recognize familiar elements in a new situation and to act in a manner that is appropriate to it.”

 

We can make predictions, we can learn to recognize commonalities between situations, and even on a subconscious level we can absorb and recall information to use in decisions. The key to using our intuition successfully is a careful line between mastery and arrogance. It requires self-awareness to know what we know, to understand an area well enough that we can trust our intuition, and to know what we don’t know, so that we don’t make judgments beyond our area of expertise.

 

While much of Kahneman’s research (the majority of which I’m going to be writing about) is focused on problematic heuristics, predictable cognitive errors, and hidden mental biases, it is important to know when we can trust our intuition and where our thinking doesn’t lead us astray. There are times where developing expertise through practice and experience can help us make better decisions. Even if we are not the expert, we can recognize and learn from those who do have expertise, paying attention when their intuitions forecast something important. Getting a sense for how well the mind can work, and how well humans can think and forecast when they have the right information and knowledge is powerful if we want to think positively about what our future might hold. At the same time, we have to also understand how thinking fast can get us in trouble, and where our expert intuitions may fail us.