Regression to the Mean Versus Causal Thinking

Regression to the Mean Versus Causal Thinking

Regression to the mean, the idea that there is an average outcome that can be expected and that overtime individual outliers from the average will revert back toward that average, is a boring phenomenon on its own. If you think about it in the context of driving to work and counting your red lights, you can see why it is a rather boring idea. If you normally hit 5 red lights, and one day you manage to get to work with just a single red light, you probably expect that the following day you won’t have as much luck with the lights, and will probably have more red lights than than your lucky one red light commute. Conversely, if you have a day where you manage to hit every possible red light, you would probably expect to have better traffic luck the next day and be somewhere closer to your average. This is regression to the mean. Simply because you had only one red or managed to hit every red one day doesn’t cause the next day’s traffic light stoppage to be any different, but you know you will probably have a more average count of reds versus greens – no causal explanation involved, just random traffic light luck.

 

But for some reason this idea is both fascinating and hard to grasp in other areas, especially if we think that we have some control of the outcome. In Thinking Fast and Slow, Daniel Kahneman helps explain why it is so difficult in some settings for us to accept regression to the mean, what is otherwise a rather boring concept. He writes,

 

“Our mind is strongly biased toward causal explanations and does not deal well with mere statistics. When our attention is called to an event, associative memory will look for its cause – more precisely, activation will automatically spread to any cause that is already stored in memory. Causal explanations will be evoked when regression is detected, but they will be wrong because the truth is that regression to the mean has an explanation but does not have a cause.”

 

Unless you truly believe that there is a god of traffic lights who rules over your morning commute, you probably don’t assign any causal mechanism to your luck with red lights. But when you are considering how well a professional golfer played on the second day of a tournament compared to the first day, or when you are considering whether intelligent women marry equally intelligent men, you are likely to have some causal idea that comes to mind. The golfer was more or less complacent on the second day – the highly intelligent women have to settle for less intelligent men because the highly intelligent men don’t want an intellectual equal. These are examples that Kahneman uses in the book and present plausible causal mechanisms, but as Kahneman shows, the more simple though boring answer is simply regression to the mean. A golfer who performs spectacularly on day one is likely to be less lucky on day two. A highly intelligent woman is likely to marry a man with intelligence closer to average just by statistical chance.

 

When regression to the mean violates our causal expectation it becomes an interesting and important concept. It reveals that our minds don’t simply observe an objective reality, they observe causal structures that fit with preexisting narratives. Our causal conclusions can be quite inaccurate, especially if they are influenced by biases and prejudices that are unwarranted. If we keep regression to the mean in mind, we might lose some of our exciting narratives, but our thinking will be more sound, and our judgments more clear.
The Availability Heuristic

The Science of Availability

Which presidential candidate is doing more advertising this year? Which college football team has been the most dominant over the last five years? Who has had the most songs on the Hot 100 over the last five years? You can probably come up with an intuitive answer to (at least one of) these questions even if you don’t follow politics, college football, or pop music very closely. But what you are doing when you come up with an intuitive answer isn’t really answering the question, but instead relying on substitution and the availability heuristic.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “We defined the availability heuristic as the process of judging frequency by the ease with which instances come to mind.” So if you recently saw a few ads from the Trump Campaign, then your mind would probably intuit that his campaign is doing more advertising. If you remember that LSU won the college football national championship last year, then you might have answered LSU, but also if you see lots of people wearing Alabama hats on a regular basis, you might answer Alabama. And if you recently heard a Taylor Swift song, then your intuitive guess might be that she has had the most top 100 hits.

 

Kahneman continues, “The availability heuristic, like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind.” When we are asked to guess how often an event happens or what percent of a category fits a certain characteristic, our brains flip back through short-term memory for examples that match what we are looking for. The easier it is to remember an example the more weight we give to it.

 

I don’t really know who is doing more advertising, but I do know that I have seen a lot of Trump ads on YouTube, so it intuitively felt that he was doing more advertising, even though I might have just picked one channel where his ads were more salient. Overall, he may be doing less than the Biden campaign. Similarly, I didn’t initially remember that LSU won the national championship last year, but I did see someone wearing an Alabama sweatshirt recently, and that team came to mind quickly when thinking of dominant football programs. I also don’t have a clue who has had the most top 100 hits in the last 5 years, but people in my orbit on Twitter frequently post things relating to Taylor Swift, so her name came to mind easily when guessing for the top 100 hits. I wasn’t doing any deep thinking, I was just scratching the surface of my memory for an easy answer.

 

Throughout Thinking Fast and Slow, Kahneman reveals instances where our thinking appears to be deep and nuanced, but is really quick, intuitive, and prone to errors. In most instances we don’t do any deep calculation or thinking, and just roll with the intuitive answer. But our intuition is often faulty, incomplete, and based on a substitution for the real question we are being asked. This might not have high stakes when it means we are inaccurately estimating divorce rates for celebrities (an example from the book), but it can have high stakes in other decision-making areas. If we are looking to buy a home and are concerned about flood risk, we will incorrectly weight the risk of a flood at a property if there were a lot of news stories about hurricane flooding from a hurricane in the Gulf of Mexico. This could influence where we chose to live and whether we pay for expensive insurance or not. Little assumptions and misperceptions can nudge us in critical directions, either positive or negative, and change whether we invest for our futures, fudge our taxes, or buy a new car. Recognizing that our brains make mistakes based on thinking strategies like the availability heuristic can help us in some large decision-making areas, so it is important to understand how our brains work, and where they can go wrong.
Causal Versus Statistical Thinking

Causal Versus Statistical Thinking

Humans are naturally causal thinkers. We observe things happening in the world and begin to apply a causal reason to them, asking what could have led to the observation we made. We attribute intention and desire to people and things, and work out a narrative that explains why things happened the way they did.

 

The problem, however, is that we are prone to lots of mistakes when we think in this way. Especially when we start looking at situations that require statistical thinking. In his book Thinking Fast and Slow, Daniel Kahneman writes the following:

 

“The prominence of causal intuitions is a recurrent theme in this book because people are prone to apply causal thinking inappropriately, to situations that require statistical reasoning. Statistical thinking derives conclusions about individual cases from properties of categories and ensembles. Unfortunately, System 1 does not have the capability for this mode of reasoning; system 2 can learn to think statistically, but few people receive the necessary training.”

 

System 1 is our fast brain. It works quickly to identify associations and patters, but it doesn’t take in a comprehensive set of information and isn’t able to do much serious number crunching. System 2 is our slow brain, able to do the tough calculations, but limited to work on the set of data that System 1 is able to accumulate. Also, System 2 is only active for short periods of time, and only when we consciously make use of it.

 

This leads to our struggles with causal thinking. We have to take in a wide range of possibilities, categories, and ranges of combinations. We have to make predictions and understand that in some set of instances we will see one outcome, but in another set of circumstances we may see a different outcome. Statistical thinking doesn’t pin down a concrete answer the way our causal thinking likes. As a result, we reach conclusions based on incomplete considerations, we ignore some important pieces of information, and we assume that we are correct because our answer feels correct and satisfies some criteria. Thinking causally can be powerful and useful, but only if we fully understand the statistical dimensions at hand, and can fully think through the implications of the causal structures we are defining.
Seeing Causality

Seeing Causality

In Thinking Fast and Slow Daniel Kahneman describes how a Belgian psychologist changed the way that we understand our thinking in regard to causality. The traditional thinking held that we make observations about the world and come to understand causality through repeated exposure to phenomenological events. As Kahneman writes, “[Albert] Michotte [1945] had a different idea: he argued that we see causality just as directly as we see color.”

 

The argument from Michotte is that causality is an integral part of the human psyche. We think and understand the world through a causal lens. From the time we are infants, we interpret the world causally and we can see and understand causal links and connections in the things that happen around us. It is not through repeated experience and exposure that we learn to view an event as having a cause or as being the cause of another event. It is something we have within us from the beginning.

 

“We are evidently ready from birth to have impressions of causality, which do not depend on reasoning about patterns of causation.”

 

I try to remember this idea of our intuitive and automatic causal understanding of the world when I think about science and how I should relate to science. We go through a lot of effort to make sure that we are as clear as possible with our scientific thinking. We use randomized controlled trials (RCT) to test the accuracy of our hypothesis, but sometimes, an intensely rigorous scientific study isn’t necessary for us to make changes in our behavior based on simple scientific exploration via normal causal thinking. There are some times where we can trust our causal intuition, and without having to rely on an RCT for evidence. I don’t know where to draw the line between causal inferences that we can accept and those that need an RCT, but through honest self-awareness and reflection, we should be able to identify times when our causal interpretations demonstrate validity and are reasonably well insulated from our own self-interests.

 

The Don’t Panic Geocast has discussed two academic journal articles on the effectiveness of parachutes for preventing death when falling from an aircraft during the Fun Paper Friday segment of two episodes. The two papers, both published in the British Medical Journal, are satirical, but demonstrate an important point. We don’t need to conduct an RCT to determine whether using a parachute when jumping from a plane will be more effective at helping us survive the fall than not using a backpack. It is an extreme example, but it demonstrates that our minds can see and understand causality without always needing an experiment to confirm a causal link. In a more consequential example, we can trust our brains when they observe that smoking cigarettes has negative health consequences including increased likelihood of an individual developing lung cancer. An RCT to determine the exact nature and frequency of cancer development in smokers would certainly be helpful in building our scientific knowledge, but the scientific consensus around smoking and cancer should have been accepted much more readily than what it was. An RCT in this example would take years and would potentially be unethical or impossible. Tobacco companies obfuscated the science by taking advantage of the fact that an RCT in this case couldn’t be performed, and we failed to accept the causal link that our brains could see, but could not prove as definitively as we can prove something with an RCT. Nevertheless, we should have trusted our causal thinking brains, and accepted the intuitive answer.

 

We can’t always trust the causal conclusions that our mind reaches, but there are times where we should acknowledge that our brains think causally, and accept that the causal links that we intuit are accurate.
Conscious and Unconscious Priming Effects

Conscious and Unconscious Priming Effects

“Another major advance in our understanding of memory was the discovery that priming is not restricted to concepts and words,” writes Daniel Kahneman in his book Thinking Fast and Slow, “You cannot know this from conscious experience, of course, but you must accept the alien idea that your actions and your emotions can be primed by events of which you are not even aware.”

 

Yesterday I wrote about linguistic priming. How words can trigger thoughts in our mind, and set us up to think certain thoughts. I wrote about how ideas spread, like in the movie Inception from one thought or idea to another based on similarities and categories of things. I wrote about how important implicit associations can be, and how we have used them to measure racial bias and the harm that these biases could have in society. Today’s post continues on that trend, exploring the areas in our lives where priming may be taking place without our knowledge.

 

In his book, before the quote I shared at the start of this post, Kahneman describes our thoughts as behaving like ripples on a pond. A train of thought can be primed in one direction, and ripples from that priming can spread out across our mind. So when I used Inception earlier, I may have primed our minds to think about trains, since they feature so prominently in the movie, and if that is the case, it is no surprise that I used train of thought just a few sentences later. From this point forward, there are likely other metaphors and examples that I might use that are potentially primed by the movie Inception or by associations with trains. It is clear that I’m following priming effects if I directly reference my thinking staying on track or going off the rails, but it might be less obvious and clear how my thinking might relate to trains in the sentences to come, but as Kahneman’s quote suggested, my mind might be unconsciously primed for certain directions all from the casual mention of Inception from earlier.

 

Across my writing I have always been fascinated by the idea that we are not in as much control over our minds as we believe. Thoughts think themselves, we don’t necessarily think our own thoughts. Our minds can be influenced by time, by caffeine levels, by whether someone smiled at us on our commute to work, or whether our sock is rubbing on our foot in a strange way. We don’t have to think of anything for it to directly register with our brain and influence where our mind goes. What thoughts pop into our head, and what ripples of ideas are primed across our mind are beyond our control and influenced by things we sometimes barely notice. Priming, according to Kahneman, can be direct, deliberate, and conscious, or it can be unconscious and oblique. The mind and how we think is more random and unpredictable than it feels, and sometimes more random than we would like to believe. This should change how we think of ourselves, how we think of others, and what information and knowledge we privilege and encourage. It should make us less certain that we are always behaving as we should, and less certain that we are as smart and savvy in all situations as we like to believe we are.
Embodied Cognition

Embodied Cognition

I really enjoy science podcasts, science writing, and trying to think rationally and scientifically when I observe and consider the world. Within science, when we approach the world to better understand the connections that take place, we try to isolate the variables acting on our observations or experiments. We try to separate ourselves from the world so that we can make an objective and independent observation of reality, free from our own interference and influence. Nevertheless, it is important to remember that we are part of the world, and that we do have an influence on it. No matter how independent and rational we want to be, we are still part of the world and interact with it, even if we are just thinking and observing.

 

Daniel Kahneman demonstrates how our thoughts and observations can lead us to have unintended physical manifestations in the world in his book Thinking Fast and Slow. He presents the reader with two words that normally don’t go together (I won’t reveal his experiment for the reader here). What he shows with his word association experiment is that simple thoughts, just hearing or reading a word, can influence how we experience and behave in the physical world. Anyone who has started sweating during a poker game and anyone who has shuttered just from reading the words nails on a chalkboard knows that this is true. We are physical systems, and simple thoughts, memories, and words are enough to trigger physical responses in our bodies. While we like to think of ourselves as being independent and separate from the world, we never really are.

 

Kahneman explains this by writing, “As cognitive scientists have emphasized in recent years, cognition is embodied; you think with your body, not only with your brain.” Our brains take in electrical information from stimuli in the world. Chemicals bind to receptors in our noses or on our tongues, and nerves transmit electrical information to the brain to tell it what chemicals are present. Light interacts with receptors in our eyes, and nerves from our eyes again travel directly into our brains. Thinking is a direct result of physical sensory input, and while we can’t physically touch a thought, our body does react to the thinking and experiencing taking place.

 

No matter how much we want to believe that we can be objective and separated from the physical reality of the world around us, we cannot be 100% isolated. We experience the world physically, and we can try to think of the world independently, but our senses and experiences are directly connected to that physical world. Our responses in turn are also physical, even if we don’t perceive them. We have to accept, no matter how scientific and objective we want to be, that we are part of the system we are evaluating. There is no independent God’s eye view, our cognition is embodied, and we are within the system we observe.
Thinking Fast and Evolution

Thinking Fast and Evolution

I have written in the past about how I think I probably put too much emphasis on evolutionary biology, especially considering brains, going all the way back to when our human ancestors liven in small tribes as hunter-gatherers. Perhaps it is because I look for it more than others, but I feel as though characteristics and traits that served us well during that time, still influence much of how we behave and respond to the world today. Sometimes the effects are insignificant, but sometimes I believe they do matter, and sometimes I believe they drive negative outcomes or behaviors that are maladapted to today’s world.

 

As I have begun writing about Daniel Kahneman’s research as presented in his book Thinking Fast and Slow, I have generally given System 1, or what Kahneman describes as our quick, automatic, and reactive part of our brain, a bad rep. But the reality is that it serves an important purpose, and likely served an especially important role over the course of human evolution, getting us to the place we are at today. Knowing that I tend to weigh our evolutionary past heavily (and perhaps too heavily), it is not surprising to me that I view System 1 as an important piece of how we got to where we are, even if System 1 is easy to pick on in our current world.

 

In his book, Kahneman writes, “Any task that requires you to keep several ideas in mind at the same time has the same hurried character. Unless you have the good fortune of a capacious working memory, you may be forced to work uncomfortably hard. The most effortful forms of slow thinking are those that require you to think fast.”

 

Anyone who has had to remember a couple of phone numbers without the benefit of being able to write them down or save them immediately, and anyone who has had to remember more words than Person, Woman, Man, Camera, TV, knows that we feel super rushed when we are suddenly given something important to hold in our working memory. We try to do what we can as quickly as possible to get the information out of our head, stored someplace other than our working memory. We feel rushed to complete the task to ease our cognitive load. Why would our brains work this way? Why would it be that we become so rushed when we have something meaningful that we need to hold in our mind?

 

The answer, as I view it, might go back to our hunter-gatherer ancestors. They mostly needed System 1. They had to react quickly to a noise that could be a dangerous predator. They had to move fast and on instinct to successfully take down dinner. There were not as many things that required deep focus, and the things that required deep focus were not dense academic journal articles, or spreadsheets, or PowerPoints, or a guy with a clip-board asking you to count backward from 200 by 13. You don’t have to worry about pop-ups or advertisements when you are skinning an animal, grinding seeds, or doing some type of work with your hands in a non-digital world. You didn’t have phone numbers to remember and you were not heading into a business meeting with four people you just met, whose names you needed to memorize as quick and fluidly as possible.

 

Slow thinking developed for people who had time for slow thinking. Fast thinking developed when survival was on the line. Today, the slow thinking might be more likely to help us survive than our fast thinking, presuming we don’t have dangerous drives to work each day and are eating safely prepared foods. Slow thinking is a greater advantage for us today, but we also live in a world where slow thinking is still difficult because we have packed more distractions into our environments. We have literally moved ourselves out of environments for which our brains are optimized by evolution, and this has created the challenges and conflicts we face with System 1 and System 2 in our daily lives and in the work we do.
Limited Effort for Focus and Deep Work

Limited Effort

A little while back I wrote a blog post centered around a quote from Cal Newport, “You have a finite amount of willpower that becomes depleted as you use it.”

 

The idea is that our brains get tired, and as they get tired, they become worse at practicing self control. When you are exhausted, when you have had to concentrate really hard on school work, a business presentation, or on paperwork to ensure your child’s medical care is covered, your mind’s ability to focus becomes deminished. You have trouble staying away from that piece of cake in the fridge, from scrolling through Facebook, and you have trouble being patient with a child or spouse when they try to talk to you.

 

In his book Thinking Fast and Slow, Daniel Kahneman writes something very similar to the quote from Newport, “self-control and deliberate thought apparently draw on the same limited budget of effort.” 

 

Our brains only have so much ability to do heavy duty thinking. It is as if there is a set account for deep thinking, and as we think critically we slowly make deductions from the account until our brains are in the red. Using our brain for serious thoughts and calculations requires focus and self-control. However, our willpower is depleted as we use it, so as we focus for longer periods of time, our brains become worse at ensuring that we stay focused.

 

Kahneman suggests that this is part of why we spend most of our life operating on System 1, the automatic, quick, and lightweight thinking process of our lives. System 2 is the deliberate thought process that we engage to do math, to study a map to make sure we know where we are driving, and to listen seriously to a spouse or child and provide them with support. System 2 takes a lot of energy, and has a limited budget. System 1 runs on low-power mode, and that is why it is our default. It makes mistakes, is subject to biases, and doesn’t always answer the right questions, but at least it saves us energy and allows us to reserve the effort of attention for the most important tasks.

 

Kahneman and Newport would likely both agree that we should use our budget for System 2. We should maximize the time we spend in deep work, and set ourselves up to do our best System 2 work when we need to. We can save System 1 for unimportant moments and tasks, and work with our brains so that we don’t force too much System 2 work into the times when our effort budget has been depleted.
Detecting Simple Relationships

Detecting Simple Relationships

System 1, in Daniel Kahneman’s picture of the mind, is the part of our brain that is always on. It is the automatic part of our brain that detects simple relationships in the world, makes quick assumptions and associations, and reacts to the world before we are even consciously aware of anything. It is contrasted against System 2, which is more methodical, can hold complex and competing information, and can draw rational conclusions from detailed information through energy intensive thought processes.

 

According to Kahneman, we only engage System 2 when we really need to. Most of the time, System 1 does just fine and saves us a lot of energy. We don’t need to have to think critically about what we need to do when the stoplight changes from green to yellow to red. Our System 1 can develop an automatic response so that we let off the gas and come to a stop without having to consciously think about every action involved in slowing down at an intersection. However, System 1 has some very serious limitations.

 

“System 1 detects simple relations (they are all alike, the son is much taller than the father) and excels at integrating information about one thing, but it does not deal with multiple distinct topics at once, nor is it adept at using purely statistical information.”

 

When relationships start to get complicated, like say the link between human activities and long term climate change, System 1 will let us down. It also fails us when we see someone who looks like they belong to the Hell’s Angels on a father-daughter date at an ice cream shop, when we see someone who looks like an NFL linebacker in a book club, or when we see a little old lady driving a big truck. System 1 makes assumptions about the world based on simple relationships, and is easily surprised. It can’t calculate unique and edge cases, and it can’t hold complicated statistical information about multiple actors and factors that influence the outcome of events.

 

System 1 is our default, and we need to remember where its strengths and where its weaknesses are. It can help us make quick decisions while driving or catching an apple falling off a counter, but it can’t help us determine whether a defendant in a criminal case is guilty. There are times when our intuitive assumptions and reactions are spot on, but there are a lot of times when they can lead us astray, especially in cases that are not simple relationships and violate our expectations.
Recognize Situations Where Mistakes Are Common

Recognize Situations Where Mistakes Are Common

“Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent,” writes Daniel Kahneman in Thinking Fast and Slow. System 1 is how Kahneman describes the intuitive, quick reacting part of our brain that continually scans the environment and filters information going to System 2, the more thoughtful, deliberate, calculating, and rational part of our brain. Biases in human thought often originate with System 1. When System 1 misreads a situation, makes a judgment on a limited set of information, or inaccurately perceives something about the world, System 2 will be working on a poor data set and is likely reach faulty conclusions.

 

Kahneman’s book focuses on common cognitive errors and biases, not in the hope that we can radically change our brains and no longer fall victim to prejudices, optical illusions, and cognitive fallacies, but in the hopes that we can increase our awareness of how the brain and our thinking goes off the rails, to help us marginally improve our thought processes and final conclusions. Kahneman writes, “The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.”

 

If we are aware that we will make snap judgments the instant we see a person, before either of us has even spoken a single word, we can learn to adjust our behavior to prevent an instantaneous bias from coloring the entire interaction. If we know that we are making a crucial decision on how we are going to invest our finances for retirement, we can pause and use examples from Kahneman’s book to remember that we have a tendency to answer simpler questions, we have a tendency to favor things that are familiar, and we have a tendency to trust other people based on factors that don’t truly align with trustworthiness. Kahneman doesn’t think his book and his discussions on cognitive fallacies will make us experts in investing, but he does think that his research can help us understand the biases we might make in an investment situation and improve the way we make some important decisions. Understanding how our biases may be impacting our decision can help us improve those decisions.

 

Self- and situational-awareness are crucial for accurately understanding the world and making good decisions based on sound predictions. It is important to know if you can trust an educated guess from yourself or others, and it is important to recognize when your confidence is unwarranted. It is important to know when your opinions carry weight, and when your direct observations might be incomplete and misleading. In most instances of our daily lives, the stakes are low and errors from cognitive biases and errors are low, but in some situations, like serving on a jury, driving on the freeway, or choosing whether to hire someone, our (and other people’s) livelihoods could be on the line. We should honestly recognize the biases and limitations of the mind so we can further recognize situations where mistakes are common, and hopefully make fewer mistakes when it matters most.