Causal Illusions - The Book of Why

Causal Illusions

In The Book of Why Judea Pearl writes, “our brains are not wired to do probability problems, but they are wired to do causal problems. And this causal wiring produces systematic probabilistic mistakes, like optical illusions.” This can create problems for us when no causal link exists and when data correlate without any causal connections between outcomes.  According to Pearl, our causal thinking, “neglects to account for the process by which observations are selected.”  We don’t always realize that we are taking a sample, that our sample could be biased, and that structural factors independent of the phenomenon we are trying to observe could greatly impact the observations we actually make.
Pearl continues, “We live our lives as if the common cause principle were true. Whenever we see patterns, we look for a causal explanation. In fact, we hunger for an explanation, in terms of stable mechanisms that lie outside the data.” When we see a correlation our brains instantly start looking for a causal mechanism that can explain the correlation and the data we see. We don’t often look at the data itself to ask if there was some type of process in the data collection that lead to the outcomes we observed. Instead, we assume the data is correct and  that the data reflects an outside, real-world phenomenon. This is the cause of many causal illusions that Pearl describes in the book. Our minds are wired for causal thinking, and we will invent causality when we see patterns, even if there truly isn’t a causal structure linking the patterns we see.
It is in this spirit that we attribute negative personality traits to people who cut us off on the freeway. We assume they don’t like us, that they are terrible people, or that they are rushing to the hospital with a sick child so that our being cut off has a satisfying causal explanation. When a particular type of car stands out and we start seeing that car everywhere, we misattribute our increased attention to the type of car and assume that there really are more of those cars on the road now. We assume that people find them more reliable or more appealing and that people purposely bought those cars as a causal mechanism to explain why we now see them everywhere. In both of these cases we are creating causal pathways in our mind that in reality are little more than causal illusions, but we want to find a cause to everything and we don’t always realize that we are doing so. It is important that we be aware of these causal illusions when making important decisions, that we think about how the data came to mind, and whether there is a possibility of a causal illusion or cognitive error at play.
Paternalistic Nudges - Joe Abittan

Paternalistic Nudges

In their book Nudge, Cass Sunstein and Richard Thaler argue in favor of libertarian paternalism. Their argument is that our world is complex and interconnected, and it is impossible for people to truly make decisions on their own. Not only is it impossible for people to simply make their own decisions, it is impossible for other people to avoid influencing the decisions of others. Whether we decide to influence a decision in a particular way, or whether we decide to try to avoid any influence on another’s decision, we still shape how decisions are presented, understood, and contextualized. Given this reality, the best alternative is to try to help people make consistently better decisions than they would without aid and assistance.

 

The authors describe libertarian paternalism by writing:

 

“The approach we recommend does count as paternalistic, because private and public choice architects are not merely trying to track or to implement people’s anticipated choices. Rather, they are self-consciously attempting to move people in directions that will make their lives better. They nudge.”

 

The nudge is the key aspect of libertarian paternalism. Forcing people into a single choice, forcing them to accept your advice and perspective, and aggressively trying to change people’s behaviors and opinions doesn’t fit within the libertarian paternalism framework advocated by Sunstein and Thaler. Instead, a more subtle form of guidance toward good decisions is employed. People retain maximal choices if desired, and their opinions, decisions, and behaviors are somewhat constrained but almost nothing is completely off the table.

 

“A nudge,” Sunstein and Thaler write, “as we will use the term, is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives.”

 

Daniel Kahneman, in his book Thinking Fast and Slow demonstrated that people make predictable errors and have predictable biases. If we can understand these thinking errors and biases, then we can identify situations in which these biases and cognitive errors are likely to lead people to making suboptimal decisions. To go a step further, as Sunstein and Thaler would suggest, if we are a choice architect, we should design and structure choices in a way that leads people away from predictable cognitive biases and errors. We should design choices in a way that takes those thinking mistakes into consideration and improves the way people understand their choices and options.

 

As a real world example, if we are structuring a retirement savings plan, we can be relatively sure that people will anchor around a default contribution built into their retirement savings plan. If we want to encourage greater retirement savings (knowing that economic data indicate people rarely save enough), we can set the default to 8% or higher, knowing that people may reduce the default rate, but likely won’t eliminate contributions entirely. Setting a high default is a nudge toward better retirement saving. We could chose not to have a default rate at all, and it is likely that people wouldn’t be sure about what rate to select and might chose a low rate below inflation or simply chose not to enter a rate at all, completely failing to contribute anything to the plan. It is clear that there is a better outcome that we, as choice architects, could help people attain if we understand how their minds work and can apply a subtle nudge.
Can We Avoid Cognitive Errors?

Can We Avoid Cognitive Errors?

Daniel Kahneman is not very hopeful when it comes to our ability to avoid cognitive errors. Toward the end of his book Thinking Fast and Slow, a book all about cognitive errors, predictable biases, and situations in which we can recognize such biases and thinking errors, Kahneman isn’t so sure there is much we can actually do in our lives to improve our thinking.

 

Regarding his own thinking, Kahneman writes, “little can be achieved without considerable effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.”

 

Kahneman’s book is fantastic in part because of his humility. It would be easy to take a book on illusions, cognitive errors, biases, and predictable fallacies and use it to show how much smarter you are than everyone else who makes such thinking mistakes. However, Kahneman uses his own real life examples throughout the book to show how common and easy it is to fall into ways of thinking that don’t actually reflect reality. What is unfortunate though, is how hard it is to actually take what you learn from the book and apply it to your own life. If the author himself can hardly improve his own thinking, then those of us who read the book likely won’t make big changes in our thinking either.

 

“The upshot is that it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so. Observers are less cognitively busy and more open to information than actors,” Kahneman continues. While we might not be able to improve our thinking simply by knowing about cognitive errors and being aware of predictable biases, we can at least recognize them in others. This can help us be more thoughtful when we critique or gossip about others (something we all do even if we claim we don’t).

 

Beyond improving the way we gossip or judge others, Kahneman’s research and his book are incredibly valuable for anyone who is in a design focused role. If you are creating a layout for a webpage, a seating arrangement at a restaurant, or the standard operating procedures for a company, you have an opportunity to design and develop a process and flow that takes cognitive errors and predictable biases into account. Because it is easier to observe others making mistakes than to observe those mistakes in ourselves, we can watch for situations where people are led astray, and help get them back on course. We can develop systems and structures that take our biases and cognitive errors into account, and minimize the damage they may do. We can set the world up to help guide us in a reasonable way through our cognitive errors and biases, but only if we know what to look for.
The Remembering Self and Time - Joe Abittan

The Remembering Self and Time

Time, as we have known it, has only been with human beings for a small slice of human history. The story of time zones is fascinating, and really began once rail roads connected the United States. Before we had a standardized system for operating within time, human lives were ruled by the cycle of the sun and the seasons, not by the hands of a watch. This is important because it suggests that the time bounds we put on our lives, the hours of our schedules and work days, and the way we think about the time duration of meetings, movies, a good night’s sleep, and flights is not something our species truly evolved to operate within.

 

In Thinking Fast and Slow, Daniel Kahneman shows one of the consequences of human history being out of sync with modern time. “The mind,” he writes, “is good with stories, but it does not appear to be well designed for the processing of time.”

 

I would argue that this makes sense and should be expected. Before we worked set schedules defined by the clock, before we could synchronize the start of a football game with TV broadcasts across the world, and before we all needed to be at the same place at precisely the right time to catch a departing train, time wasn’t very important. It was easy to tie time with sunrise, sunset, or mid-day compared to a 3:15 departure or a 7:05 kick-off. The passage of time also didn’t matter that much. The difference between being 64 and 65 years old wasn’t a big deal for humans that didn’t receive retirement benefits and social security payments. We did not evolve to live in a world where every minute of every day was tightly controlled by time and where the passage of time was tied so specifically to events in our lives.

 

For me, and I think for Daniel Kahneman, this may explain why we see some of the cognitive errors we make when we remember events from our past. Time wasn’t as important of a factor for ancient humans as story telling was. Kahneman continues,

 

“The remembering self, as I have described it, also tells stories and makes choices, and neither the stories nor the choices properly represent time. In storytelling mode, an episode is represented by a few critical moments, especially the beginning, the peak, and the end. Duration is neglected.”

 

When we think back on our lives, on moments that meant a lot to us, on times we want to relive, or on experiences we want to avoid in the future, we remember the salient details. We don’t necessarily remember how long everything lasted. My high school basketball days are not remembered by the hours spent running UCLAs, by the number of Saturdays I had to be up early for 8 a.m. practices, or by the hours spent in drills. My memories are made up of a few standout plays, games, and memorable team moments. The same is true for my college undergrad memories, the half-marathons I have raced, and my memories from previous homes I have lived in.

 

When we think about our lives we are not good at thinking about the passage of time, about how long we spent working on something, how long we had to endure difficulties, or how long the best parts of our lives lasted. We live with snapshots that can represent entire years or decades. Our remembering self drops the less meaningful parts of experiences from our memories, and holds onto the start, the end, and the best or worst moments from an experience. It distorts our understanding of our own history, and creates memories devoid of a sense of time or duration.

 

I think about this a lot because our minds and our memories are the things that drive how we behave and how we understand the present moment. However, duration neglect helps us see that reality of our lives is shaped by unreality. We are influenced by cognitive errors and biases, by poor memories, and distortions of time and experience. It is important to recognize how faulty our thinking can be, so we can develop systems, structures, and ways of thinking that don’t assume we are always correct, but help guide us toward better and more realistic ways of understanding the world.
The Focusing Illusion Continued

The Focusing Illusion Continued

I find the focusing illusion as described by Daniel Kahneman in his book Thinking Fast and Slow to be fascinating because it reveals how strange our actual thinking is. I am constantly baffled by the way that our brains continuously and predictably makes mistakes. The way we think about, interpret, and understand the world is not based on an objective reality, but is instead based on what our brain happens to be focused on at any given time. As Kahneman writes, what you see is all there is, and the focusing illusion is a product of our brain’s limited ability to take in information combined with the brain’s tendency to substitute difficult and complex questions for more simple questions.

 

In the book, Kahneman asks us to think about the overall happiness of someone who recently moved from Ohio to California and also asks us to think about the amount of time that paraplegics spend in a bad mood. In both situations, we make a substitution. We know that people’s overall happiness and general moods are comprised of a huge number of factors, but when we think about the two situations, we focus in on a couple of simple ideas.

 

We assume the person from Ohio is happier in California because the weather in California is always perfect while Ohio experiences cold winters. The economic prospects in California might be better than Ohio, and there are more movie stars and surfing opportunities. Without knowing anything about the person, we probably assume the California move made them happier overall (especially given the additional context and priming based on the weather and jobs prospects that Kahneman presents in the example in his book).

 

For our assumptions about the paraplegic, we likely go the other way with our thoughts. We think about how we would feel if we were in an accident and lost the use of our legs or arms. We assume their life must be miserable, and that they spend much of their day in a bad mood. We don’t make a complex consideration of the individual’s life or ask more information about them, we just make an assumption based on limited information by substituting in the question, “How would I feel if I became paralyzed.” Of course, people who are paralyzed or lose the function of part of their body are still capable of a full range of human emotions, and might still find happiness in their lives in many areas.

 

Kahneman writes, “The focusing illusion can cause people to be wrong about their present state of well-being as well as about the happiness of others, and about their own happiness in the future.”

 

We often say that it is important that we know ourselves and that we be true to ourselves if we want to live healthy and successful lives. But research throughout Thinking Fast and Slow shows us how hard it can be. After reading Kahneman’s book, learning about Nudges from Cass Sunstein and Richard Thaler, and learning how poorly we process risk and chance from Gerd Gigerenzer, I constantly doubt how much I can really know about myself, about others, or really about anything. I am frustrated when people act on intuition, sure of themselves and their ideas in complex areas such as economics, healthcare, or education. I am dismayed by advertisements, religions, and political parties that encourage us to act tribally and to trust our instincts and intuitions. It is fascinating that we can be so wrong about something as personal as our own happiness. It is fascinating that we can be so biased in our thinking and judgement, and that we can make conclusions and assumptions about ourselves and others with limited information and not even notice how poorly our thought processes are. I love thinking about and learning about the biases and cognitive errors of our mind, and it makes me pause when I am sure of myself and when I think that I am clearly right and others are wrong. After all, if what you see is all there is, then your opinions, ideas, and beliefs are almost certainly inadequate to actually describe the reality you inhabit.
Sunk-Cost Fallacy - Joe Abittan

Sunk-Cost Fallacy

Every time I pick the wrong line at the grocery store I am reminded of the sunk-cost fallacy. There are times I will be stuck in line, see another line moving more quickly, and debate internally if I should jump to the other line or just wait it out in the line I’m already in. Once I remember the sunk-cost fallacy, however, the internal debate shifts and I let go of any feeling that I need to remain in the current line.

 

My grocery store example is a comical take on the sunk-cost fallacy, but in real life, this cognitive error can have huge consequences. Daniel Kahneman describes it this way, “The decision to invest additional resources in a losing account, when better investments are available, is known as the sunk-cost fallacy, a costly mistake that is observed in decisions large and small.”

 

We are going to make decisions and choices for where to invest our time, attention, and money that will turn out to be mistakes. At a certain point we have to realize when something is not working and walk away. Doing so, however, requires that we admit failure, that we cut our losses, and that we search for new opportunities. Admitting that we were wrong, giving up on losses, and searching for new avenues is difficult, and it is not uncommon for us to keep moving forward despite our failures, as if we just need to try harder and push more in order to find the success we desire. This is the base of the sunk-cost fallacy. When we have invested a lot of time, energy, and resources into something it is hard to walk away, even if we would be better off by doing so.

 

Pursuing a career path that clearly isn’t panning out and refusing to try a new different avenue is an example of sunk-cost fallacy. Movie studios that try to reinvent a character or story over and over with continued failure is another example. Sitting through the terrible movie the studio produced, rather than leaving the theater early, is also an example of the sunk-cost fallacy. In all of these instances, an investment has been made, and costly efforts to make the investment pay-off are undertaken, generally at a greater loss than would be incurred if we had made a change and walked away.

 

When you find yourself saying, “I have already spent so much money on XYZ, or I have already put so much effort into making XYZ work, and I don’t want to just let that all go to waste,” you are stuck in the middle of the sunk-cost fallacy. At this point, it is time to step back, look at other ways you could spend your money and time, and honestly evaluate what your priorities should be. Doing so, and remembering Kahneman’s quote, will help you begin to make the shift to a better use of your time, energy, and resources. It may be embarrassing and disappointing to admit that something is going in the wrong direction, but ultimately, you will end up in a better and more productive spot.
Denominator Neglect - Joe Abittan

Denominator Neglect

“The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects,” writes Daniel Kahneman in Thinking Fast and Slow.

 

One thing we have seen in 2020 is how difficult it is to communicate and understand risk. Thinking about risk requires thinking statistically, and thinking statistically doesn’t come naturally for our brains. We are good at thinking in terms of anecdotes and our brains like to identify patterns and potential causal connections between specific events. When our brains have to predict chance and deal with uncertainty, they easily get confused. Our brains shift and solve easier problems rather than complex mathematical problems, substituting the answer to the easy problem without realizing it. Whether it is our risk of getting COVID or the probability we assigned to election outcomes before November 3rd, many of us have been thinking poorly about probability and chance this year.

 

Kahneman’s quote above highlights one example of how our thinking can go wrong when we have to think statistically. Our brains can be easily influenced by random numbers, and that can throw off our decision-making when it comes to dealing with uncertainty. To demonstrate denominator neglect, Kahneman presents two situations in his book. There are two large urns full of white and red marbles. If you pull a red marble from an urn, you are a winner. The first urn has 10 marbles in it, with 9 white and 1  red. The second urn has 100 marbles in it, with 92 white and 8 red marbles. Statistically, we should try our luck with the urn with 10 marbles, because 1 out of 10, or 10% of all marbles in the urn are red. In the second urn, only 8% of the marbles are red.

 

When asked which urn they would want to select from, many people select the second urn, leading to what Kahneman describes as denominator neglect. The chance of winning is lower with the second urn, but there are more winning marbles in the jar, making it seem like the better option if you don’t slow down and engage your System 2 thinking processes. If you pause and think statistically, you can see that option 1 provides better odds, but if you are moving quick your brain can be distracted by the larger number of winning marbles and lead you to make a worse choice.

 

What is important to recognize is that we can be influenced by numbers that shouldn’t mean anything to us. The number of winning marbles shouldn’t matter, only the percent chance of winning should matter, but our brains get thrown off. The same thing happens when we see sales prices, think about a the risk of a family gathering of 10 people during a global pandemic, or think about polling errors. I like to check The Nevada Independent‘s COVID-19 tracking website, and I have noticed denominator neglect in how I think about the numbers they report. For a continued stretch, Nevada’s total number of cases was decreasing, but our case positivity rate was staying the same. Statistically, nothing was really changing regarding the state of the pandemic in Nevada, but fewer tests were being completed and reported each day, so the overall number of positive cases was decreasing. If you scroll down the Nevada Independent website, you will get to a graph of the case positivity rate and see that things were staying the same. When looking at the decreasing number of positive tests reported, my brain was neglecting the denominator, the number of tests completed. The way I understood the pandemic was biased by the big headline number, and wasn’t really based on how many people out of those tested did indeed have the virus. Thinking statistically provides a more accurate view of reality, but it can be hard to think statistically and can be tempting to look just at a single headline number.
Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.
Narratives and Halos

Narratives and Halos

Yesterday I wrote about narrative fallacies and how our brains’ desires to create coherent stories can lead to cognitive errors. One error, which I wrote about previously, is the halo effect, and in some ways it is a direct consequence of narrative thinking. Our brains don’t do well with conflicting information that doesn’t fit a coherent narrative, and the halo effect helps smooth over this problem in our minds.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “The halo effect helps keep explanatory narratives simply and coherent by exaggerating the consistency of evaluations; good people do only good things and bad people are all bad.” When we already like someone or consider them a good person the halo effect will simplify other judgments that we might have to make about them. If the person we admire is wearing a particular kind of coat, then we will assume that it is also a coat we should admire. If a person we dislike is engaging in some type of business, then we will assume that business is also bad. Contradictions occur when we see someone we admire wearing clothing we don’t find acceptable or when a person we know to have moral flaws engages in altruistic charity work.

 

Instead of accepting a contradiction in our narrative, creating a more complex story where some people are good in some situations but bad in others, we alter our judgments in other ways to maintain a coherent narrative. The person we like wearing strange clothes is a trend setter, and that must be the new up-and-coming style we should try to emulate. The bad person engaged in charity isn’t really doing the good things for good reasons, rather they are being selfish and trying to show-off through their charity.

 

When we reflect on our thinking and try to be more considerate of the narratives we create, we can see that we fall into traps like the halo effect. What is harder to do, however, is overcome the halo effect and other cognitive errors that simplify our narratives once we have noticed them. It is hard to continually live with conflicting opinions, ideas of people, cities, sports teams, car companies, and shoe brands. It is much easier to adopt a few favorites and believe them to be a good in all ways, rather than to accept that something might be great in some ways, but harmful or disappointing in others.
Narrative Fallacies #NarrativePolicyFramework

Narrative Fallacies

With perhaps the exception of professional accountants and actuaries, we think in narratives. How we understand important aspects of our lives, such as who we are, the opportunities we have had in life, the decisions we have made, and how our society works is shaped by the narratives we create in our minds. We use stories to make sense of our relationships with other people, of where our future is heading, and to motivate ourselves to keep going. Narratives are powerful, but so are the narrative fallacies that can arise from the way we think.

 

Daniel Kahneman, in Thinking Fast and Slow, demonstrates the ways in which our brains take short-cuts, rely on heuristics, and create narratives to understand a complex world. He shows he these thinking strategies can fail us in predictable ways due to biases, illusions, and judgments made on incomplete information. Narrative fallacies can arise from all three of the cognitive errors I just listed. To get more in depth with narrative fallacies, Kahneman writes,

 

“Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.”

 

We don’t really know how to judge probabilities, possibilities, and the consequences of things that didn’t happen. We are biased to see agency in people and things when luck was more of a factor than any direct action or individual decision. We are motivated and compelled by stories of the world that simplify the complexity of reality, taking a small slice of the world and turning that into a model to describe how we should live, behave, and relate to others.

 

Unfortunately, in my opinion, narrative fallacies cannot be avoided. I studied public policy, and one of the frameworks for understanding political decision-making that I think needs far more direct attention is the Narrative Policy Framework which incorporates the idea of Social Constructions of Target Populations from Anne Schneider and Helen Ingram. We understand the outcomes of an event based on how we think about the person or group that were impacted by the consequences of the outcome. A long prison sentence for a person who committed a violent crime is fair and appropriate. A tax break for parents who work full time is also fair and appropriate. In both instances, we think about the person receiving the punishment or reward of a decision, and we judge whether they are deserving of the punishment or reward. We create a narrative to explain why we think the outcomes are fair.

 

We cannot exist in a large society of millions of people without shared narratives to help us explain and understand our society collectively. We cannot help but create a story about a certain person or group of people, and build a narrative to explain why we think that person or group deserves a certain outcome. No matter what, however, the outcomes will not be rational, they will be biased and contain contradictions. We will judge groups positively or negatively based on stories that may or may not be accurate and complete, and people will face real rewards or punishments due to how we construct our narratives and what biases are built into our stories. We can’t escape this reality because it is how our brains work and how we create a cohesive society, but we can at least step back and admit this is how our brains work, admit that our narratives are subject to biases and are based on incomplete information, and we can decide how we want to move forward with new narratives that will help to unify our societies rather than pit them against each other in damaging competition.