Sunk-Cost Fallacy - Joe Abittan

Sunk-Cost Fallacy

Every time I pick the wrong line at the grocery store I am reminded of the sunk-cost fallacy. There are times I will be stuck in line, see another line moving more quickly, and debate internally if I should jump to the other line or just wait it out in the line I’m already in. Once I remember the sunk-cost fallacy, however, the internal debate shifts and I let go of any feeling that I need to remain in the current line.

 

My grocery store example is a comical take on the sunk-cost fallacy, but in real life, this cognitive error can have huge consequences. Daniel Kahneman describes it this way, “The decision to invest additional resources in a losing account, when better investments are available, is known as the sunk-cost fallacy, a costly mistake that is observed in decisions large and small.”

 

We are going to make decisions and choices for where to invest our time, attention, and money that will turn out to be mistakes. At a certain point we have to realize when something is not working and walk away. Doing so, however, requires that we admit failure, that we cut our losses, and that we search for new opportunities. Admitting that we were wrong, giving up on losses, and searching for new avenues is difficult, and it is not uncommon for us to keep moving forward despite our failures, as if we just need to try harder and push more in order to find the success we desire. This is the base of the sunk-cost fallacy. When we have invested a lot of time, energy, and resources into something it is hard to walk away, even if we would be better off by doing so.

 

Pursuing a career path that clearly isn’t panning out and refusing to try a new different avenue is an example of sunk-cost fallacy. Movie studios that try to reinvent a character or story over and over with continued failure is another example. Sitting through the terrible movie the studio produced, rather than leaving the theater early, is also an example of the sunk-cost fallacy. In all of these instances, an investment has been made, and costly efforts to make the investment pay-off are undertaken, generally at a greater loss than would be incurred if we had made a change and walked away.

 

When you find yourself saying, “I have already spent so much money on XYZ, or I have already put so much effort into making XYZ work, and I don’t want to just let that all go to waste,” you are stuck in the middle of the sunk-cost fallacy. At this point, it is time to step back, look at other ways you could spend your money and time, and honestly evaluate what your priorities should be. Doing so, and remembering Kahneman’s quote, will help you begin to make the shift to a better use of your time, energy, and resources. It may be embarrassing and disappointing to admit that something is going in the wrong direction, but ultimately, you will end up in a better and more productive spot.
Denominator Neglect - Joe Abittan

Denominator Neglect

“The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects,” writes Daniel Kahneman in Thinking Fast and Slow.

 

One thing we have seen in 2020 is how difficult it is to communicate and understand risk. Thinking about risk requires thinking statistically, and thinking statistically doesn’t come naturally for our brains. We are good at thinking in terms of anecdotes and our brains like to identify patterns and potential causal connections between specific events. When our brains have to predict chance and deal with uncertainty, they easily get confused. Our brains shift and solve easier problems rather than complex mathematical problems, substituting the answer to the easy problem without realizing it. Whether it is our risk of getting COVID or the probability we assigned to election outcomes before November 3rd, many of us have been thinking poorly about probability and chance this year.

 

Kahneman’s quote above highlights one example of how our thinking can go wrong when we have to think statistically. Our brains can be easily influenced by random numbers, and that can throw off our decision-making when it comes to dealing with uncertainty. To demonstrate denominator neglect, Kahneman presents two situations in his book. There are two large urns full of white and red marbles. If you pull a red marble from an urn, you are a winner. The first urn has 10 marbles in it, with 9 white and 1  red. The second urn has 100 marbles in it, with 92 white and 8 red marbles. Statistically, we should try our luck with the urn with 10 marbles, because 1 out of 10, or 10% of all marbles in the urn are red. In the second urn, only 8% of the marbles are red.

 

When asked which urn they would want to select from, many people select the second urn, leading to what Kahneman describes as denominator neglect. The chance of winning is lower with the second urn, but there are more winning marbles in the jar, making it seem like the better option if you don’t slow down and engage your System 2 thinking processes. If you pause and think statistically, you can see that option 1 provides better odds, but if you are moving quick your brain can be distracted by the larger number of winning marbles and lead you to make a worse choice.

 

What is important to recognize is that we can be influenced by numbers that shouldn’t mean anything to us. The number of winning marbles shouldn’t matter, only the percent chance of winning should matter, but our brains get thrown off. The same thing happens when we see sales prices, think about a the risk of a family gathering of 10 people during a global pandemic, or think about polling errors. I like to check The Nevada Independent‘s COVID-19 tracking website, and I have noticed denominator neglect in how I think about the numbers they report. For a continued stretch, Nevada’s total number of cases was decreasing, but our case positivity rate was staying the same. Statistically, nothing was really changing regarding the state of the pandemic in Nevada, but fewer tests were being completed and reported each day, so the overall number of positive cases was decreasing. If you scroll down the Nevada Independent website, you will get to a graph of the case positivity rate and see that things were staying the same. When looking at the decreasing number of positive tests reported, my brain was neglecting the denominator, the number of tests completed. The way I understood the pandemic was biased by the big headline number, and wasn’t really based on how many people out of those tested did indeed have the virus. Thinking statistically provides a more accurate view of reality, but it can be hard to think statistically and can be tempting to look just at a single headline number.
Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.
Narratives and Halos

Narratives and Halos

Yesterday I wrote about narrative fallacies and how our brains’ desires to create coherent stories can lead to cognitive errors. One error, which I wrote about previously, is the halo effect, and in some ways it is a direct consequence of narrative thinking. Our brains don’t do well with conflicting information that doesn’t fit a coherent narrative, and the halo effect helps smooth over this problem in our minds.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “The halo effect helps keep explanatory narratives simply and coherent by exaggerating the consistency of evaluations; good people do only good things and bad people are all bad.” When we already like someone or consider them a good person the halo effect will simplify other judgments that we might have to make about them. If the person we admire is wearing a particular kind of coat, then we will assume that it is also a coat we should admire. If a person we dislike is engaging in some type of business, then we will assume that business is also bad. Contradictions occur when we see someone we admire wearing clothing we don’t find acceptable or when a person we know to have moral flaws engages in altruistic charity work.

 

Instead of accepting a contradiction in our narrative, creating a more complex story where some people are good in some situations but bad in others, we alter our judgments in other ways to maintain a coherent narrative. The person we like wearing strange clothes is a trend setter, and that must be the new up-and-coming style we should try to emulate. The bad person engaged in charity isn’t really doing the good things for good reasons, rather they are being selfish and trying to show-off through their charity.

 

When we reflect on our thinking and try to be more considerate of the narratives we create, we can see that we fall into traps like the halo effect. What is harder to do, however, is overcome the halo effect and other cognitive errors that simplify our narratives once we have noticed them. It is hard to continually live with conflicting opinions, ideas of people, cities, sports teams, car companies, and shoe brands. It is much easier to adopt a few favorites and believe them to be a good in all ways, rather than to accept that something might be great in some ways, but harmful or disappointing in others.
Narrative Fallacies #NarrativePolicyFramework

Narrative Fallacies

With perhaps the exception of professional accountants and actuaries, we think in narratives. How we understand important aspects of our lives, such as who we are, the opportunities we have had in life, the decisions we have made, and how our society works is shaped by the narratives we create in our minds. We use stories to make sense of our relationships with other people, of where our future is heading, and to motivate ourselves to keep going. Narratives are powerful, but so are the narrative fallacies that can arise from the way we think.

 

Daniel Kahneman, in Thinking Fast and Slow, demonstrates the ways in which our brains take short-cuts, rely on heuristics, and create narratives to understand a complex world. He shows he these thinking strategies can fail us in predictable ways due to biases, illusions, and judgments made on incomplete information. Narrative fallacies can arise from all three of the cognitive errors I just listed. To get more in depth with narrative fallacies, Kahneman writes,

 

“Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.”

 

We don’t really know how to judge probabilities, possibilities, and the consequences of things that didn’t happen. We are biased to see agency in people and things when luck was more of a factor than any direct action or individual decision. We are motivated and compelled by stories of the world that simplify the complexity of reality, taking a small slice of the world and turning that into a model to describe how we should live, behave, and relate to others.

 

Unfortunately, in my opinion, narrative fallacies cannot be avoided. I studied public policy, and one of the frameworks for understanding political decision-making that I think needs far more direct attention is the Narrative Policy Framework which incorporates the idea of Social Constructions of Target Populations from Anne Schneider and Helen Ingram. We understand the outcomes of an event based on how we think about the person or group that were impacted by the consequences of the outcome. A long prison sentence for a person who committed a violent crime is fair and appropriate. A tax break for parents who work full time is also fair and appropriate. In both instances, we think about the person receiving the punishment or reward of a decision, and we judge whether they are deserving of the punishment or reward. We create a narrative to explain why we think the outcomes are fair.

 

We cannot exist in a large society of millions of people without shared narratives to help us explain and understand our society collectively. We cannot help but create a story about a certain person or group of people, and build a narrative to explain why we think that person or group deserves a certain outcome. No matter what, however, the outcomes will not be rational, they will be biased and contain contradictions. We will judge groups positively or negatively based on stories that may or may not be accurate and complete, and people will face real rewards or punishments due to how we construct our narratives and what biases are built into our stories. We can’t escape this reality because it is how our brains work and how we create a cohesive society, but we can at least step back and admit this is how our brains work, admit that our narratives are subject to biases and are based on incomplete information, and we can decide how we want to move forward with new narratives that will help to unify our societies rather than pit them against each other in damaging competition.
How We Chose to Measure Risk

How We Chose to Measure Risk

Risk is a tricky thing to think about, and how we chose to measure and communicate risk can make it even more challenging to comprehend. Our brains like to categorize things, and categorization is easiest when the categories are binary or represent three or fewer distinct possibilities. Once you start adding options and different possible outcomes, decisions quickly become overwhelmingly complex, and our minds have trouble sorting through the possibilities. In his book Thinking Fast and Slow, Daniel Kahneman discusses the challenges of thinking about risk, and highlights another level of complexity in thinking about risk: what measurements we are going to use to communicate and judge risk.

 

Humans are pretty good at estimating coin flips – that is to say that our brains do ok with binary 50-50 outcomes (although as Kahneman shows in his book this can still trip us up from time to time). Once we have to start thinking about complex statistics, like how many people will die from cancer caused by smoking if they smoke X number of packs of cigarettes per month for X number of years, our brains start to have trouble keeping up. However, there is an additional decision that needs to be layered on top statistics such as cigarette related death statistics before we can begin to understand them. That decision is how we are going to report the death statistics.  Will we chose to report deaths per thousand smokers? Will we chose to report the number of packs smoked for a number of years? Will we just chose to report deaths among all smokers, regardless as to whether they smoked one pack per month or one pack before lunch every day?

 

Kahneman writes, “the evaluation of the risk depends on the choice of a measure – with the obvious possibility that the choice may have been guided by a preference for one outcome or another.”

 

Political decisions cannot be escaped, even when we are trying to make objective and scientific statements about risk. If we want to convey that something is dangerous, we might chose to report overall death numbers across the country. Those death numbers might sound like a large number, even though they may represent a very small fraction of incidents. In our lives today, this may be done with COVID-19 deaths, voter fraud instances, or wildfire burn acreage. Our brains will have a hard time comprehending risk in each of these areas, and adding the complexity of how that risk is calculated, measured, and reported can make virtually impossible for any of us to comprehend risk. Clear and accurate risk reporting is vital for helping us understand important risks in our lives and in society, but the entire process can be derailed if we chose measures that don’t accurately reflect risk or that muddy the waters of exactly what the risk is.
The Availability Heuristic

The Science of Availability

Which presidential candidate is doing more advertising this year? Which college football team has been the most dominant over the last five years? Who has had the most songs on the Hot 100 over the last five years? You can probably come up with an intuitive answer to (at least one of) these questions even if you don’t follow politics, college football, or pop music very closely. But what you are doing when you come up with an intuitive answer isn’t really answering the question, but instead relying on substitution and the availability heuristic.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “We defined the availability heuristic as the process of judging frequency by the ease with which instances come to mind.” So if you recently saw a few ads from the Trump Campaign, then your mind would probably intuit that his campaign is doing more advertising. If you remember that LSU won the college football national championship last year, then you might have answered LSU, but also if you see lots of people wearing Alabama hats on a regular basis, you might answer Alabama. And if you recently heard a Taylor Swift song, then your intuitive guess might be that she has had the most top 100 hits.

 

Kahneman continues, “The availability heuristic, like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind.” When we are asked to guess how often an event happens or what percent of a category fits a certain characteristic, our brains flip back through short-term memory for examples that match what we are looking for. The easier it is to remember an example the more weight we give to it.

 

I don’t really know who is doing more advertising, but I do know that I have seen a lot of Trump ads on YouTube, so it intuitively felt that he was doing more advertising, even though I might have just picked one channel where his ads were more salient. Overall, he may be doing less than the Biden campaign. Similarly, I didn’t initially remember that LSU won the national championship last year, but I did see someone wearing an Alabama sweatshirt recently, and that team came to mind quickly when thinking of dominant football programs. I also don’t have a clue who has had the most top 100 hits in the last 5 years, but people in my orbit on Twitter frequently post things relating to Taylor Swift, so her name came to mind easily when guessing for the top 100 hits. I wasn’t doing any deep thinking, I was just scratching the surface of my memory for an easy answer.

 

Throughout Thinking Fast and Slow, Kahneman reveals instances where our thinking appears to be deep and nuanced, but is really quick, intuitive, and prone to errors. In most instances we don’t do any deep calculation or thinking, and just roll with the intuitive answer. But our intuition is often faulty, incomplete, and based on a substitution for the real question we are being asked. This might not have high stakes when it means we are inaccurately estimating divorce rates for celebrities (an example from the book), but it can have high stakes in other decision-making areas. If we are looking to buy a home and are concerned about flood risk, we will incorrectly weight the risk of a flood at a property if there were a lot of news stories about hurricane flooding from a hurricane in the Gulf of Mexico. This could influence where we chose to live and whether we pay for expensive insurance or not. Little assumptions and misperceptions can nudge us in critical directions, either positive or negative, and change whether we invest for our futures, fudge our taxes, or buy a new car. Recognizing that our brains make mistakes based on thinking strategies like the availability heuristic can help us in some large decision-making areas, so it is important to understand how our brains work, and where they can go wrong.
The Environment of the Moment

The Environment of the Moment

“The main moral of priming research is that our thoughts and our behavior are influenced, much more than we know or want, by the environment of the moment. Many people find the priming results unbelievable, because they do not correspond to subjective experience. Many others find the results upsetting, because they threaten the subjective sense of agency and autonomy.”

 

Daniel Kahneman includes the above quote in his book Thinking Fast and Slow when recapping his chapter about anchoring effects. The quote highlights the surprising and conflicting reality of research on priming and anchoring effects. The research shows that our minds are not always honest with us, or at least are not capable of consciously recognizing everything taking place within them. Seemingly meaningless cues in our environment can influence a great deal of what takes place within our brains. We can become more defensive, likely to donate more to charity, and more prone to think certain thoughts by symbols, ideas, and concepts present in our environment.

 

We all accept that when we are hungry, when our allergies are overwhelming, and when we are frustrated from being cut-off on the freeway that our behaviors will be changed. We know these situations will make us less patient, more likely to glare at someone who didn’t mean to offend us, and more likely to grab a donut for breakfast because we are not in the mood for flavor-lacking oatmeal. But somehow, even though we know external events are influencing our internal thinking and decision-making, this still seems to be in our conscious control in one way or another. A hearty breakfast, a few allergy pills, and a few deep breaths to calm us down are all we need to get back to normal and be in control of our minds and behavior.

 

It is harder to accept that our minds, moods, generosity, behavior towards others, and stated beliefs could be impacted just as easily by factors that we don’t even notice. We see some type of split between being short with someone because we are hungry, and being short with someone because an advertisement on our way to work primed us to be more selfish. We don’t believe that we will donate more to charity when the charity asks for a $500 dollar donation rather than a $50 dollar donation. In each of these situations our conscious and rational brain produces an explanation for our behavior that is based on observations the conscious mind can make. We are not aware of the primes and anchors impacting our behavior, so consciously we don’t believe they have any impact on us at all.

 

Nevertheless, research shows that our minds are not as independent and controllable as we subjectively believe. Kahneman’s quote shows that traditional understandings of free-will fall down when faced by research on priming and anchoring effects. We don’t like to admit that random and seemingly innocuous cues in the environment of the moment shape us because doing so threatens the narratives and stories we want to believe about who we are, why we do the things we do, and how our society is built. It is scary, possibly upsetting, and violates basic understandings of who we are, but it is accurate and important to accept if we want to behave and perform better in our lives.
Cause and Chance

Cause and Chance

Recently I have written a lot about our mind’s tendency toward causal thinking, and how this tendency can sometimes get our minds in trouble. We make associations and predictions based on limited information and we are often influenced by biases that we are not aware of. Sometimes, our brains need to shift out of our causal framework and think in a more statistical manner, but we rarely seem to do this well.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “The associative machinery seeks causes. The difficulty we have with statistical regularities is that they call for a different approach. Instead of focusing on how the event at hand came to be, the statistical view relates it to what could have happened instead. Nothing in particular caused it to be what it is – chance selected it from among its alternatives.”

 

This is hard for us to accept. We want there to be a reason for why one candidate won a toss-up election and the other lost. We want there to be a reason for why the tornado hit one neighborhood, and not the adjacent neighborhood. Our mind wants to find patterns, it wants to create associations between events, people, places, and things. It isn’t happy when there is a large amount of data, unknown variables, and some degree of randomness that can influence exactly what we observe.

 

Statistics, however, isn’t concerned with our need for intelligible causal structures. Statistics is fine with a coin flip coming up heads 9 times in a row, and the 10th flip still having a 50-50 shot of being heads.

 

Our minds don’t have the ability to hold multiple competing narratives at one time. In national conversations, we seem to want to split things into 2 camps (maybe this is just an artifact of the United States having a winner take all political system) where we have to sides to an argument and two ways of thinking and viewing the world. I tend to think in triads, and my writing often reflects that with me presenting a series of three examples of a phenomenon. When we need to hold 7, 15, or 100 different potential outcomes in our mind, we are easily overwhelmed. Accepting strange combinations that don’t fit with a simple this-or-that causal structure is hard for our minds, and in many cases being so nuanced is not very rewarding. We can generalize and make substitutions in these complex settings and usually do just fine. We can trick our selves to believing that we think statistically, even if we are really only justifying the causal structures and hypotheses that we want to be true.

 

However, sometimes, as in some elections, in understanding cancer risk, and making cost benefit analyses of traffic accidents for freeway construction, thinking statistically is important. We have to understand that there is a range of outcomes, and only so many predictions we can make. We can develop aids to help us think through these statistical decisions, but we have to recognize that our brains will struggle. We can understand our causal tendencies and desires, and recognize the difficulties of accepting statistical information to help set up structures to enable us to make better decisions.
Rarely Stumped

Rarely Stumped

Daniel Kahneman starts one of the chapters in his book Thinking Fast and Slow by writing, “A remarkable aspect of your mental life is that you are rarely stumped. True, you occasionally face a question such as 17 × 24 = ? to which no answer comes immediately to mind, but these dumbfounded moments are rare. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way.”

 

When I read this quote I am reminded of Gus, the father, in My Big Fat Greek Wedding. He is always ready to show how every word comes from a Greek root, even a Japanese word like kimono. He is sure of his intellect, sure that his heritage is perfect and is the foundation of all that is good in the world. He trusts his instincts and intuitions to a hilarious extent, even when he is clearly wrong and even when his decisions are gift-wrapped and planted in his mind in an almost Inception style.

 

His character is part caricature, but it is revealing of what Kahneman explains with the quote above. Our minds are good at finding intuitive answers that make sense of the world around us, even if we really don’t have any idea what is going on. We laugh at Gus and don’t consider ourselves to be guilty of behaving like him, but the only difference between most of us and Gus is that Gus is an exaggeration of the intuitive dogma and sense of self value and assurance that we all live with.

 

We scroll through social media, and trust that our initial judgment of a headline or post is the right frame for how to think about the issue. We are certain that our home remedy for tackling bug bites, cleaning windows, or curing a headache is based on sound science, even if it does nothing more than produce a placebo effect. We find a way to fit every aspect of our lives into a comprehensive framework where our decisions appear rational and justified, with us being the hero (or innocent victim if needed) of the story.

 

We should remember that we have a propensity to believe that we are always correct, that we are never stumped. We should pause, ask more questions, think about what is important to know before making a decision, and then deeply interrogate our thoughts to decide if we really have obtained meaningful information to inform our opinions, or if we are just acting on instinct, heuristics, self-interest, or out of groupthink. We cannot continue believing we are right, pushing baseless beliefs onto others when we have no real knowledge of an issue. We shouldn’t assume things are true just because they happen to align with the story we want to believe about ourselves and the world. When it comes to crucial issues and our interactions and relationships with others, we need to think more critically, and recognize when we are assuming we are right. If we can pause at those times and think more deeply, gather more information, ask more questions of our selves, we can have more accurate and honest interactions and relationships. Hopefully this will help us have more meaningful lives that better connect and better develop the community we all need in order to thrive.