A False Sense of Insecurity

A False Sense of Insecurity

The human mind is subject to a lot of cognitive errors and illusions. One cognitive error that we often fall into is a misperception of the frequency of events. If you have ever purchased a new car, you have likely experienced this. Prior to buying a new car, your eye probably wasn’t on the lookout for vehicles of the same make, model, year, and color. But suddenly, once you own a blue Ford Expedition, an orange Mini Cooper, or a silver Camaro, you will feel as though you are seeing more of those cars on the road. A cognitive illusion will make you feel as though suddenly everyone else has purchased the same car as you and that your particular year, make, model, and color of vehicle is growing in popularity (this has even happened to me with rental cars).
 
 
The reality is that other people didn’t all suddenly buy the same car as you. You are not that big of a trend setter. All that happened is that your focus while driving has shifted. You previously never paid attention to similar vehicles when you passed them on the road. You had no reason to think twice about a green Subaru, but now that you drive a green Subaru, every other green Subaru stands out. You remember seeing a car or it at least becomes salient to your mind, where previously you would not have actually thought about the other car. You would have seen it, but you wouldn’t have logged the occurrence in your mind.
 
 
Steven Pinker shows that this same phenomenon happens when we think about violence in his book The Better Angels of Our Nature. News headlines easily mislead us and create a false sense of insecurity. We don’t actually have a real sense or a good understanding of the trends of violence and crime in a given area, but we do have a good sense of what kinds of stories have been on the news lately. As Pinker writes, “if we don’t keep an eye on the numbers, the programing policy if it bleed it leads will feed the cognitive shortcut the more memorable, the more frequent, and we will end up with what has been called a false sense of insecurity.”
 
 
The cognitive shortcut that Pinker mentions is something Daniel Kahneman writes about in his book Thinking Fast and Thinking Slow. When we are asked a difficult question, like how common are gold BMWs or how do crime trends today compare with crime trends of five years ago, we take a cognitive shortcut to come up with an answer. Instead of diving into statistics and historical records, which is hard work, we substitute an easier question and provide an answer to that question. The question we answer is, “can I think of memorable instances of this thing?”
 
 
When we ask ourselves that question, our perception and what we happen to have thought about or noticed recently matters a lot. If we never think about Dodge trucks, we won’t think they are very common on the roads. But if we happen to own a Dodge truck, then we are more likely to pay attention to other Dodge trucks on the road meaning that we will answer the substitute question about their frequency with an overestimation of their actual commonness on the roads. The same happens with news reports of violence. Instead of answer the question about trends in violence, we answer the question, “can I remember instances of violence in my city, state, country, or in the world?” If we watch a lot of news, then we are going to hear about every school shooting in the country. We are going to hear about all the robberies and assaults in our city, and we are going to hear about violent acts from across the globe. We are going to remember these events and consequently feel that the world is a dangerous and violent place, even if actual trends in violence and crime are decreasing. This cognitive error, based on a cognitive shortcut, creates a false sense of insecurity about the true nature of violence in our world.
Sunk Costs and Public Commitment

Sunk Costs & Public Commitment

The Sunk Cost fallacy is an example of an error in human judgment that we should all try to keep in mind. Thinking about sunk costs and how we respond to them can help us make better decisions in the future. It is one small avenue of cognitive psychology research that we can act on and see immediate benefits in our lives.
 
 
Sunk costs pop up all over the place. Are you hesitant to change lines at the grocery store because you have already been waiting in one line for a while and might as well stick it out? Did you start a landscaping project that isn’t turning out the way you want, but you are afraid to give up and try something else because you have already put so much time, money, and effort into the current project? Would you be mad at a politician who wanted to pull out of a deadly war and give up because doing so would mean that soldiers died in vain?
 
 
All of these examples are instances where sunk cost fallacies can lead us to make worse decisions. In his book The Better Angels of Our Nature, Steven Pinker writes, “though psychologists don’t fully understand why people are suckers for sunk costs, a common explanation is that it signals a public commitment.” In the examples above, changing course signals something weak about us. We are not patient and not willing to stick out our original decision to wait in one grocery store line relative to another. We are not committed to our vision of a perfect lawn and are willing to give up and put in cheap rock instead of seeing our sprinkler system repair all the way through. And we are not truly patriotic and don’t truly value the lives of soldiers lost in war if we are willing to give up a fight. In each of these areas, we may feel pressured to persist with our original decision which has become more costly than we expected. Even as costs continue to mount, we feel a need to stay the course. We fail to recognize that sunk costs are in the past, that we can’t do anything to recoup them, and that we can make more efficient decisions moving forward if we can avoid feeling bad about sunk costs.
 
 
Tied  to the pressure we feel is a misperception of incremental costs. Somehow additional time spent in line, additional effort spent on the lawn, and additional lives lost in battle matter less given everything that has already passed. “An increment is judged relative to the previous amount,” writes Pinker. One more life lost in a war doesn’t feel as tragic once many lives have already been lost. Another hundred dollars on sprinkler materials doesn’t feel as costly when we have already put hundreds into our landscaping project (even if $100 in rock would go further and be simpler). And another minute in line at the grocery store is compared the to the time already spent waiting, distorting how we think about that time.
 
 
If we can reconsider sunk costs, we can start to make better decisions. We can get over the pride we feel waiting out the terrible line at the grocery store. We can reframe our landscaping and make the simpler decision and begin enjoying our time again. And we can save lives by not continuing fruitless wars because we don’t want those who already died to have died in vain. Changing our relationship to sunk costs and how we consider incremental costs can have an immediate benefit in our lives, one of the few relatively easy lessons we can learn from cognitive psychology research.
The Elephant in the Brain with Psychics and Mediums - Kevin Simler - Robin Hanson - Mary Roach - Joe Abittan - Spook: Science Tackles the Afterlife

The Elephant in the Brain with Psychics and Mediums

In the book The Elephant in the Brain, Robin Hanson and Kevin Simler argue that our own self-interest drives a huge amount of our behavior. On the surface this doesn’t sound like a huge shock, but if you truly look at how deeply our self-interest is tied to everything we do, you start to see that we like to pretend that we don’t act purely out of our own self-interest. Instead, we lie to ourselves and others and create high minded reasons for our beliefs, behaviors, and actions. But our self-interest is never far behind. It is always there as the elephant in the room (or brain) influencing all that we do even if we constantly try to ignore it.
This is likely what happens when people visit psychics and mediums with the hopes of learning about their future or reconnecting with the spirit of a lost one. Mary Roach describes what is going on with psychics, mediums, and their clients in her book Spook, and I think her explanation is a strong argument for the ideas presented by Hanson and Simler in The Elephant in the Brain. She writes:
“It seems to me that in many cases psychics and mediums prosper not because they’re intentionally fraudulent, but because their subjects are uncritical. The people who visit mediums and psychics are often strongly motivated or constitutionally inclined to believe that what is being said is relevant and meaningful with regard to them or a loved one.”
Both psychics/mediums and their subjects are motivated by self-interests that they don’t want to fully own up to. They both deceive themselves in order to appear to genuinely believe the experience. If you can fool yourself then it becomes much easier to fool others, and that requires that you ignore the elephant (your self-interest) in your brain.
Clients want to believe they are really interacting with the spirit of a lost one and not being fooled or defrauded. Critical thinking and deliberately acknowledging that they are susceptible to being fooled are ignored and forgotten. Instead, the individual’s self-interest acts behind the scenes as they help create the reality they want to inhabit with the help of the psychic or medium.
The psychics and mediums also don’t want to be viewed as fraudsters and quacks. They hide the fact that they have economic and social motivations to appear to have special powers and signal their authenticity. If a client is uncritical, it helps the entire process and allows both parties to ignore their self-interest acting below the surface. Ultimately, as Roach argues, the process is dependent on both practitioners who are willing to believe their subjects are having authentic experiences and on subjects to then believe their psychics and mediums are genuinely communicating with the dead. Without either, and without the self-deception for both, the whole process would fall apart.
Personally and Politically Disturbed by the Homeless

Personally and Politically Disturbed by the Homeless

On the first page of the preface of The Homeless, Christopher Jencks writes about the responses that many Americans had to the rise of homelessness in American cities in the 1970s. He writes, “The spread of homelessness disturbed affluent Americans for both personal and political reasons. At a personal level, the faces of the homeless often suggest depths of despair that we would rather not imagine, much less confront in the flesh. … At a political level, the spread of homelessness suggests that something has gone fundamentally wrong with America’s economic or social institutions.”
I think the two books which most accurately describe the way that I understand our political and social worlds are Thinking Fast and Slow by Daniel Kahneman and The Elephant in the Brain by Kevin Simler and Robin Hanson. Kahneman suggests that our brains are far more susceptible to cognitive errors than we would like to believe. Much of our decision-making isn’t really so much decision-making as it is excuse making, finding ways to give us agency over decisions that were more or less automatic. Additionally, Kahneman shows that we very frequently, and very predictably, make certain cognitive errors that lead us to inaccurate conclusions about the world. Simler and Hansen show that we often deliberately mislead ourselves, choosing to intentionally buy into our minds’ cognitive errors. By deliberately lying to ourselves and choosing to view ourselves and our beliefs through a false objectivity, we can better lie to others, enhancing the way we signal to the world and making ourselves appear more authentic. [Note: some recent evidence has put some findings from Kahneman in doubt, but I think his general argument around cognitive errors still holds.]
Jencks published his book long before Thinking Fast and Slow and The Elephant in the Brain were published, but I think his observation hints at the findings that Kahneman, Simler, and Hanson would all write about in the coming decades. People wanted to hold onto beliefs they possibly knew or suspected to be false. They were disturbed by a reality that did not match the imagined reality in which they wanted to believe. They embraced cognitive errors and adopted beliefs and conclusions based on those cognitive errors. They deceived themselves about reality to better appear to believe the myths they embraced, and in the end they developed a political system where they could signal their virtue by strongly adhering to the initial cognitive errors that sparked the whole process.
Jencks’ quote shows why homelessness is such a tough issue for many of us to face. When we see large number of people failing and ending up homeless it suggests that there is something more than individual shortcomings at work. It suggests that somewhere within society and our social structures are points of failure. It suggests that our institutions, from which we may benefit as individuals, are not serving everyone. This goes against our beliefs which reinforce our self-interest, and is hard to accept. It is much easier to simply fall back on cognitive illusions and errors and to blame those who have failed. We truly believe that homelessness is the problem of individuals because we are deceiving ourselves, and because it serves our self-interest to do so. When we see homeless, we see a reality we want to ignore and pretend does not exist because we fear it and we fear that we may be responsible for it in some way. We fear that homelessness will necessitate a change in the social structures and institutions that have helped us get to where we are and that changes may make things harder for us or somehow diminishing our social status. This is why we are so disturbed by homeless, why we prefer not to think about it, and why we develop policies based on the assumption that people who end up homeless are deeply flawed individuals and are responsible for their own situation. It is also likely why we have not done enough to help the homeless, why it is becoming a bigger issue in American cities, and why we have been so bad at addressing the real causes of homelessness in America. There is definitely some truth to the argument that homelessness is the result of flawed individuals, which is why it is such a strong argument, but we should accept that there are some flawed causal thoughts at play and that it is often in our self-interest to dismiss the homeless as individual failures.
Causal Illusions - The Book of Why

Causal Illusions

In The Book of Why Judea Pearl writes, “our brains are not wired to do probability problems, but they are wired to do causal problems. And this causal wiring produces systematic probabilistic mistakes, like optical illusions.” This can create problems for us when no causal link exists and when data correlate without any causal connections between outcomes.  According to Pearl, our causal thinking, “neglects to account for the process by which observations are selected.”  We don’t always realize that we are taking a sample, that our sample could be biased, and that structural factors independent of the phenomenon we are trying to observe could greatly impact the observations we actually make.
Pearl continues, “We live our lives as if the common cause principle were true. Whenever we see patterns, we look for a causal explanation. In fact, we hunger for an explanation, in terms of stable mechanisms that lie outside the data.” When we see a correlation our brains instantly start looking for a causal mechanism that can explain the correlation and the data we see. We don’t often look at the data itself to ask if there was some type of process in the data collection that lead to the outcomes we observed. Instead, we assume the data is correct and  that the data reflects an outside, real-world phenomenon. This is the cause of many causal illusions that Pearl describes in the book. Our minds are wired for causal thinking, and we will invent causality when we see patterns, even if there truly isn’t a causal structure linking the patterns we see.
It is in this spirit that we attribute negative personality traits to people who cut us off on the freeway. We assume they don’t like us, that they are terrible people, or that they are rushing to the hospital with a sick child so that our being cut off has a satisfying causal explanation. When a particular type of car stands out and we start seeing that car everywhere, we misattribute our increased attention to the type of car and assume that there really are more of those cars on the road now. We assume that people find them more reliable or more appealing and that people purposely bought those cars as a causal mechanism to explain why we now see them everywhere. In both of these cases we are creating causal pathways in our mind that in reality are little more than causal illusions, but we want to find a cause to everything and we don’t always realize that we are doing so. It is important that we be aware of these causal illusions when making important decisions, that we think about how the data came to mind, and whether there is a possibility of a causal illusion or cognitive error at play.
Paternalistic Nudges - Joe Abittan

Paternalistic Nudges

In their book Nudge, Cass Sunstein and Richard Thaler argue in favor of libertarian paternalism. Their argument is that our world is complex and interconnected, and it is impossible for people to truly make decisions on their own. Not only is it impossible for people to simply make their own decisions, it is impossible for other people to avoid influencing the decisions of others. Whether we decide to influence a decision in a particular way, or whether we decide to try to avoid any influence on another’s decision, we still shape how decisions are presented, understood, and contextualized. Given this reality, the best alternative is to try to help people make consistently better decisions than they would without aid and assistance.

 

The authors describe libertarian paternalism by writing:

 

“The approach we recommend does count as paternalistic, because private and public choice architects are not merely trying to track or to implement people’s anticipated choices. Rather, they are self-consciously attempting to move people in directions that will make their lives better. They nudge.”

 

The nudge is the key aspect of libertarian paternalism. Forcing people into a single choice, forcing them to accept your advice and perspective, and aggressively trying to change people’s behaviors and opinions doesn’t fit within the libertarian paternalism framework advocated by Sunstein and Thaler. Instead, a more subtle form of guidance toward good decisions is employed. People retain maximal choices if desired, and their opinions, decisions, and behaviors are somewhat constrained but almost nothing is completely off the table.

 

“A nudge,” Sunstein and Thaler write, “as we will use the term, is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives.”

 

Daniel Kahneman, in his book Thinking Fast and Slow demonstrated that people make predictable errors and have predictable biases. If we can understand these thinking errors and biases, then we can identify situations in which these biases and cognitive errors are likely to lead people to making suboptimal decisions. To go a step further, as Sunstein and Thaler would suggest, if we are a choice architect, we should design and structure choices in a way that leads people away from predictable cognitive biases and errors. We should design choices in a way that takes those thinking mistakes into consideration and improves the way people understand their choices and options.

 

As a real world example, if we are structuring a retirement savings plan, we can be relatively sure that people will anchor around a default contribution built into their retirement savings plan. If we want to encourage greater retirement savings (knowing that economic data indicate people rarely save enough), we can set the default to 8% or higher, knowing that people may reduce the default rate, but likely won’t eliminate contributions entirely. Setting a high default is a nudge toward better retirement saving. We could chose not to have a default rate at all, and it is likely that people wouldn’t be sure about what rate to select and might chose a low rate below inflation or simply chose not to enter a rate at all, completely failing to contribute anything to the plan. It is clear that there is a better outcome that we, as choice architects, could help people attain if we understand how their minds work and can apply a subtle nudge.
Can We Avoid Cognitive Errors?

Can We Avoid Cognitive Errors?

Daniel Kahneman is not very hopeful when it comes to our ability to avoid cognitive errors. Toward the end of his book Thinking Fast and Slow, a book all about cognitive errors, predictable biases, and situations in which we can recognize such biases and thinking errors, Kahneman isn’t so sure there is much we can actually do in our lives to improve our thinking.

 

Regarding his own thinking, Kahneman writes, “little can be achieved without considerable effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.”

 

Kahneman’s book is fantastic in part because of his humility. It would be easy to take a book on illusions, cognitive errors, biases, and predictable fallacies and use it to show how much smarter you are than everyone else who makes such thinking mistakes. However, Kahneman uses his own real life examples throughout the book to show how common and easy it is to fall into ways of thinking that don’t actually reflect reality. What is unfortunate though, is how hard it is to actually take what you learn from the book and apply it to your own life. If the author himself can hardly improve his own thinking, then those of us who read the book likely won’t make big changes in our thinking either.

 

“The upshot is that it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so. Observers are less cognitively busy and more open to information than actors,” Kahneman continues. While we might not be able to improve our thinking simply by knowing about cognitive errors and being aware of predictable biases, we can at least recognize them in others. This can help us be more thoughtful when we critique or gossip about others (something we all do even if we claim we don’t).

 

Beyond improving the way we gossip or judge others, Kahneman’s research and his book are incredibly valuable for anyone who is in a design focused role. If you are creating a layout for a webpage, a seating arrangement at a restaurant, or the standard operating procedures for a company, you have an opportunity to design and develop a process and flow that takes cognitive errors and predictable biases into account. Because it is easier to observe others making mistakes than to observe those mistakes in ourselves, we can watch for situations where people are led astray, and help get them back on course. We can develop systems and structures that take our biases and cognitive errors into account, and minimize the damage they may do. We can set the world up to help guide us in a reasonable way through our cognitive errors and biases, but only if we know what to look for.
The Remembering Self and Time - Joe Abittan

The Remembering Self and Time

Time, as we have known it, has only been with human beings for a small slice of human history. The story of time zones is fascinating, and really began once rail roads connected the United States. Before we had a standardized system for operating within time, human lives were ruled by the cycle of the sun and the seasons, not by the hands of a watch. This is important because it suggests that the time bounds we put on our lives, the hours of our schedules and work days, and the way we think about the time duration of meetings, movies, a good night’s sleep, and flights is not something our species truly evolved to operate within.

 

In Thinking Fast and Slow, Daniel Kahneman shows one of the consequences of human history being out of sync with modern time. “The mind,” he writes, “is good with stories, but it does not appear to be well designed for the processing of time.”

 

I would argue that this makes sense and should be expected. Before we worked set schedules defined by the clock, before we could synchronize the start of a football game with TV broadcasts across the world, and before we all needed to be at the same place at precisely the right time to catch a departing train, time wasn’t very important. It was easy to tie time with sunrise, sunset, or mid-day compared to a 3:15 departure or a 7:05 kick-off. The passage of time also didn’t matter that much. The difference between being 64 and 65 years old wasn’t a big deal for humans that didn’t receive retirement benefits and social security payments. We did not evolve to live in a world where every minute of every day was tightly controlled by time and where the passage of time was tied so specifically to events in our lives.

 

For me, and I think for Daniel Kahneman, this may explain why we see some of the cognitive errors we make when we remember events from our past. Time wasn’t as important of a factor for ancient humans as story telling was. Kahneman continues,

 

“The remembering self, as I have described it, also tells stories and makes choices, and neither the stories nor the choices properly represent time. In storytelling mode, an episode is represented by a few critical moments, especially the beginning, the peak, and the end. Duration is neglected.”

 

When we think back on our lives, on moments that meant a lot to us, on times we want to relive, or on experiences we want to avoid in the future, we remember the salient details. We don’t necessarily remember how long everything lasted. My high school basketball days are not remembered by the hours spent running UCLAs, by the number of Saturdays I had to be up early for 8 a.m. practices, or by the hours spent in drills. My memories are made up of a few standout plays, games, and memorable team moments. The same is true for my college undergrad memories, the half-marathons I have raced, and my memories from previous homes I have lived in.

 

When we think about our lives we are not good at thinking about the passage of time, about how long we spent working on something, how long we had to endure difficulties, or how long the best parts of our lives lasted. We live with snapshots that can represent entire years or decades. Our remembering self drops the less meaningful parts of experiences from our memories, and holds onto the start, the end, and the best or worst moments from an experience. It distorts our understanding of our own history, and creates memories devoid of a sense of time or duration.

 

I think about this a lot because our minds and our memories are the things that drive how we behave and how we understand the present moment. However, duration neglect helps us see that reality of our lives is shaped by unreality. We are influenced by cognitive errors and biases, by poor memories, and distortions of time and experience. It is important to recognize how faulty our thinking can be, so we can develop systems, structures, and ways of thinking that don’t assume we are always correct, but help guide us toward better and more realistic ways of understanding the world.
The Focusing Illusion Continued

The Focusing Illusion Continued

I find the focusing illusion as described by Daniel Kahneman in his book Thinking Fast and Slow to be fascinating because it reveals how strange our actual thinking is. I am constantly baffled by the way that our brains continuously and predictably makes mistakes. The way we think about, interpret, and understand the world is not based on an objective reality, but is instead based on what our brain happens to be focused on at any given time. As Kahneman writes, what you see is all there is, and the focusing illusion is a product of our brain’s limited ability to take in information combined with the brain’s tendency to substitute difficult and complex questions for more simple questions.

 

In the book, Kahneman asks us to think about the overall happiness of someone who recently moved from Ohio to California and also asks us to think about the amount of time that paraplegics spend in a bad mood. In both situations, we make a substitution. We know that people’s overall happiness and general moods are comprised of a huge number of factors, but when we think about the two situations, we focus in on a couple of simple ideas.

 

We assume the person from Ohio is happier in California because the weather in California is always perfect while Ohio experiences cold winters. The economic prospects in California might be better than Ohio, and there are more movie stars and surfing opportunities. Without knowing anything about the person, we probably assume the California move made them happier overall (especially given the additional context and priming based on the weather and jobs prospects that Kahneman presents in the example in his book).

 

For our assumptions about the paraplegic, we likely go the other way with our thoughts. We think about how we would feel if we were in an accident and lost the use of our legs or arms. We assume their life must be miserable, and that they spend much of their day in a bad mood. We don’t make a complex consideration of the individual’s life or ask more information about them, we just make an assumption based on limited information by substituting in the question, “How would I feel if I became paralyzed.” Of course, people who are paralyzed or lose the function of part of their body are still capable of a full range of human emotions, and might still find happiness in their lives in many areas.

 

Kahneman writes, “The focusing illusion can cause people to be wrong about their present state of well-being as well as about the happiness of others, and about their own happiness in the future.”

 

We often say that it is important that we know ourselves and that we be true to ourselves if we want to live healthy and successful lives. But research throughout Thinking Fast and Slow shows us how hard it can be. After reading Kahneman’s book, learning about Nudges from Cass Sunstein and Richard Thaler, and learning how poorly we process risk and chance from Gerd Gigerenzer, I constantly doubt how much I can really know about myself, about others, or really about anything. I am frustrated when people act on intuition, sure of themselves and their ideas in complex areas such as economics, healthcare, or education. I am dismayed by advertisements, religions, and political parties that encourage us to act tribally and to trust our instincts and intuitions. It is fascinating that we can be so wrong about something as personal as our own happiness. It is fascinating that we can be so biased in our thinking and judgement, and that we can make conclusions and assumptions about ourselves and others with limited information and not even notice how poorly our thought processes are. I love thinking about and learning about the biases and cognitive errors of our mind, and it makes me pause when I am sure of myself and when I think that I am clearly right and others are wrong. After all, if what you see is all there is, then your opinions, ideas, and beliefs are almost certainly inadequate to actually describe the reality you inhabit.
Sunk-Cost Fallacy - Joe Abittan

Sunk-Cost Fallacy

Every time I pick the wrong line at the grocery store I am reminded of the sunk-cost fallacy. There are times I will be stuck in line, see another line moving more quickly, and debate internally if I should jump to the other line or just wait it out in the line I’m already in. Once I remember the sunk-cost fallacy, however, the internal debate shifts and I let go of any feeling that I need to remain in the current line.

 

My grocery store example is a comical take on the sunk-cost fallacy, but in real life, this cognitive error can have huge consequences. Daniel Kahneman describes it this way, “The decision to invest additional resources in a losing account, when better investments are available, is known as the sunk-cost fallacy, a costly mistake that is observed in decisions large and small.”

 

We are going to make decisions and choices for where to invest our time, attention, and money that will turn out to be mistakes. At a certain point we have to realize when something is not working and walk away. Doing so, however, requires that we admit failure, that we cut our losses, and that we search for new opportunities. Admitting that we were wrong, giving up on losses, and searching for new avenues is difficult, and it is not uncommon for us to keep moving forward despite our failures, as if we just need to try harder and push more in order to find the success we desire. This is the base of the sunk-cost fallacy. When we have invested a lot of time, energy, and resources into something it is hard to walk away, even if we would be better off by doing so.

 

Pursuing a career path that clearly isn’t panning out and refusing to try a new different avenue is an example of sunk-cost fallacy. Movie studios that try to reinvent a character or story over and over with continued failure is another example. Sitting through the terrible movie the studio produced, rather than leaving the theater early, is also an example of the sunk-cost fallacy. In all of these instances, an investment has been made, and costly efforts to make the investment pay-off are undertaken, generally at a greater loss than would be incurred if we had made a change and walked away.

 

When you find yourself saying, “I have already spent so much money on XYZ, or I have already put so much effort into making XYZ work, and I don’t want to just let that all go to waste,” you are stuck in the middle of the sunk-cost fallacy. At this point, it is time to step back, look at other ways you could spend your money and time, and honestly evaluate what your priorities should be. Doing so, and remembering Kahneman’s quote, will help you begin to make the shift to a better use of your time, energy, and resources. It may be embarrassing and disappointing to admit that something is going in the wrong direction, but ultimately, you will end up in a better and more productive spot.