Sunk Costs and Public Commitment

Sunk Costs & Public Commitment

The Sunk Cost fallacy is an example of an error in human judgment that we should all try to keep in mind. Thinking about sunk costs and how we respond to them can help us make better decisions in the future. It is one small avenue of cognitive psychology research that we can act on and see immediate benefits in our lives.
 
 
Sunk costs pop up all over the place. Are you hesitant to change lines at the grocery store because you have already been waiting in one line for a while and might as well stick it out? Did you start a landscaping project that isn’t turning out the way you want, but you are afraid to give up and try something else because you have already put so much time, money, and effort into the current project? Would you be mad at a politician who wanted to pull out of a deadly war and give up because doing so would mean that soldiers died in vain?
 
 
All of these examples are instances where sunk cost fallacies can lead us to make worse decisions. In his book The Better Angels of Our Nature, Steven Pinker writes, “though psychologists don’t fully understand why people are suckers for sunk costs, a common explanation is that it signals a public commitment.” In the examples above, changing course signals something weak about us. We are not patient and not willing to stick out our original decision to wait in one grocery store line relative to another. We are not committed to our vision of a perfect lawn and are willing to give up and put in cheap rock instead of seeing our sprinkler system repair all the way through. And we are not truly patriotic and don’t truly value the lives of soldiers lost in war if we are willing to give up a fight. In each of these areas, we may feel pressured to persist with our original decision which has become more costly than we expected. Even as costs continue to mount, we feel a need to stay the course. We fail to recognize that sunk costs are in the past, that we can’t do anything to recoup them, and that we can make more efficient decisions moving forward if we can avoid feeling bad about sunk costs.
 
 
Tied  to the pressure we feel is a misperception of incremental costs. Somehow additional time spent in line, additional effort spent on the lawn, and additional lives lost in battle matter less given everything that has already passed. “An increment is judged relative to the previous amount,” writes Pinker. One more life lost in a war doesn’t feel as tragic once many lives have already been lost. Another hundred dollars on sprinkler materials doesn’t feel as costly when we have already put hundreds into our landscaping project (even if $100 in rock would go further and be simpler). And another minute in line at the grocery store is compared the to the time already spent waiting, distorting how we think about that time.
 
 
If we can reconsider sunk costs, we can start to make better decisions. We can get over the pride we feel waiting out the terrible line at the grocery store. We can reframe our landscaping and make the simpler decision and begin enjoying our time again. And we can save lives by not continuing fruitless wars because we don’t want those who already died to have died in vain. Changing our relationship to sunk costs and how we consider incremental costs can have an immediate benefit in our lives, one of the few relatively easy lessons we can learn from cognitive psychology research.
Causal Illusions - The Book of Why

Causal Illusions

In The Book of Why Judea Pearl writes, “our brains are not wired to do probability problems, but they are wired to do causal problems. And this causal wiring produces systematic probabilistic mistakes, like optical illusions.” This can create problems for us when no causal link exists and when data correlate without any causal connections between outcomes.  According to Pearl, our causal thinking, “neglects to account for the process by which observations are selected.”  We don’t always realize that we are taking a sample, that our sample could be biased, and that structural factors independent of the phenomenon we are trying to observe could greatly impact the observations we actually make.
Pearl continues, “We live our lives as if the common cause principle were true. Whenever we see patterns, we look for a causal explanation. In fact, we hunger for an explanation, in terms of stable mechanisms that lie outside the data.” When we see a correlation our brains instantly start looking for a causal mechanism that can explain the correlation and the data we see. We don’t often look at the data itself to ask if there was some type of process in the data collection that lead to the outcomes we observed. Instead, we assume the data is correct and  that the data reflects an outside, real-world phenomenon. This is the cause of many causal illusions that Pearl describes in the book. Our minds are wired for causal thinking, and we will invent causality when we see patterns, even if there truly isn’t a causal structure linking the patterns we see.
It is in this spirit that we attribute negative personality traits to people who cut us off on the freeway. We assume they don’t like us, that they are terrible people, or that they are rushing to the hospital with a sick child so that our being cut off has a satisfying causal explanation. When a particular type of car stands out and we start seeing that car everywhere, we misattribute our increased attention to the type of car and assume that there really are more of those cars on the road now. We assume that people find them more reliable or more appealing and that people purposely bought those cars as a causal mechanism to explain why we now see them everywhere. In both of these cases we are creating causal pathways in our mind that in reality are little more than causal illusions, but we want to find a cause to everything and we don’t always realize that we are doing so. It is important that we be aware of these causal illusions when making important decisions, that we think about how the data came to mind, and whether there is a possibility of a causal illusion or cognitive error at play.
Co-opting Mental Machinery

Co-opting Mental Machinery

The human mind is great at pattern recognition, but it is not the only brain that can recognize a pattern. Pigeons can recognize patterns for food distribution with button presses, mice can remember mazes and navigate through complex patterns to a reward, and other animals can recognize patterns in hunting, mating, and other activities. What humans do differently is use pattern recognition to determine causal structures by imagining and testing alternative hypotheses. This is a crucial step beyond the pattern recognition of other animals.
In The Book of Why Judea Pearl writes, “It is not too much of a stretch to think that 40,000 years ago, humans co-opted the machinery in their brain that already existed for pattern recognition and started to use it for causal reasoning.” This idea is interesting because it explains our pattern recognition linkage with other animals and helps us think about how brain structures and ways of thinking may have evolved.
In isolation, a brain process is interesting, but not as interesting as when considered alongside similar brain processes. When we look at pattern recognition and its similarities to causal reasoning, we see a jumping off point. We can see how brain processes that helped us in one area opened up new possibilities through development. This helps us think more deeply about the mental abilities that we have.
The ways we think and how our brains work is not static. Different cultural factors, environmental factors, and existing brain processes can all shape how our brains work and evolve individually and as a species.  As Pearl notes, it is likely that many of our brain processes co-opted other mental machinery for new purposes. Very few of what see in human psychology can be well understood in isolation. Asking why and how evolution could have played a role is crucial to understanding who we are now and how we got to this point. Causality is not something that just existed naturally in the brain. It was built by taking other processes and co-opting them for new purposes, and those new purposes have allowed us to do magnificent things like build rockets, play football, and develop clean water systems.
The Human Need for Certainty - Joe Abittan

The Human Need for Certainty

Throughout the book Risk Savvy, Gerd Gigerenzer discusses the challenges that people face with thinking statistically, assessing different probable outcomes, and understanding risk. Gigerenzer also discusses how important it is that people become risk literate, and how the future of humanity will require that people better understand risk and uncertainty. What this future requires, he explains, is fighting against aspects of human psychology that are common to all of us and form part of our core nature. One aspect in particular that Gigerenzer highlights as a problem for humans moving forward, is our need for certainty.

 

“Humans appear to have a need for certainty, a motivation to hold onto something rather than to question it,” he writes. Whether it is our religion, our plans for retirement, or the brand of shoes we prefer, we have a need for certainty. We don’t want to question whether our religious, political, or social beliefs are correct. It is more comforting for us to adopt beliefs and be certain that we are correct. We don’t want to continuously re-evaluate our savings plans and open ourselves to the possibility that we are not doing enough to save for retirement. And we like to believe that we purchased the best running shoes, that we bough the most sustainable shoes for the planet, and that our shoe choices are the most popular. In all of these areas, ambiguity makes our decisions harder whereas a feeling of certainty gives us confidence and allows us to move through the world. In many ways, our need for certainty is simply a practicality. There are unlimited possibilities and decisions for us to make every day. Adopting certainty eliminates many possibilities and choices, simplifying our life and allowing us to move through the world without having to question every action of every second of every day.

 

But in the modern world, humans have to be more comfortable living with ambiguity and have to be able to give up certainty in some areas. “For the mature adult,” Gigerenzer writes, “a high need for certainty can be a dangerous thing.”  We live with risk and need to be able to adjust as we face new risks and uncertainties in our lives. We like to hold onto our beliefs and we are not comfortable questioning our decisions, but it can be necessary for us to do so in order to move forward and live in harmony in a changing world with new technologies, different demographics, and new uncertainties. A need for certainty can lead people to become dogmatic, to embrace apologetics when discounting science that demonstrates errors in thinking, and to ignore the realities of a changing world. One way or another, we have to find ways to be flexible and adjust our choices and plans according to risk, otherwise we are likely to make poor choices and be crushed when the world does not align itself with our beliefs and wishes.
Endowment Effects Joe Abittan

Endowment Effects

In his book Thinking Fast and Slow, Daniel Kahneman discusses an experiment he helped run to explore the endowment effect. The endowment effect is a cognitive fallacy that helps explain our attachment to things and our unwillingness to part with objects, even when we are offered something greater than the objective value of the the object itself. We endow the object with greater significance than is really warranted, and in his book, Kahneman shows that this has been studied with Super Bowl tickets, wine, and coffee mugs.

 

Kahneman helped run experiments at a few different universities where college students were randomly given coffee mugs with the university logo. The mugs were worth about $6 each, and were randomly distributed to about half of a classroom. Students were allowed to buy or sell the mugs, and the researchers saw a divergence in the value assigned to the mugs by the students who randomly obtained a mug and those who didn’t. Potential sellers were willing to part with the mug for about $7 dollars, a price above the actual value of the mug. Buyers, however, were generally only willing to purchase a mug for about $3, or half the value of the mug.

 

Kahneman suggests that the endowment effect has something to do with the unequal values assigned to the mug by those who received a mug and those who didn’t. He suggests that it is unlikely that those who received the mugs really wanted a university mug and particularly valued a mug relative to those who didn’t receive a mug. Those students should have been willing to trade the mug for $3 dollars which could be used to purchase something that they may have actually wanted, rather than a random mug. To explain why they didn’t sell their mugs, Kahneman suggests that the mugs became endowed with additional value by those who received them.

 

A further study showed similar effects. When all students in the class randomly received either a chocolate bar or a mug, researchers found that fewer students were willing to make a trade than the researchers predicted. Again, it is unlikely that a random distribution of mugs and candy perfectly matched the mug versus candy preferences of the students. There should have been plenty of students who could have used a sugar boost more than an extra mug (and vice versa), but little trading actually took place. It appears that once someone randomly receives a gift, even if the value of the gift was very small, they are not likely to give it up. The gift becomes endowed with some meaning beyond its pure utility and value.

 

Kahneman describes part of what takes place in our minds when the endowment effect is at work, “the shoes the merchant sells you and the money you spend from your budget for shoes are held for exchange. They are intended to be traded for other goods. Other goods, such as wine and Super Bowl tickets, are held for use to be consumed or otherwise enjoyed. Your leisure time and the standard of living that your income supports are also not intended for sale or exchange.”

 

The random mug or candy bar were not seen as objective items intended to be traded or bartered in exchange for something that we actually want. They were viewed as a windfall over the status quo, and thus their inherent value to the individual was greater than the actual value of the object. Kahneman suggests that this is why so few students traded candy for mugs, and why mug sellers asked far more than what mug buyers wanted to pay in his experiments. The endowment effect is another example of how our emotional valence and narrative surrounding an otherwise objectively unimportant object can shape our behaviors in ways that can seem irrational. Next spring when you are trying to de-clutter your house, remember this post and the endowment effect. Remember that you are imbuing objects with value simply because you happen to own it, and remember that you would only pay half price for it if it was actually offered to you for purchase now. Hopefully that helps you minimalize the number of mugs you own and declutter some of your cabinets.
Regression to the Mean Versus Causal Thinking

Regression to the Mean Versus Causal Thinking

Regression to the mean, the idea that there is an average outcome that can be expected and that overtime individual outliers from the average will revert back toward that average, is a boring phenomenon on its own. If you think about it in the context of driving to work and counting your red lights, you can see why it is a rather boring idea. If you normally hit 5 red lights, and one day you manage to get to work with just a single red light, you probably expect that the following day you won’t have as much luck with the lights, and will probably have more red lights than than your lucky one red light commute. Conversely, if you have a day where you manage to hit every possible red light, you would probably expect to have better traffic luck the next day and be somewhere closer to your average. This is regression to the mean. Simply because you had only one red or managed to hit every red one day doesn’t cause the next day’s traffic light stoppage to be any different, but you know you will probably have a more average count of reds versus greens – no causal explanation involved, just random traffic light luck.

 

But for some reason this idea is both fascinating and hard to grasp in other areas, especially if we think that we have some control of the outcome. In Thinking Fast and Slow, Daniel Kahneman helps explain why it is so difficult in some settings for us to accept regression to the mean, what is otherwise a rather boring concept. He writes,

 

“Our mind is strongly biased toward causal explanations and does not deal well with mere statistics. When our attention is called to an event, associative memory will look for its cause – more precisely, activation will automatically spread to any cause that is already stored in memory. Causal explanations will be evoked when regression is detected, but they will be wrong because the truth is that regression to the mean has an explanation but does not have a cause.”

 

Unless you truly believe that there is a god of traffic lights who rules over your morning commute, you probably don’t assign any causal mechanism to your luck with red lights. But when you are considering how well a professional golfer played on the second day of a tournament compared to the first day, or when you are considering whether intelligent women marry equally intelligent men, you are likely to have some causal idea that comes to mind. The golfer was more or less complacent on the second day – the highly intelligent women have to settle for less intelligent men because the highly intelligent men don’t want an intellectual equal. These are examples that Kahneman uses in the book and present plausible causal mechanisms, but as Kahneman shows, the more simple though boring answer is simply regression to the mean. A golfer who performs spectacularly on day one is likely to be less lucky on day two. A highly intelligent woman is likely to marry a man with intelligence closer to average just by statistical chance.

 

When regression to the mean violates our causal expectation it becomes an interesting and important concept. It reveals that our minds don’t simply observe an objective reality, they observe causal structures that fit with preexisting narratives. Our causal conclusions can be quite inaccurate, especially if they are influenced by biases and prejudices that are unwarranted. If we keep regression to the mean in mind, we might lose some of our exciting narratives, but our thinking will be more sound, and our judgments more clear.
Anchoring Effects

Anchoring Effects

Anchoring effects were one of the psychological phenomenon that I found the most interesting in Daniel Kahneman’s book Thinking Fast and Slow. In many situations in our lives, random numbers seem to be able to influence other numbers that we consciously think about, even when there is no reasonable connection between the random number we see, and the numbers we consciously use for another purpose. As Kahneman writes about anchoring effects, “It occurs when people consider a particular value for an unknown quantity before estimating that quantity.”

 

Several examples of anchoring effects are given in the book. In one instance, judges were asked to assess how much a night club should be fined for playing loud music long after the quite orders in the night club’s local town. Real life judges who have to make these legal decisions were presented with the name of the club and information about the violation of the noise ordinance. The fictitious club was named after the fictitious street that it was located along. In some instanced, the club name was something along the lines of 1500 First Street, and in other instances the club name was something like 10 First Street. Judges consistently assessed a higher fine to the club with the name 1500 First Street than the club with the name 10 First Street. Seemingly random and unimportant information, the numbers for the street address in the name of the club, had a real impact on the amount that judges on average thought the club should be fined.

 

In other examples of anchoring effects, Kahneman shows us that we come up with different guesses of how old Gandhi was when he died if we are asked if he was older than 35 or younger than 114. In another experiment, a random wheel spin influenced the guess people offered for the number of African nations in the UN. In all these examples, when we have to think of a number that we don’t know or that we can apply subjective judgment to, other random numbers can influence what numbers come to mind.

 

This can have real consequences in our lives when we are looking to buy something or make a donation or investment. Retailers may present us with high anchors in an effort to prime us to be willing to accept a higher price than we would otherwise pay for an item. If you walk into a sunglass shop and see two prominently displayed sunglasses with very high prices, you might not be as surprised by the high prices listed on other sunglasses, and might even consider slightly lower prices on other sunglasses as a good deal.

 

It is probably safe to say that sales prices in stores, credit card interest rates, and investment management fees are carefully crafted with anchoring effects in mind. Retailers want you to believe that a high price on an item is a fair price, and could be higher if they were not willing to offer you a deal. Credit card companies and investment brokers want you to believe the interest rates and management fees they charge are small, and might try to prime you with large numbers relative to the rates they quote you. We probably can’t completely overcome the power of anchoring effects, but if we know what to look for, we might be better at taking a step back and analyzing the rates and costs a little more objectively. If nothing else, we can pause and doubt ourselves a little more when we are sure we have found a great deal on a new purchase or investment.
Guided by Impressions of System 1

Guided by Impressions of System 1

In Thinking Fast and Slow Daniel Kahneman shares research showing how easily people can be tricked or influenced by factors that seem to be completely irrelevant to the mental task that the people are asked to carry out. People will remember rhyming proverbs better than non-rhyming proverbs. People will trust a cited research source with an easy to say name over a difficult and foreign sounding name. People will also be influenced by the quality of paper and colors used in advertising materials. No one would admit that rhymes, easy to say names, or paper quality is why they made a certain decision, but statistics show that these things can strongly influence how we decide.

 

Kahneman describes the research this way, “The psychologists who do these experiments do not believe that people are stupid or infinitely gullible. What psychologists do believe is that all of us live much of our life guided by the impressions of System 1 – and we often do not know the source of these impressions.”

 

Making tough and important decisions requires a lot of energy. In many instances, we have to make tough decisions that require a lot of mental effort in a relatively short time. We don’t always have a great pen and paper template to follow for decision-making, and sometimes we have to come to a conclusion in the presence of others, upping the stakes and increasing the pressure as we try to think through our options. As a result, the brain turns to heuristics reliant on System 1. The brain uses intuition, quick impressions, and substitutes questions for an easier decision.

 

We might not know why we intuitively favored one option over the other. When we ask our brain to think back on the decision we made, we are engaging System 2 to think deeply, and it is likely going to overlook and not consider inconsequential factors such as the color of the paper for the option we picked. It won’t remember that the first sales person didn’t make much eye contact with us and that the second person did, but it will substitute some other aspect of competence to give us a reason for trusting sales person number two more.

 

What is important to remember is that System 1 guides a lot of our lives. We don’t always realize it, but System 1 is passing along information to System 2 that isn’t always relevant for the decision that System 2 has to make. Intuitions and quick impressions can be biased and formed by unimportant factors, but even if we don’t consciously recognize them, they get passed along and calculated into our final choice.
Thoughts on Biases

Thoughts on Biases

“Anything that makes it easier for the associative machine to run smoothly will also bias beliefs,” writes Daniel Kahneman in his book Thinking Fast and Slow. Biases are an unavoidable part of our thinking. They can lead to terrible prejudices, habits, and meaningless preferences, but they can also help save us a lot of time, reduce the cognitive demand on our brains, and help us move smoothly through the world. There are too many decision points in our lives and too much information for us to absorb at any one moment for us to not develop shortcuts and heuristics to help our brain think quicker. Quick rules for associative thinking are part of the process of helping us actually exist in the world, and they necessarily create biases.

 

A bad sushi roll might bias us against sushi for the rest of our life. A jump-scare movie experience as a child might bias us toward romcoms and away from horror movies. And being bullied by a beefy kid in elementary school might bias us against muscular dudes and sports. In each instance, a negative experience is associated in our brains with some category of thing (food, entertainment, people) and our memory is helping us move toward things we are more likely to like (or at least less likely to bring us harm). The consequences can be low stakes, like not going to horror movies, but can also be high stakes, like not hiring someone because their physical appearance reminds you of a kid who bullied you as a child.

 

What is important to note here is that biases are natural and to some extent unavoidable. They develop from our experiences and the associations we make as move through life and try to understand the world. They can be defining parts of our personality (I only drink black coffee), they can be incidental pieces of us that we barely notice (my doughnut choice order is buttermilk bar, maple covered anything, chocolate, plain glaze), and they could also be far more dangerous (I have an impulse to think terrible things about anyone with a bumper sticker for a certain political figure – and I have to consciously fight the impulse). Ultimately, we develop biases because it helps us make easier decisions that will match our preferences and minimize our chances of being upset. They are mental shortcuts, saving us from having to make tough decisions and helping us reach conclusions about entire groups of things more quickly.

 

The goal for our society shouldn’t be to completely eliminate all instances of bias in our lives. That would require too much thought and effort for each of us, and we don’t really have the mental capacity to make so many decisions. It is OK if we are biased toward Starbucks rather than having to make a decision about what coffee shop to go to each morning, or which new coffee shop to try in a town we have never visited.

 

What we should do, is work hard to recognize biases that can really impact our lives and have negative consequences. We have to acknowledge that we have negative impulses toward certain kids of people, and we have to think deeply about those biases and work to be aware of how we treat people. Don’t pretend that you move through the world free from problematic biases. Instead, work to see those biases, and work to push against your initial negative reaction and think about ways that you could have more positive interactions with others, and how you can find empathy and shared humanity with them. Allow biases to remain when helpful or insignificant (be biased toward vegetarian take-out for example), but think critically about biases that could have real impacts in your life and in the lives of others.
Detecting Simple Relationships

Detecting Simple Relationships

System 1, in Daniel Kahneman’s picture of the mind, is the part of our brain that is always on. It is the automatic part of our brain that detects simple relationships in the world, makes quick assumptions and associations, and reacts to the world before we are even consciously aware of anything. It is contrasted against System 2, which is more methodical, can hold complex and competing information, and can draw rational conclusions from detailed information through energy intensive thought processes.

 

According to Kahneman, we only engage System 2 when we really need to. Most of the time, System 1 does just fine and saves us a lot of energy. We don’t need to have to think critically about what we need to do when the stoplight changes from green to yellow to red. Our System 1 can develop an automatic response so that we let off the gas and come to a stop without having to consciously think about every action involved in slowing down at an intersection. However, System 1 has some very serious limitations.

 

“System 1 detects simple relations (they are all alike, the son is much taller than the father) and excels at integrating information about one thing, but it does not deal with multiple distinct topics at once, nor is it adept at using purely statistical information.”

 

When relationships start to get complicated, like say the link between human activities and long term climate change, System 1 will let us down. It also fails us when we see someone who looks like they belong to the Hell’s Angels on a father-daughter date at an ice cream shop, when we see someone who looks like an NFL linebacker in a book club, or when we see a little old lady driving a big truck. System 1 makes assumptions about the world based on simple relationships, and is easily surprised. It can’t calculate unique and edge cases, and it can’t hold complicated statistical information about multiple actors and factors that influence the outcome of events.

 

System 1 is our default, and we need to remember where its strengths and where its weaknesses are. It can help us make quick decisions while driving or catching an apple falling off a counter, but it can’t help us determine whether a defendant in a criminal case is guilty. There are times when our intuitive assumptions and reactions are spot on, but there are a lot of times when they can lead us astray, especially in cases that are not simple relationships and violate our expectations.