Taboo Tradeoffs

Taboo Tradeoffs

A taboo tradeoff occurs when we are faced with the dilemma of exchanging something that we are not supposed to give up for money, food, or other resources. Our time, attention, energy, and sometimes even our happiness are perfectly legitimate to trade, but things like health and safety generally are not. We are expected to exchange our time, attention, and physical labor for money, but we are not expected to exchange our personal health for money. When I first read about taboo tradeoffs in Daniel Kahneman’s book Thinking Fast and Slow, the year was 2019, and we had not yet entered into a period of time defined by a global pandemic where people began to challenge the taboo against trading health and safety for entertainment, for trials for COVID-19 cures, and to signal their political allegiance.

 

In the book, Kahneman suggests that holding to hard rules against taboo tradeoffs actually makes us all worse off in the end. “The taboo tradeoff against accepting any increase in risk is not an efficient way to use the safety budget,” he writes. Kahneman’s point is that we can spend huge amounts of resources to ensure that there is absolutely no risk to ourselves, our children, or to others, but that we would be better off allocating those resources in a different way. I think Kahneman is correct, but I think that his message has the potential to be read very differently in 2020, and deserves more careful and nuanced discussion.

 

“The intense aversion to trading increased risk for some other advantage plays out on a grand scale in the laws and regulations governing risk.” The important point to note is that complete security and safety comes at a cost of other advantages. The advantage to driving to a football game is that we get to enjoy watching live sports, the risk is that we could be in a serious traffic accident. The advantage of using bug spray is that we kill the creepy crawlies in the dark corners of the garage, the risk is that we (or a child or pet) could accidently ingest the poison. The safest things to do would be to watch the game on TV and to use a broom and boot to kill the bugs, but if we avoid the risk then we give up the advantages of seeing live sports and using efficient pest control products.

 

Kahneman notes that when we make these decisions, we often make them based on a fear of regret more than out of altruistic concerns for our own health and safety or for the health and safety of others. If you traded some level of risk of your child’s safety, and they died, you would feel immense regret and shame, and so you avoid the taboo tradeoff to prevent your own shame. When this plays out across society in millions of large and small examples, we end up in a collectively risk averse paralysis, and society gives up huge advantages because there is a possibility of risk for some individuals.

 

To address the current global state of affairs, I think Kahneman would recognize the risk of COVID-19 and would not encourage us to trade our health and safety (and the health and safety of others) for the enjoyment of a birthday party, holiday meal, or other type of gathering without wearing masks and taking other precautions.  Throughout the book Kahneman highlights the difficulties and challenges of thinking through risk. He addresses the many biases that play into how we behave and how we understand the world. He demonstrates the difficulties we have in thinking statistically and understanding complex probabilities. The takeaway from Kahneman in regard to the taboo tradeoff is that there is a level at which our efforts of safety are outpaced by the advantages we could attain by giving up some of our safety. It isn’t necessarily on each of us individually to try to decide exactly what level of risk society should accept. It is up to the experts who can engage their System 2 brain and evaluate their biases to help the rest of us better understand and conceptualize risk. We might be able to do some things understanding that there is a level of risk we take when engaging in society in 2020, but adequate precautions can still mitigate that risk, and still help us maintain a reasonable balance of safety tradeoffs while enjoying our lives.
Daniel Kahneman on Regret

Daniel Kahneman on Regret

Regret is an interesting emotion and worth deep consideration. It is a System 2 emotion, that is, an emotion we feel when we pause, reflect on our life or actions, and consider the decisions we have or have not made in the past. System 1, the active, fast, and general default mode of our brain doesn’t feel regret. It lives in the moment and takes action based on our current inputs. It can receive feedback from System 2’s regret and make adjustments with new decisions and actions, but it is too busy with the present moment and environment to be the one building the emotion of regret.

 

Regret also stems from our ability to imagine different realities. Daniel Kahneman describes it as an emotion associated with loss and mistakes that allows us to self-correct and perceive different opportunities and realities that we might want to live within. It can modify how we act and behave, before we have even been faced with a decision. Kahenman writes, “decision makers know that they are prone to regret, and the anticipation of that painful emotion plays a part in many decisions.”

 

If I pause to think about regret, I typically think about a person on their deathbed, regretful for all the things they never did in their life. A fear of being this person has pushed me to try to do more, be more involved, and have varied and interesting experiences. The trite quote is that people on their deathbed are more regretful for the things they didn’t do than the things in life they did do. In this view, people recognize regret, and it turns into a fear of missing out that spurs people to action before it is too late, before they regret not taking action.

 

However, this idea may not represent the most powerful feelings of regret that we may experience. Kahneman writes, “People expect to have stronger emotional reactions (including regret) to an outcome that is produced by action than to the same outcome when it is produced by inaction.” As an example, Kahneman presents two fictional characters. Both have investments with companies A and B. One individual considers making a greater investment in company A, but does not and loses out on $1,200 of potential gains. The other removes some of her investment from company A, and she ends up with $1,200 less than what she could have received if she had done nothing. The consensus among people who read Kahneman’s examples indicate that the person who actively pulled money out of company A feels more regret than the person who never added extra investment funds to the company. Doing nothing and missing a potential gain is less regretful than taking an action that creates a perceived loss.

 

Loss aversion is powerful, and we are more likely to take actions to avoid losses to help us avoid feelings of regret rather than take chances at potential gains. The gains we don’t receive won’t cause as much regret as losses we do receive. Regret is not just the fear of missing out or the fear of having done too little that I described earlier. It is a powerful emotion that kicks in when we reflect on our life and see that our actions directly lead to losses and mistakes that we made. We may begin to change our behaviors and decisions to avoid similar losses in the future, and avoid the regret that those losses will bring, but that can drive us into making irrational choices in the present moment, with the hope of not losing out in the future.
Sunk-Cost Fallacy - Joe Abittan

Sunk-Cost Fallacy

Every time I pick the wrong line at the grocery store I am reminded of the sunk-cost fallacy. There are times I will be stuck in line, see another line moving more quickly, and debate internally if I should jump to the other line or just wait it out in the line I’m already in. Once I remember the sunk-cost fallacy, however, the internal debate shifts and I let go of any feeling that I need to remain in the current line.

 

My grocery store example is a comical take on the sunk-cost fallacy, but in real life, this cognitive error can have huge consequences. Daniel Kahneman describes it this way, “The decision to invest additional resources in a losing account, when better investments are available, is known as the sunk-cost fallacy, a costly mistake that is observed in decisions large and small.”

 

We are going to make decisions and choices for where to invest our time, attention, and money that will turn out to be mistakes. At a certain point we have to realize when something is not working and walk away. Doing so, however, requires that we admit failure, that we cut our losses, and that we search for new opportunities. Admitting that we were wrong, giving up on losses, and searching for new avenues is difficult, and it is not uncommon for us to keep moving forward despite our failures, as if we just need to try harder and push more in order to find the success we desire. This is the base of the sunk-cost fallacy. When we have invested a lot of time, energy, and resources into something it is hard to walk away, even if we would be better off by doing so.

 

Pursuing a career path that clearly isn’t panning out and refusing to try a new different avenue is an example of sunk-cost fallacy. Movie studios that try to reinvent a character or story over and over with continued failure is another example. Sitting through the terrible movie the studio produced, rather than leaving the theater early, is also an example of the sunk-cost fallacy. In all of these instances, an investment has been made, and costly efforts to make the investment pay-off are undertaken, generally at a greater loss than would be incurred if we had made a change and walked away.

 

When you find yourself saying, “I have already spent so much money on XYZ, or I have already put so much effort into making XYZ work, and I don’t want to just let that all go to waste,” you are stuck in the middle of the sunk-cost fallacy. At this point, it is time to step back, look at other ways you could spend your money and time, and honestly evaluate what your priorities should be. Doing so, and remembering Kahneman’s quote, will help you begin to make the shift to a better use of your time, energy, and resources. It may be embarrassing and disappointing to admit that something is going in the wrong direction, but ultimately, you will end up in a better and more productive spot.
Money Isn't About Economic Security (For Most of Us)

Money Isn’t About Economic Security (For Most of Us)

Tyler Cowen started his February 28th, 2018 podcast interview with his colleague from George Mason University, Robin Hanson, with the following:

 

“Robin, if politics is not about policy, medicine is not about health, laughter is not about jokes, and food is not about nutrition, what are podcasts not about?”

 

Hanson goes on to explain that conversations are not really about imparting useful information and finding out useful things, but that conversation is likely more about showing off and signaling. When you share new information to someone, you are showing them that you are a valuable ally who knows useful things that might one day be helpful. When you share a particular piece of knowledge, you are signaling that you are the kind of person who would know such knowledge.

 

I think that Hanson’s views toward signaling are correct and deserve more attention and consideration. A lot of what we do has more to do with signaling than about the reason we would give to an observer for what we are doing. Hanson is not alone in recognizing this reality.

 

In Thinking  Fast and Slow, Daniel Kahneman writes, the following about money:

 

“Except for the very poor, for whom income coincides with survival, the main motivators of money-seeking are not necessarily economic. For the billionaire looking for the extra billion, and indeed for the participant in an experimental economics project looking for the extra dollar, money is a proxy for points on a scale of self-regard and achievement. These rewards and punishments, promises and threats, are all in our head.”

 

Money is not really about economic well-being (for most of us). Its not really about the things we can purchase or the vacations we can take. Money is really about social status. Having more of it elevates our social status, as does using it for impressive and expensive purposes. There is no objective ranking out there for our social status, but we act as if our social status is tangible and will reveal something important about our lives and who we are. Pursuing money gives us a chance to pursue social status in an oblique way, making it look as though we are doing something for high-minded reasons, when in reality we are trying to climb a social ladder and use money as our measuring stick of success.

 

Realistically, we are not going to be able to do much of anything about our signaling behaviors, especially if Hanson is correct in estimating that well over 90% of what we do is signaling. However, we can start to acknowledge signaling and chose where and how we send signals about ourselves. We can chose not to rely on money to signal something about who we are and can seek out more healthy avenues for signaling, with more environmentally friendly and socially conscientious signaling externalities taken into consideration.
Competing Biases

Competing Biases

I am trying to remind myself that everyone, myself included, operates on a complex set of ideas, narratives, and beliefs that are sometimes coherent, but often conflicting. When I view my own beliefs, I am tempted to think of myself as rational and realistic. When I think of others who I disagree with, I am prone to viewing them in a simplistic frame that makes their arguments irrational and wrong. The reality is that all of our beliefs are less coherent and more complex than we typically think.

 

Daniel Kahneman’s book Thinking Fast and Slow has many examples of how complex and contradictory much of our thinking is, even if we don’t recognize it. One example is competing biases that manifest within us as individuals and can be seen in the organizations and larger groups that we form. We can be exaggeratedly optimistic and paralyzingly risk averse at the same time, and sometimes this tendency can actually be a good thing for us. “Exaggerated optimism protects individuals and organizations from the paralyzing effects of loss aversion; loss aversion protects them from the follies of overconfident optimism.”

 

On a first read, I would expect the outcome of what Kahneman describes to be gridlock. The optimist (or optimistic part of our brain) wants to push forward with a big new idea and plan. Meanwhile, loss aversion halts any decision making and prevents new ideas from taking root. The reality, as I think Kahneman would explain, is less of a conscious and deliberate gridlock, but an unnoticed trend toward certain decisions. The optimism wins out in an enthusiastic way when we see a safe bet or when a company sees an opportunity to capture rents. The loss aversion wins out when the bet isn’t safe enough, and when we want to hoard what we already have. We don’t even realize when we are making these decisions, they are just obvious and clear directions, but the reality is that we are constantly being jostled between exaggerated optimism and loss aversion.

 

Kahneman shows that these two biases are not exclusionary even though they may be conflicting. We can act on both biases at the same time, we are not exclusively a risk seeking optimists or exclusively risk averse. When the situation calls for it, we apply the appropriate frame at an intuitive level. Kahneman’s quote above shows that this can be advantageous for us, but throughout the book he also shows us how biases in certain directions and situation can be costly for us overtime as well.

 

We like simple and coherent narratives. We like thinking that we are one thing or another, that other people are either good or bad and right or wrong. The reality, however, is that we contain multitudes within us, act on competing and conflicting biases, and have more nuance and incongruency in our lives than we realize. This isn’t necessarily a bad thing. We can all still survive and prosper despite the complexity and incoherent beliefs that we hold. Nevertheless, I think it is important that we acknowledge the reality we live within, rather than simply believing the simple stories that we like to tell ourselves.
Narrow Framers

Narrow Framers

I like to write about the irrational nature of human thinking because it reminds me that I don’t have all the answers figured out, and it reminds me that I often make decisions that feel like the best decision in the moment, but likely isn’t the best decision if I were to step back to be more considerate. Daniel Kahneman writes about instances where we fail to be rational actors in his book Thinking Fast and Slow, and his examples have helped me better understand my own thinking process and the context in which I make decisions that feel logically consistent, but likely fall short of being truly rational.

 

In one example, Kahneman describes humans as narrow framers, failing to take multiple outcomes into consideration and focusing instead on a limited set of salient possibilities. He writes, “Imagine a longer list of 5 simple (binary) decisions to be considered simultaneously. The broad (comprehensive) frame consists of a single choice with 32 options. Narrow framing will yield a sequence of 5 simple choices. The sequence of 5 choices will be one of the 32 options of the broad frame. Will it be the best? Perhaps, but not very likely. A rational agent will of course engage in broad framing, but Humans are by nature narrow framers.”

 

In the book Kahneman writes that we think sequentially and address problems only as they arise. We don’t have the mental capacity to hold numerous outcomes (even simple binary outcomes) in our mind simultaneously and make predictions on how the world will respond to each outcome. If we map out decisions and create tables, charts, and diagrams then we have a chance of making rational decisions with complex information, but if we don’t outsource the information to a computer or to pen and paper, then we are going to make narrow short-term choices. We will consider a simple set of outcomes and discount other combinations of outcomes that we don’t expect. In general, we will progress one outcome at a time, reacting to the world and making choices individually as the situation changes, rather than making long-term decisions before a problem has arisen.

 

Deep thinking about complex systems is hard and our brains default toward lower energy and lower effort decision-making. We only engage our logical and calculating System 2 part of the brain when it is needed, and even then, we only engage it for one problem at a time with a limited set of information that we can easily observe about the world. This means that our thinking tends to focus on the present, without full consideration of the future consequences of our actions and decisions. It also means that our thinking is limited and doesn’t contain the full set of our data that might be necessary for making accurate and rational choices or predictions. When necessary, we build processes, systems, and structures to help our minds be more rational, but that requires getting information out of our heads, and outsourcing the effort to technologies beyond the brain, otherwise our System 2 will be bogged down and overwhelmed by the complexity of the information in the world around us.
Risk Averse and Risk Seeking - Joe Abittan

Risk Averse and Risk Seeking

I would generally categorize myself as somewhat risk averse, but studies from Daniel Kahneman in Thinking Fast and Slow might suggest that I’m not really any different than anyone else. I might just be responding to the set of circumstances that I typically experience, similar to anyone else, and I might just be more aware of times when I am risk averse rather than times when I am more risk seeking. In particular, I might be risk averse in certain situations and categorize those situations correctly, but risk seeking in other situations without recognizing it.

 

Kahneman uses examples throughout his book to demonstrate to the audience that common cognitive errors and psychological tendencies are shared with even the most savvy readers who would pick up a book like Thinking Fast and Slow. Kahneman even uses anecdotes from his own life and his own thoughts to demonstrate how deep knowledge of cognitive biases and errors doesn’t make one immune. After demonstrating how our minds can lead us to be risk averse in some settings and risk seeking in others, Kahneman cautions us against a typical pattern that many of us will find ourselves in. “It is costly to be risk averse for gains and risk seeking for losses.”

 

On its own, this quote doesn’t seem to reveal anything to interesting, but in the context of Kahneman’s experiments and examples, it reveals a lot about the way we behave whether we are risk seeking or risk averse. When we are offered a flat sum or a gamble with the potentially win more than the flat sum, we often won’t be willing to take the gamble. The guaranteed money is more appealing to us than the prospects of a higher winning with a small chance of gaining nothing. When it comes to gains, we are often risk averse, preferring the sure thing rather than the possibility of getting more with the risk of getting nothing or facing a cost.

 

However, we become risk seeking when we stand to lose something. As long as there is a small outside chance that we won’t lose anything, we will avoid a certain loss, risk a larger loss, and take a gamble. In Kahneman’s example he demonstrates how people will quickly turn down a sure loss of $750 for a 25% chance of losing nothing, even when there is a 75% chance of losing $1000.

 

When you do the math over numerous trials, you see that taking the loss at $750 is better. However, our minds don’t perceive things this way. When we stand to win something, we tend to become conservative and risk averse, but if we stand to lose something, we suddenly become more risk seeking. Combining these two tendencies can be dangerous. It means we can stand to gain much less than we might if we flipped our biases around, and it also means we are more likely to face greater losses with greater frequency than if we had been less risk seeking with regard to losses.

 

If we think about this in the context of our lives more generally, we can see that categorizing ourselves and most of our friends as either risk averse or risk seeking doesn’t necessarily make sense. When you are young, you really don’t have anything that you will be worried about losing. It makes sense that you might be more risk seeking, more willing to take on behaviors and ideas that are risky, but might have a big upside. You might procrastinate with important homework, retirement savings, and household chores because you know you will lose time (the only thing you may have if you are really young), and you can gamble on the consequences. As you get older, once you are established in a career, own a home, have a 401K, and move through life in general, you stand to lose more. Gains throughout your life become less significant due to Tyler Cowen‘s favorite idea, diminishing marginal returns. It becomes harder to give up the guaranteed gains because the marginal increase in a potential gain through a gamble is less appealing. You become risk averse as you get older and in more situations as you grow to have more things to worry about losing. Therefore, categorizing people as generally risk averse or generally risk seeking is meaningless. You need to look at the circumstances of their lives to understand where they find themselves in terms of social status, what material possessions they have, what their family structure is like, and you will start to understand why they make generally more risk averse or generally more risk seeking decisions. There is probably some variability across people, but I would expect the structures and systems in place around us shape our behavior more than any genetic or inherent factors.
Denominator Neglect - Joe Abittan

Denominator Neglect

“The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects,” writes Daniel Kahneman in Thinking Fast and Slow.

 

One thing we have seen in 2020 is how difficult it is to communicate and understand risk. Thinking about risk requires thinking statistically, and thinking statistically doesn’t come naturally for our brains. We are good at thinking in terms of anecdotes and our brains like to identify patterns and potential causal connections between specific events. When our brains have to predict chance and deal with uncertainty, they easily get confused. Our brains shift and solve easier problems rather than complex mathematical problems, substituting the answer to the easy problem without realizing it. Whether it is our risk of getting COVID or the probability we assigned to election outcomes before November 3rd, many of us have been thinking poorly about probability and chance this year.

 

Kahneman’s quote above highlights one example of how our thinking can go wrong when we have to think statistically. Our brains can be easily influenced by random numbers, and that can throw off our decision-making when it comes to dealing with uncertainty. To demonstrate denominator neglect, Kahneman presents two situations in his book. There are two large urns full of white and red marbles. If you pull a red marble from an urn, you are a winner. The first urn has 10 marbles in it, with 9 white and 1  red. The second urn has 100 marbles in it, with 92 white and 8 red marbles. Statistically, we should try our luck with the urn with 10 marbles, because 1 out of 10, or 10% of all marbles in the urn are red. In the second urn, only 8% of the marbles are red.

 

When asked which urn they would want to select from, many people select the second urn, leading to what Kahneman describes as denominator neglect. The chance of winning is lower with the second urn, but there are more winning marbles in the jar, making it seem like the better option if you don’t slow down and engage your System 2 thinking processes. If you pause and think statistically, you can see that option 1 provides better odds, but if you are moving quick your brain can be distracted by the larger number of winning marbles and lead you to make a worse choice.

 

What is important to recognize is that we can be influenced by numbers that shouldn’t mean anything to us. The number of winning marbles shouldn’t matter, only the percent chance of winning should matter, but our brains get thrown off. The same thing happens when we see sales prices, think about a the risk of a family gathering of 10 people during a global pandemic, or think about polling errors. I like to check The Nevada Independent‘s COVID-19 tracking website, and I have noticed denominator neglect in how I think about the numbers they report. For a continued stretch, Nevada’s total number of cases was decreasing, but our case positivity rate was staying the same. Statistically, nothing was really changing regarding the state of the pandemic in Nevada, but fewer tests were being completed and reported each day, so the overall number of positive cases was decreasing. If you scroll down the Nevada Independent website, you will get to a graph of the case positivity rate and see that things were staying the same. When looking at the decreasing number of positive tests reported, my brain was neglecting the denominator, the number of tests completed. The way I understood the pandemic was biased by the big headline number, and wasn’t really based on how many people out of those tested did indeed have the virus. Thinking statistically provides a more accurate view of reality, but it can be hard to think statistically and can be tempting to look just at a single headline number.
Imagining Success Versus Anticipating Failure

Imagining Success Versus Anticipating Failure

I am guilty of not spending enough time planning what to do when things don’t work out the way I want. I have written in the past about the importance of planning for failure and adversity, but like many others, I find it hard to do and hard to get myself to sit down and think seriously about how my plans and projects may fail. Planning for resilience is incredibly important, but so many of us never get around to it. Daniel Kahneman in Thinking Fast and Slow, helps us understand why we fail to plan for failure.

 

He writes, “The successful execution of a plan is specific and easy to imagine when one tries to forecast the outcome of a project. In contrast, the alternative of failure is diffuse, because there are innumerable ways for things to go wrong.”

 

Recently, I have written a lot about the fact that our minds understand the world not by accumulating facts, understanding data, and analyzing nuanced information, but by constructing coherent narratives. The less we know and the more simplistic the information we work with, the more coherent our narratives of the world can be. When we have less uncertainty, our narrative flows more easily, feels more believable, and is more comforting to our mind. When we descend into the particular, examine complexity, and weigh competing and conflicting information, we have to balance substantial cognitive dissonance with our prior beliefs, our desired outcomes, and our expectations. This is hard and uncomfortable work, and as Kahneman points out, a problem when we try to anticipate failures and roadblocks.

 

It is easy to project forward how our perfect plan will be executed. It is much harder to identify how different potential failure points can interact and bring the whole thing crashing down. For large and complex systems and processes, there can be so many obstacles that this process can feel entirely overwhelming and disconnected from reality. Nevertheless, it is important that we get outside of our comfortable narrative of success and at least examine a few of the most likely mistakes and obstacles that we can anticipate. Any time that we spend planning ways to get the ship back on course if something goes wrong will pay off in the future when things do go wrong. Its not easy because it is mentally challenging and nebulous, but if we can get ourselves to focus on the likelihood of failure rather than the certainty of success, we will have a better chance of getting to where we want to be and overcoming the obstacles we will face along the way.
Why Terrorism Works

Why Terrorism Works

In the wake of terrorism attacks, deadly shootings, or bizarre accidents I often find myself trying to talk down the threat and trying to act as if my daily life shouldn’t be changed. I live in Reno, NV, and my city has experienced school shootings while my state experienced the worst mass shooting in the United States, but I personally have never been close to any of these extreme yet rare events.  Nevertheless, despite efforts to talk down any risk, I do psychologically notice the fear that I feel following such events.

 

This fear is part of why terrorism works. Despite trying to rationally and logically talk myself through the post-terrorism incident and remind myself that I am in more danger on the freeway than I am near a school or at a concert, there is still some apprehension under the surface, no matter how cool I make myself look on the outside. In Thinking Fast and Slow, Daniel Kahneman examines why we behave this way following such attacks. Terrorism, he writes, “induces an availability cascade. An extremely vivid image of death and damage, constantly reinforced by media attention and frequent conversations becomes highly accessible, especially if it is associated with a specific situation.”

 

Availability is more powerful in our mind than statistics. If we know that a given event is incredibly rare, but have strong mental images of such an event, then we will overweight the likelihood of that event occurring again. The more easily an idea or possibility comes to mind, the more likely it will feel to us that it could happen again. On the other hand, if we have trouble recalling experiences or instances where rare outcomes did not happen, then we will discount the possibility that they could occur. Where terrorism succeeds is because it shifts deadly events from feeling as if they were impossible to making them easily accessible in the mind, and making them feel as though they could happen again at any time. If our brains were coldly rational, then terrorism wouldn’t work as well as it does. As it is, however, our brains respond to powerful mental images and memories, and the fluidity of those mental images and memories shapes what we expect and what we think is likely or possible.