Frame Bound vs Reality Bound

Frame Bound vs Reality Bound

My wife works with families with children with disabilities and one of the things I learned from her is how to ask children to do something. When speaking with an adult, we often use softeners when requesting that the other person do something, but this doesn’t work with children. So while we may say to a colleague, a spouse, or a friend, “can you please XYZ,” or “lets call it a night of bowling after this frame, OK?” these sentences don’t work with children. A child won’t quite grasp the way a softener like “OK” is used and they won’t understand that while you have framed an instruction or request as a question you are not actually asking a question or trying to give someone a choice. If you frame an instruction as a choice the child can reply with “no” and then you as a parent are stuck fighting them.

 

What happens in this situation is that children reject the frame bounding that parents present them with. To get around it, parents need to be either more direct or more creative with how they tell their children to do things. You can create a new frame for your child that they can’t escape by saying, “It is time to get ready for dinner, you can either put away your toys, or you can go set the table.” You frame a choice for the child, and they get to chose which action they are going to take, but in reality both are things you want them to do (my wife says this also works with husbands but I think the evidence is mixed).

 

In Thinking Fast and Slow, Daniel Kahneman writes, “Unless there is an obvious reason to do otherwise, most of us passively accept decision problems as they are framed and therefore rarely have an opportunity to discover the extent to which our preferences are frame-bound rather than reality-bound.”

 

The examples I gave with talking to children versus talking to adults helps demonstrate how we passively accept the framing for our decisions. We don’t often pause to reconsider whether we should really purchase an item on sale. The discount that we are saving outweighs the fact that we still face a cost when purchasing the item. Our thinking works this way in office settings, in politics, and on the weekends when we can’t decide if we are going to roll out of bed or not. The frame that is applied to our decisions becomes our reality, even if there are more possibilities out there than what we realize.

 

A child rejecting the framing that a parent provides, or conversely a parent creating new frames to shape a child’s decisions and behaviors demonstrates how easily we can fall into frame-bound thinking and how jarring it can be when reality intrudes on the frames we try to live within. Most of the times we accept the frames presented for us, but there can be huge costs if we just go along with the frames that advertisers, politicians, and other people want us to adopt.
Framing Costs and Losses - Joe Abittan

Framing Costs and Losses

Losses evokes stronger negative feelings than costs. Choices are not reality-bound because System 1 is not reality-bound,” writes Daniel Kahneman in Thinking Fast and Slow.

 

We do not like losses. The idea of a loss, of having the status quo changed in a negative way without it being our deliberate choice, is hard for us to accept or justify. Costs, on the other hand, we can accept much more readily, even if the only difference between a cost and a loss is the way we chose to describe it.

 

Kahneman shares an example in his book where he an Amos Tversky did just that, changing the structure of a gamble so that the contestant faced the possible outcome of a $5 loss or where they paid a $5 cost with a possibility of gaining nothing. The potential outcomes of the two gambles is exactly the same, but people interpret the gambles differently based on how the cost/loss is displayed. People are more likely to take a bet when it is posed as a cost and not as a possible loss. System 1, the quick thinking part of the brain, scans the two gambles and has an immediate emotional reaction to the idea of a loss, and that influences the ultimate decision and feeling regarding the two gambles. System 1 is not rationally calculating the two options to see that they are equivalent, it is just acting on the intuition that it experiences.

 

“People will more readily forgo a discount than pay a surcharge. The two may be economically equivalent, but they are not emotionally equivalent.”

 

Kahneman continues to describe research from Richard Thaler who had studied credit-card lobbying efforts to prevent gas stations from charging different rates for cash versus credit. When you pay with a card, there is a transaction processing fee that the vendor pays to the credit card company. Gas stations charge more for credit card purchases because they have to pay a portion on the back end of the all credit transactions that take place. Credit card companies didn’t want gas stations to charge a credit card surcharge, effectively making it more expensive to buy gas with a card than with cash. Ultimately they couldn’t stop gas stations from charging different rates, but they did succeed in changing the framing around the different prices. Cash prices are listed as discounts, shifting the base rate to the credit price. As Kahneman writes, people will skip the extra effort that would garner the cash discount and pay with their cards. However, if people were directly told that there was a credit surcharge, that they had to pay more for the convenience of using their card, it is possible that more individuals would make the extra effort to pay with cash. How we frame a cost or a loss matters, especially because it can shift the baseline for consideration, making us see things as either costs or losses depending on the context, and potentially altering our behavior.
A Lack of Internal Consistency

A Lack of Internal Consistency

Something I have been trying to keep in mind lately is that our internal beliefs are not as consistent as we might imagine. This is important right now because our recent presidential election has highlighted the divide between many Americans. In most of the circles I am a part of, people cannot imagine how anyone could vote for Donald Trump. Since they see President Trump as contemptible, it is hard for them to separate his negative qualities from the people who may vote for him. All negative aspects of Trump and of the ideas that people see him as representing are heaped onto his voters. The problem however, is that none of us have as much internal consistency between our thoughts, ideas, opinions, and beliefs for any of us to justify characterizing as much as half the country as bigoted, uncaring, selfish, or really any other adjective (except maybe self-interested).

 

I have written a lot recently about the narratives we tell ourselves. It is problematic that the more simplistic a narrative, the more believable and accurate it feels to us. The world is incredibly complicated, and a simplistic story that seems to make sense of it all is almost certainly wrong. Given this, it is worth looking at our ideas and views and trying to identify areas where we have inconsistencies in our thoughts. This helps us tease apart our narratives and recognize where simplistic thinking is leading us to unfound conclusions.

 

In Thinking Fast and Slow, Daniel Kahneman shows us how this inconsistency between our thoughts, beliefs, and behaviors can arise, using moral ambiguity as an example. He writes, “the beliefs that you endorse when you reflect about morality do not necessarily govern your emotional reactions, and the moral intuitions that come to your mind in different situations are not internally consistent.”

 

It is easy to adopt a moral position against some immoral behavior or attitude, but when we find ourselves in a situation where we are violating that moral position, we find ways to explain our internal inconsistency without directly violating our initial moral stance. We rationalize why our moral beliefs don’t apply to us in a given situation, and we create a story in our minds where there is no inconsistency at all.

 

Once we know that we do this with our own beliefs toward moral behavior, we should recognize that we do this with every area of life. It is completely possible for us to think entirely contradictory things, but to explain away those contradictions in ways that make sense to us, even if it leaves us with incoherent beliefs. And if we do this ourselves, then we should recognize that other people do this as well. So when we see people voting for a candidate and can’t imagine how they could vote for such a candidate, we should assume that they are making internally inconsistent justifications for voting for that candidate. They are creating a narrative in their head where they are making the best possible decision. They may have truly detestable thoughts and opinions, but we should remember that in their minds they are justified and making rational choices.

 

Rather than simply hating people and heaping every negative quality we can onto them. We should pause and ask what factors might be leading them to justify contemptible behavior. We should look for internal inconsistencies and try to help people recognize these areas and move forward more comprehensively. We should see in the negativity in others something we have the same capacity for, and we should try to find more constructive ways to engage with them and help them shift the narrative that justifies their inconsistent thinking.
Stoicism in Thinking Fast and Slow

Stoicism in Thinking Fast and Slow

“We spend much of our day anticipating, and trying to avoid, the emotional pains we inflict on ourselves,” writes Daniel Kahneman in his book Thinking Fast and Slow. “How seriously should we take these intangible outcomes, the self-administered punishments (and occasional rewards) that we experience as we score our lives?”

 

Kahneman’s point is that emotions such as regret greatly influence the decisions we make. We are so afraid of loss that we go out of our way to minimize risk, to the point where we may be limiting ourselves so much that we experience costs that are actually greater than the potential loss we wanted to avoid. Kahneman is pointing to something that stoic thinkers, dating back to Marcus Aurelius and Seneca, addressed – our ability to be captured by our emotions and effectively held hostage by fears of the future and pain from the past.

 

In Letters from a Stoic, Seneca writes, “Why, indeed, is it necessary to summon trouble – which must be endured soon enough when it has once arrived, or to anticipate trouble and ruin the present through fear of the future? It is foolish to be unhappy now because you may be unhappy at some future time.” I think Kahneman would agree with Seneca’s mindset. In his book, Kahneman write that we should accept some level of risk and some level of regret in our lives. We know we will face regret if we experience some type of failure. We can prepare for regret and accept it without having to ruin our lives by taking every possible precaution to try to avoid the potential for failure, pain, and loss. It is inevitable that we are going to lose loved ones and have unfortunate accidents. We can’t prepare and shield ourselves from every danger, unless we want to completely withdrawal from all that makes us human.

 

Ryan Holiday wrote about the importance of feeling and accepting our emotions in his book The Obstacle is the Way. He wrote, “Real strength lies in the control or, as Nassim Taleb put it, the domestication of one’s emotions, not in pretending they don’t exist.” Kahneman would also agree with Holiday and Taleb. Econs, the term Kahneman and other economists use to refer to theoretical humans who act purely rationally, are not pulled by emotions and cognitive biases. However, Econs are not human. We experience emotions when investments don’t pan out, when bets go the wrong way, and when we face multiple choices and are unsure if we truly made the best decision. We have to live with our emotions and the weight of failure or poor investments. Somehow, we have to work with these emotions and learn to continue even though we know things can go wrong. Holiday would suggest that we must be present, but acknowledge that things wont always go well and learn to recognize and express emotions in a healthy way when things don’t go well.

 

Kahneman continues, “Perhaps the most useful is to be explicit about the anticipation of regret. If you can remember when things go badly that you considered the possibility of regret carefully before deciding, you are less likely to experience less of it.” In this way, our emotions can be tools to help us make more thoughtful decisions, rather than anchors we are tethered to and hopelessly unable to escape. A thoughtful consideration of emotions, a return to the present moment, and acceptance of the different emotions we may feel after a decision are all helpful in allowing us to live and exist with some level of risk, some level of uncertainty, and some less of loss. These are ideas that stoic thinkers wrote about frequently, and they show up for Kahneman when he considers how we should live with our mental biases and cognitive errors.
The precautionary principle in governance

A Factor for Paralysis in Regulation & Legislation

A common complaint today in the United States is that nothing gets done. We are frustrated by political leaders who can’t pass important legislation. We dislike how slow local governments are to update infrastructure, adopt new technologies, and make improvements in the places we live. Gridlock has become the norm, and the actions that governments take seem to be too little too late.

 

But is this criticism really fair? Is the problem slow governments, ineffectual legislators, and inept public officials? Daniel Kahneman in his book Thinking Fast and Slow highlights a basic aspect of human psychology that might be one of the major contributing factors to the paralysis we see in governance today, and it has nothing to do with the quality of officials and legislators, but instead is all about the structures and systems of incentives that elected officials and policy actors respond to. The precautionary principle, a side effect of our general tendency toward loss aversion and our general stance against taboo tradeoffs drives our paralysis, and it is a logical response to the structure of many of our governing institutions.

 

Governments are necessary parts of human society, helping us establish rules for how we will live, interact, and make decisions collectively. Governments make investments, determine safety and efficacy standards, and help allocate resources across populations. In each of these functions of governance there is a possibility for error, a possibility for failure, and risk involved in the decisions. This is where the precautionary principle comes in. Kahneman writes,

 

“In the regulatory context, the precautionary principle imposes the entire burden of proving safety on anyone who undertakes actions that might harm people or the environment. Multiple international bodies have specified that the absence of scientific evidence of potential damage is not sufficient justification for taking risks. … the precautionary principle is costly, and when interpreted strictly it can be paralyzing.”

 

When risk is involved in decision-making processes, elected officials and public leaders are held responsible for any the bad outcomes that come to pass. There will always be a chance that a government investment fails, and no public official wants that failure to reflect poorly on their decision-making. There is always the risk that allocated resources could be misused, and it is often the official who approved the resource allocation (as well as the bad actor themselves) who faces consequences. When there is a deliberate decision to trade-off some level of safety or to accept an increase in risk in exchange for improved economic performance, faster traffic flows, or reduced government spending, public leaders and elected officials are the ones who look bad when something goes wrong.

 

The way our governance operates today encourages the precautionary principle. Risk is incredibly dangerous for public leaders, so the safer and more costly approach feels like the right choice in each individual decision. Over time, however, the costs add up, the paralysis becomes suffocating, and the public becomes dissatisfied and cynical. The answer might not be to completely cut out the regulation and safety apparatus of the government (that didn’t work well for President Trump who eliminated the NSC directorate for global health and security and bio-defense). The answer will be new structures for governance, new ways to allow government to take risks, and new ways to understand the risks that we all take in our lives. None of these are easy or simple transitions, but it is likely what we need in order to survive in a more complex and turbulent world.
Taboo Tradeoffs

Taboo Tradeoffs

A taboo tradeoff occurs when we are faced with the dilemma of exchanging something that we are not supposed to give up for money, food, or other resources. Our time, attention, energy, and sometimes even our happiness are perfectly legitimate to trade, but things like health and safety generally are not. We are expected to exchange our time, attention, and physical labor for money, but we are not expected to exchange our personal health for money. When I first read about taboo tradeoffs in Daniel Kahneman’s book Thinking Fast and Slow, the year was 2019, and we had not yet entered into a period of time defined by a global pandemic where people began to challenge the taboo against trading health and safety for entertainment, for trials for COVID-19 cures, and to signal their political allegiance.

 

In the book, Kahneman suggests that holding to hard rules against taboo tradeoffs actually makes us all worse off in the end. “The taboo tradeoff against accepting any increase in risk is not an efficient way to use the safety budget,” he writes. Kahneman’s point is that we can spend huge amounts of resources to ensure that there is absolutely no risk to ourselves, our children, or to others, but that we would be better off allocating those resources in a different way. I think Kahneman is correct, but I think that his message has the potential to be read very differently in 2020, and deserves more careful and nuanced discussion.

 

“The intense aversion to trading increased risk for some other advantage plays out on a grand scale in the laws and regulations governing risk.” The important point to note is that complete security and safety comes at a cost of other advantages. The advantage to driving to a football game is that we get to enjoy watching live sports, the risk is that we could be in a serious traffic accident. The advantage of using bug spray is that we kill the creepy crawlies in the dark corners of the garage, the risk is that we (or a child or pet) could accidently ingest the poison. The safest things to do would be to watch the game on TV and to use a broom and boot to kill the bugs, but if we avoid the risk then we give up the advantages of seeing live sports and using efficient pest control products.

 

Kahneman notes that when we make these decisions, we often make them based on a fear of regret more than out of altruistic concerns for our own health and safety or for the health and safety of others. If you traded some level of risk of your child’s safety, and they died, you would feel immense regret and shame, and so you avoid the taboo tradeoff to prevent your own shame. When this plays out across society in millions of large and small examples, we end up in a collectively risk averse paralysis, and society gives up huge advantages because there is a possibility of risk for some individuals.

 

To address the current global state of affairs, I think Kahneman would recognize the risk of COVID-19 and would not encourage us to trade our health and safety (and the health and safety of others) for the enjoyment of a birthday party, holiday meal, or other type of gathering without wearing masks and taking other precautions.  Throughout the book Kahneman highlights the difficulties and challenges of thinking through risk. He addresses the many biases that play into how we behave and how we understand the world. He demonstrates the difficulties we have in thinking statistically and understanding complex probabilities. The takeaway from Kahneman in regard to the taboo tradeoff is that there is a level at which our efforts of safety are outpaced by the advantages we could attain by giving up some of our safety. It isn’t necessarily on each of us individually to try to decide exactly what level of risk society should accept. It is up to the experts who can engage their System 2 brain and evaluate their biases to help the rest of us better understand and conceptualize risk. We might be able to do some things understanding that there is a level of risk we take when engaging in society in 2020, but adequate precautions can still mitigate that risk, and still help us maintain a reasonable balance of safety tradeoffs while enjoying our lives.
Daniel Kahneman on Regret

Daniel Kahneman on Regret

Regret is an interesting emotion and worth deep consideration. It is a System 2 emotion, that is, an emotion we feel when we pause, reflect on our life or actions, and consider the decisions we have or have not made in the past. System 1, the active, fast, and general default mode of our brain doesn’t feel regret. It lives in the moment and takes action based on our current inputs. It can receive feedback from System 2’s regret and make adjustments with new decisions and actions, but it is too busy with the present moment and environment to be the one building the emotion of regret.

 

Regret also stems from our ability to imagine different realities. Daniel Kahneman describes it as an emotion associated with loss and mistakes that allows us to self-correct and perceive different opportunities and realities that we might want to live within. It can modify how we act and behave, before we have even been faced with a decision. Kahenman writes, “decision makers know that they are prone to regret, and the anticipation of that painful emotion plays a part in many decisions.”

 

If I pause to think about regret, I typically think about a person on their deathbed, regretful for all the things they never did in their life. A fear of being this person has pushed me to try to do more, be more involved, and have varied and interesting experiences. The trite quote is that people on their deathbed are more regretful for the things they didn’t do than the things in life they did do. In this view, people recognize regret, and it turns into a fear of missing out that spurs people to action before it is too late, before they regret not taking action.

 

However, this idea may not represent the most powerful feelings of regret that we may experience. Kahneman writes, “People expect to have stronger emotional reactions (including regret) to an outcome that is produced by action than to the same outcome when it is produced by inaction.” As an example, Kahneman presents two fictional characters. Both have investments with companies A and B. One individual considers making a greater investment in company A, but does not and loses out on $1,200 of potential gains. The other removes some of her investment from company A, and she ends up with $1,200 less than what she could have received if she had done nothing. The consensus among people who read Kahneman’s examples indicate that the person who actively pulled money out of company A feels more regret than the person who never added extra investment funds to the company. Doing nothing and missing a potential gain is less regretful than taking an action that creates a perceived loss.

 

Loss aversion is powerful, and we are more likely to take actions to avoid losses to help us avoid feelings of regret rather than take chances at potential gains. The gains we don’t receive won’t cause as much regret as losses we do receive. Regret is not just the fear of missing out or the fear of having done too little that I described earlier. It is a powerful emotion that kicks in when we reflect on our life and see that our actions directly lead to losses and mistakes that we made. We may begin to change our behaviors and decisions to avoid similar losses in the future, and avoid the regret that those losses will bring, but that can drive us into making irrational choices in the present moment, with the hope of not losing out in the future.
Sunk-Cost Fallacy - Joe Abittan

Sunk-Cost Fallacy

Every time I pick the wrong line at the grocery store I am reminded of the sunk-cost fallacy. There are times I will be stuck in line, see another line moving more quickly, and debate internally if I should jump to the other line or just wait it out in the line I’m already in. Once I remember the sunk-cost fallacy, however, the internal debate shifts and I let go of any feeling that I need to remain in the current line.

 

My grocery store example is a comical take on the sunk-cost fallacy, but in real life, this cognitive error can have huge consequences. Daniel Kahneman describes it this way, “The decision to invest additional resources in a losing account, when better investments are available, is known as the sunk-cost fallacy, a costly mistake that is observed in decisions large and small.”

 

We are going to make decisions and choices for where to invest our time, attention, and money that will turn out to be mistakes. At a certain point we have to realize when something is not working and walk away. Doing so, however, requires that we admit failure, that we cut our losses, and that we search for new opportunities. Admitting that we were wrong, giving up on losses, and searching for new avenues is difficult, and it is not uncommon for us to keep moving forward despite our failures, as if we just need to try harder and push more in order to find the success we desire. This is the base of the sunk-cost fallacy. When we have invested a lot of time, energy, and resources into something it is hard to walk away, even if we would be better off by doing so.

 

Pursuing a career path that clearly isn’t panning out and refusing to try a new different avenue is an example of sunk-cost fallacy. Movie studios that try to reinvent a character or story over and over with continued failure is another example. Sitting through the terrible movie the studio produced, rather than leaving the theater early, is also an example of the sunk-cost fallacy. In all of these instances, an investment has been made, and costly efforts to make the investment pay-off are undertaken, generally at a greater loss than would be incurred if we had made a change and walked away.

 

When you find yourself saying, “I have already spent so much money on XYZ, or I have already put so much effort into making XYZ work, and I don’t want to just let that all go to waste,” you are stuck in the middle of the sunk-cost fallacy. At this point, it is time to step back, look at other ways you could spend your money and time, and honestly evaluate what your priorities should be. Doing so, and remembering Kahneman’s quote, will help you begin to make the shift to a better use of your time, energy, and resources. It may be embarrassing and disappointing to admit that something is going in the wrong direction, but ultimately, you will end up in a better and more productive spot.