Framing and Nudges

Framing and Nudges

“Framing works because people tend to be somewhat mindless, passive decision makers,” write Cass Sunstein and Richard Thaler in their book Nudge. “Their Reflective System does not do the work that would be required to check and see whether reframing the question would produce a different answer.”

 

Framing is an important rhetorical tool. We can frame things as gains or losses, reference numbers as percentages or as whole numbers, and compare phenomena to small classes or to larger populations. Framing can include elements of good or evil, morality or sin, responsibility toward ones family or individual greed. Depending on what we want people to do or how we want them to behave, we can adjust the way we frame a situation or decision to influence people in certain ways. Framing is not a 100% effective way to make people do what we want, but it can be a helpful way to nudge people toward certain decisions.

 

Sunstein and Thaler present an example of using framing to nudge people to conserve energy. They write,

 

“Energy conservation is now receiving a lot of attention, so consider the following information campaigns: (a) If you use energy conservation methods, you will save $350 per year; (b) If you do not use energy conservation methods, you will lose $350 per year. It turns out that information campaign (b), framed in terms of losses, is far more effective than information campaign (a). If the government wants to encourage energy conservation, option (b) is a stronger nudge.”

 

It is not the case that everyone who sees a message touting the money saved by conserving energy will do nothing while everyone who sees a message about the money they lose will take action. Some people will be motivated to take action by the message to save $350 per year, and some people won’t be motivated by the $350 loss aversion. However, on average, more people with the loss averse message will decide to take action. People tend to feel losses to a greater extent then they seek gains, so framing energy conservation methods as preventing a loss will motivate more people than framing energy conservation methods as leading to a gain.

 

This small shift in framing alters the perspective of buying energy efficient light bulbs or resealing windows from costly investments to practical strategies for avoiding further losses. Framing in this example is a simple nudge that isn’t a form of mind control, but plays into existing human biases and encourages people to make decisions that are better for them individually and for society collectively. I would argue that framing is a necessary and unavoidable choice. Messages are necessarily context dependent, and trying not to include any particular framing can make a message useless – at that point you might as well not have a message at all. Given that framing is necessary and that there are preferable outcomes, choice architects should think about framing and employ frames in a way to encourage the best possible decisions for the most people possible.
The Value of Objects - Joe Abittan

The Value of Objects

My dad collected Hot Wheels toy cars. He had thousands of cars, all in their packaging, with limited editions and rare valuable cars all collected and organized together. It was a hobby, and an example of how much value an individual can attach to objects that don’t mean anything to other people.

 

On my wife’s side of the family, near-hoarding behavior is not uncommon. With my dad’s collection and my wife’s family’s saving everything just in case, we have both seen the excesses of placing too much value in objects. We try hard to think critically about the things we have in our lives, and try to avoid having too many things and giving them too much value. But still, it is hard to part with things, even when they are gifts that we never really wanted and even when we know that we don’t need it or could replace it easily.

 

In the book Nudge, Cass Sunstein and Richard Thaler write the following about the value of objects. “People do not assign specific values to objects. When they have to give something up, they are hurt more than they are pleased if they acquire the very same thing.”

 

People are not actually that good at thinking about value. We will go out of our way for free stuff, we will hoard things to avoid feelings of loss, and we will collect items that don’t have the same economic value as the emotional value we attach to them once they are in our possession. This impulse helps drive our economy, but it can also drive us as individuals into madness.

 

I think that it is important to understand the quote from Sunstein and Thaler. When we recognize that we are not very good at thinking about value objectively, we can recognize irrational ways that we become attached to mere objects. We can start to shift the way we think about the material things in our lives and to really consider if they provide us value. We might find that our Hot Wheels collection really does provide us value, but at the same time, we may recognize that the decorative thing that someone gave us years ago doesn’t provide value. We can look at items that sit around taking up space, requiring cleaning, and cluttering our lives and feel more freedom with parting from those items. Understanding our irrational tendency towards objects and value can help us rethink what we keep, what we fill our lives with, and can help us get beyond the loss aversion we feel when we think about selling something or tossing it out.
Framing Costs and Losses - Joe Abittan

Framing Costs and Losses

Losses evokes stronger negative feelings than costs. Choices are not reality-bound because System 1 is not reality-bound,” writes Daniel Kahneman in Thinking Fast and Slow.

 

We do not like losses. The idea of a loss, of having the status quo changed in a negative way without it being our deliberate choice, is hard for us to accept or justify. Costs, on the other hand, we can accept much more readily, even if the only difference between a cost and a loss is the way we chose to describe it.

 

Kahneman shares an example in his book where he an Amos Tversky did just that, changing the structure of a gamble so that the contestant faced the possible outcome of a $5 loss or where they paid a $5 cost with a possibility of gaining nothing. The potential outcomes of the two gambles is exactly the same, but people interpret the gambles differently based on how the cost/loss is displayed. People are more likely to take a bet when it is posed as a cost and not as a possible loss. System 1, the quick thinking part of the brain, scans the two gambles and has an immediate emotional reaction to the idea of a loss, and that influences the ultimate decision and feeling regarding the two gambles. System 1 is not rationally calculating the two options to see that they are equivalent, it is just acting on the intuition that it experiences.

 

“People will more readily forgo a discount than pay a surcharge. The two may be economically equivalent, but they are not emotionally equivalent.”

 

Kahneman continues to describe research from Richard Thaler who had studied credit-card lobbying efforts to prevent gas stations from charging different rates for cash versus credit. When you pay with a card, there is a transaction processing fee that the vendor pays to the credit card company. Gas stations charge more for credit card purchases because they have to pay a portion on the back end of the all credit transactions that take place. Credit card companies didn’t want gas stations to charge a credit card surcharge, effectively making it more expensive to buy gas with a card than with cash. Ultimately they couldn’t stop gas stations from charging different rates, but they did succeed in changing the framing around the different prices. Cash prices are listed as discounts, shifting the base rate to the credit price. As Kahneman writes, people will skip the extra effort that would garner the cash discount and pay with their cards. However, if people were directly told that there was a credit surcharge, that they had to pay more for the convenience of using their card, it is possible that more individuals would make the extra effort to pay with cash. How we frame a cost or a loss matters, especially because it can shift the baseline for consideration, making us see things as either costs or losses depending on the context, and potentially altering our behavior.
Daniel Kahneman on Regret

Daniel Kahneman on Regret

Regret is an interesting emotion and worth deep consideration. It is a System 2 emotion, that is, an emotion we feel when we pause, reflect on our life or actions, and consider the decisions we have or have not made in the past. System 1, the active, fast, and general default mode of our brain doesn’t feel regret. It lives in the moment and takes action based on our current inputs. It can receive feedback from System 2’s regret and make adjustments with new decisions and actions, but it is too busy with the present moment and environment to be the one building the emotion of regret.

 

Regret also stems from our ability to imagine different realities. Daniel Kahneman describes it as an emotion associated with loss and mistakes that allows us to self-correct and perceive different opportunities and realities that we might want to live within. It can modify how we act and behave, before we have even been faced with a decision. Kahenman writes, “decision makers know that they are prone to regret, and the anticipation of that painful emotion plays a part in many decisions.”

 

If I pause to think about regret, I typically think about a person on their deathbed, regretful for all the things they never did in their life. A fear of being this person has pushed me to try to do more, be more involved, and have varied and interesting experiences. The trite quote is that people on their deathbed are more regretful for the things they didn’t do than the things in life they did do. In this view, people recognize regret, and it turns into a fear of missing out that spurs people to action before it is too late, before they regret not taking action.

 

However, this idea may not represent the most powerful feelings of regret that we may experience. Kahneman writes, “People expect to have stronger emotional reactions (including regret) to an outcome that is produced by action than to the same outcome when it is produced by inaction.” As an example, Kahneman presents two fictional characters. Both have investments with companies A and B. One individual considers making a greater investment in company A, but does not and loses out on $1,200 of potential gains. The other removes some of her investment from company A, and she ends up with $1,200 less than what she could have received if she had done nothing. The consensus among people who read Kahneman’s examples indicate that the person who actively pulled money out of company A feels more regret than the person who never added extra investment funds to the company. Doing nothing and missing a potential gain is less regretful than taking an action that creates a perceived loss.

 

Loss aversion is powerful, and we are more likely to take actions to avoid losses to help us avoid feelings of regret rather than take chances at potential gains. The gains we don’t receive won’t cause as much regret as losses we do receive. Regret is not just the fear of missing out or the fear of having done too little that I described earlier. It is a powerful emotion that kicks in when we reflect on our life and see that our actions directly lead to losses and mistakes that we made. We may begin to change our behaviors and decisions to avoid similar losses in the future, and avoid the regret that those losses will bring, but that can drive us into making irrational choices in the present moment, with the hope of not losing out in the future.
Competing Biases

Competing Biases

I am trying to remind myself that everyone, myself included, operates on a complex set of ideas, narratives, and beliefs that are sometimes coherent, but often conflicting. When I view my own beliefs, I am tempted to think of myself as rational and realistic. When I think of others who I disagree with, I am prone to viewing them in a simplistic frame that makes their arguments irrational and wrong. The reality is that all of our beliefs are less coherent and more complex than we typically think.

 

Daniel Kahneman’s book Thinking Fast and Slow has many examples of how complex and contradictory much of our thinking is, even if we don’t recognize it. One example is competing biases that manifest within us as individuals and can be seen in the organizations and larger groups that we form. We can be exaggeratedly optimistic and paralyzingly risk averse at the same time, and sometimes this tendency can actually be a good thing for us. “Exaggerated optimism protects individuals and organizations from the paralyzing effects of loss aversion; loss aversion protects them from the follies of overconfident optimism.”

 

On a first read, I would expect the outcome of what Kahneman describes to be gridlock. The optimist (or optimistic part of our brain) wants to push forward with a big new idea and plan. Meanwhile, loss aversion halts any decision making and prevents new ideas from taking root. The reality, as I think Kahneman would explain, is less of a conscious and deliberate gridlock, but an unnoticed trend toward certain decisions. The optimism wins out in an enthusiastic way when we see a safe bet or when a company sees an opportunity to capture rents. The loss aversion wins out when the bet isn’t safe enough, and when we want to hoard what we already have. We don’t even realize when we are making these decisions, they are just obvious and clear directions, but the reality is that we are constantly being jostled between exaggerated optimism and loss aversion.

 

Kahneman shows that these two biases are not exclusionary even though they may be conflicting. We can act on both biases at the same time, we are not exclusively a risk seeking optimists or exclusively risk averse. When the situation calls for it, we apply the appropriate frame at an intuitive level. Kahneman’s quote above shows that this can be advantageous for us, but throughout the book he also shows us how biases in certain directions and situation can be costly for us overtime as well.

 

We like simple and coherent narratives. We like thinking that we are one thing or another, that other people are either good or bad and right or wrong. The reality, however, is that we contain multitudes within us, act on competing and conflicting biases, and have more nuance and incongruency in our lives than we realize. This isn’t necessarily a bad thing. We can all still survive and prosper despite the complexity and incoherent beliefs that we hold. Nevertheless, I think it is important that we acknowledge the reality we live within, rather than simply believing the simple stories that we like to tell ourselves.
Desperate Gambles

Desperate Gambles

Daniel Kahneman worked with Amos Tversky to develop many of the concepts that today create the principle of Prospect Theory. Many people are familiar with the psychological and economic principle of Game Theory, and Prospect Theory is a similar psychological and economic theory of how people behave when faced with uncertainty. In his book Thinking Fast and Slow, Kahneman shares one of the early surprises of Prospect Theory that he and Tversky uncovered.

 

Prospect Theory gets its name from the way people behave when faced with different prospects, that is different potential outcomes with different potential likelihoods attached. This is similar to Game Theory, but instead of making decisions while another actor makes decisions that impact your final outcome, in prospect theory you generally are making a choice between a sure thing and an alternative minimal chance outcome. From the theory comes the fourfold pattern, which Kahneman uses to explain why large legal settlements are common, why people participate in lotteries, and why we buy insurance. What was surprising from Prospect Theory was the fourth block in the fourfold pattern, and it describes why some people are willing to take desperate gambles that have incredibly small likelihoods of paying off.

 

Kahneman writes, “when you consider a choice between a sure loss and a gamble with a high probability of a larger loss, diminishing sensitivity makes the sure loss more aversive, and the certainty effect reduces the aversiveness of the gamble.”

 

If you are suing a large corporation for damages, you are likely to accept a settlement far below what you are suing for. So, if you are suing the company for $1 million with a small chance of winning (say 5% or less), and the company offers you $95,000, you are likely to feel pressure to take the settlement to make sure you walk away with something. A guaranteed $95,000 is likely to be preferable to the tiny chance that you might actually win your lawsuit and walk away with $1 million. This is one square in the fourfold pattern that fit with Kahneman and Tversky’s prior expectations.

 

What surprised the pair was the tendencies of individuals when the tables are turned. Say you face the prospect of a large loss of $100,000 with a fringe legal possibility of getting off without facing any losses.  If you are offered a settlement where your costs will be $10,000 instead of $100,000, you are likely to feel pressure to turn down the settlement if there still exists some possibility of getting away without any losses. When we look at the expected value we find that accepting the settlement is the risk averse option, but few of us will be content taking the settlement.

 

We see this with politicians who take an “I’ll risk burning it all down to stay in power” approach, with poker players who get in too deep and misread an opponent or a hand, and with hockey teams who pull the goalie knowing that a sure loss is coming if they don’t take a big risk and get another offensive player on the ice while leaving their net open. When the sure loss is severe enough, then even large gambles with a minimal chance of success are worth the risk.

 

Kahneman and Tversky were surprised by this because it seems to violate  our normal pattern of acting based on expected value. We don’t consciously calculate the expected value of an event, but we usually do act in accordance to expected value. However, in these desperate situations, we actually choose the option with the worse expected value. We become less sensitive to the very likely large loss, and are unwilling to take the sure loss, violating expectations of risk aversion.
Signaling Fairness with Altruistic Punishment

Maintaining the Rules of Fairness with Signaling and Altruistic Punishment

Society is held together by many unspoken rules of fairness, and maintaining rules of fairness is messy but rewarding work. We don’t just advocate for fairness in our own lives, but will go out of our way to call out unfairness when we see it hampering the lives of others. We will protest, march in the streets, and post outraged messages on social media to call out the unfairness we see in the world, even if we are not directly affected by it or even stand to gain by an unfair status quo.

 

Daniel Kahneman, in Thinking Fast and Slow, shares some research studying our efforts to maintain the rules of fairness and why we are so drawn to it. He writes, “Remarkably, altruistic punishment is accompanied by increased activity in the pleasure centers of the brain. It appears that maintaining the social order and the rules of fairness in this fashion is its own reward.”

 

This idea reminds me of Robin Hanson’s book The Elephant in the Brain, where Hanson suggests a staggering amount of human behavior is little more than signaling. Much of what we do is not about the high-minded rational that we attach to our actions. Much of what we do is about something else, and our stated rationales are little more than pretext and excuses. Altruistic punishment, or going out of our way to inflicting some sort of punishment (verbal reprimands, loss of a job, or imprisonment) is not necessarily about the person who was treated unfairly or the person who was being unfair to others. It is quite plausibly more about our own pleasure, and about the maintenance or establishment of a social order that we presumably will benefit from, and about signaling to the rest of society that are someone who believes in the rules and will adhere to strict moral principles.

 

Troublingly, Kahneman continues, “Altruistic punishment could well be the glue that holds societies together. However, our brains are not designed to reward generosity as reliably as they punish meanness. Here again, we find a marked asymmetry between losses and gains.”

 

The second part of Kahneman’s quote is referring to biases in our mental thinking, connecting our meanness or niceness toward others with our tendency toward loss aversion. Losses have a bigger mental impact on us than gains. We might not be consciously aware of this, but our actions – our willingness to inflict losses on others and our reluctance to endow gains on others – seems to reflect this mental bias. We are creating social order by threatening others with loss of social standing at all times, but only with minimal hope of gaining and improving social standing. Going back to the Hansonian framework from earlier, this makes sense. A gain in social status for another person is to some extent a loss to ourselves. Maintaining the social order involves maintaining or improving our relative social position. Tearing someone down signals to our allies that we are a valuable team member fighting on the right side, but lifting someone else up only diminishes our relative standing to them (unless they are the leader who we want to signal our alliance with). Kahneman’s quote, when viewed through Robin Hanson’s perspective, is quite troubling for how our social order is built and maintained.
The Dominance of Loss Aversion - Joe Abittan

The Dominance of Loss Aversion

Loss aversion is a dominant force in many of our individual lives and in many of our societies. At this moment, I think it is one of the greatest barriers to change and growth that our entire world needs to overcome in order to move forward to address climate change, to create more equitable and cohesive societies, and to drive new innovations. Loss aversion has made us complacent, and we are feeling the cost of stagnation in our politics and in our general discontent, but at the same time we are paralyzed and unable to do anything about it. As Tyler Cowen wrote in The Complacent Class, “Americans are in fact working much harder than before to postpone change, or to avoid it altogether, and that is true whether we’re talking about corporate competition, changing residences or jobs, or building things. In an age when it is easier than ever before to dig in, the psychological resistance to change has become progressively stronger.”

 

My argument in this post is that much of the complacency and stagnation that Cowen has written about stems from loss aversion. In Thinking Fast and Slow, Daniel Kahneman writes, “Loss aversion is a powerful conservative force that favors minimal changes from the status quo in the lives of both institutions and individuals.” Additional research in the book shows that the pain and fear of loss is generally at least two times greater for most people than the pleasure and excitement of gain. Before we make a bet, the payoff has to be at least twice what we could stand to lose. If we are offered $10 or a gamble for more money, we prefer the sure $10 over the gamble, until the payoff of the gamble far outweighs the possible loss of the guaranteed $10.

 

I believe this is at the heart of the trite saying that people become more “conservative” as they get older. The reality is that as people get older they acquire more wealth, are more likely to own a home, and secure their social standing. People are not “conservative” in some high-minded ideological sense of “conservativism,” they are self-interested and risk averse. They don’t want to risk losing their wealth, losing value on their home, or losing social status. To me, this more plausibly explains conservatism and complacency than do political ideology explanations or cultural decadence.

 

To me, Kahneman’s quote is supported by Cowen’s thoughts. Institutions are built and run by people. People within institutions, especially as the institutions have become well established, become risk averse. They don’t want to lose their job, their position as the office veteran who knows how to do everything, and their knowledge and authority in their field. As the potential for loss increases, people become increasingly likely to push back against change and risk, ensuring that we cannot lose what we have, but also forgoing changes that could greatly benefit all of us in the long run. Loss Aversion has come to dominate how we organize our societies, and how we relate to one another, at individual, social, and political levels in the United States.
Loss Aversion & Golf

Loss Aversion & Golf

Daniel Kahneman presents research from University of Pennsylvania economists Devin Pope and Maurice Schweitzer to demonstrate the power of loss aversion in his book Thinking Fast and Slow. Pope and Schweitzer specifically look at golf, and how professional golfers perform when putting to demonstrate that loss aversion factors into the golfers’ performance, a conclusion that to me feels both obvious and surprising at the same time.

 

Professional golfers don’t seem like the kind of people who should  be subject to loss aversion on the course. Their performance doesn’t seem like it should be subject to the knowledge that a putt will earn them a birdie, or prevent them from scoring a bogie. However, Pope and Schweitzer challenge this thinking. Kahneman writes:

 

“Pope and Schweitzer reasoned from loss aversion that players would try a little harder when putting for par (to avoid a bogey) than when putting for a birdie. They analyzed more than 2.5 million putts in exquisite detail to tests that prediction.
They were right. Whether the putt was easy or hard, at every distance from the hole, players were more successful when putting for par than for birdie.”

 

Golfers don’t want to shoot over par. They perform better when they face the possibility of being over par rather than when they have a chance to be under par. Exceeding par is viewed as a loss which the golfers want to avoid while putting for birdie is an achievement to gain. Somewhere along the lines, golfers understand this and their physical performance is altered, decreasing the likelihood of a successful birdie putt, but increasing the likelihood of a successful par putt. Loss aversion is a powerful force, even in places we would not expect, like professional golf putting performance. If even professional golfers, who practice continually and are paid for their performance on the course are not able to avoid loss aversion, then we should recognize that it can play a huge role in our own lives, and we should invest in systems and structures to help us avoid making costly mistakes in our own lives from biases related to loss aversion.
Avoiding Gambles

Avoiding Gambles

“Most people dislike risk (the chance of receiving the lowest possible outcome), and if they are offered a choice between a gamble and an amount equal to its expected value they will pick the sure thing,” writes Daniel Kahneman in Thinking Fast and Slow. I don’t want to get too far into expected value, but in my mind I think of it as a discount on the total value of the best outcome of a gamble blended with the possibility of getting nothing. Rather than the expected value of a $100 dollar bet being $100, the expected value is going to come in somewhere less than that, maybe around $50, $75, or $85 dollars depending on whether the odds of winning the bet are so-so or are pretty good. You will either win $100 or 0, not $50, $75, or $85, but the risk factor causes us to value the bet at less than the full amount up for grabs.

 

What Kahneman describes in his book is an interesting phenomenon where people will mentally (or maybe subjectively is the better way to put it) calculate an expected value in their head when faced with a betting opportunity. If the expected value of the bet that people calculate for themselves is not much higher than a guaranteed option, people will pick the guaranteed option. The quote I used to open the post explains the phenomenon which you have probably seen if you have watched enough game show TV. As Kahneman continues, “In fact a risk-averse decision maker will choose a sure thing that is less than the expected value, in effect paying a premium to avoid the uncertainty.”

 

On game shows, people will frequently walk away from the big possibility of a pay off with a modest sum of cash if they are risk averse or if the odds seem really stacked against them. What is interesting is that we can study when people make the bet versus when people walk away, and observe patterns in our decision making. It turns out we can predict the situations that drive people toward avoiding gambles, and the situations which encourage them. It turns out that the reward has to be about two times the possible loss before people will make a gamble. If the certain outcome is pretty close to the expected outcome, people will pick the certain outcome. If there is no certain outcome, people usually need a reward that is at least 2X what they might lose before people will be comfortable with a bet. We might like to take chances and gamble from time to time, but we tend to be pretty risk averse and we tend to prefer guaranteed outcomes, even at a slight cost over the expected value of a bet, than to lose it all.