Facilitating Behaviors Through Nudges

Facilitating Behaviors

Plans held within our own head don’t seem to mean that much. I have had tons of plans to get things done around the house, to stop snacking on baked goods, and to read more, but I often find the time ticking by while I waste time reading news stories that don’t mean much to me or checking twitter. Having plans just in my head, that I convince myself I will accomplish, isn’t an effective strategy to making the changes I want. However, there are strategies that can be used for facilitating behaviors that we actually desire.

 

My last post was about the mere measurement effect. Just by measuring what people plan to do, simply by asking them if they plan to vote, plan to buy a new car this year, or intend to lose weight, people become more likely to actually follow-through on a stated behavior. But, there is a way to nudge the mere measurement effect even further, by asking people how they are going to enact their plans. When you ask people how they plan to vote, where they plan to buy a new car, and what steps they plan to take to lose weight, people become even more likely to follow-through on their intentions.

 

Asking people the how and when of a behavior they plan to adopt or an action they plan to do is a powerful and simple nudge. It is also something we can harness for ourselves. If we really want to make a change, we can’t just tell ourselves that tomorrow we will behave differently. Doing so will likely lead to letdown when the cookie temptations kick in around 2:30 in the afternoon, or when we fail to get up at our early alarm, or when we are tired in the afternoon and put on a tv show. But, if we have asked how we plan to make a change, then we can look ahead to the obstacles in our way, and plan for a healthy snack when cravings kick in, set thing up to make it easier to get out of bed, and hide the remote so we don’t turn on the TV without thinking.

 

Nudges don’t have to be external, they can be internal. We can use them to set a default course of action for ourselves or to push ourselves out of a default that we want to change. The quote that inspired this post is from Cass Sunstein and Richard Thaler’s book Nudge, where the authors write:

 

“The nudge provided by asking people what they intend to do can be accentuated by asking them when and how they plan to do it. This insight falls into the category of what the great psychologist Kurt Lewin called channel factors, a term he used for small influences that could either facilitate or inhibit certain behaviors.”
The Mere Measurement Effect - Joe Abittan

The Mere Measurement Effect

I listen to a lot of politics and policy podcasts, and one thing I learned over the last few years is that asking people to vote and encouraging them to vote isn’t very effective. What is effective, is asking people how they plan to vote. If you ask someone where their polling place is, how they plan to get there, when they plan to complete their mail in ballot, and if they will sit down with a spouse to vote, they actually become more likely to vote.

 

This seems like a strange phenomenon, but it appears that getting people to talk through the voting process helps cement their plans in their mind. The process seems to be related to the mere-measurement effect, which Richard Thaler and Cass Sunstein write about in their book Nudge. Writing about individuals who participate in surveys and their behavior after being surveyed they write,

 

“Those who engage in surveys want to catalogue behavior, not influence it. But social scientists have discovered an odd fact: when they measure people’s intentions, they affect people’s conduct. The mere-measurement effect refers to the finding that when people are asked about what they intend to do, they become more likely to act in accordance with their answers.”

 

The mere-measurement effect, just like the questions about how and where a person actually plans to cast their ballot, is a nudge. The brain is forced to think about what a person is doing, and that establishes actual plans, behaviors, and goals within the mind. It is very subtle, but it shifts the thought patterns enough to actually influence behavior. Once we voice our intention to another person, we are more likely to actually follow through compared to when we keep our inner plans secret. This can be useful for a supermarket trying to sell a certain product, a politician trying to encourage supporters to vote, or for charitable organizations looking to get more donations. The mere-measurement effect is small, but can be useful for nudging people in certain directions.
Nudges for Unrealistic Optimism

Nudges for Unrealistic Optimism

Our society makes fun of the unrealistic optimist all the time, but the reality is that most of us are unreasonably optimistic in many aspects of our life. We might not all believe that we are going to receive a financial windfall this month, that our favorite sports team will go from losing almost all their games last year to the championship this year, or that everyone in our family will suddenly be happy, but we still manage to be more optimistic about most things than is reasonable.

 

Most people believe they are better than average drivers, even though by definition half the people in a population must be above and half the people below average. Most of us probably think we will get a promotion or raise sometime sooner rather than later, and most of us probably think we will live to be 100 and won’t get cancer, go bald, or be in a serious car crash (after all, we are all above average drivers right?).

 

Our overconfidence is often necessary for daily life. If you are in sales, you need to be unrealistically optimistic that you are going to get a big sale, or you won’t continue to pick up the phone for cold calls. We would all prefer the surgeon who is more on the overconfident side than the surgeon who doubts their ability and asks us if we finalized our will before going into the operating room. And even just for going to the store, doing a favor for a neighbor, or paying for sports tickets, overconfidence is a feature, not a bug, of our thinking. But still, there are times where overconfidence can be a problem.

 

2020 is an excellent example. If we all think I’m not going to catch COVID, then we are less likely to take precautions and are more likely to actually catch the disease. This is where helpful nudges can come into play.

 

In Nudge, Cass Sunstein and Richard Thaler write, “If people are running risks because of unrealistic optimism, they might be able to benefit from a nudge. In fact, we have already mentioned one possibility: if people are reminded of a bad event, they may not continue to be so optimistic.”

 

Reminding people of others who have caught COVID might help encourage people to take appropriate safety precautions. Reminding a person trying to trade stocks of previous poor decisions might encourage them to make better investment choices then trying their hand at day trading. A quick pop-up from a website blocker might encourage someone not to risk checking social media while they are supposed to be working, saving them from the one time their supervisor walks by while they are scrolling through someone’s profile. Overconfidence may be necessary for us, but it can lead to risky behavior and can have serious downfalls. If slight nudges can help push people away from catastrophic consequences from unrealistic optimism, then they should be employed.
Should We Assume Rationality?

Should We Assume Rationality?

The world is a complex place and people have to make a lot of decisions within that complexity. Whether we are deliberate about it or not, we create and manage systems and structures for navigating the complexity and framing the decisions we make. However, each of us operate from different perspectives. We make decisions that seem reasonable and rational from our individual point of view, but from the outside may seem irrational. The question is, should we assume rationality in ourselves and others? Should we think that we and other people are behaving irrationally when our choices seem to go against our own interests or should we assume that people have a good reason to do what they do?

 

This is a current debate and challenge in the world of economics and has been a long standing and historical debate in the world of politics. In his book Thinking Fast and Slow, Daniel Kahneman seems to take the stance that people are acting rationally, at least from their own point of view. He writes, “when we observe people acting in ways that seem odd, we should first examine the possibility that they have a good reason to do what they do.”

 

Rational decision-making involves understanding a lot of risk. It involves processing lots of data points, having full knowledge of our choices and the potential outcomes we might face, as well as thinking through the short and long-term consequences of our actions. Kahneman might argue, it would seem after reading his book, that truly rational thinking is beyond what our brains are ordinarily capable of managing. But to him, this doesn’t mean that people cannot still make rational choices and do what is in their best interests. When we see behaviors that seem odd, it is possible that the choices other people have made are still rational, but just require a different perspective.

 

The way people get to rationality, Thinking Fast and Slow suggests, is through heuristics that create shortcuts to decision-making and eliminate data that is more or less just noise. Markets can be thought of as heuristics in this way, allowing people to aggregate decisions and make choices with an invisible hand directing them toward rationality. So when we see people who seem to be acting obviously irrationally or opposed to their self-interest, we should ask whether they are making choices within an entirely different marketplace. What seems like odd behavior from the outside might be savvy signaling to a group we are not part of, might be a short term indulgence that will stand out to the remembering self in the long run, and might make sense if we can change the perspective through which we judge another person.

 

Kahneman shows that we can predict biases and patterns of thought in ourselves and others, but still, we don’t know exactly what heuristics and thinking structures are involved in other people’s decision-making. A charitable way to look at people is to assume their decisions are rational from where they stand and in line with the goals they hold, even if the choices they make do not appear to be rational to us from the outside.

 

Personally, I am on the side that doubts human rationality. While it is useful, empathetic, and humanizing to assume rationality, I think it can be a mistake, especially if we go too far in accepting the perspective of others as justification for their acts. I think that there are simply too many variables and too much information for us to truly make rational decisions or to fully understand the choices of others. My thinking is influenced by Kevin Simler and Robin Hanson who argue in The Elephant in the Brain, that we act on pure self-interest to a greater extent than we would ever admit, and we hide our self-interested behaviors and decisions from everyone, including ourselves.

 

At the same time, I do believe that we can set up systems, structures, and institutions that can help us make more rational decisions. Sunstein and Thaler, in Nudge, clearly show that markets can work and that people can be rational, but often need proper incentives and easy choice structures that encourage to encourage better choices. Gigerenzer in Risk Savvy ends up at a similar place, showing that we can get ahead of the brain’s heuristics and biases to produce rational thought. Creating the right frames, offering the right visual aids, and helping the brain focus on the relevant information can lead to rational thought, but nevertheless, as Kahneman shows, our thinking can still be hijacked and derailed, leading to choices that feel rational from the inside, but appear to violate what would be in our best interest when our decisions are stacked and combined over time. Ultimately, the greatest power in assuming rationality in others is that it helps us understand multiple perspectives, and might help us understand what nudges might help people change their behaviors and decisions to be more rational.
Framing Costs and Losses - Joe Abittan

Framing Costs and Losses

Losses evokes stronger negative feelings than costs. Choices are not reality-bound because System 1 is not reality-bound,” writes Daniel Kahneman in Thinking Fast and Slow.

 

We do not like losses. The idea of a loss, of having the status quo changed in a negative way without it being our deliberate choice, is hard for us to accept or justify. Costs, on the other hand, we can accept much more readily, even if the only difference between a cost and a loss is the way we chose to describe it.

 

Kahneman shares an example in his book where he an Amos Tversky did just that, changing the structure of a gamble so that the contestant faced the possible outcome of a $5 loss or where they paid a $5 cost with a possibility of gaining nothing. The potential outcomes of the two gambles is exactly the same, but people interpret the gambles differently based on how the cost/loss is displayed. People are more likely to take a bet when it is posed as a cost and not as a possible loss. System 1, the quick thinking part of the brain, scans the two gambles and has an immediate emotional reaction to the idea of a loss, and that influences the ultimate decision and feeling regarding the two gambles. System 1 is not rationally calculating the two options to see that they are equivalent, it is just acting on the intuition that it experiences.

 

“People will more readily forgo a discount than pay a surcharge. The two may be economically equivalent, but they are not emotionally equivalent.”

 

Kahneman continues to describe research from Richard Thaler who had studied credit-card lobbying efforts to prevent gas stations from charging different rates for cash versus credit. When you pay with a card, there is a transaction processing fee that the vendor pays to the credit card company. Gas stations charge more for credit card purchases because they have to pay a portion on the back end of the all credit transactions that take place. Credit card companies didn’t want gas stations to charge a credit card surcharge, effectively making it more expensive to buy gas with a card than with cash. Ultimately they couldn’t stop gas stations from charging different rates, but they did succeed in changing the framing around the different prices. Cash prices are listed as discounts, shifting the base rate to the credit price. As Kahneman writes, people will skip the extra effort that would garner the cash discount and pay with their cards. However, if people were directly told that there was a credit surcharge, that they had to pay more for the convenience of using their card, it is possible that more individuals would make the extra effort to pay with cash. How we frame a cost or a loss matters, especially because it can shift the baseline for consideration, making us see things as either costs or losses depending on the context, and potentially altering our behavior.
Depleting Self-Control

Depleting Self-Control

A theme that runs through a lot of the writing that I do, influenced by Stoic thinkers such as Marcus Aurelius and modern academics and productivity experts like Cal Newport, is that we don’t have as much control over our lives as we generally believe. Writings from Aurelius show us how much happens beyond our control, and how important it is to be measured and moderate in our reactions to the world. Newport’s work shows how easily our brains can become distracted and how limited they are at sustaining long-term focus. Fitting in with both lines of thoughts is research from Daniel Kahneman, particularly an idea he presents in his book Thinking Fast and Slow about our depleting self-control. His work as a whole shows us just how much of our world we misunderstand and how important structures, systems, and institutions in our lives can be.

 

Regarding our ability for self-control, Kahneman writes, “an effort of will or self-control is tiring; if you have had to force yourself to do something, you are less willing or less able to exert self-control when the next challenge comes around. The phenomenon has been named ego depletion.”

 

Self-control is overrated. We think of ourselves and others as having far more self-control than is really possible. We are quick to judge others for failing to exercise self-control, and we can beat ourselves up mentally when we don’t seem to be able to muster the self-control needed to achieve our goals, stick to a diet, or hold to a resolution. But the work of Roy Baumeister that Kahneman’s quote describes shows us that self-control is limited, and that we can run out of self-control when we are overly taxed. Self-control is not an unlimited characteristic that reveals a deep truth about our personality.

 

It is easy to think up situations where you might have to restrain yourself from behaving rudely, indulging in vices, or shirking away from hard work. What is harder to immediately think of is how your initial act of self-control will influence the following situations that you might find yourself in. If you spend all day trying hard not to open Twitter while working, then you might give in to a post-work cookie. If you sat through an uncomfortable family dinner and restrained yourself from yelling at your relatives, then you might find it hard to hold back from speeding down the freeway on the drive home. We don’t like to think of ourselves as being so easily influenced by things that happened in the past, but we are unable to truly separate ourselves from things that happen around us. As we exert effort via self-control in one situation, we lose some degree of our ability to exert self-control in other situations.

 

It is important that we keep Kahneman and Baumeister’s research in mind and think about how we set up our environment so that we are not fighting a self-control battle all day long. There are tools that will stop you from being able to open certain websites while you are supposed to be working, you might have to decide that you just won’t buy any cookies so that they are not in the house at 2 in the afternoon when your sweet tooth acts up, and you may need to just Uber to and from those tense family dinners. If we put it all on ourselves to have self-control, then we will probably fail, but if we set up our environment properly, and give up some of the idea of self-control, then we will probably be more successful in the long run.
Gossip Machines

Gossip Machines

Humans are gossip machines. We like to talk about  and think about other people, especially the negative traits and qualities of others. At the same time, we are self-deception machines. We downplay our own faults, spend little time thinking about our mistakes, and deny any negative quality about ourselves. Even when we are the only audience for our thoughts, we hide our own flaws and instead nitpick all the things we dislike about other people.

 

As Daniel Kahneman writes in his book Thinking Fast and Slow, “it is much easier, as well as far more enjoyable, to identify and label the mistakes of others than to recognize our own.”

 

But gossip isn’t necessarily as bad as we usually make it out to be. It is definitely not a good thing to constantly talk bad about other people, to find faults in others, and to ignore our own shortcomings. It can make us vain, destroy our relationships with friends and family, and give us a bad reputation among the people in our lives. And yet, we all engage in gossip and it pops up on social media today, in movies from the 80’s and 90’s, and even in journals from our nation’s founding fathers. Gossip seems to have always been with us, and while we are quick to highlight its evils, it seems to also be an important part of human society.

 

Kahneman continues, “The expectation of intelligent gossip is a powerful motive for serious self-criticism, more powerful than New Year resolutions to improve one’s decision making at work and at home.”

 

We do not live in a vacuum. We are not isolated from society and other humans, and as a result we understand ourselves and think about ourselves in relation to other people. We partake in gossip and we know that other people gossip about us. This creates an important constraint on our actions and behaviors, shaping the way we live our lives. Knowing that other people will judge us prevents many negative behaviors such as reckless driving, living in unsanitary conditions, or being deliberately mean to other people. While gossip certainly has a lot of problems, it does in some ways shape how we behave in societies with other people in positive directions.

 

We might not want to think about our own flaws, but knowing that humans are gossip machines forces us to at least consider how we will appear to other people some of the time. This can drive us to act in accordance to social norms, and can be the bedrock of a society that can cooperate and coordinate efforts and goals. Without gossip, we might have a harder time bringing ourselves together to live in harmony.

The Time for Being Moral

Morality and our behavior is one of the spaces that I think demonstrates how little our actions and behaviors seem to actually align with the way we think about ourselves and the level of control we have in our lives. We believe that we are the masters of our own sails and that we are in control of what we do in the way that a CEO is in control of a company. We blame people when they make mistakes, hide our own shortcomings, and are pretty tough on ourselves when we don’t do the things we said we would do.

 

We hold ourselves and others to high moral standards and approach morality as if there is one fixed standard that is set in stone, but in our lives we don’t actually live that out. However, small factors that we likely ignore or completely fail to recognize play a huge role in our actual behaviors, and shape how we think about morality and whether we behave in a way that is consistent with the moral values we claim to have. One example is time.

 

In his book When, author Dan Pink looks at the ways that humans behave and interact with the world at different times of the day. Most people start their day out with a generally positive affect that peaks somewhere around 4 to 6 hours after waking up. Their mood then plummets and they experience a trough in the middle of the day and afternoon where their affect is more negative, their patience is shorter, and their attention is dulled. But, luckily for us all, people generally show a tendency to rebound in the late afternoon and early evening and their mood and affect improve. Night owls show the same pattern, but generally reversed starting out in more of a rebound phase, running through a trough in the middle of the day, and peaking in the evening.

 

Studies seem to show that this cycle holds for our attentiveness, our mood, and also our morality. Pink writes, “synchrony even affects our ethical behavior. In 2014 two scholars identified what they dubbed the morning morality effect, which showed that people are less likely to lie and cheat on tasks in the morning than they are later in the day.” Pink continues to explain that subsequent research seems to indicate that we are more moral during our peak. Most people are morning people and are most moral in the mornings. Night owls seem to be more moral in the evening when they hit their peak.

 

It seems strange that we would have certain times when we behave more moral. For the standard story we tell ourselves, we are rational agents who are not influenced by cheesy commercials, by insignificant details, or by the random time at which something takes place. We are the masters of our own destiny and we are in control of our own behavior and thoughts. This story, however, doesn’t seem to be an accurate reflection of our lives. If simply changing the time of day during which we have to make a morality decision changes the outcome of our decision, then we should ask if we really are in control of our thoughts and actions. It seems that we are greatly influenced by things that really shouldn’t matter when it comes to crucial decisions about morality and our behavior.

Take a Close Look at What Feels Right

A topic I am fascinated by and plan to dig into in the future is motivated reasoning. We are great at finding all of the reasons and examples for why the things we do are overwhelmingly good and justified, while finding all the flaws in the people and things we dislike. Our brains seems to be wired to tell us that what benefits us is inherently good for the world while things that harm us are inherently evil. As Kevin Simler and Robin Hanson write in The Elephant in the Brain, “What feels, to each of you, overwhelmingly right and undeniably true is often suspiciously self-serving, and if nothing else, it can be useful to take a step back and reflect on your brain’s willingness to distort things for your benefit.” This is the essence of motivated reasoning, and we often don’t even realize we are doing it.

 

We each have a particular view of the world that feels like it is foolproof. We have our own experiences and knowledge, and the way we see the world comes out of those factors. It will always feel right to us because it is directly dependent on the inputs we observe, recognize, and cognitively arrange. But, we should be able to recognize that the worldview that we hold will always be an incomplete and ineffective model. We can’t have all of the experiences in the world and we can’t know all of the information about the universe. We will always have a flaw in our opinion because we can’t have a perfect and all encompassing perspective. There will always be gaps and there will always be inaccuracies.

 

When we train ourselves to remember the reality that we don’t have all the information and all the background experiences necessary to fully understand the world, we can start to approach our own thoughts and opinions with more skepticism. It is easy to be skeptical of the out of date baby boomer advice you received and it is easy to discount the political views of someone in the other party, but it is much harder to discount something that feels overwhelmingly accurate to yourself but might be wrong or only marginal, especially if you stand to benefit in one way or another.

 

At the end of the day we likely will have to make some type of decision related to our incomplete and inaccurate worldview. Even if we step back and observe what is going through our mind and where we might have blind-spots, we may find that we reach the same conclusion. That is perfectly fine, as long as we understand where we may be wrong and work to improve our understanding in that area. Or, we might acknowledge that we don’t know it all and be willing to accept some type of compromise that might slightly diminish our self-interest but still hold true to the underlying value at the heart of our decision. This is likely the only way our fractured societies can move forward. We must admit we can’t know it all and we must be willing to admit that sometimes we act out of self-interest in favor of our own personal values rather than acting based on immutable truths. From there we can start to find areas where it makes sense for us to give up a small piece and be willing to experiment with something new. A disposition toward this type of thinking can help us actually develop and make real progress toward a better world.

Religion As a Community Social Structure

There are not many things that pull people together quite like religious beliefs. Sports pull us together when our kids are on the same team, when we are all in a stadium, or when two of us are wearing the right hat on an airplane, but those don’t make for strong ties that are lasting and uniting. Religion offers an entire worldview and set of corresponding behaviors that do create lasting ties between people who otherwise wouldn’t have much in common and wouldn’t likely interact for any significant time. Robin Hanson and Kevin Simler look at religion in their book The Elephant in the Brain to understand the ways that religious signaling, behaviors, and beliefs operate in ways that often go unnoticed.

 

They quote a few authors in a short section that stood out to me:
“Religion,” says Jonathan Haidt, “is a team sport.”
“God,” says Emile Durkheim, “Is society writ large.”

 

Simler and Hanson go on to explain what this community and larger social aspect of religion means given that we tend to think of religion more as a private belief system:

 

“In this view, religion isn’t a matter of private beliefs, but rather of shared beliefs and, more importantly, communal practices. These interlocking pieces work together, creating strong social incentives for individuals to act (selfishly in ways that benefit the entire religious community. And the net result is a highly cohesive and cooperative social group. A religion, therefore, isn’t just a set of propositional beliefs about God and the afterlife; it’s an entire social system.”

 

Religions typically encourage pro-social behaviors that get people thinking more about a cohesive group than about selfish motives. By pursuing these prosocial behaviors, people can gain more status and prestige in society. For selfish reasons then (at least to some extent), people pursue the religious dictates of their society in their own personal lives. As they do this, positive externalities may arise and may create a society that is more cohesive and supportive all around. This might not always happen, but having a shared system of understanding the world, our places in the world, and the stories about who we are and why we exist help to create the social fabric and social capital to further encourage cooperation and social cohesion. In a weird way, our selfish motives encourage religion, even if we don’t acknowledge it and assume that religion is entirely about personal beliefs.