Narrative Fallacies #NarrativePolicyFramework

Narrative Fallacies

With perhaps the exception of professional accountants and actuaries, we think in narratives. How we understand important aspects of our lives, such as who we are, the opportunities we have had in life, the decisions we have made, and how our society works is shaped by the narratives we create in our minds. We use stories to make sense of our relationships with other people, of where our future is heading, and to motivate ourselves to keep going. Narratives are powerful, but so are the narrative fallacies that can arise from the way we think.

 

Daniel Kahneman, in Thinking Fast and Slow, demonstrates the ways in which our brains take short-cuts, rely on heuristics, and create narratives to understand a complex world. He shows he these thinking strategies can fail us in predictable ways due to biases, illusions, and judgments made on incomplete information. Narrative fallacies can arise from all three of the cognitive errors I just listed. To get more in depth with narrative fallacies, Kahneman writes,

 

“Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.”

 

We don’t really know how to judge probabilities, possibilities, and the consequences of things that didn’t happen. We are biased to see agency in people and things when luck was more of a factor than any direct action or individual decision. We are motivated and compelled by stories of the world that simplify the complexity of reality, taking a small slice of the world and turning that into a model to describe how we should live, behave, and relate to others.

 

Unfortunately, in my opinion, narrative fallacies cannot be avoided. I studied public policy, and one of the frameworks for understanding political decision-making that I think needs far more direct attention is the Narrative Policy Framework which incorporates the idea of Social Constructions of Target Populations from Anne Schneider and Helen Ingram. We understand the outcomes of an event based on how we think about the person or group that were impacted by the consequences of the outcome. A long prison sentence for a person who committed a violent crime is fair and appropriate. A tax break for parents who work full time is also fair and appropriate. In both instances, we think about the person receiving the punishment or reward of a decision, and we judge whether they are deserving of the punishment or reward. We create a narrative to explain why we think the outcomes are fair.

 

We cannot exist in a large society of millions of people without shared narratives to help us explain and understand our society collectively. We cannot help but create a story about a certain person or group of people, and build a narrative to explain why we think that person or group deserves a certain outcome. No matter what, however, the outcomes will not be rational, they will be biased and contain contradictions. We will judge groups positively or negatively based on stories that may or may not be accurate and complete, and people will face real rewards or punishments due to how we construct our narratives and what biases are built into our stories. We can’t escape this reality because it is how our brains work and how we create a cohesive society, but we can at least step back and admit this is how our brains work, admit that our narratives are subject to biases and are based on incomplete information, and we can decide how we want to move forward with new narratives that will help to unify our societies rather than pit them against each other in damaging competition.
Intensity Matching and Intuitive Predictions

Intuitive Predictions and Intensity Matching

“Intuitive predictions are almost completely insensitive to the actual predictive quality of the evidence,” writes Daniel Kahneman in Thinking Fast and Slow. A lot of our thinking takes place in the part of our brain which is good at making quick connections, quickly detecting patterns, and making fast judgments. The deeper and more thoughtful part of our brain only engages with the world when it really needs to, when we really need to do some critical thinking to sort out a math problem, write a blog post, or figure out how to grind down some grains to make bread. The result is that a lot of our thinking processes happen at a quick and intuitive level that is subject to biases and assumptions based on incomplete information.  When we do finally turn our critical thinking brain to a problem, it is only operating with a limited set of information from the quick part of our brain which scanned the environment and grabbed the information which stood out.

 

When we make a prediction without sitting down and doing some math or weighing the factors that influence our prediction with pen and paper, our predictions will seem logical, but will miss critical information. We will make connections between ideas and experiences that might not be very reflective of the actual world. We will simplify the prediction by answering easy questions and substituting answers for the more difficult question that our prediction is trying to answer.

 

This year, as in 2016, we will see this in action. In 2016, for me and many of the people I know, it seemed as though very few people supported Donald Trump for president. I saw very few bumper stickers or yard signs for Trump, all the social media posts I saw highlighted his worst moments, and the news coverage I consumed described why he was unfit to be president. Naturally enough, I believed he would lose in a landslide. Of course, that did not happen. Intuitively I was sure that Clinton would win, and Kahneman’s research helps explain why I should have been more skeptical of my natural intuition.

 

Part of the problem was that my intuitive prediction was an exercise of intensity matching, and as Kahneman writes, “Intensity matching yields predictions that are as extreme as the evidence on which they are based.” All the information I saw highlighted how terrible Trump was. I didn’t see a lot of people supporting trump, I didn’t see news stories justifying his candidacy. I didn’t see people in my immediate environment who strongly supported him, so my intuition was biased. It didn’t help that I didn’t do anything to seek out people who did support him or information outlets that posted articles or stories in support of him.

 

Kahneman’s writing aligns with my real world experience. His studies of the brain and of our predictive machinery reveals biases and errors in our thinking. Our intuition is based on a limited set of information that the quick part of our brain can put together. When we do engage our deep thinking brain, it can still only operate on that limited information, so even if we do think critically, we are likely to still make mistakes because we can’t see the full picture and biases in the information we absorb will predictably shape the direction of our miscalculations. What might feel natural and obvious to us could be a result of faulty intensity matching and random chance in the environment around us.
Fluency of Ideas

Fluency of Ideas

Our experiences and narratives are extremely important to consider when we make judgments about the world, however we rarely think deeply about the reasons why we hold the beliefs we do. We rarely pause to consider whether our opinions are biased, whether our limited set of experiences shape the narratives that play in our mind, and how this influences our entire outlook on life. Instead, we rely on the fluency of ideas to judge our thoughts and opinions as accurate.

 

In Thinking Fast and Slow Daniel Kahneman writes about ideas from Cass Sunstein and jurist Timur Kuran explaining their views on fluency, “the importance of an idea is often judged by the fluency (and emotional charge) with which that idea comes to mind.” It is easy to characterize an entire group of people as hardworking, or lazy, or greedy, or funny based entirely on a single interaction with a single person from that group. We don’t pause to ask if our interaction with one person is really a good reflection of all people who fit the same group as that person, we instead allow the fluency of our past experiences to shape our opinions of all people in that group.

 

And our ideas and the fluency with which those ideas come to mind don’t have to come from our own personal experience. If a claim is repeated often enough, we will have trouble distinguishing it from truth, even if it is absurd and doesn’t have any connection to reality. The idea will come to mind more fluently, and consequently the idea will start to feel true. We don’t have to have direct experience with something if a great marketing campaign has lodge an opinion or slogan in mind that we can quickly recall.

 

If we are in an important decision-making role, it is important that we recognize this fluency bias. The fluency of ideas will drive us toward a set of conclusions that might not be in our best interests. A clever marketing campaign, a trite saying repeated by salient public leaders, or a few extreme yet random personal experiences can bias our judgment. We have to find a way to step back, recognize the narrative at hand, and find reliable data to help us make better decisions, otherwise we might end up judging ideas and making decisions based on faulty reasoning.
As an addendum to this post (originally written on 10/04/2020), this morning I began The Better Angels of Our Nature: Why Violence Has Declined, by Steven Pinker. Early in the introduction, Pinker states that violence in almost all forms is decreasing, despite the fact that for many of us, it feels as though violence is as front and center in our world as ever before. Pinker argues that our subjective experience of out of control violence is in some ways due to the fluency bias that Kahneman describes from Sunstein and Kuran. Pinker writes,

 

“No matter how small the percentage of violent deaths may be, in absolute numbers there will always be enough of them to fill the evening news, so people’s impressions of violence will be disconnected from the actual proportions.” 

 

The fluency effect causes an observation to feel correct, even if it is not reflective of actual trends or rates in reality.
Biased in Predictable Ways

Biased in Predictable Ways

“A judgment that is based on substitution will inevitably be biased in predictable ways,” writes Daniel Kahneman in his book Thinking Fast and Slow. Kahneman uses an optical illusion to show how our minds can be tricked in specific way to lead us to an incorrect conclusion. The key take-away, is that we can understand and predict our biases and how those biases will lead to specific patterns of thinking. The human mind is complex and varied, but the errors it makes can be studied, understood, and predicted.

 

We don’t like to admit that our minds are biased, and even if we are willing to admit a bias in our thinking, we are often even less willing to accept a negative conclusion about ourselves or our behavior resulting from such a bias. However, as Kahneman’s work shows, our biases are predictable and follow patterns. We know that we hold biases, and we know that certain biases can arise or be induced in certain settings. If we are going to accept these biases, then we must accept what they tell us about our brains and about the consequences of these biases, regardless whether they are trivial or have major implications in our lives and societies.

 

In a lot of ways, I think this describes the conflicts we are seeing in American society today. There are many situations where we are willing to admit that biases occur, but to admit and accept a bias implicates greater social phenomenon. Admitting a bias can make it hard to deny that larger social and societal changes may be necessary, and the costs of change can be too high for some to accept. This puts us in situations where many deny that bias exists, or live in contradiction where a bias is accepted, but a remedy to rectify the consequences of the bias is not accepted. A bias can be accepted, but the conclusion and recognition that biases are predictable and understandable can be rejected, despite the mental contradictions that arise.

 

As we have better understood how we behave and react to each other, we have studied more forms of bias in certain settings. We know that we are quick to form in-groups and out-groups. We know that we see some people as more threatening than others, and that we are likely to have very small reactions that we might not consciously be aware of, but that can nevertheless be perceived by others. Accepting and understanding these biases with an intention to change is difficult. It requires not just that one person adapt their behavior, but that many people change some aspect of their lives, often giving up material goods and resources or status. The reason there is so much anger and division in the United States today is because there are many people who are ready to accept these biases, to accept the science that Kahneman shows, and to make changes, while many others are not. Accepting the science of how the brain works and the biases that can be produced in the brain challenges our sense of self, reveals things about us that we would rather leave in the shadows, and might call for change that many of us don’t want to make, especially when a fiction that denies such biases helps propel our status.
A Capacity for Surprise

A Capacity for Surprise

For much of our waking life we operate on System 1, or we at least allow System 1 to be in control of many of our actions, thoughts, and reactions to the world around us. We don’t normally have to think very hard about our commute to work, we can practically walk through the house in the early morning on our way to the coffee machine with our eyes closed, and we can nod to the Walmart greeter and completely forget them half a second after we have passed. Most of the time, the pattern of associated ideas from System 1 is great at getting us through the world, but occasionally, something happens that doesn’t fit the model. Occasionally, something reveals our capacity for surprise.

 

Seeing someone juggling in the middle of the shopping isle in Walmart would be a surprise (although less of a surprise in a Walmart than in some other places). Stepping on a stray Lego is an unwelcome early morning pre-coffee surprise, as is an unexpected road closure on our commute. These are examples of large surprises in our daily routine, but we can also have very small surprises, like when someone tells us we will be meeting with Aaron to discuss our personal financial plan, and in walks Erin, surprising us by being a woman in a position we may have subconsciously associated with men.

 

“A capacity for surprise is an essential aspect of our mental life,” writes Daniel Kahneman in his book Thinking Fast and Slow, “and surprise itself is the most sensitive indication of how we understand our world and what we expect from it.”

 

Because so much of our lives is in the hands of System 1, we are frequently surprised. If we consciously think about the world and the vast array of possibilities at any moment, we might not be too surprised at any given outcome. We also would be paralyzed by trying to make predictions of a million different outcomes for the next five minutes. System 1 eases our cognitive load and sets us up for routine expectations based on the model of the world it has adapted from experience. Surprise occurs when something violates our model, and one of the best ways to understand what that model looks like is to look at the situations that surprise us.

 

Bias is revealed through surprise, when an associated pattern is interrupted by something that we were not expecting. The examples can be harmless, such as not expecting a friend to answer the phone sick, with a raspy and sleepy voice. But often our surprise can reveal more consequential biases, such as when we are surprised to see a person of a racial minority in a position of authority. It might not seem like much, but our surprise can convey a lot of information about what we expect and how we understand the world, and other people might pick up on that even if we didn’t intend to convey a certain expectation about another person’s place in the world. We are constantly making predictions about what we will experience in the world, and our capacity for surprise reveals the biases that exist within our predictions, saying something meaningful about what our model of the world looks like.
A Condescending Impulse

A Condescending Impulse

In my last few posts I have written about Johann Hari’s research into Harry Anslinger, the nation’s first Commissioner for the Federal Bureau of Narcotics, and what Hari learned about Anslinger and the start of the nation’s war on drugs. Anslinger held deeply racist views which he channeled into propaganda and drug policy in the Untied States. Hari was appalled by what he read, the common newspaper headlines about Anslinger’s raids from the time, and the quotes from the Commissioner himself. Writing about his research, Hari states,

 

“At times, as I read through Harry’s ever-stranger arguments, I wondered: How could a man like this have persuaded so many people? But the answers were lying there, waiting for me, in the piles of letters he received from members of he public, from senators and from presidents. They wanted to be persuaded. They wanted easy answers to complex fears. It’s tempting to feel superior – to condescend to these people – but I suspect this impulse is there in all of us. The public wanted to be told that these deep, complex problems – race, inequality, geopolitics – came down  to a few powders and pills, and if these powders and pills could be wiped from the world, these problems would disappear.” (Underlined text emphasis added by blog author)

 

We live in a complex world and we all lead busy lives that demand a lot of mental energy and attention just to keep the lights on. We hopefully figure out how to be successful and productive in our own lives, but we only ever get a single perspective on the world, our own. We want to believe that we are good people and that success in our society is entirely within the control of the individual (especially if we have become successful ourselves). When we face great uncertainty and complexity which doesn’t seem to line up with experiences of our lives or the heuristics we have developed for how we live, we seek simple answers that confirm what we want to believe. That is what Hari’s quote shows.

 

Anslinger was building a coalition of like-minded individuals with racial prejudices who wanted to be proven right. They feared drugs, and found drug users and addicts to be easy and defenseless targets. Drugs became a simple answer to the complex problems of why some people became dregs on society while others became wealthy successes.

 

Hari’s quote points out that we should recognize this, but not demonize people for it. We should acknowledge that this instinct is within all of us, and we should not fall into this condescending impulse and turn around a vilify those who are vilifying others. We must approach even our enemies and those among us who are wrong and hold dangerous beliefs with empathy. We must understand that the faults we find in them are faults that we too may have. The only way to connect and make real changes is to recognize and acknowledge these fears, and work to demonstrate how these simple answers to complex problems cannot possibly encompass all that is wrong in our societies so that we can move forward with better ideas and policies in the future.

Immediate Evaluations

I will be honest with this one. I think President Donald Trump is a despicable human being, a lazy thinker, and too incompetent (not to mention unaware of his incompetence) to serve as President of the United States. As a result of my disliking of the President, I feel that I cannot trust anything he says. This is troubling because I am likely to immediately dismiss his evaluations and policies, assuming that they are wrong and potentially corrupt. I’m not going to blame myself 100% here (the President has done many things to make me and others suspicious of what he says), but I think it is important for me to recognize and acknowledge that I immediately dismiss anything he says and immediately assume that anything he thinks is wrong.

 

The President is such a polarizing individual that he, and my reactions to him, serve as useful examples of how quickly we can make judgments about what other people say. We pick up on direct cues from others and interpret indirect identity cues to begin to make judgments about what others say, before they have even said anything.

 

In his book How to Win Friends and Influence People, Dale Carnegie quotes from the book On Becoming a Person by Carl Rogers, “Our first reaction to most of the statements (which we hear from other people) is an evaluation or judgment, rather than an understanding of it.”

 

When a friend that we get along with and share similar interests and identities with starts to say something about a sports team that we don’t have strong opinions about, we will probably agree with them in an instinctive manner. At the same time, when our uncle posts on Facebook about how terrible the political party we vote for is, we will likely scroll right by or block his post without actually giving it a second thought. There may not really be a reason to instantly agree with our friend about how good LeBron James is or to debate our uncle about his political philosophy, but we should nevertheless be aware of how quickly we make judgments about what other people think, say, and post on social media.

 

If we occupy a key decision-making role in a company, if we have to make decisions about our child’s education, and if we are thinking about our long-term retirement plans, it would be helpful for us to consider how quickly judgments happen. If we really like our financial adviser, we might instinctively agree with what he says, even if his advice isn’t as well researched and accurate as it should be. If we have had a combative relationship with our college-aged child, we might not be happy to hear that they switched out of a pre-med major, even if we know in our hearts that becoming a doctor might not be a good route for our son or daughter. If we understand how quickly our minds make decisions for us, we can push back and hopefully make better ore informed decisions. We can at least be aware of times when we make a snap judgment and try to seek other sources of information and consider that we might be wrong, and that the advice or decision of another are actually sound.

Motivated Reasoning – Arguments to Continue Believing As We Already Do

Recently I have been thinking a lot about the way we think. To each of us, it feels as though our thinking and our thought process is logical, that our assumptions about the world are sound and built on good evidence, and that we might have a few complex technical facts wrong, but our judgments are not influenced by bias or prejudice. We feel that we take into consideration wide ranges of data when making decisions, and we do not feel as though our decisions and opinions are influenced by meaningless information and chance.

 

However, science tells us that our brains often make mistakes, and that many of those mistakes are systematic. Also, we know people in our own lives who display wonderful thinking errors, such as close-mindedness, gullibility, and arrogance. We should be more ready to accept that our thinking isn’t any different from the minds of people in scientific studies that show the brain’s capacity to traverse off course or that we are really any different from the person we complain about for being biased or unfair in their thinking about something or someone we we care about.

 

What can make this process hard is the mind itself. Our brains are experts at creating logical narratives, including about themselves. We are great at explaining why we did what we did, why we believe what we believe, and why our reasoning is correct. Scientists call this motivated reasoning.

 

Dale Carnegie has a great explanation of it in his book How to Win Friends and Influence People, “We like to continue to believe what we have been accustomed to accept as true, and the resentment aroused when doubt is cast upon any of our assumptions leads us to seek every manner of excuse for clinging to it. The result is that most of our so-called reasoning consists in finding arguments for going on believing as we already do.” 

 

Very often, when confronted with new information that doesn’t align with what we already believe, doesn’t align with our own self-interest, or that challenges our identity in one way or another, we don’t update our thinking but instead explain away or ignore the new information. Even for very small thing (Carnegie uses the pronunciation of Epictetus as an example) we may ignore convention and evidence and back our beliefs in outdated and out of context examples that seem to support us.

 

In my own life I try to remember this, and whether it is my rationalization of why it is OK that I went for a workout rather than doing dishes, or my self-talk about how great a new business idea is, or me rationalizing buying that sushi at the store when I was hungry while grocery shopping, I try to ask myself if my thoughts and decisions are influenced by motivated reasoning. This doesn’t always change my behavior, but it does help me recognize that I might be trying to fool myself. It helps me see that I am no better than anyone else when it comes to making up reasons to support all the things that I want. When I see this in other people, I am able to pull forward examples from my own life of me doing the same thing, and I can approach others with more generosity and hopefully find a more constructive way of addressing their behavior and thought process. At an individual level this won’t change the world, but on the margins we should try to reduce our motivated reasoning, as hard as it may be, and slowly encourage those around us to do the same.

Judicial Sentencing and Daylight Saving Time

Our justice system in the United States is not the greatest system that we have developed. In recent years a lot of attention has been paid to disparities in sentencing and ways in which the system doesn’t seem to operate fairly. For instance possession of the same amount crack cocaine and powder cocaine carried different mandated sentences, even though it was the same drug just in different forms. The sentencing differences represented a bias in the way we treated the drug considering who was more likely to be a crack versus powder cocaine user.

 

In general, we believe that our system is fair and unbiased. We like to believe that our judges, jurors, and justice system officials are blind, only seeing the facts of the case and making rational decisions that are consistent from case to case. It is important that we believe our system works this way and that we take steps to ensure it does, but there is evidence that it does not and that basic factors of our humanity prevent the system from being perfectly fair.

 

An interesting example of the challenges of creating a perfectly balanced judicial system is presented in Daniel Pink’s book When. Pink’s book is an exploration of time and the power of timing in our lives. He presents evidence that the human mind’s decision-making ability deteriorates throughout the course of the day, becoming less nuanced, less analytical, and more easily distracted the longer we have been awake and the longer we have been focused on a task. Judges are no exception.

 

Pink references a study that shows that simple timing changes can impact the decisions that judges make, even when the timing seems as though it should be irrelevant. Pink writes, “Another study of U.S. federal courts found that on the Mondays after the switch to Daylight Saving Time, when people on average lose roughly forty minutes of sleep, judges rendered prison sentences that were about 5 percent longer than the ones they handed down on typical Mondays.”

 

A slight loss of sleep, and a slight change in time resulted in inconsistent sentencing within our courts. The decisions our judges make are nuanced and challenging, and our judges have to make multiple life impacting decisions each day. Unfortunately, the system within which they operate is not designed to help provide more consistency across scheduling. Factors such as Daylight Saving Time, extensive blocks between lunch and breaks, and long daily schedules wear out our judges, and lead to less nuanced thinking and less fair sentences. We should think about how our system impacts the decisions we make (within the judicial system, the corporate board room, and on the factory floor) and try to redesign systems around time to help people make better and more consistent decisions.

Racial Bias Manifests When We Are Tired

Whether we want to admit it or not, we all make cognitive errors that result in biases, incorrect assessments, and bad decisions. Daniel Pink examines the timing of our errors and biases in his book When: The Scientific Secrets to Perfect Timing. It is one thing to simply say that biases exist, and another to try to understand what leads to biases and when such biases are most likely to manifest. It turns out that the time of day has a big impact on when we are likely to see biases in our thinking and actions.

 

Regarding a research study where participants were asked to judge a criminal defendant, Pink writes, “All of the jurors read the same set of facts. But for half of them, the defendants’s name was Robert Garner, and for the other half, it was Roberto Garcia. When people made their decisions in the morning, there was no difference in guilty verdicts between the two defendants. However, when they rendered their verdicts later in the day, they were much more likely to believe that Garcia was guilty and Garner was innocent.”

 

Pink argues that when we are tired, when we have had to make many decisions throughout the day, and when we have become exhausted from high cognitive loads, we slow down with our decision-making process and are less able to think rationally. We use short-cuts in our decisions which can lead to cognitive errors. The case above shows how racial biases or prejudices may slip in when our brains are depleted.

 

None of us like to think of ourselves as impulsive or biased. And perhaps in the morning, after our first cup of coffee and before the stress of the day has gotten to us, we really are the aspirational versions of ourselves who we see as fair, honest, and patient. But the afternoon version of ourselves, the one who yells at other drivers in 5 p.m. traffic, is much less patient, more biased, and less capable of rational thought.

 

The idea of implicit biases, or prejudices that we don’t recognize that we hold, is controversial. None of us want to believe that we could make such terrible mistakes in thinking and treat two people so differently simply because a name sounds foreign. The study Pink mentions is a good way to approach this topic and show that we are at the whim of our tired brains, and to demonstrate that we can, in a sense, have two selves. Our rational and patient post-coffee self is able to make better decisions than our afternoon I-just-want-to-get-home-from-work selves. We are not the evil that manifests through our biases, but rather our biases are a manifestation that results from poor decision-making situations and mental fatigue. This is a lighter way to demonstrate the power and hidden dangers of our cognitive biases, and the importance of having people make crucial decisions at appropriate times. It is important to be honest about these biases so that we can look at the structures, systems, and institutions that shape our lives so that we can create a society that works better for all of us, regardless of what time of day it is.