On Prejudice - Joe Abittan

On Prejudice

In Vices of the Mind Quassim Cassam writes, “A prejudice isn’t just an attitude towards something, someone, or some group, but an attitude formed and sustained without any proper inquiry into the merits or demerits of the object of prejudice.”
Prejudices are pernicious and in his book Cassam describes prejudices as epistemic vices. They color our perception and opinions about people, places, and things before we have any reasonable reason to hold such beliefs. They persist when we don’t make any efforts to investigate them and they actively deter our discovery of new knowledge that would dismantle a prejudice. They are in a sense, self sustaining.
Prejudices obstruct knowledge by creating fear and negative associations with the people, places, and things we are prejudiced against. When we are in such a state, we feel no need, desire, or obligation to improve our point of view and possibly obtain knowledge that would change our mind. We actively avoid such information and discourage others from adopting points of view that would run against our existing prejudices.
I think that Cassam’s way of explaining prejudices is extremely valuable. When there is something we dislike, distrust, and are biased against, we should ask ourselves if our opinions are based on any reality or simply on unmerited existing feelings. Have we formed our opinions without any real inquiry into the merits or demerits of the person, place, or thing that we scorn?
It is important that we ask these questions honestly and with a real willingness to explore topics openly. It would be very easy for us to set out to confirm our existing biases, to seek out only examples that support our prejudice. But doing so would only further entrench our unfair priors and give us excuses for being so prejudiced. It would not count as proper inquiry into the merits or demerits of the objects of our prejudice.
We must recognize when we hold such negative opinions without cause. Anecdotal thinking, closed-mindedness, and biases can drive us to prejudice. These epistemic vices obstruct our knowledge, may lead us to share and spread misinformation, and can have harmful impacts on our lives and the lives of others. There is no true basis for the beliefs other than our lack of reasonable information and potentially our intentional choices to avoid conflicting information to further entrench our prejudices.
A Bias Toward Complexity

A Bias Toward Complexity

When making predictions or decisions in the real world where there are many variables, high levels of uncertainty, and numerous alternative options to chose from, using a simple rule of thumb can be better than developing complex models for predictions. The intuitive sense is that the more complex our model the more accurately it will reflect the real complexity of the world, and the better job it will do with making a prediction. If we can see that there are multiple variables, then shouldn’t our model capture the different alternatives for each of those variables? Wouldn’t a simple rule of thumb necessarily flatten many of the alternatives for those variables, failing to take into consideration the different possibilities that exist? Shouldn’t a more complex model be better than a simple heuristic?

 

The answer to these questions is no. We are biased toward complexity for numerous reasons. It feels important to build a model that tries to account for every possible alternative for each variable, we believe that always having more information is always good, and we want to impress people by showing how thoughtful and considerate we are. Creating a model that accounts for all the different possibilities out there fits those preexisting biases. The problem, however, is that as we make our model more complex it becomes more unstable.

 

In Risk Savvy, Gerd Gigerenzer explains what happens with variance and our models by writing, “Unlike 1/N, complex methods use past observations to predict the future. These predictions will depend on the specific sample of observations it uses and may therefore be unstable. This instability (the variability of these predictions around their mean) is called variance. Thus, the more complex the method, the more factors need to be estimated, and the higher the amount of error due to variance.”  (Emphasis added by me – 1/N is an example of a simple heuristic that Gigerenzer explains in the book.)

 

Our bias toward complexity can make our models and predictions worse when high levels of uncertainty with many alternatives and relatively limited amounts of data exist. If we find ourselves in the opposite situation, where there is low uncertainty, few alternatives, and a plethora of data, then we can use very complex models to make accurate predictions. But when we are in the real world, like making stock market or March Madness predictions, then we should rely on a simple rule of thumb. The more complex our model, the more opportunities for us to misestimate a given variable. Rather than having one error be offset by numerous other point estimates within our model to reduce the cost of a miscalculation, our model ends up creating more variance and a greater likelihood that our prediction will be further from reality than if we had flattened the variables with a simple heuristic.
The Human Need for Certainty - Joe Abittan

The Human Need for Certainty

Throughout the book Risk Savvy, Gerd Gigerenzer discusses the challenges that people face with thinking statistically, assessing different probable outcomes, and understanding risk. Gigerenzer also discusses how important it is that people become risk literate, and how the future of humanity will require that people better understand risk and uncertainty. What this future requires, he explains, is fighting against aspects of human psychology that are common to all of us and form part of our core nature. One aspect in particular that Gigerenzer highlights as a problem for humans moving forward, is our need for certainty.

 

“Humans appear to have a need for certainty, a motivation to hold onto something rather than to question it,” he writes. Whether it is our religion, our plans for retirement, or the brand of shoes we prefer, we have a need for certainty. We don’t want to question whether our religious, political, or social beliefs are correct. It is more comforting for us to adopt beliefs and be certain that we are correct. We don’t want to continuously re-evaluate our savings plans and open ourselves to the possibility that we are not doing enough to save for retirement. And we like to believe that we purchased the best running shoes, that we bough the most sustainable shoes for the planet, and that our shoe choices are the most popular. In all of these areas, ambiguity makes our decisions harder whereas a feeling of certainty gives us confidence and allows us to move through the world. In many ways, our need for certainty is simply a practicality. There are unlimited possibilities and decisions for us to make every day. Adopting certainty eliminates many possibilities and choices, simplifying our life and allowing us to move through the world without having to question every action of every second of every day.

 

But in the modern world, humans have to be more comfortable living with ambiguity and have to be able to give up certainty in some areas. “For the mature adult,” Gigerenzer writes, “a high need for certainty can be a dangerous thing.”  We live with risk and need to be able to adjust as we face new risks and uncertainties in our lives. We like to hold onto our beliefs and we are not comfortable questioning our decisions, but it can be necessary for us to do so in order to move forward and live in harmony in a changing world with new technologies, different demographics, and new uncertainties. A need for certainty can lead people to become dogmatic, to embrace apologetics when discounting science that demonstrates errors in thinking, and to ignore the realities of a changing world. One way or another, we have to find ways to be flexible and adjust our choices and plans according to risk, otherwise we are likely to make poor choices and be crushed when the world does not align itself with our beliefs and wishes.
Asymmetric Paternalism

Asymmetric Paternalism

While writing about the book Nudge by Richard Thaler and Cass Sunstein, I have primarily focused on an idea that the authors call Libertarian Paternalism. The idea is to structure choices and use nudges (slight incentives and structural approaches) to guide people toward making the best possible decision as judged by themselves. Maintaining free choice and the option to investigate or chose alternatives is an important piece of the concept, as is the belief that we will influence people’s decisions no matter what, so we should use that influence in a responsible way to help foster good decision-making.

 

But the authors also ask if it is reasonable to go a step beyond Libertarian Paternalism. Is it reasonable for choice architects, governments, and employers to go further than gentle nudges in decision situations? Are there situations where decision-making is too important to be left to the people, where paternalistic decision-making is actually best? Sunstein and Thaler present an introduction to Asymmetric Paternalism as one possible step beyond Libertarian Paternalism.

 

“A good approach to thinking about these problems has been proposed by a collection of behavioral economists and lawyers under the rubric of Asymmetric Paternalism. Their guiding principle is that we should design policies that help the least sophisticated people in society while imposing the smallest possible costs on the most sophisticated.”

 

This approach is appealing in many ways, but also walks the line between elitism, the marginalization of entire segments of society, and maximizing good decision-making. I hate having to make lots of decisions regarding appropriate tax filings, I don’t want to have to make decisions on lots of household appliances, and I don’t really want to have to spend too much time figuring out exactly what maintenance schedule is the best for all of my cars. However, I do want to get into the weeds of my healthcare plan, I want to micromanage my exercise routine, and I want to select all the raw ingredients that go into the dinners and lunches that I cook. On some decisions that I make, I want to outsource my decision-making and I would often be happy with having someone else make a decision so that I don’t have to. But in other areas, I feel very sophisticated in my decision-making approach, and I want to have maximum choice and freedom. Asymmetric Paternalism seems like a good system for those of us who care deeply about some issues, are experts in some areas, and want to maintain full decision-making in the areas we care about, while exporting decision-making in other areas to other people.

 

Of course, prejudices, biases, and people’s self-interest can ruin this approach. What would happen if we allowed ourselves to deem entire groups of people as unworthy of making decisions for themselves by default? Could they ever recover and be able to exercise their freedom to chose in important areas like housing, retirement, and investment spaces? Would we be able to operate for long periods of time under a system of Asymmetric Paternalism without the system devolving due to our biases and prejudices? These are real fears, and while we might like to selectively trade off decision-making when it is convenient for us, we also have to fear that someone else will be making decisions for us that are self-serving for someone other than ourselves.

 

The point, according to Sunstein and Thaler, would be to maintain the freedom of decision-making for everyone, but to structure choices in a way where those with less interest and less ability to make the best decisions are guided more strongly toward what is likely the best option for them. However, we can see how this system of asymmetric paternalism would get out of control. How do we decide where the appropriate level is to draw the line between strong guidance and outright choosing for people? Would people voluntarily give up their ability to chose and overtime hand over too many decisions without an ability to get their decision authority back? Transparency in the process may help, but it might not be enough to make sure the system works.
Paternalistic Nudges - Joe Abittan

Paternalistic Nudges

In their book Nudge, Cass Sunstein and Richard Thaler argue in favor of libertarian paternalism. Their argument is that our world is complex and interconnected, and it is impossible for people to truly make decisions on their own. Not only is it impossible for people to simply make their own decisions, it is impossible for other people to avoid influencing the decisions of others. Whether we decide to influence a decision in a particular way, or whether we decide to try to avoid any influence on another’s decision, we still shape how decisions are presented, understood, and contextualized. Given this reality, the best alternative is to try to help people make consistently better decisions than they would without aid and assistance.

 

The authors describe libertarian paternalism by writing:

 

“The approach we recommend does count as paternalistic, because private and public choice architects are not merely trying to track or to implement people’s anticipated choices. Rather, they are self-consciously attempting to move people in directions that will make their lives better. They nudge.”

 

The nudge is the key aspect of libertarian paternalism. Forcing people into a single choice, forcing them to accept your advice and perspective, and aggressively trying to change people’s behaviors and opinions doesn’t fit within the libertarian paternalism framework advocated by Sunstein and Thaler. Instead, a more subtle form of guidance toward good decisions is employed. People retain maximal choices if desired, and their opinions, decisions, and behaviors are somewhat constrained but almost nothing is completely off the table.

 

“A nudge,” Sunstein and Thaler write, “as we will use the term, is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives.”

 

Daniel Kahneman, in his book Thinking Fast and Slow demonstrated that people make predictable errors and have predictable biases. If we can understand these thinking errors and biases, then we can identify situations in which these biases and cognitive errors are likely to lead people to making suboptimal decisions. To go a step further, as Sunstein and Thaler would suggest, if we are a choice architect, we should design and structure choices in a way that leads people away from predictable cognitive biases and errors. We should design choices in a way that takes those thinking mistakes into consideration and improves the way people understand their choices and options.

 

As a real world example, if we are structuring a retirement savings plan, we can be relatively sure that people will anchor around a default contribution built into their retirement savings plan. If we want to encourage greater retirement savings (knowing that economic data indicate people rarely save enough), we can set the default to 8% or higher, knowing that people may reduce the default rate, but likely won’t eliminate contributions entirely. Setting a high default is a nudge toward better retirement saving. We could chose not to have a default rate at all, and it is likely that people wouldn’t be sure about what rate to select and might chose a low rate below inflation or simply chose not to enter a rate at all, completely failing to contribute anything to the plan. It is clear that there is a better outcome that we, as choice architects, could help people attain if we understand how their minds work and can apply a subtle nudge.
Competing Biases

Competing Biases

I am trying to remind myself that everyone, myself included, operates on a complex set of ideas, narratives, and beliefs that are sometimes coherent, but often conflicting. When I view my own beliefs, I am tempted to think of myself as rational and realistic. When I think of others who I disagree with, I am prone to viewing them in a simplistic frame that makes their arguments irrational and wrong. The reality is that all of our beliefs are less coherent and more complex than we typically think.

 

Daniel Kahneman’s book Thinking Fast and Slow has many examples of how complex and contradictory much of our thinking is, even if we don’t recognize it. One example is competing biases that manifest within us as individuals and can be seen in the organizations and larger groups that we form. We can be exaggeratedly optimistic and paralyzingly risk averse at the same time, and sometimes this tendency can actually be a good thing for us. “Exaggerated optimism protects individuals and organizations from the paralyzing effects of loss aversion; loss aversion protects them from the follies of overconfident optimism.”

 

On a first read, I would expect the outcome of what Kahneman describes to be gridlock. The optimist (or optimistic part of our brain) wants to push forward with a big new idea and plan. Meanwhile, loss aversion halts any decision making and prevents new ideas from taking root. The reality, as I think Kahneman would explain, is less of a conscious and deliberate gridlock, but an unnoticed trend toward certain decisions. The optimism wins out in an enthusiastic way when we see a safe bet or when a company sees an opportunity to capture rents. The loss aversion wins out when the bet isn’t safe enough, and when we want to hoard what we already have. We don’t even realize when we are making these decisions, they are just obvious and clear directions, but the reality is that we are constantly being jostled between exaggerated optimism and loss aversion.

 

Kahneman shows that these two biases are not exclusionary even though they may be conflicting. We can act on both biases at the same time, we are not exclusively a risk seeking optimists or exclusively risk averse. When the situation calls for it, we apply the appropriate frame at an intuitive level. Kahneman’s quote above shows that this can be advantageous for us, but throughout the book he also shows us how biases in certain directions and situation can be costly for us overtime as well.

 

We like simple and coherent narratives. We like thinking that we are one thing or another, that other people are either good or bad and right or wrong. The reality, however, is that we contain multitudes within us, act on competing and conflicting biases, and have more nuance and incongruency in our lives than we realize. This isn’t necessarily a bad thing. We can all still survive and prosper despite the complexity and incoherent beliefs that we hold. Nevertheless, I think it is important that we acknowledge the reality we live within, rather than simply believing the simple stories that we like to tell ourselves.
Hindsight Bias and Accountability - Joe Abittan

Hindsight Bias and Accountability

“Increased accountability is a mixed blessing,” writes Daniel Kahneman in his book Thinking Fast and Slow. This is an idea I came across in the past from books like Political Realism by Jonathan Rauch and The New Localism by Bruce Katz and  Jeremy Nowak. Our go-to answer to any challenges and problems tends to be increased transparency and greater oversight. However, in some complex fields simply opening processes and decision-making procedures to more scrutiny and review can create new problems that might be even worse. This is a particular challenge when we consider the way hindsight bias influences the thoughts and opinions of those reviewing bodies.

 

Kahneman continues, “because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions and – to an extreme – reluctance to take risks.”

 

Excess scrutiny and oversight can lead to rigid and mechanical decision-making processes. This might not be a problem when we are engineering a bridge and need to make technical decisions based on known mathematical calculations (I’ve never engineered a bridge so I may be wrong here), but it can be a problem for doctors and policy makers. Doctors have to rely on their experience, their knowledge, and their intuitions to determine the best possible medical treatment. Checklists are fantastic ideas, but when things go wrong in an operating room, doctors and nurses have to make quick decisions balancing risk and uncertainty. If the oversight they will face is high, then there is a chance that doctors stick to a rigid set of steps, that might not really fit the current emergency. In his book, Kahneman writes about how this leads doctors to order unnecessary tests and procedures, more to cover themselves from liability than to truly help the patient, wasting time and money within the healthcare system.

 

For public decision-making, hindsight bias can be a disaster for public growth and development. The federal government makes loans and backs many projects. Like any venture capitalist firm or large bank making multiple investments, some projects will fail. It is impossible to know at the outset which of ten solar energy projects will be a massive success, and which company is going to go bust. But thanks to hindsight bias and the intense oversight that public agencies and legislatures are subject to, an investment in a solar project that goes bust is likely to haunt the agency head or legislators who backed the project, even if 9 of the other 10 projects were huge successes.

 

Oversight is important, but when oversight is subject to hindsight bias, the accountability shifts into high gear, blaming decision-makers for failing to have the superhuman ability to predict the future. This creates risk averse institutions that stagnate, waste resources, and are slow to act, potentially creating new problems and new vulnerabilities to hindsight bias in the future. Rauch, Katz, and Nowak in the posts I linked to above, all favor reducing transparency in the public setting for this reason, but Kahneman might not agree with them, arguing that closing the deliberations to transparency won’t hide the outcomes from the public, and won’t stop hindsight bias from being an issue.
Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.
Narratives and Halos

Narratives and Halos

Yesterday I wrote about narrative fallacies and how our brains’ desires to create coherent stories can lead to cognitive errors. One error, which I wrote about previously, is the halo effect, and in some ways it is a direct consequence of narrative thinking. Our brains don’t do well with conflicting information that doesn’t fit a coherent narrative, and the halo effect helps smooth over this problem in our minds.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “The halo effect helps keep explanatory narratives simply and coherent by exaggerating the consistency of evaluations; good people do only good things and bad people are all bad.” When we already like someone or consider them a good person the halo effect will simplify other judgments that we might have to make about them. If the person we admire is wearing a particular kind of coat, then we will assume that it is also a coat we should admire. If a person we dislike is engaging in some type of business, then we will assume that business is also bad. Contradictions occur when we see someone we admire wearing clothing we don’t find acceptable or when a person we know to have moral flaws engages in altruistic charity work.

 

Instead of accepting a contradiction in our narrative, creating a more complex story where some people are good in some situations but bad in others, we alter our judgments in other ways to maintain a coherent narrative. The person we like wearing strange clothes is a trend setter, and that must be the new up-and-coming style we should try to emulate. The bad person engaged in charity isn’t really doing the good things for good reasons, rather they are being selfish and trying to show-off through their charity.

 

When we reflect on our thinking and try to be more considerate of the narratives we create, we can see that we fall into traps like the halo effect. What is harder to do, however, is overcome the halo effect and other cognitive errors that simplify our narratives once we have noticed them. It is hard to continually live with conflicting opinions, ideas of people, cities, sports teams, car companies, and shoe brands. It is much easier to adopt a few favorites and believe them to be a good in all ways, rather than to accept that something might be great in some ways, but harmful or disappointing in others.
Narrative Fallacies #NarrativePolicyFramework

Narrative Fallacies

With perhaps the exception of professional accountants and actuaries, we think in narratives. How we understand important aspects of our lives, such as who we are, the opportunities we have had in life, the decisions we have made, and how our society works is shaped by the narratives we create in our minds. We use stories to make sense of our relationships with other people, of where our future is heading, and to motivate ourselves to keep going. Narratives are powerful, but so are the narrative fallacies that can arise from the way we think.

 

Daniel Kahneman, in Thinking Fast and Slow, demonstrates the ways in which our brains take short-cuts, rely on heuristics, and create narratives to understand a complex world. He shows he these thinking strategies can fail us in predictable ways due to biases, illusions, and judgments made on incomplete information. Narrative fallacies can arise from all three of the cognitive errors I just listed. To get more in depth with narrative fallacies, Kahneman writes,

 

“Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.”

 

We don’t really know how to judge probabilities, possibilities, and the consequences of things that didn’t happen. We are biased to see agency in people and things when luck was more of a factor than any direct action or individual decision. We are motivated and compelled by stories of the world that simplify the complexity of reality, taking a small slice of the world and turning that into a model to describe how we should live, behave, and relate to others.

 

Unfortunately, in my opinion, narrative fallacies cannot be avoided. I studied public policy, and one of the frameworks for understanding political decision-making that I think needs far more direct attention is the Narrative Policy Framework which incorporates the idea of Social Constructions of Target Populations from Anne Schneider and Helen Ingram. We understand the outcomes of an event based on how we think about the person or group that were impacted by the consequences of the outcome. A long prison sentence for a person who committed a violent crime is fair and appropriate. A tax break for parents who work full time is also fair and appropriate. In both instances, we think about the person receiving the punishment or reward of a decision, and we judge whether they are deserving of the punishment or reward. We create a narrative to explain why we think the outcomes are fair.

 

We cannot exist in a large society of millions of people without shared narratives to help us explain and understand our society collectively. We cannot help but create a story about a certain person or group of people, and build a narrative to explain why we think that person or group deserves a certain outcome. No matter what, however, the outcomes will not be rational, they will be biased and contain contradictions. We will judge groups positively or negatively based on stories that may or may not be accurate and complete, and people will face real rewards or punishments due to how we construct our narratives and what biases are built into our stories. We can’t escape this reality because it is how our brains work and how we create a cohesive society, but we can at least step back and admit this is how our brains work, admit that our narratives are subject to biases and are based on incomplete information, and we can decide how we want to move forward with new narratives that will help to unify our societies rather than pit them against each other in damaging competition.