A Condescending Impulse

A Condescending Impulse

In my last few posts I have written about Johann Hari’s research into Harry Anslinger, the nation’s first Commissioner for the Federal Bureau of Narcotics, and what Hari learned about Anslinger and the start of the nation’s war on drugs. Anslinger held deeply racist views which he channeled into propaganda and drug policy in the Untied States. Hari was appalled by what he read, the common newspaper headlines about Anslinger’s raids from the time, and the quotes from the Commissioner himself. Writing about his research, Hari states,

 

“At times, as I read through Harry’s ever-stranger arguments, I wondered: How could a man like this have persuaded so many people? But the answers were lying there, waiting for me, in the piles of letters he received from members of he public, from senators and from presidents. They wanted to be persuaded. They wanted easy answers to complex fears. It’s tempting to feel superior – to condescend to these people – but I suspect this impulse is there in all of us. The public wanted to be told that these deep, complex problems – race, inequality, geopolitics – came down  to a few powders and pills, and if these powders and pills could be wiped from the world, these problems would disappear.” (Underlined text emphasis added by blog author)

 

We live in a complex world and we all lead busy lives that demand a lot of mental energy and attention just to keep the lights on. We hopefully figure out how to be successful and productive in our own lives, but we only ever get a single perspective on the world, our own. We want to believe that we are good people and that success in our society is entirely within the control of the individual (especially if we have become successful ourselves). When we face great uncertainty and complexity which doesn’t seem to line up with experiences of our lives or the heuristics we have developed for how we live, we seek simple answers that confirm what we want to believe. That is what Hari’s quote shows.

 

Anslinger was building a coalition of like-minded individuals with racial prejudices who wanted to be proven right. They feared drugs, and found drug users and addicts to be easy and defenseless targets. Drugs became a simple answer to the complex problems of why some people became dregs on society while others became wealthy successes.

 

Hari’s quote points out that we should recognize this, but not demonize people for it. We should acknowledge that this instinct is within all of us, and we should not fall into this condescending impulse and turn around a vilify those who are vilifying others. We must approach even our enemies and those among us who are wrong and hold dangerous beliefs with empathy. We must understand that the faults we find in them are faults that we too may have. The only way to connect and make real changes is to recognize and acknowledge these fears, and work to demonstrate how these simple answers to complex problems cannot possibly encompass all that is wrong in our societies so that we can move forward with better ideas and policies in the future.

Immediate Evaluations

I will be honest with this one. I think President Donald Trump is a despicable human being, a lazy thinker, and too incompetent (not to mention unaware of his incompetence) to serve as President of the United States. As a result of my disliking of the President, I feel that I cannot trust anything he says. This is troubling because I am likely to immediately dismiss his evaluations and policies, assuming that they are wrong and potentially corrupt. I’m not going to blame myself 100% here (the President has done many things to make me and others suspicious of what he says), but I think it is important for me to recognize and acknowledge that I immediately dismiss anything he says and immediately assume that anything he thinks is wrong.

 

The President is such a polarizing individual that he, and my reactions to him, serve as useful examples of how quickly we can make judgments about what other people say. We pick up on direct cues from others and interpret indirect identity cues to begin to make judgments about what others say, before they have even said anything.

 

In his book How to Win Friends and Influence People, Dale Carnegie quotes from the book On Becoming a Person by Carl Rogers, “Our first reaction to most of the statements (which we hear from other people) is an evaluation or judgment, rather than an understanding of it.”

 

When a friend that we get along with and share similar interests and identities with starts to say something about a sports team that we don’t have strong opinions about, we will probably agree with them in an instinctive manner. At the same time, when our uncle posts on Facebook about how terrible the political party we vote for is, we will likely scroll right by or block his post without actually giving it a second thought. There may not really be a reason to instantly agree with our friend about how good LeBron James is or to debate our uncle about his political philosophy, but we should nevertheless be aware of how quickly we make judgments about what other people think, say, and post on social media.

 

If we occupy a key decision-making role in a company, if we have to make decisions about our child’s education, and if we are thinking about our long-term retirement plans, it would be helpful for us to consider how quickly judgments happen. If we really like our financial adviser, we might instinctively agree with what he says, even if his advice isn’t as well researched and accurate as it should be. If we have had a combative relationship with our college-aged child, we might not be happy to hear that they switched out of a pre-med major, even if we know in our hearts that becoming a doctor might not be a good route for our son or daughter. If we understand how quickly our minds make decisions for us, we can push back and hopefully make better ore informed decisions. We can at least be aware of times when we make a snap judgment and try to seek other sources of information and consider that we might be wrong, and that the advice or decision of another are actually sound.

Motivated Reasoning – Arguments to Continue Believing As We Already Do

Recently I have been thinking a lot about the way we think. To each of us, it feels as though our thinking and our thought process is logical, that our assumptions about the world are sound and built on good evidence, and that we might have a few complex technical facts wrong, but our judgments are not influenced by bias or prejudice. We feel that we take into consideration wide ranges of data when making decisions, and we do not feel as though our decisions and opinions are influenced by meaningless information and chance.

 

However, science tells us that our brains often make mistakes, and that many of those mistakes are systematic. Also, we know people in our own lives who display wonderful thinking errors, such as close-mindedness, gullibility, and arrogance. We should be more ready to accept that our thinking isn’t any different from the minds of people in scientific studies that show the brain’s capacity to traverse off course or that we are really any different from the person we complain about for being biased or unfair in their thinking about something or someone we we care about.

 

What can make this process hard is the mind itself. Our brains are experts at creating logical narratives, including about themselves. We are great at explaining why we did what we did, why we believe what we believe, and why our reasoning is correct. Scientists call this motivated reasoning.

 

Dale Carnegie has a great explanation of it in his book How to Win Friends and Influence People, “We like to continue to believe what we have been accustomed to accept as true, and the resentment aroused when doubt is cast upon any of our assumptions leads us to seek every manner of excuse for clinging to it. The result is that most of our so-called reasoning consists in finding arguments for going on believing as we already do.” 

 

Very often, when confronted with new information that doesn’t align with what we already believe, doesn’t align with our own self-interest, or that challenges our identity in one way or another, we don’t update our thinking but instead explain away or ignore the new information. Even for very small thing (Carnegie uses the pronunciation of Epictetus as an example) we may ignore convention and evidence and back our beliefs in outdated and out of context examples that seem to support us.

 

In my own life I try to remember this, and whether it is my rationalization of why it is OK that I went for a workout rather than doing dishes, or my self-talk about how great a new business idea is, or me rationalizing buying that sushi at the store when I was hungry while grocery shopping, I try to ask myself if my thoughts and decisions are influenced by motivated reasoning. This doesn’t always change my behavior, but it does help me recognize that I might be trying to fool myself. It helps me see that I am no better than anyone else when it comes to making up reasons to support all the things that I want. When I see this in other people, I am able to pull forward examples from my own life of me doing the same thing, and I can approach others with more generosity and hopefully find a more constructive way of addressing their behavior and thought process. At an individual level this won’t change the world, but on the margins we should try to reduce our motivated reasoning, as hard as it may be, and slowly encourage those around us to do the same.

Judicial Sentencing and Daylight Saving Time

Our justice system in the United States is not the greatest system that we have developed. In recent years a lot of attention has been paid to disparities in sentencing and ways in which the system doesn’t seem to operate fairly. For instance possession of the same amount crack cocaine and powder cocaine carried different mandated sentences, even though it was the same drug just in different forms. The sentencing differences represented a bias in the way we treated the drug considering who was more likely to be a crack versus powder cocaine user.

 

In general, we believe that our system is fair and unbiased. We like to believe that our judges, jurors, and justice system officials are blind, only seeing the facts of the case and making rational decisions that are consistent from case to case. It is important that we believe our system works this way and that we take steps to ensure it does, but there is evidence that it does not and that basic factors of our humanity prevent the system from being perfectly fair.

 

An interesting example of the challenges of creating a perfectly balanced judicial system is presented in Daniel Pink’s book When. Pink’s book is an exploration of time and the power of timing in our lives. He presents evidence that the human mind’s decision-making ability deteriorates throughout the course of the day, becoming less nuanced, less analytical, and more easily distracted the longer we have been awake and the longer we have been focused on a task. Judges are no exception.

 

Pink references a study that shows that simple timing changes can impact the decisions that judges make, even when the timing seems as though it should be irrelevant. Pink writes, “Another study of U.S. federal courts found that on the Mondays after the switch to Daylight Saving Time, when people on average lose roughly forty minutes of sleep, judges rendered prison sentences that were about 5 percent longer than the ones they handed down on typical Mondays.”

 

A slight loss of sleep, and a slight change in time resulted in inconsistent sentencing within our courts. The decisions our judges make are nuanced and challenging, and our judges have to make multiple life impacting decisions each day. Unfortunately, the system within which they operate is not designed to help provide more consistency across scheduling. Factors such as Daylight Saving Time, extensive blocks between lunch and breaks, and long daily schedules wear out our judges, and lead to less nuanced thinking and less fair sentences. We should think about how our system impacts the decisions we make (within the judicial system, the corporate board room, and on the factory floor) and try to redesign systems around time to help people make better and more consistent decisions.

Racial Bias Manifests When We Are Tired

Whether we want to admit it or not, we all make cognitive errors that result in biases, incorrect assessments, and bad decisions. Daniel Pink examines the timing of our errors and biases in his book When: The Scientific Secrets to Perfect Timing. It is one thing to simply say that biases exist, and another to try to understand what leads to biases and when such biases are most likely to manifest. It turns out that the time of day has a big impact on when we are likely to see biases in our thinking and actions.

 

Regarding a research study where participants were asked to judge a criminal defendant, Pink writes, “All of the jurors read the same set of facts. But for half of them, the defendants’s name was Robert Garner, and for the other half, it was Roberto Garcia. When people made their decisions in the morning, there was no difference in guilty verdicts between the two defendants. However, when they rendered their verdicts later in the day, they were much more likely to believe that Garcia was guilty and Garner was innocent.”

 

Pink argues that when we are tired, when we have had to make many decisions throughout the day, and when we have become exhausted from high cognitive loads, we slow down with our decision-making process and are less able to think rationally. We use short-cuts in our decisions which can lead to cognitive errors. The case above shows how racial biases or prejudices may slip in when our brains are depleted.

 

None of us like to think of ourselves as impulsive or biased. And perhaps in the morning, after our first cup of coffee and before the stress of the day has gotten to us, we really are the aspirational versions of ourselves who we see as fair, honest, and patient. But the afternoon version of ourselves, the one who yells at other drivers in 5 p.m. traffic, is much less patient, more biased, and less capable of rational thought.

 

The idea of implicit biases, or prejudices that we don’t recognize that we hold, is controversial. None of us want to believe that we could make such terrible mistakes in thinking and treat two people so differently simply because a name sounds foreign. The study Pink mentions is a good way to approach this topic and show that we are at the whim of our tired brains, and to demonstrate that we can, in a sense, have two selves. Our rational and patient post-coffee self is able to make better decisions than our afternoon I-just-want-to-get-home-from-work selves. We are not the evil that manifests through our biases, but rather our biases are a manifestation that results from poor decision-making situations and mental fatigue. This is a lighter way to demonstrate the power and hidden dangers of our cognitive biases, and the importance of having people make crucial decisions at appropriate times. It is important to be honest about these biases so that we can look at the structures, systems, and institutions that shape our lives so that we can create a society that works better for all of us, regardless of what time of day it is.

Donating to Faces

In the United States there is a lot of wealth and a lot resources that are directed toward charity. One problem, however, is that the people who are the most in need of charity are generally in developing countries and economies on the other side of the globe. Those counties and individuals, where our donations from the United States could go the furthest, don’t manage to capture as much of the donation market as we might think they would given the scale of need and potential impact of our donations. Kevin Simler and Robin Hanson in The Elephant in the Brain call this the Relatability Problem of charitable donations.

 

They write, “we’re much more likely to help someone we can identify-a specific individual with a name, a face, and a story. First investigated by Thomas Schelling in 1968, this phenomenon has come to be known as the identifiable victim effect. The corresponding downside, of course, is that we’re less likely to help victims who aren’t identifiable.”

 

We might hear a news story about millions of people in a distant country being displaced by a major natural disaster. We might see lines of people trying to flee a destroyed town or countryside, but the further from us they are in terms of both distance and culture, the less likely we are to feel a burning desire to help them. I think that part of this comes from the rational side of our brains. We want to be sure that if we expend effort, energy, or resources, that we can see the final product to know that something good happened. If we can see a single person in need who received a meal, a place to sleep, or had a home repaired as a result of our charity, then we will be more likely to make some type of donation to help, especially if we can see something in ourselves in their situation. When we just see statistics about how many people are in need and how many dollars helped however many people, we are less sure that our efforts really made a difference and actually applied to the problem at hand. This feels like it makes rational sense, but as I have detailed previously, our charity is usually not very rational to begin with, and our brains end up driving our charity to less rational purposes in this potential rational aim.

 

Peter Singer gives an example of this in his book The Most Good You Can Do. If we see a campaign for the  Make-a-Wish Foundation to help one specific child with a terminal illness have an amazing day, we will likely feel incredible empathy for the child and we will see an opportunity for us to be part of making something spectacular happen for a child with an unfortunate and unavoidably short life. We see exactly who we are helping, we can read or watch a story about why we should help this child (and others like them), and how our donation will help them directly. At the same time, however, the CDC reports that in 2016 445,000 people died from Malaria, a preventable mosquito born parasitic infection.

 

We could make a $250 donation to the Make-a-Wish Foundation and our money would go toward some things that help provide a fantastic day for the one child whose story we can see on TV or read about. We could alternatively make a $250 donation to the against malaria and provide about 50 anti-malarial bed-nets to children. Somewhere inside us, the statistics about bed-nets doesn’t weigh as heavily as helping the one child whose story we saw on TV, even though we are helping 50 children and potentially saving the lives of the children by ensuring they have something to prevent Malarial infections. Its hard to say how much our donation does for the Make-A-Wish foundation, but we know pretty well what our donation toward bed-nets does.

 

Global charities helping those where our resources could go the furthest are hampered by our empathetic drives to help those with whom we relate to. We first want to help those who look like us, have similar backgrounds, and speak our same language. After that we are willing to try to help those unknown people creating the statistics that fail to move us to action. We don’t donate because we want to make the most good, we donate because we feel compelled to help people who look like part of our tribe.

Egocentric Bias

I was reading an political science paper in an academic journal last night and came across a sentence that really stood out to me. The paper focused on the staffers who work for members of congress and whether they held accurate views of the constituents represented by the member of congress that they worked for. The paper finds that congressional staffers routinely misinterpret the views of their constituents, particularly overestimating just how conservative their constituents tend to be.

 

One reason given for the misinterpretation of constituent views was the opinions and ideology of the staffers themselves. In particular, egocentric bias may be pushing the staffers to see the views of their constituents in a warped light. The authors write, “Egocentric bias is a consistent finding in psychology that suggests individuals use  their own beliefs as a heuristic for estimating the beliefs and opinions of others.” In other words, we believe that people are like us and think the way we do.

 

In political science and in a democracy the implications of egocentric bias are huge. Our representatives could totally misinterpret what we think is good or bad, could totally fail to see what issues are important to us, and could support (or oppose) legislation thinking they were doing what we, their constituents, wanted. But really our representatives might end up acting against the wishes of a majority of the people they represent.

 

In our own lives, egocentric bias can also play a huge role. It may not seem like a big deal if we play some music from a speaker while hiking, if we don’t wipe down the machine at the gym, or if we wear that shirt with a funny yet provocative saying on it. After all, we are not bothered with these things and if we assume most people are like us then no one will really care too much. Unfortunately, other people (possibly a plurality or majority of others) may see our behaviors as reprehensible and deeply upsetting. We made an assumption that things we like are things that others like and that things that bother us are things that bother others. We adapted our behavior around our own interests and just assumed everyone else would understand and go with the flow. We bought in to egocentric behavior and acted in ways that could really upset or offend other people.

 

Egocentric bias is something we should work to recognize and move beyond. When we assume everyone is like us, we become less considerate, and that will show in how we behave. If instead we recognize that people are not all like us, we can start to see our world and our actions through new perspectives. This can open up new possibilities for our lives and help us to behave in ways that are more helpful toward others rather than in ways that are more likely to upset other people. What we will find is that we are able to have better connections with people around us and develop better relationships with people because we are more considerate and better able to view the world as they may see it, rather than just assuming that everyone sees the world how we do.

Aware of Your Feelings of Superiority

In my life I want to remain open to the world around me, try new things, and stretch myself in areas where I recognize I don’t have much experience. In order to successfully live an open and exploratory life, I will have to accept that I am not as great as my ego wants me to believe I am, and I will have to accept that I don’t already know everything I need to know about how to live a good life. If I begin approaching the world as though I already have it figured out and as if my way of life is superior to the way that other people live, then instead of branching out, I will likely turn inward, away from a changing world.

 

“No group ever decided to pull inward and cut off contact with the outside world because they believed their own group was inferior,” Colin Wright wrote in his book Becoming Who We Need To Be. It is hard to avoid judging other people, and even easier to judge other groups rather than just other individuals. “Moral superiority is probably some degree of confidence in their social group and their support of their social group. That is, people are especially willing to express moral superiority when they’re expressing the superiority, not of themselves individually, but of the group of people they’re within together,” Robin Hanson stated in an interview with Tyler Cowen for his podcast Conversations with Tyler.

 

Allowing ourselves to see ourselves and our groups as morally superior to others limits our world and puts us in a place where we are less likely to connect with people who are not like us. I see this a lot with the relationships between runners and people who do cross-fit. It seems almost universal that runners criticize cross-fit athletes. I have thought about this a lot, and I think that what is happening is that runners are trying to express their (moral) athletic superiority over cross-fit athletes as a way to justify why they don’t do cross-fit themselves. To acknowledge that cross-fit is a good workout and accept that a cross-fit athlete is just as athletic, talented, hard-working, or smart as a runner places the runner in a position where they have to defend their sport and their choice to do running when a potentially more well-rounded and fun type of exercise exists.

 

The runners versus cross-fit example is just a small example of how our in-group versus out-group thinking manifests in real life. This type of thinking, of believing that we and our group are superior to other groups can have serious consequences. It can lead to our group becoming more close-minded. It can lead to us individually being less open to people who live differently. It can lead to enclaves and divisions within society that see conflict and threat instead of opportunity and learning. By becoming aware of these feelings of superiority, recognizing how frequently these feelings lack any solid rational basis, and by trying something new, we can prevent ourselves and our groups from becoming isolated. This will give us a chance to learn new things, gain insightful experiences, and it will help us provide more value to the world.

Attribution Bias

Our brains are pretty impressive pattern recognition machines. We take in a lot of information about the world around us, remember stories, pull information together to form new thoughts and insights, move through the world based on the information we take in, and we are able to predict the results of actions before they have occurred. Our brain evolved to help us navigate a complex, dangerous, and uncertain world.

 

Today however, while our world is arguably more complex and uncertain than ever, it might not be as dangerous on a general day to day basis. I’m pretty sure I won’t encounter any animals who may try to eat me when I sit at the park to read during my lunch break, I won’t need to distinguish between two types of berries to make sure I don’t eat the poison kind, and if the thunder storms scheduled for this evening drop golf ball sized hail, I won’t have to worry to much about where I will find safety and shelter. Nevertheless, my evolved brain is still going to approach the world as if it were the dangerous place it was when my ancestors were evolving their thought capacities, and that will throw some monkey-wrenches into my life and lead to me to see patterns that don’t really exist.

 

Colin Wright has a great quote about this in his book Becoming Who We Need to Be. He writes, “You ascribe meaning to that person’s actions through the lens of what’s called “attribution bias.” If you’re annoyed by their slow driving, that inferred meaning will probably not be generous to the other driver: they’re a bad person, they’re in the way, and they’re doing this because they’re stupid or incapable. That these assumptions about the situation are possibly incorrect – maybe they’re driving slowly because thy’re in deep thought about elephant tool usage – is irrelevant. Ascribing meaning to acts unto itself is impressive, even if we often fail to arrive at a correct, or fully correct understanding of the situation.”

 

We often find ourselves in situations that are random and try to ascribe a greater meaning to the situation or event we are in. At least in the United States, it is incredibly common to hear people say that everything happens for a reason, creating a story for themselves in which this moment of inconvenience is part of a larger story filled with lessons, staircases, detours, success, and failure that are all supposed to culminate in a larger narrative that will one day all make sense. The fact that this way of thinking is so prevalent suggests to me that the power of our pattern recognition focused brains is still in full swing even though we no longer need it to be as active in as many situations of our life. We don’t need every moment of our life to happen for a reason, and if we allow for randomness and eliminate the running narrative of our life, we don’t have to work through challenging apologetics to understand something negative.

 

Attribution bias as described by Wright shows us how wrong our brain can be about the world. It shows us that our brains have certain tendencies that elevate ourselves in thought over the rest of the world that doesn’t conform to our desires, interests, wishes, and preferences. It reveals that we are using parts of our brains that evolved to help our ancestors in ways that we now understand to be irrational. If we can see that the slow person driving in front of us with a political sticker that makes our blood boil is not all the terrible things we instantly think they are (that instead they are a 75 year-old grandfather driving in a new town trying to get to the hospital where his child is sick) then we can recognize that not everything in life has a meaning, or at least not the meaning that our narrow pattern recognizing brain wants to ascribe. Remembering this mental bias and making an effort to recognize this type of thinking and move in a more generous thought direction will help us move through the world with less friction, anger, and disappointment because we won’t develop false patterns that let us down when they fail to materialize in the outcomes we expected.

Social Constructionism in Physics and … Everything!

I just finished a semester at the University of Nevada focusing on Public Policy as part of a Masters in Public Administration. Throughout the semester we focused on rational models of public policy and decision-making, but we constantly returned to the ways in which those models break down and cannot completely inform ad shape the public policy making process. We select our goals via political processes and develop rational means for reaching those political ends. There is no way to take a policy or its administration out of the hands and minds of humans to have an objective and rational process free of the differences which arise when we all have different perspectives on an issue.

 

Surprisingly, this is also what we see when we look at physics, and it is one of the big stumbling blocks preventing us from linking Einstein’s theory of relativity with quantum mechanics. Throughout her book Trespassing on Einstein’s Lawn, Amanda Gefter introduces us to the biggest concepts and challenges within the world of physics and how she and her dad attempted to make sense of those concepts on their own. A major influencer on the world of physics, and consequently on the adventure that Gefter took, was John Wheeler, who seemed to bring an idea of social construction to the rational and scientific world of physics. Wheeler described the idea of the self observing universe, to say that we are matter, observing other matter, creating our reality as we observe it. This idea exactly the idea of social construction in politics and governance that I touched on in the opening note. Gefter quotes a note in one of Wheeler’s notebooks, “Add ‘Participant’ to ‘Undecidable Propositions’ to Arrive at Physics.”

 

Social Constructionism is a theory from  the social sciences. It is used to describe the ways in which a society or group comes to understand the problems it faces: who is at fault for the problem, who receives a benefit from our problem solution, who has the right to complain about a problem, and in what order should we attempt to solve our problems? These are all serious questions to which there is no perfect answer. We cannot identify a perfectly rational answer that will satisfy everyone. Our individual preferences will always be at play and our interactions in the decision-making process will shape the outcomes we decide we want and the solutions we decide to implement to reach those outcomes. In a sense, these large political questions are like the undecidable propositions described by Wheeler. Politics is the outcome we arrive at when you add participants to undecidable propositions in society, and physics is what you arrive at when you add participants with limited knowledge and limited perspectives to the observation and understanding of major questions about the workings of the universe.

 

We use questions of social science to inform the way we think about our interactions with other people and how we form societies. Social Constructionism reminds us that what seems clear and obvious to us, may seem different to someone else with different experiences, different backgrounds, different needs, and different expectations. Keeping this theory in mind helps us better connect with other people and helps us see the world in new ways. Similarly, physics informs how we understand the universe to be ordered and how matter and energy interact within the universe. Recognizing that our perspective matters, when it comes to science and physics, helps us to consider our own biases and prior conceptions which may influence exactly how we choose to study and experiment with the universe. Keeping social constructionism in mind also helps us understand why we choose to study certain aspects of science and why we present our findings in the ways that we do. We may never be able to get to a purely rational place in either science or politics (though science is certainly much closer), but understanding and knowing where social construction plays a part will help us be more observant and honest about what we say, study, believe, and discover.