Probability Judgments

Probability Judgments

Julia Marcus, an epidemiologist at Harvard Medical School, was on a recent episode of the Ezra Klein show to discuss thinking about personal risk during the COVID-19 Pandemic. Klein and Marcus talked about the ways in which the United States Government has failed to help provide people with structures for thinking about risk, and how this has pushed risk decisions onto individuals. They talked about how this creates pressures on each of us to determine what activities are worthwhile, what is too risky for us, and how we can know if there is a high probability of infection in one setting relative to another.

 

On the podcast they acknowledged what Daniel Kahneman writes about in his book Thinking Fast and Slow – humans are not very good at making probability judgments. Risk is all about probability. It is fraught with uncertainty, with with small likelihoods of very bad outcomes, and with conflicting opinions and desires. Our minds, especially our normal operating mode of quick associations and judgments, doesn’t have the capacity to think statistically in the way that is necessary to make good probability judgments.

 

When we try to think statistically, we often turn to substitutions, as Kahneman explains in his book. “We asked ourselves how people manage to make judgments of probability without knowing precisely what probability is. We concluded that people must somehow simplify that impossible task and we set out to find how they do it. Our answer was that when called upon to judge probability, people actually judge something else and believe they have judged probability.”

 

This is very important when we think about our actions, and the actions of others, during this pandemic. We know it is risky to have family dinners with our loved ones, and we ask ourselves if it is too risky to get together with our parents, with siblings who are at risk due to health conditions, and if we shouldn’t be in the same room with a family member who is a practicing medical professional. But in the end, we answer a different question. We ask how much we miss our parents, if we think it is important to be close to our family, and if we really really want some of mom’s famous pecan pie.

 

As Klein and Marcus say during the podcast, it is a lot easier to be angry at people at a beach than to make probability judgments about a small family dinner. When governments, public health officials, and employers fail to establish systems to help us navigate the risk, we place the responsibility back onto individuals, so that we can have someone to blame, some sense of control, and an outlet for the frustrations that arise when our mind can’t process probability. We distort probability judgments and ask more symbolic questions about social cohesion, family love, and isolation. The answer to our challenges would be better and more responsive institutions and structures to manage risk and mediate probability judgments. The individual human mind can only substitute easier questions for complex probability judgments, and it needs visual aids, better structures, and guidance to help think through risk and probability in an accurate and reasonable manner.
Substitution Heuristics

Substitution Heuristics

I think heuristics are underrated. We should discuss heuristics as a society way more than we do. We barely acknowledge heuristics, but if we look closely, they are at the heart of many of our decisions, beliefs, and assumptions. They save us a lot of work and help us move through the world pretty smoothly, but are rarely discussed directly or even slightly recognized.

 

In Thinking Fast and Slow, Daniel Kahneman highlights heuristics in the sense of substitution and explains their role as:

 

“The target question is the assessment you intended to produce.
The heuristic question is the simpler question that you answered instead.”

 

I have already written about our brain substituting easier questions for harder questions, but the idea of heuristics gives the process a deeper dimension. Kahneman defines a heuristic writing, “The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions.”

 

In my own life, and I imagine I am a relatively average case, I have relied on heuristics to help me make a huge number of decisions. I don’t know the best possible investment strategies for my future retirement, but as a heuristic, I know that working with an investment advisor to manage mutual funds and IRAs can be an adequate (even if not perfect) way to ensure I save for the future. I don’t know the healthiest possible foods to eat and what food combinations will maximize my nutrient intake, but as a heuristic I can ensure that I have a colorful plate with varied veggies and not too many sweets to ensure I get enough of the vitamins and nutrients that I need.

 

We have to make a lot of difficult decisions in our lives. Most of us don’t have the time or the ability to compile all the information we need on a given subject to make a fully informed decision, and even if we try, most of us don’t have a reasonable way to sort through contrasting and competing information to determine what is true and what the best course of action would be. Instead, we make substitutions and use heuristics to figure out what we should do. Instead of recognizing that we are using heuristics, however, we ascribe a higher level of confidence and certainty to our decisions than is warranted. What we do, how we live, and what we believe become part of our identity, and we fail to recognize that we are adopting a heuristic to achieve some version of what we believe to be a good life. When pressed to think about it, our mind creates a justification for our decision that doesn’t acknowledge the heuristics in play.

 

In a world where we were quicker to recognize heuristics, we might be able to live with a little distance between ourselves, our decisions, and our beliefs. We could acknowledge that heuristics are driving us, and be more open to change and more willing to be flexible with others. Acknowledging that we don’t have all the answers (that we don’t even have all the necessary information) and are operating on substitution heuristics for complex questions, might help us be less polarized and better connected within our society.
Rarely Stumped

Rarely Stumped

Daniel Kahneman starts one of the chapters in his book Thinking Fast and Slow by writing, “A remarkable aspect of your mental life is that you are rarely stumped. True, you occasionally face a question such as 17 × 24 = ? to which no answer comes immediately to mind, but these dumbfounded moments are rare. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way.”

 

When I read this quote I am reminded of Gus, the father, in My Big Fat Greek Wedding. He is always ready to show how every word comes from a Greek root, even a Japanese word like kimono. He is sure of his intellect, sure that his heritage is perfect and is the foundation of all that is good in the world. He trusts his instincts and intuitions to a hilarious extent, even when he is clearly wrong and even when his decisions are gift-wrapped and planted in his mind in an almost Inception style.

 

His character is part caricature, but it is revealing of what Kahneman explains with the quote above. Our minds are good at finding intuitive answers that make sense of the world around us, even if we really don’t have any idea what is going on. We laugh at Gus and don’t consider ourselves to be guilty of behaving like him, but the only difference between most of us and Gus is that Gus is an exaggeration of the intuitive dogma and sense of self value and assurance that we all live with.

 

We scroll through social media, and trust that our initial judgment of a headline or post is the right frame for how to think about the issue. We are certain that our home remedy for tackling bug bites, cleaning windows, or curing a headache is based on sound science, even if it does nothing more than produce a placebo effect. We find a way to fit every aspect of our lives into a comprehensive framework where our decisions appear rational and justified, with us being the hero (or innocent victim if needed) of the story.

 

We should remember that we have a propensity to believe that we are always correct, that we are never stumped. We should pause, ask more questions, think about what is important to know before making a decision, and then deeply interrogate our thoughts to decide if we really have obtained meaningful information to inform our opinions, or if we are just acting on instinct, heuristics, self-interest, or out of groupthink. We cannot continue believing we are right, pushing baseless beliefs onto others when we have no real knowledge of an issue. We shouldn’t assume things are true just because they happen to align with the story we want to believe about ourselves and the world. When it comes to crucial issues and our interactions and relationships with others, we need to think more critically, and recognize when we are assuming we are right. If we can pause at those times and think more deeply, gather more information, ask more questions of our selves, we can have more accurate and honest interactions and relationships. Hopefully this will help us have more meaningful lives that better connect and better develop the community we all need in order to thrive.
Judging Faces

Judging Faces

One of the successes of System 1, the name Daniel Kahneman uses to describe our quick, intuitive part of the brain in his book Thinking Fast and Slow, is recognizing emotions in people’s faces. We don’t need much time to study someone’s face to recognize that they are happy, scared, or angry. We don’t even need to see someone’s face for a full second to get an accurate sense of their emotional state, and to adjust our behavior to interact accordingly with them.

 

The human mind is great at intuiting emotions from people’s faces. I can’t remember where, but I came across something that suggested the reason why we have white eyes is to help us better see where each other’s eyes are looking, and to help us better read each other’s emotions. Our ability to quickly and intuitively read each others’ faces helps us build social cohesion and connections. However, it can still go wrong, even though we are so adept.

 

Kahneman explains that biases and baseless assumptions can be built into System 1’s assessment of faces. We are quick to notice faces that share similar features as our own. We are also quick to judge people as nice, competent, or strong based on features in their faces. This is demonstrated in Thinking Fast and Slow with experiments conducted by Alex Todorov. He had showed potential voters the faces of candidates, for sometimes only fractions of seconds and noted that faces influenced votes. Kahneman writes, “As expected, the effect of facial competence on voting is about three times larger for information-poor and TV-prone voters than for others who are better informed and watch less television.”

 

I’m not here to hate on information-poor and TV-prone voters, but instead to help us see that we can easily be influenced by people’s faces and traits that we have associated with facial characteristics, even if we don’t consciously know those associations exist. For all of us, there will be situations where we are information-poor and ignorant of issues or important factors for our decision (the equivalent of being TV-prone in electoral voting). We might trust what a mechanic or investment banker says if they have a square jaw and high cheekbones. We might trust the advice of a nurse simply because she has facial features that make her seem caring and sympathetic. Perhaps in both situations the person is qualified and competent to be giving us advice, but even if they were not, we might trust them based on little more than appearance. System 1, which is so good at telling us about peoples’ emotions, can jump ahead and make judgement about many characteristics of people simply based on faces, and it may be correct sometimes, but it can also be wrong. System 2 will probably construct a coherent narrative to justify the quick decision made by System 1, but it likely won’t really have to do with the experience and qualifications of the person. We may find that we end up in situations where deep down, we are making judgments of someone based on little more than what they look like, and what System 1 thought of their face.
What You See Is All There Is

What You See Is All There Is

In Thinking Fast and Slow, Daniel Kahneman gives us the somewhat unwieldy acronym WYSIATI – what you see is all there is. The acronym describes a phenomenon that stems from how our brains work. System 1, the name that Kahneman gives to the part of our brain which is automatic, quick, and associative, can only take in so much information. It makes quick inferences about the world around it, and establishes a simple picture of the world for System 2, the thoughtful calculating part of our brain, to work with.

 

What you see is all there is means that we are limited by the observations and information that System 1 can take in. It doesn’t matter how good System 2 is at processing and making deep insights about the world if System 1 is passing along poor information. Garbage in, garbage out, as the computer science majors like to say.

 

Daniel Kahneman explains what this means for our day to day lives in detail in his book. He writes, “As the WYSIATI rule implies, neither the quantity nor the quality of the evidence counts for much in subjective confidence. The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little.”

 

System 2 doesn’t recognize that System 1 hands it incomplete and limited information. It chugs along believing that the information handed off by System 1 is everything that it needs to know. It doesn’t ask for more information, it just accepts that it has been handed a complete data set and begins to work. System 2 creates a solid narrative out of whatever information System 1 gives it, and only momentarily pauses if it notices an inconsistency in the story it is stitching together about the world. If it can make a coherent narrative, then it is happy and doesn’t find a need to look for additional information. What you see is all there is, there isn’t anything missing.

 

But we know that we only take in a limited slice of the world. We can’t sense the Earth’s magnetic pull, we can’t see in ultraviolet or infrared, and we have no way of knowing what is really happening in another person’s mind. When we read a long paper or finish a college course, we will remember some stuff, but not everything. Our mind is only able to hold so much information, and System 2 is limited to what can be observed and held. This should be a huge problem for our brain, we should recognize enormous blind spots, and be paralyzed with inaction due to a lack of information. But this isn’t what happens. We don’t even notice the blind spots, and instead we make a story from the information we collect, building a complete world that makes sense of the information, no matter how limited it is. What you see is all there is, we make the world work, but we do so with only a portion of what is really out there, and we don’t even notice we do so.
Narrative Confidence

Narrative Confidence

We like to believe that having more information will make us more confident in our decisions and opinions. The opposite, however, may be true. I have written in the past about a jam study, where participants who selected jam from a sample of a few jams were more happy with their choice than participants who selected jam from a sample of several dozen jam options. More information and more choices seems like it would help make us more happy and make us more confident with our decision, but those who selected jam from the small sample were happier than those who had several dozen jam options.

 

We like simple stories. They are easy for our brain to construct a narrative around and easy for us to have confidence in. The stories we tell ourselves and the conclusions we reach are often simplistic, often built on incomplete information, and often lack the nuance that is necessary to truly reflect reality. Our brains don’t want to work too hard, and don’t want to hold conflicting information that forces an unpleasant compromise. We don’t want to constantly wonder if we made the right choice, if we should do something different, if we need to try another option. We just want to make a decision and have someone tell us it was a good decision, regardless of the actual outcome or impact on our lives, the lives of others, or the planet.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow. He describes a study (not the jam study) where participants were presented with either one side or two sides of an argument. They had to chose which side they agreed with, and rate their confidence. “Participants who saw one-sided evidence were more confident of their judgments than those who saw both sides,” writes Kahneman, “This is just what you would expect if the confidence that people experience is determined by the coherence of the story they manage to construct from available information. It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.”

 

Learning a lot and truly understanding any given issue is challenging because it means we must build a complex picture of the world. We can’t rely on simple arguments and outlooks on life when we start to get into the weeds of an issue or topic. We will see that admirable people have tragic flaws. We will see that policies which benefit us may exploit others. We will find that things we wish to be true about who we are and the world we live in are only semi-true. Ignorance is bliss in the sense that knowing only a little bit about the world will allow you to paint a picture that makes sense to you, but it won’t be accurate about the world and it won’t acknowledge the negative externalities that the story may create. Simplistic narratives may help us come together as sports fans, or as consumers, or as a nation, but we should all be worried about what happens when we have to accept the inaccuracies of our stories. How we do we weave a complex narrative that will bring people across the world together in a meaningful and peaceful way without driving inequality and negative externalities? That is the challenge of the age, and unfortunately, the better we try to be at accurately depicting the world we inhabit, the less confident any of us will be about the conclusions and decisions for how we should move forward.
System 1 Success

System 1 Success

“The measure of success for System 1 is the coherence of the story it manages to create.”

 

Daniel Kahneman writes that in his book Thinking Fast and Slow when discussing the quick conclusions of our System 1, the mental processing part of our brain that is fast, intuitive, and operates based on simple associations and heuristics.

 

System 1 stitches together a picture of the world and environment around us with incomplete information. It makes assumptions and quick estimates about what we are seeing and compiles a coherent story for us. And what is important for System 1 is that the story be coherent, not that the story be accurate.

 

System 2, the part of our brain which is more rational, calculating, and slower, is the part of the brain that is required for making detailed assessments on the information that System 1 takes in. But normally we don’t activate System 2 unless we really need to. If we judge that System 1 is making coherent connections and associations, then we don’t give it more attention and scrutiny from System 2.

 

It is important that we understand this about our minds. We can go about acting intuitively and believing that our simple narrative is correct, but we risk believing our own thoughts simply because they feel true and coherent to us and in line with our past experiences. Our thoughts will necessarily be inadequate, however, to fully encompass the reality around us. Other people will have different backgrounds, different histories, and different narratives knitted together in their own minds. It’s important that we find a way to engage System 2 when the stakes are high to make more thoughtful considerations than System 1 can generate. Simply because a narrative feels intuitively correct doesn’t mean that it accurately reflects the world around us, or creates a picture of the world that will work within the narrative frameworks that other people create.
First Impressions Matter

First Impressions Matter

In Thinking Fast and Slow, Daniel Kahneman describes a research study that shows the power of the halo effect. The halo effect is the phenomenon where positive traits in a person outshines the negative traits or characteristics of the individual, or cause us to project additional positive traits onto them. For example, think of your favorite celebrity. You know they are good looking, talented at whatever they do, and you most likely also ascribe a number of positive traits to them that you don’t really have evidence for. You probably believe they have the same political beliefs as you, that they probably pay their taxes and don’t litter. If you discovered they did one of these things, your brain would want to discredit that information, or you might face some cognitive dissonance as you square the negative characteristic with the fact that the person looks good and is talented.

 

The study Kahneman references shows the power of the halo effect by giving people 6 descriptions of a fictitious person. Some people were shown 3 positive characteristics followed by 3 negative traits. Another group of people were shown a different fictitious person, with the same 6 traits, but listed in reverse, with the negative traits first followed by the positive. Kahneman writes, “The sequence in which we observe characteristics of a person is often determined by chance. Sequence matters, however, because the halo effect increases the weight of first impressions, sometimes to the point that subsequent information is mostly wasted.”

 

The study shows that first impressions matter a lot, even when we are not actually meeting someone in person. When the first thing we learn about a person is something positive, it can be easy to overlook negative traits that we discover later, and this is true in reverse. This idea is part of what drove Malcolm Gladwell to write his new book Talking to Strangers. I have not read Gladwell’s book, but I have listened to him talk about it on several podcasts. He discusses the death of Sandra Bland, and the interaction she had with law enforcement that led to her arrest and subsequent suicide. First impressions matter, and the first impression she made on the police officer who pulled her over was negative, shaping the entire interaction between Sandra and the officer, and ultimately causing her arrest. Gladwell would also argue, I believe, that first impressions can be formed before you have even met someone, simply  by absorbing racial or other stereotypes.

 

Gladwell also discusses Bernie Madoff in his book. A savvy conman who relied on the halo effect to swindle millions. He charmed people and seemed successful, so people who trusted him with investments had trouble seeing through the lies. They wanted to believe the positive traits they first observed from him, and any hints of fraud were easily missed or ignored.

 

The best we can hope for is awareness of the halo effect, and remembering how much our very first impressions can matter. How we put ourselves forward can shape the interactions we have with others. But we can remember to give people a break, and give people second chances when our first impressions of them are not great. Remember to look beyond the first observed trait to see the whole picture of other people in your life, and try to set up situations so that you don’t judge people immediately on their appearance, and can look further to know and understand them a little better.
Positive Test Strategies

Positive Test Strategies

A real danger for us, that I don’t know how to move beyond, is positive test strategy. It is the search for evidence that confirms what we want to believe or what we think is true. When we already have an intuition about something, we look for examples that support our intuition. Looking for examples that don’t support our thought, or situations where our idea seems to fall short, is uncomfortable, and not something we are very good at. Positive test strategies are a form of motivated rationality, where we find ways to justify what we want to believe, and find ways to align our beliefs with what happens to be best for us.

 

In Thinking Fast and Slow, Daniel Kahneman writes the following, “A deliberate search for confirming evidence, known as positive test strategy, is also how System 2 tests a hypothesis. Contrary  to the rules of philosophers of science, who advise testing hypothesis by trying to refute them, people (and scientists, quite often) seek data that are likely to be compatible with the beliefs they currently hold.” 

 

In science, the best way to conduct a study is to try to refute the null hypothesis, rather than to try to support the actual hypothesis. You take a condition about the world, try to make an informed guess about why you observe what you do, and then you formulate a null hypothesis before you begin any testing. Your null hypothesis says, actually nothing is happening here after all. So you might think that teenage drivers are more likely to get in car crashes at roundabouts than regular intersections, or that crickets are more likely to eat a certain type of grass. Your null hypothesis is that teenagers do not crash at roundabouts more than typical intersections and that crickets don’t display a preference for one type of grass over another.

 

In your experimental study, instead of seeking out confirmation to show that teenagers crash more at roundabouts or that crickets prefer a certain grass, you seek to prove that there is a difference in where teenagers crash and which grass crickets prefer. In other-words, you seek to disprove the null hypothesis (that there is no difference) rather than try to prove that something specific is happening. It is a subtle difference, but it is importance. Its also important to note that good science doesn’t seek to disprove the null hypothesis in a specific direction. Good science tries to avoid positive test strategies by showing that the nothing to see here hypothesis is wrong and that there is something to see, but it could be in any direction. If scientists do want to provide more evidence that it is in a given direction, they look for stronger evidence, and less chance of random sampling error.

 

In our minds however, we don’t often do this. We start to see a pattern of behavior or outcomes, and we start searching for explanations to what we see. We come up with a hypothesis, think of more things that would fit with our hypothesis, and we find ways to explain how things align with our hypothesis. In My Big Fat Greek Wedding, this is what the character Gus does when he tries to show that all words in the world are originally Greek.

 

Normally, we identify something that would be in our personal interest or would support our group identity in a way to help raise our social status. From there, we begin to adopt hypothesis about how the world should operate that support what is in our personal interest. We then look for ways to test our hypothesis that would support it, and we avoid situations where our hypothesis could be disproven. Finding things that support what we already want to believe is comforting and relatively easy compared to identifying a null hypothesis, testing it, and then examining the results without already having a pre-determined outcome that we want to see.
Causal Versus Statistical Thinking

Causal Versus Statistical Thinking

Humans are naturally causal thinkers. We observe things happening in the world and begin to apply a causal reason to them, asking what could have led to the observation we made. We attribute intention and desire to people and things, and work out a narrative that explains why things happened the way they did.

 

The problem, however, is that we are prone to lots of mistakes when we think in this way. Especially when we start looking at situations that require statistical thinking. In his book Thinking Fast and Slow, Daniel Kahneman writes the following:

 

“The prominence of causal intuitions is a recurrent theme in this book because people are prone to apply causal thinking inappropriately, to situations that require statistical reasoning. Statistical thinking derives conclusions about individual cases from properties of categories and ensembles. Unfortunately, System 1 does not have the capability for this mode of reasoning; system 2 can learn to think statistically, but few people receive the necessary training.”

 

System 1 is our fast brain. It works quickly to identify associations and patters, but it doesn’t take in a comprehensive set of information and isn’t able to do much serious number crunching. System 2 is our slow brain, able to do the tough calculations, but limited to work on the set of data that System 1 is able to accumulate. Also, System 2 is only active for short periods of time, and only when we consciously make use of it.

 

This leads to our struggles with causal thinking. We have to take in a wide range of possibilities, categories, and ranges of combinations. We have to make predictions and understand that in some set of instances we will see one outcome, but in another set of circumstances we may see a different outcome. Statistical thinking doesn’t pin down a concrete answer the way our causal thinking likes. As a result, we reach conclusions based on incomplete considerations, we ignore some important pieces of information, and we assume that we are correct because our answer feels correct and satisfies some criteria. Thinking causally can be powerful and useful, but only if we fully understand the statistical dimensions at hand, and can fully think through the implications of the causal structures we are defining.