## Probability is Multifaceted

For five years my wife and I lived in a house that was at the base of the lee side of a small mountain range in Northern Nevada. When a storm would come through the area it would have to make it over a couple of small mountain ranges and valleys before getting to our house, and as a result we experienced less precipitation at our house than most people in the Reno/Sparks area. Now my wife and I live in a house higher up on a different mountain that is more in the direct path of storms coming from the west. We receive snow at our house while my parents and family lower in the valley barely get any wind. At both houses we have learned to adjust our expectations for precipitation relative to the probabilities reported by weather stations which reference the airport at the valley floor. Our experiences with rain and snow at our two places is a useful demonstration that probability (in this case the probability of precipitation) is multifaceted – that multiple factors  play a role in the probability of a given event at a given place and time.

In his book Risk Savvy, Gerd Gigerenzer writes, “Probability is not one of a kind; it was born with three faces: frequency, physical design, and degrees of belief.” Gigerenzer explains that frequency is about counting. To me, this is the most clearly understandable aspect of probability, and what we usually refer to when we discuss probability. On how many days does it usually rain in Reno each year? How frequently does a high school team from Northern Nevada win a state championship and how frequently does a team from Southern Nevada win a state championship? These types of questions simply require counting to give us a general probability of an event happening.

But probability is not just about counting and tallying events. Physical design plays a role as well. Our house on the lee side of a small mountain range was shielded from precipitation, so while it may have rained in the valley half a mile away, we didn’t get any precipitation. Conversely, our current home is in a position to get more precipitation than the rest of the region. In high school sports, fewer kids live in Reno/Sparks compared to the Las Vegas region, so in terms of physical design, state championships are likely to be more common for high schools in Southern Nevada. Additionally, there may be differences in the density of students at each school, meaning the North could have more schools per students than the south, also influencing the probability of a north or south school winning. Probability, Gigerenzer explains, can be impacted by the physical design of systems, potentially making the statistics and chance more complicated to understand.

Finally, degrees of belief play a role in how we comprehend probability. Gigerenzer states that degrees of belief include experience and personal impression which are very subjective. Trusting two eye witnesses, Gigerenzer explains, rather than two people who heard about an event from someone else can increase our perception that the probability of an unlikely story is accurate. Degrees of belief can also be seen in my experiences with rain and our two houses. I learned to discount the probability of rain at our first house and to increase my expectation of rain at our new house. If the meteorologist said there was a low chance of rain when we lived on the sheltered side of a hill, then I didn’t worry much about storm forecasts. At our new house, however, if there is a chance of precipitation and storm coming from the west, I will certainly go remove anything from the yard that I don’t want to get wet, because I believe the chance that our specific neighborhood will see rain is higher than what the meteorologist predicted.

Probability and how we understand it and consequentially make decisions  is complex, and Gigerenzer’s explanation of the multiple facets of probability helps us better understand the complexity. Simply tallying outcomes and predicting into the future often isn’t enough for us to truly have a good sense of the probability of a given outcome. We have to think about physical design, and we have to think about the personal experiences and subjective opinions that form the probabilities that people develop and express. Understanding probability requires that we hold a lot of information in our head at one time, something humans are not great at doing, but that we can do better when we have better strategies for understanding complexity.

## Dread Risks

Over the course of 2020 we watched COVID-19 shift from a dread risk to a less alarming risk. To some extent, COVID-19 became a mundane risk that we adjusted to and learned to live with. Our initial reactions to COVID-19, and our later discontent but general acceptance reveal interesting ways in which the mind works. Sudden and unexplained deaths and risks are terrifying, while continual risk is to some extent ignored, even if we face greater risk from dangers we ignore.

In Risk Savvy Gerd Gigerenzer describes dread risks and our psychological reactions by writing, “low-probability events in which many people are suddenly killed trigger an unconscious psychological principle: If many people die at one point in time, react with fear and avoid that situation.” Dread risks are instances like terrorist attacks, sudden bridge collapses, and commercial food contamination events. A risk that we did not consider is thrust into our minds, and we react strongly by avoiding something we previously thought to be safe.

An unfortunate reality of dread risks is that they distract us and pull our energy and attention away from ongoing and more mundane risks. This has been a challenge as we try to keep people focused on limiting COVID-19 and not simply accepting deaths from the disease the way we accept deaths from car crashes, gun violence, and second hand smoke exposure. Gigerenzer continues, “But when as many or more die distributed over time, such as in car and motorbike accidents, we are less likely to be afraid.” Dread risks trigger fears and responses that distributed risks don’t.

This psychological bias drove the United States into wars in Iraq and Afghanistan in the early 2000s and we are still paying the prices for those wars. The shift of COVID-19 in our collective consciousnesses from a dread risk to a distributed risk lead to mass political rallies, unwise indoor gatherings, and other social and economic events where people contracted the disease and died even though they should have known to be more cautious. Reacting appropriately to a dread risk is difficult, and giving distributed risks the attention and resources they deserve is also difficult. The end result is poor public policy, poor individual decision-making, and potentially the loss of life as we fail to use resources in a way that saves the most lives.

## Decision Weights

On the heels of the 2020 election, I cannot decide if this post is timely, or untimely. On the one hand, this post is about how we should think about unlikely events, and I will argue, based on a quote from Daniel Kahneman’s book Thinking Fast and Slow, that we overweight unlikely outcomes and should better alight our expectations with realistic probabilities. On the other hand, however, the 2020 election was closer than many people expected, we almost saw some very unlikely outcomes materialize, and one can argue that a few unlikely outcomes really did come to pass. Ultimately, this post falls in a difficult space, arguing that we should discount unlikely outcomes more than we actually do, while acknowledging that sometimes very unlikely outcomes really do happen.

In Thinking Fast and Slow Kahneman writes, “The decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle.”  This quote is referencing studies which showed that people are not good at conceptualizing chance outcomes at the far tails of a distribution. When the chance of something occurring gets below 10%, and especially when it pushes into the sub 5% range, we have trouble connecting that with real world expectations. Our behaviors seem to change when things move from 50-50 to 75-25 or even to 80-20, but we have trouble adjusting any further once the probabilities really stretch beyond that point.

Kahneman continues, “Improbable outcomes are overweighed – this is the possibility effect. Outcomes that are almost certain are underweighted relative to actual certainty. The expectation principle, by which values are weighted by their probability, is poor psychology.”

When something has only 5% or lower chance of happening, we actually behave as though chance or probability for that occurrence is closer to say 25%. We know the likelihood is very low, but we behave as if the likelihood is actually a bit higher than a single digit percentage. Meanwhile, the very certain and almost completely sure outcome of 95%+ is discounted beyond what it really should be. Certainly very rare outcomes do sometimes happen, but in our minds we have trouble conceptualizing these incredibly rare outcomes, and rather than keeping a perspective based on the actual probabilities, by utilizing rational decision weights, we overweight the improbably and underweight the certain.

Our challenges with thinking about and correctly weighting extremely certain or extremely unlikely events may have an evolutionary history. For our early ancestors, being completely sure of anything may have resulted in a few very unlikely deaths. Those who were a tad more cautious may have been less likely to run across the log that actually gave way into the whitewater rapids below. And our ancestors who reacted to the improbably as though it were a little more certain may have also been better at avoiding the lion the one time the twig snapping outside the campground really was a lion. Our ancestor who sat by the fire and said, “twigs snap every night, the chances that it actually is a lion this time have gotta be under 5%,” may not have lived long enough to pass enough genes into the future generations. The reality is that in most situations for our early ancestors, being a little more cautious was probably advantageous for society. Today being overly cautious and struggling with improbable or nearly certain decision weights can be costly for us in terms of over-purchasing insurance, spending huge amounts to avoid the rare chance that we could lose a huge amount, and over trusting democratic institutions in the face of a coup attempt.

## Probability Judgments

Julia Marcus, an epidemiologist at Harvard Medical School, was on a recent episode of the Ezra Klein show to discuss thinking about personal risk during the COVID-19 Pandemic. Klein and Marcus talked about the ways in which the United States Government has failed to help provide people with structures for thinking about risk, and how this has pushed risk decisions onto individuals. They talked about how this creates pressures on each of us to determine what activities are worthwhile, what is too risky for us, and how we can know if there is a high probability of infection in one setting relative to another.

On the podcast they acknowledged what Daniel Kahneman writes about in his book Thinking Fast and Slow – humans are not very good at making probability judgments. Risk is all about probability. It is fraught with uncertainty, with with small likelihoods of very bad outcomes, and with conflicting opinions and desires. Our minds, especially our normal operating mode of quick associations and judgments, doesn’t have the capacity to think statistically in the way that is necessary to make good probability judgments.

When we try to think statistically, we often turn to substitutions, as Kahneman explains in his book. “We asked ourselves how people manage to make judgments of probability without knowing precisely what probability is. We concluded that people must somehow simplify that impossible task and we set out to find how they do it. Our answer was that when called upon to judge probability, people actually judge something else and believe they have judged probability.”

This is very important when we think about our actions, and the actions of others, during this pandemic. We know it is risky to have family dinners with our loved ones, and we ask ourselves if it is too risky to get together with our parents, with siblings who are at risk due to health conditions, and if we shouldn’t be in the same room with a family member who is a practicing medical professional. But in the end, we answer a different question. We ask how much we miss our parents, if we think it is important to be close to our family, and if we really really want some of mom’s famous pecan pie.

As Klein and Marcus say during the podcast, it is a lot easier to be angry at people at a beach than to make probability judgments about a small family dinner. When governments, public health officials, and employers fail to establish systems to help us navigate the risk, we place the responsibility back onto individuals, so that we can have someone to blame, some sense of control, and an outlet for the frustrations that arise when our mind can’t process probability. We distort probability judgments and ask more symbolic questions about social cohesion, family love, and isolation. The answer to our challenges would be better and more responsive institutions and structures to manage risk and mediate probability judgments. The individual human mind can only substitute easier questions for complex probability judgments, and it needs visual aids, better structures, and guidance to help think through risk and probability in an accurate and reasonable manner.