# Denominator Neglect

“The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects,” writes Daniel Kahneman in Thinking Fast and Slow.

One thing we have seen in 2020 is how difficult it is to communicate and understand risk. Thinking about risk requires thinking statistically, and thinking statistically doesn’t come naturally for our brains. We are good at thinking in terms of anecdotes and our brains like to identify patterns and potential causal connections between specific events. When our brains have to predict chance and deal with uncertainty, they easily get confused. Our brains shift and solve easier problems rather than complex mathematical problems, substituting the answer to the easy problem without realizing it. Whether it is our risk of getting COVID or the probability we assigned to election outcomes before November 3rd, many of us have been thinking poorly about probability and chance this year.

Kahneman’s quote above highlights one example of how our thinking can go wrong when we have to think statistically. Our brains can be easily influenced by random numbers, and that can throw off our decision-making when it comes to dealing with uncertainty. To demonstrate denominator neglect, Kahneman presents two situations in his book. There are two large urns full of white and red marbles. If you pull a red marble from an urn, you are a winner. The first urn has 10 marbles in it, with 9 white and 1  red. The second urn has 100 marbles in it, with 92 white and 8 red marbles. Statistically, we should try our luck with the urn with 10 marbles, because 1 out of 10, or 10% of all marbles in the urn are red. In the second urn, only 8% of the marbles are red.

When asked which urn they would want to select from, many people select the second urn, leading to what Kahneman describes as denominator neglect. The chance of winning is lower with the second urn, but there are more winning marbles in the jar, making it seem like the better option if you don’t slow down and engage your System 2 thinking processes. If you pause and think statistically, you can see that option 1 provides better odds, but if you are moving quick your brain can be distracted by the larger number of winning marbles and lead you to make a worse choice.

What is important to recognize is that we can be influenced by numbers that shouldn’t mean anything to us. The number of winning marbles shouldn’t matter, only the percent chance of winning should matter, but our brains get thrown off. The same thing happens when we see sales prices, think about a the risk of a family gathering of 10 people during a global pandemic, or think about polling errors. I like to check The Nevada Independent‘s COVID-19 tracking website, and I have noticed denominator neglect in how I think about the numbers they report. For a continued stretch, Nevada’s total number of cases was decreasing, but our case positivity rate was staying the same. Statistically, nothing was really changing regarding the state of the pandemic in Nevada, but fewer tests were being completed and reported each day, so the overall number of positive cases was decreasing. If you scroll down the Nevada Independent website, you will get to a graph of the case positivity rate and see that things were staying the same. When looking at the decreasing number of positive tests reported, my brain was neglecting the denominator, the number of tests completed. The way I understood the pandemic was biased by the big headline number, and wasn’t really based on how many people out of those tested did indeed have the virus. Thinking statistically provides a more accurate view of reality, but it can be hard to think statistically and can be tempting to look just at a single headline number.