Decision Weights

On the heels of the 2020 election, I cannot decide if this post is timely, or untimely. On the one hand, this post is about how we should think about unlikely events, and I will argue, based on a quote from Daniel Kahneman’s book Thinking Fast and Slow, that we overweight unlikely outcomes and should better alight our expectations with realistic probabilities. On the other hand, however, the 2020 election was closer than many people expected, we almost saw some very unlikely outcomes materialize, and one can argue that a few unlikely outcomes really did come to pass. Ultimately, this post falls in a difficult space, arguing that we should discount unlikely outcomes more than we actually do, while acknowledging that sometimes very unlikely outcomes really do happen.

 

In Thinking Fast and Slow Kahneman writes, “The decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle.”  This quote is referencing studies which showed that people are not good at conceptualizing chance outcomes at the far tails of a distribution. When the chance of something occurring gets below 10%, and especially when it pushes into the sub 5% range, we have trouble connecting that with real world expectations. Our behaviors seem to change when things move from 50-50 to 75-25 or even to 80-20, but we have trouble adjusting any further once the probabilities really stretch beyond that point.

 

Kahneman continues, “Improbable outcomes are overweighed – this is the possibility effect. Outcomes that are almost certain are underweighted relative to actual certainty. The expectation principle, by which values are weighted by their probability, is poor psychology.”

 

When something has only 5% or lower chance of happening, we actually behave as though chance or probability for that occurrence is closer to say 25%. We know the likelihood is very low, but we behave as if the likelihood is actually a bit higher than a single digit percentage. Meanwhile, the very certain and almost completely sure outcome of 95%+ is discounted beyond what it really should be. Certainly very rare outcomes do sometimes happen, but in our minds we have trouble conceptualizing these incredibly rare outcomes, and rather than keeping a perspective based on the actual probabilities, by utilizing rational decision weights, we overweight the improbably and underweight the certain.

 

Our challenges with thinking about and correctly weighting extremely certain or extremely unlikely events may have an evolutionary history. For our early ancestors, being completely sure of anything may have resulted in a few very unlikely deaths. Those who were a tad more cautious may have been less likely to run across the log that actually gave way into the whitewater rapids below. And our ancestors who reacted to the improbably as though it were a little more certain may have also been better at avoiding the lion the one time the twig snapping outside the campground really was a lion. Our ancestor who sat by the fire and said, “twigs snap every night, the chances that it actually is a lion this time have gotta be under 5%,” may not have lived long enough to pass enough genes into the future generations. The reality is that in most situations for our early ancestors, being a little more cautious was probably advantageous for society. Today being overly cautious and struggling with improbable or nearly certain decision weights can be costly for us in terms of over-purchasing insurance, spending huge amounts to avoid the rare chance that we could lose a huge amount, and over trusting democratic institutions in the face of a coup attempt.

Leave a Reply