Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.
Can You Remember Your Prior Beliefs? - Joe Abittan

Can You Remember Your Prior Beliefs?

“A general limitation of the human mind,” writes Daniel Kahneman in his book Thinking Fast and Slow, “is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”

 

What Kahneman is referring to with this quote is the difficulty we have in understanding how our thinking evolves and changes over time. To each of us, our thinking slowly adapts and revises itself, sometimes quite dramatically, but often very slowly. Our experience of our changing mind isn’t very reflective of these changes, unless we had a salient change that I would argue is tied in one way or another to an important aspect of our identity. For most changes in our mental approach, we generally don’t remember our prior beliefs and views, and we likely don’t remember a point at which our beliefs changed.

 

In the book Kahneman uses an example of two football teams with the same record playing each other. One team crushes the other, but before we knew the outcome, we didn’t have a strong sense of how the game would go. After watching a resounding victory, it is hard to remember that we once were so uncertain about the future outcome.

 

This tendency of the mind wouldn’t be much of a problem if it was restricted to our thinking about sports – unless we had a serious betting problem. However, this applies to our thinking on many more important topics such as family member marriages, career choices, political voting patterns, and consumer brand loyalty. At this moment, many Democrat voters in our nation probably don’t remember exactly what their opinions were on topics like free trade, immigration, or infectious disease policy prior to the 2016 election. If they do remember their stances on any of those issues, they probably don’t remember all the legal and moral arguments they expressed at that time. Their minds and opinions on the matter have probably shifted in response to President Trump’s policy positions, but it is probably hard for many to say exactly how or why their views have changed.

 

In a less charged example, imagine that you are back in high school, and for years you have really been into a certain brand of shoes. But, one day, you are bullied for liking that brand, or perhaps someone you really dislike is now sporting that same brand, and you want to do everything in your power to distance yourself from any association with the bullying or the person you don’t like. Ditching the shoes and forgetting that you ever liked that brand is an easy switch for our minds to make, and you never have to remember that you too wore those shoes.

 

The high school example is silly, but for me it helps put our brain’s failure to remember previous opinions and beliefs in context. Our brains evolved in a social context, and for our ancestors, navigating complex tribal social structures and hierarchies was complex and sometimes a matter of life and death (not just social media death for a few years in high school like today). Being able to ditch beliefs that no longer fit our needs was probably helpful for our ancestors, especially if it helped them fully commit to a new tribal leader’s strange quirks and new spiritual beliefs. Today, this behavior can cause us to form strange high school (or office) social cliques and can foment toxic political debates, but it may have served a more constructive role for our ancestors forming early human civilizations.

Understanding the Past

Understanding the Past

I am always fascinated by the idea, that continually demonstrates validity in my own life, that the more we learn about something, the more realize how little we actually know about it. I am currently reading Yuval Noah Harari’s book Sapiens: A Brief History of Humankind, and I am continually struck by how often Harari brings in events from mankind’s history that I had never heard about. The more I learn about the past, or about any given subject, the more I realize how little knowledge I have ever had, and how limited, narrow, and sometimes just flat out inaccurate my understandings have been.

 

This is particularly important when it comes to how we think about the past. I believe very strongly that our reality and the worlds in which we live and inhabit are mostly social constructions. The trees, houses, and roads are all real, but how we understand the physical objects, the spaces we operate, and how we use the real material things in our worlds is shaped to an incredible degree by social constructions and the relationships we build between ourselves and the world we inhabit. In order to understand these constructions and in order to shape them for a future that we want to live in (and are physiologically capable of living in) we need to understand the past and make predictions about the future with new social constructs that enable continued human flourishing.

 

To some extent, this feels easy and natural to us. We all have a story and we learn and adopt family stories, national stories, and global stories about the grand arc of humanity. But while our stories seem to be shared, and while we seem to know where we are heading, we all operate based on individual understandings of the past, and where that means we are (or should be) heading. As Daniel Kahneman writes in his  book Thinking Fast and Slow, “we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less that we believe we do.”

 

As I laid out to begin this post, there is always so much more complexity and nuance to anything that we might study and be familiar with than we often realize. We can feel that we know something well when we are ignorant of the nuance and complexity. When we start to really untangle something, whether it be nuclear physics, the history of the American Confederacy, or how our fruits and veggies get to the supermarket, we realize that we really don’t know and understand anything as well as we might intuitively believe.

 

When we lack a deep and complex understanding of the past, because we just don’t know about something or because we didn’t have an accurate and detailed presentation of the thing from the past, then we are likely to misinterpret and misunderstand how we got to our current point. By having a limited historical perspective and understanding, we will incorrectly assess where our best future lies. It is important that we recognize how limited our knowledge is, and remember that these limits will shape the extent to which we can make valid predictions for the future.
Scared Before You Even Know It

Scared Before You Even Know It

In Thinking Fast and Slow, Daniel Kahneman demonstrates how quick our minds are and how fast they react to potential dangers and threats by showing us two very simple pictures of eyes. The pictures are black squares, with a little bit of white space that our brains immediately perceive as eyes, and beyond that immediate perception of eyes, our brains also immediately perceive an emotional response within the eyes. They are similar to the simple eyes I sketched out here:

In my sketch, the eyes on the left are aggressive and threatening, and our brains will pick up on the threat they pose and we will have physiological responses before we can consciously think through the fact that those eyes are just a few lines drawn on paper. The same thing happens with the eyes on the right, which our brains recognize as anxious or worried. Our body will have a quick fear reaction, and our brain will be on guard in case there is something we need to be anxious or worried about as well.

 

Regarding a study that was conducted where subjects in a brain scanner were shown a threatening picture for less than 2/100 of a second, Kahneman writes, “Images of the brain showed an intense response of the amygdala to a threatening picture that the viewer did not recognize. The information about the threat probably traveled via a superfast neural channel that feeds directly into a part of the brain that processes emotions, bypassing the visual cortex that supports the conscious experience of seeing.” The study was designed so that the subjects were not consciously aware of having seen an image of threatening eyes, but nevertheless their brain perceived it and their body reacted accordingly.

 

The takeaway from this kind of research is that our environments matter and that our brains respond to more than what we are consciously aware of. Subtle cues and factors around us can shape the way we behave and feel about where we are and what is happening. We might not know why we feel threatened, and we might not even realize that we feel threatened, but our heart rate may be elevated, we might tense up, and we might become short and defensive in certain situations. When we think back on why we behaved a certain way, why we felt the way we did, and why we had the reactions we did, our brains won’t be able to recognize these subtle cues that never rose to the level of consciousness. We won’t be able to explain the reason why we felt threatened, all we will be able to recall is the physiological response we had to the situation. We are influenced by far more than our conscious brain is aware, and we should remember that our conscious brain doesn’t provide us with a perfect picture of reality, but nevertheless our subconscious reacts to more of the world than we notice.
Ignore Our Ignorance

Ignore Our Ignorance

There is a quote that is attributed to Harry Truman along the lines of, “give me a one-handed economist.” The quote references the frustrations that any key decision-maker might have when faced with challenging and sometimes conflicting information and choices. On the one hand is a decision with a predicted set of outcomes, but on the other hand is another decision or a separate undesirable set of consequences. The quote shows how challenging it is to understand and navigate the world when you have complex and nuanced understandings of what is happening.

 

Living in ignorance actually makes choices and decisions easier – there is no other hand of separate choices, of negative consequences, or different points of view. Ignoring our ignorance is preferable when we live our own narrative constructions, where what we see is all there is, and reality is what we make it to be.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow, and how these narrative fallacies lead to so many of our predictable cognitive errors. He writes, “Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

 

When I think about Kahneman’s quote, I think about myself upon graduating with a Masters in Public Administration and Policy and my older sister upon her high school graduation. My sister has had strong political views for a very long time, views that she readily adopted as a high school student. Her self-assured profession of her political views which contrasted against the self-assured political views of my parents is part of what sparked an interest in me to study political science and public policy. I wanted to understand how people became so sure of political views that I didn’t fully understand, but which I could see contained multitudes of perspectives, benefits, and costs.

 

At the completion of my degree I felt that I had a strong understanding of the political processes in the United States. I could understand how public policy was shaped and formed, I could describe how people came to hold various points of view and why some people might favor different policies. But what I did not gain was a sense that one particular political approach was necessarily correct or inherently better than any other. So much of our political process is dependent on who stands to benefit, what is in our individual self-interest, and what our true goals happen to be. At the completion of a study of politics, I felt that I knew more than many, but I did not exactly feel that my political opinions were stronger than the political opinions of my sisters when she graduated high school. Her opinions were formed in ignorance (not saying this in a mean way!), and her limited perspective allowed her to be more confident in her opinions than I could be with my detailed and nuanced upstanding of political systems and processes.

 

Our views of the world and how we understand our reality is shaped by the information we absorb and the experiences we have. What you see is all there is, and the narrative you live within will make more sense when you are more ignorant of the complexities of the world around you. Your narrative will be simpler and more coherent since there won’t be other hands to contrast against your opinions, desires, and convictions.
Narratives and Halos

Narratives and Halos

Yesterday I wrote about narrative fallacies and how our brains’ desires to create coherent stories can lead to cognitive errors. One error, which I wrote about previously, is the halo effect, and in some ways it is a direct consequence of narrative thinking. Our brains don’t do well with conflicting information that doesn’t fit a coherent narrative, and the halo effect helps smooth over this problem in our minds.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “The halo effect helps keep explanatory narratives simply and coherent by exaggerating the consistency of evaluations; good people do only good things and bad people are all bad.” When we already like someone or consider them a good person the halo effect will simplify other judgments that we might have to make about them. If the person we admire is wearing a particular kind of coat, then we will assume that it is also a coat we should admire. If a person we dislike is engaging in some type of business, then we will assume that business is also bad. Contradictions occur when we see someone we admire wearing clothing we don’t find acceptable or when a person we know to have moral flaws engages in altruistic charity work.

 

Instead of accepting a contradiction in our narrative, creating a more complex story where some people are good in some situations but bad in others, we alter our judgments in other ways to maintain a coherent narrative. The person we like wearing strange clothes is a trend setter, and that must be the new up-and-coming style we should try to emulate. The bad person engaged in charity isn’t really doing the good things for good reasons, rather they are being selfish and trying to show-off through their charity.

 

When we reflect on our thinking and try to be more considerate of the narratives we create, we can see that we fall into traps like the halo effect. What is harder to do, however, is overcome the halo effect and other cognitive errors that simplify our narratives once we have noticed them. It is hard to continually live with conflicting opinions, ideas of people, cities, sports teams, car companies, and shoe brands. It is much easier to adopt a few favorites and believe them to be a good in all ways, rather than to accept that something might be great in some ways, but harmful or disappointing in others.
Narrative Fallacies #NarrativePolicyFramework

Narrative Fallacies

With perhaps the exception of professional accountants and actuaries, we think in narratives. How we understand important aspects of our lives, such as who we are, the opportunities we have had in life, the decisions we have made, and how our society works is shaped by the narratives we create in our minds. We use stories to make sense of our relationships with other people, of where our future is heading, and to motivate ourselves to keep going. Narratives are powerful, but so are the narrative fallacies that can arise from the way we think.

 

Daniel Kahneman, in Thinking Fast and Slow, demonstrates the ways in which our brains take short-cuts, rely on heuristics, and create narratives to understand a complex world. He shows he these thinking strategies can fail us in predictable ways due to biases, illusions, and judgments made on incomplete information. Narrative fallacies can arise from all three of the cognitive errors I just listed. To get more in depth with narrative fallacies, Kahneman writes,

 

“Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.”

 

We don’t really know how to judge probabilities, possibilities, and the consequences of things that didn’t happen. We are biased to see agency in people and things when luck was more of a factor than any direct action or individual decision. We are motivated and compelled by stories of the world that simplify the complexity of reality, taking a small slice of the world and turning that into a model to describe how we should live, behave, and relate to others.

 

Unfortunately, in my opinion, narrative fallacies cannot be avoided. I studied public policy, and one of the frameworks for understanding political decision-making that I think needs far more direct attention is the Narrative Policy Framework which incorporates the idea of Social Constructions of Target Populations from Anne Schneider and Helen Ingram. We understand the outcomes of an event based on how we think about the person or group that were impacted by the consequences of the outcome. A long prison sentence for a person who committed a violent crime is fair and appropriate. A tax break for parents who work full time is also fair and appropriate. In both instances, we think about the person receiving the punishment or reward of a decision, and we judge whether they are deserving of the punishment or reward. We create a narrative to explain why we think the outcomes are fair.

 

We cannot exist in a large society of millions of people without shared narratives to help us explain and understand our society collectively. We cannot help but create a story about a certain person or group of people, and build a narrative to explain why we think that person or group deserves a certain outcome. No matter what, however, the outcomes will not be rational, they will be biased and contain contradictions. We will judge groups positively or negatively based on stories that may or may not be accurate and complete, and people will face real rewards or punishments due to how we construct our narratives and what biases are built into our stories. We can’t escape this reality because it is how our brains work and how we create a cohesive society, but we can at least step back and admit this is how our brains work, admit that our narratives are subject to biases and are based on incomplete information, and we can decide how we want to move forward with new narratives that will help to unify our societies rather than pit them against each other in damaging competition.
Intensity Matching and Intuitive Predictions

Intuitive Predictions and Intensity Matching

“Intuitive predictions are almost completely insensitive to the actual predictive quality of the evidence,” writes Daniel Kahneman in Thinking Fast and Slow. A lot of our thinking takes place in the part of our brain which is good at making quick connections, quickly detecting patterns, and making fast judgments. The deeper and more thoughtful part of our brain only engages with the world when it really needs to, when we really need to do some critical thinking to sort out a math problem, write a blog post, or figure out how to grind down some grains to make bread. The result is that a lot of our thinking processes happen at a quick and intuitive level that is subject to biases and assumptions based on incomplete information.  When we do finally turn our critical thinking brain to a problem, it is only operating with a limited set of information from the quick part of our brain which scanned the environment and grabbed the information which stood out.

 

When we make a prediction without sitting down and doing some math or weighing the factors that influence our prediction with pen and paper, our predictions will seem logical, but will miss critical information. We will make connections between ideas and experiences that might not be very reflective of the actual world. We will simplify the prediction by answering easy questions and substituting answers for the more difficult question that our prediction is trying to answer.

 

This year, as in 2016, we will see this in action. In 2016, for me and many of the people I know, it seemed as though very few people supported Donald Trump for president. I saw very few bumper stickers or yard signs for Trump, all the social media posts I saw highlighted his worst moments, and the news coverage I consumed described why he was unfit to be president. Naturally enough, I believed he would lose in a landslide. Of course, that did not happen. Intuitively I was sure that Clinton would win, and Kahneman’s research helps explain why I should have been more skeptical of my natural intuition.

 

Part of the problem was that my intuitive prediction was an exercise of intensity matching, and as Kahneman writes, “Intensity matching yields predictions that are as extreme as the evidence on which they are based.” All the information I saw highlighted how terrible Trump was. I didn’t see a lot of people supporting trump, I didn’t see news stories justifying his candidacy. I didn’t see people in my immediate environment who strongly supported him, so my intuition was biased. It didn’t help that I didn’t do anything to seek out people who did support him or information outlets that posted articles or stories in support of him.

 

Kahneman’s writing aligns with my real world experience. His studies of the brain and of our predictive machinery reveals biases and errors in our thinking. Our intuition is based on a limited set of information that the quick part of our brain can put together. When we do engage our deep thinking brain, it can still only operate on that limited information, so even if we do think critically, we are likely to still make mistakes because we can’t see the full picture and biases in the information we absorb will predictably shape the direction of our miscalculations. What might feel natural and obvious to us could be a result of faulty intensity matching and random chance in the environment around us.
Regression to the Mean Versus Causal Thinking

Regression to the Mean Versus Causal Thinking

Regression to the mean, the idea that there is an average outcome that can be expected and that overtime individual outliers from the average will revert back toward that average, is a boring phenomenon on its own. If you think about it in the context of driving to work and counting your red lights, you can see why it is a rather boring idea. If you normally hit 5 red lights, and one day you manage to get to work with just a single red light, you probably expect that the following day you won’t have as much luck with the lights, and will probably have more red lights than than your lucky one red light commute. Conversely, if you have a day where you manage to hit every possible red light, you would probably expect to have better traffic luck the next day and be somewhere closer to your average. This is regression to the mean. Simply because you had only one red or managed to hit every red one day doesn’t cause the next day’s traffic light stoppage to be any different, but you know you will probably have a more average count of reds versus greens – no causal explanation involved, just random traffic light luck.

 

But for some reason this idea is both fascinating and hard to grasp in other areas, especially if we think that we have some control of the outcome. In Thinking Fast and Slow, Daniel Kahneman helps explain why it is so difficult in some settings for us to accept regression to the mean, what is otherwise a rather boring concept. He writes,

 

“Our mind is strongly biased toward causal explanations and does not deal well with mere statistics. When our attention is called to an event, associative memory will look for its cause – more precisely, activation will automatically spread to any cause that is already stored in memory. Causal explanations will be evoked when regression is detected, but they will be wrong because the truth is that regression to the mean has an explanation but does not have a cause.”

 

Unless you truly believe that there is a god of traffic lights who rules over your morning commute, you probably don’t assign any causal mechanism to your luck with red lights. But when you are considering how well a professional golfer played on the second day of a tournament compared to the first day, or when you are considering whether intelligent women marry equally intelligent men, you are likely to have some causal idea that comes to mind. The golfer was more or less complacent on the second day – the highly intelligent women have to settle for less intelligent men because the highly intelligent men don’t want an intellectual equal. These are examples that Kahneman uses in the book and present plausible causal mechanisms, but as Kahneman shows, the more simple though boring answer is simply regression to the mean. A golfer who performs spectacularly on day one is likely to be less lucky on day two. A highly intelligent woman is likely to marry a man with intelligence closer to average just by statistical chance.

 

When regression to the mean violates our causal expectation it becomes an interesting and important concept. It reveals that our minds don’t simply observe an objective reality, they observe causal structures that fit with preexisting narratives. Our causal conclusions can be quite inaccurate, especially if they are influenced by biases and prejudices that are unwarranted. If we keep regression to the mean in mind, we might lose some of our exciting narratives, but our thinking will be more sound, and our judgments more clear.
Regression to the Mean

Praise, Punishment, & Regression to the Mean

Regression to the mean is seriously underrated. In sports, stock market funds, and biological trends like generational height differences, regression to the mean is a powerful, yet misunderstood phenomenon. A rookie athlete may have a standout first year, only to perform less spectacularly the following year. An index fund may outperform all others one year, only to see other funds catch up the next year. And a tall man may have a son who is shorter. In each instance, regression to the mean is at play, but since we underrate it, we assume there is some causal factor causing our athlete to play worse (it went to his head!), causing our fund to earn less (they didn’t rebalance the portfolio correctly!), and causing our son to be shorter (his father must have married a short woman).

 

In Thinking Fast and Slow Daniel Kahneman looks at the consequences that arise when we fail to understand regression to the mean and attempt to create causal connections between events when we shouldn’t. Kahneman describes an experiment he conducted with Air Force cadets, asking them to flip a coin backwards over their head and try to hit a spot on the floor. Those who had a good first shot typically did worse on their second shot. Those who did poor on their first shot, usually did better the next time. There wasn’t any skill involved, the outcome was mostly just luck and random chance, so if someone was close one time, you might expect their next shot to be a little further out, just by random chance. This is regression to the mean in an easy to understand example.

 

But what happens when we don’t recognize regression to the mean in a random and simplified experiment? Kahneman used the cadets to demonstrate how random performance deviations from the mean during flight maneuvers translates into praise or punishments for the cadets. Those who performed well were often praised, only to regress to  the mean on their next flight and perform worse. Those who performed poorly also regressed to the mean, but in an upward direction, improving on the next flight. Those whose initial performance was poor received punishment (perhaps just a verbal reprimand) between their initial poor effort and follow-up improvement (regression).  Kahneman describes the take-away from the experiment this way:

 

“The feedback to which life exposes us is perverse. Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty.”

 

Praise a cadet who performed well, and they will then perform worse. Criticize a cadet who performed poorly, and they will do better. Our minds overfit patterns and start to see a causal link between praise and subsequent poor performance and castigation and subsequent improvement. All that is really happening is that we are misunderstanding regression to the mean, and creating a causal model where we should not.

 

If we better understood regression to the mean, we wouldn’t be so shocked when a standout rookie sports star appears to have a sophomore slump. We wouldn’t jump on the bandwagon when an index fund had an exceptional year, and we wouldn’t be surprised by phenotypical regression to the mean from one generation to the next. Our brains are phenomenal pattern recognizing machines, but sometimes they see the wrong pattern, and sometimes that gives us perverse incentives for how we behave and interact with each other. The solution is to step back from individual cases and try to look at an average over time. By gathering more data and looking for longer lasting trends we can better identify regression to the mean versus real trends in performance over time.