The Hindsight Fallacy &The Hindsight Fallacy & the How & Why of History - Jim Collins Good To Great Bias the How & Why of History

The Hindsight Fallacy & the How & Why of History

Looking forward and making predictions and judgments on what will happen in the future is incredibly difficult, and very few people can reliably make good predictions about what will happen. But when we look to the past, almost all of us can describe what did happen. It is easy to look back and see how a series of events unfolded, to make connections between certain conditions and eventual outcomes, and to be confident that we understood why things unfolded as they did. But this confidence is misleading and is something we can reliably expect from people.
 
 
The hindsight fallacy is the term which describes our overconfidence in describing what happened in the past and determining which causal factors influenced the outcomes we observed. When the college football playoff is over this year, sports commentators will have a compelling narrative as to why the winning team was able to pull through. When the stock market makes a jump or dip in the next year, analysts will be able to look backward to connect the dots that caused the rise or fall of the market. Their explanations will be confident and narratively coherent, making the analysts and commentators sound like well reasoned individuals.
 
 
However, “every point in history is a crossroads,” writes Yuval Noah Harari. Strange and unpredictable things could happen at any time in history, and the causal factors at work are hard to determine. It is worth remembering that the best social science studies return an R value of about .4 at most (the R value is a statistical value reflecting how well the model fits reality). This means that the best social science studies we can conduct barely reflect the reality of the world. It is unlikely that any commentator, even a seasoned football announcer or stock market analyst, really understands causality well enough to be confident in what caused what, even in hindsight. Major shifts could happen because someone was in a bad mood. Unexpected windfalls could create new and somewhat random outcomes. Humans can think causally, and this helps us better understand the world, but we can also be overconfident in our causal reasoning.
 
 
Harari continues, “the better you know a particular historical period, the harder it becomes to explain why things happened one way and not another. Those who have only a superficial knowledge of a certain period tend to focus only on the possibility that was eventually realized.” What Harari is saying in this quote is that we can get very good at describing how things happened in the past, but not exactly very good at describing why. We can look at each step and each development that unfolded ahead of a terrorist attack, a win by an army, or as Harari uses for demonstration in his book, the adoption of Christianity in the Roman Empire. But we can’t always explain the exact causal pathway of each step. If we could, then we could identify the specific historical crossroads where history took one path and not another and make reasonable predictions about how the world would have looked had the alternative option been the one that history followed. But we really can’t do this. We can look back and identify factors that seemed important in the historical development, but we can’t always explain exactly why those factors were important in one situation relative to another. There is too much randomness, too much chance, and too much complexity for us to be confident in the causal pathways we see. We won’t stop thinking in a causal way, of course, but we should at least be more open to a wild range of possibilities, and less confident in our assessments of history.
 
 
One of my favorite examples of the hindsight bias in action is in Good to Great by Jim Collins. In the book published in 2001, Collins identifies 11 companies that had jumped form being good companies to great companies. One of the companies identified, Circuit City, was out of business before Collins published his subsequent book. Another, Wells Fargo, is now one of the most hated companies in the United States.  A third, Fannie Mae, was at the center of the 2008 financial crisis, and a fourth, Gillette, was purchased by P&G and is no longer an independent entity. A quick search suggests that the companies in the Good to Great portfolio have underperformed the market since the books publication. It is likely that the success of the 11 companies included a substantial amount of randomness, which Collins and his team failed to incorporate in their analysis. Hindsight bias was at play in the selection of the 11 companies and the explanation for why they had such substantial growth in the period that Collins explored. 
Hindsight Bias and Accountability - Joe Abittan

Hindsight Bias and Accountability

“Increased accountability is a mixed blessing,” writes Daniel Kahneman in his book Thinking Fast and Slow. This is an idea I came across in the past from books like Political Realism by Jonathan Rauch and The New Localism by Bruce Katz and  Jeremy Nowak. Our go-to answer to any challenges and problems tends to be increased transparency and greater oversight. However, in some complex fields simply opening processes and decision-making procedures to more scrutiny and review can create new problems that might be even worse. This is a particular challenge when we consider the way hindsight bias influences the thoughts and opinions of those reviewing bodies.

 

Kahneman continues, “because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions and – to an extreme – reluctance to take risks.”

 

Excess scrutiny and oversight can lead to rigid and mechanical decision-making processes. This might not be a problem when we are engineering a bridge and need to make technical decisions based on known mathematical calculations (I’ve never engineered a bridge so I may be wrong here), but it can be a problem for doctors and policy makers. Doctors have to rely on their experience, their knowledge, and their intuitions to determine the best possible medical treatment. Checklists are fantastic ideas, but when things go wrong in an operating room, doctors and nurses have to make quick decisions balancing risk and uncertainty. If the oversight they will face is high, then there is a chance that doctors stick to a rigid set of steps, that might not really fit the current emergency. In his book, Kahneman writes about how this leads doctors to order unnecessary tests and procedures, more to cover themselves from liability than to truly help the patient, wasting time and money within the healthcare system.

 

For public decision-making, hindsight bias can be a disaster for public growth and development. The federal government makes loans and backs many projects. Like any venture capitalist firm or large bank making multiple investments, some projects will fail. It is impossible to know at the outset which of ten solar energy projects will be a massive success, and which company is going to go bust. But thanks to hindsight bias and the intense oversight that public agencies and legislatures are subject to, an investment in a solar project that goes bust is likely to haunt the agency head or legislators who backed the project, even if 9 of the other 10 projects were huge successes.

 

Oversight is important, but when oversight is subject to hindsight bias, the accountability shifts into high gear, blaming decision-makers for failing to have the superhuman ability to predict the future. This creates risk averse institutions that stagnate, waste resources, and are slow to act, potentially creating new problems and new vulnerabilities to hindsight bias in the future. Rauch, Katz, and Nowak in the posts I linked to above, all favor reducing transparency in the public setting for this reason, but Kahneman might not agree with them, arguing that closing the deliberations to transparency won’t hide the outcomes from the public, and won’t stop hindsight bias from being an issue.
Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.