The Elephant in the Brain with Psychics and Mediums - Kevin Simler - Robin Hanson - Mary Roach - Joe Abittan - Spook: Science Tackles the Afterlife

The Elephant in the Brain with Psychics and Mediums

In the book The Elephant in the Brain, Robin Hanson and Kevin Simler argue that our own self-interest drives a huge amount of our behavior. On the surface this doesn’t sound like a huge shock, but if you truly look at how deeply our self-interest is tied to everything we do, you start to see that we like to pretend that we don’t act purely out of our own self-interest. Instead, we lie to ourselves and others and create high minded reasons for our beliefs, behaviors, and actions. But our self-interest is never far behind. It is always there as the elephant in the room (or brain) influencing all that we do even if we constantly try to ignore it.
This is likely what happens when people visit psychics and mediums with the hopes of learning about their future or reconnecting with the spirit of a lost one. Mary Roach describes what is going on with psychics, mediums, and their clients in her book Spook, and I think her explanation is a strong argument for the ideas presented by Hanson and Simler in The Elephant in the Brain. She writes:
“It seems to me that in many cases psychics and mediums prosper not because they’re intentionally fraudulent, but because their subjects are uncritical. The people who visit mediums and psychics are often strongly motivated or constitutionally inclined to believe that what is being said is relevant and meaningful with regard to them or a loved one.”
Both psychics/mediums and their subjects are motivated by self-interests that they don’t want to fully own up to. They both deceive themselves in order to appear to genuinely believe the experience. If you can fool yourself then it becomes much easier to fool others, and that requires that you ignore the elephant (your self-interest) in your brain.
Clients want to believe they are really interacting with the spirit of a lost one and not being fooled or defrauded. Critical thinking and deliberately acknowledging that they are susceptible to being fooled are ignored and forgotten. Instead, the individual’s self-interest acts behind the scenes as they help create the reality they want to inhabit with the help of the psychic or medium.
The psychics and mediums also don’t want to be viewed as fraudsters and quacks. They hide the fact that they have economic and social motivations to appear to have special powers and signal their authenticity. If a client is uncritical, it helps the entire process and allows both parties to ignore their self-interest acting below the surface. Ultimately, as Roach argues, the process is dependent on both practitioners who are willing to believe their subjects are having authentic experiences and on subjects to then believe their psychics and mediums are genuinely communicating with the dead. Without either, and without the self-deception for both, the whole process would fall apart.
Personally and Politically Disturbed by the Homeless

Personally and Politically Disturbed by the Homeless

On the first page of the preface of The Homeless, Christopher Jencks writes about the responses that many Americans had to the rise of homelessness in American cities in the 1970s. He writes, “The spread of homelessness disturbed affluent Americans for both personal and political reasons. At a personal level, the faces of the homeless often suggest depths of despair that we would rather not imagine, much less confront in the flesh. … At a political level, the spread of homelessness suggests that something has gone fundamentally wrong with America’s economic or social institutions.”
I think the two books which most accurately describe the way that I understand our political and social worlds are Thinking Fast and Slow by Daniel Kahneman and The Elephant in the Brain by Kevin Simler and Robin Hanson. Kahneman suggests that our brains are far more susceptible to cognitive errors than we would like to believe. Much of our decision-making isn’t really so much decision-making as it is excuse making, finding ways to give us agency over decisions that were more or less automatic. Additionally, Kahneman shows that we very frequently, and very predictably, make certain cognitive errors that lead us to inaccurate conclusions about the world. Simler and Hansen show that we often deliberately mislead ourselves, choosing to intentionally buy into our minds’ cognitive errors. By deliberately lying to ourselves and choosing to view ourselves and our beliefs through a false objectivity, we can better lie to others, enhancing the way we signal to the world and making ourselves appear more authentic. [Note: some recent evidence has put some findings from Kahneman in doubt, but I think his general argument around cognitive errors still holds.]
Jencks published his book long before Thinking Fast and Slow and The Elephant in the Brain were published, but I think his observation hints at the findings that Kahneman, Simler, and Hanson would all write about in the coming decades. People wanted to hold onto beliefs they possibly knew or suspected to be false. They were disturbed by a reality that did not match the imagined reality in which they wanted to believe. They embraced cognitive errors and adopted beliefs and conclusions based on those cognitive errors. They deceived themselves about reality to better appear to believe the myths they embraced, and in the end they developed a political system where they could signal their virtue by strongly adhering to the initial cognitive errors that sparked the whole process.
Jencks’ quote shows why homelessness is such a tough issue for many of us to face. When we see large number of people failing and ending up homeless it suggests that there is something more than individual shortcomings at work. It suggests that somewhere within society and our social structures are points of failure. It suggests that our institutions, from which we may benefit as individuals, are not serving everyone. This goes against our beliefs which reinforce our self-interest, and is hard to accept. It is much easier to simply fall back on cognitive illusions and errors and to blame those who have failed. We truly believe that homelessness is the problem of individuals because we are deceiving ourselves, and because it serves our self-interest to do so. When we see homeless, we see a reality we want to ignore and pretend does not exist because we fear it and we fear that we may be responsible for it in some way. We fear that homelessness will necessitate a change in the social structures and institutions that have helped us get to where we are and that changes may make things harder for us or somehow diminishing our social status. This is why we are so disturbed by homeless, why we prefer not to think about it, and why we develop policies based on the assumption that people who end up homeless are deeply flawed individuals and are responsible for their own situation. It is also likely why we have not done enough to help the homeless, why it is becoming a bigger issue in American cities, and why we have been so bad at addressing the real causes of homelessness in America. There is definitely some truth to the argument that homelessness is the result of flawed individuals, which is why it is such a strong argument, but we should accept that there are some flawed causal thoughts at play and that it is often in our self-interest to dismiss the homeless as individual failures.
Causal Illusions - The Book of Why

Causal Illusions

In The Book of Why Judea Pearl writes, “our brains are not wired to do probability problems, but they are wired to do causal problems. And this causal wiring produces systematic probabilistic mistakes, like optical illusions.” This can create problems for us when no causal link exists and when data correlate without any causal connections between outcomes.  According to Pearl, our causal thinking, “neglects to account for the process by which observations are selected.”  We don’t always realize that we are taking a sample, that our sample could be biased, and that structural factors independent of the phenomenon we are trying to observe could greatly impact the observations we actually make.
Pearl continues, “We live our lives as if the common cause principle were true. Whenever we see patterns, we look for a causal explanation. In fact, we hunger for an explanation, in terms of stable mechanisms that lie outside the data.” When we see a correlation our brains instantly start looking for a causal mechanism that can explain the correlation and the data we see. We don’t often look at the data itself to ask if there was some type of process in the data collection that lead to the outcomes we observed. Instead, we assume the data is correct and  that the data reflects an outside, real-world phenomenon. This is the cause of many causal illusions that Pearl describes in the book. Our minds are wired for causal thinking, and we will invent causality when we see patterns, even if there truly isn’t a causal structure linking the patterns we see.
It is in this spirit that we attribute negative personality traits to people who cut us off on the freeway. We assume they don’t like us, that they are terrible people, or that they are rushing to the hospital with a sick child so that our being cut off has a satisfying causal explanation. When a particular type of car stands out and we start seeing that car everywhere, we misattribute our increased attention to the type of car and assume that there really are more of those cars on the road now. We assume that people find them more reliable or more appealing and that people purposely bought those cars as a causal mechanism to explain why we now see them everywhere. In both of these cases we are creating causal pathways in our mind that in reality are little more than causal illusions, but we want to find a cause to everything and we don’t always realize that we are doing so. It is important that we be aware of these causal illusions when making important decisions, that we think about how the data came to mind, and whether there is a possibility of a causal illusion or cognitive error at play.
Paternalistic Nudges - Joe Abittan

Paternalistic Nudges

In their book Nudge, Cass Sunstein and Richard Thaler argue in favor of libertarian paternalism. Their argument is that our world is complex and interconnected, and it is impossible for people to truly make decisions on their own. Not only is it impossible for people to simply make their own decisions, it is impossible for other people to avoid influencing the decisions of others. Whether we decide to influence a decision in a particular way, or whether we decide to try to avoid any influence on another’s decision, we still shape how decisions are presented, understood, and contextualized. Given this reality, the best alternative is to try to help people make consistently better decisions than they would without aid and assistance.

 

The authors describe libertarian paternalism by writing:

 

“The approach we recommend does count as paternalistic, because private and public choice architects are not merely trying to track or to implement people’s anticipated choices. Rather, they are self-consciously attempting to move people in directions that will make their lives better. They nudge.”

 

The nudge is the key aspect of libertarian paternalism. Forcing people into a single choice, forcing them to accept your advice and perspective, and aggressively trying to change people’s behaviors and opinions doesn’t fit within the libertarian paternalism framework advocated by Sunstein and Thaler. Instead, a more subtle form of guidance toward good decisions is employed. People retain maximal choices if desired, and their opinions, decisions, and behaviors are somewhat constrained but almost nothing is completely off the table.

 

“A nudge,” Sunstein and Thaler write, “as we will use the term, is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives.”

 

Daniel Kahneman, in his book Thinking Fast and Slow demonstrated that people make predictable errors and have predictable biases. If we can understand these thinking errors and biases, then we can identify situations in which these biases and cognitive errors are likely to lead people to making suboptimal decisions. To go a step further, as Sunstein and Thaler would suggest, if we are a choice architect, we should design and structure choices in a way that leads people away from predictable cognitive biases and errors. We should design choices in a way that takes those thinking mistakes into consideration and improves the way people understand their choices and options.

 

As a real world example, if we are structuring a retirement savings plan, we can be relatively sure that people will anchor around a default contribution built into their retirement savings plan. If we want to encourage greater retirement savings (knowing that economic data indicate people rarely save enough), we can set the default to 8% or higher, knowing that people may reduce the default rate, but likely won’t eliminate contributions entirely. Setting a high default is a nudge toward better retirement saving. We could chose not to have a default rate at all, and it is likely that people wouldn’t be sure about what rate to select and might chose a low rate below inflation or simply chose not to enter a rate at all, completely failing to contribute anything to the plan. It is clear that there is a better outcome that we, as choice architects, could help people attain if we understand how their minds work and can apply a subtle nudge.
Can We Avoid Cognitive Errors?

Can We Avoid Cognitive Errors?

Daniel Kahneman is not very hopeful when it comes to our ability to avoid cognitive errors. Toward the end of his book Thinking Fast and Slow, a book all about cognitive errors, predictable biases, and situations in which we can recognize such biases and thinking errors, Kahneman isn’t so sure there is much we can actually do in our lives to improve our thinking.

 

Regarding his own thinking, Kahneman writes, “little can be achieved without considerable effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.”

 

Kahneman’s book is fantastic in part because of his humility. It would be easy to take a book on illusions, cognitive errors, biases, and predictable fallacies and use it to show how much smarter you are than everyone else who makes such thinking mistakes. However, Kahneman uses his own real life examples throughout the book to show how common and easy it is to fall into ways of thinking that don’t actually reflect reality. What is unfortunate though, is how hard it is to actually take what you learn from the book and apply it to your own life. If the author himself can hardly improve his own thinking, then those of us who read the book likely won’t make big changes in our thinking either.

 

“The upshot is that it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so. Observers are less cognitively busy and more open to information than actors,” Kahneman continues. While we might not be able to improve our thinking simply by knowing about cognitive errors and being aware of predictable biases, we can at least recognize them in others. This can help us be more thoughtful when we critique or gossip about others (something we all do even if we claim we don’t).

 

Beyond improving the way we gossip or judge others, Kahneman’s research and his book are incredibly valuable for anyone who is in a design focused role. If you are creating a layout for a webpage, a seating arrangement at a restaurant, or the standard operating procedures for a company, you have an opportunity to design and develop a process and flow that takes cognitive errors and predictable biases into account. Because it is easier to observe others making mistakes than to observe those mistakes in ourselves, we can watch for situations where people are led astray, and help get them back on course. We can develop systems and structures that take our biases and cognitive errors into account, and minimize the damage they may do. We can set the world up to help guide us in a reasonable way through our cognitive errors and biases, but only if we know what to look for.
The Remembering Self and Time - Joe Abittan

The Remembering Self and Time

Time, as we have known it, has only been with human beings for a small slice of human history. The story of time zones is fascinating, and really began once rail roads connected the United States. Before we had a standardized system for operating within time, human lives were ruled by the cycle of the sun and the seasons, not by the hands of a watch. This is important because it suggests that the time bounds we put on our lives, the hours of our schedules and work days, and the way we think about the time duration of meetings, movies, a good night’s sleep, and flights is not something our species truly evolved to operate within.

 

In Thinking Fast and Slow, Daniel Kahneman shows one of the consequences of human history being out of sync with modern time. “The mind,” he writes, “is good with stories, but it does not appear to be well designed for the processing of time.”

 

I would argue that this makes sense and should be expected. Before we worked set schedules defined by the clock, before we could synchronize the start of a football game with TV broadcasts across the world, and before we all needed to be at the same place at precisely the right time to catch a departing train, time wasn’t very important. It was easy to tie time with sunrise, sunset, or mid-day compared to a 3:15 departure or a 7:05 kick-off. The passage of time also didn’t matter that much. The difference between being 64 and 65 years old wasn’t a big deal for humans that didn’t receive retirement benefits and social security payments. We did not evolve to live in a world where every minute of every day was tightly controlled by time and where the passage of time was tied so specifically to events in our lives.

 

For me, and I think for Daniel Kahneman, this may explain why we see some of the cognitive errors we make when we remember events from our past. Time wasn’t as important of a factor for ancient humans as story telling was. Kahneman continues,

 

“The remembering self, as I have described it, also tells stories and makes choices, and neither the stories nor the choices properly represent time. In storytelling mode, an episode is represented by a few critical moments, especially the beginning, the peak, and the end. Duration is neglected.”

 

When we think back on our lives, on moments that meant a lot to us, on times we want to relive, or on experiences we want to avoid in the future, we remember the salient details. We don’t necessarily remember how long everything lasted. My high school basketball days are not remembered by the hours spent running UCLAs, by the number of Saturdays I had to be up early for 8 a.m. practices, or by the hours spent in drills. My memories are made up of a few standout plays, games, and memorable team moments. The same is true for my college undergrad memories, the half-marathons I have raced, and my memories from previous homes I have lived in.

 

When we think about our lives we are not good at thinking about the passage of time, about how long we spent working on something, how long we had to endure difficulties, or how long the best parts of our lives lasted. We live with snapshots that can represent entire years or decades. Our remembering self drops the less meaningful parts of experiences from our memories, and holds onto the start, the end, and the best or worst moments from an experience. It distorts our understanding of our own history, and creates memories devoid of a sense of time or duration.

 

I think about this a lot because our minds and our memories are the things that drive how we behave and how we understand the present moment. However, duration neglect helps us see that reality of our lives is shaped by unreality. We are influenced by cognitive errors and biases, by poor memories, and distortions of time and experience. It is important to recognize how faulty our thinking can be, so we can develop systems, structures, and ways of thinking that don’t assume we are always correct, but help guide us toward better and more realistic ways of understanding the world.
The Focusing Illusion Continued

The Focusing Illusion Continued

I find the focusing illusion as described by Daniel Kahneman in his book Thinking Fast and Slow to be fascinating because it reveals how strange our actual thinking is. I am constantly baffled by the way that our brains continuously and predictably makes mistakes. The way we think about, interpret, and understand the world is not based on an objective reality, but is instead based on what our brain happens to be focused on at any given time. As Kahneman writes, what you see is all there is, and the focusing illusion is a product of our brain’s limited ability to take in information combined with the brain’s tendency to substitute difficult and complex questions for more simple questions.

 

In the book, Kahneman asks us to think about the overall happiness of someone who recently moved from Ohio to California and also asks us to think about the amount of time that paraplegics spend in a bad mood. In both situations, we make a substitution. We know that people’s overall happiness and general moods are comprised of a huge number of factors, but when we think about the two situations, we focus in on a couple of simple ideas.

 

We assume the person from Ohio is happier in California because the weather in California is always perfect while Ohio experiences cold winters. The economic prospects in California might be better than Ohio, and there are more movie stars and surfing opportunities. Without knowing anything about the person, we probably assume the California move made them happier overall (especially given the additional context and priming based on the weather and jobs prospects that Kahneman presents in the example in his book).

 

For our assumptions about the paraplegic, we likely go the other way with our thoughts. We think about how we would feel if we were in an accident and lost the use of our legs or arms. We assume their life must be miserable, and that they spend much of their day in a bad mood. We don’t make a complex consideration of the individual’s life or ask more information about them, we just make an assumption based on limited information by substituting in the question, “How would I feel if I became paralyzed.” Of course, people who are paralyzed or lose the function of part of their body are still capable of a full range of human emotions, and might still find happiness in their lives in many areas.

 

Kahneman writes, “The focusing illusion can cause people to be wrong about their present state of well-being as well as about the happiness of others, and about their own happiness in the future.”

 

We often say that it is important that we know ourselves and that we be true to ourselves if we want to live healthy and successful lives. But research throughout Thinking Fast and Slow shows us how hard it can be. After reading Kahneman’s book, learning about Nudges from Cass Sunstein and Richard Thaler, and learning how poorly we process risk and chance from Gerd Gigerenzer, I constantly doubt how much I can really know about myself, about others, or really about anything. I am frustrated when people act on intuition, sure of themselves and their ideas in complex areas such as economics, healthcare, or education. I am dismayed by advertisements, religions, and political parties that encourage us to act tribally and to trust our instincts and intuitions. It is fascinating that we can be so wrong about something as personal as our own happiness. It is fascinating that we can be so biased in our thinking and judgement, and that we can make conclusions and assumptions about ourselves and others with limited information and not even notice how poorly our thought processes are. I love thinking about and learning about the biases and cognitive errors of our mind, and it makes me pause when I am sure of myself and when I think that I am clearly right and others are wrong. After all, if what you see is all there is, then your opinions, ideas, and beliefs are almost certainly inadequate to actually describe the reality you inhabit.
Sunk-Cost Fallacy - Joe Abittan

Sunk-Cost Fallacy

Every time I pick the wrong line at the grocery store I am reminded of the sunk-cost fallacy. There are times I will be stuck in line, see another line moving more quickly, and debate internally if I should jump to the other line or just wait it out in the line I’m already in. Once I remember the sunk-cost fallacy, however, the internal debate shifts and I let go of any feeling that I need to remain in the current line.

 

My grocery store example is a comical take on the sunk-cost fallacy, but in real life, this cognitive error can have huge consequences. Daniel Kahneman describes it this way, “The decision to invest additional resources in a losing account, when better investments are available, is known as the sunk-cost fallacy, a costly mistake that is observed in decisions large and small.”

 

We are going to make decisions and choices for where to invest our time, attention, and money that will turn out to be mistakes. At a certain point we have to realize when something is not working and walk away. Doing so, however, requires that we admit failure, that we cut our losses, and that we search for new opportunities. Admitting that we were wrong, giving up on losses, and searching for new avenues is difficult, and it is not uncommon for us to keep moving forward despite our failures, as if we just need to try harder and push more in order to find the success we desire. This is the base of the sunk-cost fallacy. When we have invested a lot of time, energy, and resources into something it is hard to walk away, even if we would be better off by doing so.

 

Pursuing a career path that clearly isn’t panning out and refusing to try a new different avenue is an example of sunk-cost fallacy. Movie studios that try to reinvent a character or story over and over with continued failure is another example. Sitting through the terrible movie the studio produced, rather than leaving the theater early, is also an example of the sunk-cost fallacy. In all of these instances, an investment has been made, and costly efforts to make the investment pay-off are undertaken, generally at a greater loss than would be incurred if we had made a change and walked away.

 

When you find yourself saying, “I have already spent so much money on XYZ, or I have already put so much effort into making XYZ work, and I don’t want to just let that all go to waste,” you are stuck in the middle of the sunk-cost fallacy. At this point, it is time to step back, look at other ways you could spend your money and time, and honestly evaluate what your priorities should be. Doing so, and remembering Kahneman’s quote, will help you begin to make the shift to a better use of your time, energy, and resources. It may be embarrassing and disappointing to admit that something is going in the wrong direction, but ultimately, you will end up in a better and more productive spot.
Denominator Neglect - Joe Abittan

Denominator Neglect

“The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects,” writes Daniel Kahneman in Thinking Fast and Slow.

 

One thing we have seen in 2020 is how difficult it is to communicate and understand risk. Thinking about risk requires thinking statistically, and thinking statistically doesn’t come naturally for our brains. We are good at thinking in terms of anecdotes and our brains like to identify patterns and potential causal connections between specific events. When our brains have to predict chance and deal with uncertainty, they easily get confused. Our brains shift and solve easier problems rather than complex mathematical problems, substituting the answer to the easy problem without realizing it. Whether it is our risk of getting COVID or the probability we assigned to election outcomes before November 3rd, many of us have been thinking poorly about probability and chance this year.

 

Kahneman’s quote above highlights one example of how our thinking can go wrong when we have to think statistically. Our brains can be easily influenced by random numbers, and that can throw off our decision-making when it comes to dealing with uncertainty. To demonstrate denominator neglect, Kahneman presents two situations in his book. There are two large urns full of white and red marbles. If you pull a red marble from an urn, you are a winner. The first urn has 10 marbles in it, with 9 white and 1  red. The second urn has 100 marbles in it, with 92 white and 8 red marbles. Statistically, we should try our luck with the urn with 10 marbles, because 1 out of 10, or 10% of all marbles in the urn are red. In the second urn, only 8% of the marbles are red.

 

When asked which urn they would want to select from, many people select the second urn, leading to what Kahneman describes as denominator neglect. The chance of winning is lower with the second urn, but there are more winning marbles in the jar, making it seem like the better option if you don’t slow down and engage your System 2 thinking processes. If you pause and think statistically, you can see that option 1 provides better odds, but if you are moving quick your brain can be distracted by the larger number of winning marbles and lead you to make a worse choice.

 

What is important to recognize is that we can be influenced by numbers that shouldn’t mean anything to us. The number of winning marbles shouldn’t matter, only the percent chance of winning should matter, but our brains get thrown off. The same thing happens when we see sales prices, think about a the risk of a family gathering of 10 people during a global pandemic, or think about polling errors. I like to check The Nevada Independent‘s COVID-19 tracking website, and I have noticed denominator neglect in how I think about the numbers they report. For a continued stretch, Nevada’s total number of cases was decreasing, but our case positivity rate was staying the same. Statistically, nothing was really changing regarding the state of the pandemic in Nevada, but fewer tests were being completed and reported each day, so the overall number of positive cases was decreasing. If you scroll down the Nevada Independent website, you will get to a graph of the case positivity rate and see that things were staying the same. When looking at the decreasing number of positive tests reported, my brain was neglecting the denominator, the number of tests completed. The way I understood the pandemic was biased by the big headline number, and wasn’t really based on how many people out of those tested did indeed have the virus. Thinking statistically provides a more accurate view of reality, but it can be hard to think statistically and can be tempting to look just at a single headline number.
Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.