Using Misinformation and Disinformation for Political Purposes

Using Misinformation & Disinformation for Political Purposes

“A relentless barrage of misleading pronouncements about a given subject,” writes Quassim Cassam in Vices of the Mind, “can deprive one of one’s prior knowledge of that subject by muddying the waters and making one mistrust one’s own judgement.”
This sentence seems to perfectly describe the four year presidency of Donald Trump. The former President of the United States said a lot things that could not possibly be true, and didn’t seem to care whether his statements were accurate or inaccurate. There were times when he was clearly trying to mislead the nation, and times when he simply didn’t seem to know what he was talking about and made up claims that sounded good in the moment. Regardless of whether he was trying to deliberately mislead the public or not, his statements often had the same effect. They often created confusion, a buzz around a particular topic, and a dizzying array of rebuttals, of support arguments, and complicated fact-checks.
The President’s epistemic insouciance created confusion and bitter arguments that the President could spin for his own political gain. He would lie about meaningless topics and then criticize people for focusing on narrow and unimportant falsehoods. He would say random and sometimes contradictory things which would create so much confusion around a topic that people had trouble understanding what the argument was about and began to doubt factual information and reporting. The result was a blurring of the lines between reputable and fact-based reporting and hyperbolic opinionated reporting.
A clear lesson from Trump’s presidency is that we need to do a better job of holding elected officials to a higher standard with their statements. Unfortunately, it often goes against our self or group interest to hold the elected officials we favor to high standards. If we generally like a politician who happens to be epistemically insouciant, it is hard to vote against them, even if we know what they say is wrong or deliberately misleading. As many of Trump’s supporters demonstrated, it can be more comfortable to do complex mental gymnastics to make excuses for obviously inept and dangerous behaviors than to admit that our favored politician is lazy and incompetent. 
Knowledge and accurate beliefs are important. We have entered a period in humanity where we depend on complex systems. Whether it is infrastructure, supply chains, or human impacts on climate, our actions and behaviors are part of large interconnected systems. None of us can understand these systems individually, and we depend on experts who can help us make sense of how we relate to larger wholes. We need to be investing in and developing systems and structures that encourage and facilitate knowledge. Using misinformation and disinformation for political purposes inhibits knowledge, and makes us more vulnerable to system collapses when we cannot effectively and efficiently coordinate our actions and behaviors as complex systems change or break. Going forward, we have to find a way to prevent the epistemically insouciant from muddying the waters and clouding our knowledge.
Epistemically Malevolent & Epistemically Insouciant

Epistemically Malevolent & Epistemically Insouciant

Over the last few years I feel as though I have seen an increase in the number of news outlets and reporters saying that we now live in a post-truth society. The argument is that truth and accuracy no longer matter to many people, and that we live in a world where people simply want to believe what they want to believe, regardless of the evidence. This argument is supported by documented instances of fake news, by a former US president who didn’t seem to care what the truth was, and by politicians and every day people professing beliefs that are clearly inaccurate as a type of loyalty test. This puts us in a position where it becomes difficult to communicate important information and create a coherent narrative based on accurate details surrounding the events of our lives.
Two concepts that Quassim Cassam discusses in his book Vices of the Mind can help us think about what it means to be in a post-truth society. Cassam writes, “one can be epistemically malevolent without being epistemically insouciant.” To me, it seems that a post-truth society depends on both malevolency and insouciance to exist. I find it helpful to see that there is a distinction in these two different postures toward knowledge.
To be epistemically malevolent means to intentionally and deliberately attempt to hinder and limit knowledge. Cassam uses the example of tobacco companies deliberately misleading the public on the dangers of smoking. Company executives intentionally made efforts to hide accurate scientific information and to mislead the public. In recent years we have seen epistemic malevolence in the form of fake-news, misinformation, and disinformation intended to harm political opponents and discourage voter turnout for opposing political parties.
Epistemic insouciance doesn’t necessarily have a malicious intent behind it. Instead, it is characterized by an indifference to the accuracy of information. You don’t need an intentional desire to spread false information in order to be epistemically insouciant. However, this careless attitude toward the accuracy of information is in some ways necessary for false information to take hold. Individuals who care whether their knowledge and statements are correct are less likely to be pulled in by the epistemically malevolent, and less likely to spread their messages. However, someone who favors what the epistemically malevolent have to say and is unwilling to be critical of the message are more likely to engage with such false messaging and to echo and spread malevolent lies. Even if an individual doesn’t want to be intentionally misleading, insouciance plays into malevolence.
This helps us see that our post-truth society will need to be addressed on two fronts. First, we need to understand why people are epistemically insouciant and find ways to encourage people to be more concerned with the accuracy and factuality of their statements and beliefs. External nudges, social pressures, and other feedback should be developed to promote factual statements and to hinder epistemic insouciance. This is crucial to getting people to both recognize and denounce epistemic malevolency. Once people care about the accuracy of their beliefs and statements, we can increase the costs of deliberately spreading false information. As things exist now, epistemic insouciance encourages epistemic malevolency. Combating epistemic malevolency will require that we address epistemic insouciance and then turn our attention to stopping the spread of deliberate falsehoods and fake news.
Epistemic Insouciance

Epistemic Insouciance

Dictionary.com defines insouciant as free from concern, worry or anxiety; carefree; nonchalant. To be epistemic insouciant then is to be carefree or nonchalant regarding knowledge. Epistemic insouciance can be characterized as a lack of concern regarding accurate information, true beliefs, and verifiable knowledge. Whether you know something or not, whether what you think you know is correct or not is of little concern.
In Vices of the Mind, Quassim Cassam writes the following about epistemic insouciance:
“Epistemic insouciance means not really caring about any of this [whether claims are grounded in reality or the evidence] and being excessively casual and nonchalant about the challenge of finding answers to complex questions, partly as a result of a tendency to view such questions as much less complex than they really are.”
Cassam continues to define epistemic insouciance as an attitude vice, different from other epistemic vices in the book that he characterizes as thinking style vices or character trait vices. To demonstrate how it becomes an attitude vice, Cassam uses reporting from the Brexit campaign to demonstrate how a lack of concern over evidence and the impact of complex questions reflected an epistemically insouciant attitude. According to Cassam, reports indicated that Boris Johnson, current British Prime Minister, did not care much about the actual outcomes of the vote on remaining in or leaving the European Union. Johnson eventually wrote an article supporting the decision to leave, but he reportedly had an article written supporting the decision to remain had that option won in general election. His interests were in supporting the winning position, not in the hard work of trying to determine which side he should support and what the actual social, financial, and future impacts of the choices would be. He didn’t care about the evidence and information surrounding the decision, but rather that he looked like he was on the right side.
Epistemic insouciance is not limited to politicians. We can all be guilty of epistemic insouciance, and in some ways we cannot move through the world without it. At the moment, I need to make a decision regarding a transmission repair for a vehicle of mine. I have a lot of important concerns at the moment outside of this vehicle’s transmission. I have a lot of responsibilities and items that require my focus that I think are more important than the transmission issue. I am not interested in really evaluating any evidence to support the decision I eventually make for repairing the transmission or just getting rid of the vehicle. If I were not epistemically insouciant on this issue, I would research the costs more thoroughly, try to understand how much usage I can get out of the vehicle if I repair it, and consider alternatives such as what it could be sold for and what I would spend for a better vehicle. However, this is a lot of work for an item that is not a major concern for me at the moment. I can save the mental energy and attention for more important issues.
Our minds are limited. We cannot be experts in all areas and all decisions that we have to make. Some degree of epistemic insouciance is sometimes necessary, even if it can be financially costly. However, it is important that we recognize when we are being epistemically insouciant and that we try to understand the risks associated with this attitude in our decisions. We should ensure that we are not epistemically insouciant on the most important decisions in our lives, and we should try to clear out the mental clutter and habits that may make us epistemically insouciant on those important issues.

When to Stop Thinking

When to Stop Thinking

My last post was about closed-mindedness and focused on how closed-minded people fail to make appropriate inquiries to gain the necessary information to make good decisions and accurately understand the world. What the post didn’t ask, is when we should stop thinking and make a decision, versus when we should continue our investigations to gain more knowledge. A serious problem, and one we avoid when we are closed-minded, is often referred to as paralysis by analysis. It occurs when you lack confidence in decision-making and continually seek more information before making a decision, potentially delaying your choice or any action indefinitely.
Writing about this idea in Vices of the Mind, Quassim Cassam writes, “our investigations can be open-ended and there is often, though not always, scope for further investigation.” Sometimes we are asking questions and doing research on continually evolving topics. Sometimes we are working at a cutting edge where changes in politics, markets, social trends, and scientific breakthroughs can influence what we do from day to day. There never is a final answer, and we have to continually seek new information in order to adapt. However, this doesn’t mean that we can’t make important decisions that require thoughtful deliberation.
“A good investigator,” Cassam writes, “has a sense of when enough is enough and diminishing returns are setting in. But the decision to call a halt at that point isn’t properly described as closed-minded. What allows us to get on with our lives isn’t closed-mindedness but the ability to judge when no further research into the question at hand is necessary.”
Closed-minded people make decisions while ignoring pertinent information. Open-minded people make decisions while ignoring extraneous information. Over time, for each of us if we practice long enough, we should improve our judgements and become better at recognizing the diminishing returns of continued research. We might continue to learn a bit more as we continue to study, but the value of each new bit of information will be smaller and smaller, and at some point won’t truly impact our decisions. A novice might have trouble identifying this point, but an expert should be better. A closed-minded person doesn’t look for this optimal point, but an open-minded person does, continually updating their priors and judgements on when they have enough information to make a decision, rather than rigidly locking in with a specific set of information. This is how we avoid analysis paralysis and how we improve our decision-making over time to get on with our lives as Cassam writes.
Rules of Thumb: Helpful, but Systematically Error Producing

Rules of Thumb: Helpful, but Systematically Error Producing

The world throws a lot of complex problems at us. Even simple and mundane tasks and decisions hold a lot of complexity behind them. Deciding what time to wake up at, the best way to go to the grocery store and post office in a single trip, and how much is appropriate to pay for a loaf of bread have incredibly complex mechanisms behind them. In figuring out when to wake up we have to consider how many hours of sleep we need, what activities we need to do in the morning, and how much time it will take for each of those activities to still provide us a cushion of time in case something runs long. In making a shopping trip we are confronted with p=np, one of the most vexing mathematical problems that exists. And the price of bread was once the object of focus for teams of Soviet economists who could not pinpoint the right price for a loaf of bread that would create the right supply to match the population’s demand.
The brain handles all of these problems with relatively simple heuristics and rules of thumb, simplifying decisions so that we don’t waste the whole night doing math problems for the perfect time to set an alarm, don’t miss the entire day trying to calculate the best route to run all our errands, and don’t waste tons of brain power trying to set bread prices. We set a standard alarm time and make small adjustments knowing that we ought to leave the house ready for work by a certain time to make sure we reduce the risk of being late. We stick to main roads and travel similar routes to get where we need to go, eliminating the thousands of right or left turn alternatives we could chose from. We rely on open markets to determine the price of bread without setting a universal standard.
Rules of thumb are necessary in a complex world, but that doesn’t mean they are not without their own downfalls. As Quassim Cassam writes in Vices of the Mind, echoing Daniel Kahneman from Thinking Fast and Slow, “We are hard-wired to use simple rules of thumb (‘heuristics’) to make judgements based on incomplete or ambiguous information, and while these rules of thumb are generally quite useful, they sometimes lead to systematic errors.” Useful, but inadequate, rules of thumb can create predictable and reliable errors or mistakes. Our thinking can be distracted with meaningless information, we can miss important factors, and we can fail to be open to improvements or alternatives that would make our decision-making better.
What is important to recognize is that systematic and predictable errors from rules of thumb can be corrected. If we know where errors and mistakes are systematically likely to arise, then we can take steps to mitigate and reduce those errors. We can be confident with rules of thumb and heuristics that simplify decisions in positive ways while being skeptical of rules of thumb that we know are likely to produce errors, biases, and inaccurate judgements and assumptions. Companies, governments, and markets do this all the time, though not always in a step by step process (sometimes there is one step forward and two steps backward) leading to progress over time. Embracing the usefulness of rules of thumb while acknowledging their shortcomings is a powerful way to improve decision-making while avoiding the cognitive downfall of heuristics.

Closed-Mindedness

One of the epistemic vices that Quassim Cassam describes in his book Vices of the Mind is closed-mindedness. An epistemic vice, Cassam explains, is a pattern of thought or a behavior that obstructs knowledge. They systematically get in the way of learning, communicating, or holding on to important and accurate information.
Regarding closed-mindedness, Cassam writes, “in the case of closed-mindedness, one of the motivations is the need for closure, that is, the individual’s desire for a firm answer to a question, any firm answer as compared to confusion and/or ambiguity [Italics indicate quote from A.W. Kruglanski]. This doesn’t seem an inherently bad motive and even has potential benefits. The point at which it becomes problematic is the point at which it gets in the way of knowledge.”
This quote about closed-mindedness reveals a couple of interesting aspects about the way we think and the patterns of thought that we adopt. The quote shows that we can become closed-minded without intending to be closed-minded people. I’m sure that very few people think that it is a good thing for us to close ourselves off from new information or diverse perspectives about how our lives should be. Instead, we seek knowledge and we prefer feeling as though we are correct and as though we understand the world we live in. Closed-mindedness is in some ways a by-product of living in a complex world where we have to make decisions with uncertainty. It is uncomfortable to constantly question every decision we make and can become paralyzing if we stress each decision too tightly. Simply making a decision and deciding we are correct without revisiting the question is easier, but also characteristically closed-minded.
The second interesting point is that epistemic vices such as closed-mindedness are not always inherently evil. As I wrote in the previous paragraph, closed-mindedness (or at least a shade of it), can help us navigate an uncertain world. It can help us make an initial decision and move on from that decision in situations where we otherwise may feel paralyzed. In many instances, like purchasing socks, there is no real harm that comes from being closed-minded. You might pay more than necessary purchasing fancy socks, but the harm is pretty minimal.
However, closed-mindedness systematically hinders knowledge by making people unreceptive to new information that challenges existing or desired beliefs. It makes people worse at communicating information because their data may be incomplete and irrelevant. Knowledge is limited by closed-mindedness, and overtime this creates a potential for substantial consequences in people’s lives. Selecting a poor health insurance plan as a result of being closed-minded, starting a war, or spreading harmful chemical pesticides are real world consequences that have occurred as a result of closed-mindedness. Substantial sums of money, people’s lives, and people’s health and well-being can hang in the balance when closed-mindedness prevents people from making good decisions, regardless of the motives that made someone closed-minded and regardless of whether being closed-minded helped solve analysis paralysis. Many of the epistemic vices, and the characteristics of epistemic vices, that Cassam describes manifest in our lives similar to closed-mindedness. Reducing such vices, like avoiding closed-mindedness, can help us prevent serious harms that can accompany the systematic obstruction of knowledge.
Epistemic Vices - Joe Abittan

Epistemic Vices

Quassim Cassam’s book Vices of the Mind is all about epistemic vices. Epistemic vices are intentional and unintentional habits, behaviors, personality traits, and patterns of thought that hinder knowledge, information sharing, and accurate and adequate understandings of the world around us. Sometimes we intentionally deceive ourselves, sometimes we simply fail to recognize that we don’t have enough data to confidently state our beliefs, and sometimes we are intentionally deceived by others without recognizing it. When we fall into thinking habits and styles that limit our ability to think critically and rationally, we are indulging in epistemic vices, and the results can often be dangerous to ourselves and people impacted by our decisions.
“Knowledge is something that we can acquire, retain, and transmit. Put more simply, it is something that we can gain, keep, and share. So one way to see how epistemic vices get in the way of knowledge is to see how they obstruct the acquisition, retention, and transmission of knowledge,” Cassam writes.
A challenge that I have is living comfortably knowing that I have incomplete knowledge on everything, that the world is more complex than I can manage to realize, and that even when doing my best I will still not know everything that another person does. This realization is paralyzing for me, and I constantly feel inadequate because of it. However, Cassam’s quote provides a perspective of hope.
Knowledge is something we can always gain, retain, and transmit. We can improve all of those areas, gaining more knowledge, improving our retention and retrieval of knowledge, and doing better to transmit our knowledge. By recognizing and eliminating epistemic vices we can increase the knowledge that we have, use, and share, ultimately boosting our productivity and value to human society. Seeing knowledge as an iceberg that we can only access a tiny fraction of is paralyzing, but recognizing that knowledge is something we can improve our access to and use of is empowering. Cassam’s book is helpful in shining a light on epistemic vices so we can identify them, understand how they obstruct knowledge, and overcome our vices to improve our relationship with knowledge.
On The Opportunity To Profit From Uninformed Patients

On The Opportunity To Profit From Uninformed Patients

The American Medical System is in a difficult and dangerous place right now. Healthcare services have become incredibly expensive, and the entire system has become so complex that few people fully understand it and even fewer can successfully navigate the system to get appropriate care that they can reasonably afford. My experience is that many people don’t see value in much of the care they receive or with many of the actors connected with their care. They know they need insurance to afford their care, but they really can’t see what value their insurance provides – it often appears to be more of a frustration than something most people appreciate. The same can be true for primary care, anesthesiologists, and the variety of healthcare benefits that employers may offer to their patients. There seem to be lots of people ready to profit from healthcare, but not a lot of people ready to provide real value to the people who need it.
 
These sentiments are all generalizations, and of course many people really do see value in at least some of their healthcare and are grateful for the care they receive. However, the complexity, the lack of transparency, and the ever climbing costs of care have people questioning the entire system, especially at a moral and ethical level. I think a great deal of support for Medicare for All, or universal healthcare coverage, comes from people thinking that profit within medicine may be unethical and from a lack of trust that stems from an inability to see anything other than a profit motive in many healthcare actors and services.
 
Gerd Gigerenzer writes about this idea in his book Risk Savvy. In the book he doesn’t look at healthcare specifically, but uses healthcare to show the importance of being risk literate in today’s complex world. Medical health screening in particular is a good space to demonstrate the harms that can come from misinformed patients and doctors. A failure to understand and communicate risk can harm patients, and it can actually create perverse incentives for healthcare systems by providing them the opportunity to profit from uninformed patients. Gigerenzer quotes Dr. Otis Brawley who had been Director of the Georgia Cancer Center at Emory in Atlanta.
 
In Dr. Brawley’s quote, he discusses how Emory could have screened 1,000 men at a mall for prostate cancer and how the hospital could have made $4.9 million in billing for the tests. Additionally the hospital would have profited from future services when men returned for other unrelated healthcare concerns as established patients. In Dr. Brawley’s experience, the hospital could tell him how much they could profit from the tests, but could not tell him whether screening 1,000 men early for prostate cancer would have actually saved any lives among the 1,000 men screened. Dr. Brawley knew that screening many men would lead to false positive tests, and unnecessary stress and further medical diagnostic care for those false positives – again medical care that Emory would profit from. The screenings would also identify men with prostate cancer that was unlikely to impact their future health, but would nevertheless lead to treatment that would make the men impotent or potentially incontinent. The hospital would profit, but their patients would be worse off than if they had not been screened. Dr. Brawley’s experience was that the hospital could identify avenues for profit, but could not identify avenues to provide real value in the healthcare services they offer.
 
Gigerenzer found this deeply troubling. A failure to understand and communicate the risks of prostate cancer (which is more complex than I can write about here) presents an opportunity for healthcare providers to profit by pushing unnecessary medical screening and treatment onto patients. Gigerenzer also notes that profiting from uninformed patients is not just limited to cancer screening. Doctors who are not risk literate cannot adequately explain risks and benefits of treatment to patients, and their patients cannot make the best decisions for themselves. This is a situation that needs to change if hospitals want to keep the trust of their patients and avoid being a hated entity that fails to demonstrate value. They will go the way of health insurance companies, with frustrated patients wanting to eliminate them altogether.
 
Wrapping up the quote from Dr. Brawley, Gigerenzer writes, “profiting from uninformed patients is unethical. medicine should not be a money game.” I believe that Gigerenzer and Dr. Brawley are right, and I think that all healthcare actors need to clearly demonstrate their value, otherwise any profits they earn will make them look like money-first enterprises and not patient-first enterprises, frustrating the public and leading to distrust in the medical field. In the end, this is going to be harmful for everyone involved. Demonstrating real value in healthcare is crucial, and profiting from uniformed patients will diminish the value provided and hurt trust, making the entire healthcare system in our country even worse.

Missing Feedback

Missing Feedback

I generally think we are overconfident in our opinions. We should all be more skeptical that we are right, that we have made the best possible decisions, and that we truly understand how the world operates. Our worldviews can only be informed by our experiences and by the information we take in about events, phenomena, and stories in the world. We will always be limited because we can’t take in all the information the world has to offer. Additionally, beyond simply not being able to hold all the information possible, we are unable to get the appropriate feedback we need in all situations for comprehensive learning. Some feedback is hazy and some feedback is impossible to receive at all. This means that we cannot be sure that we have made the best choices in our lives, even if things are going well and we are making our best efforts to study the world.

 

In Nudge, Cass Sunstein and Richard Thaler write, “When feedback does not work, we may benefit from a nudge.” When we can’t get immediate feedback on our choices and decisions, or when we get feedback that is unclear, we can’t adjust appropriately for future decisions. We can’t learn, we can’t improve, and we can’t make the best choices when we return to a decision-situation. However, we can observe where situations of poor feedback exist, and we can help design those decision-spaces to provide subtle nudges to help people make better decisions in the absence of feedback. Visual aids showing how much money people need for retirement and how much they can expect to have based on current savings rates is a helpful nudge in a situation where we don’t get feedback for how well we are saving money. There are devices that glow red or green based on your home’s current energy usage and efficiency, providing a subtle nudge to remind people not to use appliances at peak demand times and giving people feedback on energy usage that they normally wouldn’t receive. Nudges such as these can provide feedback, or can provide helpful information in the absence of feedback.

 

Sunstein and Thaler also write, “many of life’s choices are like practicing putting without being able to see where the balls end up, and for one simple reason: the situation is not structured to provide good feedback. For example, we usually get feedback only on the options we select, not the ones we reject.” Missing feedback is an important consideration because the lack of feedback influences how we understand the world and how we make decisions. The fact that we cannot get feedback on options we never chose should be nearly paralyzing. We can’t say how the world works if we never experiment and try something different. We can settle into a decent rhythm and routine, but we may be missing out on better lifestyles, happier lives, or better societies if we made different choices. However, we can never receive feedback on these non-choices. I don’t know that this means we should necessarily try to constantly experiment at the cost of settling in with the feedback we can receive, but I do think it means we should discount our own confidence and accept that we don’t know all there is. I also think it means we should look to increase nudges, use more visual aids, and structure our choices and decisions in ways that help maximize useful feedback to improve learning for future decision-making.
Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.