A Vaccine for Lies and Falsehoods

A Vaccine for Lies and Falsehoods

Vaccines are on everyone’s mind this year as we hope to move forward from the Coronavirus Pandemic, and I cannot help but think about today’s quote from Quassim Cassam’s book Vices of the Mind through a vaccine lens. While writing about ways to build and maintain epistemic virtues Cassam writes, “only the inculcation and cultivation of the ability to distinguish truth from lies can prevent our knowledge from being undermined by malevolent individuals and organizations that peddle falsehoods for their own political or economic ends.” In other words, there is no vaccine for lies and falsehoods, only the hard work of building the skills to recognize truth, narrative, and outright lies.
I am also reminded of a saying that Steven Pinker included in his book Enlightenment Now, “any jackass can knock down a barn, but it takes a carpenter to build one.” This quote comes to mind when I think about Cassam’s quote because building knowledge is hard, but spreading falsehoods is easy. Epistemic vices are easy, but epistemic virtues are hard.
Anyone can be closed-minded, anyone can use lies to try to better their own position, and anyone can be tricked by wishful thinking. It takes effort and concentration to be open-minded yet not gullible, to identify and counter lies, and to create and transmit knowledge for use by other people. The vast knowledge bases that humanity has built has taken years to develop, to weed out the inaccuracies, and to painstakingly hone in on ever more precise and accurate understandings of the universe. All this knowledge and information has taken incredible amounts of hard work by people dedicated to building such knowledge.
But any jackass can knock it all down. Anyone can come along and attack science, attack knowledge, spread misinformation and deliberately use disinformation to confuse and mislead people. Being an epistemic carpenter and building knowledge is hard, but being a conman and acting epistemically malevolent is easy.
The task for all of us is to think critically about our knowledge, about the systems and structures that have facilitated our knowledge growth and development as a species over time, and to do what we can to be more epistemically virtuous. Only by working hard to identify truth, to improve systems for creating accurate information, and to enhance knowledge highways to help people learn and transmit knowledge effectively can we continue to move forward. At any point we can chose to throw sand in the gears of knowledge, bringing the whole system down, or we can find ways to make it harder to gum up the knowledge machinery we have built. We must do the latter if we want to continue to grow, develop, and live peacefully rather than at the mercy of the epistemically malevolent. After all, there is no vaccine to cure us from lies and falsehoods.
Epistemically Malevolent & Epistemically Insouciant

Epistemically Malevolent & Epistemically Insouciant

Over the last few years I feel as though I have seen an increase in the number of news outlets and reporters saying that we now live in a post-truth society. The argument is that truth and accuracy no longer matter to many people, and that we live in a world where people simply want to believe what they want to believe, regardless of the evidence. This argument is supported by documented instances of fake news, by a former US president who didn’t seem to care what the truth was, and by politicians and every day people professing beliefs that are clearly inaccurate as a type of loyalty test. This puts us in a position where it becomes difficult to communicate important information and create a coherent narrative based on accurate details surrounding the events of our lives.
Two concepts that Quassim Cassam discusses in his book Vices of the Mind can help us think about what it means to be in a post-truth society. Cassam writes, “one can be epistemically malevolent without being epistemically insouciant.” To me, it seems that a post-truth society depends on both malevolency and insouciance to exist. I find it helpful to see that there is a distinction in these two different postures toward knowledge.
To be epistemically malevolent means to intentionally and deliberately attempt to hinder and limit knowledge. Cassam uses the example of tobacco companies deliberately misleading the public on the dangers of smoking. Company executives intentionally made efforts to hide accurate scientific information and to mislead the public. In recent years we have seen epistemic malevolence in the form of fake-news, misinformation, and disinformation intended to harm political opponents and discourage voter turnout for opposing political parties.
Epistemic insouciance doesn’t necessarily have a malicious intent behind it. Instead, it is characterized by an indifference to the accuracy of information. You don’t need an intentional desire to spread false information in order to be epistemically insouciant. However, this careless attitude toward the accuracy of information is in some ways necessary for false information to take hold. Individuals who care whether their knowledge and statements are correct are less likely to be pulled in by the epistemically malevolent, and less likely to spread their messages. However, someone who favors what the epistemically malevolent have to say and is unwilling to be critical of the message are more likely to engage with such false messaging and to echo and spread malevolent lies. Even if an individual doesn’t want to be intentionally misleading, insouciance plays into malevolence.
This helps us see that our post-truth society will need to be addressed on two fronts. First, we need to understand why people are epistemically insouciant and find ways to encourage people to be more concerned with the accuracy and factuality of their statements and beliefs. External nudges, social pressures, and other feedback should be developed to promote factual statements and to hinder epistemic insouciance. This is crucial to getting people to both recognize and denounce epistemic malevolency. Once people care about the accuracy of their beliefs and statements, we can increase the costs of deliberately spreading false information. As things exist now, epistemic insouciance encourages epistemic malevolency. Combating epistemic malevolency will require that we address epistemic insouciance and then turn our attention to stopping the spread of deliberate falsehoods and fake news.
Lies Versus Epistemic Insouciance

Lies Versus Epistemic Insouciance

My last post was about epistemic insouciance, being indifferent to whether or not your beliefs, statements, and ideas are accurate or inaccurate. Epistemic insouciance, Quassim Cassam argues in Vices of the Mind is an attitude. It is a disposition toward accurate or false information that is generally case specific.
In the book, Cassam distinguishes between lies and epistemic insouciance. He writes, “lying is something that a person does rather than an attitude, and the intention to conceal the truth implies that the liar is not indifferent to the truth or falsity of his utterances. Epistemic insouciance is an attitude rather than something that a person does, and it does imply an indifference to the truth or falsity of one’s utterances.”
The distinction is helpful when we think about people who deliberately lie and manipulate information for their own gain and people who are bullshitters. Liars, as the quote suggests, know and care about what information is true and what is false. They are motivated by factors beyond the accuracy of the information, and do their best within their lies to present false information as factual.
Bullshitters, however, don’t care whether their information is accurate. The tools that work to uncover inaccurate information and counter a liar don’t work against a bullshitter because of their epistemic insouciance. Liars contort evidence and create excuses for misstatements and lies. Bullshitters simply flood the space with claims and statements of varying accuracy. If confronted, they argue that it doesn’t matter whether they lied or not, and instead argue that their information was wrong, that they didn’t care about it being wrong, and as a result they were not actually lying. This creates circular arguments and distracts from the epistemic value of information and the real costs of epistemic insouciance. Seeing the difference between liars and epistemically insouciant bullshitters is helpful if we want to know how to address those who intentionally obstruct knowledge.

Our Mind Seems Counterproductive

I listen to a lot of science podcasts, and really love the discoveries, new ways of thinking about the world, and better understandings of the world that we gain from science. Science is a process that strives to be rational and to build on previous knowledge to better understand an objective reality. What is also interesting about science, is that it operates against the way our brains want to work. As much as I love science and as much as I want to be scientific in my thinking and approaches to the world, I understand that a great deal that shapes human beings and the world we build is not rational and seems counterproductive when viewed through a rational lens.

 

Part of the explanation for our minds being so irrational might be explained by Kevin Simler and Robin Hanson in their book The Elephant in the Brain. The authors describe one reason for why our brains evolved to be as complex and irrational as they are: we evolved to be political and deceptive creatures, not to be rational and objective creatures with a comprehensive view of reality. “Here’s the puzzle:” write Simler and Hanson, “we don’t just deceive others; we also deceive ourselves. Our minds habitually distort or ignore critical information in ways that seem, on the face of it, counterproductive. Our mental processes act in bad faith, perverting or degrading our picture of the world.”

 

We act so irrationally and have such an incorrect view of the world according to Simler and Hanson because it helped our ancestors to be more deceptive and to survive. If you wish to tell a white lie to someone or if you really want to appear sincere in your thoughts and actions, it is much easier if you believe the things you are lying about. If you know you are lying and acting in bad faith, you have to be a really good actor or poker player to convince everyone else. We actually benefit if our brains fail to recognize exactly what is driving us and help us systematically not recognize inconvenient truths.

 

For example, I use Strava, a social media platform geared toward runners and cyclists. The app allows us to upload our GPS data from our runs and bike rides and to compare our routes and see who went the fastest along a particular street or who ran up a trail the fastest. At a base level I know that I am using the app because it allows me show off to other people just how good of a runner I am. But if you asked me at any given point why I upload all my workouts to Strava, I would tell you a story about wanting to keep up with friends, wanting to discover new places to go running, and about the data that I can get to analyze my performance. The first story doesn’t look so great for me, but the second one makes me sound social and intelligent. I am inclined to tell myself that is why I use the app and to deny, even to myself, that I use it because I want to prove that I am a better runner than someone else or to show off to my casual running friends who might log-in and see that I went on a long run.

 

Our brains are not the scientifically rational things I wish they were, but in many ways that is important for us as we try to build coalitions and social groups to get things done. We connect in ways that are beyond rationality, and sometimes we need the generous (though often false) view of ourselves as good actors to help us get through the day. We can strive for more rationality in our thoughts and actions, but we should accept that we will only get so far, and we shouldn’t hate ourselves or anyone else for not always having the nice and pure motives that we present.