Improve Your Posture - Joe Abittan - Vices Of The Mind - Cassam

Improve Your Posture

In the book Vices of  the Mind, Quassim Cassam compares our thinking to our physical posture. Parents, physical therapists, and human resources departments all know the importance of good physical posture. Strengthening your core, lifting from your legs and not your back, and having your computer monitor at an appropriate height is important if you are going to avoid physical injuries and costly medical care to relive your pain. But have you ever thought about your epistemic posture?
Your epistemic posture can be thought of in a similar manner as your physical posture. Are you paying attention to the right things, are you practicing good focus, and are you working on being open-minded? Having good epistemic posture will mean that you are thinking in a way that is the most conducive to knowledge generation. Just as poor physical posture can result in injuries, poor epistemic posture can result in knowledge injuries (at least if you want to consider a lack of knowledge and information an injury).
Cassam writes, “The importance of one’s physical posture in doing physical work is widely recognized. The importance of one’s epistemic posture in doing epistemic work is not. Poor physical posture causes all manner of physical problems, and a poor epistemic posture causes all manner of intellectual problems. So the best advice to the epistemically insouciant and intellectually arrogant is: improve your posture.”
Improving our epistemic posture is not easy. Its not something we just wake up and decide we can do on our own, just as we can’t improve our walking form, the way we lift boxes, or easily adjust our workspace to be the most ergonomic all on our own. We need coaches, teachers, and therapists to help us see where we are going through dangerous, harmful, or imbalanced motions, and we need them to help correct us. These are skills that should be taught from a young age (both physically and epistemically) to help us understand how to adopt good habits maintain a healthy posture throughout life.
Thinking in ways that build and enhance our knowledge is important. It is important that we learn to be open-minded, that we learn how not to be arrogant, and that we learn that our opinions and perspectives are limited. The more we practice good epistemic posture the better we can be at recognizing when we have enough information to make important decisions and when we are making decisions without sufficient information. It can help us avoid spreading misinformation and disinformation, and can help us avoid harmful conspiracy theories or motivated reasoning. Good epistemic posture will help us have strong and resilient minds, just as good physical posture will help us have strong and resilient bodies.
Who is Harmed by Epistemic Malevolence

Who is Harmed by Epistemic Malevolence?

One of the reasons we should care about epistemic vices is that they harm all of society. Epistemic vices are vices that hinder knowledge, and since we live in complex and interconnected societies, we rely on shared and easily accessible knowledge in order for any of us to survive. When knowledge is hindered, the chance that complex systems can break down and harm people increases.
 
This idea is important and helpful when we think about our own potential epistemic vices. Our attitudes, behaviors, and actions that hinder knowledge may not harm us, but may harm someone else or may harm broader segments of society. In his book Vices of the Mind Quassaim Cassam demonstrates this by examining epistemic malevolence. He writes, “the person who is deprived of knowledge by the vice of epistemic malevolence is not the person with the vice.”
 
If someone is intentionally misleading you by giving you false information or making you question legitimate information for their own gain, then they are not harmed. They likely know that the information they are presenting and sharing is inaccurate, but stand to gain from you having inaccurate information. They may stand to profit, which motivates their epistemic malevolence, while you are harmed.
 
In some epistemic vices, the individual with the vice is the one who is harmed. Wishful thinkers and gullible individuals are the ones who are harmed by their epistemic vices. However, other epistemic vices, as the malevolence example demonstrates, harm other people. Knowledge is something that is shared and built communally. Few of us develop real knowledge completely on our own, and the power of knowledge is magnified when shared with others. Often, when we get in the way of this process, it is not just ourselves that are harmed, but all of society, increasing the responsibility that we all have to minimize epistemic vices.

Using Misinformation and Disinformation for Political Purposes

Using Misinformation & Disinformation for Political Purposes

“A relentless barrage of misleading pronouncements about a given subject,” writes Quassim Cassam in Vices of the Mind, “can deprive one of one’s prior knowledge of that subject by muddying the waters and making one mistrust one’s own judgement.”
This sentence seems to perfectly describe the four year presidency of Donald Trump. The former President of the United States said a lot things that could not possibly be true, and didn’t seem to care whether his statements were accurate or inaccurate. There were times when he was clearly trying to mislead the nation, and times when he simply didn’t seem to know what he was talking about and made up claims that sounded good in the moment. Regardless of whether he was trying to deliberately mislead the public or not, his statements often had the same effect. They often created confusion, a buzz around a particular topic, and a dizzying array of rebuttals, of support arguments, and complicated fact-checks.
The President’s epistemic insouciance created confusion and bitter arguments that the President could spin for his own political gain. He would lie about meaningless topics and then criticize people for focusing on narrow and unimportant falsehoods. He would say random and sometimes contradictory things which would create so much confusion around a topic that people had trouble understanding what the argument was about and began to doubt factual information and reporting. The result was a blurring of the lines between reputable and fact-based reporting and hyperbolic opinionated reporting.
A clear lesson from Trump’s presidency is that we need to do a better job of holding elected officials to a higher standard with their statements. Unfortunately, it often goes against our self or group interest to hold the elected officials we favor to high standards. If we generally like a politician who happens to be epistemically insouciant, it is hard to vote against them, even if we know what they say is wrong or deliberately misleading. As many of Trump’s supporters demonstrated, it can be more comfortable to do complex mental gymnastics to make excuses for obviously inept and dangerous behaviors than to admit that our favored politician is lazy and incompetent. 
Knowledge and accurate beliefs are important. We have entered a period in humanity where we depend on complex systems. Whether it is infrastructure, supply chains, or human impacts on climate, our actions and behaviors are part of large interconnected systems. None of us can understand these systems individually, and we depend on experts who can help us make sense of how we relate to larger wholes. We need to be investing in and developing systems and structures that encourage and facilitate knowledge. Using misinformation and disinformation for political purposes inhibits knowledge, and makes us more vulnerable to system collapses when we cannot effectively and efficiently coordinate our actions and behaviors as complex systems change or break. Going forward, we have to find a way to prevent the epistemically insouciant from muddying the waters and clouding our knowledge.
Epistemically Malevolent & Epistemically Insouciant

Epistemically Malevolent & Epistemically Insouciant

Over the last few years I feel as though I have seen an increase in the number of news outlets and reporters saying that we now live in a post-truth society. The argument is that truth and accuracy no longer matter to many people, and that we live in a world where people simply want to believe what they want to believe, regardless of the evidence. This argument is supported by documented instances of fake news, by a former US president who didn’t seem to care what the truth was, and by politicians and every day people professing beliefs that are clearly inaccurate as a type of loyalty test. This puts us in a position where it becomes difficult to communicate important information and create a coherent narrative based on accurate details surrounding the events of our lives.
Two concepts that Quassim Cassam discusses in his book Vices of the Mind can help us think about what it means to be in a post-truth society. Cassam writes, “one can be epistemically malevolent without being epistemically insouciant.” To me, it seems that a post-truth society depends on both malevolency and insouciance to exist. I find it helpful to see that there is a distinction in these two different postures toward knowledge.
To be epistemically malevolent means to intentionally and deliberately attempt to hinder and limit knowledge. Cassam uses the example of tobacco companies deliberately misleading the public on the dangers of smoking. Company executives intentionally made efforts to hide accurate scientific information and to mislead the public. In recent years we have seen epistemic malevolence in the form of fake-news, misinformation, and disinformation intended to harm political opponents and discourage voter turnout for opposing political parties.
Epistemic insouciance doesn’t necessarily have a malicious intent behind it. Instead, it is characterized by an indifference to the accuracy of information. You don’t need an intentional desire to spread false information in order to be epistemically insouciant. However, this careless attitude toward the accuracy of information is in some ways necessary for false information to take hold. Individuals who care whether their knowledge and statements are correct are less likely to be pulled in by the epistemically malevolent, and less likely to spread their messages. However, someone who favors what the epistemically malevolent have to say and is unwilling to be critical of the message are more likely to engage with such false messaging and to echo and spread malevolent lies. Even if an individual doesn’t want to be intentionally misleading, insouciance plays into malevolence.
This helps us see that our post-truth society will need to be addressed on two fronts. First, we need to understand why people are epistemically insouciant and find ways to encourage people to be more concerned with the accuracy and factuality of their statements and beliefs. External nudges, social pressures, and other feedback should be developed to promote factual statements and to hinder epistemic insouciance. This is crucial to getting people to both recognize and denounce epistemic malevolency. Once people care about the accuracy of their beliefs and statements, we can increase the costs of deliberately spreading false information. As things exist now, epistemic insouciance encourages epistemic malevolency. Combating epistemic malevolency will require that we address epistemic insouciance and then turn our attention to stopping the spread of deliberate falsehoods and fake news.
Prejudice as an Epistemic Vice - Joe Abittan - Vices of the Mind

Prejudice as an Epistemic Vice

“Prejudice counts as an epistemic attitude insofar as it is an affective posture toward another person’s epistemic credentials,” writes Quassim Cassam in his book Vices of the Mind. Prejudices inhibit knowledge, deserve reproof, and are attitudes for which individuals can be blameworthy of holding. Therefore, prejudices qualify as epistemic vices.
Cassam continues, “what makes a prejudice a prejudice is that it is an attitude formed and sustained without any proper inquiry into the merits or demerits of its object.” Prejudices  are not based on fact and reality. They are based on incomplete subjective opinions and evaluations of people, places, and things. Generally, a few standout qualities that we either like or dislike are used as justification for our opinions of entire classes and groups, regardless of whether those perceived qualities are indeed real or generalizable to the larger class. Greater consideration might show us that our beliefs are incorrect, that our assumptions are mistaken, and that our perspectives are not generalizable, but prejudices are maintained by an active unwillingness (or an insouciance) to obtain better information.
It is important to note that Cassam’s quote shows that prejudices are not always negative views of people, places, or things. We can be prejudiced to think that something is good or exemplary – think about fancy cars, expensive brands, or your favorite celebrities. What matters with prejudice is not whether we favor of scorn something, but the fact that we adopt inaccurate beliefs via an attitude that hinders knowledge. We could learn more about people, places, and things to better understand their merits and demerits, increasing our knowledge and the knowledge of anyone we share our lessons with. However, prejudiced individuals have an attitude that actively avoids such information, limiting knowledge and preventing transmission of useful information with others. This limitation of knowledge and sustenance of incorrect knowledge is where prejudices become specifically epistemic vices. Understanding this helps us recognize our prejudices (both positive and negative) and helps us also see how we can eliminate them.
Lies Versus Epistemic Insouciance

Lies Versus Epistemic Insouciance

My last post was about epistemic insouciance, being indifferent to whether or not your beliefs, statements, and ideas are accurate or inaccurate. Epistemic insouciance, Quassim Cassam argues in Vices of the Mind is an attitude. It is a disposition toward accurate or false information that is generally case specific.
In the book, Cassam distinguishes between lies and epistemic insouciance. He writes, “lying is something that a person does rather than an attitude, and the intention to conceal the truth implies that the liar is not indifferent to the truth or falsity of his utterances. Epistemic insouciance is an attitude rather than something that a person does, and it does imply an indifference to the truth or falsity of one’s utterances.”
The distinction is helpful when we think about people who deliberately lie and manipulate information for their own gain and people who are bullshitters. Liars, as the quote suggests, know and care about what information is true and what is false. They are motivated by factors beyond the accuracy of the information, and do their best within their lies to present false information as factual.
Bullshitters, however, don’t care whether their information is accurate. The tools that work to uncover inaccurate information and counter a liar don’t work against a bullshitter because of their epistemic insouciance. Liars contort evidence and create excuses for misstatements and lies. Bullshitters simply flood the space with claims and statements of varying accuracy. If confronted, they argue that it doesn’t matter whether they lied or not, and instead argue that their information was wrong, that they didn’t care about it being wrong, and as a result they were not actually lying. This creates circular arguments and distracts from the epistemic value of information and the real costs of epistemic insouciance. Seeing the difference between liars and epistemically insouciant bullshitters is helpful if we want to know how to address those who intentionally obstruct knowledge.

Epistemic Insouciance

Epistemic Insouciance

Dictionary.com defines insouciant as free from concern, worry or anxiety; carefree; nonchalant. To be epistemic insouciant then is to be carefree or nonchalant regarding knowledge. Epistemic insouciance can be characterized as a lack of concern regarding accurate information, true beliefs, and verifiable knowledge. Whether you know something or not, whether what you think you know is correct or not is of little concern.
In Vices of the Mind, Quassim Cassam writes the following about epistemic insouciance:
“Epistemic insouciance means not really caring about any of this [whether claims are grounded in reality or the evidence] and being excessively casual and nonchalant about the challenge of finding answers to complex questions, partly as a result of a tendency to view such questions as much less complex than they really are.”
Cassam continues to define epistemic insouciance as an attitude vice, different from other epistemic vices in the book that he characterizes as thinking style vices or character trait vices. To demonstrate how it becomes an attitude vice, Cassam uses reporting from the Brexit campaign to demonstrate how a lack of concern over evidence and the impact of complex questions reflected an epistemically insouciant attitude. According to Cassam, reports indicated that Boris Johnson, current British Prime Minister, did not care much about the actual outcomes of the vote on remaining in or leaving the European Union. Johnson eventually wrote an article supporting the decision to leave, but he reportedly had an article written supporting the decision to remain had that option won in general election. His interests were in supporting the winning position, not in the hard work of trying to determine which side he should support and what the actual social, financial, and future impacts of the choices would be. He didn’t care about the evidence and information surrounding the decision, but rather that he looked like he was on the right side.
Epistemic insouciance is not limited to politicians. We can all be guilty of epistemic insouciance, and in some ways we cannot move through the world without it. At the moment, I need to make a decision regarding a transmission repair for a vehicle of mine. I have a lot of important concerns at the moment outside of this vehicle’s transmission. I have a lot of responsibilities and items that require my focus that I think are more important than the transmission issue. I am not interested in really evaluating any evidence to support the decision I eventually make for repairing the transmission or just getting rid of the vehicle. If I were not epistemically insouciant on this issue, I would research the costs more thoroughly, try to understand how much usage I can get out of the vehicle if I repair it, and consider alternatives such as what it could be sold for and what I would spend for a better vehicle. However, this is a lot of work for an item that is not a major concern for me at the moment. I can save the mental energy and attention for more important issues.
Our minds are limited. We cannot be experts in all areas and all decisions that we have to make. Some degree of epistemic insouciance is sometimes necessary, even if it can be financially costly. However, it is important that we recognize when we are being epistemically insouciant and that we try to understand the risks associated with this attitude in our decisions. We should ensure that we are not epistemically insouciant on the most important decisions in our lives, and we should try to clear out the mental clutter and habits that may make us epistemically insouciant on those important issues.

Justified Beliefs

A lot of us have beliefs that are formed out of biases and prejudices. Often those beliefs still end up being true in the end, but they are nevertheless unjustified. A skill of the human mind is to ignore contradictory evidence and focus in on the limited evidence which supports what we want to believe and backs-up our prior assumptions. Whether it is a belief about a sports team, a racial or ethnic group, or about a restaurant, we often adopt unjustified beliefs that we support with anecdotal thinking. When these unjustified beliefs turn out to be correct, we use it as a defense of our biased thinking, and risk becoming entrenched with inaccurate assumptions of how the world works.
In Vices of the Mind Quassim Cassam writes about this directly. He argues that people need to be more considerate when considering whether a way of thinking is helpful or harmful, and whether a true result in the end justifies biased assumptions.  Cassam writes, “leading to true belief is not the same as being conducive to knowledge. Even in cases where an epistemic vice leads someone to believe something true that doesn’t mean that they have the right to be confident that things are as they take them to be or that their belief is justified.”
To take a relatively harmless example, imagine two sports fans who bet on a college basketball game. One fan might be biased in favor of big-name schools, while another might be less biased and willing to look at sports analytics when making decisions about which team is likely to win a game. The biased individual may bet against a smaller school, and may win that bet, but it is hard to say that they would systematically win bets against small schools in favor of more recognizable schools. In any individual instance their bet might pay off, but over the long term we would probably expect the more objective individual without biases who is more open-minded with sports analytics or other survey methods to win more bets. The biased individual who wins a lucky bet does not have justified beliefs even when his bias pays off.
This type of thinking can be more harmful than bets among college basketball fans. The human mind has a remarkable ability to remember the evidence that supports those beliefs we want to be true and to ignore evidence that undermines our desired beliefs. The biased sports fan probably remembers when he was right about a small school being over-hyped, but probably doesn’t remember the times when big-named schools lost to smaller schools. This can happen with people who are biased against police officers, minority groups, or people who drive certain types of cars. The reference class doesn’t matter to our brain, but the individual anecdotes that support our prior beliefs are remembered.
Holding justified beliefs requires that we inform our beliefs based on real-world evidence with statistical value. Basing our beliefs on individual anecdotes will not consistently lead us to having accurate beliefs, and if we do hit upon a true belief from time to time, we won’t be justified in the beliefs, assumptions, and conclusions that we draw. It is important to recognize when our thinking is anecdotal, and to consider whether our beliefs are justified.
Anecdotal Versus Systematic Thinking

Anecdotal Versus Systematic Thinking

Anecdotes are incredibly convincing, especially when they focus on an extreme case. However, anecdotes are not always representative of larger populations. Some anecdotes are very context dependent, focus on specific and odd situations, and deal with narrow circumstances. However, because they are often vivid, highly visible, and emotionally resonant, they can be highly memorable and influential.
Systemic thinking often lacks many of these qualities. Often, the general reference class is hard to see or make sense of. It is much easier to remember a commute that featured an officer or traffic accident than the vast majority of commutes that were uneventful. Sometimes the data directly contradicts the anecdotal stories and thoughts we have, but that data often lacks the visibility to reveal the contradictions. This happens frequently with news stories or TV shows that highlight dangerous crime or teen pregnancy. Despite a rise in crime during 2020, we have seen falling crime rates in recent decades, and despite TV shows about teen pregnancies, those rates have also been falling.
In Vices of the Mind, Quassim Cassam examines anecdotal versus systematic thinking to demonstrate that anecdotal thinking can be an epistemic vice that obstructs our view of reality. He writes, “With a bit of imagination it is possible to show that every supposed epistemic vice can lead to true belief in certain circumstances. What is less obvious is that epistemic vices are reliable pathways to true belief or that they are systematically conducive to true belief.”
Anecdotal versus systematic thinking or structural thinking is a useful context for thinking about Cassam’s quote. An anecdote describes a situation or story with an N of 1. That is to say, an anecdote is a single case study. Within any population of people, drug reactions, rocket launches, or any other phenomenon, there are going to be outliers. There will be some results that are strange and unique, deviating from the norm or average. These individual cases are interesting and can be useful to study, but it is important that we recognize them as outliers and not generalize these individual cases to the larger population. Systematic and structural thinking helps us see the larger population and develop more accurate beliefs about what we should normally expect to happen.
Anecdotal thinking may occasionally lead to true beliefs about larger classes, but as Cassam notes, it will not do so reliably. We cannot build our beliefs around single anecdotes, or we will risk making decisions based on unusual outliers. Trying to address crime, reduce teen pregnancy, determine the efficacy of a medication, or verify the safety of a spaceship requires that we understand the larger systemic and structural picture. We cannot study one instance of crime and assume we know how to reduce crime across an entire country, and none of us would want to ride in a spaceship that had only been tested once.
It is important that we recognize anecdotal thinking, and other epistemic vices, so we can improve our thinking and have better understandings of reality. Doing so will help improve our decision-making, will improve the way we relate to the world, and will help us as a society better determine where we should place resources to help create a world we want to live in. Anecdotal thinking, and indulging in other epistemic vices, might give us a correct answer from time to time, but it is likely to lead to worse outcomes and decisions over time as we routinely misjudge reality. This in turn will create tensions and distrust among a society that cannot agree on the actual trends and needs of the population.
Thinking Conspiratorially Versus Evidence-Based Thinking - Joe Abittan

Thinking Conspiratorially Versus Evidence-Based Thinking

My last two posts have focused around conspiratorial thinking and whether it is an epistemic vice. Quassim Cassam in Vices of the Mind argues that we can only consider thinking conspiratorially to be a vice based on context. He means that conspiratorial thinking is a vice dependent on whether there is reliable and accurate evidence to support a conspiratorial claim. Thinking conspiratorially is not an epistemic vice when we are correct and have solid evidence and rational justifications for thinking conspiratorially. Anti-conspiratorial thinking can be an epistemic vice if we ignore good evidence of a conspiracy to continue believing that everything is in order.
Many conspiracies are not based on reliable facts and information. They create causal links between disconnected events and fail to explain reality. Anti-conspiratorial thinking also creates a false picture of reality, but does so by ignoring causal links that actually do exist. As epistemic vices, both ways of thinking can be described consequentially and by examining the patterns of thought that contribute to the conspiratorial or anti-conspiratorial thinking.
However, that is not to say that conspiratorial thinking is a vice in non-conspiracy environments and that anti-conspiratorial thinking is a vice in high-conspiracy environments. Regarding this line of thought, Cassam writes, “Seductive as this line of thinking might seem, it isn’t correct. The obvious point to make is that conspiracy thinking can be vicious in a conspiracy-rich environment, just as anti-conspiracy thinking can be vicious in contexts in which conspiracies are rare.” The key, according to Cassam, is evidence-based thinking and whether we have justified beliefs and opinions, even if they turn out to be wrong in the end.
Cassam generally supports the principle of parsimony, the idea that the simplest explanation for a scenario is often the best and the one that you should assume to be correct. Based on the evidence available, we should look for the simplest and most direct path to explain reality. However, as Cassam continues, “the principle of parsimony is a blunt instrument when it comes to assessing the merits of a hypothesis in complex cases.” This means that we will still end up with epistemic vices related to conspiratorial thinking if we only look for the simplest explanation.
What Cassam’s quotes about conspiratorial thinking and parsimony get at is the importance of good evidence-based thinking. When we are trying to understand reality, we should be thinking about what evidence should exist for our claims, what evidence would be needed to support our claims, and what kinds of evidence would refute our claims. Evidence-based thinking helps us avoid pitfalls of conspiratorial or anti-conspiratorial thinking, regardless as to whether we live in conspiracy rich or poor environments. Accurately identifying or denying a conspiracy based on thinking without any evidence, based on assuming simple relationships, is ultimately not much better than simply making up beliefs based on magic. What we need to do is learn to adopt evidence-based thinking and to better understand the causal structures that exist in the world. That is the only true way to avoid the epistemic vices related to conspiratorial thinking.