Epistemic Insouciance

Epistemic Insouciance

Dictionary.com defines insouciant as free from concern, worry or anxiety; carefree; nonchalant. To be epistemic insouciant then is to be carefree or nonchalant regarding knowledge. Epistemic insouciance can be characterized as a lack of concern regarding accurate information, true beliefs, and verifiable knowledge. Whether you know something or not, whether what you think you know is correct or not is of little concern.
In Vices of the Mind, Quassim Cassam writes the following about epistemic insouciance:
“Epistemic insouciance means not really caring about any of this [whether claims are grounded in reality or the evidence] and being excessively casual and nonchalant about the challenge of finding answers to complex questions, partly as a result of a tendency to view such questions as much less complex than they really are.”
Cassam continues to define epistemic insouciance as an attitude vice, different from other epistemic vices in the book that he characterizes as thinking style vices or character trait vices. To demonstrate how it becomes an attitude vice, Cassam uses reporting from the Brexit campaign to demonstrate how a lack of concern over evidence and the impact of complex questions reflected an epistemically insouciant attitude. According to Cassam, reports indicated that Boris Johnson, current British Prime Minister, did not care much about the actual outcomes of the vote on remaining in or leaving the European Union. Johnson eventually wrote an article supporting the decision to leave, but he reportedly had an article written supporting the decision to remain had that option won in general election. His interests were in supporting the winning position, not in the hard work of trying to determine which side he should support and what the actual social, financial, and future impacts of the choices would be. He didn’t care about the evidence and information surrounding the decision, but rather that he looked like he was on the right side.
Epistemic insouciance is not limited to politicians. We can all be guilty of epistemic insouciance, and in some ways we cannot move through the world without it. At the moment, I need to make a decision regarding a transmission repair for a vehicle of mine. I have a lot of important concerns at the moment outside of this vehicle’s transmission. I have a lot of responsibilities and items that require my focus that I think are more important than the transmission issue. I am not interested in really evaluating any evidence to support the decision I eventually make for repairing the transmission or just getting rid of the vehicle. If I were not epistemically insouciant on this issue, I would research the costs more thoroughly, try to understand how much usage I can get out of the vehicle if I repair it, and consider alternatives such as what it could be sold for and what I would spend for a better vehicle. However, this is a lot of work for an item that is not a major concern for me at the moment. I can save the mental energy and attention for more important issues.
Our minds are limited. We cannot be experts in all areas and all decisions that we have to make. Some degree of epistemic insouciance is sometimes necessary, even if it can be financially costly. However, it is important that we recognize when we are being epistemically insouciant and that we try to understand the risks associated with this attitude in our decisions. We should ensure that we are not epistemically insouciant on the most important decisions in our lives, and we should try to clear out the mental clutter and habits that may make us epistemically insouciant on those important issues.

Justified Beliefs

A lot of us have beliefs that are formed out of biases and prejudices. Often those beliefs still end up being true in the end, but they are nevertheless unjustified. A skill of the human mind is to ignore contradictory evidence and focus in on the limited evidence which supports what we want to believe and backs-up our prior assumptions. Whether it is a belief about a sports team, a racial or ethnic group, or about a restaurant, we often adopt unjustified beliefs that we support with anecdotal thinking. When these unjustified beliefs turn out to be correct, we use it as a defense of our biased thinking, and risk becoming entrenched with inaccurate assumptions of how the world works.
In Vices of the Mind Quassim Cassam writes about this directly. He argues that people need to be more considerate when considering whether a way of thinking is helpful or harmful, and whether a true result in the end justifies biased assumptions.  Cassam writes, “leading to true belief is not the same as being conducive to knowledge. Even in cases where an epistemic vice leads someone to believe something true that doesn’t mean that they have the right to be confident that things are as they take them to be or that their belief is justified.”
To take a relatively harmless example, imagine two sports fans who bet on a college basketball game. One fan might be biased in favor of big-name schools, while another might be less biased and willing to look at sports analytics when making decisions about which team is likely to win a game. The biased individual may bet against a smaller school, and may win that bet, but it is hard to say that they would systematically win bets against small schools in favor of more recognizable schools. In any individual instance their bet might pay off, but over the long term we would probably expect the more objective individual without biases who is more open-minded with sports analytics or other survey methods to win more bets. The biased individual who wins a lucky bet does not have justified beliefs even when his bias pays off.
This type of thinking can be more harmful than bets among college basketball fans. The human mind has a remarkable ability to remember the evidence that supports those beliefs we want to be true and to ignore evidence that undermines our desired beliefs. The biased sports fan probably remembers when he was right about a small school being over-hyped, but probably doesn’t remember the times when big-named schools lost to smaller schools. This can happen with people who are biased against police officers, minority groups, or people who drive certain types of cars. The reference class doesn’t matter to our brain, but the individual anecdotes that support our prior beliefs are remembered.
Holding justified beliefs requires that we inform our beliefs based on real-world evidence with statistical value. Basing our beliefs on individual anecdotes will not consistently lead us to having accurate beliefs, and if we do hit upon a true belief from time to time, we won’t be justified in the beliefs, assumptions, and conclusions that we draw. It is important to recognize when our thinking is anecdotal, and to consider whether our beliefs are justified.
Anecdotal Versus Systematic Thinking

Anecdotal Versus Systematic Thinking

Anecdotes are incredibly convincing, especially when they focus on an extreme case. However, anecdotes are not always representative of larger populations. Some anecdotes are very context dependent, focus on specific and odd situations, and deal with narrow circumstances. However, because they are often vivid, highly visible, and emotionally resonant, they can be highly memorable and influential.
Systemic thinking often lacks many of these qualities. Often, the general reference class is hard to see or make sense of. It is much easier to remember a commute that featured an officer or traffic accident than the vast majority of commutes that were uneventful. Sometimes the data directly contradicts the anecdotal stories and thoughts we have, but that data often lacks the visibility to reveal the contradictions. This happens frequently with news stories or TV shows that highlight dangerous crime or teen pregnancy. Despite a rise in crime during 2020, we have seen falling crime rates in recent decades, and despite TV shows about teen pregnancies, those rates have also been falling.
In Vices of the Mind, Quassim Cassam examines anecdotal versus systematic thinking to demonstrate that anecdotal thinking can be an epistemic vice that obstructs our view of reality. He writes, “With a bit of imagination it is possible to show that every supposed epistemic vice can lead to true belief in certain circumstances. What is less obvious is that epistemic vices are reliable pathways to true belief or that they are systematically conducive to true belief.”
Anecdotal versus systematic thinking or structural thinking is a useful context for thinking about Cassam’s quote. An anecdote describes a situation or story with an N of 1. That is to say, an anecdote is a single case study. Within any population of people, drug reactions, rocket launches, or any other phenomenon, there are going to be outliers. There will be some results that are strange and unique, deviating from the norm or average. These individual cases are interesting and can be useful to study, but it is important that we recognize them as outliers and not generalize these individual cases to the larger population. Systematic and structural thinking helps us see the larger population and develop more accurate beliefs about what we should normally expect to happen.
Anecdotal thinking may occasionally lead to true beliefs about larger classes, but as Cassam notes, it will not do so reliably. We cannot build our beliefs around single anecdotes, or we will risk making decisions based on unusual outliers. Trying to address crime, reduce teen pregnancy, determine the efficacy of a medication, or verify the safety of a spaceship requires that we understand the larger systemic and structural picture. We cannot study one instance of crime and assume we know how to reduce crime across an entire country, and none of us would want to ride in a spaceship that had only been tested once.
It is important that we recognize anecdotal thinking, and other epistemic vices, so we can improve our thinking and have better understandings of reality. Doing so will help improve our decision-making, will improve the way we relate to the world, and will help us as a society better determine where we should place resources to help create a world we want to live in. Anecdotal thinking, and indulging in other epistemic vices, might give us a correct answer from time to time, but it is likely to lead to worse outcomes and decisions over time as we routinely misjudge reality. This in turn will create tensions and distrust among a society that cannot agree on the actual trends and needs of the population.
Thinking Conspiratorially Versus Evidence-Based Thinking - Joe Abittan

Thinking Conspiratorially Versus Evidence-Based Thinking

My last two posts have focused around conspiratorial thinking and whether it is an epistemic vice. Quassim Cassam in Vices of the Mind argues that we can only consider thinking conspiratorially to be a vice based on context. He means that conspiratorial thinking is a vice dependent on whether there is reliable and accurate evidence to support a conspiratorial claim. Thinking conspiratorially is not an epistemic vice when we are correct and have solid evidence and rational justifications for thinking conspiratorially. Anti-conspiratorial thinking can be an epistemic vice if we ignore good evidence of a conspiracy to continue believing that everything is in order.
Many conspiracies are not based on reliable facts and information. They create causal links between disconnected events and fail to explain reality. Anti-conspiratorial thinking also creates a false picture of reality, but does so by ignoring causal links that actually do exist. As epistemic vices, both ways of thinking can be described consequentially and by examining the patterns of thought that contribute to the conspiratorial or anti-conspiratorial thinking.
However, that is not to say that conspiratorial thinking is a vice in non-conspiracy environments and that anti-conspiratorial thinking is a vice in high-conspiracy environments. Regarding this line of thought, Cassam writes, “Seductive as this line of thinking might seem, it isn’t correct. The obvious point to make is that conspiracy thinking can be vicious in a conspiracy-rich environment, just as anti-conspiracy thinking can be vicious in contexts in which conspiracies are rare.” The key, according to Cassam, is evidence-based thinking and whether we have justified beliefs and opinions, even if they turn out to be wrong in the end.
Cassam generally supports the principle of parsimony, the idea that the simplest explanation for a scenario is often the best and the one that you should assume to be correct. Based on the evidence available, we should look for the simplest and most direct path to explain reality. However, as Cassam continues, “the principle of parsimony is a blunt instrument when it comes to assessing the merits of a hypothesis in complex cases.” This means that we will still end up with epistemic vices related to conspiratorial thinking if we only look for the simplest explanation.
What Cassam’s quotes about conspiratorial thinking and parsimony get at is the importance of good evidence-based thinking. When we are trying to understand reality, we should be thinking about what evidence should exist for our claims, what evidence would be needed to support our claims, and what kinds of evidence would refute our claims. Evidence-based thinking helps us avoid pitfalls of conspiratorial or anti-conspiratorial thinking, regardless as to whether we live in conspiracy rich or poor environments. Accurately identifying or denying a conspiracy based on thinking without any evidence, based on assuming simple relationships, is ultimately not much better than simply making up beliefs based on magic. What we need to do is learn to adopt evidence-based thinking and to better understand the causal structures that exist in the world. That is the only true way to avoid the epistemic vices related to conspiratorial thinking.
Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

How we think, what we spend our time thinking about, and the way we view and understand the world is important. If we fail to develop accurate beliefs in the world then we will make decisions based on causal structures that do not exist. Our actions, thoughts, and behaviors will inhibit knowledge for ourselves and others, and our species will be worse off because of it.
This idea is at the heart of Quassim Cassam’s book Vices of the Mind. Throughout our human history we have held many beliefs that cannot plausibly be true, or which we came to learn were incorrect over time. Cassam would argue (alongside others such as Steven Pinker, Yuval Noah Harari, and Joseph Henrich) that adopting more accurate and correct beliefs and promoting knowledge would help us systematically make better decisions to improve the life of our fellow humans. Learning where we were wrong and using science, technology, and information to improve our decision-making has helped our world become less violent, given us more opportunity, provided better nutrition, and allowed us to be more cooperative on a global level.
This is why Cassam addresses paranormal beliefs, superstitions, and conspiratorial thinking in his book. While examining conspiracy theories in depth, he writes, “studies have also found that belief in conspiracy theories is associated with superstitious and paranormal beliefs, and it has been suggested that these beliefs are associated because they are underpinned by similar thinking styles [italicized text is cited with Swami et al. 2011].  Cassam argues that conspiracy theories are different from the other two modes of thinking because they can sometimes be accurate in their descriptions of the world. Sometimes a politician truly is running a corruption scheme, sometimes a group of companies are conspiring to keep prices high, and sometimes a criminal organization is hiding nefarious activities in plain sight. Conspiratorial thinking in some instances can reveal real causal connections in the world.
However, conspiratorial thinking is often bizarre and  implausible. When our conspiratorial thinking pushes us off the deep edge, then it does share important characteristics with superstitious and paranormal thinking. All three can be described by positing causal connections that cannot possibly exist between phenomena happening or imagined in the real world. They create explanations that are inaccurate and prevent us from identifying real information about the world. Superstitions posit causal connections between random and unconnected events and paranormal thinking posits causal connections between non-existent entities and real world events. Conspiratorial thinking seems to fall in line with both ways of thinking when it is not describing reality.
Over the last few years we have seen how conspiratorial thinking can be vicious, how it can inhibit knowledge, and how it can have real life and death consequences when it goes wrong. Superstitious thinking doesn’t generally seem to have as severe of consequences, but it still prevents us from making the best possible decisions and still drives us to adopt incorrect worldviews, sometimes entrenching unfair biases and prejudices. Paranormal thinking has been a foundation of many world religions and fables used to teach lessons and encourage particular forms of behavior. However, if it does not describe the world in a real way, then the value of paranormal thinking is minimized, and we should seriously consider the harms that can come from paranormal thinking, such as anxiety, suicide, or hours of lost sleep. These ideas are important to consider because we need to make the best possible decisions based on the most accurate information possible if we want to continue to advance human societies, to live sustainably, and to continue to foster cooperation and community between all humans on a global scale. Thinking accurately takes practice, so pushing against unwarranted conspiracy theories, superstitions, and paranormal beliefs helps us build our epistemic muscles to improve thinking overall.
Thinking Conspiratorially

Thinking Conspiratorially

Over the last few years a number of wild conspiracy theories have become popular. Former President Donald Trump embraced a conspiracy theory that the 2020 Presidential Election was rigged (it was not), supported the Qanon conspiracy theory, and did little to push back against conspiracy theories surrounding COVID-19. His actions, behaviors, and beliefs demonstrate that thinking conspiratorially can be an epistemic vice. His willingness to believe wild falsehoods obstructed knowledge for himself and his most ardent supporters.
However, thinking conspiratorially is not always an epistemic vice. One reason why conspiracy theories become so gripping and why people sometimes fall into them is because real conspiracies do occur. Nixon’s Watergate Scandal, Trump’s withholding of financial and military aid unless Ukraine announced an investigation into Joe Biden and his son, and fraud schemes uncovered by inspectors general and government auditors demonstrate that nefarious conspiracies sometimes are real. While thinking conspiratorially can become an epistemic vice, the same is true for anti-conspiratorial thinking.
In the book Vices of the Mind, Quassim Cassam quotes Dr. Charles Pigden from the University of Otago in New Zealand by writing, “there is nothing inherently vicious about believing or being disposed to believe conspiracy theories.” Cassam argues that conspiratorial thinking is not an epistemic vice on its own, but is instead a context dependent vice or virtue. He continues, “there are environments in which either way of thinking can be epistemically virtuous or vicious, and a way to capture this context-relativity is to describe these thinking styles as conditionally virtuous or vicious.”
The examples I used earlier show how conspiratorial thinking can be either virtuous or vicious. In the case of our former President, his conspiratorial thinking spread misinformation, suppressed true and accurate information, and created a set of false beliefs that some of his supporters believed so strongly that they stormed the United States Capitol in an attempt to stop Congress from certifying the election. The context of his conspiracy theories obstructed knowledge and caused substantial harm to life and property. However, a government auditor who notices inconsistencies in paperwork and accounting practices may be rewarded for thinking conspiratorially, at least to a point. Believing that something nefarious could possibly be going on will encourage the auditor to review financial statements and testimony from personnel with more scrutiny, potentially helping them uncover real fraud. Of course, they could still go too far and push the issue beyond reasonable bounds by thinking conspiratorially, but this type of thinking is conditionally virtuous when it discovers true fraud and improves knowledge about fraud schemes.
Given the dramatic consequences of conspiracy thinking over the last few years, it is easy to dismiss thinking conspiratorially as an epistemic vice. However, we should remember that it is only conditionally an epistemic vice, and that sometimes conspiracies do turn out to be true (or at least partially true). We don’t have to give every conspiracy our respect and attention, but when a conspiracy does appear to be grounded in reality and supported by real evidence, then we should not be too quick to dismiss it.
Causal Links Between Unconnected Events

Causal Links Between Unconnected Events

As a kid I grew up attending basketball camps at UCLA. I played in the old gym that used to host UCLA games in front of a few thousand fans, played on the current court in main stadium, and slept in the dorms. With my history of basketball at UCLA, I have always been a fan of the men’s basketball team, rooting for them and the Nevada Wolf Pack – where I actually went to school. With the UCLA team making a deep run in the NCAA March Madness tournament, I have been reminded of all the superstitious thinking that surrounds sports and that I used to indulge in.
Sports seem to bring out superstitious thinking in even the most rational of people. I try very hard to think about causal structures and to avoid seeing non-existent causal links between unconnected events, but nevertheless, it is hard to not let superstitious thinking creep in. When you are watching a game it is hard not to feel like you have to sit in the right spot, have to watch from a certain room, or have to avoid certain behaviors in order to keep your team in the lead. However, it is absolute nonsense to think that your actions on your couch, miles away from the sporting venue where the game is taking place, could have any causal link to the way that a sports team performs.
In the book Vices of the Mind, Quassim Cassam spends time examining what is happening within our mind when we engage in superstitious thinking. He explains that superstitious thinking qualifies as an epistemic vice because it gets in the way of knowledge. It prevents us from forming accurate beliefs about the world. “Superstitious thinking,” Cassam writes, “isn’t a generally reliable method for forming true beliefs about the future; it won’t generally lead to true beliefs because it posits causal links between unconnected events. … beliefs based on superstitious thinking aren’t reasonable.”
Cassam gives the example of superstitions about walking under ladders in the book. Someone with a superstition believing that bad luck will befall them if they walk under a ladder will probably avoid walking under ladders, and as a result they won’t be as likely to have paint drip on them, to have something fall on their head, or to knock over the ladder and anyone or anything on top of it. Their superstition will lead to better outcomes for them, but not because the superstition helped them create true beliefs about the dangers of walking under ladders. The individual ends up with the correct answer, but interprets the wrong causal chain to get there.
Thinking about rational and plausible causal chains is a way to escape superstitious thinking. You can rationally examine the risks, harms, and benefits of certain behaviors and actions with rational connections between events to see when a superstition is nonsense, and when it pulls from real-life causal chains to help improve life. Trying not step on cracks will not prevent you from starting a causal chain that leads to your mother’s broken back, but it will help ensure you have more stable and steady footing when you walk. Wearing the same basketball jersey for each sports game has no causal connection with the team’s performance, and wearing it or not wearing it will not have an impact on how your favorite team performs. We should strive to have accurate beliefs about the world, we should work to see causal connections clearly, and we should limit superstitious thinking even if it is about trivial things like sports.
Superstitious Thinking

Superstitious Thinking

Would you consider superstitious thinking to be a vice? According to Quassim Cassam in Vices of the Mind, superstitious thinking is indeed an epistemic vice. That is to say, Cassam believes that superstitious thinking is a is reprehensible, blameworthy, systematically obstructs knowledge. By systematically obstructing knowledge, superstitious thinking causes people to adopt beliefs about the world that don’t match reality, leaving them vulnerable to poor decision-making that can have real-world consequences in their lives.
Cassam writes, “a gambler who sees a succession of coin tosses coming down heads thinks that the next toss will be tails because a tails is now due. This is an example of superstitious or magical thinking, thinking that posits a causal link between unconnected events, namely, previous coin tosses and the next toss.” This quote shows how superstitious thinking systematically obstructs knowledge. It causes us to see causal connections when none exist, distorting our perception and theory of reality.
A gambler making bets sees a causal connection between previous roles of a dice or spins of a roulette wheel and the next roll or spin. In reality, each time you flip a coin, roll a dice, or spin a wheel, the previous result has no bearing on the current probability. A coin toss is a 50-50 affair that does not change because the previous flip was heads.
This type of thinking is prevalent in more than just gamblers. Sports enthusiasts regularly see causal links that cannot possibly exist. The same kind of thinking also shows up in people who have lucky clothing, special rituals in aspects of daily life, or who avoid certain phrases or behaviors. In many instances, the causal links we identify are absurd but don’t incur real costs in our lives. Avoiding stepping on cracks in the sidewalk doesn’t cost you anything and growing a beard because your favorite sports team is on a roll might even provide some social benefits and save you time from not shaving. However, giving in to superstitious thinking, as noted before, distorts your view of reality.
The causal chains misperceived through superstitious thinking create false understandings of how the world works. While it is harmless to believe that you need to sit in the same exact spot for your sports team to play well, it is not harmless to believe that hiring a woman to do a certain job is bad luck, and it is not harmless to bet your life savings on a gamble because of superstitious thinking. What may be even worse is that superstitious thinking in one area could spill into other areas, creating a habit of seeing causal chains that don’t exist. Overtime, superstitious thinking will lead to worse outcomes and poor decision-making that will have real costs in our lives.
Epistemic Character Vices

Epistemic Character Vices

Quassim Cassam explores epistemic vices in his book Vices of the Mind to understand how certain vices can obstruct knowledge and why they matter. Such vices tend to be thinking vices, that is vices that relate to the way we think about and understand the world. They may impact how we view and perceive the world, how we communicate information, or whether we are able to retain and recall information when needed. For example, being closed-minded can inhibit us from taking in an accurate view of the world, being arrogant may prevent us from effectively communicating knowledge about the world, and being careless or easily distracted may limit our ability to remember and recall information.
However, some epistemic vices can also be understood as character vices. Cassam writes, “character vices actually have a dual use: they can be used to characterize a person or they can be used to describe their thinking, either in general or in a particular case.” Character vices are not just behaviors, but ways of being that are typical for a person, that embody some essential aspect of an individual. Instead of just describing an action or behavior to understand its consequence, we can use character vices to understand an entire string of behaviors and actions of an individual, to understand their larger life outcomes.
Wishful thinking is a good example of a character vice that can have dual use. If you watched more college basketball than normal over the course of the pandemic, then by the time the NCAA Tournament started, you may have engaged in wishful thinking, placing a larger bet than you should have on the outcome of some of the games. But being overconfident in a couple of bets and believing that the best possible outcome would truly come to pass is different than being characteristically a wishful thinker. Someone who we describe as a wishful thinker is likely to always see the upsides and believe that things will work out as desired. As a result, they may not be prepared when things go wrong, and may not be able to overcome or avoid obstacles.
Wishful thinking as an epistemic character vice can describe your individual action or it can describe you as a person. Either way, it is helpful to see that epistemic vices can operate on multiple levels. Studying epistemic vices so closely helps us understand our thinking, our behaviors, even our personalities. They help us connect specific behaviors or traits to real-world outcomes, hopefully allowing us to see the harms that can come from actions, behaviors, and traits that obstruct knowledge.
Case Explanations Versus Structural Explanations

Case Explanations Versus Structural Explanations

In Vices of the Mind Quassim Cassam asks whether we can understand the behaviors of an individual based on individual characteristics or if we have to rely on larger structural and systemic explanations for their behavior. The question is important for Cassam because his book focuses on epistemic vices, which are vices that get in the way of knowledge. If such vices change people’s thoughts and behaviors in predictable ways, then they are something we should think about and work to change in ourselves and others. If, however, they don’t make a difference in people’s behaviors because larger structural explanations exist, then they are not worthy of our attention.
Given that Cassam wrote an entire book about epistemic vices, it is not surprising that he believes that they are useful in explaining behavior. He writes, “Epistemic vices are obstacles to knowledge that can in appropriate cases explain how people think and what they do. Sometimes, though, structural or systemic explanations are better.” This sentence feels a little weak, as though Cassam is admitting that epistemic vices can take a back seat to structural factors. However, the sentence is a useful summation of how we should think about individual level factors and larger structural and systemic factors.
Our lives are shaped to a great degree by large structural and systemic forces that are beyond our control. Family structures drive specific types of behaviors. Markets produce predictable outcomes. The rules of a sport determine what actions can and cannot be taken. However, within these larger structures and systems there is room for individual variation. Cassam’s argument is that we can understand some of the individual variation within larger structures by understanding epistemic vices.
Case explanations can include individual choices, characteristics, and epistemic virtues and vices to help us understand behavior. These explanations can be built on top of structural and systemic explanations which shape the range of possibilities and narrow some of the individual variations. We cannot entirely define someone by their individual choices and differences, but we can view them within a system and ask how their choices within a system differed from others, whether their differences were positive or negative, and why.