Transformational Insights - Joe Abittan - Vices of the Mind - Quassim Cassam

Transformational Insights

In my last post I wrote about self-deceptive rationalization. The idea was that even when trying to critically reflect back on our lives and learn lessons from our experiences, we can error and end up entrenching problematic and inaccurate beliefs about ourselves and the world. I suggested that one potential way to be bumped out of the problem of inaccurate self-reflection was to gain transformational insights from an external event. Wishful thinking might come to an end when you don’t get the promotion you were sure was coming your way. Gullibility can be ended after you have been swindled by a conman. The arrogant can learn their lesson after a painful divorce. However, Quassim Cassam in his book Vices of the Mind suggests that even transformational insights triggered by external events might not be enough to help us change our internal reflection.
In the book Cassam writes, “this leaves it open, however, whether self-knowledge by transformational insight is as vulnerable to the impact of epistemic vices as self-knowledge by active critical reflection. … Transformational insights are always a matter of interpretation.” Even external factors that have the potential to force us to recognize our epistemic vices may fail to do so. The wishful thinkers may continue on being wishful thinkers, believing they simply hit one blip in the road. The gullible may learn their lesson once, but need to learn it again and again in different contexts. And the arrogant may not be able to recognize how their arrogance played into a divorce, instead choosing to view themselves as unfortunate victims. The matter of interpretation of transformational insights, shocks from the outside that make us consider our epistemic vices, means that they cannot be a reliable way to ensure we eliminate epistemic vices.
Again, this seems to leave us in a place where we can not overcome our epistemic vices without developing epistemic virtues. But this puts us back in a circular problem. If our epistemic vices prevent us from developing and cultivating epistemic virtues, and if we need epistemic virtues to overcome our epistemic vices, then how do we ever improve our thinking?
The answer for most of us is probably pretty boring and disappointing. Incrementally, as we gain new perspectives and more experience, we can hopefully come to distinguish between epistemic virtues and epistemic vices. Epistemic vices will systematically obstruct knowledge, leading to poorer decision-making and worse outcomes. As we seek more positive outcomes and better understanding of the world, we will slowly start to recognize epistemic vices and to see them in ourselves. Incrementally, we will become more virtuous.
This is not an exciting answer for anyone looking to make a dramatic change in their life, to achieve a New Year’s Resolution, or to introduce new policy to save the world. It is however, practical and should take some pressure off of us. We can work each day to be a little more self-aware, a little more epistemically virtuous, and to better how to cultivate knowledge. We can grow overtime, without putting the pressure on ourselves to be epistemically perfect all at once. After all, trying to do so might trigger self-deceptive rationalization and our transformational insights are subject to interpretation, which could be wrong.
Beliefs are Not Voluntary

Beliefs Are Not Voluntary

One of the ideas that Quassim Cassam examines in his book Vices of the Mind is the idea of responsibility. Cassam recognizes two forms of responsibility in his book and examines those forms of responsibility through the lens of epistemic vices. The first form of responsibility is acquisition responsibility, or our responsibility for acquiring beliefs or developing ways of thinking, and the second form of responsibility is revision responsibility, or our responsibility for changing beliefs and ways of thinking that are shown to be harmful.
 
 
Within this context Cassam provides interesting insight about our beliefs. He writes, “If I raise my arm voluntarily, without being forced to raise it, then I am in this sense responsible for raising it.
Notoriously, we lack voluntary control over our own beliefs. Belief is not voluntary.”
 
 
Cassam explains that if it is raining outside, we cannot help but believe that it is raining. We don’t have control over many of our beliefs, they are in some ways inescapable and determined by factors beyond our control. beliefs are almost forced on us by external factors. I think this is true for many of our beliefs, ranging from spiritual beliefs to objective beliefs about the world. As Cassam argues, we are not acquisition responsible for believing that we are individuals, that something is a certain color, or that our favorite sports team is going to have another dreadful season.
 
 
But we are revision responsible for our beliefs.
 
 
Cassam continues, “We do, however, have a different type of control over our own beliefs, namely, evaluative control, and this is sufficient for us to count as revision responsible for our beliefs.”
 
 
Cassam introduces ideas from Pamela Hieronymi to explain our evaluative control over our beliefs. Hieronymi argues that we can revise our beliefs when new information arises that challenges those beliefs. She uses the example of our beliefs for how long a commute will be and our shifting beliefs if we hear about heavy traffic. We might not be responsible for the initial beliefs that we develop, but we are responsible for changing those beliefs if they turn out to be incorrect. We can evaluate our beliefs, reflect on their accuracy, and make adjustments based on those evaluations.
 
 
It is important for us to make this distinction because it helps us to better think about how we assign blame for inaccurate beliefs. We cannot blame people for developing inaccurate beliefs, but we can blame them for failing to change those beliefs. We should not spend time criticizing people for developing racist beliefs, harmful spiritual beliefs, or wildly inaccurate beliefs about health, well-being, and social structures. What we should do is blame people for failing to recognize their beliefs are wrong, and we should help people build evaluative capacities to better reflect on their own beliefs. This changes our stance from labeling people as racists, bigots, or jerks and instead puts the responsibility on us to foster a society of accurate self-reflection that push back against inaccurate beliefs. Labeling people will blame them for acquiring vices, which is unreasonable, but fostering a culture that values accurate information will ease the transition to more accurate beliefs.

A Vaccine for Lies and Falsehoods

A Vaccine for Lies and Falsehoods

Vaccines are on everyone’s mind this year as we hope to move forward from the Coronavirus Pandemic, and I cannot help but think about today’s quote from Quassim Cassam’s book Vices of the Mind through a vaccine lens. While writing about ways to build and maintain epistemic virtues Cassam writes, “only the inculcation and cultivation of the ability to distinguish truth from lies can prevent our knowledge from being undermined by malevolent individuals and organizations that peddle falsehoods for their own political or economic ends.” In other words, there is no vaccine for lies and falsehoods, only the hard work of building the skills to recognize truth, narrative, and outright lies.
I am also reminded of a saying that Steven Pinker included in his book Enlightenment Now, “any jackass can knock down a barn, but it takes a carpenter to build one.” This quote comes to mind when I think about Cassam’s quote because building knowledge is hard, but spreading falsehoods is easy. Epistemic vices are easy, but epistemic virtues are hard.
Anyone can be closed-minded, anyone can use lies to try to better their own position, and anyone can be tricked by wishful thinking. It takes effort and concentration to be open-minded yet not gullible, to identify and counter lies, and to create and transmit knowledge for use by other people. The vast knowledge bases that humanity has built has taken years to develop, to weed out the inaccuracies, and to painstakingly hone in on ever more precise and accurate understandings of the universe. All this knowledge and information has taken incredible amounts of hard work by people dedicated to building such knowledge.
But any jackass can knock it all down. Anyone can come along and attack science, attack knowledge, spread misinformation and deliberately use disinformation to confuse and mislead people. Being an epistemic carpenter and building knowledge is hard, but being a conman and acting epistemically malevolent is easy.
The task for all of us is to think critically about our knowledge, about the systems and structures that have facilitated our knowledge growth and development as a species over time, and to do what we can to be more epistemically virtuous. Only by working hard to identify truth, to improve systems for creating accurate information, and to enhance knowledge highways to help people learn and transmit knowledge effectively can we continue to move forward. At any point we can chose to throw sand in the gears of knowledge, bringing the whole system down, or we can find ways to make it harder to gum up the knowledge machinery we have built. We must do the latter if we want to continue to grow, develop, and live peacefully rather than at the mercy of the epistemically malevolent. After all, there is no vaccine to cure us from lies and falsehoods.
Improve Your Posture - Joe Abittan - Vices Of The Mind - Cassam

Improve Your Posture

In the book Vices of  the Mind, Quassim Cassam compares our thinking to our physical posture. Parents, physical therapists, and human resources departments all know the importance of good physical posture. Strengthening your core, lifting from your legs and not your back, and having your computer monitor at an appropriate height is important if you are going to avoid physical injuries and costly medical care to relive your pain. But have you ever thought about your epistemic posture?
Your epistemic posture can be thought of in a similar manner as your physical posture. Are you paying attention to the right things, are you practicing good focus, and are you working on being open-minded? Having good epistemic posture will mean that you are thinking in a way that is the most conducive to knowledge generation. Just as poor physical posture can result in injuries, poor epistemic posture can result in knowledge injuries (at least if you want to consider a lack of knowledge and information an injury).
Cassam writes, “The importance of one’s physical posture in doing physical work is widely recognized. The importance of one’s epistemic posture in doing epistemic work is not. Poor physical posture causes all manner of physical problems, and a poor epistemic posture causes all manner of intellectual problems. So the best advice to the epistemically insouciant and intellectually arrogant is: improve your posture.”
Improving our epistemic posture is not easy. Its not something we just wake up and decide we can do on our own, just as we can’t improve our walking form, the way we lift boxes, or easily adjust our workspace to be the most ergonomic all on our own. We need coaches, teachers, and therapists to help us see where we are going through dangerous, harmful, or imbalanced motions, and we need them to help correct us. These are skills that should be taught from a young age (both physically and epistemically) to help us understand how to adopt good habits maintain a healthy posture throughout life.
Thinking in ways that build and enhance our knowledge is important. It is important that we learn to be open-minded, that we learn how not to be arrogant, and that we learn that our opinions and perspectives are limited. The more we practice good epistemic posture the better we can be at recognizing when we have enough information to make important decisions and when we are making decisions without sufficient information. It can help us avoid spreading misinformation and disinformation, and can help us avoid harmful conspiracy theories or motivated reasoning. Good epistemic posture will help us have strong and resilient minds, just as good physical posture will help us have strong and resilient bodies.
Who is Harmed by Epistemic Malevolence

Who is Harmed by Epistemic Malevolence?

One of the reasons we should care about epistemic vices is that they harm all of society. Epistemic vices are vices that hinder knowledge, and since we live in complex and interconnected societies, we rely on shared and easily accessible knowledge in order for any of us to survive. When knowledge is hindered, the chance that complex systems can break down and harm people increases.
 
This idea is important and helpful when we think about our own potential epistemic vices. Our attitudes, behaviors, and actions that hinder knowledge may not harm us, but may harm someone else or may harm broader segments of society. In his book Vices of the Mind Quassaim Cassam demonstrates this by examining epistemic malevolence. He writes, “the person who is deprived of knowledge by the vice of epistemic malevolence is not the person with the vice.”
 
If someone is intentionally misleading you by giving you false information or making you question legitimate information for their own gain, then they are not harmed. They likely know that the information they are presenting and sharing is inaccurate, but stand to gain from you having inaccurate information. They may stand to profit, which motivates their epistemic malevolence, while you are harmed.
 
In some epistemic vices, the individual with the vice is the one who is harmed. Wishful thinkers and gullible individuals are the ones who are harmed by their epistemic vices. However, other epistemic vices, as the malevolence example demonstrates, harm other people. Knowledge is something that is shared and built communally. Few of us develop real knowledge completely on our own, and the power of knowledge is magnified when shared with others. Often, when we get in the way of this process, it is not just ourselves that are harmed, but all of society, increasing the responsibility that we all have to minimize epistemic vices.

Using Misinformation and Disinformation for Political Purposes

Using Misinformation & Disinformation for Political Purposes

“A relentless barrage of misleading pronouncements about a given subject,” writes Quassim Cassam in Vices of the Mind, “can deprive one of one’s prior knowledge of that subject by muddying the waters and making one mistrust one’s own judgement.”
This sentence seems to perfectly describe the four year presidency of Donald Trump. The former President of the United States said a lot things that could not possibly be true, and didn’t seem to care whether his statements were accurate or inaccurate. There were times when he was clearly trying to mislead the nation, and times when he simply didn’t seem to know what he was talking about and made up claims that sounded good in the moment. Regardless of whether he was trying to deliberately mislead the public or not, his statements often had the same effect. They often created confusion, a buzz around a particular topic, and a dizzying array of rebuttals, of support arguments, and complicated fact-checks.
The President’s epistemic insouciance created confusion and bitter arguments that the President could spin for his own political gain. He would lie about meaningless topics and then criticize people for focusing on narrow and unimportant falsehoods. He would say random and sometimes contradictory things which would create so much confusion around a topic that people had trouble understanding what the argument was about and began to doubt factual information and reporting. The result was a blurring of the lines between reputable and fact-based reporting and hyperbolic opinionated reporting.
A clear lesson from Trump’s presidency is that we need to do a better job of holding elected officials to a higher standard with their statements. Unfortunately, it often goes against our self or group interest to hold the elected officials we favor to high standards. If we generally like a politician who happens to be epistemically insouciant, it is hard to vote against them, even if we know what they say is wrong or deliberately misleading. As many of Trump’s supporters demonstrated, it can be more comfortable to do complex mental gymnastics to make excuses for obviously inept and dangerous behaviors than to admit that our favored politician is lazy and incompetent. 
Knowledge and accurate beliefs are important. We have entered a period in humanity where we depend on complex systems. Whether it is infrastructure, supply chains, or human impacts on climate, our actions and behaviors are part of large interconnected systems. None of us can understand these systems individually, and we depend on experts who can help us make sense of how we relate to larger wholes. We need to be investing in and developing systems and structures that encourage and facilitate knowledge. Using misinformation and disinformation for political purposes inhibits knowledge, and makes us more vulnerable to system collapses when we cannot effectively and efficiently coordinate our actions and behaviors as complex systems change or break. Going forward, we have to find a way to prevent the epistemically insouciant from muddying the waters and clouding our knowledge.
Epistemically Malevolent & Epistemically Insouciant

Epistemically Malevolent & Epistemically Insouciant

Over the last few years I feel as though I have seen an increase in the number of news outlets and reporters saying that we now live in a post-truth society. The argument is that truth and accuracy no longer matter to many people, and that we live in a world where people simply want to believe what they want to believe, regardless of the evidence. This argument is supported by documented instances of fake news, by a former US president who didn’t seem to care what the truth was, and by politicians and every day people professing beliefs that are clearly inaccurate as a type of loyalty test. This puts us in a position where it becomes difficult to communicate important information and create a coherent narrative based on accurate details surrounding the events of our lives.
Two concepts that Quassim Cassam discusses in his book Vices of the Mind can help us think about what it means to be in a post-truth society. Cassam writes, “one can be epistemically malevolent without being epistemically insouciant.” To me, it seems that a post-truth society depends on both malevolency and insouciance to exist. I find it helpful to see that there is a distinction in these two different postures toward knowledge.
To be epistemically malevolent means to intentionally and deliberately attempt to hinder and limit knowledge. Cassam uses the example of tobacco companies deliberately misleading the public on the dangers of smoking. Company executives intentionally made efforts to hide accurate scientific information and to mislead the public. In recent years we have seen epistemic malevolence in the form of fake-news, misinformation, and disinformation intended to harm political opponents and discourage voter turnout for opposing political parties.
Epistemic insouciance doesn’t necessarily have a malicious intent behind it. Instead, it is characterized by an indifference to the accuracy of information. You don’t need an intentional desire to spread false information in order to be epistemically insouciant. However, this careless attitude toward the accuracy of information is in some ways necessary for false information to take hold. Individuals who care whether their knowledge and statements are correct are less likely to be pulled in by the epistemically malevolent, and less likely to spread their messages. However, someone who favors what the epistemically malevolent have to say and is unwilling to be critical of the message are more likely to engage with such false messaging and to echo and spread malevolent lies. Even if an individual doesn’t want to be intentionally misleading, insouciance plays into malevolence.
This helps us see that our post-truth society will need to be addressed on two fronts. First, we need to understand why people are epistemically insouciant and find ways to encourage people to be more concerned with the accuracy and factuality of their statements and beliefs. External nudges, social pressures, and other feedback should be developed to promote factual statements and to hinder epistemic insouciance. This is crucial to getting people to both recognize and denounce epistemic malevolency. Once people care about the accuracy of their beliefs and statements, we can increase the costs of deliberately spreading false information. As things exist now, epistemic insouciance encourages epistemic malevolency. Combating epistemic malevolency will require that we address epistemic insouciance and then turn our attention to stopping the spread of deliberate falsehoods and fake news.
Prejudice as an Epistemic Vice - Joe Abittan - Vices of the Mind

Prejudice as an Epistemic Vice

“Prejudice counts as an epistemic attitude insofar as it is an affective posture toward another person’s epistemic credentials,” writes Quassim Cassam in his book Vices of the Mind. Prejudices inhibit knowledge, deserve reproof, and are attitudes for which individuals can be blameworthy of holding. Therefore, prejudices qualify as epistemic vices.
Cassam continues, “what makes a prejudice a prejudice is that it is an attitude formed and sustained without any proper inquiry into the merits or demerits of its object.” Prejudices  are not based on fact and reality. They are based on incomplete subjective opinions and evaluations of people, places, and things. Generally, a few standout qualities that we either like or dislike are used as justification for our opinions of entire classes and groups, regardless of whether those perceived qualities are indeed real or generalizable to the larger class. Greater consideration might show us that our beliefs are incorrect, that our assumptions are mistaken, and that our perspectives are not generalizable, but prejudices are maintained by an active unwillingness (or an insouciance) to obtain better information.
It is important to note that Cassam’s quote shows that prejudices are not always negative views of people, places, or things. We can be prejudiced to think that something is good or exemplary – think about fancy cars, expensive brands, or your favorite celebrities. What matters with prejudice is not whether we favor of scorn something, but the fact that we adopt inaccurate beliefs via an attitude that hinders knowledge. We could learn more about people, places, and things to better understand their merits and demerits, increasing our knowledge and the knowledge of anyone we share our lessons with. However, prejudiced individuals have an attitude that actively avoids such information, limiting knowledge and preventing transmission of useful information with others. This limitation of knowledge and sustenance of incorrect knowledge is where prejudices become specifically epistemic vices. Understanding this helps us recognize our prejudices (both positive and negative) and helps us also see how we can eliminate them.
Epistemic Insouciance

Epistemic Insouciance

Dictionary.com defines insouciant as free from concern, worry or anxiety; carefree; nonchalant. To be epistemic insouciant then is to be carefree or nonchalant regarding knowledge. Epistemic insouciance can be characterized as a lack of concern regarding accurate information, true beliefs, and verifiable knowledge. Whether you know something or not, whether what you think you know is correct or not is of little concern.
In Vices of the Mind, Quassim Cassam writes the following about epistemic insouciance:
“Epistemic insouciance means not really caring about any of this [whether claims are grounded in reality or the evidence] and being excessively casual and nonchalant about the challenge of finding answers to complex questions, partly as a result of a tendency to view such questions as much less complex than they really are.”
Cassam continues to define epistemic insouciance as an attitude vice, different from other epistemic vices in the book that he characterizes as thinking style vices or character trait vices. To demonstrate how it becomes an attitude vice, Cassam uses reporting from the Brexit campaign to demonstrate how a lack of concern over evidence and the impact of complex questions reflected an epistemically insouciant attitude. According to Cassam, reports indicated that Boris Johnson, current British Prime Minister, did not care much about the actual outcomes of the vote on remaining in or leaving the European Union. Johnson eventually wrote an article supporting the decision to leave, but he reportedly had an article written supporting the decision to remain had that option won in general election. His interests were in supporting the winning position, not in the hard work of trying to determine which side he should support and what the actual social, financial, and future impacts of the choices would be. He didn’t care about the evidence and information surrounding the decision, but rather that he looked like he was on the right side.
Epistemic insouciance is not limited to politicians. We can all be guilty of epistemic insouciance, and in some ways we cannot move through the world without it. At the moment, I need to make a decision regarding a transmission repair for a vehicle of mine. I have a lot of important concerns at the moment outside of this vehicle’s transmission. I have a lot of responsibilities and items that require my focus that I think are more important than the transmission issue. I am not interested in really evaluating any evidence to support the decision I eventually make for repairing the transmission or just getting rid of the vehicle. If I were not epistemically insouciant on this issue, I would research the costs more thoroughly, try to understand how much usage I can get out of the vehicle if I repair it, and consider alternatives such as what it could be sold for and what I would spend for a better vehicle. However, this is a lot of work for an item that is not a major concern for me at the moment. I can save the mental energy and attention for more important issues.
Our minds are limited. We cannot be experts in all areas and all decisions that we have to make. Some degree of epistemic insouciance is sometimes necessary, even if it can be financially costly. However, it is important that we recognize when we are being epistemically insouciant and that we try to understand the risks associated with this attitude in our decisions. We should ensure that we are not epistemically insouciant on the most important decisions in our lives, and we should try to clear out the mental clutter and habits that may make us epistemically insouciant on those important issues.

Justified Beliefs

A lot of us have beliefs that are formed out of biases and prejudices. Often those beliefs still end up being true in the end, but they are nevertheless unjustified. A skill of the human mind is to ignore contradictory evidence and focus in on the limited evidence which supports what we want to believe and backs-up our prior assumptions. Whether it is a belief about a sports team, a racial or ethnic group, or about a restaurant, we often adopt unjustified beliefs that we support with anecdotal thinking. When these unjustified beliefs turn out to be correct, we use it as a defense of our biased thinking, and risk becoming entrenched with inaccurate assumptions of how the world works.
In Vices of the Mind Quassim Cassam writes about this directly. He argues that people need to be more considerate when considering whether a way of thinking is helpful or harmful, and whether a true result in the end justifies biased assumptions.  Cassam writes, “leading to true belief is not the same as being conducive to knowledge. Even in cases where an epistemic vice leads someone to believe something true that doesn’t mean that they have the right to be confident that things are as they take them to be or that their belief is justified.”
To take a relatively harmless example, imagine two sports fans who bet on a college basketball game. One fan might be biased in favor of big-name schools, while another might be less biased and willing to look at sports analytics when making decisions about which team is likely to win a game. The biased individual may bet against a smaller school, and may win that bet, but it is hard to say that they would systematically win bets against small schools in favor of more recognizable schools. In any individual instance their bet might pay off, but over the long term we would probably expect the more objective individual without biases who is more open-minded with sports analytics or other survey methods to win more bets. The biased individual who wins a lucky bet does not have justified beliefs even when his bias pays off.
This type of thinking can be more harmful than bets among college basketball fans. The human mind has a remarkable ability to remember the evidence that supports those beliefs we want to be true and to ignore evidence that undermines our desired beliefs. The biased sports fan probably remembers when he was right about a small school being over-hyped, but probably doesn’t remember the times when big-named schools lost to smaller schools. This can happen with people who are biased against police officers, minority groups, or people who drive certain types of cars. The reference class doesn’t matter to our brain, but the individual anecdotes that support our prior beliefs are remembered.
Holding justified beliefs requires that we inform our beliefs based on real-world evidence with statistical value. Basing our beliefs on individual anecdotes will not consistently lead us to having accurate beliefs, and if we do hit upon a true belief from time to time, we won’t be justified in the beliefs, assumptions, and conclusions that we draw. It is important to recognize when our thinking is anecdotal, and to consider whether our beliefs are justified.