Vices and Personalities

Vices & Personalities

In Vices of the Mind Quassim Cassam argues that epistemic vices are different than personality traits. He argues that we can change our behaviors and escape epistemic vices in a way that we cannot with certain aspects of our personality and who we are. This means that we can improve the way we think in order to be more rational and knowledgeable individuals.
“Wishful thinking is what a person does rather than what a person is like,” writes Cassam as an example of a difference between a vice and a personality. We can generally be happy and optimistic people or we can generally be negative and pessimistic, and though I have not studied it, my understanding is that to some extent our genes can influence our general outlook and disposition on life. Nevertheless, we can still engage in epistemic vices like wishful thinking even if we are normally more of an optimist or pessimist. Distinguishing between epistemic vices that are more in our control than personality traits is helpful to see how we can make adjustments in our thinking to improve our knowledge.
To me, the distinction is similar to the difference between the Spanish verbs of estar and ser. Estar is used to describe states of things that change. You would use it to say I am happy today, the house is in good condition, or the vase is broken. Ser captures essential elements of something. You would use it to describe yourself as tall, to say that the house is large, or to describe a vase as blue.
We can generally be positive people, generally excited to talk to strangers, or we can prefer familiar routines rather than unknown situations. But regardless of these essential characteristics, there can be patterns of thinking that we engage in, like wishful thinking. Wishful thinking is a pattern of thought that assumes the best outcomes, discredits information that contradicts our hopes, and ignores the pursuit of additional information that might change our mind. It is a behavior that obstructs knowledge, and is also a behavior we can escape through practice and recognition.
The other epistemic vices that Cassam highlights are similar to wishful thinking. They are behaviors and patterns of thought that we generally have more control over than whether we have a sunny disposition toward life. Being behaviors that obstruct knowledge, they are behaviors that we can and should strive to avoid in order to facilitate knowledge, improve our behaviors and decision-making, and ultimately strengthen the choices in life that we make.
Systematically Obstructing Knowledge

Systematically Obstructing Knowledge

The defining feature of epistemic vices, according to Quassim Cassam, is that they get in the way of knowledge. They inhibit the transmission of knowledge from one person to another, they prevent someone from acquiring knowledge, or they make it harder to retain and recall knowledge when needed. Importantly, epistemic vices don’t always obstruct knowledge, but they tend to do so systematically.
“There would be no justification for classifying closed-mindedness or arrogance as epistemic vices if they didn’t systematically get in the way of knowledge,” writes Cassam in Vices of the Mind. Cassam lays out his argument for striving against mental vices through a lens of consequentialism. Focusing on the outcomes of ways of thinking, Cassam argues that we should avoid mental vices because they lead to bad outcomes and limit knowledge in most cases.
Cassam notes that epistemic vices can turn out well for an individual in some cases. While not specifically mentioned by Cassam, we can use former President Donald Trump as an example. Cassam writes, “The point of distinguishing between systematically and invariably is to make room for the possibility that epistemic vices can have unexpected effects in particular cases.” Trump used a massive personal fortune, an unabashed bravado, and a suite of mental vices to bully his way into the presidency. His mental vices such as arrogance, closed-mindedness, and prejudice became features of his presidency, not defects. However, while his epistemic vices helped propel him to the presidency, they clearly and systematically created chaos and problems once he was in office. In his arrogance he attempted to bribe the prime minister of Ukraine, leading to an impeachment. His closed-mindedness and wishful thinking contributed to his second impeachment as he spread baseless lies about the election. 
For most of us in most situations, these same mental vices will also likely lead to failure and errors rather than success. For most of us, arrogance is likely to prevent us from learning about areas where we could improve ourselves to perform better in upcoming job interviews. Closed-mindedness is likely to prevent us from gaining knowledge about saving money with solar panels or about a new ethnic restaurant that we would really enjoy. Prejudice is also likely to prevent us from learning about new hobbies, pastimes, or opportunities for investment. These vices don’t always necessarily lead to failure and limit important knowledge for us, as Trump demonstrated, but they are more likely to obstruct important knowledge than if we had pushed against them.
Epistemic Vices - Joe Abittan

Epistemic Vices

Quassim Cassam’s book Vices of the Mind is all about epistemic vices. Epistemic vices are intentional and unintentional habits, behaviors, personality traits, and patterns of thought that hinder knowledge, information sharing, and accurate and adequate understandings of the world around us. Sometimes we intentionally deceive ourselves, sometimes we simply fail to recognize that we don’t have enough data to confidently state our beliefs, and sometimes we are intentionally deceived by others without recognizing it. When we fall into thinking habits and styles that limit our ability to think critically and rationally, we are indulging in epistemic vices, and the results can often be dangerous to ourselves and people impacted by our decisions.
“Knowledge is something that we can acquire, retain, and transmit. Put more simply, it is something that we can gain, keep, and share. So one way to see how epistemic vices get in the way of knowledge is to see how they obstruct the acquisition, retention, and transmission of knowledge,” Cassam writes.
A challenge that I have is living comfortably knowing that I have incomplete knowledge on everything, that the world is more complex than I can manage to realize, and that even when doing my best I will still not know everything that another person does. This realization is paralyzing for me, and I constantly feel inadequate because of it. However, Cassam’s quote provides a perspective of hope.
Knowledge is something we can always gain, retain, and transmit. We can improve all of those areas, gaining more knowledge, improving our retention and retrieval of knowledge, and doing better to transmit our knowledge. By recognizing and eliminating epistemic vices we can increase the knowledge that we have, use, and share, ultimately boosting our productivity and value to human society. Seeing knowledge as an iceberg that we can only access a tiny fraction of is paralyzing, but recognizing that knowledge is something we can improve our access to and use of is empowering. Cassam’s book is helpful in shining a light on epistemic vices so we can identify them, understand how they obstruct knowledge, and overcome our vices to improve our relationship with knowledge.
Believing We Are Well Informed

Believing We Are Well Informed

In his book Risk Savvy, Gerd Gigerenzer demonstrated that people often overestimate their level of knowledge about the benefits of prostate and cancer screening. “A national telephone survey of U.S. adults,” he writes, “reported that the majority were extremely confident in their decision about prostate, colorectal, and breast screening, believed they were well informed, but could not correctly answer a single knowledge question.” I think this quote reveals something important about the way our minds work. We often believe we are well informed, but that belief and our confidence in our knowledge is often an illusion.
This is something I have been trying to work on. My initial reaction any time I hear any fact or any discussion about any topic is to position myself as a knowledgeable semi-expert in the topic. I have noticed that I do this with ideas and topics that I have really only heard once or twice on a commercial, or that I have seen in a headline, or that I once overheard someone talking about. I immediately feel like an expert even though my knowledge is often less than surface deep.
I think that what is happening in these situations is that I am substituting my feeling of expertise or knowledge with a different question. I am instead answering the question can I recall a time when I thought about this thing and then answering that question. Mental substitution is common, but hard to actually detect. I suspect that the easier a topic comes to mind, even if it is a topic I don’t know anything about but have only heard the name of, then the more likely I am to feel like I am an expert.
Gigerenzer’s quote shows that people will believe themselves to be well informed even if they cannot answer a basic knowledge question about the topic. Rather than substituting the question can I recall a time when I thought about this thing, patients may also be substituting another question. Instead of analyzing their confidence in their own decision regarding cancer screening, people may be substituting the question do I trust my doctor? Trust in a physician, even without any knowledge about the procedure, may be enough for people to feel extremely confident in their decisions. They don’t have to know a lot about their health or how a procedure is going to impact it, they just need to be confident that their physician does.
These types of substitutions are important for us to recognize. We should try to identify when we are falling victim to the availability bias and when we are substituting different questions that are easier for us to answer. In a well functioning and accurate healthcare setting these biases and cognitive errors may not harm us too much, but in a world of uncertainty, we stand to lose a lot when we fail to recognize how little we actually know. Being honest about our knowledge and thinking patterns can help us develop better systems and structures in our lives to improve and guide our decision-making.
The Results of Social Learning

The Results of Social Learning

The results of Social learning are not always positive. We learn a lot from our friends, our culture, and the people around us that we are not always aware of. We are greatly influenced by what we see others doing and believing, and this includes the things we learn and come to believe as true facts about the world. This is easily demonstrated by polling the opinions of people who get their news from traditional news outlets relative to people who get their news from fringe sources with political biases. But it is also true in spaces you would not expect.

 

To describe problems in social learning results, Gerd Gigerezner in Risk Savvy writes, “All in all, social learning leads to a paradoxical result. In France, Germany, Italy, the United Kingdom, and the United States, doctors’ beliefs about diet and health – such as taking vitamin supplements or exercising – more closely resemble those of the general public in their country than of doctors in other countries.”

 

When it comes to general knowledge and an ability to distinguish between accurate information and fads, trends, or beliefs without evidence, we like to imagine that we are smart and capable of identifying the truth. We like to believe that our beliefs are based on reality, that we have carefully considered the facts, and that we hold our beliefs for good reason. We won’t admit that we believe the things we do because others hold those same beliefs, but as the doctor example above indicates, that is often the case. The Dartmouth Atlas Project shows differences across the USA in treatments for certain conditions and rates of diagnosis for different conditions. Some of that may be genetic and reflect real health differences across the country, but some of the differences reflect different treatment approach beliefs by doctors trained in and practicing in different regions of the country.

 

Social learning results are good when they bring people together in support of democratic norms or help people understand that sitting on a couch all day and eating pizza for dinner every night are unhealthy behaviors. However, social learning results can be negative when doctor’s group around wasteful medical practices. The results of social learning can also just be random and strange, such as when people fall into fad diets or exercise programs that have no discernable health benefits or harms. What we should take away from Gigerenzer’s quote is that our knowledge is not always as rock solid and evidence based as we would believe. We should be honest with ourselves and make an effort to investigate whether our beliefs are based on real evidence or based on the people in our social groups who happen to hold the same beliefs. Perhaps our beliefs are still justifiable after strict scrutiny, but perhaps some beliefs can be let go when we see they are based on little more than the opinions and feelings of people around us.
Informed Bets

Informed Bets

My last post was about limitations of the human mind and why we should be willing to doubt our conclusions and beliefs. This post contrasts my last post to argue that we can trust the informed bets that our brains make. Our brains and bodies do not have the capabilities to fully capture all of the information necessary to perfectly replicate reality in our minds, but they can do a good job putting information together in a way that helps us successfully navigate the world and our lives. Informed guesses, that is assumptions and intuitions based on experience and expertise rather than random and amateurish judgements, are actually very useful and often good approximations.

 

“Intelligence…” Gerd Gigerenzer writes in his book Risk Savvy, “is the art of making informed guesses.” Our brains make a lot of predictions and rely on heuristics, assumptions, and guesses to get by. It turns out that our brains do this well, as Gigerenzer argues in his book. We don’t need to pull out graph paper and a scientific calculator to catch a football. We don’t need to record every thought and action we have had over the last month to know if we are happy with our New Year’s resolutions and can keep them going. When we see someone standing in a long customer service line at the grocery store we don’t need to approach them with a 100 point questionnaire to know whether they are bored or upset.  Informed bets and reasonable guesses are sufficient for us to have decent and functional understanding of the world.

 

Gigerenzer continues, “Intelligence means going beyond the information given and making informed bets on what’s outside.” This quote is introduced after an optical illusion, where a grayscale checkerboard is shown with a figure casting a shadow across the board. Two squares on the board are the same shade of gray, yet our minds see the squares as different colors. Our minds are going beyond the information given, the literal wavelength of light reaching the back of our eyes, and making informed bets on the relative colors of the squares on the board if there was not a figure to cast a shadow. In the case of the visual illusion, our brain’s guess about reality is actually more helpful for us than the literal reality of the same colors of the squares in the image.

 

Bounded rationality is a serious concern. We cannot absorb all the information that exists in the world which may help us make better decisions. However, humans are intelligent. We can use the information we receive and make informed bets about the best choices and decisions available. We might not be perfect, but by making informed bets and educated guesses we can successfully come to understand the world and create systems and structures that help us improve our understanding over time.
intelligence - Joe Abittan

Intelligence

“Intelligence is not an abstract number such as an IQ, but similar to a carpenter’s tacit knowledge about using appropriate tools,” writes Gerd Gigerenzer in his book Risk Savvy. “This is why the modern science of intelligence studies the adaptive toolbox that individuals, organizations, and cultures have at their disposal; that is, the evolved and learned rules that guide our deliberate and intuitive decisions.”

 

I like Gigerenzer’s way of explaining intelligence. It is not simply a number or a ratio, but it is our knowledge and ability to understand our world. There are complex relationships between living creatures, physical matter, and information. Intelligence is an understanding of those relationships and an ability to navigate the complexity, uncertainty, and connections between everything in the world. Explicit rules, like mathematical formulas, help us understand some relationships while statistical percentages help us understand others. Recognizing and being aware of commonalities between different categories of things and items and identifying patterns help us understand these relationships and serves as the basis for our intelligence.

 

What is important to note, is that our intelligence is built with concrete tools for some situations, like 2+2=4, and less concrete rules of thumb for other situations, like the golden rule – do to others what you would like others to do to you. Gigerenzer shows that our intelligence requires that we know more than one mathematical formula, and that we have more than one rule of thumb to help us approach and address complex relationships in the world. “Granted, one rule of thumb cannot possibly solve all problems; for that reason, our minds have learned a toolbox of rules. … these rules of thumb need to be used in an adaptive way.”

 

Whether it is interpreting statistical chance, judging the emotions of others, of making plans now that delay gratification until a later time, our rules of thumb don’t have to be precise, but they do need to be flexible and adaptive given our current circumstances. 2+2 will always equal 4, but a smile from a family member might be a display of happiness or a nervous impulse and a silent plead for help in an awkward situation. It is our adaptive toolbox and our intelligence that allows us to figure out what a smile means. Similarly, adaptive rules of thumb and intelligence help us reduce complex interactions and questions to more manageable choices, reducing uncertainty about how much we need to save for retirement to a rule of thumb that tells us to save a small but significant amount of each pay check. Intelligence is not just about facts and complex math. It is about adaptable rules of thumb that help us make sense of complexity and uncertainty, and the more adaptive these rules of thumb are, the more our intelligence an help us in the complex world of today and into the uncertain future.
Missing Feedback

Missing Feedback

I generally think we are overconfident in our opinions. We should all be more skeptical that we are right, that we have made the best possible decisions, and that we truly understand how the world operates. Our worldviews can only be informed by our experiences and by the information we take in about events, phenomena, and stories in the world. We will always be limited because we can’t take in all the information the world has to offer. Additionally, beyond simply not being able to hold all the information possible, we are unable to get the appropriate feedback we need in all situations for comprehensive learning. Some feedback is hazy and some feedback is impossible to receive at all. This means that we cannot be sure that we have made the best choices in our lives, even if things are going well and we are making our best efforts to study the world.

 

In Nudge, Cass Sunstein and Richard Thaler write, “When feedback does not work, we may benefit from a nudge.” When we can’t get immediate feedback on our choices and decisions, or when we get feedback that is unclear, we can’t adjust appropriately for future decisions. We can’t learn, we can’t improve, and we can’t make the best choices when we return to a decision-situation. However, we can observe where situations of poor feedback exist, and we can help design those decision-spaces to provide subtle nudges to help people make better decisions in the absence of feedback. Visual aids showing how much money people need for retirement and how much they can expect to have based on current savings rates is a helpful nudge in a situation where we don’t get feedback for how well we are saving money. There are devices that glow red or green based on your home’s current energy usage and efficiency, providing a subtle nudge to remind people not to use appliances at peak demand times and giving people feedback on energy usage that they normally wouldn’t receive. Nudges such as these can provide feedback, or can provide helpful information in the absence of feedback.

 

Sunstein and Thaler also write, “many of life’s choices are like practicing putting without being able to see where the balls end up, and for one simple reason: the situation is not structured to provide good feedback. For example, we usually get feedback only on the options we select, not the ones we reject.” Missing feedback is an important consideration because the lack of feedback influences how we understand the world and how we make decisions. The fact that we cannot get feedback on options we never chose should be nearly paralyzing. We can’t say how the world works if we never experiment and try something different. We can settle into a decent rhythm and routine, but we may be missing out on better lifestyles, happier lives, or better societies if we made different choices. However, we can never receive feedback on these non-choices. I don’t know that this means we should necessarily try to constantly experiment at the cost of settling in with the feedback we can receive, but I do think it means we should discount our own confidence and accept that we don’t know all there is. I also think it means we should look to increase nudges, use more visual aids, and structure our choices and decisions in ways that help maximize useful feedback to improve learning for future decision-making.
Discount Confidence

Discount Confidence

You should probably discount confidence, even your own, when it comes to the certainty of a given outcome or event. I previously wrote about confidence stemming from the logical coherence of the story we are able to tell ourselves. I have also written about how logical coherence of personal narratives is easier when we lack key information and have a limited set of experiences to draw from. The more we know, the more experiences we have, the harder it becomes to construct a narrative that can balance conflicting and competing information. Laddering up from this point, we should be able to see that the more detailed and complete our information, the less coherent and easily logical our narrative about the world should be, and the less confidence we should have about anything.

 

If you have a high level of confidence in your own intuitions, then you probably don’t know enough about the world. If someone tells you they are very confident in something, like say an investment strategy, then you should probably discount the outcome based on their certainty. They may still be right in the end, but their certainty shouldn’t be a factor that leads to your support of the outcome they tell you to be a sure thing. As Daniel Kahneman writes in Thinking Fast and Slow, “The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trusty anyone – including yourself – to tell you how much you should trust their judgment.”

 

We tend to be very trustworthy. Our society and economy run on trust that we place in complete strangers. Our inclination toward trust is what causes us to be so easily fooled by confidence. It is easy to assume that someone who has a lot of confidence in something is more trustworthy, because we assume they must know a lot in order to be so confidence. But as I laid out at the start of this post, that isn’t always the case. In fact, the more knowledge you have about something, the less confidence you should have. With more knowledge comes more understanding of nuance, better conceptions of areas of uncertainty, and a better sense of trade-offs and contradictions. Confidence alone is not a predictor of accuracy. Our assumptions influence how accurate our prediction is, and we can be very confident in our assumptions without having any concrete connection to reality.
A Large Collection of Miniskills

A Large Collection of Miniskills

I  really like the way that Daniel Kahneman describes expertise in his book Thinking Fast and Slow. His description is incredibly meaningful today, in a world where so many of us work in offices and perform knowledge world. Expertise is important, but it is a bit nebulous when you think about knowledge work expertise compared to craftsmanship expertise. Nevertheless, a good concept of what expertise is can be helpful when thinking about personal growth and success.

 

Kahneman writes, “The acquisition of expertise in complex tasks such as high-level chess, professional basketball, or firefighting is intricate and slow because expertise in a domain is not a single skill but rather a large collection of miniskills.” By thinking about expertise as a large collection of miniskills it becomes more understandable and meaningful, even in the context of knowledge work. For sports, many crafts, and even physical labor, expertise as a collection of miniskills is so obvious it is almost invisible. But for knowledge work, expertise as a collection of miniskills is invisible because it is not obvious or ubiquitous.

 

The image coming to mind for me when I think of expertise as a series of miniskills is iron forging or glasswork. It is clear that one must have a lot of different skills ranging from skills related to noticing subtle changes in materials as heat is applied to physical skills involved in shaping the material once it is at a certain temperature. One also has to have imaginative skills in order to see the shape and design that one wants, and be able to connect the right twists, bends, and physical manipulations to the object to match the mental image. Forging a knife or making a glass marble requires a lot of skills in related but different spheres in order to make one final product. It is obvious that one needs a lot of miniskills to be successful, but unless we enroll in a beginners class, we don’t necessarily think about all the miniskills that go into the craftsmanship.

 

In the knowledge work economy, our final work products are also an accumulation of miniskills, even though it feels as though we just produce one thing or do one thing with no real “skill” involved. However, our work requires communication skills, writing skills (a particular variation of communication skills), scheduling and coordinating skills, and oftentimes skills that require us to be able to create visually stimulating and engaging materials. Whether it is creating a slide show, coordinating an important meeting, or drafting standard operating procedures, we are not simply doing one thing, but are engaging an entire set of miniskills. True expertise in knowledge work is still derived from a set of miniskills, but the skills themselves don’t seem like real skills, and are easily ignored or overlooked. Focusing on the miniskills needed for knowledge work expertise can help us understand where we can improve, what our image of success really entails, and how to approach important projects. It is the mastery and connection of various miniskills that enables us to be experts in what we do, even in our ubiquitous office environments.