Shooting Accuracy & Movie Expectations

Shooting Accuracy & Movie Expectations

The other day I started a blog post with the main idea being that movies about war give us a false impression of what it really is like to fight in a war. The post was based on a quote from Mary Roach’s book Grunt, but it got a bit too off topic from the original contnext of the quote so I scrapped the post and re-wrote it. Today’s quote from Grunt allows me to revisit the idea in a more direct way. In the book Roach writes, “The average police officer taking a qualifying test on a shooting range scores 85 to 92 percent, [Bruce] Siddle told me, but in actual firefights hits the target only 18 percent of the time.”
In movies, the good guys never miss the target during practice. In the actual battles their accuracy is diminished, but definitely much higher than 18 percent. Their misses also usually seem to be on point, but the bad guy gets lucky by a passing car, an exceptional dodge, or some type of near-magic shield to protect themselves. For the good guys, missed shots are not so much missed shots as much as lucky blocks for the bad guy. The bad guys of course can’t hit anything and might as well not even have weapons.
The reason why I think this is important is because it presents a false sense of what it is like to be in active shooter situations. In our minds we all like to picture ourselves as the hero who can’t miss a shot and who can’t be hit by the bad guy’s bullets. In reality, trained police officers only manage to hit targets in firefights 18% of the time. Research shows that states with Stand Your Ground laws, which provide legal immunity to individuals who defend themselves with lethal force if attacked or within their own homes, have higher rates of men who die from gunshot wounds. The men who die are not the intruders or attackers, but the men who chose to stand their ground. Certainly these men thought they had a better than 18% chance of hitting their target and thought they would be the hero who couldn’t be hit by the bad guy’s bullets.
Public policy is often shaped by narrative more than fact, and our popular movies influence that narrative, even if we know the movies are impossible fictions. When we tell a narrative that assumes we can stand our ground and hit our target in a firefight, when we assume that we need concealed carry weapons so that we could protect ourselves in an active shooter situation, we are basing our narrative on a fiction of how effective we would be with a firearm. Reality suggests that untrained individuals will hit their target less than 18% of the time, if that is the hit rate of trained police. In a world that wasn’t influenced by movies, we would assume that concealed carry and stand your ground laws were pointless, because we would have a terrible chance of defending ourselves and stopping an active shooter. This is why it is important that we realize how far movies are from reality. It is important that we spend more time accurately understanding how humans respond in high stress situations, like active shooter events, and develop policies that are reasonable given the fact that trained police officers don’t hit anything when they fire their guns in active shooter situations. We can change the way the public responds to such events and possibly even the way police respond.
Lies Versus Epistemic Insouciance

Lies Versus Epistemic Insouciance

My last post was about epistemic insouciance, being indifferent to whether or not your beliefs, statements, and ideas are accurate or inaccurate. Epistemic insouciance, Quassim Cassam argues in Vices of the Mind is an attitude. It is a disposition toward accurate or false information that is generally case specific.
In the book, Cassam distinguishes between lies and epistemic insouciance. He writes, “lying is something that a person does rather than an attitude, and the intention to conceal the truth implies that the liar is not indifferent to the truth or falsity of his utterances. Epistemic insouciance is an attitude rather than something that a person does, and it does imply an indifference to the truth or falsity of one’s utterances.”
The distinction is helpful when we think about people who deliberately lie and manipulate information for their own gain and people who are bullshitters. Liars, as the quote suggests, know and care about what information is true and what is false. They are motivated by factors beyond the accuracy of the information, and do their best within their lies to present false information as factual.
Bullshitters, however, don’t care whether their information is accurate. The tools that work to uncover inaccurate information and counter a liar don’t work against a bullshitter because of their epistemic insouciance. Liars contort evidence and create excuses for misstatements and lies. Bullshitters simply flood the space with claims and statements of varying accuracy. If confronted, they argue that it doesn’t matter whether they lied or not, and instead argue that their information was wrong, that they didn’t care about it being wrong, and as a result they were not actually lying. This creates circular arguments and distracts from the epistemic value of information and the real costs of epistemic insouciance. Seeing the difference between liars and epistemically insouciant bullshitters is helpful if we want to know how to address those who intentionally obstruct knowledge.

Anecdotal Versus Systematic Thinking

Anecdotal Versus Systematic Thinking

Anecdotes are incredibly convincing, especially when they focus on an extreme case. However, anecdotes are not always representative of larger populations. Some anecdotes are very context dependent, focus on specific and odd situations, and deal with narrow circumstances. However, because they are often vivid, highly visible, and emotionally resonant, they can be highly memorable and influential.
Systemic thinking often lacks many of these qualities. Often, the general reference class is hard to see or make sense of. It is much easier to remember a commute that featured an officer or traffic accident than the vast majority of commutes that were uneventful. Sometimes the data directly contradicts the anecdotal stories and thoughts we have, but that data often lacks the visibility to reveal the contradictions. This happens frequently with news stories or TV shows that highlight dangerous crime or teen pregnancy. Despite a rise in crime during 2020, we have seen falling crime rates in recent decades, and despite TV shows about teen pregnancies, those rates have also been falling.
In Vices of the Mind, Quassim Cassam examines anecdotal versus systematic thinking to demonstrate that anecdotal thinking can be an epistemic vice that obstructs our view of reality. He writes, “With a bit of imagination it is possible to show that every supposed epistemic vice can lead to true belief in certain circumstances. What is less obvious is that epistemic vices are reliable pathways to true belief or that they are systematically conducive to true belief.”
Anecdotal versus systematic thinking or structural thinking is a useful context for thinking about Cassam’s quote. An anecdote describes a situation or story with an N of 1. That is to say, an anecdote is a single case study. Within any population of people, drug reactions, rocket launches, or any other phenomenon, there are going to be outliers. There will be some results that are strange and unique, deviating from the norm or average. These individual cases are interesting and can be useful to study, but it is important that we recognize them as outliers and not generalize these individual cases to the larger population. Systematic and structural thinking helps us see the larger population and develop more accurate beliefs about what we should normally expect to happen.
Anecdotal thinking may occasionally lead to true beliefs about larger classes, but as Cassam notes, it will not do so reliably. We cannot build our beliefs around single anecdotes, or we will risk making decisions based on unusual outliers. Trying to address crime, reduce teen pregnancy, determine the efficacy of a medication, or verify the safety of a spaceship requires that we understand the larger systemic and structural picture. We cannot study one instance of crime and assume we know how to reduce crime across an entire country, and none of us would want to ride in a spaceship that had only been tested once.
It is important that we recognize anecdotal thinking, and other epistemic vices, so we can improve our thinking and have better understandings of reality. Doing so will help improve our decision-making, will improve the way we relate to the world, and will help us as a society better determine where we should place resources to help create a world we want to live in. Anecdotal thinking, and indulging in other epistemic vices, might give us a correct answer from time to time, but it is likely to lead to worse outcomes and decisions over time as we routinely misjudge reality. This in turn will create tensions and distrust among a society that cannot agree on the actual trends and needs of the population.
Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

How we think, what we spend our time thinking about, and the way we view and understand the world is important. If we fail to develop accurate beliefs in the world then we will make decisions based on causal structures that do not exist. Our actions, thoughts, and behaviors will inhibit knowledge for ourselves and others, and our species will be worse off because of it.
This idea is at the heart of Quassim Cassam’s book Vices of the Mind. Throughout our human history we have held many beliefs that cannot plausibly be true, or which we came to learn were incorrect over time. Cassam would argue (alongside others such as Steven Pinker, Yuval Noah Harari, and Joseph Henrich) that adopting more accurate and correct beliefs and promoting knowledge would help us systematically make better decisions to improve the life of our fellow humans. Learning where we were wrong and using science, technology, and information to improve our decision-making has helped our world become less violent, given us more opportunity, provided better nutrition, and allowed us to be more cooperative on a global level.
This is why Cassam addresses paranormal beliefs, superstitions, and conspiratorial thinking in his book. While examining conspiracy theories in depth, he writes, “studies have also found that belief in conspiracy theories is associated with superstitious and paranormal beliefs, and it has been suggested that these beliefs are associated because they are underpinned by similar thinking styles [italicized text is cited with Swami et al. 2011].  Cassam argues that conspiracy theories are different from the other two modes of thinking because they can sometimes be accurate in their descriptions of the world. Sometimes a politician truly is running a corruption scheme, sometimes a group of companies are conspiring to keep prices high, and sometimes a criminal organization is hiding nefarious activities in plain sight. Conspiratorial thinking in some instances can reveal real causal connections in the world.
However, conspiratorial thinking is often bizarre and  implausible. When our conspiratorial thinking pushes us off the deep edge, then it does share important characteristics with superstitious and paranormal thinking. All three can be described by positing causal connections that cannot possibly exist between phenomena happening or imagined in the real world. They create explanations that are inaccurate and prevent us from identifying real information about the world. Superstitions posit causal connections between random and unconnected events and paranormal thinking posits causal connections between non-existent entities and real world events. Conspiratorial thinking seems to fall in line with both ways of thinking when it is not describing reality.
Over the last few years we have seen how conspiratorial thinking can be vicious, how it can inhibit knowledge, and how it can have real life and death consequences when it goes wrong. Superstitious thinking doesn’t generally seem to have as severe of consequences, but it still prevents us from making the best possible decisions and still drives us to adopt incorrect worldviews, sometimes entrenching unfair biases and prejudices. Paranormal thinking has been a foundation of many world religions and fables used to teach lessons and encourage particular forms of behavior. However, if it does not describe the world in a real way, then the value of paranormal thinking is minimized, and we should seriously consider the harms that can come from paranormal thinking, such as anxiety, suicide, or hours of lost sleep. These ideas are important to consider because we need to make the best possible decisions based on the most accurate information possible if we want to continue to advance human societies, to live sustainably, and to continue to foster cooperation and community between all humans on a global scale. Thinking accurately takes practice, so pushing against unwarranted conspiracy theories, superstitions, and paranormal beliefs helps us build our epistemic muscles to improve thinking overall.
Discount Confidence

Discount Confidence

You should probably discount confidence, even your own, when it comes to the certainty of a given outcome or event. I previously wrote about confidence stemming from the logical coherence of the story we are able to tell ourselves. I have also written about how logical coherence of personal narratives is easier when we lack key information and have a limited set of experiences to draw from. The more we know, the more experiences we have, the harder it becomes to construct a narrative that can balance conflicting and competing information. Laddering up from this point, we should be able to see that the more detailed and complete our information, the less coherent and easily logical our narrative about the world should be, and the less confidence we should have about anything.

 

If you have a high level of confidence in your own intuitions, then you probably don’t know enough about the world. If someone tells you they are very confident in something, like say an investment strategy, then you should probably discount the outcome based on their certainty. They may still be right in the end, but their certainty shouldn’t be a factor that leads to your support of the outcome they tell you to be a sure thing. As Daniel Kahneman writes in Thinking Fast and Slow, “The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trusty anyone – including yourself – to tell you how much you should trust their judgment.”

 

We tend to be very trustworthy. Our society and economy run on trust that we place in complete strangers. Our inclination toward trust is what causes us to be so easily fooled by confidence. It is easy to assume that someone who has a lot of confidence in something is more trustworthy, because we assume they must know a lot in order to be so confidence. But as I laid out at the start of this post, that isn’t always the case. In fact, the more knowledge you have about something, the less confidence you should have. With more knowledge comes more understanding of nuance, better conceptions of areas of uncertainty, and a better sense of trade-offs and contradictions. Confidence alone is not a predictor of accuracy. Our assumptions influence how accurate our prediction is, and we can be very confident in our assumptions without having any concrete connection to reality.
Fluency of Ideas

Fluency of Ideas

Our experiences and narratives are extremely important to consider when we make judgments about the world, however we rarely think deeply about the reasons why we hold the beliefs we do. We rarely pause to consider whether our opinions are biased, whether our limited set of experiences shape the narratives that play in our mind, and how this influences our entire outlook on life. Instead, we rely on the fluency of ideas to judge our thoughts and opinions as accurate.

 

In Thinking Fast and Slow Daniel Kahneman writes about ideas from Cass Sunstein and jurist Timur Kuran explaining their views on fluency, “the importance of an idea is often judged by the fluency (and emotional charge) with which that idea comes to mind.” It is easy to characterize an entire group of people as hardworking, or lazy, or greedy, or funny based entirely on a single interaction with a single person from that group. We don’t pause to ask if our interaction with one person is really a good reflection of all people who fit the same group as that person, we instead allow the fluency of our past experiences to shape our opinions of all people in that group.

 

And our ideas and the fluency with which those ideas come to mind don’t have to come from our own personal experience. If a claim is repeated often enough, we will have trouble distinguishing it from truth, even if it is absurd and doesn’t have any connection to reality. The idea will come to mind more fluently, and consequently the idea will start to feel true. We don’t have to have direct experience with something if a great marketing campaign has lodge an opinion or slogan in mind that we can quickly recall.

 

If we are in an important decision-making role, it is important that we recognize this fluency bias. The fluency of ideas will drive us toward a set of conclusions that might not be in our best interests. A clever marketing campaign, a trite saying repeated by salient public leaders, or a few extreme yet random personal experiences can bias our judgment. We have to find a way to step back, recognize the narrative at hand, and find reliable data to help us make better decisions, otherwise we might end up judging ideas and making decisions based on faulty reasoning.
As an addendum to this post (originally written on 10/04/2020), this morning I began The Better Angels of Our Nature: Why Violence Has Declined, by Steven Pinker. Early in the introduction, Pinker states that violence in almost all forms is decreasing, despite the fact that for many of us, it feels as though violence is as front and center in our world as ever before. Pinker argues that our subjective experience of out of control violence is in some ways due to the fluency bias that Kahneman describes from Sunstein and Kuran. Pinker writes,

 

“No matter how small the percentage of violent deaths may be, in absolute numbers there will always be enough of them to fill the evening news, so people’s impressions of violence will be disconnected from the actual proportions.” 

 

The fluency effect causes an observation to feel correct, even if it is not reflective of actual trends or rates in reality.
Rarely Stumped

Rarely Stumped

Daniel Kahneman starts one of the chapters in his book Thinking Fast and Slow by writing, “A remarkable aspect of your mental life is that you are rarely stumped. True, you occasionally face a question such as 17 × 24 = ? to which no answer comes immediately to mind, but these dumbfounded moments are rare. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way.”

 

When I read this quote I am reminded of Gus, the father, in My Big Fat Greek Wedding. He is always ready to show how every word comes from a Greek root, even a Japanese word like kimono. He is sure of his intellect, sure that his heritage is perfect and is the foundation of all that is good in the world. He trusts his instincts and intuitions to a hilarious extent, even when he is clearly wrong and even when his decisions are gift-wrapped and planted in his mind in an almost Inception style.

 

His character is part caricature, but it is revealing of what Kahneman explains with the quote above. Our minds are good at finding intuitive answers that make sense of the world around us, even if we really don’t have any idea what is going on. We laugh at Gus and don’t consider ourselves to be guilty of behaving like him, but the only difference between most of us and Gus is that Gus is an exaggeration of the intuitive dogma and sense of self value and assurance that we all live with.

 

We scroll through social media, and trust that our initial judgment of a headline or post is the right frame for how to think about the issue. We are certain that our home remedy for tackling bug bites, cleaning windows, or curing a headache is based on sound science, even if it does nothing more than produce a placebo effect. We find a way to fit every aspect of our lives into a comprehensive framework where our decisions appear rational and justified, with us being the hero (or innocent victim if needed) of the story.

 

We should remember that we have a propensity to believe that we are always correct, that we are never stumped. We should pause, ask more questions, think about what is important to know before making a decision, and then deeply interrogate our thoughts to decide if we really have obtained meaningful information to inform our opinions, or if we are just acting on instinct, heuristics, self-interest, or out of groupthink. We cannot continue believing we are right, pushing baseless beliefs onto others when we have no real knowledge of an issue. We shouldn’t assume things are true just because they happen to align with the story we want to believe about ourselves and the world. When it comes to crucial issues and our interactions and relationships with others, we need to think more critically, and recognize when we are assuming we are right. If we can pause at those times and think more deeply, gather more information, ask more questions of our selves, we can have more accurate and honest interactions and relationships. Hopefully this will help us have more meaningful lives that better connect and better develop the community we all need in order to thrive.
Narrative Confidence

Narrative Confidence

We like to believe that having more information will make us more confident in our decisions and opinions. The opposite, however, may be true. I have written in the past about a jam study, where participants who selected jam from a sample of a few jams were more happy with their choice than participants who selected jam from a sample of several dozen jam options. More information and more choices seems like it would help make us more happy and make us more confident with our decision, but those who selected jam from the small sample were happier than those who had several dozen jam options.

 

We like simple stories. They are easy for our brain to construct a narrative around and easy for us to have confidence in. The stories we tell ourselves and the conclusions we reach are often simplistic, often built on incomplete information, and often lack the nuance that is necessary to truly reflect reality. Our brains don’t want to work too hard, and don’t want to hold conflicting information that forces an unpleasant compromise. We don’t want to constantly wonder if we made the right choice, if we should do something different, if we need to try another option. We just want to make a decision and have someone tell us it was a good decision, regardless of the actual outcome or impact on our lives, the lives of others, or the planet.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow. He describes a study (not the jam study) where participants were presented with either one side or two sides of an argument. They had to chose which side they agreed with, and rate their confidence. “Participants who saw one-sided evidence were more confident of their judgments than those who saw both sides,” writes Kahneman, “This is just what you would expect if the confidence that people experience is determined by the coherence of the story they manage to construct from available information. It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.”

 

Learning a lot and truly understanding any given issue is challenging because it means we must build a complex picture of the world. We can’t rely on simple arguments and outlooks on life when we start to get into the weeds of an issue or topic. We will see that admirable people have tragic flaws. We will see that policies which benefit us may exploit others. We will find that things we wish to be true about who we are and the world we live in are only semi-true. Ignorance is bliss in the sense that knowing only a little bit about the world will allow you to paint a picture that makes sense to you, but it won’t be accurate about the world and it won’t acknowledge the negative externalities that the story may create. Simplistic narratives may help us come together as sports fans, or as consumers, or as a nation, but we should all be worried about what happens when we have to accept the inaccuracies of our stories. How we do we weave a complex narrative that will bring people across the world together in a meaningful and peaceful way without driving inequality and negative externalities? That is the challenge of the age, and unfortunately, the better we try to be at accurately depicting the world we inhabit, the less confident any of us will be about the conclusions and decisions for how we should move forward.
System 1 Success

System 1 Success

“The measure of success for System 1 is the coherence of the story it manages to create.”

 

Daniel Kahneman writes that in his book Thinking Fast and Slow when discussing the quick conclusions of our System 1, the mental processing part of our brain that is fast, intuitive, and operates based on simple associations and heuristics.

 

System 1 stitches together a picture of the world and environment around us with incomplete information. It makes assumptions and quick estimates about what we are seeing and compiles a coherent story for us. And what is important for System 1 is that the story be coherent, not that the story be accurate.

 

System 2, the part of our brain which is more rational, calculating, and slower, is the part of the brain that is required for making detailed assessments on the information that System 1 takes in. But normally we don’t activate System 2 unless we really need to. If we judge that System 1 is making coherent connections and associations, then we don’t give it more attention and scrutiny from System 2.

 

It is important that we understand this about our minds. We can go about acting intuitively and believing that our simple narrative is correct, but we risk believing our own thoughts simply because they feel true and coherent to us and in line with our past experiences. Our thoughts will necessarily be inadequate, however, to fully encompass the reality around us. Other people will have different backgrounds, different histories, and different narratives knitted together in their own minds. It’s important that we find a way to engage System 2 when the stakes are high to make more thoughtful considerations than System 1 can generate. Simply because a narrative feels intuitively correct doesn’t mean that it accurately reflects the world around us, or creates a picture of the world that will work within the narrative frameworks that other people create.
Mood, Creativity, & Cognitive Errors

Mood, Creativity, & Cognitive Errors

In Thinking Fast and Slow, Daniel Kahneman comments on research studying people’s mood and cognitive performance. He writes the following about how we think when we are in a good mood, “when in a good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors.”

 

We think differently when we are in different moods. When we are relaxed and happy, our minds are more creative and our intuitions tend to be more accurate. Kahneman suggests that when we are happy and when we don’t sense threats, our rational and logical part of the brain lets up, allowing our mind to flow more freely. When we are not worried about our safety, our mind doesn’t have to examine and interrogate everything in our environment as thoroughly, hence the tendency toward logical errors. A sense of threat activates our deep thinking, making us more logical, but also diminishing the capacity of our intuitive thinking and making us less creative, less willing to take risks with our ideas and thoughts.

 

The research from Kahneman about mood, creativity, and cognitive errors reminds me of the research Daniel Pink shares in his book When. Pink finds that we tend to be more creative in the afternoons, once our affect has recovered from the afternoon trough when we all need a nap. Once our mood has improved toward the end of the day, Pink suggest that we are more creative. Our minds are able to return to important cognitive work, but are still easily distracted, allowing for more creative thinking.  This seems to tie in with the research from Kahneman. We become more relaxed, and are willing to let ideas flow across the logical boundaries that had previously separated ideas and categories of thought in our minds.

 

It is important that we think about our mood and the tasks we have at hand. If we need to do creative work, we should save it for the afternoon, when our moods improve and we have more capacity for drawing on previously disconnected thoughts and ideas in new ways. We shouldn’t try to cram work that requires logical coherence into times when we are happy and bubbly, our minds simply won’t be operating in the right way to handle the task. When we do work is as important as the mood we bring to work, and both the when and the mood may seriously impact the output.