Discount Confidence

Discount Confidence

You should probably discount confidence, even your own, when it comes to the certainty of a given outcome or event. I previously wrote about confidence stemming from the logical coherence of the story we are able to tell ourselves. I have also written about how logical coherence of personal narratives is easier when we lack key information and have a limited set of experiences to draw from. The more we know, the more experiences we have, the harder it becomes to construct a narrative that can balance conflicting and competing information. Laddering up from this point, we should be able to see that the more detailed and complete our information, the less coherent and easily logical our narrative about the world should be, and the less confidence we should have about anything.

 

If you have a high level of confidence in your own intuitions, then you probably don’t know enough about the world. If someone tells you they are very confident in something, like say an investment strategy, then you should probably discount the outcome based on their certainty. They may still be right in the end, but their certainty shouldn’t be a factor that leads to your support of the outcome they tell you to be a sure thing. As Daniel Kahneman writes in Thinking Fast and Slow, “The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trusty anyone – including yourself – to tell you how much you should trust their judgment.”

 

We tend to be very trustworthy. Our society and economy run on trust that we place in complete strangers. Our inclination toward trust is what causes us to be so easily fooled by confidence. It is easy to assume that someone who has a lot of confidence in something is more trustworthy, because we assume they must know a lot in order to be so confidence. But as I laid out at the start of this post, that isn’t always the case. In fact, the more knowledge you have about something, the less confidence you should have. With more knowledge comes more understanding of nuance, better conceptions of areas of uncertainty, and a better sense of trade-offs and contradictions. Confidence alone is not a predictor of accuracy. Our assumptions influence how accurate our prediction is, and we can be very confident in our assumptions without having any concrete connection to reality.
Fluency of Ideas

Fluency of Ideas

Our experiences and narratives are extremely important to consider when we make judgments about the world, however we rarely think deeply about the reasons why we hold the beliefs we do. We rarely pause to consider whether our opinions are biased, whether our limited set of experiences shape the narratives that play in our mind, and how this influences our entire outlook on life. Instead, we rely on the fluency of ideas to judge our thoughts and opinions as accurate.

 

In Thinking Fast and Slow Daniel Kahneman writes about ideas from Cass Sunstein and jurist Timur Kuran explaining their views on fluency, “the importance of an idea is often judged by the fluency (and emotional charge) with which that idea comes to mind.” It is easy to characterize an entire group of people as hardworking, or lazy, or greedy, or funny based entirely on a single interaction with a single person from that group. We don’t pause to ask if our interaction with one person is really a good reflection of all people who fit the same group as that person, we instead allow the fluency of our past experiences to shape our opinions of all people in that group.

 

And our ideas and the fluency with which those ideas come to mind don’t have to come from our own personal experience. If a claim is repeated often enough, we will have trouble distinguishing it from truth, even if it is absurd and doesn’t have any connection to reality. The idea will come to mind more fluently, and consequently the idea will start to feel true. We don’t have to have direct experience with something if a great marketing campaign has lodge an opinion or slogan in mind that we can quickly recall.

 

If we are in an important decision-making role, it is important that we recognize this fluency bias. The fluency of ideas will drive us toward a set of conclusions that might not be in our best interests. A clever marketing campaign, a trite saying repeated by salient public leaders, or a few extreme yet random personal experiences can bias our judgment. We have to find a way to step back, recognize the narrative at hand, and find reliable data to help us make better decisions, otherwise we might end up judging ideas and making decisions based on faulty reasoning.
As an addendum to this post (originally written on 10/04/2020), this morning I began The Better Angels of Our Nature: Why Violence Has Declined, by Steven Pinker. Early in the introduction, Pinker states that violence in almost all forms is decreasing, despite the fact that for many of us, it feels as though violence is as front and center in our world as ever before. Pinker argues that our subjective experience of out of control violence is in some ways due to the fluency bias that Kahneman describes from Sunstein and Kuran. Pinker writes,

 

“No matter how small the percentage of violent deaths may be, in absolute numbers there will always be enough of them to fill the evening news, so people’s impressions of violence will be disconnected from the actual proportions.” 

 

The fluency effect causes an observation to feel correct, even if it is not reflective of actual trends or rates in reality.
Rarely Stumped

Rarely Stumped

Daniel Kahneman starts one of the chapters in his book Thinking Fast and Slow by writing, “A remarkable aspect of your mental life is that you are rarely stumped. True, you occasionally face a question such as 17 × 24 = ? to which no answer comes immediately to mind, but these dumbfounded moments are rare. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way.”

 

When I read this quote I am reminded of Gus, the father, in My Big Fat Greek Wedding. He is always ready to show how every word comes from a Greek root, even a Japanese word like kimono. He is sure of his intellect, sure that his heritage is perfect and is the foundation of all that is good in the world. He trusts his instincts and intuitions to a hilarious extent, even when he is clearly wrong and even when his decisions are gift-wrapped and planted in his mind in an almost Inception style.

 

His character is part caricature, but it is revealing of what Kahneman explains with the quote above. Our minds are good at finding intuitive answers that make sense of the world around us, even if we really don’t have any idea what is going on. We laugh at Gus and don’t consider ourselves to be guilty of behaving like him, but the only difference between most of us and Gus is that Gus is an exaggeration of the intuitive dogma and sense of self value and assurance that we all live with.

 

We scroll through social media, and trust that our initial judgment of a headline or post is the right frame for how to think about the issue. We are certain that our home remedy for tackling bug bites, cleaning windows, or curing a headache is based on sound science, even if it does nothing more than produce a placebo effect. We find a way to fit every aspect of our lives into a comprehensive framework where our decisions appear rational and justified, with us being the hero (or innocent victim if needed) of the story.

 

We should remember that we have a propensity to believe that we are always correct, that we are never stumped. We should pause, ask more questions, think about what is important to know before making a decision, and then deeply interrogate our thoughts to decide if we really have obtained meaningful information to inform our opinions, or if we are just acting on instinct, heuristics, self-interest, or out of groupthink. We cannot continue believing we are right, pushing baseless beliefs onto others when we have no real knowledge of an issue. We shouldn’t assume things are true just because they happen to align with the story we want to believe about ourselves and the world. When it comes to crucial issues and our interactions and relationships with others, we need to think more critically, and recognize when we are assuming we are right. If we can pause at those times and think more deeply, gather more information, ask more questions of our selves, we can have more accurate and honest interactions and relationships. Hopefully this will help us have more meaningful lives that better connect and better develop the community we all need in order to thrive.
Narrative Confidence

Narrative Confidence

We like to believe that having more information will make us more confident in our decisions and opinions. The opposite, however, may be true. I have written in the past about a jam study, where participants who selected jam from a sample of a few jams were more happy with their choice than participants who selected jam from a sample of several dozen jam options. More information and more choices seems like it would help make us more happy and make us more confident with our decision, but those who selected jam from the small sample were happier than those who had several dozen jam options.

 

We like simple stories. They are easy for our brain to construct a narrative around and easy for us to have confidence in. The stories we tell ourselves and the conclusions we reach are often simplistic, often built on incomplete information, and often lack the nuance that is necessary to truly reflect reality. Our brains don’t want to work too hard, and don’t want to hold conflicting information that forces an unpleasant compromise. We don’t want to constantly wonder if we made the right choice, if we should do something different, if we need to try another option. We just want to make a decision and have someone tell us it was a good decision, regardless of the actual outcome or impact on our lives, the lives of others, or the planet.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow. He describes a study (not the jam study) where participants were presented with either one side or two sides of an argument. They had to chose which side they agreed with, and rate their confidence. “Participants who saw one-sided evidence were more confident of their judgments than those who saw both sides,” writes Kahneman, “This is just what you would expect if the confidence that people experience is determined by the coherence of the story they manage to construct from available information. It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.”

 

Learning a lot and truly understanding any given issue is challenging because it means we must build a complex picture of the world. We can’t rely on simple arguments and outlooks on life when we start to get into the weeds of an issue or topic. We will see that admirable people have tragic flaws. We will see that policies which benefit us may exploit others. We will find that things we wish to be true about who we are and the world we live in are only semi-true. Ignorance is bliss in the sense that knowing only a little bit about the world will allow you to paint a picture that makes sense to you, but it won’t be accurate about the world and it won’t acknowledge the negative externalities that the story may create. Simplistic narratives may help us come together as sports fans, or as consumers, or as a nation, but we should all be worried about what happens when we have to accept the inaccuracies of our stories. How we do we weave a complex narrative that will bring people across the world together in a meaningful and peaceful way without driving inequality and negative externalities? That is the challenge of the age, and unfortunately, the better we try to be at accurately depicting the world we inhabit, the less confident any of us will be about the conclusions and decisions for how we should move forward.
System 1 Success

System 1 Success

“The measure of success for System 1 is the coherence of the story it manages to create.”

 

Daniel Kahneman writes that in his book Thinking Fast and Slow when discussing the quick conclusions of our System 1, the mental processing part of our brain that is fast, intuitive, and operates based on simple associations and heuristics.

 

System 1 stitches together a picture of the world and environment around us with incomplete information. It makes assumptions and quick estimates about what we are seeing and compiles a coherent story for us. And what is important for System 1 is that the story be coherent, not that the story be accurate.

 

System 2, the part of our brain which is more rational, calculating, and slower, is the part of the brain that is required for making detailed assessments on the information that System 1 takes in. But normally we don’t activate System 2 unless we really need to. If we judge that System 1 is making coherent connections and associations, then we don’t give it more attention and scrutiny from System 2.

 

It is important that we understand this about our minds. We can go about acting intuitively and believing that our simple narrative is correct, but we risk believing our own thoughts simply because they feel true and coherent to us and in line with our past experiences. Our thoughts will necessarily be inadequate, however, to fully encompass the reality around us. Other people will have different backgrounds, different histories, and different narratives knitted together in their own minds. It’s important that we find a way to engage System 2 when the stakes are high to make more thoughtful considerations than System 1 can generate. Simply because a narrative feels intuitively correct doesn’t mean that it accurately reflects the world around us, or creates a picture of the world that will work within the narrative frameworks that other people create.
Mood, Creativity, & Cognitive Errors

Mood, Creativity, & Cognitive Errors

In Thinking Fast and Slow, Daniel Kahneman comments on research studying people’s mood and cognitive performance. He writes the following about how we think when we are in a good mood, “when in a good mood, people become more intuitive and more creative but also less vigilant and more prone to logical errors.”

 

We think differently when we are in different moods. When we are relaxed and happy, our minds are more creative and our intuitions tend to be more accurate. Kahneman suggests that when we are happy and when we don’t sense threats, our rational and logical part of the brain lets up, allowing our mind to flow more freely. When we are not worried about our safety, our mind doesn’t have to examine and interrogate everything in our environment as thoroughly, hence the tendency toward logical errors. A sense of threat activates our deep thinking, making us more logical, but also diminishing the capacity of our intuitive thinking and making us less creative, less willing to take risks with our ideas and thoughts.

 

The research from Kahneman about mood, creativity, and cognitive errors reminds me of the research Daniel Pink shares in his book When. Pink finds that we tend to be more creative in the afternoons, once our affect has recovered from the afternoon trough when we all need a nap. Once our mood has improved toward the end of the day, Pink suggest that we are more creative. Our minds are able to return to important cognitive work, but are still easily distracted, allowing for more creative thinking.  This seems to tie in with the research from Kahneman. We become more relaxed, and are willing to let ideas flow across the logical boundaries that had previously separated ideas and categories of thought in our minds.

 

It is important that we think about our mood and the tasks we have at hand. If we need to do creative work, we should save it for the afternoon, when our moods improve and we have more capacity for drawing on previously disconnected thoughts and ideas in new ways. We shouldn’t try to cram work that requires logical coherence into times when we are happy and bubbly, our minds simply won’t be operating in the right way to handle the task. When we do work is as important as the mood we bring to work, and both the when and the mood may seriously impact the output.
Performance and Mood

Performance and Mood

We are in the middle of a global health pandemic, but it comes at a time when companies are starting to radically re-think the work environments they set up for their employees. I worked for a time for a tech company based out of the San Francisco Bay Area, and saw first hand the changing thoughts in how companies relate to their employees. Every day wasn’t a party, but companies like the one I worked for were beginning to recognize how important a healthy, happy, and agreeable workforce is to productivity and good outcomes as whole. The pandemic has forced companies to think even more deeply about these things, and some blend of remote and office work schedules will likely remain for a huge number of employees. Hopefully, we will walk away from the pandemic with workplaces that better align with the demands placed on people today, and hopefully we will be more happy in our work environments.

 

Research that Daniel Kahneman presents in his book Thinking Fast and Slow suggest that adapting workplaces to better accommodate employees and help them be more happy with their work could have huge positive impacts for our futures. Regarding tests for intuitive accuracy, Kahneman shares the following about people’s performance on tests and their mood:

 

“Putting participants in a good mood before the test by having them think happy thoughts more than doubled accuracy. An even more striking result is that unhappy subjects were completely incapable of performing the intuitive task accurately; their guesses were no better than random.”

 

Our mood impacts our thoughts and our thinking processes. When we are happy, we are better at making intuitive connections and associations. If we need to be productive, accurate, and intuitive, then we better have an environment that supports a relatively high level of happiness.

 

If our work environment does the opposite, if we are overwhelmed by stress and must deal with toxic culture issues, then it is likely that we will be less accurate with our tasks. We won’t perform as well, and those who depend on our work will receive sub-par products.

 

There is likely a self-perpetuating effect with both scenarios. A happy person is likely to perform better, and they will likely be praised for their good outcomes, improving their happiness and reinforcing their good work. But someone who is unhappy will likely have poor performance and is more likely to be reprimanded, leading to more unhappiness and continued unsatisfactory performance. For these reasons it is important that companies take steps to help put their employees in a good mood while working. This requires more than motivational posters, it requires real relationships and inclusion in important decisions around the workspace. In the long run, boosting mood among employees can have a huge impact, especially if the good results reinforce more positive feeling and continued high quality output. Changing work schedules and locations as forced upon employers by the pandemic can provide an opportunity for employers to think about the demands they place on employees, and what they can do to ensure their employees have healthy, safe workplaces that encourage positive moods and productivity.
Familiarity vs Truth

Familiarity vs Truth

People who wish to spread disinformation don’t have to try very hard to get people to believe that what they are saying is true, or that their BS at least has some element of truth to it. All it takes, is frequent repetition. “A reliable way to make people believe in falsehoods,” writes Daniel Kahneman in his book Thinking Fast and Slow, “is frequent repetition, because familiarity is not easily distinguished from truth.”

 

Having accurate and correct representations of the world feels important to me. I really love science. I listen to lots of science based podcasts, love sciency discussions with family members and friends, and enjoy reading science books. By accurately understanding how the world operates, by seeking to better understand the truth of the universe, and by developing better models and systems to represent the way nature works, I believe we can find a better future. I try not to fall unthinkingly into techno-utopianism thinking, but I do think that having accurate beliefs and understandings are important for improving the lives of people across the planet.

 

Unfortunately, for many people, I don’t think that accurate and correct understandings of the worlds have such high priority in their lives. I fear that religion and science may be incompatible or at odds with each other, and there may be a willingness to accept inaccurate science or beliefs to support religious doctrine. I also fear that people in extractive industries may discount science, preferring to hold an inaccurate belief that supports their ability to profit through their extractive practices. Additionally, the findings, conclusions, and recommendations from science may just be scary for many ordinary people, and accepting what science says might be inconvenient or might require changes in lifestyles that people don’t want to make. When we are in this situations, it isn’t hard to imagine why we might turn away from scientific consensus in favor of something comfortable but wrong.

 

And this is where accurate representations of the universe face and uphill battle. Inaccuracies don’t need to be convincing, don’t really need to sound plausible, and don’t need to to come from credible authorities. They just need to be repeated on a regular basis. When we hear something over and over, we start to become familiar with the argument, and we start to have trouble telling the truth and falsehood apart. This happened in 2016 when the number one word associated with Hillary Clinton was Emails. It happened with global warming when enough people suggested that human related CO2 emissions were not related to the climate change we see. And it happens every day in trite sayings and ideas from trickle down economics to popping your knuckles causes arthritis.

 

I don’t think that disproving inaccuracies is the best route to solving the problem of familiarity vs truth. I think the only thing we can hope to do is amplify those ideas, conclusions, experiments, and findings which accurately reflect the true nature of reality. We have to focus on what is true, not on all the misleading nonsense that gets repeated. We must repeat accurate statements about the universe so that they are what become familiar, rather than the mistaken ideas that become hard to distinguish from the truth.
Misdiagnosis

Misdiagnosis

Healthcare spending has been increasing, but it is easy to see that we have a finite set of healthcare resources available to everyone. We only have so many hospitals, there are only so many doctors available, and our healthcare plans are all tied together so if one person uses a high amount of healthcare, everyone paying into the health plan will see their costs rise. This is one of the reasons why it is so important to make sure we are getting the best care possible with our healthcare dollars, why it is so important that we ensure that everyone gets the right treatment at the right time.

 

As Dave Chase writes in The Opioid Crisis Wake-Up Call, “A senior executive at a Fortune 10 company wisely told me that misdiagnosis is the biggest healthcare error; everything that follows both harms the patient and costs you.” 

 

If we don’t get the diagnosis piece right for patients, then they get the wrong care. They take medications that don’t help them, undergo procedures that don’t address the correct issue, and eventually return for more evaluation and diagnostic testing. The patient can be harmed by drug side-effects, by surgeries that were never needed, and by exposure to radiation from diagnostic imaging.

 

Getting the diagnosis wrong also wastes a huge amount of our finite healthcare resources. Each new appointment to try to get the diagnosis right, to do more testing and screening, or to try a new procedure leads to increased costs for the individual and everyone else. Doctor’s offices have to fit in more appointments, patients have to fill more prescriptions to try new medications, and operating rooms are booked for the wrong procedures. Individuals and patients are delayed and have to pay more for their services.

 

It is important that we focus on making sure we get the correct diagnosis at the beginning. I’m not a physician, and I haven’t spent years connected to the healthcare system to tell you exactly where the breakdown is in finding the right diagnosis, but the costs of patient health and healthcare resources make it clear that we should invest in diagnostic capabilities. We don’t need to spot every little thing in the patient’s body, but we do need good enough diagnostics and enough knowledge and understanding to get the right diagnosis the first time, for the good of our bank accounts, and more importantly for the good of our collective health.