Believing We Are Well Informed

Believing We Are Well Informed

In his book Risk Savvy, Gerd Gigerenzer demonstrated that people often overestimate their level of knowledge about the benefits of prostate and cancer screening. “A national telephone survey of U.S. adults,” he writes, “reported that the majority were extremely confident in their decision about prostate, colorectal, and breast screening, believed they were well informed, but could not correctly answer a single knowledge question.” I think this quote reveals something important about the way our minds work. We often believe we are well informed, but that belief and our confidence in our knowledge is often an illusion.
This is something I have been trying to work on. My initial reaction any time I hear any fact or any discussion about any topic is to position myself as a knowledgeable semi-expert in the topic. I have noticed that I do this with ideas and topics that I have really only heard once or twice on a commercial, or that I have seen in a headline, or that I once overheard someone talking about. I immediately feel like an expert even though my knowledge is often less than surface deep.
I think that what is happening in these situations is that I am substituting my feeling of expertise or knowledge with a different question. I am instead answering the question can I recall a time when I thought about this thing and then answering that question. Mental substitution is common, but hard to actually detect. I suspect that the easier a topic comes to mind, even if it is a topic I don’t know anything about but have only heard the name of, then the more likely I am to feel like I am an expert.
Gigerenzer’s quote shows that people will believe themselves to be well informed even if they cannot answer a basic knowledge question about the topic. Rather than substituting the question can I recall a time when I thought about this thing, patients may also be substituting another question. Instead of analyzing their confidence in their own decision regarding cancer screening, people may be substituting the question do I trust my doctor? Trust in a physician, even without any knowledge about the procedure, may be enough for people to feel extremely confident in their decisions. They don’t have to know a lot about their health or how a procedure is going to impact it, they just need to be confident that their physician does.
These types of substitutions are important for us to recognize. We should try to identify when we are falling victim to the availability bias and when we are substituting different questions that are easier for us to answer. In a well functioning and accurate healthcare setting these biases and cognitive errors may not harm us too much, but in a world of uncertainty, we stand to lose a lot when we fail to recognize how little we actually know. Being honest about our knowledge and thinking patterns can help us develop better systems and structures in our lives to improve and guide our decision-making.
Missing Feedback

Missing Feedback

I generally think we are overconfident in our opinions. We should all be more skeptical that we are right, that we have made the best possible decisions, and that we truly understand how the world operates. Our worldviews can only be informed by our experiences and by the information we take in about events, phenomena, and stories in the world. We will always be limited because we can’t take in all the information the world has to offer. Additionally, beyond simply not being able to hold all the information possible, we are unable to get the appropriate feedback we need in all situations for comprehensive learning. Some feedback is hazy and some feedback is impossible to receive at all. This means that we cannot be sure that we have made the best choices in our lives, even if things are going well and we are making our best efforts to study the world.

 

In Nudge, Cass Sunstein and Richard Thaler write, “When feedback does not work, we may benefit from a nudge.” When we can’t get immediate feedback on our choices and decisions, or when we get feedback that is unclear, we can’t adjust appropriately for future decisions. We can’t learn, we can’t improve, and we can’t make the best choices when we return to a decision-situation. However, we can observe where situations of poor feedback exist, and we can help design those decision-spaces to provide subtle nudges to help people make better decisions in the absence of feedback. Visual aids showing how much money people need for retirement and how much they can expect to have based on current savings rates is a helpful nudge in a situation where we don’t get feedback for how well we are saving money. There are devices that glow red or green based on your home’s current energy usage and efficiency, providing a subtle nudge to remind people not to use appliances at peak demand times and giving people feedback on energy usage that they normally wouldn’t receive. Nudges such as these can provide feedback, or can provide helpful information in the absence of feedback.

 

Sunstein and Thaler also write, “many of life’s choices are like practicing putting without being able to see where the balls end up, and for one simple reason: the situation is not structured to provide good feedback. For example, we usually get feedback only on the options we select, not the ones we reject.” Missing feedback is an important consideration because the lack of feedback influences how we understand the world and how we make decisions. The fact that we cannot get feedback on options we never chose should be nearly paralyzing. We can’t say how the world works if we never experiment and try something different. We can settle into a decent rhythm and routine, but we may be missing out on better lifestyles, happier lives, or better societies if we made different choices. However, we can never receive feedback on these non-choices. I don’t know that this means we should necessarily try to constantly experiment at the cost of settling in with the feedback we can receive, but I do think it means we should discount our own confidence and accept that we don’t know all there is. I also think it means we should look to increase nudges, use more visual aids, and structure our choices and decisions in ways that help maximize useful feedback to improve learning for future decision-making.
Overcoming Group Overconfidence

Overcoming Group Overconfidence

Overcoming group overconfidence is hard, but in Thinking Fast and Slow, Daniel Kahneman offers one partial remedy: a premortem. As opposed to a postmortem, and analysis of why a project failed, a premortem looks at why a program might fail before it has started.

 

Group communication is difficult. When the leader of a group is enthusiastic about an idea, it is hard to disagree with them. If you are a junior member of a team, it can be uncomfortable, and potentially even disadvantageous for you and your career to doubt the ideas that a senior leader is excited about. If you have concerns, it is not likely that you will bring them up, especially in a group meeting with other seemingly enthusiastic team members surrounding you.

 

Beyond the silencing of a member who has concerns but doesn’t want to speak up is another problem that contributes to overconfidence among teams: groupthink. Particularly among groups that lack diversity, groupthink can crush the planning stage of a project. When everyone has similar backgrounds, similar experiences, and similar styles of thinking, it is unlikely that anyone within the group will have a viewpoint or opinion that is significantly different than the prevailing wisdom of the rest. What seems like a good idea or the correct decision to one person probably feels like the correct idea or decision to everyone else – there is literally no one in the room who has any doubts or alternative perspectives.

 

Premortems help get beyond groupthink and the fear of speaking up against a powerful and enthusiastic leader. The idea is to brainstorm all the possible ways that a project might fail. It includes an element of creativity by asking everyone to imagine the project is finally finished, either successfully but well over budget, way late, after a very turbulent series of events, or the project was a complete failure and never reached its intended end point. People have to describe the issues that came up and why the project did not reach the rosy outcome everyone initially pictured. Imaging that these failures had taken place in real life gets people to step beyond groupthink and encourages highlighting roadblocks that particularly enthusiastic members overlook.

 

Because premortems are hypothetical, it gives people a chance to speak up about failure points and weaknesses in plans and ideas without appearing to criticize the person the idea came from. It creates a safe space for imagining barriers and obstacles that need to be overcome to achieve success. It reduces groupthink by encouraging a creative flow of ideas of failure points. As Kahneman writes, “The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier.”

 

Overcoming group overconfidence is possible, but it needs the right systems and structures to happen. Groupthink and fear are likely to prevent people from bringing up real doubts and threats, but a premortem allows those concerns to be aired and seriously considered. It helps get people to look beyond the picture of success they intuitively connect with, and it helps prevent enthusiastic supporters from getting carried away with their overconfidence.
Discount Confidence

Discount Confidence

You should probably discount confidence, even your own, when it comes to the certainty of a given outcome or event. I previously wrote about confidence stemming from the logical coherence of the story we are able to tell ourselves. I have also written about how logical coherence of personal narratives is easier when we lack key information and have a limited set of experiences to draw from. The more we know, the more experiences we have, the harder it becomes to construct a narrative that can balance conflicting and competing information. Laddering up from this point, we should be able to see that the more detailed and complete our information, the less coherent and easily logical our narrative about the world should be, and the less confidence we should have about anything.

 

If you have a high level of confidence in your own intuitions, then you probably don’t know enough about the world. If someone tells you they are very confident in something, like say an investment strategy, then you should probably discount the outcome based on their certainty. They may still be right in the end, but their certainty shouldn’t be a factor that leads to your support of the outcome they tell you to be a sure thing. As Daniel Kahneman writes in Thinking Fast and Slow, “The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trusty anyone – including yourself – to tell you how much you should trust their judgment.”

 

We tend to be very trustworthy. Our society and economy run on trust that we place in complete strangers. Our inclination toward trust is what causes us to be so easily fooled by confidence. It is easy to assume that someone who has a lot of confidence in something is more trustworthy, because we assume they must know a lot in order to be so confidence. But as I laid out at the start of this post, that isn’t always the case. In fact, the more knowledge you have about something, the less confidence you should have. With more knowledge comes more understanding of nuance, better conceptions of areas of uncertainty, and a better sense of trade-offs and contradictions. Confidence alone is not a predictor of accuracy. Our assumptions influence how accurate our prediction is, and we can be very confident in our assumptions without having any concrete connection to reality.
Should You Be So Confident?

Should You Be So Confident?

Are you pretty confident that your diet is a healthy option for your? Are you confident in the outcome of your upcoming job interview? And how confident are you that you will have enough saved for retirement? Whatever your level of confidence, you might want to reconsider whether you should be as confident as you are, or whether you are just telling yourself a narrative that you like and that makes you feel comfortable with the decisions you have made.

 

In Thinking Fast and Slow, Daniel Kahneman writes the following about confidence:

 

“Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”

 

We feel confident in our choices, decisions, and predictions about the future when we can construct a coherent narrative. When we have limited information and experience, it is easy for us to fit that information together in a simplified manner that creates a logical story. The more conflicting and complex information and knowledge we obtain, the more diverse experiences and viewpoints we adopt, the harder it is to construct a simple narrative, and the harder it is for our story about the world to align in a way that makes us confident about anything.

 

A high level of confidence doesn’t represent reality, and it may actually reflect a lack of understanding of reality and all of its complexities. We are confident that our diet is good when we cut out ice cream and cookies, but we don’t really know that we are getting sufficient nutrients for our bodies and our lifestyles. We don’t really know how we perform in a job interview, but if we left feeling that we really connected and remembered to say the things we prepared, then we might be confident that we will land the job. And if we have a good retirement savings program through our job and also contribute to an IRA, we might feel that we are doing enough for retirement and be confident that we will be able to retire at 65, but few of us really do the calculations to ensure we are contributing what we need, and none of us can predict what housing or stock markets will look like as we get closer to retirement. Confidence is necessary for us to function in the world without being paralyzed by fear and never-ending cycles of analysis, but we shouldn’t mistake confidence in ourselves or in other people for actual certainty and knowledge.
Ignore Our Ignorance

Ignore Our Ignorance

There is a quote that is attributed to Harry Truman along the lines of, “give me a one-handed economist.” The quote references the frustrations that any key decision-maker might have when faced with challenging and sometimes conflicting information and choices. On the one hand is a decision with a predicted set of outcomes, but on the other hand is another decision or a separate undesirable set of consequences. The quote shows how challenging it is to understand and navigate the world when you have complex and nuanced understandings of what is happening.

 

Living in ignorance actually makes choices and decisions easier – there is no other hand of separate choices, of negative consequences, or different points of view. Ignoring our ignorance is preferable when we live our own narrative constructions, where what we see is all there is, and reality is what we make it to be.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow, and how these narrative fallacies lead to so many of our predictable cognitive errors. He writes, “Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

 

When I think about Kahneman’s quote, I think about myself upon graduating with a Masters in Public Administration and Policy and my older sister upon her high school graduation. My sister has had strong political views for a very long time, views that she readily adopted as a high school student. Her self-assured profession of her political views which contrasted against the self-assured political views of my parents is part of what sparked an interest in me to study political science and public policy. I wanted to understand how people became so sure of political views that I didn’t fully understand, but which I could see contained multitudes of perspectives, benefits, and costs.

 

At the completion of my degree I felt that I had a strong understanding of the political processes in the United States. I could understand how public policy was shaped and formed, I could describe how people came to hold various points of view and why some people might favor different policies. But what I did not gain was a sense that one particular political approach was necessarily correct or inherently better than any other. So much of our political process is dependent on who stands to benefit, what is in our individual self-interest, and what our true goals happen to be. At the completion of a study of politics, I felt that I knew more than many, but I did not exactly feel that my political opinions were stronger than the political opinions of my sisters when she graduated high school. Her opinions were formed in ignorance (not saying this in a mean way!), and her limited perspective allowed her to be more confident in her opinions than I could be with my detailed and nuanced upstanding of political systems and processes.

 

Our views of the world and how we understand our reality is shaped by the information we absorb and the experiences we have. What you see is all there is, and the narrative you live within will make more sense when you are more ignorant of the complexities of the world around you. Your narrative will be simpler and more coherent since there won’t be other hands to contrast against your opinions, desires, and convictions.
Narrative Confidence

Narrative Confidence

We like to believe that having more information will make us more confident in our decisions and opinions. The opposite, however, may be true. I have written in the past about a jam study, where participants who selected jam from a sample of a few jams were more happy with their choice than participants who selected jam from a sample of several dozen jam options. More information and more choices seems like it would help make us more happy and make us more confident with our decision, but those who selected jam from the small sample were happier than those who had several dozen jam options.

 

We like simple stories. They are easy for our brain to construct a narrative around and easy for us to have confidence in. The stories we tell ourselves and the conclusions we reach are often simplistic, often built on incomplete information, and often lack the nuance that is necessary to truly reflect reality. Our brains don’t want to work too hard, and don’t want to hold conflicting information that forces an unpleasant compromise. We don’t want to constantly wonder if we made the right choice, if we should do something different, if we need to try another option. We just want to make a decision and have someone tell us it was a good decision, regardless of the actual outcome or impact on our lives, the lives of others, or the planet.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow. He describes a study (not the jam study) where participants were presented with either one side or two sides of an argument. They had to chose which side they agreed with, and rate their confidence. “Participants who saw one-sided evidence were more confident of their judgments than those who saw both sides,” writes Kahneman, “This is just what you would expect if the confidence that people experience is determined by the coherence of the story they manage to construct from available information. It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.”

 

Learning a lot and truly understanding any given issue is challenging because it means we must build a complex picture of the world. We can’t rely on simple arguments and outlooks on life when we start to get into the weeds of an issue or topic. We will see that admirable people have tragic flaws. We will see that policies which benefit us may exploit others. We will find that things we wish to be true about who we are and the world we live in are only semi-true. Ignorance is bliss in the sense that knowing only a little bit about the world will allow you to paint a picture that makes sense to you, but it won’t be accurate about the world and it won’t acknowledge the negative externalities that the story may create. Simplistic narratives may help us come together as sports fans, or as consumers, or as a nation, but we should all be worried about what happens when we have to accept the inaccuracies of our stories. How we do we weave a complex narrative that will bring people across the world together in a meaningful and peaceful way without driving inequality and negative externalities? That is the challenge of the age, and unfortunately, the better we try to be at accurately depicting the world we inhabit, the less confident any of us will be about the conclusions and decisions for how we should move forward.
Confident But Wrong

Confident, But Wrong

We like confident people. We like people who can tell us something direct and simple to understand while being confident in the statements they make. It makes our job as a receiver easier. We can trust someone with confidence because surely they have thought out what they say, and surely their lack of ambivalence or hesitation means they have solid evidence and a logical coherence to the ideas they are expressing.

 

The problem, however, is that confidence and accuracy are not actually linked. We can be very confident in something that isn’t accurate, true, or correct. What is even worse, it can be hard for us ourselves to recognize when our confidence is misplaced. As Daniel Kahneman writes in his book Thinking Fast and Slow, “We are often confident even when we are wrong, and an objective observer is more likely to detect our errors than we are.”

 

We need to surround ourselves with thoughtful people with expertise in important areas where we will be making crucial decisions. We need to collect input from more than one person before we express complete confidence in another person, idea, or prediction. In the real world, this isn’t often possible, but it is something we should at least be aware of.

 

Trusting confident people is a way of answering an easier question in place of a more difficult question. The question might be, should we invest in this mutual fund or that mutual fund, or should we have a totally different investment strategy that doesn’t involve mutual funds. Instead of asking ourselves how we should invest our savings and doing the difficult research to understand the best strategy for ourselves, we switch to a different question and ask, “do I trust the financial advisor giving me the investment advice?” This is an easier question for us to answer. If the advisor sounds smart, has awards on their desk or wall, and exudes confidence, then they are going to appear more trustworthy, and we will believe what they say. They can present us with a lot of confidence, but be totally wrong, and we will likely go with their recommendations anyway.

 

As Kahneman explains, however, outside observers can help us overcome these confidence traps in ourselves and in how we perceive others. If we have a reliable person with knowledge of a certain area, they can help us think through our arguments to determine if we should be as confident as we are. They can help us evaluate the claims of others, to determine whether their confidence is also well deserved or needs more scrutiny. What is important to remember is that we use confidence as a heuristic, and sometimes we can be confident, but wrong with our thoughts and opinions on a given subject.

We Might Be Wrong

“If you can be sure of being right only 55 percent of the time,” writes Dale Carnegie in the book How to Win Friends and Influence People, “you can go down to Wall Street and make a million dollars a day. If you can’t be sure of being right even 55 percent of the time, why should you tell other people they are wrong?”

 

We always feel so sure of our judgments and conclusions. From the health and safety of GMO foods, to the impacts of a new tax, to who is going to win the Super Bowl, we are often very confident people. The world seems to always want our opinions, and we are usually very excited to offer our opinion with a staggering amount of confidence. This has lead to a lot of funny social media posts about people being incorrect about history, science, and sports, but more seriously, it can create thinking errors that lead nations to invade countries for poor reasons, lead to mechanical failures of spacecraft and oil platforms, and can cause us to loose huge sums of money when the game doesn’t turn out the way we knew it would.

 

I think a good practice is to look for areas where we feel a high degree of confidence, and to then try to ascribe a confidence level to our thoughts. We can try to tie our confidence levels back to real world events to help us ground our predictions: The percent chance of getting blackjack in a given hand is 4.83%, Steph Curry’s 3-point shooting percentage is 43.5%, and the percent chance of getting heads in a coin flip is of course 50%. Can you anchor your confidence (or the chance you are wrong) to one of these percentages?

 

I haven’t studied this (so I could be wrong – I’d wager the chance I’m wrong and this is not helpful at Steph Curry’s 3-point percentage), but I would expect that doing this type of exercise would help us recognize how overconfident we often are. It might even help us get to the next step, admitting that we might be wrong and considering different possibilities. Carnegie continues:

 

“You will never get into trouble by admitting that you may be wrong. That will stop all argument and inspire your opponent to be just as fair and open and broad-minded as you are. It will make him want to admit that he, too, may be wrong.”

 

The important thing to remember is that the world is incredibly complex, and our minds are only so good at absorbing lots of new data and articulating a comprehensive understanding of the information we synthesize. We should be more willing to consider ways in which our beliefs may be inaccurate, and more willing to listen to reasonable people (especially those who have demonstrated expertise or effective thinking skills) when they suggest an idea that does not conform to our prior beliefs. Try not to be close-minded and overly confident in your own beliefs, and you will be better at avoiding thinking errors and making better long-term decisions.

Becoming Less Wrong

Continuing his focus on confidence in the book Act Accordingly, Colin Wright states, “A Confident person doesn’t fear having been wrong: she’s just happy to be more right now than she was before.” This quote shows one of Wright’s core principles expressed in his books Act Accordingly and Considerations. He is continually focusing on adopting as many new perspectives as possible, and learning from new situations and discussions.  What his quote here is saying is that those who can be adaptive become more confident because they are not forced into belief systems at the expense of learning and growing.

 

I really enjoy focusing on perspectives because we each have a unique view of the world around us based on the information we take in, our backgrounds, ambitions, expectations, and other often hidden factors.  With so many forces impacting us and changing who we are, it is not surprising that we can all interpret an event, idea, or feeling differently.  What Wright argues is that we should seek out as many varying perspectives as possible so that we can understand others and begin to see things from multiple perspectives.  When we focus on finding various perspectives we avoid believing that there is one correct answer and we become less judgmental of others.

 

Wright’s quote above speaks to me about the discussions we may have on a daily basis related to heafty topics such as politics or religion.  In these two areas in particular people tend to become very entrenched and unchanging in their ideas. This limits them to a single perspective for which they seek out confirmation and agreement rather than differing perspectives and challenges.  A person without confidence will hide behind their idea and find excuses for why other perspectives are wrong.  More confident people will allow their idea to change because they understand that as they learn more and take in more information, their perspective will shift, and they will begin to see things with a better clarity.  Adopting a single mindset and ideology and not allowing it to change means that you are shutting out other perspectives and limiting your growth.  Opening up your ideology will allow you to connect with others and see the world in a better light.