Believing We Are Well Informed

Believing We Are Well Informed

In his book Risk Savvy, Gerd Gigerenzer demonstrated that people often overestimate their level of knowledge about the benefits of prostate and cancer screening. “A national telephone survey of U.S. adults,” he writes, “reported that the majority were extremely confident in their decision about prostate, colorectal, and breast screening, believed they were well informed, but could not correctly answer a single knowledge question.” I think this quote reveals something important about the way our minds work. We often believe we are well informed, but that belief and our confidence in our knowledge is often an illusion.
This is something I have been trying to work on. My initial reaction any time I hear any fact or any discussion about any topic is to position myself as a knowledgeable semi-expert in the topic. I have noticed that I do this with ideas and topics that I have really only heard once or twice on a commercial, or that I have seen in a headline, or that I once overheard someone talking about. I immediately feel like an expert even though my knowledge is often less than surface deep.
I think that what is happening in these situations is that I am substituting my feeling of expertise or knowledge with a different question. I am instead answering the question can I recall a time when I thought about this thing and then answering that question. Mental substitution is common, but hard to actually detect. I suspect that the easier a topic comes to mind, even if it is a topic I don’t know anything about but have only heard the name of, then the more likely I am to feel like I am an expert.
Gigerenzer’s quote shows that people will believe themselves to be well informed even if they cannot answer a basic knowledge question about the topic. Rather than substituting the question can I recall a time when I thought about this thing, patients may also be substituting another question. Instead of analyzing their confidence in their own decision regarding cancer screening, people may be substituting the question do I trust my doctor? Trust in a physician, even without any knowledge about the procedure, may be enough for people to feel extremely confident in their decisions. They don’t have to know a lot about their health or how a procedure is going to impact it, they just need to be confident that their physician does.
These types of substitutions are important for us to recognize. We should try to identify when we are falling victim to the availability bias and when we are substituting different questions that are easier for us to answer. In a well functioning and accurate healthcare setting these biases and cognitive errors may not harm us too much, but in a world of uncertainty, we stand to lose a lot when we fail to recognize how little we actually know. Being honest about our knowledge and thinking patterns can help us develop better systems and structures in our lives to improve and guide our decision-making.
Substitution Heuristics

Substitution Heuristics

I think heuristics are underrated. We should discuss heuristics as a society way more than we do. We barely acknowledge heuristics, but if we look closely, they are at the heart of many of our decisions, beliefs, and assumptions. They save us a lot of work and help us move through the world pretty smoothly, but are rarely discussed directly or even slightly recognized.

 

In Thinking Fast and Slow, Daniel Kahneman highlights heuristics in the sense of substitution and explains their role as:

 

“The target question is the assessment you intended to produce.
The heuristic question is the simpler question that you answered instead.”

 

I have already written about our brain substituting easier questions for harder questions, but the idea of heuristics gives the process a deeper dimension. Kahneman defines a heuristic writing, “The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions.”

 

In my own life, and I imagine I am a relatively average case, I have relied on heuristics to help me make a huge number of decisions. I don’t know the best possible investment strategies for my future retirement, but as a heuristic, I know that working with an investment advisor to manage mutual funds and IRAs can be an adequate (even if not perfect) way to ensure I save for the future. I don’t know the healthiest possible foods to eat and what food combinations will maximize my nutrient intake, but as a heuristic I can ensure that I have a colorful plate with varied veggies and not too many sweets to ensure I get enough of the vitamins and nutrients that I need.

 

We have to make a lot of difficult decisions in our lives. Most of us don’t have the time or the ability to compile all the information we need on a given subject to make a fully informed decision, and even if we try, most of us don’t have a reasonable way to sort through contrasting and competing information to determine what is true and what the best course of action would be. Instead, we make substitutions and use heuristics to figure out what we should do. Instead of recognizing that we are using heuristics, however, we ascribe a higher level of confidence and certainty to our decisions than is warranted. What we do, how we live, and what we believe become part of our identity, and we fail to recognize that we are adopting a heuristic to achieve some version of what we believe to be a good life. When pressed to think about it, our mind creates a justification for our decision that doesn’t acknowledge the heuristics in play.

 

In a world where we were quicker to recognize heuristics, we might be able to live with a little distance between ourselves, our decisions, and our beliefs. We could acknowledge that heuristics are driving us, and be more open to change and more willing to be flexible with others. Acknowledging that we don’t have all the answers (that we don’t even have all the necessary information) and are operating on substitution heuristics for complex questions, might help us be less polarized and better connected within our society.
Answering the Easy Question

Answering the Easy Question

One of my favorite pieces from Daniel Kahneman’s book Thinking Fast and Slow, was the research Kahneman presented on mental substitution. Our brains work very quickly, and we don’t always recognize the times when our thinking has moved in a direction we didn’t intend. Our thinking seems to flow logically and naturally from one thought to the next, and we don’t notice the times when our brains make logical errors or jumps that are less than rational. Mental substitution is a great example of this, and one that I know my brain does, but that I often have trouble seeing even when I know to look for it.

 

Kahneman writes, “When the question is difficult and a skilled solution is not available, intuition still has a shot: an answer may come to mind quickly – but it is not an answer to the original question.” 

 

The example that Kahneman uses is of a business executive making a decision on whether to invest in Ford. To make a smart decision, the executive has to know what trends in the auto industry look like and whether Ford is well positioned to adapt to changing economic, climate, and consumer realities. They need to know what Ford’s competition is doing and think about how Ford has performed relative to other automobile companies and how the automotive sector has performed relative to other industries. The decision requires thinking about a lot of factors, and the executive’s time is limited, along with the amount of information they can hold in their head, especially given the other responsibilities at home and in the office that the executive has to worry about.

 

To simplify the decision, the executive might chose to answer a simpler question, as Kahneman explains, “Do I like Ford cars?” If the executive grew up driving a Ford truck, if they really liked the 1965 Mustang, or if the only car crash they were ever involved in was when a person driving a Ford rear-ended them, their decision might be influenced by an intuitive sense of Ford cars and people who drive Fords. Also, if the investor has personnaly met someone within the executive team, they may be swayed by whether or not they liked the person they met. Instead of asking a large question about Ford the company, they might substitute an easier question about a single Ford executive team member.

 

“This is the essence of intuitive heuristics,” writes Kahneman, “when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.” 

 

Often, we already have a certain feeling in mind, and we switch the question being asked so that we can answer in line with our intuition. I grew up driving a Ford, so I might be inclined to favor investing in Ford. I might answer the question of investing in Ford before I am even asked the question, and then, instead of objectively setting out to review a lot of information, I might just cherry pick the information that supports my original inclination. I’m substituting the question at hand, and might even provide myself with plenty of information to support my choice, but it is likely biased and misguided information.

 

It is important to recognize when these prejudices and biases are influencing our decisions. By being aware of how we feel when asked a question, we can think critically to ask if we are being honest with the question that was asked of us. Are we truly answering the right question, or have we substituted for a question that is easier for us to answer?

 

In the question of cars and investments, the cost might not truly be a big deal (at least if you have a well diversified portfolio in other respects), but if we are talking about public policy that could be influenced by racial prejudice or by how deserving we think another group of people is, then our biases could be very dangerous. If we think that a certain group of people is inherently greedy, lazy, or unintelligent, then we might substitute the question, “will this policy lead to the desired social outcome” with the question, “do I like the group that stands to benefit the most from this policy?” The results from answering the wrong question could be disastrous, and could harm our society for years.

An Illusion of Security, Stability, and Control

The online world is a very interesting place. While we frequently say that we have concerns about privacy, about how our data is being used, and about what information is publicly available to us, very few people delete their social media accounts or take real action when a data breach occurs. We have been moving more and more of our life online, and we have been more accepting of devices connected to the internet that can either be hacked or be used to tacitly spy on us than we would expect given the amount of time we spend expressing concern for our privacy.

 

A quick line from Tyler Cowen’s book The Complacent Class may explain the contradiction. “A lot of our contentment or even enthrallment with online practices may be based on an illusion of security, stability, and control.”

 

I just read Daniel Kahneman’s book Thinking Fast and Slow and in it he writes about a common logical fallacy, the substitution principle. When we are asked difficult questions, we often substitute a simpler question that we can answer. However, we rarely realize that we do this. Cowen’s insight suggests that we are using this substitution fallacy when we are evaluating online practices.

 

Instead of thinking deeply and critically about our privacy, safety, and the security of our personal or financial information in a given context, we substitute. We ask ourselves, does this website intuitively feel legitimate and well put together? If the answer is yes, we are more likely to enter our personal information, allow our online movements to be tracked, enter our preferences, and save our credit card number.

 

If matching technology works well, if our order is fulfilled, and if we are provided with more content that we can continue to enjoy, we will again substitute. Instead of asking whether our data is safe or whether the value we receive exceeds the risk of having our information available, we will ask if we are satisfied with what was provided to us and if we liked the look and feel of what we received. We can pretend to answer the hard questions with illusory answers to easier questions.

 

In the end, we land in a place where the companies and organizations operating on the internet have little incentive to improve their systems, to innovate in ways that create disruptive changes, or to pursue big leaps forward. We are already content and we are not actually asking the hard questions which may push innovation forward. This contentment builds stagnation and prevents us from seeing the risks that exist behind the curtain. We live in our illusion that we control our information online, that we know how the internet works, and that things are stable and will continue to work, even if the outside world is chaotic. This could be a recipe for a long-term disaster that we won’t see coming because we believe we are safely in control when we are not.