Recognize Situations Where Mistakes Are Common

Recognize Situations Where Mistakes Are Common

“Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent,” writes Daniel Kahneman in Thinking Fast and Slow. System 1 is how Kahneman describes the intuitive, quick reacting part of our brain that continually scans the environment and filters information going to System 2, the more thoughtful, deliberate, calculating, and rational part of our brain. Biases in human thought often originate with System 1. When System 1 misreads a situation, makes a judgment on a limited set of information, or inaccurately perceives something about the world, System 2 will be working on a poor data set and is likely reach faulty conclusions.

 

Kahneman’s book focuses on common cognitive errors and biases, not in the hope that we can radically change our brains and no longer fall victim to prejudices, optical illusions, and cognitive fallacies, but in the hopes that we can increase our awareness of how the brain and our thinking goes off the rails, to help us marginally improve our thought processes and final conclusions. Kahneman writes, “The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.”

 

If we are aware that we will make snap judgments the instant we see a person, before either of us has even spoken a single word, we can learn to adjust our behavior to prevent an instantaneous bias from coloring the entire interaction. If we know that we are making a crucial decision on how we are going to invest our finances for retirement, we can pause and use examples from Kahneman’s book to remember that we have a tendency to answer simpler questions, we have a tendency to favor things that are familiar, and we have a tendency to trust other people based on factors that don’t truly align with trustworthiness. Kahneman doesn’t think his book and his discussions on cognitive fallacies will make us experts in investing, but he does think that his research can help us understand the biases we might make in an investment situation and improve the way we make some important decisions. Understanding how our biases may be impacting our decision can help us improve those decisions.

 

Self- and situational-awareness are crucial for accurately understanding the world and making good decisions based on sound predictions. It is important to know if you can trust an educated guess from yourself or others, and it is important to recognize when your confidence is unwarranted. It is important to know when your opinions carry weight, and when your direct observations might be incomplete and misleading. In most instances of our daily lives, the stakes are low and errors from cognitive biases and errors are low, but in some situations, like serving on a jury, driving on the freeway, or choosing whether to hire someone, our (and other people’s) livelihoods could be on the line. We should honestly recognize the biases and limitations of the mind so we can further recognize situations where mistakes are common, and hopefully make fewer mistakes when it matters most.
Blind to our blindness

Blind to Our Blindness

I remember the first time I watched the Gorilla Attentiveness Study, as a freshman in college, and to this day it is one of my favorite studies and examples of the ways in which our brains can let us down. Writing about the study in his book Thinking Fast and Slow, Daniel Kahneman states, “The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.” Kahneman uses the study to show that we can’t always trust what we see, or what we experience in the world more broadly. Our minds are limited in what they take in, especially when we are engaged with one task and our mind is filtering out the other noise and extra information in our environment.

 

Kahneman uses the study to support two major ideas that he presents in his book. The first is that our brains can only operate on the information they take in. Most of the time, our general perception of the world is guided by System 1, the term Kahneman uses to describe the automatic, fast, and intuitive functioning part of our brain. It is not literally a separate part and structure of the brain, but it does seem to be a system with specific functions that generally runs in the background as we go about our lives. That system filters out unimportant information in the world around us, like the feeling of our clothes on our skin, low level traffic noise outside our office, or a bee buzzing around at the edges of our peripheral outside a window. That data is ignored as unimportant, allowing us to instead engage System 2 on something more worthy of our attention.

 

System 2 is used by Kahneman to describe the attentive, energy demanding, logical part of our brain. The modules in the brain which allow us to write blog posts, to count basketball passes, and to thread string through a needle comprise what Kahneman describes as System 2. However, System 2 can only focus on a limited number of things at one time. That is why we can’t write blog posts on a subway and why we miss the gorilla. We have to ignore the noise in order to focus on the important things. What is worse, however, is that System 2 is  often dependent on information from System 1, and System 1 is subject to biases and blind spots and has a bad habit of using inferences to complete the full picture based on a limited set of information. System 1’s biases directly feed into the intense focus and logical thinking of System 2, which in turn causes us to reach faulty conclusions. And because the inferences from System 1 are usually pretty good, and do an adequate job completing the picture, our faulty conclusions appear sound to us.

 

Kahneman writes that we are blind to the obvious, meaning that we often miss important, crucial, and sometimes clearly important information simply because  we don’t look for it, don’t recognize it for what it is, or and fill in gaps with intuition. Quite often we are not even aware of the things we are blind to, we literally are blind in regard to our blind spots, making it harder to see how we could be wrong, where our cognitive biases and errors may be, and what could be done to make our thinking more accurate.

 

I try to remember this in my own life and to ask myself where I think I could be wrong. I try to be aware of instances where I am deliberately creating blind spots in my life, and I try at least marginally to push against such tendencies. It is important that we remember our biases and errors in thinking, and consider how our thinking is often built on blind spots and faulty conclusions. Doing so will help us be more generous when thinking of others, and will help us become better thinkers ourselves. It will help us pause when we reach a conclusion about an argument, think more broadly when we become upset, and shift away from System 1 biases to have more accurate and complete pictures of the world.
We Think of Ourselves as Rational

We Think of Ourselves as Rational

In Daniel Kahneman’s book Thinking Fast and Slow, Kahneman lays out two ideas for thinking about our thought processing. Kahneman calles the two ways of thinking about our thought processing System 1 and System 2. System 1 is fast, automatic, often subconscious, and usually pretty accurate in terms of making quick judgments, assumptions, and estimations of the world. System 2 is where our heavy duty thinking takes place. It is where we crunch through math problems, where our rational problem-solving part of the brain is in action, and its the system that uses a lot of energy to help us remember important information and understand the world.

 

Despite the fact that we normally operate on System 1, that is not the part of our brain that we think of as ourselves. Kahneman writes, “When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do.” We believe ourselves to be rational agents, responding reasonably to the world around us. We see ourselves a free from bias, as logically coherent, and as considerate and understanding. Naturally, it is System 2 that we see ourselves as spending most of our time with, however, this is not exactly the case.

 

A lot of our actions are influenced by factors that seem to play more at the System 1 level than the System 2 level. If you are extra tired, if you are hungry, or if you feel insulted by someone close to you, then you probably won’t be thinking as rationally and reasonably as you would expect. You are likely going to operate on System 1, making sometimes faulty assumptions on incomplete data about the world around you. If you are hungry or tired enough, you will effectively be operating on auto-pilot, letting System 1 take over as you move about the cabin.

 

Even though we often operate on System 1, we feel as though we operate on System 2 because the part of us that thinks back to how we behaved, the part of us required for serious reflection, is part of System 2. It is critical, thoughtful, and takes its time generating logically coherent answers. System 1 is quick and automatic, so we don’t even notice when it is in control. When we think about who we are, why we did something, and what kind of person we aspire to be, it is System 2 that is flying the plane, and it is System 2 that we become aware of, fooling ourselves into believing that System 2 is all we are, that System 2 is what is really in our head. We think of ourselves as rational, but that is only because our irrational System 1 can’t pause to reflect back on itself. We only see the rational part of ourselves, and it is comforting to believe that is really who we are.
Thinking Statistically

Thinking Statistically

In Thinking Fast and Slow, Daniel Kahneman personifies two modes of thought as System 1 and System 2. System 1 is fast. It takes in information, processes it rapidly, and doesn’t always make us cognizant of the information we took in. It reacts to the world around us on an intuitive level, isn’t good at math, but is great at positioning us for catching a football.

 

System 2 is slow. Its is deliberate, calculating, and uses a lot of energy to maintain. Because it requires so much energy, we don’t actually active it very often, not unless we really need to. What is worse, System 2 can only operate on the information (unless we have a lot of time to pause specifically for information intake) that System 1 takes in, meaning, it processes incomplete information.

 

System 1 and System 2 are important to keep in mind when we start to to think statistically, something our minds are not good at. When we think back to the 2016 US Presidential election, we can see how hard statistical thinking is. Clinton was favored to win, but there was a statistical chance that Trump would win, as happened. The chance was small, but that didn’t mean the models were all wrong when he did win, it just means that the most likely event forecasted didn’t materialize. We had trouble thinking statistically about win percentages going into the election, and had trouble understanding an unlikely outcome after it happened.

 

“Why is it so difficult for us to think statistically?” Kahneman asks in his book, “We easily think associatively, wee think metaphorically, we think causally, but statistics requires thinking about many things at once, which is something that System 1 is not designed to do.”

 

System 1 operates quickly and cheaply. It takes less energy and effort to run on System 1, but because it is subject to bias and because it makes judgments on incomplete information, it is not reliable for important decisions and calculations based on nuance. We have to engage System 2 to be great at thinking statistically, but statistical thinking still trips up System 2 because it is hard to think about multiple competing outcomes at the same time and weight them appropriately. In Risk Savvy, Gerd Gigerenzer shows that statistical thinking can be substantially improved and that we really can think statistically, but that we need some help from visual aids and tools so that our minds can grasp statistical concepts better. We have to help System 1 so that it can set up System 2 for success if we want to be good at thinking statistically.

 

From the framework that Kahneman lays out, a quick reacting System 1 running on power save mode with limited informational processing power and System 2 operating on incomplete information aggregated by System 1, statistical thinking is nearly impossible. System 1 can’t bring in enough information for System 2 to analyze appropriately. As a result, we fall back on biases or maybe substitute an easier question over the challenging statistical question. Gigerenzer argues that we can think statistically, but that we need the appropriate framing and cues for System 1, so that System 2 can understand the number crunching and leg work that is needed. In the end, statistical thinking doesn’t happen quickly, and requires an ability to hold competing and conflicting information in the mind at the same time, making it hard for us to think statistically rather than anecdotally or metaphorically.
Answering the Easy Question

Answering the Easy Question

One of my favorite pieces from Daniel Kahneman’s book Thinking Fast and Slow, was the research Kahneman presented on mental substitution. Our brains work very quickly, and we don’t always recognize the times when our thinking has moved in a direction we didn’t intend. Our thinking seems to flow logically and naturally from one thought to the next, and we don’t notice the times when our brains make logical errors or jumps that are less than rational. Mental substitution is a great example of this, and one that I know my brain does, but that I often have trouble seeing even when I know to look for it.

 

Kahneman writes, “When the question is difficult and a skilled solution is not available, intuition still has a shot: an answer may come to mind quickly – but it is not an answer to the original question.” 

 

The example that Kahneman uses is of a business executive making a decision on whether to invest in Ford. To make a smart decision, the executive has to know what trends in the auto industry look like and whether Ford is well positioned to adapt to changing economic, climate, and consumer realities. They need to know what Ford’s competition is doing and think about how Ford has performed relative to other automobile companies and how the automotive sector has performed relative to other industries. The decision requires thinking about a lot of factors, and the executive’s time is limited, along with the amount of information they can hold in their head, especially given the other responsibilities at home and in the office that the executive has to worry about.

 

To simplify the decision, the executive might chose to answer a simpler question, as Kahneman explains, “Do I like Ford cars?” If the executive grew up driving a Ford truck, if they really liked the 1965 Mustang, or if the only car crash they were ever involved in was when a person driving a Ford rear-ended them, their decision might be influenced by an intuitive sense of Ford cars and people who drive Fords. Also, if the investor has personnaly met someone within the executive team, they may be swayed by whether or not they liked the person they met. Instead of asking a large question about Ford the company, they might substitute an easier question about a single Ford executive team member.

 

“This is the essence of intuitive heuristics,” writes Kahneman, “when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.” 

 

Often, we already have a certain feeling in mind, and we switch the question being asked so that we can answer in line with our intuition. I grew up driving a Ford, so I might be inclined to favor investing in Ford. I might answer the question of investing in Ford before I am even asked the question, and then, instead of objectively setting out to review a lot of information, I might just cherry pick the information that supports my original inclination. I’m substituting the question at hand, and might even provide myself with plenty of information to support my choice, but it is likely biased and misguided information.

 

It is important to recognize when these prejudices and biases are influencing our decisions. By being aware of how we feel when asked a question, we can think critically to ask if we are being honest with the question that was asked of us. Are we truly answering the right question, or have we substituted for a question that is easier for us to answer?

 

In the question of cars and investments, the cost might not truly be a big deal (at least if you have a well diversified portfolio in other respects), but if we are talking about public policy that could be influenced by racial prejudice or by how deserving we think another group of people is, then our biases could be very dangerous. If we think that a certain group of people is inherently greedy, lazy, or unintelligent, then we might substitute the question, “will this policy lead to the desired social outcome” with the question, “do I like the group that stands to benefit the most from this policy?” The results from answering the wrong question could be disastrous, and could harm our society for years.
Expert Intuition

Expert Intuition

Much of Daniel Kahneman’s book Thinking Fast and Slow is about the breakdowns in our thinking processes, especially regarding the mental shortcuts we use to make decisions. The reality of the world is that there is too much information, too many stimuli, too many things that we could focus on and consider at any given time for us take in everything and make a comprehensive decision. Instead, we rely on short-cuts, use our intuition, and make estimates that help us with our decision-making. Usually we do just fine with this whole process, and that is why we rely so much on these short-cuts, but sometimes, cognitive errors and biases can drive us off a cliff.

 

However, Kahneman stresses that all is not lost. Our intuition can be very reliable if we develop true expertise in the area where we are putting our intuition to the test. As Kahneman writes, “Valid intuitions develop when experts have learned to recognize familiar elements in a new situation and to act in a manner that is appropriate to it.”

 

We can make predictions, we can learn to recognize commonalities between situations, and even on a subconscious level we can absorb and recall information to use in decisions. The key to using our intuition successfully is a careful line between mastery and arrogance. It requires self-awareness to know what we know, to understand an area well enough that we can trust our intuition, and to know what we don’t know, so that we don’t make judgments beyond our area of expertise.

 

While much of Kahneman’s research (the majority of which I’m going to be writing about) is focused on problematic heuristics, predictable cognitive errors, and hidden mental biases, it is important to know when we can trust our intuition and where our thinking doesn’t lead us astray. There are times where developing expertise through practice and experience can help us make better decisions. Even if we are not the expert, we can recognize and learn from those who do have expertise, paying attention when their intuitions forecast something important. Getting a sense for how well the mind can work, and how well humans can think and forecast when they have the right information and knowledge is powerful if we want to think positively about what our future might hold. At the same time, we have to also understand how thinking fast can get us in trouble, and where our expert intuitions may fail us.
Luck & Stories of Success

Luck & Stories of Success

There are some factors within individual control that influence success. Hard work is clearly important, good decision-making is important, and an ability to cooperate and work well with others is also important for success. But none of these factors on their own are sufficient for success, at least many prominent thinkers and researchers seem to agree that they are not sufficient. One very successful researcher who would agree that these character, personality, or individual traits are not enough is Daniel Kahneman, the Nobel Prize winning professor from Princeton.

 

Kahneman’s research, the portion which won him the Nobel Prize was conducted with Amos Tversky, was incredibly successful and influential within psychology and economics. But remembering the lessons he learned from his own research, Kahneman writes the following about his academic journey and the studies he shares in his book:

 

“A recurrent theme of this book is that luck plays a large role in every story of success; it is almost always easy to identify a small change in the story that would have turned a remarkable achievement into a mediocre outcome. Our story was no exception.”

 

In anything we do, a certain amount of luck is necessary for any level of success, and much of that luck is beyond our control. Some songs really take off and become major hits, even if the song is objectively not as catchy or as good as other songs (is there any other way to explain Gangnam Style?). Sometimes a mention by a celebrity or already famous author can ignite the popularity of another writer, and sometimes a good referral can help jump-start the popularity of a restaurant. We can work hard, put our best product forward, and make smart choices, but the level of success we achieve can sometimes be as random as the right person telling another right person about what we are doing.

 

Timing, connections, and fortunate births are all luck factors that we don’t control, but that can hugely influence our stories. Being born without a disability or costly medical condition can allow you to save for a rainy day. Being born in a country with functioning roads and postal services can allow you to embark on a new business venture. And happening to have a neighbor who knows some body who can help your kid get into a good college can allow your child and family to move up in ways that might have otherwise been impossible.

 

There are certain things we can do to prepare ourselves to take advantage of good luck, but we need to recognize how important luck is. We have to acknowledge it and remember that our stories are full of luck, and that not everyone has a story with as equally good luck as we do. We can’t assume that our success was all due to factors relating to our good personal traits and habits (a cognitive error that Kahneman discusses in his book). To fully understand the world, we have to look at it objectively, and that requires that we think critically and honestly about the good luck we have had.
Confident But Wrong

Confident, But Wrong

We like confident people. We like people who can tell us something direct and simple to understand while being confident in the statements they make. It makes our job as a receiver easier. We can trust someone with confidence because surely they have thought out what they say, and surely their lack of ambivalence or hesitation means they have solid evidence and a logical coherence to the ideas they are expressing.

 

The problem, however, is that confidence and accuracy are not actually linked. We can be very confident in something that isn’t accurate, true, or correct. What is even worse, it can be hard for us ourselves to recognize when our confidence is misplaced. As Daniel Kahneman writes in his book Thinking Fast and Slow, “We are often confident even when we are wrong, and an objective observer is more likely to detect our errors than we are.”

 

We need to surround ourselves with thoughtful people with expertise in important areas where we will be making crucial decisions. We need to collect input from more than one person before we express complete confidence in another person, idea, or prediction. In the real world, this isn’t often possible, but it is something we should at least be aware of.

 

Trusting confident people is a way of answering an easier question in place of a more difficult question. The question might be, should we invest in this mutual fund or that mutual fund, or should we have a totally different investment strategy that doesn’t involve mutual funds. Instead of asking ourselves how we should invest our savings and doing the difficult research to understand the best strategy for ourselves, we switch to a different question and ask, “do I trust the financial advisor giving me the investment advice?” This is an easier question for us to answer. If the advisor sounds smart, has awards on their desk or wall, and exudes confidence, then they are going to appear more trustworthy, and we will believe what they say. They can present us with a lot of confidence, but be totally wrong, and we will likely go with their recommendations anyway.

 

As Kahneman explains, however, outside observers can help us overcome these confidence traps in ourselves and in how we perceive others. If we have a reliable person with knowledge of a certain area, they can help us think through our arguments to determine if we should be as confident as we are. They can help us evaluate the claims of others, to determine whether their confidence is also well deserved or needs more scrutiny. What is important to remember is that we use confidence as a heuristic, and sometimes we can be confident, but wrong with our thoughts and opinions on a given subject.
Kahneman's Gossip Hope

Kahneman’s Hope

Daniel Kahneman opens his book on cognitive biases, thinking errors, and observed processes within the field of cognitive psychology in an interesting place. Thinking Fast and Slow begins with Kahneman praising gossip, and explaining his hope for the readers of his book. He does not hope that readers of his book will avoid gossip and stop talking about others behind their backs. He hopes readers will gossip better, and understand the thought processes, mental limitations of the human mind, and mental errors that go into all of our gossip.

 

Kahneman writes, “The hope for informed gossip is that there are distinctive patterns in the errors people make. Systematic errors are known as biases, and they recur predictably in particular circumstances.”

 

What he hopes for is that understanding how the brain works will help us all have better conversations at the water cooler, or at lunch with a colleague, or in the evening when we get home and want to vent to a spouse or parent or pet. Gossip can be a powerful tool in developing and shaping the norms of society, and if we are going to give gossip so much power, we should at least do our best to ensure that our gossiping is well informed, accurate, and that when we gossip we understand how our minds are reaching our gossipy conclusions.

 

Certainly Kahneman’s real hope is that writing about and explaining gossip in a way that more people can access than simply putting his ideas in academic journals will lead to fewer negative externalities in the world from biases, prejudices, and simple cognitive errors. However, for most people, Kahneman thinks the water cooler gossip forum is where his ideas and research will really impact people’s conversations.

 

The point is that the human mind doesn’t exactly work the way we tend to think it does. It feels as though we have one thought that rationally flows from another thought. That we are observant, considerate, and are willing to come to conclusions based on fact and observed reality. Through his research in the book, Kahneman shows us that our brains are predictable in the errors they make. They are not as rational as we believe, and our thoughts don’t flow coherently from one idea to the next. The observations we make are always incomplete relative to the full information of the reality around us, and our choices and actions are far more motivated by what we want to believe is true than is actually true in reality. Knowing all of this, Kahneman hopes, will make us more cognizant and reflective in our gossip, hopefully helping the world to be a slightly more accurate and enjoyable place to be.
Gossip Machines

Gossip Machines

Humans are gossip machines. We like to talk about  and think about other people, especially the negative traits and qualities of others. At the same time, we are self-deception machines. We downplay our own faults, spend little time thinking about our mistakes, and deny any negative quality about ourselves. Even when we are the only audience for our thoughts, we hide our own flaws and instead nitpick all the things we dislike about other people.

 

As Daniel Kahneman writes in his book Thinking Fast and Slow, “it is much easier, as well as far more enjoyable, to identify and label the mistakes of others than to recognize our own.”

 

But gossip isn’t necessarily as bad as we usually make it out to be. It is definitely not a good thing to constantly talk bad about other people, to find faults in others, and to ignore our own shortcomings. It can make us vain, destroy our relationships with friends and family, and give us a bad reputation among the people in our lives. And yet, we all engage in gossip and it pops up on social media today, in movies from the 80’s and 90’s, and even in journals from our nation’s founding fathers. Gossip seems to have always been with us, and while we are quick to highlight its evils, it seems to also be an important part of human society.

 

Kahneman continues, “The expectation of intelligent gossip is a powerful motive for serious self-criticism, more powerful than New Year resolutions to improve one’s decision making at work and at home.”

 

We do not live in a vacuum. We are not isolated from society and other humans, and as a result we understand ourselves and think about ourselves in relation to other people. We partake in gossip and we know that other people gossip about us. This creates an important constraint on our actions and behaviors, shaping the way we live our lives. Knowing that other people will judge us prevents many negative behaviors such as reckless driving, living in unsanitary conditions, or being deliberately mean to other people. While gossip certainly has a lot of problems, it does in some ways shape how we behave in societies with other people in positive directions.

 

We might not want to think about our own flaws, but knowing that humans are gossip machines forces us to at least consider how we will appear to other people some of the time. This can drive us to act in accordance to social norms, and can be the bedrock of a society that can cooperate and coordinate efforts and goals. Without gossip, we might have a harder time bringing ourselves together to live in harmony.