Recognize Situations Where Mistakes Are Common

“Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent,” writes Daniel Kahneman in Thinking Fast and Slow. System 1 is how Kahneman describes the intuitive, quick reacting part of our brain that continually scans the environment and filters information going to System 2, the more thoughtful, deliberate, calculating, and rational part of our brain. Biases in human thought often originate with System 1. When System 1 misreads a situation, makes a judgment on a limited set of information, or inaccurately perceives something about the world, System 2 will be working on a poor data set and is likely reach faulty conclusions.

 

Kahneman’s book focuses on common cognitive errors and biases, not in the hope that we can radically change our brains and no longer fall victim to prejudices, optical illusions, and cognitive fallacies, but in the hopes that we can increase our awareness of how the brain and our thinking goes off the rails, to help us marginally improve our thought processes and final conclusions. Kahneman writes, “The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.”

 

If we are aware that we will make snap judgments the instant we see a person, before either of us has even spoken a single word, we can learn to adjust our behavior to prevent an instantaneous bias from coloring the entire interaction. If we know that we are making a crucial decision on how we are going to invest our finances for retirement, we can pause and use examples from Kahneman’s book to remember that we have a tendency to answer simpler questions, we have a tendency to favor things that are familiar, and we have a tendency to trust other people based on factors that don’t truly align with trustworthiness. Kahneman doesn’t think his book and his discussions on cognitive fallacies will make us experts in investing, but he does think that his research can help us understand the biases we might make in an investment situation and improve the way we make some important decisions. Understanding how our biases may be impacting our decision can help us improve those decisions.

 

Self- and situational-awareness are crucial for accurately understanding the world and making good decisions based on sound predictions. It is important to know if you can trust an educated guess from yourself or others, and it is important to recognize when your confidence is unwarranted. It is important to know when your opinions carry weight, and when your direct observations might be incomplete and misleading. In most instances of our daily lives, the stakes are low and errors from cognitive biases and errors are low, but in some situations, like serving on a jury, driving on the freeway, or choosing whether to hire someone, our (and other people’s) livelihoods could be on the line. We should honestly recognize the biases and limitations of the mind so we can further recognize situations where mistakes are common, and hopefully make fewer mistakes when it matters most.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.