I remember the first time I watched the
Gorilla Attentiveness Study, as a freshman in college, and to this day it is one of my favorite studies and examples of the ways in which our brains can let us down. Writing about the study in his book
Thinking Fast and Slow, Daniel Kahneman states,
“The gorilla study illustrates two important facts about our minds: we can be blind to the obvious, and we are also blind to our blindness.” Kahneman uses the study to show that we can’t always trust what we see, or what we experience in the world more broadly. Our minds are limited in what they take in, especially when we are engaged with one task and our mind is filtering out the other noise and extra information in our environment.
Kahneman uses the study to support two major ideas that he presents in his book. The first is that our brains can only operate on the information they take in. Most of the time, our general perception of the world is guided by System 1, the term Kahneman uses to describe the automatic, fast, and intuitive functioning part of our brain. It is not literally a separate part and structure of the brain, but it does seem to be a system with specific functions that generally runs in the background as we go about our lives. That system filters out unimportant information in the world around us, like the feeling of our clothes on our skin, low level traffic noise outside our office, or a bee buzzing around at the edges of our peripheral outside a window. That data is ignored as unimportant, allowing us to instead engage System 2 on something more worthy of our attention.
System 2 is used by Kahneman to describe the attentive, energy demanding, logical part of our brain. The modules in the brain which allow us to write blog posts, to count basketball passes, and to thread string through a needle comprise what Kahneman describes as System 2. However, System 2 can only focus on a limited number of things at one time. That is why we can’t write blog posts on a subway and why we miss the gorilla. We have to ignore the noise in order to focus on the important things. What is worse, however, is that System 2 is often dependent on information from System 1, and System 1 is subject to biases and blind spots and has a bad habit of using inferences to complete the full picture based on a limited set of information. System 1’s biases directly feed into the intense focus and logical thinking of System 2, which in turn causes us to reach faulty conclusions. And because the inferences from System 1 are usually pretty good, and do an adequate job completing the picture, our faulty conclusions appear sound to us.
Kahneman writes that we are blind to the obvious, meaning that we often miss important, crucial, and sometimes clearly important information simply because we don’t look for it, don’t recognize it for what it is, or and fill in gaps with intuition. Quite often we are not even aware of the things we are blind to, we literally are blind in regard to our blind spots, making it harder to see how we could be wrong, where our cognitive biases and errors may be, and what could be done to make our thinking more accurate.
I try to remember this in my own life and to ask myself where I think I could be wrong. I try to be aware of instances where I am deliberately creating blind spots in my life, and I try at least marginally to push against such tendencies. It is important that we remember our biases and errors in thinking, and consider how our thinking is often built on blind spots and faulty conclusions. Doing so will help us be more generous when thinking of others, and will help us become better thinkers ourselves. It will help us pause when we reach a conclusion about an argument, think more broadly when we become upset, and shift away from System 1 biases to have more accurate and complete pictures of the world.