Scared Before You Even Know It

Scared Before You Even Know It

In Thinking Fast and Slow, Daniel Kahneman demonstrates how quick our minds are and how fast they react to potential dangers and threats by showing us two very simple pictures of eyes. The pictures are black squares, with a little bit of white space that our brains immediately perceive as eyes, and beyond that immediate perception of eyes, our brains also immediately perceive an emotional response within the eyes. They are similar to the simple eyes I sketched out here:

In my sketch, the eyes on the left are aggressive and threatening, and our brains will pick up on the threat they pose and we will have physiological responses before we can consciously think through the fact that those eyes are just a few lines drawn on paper. The same thing happens with the eyes on the right, which our brains recognize as anxious or worried. Our body will have a quick fear reaction, and our brain will be on guard in case there is something we need to be anxious or worried about as well.

 

Regarding a study that was conducted where subjects in a brain scanner were shown a threatening picture for less than 2/100 of a second, Kahneman writes, “Images of the brain showed an intense response of the amygdala to a threatening picture that the viewer did not recognize. The information about the threat probably traveled via a superfast neural channel that feeds directly into a part of the brain that processes emotions, bypassing the visual cortex that supports the conscious experience of seeing.” The study was designed so that the subjects were not consciously aware of having seen an image of threatening eyes, but nevertheless their brain perceived it and their body reacted accordingly.

 

The takeaway from this kind of research is that our environments matter and that our brains respond to more than what we are consciously aware of. Subtle cues and factors around us can shape the way we behave and feel about where we are and what is happening. We might not know why we feel threatened, and we might not even realize that we feel threatened, but our heart rate may be elevated, we might tense up, and we might become short and defensive in certain situations. When we think back on why we behaved a certain way, why we felt the way we did, and why we had the reactions we did, our brains won’t be able to recognize these subtle cues that never rose to the level of consciousness. We won’t be able to explain the reason why we felt threatened, all we will be able to recall is the physiological response we had to the situation. We are influenced by far more than our conscious brain is aware, and we should remember that our conscious brain doesn’t provide us with a perfect picture of reality, but nevertheless our subconscious reacts to more of the world than we notice.
Ignore Our Ignorance

Ignore Our Ignorance

There is a quote that is attributed to Harry Truman along the lines of, “give me a one-handed economist.” The quote references the frustrations that any key decision-maker might have when faced with challenging and sometimes conflicting information and choices. On the one hand is a decision with a predicted set of outcomes, but on the other hand is another decision or a separate undesirable set of consequences. The quote shows how challenging it is to understand and navigate the world when you have complex and nuanced understandings of what is happening.

 

Living in ignorance actually makes choices and decisions easier – there is no other hand of separate choices, of negative consequences, or different points of view. Ignoring our ignorance is preferable when we live our own narrative constructions, where what we see is all there is, and reality is what we make it to be.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow, and how these narrative fallacies lead to so many of our predictable cognitive errors. He writes, “Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

 

When I think about Kahneman’s quote, I think about myself upon graduating with a Masters in Public Administration and Policy and my older sister upon her high school graduation. My sister has had strong political views for a very long time, views that she readily adopted as a high school student. Her self-assured profession of her political views which contrasted against the self-assured political views of my parents is part of what sparked an interest in me to study political science and public policy. I wanted to understand how people became so sure of political views that I didn’t fully understand, but which I could see contained multitudes of perspectives, benefits, and costs.

 

At the completion of my degree I felt that I had a strong understanding of the political processes in the United States. I could understand how public policy was shaped and formed, I could describe how people came to hold various points of view and why some people might favor different policies. But what I did not gain was a sense that one particular political approach was necessarily correct or inherently better than any other. So much of our political process is dependent on who stands to benefit, what is in our individual self-interest, and what our true goals happen to be. At the completion of a study of politics, I felt that I knew more than many, but I did not exactly feel that my political opinions were stronger than the political opinions of my sisters when she graduated high school. Her opinions were formed in ignorance (not saying this in a mean way!), and her limited perspective allowed her to be more confident in her opinions than I could be with my detailed and nuanced upstanding of political systems and processes.

 

Our views of the world and how we understand our reality is shaped by the information we absorb and the experiences we have. What you see is all there is, and the narrative you live within will make more sense when you are more ignorant of the complexities of the world around you. Your narrative will be simpler and more coherent since there won’t be other hands to contrast against your opinions, desires, and convictions.
Narratives and Halos

Narratives and Halos

Yesterday I wrote about narrative fallacies and how our brains’ desires to create coherent stories can lead to cognitive errors. One error, which I wrote about previously, is the halo effect, and in some ways it is a direct consequence of narrative thinking. Our brains don’t do well with conflicting information that doesn’t fit a coherent narrative, and the halo effect helps smooth over this problem in our minds.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “The halo effect helps keep explanatory narratives simply and coherent by exaggerating the consistency of evaluations; good people do only good things and bad people are all bad.” When we already like someone or consider them a good person the halo effect will simplify other judgments that we might have to make about them. If the person we admire is wearing a particular kind of coat, then we will assume that it is also a coat we should admire. If a person we dislike is engaging in some type of business, then we will assume that business is also bad. Contradictions occur when we see someone we admire wearing clothing we don’t find acceptable or when a person we know to have moral flaws engages in altruistic charity work.

 

Instead of accepting a contradiction in our narrative, creating a more complex story where some people are good in some situations but bad in others, we alter our judgments in other ways to maintain a coherent narrative. The person we like wearing strange clothes is a trend setter, and that must be the new up-and-coming style we should try to emulate. The bad person engaged in charity isn’t really doing the good things for good reasons, rather they are being selfish and trying to show-off through their charity.

 

When we reflect on our thinking and try to be more considerate of the narratives we create, we can see that we fall into traps like the halo effect. What is harder to do, however, is overcome the halo effect and other cognitive errors that simplify our narratives once we have noticed them. It is hard to continually live with conflicting opinions, ideas of people, cities, sports teams, car companies, and shoe brands. It is much easier to adopt a few favorites and believe them to be a good in all ways, rather than to accept that something might be great in some ways, but harmful or disappointing in others.
Narrative Fallacies #NarrativePolicyFramework

Narrative Fallacies

With perhaps the exception of professional accountants and actuaries, we think in narratives. How we understand important aspects of our lives, such as who we are, the opportunities we have had in life, the decisions we have made, and how our society works is shaped by the narratives we create in our minds. We use stories to make sense of our relationships with other people, of where our future is heading, and to motivate ourselves to keep going. Narratives are powerful, but so are the narrative fallacies that can arise from the way we think.

 

Daniel Kahneman, in Thinking Fast and Slow, demonstrates the ways in which our brains take short-cuts, rely on heuristics, and create narratives to understand a complex world. He shows he these thinking strategies can fail us in predictable ways due to biases, illusions, and judgments made on incomplete information. Narrative fallacies can arise from all three of the cognitive errors I just listed. To get more in depth with narrative fallacies, Kahneman writes,

 

“Narrative fallacies arise inevitably from our continuous attempt to make sense of the world. The explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen.”

 

We don’t really know how to judge probabilities, possibilities, and the consequences of things that didn’t happen. We are biased to see agency in people and things when luck was more of a factor than any direct action or individual decision. We are motivated and compelled by stories of the world that simplify the complexity of reality, taking a small slice of the world and turning that into a model to describe how we should live, behave, and relate to others.

 

Unfortunately, in my opinion, narrative fallacies cannot be avoided. I studied public policy, and one of the frameworks for understanding political decision-making that I think needs far more direct attention is the Narrative Policy Framework which incorporates the idea of Social Constructions of Target Populations from Anne Schneider and Helen Ingram. We understand the outcomes of an event based on how we think about the person or group that were impacted by the consequences of the outcome. A long prison sentence for a person who committed a violent crime is fair and appropriate. A tax break for parents who work full time is also fair and appropriate. In both instances, we think about the person receiving the punishment or reward of a decision, and we judge whether they are deserving of the punishment or reward. We create a narrative to explain why we think the outcomes are fair.

 

We cannot exist in a large society of millions of people without shared narratives to help us explain and understand our society collectively. We cannot help but create a story about a certain person or group of people, and build a narrative to explain why we think that person or group deserves a certain outcome. No matter what, however, the outcomes will not be rational, they will be biased and contain contradictions. We will judge groups positively or negatively based on stories that may or may not be accurate and complete, and people will face real rewards or punishments due to how we construct our narratives and what biases are built into our stories. We can’t escape this reality because it is how our brains work and how we create a cohesive society, but we can at least step back and admit this is how our brains work, admit that our narratives are subject to biases and are based on incomplete information, and we can decide how we want to move forward with new narratives that will help to unify our societies rather than pit them against each other in damaging competition.
Intensity Matching and Intuitive Predictions

Intuitive Predictions and Intensity Matching

“Intuitive predictions are almost completely insensitive to the actual predictive quality of the evidence,” writes Daniel Kahneman in Thinking Fast and Slow. A lot of our thinking takes place in the part of our brain which is good at making quick connections, quickly detecting patterns, and making fast judgments. The deeper and more thoughtful part of our brain only engages with the world when it really needs to, when we really need to do some critical thinking to sort out a math problem, write a blog post, or figure out how to grind down some grains to make bread. The result is that a lot of our thinking processes happen at a quick and intuitive level that is subject to biases and assumptions based on incomplete information.  When we do finally turn our critical thinking brain to a problem, it is only operating with a limited set of information from the quick part of our brain which scanned the environment and grabbed the information which stood out.

 

When we make a prediction without sitting down and doing some math or weighing the factors that influence our prediction with pen and paper, our predictions will seem logical, but will miss critical information. We will make connections between ideas and experiences that might not be very reflective of the actual world. We will simplify the prediction by answering easy questions and substituting answers for the more difficult question that our prediction is trying to answer.

 

This year, as in 2016, we will see this in action. In 2016, for me and many of the people I know, it seemed as though very few people supported Donald Trump for president. I saw very few bumper stickers or yard signs for Trump, all the social media posts I saw highlighted his worst moments, and the news coverage I consumed described why he was unfit to be president. Naturally enough, I believed he would lose in a landslide. Of course, that did not happen. Intuitively I was sure that Clinton would win, and Kahneman’s research helps explain why I should have been more skeptical of my natural intuition.

 

Part of the problem was that my intuitive prediction was an exercise of intensity matching, and as Kahneman writes, “Intensity matching yields predictions that are as extreme as the evidence on which they are based.” All the information I saw highlighted how terrible Trump was. I didn’t see a lot of people supporting trump, I didn’t see news stories justifying his candidacy. I didn’t see people in my immediate environment who strongly supported him, so my intuition was biased. It didn’t help that I didn’t do anything to seek out people who did support him or information outlets that posted articles or stories in support of him.

 

Kahneman’s writing aligns with my real world experience. His studies of the brain and of our predictive machinery reveals biases and errors in our thinking. Our intuition is based on a limited set of information that the quick part of our brain can put together. When we do engage our deep thinking brain, it can still only operate on that limited information, so even if we do think critically, we are likely to still make mistakes because we can’t see the full picture and biases in the information we absorb will predictably shape the direction of our miscalculations. What might feel natural and obvious to us could be a result of faulty intensity matching and random chance in the environment around us.
Regression to the Mean Versus Causal Thinking

Regression to the Mean Versus Causal Thinking

Regression to the mean, the idea that there is an average outcome that can be expected and that overtime individual outliers from the average will revert back toward that average, is a boring phenomenon on its own. If you think about it in the context of driving to work and counting your red lights, you can see why it is a rather boring idea. If you normally hit 5 red lights, and one day you manage to get to work with just a single red light, you probably expect that the following day you won’t have as much luck with the lights, and will probably have more red lights than than your lucky one red light commute. Conversely, if you have a day where you manage to hit every possible red light, you would probably expect to have better traffic luck the next day and be somewhere closer to your average. This is regression to the mean. Simply because you had only one red or managed to hit every red one day doesn’t cause the next day’s traffic light stoppage to be any different, but you know you will probably have a more average count of reds versus greens – no causal explanation involved, just random traffic light luck.

 

But for some reason this idea is both fascinating and hard to grasp in other areas, especially if we think that we have some control of the outcome. In Thinking Fast and Slow, Daniel Kahneman helps explain why it is so difficult in some settings for us to accept regression to the mean, what is otherwise a rather boring concept. He writes,

 

“Our mind is strongly biased toward causal explanations and does not deal well with mere statistics. When our attention is called to an event, associative memory will look for its cause – more precisely, activation will automatically spread to any cause that is already stored in memory. Causal explanations will be evoked when regression is detected, but they will be wrong because the truth is that regression to the mean has an explanation but does not have a cause.”

 

Unless you truly believe that there is a god of traffic lights who rules over your morning commute, you probably don’t assign any causal mechanism to your luck with red lights. But when you are considering how well a professional golfer played on the second day of a tournament compared to the first day, or when you are considering whether intelligent women marry equally intelligent men, you are likely to have some causal idea that comes to mind. The golfer was more or less complacent on the second day – the highly intelligent women have to settle for less intelligent men because the highly intelligent men don’t want an intellectual equal. These are examples that Kahneman uses in the book and present plausible causal mechanisms, but as Kahneman shows, the more simple though boring answer is simply regression to the mean. A golfer who performs spectacularly on day one is likely to be less lucky on day two. A highly intelligent woman is likely to marry a man with intelligence closer to average just by statistical chance.

 

When regression to the mean violates our causal expectation it becomes an interesting and important concept. It reveals that our minds don’t simply observe an objective reality, they observe causal structures that fit with preexisting narratives. Our causal conclusions can be quite inaccurate, especially if they are influenced by biases and prejudices that are unwarranted. If we keep regression to the mean in mind, we might lose some of our exciting narratives, but our thinking will be more sound, and our judgments more clear.
Regression to the Mean

Praise, Punishment, & Regression to the Mean

Regression to the mean is seriously underrated. In sports, stock market funds, and biological trends like generational height differences, regression to the mean is a powerful, yet misunderstood phenomenon. A rookie athlete may have a standout first year, only to perform less spectacularly the following year. An index fund may outperform all others one year, only to see other funds catch up the next year. And a tall man may have a son who is shorter. In each instance, regression to the mean is at play, but since we underrate it, we assume there is some causal factor causing our athlete to play worse (it went to his head!), causing our fund to earn less (they didn’t rebalance the portfolio correctly!), and causing our son to be shorter (his father must have married a short woman).

 

In Thinking Fast and Slow Daniel Kahneman looks at the consequences that arise when we fail to understand regression to the mean and attempt to create causal connections between events when we shouldn’t. Kahneman describes an experiment he conducted with Air Force cadets, asking them to flip a coin backwards over their head and try to hit a spot on the floor. Those who had a good first shot typically did worse on their second shot. Those who did poor on their first shot, usually did better the next time. There wasn’t any skill involved, the outcome was mostly just luck and random chance, so if someone was close one time, you might expect their next shot to be a little further out, just by random chance. This is regression to the mean in an easy to understand example.

 

But what happens when we don’t recognize regression to the mean in a random and simplified experiment? Kahneman used the cadets to demonstrate how random performance deviations from the mean during flight maneuvers translates into praise or punishments for the cadets. Those who performed well were often praised, only to regress to  the mean on their next flight and perform worse. Those who performed poorly also regressed to the mean, but in an upward direction, improving on the next flight. Those whose initial performance was poor received punishment (perhaps just a verbal reprimand) between their initial poor effort and follow-up improvement (regression).  Kahneman describes the take-away from the experiment this way:

 

“The feedback to which life exposes us is perverse. Because we tend to be nice to other people when they please us and nasty when they do not, we are statistically punished for being nice and rewarded for being nasty.”

 

Praise a cadet who performed well, and they will then perform worse. Criticize a cadet who performed poorly, and they will do better. Our minds overfit patterns and start to see a causal link between praise and subsequent poor performance and castigation and subsequent improvement. All that is really happening is that we are misunderstanding regression to the mean, and creating a causal model where we should not.

 

If we better understood regression to the mean, we wouldn’t be so shocked when a standout rookie sports star appears to have a sophomore slump. We wouldn’t jump on the bandwagon when an index fund had an exceptional year, and we wouldn’t be surprised by phenotypical regression to the mean from one generation to the next. Our brains are phenomenal pattern recognizing machines, but sometimes they see the wrong pattern, and sometimes that gives us perverse incentives for how we behave and interact with each other. The solution is to step back from individual cases and try to look at an average over time. By gathering more data and looking for longer lasting trends we can better identify regression to the mean versus real trends in performance over time.
Valid Stereotypes

Valid Stereotypes?

Arguments about stereotypes are common in the world today. In the United States we have worked very hard to push back against stereotypes by bringing them into view so that we can address them directly to dispel incorrect and harmful prejudices. In the circles I am usually a part of, eliminating stereotypes is universally applauded, and people who reveal an inner stereotype, even if harmless, are often castigated for applying a characteristic or trait to an entire group of people and failing to recognize diversity and randomness within a group of people.

 

What I almost never hear, at least among the circles I am a part of, is that stereotypes can have validity and help improve some level of judgment. However, Daniel Kahneman in Thinking Fast and Slow suggests that maybe we should acknowledge some valid and helpful stereotypes. He writes,

 

“The social norm against stereotyping, including the opposition to profiling, has been highly beneficial in creating a more civilized and more equal society. It is useful to remember, however, that neglecting valid stereotypes inevitably results in suboptimal judgments. Resistance to stereotyping is a laudable moral position, but the simplistic idea that the resistance is costless is wrong. The costs are worth paying to achieve a better society, but denying that the costs exist, while satisfying to the soul and politically correct, is not scientifically defensible.”

 

I have a couple of thoughts in response to the quote from Kahneman. First, is about the way in which rejecting stereotypes that helps with judgment makes society more cohesive, and the second is about how we can use stereotypes to actually make the world more inclusive.

 

First, Kahneman states that society has become more equal and more civilized with stereotype rejection. The benefits of rejecting stereotypes comes from rejecting invalid stereotypes – prejudices that outcast other people and groups as inferior and inadequate. When we throw out stereotypes, we eliminate a lot of barriers from prejudices, even if it makes some roles and interactions with people who are not like us a little more challenging. The cost, as Kahneman notes, of abandoning stereotypes is that we have a little more friction in some of our interactions with others, but through deliberate effort this can be overcome and reduced.

 

The second note, is that embracing some valid stereotypes can help us have a better world. My initial thought in this regard is bright colored sand-paper strips at the edge of stairs. Many public buildings will add a strip of sand-paper like material, often bright yellow or a contrasting color, to the edge of stairs in public walkways. We might stereotype senior citizens or people with vision disorders and assume they need extra help walking up stairs, and we might be correct in these stereotypes. The stereotypes can become valid if they enable us to build a better world and accurately reflect the reality of the people we are making assumptions or pre-judgments about. The end result, if we embrace the stereotype instead of dismissing or ignoring it, is that we build staircases that are more safe and actually better for everyone. Able bodied young people will also benefit from stairs that are responsive to stereotypical concerns about the elderly. Perhaps this isn’t what Kahneman is referring to in his thoughts of valid stereotypes, perhaps this is just good design of the built world, but I think it can be considered a way of using stereotypes in a positive direction.

 

In most instances, our stereotypes have been negative factors that outcast people who are not like us, and serve to create more social animosity among people. Certainly these stereotypes should be discarded, however, Kahneman would argue that some stereotypes can be valid, and we can use them to construct more inclusive and overall better worlds for ourselves and others. There is a cost to ignoring all stereotypes, even if ignoring the vast majority of stereotypes actually is helpful for our societies.
Base Rates Joe Abittan

Base Rates

When we think about individual outcomes we usually think about independent causal structures. A car accident happened because a person was switching their Spotify playlist and accidently ran a red light. A person stole from a grocery store because they had poor moral character which came from a poor cultural upbringing. A build-up of electrical potential from the friction of two air masses rushing past each other caused a lightning strike.

 

When we think about larger systems and structures we usually think about more interconnected and somewhat random outcomes that we don’t necessarily observe on a case by case basis, but instead think about in terms of likelihoods and conditions which create the possibilities for a set of events and outcomes. Increasing technological capacity in smartphones with lagging technological capacity in vehicles created a tension for drivers who wanted to stream music while operating vehicles, increasing the chances of a driver error accident. A stronger US dollar made it more profitable for companies to employ workers in other countries, leading to a decline in manufacturing jobs in US cities and people stealing food as they lost their paychecks.  Earth’s tilt toward the sun led to a difference in the amount of solar energy that northern continental landmasses experienced, creating a temperature and atmospheric gradient which led to lightning producing storms and increased chances of lightning in a given region.

 

What I am trying to demonstrate in the two paragraphs above is a tension between thinking statistically versus thinking causally. It is easy to think causally on a case by case basis, and harder to move up the ladder to think about statistical likelihoods and larger outcomes over entire complex systems. Daniel Kahneman presents these two types of thought in his book Thinking Fast and Slow writing:

 

Statistical base rates are facts about a population to which a case belongs, but they are not relevant to the individual case. Causal base rates change your view of how the individual case came to be.”

 

It is more satisfying for us to assign agency to a single individual than to consider that individual’s actions as being part of a large and complex system that will statistically produce a certain number of outcomes that we observe. We like easy causes, and dislike thinking about statistical likelihoods of different events.

 

“Statistical base rates are generally underweighted, and sometimes neglected altogether, when specific information about the case at hand is available.
Causal base rates are treated as information about the individual case and are easily combined with other case-specific information.”

 

The base rates that Kahneman describes can be thought of as the category or class to which we assign something. We can use different forms of base rates to support different views and opinions. Shifting the base rate from a statistical base rate to a causal base rate may change the way we think about whether a person is deserving of punishment, or aid, or indifference. It may change how we structure society, design roads, and conduct cost-benefit analyses for changing programs or technologies. Looking at the world through a limited causal base rate will give us a certain set of outcomes that might not generalize toward the rest of the world, and might cause us to make erroneous judgments about the best ways to organize ourselves to achieve the outcomes we want for society.
Availability Cascades

Availability Cascades

This morning, while reading Sapiens by Yuval Noah Harari, I came across an idea that was new to me. Harari writes, “Chaotic systems come in two shapes. Level one chaos is chaos that does not react to predictions about it. … Level two chaos is chaos that reacts to predictions about it.”  The idea is that chaotic systems, like societies and cultures, are distinct from chaotic systems like the weather. We can model the weather, and it won’t change based on what we forecast. When we model elections, on the other hand, there is a chance that people, and ultimately the outcome of the election, will be influenced by the predictions we make.  The chaos is responsive to the way we think about that chaos. A hurricane doesn’t care where we think it is going to make landfall, but voters in a state may care quite a bit and potentially change their behavior if they think their state could change the outcome of an election.

 

This ties in with the note from Daniel Kahneman’s book Thinking Fast and Slow which I had selected to write about today. Kahneman writes about availability cascades in his book, and they are a piece of the feedback mechanism described by Harari in level two chaos systems. Kaneman writes:

 

“An availability cascade is a self-sustaining chain of events, which may start from media reports of a relatively minor event and lead up to public panic and large-scale government action. One some occasions, a media story about a risk catches the attention of a segment of the public, which becomes aroused and worried.”

 

We can think about any action or event that people and governments might take as requiring a certain action potential in order to take place. A certain amount of energy, interest, and attention is required for social action to take place. The action potential can be small, such as a red light being enough of an impetus to cause multiple people to stop their cars at an intersection, or monumental, such as a major health crisis being necessary to spur emergency financial actions from the Federal Government. Availability cascades create a set of triggers which can enhance the energy, interest, and attention provided to certain events and bolster the likelihood of a public response.

 

2020 has been a series of extreme availability cascades. With a global pandemic, more people are watching news more closely than before. This allows for the increased salience of incident of police brutality, and increases the energy in the public response to such incidents. As a result, more attention has been paid to racial injustice, and large companies have begun to respond in new ways to issues of race and equality, again heightening the energy and interest of the public in demanding action regarding both racial justice and police policy. There are other ways that events could have played out, but availability cascades created feedback mechanisms within a level two chaotic system, opening certain avenues for public and societal action.

 

It is easy to look back and make assessments on what happened, but in the chaos of the moment it is hard to understand what is going on. Availability cascades help describe what we see, and help us think about what might be possible in the future.