Detecting Rules

Detecting Rules

Our brains are built to think causally and look for patterns. We benefit when we can recognize that some foods make us feel sick, when certain types of clouds mean rain is on the way, or when our spouse gives us a certain look that means they are getting frustrated with something. Being able to identify patterns helps us survive and form better social groups, but it can also lead to problems. Sometimes we detect patterns and rules when there aren’t any, and we can adopt strange superstitions, traditions, or behaviors that don’t help us and might have a cost.

 

Daniel Kahneman demonstrates this in his book Thinking Fast and Slow by showing a series of letters and asking us to think about which series of letters would be more likely in a random draw situation. If we had a bag of letter tiles for each letter of the English alphabet, and we selected tiles at random, we wouldn’t expect to get a word we recognized. However, sometimes through random chance we do get a complete word. If you played thousands of Scrabble games, eventually you might draw 7 tiles that make a complete word on your first turn. The reality is that drawing the letters MGPOVIT is just as statistically likely as drawing the letters MORNING.

 

For our brains, however, seeing a full word feels less likely than a jumble of letters. As Kahneman explains, “We do not expect to see regularity produced by a random process, and when we detect what appears to be a rule, we quickly reject the idea that the process is truly random.”

 

We can go out of our way trying to prove that something is behaving according to rule when it is truly random. We can be sure that a pattern is taking place, even when there is no pattern occurring. This happens in basketball with the Hot Hand phenomenon and in science when researchers search for a significant finding that doesn’t really exist in the data from an experiment. Most of the time, this doesn’t cause a big problem for us. Its not really a big deal if you believe that you need to eat Pringles for your favorite sports team to win a playoff game. It only adds a small cost if you tackle some aspect of a house project in an inefficient way, because you are sure you have better luck when you do your long approach versus a more direct approach to handling the task.

 

However, once we start to see patterns that don’t exist in social life with other people, there can be serious consequences. The United States saw this with marijuana in the early days of marijuana prohibition as prejudice and racial fear overwhelmed the public through inaccurate stories of marijuana dangers. Ancient people who sacrificed humans to bring about rain were fooled by false pattern recognition. We see our brains looking for rules when we examine how every act of our president influences political polls for the upcoming election. Our powerful pattern and rule detecting brains can help us in a lot of ways, but they can also waste our time, make us look foolish, and have huge externalities for society.
Cause and Chance

Cause and Chance

Recently I have written a lot about our mind’s tendency toward causal thinking, and how this tendency can sometimes get our minds in trouble. We make associations and predictions based on limited information and we are often influenced by biases that we are not aware of. Sometimes, our brains need to shift out of our causal framework and think in a more statistical manner, but we rarely seem to do this well.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “The associative machinery seeks causes. The difficulty we have with statistical regularities is that they call for a different approach. Instead of focusing on how the event at hand came to be, the statistical view relates it to what could have happened instead. Nothing in particular caused it to be what it is – chance selected it from among its alternatives.”

 

This is hard for us to accept. We want there to be a reason for why one candidate won a toss-up election and the other lost. We want there to be a reason for why the tornado hit one neighborhood, and not the adjacent neighborhood. Our mind wants to find patterns, it wants to create associations between events, people, places, and things. It isn’t happy when there is a large amount of data, unknown variables, and some degree of randomness that can influence exactly what we observe.

 

Statistics, however, isn’t concerned with our need for intelligible causal structures. Statistics is fine with a coin flip coming up heads 9 times in a row, and the 10th flip still having a 50-50 shot of being heads.

 

Our minds don’t have the ability to hold multiple competing narratives at one time. In national conversations, we seem to want to split things into 2 camps (maybe this is just an artifact of the United States having a winner take all political system) where we have to sides to an argument and two ways of thinking and viewing the world. I tend to think in triads, and my writing often reflects that with me presenting a series of three examples of a phenomenon. When we need to hold 7, 15, or 100 different potential outcomes in our mind, we are easily overwhelmed. Accepting strange combinations that don’t fit with a simple this-or-that causal structure is hard for our minds, and in many cases being so nuanced is not very rewarding. We can generalize and make substitutions in these complex settings and usually do just fine. We can trick our selves to believing that we think statistically, even if we are really only justifying the causal structures and hypotheses that we want to be true.

 

However, sometimes, as in some elections, in understanding cancer risk, and making cost benefit analyses of traffic accidents for freeway construction, thinking statistically is important. We have to understand that there is a range of outcomes, and only so many predictions we can make. We can develop aids to help us think through these statistical decisions, but we have to recognize that our brains will struggle. We can understand our causal tendencies and desires, and recognize the difficulties of accepting statistical information to help set up structures to enable us to make better decisions.
Statistical Artifacts

Statistical Artifacts

When we have good graphs and statistical aids, thinking statistically can feel straightforward and intuitive. Clear charts can help us tell a story, can help us visualize trends and relationships, and can help us better conceptualize risk and probability. However, understanding data is hard, especially if the way that data is collected creates statistical artifacts.

 

Yesterday’s post was about extreme outcomes, and how it is the smallest counties in the United States where we see both the highest per capita instances of cancer and the lowest per capita instances of cancer. Small populations allow for large fluctuations in per capita cancer diagnoses, and thus extreme outcomes in cancer rates. We could graph the per capita rates, model them on a map of the United States, or present the data in unique ways, but all we would really be doing is creating a visual aid influenced by statistical artifacts from the samples we used. As Daniel Kahneman explains in his book Thinking Fast and Slow, “the differences between dense and rural counties do not really count as facts: they are what scientists call artifacts, observations that are produced entirely by some aspect of the method of research – in this case, by differences in sample size.”

 

Counties in the United States vary dramatically. Some counties are geographically huge, while others are pretty small – Nevada’s is a large state with over 110,000 square miles of land but only 17 counties compared to West Virginia with under 25,000 square feet of land and 55 counties. Across the US, some counties are exclusively within metropolitan areas, some are completely within suburbs, some are entirely rural with only a few hundred people, and some manage to incorporate major metros, expansive suburbs, and vast rural stretches (shoutout to Clark County, NV). They are convenient for collecting data, but can cause problems when analyzing population trends across the country. The variations in size and other factors creates the possibility for the extreme outcomes we see in things like cancer rates across counties. When smoothed out over larger populations, the disparities in cancer rates disappears.

 

Most of us are not collecting lots of important data for analysis each day. Most of us probably don’t have to worry too  much on a day to day basis about some important statistical sampling problem. But we should at least be aware of how complex information is, and how difficult it can be to display and share information in an accurate manner. We should turn to people like Tim Harford for help interpreting and understanding complex statistics when we can, and we should try to look for factors that might interfere with a convenient conclusion before we simply believe what we would like to believe about a set of data. Statistical artifacts can play a huge role in shaping the way we understand a particular phenomenon, and we shouldn’t jump to extreme conclusions based on poor data.
Extreme Outcomes

Extreme Outcomes

Large sample sizes are important. At this moment, the world is racing as quickly as possible toward a vaccine to allow us to move forward from the COVID-19 Pandemic. People across the globe are anxious for a way to resume normal life and to reduce the risk of death from the new virus and disease. One thing standing in the way of the super quick solution that everyone wants is basic statistics. For any vaccine or treatment, we need a large sample size to be certain of the effects of anything we offer to people as a cure or for prevention of COVID-19. We want to make sure we don’t make decisions based on extreme outcomes, and that what we produce is safe and effective.

 

Statistics and probability are frequent parts of our lives, and many of us probably feel as though we have a basic and sufficient grasp of both. The reality, however, is that we are often terrible with thinking statistically. We are much better at thinking in narrative, and often we substitute a narrative interpretation for a statistical interpretation of the world without even recognizing it. It is easy to change our behavior based on anecdote and narrative, but not always so easy to change our behavior based on statistics. This is why we have the saying often attributed to Stalin: One death is a tragedy, a million deaths is a statistic.

 

The danger with anecdotal and narrative interpretations of the world is that they are drawn from small sample sizes. Daniel Kahneman explains the danger of small sample sizes in his book Thinking Fast and Slow, “extreme outcomes (both high and low) are more likely to be found in small than in large samples. This explanation is not causal.”

 

In his book, Kahneman explains that when you look at counties in the United States with the highest rates of cancer, you find that some of the smallest counties in the nation have the highest rates of cancer. However, if you look at which counties have the lowest rates of cancer, you will also find that it is the smallest counties in the nation that have the lowest rates. While you could drive across the nation looking for explanations to the high and low cancer rates in rural and small counties, you likely wouldn’t find a compelling causal explanation. You might be able to string a narrative together and if you try really hard you might start to see a causal chain, but your interpretation is likely to be biased and based on flimsy evidence. The fact that our small counties are the ones that have the highest and lowest rates of cancer is an artifact of small sample sizes. When you have small sample sizes, as Kahneman explains, you are likely to see more extreme outcomes. A few random chance events can dramatically change the rate of cancer per thousand residents when you only have a few thousand residents in small counties. In larger more populated counties, you find a reversion to the mean, and few extreme chance outcomes outcomes are less likely to influence the overall statistics.

 

To prevent our decision-making from being overly influenced by extreme outcomes we have to move past our narrative and anecdotal thinking. To ensure that a vaccine for the coronavirus or a cure for COVID-19 is safe and effective, we must allow the statistics to play out. We have to have large sample sizes, so that we are not influenced by extreme outcomes, either positive or negative, that we see when a few patients are treated successfully. We need the data to ensure that the outcomes we see are statistically sound, and not an artifact of chance within a small sample.
Affect Heuristics

Affect Heuristics

I studied public policy at the University of Nevada, Reno, and one of the things I had to accept early on in my studies was that humans are not as rational as we like to believe. We tell ourselves that we are making objective and unbiased judgments about the world to reach the conclusions we find. We tell ourselves that we are listening to smart people who truly understand the issues, policies, and technicalities of policies and science, but studies of voting, of policy preference, and of individual knowledge show that this is not the case.

 

We are nearing November and in the United States we will be voting for president and other elected officials. Few of us will spend much time investigating the candidates on the ballot in a thorough and rigorous way. Few of us will seek out in-depth and nuanced information about the policies our political leaders support or about referendum questions on the ballot.  But many of us, perhaps the vast majority of us, will have strong views on policies ranging from tech company monopolies, to tariffs, and to public health measures. We will reach unshakable conclusions and find a few snippets of facts to support our views. But this doesn’t mean that we will truly understand any of the issues in a deep and complex manner.

 

Daniel Kahneman, in his book Thinking Fast and Slow helps us understand what is happening with our voting, and reveals what I didn’t want to believe, but what I was confronted with over and over through academic studies. He writes, “The dominance of conclusions over arguments is most pronounced where emotions are involved. The psychologist Paul Slovic has proposed an affect heuristic in which people let their likes and dislikes determine their beliefs about the world.”

 

Very few of us have a deep understating of economics, international relations, or public health, but we are good at recognizing what is in our immediate self-interest and who represents the identities that are core to who we are. We know that having someone who reflects our identities and praises those identities will help improve the social standing of our group, and ultimately improve our own social status. By recognizing who our leader is and what is in our individual self-interest to support, we can learn which policy beliefs we should adopt. We look to our leaders, learn what they believe and support, and follow their lead. We memorize a few basic facts, and use that as justification for the beliefs we hold, rather than admit that our beliefs simply follow our emotional desire to align with a leader that we believe will boost our social standing.

 

It is this affect heuristic that drives much of our political decision making. It helps explain how we can support some policies which don’t seem to immediately benefit us, by looking at the larger group we want to be a part of and trying to increase the social standing of that group, even at a personal cost. The affect heuristic shows that we want a conclusion to be true, because we would benefit from it, and we use motivated reasoning to adopt beliefs that conveniently support our self-interest. There doesn’t need to be any truth to the beliefs, they just need to satisfy our emotional valance and give us a shortcut to making decisions on complex topics.
Evaluating Happiness

Evaluating Happiness

If you ask college students how many dates they have had in the last month and then ask them how happy they are overall, you will find that those who had more dates will rate themselves as generally more happy than those who had fewer dates. However, if you ask college students how happy they are overall, and then after they evaluate their happiness ask them how many dates they have had, you won’t see a big difference in overall happiness based on the number of dates that students had in the last month.

 

Daniel Kahneman looks at the results of studies like this in his book Thinking Fast and Slow and draws the following conclusion. “The explanation is straightforward, and it is a good example of substitution,” he writes. Happiness these days is not a natural or an easy assessment. A good answer requires a fair amount of thinking. However, the students who had just been asked about their dating did not need to think hard because they already had in their mind an answer to a related question: how happy were they with their love life?

 

This example is interesting because we are often placed in situations where we have to make a quick assessment of a large and complex state of being. When we buy a new car or house we rarely have a chance to live with the car or house for six months to determine if we really like it and if it is actually a good fit for us. We have a test drive or two, a couple walk-throughs, and then we are asked to make an assessment of whether we would like to own the the thing and whether it would be a good fit for our lives. We face the same challenges with voting for president, choosing a college or major, hiring a new employee/taking a new job, or buying a mattress. Evaluating happiness and predicting happiness is complex and difficult, and often without noticing it, we switch the question to something that is easier for us to answer. We narrow down our overall assessment to a few factors that are more easy to evaluate and hold in our head. More dates last month means I’m more happy.

 

“The present state of mind looms very large when people evaluate their happiness,” writes Kahneman.

 

We often judge the president based on the economy in the last months or weeks leading up to an election. We may chose to buy a home or car based on how friendly our agent or salesperson was and whether they did a good job of making us feel smart. Simple factors that might influence our mood in the moment can alter our perceived level of happiness and have direct outcomes in the decisions we make. We rarely pause to think about how happy we are on an overall level, and if we do, it is hard to untangle the things that are influencing our current mood from our perception of our general life happiness. It is important to recognize how much the current moment can shape our overall happiness so that we can pause and adjust our behaviors and attitudes to better reflect our reality. Having a minor inconvenience should not throw off our entire mood and outlook on life. Similarly, if we are in positions we dislike and find unbearable, we should not put up with the status quo just because someone flatters us but makes no real changes to improve our situation. Ultimately, it is important for us to be able to recognize what is happening in our minds and to be able to recognize when our minds are likely to be influenced by small and rather meaningless things.
Biased in Predictable Ways

Biased in Predictable Ways

“A judgment that is based on substitution will inevitably be biased in predictable ways,” writes Daniel Kahneman in his book Thinking Fast and Slow. Kahneman uses an optical illusion to show how our minds can be tricked in specific way to lead us to an incorrect conclusion. The key take-away, is that we can understand and predict our biases and how those biases will lead to specific patterns of thinking. The human mind is complex and varied, but the errors it makes can be studied, understood, and predicted.

 

We don’t like to admit that our minds are biased, and even if we are willing to admit a bias in our thinking, we are often even less willing to accept a negative conclusion about ourselves or our behavior resulting from such a bias. However, as Kahneman’s work shows, our biases are predictable and follow patterns. We know that we hold biases, and we know that certain biases can arise or be induced in certain settings. If we are going to accept these biases, then we must accept what they tell us about our brains and about the consequences of these biases, regardless whether they are trivial or have major implications in our lives and societies.

 

In a lot of ways, I think this describes the conflicts we are seeing in American society today. There are many situations where we are willing to admit that biases occur, but to admit and accept a bias implicates greater social phenomenon. Admitting a bias can make it hard to deny that larger social and societal changes may be necessary, and the costs of change can be too high for some to accept. This puts us in situations where many deny that bias exists, or live in contradiction where a bias is accepted, but a remedy to rectify the consequences of the bias is not accepted. A bias can be accepted, but the conclusion and recognition that biases are predictable and understandable can be rejected, despite the mental contradictions that arise.

 

As we have better understood how we behave and react to each other, we have studied more forms of bias in certain settings. We know that we are quick to form in-groups and out-groups. We know that we see some people as more threatening than others, and that we are likely to have very small reactions that we might not consciously be aware of, but that can nevertheless be perceived by others. Accepting and understanding these biases with an intention to change is difficult. It requires not just that one person adapt their behavior, but that many people change some aspect of their lives, often giving up material goods and resources or status. The reason there is so much anger and division in the United States today is because there are many people who are ready to accept these biases, to accept the science that Kahneman shows, and to make changes, while many others are not. Accepting the science of how the brain works and the biases that can be produced in the brain challenges our sense of self, reveals things about us that we would rather leave in the shadows, and might call for change that many of us don’t want to make, especially when a fiction that denies such biases helps propel our status.