Categories are Approximations

Categories Are Approximations

My last post was on the human tendency to put things into categories and how that can cause problems when things don’t fit nicely into the categories we have created. We like to define and group things based on shared characteristics, but those characteristics can have undefined edge cases. This isn’t a big deal when we are classifying types of mushrooms or shoes, but it can be a problem when we are classifying people and when we extend particular qualities of a group of people to everyone perceived as part of that group.
 
 
This post takes the danger in that idea a step further. In The Better Angels of Our Nature, Steven Pinker writes, “people tend to moralize their categories, assigning praiseworthy traits to their allies and condemnable ones to their enemies.” We create groups and view them as binaries. If we step back, we realize this doesn’t make sense when evaluating people, but nevertheless, we do it. We view an entire group of people as good or bad based on how we categorize them and based on a few salient traits of the category.
 
 
Pinker continues, “people tend to essentialize groups. As children, they tell experimenters that a baby whose parents have been switched at birth will speak the language of her biological rather than her adoptive parents.” When we get older we realize this is not the case, but it hints at a general disposition that humans have. We don’t focus highly on environment and contextual factors for people. We assume that essential characteristics of the group they belong to, whether or not those characteristics are actually valid, apply to every member, even if members are separated from the group and placed in a new context.
 
 
Categorizing people can end up with us placing people in a specific frame of reference that denies their individuality and humanity. We see people as inherently geared toward certain dispositions, simply because they share characteristics with other people we assume to have such dispositions. From this categorizing and these harmful tendencies follow xenophobia and racism. We wish to be seen as an individual ourselves, but we put others into categories and judge them to be inherently good or bad. We assume good people are all like us, and that all people like us are bad. Conversely, we assume all people unlike us are in some way bad, or that all bad people are unlike us. This oversimplified thought process fuels polarization and a host of negative thinking shortcuts that we have to overcome to live in a peaceful, equitable, and cooperative society.
Categorization & People

Categorization & People

Human beings really like to categorize things. We categorize coffee brews, nuts and bolts, cars, plants, berries, galaxies, and even other people. Doing so helps us think about an incredibly complex world and helps us make better decisions. I know what kind of coffee I like for which methods of making coffee, so I can simplify my decision process to get a good cup when I am at a new coffee shop. Sorting nuts and bolts allows us to find the perfect fastener for each and every situation, meaning we can ultimately make more complex cars and trucks that fit the specific category we need for our purposes. And when we are looking to make a pie, knowing a little bit about different categories of berries helps us make a delicious pie. Categorization is essential for human survival, recreation, and scientific advancement. It is an exceptional tool for humanity.
 
 
But categorization can also be troublesome, especially when we start trying to categorize things that don’t seem to want to fit within our pre-defined categories. Nuts and bolts can be manufactured specific to our predefined categories, but not everything fits nicely into the categories we adopt and use every day. Nature’s creations, as well as man’s creations, can blend across multiple categories. Some plants, animals, seeds, fruits, geological formations, and clouds can all be very distinct and easy to categorize. A mountain is very different from a valley which is very different from the ocean. But All of these items can have fuzzy distinctions and can overlap in complex ways. 
 
 
This is often the case with people. As Steven Pinker writes in his book The Better Angels of Our Nature, “the problem with categorization is that it often goes beyond the statistics. For one thing, when people are pressured, distracted, or in an emotional state, they forget that a category is an approximation and act as if a stereotype applies to every last man, woman, and child.” Pinker is describing the consequences of using a simplifying process that doesn’t fit everything we want to put in the category. Humans rarely fit perfectly within the categories we create, and there can be bad consequences when we act as if they do. On top of bad categorization, we also forget that the categories we put people into don’t always matter very much. When we forget these things, we can treat people poorly and make poor judgments about the categories we have placed people into. This takes what is a useful tool of humanity and turns it into a dangerous shortcut that can cause serious harm for real life people.
Sharks, The Navy, & The Availability Heuristic

Sharks, The Navy, And the Availability Heuristic

In the book Grunt, Mary Roach investigates what Navies across the globe do to keep their sailors, pilots, and personnel safe from shark attacks. To some extent, Roach’s findings can be summed up by describing the availability heuristic. Our minds make predictable cognitive errors, and our fear of sharks, and Roach’s subsequent curiosity about how navies protect their personnel from sharks, is in more ways inspired by cognitive error rather than real threat and danger.
In Thinking Fast and Slow Daniel Kahneman writes, “We defined the availability heuristic as the process of judging frequency by the ease with which instances come to mind.” That is to say that we don’t actually have a good mental database of shark attack frequencies relative to other nautical maladies. Neither do we have a great mental database of times when we were successful on the job, the number of electric vehicles on the road, or how many Asian-American actors have been in major motion pictures. We rely on availability. The easier it is for us to think of instances of a shark attack, instances of us doing something good at work, times we saw Teslas in the neighborhood, or whether we just saw Shang-Chi, the more we will think that each of these things occurs with high frequency.
Naval Special Warfare Command communications specialist Joe Kane is quoted as saying the following in Grunt, “You’re coming at this the wrong way. The Question is not do Navy SEALs need shark repellant? The question is, Do sharks need Navy SEAL repellent?”
Shark attacks are sensationalized and make headlines around the world.  Its easy to think of times when we have seen a shark bite victim on the news or when we can remember seeing a news headline about a shark attack. These stories are highly available, so we think they are more common than they really are, and we think they are more dangerous than they really are. After all, a shark encounter that ended with a shark being scared away without trying to bit anyone doesn’t make the news to become available to our minds. Roach writes, “a floating sailor could dispatch a curious shark by hitting it churning the water with his legs. (Baldridge [a researcher Roach spoke with] observed that even a kick to a shark’s nose from the rear leg of a swimming rat was enough to cause a startled response and rapid departure from the vicinity.)”
It is probably still a good idea for naval personnel to think about sharks and how to best train personnel to respond to sharks. However, our fear of sharks is overblown, a consequence of the availability heuristic. Sharks should only be considered to a certain extent, and beyond that, navies will face diminishing marginal returns and unnecessary expenses to try to keep their personnel safe from a minimal threat. It is the availability heuristic they may have to worry about more than sharks.
Rules of Thumb: Helpful, but Systematically Error Producing

Rules of Thumb: Helpful, but Systematically Error Producing

The world throws a lot of complex problems at us. Even simple and mundane tasks and decisions hold a lot of complexity behind them. Deciding what time to wake up at, the best way to go to the grocery store and post office in a single trip, and how much is appropriate to pay for a loaf of bread have incredibly complex mechanisms behind them. In figuring out when to wake up we have to consider how many hours of sleep we need, what activities we need to do in the morning, and how much time it will take for each of those activities to still provide us a cushion of time in case something runs long. In making a shopping trip we are confronted with p=np, one of the most vexing mathematical problems that exists. And the price of bread was once the object of focus for teams of Soviet economists who could not pinpoint the right price for a loaf of bread that would create the right supply to match the population’s demand.
The brain handles all of these problems with relatively simple heuristics and rules of thumb, simplifying decisions so that we don’t waste the whole night doing math problems for the perfect time to set an alarm, don’t miss the entire day trying to calculate the best route to run all our errands, and don’t waste tons of brain power trying to set bread prices. We set a standard alarm time and make small adjustments knowing that we ought to leave the house ready for work by a certain time to make sure we reduce the risk of being late. We stick to main roads and travel similar routes to get where we need to go, eliminating the thousands of right or left turn alternatives we could chose from. We rely on open markets to determine the price of bread without setting a universal standard.
Rules of thumb are necessary in a complex world, but that doesn’t mean they are not without their own downfalls. As Quassim Cassam writes in Vices of the Mind, echoing Daniel Kahneman from Thinking Fast and Slow, “We are hard-wired to use simple rules of thumb (‘heuristics’) to make judgements based on incomplete or ambiguous information, and while these rules of thumb are generally quite useful, they sometimes lead to systematic errors.” Useful, but inadequate, rules of thumb can create predictable and reliable errors or mistakes. Our thinking can be distracted with meaningless information, we can miss important factors, and we can fail to be open to improvements or alternatives that would make our decision-making better.
What is important to recognize is that systematic and predictable errors from rules of thumb can be corrected. If we know where errors and mistakes are systematically likely to arise, then we can take steps to mitigate and reduce those errors. We can be confident with rules of thumb and heuristics that simplify decisions in positive ways while being skeptical of rules of thumb that we know are likely to produce errors, biases, and inaccurate judgements and assumptions. Companies, governments, and markets do this all the time, though not always in a step by step process (sometimes there is one step forward and two steps backward) leading to progress over time. Embracing the usefulness of rules of thumb while acknowledging their shortcomings is a powerful way to improve decision-making while avoiding the cognitive downfall of heuristics.
Believing We Are Well Informed

Believing We Are Well Informed

In his book Risk Savvy, Gerd Gigerenzer demonstrated that people often overestimate their level of knowledge about the benefits of prostate and cancer screening. “A national telephone survey of U.S. adults,” he writes, “reported that the majority were extremely confident in their decision about prostate, colorectal, and breast screening, believed they were well informed, but could not correctly answer a single knowledge question.” I think this quote reveals something important about the way our minds work. We often believe we are well informed, but that belief and our confidence in our knowledge is often an illusion.
This is something I have been trying to work on. My initial reaction any time I hear any fact or any discussion about any topic is to position myself as a knowledgeable semi-expert in the topic. I have noticed that I do this with ideas and topics that I have really only heard once or twice on a commercial, or that I have seen in a headline, or that I once overheard someone talking about. I immediately feel like an expert even though my knowledge is often less than surface deep.
I think that what is happening in these situations is that I am substituting my feeling of expertise or knowledge with a different question. I am instead answering the question can I recall a time when I thought about this thing and then answering that question. Mental substitution is common, but hard to actually detect. I suspect that the easier a topic comes to mind, even if it is a topic I don’t know anything about but have only heard the name of, then the more likely I am to feel like I am an expert.
Gigerenzer’s quote shows that people will believe themselves to be well informed even if they cannot answer a basic knowledge question about the topic. Rather than substituting the question can I recall a time when I thought about this thing, patients may also be substituting another question. Instead of analyzing their confidence in their own decision regarding cancer screening, people may be substituting the question do I trust my doctor? Trust in a physician, even without any knowledge about the procedure, may be enough for people to feel extremely confident in their decisions. They don’t have to know a lot about their health or how a procedure is going to impact it, they just need to be confident that their physician does.
These types of substitutions are important for us to recognize. We should try to identify when we are falling victim to the availability bias and when we are substituting different questions that are easier for us to answer. In a well functioning and accurate healthcare setting these biases and cognitive errors may not harm us too much, but in a world of uncertainty, we stand to lose a lot when we fail to recognize how little we actually know. Being honest about our knowledge and thinking patterns can help us develop better systems and structures in our lives to improve and guide our decision-making.
Satisficing

Satisficing

Satisficing gets a bad wrap, but it isn’t actually that bad of a way to make decisions and it realistically accommodates the constraints and challenges that decision-makers in the real world face. None of us would like admit when we are satisficing, but the reality is that we are happy to satisfice all the time, and we are often happy with the results.

 

In Risk Savvy, Gerd Gigerenzer recommends satisficing when trying to chose what to order at a restaurant. Regarding this strategy for ordering, he writes:

 

“Satisficing: This … means to choose the first option that is satisfactory; that is, good enough. You need the menu for this rule. First, you pick a category (say, fish). Then you read the first item in this category, and decide whether it is good enough. If yes, you close the menu and order that dish without reading any further.”

 

Satisficing works because we often have more possibilities than we have time to carefully weigh and consider. If you have never been to the Cheesecake Factory, reading each option on the menu for the first time would probably take you close to 30 minutes. If you are eating on your own and don’t have any time constraints, then sure, read the whole menu, but the staff will probably be annoyed with you. If you are out with friends or on a date, you probably don’t want to take 30 minutes to order, and you will feel pressured to make a choice relatively quickly without having full knowledge and information regarding all your options. Satisficing helps you make a selection that you can be relatively confident you will be happy with given some constraints on your decision-making.

 

The term satisficing was coined by the Nobel Prize winning political scientist and economist Herbert Simon, and I remember hearing a story from a professor of mine about his decision to remain at Carnegie Melon University in Pittsburgh. When asked why he hadn’t taken a position at Harvard or a more prestigious Ivy League School, Simon replied that his wife was happy in Pittsburgh and while Carnegie Melon wasn’t as renown as Harvard it was still a good school and still offered him enough of what he wanted to remain. In other words, Carnegie Melon satisfied his basic needs and satisfied criteria in enough areas to make him happy, even though a school like Harvard would have maximized his prestige and influence. Simon was satisficing.

 

Without always recognizing it, we turn to satisficing for many of our decisions. We often can’t buy the perfect home (because of timing, price, and other bidders), so we satisfice and buy the first home we can get a good offer on that meets enough of our desires (but doesn’t fit all our desires perfectly). The same goes for jobs, cars, where we are going to get take-out, what movie we want to rent, what new clothes to buy, and more. Carefully analyzing every potential decision we have to make can be frustrating and exhausting. We will constantly doubt whether we made the best choice, and we may be too paralyzed to even make a decision in the first place. If we satisfice, however, we accept that we are not making the best choice but are instead making an adequate choice that satisfies the greatest number of our needs while simplifying the choice we have to make. We can live with what we get and move on without the constant doubt and loss of time that we might otherwise experience. Satisficing, while getting a bad rep from those who favor rationality in all instances, is actually a pretty good decision-making heuristic.
Navigating Uncertainty with Nudges

Navigating Uncertainty with Nudges

In Risk Savvy Gerd Gigerenzer makes a distinction between known risks and uncertainty. In a foot note for a figure, he writes, “In everyday language, we make a distinction between certainty and risk, but the terms risk and uncertainty are used mostly as synonyms. They aren’t. In a world of known risks, everything, including the probabilities, is known for certain. Here statistical thinking and logic are sufficient to make good decisions. In an uncertain world, not everything is known, and one cannot calculate the best option. Here, good rules of thumb and intuition are also required.” Gigerenzer’s distinction between risk and uncertainty is important. He demonstrates that people can manage decision-making when making risk based decisions, but that people need to rely on intuition and good judgement when dealing with uncertainty. One solution to improved judgement and intuition is to use nudges.

 

In the book Nudge, Cass Sunstein and Richard Thaler encourage choice architects to design systems and structures that will help individuals make the best decision in a given situation as defined by the chooser. Much of their argument is supported by research presented by Daniel Kahneman in Thinking Fast and Slow, where Kahneman demonstrates how predictable biases and cognitive errors can lead people to making decisions that they likely wouldn’t make if they had more clear information, had the ability to free themselves from irrelevant biases, and could improve their statistical thinking. Gigerenzer’s quote supports Sunstein and Thaler’s nudges by building on the research from Kahneman. Distinguishing between risk and uncertainty helps us understand when to use nudges, and how aggressive our nudges may need to be.

 

Gigerenzer uses casino slot machines as an example of risk and for examples of uncertainty uses stocks, romance, earthquakes, business, and health. When we are gambling, we can know the statistical chances that our bets will pay off and calculate optimal strategies (there is a reason the casino dealer stays on 17). We won’t know what the outcome will be ahead of time, but we can precisely define the risk. The same cannot be said for picking the right stocks, the right romantic partner, or when creating business, earthquake preparedness, or health plans. We may know the five year rate of return for a company’s stocks, the divorce rate in our state, the average frequency and strength of earthquakes in our region, and how old our grandfather lived to be, but we cannot use this information alone to calculate risk. We don’t know exactly what business trends will arise in the future, we don’t know for sure whether we have a genetic disease that will strike us (or our romantic partner) down sooner than expected, and we can’t say for sure that a 7.0 earthquake is or is not possible next month.

 

But nudges can help us in these decisions. We can use statistical information for business development and international stock returns to identify general rules of thumb when investing. We can listen to parents and elders and learn from their advice and mistakes when selecting a romantic partner, intuiting the traits that make a good (or bad) spouse. We can overengineer our bridges and skyscrapers by 10% to give us a little more assurance that they can survive a major and unexpected earthquake. Nudges are helpful because they can augment our gut instincts and help bring visualizations to the rules of thumb that we might utilize.

 

Expecting everyone’s individual intuition and heuristics to be up to the task of navigating uncertainty is likely to lead to many poor choices. But, if we help pool the statistical information available, provide guides, communicate rules of thumb that have panned out for many people, and structure choices in ways that help present this information, then people can likely make marginally better decisions. My suggestion in this post, is a nudge to use more nudges in moments of uncertainty. When certainty exists, or even when calculable risks exist, nudges may not be needed. However, once we get beyond calculable risk, where we must rely on judgement and intuition, nudges are important tools to help people navigate uncertainty and improve their decision making.
Quick Heuristics

Quick Heuristics

I really like the idea of heuristics. I have always thought of heuristics as short-cuts for problem solving or rules of thumb to apply to given situations to ease cognitive demand. We live in an incredibly complex world and the nature of reality cannot be deduced just by observing the world around us. For the world to get to the point where I can drink an espresso while listing to music streamed across the internet as I write a blog post, humanity collectively had to make discoveries involving microscopes, electromagnetism, and electricity, none of which were easily observable or intuitively understandable to our human ancestors.

 

To cope with a complex world and a limited ability to explore and understand that world, humans thrived through the use of heuristics. When faced with difficult problems and decisions, we substitute approximate but not exact answers. We can make a category judgement and reduce the number of decisions we have to make, taking a generalized path that will usually turn out well. Heuristics help us cope with the overwhelming complexity of the world, but they are not perfect, and they simplify the world according to the information we can observe and readily take in.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “the heuristic answer is not necessarily simpler or more frugal than the original question – it is only more accessible, computed more quickly and easily. The heuristic answers are not random, and they are often approximately correct. And sometimes they are quite wrong.”

 

Heuristics are quick, which is important if you are foraging and hear a dangerous sound, if you need to pick a quick place for shelter as a storm approaches, or if you have to make quick decisions about how to behave in a small tribal group. The more fluidly and quicker a heuristic comes to mind, the more natural it will feel and the stronger people will grasp it, even if it is not true. Stories and myths contain relatable elements and extend common experiences to complex problems like how to govern an empire, understanding why storms occur, and guiding us as to how we should organize an economy. Heuristics give us short-cuts to understanding these complexities, but they are biased toward our accessible world and experiences, which means they only approximate reality, and cannot fully and accurately answer our questions. While they can get some concepts more or less correct and give us good approaches to life in general, they can also be very wrong with serious consequences for many people over many generations.
Affect Heuristics

More on Affect Heuristics

For me, one of the easiest examples of heuristics that Daniel Kahneman shares in his book Thinking Fast and Slow is the affect heuristic. It is a bias that I know I fall into all the time, and that has led me to buy particular brands of shoes, has influenced how I think about certain foods, and has shaped the way I think about people. In his book Kahenman writes, “The affect heuristic is an instances of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think About it?).”

 

The world is a complex and tricky place, and we can only focus a lot of attention in one direction at a time. For a lot of us, that means we are focused on getting kids ready for school, cooking dinner, or trying to keep the house clean. Trying to fully understand the benefits and drawbacks of a social media platform, a new traffic pattern, or how to invest in retirement may seem important, but it can be hard to find the time and mental energy to focus on a complex topic and organize our thoughts in a logical and coherent manner. Nevertheless, we are likely to be presented with situations where we have to make decisions about what level of social media is appropriate for our children, offer comments on new traffic patterns around the water cooler, or finally get around to setting up our retirement plan and deciding what to do with that old 401K from that job we left.

 

Without having adequate time, energy, and attention to think through these difficult decisions, we have to make choices and are asked to have an opinion on topics we are not very informed about. “The affect heuristic”, Kahneman writes, “simplifies our lives by creating a world that is much tidier than reality. Good technologies have few costs in the imaginary world we inhabit, bad technologies have no benefits, and all decisions are easy.” We substitute the hard question that requires detailed thought for a simple question: do I like social media, did I feel that the new traffic pattern made my commute slower, do I like the way my retirement savings advisor presented a new investment strategy. In each case, we rely on affect, our emotional reaction to something, and make decisions in line with our gut feelings. Of course my kid can use social media, I’m on it, I like it, and I want to see what they are posting. Ugh, that new traffic pattern is awful, what were they thinking putting that utility box where it blocks the view of the intersection. Obviously this is the best investment strategy for me, my advisor was able to explain it well and I liked it when they told me I was making a smart decision.

 

We don’t notice when we default to the affect heuristic. It is hard to recognize that we have shifted away from making detailed calculations to rely solely on intuitions about how something makes us feel. Rather than admitting that we buy Nike shoes because our favorite basketball player wears them, and we want to be like LeBron, we create a story in our head about the quality of the shoes, the innovative design, and the complementary colors. We fall back on a quick set of factors that gives the impression of a thoughtful decision. In a lot of situations, we probably can’t do much better than the affect heuristic, but it is worth considering if our decisions are really being driven by affect. We might be able to avoid buying things just out of brand loyalty, and we might be a little calmer and reasonable in debates and arguments with friends and family when we realize we are acting on affect and not on reason.
Fluency Versus Frequency

Fluency Versus Frequency

When it comes to the availability heuristic, fluency seems to be the most important factor. The ease with which an example of something comes to mind matters more than the real world frequency of the event. Salient examples of people being pulled over by the police, of celebrity divorces, or of wildfires cause our brains to consider these types of events to be more common and likely than they really are.

 

In Thinking Fast and Slow, Daniel Kahneman shares results from a study by German psychologist Norbert Schwarz which demonstrates fluency versus frequency in our analysis of the world. Schwarz asked participants to list six instances in which they behaved assertively, and to then rate their overall level of assertiveness. In a second instance, Schwarz asked participants to list twelve instances where they were assertive and to then rate their overall level of assertiveness. What the studies show is that those who were asked to come up with 6 instances of assertiveness considered themselves to be more assertive than those asked to come up with 12 instances. Kahneman describes the results by writing, “Self-ratings were dominated by the ease with which examples had come to mind. The experience of fluent retrieval of instances trumped the number retrieved.”

 

The logical expectation would be that asking people to list 12 instances of assertiveness would give people more reason to believe they were a more assertive person. However, that is not what the study showed. Instead, what Kahneman explains happened is that as you are asked to pull more examples from memory, your brain has a harder time remembering times when you were assertive. You easily remember a few stand-out assertive moments, but eventually you start to run out of examples. As you struggle to think of assertive times in your life, you start to underrate your assertiveness. On the other hand, if you only have to think of a handful of assertive moments, and your brain pulls those moments from memory easily, then the experience of easily identifying moments of assertiveness gives you more confidence with rating yourself as assertive.

 

What I find fascinating with the study Kahneman presents is that the brain doesn’t rely on facts or statistics to make judgments and assessments about the world. It is not setting a bar before analysis at which it can say, more examples of this and I am assertive, or fewer examples and I am not assertive. It is operating on feeling and intuition, fluidly moving through the world making judgments by heuristics. The brain is not an objective observer of the world, and its opinions, perspectives, and conclusions are biased by the way it operates. The study suggests that we cannot trust our simple judgments, even when they are about something as personal as our own level of assertiveness.