Rules of Thumb: Helpful, but Systematically Error Producing

Rules of Thumb: Helpful, but Systematically Error Producing

The world throws a lot of complex problems at us. Even simple and mundane tasks and decisions hold a lot of complexity behind them. Deciding what time to wake up at, the best way to go to the grocery store and post office in a single trip, and how much is appropriate to pay for a loaf of bread have incredibly complex mechanisms behind them. In figuring out when to wake up we have to consider how many hours of sleep we need, what activities we need to do in the morning, and how much time it will take for each of those activities to still provide us a cushion of time in case something runs long. In making a shopping trip we are confronted with p=np, one of the most vexing mathematical problems that exists. And the price of bread was once the object of focus for teams of Soviet economists who could not pinpoint the right price for a loaf of bread that would create the right supply to match the population’s demand.
The brain handles all of these problems with relatively simple heuristics and rules of thumb, simplifying decisions so that we don’t waste the whole night doing math problems for the perfect time to set an alarm, don’t miss the entire day trying to calculate the best route to run all our errands, and don’t waste tons of brain power trying to set bread prices. We set a standard alarm time and make small adjustments knowing that we ought to leave the house ready for work by a certain time to make sure we reduce the risk of being late. We stick to main roads and travel similar routes to get where we need to go, eliminating the thousands of right or left turn alternatives we could chose from. We rely on open markets to determine the price of bread without setting a universal standard.
Rules of thumb are necessary in a complex world, but that doesn’t mean they are not without their own downfalls. As Quassim Cassam writes in Vices of the Mind, echoing Daniel Kahneman from Thinking Fast and Slow, “We are hard-wired to use simple rules of thumb (‘heuristics’) to make judgements based on incomplete or ambiguous information, and while these rules of thumb are generally quite useful, they sometimes lead to systematic errors.” Useful, but inadequate, rules of thumb can create predictable and reliable errors or mistakes. Our thinking can be distracted with meaningless information, we can miss important factors, and we can fail to be open to improvements or alternatives that would make our decision-making better.
What is important to recognize is that systematic and predictable errors from rules of thumb can be corrected. If we know where errors and mistakes are systematically likely to arise, then we can take steps to mitigate and reduce those errors. We can be confident with rules of thumb and heuristics that simplify decisions in positive ways while being skeptical of rules of thumb that we know are likely to produce errors, biases, and inaccurate judgements and assumptions. Companies, governments, and markets do this all the time, though not always in a step by step process (sometimes there is one step forward and two steps backward) leading to progress over time. Embracing the usefulness of rules of thumb while acknowledging their shortcomings is a powerful way to improve decision-making while avoiding the cognitive downfall of heuristics.
Believing We Are Well Informed

Believing We Are Well Informed

In his book Risk Savvy, Gerd Gigerenzer demonstrated that people often overestimate their level of knowledge about the benefits of prostate and cancer screening. “A national telephone survey of U.S. adults,” he writes, “reported that the majority were extremely confident in their decision about prostate, colorectal, and breast screening, believed they were well informed, but could not correctly answer a single knowledge question.” I think this quote reveals something important about the way our minds work. We often believe we are well informed, but that belief and our confidence in our knowledge is often an illusion.
This is something I have been trying to work on. My initial reaction any time I hear any fact or any discussion about any topic is to position myself as a knowledgeable semi-expert in the topic. I have noticed that I do this with ideas and topics that I have really only heard once or twice on a commercial, or that I have seen in a headline, or that I once overheard someone talking about. I immediately feel like an expert even though my knowledge is often less than surface deep.
I think that what is happening in these situations is that I am substituting my feeling of expertise or knowledge with a different question. I am instead answering the question can I recall a time when I thought about this thing and then answering that question. Mental substitution is common, but hard to actually detect. I suspect that the easier a topic comes to mind, even if it is a topic I don’t know anything about but have only heard the name of, then the more likely I am to feel like I am an expert.
Gigerenzer’s quote shows that people will believe themselves to be well informed even if they cannot answer a basic knowledge question about the topic. Rather than substituting the question can I recall a time when I thought about this thing, patients may also be substituting another question. Instead of analyzing their confidence in their own decision regarding cancer screening, people may be substituting the question do I trust my doctor? Trust in a physician, even without any knowledge about the procedure, may be enough for people to feel extremely confident in their decisions. They don’t have to know a lot about their health or how a procedure is going to impact it, they just need to be confident that their physician does.
These types of substitutions are important for us to recognize. We should try to identify when we are falling victim to the availability bias and when we are substituting different questions that are easier for us to answer. In a well functioning and accurate healthcare setting these biases and cognitive errors may not harm us too much, but in a world of uncertainty, we stand to lose a lot when we fail to recognize how little we actually know. Being honest about our knowledge and thinking patterns can help us develop better systems and structures in our lives to improve and guide our decision-making.
Satisficing

Satisficing

Satisficing gets a bad wrap, but it isn’t actually that bad of a way to make decisions and it realistically accommodates the constraints and challenges that decision-makers in the real world face. None of us would like admit when we are satisficing, but the reality is that we are happy to satisfice all the time, and we are often happy with the results.

 

In Risk Savvy, Gerd Gigerenzer recommends satisficing when trying to chose what to order at a restaurant. Regarding this strategy for ordering, he writes:

 

“Satisficing: This … means to choose the first option that is satisfactory; that is, good enough. You need the menu for this rule. First, you pick a category (say, fish). Then you read the first item in this category, and decide whether it is good enough. If yes, you close the menu and order that dish without reading any further.”

 

Satisficing works because we often have more possibilities than we have time to carefully weigh and consider. If you have never been to the Cheesecake Factory, reading each option on the menu for the first time would probably take you close to 30 minutes. If you are eating on your own and don’t have any time constraints, then sure, read the whole menu, but the staff will probably be annoyed with you. If you are out with friends or on a date, you probably don’t want to take 30 minutes to order, and you will feel pressured to make a choice relatively quickly without having full knowledge and information regarding all your options. Satisficing helps you make a selection that you can be relatively confident you will be happy with given some constraints on your decision-making.

 

The term satisficing was coined by the Nobel Prize winning political scientist and economist Herbert Simon, and I remember hearing a story from a professor of mine about his decision to remain at Carnegie Melon University in Pittsburgh. When asked why he hadn’t taken a position at Harvard or a more prestigious Ivy League School, Simon replied that his wife was happy in Pittsburgh and while Carnegie Melon wasn’t as renown as Harvard it was still a good school and still offered him enough of what he wanted to remain. In other words, Carnegie Melon satisfied his basic needs and satisfied criteria in enough areas to make him happy, even though a school like Harvard would have maximized his prestige and influence. Simon was satisficing.

 

Without always recognizing it, we turn to satisficing for many of our decisions. We often can’t buy the perfect home (because of timing, price, and other bidders), so we satisfice and buy the first home we can get a good offer on that meets enough of our desires (but doesn’t fit all our desires perfectly). The same goes for jobs, cars, where we are going to get take-out, what movie we want to rent, what new clothes to buy, and more. Carefully analyzing every potential decision we have to make can be frustrating and exhausting. We will constantly doubt whether we made the best choice, and we may be too paralyzed to even make a decision in the first place. If we satisfice, however, we accept that we are not making the best choice but are instead making an adequate choice that satisfies the greatest number of our needs while simplifying the choice we have to make. We can live with what we get and move on without the constant doubt and loss of time that we might otherwise experience. Satisficing, while getting a bad rep from those who favor rationality in all instances, is actually a pretty good decision-making heuristic.
Navigating Uncertainty with Nudges

Navigating Uncertainty with Nudges

In Risk Savvy Gerd Gigerenzer makes a distinction between known risks and uncertainty. In a foot note for a figure, he writes, “In everyday language, we make a distinction between certainty and risk, but the terms risk and uncertainty are used mostly as synonyms. They aren’t. In a world of known risks, everything, including the probabilities, is known for certain. Here statistical thinking and logic are sufficient to make good decisions. In an uncertain world, not everything is known, and one cannot calculate the best option. Here, good rules of thumb and intuition are also required.” Gigerenzer’s distinction between risk and uncertainty is important. He demonstrates that people can manage decision-making when making risk based decisions, but that people need to rely on intuition and good judgement when dealing with uncertainty. One solution to improved judgement and intuition is to use nudges.

 

In the book Nudge, Cass Sunstein and Richard Thaler encourage choice architects to design systems and structures that will help individuals make the best decision in a given situation as defined by the chooser. Much of their argument is supported by research presented by Daniel Kahneman in Thinking Fast and Slow, where Kahneman demonstrates how predictable biases and cognitive errors can lead people to making decisions that they likely wouldn’t make if they had more clear information, had the ability to free themselves from irrelevant biases, and could improve their statistical thinking. Gigerenzer’s quote supports Sunstein and Thaler’s nudges by building on the research from Kahneman. Distinguishing between risk and uncertainty helps us understand when to use nudges, and how aggressive our nudges may need to be.

 

Gigerenzer uses casino slot machines as an example of risk and for examples of uncertainty uses stocks, romance, earthquakes, business, and health. When we are gambling, we can know the statistical chances that our bets will pay off and calculate optimal strategies (there is a reason the casino dealer stays on 17). We won’t know what the outcome will be ahead of time, but we can precisely define the risk. The same cannot be said for picking the right stocks, the right romantic partner, or when creating business, earthquake preparedness, or health plans. We may know the five year rate of return for a company’s stocks, the divorce rate in our state, the average frequency and strength of earthquakes in our region, and how old our grandfather lived to be, but we cannot use this information alone to calculate risk. We don’t know exactly what business trends will arise in the future, we don’t know for sure whether we have a genetic disease that will strike us (or our romantic partner) down sooner than expected, and we can’t say for sure that a 7.0 earthquake is or is not possible next month.

 

But nudges can help us in these decisions. We can use statistical information for business development and international stock returns to identify general rules of thumb when investing. We can listen to parents and elders and learn from their advice and mistakes when selecting a romantic partner, intuiting the traits that make a good (or bad) spouse. We can overengineer our bridges and skyscrapers by 10% to give us a little more assurance that they can survive a major and unexpected earthquake. Nudges are helpful because they can augment our gut instincts and help bring visualizations to the rules of thumb that we might utilize.

 

Expecting everyone’s individual intuition and heuristics to be up to the task of navigating uncertainty is likely to lead to many poor choices. But, if we help pool the statistical information available, provide guides, communicate rules of thumb that have panned out for many people, and structure choices in ways that help present this information, then people can likely make marginally better decisions. My suggestion in this post, is a nudge to use more nudges in moments of uncertainty. When certainty exists, or even when calculable risks exist, nudges may not be needed. However, once we get beyond calculable risk, where we must rely on judgement and intuition, nudges are important tools to help people navigate uncertainty and improve their decision making.
Quick Heuristics

Quick Heuristics

I really like the idea of heuristics. I have always thought of heuristics as short-cuts for problem solving or rules of thumb to apply to given situations to ease cognitive demand. We live in an incredibly complex world and the nature of reality cannot be deduced just by observing the world around us. For the world to get to the point where I can drink an espresso while listing to music streamed across the internet as I write a blog post, humanity collectively had to make discoveries involving microscopes, electromagnetism, and electricity, none of which were easily observable or intuitively understandable to our human ancestors.

 

To cope with a complex world and a limited ability to explore and understand that world, humans thrived through the use of heuristics. When faced with difficult problems and decisions, we substitute approximate but not exact answers. We can make a category judgement and reduce the number of decisions we have to make, taking a generalized path that will usually turn out well. Heuristics help us cope with the overwhelming complexity of the world, but they are not perfect, and they simplify the world according to the information we can observe and readily take in.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “the heuristic answer is not necessarily simpler or more frugal than the original question – it is only more accessible, computed more quickly and easily. The heuristic answers are not random, and they are often approximately correct. And sometimes they are quite wrong.”

 

Heuristics are quick, which is important if you are foraging and hear a dangerous sound, if you need to pick a quick place for shelter as a storm approaches, or if you have to make quick decisions about how to behave in a small tribal group. The more fluidly and quicker a heuristic comes to mind, the more natural it will feel and the stronger people will grasp it, even if it is not true. Stories and myths contain relatable elements and extend common experiences to complex problems like how to govern an empire, understanding why storms occur, and guiding us as to how we should organize an economy. Heuristics give us short-cuts to understanding these complexities, but they are biased toward our accessible world and experiences, which means they only approximate reality, and cannot fully and accurately answer our questions. While they can get some concepts more or less correct and give us good approaches to life in general, they can also be very wrong with serious consequences for many people over many generations.
Affect Heuristics

More on Affect Heuristics

For me, one of the easiest examples of heuristics that Daniel Kahneman shares in his book Thinking Fast and Slow is the affect heuristic. It is a bias that I know I fall into all the time, and that has led me to buy particular brands of shoes, has influenced how I think about certain foods, and has shaped the way I think about people. In his book Kahenman writes, “The affect heuristic is an instances of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think About it?).”

 

The world is a complex and tricky place, and we can only focus a lot of attention in one direction at a time. For a lot of us, that means we are focused on getting kids ready for school, cooking dinner, or trying to keep the house clean. Trying to fully understand the benefits and drawbacks of a social media platform, a new traffic pattern, or how to invest in retirement may seem important, but it can be hard to find the time and mental energy to focus on a complex topic and organize our thoughts in a logical and coherent manner. Nevertheless, we are likely to be presented with situations where we have to make decisions about what level of social media is appropriate for our children, offer comments on new traffic patterns around the water cooler, or finally get around to setting up our retirement plan and deciding what to do with that old 401K from that job we left.

 

Without having adequate time, energy, and attention to think through these difficult decisions, we have to make choices and are asked to have an opinion on topics we are not very informed about. “The affect heuristic”, Kahneman writes, “simplifies our lives by creating a world that is much tidier than reality. Good technologies have few costs in the imaginary world we inhabit, bad technologies have no benefits, and all decisions are easy.” We substitute the hard question that requires detailed thought for a simple question: do I like social media, did I feel that the new traffic pattern made my commute slower, do I like the way my retirement savings advisor presented a new investment strategy. In each case, we rely on affect, our emotional reaction to something, and make decisions in line with our gut feelings. Of course my kid can use social media, I’m on it, I like it, and I want to see what they are posting. Ugh, that new traffic pattern is awful, what were they thinking putting that utility box where it blocks the view of the intersection. Obviously this is the best investment strategy for me, my advisor was able to explain it well and I liked it when they told me I was making a smart decision.

 

We don’t notice when we default to the affect heuristic. It is hard to recognize that we have shifted away from making detailed calculations to rely solely on intuitions about how something makes us feel. Rather than admitting that we buy Nike shoes because our favorite basketball player wears them, and we want to be like LeBron, we create a story in our head about the quality of the shoes, the innovative design, and the complementary colors. We fall back on a quick set of factors that gives the impression of a thoughtful decision. In a lot of situations, we probably can’t do much better than the affect heuristic, but it is worth considering if our decisions are really being driven by affect. We might be able to avoid buying things just out of brand loyalty, and we might be a little calmer and reasonable in debates and arguments with friends and family when we realize we are acting on affect and not on reason.
Fluency Versus Frequency

Fluency Versus Frequency

When it comes to the availability heuristic, fluency seems to be the most important factor. The ease with which an example of something comes to mind matters more than the real world frequency of the event. Salient examples of people being pulled over by the police, of celebrity divorces, or of wildfires cause our brains to consider these types of events to be more common and likely than they really are.

 

In Thinking Fast and Slow, Daniel Kahneman shares results from a study by German psychologist Norbert Schwarz which demonstrates fluency versus frequency in our analysis of the world. Schwarz asked participants to list six instances in which they behaved assertively, and to then rate their overall level of assertiveness. In a second instance, Schwarz asked participants to list twelve instances where they were assertive and to then rate their overall level of assertiveness. What the studies show is that those who were asked to come up with 6 instances of assertiveness considered themselves to be more assertive than those asked to come up with 12 instances. Kahneman describes the results by writing, “Self-ratings were dominated by the ease with which examples had come to mind. The experience of fluent retrieval of instances trumped the number retrieved.”

 

The logical expectation would be that asking people to list 12 instances of assertiveness would give people more reason to believe they were a more assertive person. However, that is not what the study showed. Instead, what Kahneman explains happened is that as you are asked to pull more examples from memory, your brain has a harder time remembering times when you were assertive. You easily remember a few stand-out assertive moments, but eventually you start to run out of examples. As you struggle to think of assertive times in your life, you start to underrate your assertiveness. On the other hand, if you only have to think of a handful of assertive moments, and your brain pulls those moments from memory easily, then the experience of easily identifying moments of assertiveness gives you more confidence with rating yourself as assertive.

 

What I find fascinating with the study Kahneman presents is that the brain doesn’t rely on facts or statistics to make judgments and assessments about the world. It is not setting a bar before analysis at which it can say, more examples of this and I am assertive, or fewer examples and I am not assertive. It is operating on feeling and intuition, fluidly moving through the world making judgments by heuristics. The brain is not an objective observer of the world, and its opinions, perspectives, and conclusions are biased by the way it operates. The study suggests that we cannot trust our simple judgments, even when they are about something as personal as our own level of assertiveness.
Teamwork Contributions

Thinking About Who Deserves Credit for Good Teamwork

Yesterday I wrote about the Availability Heuristic, the term that Daniel Kahneman uses in his book Thinking Fast and Slow to describe the ways in which our brains misjudge frequency, amount, and probability based on how easily an example of something comes to mind. In his book, Kahneman describes individuals being more likely to overestimate things like celebrity divorce rates if there was recently a high profile and contentious celebrity divorce in the news. The easier it is for us to make an association or to think of an example of a behavior or statistical outcome, the more likely we will overweight that thing in our mental models and expectations for the world.

 

Overestimating celebrity divorce rates isn’t a very big deal, but the availability heuristic can have a serious impact in our lives if we work as part of a team or if we are married and have a family. The availability heuristic can influence how we think about who deserves credit for good team work.

 

Whenever you are collaborating on a project, whether it is a college assignment, a proposal or set of training slides at work, or keeping the house clean on a regular basis, you are likely to overweight your own contributions relative to others. You might be aware of someone who puts in a herculean effort and does well more than their own share, but if everyone is chugging along completing a roughly equivalent workload, you will see yourself as doing more than others. The reason is simple, you experience your own work firsthand. You only see everyone else’s handiwork once they have finished it and everyone has come back together. You suffer from availability bias because it is easier for you to recall the time and effort you put into the group collaboration than it is for you to recognize and understand how much work and effort others pitched in. Kahneman describes the result in his book, “you will occasionally do more than your share, but it is useful to know that you are likely to have that feeling even when each member of the team feels the same way.” 

 

Even if everyone did an equal amount of work, everyone is likely to feel as though they contributed more than the others. As Kahneman writes, there is more than 100% of credit to go around when you consider how much each person thinks they contributed. In marriages, this is important to recognize and understand. Spouses often complain that one person is doing more than the other to keep the house running smoothly, but if they complain to their partner about the unfair division of household labor, they are likely to end up in an unproductive argument with each person upset that their partner doesn’t recognize how much they contribute and how hard they work. Both will end up feeling undervalued and attacked, which is certainly not where any couple wants to be.

 

Managers must be aware of this and must find ways to encourage and celebrate the achievements of their team members while recognizing that each team member may feel that they are pulling more than their own weight. Letting everyone feel that they are doing more than their fair share is a good way to create unhelpful internal team competition and to create factions within the workplace. No professional work team wants to end up like a college or high school project group, where one person pulls and all-nighter, overwriting everyone else’s work and where one person seemingly disappears and emails everyone last minute to ask them not to rat them out to the teacher.

 

Individually, we should acknowledge that other people are not going to see and understand how much effort we feel that we put into the projects we work on. Ultimately, at an individual level we have to be happy with team success over our individual success. We don’t need to receive a gold star for every little thing that we do, and if we value helping others succeed as much as we value our own success, we will be able to overcome the availability heuristic in this instance, and become a more productive team member, whether it is in volunteer projects, in the workplace, or at home with our families.
The Availability Heuristic

The Science of Availability

Which presidential candidate is doing more advertising this year? Which college football team has been the most dominant over the last five years? Who has had the most songs on the Hot 100 over the last five years? You can probably come up with an intuitive answer to (at least one of) these questions even if you don’t follow politics, college football, or pop music very closely. But what you are doing when you come up with an intuitive answer isn’t really answering the question, but instead relying on substitution and the availability heuristic.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “We defined the availability heuristic as the process of judging frequency by the ease with which instances come to mind.” So if you recently saw a few ads from the Trump Campaign, then your mind would probably intuit that his campaign is doing more advertising. If you remember that LSU won the college football national championship last year, then you might have answered LSU, but also if you see lots of people wearing Alabama hats on a regular basis, you might answer Alabama. And if you recently heard a Taylor Swift song, then your intuitive guess might be that she has had the most top 100 hits.

 

Kahneman continues, “The availability heuristic, like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind.” When we are asked to guess how often an event happens or what percent of a category fits a certain characteristic, our brains flip back through short-term memory for examples that match what we are looking for. The easier it is to remember an example the more weight we give to it.

 

I don’t really know who is doing more advertising, but I do know that I have seen a lot of Trump ads on YouTube, so it intuitively felt that he was doing more advertising, even though I might have just picked one channel where his ads were more salient. Overall, he may be doing less than the Biden campaign. Similarly, I didn’t initially remember that LSU won the national championship last year, but I did see someone wearing an Alabama sweatshirt recently, and that team came to mind quickly when thinking of dominant football programs. I also don’t have a clue who has had the most top 100 hits in the last 5 years, but people in my orbit on Twitter frequently post things relating to Taylor Swift, so her name came to mind easily when guessing for the top 100 hits. I wasn’t doing any deep thinking, I was just scratching the surface of my memory for an easy answer.

 

Throughout Thinking Fast and Slow, Kahneman reveals instances where our thinking appears to be deep and nuanced, but is really quick, intuitive, and prone to errors. In most instances we don’t do any deep calculation or thinking, and just roll with the intuitive answer. But our intuition is often faulty, incomplete, and based on a substitution for the real question we are being asked. This might not have high stakes when it means we are inaccurately estimating divorce rates for celebrities (an example from the book), but it can have high stakes in other decision-making areas. If we are looking to buy a home and are concerned about flood risk, we will incorrectly weight the risk of a flood at a property if there were a lot of news stories about hurricane flooding from a hurricane in the Gulf of Mexico. This could influence where we chose to live and whether we pay for expensive insurance or not. Little assumptions and misperceptions can nudge us in critical directions, either positive or negative, and change whether we invest for our futures, fudge our taxes, or buy a new car. Recognizing that our brains make mistakes based on thinking strategies like the availability heuristic can help us in some large decision-making areas, so it is important to understand how our brains work, and where they can go wrong.
Affect Heuristics

Affect Heuristics

I studied public policy at the University of Nevada, Reno, and one of the things I had to accept early on in my studies was that humans are not as rational as we like to believe. We tell ourselves that we are making objective and unbiased judgments about the world to reach the conclusions we find. We tell ourselves that we are listening to smart people who truly understand the issues, policies, and technicalities of policies and science, but studies of voting, of policy preference, and of individual knowledge show that this is not the case.

 

We are nearing November and in the United States we will be voting for president and other elected officials. Few of us will spend much time investigating the candidates on the ballot in a thorough and rigorous way. Few of us will seek out in-depth and nuanced information about the policies our political leaders support or about referendum questions on the ballot.  But many of us, perhaps the vast majority of us, will have strong views on policies ranging from tech company monopolies, to tariffs, and to public health measures. We will reach unshakable conclusions and find a few snippets of facts to support our views. But this doesn’t mean that we will truly understand any of the issues in a deep and complex manner.

 

Daniel Kahneman, in his book Thinking Fast and Slow helps us understand what is happening with our voting, and reveals what I didn’t want to believe, but what I was confronted with over and over through academic studies. He writes, “The dominance of conclusions over arguments is most pronounced where emotions are involved. The psychologist Paul Slovic has proposed an affect heuristic in which people let their likes and dislikes determine their beliefs about the world.”

 

Very few of us have a deep understating of economics, international relations, or public health, but we are good at recognizing what is in our immediate self-interest and who represents the identities that are core to who we are. We know that having someone who reflects our identities and praises those identities will help improve the social standing of our group, and ultimately improve our own social status. By recognizing who our leader is and what is in our individual self-interest to support, we can learn which policy beliefs we should adopt. We look to our leaders, learn what they believe and support, and follow their lead. We memorize a few basic facts, and use that as justification for the beliefs we hold, rather than admit that our beliefs simply follow our emotional desire to align with a leader that we believe will boost our social standing.

 

It is this affect heuristic that drives much of our political decision making. It helps explain how we can support some policies which don’t seem to immediately benefit us, by looking at the larger group we want to be a part of and trying to increase the social standing of that group, even at a personal cost. The affect heuristic shows that we want a conclusion to be true, because we would benefit from it, and we use motivated reasoning to adopt beliefs that conveniently support our self-interest. There doesn’t need to be any truth to the beliefs, they just need to satisfy our emotional valance and give us a shortcut to making decisions on complex topics.