Affect Heuristics

More on Affect Heuristics

For me, one of the easiest examples of heuristics that Daniel Kahneman shares in his book Thinking Fast and Slow is the affect heuristic. It is a bias that I know I fall into all the time, and that has led me to buy particular brands of shoes, has influenced how I think about certain foods, and has shaped the way I think about people. In his book Kahenman writes, “The affect heuristic is an instances of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think About it?).”

 

The world is a complex and tricky place, and we can only focus a lot of attention in one direction at a time. For a lot of us, that means we are focused on getting kids ready for school, cooking dinner, or trying to keep the house clean. Trying to fully understand the benefits and drawbacks of a social media platform, a new traffic pattern, or how to invest in retirement may seem important, but it can be hard to find the time and mental energy to focus on a complex topic and organize our thoughts in a logical and coherent manner. Nevertheless, we are likely to be presented with situations where we have to make decisions about what level of social media is appropriate for our children, offer comments on new traffic patterns around the water cooler, or finally get around to setting up our retirement plan and deciding what to do with that old 401K from that job we left.

 

Without having adequate time, energy, and attention to think through these difficult decisions, we have to make choices and are asked to have an opinion on topics we are not very informed about. “The affect heuristic”, Kahneman writes, “simplifies our lives by creating a world that is much tidier than reality. Good technologies have few costs in the imaginary world we inhabit, bad technologies have no benefits, and all decisions are easy.” We substitute the hard question that requires detailed thought for a simple question: do I like social media, did I feel that the new traffic pattern made my commute slower, do I like the way my retirement savings advisor presented a new investment strategy. In each case, we rely on affect, our emotional reaction to something, and make decisions in line with our gut feelings. Of course my kid can use social media, I’m on it, I like it, and I want to see what they are posting. Ugh, that new traffic pattern is awful, what were they thinking putting that utility box where it blocks the view of the intersection. Obviously this is the best investment strategy for me, my advisor was able to explain it well and I liked it when they told me I was making a smart decision.

 

We don’t notice when we default to the affect heuristic. It is hard to recognize that we have shifted away from making detailed calculations to rely solely on intuitions about how something makes us feel. Rather than admitting that we buy Nike shoes because our favorite basketball player wears them, and we want to be like LeBron, we create a story in our head about the quality of the shoes, the innovative design, and the complementary colors. We fall back on a quick set of factors that gives the impression of a thoughtful decision. In a lot of situations, we probably can’t do much better than the affect heuristic, but it is worth considering if our decisions are really being driven by affect. We might be able to avoid buying things just out of brand loyalty, and we might be a little calmer and reasonable in debates and arguments with friends and family when we realize we are acting on affect and not on reason.
Fluency of Ideas

Fluency of Ideas

Our experiences and narratives are extremely important to consider when we make judgments about the world, however we rarely think deeply about the reasons why we hold the beliefs we do. We rarely pause to consider whether our opinions are biased, whether our limited set of experiences shape the narratives that play in our mind, and how this influences our entire outlook on life. Instead, we rely on the fluency of ideas to judge our thoughts and opinions as accurate.

 

In Thinking Fast and Slow Daniel Kahneman writes about ideas from Cass Sunstein and jurist Timur Kuran explaining their views on fluency, “the importance of an idea is often judged by the fluency (and emotional charge) with which that idea comes to mind.” It is easy to characterize an entire group of people as hardworking, or lazy, or greedy, or funny based entirely on a single interaction with a single person from that group. We don’t pause to ask if our interaction with one person is really a good reflection of all people who fit the same group as that person, we instead allow the fluency of our past experiences to shape our opinions of all people in that group.

 

And our ideas and the fluency with which those ideas come to mind don’t have to come from our own personal experience. If a claim is repeated often enough, we will have trouble distinguishing it from truth, even if it is absurd and doesn’t have any connection to reality. The idea will come to mind more fluently, and consequently the idea will start to feel true. We don’t have to have direct experience with something if a great marketing campaign has lodge an opinion or slogan in mind that we can quickly recall.

 

If we are in an important decision-making role, it is important that we recognize this fluency bias. The fluency of ideas will drive us toward a set of conclusions that might not be in our best interests. A clever marketing campaign, a trite saying repeated by salient public leaders, or a few extreme yet random personal experiences can bias our judgment. We have to find a way to step back, recognize the narrative at hand, and find reliable data to help us make better decisions, otherwise we might end up judging ideas and making decisions based on faulty reasoning.
As an addendum to this post (originally written on 10/04/2020), this morning I began The Better Angels of Our Nature: Why Violence Has Declined, by Steven Pinker. Early in the introduction, Pinker states that violence in almost all forms is decreasing, despite the fact that for many of us, it feels as though violence is as front and center in our world as ever before. Pinker argues that our subjective experience of out of control violence is in some ways due to the fluency bias that Kahneman describes from Sunstein and Kuran. Pinker writes,

 

“No matter how small the percentage of violent deaths may be, in absolute numbers there will always be enough of them to fill the evening news, so people’s impressions of violence will be disconnected from the actual proportions.” 

 

The fluency effect causes an observation to feel correct, even if it is not reflective of actual trends or rates in reality.
Why We Talk About Human Nature

Why We Talk About Human Nature

I entered a Master’s in Public Administration program at the University of Nevada in 2016. I started the same semester as the 2016 election of President Donald Trump. I was drawn toward public policy because I love science, because I have always wanted to better understand how people come to hold political beliefs, and because I thought that bringing my rational science-based mind to public policy would open doors and avenues for me that were desperately needed in the world of public administration and policy. What I learned, and what we have all learned since President Trump took office, is that politics is not about policy, public administration is not about the high minded ideals we say it is about, and rationality is not and cannot be at the heart of public policy. Instead, politics is about identity, public administration is about systems and structures that benefit those we decide to be deserving and punishing those who are deviant. Public policy isn’t rational, its about self-interest and individual and group preferences. And this connects to the title of this post. We talk about human nature, because how we can define, understand, and perceive human nature can help us rationalize why our self-interest is valuable in public policy, why one group should be favored over another, and why one system that rewards some people is preferable over another system that rewards other people.

 

In Daniel Kahneman’s book Thinking Fast and Slow, he writes, “policy is ultimately about people, what they want and what is best for them. Every policy question involves assumptions about human nature, in particular about the choices that people may make and the consequences of their choices for themselves and society.” The reason why we talk about human nature is because it serves as the foundation upon which all of our social systems and structures are built upon. All of our decisions are based in fundamental assumptions about what we want, what are inherently inclined to do, and how we will behave as individuals and as part of a collective. However, this discussion is complicated because what we consider to be human nature, is subject to bias, to misunderstandings, and motivated reasoning. Politics and public policy are not rational because we all live with narrow understandings of what we want human nature to mean.

 

Personally, I think our conceptions and ideas of human nature are generally too narrow and limiting. I am currently reading Yuval Noah Harari’s book Sapiens, and he makes a substantial effort to show the diversity and seeming randomness in the stories that humans have created over tens of thousands of years, and how humans have lived in incredibly different circumstances, with different beliefs, different cultures, and different lifestyles throughout time. It is a picture of human nature which doesn’t quite make the jump to arguing that there is no human nature, but argues that human nature is a far more broad topic than what we typically focus on. I think Harari is correct, but someone who wants questions to religion to be central to human nature, someone who wants capitalistic competition to be central to human nature, or someone who wants altruism to be a deep facet of human nature might disagree with Harari.

 

Ultimately, we argue over human nature because how we define human nature can influence who is a winner and who is a loser in our society. It can shape who we see as deserving and who we see as deviant. The way we frame human nature can structure the political systems we adopt, the leaders we favor, and the economic systems that will run most of our lives. The discussions about human nature appear to be scientific, but they are often biased and flawed, and in the end what we really care about is our personal self-interest, and in seeing our group advance, even at the expense of others. Politics is not rational, we have all learned in nearly four years of a Donald Trump Presidency, because we have different views of what the people want and what is best for them, and flawed understandings of human nature influence those views and the downstream political decisions that we make.
How We Chose to Measure Risk

How We Chose to Measure Risk

Risk is a tricky thing to think about, and how we chose to measure and communicate risk can make it even more challenging to comprehend. Our brains like to categorize things, and categorization is easiest when the categories are binary or represent three or fewer distinct possibilities. Once you start adding options and different possible outcomes, decisions quickly become overwhelmingly complex, and our minds have trouble sorting through the possibilities. In his book Thinking Fast and Slow, Daniel Kahneman discusses the challenges of thinking about risk, and highlights another level of complexity in thinking about risk: what measurements we are going to use to communicate and judge risk.

 

Humans are pretty good at estimating coin flips – that is to say that our brains do ok with binary 50-50 outcomes (although as Kahneman shows in his book this can still trip us up from time to time). Once we have to start thinking about complex statistics, like how many people will die from cancer caused by smoking if they smoke X number of packs of cigarettes per month for X number of years, our brains start to have trouble keeping up. However, there is an additional decision that needs to be layered on top statistics such as cigarette related death statistics before we can begin to understand them. That decision is how we are going to report the death statistics.  Will we chose to report deaths per thousand smokers? Will we chose to report the number of packs smoked for a number of years? Will we just chose to report deaths among all smokers, regardless as to whether they smoked one pack per month or one pack before lunch every day?

 

Kahneman writes, “the evaluation of the risk depends on the choice of a measure – with the obvious possibility that the choice may have been guided by a preference for one outcome or another.”

 

Political decisions cannot be escaped, even when we are trying to make objective and scientific statements about risk. If we want to convey that something is dangerous, we might chose to report overall death numbers across the country. Those death numbers might sound like a large number, even though they may represent a very small fraction of incidents. In our lives today, this may be done with COVID-19 deaths, voter fraud instances, or wildfire burn acreage. Our brains will have a hard time comprehending risk in each of these areas, and adding the complexity of how that risk is calculated, measured, and reported can make virtually impossible for any of us to comprehend risk. Clear and accurate risk reporting is vital for helping us understand important risks in our lives and in society, but the entire process can be derailed if we chose measures that don’t accurately reflect risk or that muddy the waters of exactly what the risk is.
The Emotional Replica of Reality in our Brains

The Emotional Replica of Reality Within Our Brains

It feels weird to acknowledge that the model for reality within our brains is nothing more than a model. It is a construction of what constitutes reality based on our experiences and based on the electrical stimuli that reach our brain from various sensory organs, tissues, and nerve endings. The brain doesn’t have a model for things that it doesn’t have a way of experiencing or imagining. Like the experience of falling into a black hole, representations of what the experience is like will never fully substitute for the real thing, and will forever be unknowable to our brains. Consequently, the model of reality that our brain uses for every day operations can only include the limited slice of reality that is available to our experiences.

 

What results is a distorted picture of the world. This was not too much of a problem for our ancestors living as hunter-gatherers in small tribes. It didn’t matter if they fully understood the precise risk of tiger attacks or poisonous fungi, as long as they had heuristics to keep them away from dangerous situations and questionable foods. They didn’t need to hear at the frequency of a bat’s echolocation pulses, they didn’t need to see ultraviolet, and they didn’t need to sense the earth’s magnetic field. Precision and completeness wasn’t as important as a general sense of the world for pattern recognition and enough fear and memory to stay safe and find reliable food.

 

Today, however, we operate in complex social structures and the narratives we tell about ourselves, our societies, and how we should interact can have lasting influences on our own lives and the lives of generations to come.  How we understand the world is often shaped by our emotional reaction to the world, rather than being shaped by a complete set of scientific and reality based details and information. As Daniel Kahneman writes in Thinking Fast and Slow, “The world in our heads is not a precise replica of reality; our expectations about the frequency of events are distorted by the prevalence and emotional intensity of the messages to which we are exposed.”

 

Kahneman writes about news reporting of strange and extreme phenomenon, and how that leads us to believe that very rare events like tornado deaths are more likely than mundane and common causes of death such as those resulting from asthma complications. Things that are dramatic and unique feel more noteworthy, and are likely to be easier for us to remember and recall. When that happens, the events feel less like strange outliers, and more like normal events. The picture of reality operating in our mind is altered and distorted based on our experiences, the information we absorb, and our emotional valence to both.

 

For a social species, this can have dramatic consequences. If we generalize a character trait of one person to an entire group, we can develop dangerous stereotypes that influence our interactions with hundreds or thousands of people. A single salient event can shape how we think about problems or opportunities in our communities and societies. Rather than fully understanding our reaction and the event itself, we are going to struggle through narratives that seek to combine thousands of individual perceptions of reality, each influenced in unique ways by conflicting emotions and opinions of what has happened. Systems and structures matter, especially when our brains operate on inadequate versions of reality rather than concrete versions of reality and can be shaped by our emotional reactions to such systems and structures.
Fluency Versus Frequency

Fluency Versus Frequency

When it comes to the availability heuristic, fluency seems to be the most important factor. The ease with which an example of something comes to mind matters more than the real world frequency of the event. Salient examples of people being pulled over by the police, of celebrity divorces, or of wildfires cause our brains to consider these types of events to be more common and likely than they really are.

 

In Thinking Fast and Slow, Daniel Kahneman shares results from a study by German psychologist Norbert Schwarz which demonstrates fluency versus frequency in our analysis of the world. Schwarz asked participants to list six instances in which they behaved assertively, and to then rate their overall level of assertiveness. In a second instance, Schwarz asked participants to list twelve instances where they were assertive and to then rate their overall level of assertiveness. What the studies show is that those who were asked to come up with 6 instances of assertiveness considered themselves to be more assertive than those asked to come up with 12 instances. Kahneman describes the results by writing, “Self-ratings were dominated by the ease with which examples had come to mind. The experience of fluent retrieval of instances trumped the number retrieved.”

 

The logical expectation would be that asking people to list 12 instances of assertiveness would give people more reason to believe they were a more assertive person. However, that is not what the study showed. Instead, what Kahneman explains happened is that as you are asked to pull more examples from memory, your brain has a harder time remembering times when you were assertive. You easily remember a few stand-out assertive moments, but eventually you start to run out of examples. As you struggle to think of assertive times in your life, you start to underrate your assertiveness. On the other hand, if you only have to think of a handful of assertive moments, and your brain pulls those moments from memory easily, then the experience of easily identifying moments of assertiveness gives you more confidence with rating yourself as assertive.

 

What I find fascinating with the study Kahneman presents is that the brain doesn’t rely on facts or statistics to make judgments and assessments about the world. It is not setting a bar before analysis at which it can say, more examples of this and I am assertive, or fewer examples and I am not assertive. It is operating on feeling and intuition, fluidly moving through the world making judgments by heuristics. The brain is not an objective observer of the world, and its opinions, perspectives, and conclusions are biased by the way it operates. The study suggests that we cannot trust our simple judgments, even when they are about something as personal as our own level of assertiveness.
Teamwork Contributions

Thinking About Who Deserves Credit for Good Teamwork

Yesterday I wrote about the Availability Heuristic, the term that Daniel Kahneman uses in his book Thinking Fast and Slow to describe the ways in which our brains misjudge frequency, amount, and probability based on how easily an example of something comes to mind. In his book, Kahneman describes individuals being more likely to overestimate things like celebrity divorce rates if there was recently a high profile and contentious celebrity divorce in the news. The easier it is for us to make an association or to think of an example of a behavior or statistical outcome, the more likely we will overweight that thing in our mental models and expectations for the world.

 

Overestimating celebrity divorce rates isn’t a very big deal, but the availability heuristic can have a serious impact in our lives if we work as part of a team or if we are married and have a family. The availability heuristic can influence how we think about who deserves credit for good team work.

 

Whenever you are collaborating on a project, whether it is a college assignment, a proposal or set of training slides at work, or keeping the house clean on a regular basis, you are likely to overweight your own contributions relative to others. You might be aware of someone who puts in a herculean effort and does well more than their own share, but if everyone is chugging along completing a roughly equivalent workload, you will see yourself as doing more than others. The reason is simple, you experience your own work firsthand. You only see everyone else’s handiwork once they have finished it and everyone has come back together. You suffer from availability bias because it is easier for you to recall the time and effort you put into the group collaboration than it is for you to recognize and understand how much work and effort others pitched in. Kahneman describes the result in his book, “you will occasionally do more than your share, but it is useful to know that you are likely to have that feeling even when each member of the team feels the same way.” 

 

Even if everyone did an equal amount of work, everyone is likely to feel as though they contributed more than the others. As Kahneman writes, there is more than 100% of credit to go around when you consider how much each person thinks they contributed. In marriages, this is important to recognize and understand. Spouses often complain that one person is doing more than the other to keep the house running smoothly, but if they complain to their partner about the unfair division of household labor, they are likely to end up in an unproductive argument with each person upset that their partner doesn’t recognize how much they contribute and how hard they work. Both will end up feeling undervalued and attacked, which is certainly not where any couple wants to be.

 

Managers must be aware of this and must find ways to encourage and celebrate the achievements of their team members while recognizing that each team member may feel that they are pulling more than their own weight. Letting everyone feel that they are doing more than their fair share is a good way to create unhelpful internal team competition and to create factions within the workplace. No professional work team wants to end up like a college or high school project group, where one person pulls and all-nighter, overwriting everyone else’s work and where one person seemingly disappears and emails everyone last minute to ask them not to rat them out to the teacher.

 

Individually, we should acknowledge that other people are not going to see and understand how much effort we feel that we put into the projects we work on. Ultimately, at an individual level we have to be happy with team success over our individual success. We don’t need to receive a gold star for every little thing that we do, and if we value helping others succeed as much as we value our own success, we will be able to overcome the availability heuristic in this instance, and become a more productive team member, whether it is in volunteer projects, in the workplace, or at home with our families.
The Availability Heuristic

The Science of Availability

Which presidential candidate is doing more advertising this year? Which college football team has been the most dominant over the last five years? Who has had the most songs on the Hot 100 over the last five years? You can probably come up with an intuitive answer to (at least one of) these questions even if you don’t follow politics, college football, or pop music very closely. But what you are doing when you come up with an intuitive answer isn’t really answering the question, but instead relying on substitution and the availability heuristic.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “We defined the availability heuristic as the process of judging frequency by the ease with which instances come to mind.” So if you recently saw a few ads from the Trump Campaign, then your mind would probably intuit that his campaign is doing more advertising. If you remember that LSU won the college football national championship last year, then you might have answered LSU, but also if you see lots of people wearing Alabama hats on a regular basis, you might answer Alabama. And if you recently heard a Taylor Swift song, then your intuitive guess might be that she has had the most top 100 hits.

 

Kahneman continues, “The availability heuristic, like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind.” When we are asked to guess how often an event happens or what percent of a category fits a certain characteristic, our brains flip back through short-term memory for examples that match what we are looking for. The easier it is to remember an example the more weight we give to it.

 

I don’t really know who is doing more advertising, but I do know that I have seen a lot of Trump ads on YouTube, so it intuitively felt that he was doing more advertising, even though I might have just picked one channel where his ads were more salient. Overall, he may be doing less than the Biden campaign. Similarly, I didn’t initially remember that LSU won the national championship last year, but I did see someone wearing an Alabama sweatshirt recently, and that team came to mind quickly when thinking of dominant football programs. I also don’t have a clue who has had the most top 100 hits in the last 5 years, but people in my orbit on Twitter frequently post things relating to Taylor Swift, so her name came to mind easily when guessing for the top 100 hits. I wasn’t doing any deep thinking, I was just scratching the surface of my memory for an easy answer.

 

Throughout Thinking Fast and Slow, Kahneman reveals instances where our thinking appears to be deep and nuanced, but is really quick, intuitive, and prone to errors. In most instances we don’t do any deep calculation or thinking, and just roll with the intuitive answer. But our intuition is often faulty, incomplete, and based on a substitution for the real question we are being asked. This might not have high stakes when it means we are inaccurately estimating divorce rates for celebrities (an example from the book), but it can have high stakes in other decision-making areas. If we are looking to buy a home and are concerned about flood risk, we will incorrectly weight the risk of a flood at a property if there were a lot of news stories about hurricane flooding from a hurricane in the Gulf of Mexico. This could influence where we chose to live and whether we pay for expensive insurance or not. Little assumptions and misperceptions can nudge us in critical directions, either positive or negative, and change whether we invest for our futures, fudge our taxes, or buy a new car. Recognizing that our brains make mistakes based on thinking strategies like the availability heuristic can help us in some large decision-making areas, so it is important to understand how our brains work, and where they can go wrong.
The Environment of the Moment

The Environment of the Moment

“The main moral of priming research is that our thoughts and our behavior are influenced, much more than we know or want, by the environment of the moment. Many people find the priming results unbelievable, because they do not correspond to subjective experience. Many others find the results upsetting, because they threaten the subjective sense of agency and autonomy.”

 

Daniel Kahneman includes the above quote in his book Thinking Fast and Slow when recapping his chapter about anchoring effects. The quote highlights the surprising and conflicting reality of research on priming and anchoring effects. The research shows that our minds are not always honest with us, or at least are not capable of consciously recognizing everything taking place within them. Seemingly meaningless cues in our environment can influence a great deal of what takes place within our brains. We can become more defensive, likely to donate more to charity, and more prone to think certain thoughts by symbols, ideas, and concepts present in our environment.

 

We all accept that when we are hungry, when our allergies are overwhelming, and when we are frustrated from being cut-off on the freeway that our behaviors will be changed. We know these situations will make us less patient, more likely to glare at someone who didn’t mean to offend us, and more likely to grab a donut for breakfast because we are not in the mood for flavor-lacking oatmeal. But somehow, even though we know external events are influencing our internal thinking and decision-making, this still seems to be in our conscious control in one way or another. A hearty breakfast, a few allergy pills, and a few deep breaths to calm us down are all we need to get back to normal and be in control of our minds and behavior.

 

It is harder to accept that our minds, moods, generosity, behavior towards others, and stated beliefs could be impacted just as easily by factors that we don’t even notice. We see some type of split between being short with someone because we are hungry, and being short with someone because an advertisement on our way to work primed us to be more selfish. We don’t believe that we will donate more to charity when the charity asks for a $500 dollar donation rather than a $50 dollar donation. In each of these situations our conscious and rational brain produces an explanation for our behavior that is based on observations the conscious mind can make. We are not aware of the primes and anchors impacting our behavior, so consciously we don’t believe they have any impact on us at all.

 

Nevertheless, research shows that our minds are not as independent and controllable as we subjectively believe. Kahneman’s quote shows that traditional understandings of free-will fall down when faced by research on priming and anchoring effects. We don’t like to admit that random and seemingly innocuous cues in the environment of the moment shape us because doing so threatens the narratives and stories we want to believe about who we are, why we do the things we do, and how our society is built. It is scary, possibly upsetting, and violates basic understandings of who we are, but it is accurate and important to accept if we want to behave and perform better in our lives.
Anchoring Effects

Anchoring Effects

Anchoring effects were one of the psychological phenomenon that I found the most interesting in Daniel Kahneman’s book Thinking Fast and Slow. In many situations in our lives, random numbers seem to be able to influence other numbers that we consciously think about, even when there is no reasonable connection between the random number we see, and the numbers we consciously use for another purpose. As Kahneman writes about anchoring effects, “It occurs when people consider a particular value for an unknown quantity before estimating that quantity.”

 

Several examples of anchoring effects are given in the book. In one instance, judges were asked to assess how much a night club should be fined for playing loud music long after the quite orders in the night club’s local town. Real life judges who have to make these legal decisions were presented with the name of the club and information about the violation of the noise ordinance. The fictitious club was named after the fictitious street that it was located along. In some instanced, the club name was something along the lines of 1500 First Street, and in other instances the club name was something like 10 First Street. Judges consistently assessed a higher fine to the club with the name 1500 First Street than the club with the name 10 First Street. Seemingly random and unimportant information, the numbers for the street address in the name of the club, had a real impact on the amount that judges on average thought the club should be fined.

 

In other examples of anchoring effects, Kahneman shows us that we come up with different guesses of how old Gandhi was when he died if we are asked if he was older than 35 or younger than 114. In another experiment, a random wheel spin influenced the guess people offered for the number of African nations in the UN. In all these examples, when we have to think of a number that we don’t know or that we can apply subjective judgment to, other random numbers can influence what numbers come to mind.

 

This can have real consequences in our lives when we are looking to buy something or make a donation or investment. Retailers may present us with high anchors in an effort to prime us to be willing to accept a higher price than we would otherwise pay for an item. If you walk into a sunglass shop and see two prominently displayed sunglasses with very high prices, you might not be as surprised by the high prices listed on other sunglasses, and might even consider slightly lower prices on other sunglasses as a good deal.

 

It is probably safe to say that sales prices in stores, credit card interest rates, and investment management fees are carefully crafted with anchoring effects in mind. Retailers want you to believe that a high price on an item is a fair price, and could be higher if they were not willing to offer you a deal. Credit card companies and investment brokers want you to believe the interest rates and management fees they charge are small, and might try to prime you with large numbers relative to the rates they quote you. We probably can’t completely overcome the power of anchoring effects, but if we know what to look for, we might be better at taking a step back and analyzing the rates and costs a little more objectively. If nothing else, we can pause and doubt ourselves a little more when we are sure we have found a great deal on a new purchase or investment.