Acquisition Responsibility

We are not always responsible for the acquisition of our virtues and vices. For some of us, being good natured and virtuous toward other people comes naturally, and for others of us, being arrogant or closed-minded comes naturally or was pushed onto us from forces we could not control. I think it is reasonable to say that virtues likely require more training, habituation, imitation, and intentionality for acquisition than vices, so in that sense we are more responsible for virtue acquisition than vice acquisition. It is useful to think about becoming versus being when we think about virtues and vices because it helps us better consider individual responsibility. Making this distinction helps us think about blameworthiness and deservingness, and it can shape the narratives that influence how we behave toward others.
In Vices of the Mind Quassim Cassam writes, “a person who is not responsible for becoming dogmatic might still be responsible for being that way. Acquisition responsibility is backward-looking: it is concerned with the actual or imagined origin of one’s vices.”
In the book, in which Cassam focuses on epistemic vices, or vices that obstruct knowledge. Cassam uses an example from Heather Battaly of a young man who is unfortunate enough to grow up in a part of the world controlled by the Taliban. The young man will undoubtedly be closed-minded (at the very least) as a result of being indoctrinated by the Taliban. There is little the man could do to be more open minded, to avoid adopting a specific viewpoint informed by the biases, prejudices, and agendas of the Taliban. It is not reasonable to say that the man has acquisition responsibility for his closed-mindedness. Many of our epistemic vices are like this, they are the results of forces beyond our control or invisible to us, they are in some ways natural cognitive errors that come from misperceptions of the world.
When we think about vices in this way, I would argue that it should change how we think about people who hold such vices. It seems to me that it would be unreasonable to scorn everyone who holds a vice for which they have no control over the acquisition. Being backward-looking doesn’t help us think about how to move forward. It is important to recognize that people hate being held responsible for things they had no control over, even if that thing lead to serious harms for other people. An example might be people who have benefitted from structural racism, and might like to see systems and institutions change to be less structurally racist, but don’t want to be blamed for a system they didn’t recognize or know they contributed to. Being stuck with a backward-looking view frustrates people, makes them feel ashamed and powerless, and prevents progress. People would rather argue that it wasn’t their fault and that they don’t deserve blame than think about ways to move forward. Keeping this in mind when thinking about how we address and eliminate vices for which people are not acquisition responsible is important for us if we want to continue to grow as individuals and societies and if we want to successfully overcome epistemic vices.
Do People Make the Best Choices?

Do People Make the Best Choices?

My wife works with families with children with disabilities and for several years I worked in the healthcare space. A common idea between our two worlds was that the people being assisted are the experts on their own lives, and they know what is best for them. Parents are the experts for their children and patients are the experts in their health. Even if parents to don’t know all the intervention strategies to help a child with disabilities, and even if patients don’t have an MD from Stanford, they are still the expert in their own lives and what they and their families need.

 

But is this really true? In recent years there has been a bit of a customer service pushback in the world of business, more of a recognition that the customer isn’t always right. Additionally, research from the field of cognitive psychology, like much of the research from Daniel Kahneman’s book Thinking Fast and Slow that I wrote about, demonstrates that people can have huge blind spots in their own lives. People cannot always think rationally, in part because their brains are limited in their capacity to handle lots of information and because their brains can be tempted to take easy shortcuts in decision-making that don’t always take into account the true nature of reality. Add to Kahneman’s research the ideas put forth by Robin Hanson and Kevin Simler in The Elephant in the Brain, where the authors argue that our minds intentionally hide information from ourselves for political and personal advantage, and we can see that individual’s can’t be trusted to always make the best decisions.

 

So while no one else may know a child as well as the child’s parents, and while no one knows your body and health as well as you do, your status as the expert of who you are doesn’t necessarily mean you are in the best position to always make choices and decisions that are in your own best interest. Biases, cognitive errors, and simple self-deception can lead you astray.

 

If you accept that you as an individual, and everyone else individually, cannot be trusted to always make the best choices, then it is reasonable to think that someone else can step in to help improve your decision-making in certain predictable instances where cognitive errors and biases can be anticipated. This is a key idea in the book Nudge by Cass Sunstein and Richard Thaler. In defending their ideas for libertarian paternalism, the authors write, “The false assumption is that almost all people, almost all of the time, make choices that are in their best interest or at the very least are better than the choices that would be made by someone else. We claim that this assumption is false – indeed, obviously false.”

 

In many ways, our country prefers to operate with markets shaping the main decisions and factors of our lives. We like to believe that we make the best choices for our lives, and that aggregating our choices into markets will allow us to minimize the costs of individual errors. The idea is that we will collectively make the right choices, driving society in the right direction and revealing the best option and decision for each individual without deliberate tinkering in the process. However, we have seen that markets don’t encourage us to save as much as we should and markets can be susceptible to the same cognitive errors and biases that we as individuals all share.  Markets, in other words, can be wrong just like us as individuals.

 

Libertarian paternalism helps overcome the errors of markets by providing nudges to help people make better decisions. Setting up systems and structures that make saving for retirement easier helps correct a market failure. Outsourcing investment strategies, rather than each of us individually making stock trades, helps ensure that shared biases and panics don’t overwhelm the entire stock exchange. The reality is that we as individuals are not rational, but we can develop systems and structures that provide us with nudges to help us act more rationally, overcoming the reality that we don’t always make the choices that are in our best interest.
Can We Avoid Cognitive Errors?

Can We Avoid Cognitive Errors?

Daniel Kahneman is not very hopeful when it comes to our ability to avoid cognitive errors. Toward the end of his book Thinking Fast and Slow, a book all about cognitive errors, predictable biases, and situations in which we can recognize such biases and thinking errors, Kahneman isn’t so sure there is much we can actually do in our lives to improve our thinking.

 

Regarding his own thinking, Kahneman writes, “little can be achieved without considerable effort. As I know from experience, System 1 is not readily educable. Except for some effects that I attribute mostly to age, my intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy as it was before I made a study of these issues.”

 

Kahneman’s book is fantastic in part because of his humility. It would be easy to take a book on illusions, cognitive errors, biases, and predictable fallacies and use it to show how much smarter you are than everyone else who makes such thinking mistakes. However, Kahneman uses his own real life examples throughout the book to show how common and easy it is to fall into ways of thinking that don’t actually reflect reality. What is unfortunate though, is how hard it is to actually take what you learn from the book and apply it to your own life. If the author himself can hardly improve his own thinking, then those of us who read the book likely won’t make big changes in our thinking either.

 

“The upshot is that it is much easier to identify a minefield when you observe others wandering into it than when you are about to do so. Observers are less cognitively busy and more open to information than actors,” Kahneman continues. While we might not be able to improve our thinking simply by knowing about cognitive errors and being aware of predictable biases, we can at least recognize them in others. This can help us be more thoughtful when we critique or gossip about others (something we all do even if we claim we don’t).

 

Beyond improving the way we gossip or judge others, Kahneman’s research and his book are incredibly valuable for anyone who is in a design focused role. If you are creating a layout for a webpage, a seating arrangement at a restaurant, or the standard operating procedures for a company, you have an opportunity to design and develop a process and flow that takes cognitive errors and predictable biases into account. Because it is easier to observe others making mistakes than to observe those mistakes in ourselves, we can watch for situations where people are led astray, and help get them back on course. We can develop systems and structures that take our biases and cognitive errors into account, and minimize the damage they may do. We can set the world up to help guide us in a reasonable way through our cognitive errors and biases, but only if we know what to look for.
Endowment Effects Joe Abittan

Endowment Effects

In his book Thinking Fast and Slow, Daniel Kahneman discusses an experiment he helped run to explore the endowment effect. The endowment effect is a cognitive fallacy that helps explain our attachment to things and our unwillingness to part with objects, even when we are offered something greater than the objective value of the the object itself. We endow the object with greater significance than is really warranted, and in his book, Kahneman shows that this has been studied with Super Bowl tickets, wine, and coffee mugs.

 

Kahneman helped run experiments at a few different universities where college students were randomly given coffee mugs with the university logo. The mugs were worth about $6 each, and were randomly distributed to about half of a classroom. Students were allowed to buy or sell the mugs, and the researchers saw a divergence in the value assigned to the mugs by the students who randomly obtained a mug and those who didn’t. Potential sellers were willing to part with the mug for about $7 dollars, a price above the actual value of the mug. Buyers, however, were generally only willing to purchase a mug for about $3, or half the value of the mug.

 

Kahneman suggests that the endowment effect has something to do with the unequal values assigned to the mug by those who received a mug and those who didn’t. He suggests that it is unlikely that those who received the mugs really wanted a university mug and particularly valued a mug relative to those who didn’t receive a mug. Those students should have been willing to trade the mug for $3 dollars which could be used to purchase something that they may have actually wanted, rather than a random mug. To explain why they didn’t sell their mugs, Kahneman suggests that the mugs became endowed with additional value by those who received them.

 

A further study showed similar effects. When all students in the class randomly received either a chocolate bar or a mug, researchers found that fewer students were willing to make a trade than the researchers predicted. Again, it is unlikely that a random distribution of mugs and candy perfectly matched the mug versus candy preferences of the students. There should have been plenty of students who could have used a sugar boost more than an extra mug (and vice versa), but little trading actually took place. It appears that once someone randomly receives a gift, even if the value of the gift was very small, they are not likely to give it up. The gift becomes endowed with some meaning beyond its pure utility and value.

 

Kahneman describes part of what takes place in our minds when the endowment effect is at work, “the shoes the merchant sells you and the money you spend from your budget for shoes are held for exchange. They are intended to be traded for other goods. Other goods, such as wine and Super Bowl tickets, are held for use to be consumed or otherwise enjoyed. Your leisure time and the standard of living that your income supports are also not intended for sale or exchange.”

 

The random mug or candy bar were not seen as objective items intended to be traded or bartered in exchange for something that we actually want. They were viewed as a windfall over the status quo, and thus their inherent value to the individual was greater than the actual value of the object. Kahneman suggests that this is why so few students traded candy for mugs, and why mug sellers asked far more than what mug buyers wanted to pay in his experiments. The endowment effect is another example of how our emotional valence and narrative surrounding an otherwise objectively unimportant object can shape our behaviors in ways that can seem irrational. Next spring when you are trying to de-clutter your house, remember this post and the endowment effect. Remember that you are imbuing objects with value simply because you happen to own it, and remember that you would only pay half price for it if it was actually offered to you for purchase now. Hopefully that helps you minimalize the number of mugs you own and declutter some of your cabinets.
Overcoming Group Overconfidence

Overcoming Group Overconfidence

Overcoming group overconfidence is hard, but in Thinking Fast and Slow, Daniel Kahneman offers one partial remedy: a premortem. As opposed to a postmortem, and analysis of why a project failed, a premortem looks at why a program might fail before it has started.

 

Group communication is difficult. When the leader of a group is enthusiastic about an idea, it is hard to disagree with them. If you are a junior member of a team, it can be uncomfortable, and potentially even disadvantageous for you and your career to doubt the ideas that a senior leader is excited about. If you have concerns, it is not likely that you will bring them up, especially in a group meeting with other seemingly enthusiastic team members surrounding you.

 

Beyond the silencing of a member who has concerns but doesn’t want to speak up is another problem that contributes to overconfidence among teams: groupthink. Particularly among groups that lack diversity, groupthink can crush the planning stage of a project. When everyone has similar backgrounds, similar experiences, and similar styles of thinking, it is unlikely that anyone within the group will have a viewpoint or opinion that is significantly different than the prevailing wisdom of the rest. What seems like a good idea or the correct decision to one person probably feels like the correct idea or decision to everyone else – there is literally no one in the room who has any doubts or alternative perspectives.

 

Premortems help get beyond groupthink and the fear of speaking up against a powerful and enthusiastic leader. The idea is to brainstorm all the possible ways that a project might fail. It includes an element of creativity by asking everyone to imagine the project is finally finished, either successfully but well over budget, way late, after a very turbulent series of events, or the project was a complete failure and never reached its intended end point. People have to describe the issues that came up and why the project did not reach the rosy outcome everyone initially pictured. Imaging that these failures had taken place in real life gets people to step beyond groupthink and encourages highlighting roadblocks that particularly enthusiastic members overlook.

 

Because premortems are hypothetical, it gives people a chance to speak up about failure points and weaknesses in plans and ideas without appearing to criticize the person the idea came from. It creates a safe space for imagining barriers and obstacles that need to be overcome to achieve success. It reduces groupthink by encouraging a creative flow of ideas of failure points. As Kahneman writes, “The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier.”

 

Overcoming group overconfidence is possible, but it needs the right systems and structures to happen. Groupthink and fear are likely to prevent people from bringing up real doubts and threats, but a premortem allows those concerns to be aired and seriously considered. It helps get people to look beyond the picture of success they intuitively connect with, and it helps prevent enthusiastic supporters from getting carried away with their overconfidence.
Teamwork Contributions

Thinking About Who Deserves Credit for Good Teamwork

Yesterday I wrote about the Availability Heuristic, the term that Daniel Kahneman uses in his book Thinking Fast and Slow to describe the ways in which our brains misjudge frequency, amount, and probability based on how easily an example of something comes to mind. In his book, Kahneman describes individuals being more likely to overestimate things like celebrity divorce rates if there was recently a high profile and contentious celebrity divorce in the news. The easier it is for us to make an association or to think of an example of a behavior or statistical outcome, the more likely we will overweight that thing in our mental models and expectations for the world.

 

Overestimating celebrity divorce rates isn’t a very big deal, but the availability heuristic can have a serious impact in our lives if we work as part of a team or if we are married and have a family. The availability heuristic can influence how we think about who deserves credit for good team work.

 

Whenever you are collaborating on a project, whether it is a college assignment, a proposal or set of training slides at work, or keeping the house clean on a regular basis, you are likely to overweight your own contributions relative to others. You might be aware of someone who puts in a herculean effort and does well more than their own share, but if everyone is chugging along completing a roughly equivalent workload, you will see yourself as doing more than others. The reason is simple, you experience your own work firsthand. You only see everyone else’s handiwork once they have finished it and everyone has come back together. You suffer from availability bias because it is easier for you to recall the time and effort you put into the group collaboration than it is for you to recognize and understand how much work and effort others pitched in. Kahneman describes the result in his book, “you will occasionally do more than your share, but it is useful to know that you are likely to have that feeling even when each member of the team feels the same way.” 

 

Even if everyone did an equal amount of work, everyone is likely to feel as though they contributed more than the others. As Kahneman writes, there is more than 100% of credit to go around when you consider how much each person thinks they contributed. In marriages, this is important to recognize and understand. Spouses often complain that one person is doing more than the other to keep the house running smoothly, but if they complain to their partner about the unfair division of household labor, they are likely to end up in an unproductive argument with each person upset that their partner doesn’t recognize how much they contribute and how hard they work. Both will end up feeling undervalued and attacked, which is certainly not where any couple wants to be.

 

Managers must be aware of this and must find ways to encourage and celebrate the achievements of their team members while recognizing that each team member may feel that they are pulling more than their own weight. Letting everyone feel that they are doing more than their fair share is a good way to create unhelpful internal team competition and to create factions within the workplace. No professional work team wants to end up like a college or high school project group, where one person pulls and all-nighter, overwriting everyone else’s work and where one person seemingly disappears and emails everyone last minute to ask them not to rat them out to the teacher.

 

Individually, we should acknowledge that other people are not going to see and understand how much effort we feel that we put into the projects we work on. Ultimately, at an individual level we have to be happy with team success over our individual success. We don’t need to receive a gold star for every little thing that we do, and if we value helping others succeed as much as we value our own success, we will be able to overcome the availability heuristic in this instance, and become a more productive team member, whether it is in volunteer projects, in the workplace, or at home with our families.
Guided by Impressions of System 1

Guided by Impressions of System 1

In Thinking Fast and Slow Daniel Kahneman shares research showing how easily people can be tricked or influenced by factors that seem to be completely irrelevant to the mental task that the people are asked to carry out. People will remember rhyming proverbs better than non-rhyming proverbs. People will trust a cited research source with an easy to say name over a difficult and foreign sounding name. People will also be influenced by the quality of paper and colors used in advertising materials. No one would admit that rhymes, easy to say names, or paper quality is why they made a certain decision, but statistics show that these things can strongly influence how we decide.

 

Kahneman describes the research this way, “The psychologists who do these experiments do not believe that people are stupid or infinitely gullible. What psychologists do believe is that all of us live much of our life guided by the impressions of System 1 – and we often do not know the source of these impressions.”

 

Making tough and important decisions requires a lot of energy. In many instances, we have to make tough decisions that require a lot of mental effort in a relatively short time. We don’t always have a great pen and paper template to follow for decision-making, and sometimes we have to come to a conclusion in the presence of others, upping the stakes and increasing the pressure as we try to think through our options. As a result, the brain turns to heuristics reliant on System 1. The brain uses intuition, quick impressions, and substitutes questions for an easier decision.

 

We might not know why we intuitively favored one option over the other. When we ask our brain to think back on the decision we made, we are engaging System 2 to think deeply, and it is likely going to overlook and not consider inconsequential factors such as the color of the paper for the option we picked. It won’t remember that the first sales person didn’t make much eye contact with us and that the second person did, but it will substitute some other aspect of competence to give us a reason for trusting sales person number two more.

 

What is important to remember is that System 1 guides a lot of our lives. We don’t always realize it, but System 1 is passing along information to System 2 that isn’t always relevant for the decision that System 2 has to make. Intuitions and quick impressions can be biased and formed by unimportant factors, but even if we don’t consciously recognize them, they get passed along and calculated into our final choice.
Thoughts on Biases

Thoughts on Biases

“Anything that makes it easier for the associative machine to run smoothly will also bias beliefs,” writes Daniel Kahneman in his book Thinking Fast and Slow. Biases are an unavoidable part of our thinking. They can lead to terrible prejudices, habits, and meaningless preferences, but they can also help save us a lot of time, reduce the cognitive demand on our brains, and help us move smoothly through the world. There are too many decision points in our lives and too much information for us to absorb at any one moment for us to not develop shortcuts and heuristics to help our brain think quicker. Quick rules for associative thinking are part of the process of helping us actually exist in the world, and they necessarily create biases.

 

A bad sushi roll might bias us against sushi for the rest of our life. A jump-scare movie experience as a child might bias us toward romcoms and away from horror movies. And being bullied by a beefy kid in elementary school might bias us against muscular dudes and sports. In each instance, a negative experience is associated in our brains with some category of thing (food, entertainment, people) and our memory is helping us move toward things we are more likely to like (or at least less likely to bring us harm). The consequences can be low stakes, like not going to horror movies, but can also be high stakes, like not hiring someone because their physical appearance reminds you of a kid who bullied you as a child.

 

What is important to note here is that biases are natural and to some extent unavoidable. They develop from our experiences and the associations we make as move through life and try to understand the world. They can be defining parts of our personality (I only drink black coffee), they can be incidental pieces of us that we barely notice (my doughnut choice order is buttermilk bar, maple covered anything, chocolate, plain glaze), and they could also be far more dangerous (I have an impulse to think terrible things about anyone with a bumper sticker for a certain political figure – and I have to consciously fight the impulse). Ultimately, we develop biases because it helps us make easier decisions that will match our preferences and minimize our chances of being upset. They are mental shortcuts, saving us from having to make tough decisions and helping us reach conclusions about entire groups of things more quickly.

 

The goal for our society shouldn’t be to completely eliminate all instances of bias in our lives. That would require too much thought and effort for each of us, and we don’t really have the mental capacity to make so many decisions. It is OK if we are biased toward Starbucks rather than having to make a decision about what coffee shop to go to each morning, or which new coffee shop to try in a town we have never visited.

 

What we should do, is work hard to recognize biases that can really impact our lives and have negative consequences. We have to acknowledge that we have negative impulses toward certain kids of people, and we have to think deeply about those biases and work to be aware of how we treat people. Don’t pretend that you move through the world free from problematic biases. Instead, work to see those biases, and work to push against your initial negative reaction and think about ways that you could have more positive interactions with others, and how you can find empathy and shared humanity with them. Allow biases to remain when helpful or insignificant (be biased toward vegetarian take-out for example), but think critically about biases that could have real impacts in your life and in the lives of others.
Recognize Situations Where Mistakes Are Common

Recognize Situations Where Mistakes Are Common

“Because System 1 operates automatically and cannot be turned off at will, errors of intuitive thought are often difficult to prevent,” writes Daniel Kahneman in Thinking Fast and Slow. System 1 is how Kahneman describes the intuitive, quick reacting part of our brain that continually scans the environment and filters information going to System 2, the more thoughtful, deliberate, calculating, and rational part of our brain. Biases in human thought often originate with System 1. When System 1 misreads a situation, makes a judgment on a limited set of information, or inaccurately perceives something about the world, System 2 will be working on a poor data set and is likely reach faulty conclusions.

 

Kahneman’s book focuses on common cognitive errors and biases, not in the hope that we can radically change our brains and no longer fall victim to prejudices, optical illusions, and cognitive fallacies, but in the hopes that we can increase our awareness of how the brain and our thinking goes off the rails, to help us marginally improve our thought processes and final conclusions. Kahneman writes, “The best we can do is a compromise: learn to recognize situations in which mistakes are likely and try harder to avoid significant mistakes when the stakes are high.”

 

If we are aware that we will make snap judgments the instant we see a person, before either of us has even spoken a single word, we can learn to adjust our behavior to prevent an instantaneous bias from coloring the entire interaction. If we know that we are making a crucial decision on how we are going to invest our finances for retirement, we can pause and use examples from Kahneman’s book to remember that we have a tendency to answer simpler questions, we have a tendency to favor things that are familiar, and we have a tendency to trust other people based on factors that don’t truly align with trustworthiness. Kahneman doesn’t think his book and his discussions on cognitive fallacies will make us experts in investing, but he does think that his research can help us understand the biases we might make in an investment situation and improve the way we make some important decisions. Understanding how our biases may be impacting our decision can help us improve those decisions.

 

Self- and situational-awareness are crucial for accurately understanding the world and making good decisions based on sound predictions. It is important to know if you can trust an educated guess from yourself or others, and it is important to recognize when your confidence is unwarranted. It is important to know when your opinions carry weight, and when your direct observations might be incomplete and misleading. In most instances of our daily lives, the stakes are low and errors from cognitive biases and errors are low, but in some situations, like serving on a jury, driving on the freeway, or choosing whether to hire someone, our (and other people’s) livelihoods could be on the line. We should honestly recognize the biases and limitations of the mind so we can further recognize situations where mistakes are common, and hopefully make fewer mistakes when it matters most.
Answering the Easy Question

Answering the Easy Question

One of my favorite pieces from Daniel Kahneman’s book Thinking Fast and Slow, was the research Kahneman presented on mental substitution. Our brains work very quickly, and we don’t always recognize the times when our thinking has moved in a direction we didn’t intend. Our thinking seems to flow logically and naturally from one thought to the next, and we don’t notice the times when our brains make logical errors or jumps that are less than rational. Mental substitution is a great example of this, and one that I know my brain does, but that I often have trouble seeing even when I know to look for it.

 

Kahneman writes, “When the question is difficult and a skilled solution is not available, intuition still has a shot: an answer may come to mind quickly – but it is not an answer to the original question.” 

 

The example that Kahneman uses is of a business executive making a decision on whether to invest in Ford. To make a smart decision, the executive has to know what trends in the auto industry look like and whether Ford is well positioned to adapt to changing economic, climate, and consumer realities. They need to know what Ford’s competition is doing and think about how Ford has performed relative to other automobile companies and how the automotive sector has performed relative to other industries. The decision requires thinking about a lot of factors, and the executive’s time is limited, along with the amount of information they can hold in their head, especially given the other responsibilities at home and in the office that the executive has to worry about.

 

To simplify the decision, the executive might chose to answer a simpler question, as Kahneman explains, “Do I like Ford cars?” If the executive grew up driving a Ford truck, if they really liked the 1965 Mustang, or if the only car crash they were ever involved in was when a person driving a Ford rear-ended them, their decision might be influenced by an intuitive sense of Ford cars and people who drive Fords. Also, if the investor has personnaly met someone within the executive team, they may be swayed by whether or not they liked the person they met. Instead of asking a large question about Ford the company, they might substitute an easier question about a single Ford executive team member.

 

“This is the essence of intuitive heuristics,” writes Kahneman, “when faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.” 

 

Often, we already have a certain feeling in mind, and we switch the question being asked so that we can answer in line with our intuition. I grew up driving a Ford, so I might be inclined to favor investing in Ford. I might answer the question of investing in Ford before I am even asked the question, and then, instead of objectively setting out to review a lot of information, I might just cherry pick the information that supports my original inclination. I’m substituting the question at hand, and might even provide myself with plenty of information to support my choice, but it is likely biased and misguided information.

 

It is important to recognize when these prejudices and biases are influencing our decisions. By being aware of how we feel when asked a question, we can think critically to ask if we are being honest with the question that was asked of us. Are we truly answering the right question, or have we substituted for a question that is easier for us to answer?

 

In the question of cars and investments, the cost might not truly be a big deal (at least if you have a well diversified portfolio in other respects), but if we are talking about public policy that could be influenced by racial prejudice or by how deserving we think another group of people is, then our biases could be very dangerous. If we think that a certain group of people is inherently greedy, lazy, or unintelligent, then we might substitute the question, “will this policy lead to the desired social outcome” with the question, “do I like the group that stands to benefit the most from this policy?” The results from answering the wrong question could be disastrous, and could harm our society for years.