Knowledge and Perception

Knowledge and Perception

We often think that biases like prejudice are mean spirited vices that cause people to lie and become hypocritical. The reality, according to Quassim Cassam is that biases like prejudice run much deeper within our minds. Biases can become epistemic vices, inhibiting our ability to acquire and develop knowledge. They are more than just biases that make us behave in ways that we profess to be wrong. Biases can literally shape the reality of the world we live in by altering the way we understand ourselves and other people around us.
“What one sees,” Cassam writes in Vices of the Mind, “is affected by one’s beliefs and background assumptions. It isn’t just a matter of taking in what is in front of one’s eyes, and this creates an opening for vices like prejudice to obstruct the acquisition of knowledge by perception.”
I am currently reading Steven Pinker’s book Enlightenment Now where Pinker argues that humans strive toward rationality and that at the end of the day subjectivity is ultimately over-ruled by reason, rationality, and objectivity. I have long been a strong adherent to the Social Construction Framework and beliefs that our worlds are created and influenced by individual differences in perception to a great degree. Pinker challenges that assumption, but framing his challenge through the lens of Cassam’s quote helps show how Pinker is ultimately correct.
Individual level biases shape our perception. Pinker describes a study where university students watching a sporting event literally see more fouls called against their team than the opponent, revealing the prejudicial vice that Cassam describes. Perception is altered by a prejudice against the team from the other school. Knowledge (in the study it is the accurate number of fouls for each team) is inhibited for the sports fans by their prejudice. The reality they live in is to some extent subjective and shaped by their prejudices and misperceptions.
But this doesn’t mean that knowledge about reality is inaccessible to humans at a larger scale. A neutral third party (or committee of officials) could watch the game and accurately identify the correct number of fouls for each side. The sports fans and other third parties may quibble about the exact final number, but with enough neutral observers we should be able to settle on a more accurate reality than if we left things to the biased sports fans. At the end of the day, rationality will win out through strength of numbers, and even the disgruntled sports fan will have to admit that the number of fouls they perceived was different from the more objective number of fouls agreed upon by the neutral third party members.
I think this is at the heart of the message from Cassam and the argument that I am currently reading from Pinker. My first reaction to Cassam’s quote is to say that our realities are shaped by biases and perceptions, and that we cannot trust our understanding of reality. However, objective reality (or something pretty close to it that enough non-biased people could reasonably describe) does seem to exist. As collective humans, we can reach objective understandings and agreements as people recognize and overcome biases and as the descriptions of the world presented by non-biased individuals prove to be more accurate over the long run. The key is to recognize that epistemic vices shape our perception at a deep level, that they are more than just hypocritical behaviors and that they literally shape the way we interpret reality. The more we try to overcome these vices of the mind, the more accurately we can describe the world, and the more our perception can then align with reality.
Incentives for Environmentally Responsible Markets

Incentives for Environmentally Responsible Markets

When it comes to environmental issues, no single actor is completely to blame, and that means no single actor can make the necessary changes to prevent catastrophic climate change. This means we can’t put all the weight on governments to take actions to change the course of our climate future, and we can’t blame individual actors either. We have to think about economies, polities, and incentive structures.

 

In their book Nudge, economists Cass Sunstein and Richard Thaler look at what this means for markets and regulation as we try to find sustainable paths. They write, “markets are a big part of this system, and for all their virtues, they face two problems that contribute to environmental problems. First, incentives are not properly aligned. If you engage in environmentally costly behavior next year, through consumption choices, you will probably pay nothing for the environmental harms that you inflict. This is what is often called a tragedy of the commons.”

 

One reason markets bear some of the blame and responsibility for the climate change crisis is because market incentives can produce externalities that are hard to correct. Climate change mitigation strategies, such as research and development of more fuel efficient vehicles and technologies, are expensive, and the costs of climate change are far off. Market actors, both consumers and producers, don’t have proper incentives to make the costly changes today that would reduce the future costs of continued climate change.

 

A heavy handed approach to our climate change crisis would be for governments to step in with dramatic regulation – eliminating fossil fuel vehicles, setting almost unattainably high energy efficiency standards for furnaces and dishwashers, and limiting air travel. Such an approach, however, might anger the population and ruin any support for climate mitigation measures, making the crisis even more dire. I don’t think many credible people really support heavy handed government action, even if they do favor regulation which comes close to being as extreme as the examples I mentioned. Sunstein and Thaler’s suggestion of improved incentives to address failures in markets and change behaviors has advantages over heavy handed regulation. The authors write, “incentive-based approaches are more efficient and more effective, and they also increase freedom of choice.”

 

To some extent, regulation looks at a problem and asks what the most effective way to stop the problem is if everyone is acting rational. An incentives-based approach asks what behaviors need to be changed, and what existing forces encourage the negative behaviors and discourage changes toward better behaviors. Taxes, independent certifications, and public shaming can be useful incentives to get individuals, groups, and companies to make changes. I predict that in 10-15 years people who are not yet driving electric cars will start to be shamed for continuing to drive inefficient gas guzzlers (unfortunately this probably means people with low incomes will be shamed for not being able to afford a new car). In the US, we have tried to introduce taxes on carbon output, but have not been successful. Taxing energy consumption in terms of carbon output changes the incentives companies have with regard to negative environmental externalities form energy and resource consumption. And independent certification boards, like the one behind the EnergyStar label, can continue to play an important role in encouraging technological development of more efficient appliances. The incentives approach might seem less direct, slower, and less certain to work, but in many areas, not just climate change, we need broad public support to make changes, especially when the costs are high up front. This requires that we understand incentives and think about ways to change incentive structures. Nudges such as the ones I mentioned may work better than full government intervention if people are not acting fully rational, which is usually the case for most of us. Nudges can get us to change behaviors while believing that we are making choices for ourselves, rather than having choices forced on us by an outside authority.
Why Terrorism Works

Why Terrorism Works

In the wake of terrorism attacks, deadly shootings, or bizarre accidents I often find myself trying to talk down the threat and trying to act as if my daily life shouldn’t be changed. I live in Reno, NV, and my city has experienced school shootings while my state experienced the worst mass shooting in the United States, but I personally have never been close to any of these extreme yet rare events.  Nevertheless, despite efforts to talk down any risk, I do psychologically notice the fear that I feel following such events.

 

This fear is part of why terrorism works. Despite trying to rationally and logically talk myself through the post-terrorism incident and remind myself that I am in more danger on the freeway than I am near a school or at a concert, there is still some apprehension under the surface, no matter how cool I make myself look on the outside. In Thinking Fast and Slow, Daniel Kahneman examines why we behave this way following such attacks. Terrorism, he writes, “induces an availability cascade. An extremely vivid image of death and damage, constantly reinforced by media attention and frequent conversations becomes highly accessible, especially if it is associated with a specific situation.”

 

Availability is more powerful in our mind than statistics. If we know that a given event is incredibly rare, but have strong mental images of such an event, then we will overweight the likelihood of that event occurring again. The more easily an idea or possibility comes to mind, the more likely it will feel to us that it could happen again. On the other hand, if we have trouble recalling experiences or instances where rare outcomes did not happen, then we will discount the possibility that they could occur. Where terrorism succeeds is because it shifts deadly events from feeling as if they were impossible to making them easily accessible in the mind, and making them feel as though they could happen again at any time. If our brains were coldly rational, then terrorism wouldn’t work as well as it does. As it is, however, our brains respond to powerful mental images and memories, and the fluidity of those mental images and memories shapes what we expect and what we think is likely or possible.
Affect Heuristics

More on Affect Heuristics

For me, one of the easiest examples of heuristics that Daniel Kahneman shares in his book Thinking Fast and Slow is the affect heuristic. It is a bias that I know I fall into all the time, and that has led me to buy particular brands of shoes, has influenced how I think about certain foods, and has shaped the way I think about people. In his book Kahenman writes, “The affect heuristic is an instances of substitution, in which the answer to an easy question (How do I feel about it?) serves as an answer to a much harder question (What do I think About it?).”

 

The world is a complex and tricky place, and we can only focus a lot of attention in one direction at a time. For a lot of us, that means we are focused on getting kids ready for school, cooking dinner, or trying to keep the house clean. Trying to fully understand the benefits and drawbacks of a social media platform, a new traffic pattern, or how to invest in retirement may seem important, but it can be hard to find the time and mental energy to focus on a complex topic and organize our thoughts in a logical and coherent manner. Nevertheless, we are likely to be presented with situations where we have to make decisions about what level of social media is appropriate for our children, offer comments on new traffic patterns around the water cooler, or finally get around to setting up our retirement plan and deciding what to do with that old 401K from that job we left.

 

Without having adequate time, energy, and attention to think through these difficult decisions, we have to make choices and are asked to have an opinion on topics we are not very informed about. “The affect heuristic”, Kahneman writes, “simplifies our lives by creating a world that is much tidier than reality. Good technologies have few costs in the imaginary world we inhabit, bad technologies have no benefits, and all decisions are easy.” We substitute the hard question that requires detailed thought for a simple question: do I like social media, did I feel that the new traffic pattern made my commute slower, do I like the way my retirement savings advisor presented a new investment strategy. In each case, we rely on affect, our emotional reaction to something, and make decisions in line with our gut feelings. Of course my kid can use social media, I’m on it, I like it, and I want to see what they are posting. Ugh, that new traffic pattern is awful, what were they thinking putting that utility box where it blocks the view of the intersection. Obviously this is the best investment strategy for me, my advisor was able to explain it well and I liked it when they told me I was making a smart decision.

 

We don’t notice when we default to the affect heuristic. It is hard to recognize that we have shifted away from making detailed calculations to rely solely on intuitions about how something makes us feel. Rather than admitting that we buy Nike shoes because our favorite basketball player wears them, and we want to be like LeBron, we create a story in our head about the quality of the shoes, the innovative design, and the complementary colors. We fall back on a quick set of factors that gives the impression of a thoughtful decision. In a lot of situations, we probably can’t do much better than the affect heuristic, but it is worth considering if our decisions are really being driven by affect. We might be able to avoid buying things just out of brand loyalty, and we might be a little calmer and reasonable in debates and arguments with friends and family when we realize we are acting on affect and not on reason.
Anchoring Effects

Anchoring Effects

Anchoring effects were one of the psychological phenomenon that I found the most interesting in Daniel Kahneman’s book Thinking Fast and Slow. In many situations in our lives, random numbers seem to be able to influence other numbers that we consciously think about, even when there is no reasonable connection between the random number we see, and the numbers we consciously use for another purpose. As Kahneman writes about anchoring effects, “It occurs when people consider a particular value for an unknown quantity before estimating that quantity.”

 

Several examples of anchoring effects are given in the book. In one instance, judges were asked to assess how much a night club should be fined for playing loud music long after the quite orders in the night club’s local town. Real life judges who have to make these legal decisions were presented with the name of the club and information about the violation of the noise ordinance. The fictitious club was named after the fictitious street that it was located along. In some instanced, the club name was something along the lines of 1500 First Street, and in other instances the club name was something like 10 First Street. Judges consistently assessed a higher fine to the club with the name 1500 First Street than the club with the name 10 First Street. Seemingly random and unimportant information, the numbers for the street address in the name of the club, had a real impact on the amount that judges on average thought the club should be fined.

 

In other examples of anchoring effects, Kahneman shows us that we come up with different guesses of how old Gandhi was when he died if we are asked if he was older than 35 or younger than 114. In another experiment, a random wheel spin influenced the guess people offered for the number of African nations in the UN. In all these examples, when we have to think of a number that we don’t know or that we can apply subjective judgment to, other random numbers can influence what numbers come to mind.

 

This can have real consequences in our lives when we are looking to buy something or make a donation or investment. Retailers may present us with high anchors in an effort to prime us to be willing to accept a higher price than we would otherwise pay for an item. If you walk into a sunglass shop and see two prominently displayed sunglasses with very high prices, you might not be as surprised by the high prices listed on other sunglasses, and might even consider slightly lower prices on other sunglasses as a good deal.

 

It is probably safe to say that sales prices in stores, credit card interest rates, and investment management fees are carefully crafted with anchoring effects in mind. Retailers want you to believe that a high price on an item is a fair price, and could be higher if they were not willing to offer you a deal. Credit card companies and investment brokers want you to believe the interest rates and management fees they charge are small, and might try to prime you with large numbers relative to the rates they quote you. We probably can’t completely overcome the power of anchoring effects, but if we know what to look for, we might be better at taking a step back and analyzing the rates and costs a little more objectively. If nothing else, we can pause and doubt ourselves a little more when we are sure we have found a great deal on a new purchase or investment.
Accepting Unsound Arguments

Accepting Unsound Arguments

Motivated reasoning is a major problem for those of us who want to have beliefs that accurately reflect the world. To live is to have preferences about how the world operates and relates to our lives. We would prefer not to endure suffering and pain, and would rather have comfort, companionship, and prosperity. We would prefer the world to provide for us, and we would prefer to not be too heavily strained. From pure physical needs and preferences all the way through social and emotional needs and preferences, our experiences of the world are shaped by what we want and what we would like. This is why we cannot get away from our own opinions and individual preferences in life, and part of why motivated reasoning becomes the problem that it is.

 

In Thinking Fast and Slow, Daniel Kahneman writes about how motivated reasoning works in our minds, in terms of the arguments we make to support the conclusions we believe in, or would like to believe in. He writes, “When people believe a conclusion is true, they are also very likely to believe arguments that appear to support it, even when these arguments are unsound.”

 

We justify conclusions we would like to believe with any argument that seems plausible and fits the conclusion we would like to believe. Our preference for one conclusion leads us to bend the arguments in favor of that conclusion. Rather than truly analyzing the arguments, we discount factors that don’t support what we want to believe, and we disregard arguments that come from people who are reaching an alternative conclusion. Our preferences take over, and the things we want become more important than reality. Motivated reasoning gives us a way to support what we want to believe by twisting the value we assign to different facts.

 

Even in our own mind, demonstrating that an argument in favor of our preferred conclusion is flawed is unlikely to make much of a difference. We will continue to hold on to our flawed argument, choosing to believe that there is something true about it, even if we know it is flawed or contradicts other disagreeable facts that must also be true if we are to support our preferred conclusion.

 

This doesn’t make us humans look very good. We can’t reason our way to new beliefs and we can’t rely on facts and data to change minds. In the end, if we want to change our thoughts and behavior as well as those of others, we have to shape people’s preferences. Motivated reasoning can support conclusions that do not accurately reflect the world around us, so for those of us who care about reality, we have to heighten the salience of believing and trusting science and expertise before we can get people to adopt our arguments in favor of rational evidence. If we don’t think about how preference and motivated reasoning lead people to believe inaccurate claims, we will fail to address the preferences that support problematic policies, and we won’t be able to guide our world in a direction based on reason and sound conclusions.
We Think of Ourselves as Rational

We Think of Ourselves as Rational

In Daniel Kahneman’s book Thinking Fast and Slow, Kahneman lays out two ideas for thinking about our thought processing. Kahneman calles the two ways of thinking about our thought processing System 1 and System 2. System 1 is fast, automatic, often subconscious, and usually pretty accurate in terms of making quick judgments, assumptions, and estimations of the world. System 2 is where our heavy duty thinking takes place. It is where we crunch through math problems, where our rational problem-solving part of the brain is in action, and its the system that uses a lot of energy to help us remember important information and understand the world.

 

Despite the fact that we normally operate on System 1, that is not the part of our brain that we think of as ourselves. Kahneman writes, “When we think of ourselves, we identify with System 2, the conscious, reasoning self that has beliefs, makes choices, and decides what to think about and what to do.” We believe ourselves to be rational agents, responding reasonably to the world around us. We see ourselves a free from bias, as logically coherent, and as considerate and understanding. Naturally, it is System 2 that we see ourselves as spending most of our time with, however, this is not exactly the case.

 

A lot of our actions are influenced by factors that seem to play more at the System 1 level than the System 2 level. If you are extra tired, if you are hungry, or if you feel insulted by someone close to you, then you probably won’t be thinking as rationally and reasonably as you would expect. You are likely going to operate on System 1, making sometimes faulty assumptions on incomplete data about the world around you. If you are hungry or tired enough, you will effectively be operating on auto-pilot, letting System 1 take over as you move about the cabin.

 

Even though we often operate on System 1, we feel as though we operate on System 2 because the part of us that thinks back to how we behaved, the part of us required for serious reflection, is part of System 2. It is critical, thoughtful, and takes its time generating logically coherent answers. System 1 is quick and automatic, so we don’t even notice when it is in control. When we think about who we are, why we did something, and what kind of person we aspire to be, it is System 2 that is flying the plane, and it is System 2 that we become aware of, fooling ourselves into believing that System 2 is all we are, that System 2 is what is really in our head. We think of ourselves as rational, but that is only because our irrational System 1 can’t pause to reflect back on itself. We only see the rational part of ourselves, and it is comforting to believe that is really who we are.
Unconscious Thought Theory

Unconscious Thought Theory

I like to think of myself as a pretty rational and empirical thinker. I try to understand points where my thoughts will be influenced by bias and my immediate reactions to situations. At these points, I try (not always successfully) to pause to be more reflective and considerate. I generally believe that striving for rationality and more evidence backed opinions is a good thing, but there is research which suggests that this strategy can lead to overthinking things and might involve parts of the brain which are not well suited for some decisions.

 

In Deep Work, author Cal Newport writes about Unconscious Thought Theory and research by Dutch psychologist Ap Dijksterhuis (I have no pronunciation help here). Research from Dijksterhuis shows that our unconscious brains are good at handling situations with complex, ambiguous information and no clear path forward. Newport describes it by writing, “if you need to do a math calculation, only your conscious mind is able to follow the precise arithmetic rules needed for correctness. On the other hand, for decisions that involve large amounts of information and multiple vague, perhaps even conflicting constraints, your unconscious mind is well suited to tackle the issue.”

 

The reason Newport brings this into his book Deep Work is because in addition to strict focus work, he also advocates for time away from work. “your capacity for deep work in a given day is limited,” Newport explains, “If you’re careful about your schedule … you should hit your daily deep work capacity during your workday.”

 

The implication is that we should step away from our work to give our conscious minds a break when we max out on our deep work capacity. Some tasks are not well suited for the hyper-analytic conscious mind, and some of these tasks can be worked through by the unconscious mind at a time when the brain doesn’t have to marshal all resources for deep analytical thinking. By stepping away from work, closing out of our work email, and engaging with other life hobbies and our families, we can allow our unconscious brain to sort through the challenging ambiguities of the problems that had previously stymied our work. Unconscious Thought Theory suggests that our unconscious brain can work on these problems if given space, but continuing to check work emails after hours or logging back in here and there to check on our work prevents the unconscious brain from having the space it needs to do the background sorting that makes it a valuable tool.

 

In the end, it is turning off our rational brain for a little while, allowing ourselves to engage with something or focus on something that doesn’t require such heightened focus, and knowing when to stop our deep work that helps us perform at our best. The answer is not to continuously chug through all the analytic work we can force onto our brain in a day, but to maximize the time we can spend in deep work, and turn ourselves off when we have hit our cognitive limit.

Different Angles

In his book How to Win Friends and Influence People, Dale Carnegie quotes Henry Ford on seeing things from another person’s perspective: “If there is any one secret of success, it lies in the ability to get the other person’s point of view and see things from that person’s angle as well as from your own.” 

 

I am fascinated by the mind, our perception of the universe, and how we interpret the information we take in to make decisions. There is so much data and information about the world, and we will all experience that information and data in different ways, and our brains will literally construct different realities with the different timing and information that we take in. There may be an objective reality underlying our experiences, but it is nearly impossible to pinpoint exactly what that reality is given all of our different perspectives.

 

What we can realize from the vast amount of data that is out there and by our limited ability to take it all in and comprehend it is that our understanding of the universe is woefully inadequate. We need to get the perspectives of others to really understand what is happening and to make sense of the universe. Everyone will see things slightly differently and understand the world in their own unique way.

 

Carnegie’s book addresses this point in the context of business. When we are trying to make a buck, we often become purely focused on ourselves, on what we want, on how we think it is best to accomplish our goals, and on the narrow set of things that we have identified as necessary steps to get from A to B.

 

However, we are always going to be working with others, and will need the help of other people and other companies to achieve our goals. We will have to coordinate, negotiate, and come to agreement on what actions we will all take, and we will all bring our own experiences and motivations to the table. If you approach business thinking purely about what you want and what your goals are, you won’t be able to do this successfully. You have to consider the perspectives of the other people that you work with or rely on to complete any given project.

 

Your employees motivation will be different than the motivation of the companies who partner with you. Your goal might be to become or remain the leader in a certain industry, but no one cares if you are the leader in your space. Everyone wants to achieve their own ends, and the power of adopting multiple perspectives helps you see how each unique goal can align to compound efforts and returns. Remember, your mind is limited and your individual perspectives are not going to give you the insight you need to succeed in a complex world. Only by seeing the different angles with which other people approach a given problem or situation can you successfully coordinate with and motivate the team you will be working with.

Creatures of Logic

One of the things I am most fascinated by is the way in which our lives, our thoughts, and our decisions feel to us to be purely rational, but are clearly not as rational we think. Our minds are bounded by a limited amount of knowledge that we can ever have, a limited amount of information that we can hold in our head, and a host of biases, prejudices, and thinking vices that get in the way of rationality. We feel like we are in control of our minds, and our actions and decisions fit into a logically coherent, rational story that we tell ourselves, but much is missing from the picture of the world that we develop.

 

Usually I turn these considerations inward, but it can be helpful to turn this reality outward as well. Dale Carnegie does so in his book How to Win Friends and Influence People. In the context of criticizing other people, Carnegie writes, “When dealing with people, let us remember we are not dealing with creatures of logic. We are dealing with creatures of emotion, creatures bristling with prejudices and motivated by pride and vanity.”

 

The other day I wrote about criticism, and how it can often backfire when we want to criticize another person and change their behaviors. Carnegie notes the terrible consequences of criticizing people, ranging from quitting work that they might actually excel at all the way to committing suicide, and his quote above is a reminder that people are not logical automatons. Our motivations and sense of self matter to how we perform, what we do, and how we think. Adding mean-spirited criticism, even if well deserved, can be harmful. What is more, our criticism often serves to mostly prop ourselves up, and is more about how special we think we are, than about how poorly another person is performing or behaving.

 

Carnegie believes that we need to be more considerate of other people when dealing with them in any circumstance. His quote extends beyond moments of criticism to areas of motivation, quality relationships, social responsibilities, and individual health and well-being. We cannot simply look at others and heap and hold them purely responsible for the outcomes of their lives. People are not rational, and will not be able to perfectly sort out everything to identify the best possible decisions for their present and future lives. We must help them by remembering their bounded rationality and we must help develop structures that allows them to make the best decisions and perform at their best. People are going to make logical errors, but we can design society and the world they operate within so that they minimize the errors that they make and so that the negative externalities of their biases are also minimized.