Money Isn't About Economic Security (For Most of Us)

Money Isn’t About Economic Security (For Most of Us)

Tyler Cowen started his February 28th, 2018 podcast interview with his colleague from George Mason University, Robin Hanson, with the following:

 

“Robin, if politics is not about policy, medicine is not about health, laughter is not about jokes, and food is not about nutrition, what are podcasts not about?”

 

Hanson goes on to explain that conversations are not really about imparting useful information and finding out useful things, but that conversation is likely more about showing off and signaling. When you share new information to someone, you are showing them that you are a valuable ally who knows useful things that might one day be helpful. When you share a particular piece of knowledge, you are signaling that you are the kind of person who would know such knowledge.

 

I think that Hanson’s views toward signaling are correct and deserve more attention and consideration. A lot of what we do has more to do with signaling than about the reason we would give to an observer for what we are doing. Hanson is not alone in recognizing this reality.

 

In Thinking  Fast and Slow, Daniel Kahneman writes, the following about money:

 

“Except for the very poor, for whom income coincides with survival, the main motivators of money-seeking are not necessarily economic. For the billionaire looking for the extra billion, and indeed for the participant in an experimental economics project looking for the extra dollar, money is a proxy for points on a scale of self-regard and achievement. These rewards and punishments, promises and threats, are all in our head.”

 

Money is not really about economic well-being (for most of us). Its not really about the things we can purchase or the vacations we can take. Money is really about social status. Having more of it elevates our social status, as does using it for impressive and expensive purposes. There is no objective ranking out there for our social status, but we act as if our social status is tangible and will reveal something important about our lives and who we are. Pursuing money gives us a chance to pursue social status in an oblique way, making it look as though we are doing something for high-minded reasons, when in reality we are trying to climb a social ladder and use money as our measuring stick of success.

 

Realistically, we are not going to be able to do much of anything about our signaling behaviors, especially if Hanson is correct in estimating that well over 90% of what we do is signaling. However, we can start to acknowledge signaling and chose where and how we send signals about ourselves. We can chose not to rely on money to signal something about who we are and can seek out more healthy avenues for signaling, with more environmentally friendly and socially conscientious signaling externalities taken into consideration.
Competing Biases

Competing Biases

I am trying to remind myself that everyone, myself included, operates on a complex set of ideas, narratives, and beliefs that are sometimes coherent, but often conflicting. When I view my own beliefs, I am tempted to think of myself as rational and realistic. When I think of others who I disagree with, I am prone to viewing them in a simplistic frame that makes their arguments irrational and wrong. The reality is that all of our beliefs are less coherent and more complex than we typically think.

 

Daniel Kahneman’s book Thinking Fast and Slow has many examples of how complex and contradictory much of our thinking is, even if we don’t recognize it. One example is competing biases that manifest within us as individuals and can be seen in the organizations and larger groups that we form. We can be exaggeratedly optimistic and paralyzingly risk averse at the same time, and sometimes this tendency can actually be a good thing for us. “Exaggerated optimism protects individuals and organizations from the paralyzing effects of loss aversion; loss aversion protects them from the follies of overconfident optimism.”

 

On a first read, I would expect the outcome of what Kahneman describes to be gridlock. The optimist (or optimistic part of our brain) wants to push forward with a big new idea and plan. Meanwhile, loss aversion halts any decision making and prevents new ideas from taking root. The reality, as I think Kahneman would explain, is less of a conscious and deliberate gridlock, but an unnoticed trend toward certain decisions. The optimism wins out in an enthusiastic way when we see a safe bet or when a company sees an opportunity to capture rents. The loss aversion wins out when the bet isn’t safe enough, and when we want to hoard what we already have. We don’t even realize when we are making these decisions, they are just obvious and clear directions, but the reality is that we are constantly being jostled between exaggerated optimism and loss aversion.

 

Kahneman shows that these two biases are not exclusionary even though they may be conflicting. We can act on both biases at the same time, we are not exclusively a risk seeking optimists or exclusively risk averse. When the situation calls for it, we apply the appropriate frame at an intuitive level. Kahneman’s quote above shows that this can be advantageous for us, but throughout the book he also shows us how biases in certain directions and situation can be costly for us overtime as well.

 

We like simple and coherent narratives. We like thinking that we are one thing or another, that other people are either good or bad and right or wrong. The reality, however, is that we contain multitudes within us, act on competing and conflicting biases, and have more nuance and incongruency in our lives than we realize. This isn’t necessarily a bad thing. We can all still survive and prosper despite the complexity and incoherent beliefs that we hold. Nevertheless, I think it is important that we acknowledge the reality we live within, rather than simply believing the simple stories that we like to tell ourselves.
Narrow Framers

Narrow Framers

I like to write about the irrational nature of human thinking because it reminds me that I don’t have all the answers figured out, and it reminds me that I often make decisions that feel like the best decision in the moment, but likely isn’t the best decision if I were to step back to be more considerate. Daniel Kahneman writes about instances where we fail to be rational actors in his book Thinking Fast and Slow, and his examples have helped me better understand my own thinking process and the context in which I make decisions that feel logically consistent, but likely fall short of being truly rational.

 

In one example, Kahneman describes humans as narrow framers, failing to take multiple outcomes into consideration and focusing instead on a limited set of salient possibilities. He writes, “Imagine a longer list of 5 simple (binary) decisions to be considered simultaneously. The broad (comprehensive) frame consists of a single choice with 32 options. Narrow framing will yield a sequence of 5 simple choices. The sequence of 5 choices will be one of the 32 options of the broad frame. Will it be the best? Perhaps, but not very likely. A rational agent will of course engage in broad framing, but Humans are by nature narrow framers.”

 

In the book Kahneman writes that we think sequentially and address problems only as they arise. We don’t have the mental capacity to hold numerous outcomes (even simple binary outcomes) in our mind simultaneously and make predictions on how the world will respond to each outcome. If we map out decisions and create tables, charts, and diagrams then we have a chance of making rational decisions with complex information, but if we don’t outsource the information to a computer or to pen and paper, then we are going to make narrow short-term choices. We will consider a simple set of outcomes and discount other combinations of outcomes that we don’t expect. In general, we will progress one outcome at a time, reacting to the world and making choices individually as the situation changes, rather than making long-term decisions before a problem has arisen.

 

Deep thinking about complex systems is hard and our brains default toward lower energy and lower effort decision-making. We only engage our logical and calculating System 2 part of the brain when it is needed, and even then, we only engage it for one problem at a time with a limited set of information that we can easily observe about the world. This means that our thinking tends to focus on the present, without full consideration of the future consequences of our actions and decisions. It also means that our thinking is limited and doesn’t contain the full set of our data that might be necessary for making accurate and rational choices or predictions. When necessary, we build processes, systems, and structures to help our minds be more rational, but that requires getting information out of our heads, and outsourcing the effort to technologies beyond the brain, otherwise our System 2 will be bogged down and overwhelmed by the complexity of the information in the world around us.
Risk Averse and Risk Seeking - Joe Abittan

Risk Averse and Risk Seeking

I would generally categorize myself as somewhat risk averse, but studies from Daniel Kahneman in Thinking Fast and Slow might suggest that I’m not really any different than anyone else. I might just be responding to the set of circumstances that I typically experience, similar to anyone else, and I might just be more aware of times when I am risk averse rather than times when I am more risk seeking. In particular, I might be risk averse in certain situations and categorize those situations correctly, but risk seeking in other situations without recognizing it.

 

Kahneman uses examples throughout his book to demonstrate to the audience that common cognitive errors and psychological tendencies are shared with even the most savvy readers who would pick up a book like Thinking Fast and Slow. Kahneman even uses anecdotes from his own life and his own thoughts to demonstrate how deep knowledge of cognitive biases and errors doesn’t make one immune. After demonstrating how our minds can lead us to be risk averse in some settings and risk seeking in others, Kahneman cautions us against a typical pattern that many of us will find ourselves in. “It is costly to be risk averse for gains and risk seeking for losses.”

 

On its own, this quote doesn’t seem to reveal anything to interesting, but in the context of Kahneman’s experiments and examples, it reveals a lot about the way we behave whether we are risk seeking or risk averse. When we are offered a flat sum or a gamble with the potentially win more than the flat sum, we often won’t be willing to take the gamble. The guaranteed money is more appealing to us than the prospects of a higher winning with a small chance of gaining nothing. When it comes to gains, we are often risk averse, preferring the sure thing rather than the possibility of getting more with the risk of getting nothing or facing a cost.

 

However, we become risk seeking when we stand to lose something. As long as there is a small outside chance that we won’t lose anything, we will avoid a certain loss, risk a larger loss, and take a gamble. In Kahneman’s example he demonstrates how people will quickly turn down a sure loss of $750 for a 25% chance of losing nothing, even when there is a 75% chance of losing $1000.

 

When you do the math over numerous trials, you see that taking the loss at $750 is better. However, our minds don’t perceive things this way. When we stand to win something, we tend to become conservative and risk averse, but if we stand to lose something, we suddenly become more risk seeking. Combining these two tendencies can be dangerous. It means we can stand to gain much less than we might if we flipped our biases around, and it also means we are more likely to face greater losses with greater frequency than if we had been less risk seeking with regard to losses.

 

If we think about this in the context of our lives more generally, we can see that categorizing ourselves and most of our friends as either risk averse or risk seeking doesn’t necessarily make sense. When you are young, you really don’t have anything that you will be worried about losing. It makes sense that you might be more risk seeking, more willing to take on behaviors and ideas that are risky, but might have a big upside. You might procrastinate with important homework, retirement savings, and household chores because you know you will lose time (the only thing you may have if you are really young), and you can gamble on the consequences. As you get older, once you are established in a career, own a home, have a 401K, and move through life in general, you stand to lose more. Gains throughout your life become less significant due to Tyler Cowen‘s favorite idea, diminishing marginal returns. It becomes harder to give up the guaranteed gains because the marginal increase in a potential gain through a gamble is less appealing. You become risk averse as you get older and in more situations as you grow to have more things to worry about losing. Therefore, categorizing people as generally risk averse or generally risk seeking is meaningless. You need to look at the circumstances of their lives to understand where they find themselves in terms of social status, what material possessions they have, what their family structure is like, and you will start to understand why they make generally more risk averse or generally more risk seeking decisions. There is probably some variability across people, but I would expect the structures and systems in place around us shape our behavior more than any genetic or inherent factors.
Denominator Neglect - Joe Abittan

Denominator Neglect

“The idea of denominator neglect helps explain why different ways of communicating risks vary so much in their effects,” writes Daniel Kahneman in Thinking Fast and Slow.

 

One thing we have seen in 2020 is how difficult it is to communicate and understand risk. Thinking about risk requires thinking statistically, and thinking statistically doesn’t come naturally for our brains. We are good at thinking in terms of anecdotes and our brains like to identify patterns and potential causal connections between specific events. When our brains have to predict chance and deal with uncertainty, they easily get confused. Our brains shift and solve easier problems rather than complex mathematical problems, substituting the answer to the easy problem without realizing it. Whether it is our risk of getting COVID or the probability we assigned to election outcomes before November 3rd, many of us have been thinking poorly about probability and chance this year.

 

Kahneman’s quote above highlights one example of how our thinking can go wrong when we have to think statistically. Our brains can be easily influenced by random numbers, and that can throw off our decision-making when it comes to dealing with uncertainty. To demonstrate denominator neglect, Kahneman presents two situations in his book. There are two large urns full of white and red marbles. If you pull a red marble from an urn, you are a winner. The first urn has 10 marbles in it, with 9 white and 1  red. The second urn has 100 marbles in it, with 92 white and 8 red marbles. Statistically, we should try our luck with the urn with 10 marbles, because 1 out of 10, or 10% of all marbles in the urn are red. In the second urn, only 8% of the marbles are red.

 

When asked which urn they would want to select from, many people select the second urn, leading to what Kahneman describes as denominator neglect. The chance of winning is lower with the second urn, but there are more winning marbles in the jar, making it seem like the better option if you don’t slow down and engage your System 2 thinking processes. If you pause and think statistically, you can see that option 1 provides better odds, but if you are moving quick your brain can be distracted by the larger number of winning marbles and lead you to make a worse choice.

 

What is important to recognize is that we can be influenced by numbers that shouldn’t mean anything to us. The number of winning marbles shouldn’t matter, only the percent chance of winning should matter, but our brains get thrown off. The same thing happens when we see sales prices, think about a the risk of a family gathering of 10 people during a global pandemic, or think about polling errors. I like to check The Nevada Independent‘s COVID-19 tracking website, and I have noticed denominator neglect in how I think about the numbers they report. For a continued stretch, Nevada’s total number of cases was decreasing, but our case positivity rate was staying the same. Statistically, nothing was really changing regarding the state of the pandemic in Nevada, but fewer tests were being completed and reported each day, so the overall number of positive cases was decreasing. If you scroll down the Nevada Independent website, you will get to a graph of the case positivity rate and see that things were staying the same. When looking at the decreasing number of positive tests reported, my brain was neglecting the denominator, the number of tests completed. The way I understood the pandemic was biased by the big headline number, and wasn’t really based on how many people out of those tested did indeed have the virus. Thinking statistically provides a more accurate view of reality, but it can be hard to think statistically and can be tempting to look just at a single headline number.
Imagining Success Versus Anticipating Failure

Imagining Success Versus Anticipating Failure

I am guilty of not spending enough time planning what to do when things don’t work out the way I want. I have written in the past about the importance of planning for failure and adversity, but like many others, I find it hard to do and hard to get myself to sit down and think seriously about how my plans and projects may fail. Planning for resilience is incredibly important, but so many of us never get around to it. Daniel Kahneman in Thinking Fast and Slow, helps us understand why we fail to plan for failure.

 

He writes, “The successful execution of a plan is specific and easy to imagine when one tries to forecast the outcome of a project. In contrast, the alternative of failure is diffuse, because there are innumerable ways for things to go wrong.”

 

Recently, I have written a lot about the fact that our minds understand the world not by accumulating facts, understanding data, and analyzing nuanced information, but by constructing coherent narratives. The less we know and the more simplistic the information we work with, the more coherent our narratives of the world can be. When we have less uncertainty, our narrative flows more easily, feels more believable, and is more comforting to our mind. When we descend into the particular, examine complexity, and weigh competing and conflicting information, we have to balance substantial cognitive dissonance with our prior beliefs, our desired outcomes, and our expectations. This is hard and uncomfortable work, and as Kahneman points out, a problem when we try to anticipate failures and roadblocks.

 

It is easy to project forward how our perfect plan will be executed. It is much harder to identify how different potential failure points can interact and bring the whole thing crashing down. For large and complex systems and processes, there can be so many obstacles that this process can feel entirely overwhelming and disconnected from reality. Nevertheless, it is important that we get outside of our comfortable narrative of success and at least examine a few of the most likely mistakes and obstacles that we can anticipate. Any time that we spend planning ways to get the ship back on course if something goes wrong will pay off in the future when things do go wrong. Its not easy because it is mentally challenging and nebulous, but if we can get ourselves to focus on the likelihood of failure rather than the certainty of success, we will have a better chance of getting to where we want to be and overcoming the obstacles we will face along the way.
Why Terrorism Works

Why Terrorism Works

In the wake of terrorism attacks, deadly shootings, or bizarre accidents I often find myself trying to talk down the threat and trying to act as if my daily life shouldn’t be changed. I live in Reno, NV, and my city has experienced school shootings while my state experienced the worst mass shooting in the United States, but I personally have never been close to any of these extreme yet rare events.  Nevertheless, despite efforts to talk down any risk, I do psychologically notice the fear that I feel following such events.

 

This fear is part of why terrorism works. Despite trying to rationally and logically talk myself through the post-terrorism incident and remind myself that I am in more danger on the freeway than I am near a school or at a concert, there is still some apprehension under the surface, no matter how cool I make myself look on the outside. In Thinking Fast and Slow, Daniel Kahneman examines why we behave this way following such attacks. Terrorism, he writes, “induces an availability cascade. An extremely vivid image of death and damage, constantly reinforced by media attention and frequent conversations becomes highly accessible, especially if it is associated with a specific situation.”

 

Availability is more powerful in our mind than statistics. If we know that a given event is incredibly rare, but have strong mental images of such an event, then we will overweight the likelihood of that event occurring again. The more easily an idea or possibility comes to mind, the more likely it will feel to us that it could happen again. On the other hand, if we have trouble recalling experiences or instances where rare outcomes did not happen, then we will discount the possibility that they could occur. Where terrorism succeeds is because it shifts deadly events from feeling as if they were impossible to making them easily accessible in the mind, and making them feel as though they could happen again at any time. If our brains were coldly rational, then terrorism wouldn’t work as well as it does. As it is, however, our brains respond to powerful mental images and memories, and the fluidity of those mental images and memories shapes what we expect and what we think is likely or possible.
Desperate Gambles

Desperate Gambles

Daniel Kahneman worked with Amos Tversky to develop many of the concepts that today create the principle of Prospect Theory. Many people are familiar with the psychological and economic principle of Game Theory, and Prospect Theory is a similar psychological and economic theory of how people behave when faced with uncertainty. In his book Thinking Fast and Slow, Kahneman shares one of the early surprises of Prospect Theory that he and Tversky uncovered.

 

Prospect Theory gets its name from the way people behave when faced with different prospects, that is different potential outcomes with different potential likelihoods attached. This is similar to Game Theory, but instead of making decisions while another actor makes decisions that impact your final outcome, in prospect theory you generally are making a choice between a sure thing and an alternative minimal chance outcome. From the theory comes the fourfold pattern, which Kahneman uses to explain why large legal settlements are common, why people participate in lotteries, and why we buy insurance. What was surprising from Prospect Theory was the fourth block in the fourfold pattern, and it describes why some people are willing to take desperate gambles that have incredibly small likelihoods of paying off.

 

Kahneman writes, “when you consider a choice between a sure loss and a gamble with a high probability of a larger loss, diminishing sensitivity makes the sure loss more aversive, and the certainty effect reduces the aversiveness of the gamble.”

 

If you are suing a large corporation for damages, you are likely to accept a settlement far below what you are suing for. So, if you are suing the company for $1 million with a small chance of winning (say 5% or less), and the company offers you $95,000, you are likely to feel pressure to take the settlement to make sure you walk away with something. A guaranteed $95,000 is likely to be preferable to the tiny chance that you might actually win your lawsuit and walk away with $1 million. This is one square in the fourfold pattern that fit with Kahneman and Tversky’s prior expectations.

 

What surprised the pair was the tendencies of individuals when the tables are turned. Say you face the prospect of a large loss of $100,000 with a fringe legal possibility of getting off without facing any losses.  If you are offered a settlement where your costs will be $10,000 instead of $100,000, you are likely to feel pressure to turn down the settlement if there still exists some possibility of getting away without any losses. When we look at the expected value we find that accepting the settlement is the risk averse option, but few of us will be content taking the settlement.

 

We see this with politicians who take an “I’ll risk burning it all down to stay in power” approach, with poker players who get in too deep and misread an opponent or a hand, and with hockey teams who pull the goalie knowing that a sure loss is coming if they don’t take a big risk and get another offensive player on the ice while leaving their net open. When the sure loss is severe enough, then even large gambles with a minimal chance of success are worth the risk.

 

Kahneman and Tversky were surprised by this because it seems to violate  our normal pattern of acting based on expected value. We don’t consciously calculate the expected value of an event, but we usually do act in accordance to expected value. However, in these desperate situations, we actually choose the option with the worse expected value. We become less sensitive to the very likely large loss, and are unwilling to take the sure loss, violating expectations of risk aversion.
Decision Weights

Decision Weights

On the heels of the 2020 election, I cannot decide if this post is timely, or untimely. On the one hand, this post is about how we should think about unlikely events, and I will argue, based on a quote from Daniel Kahneman’s book Thinking Fast and Slow, that we overweight unlikely outcomes and should better alight our expectations with realistic probabilities. On the other hand, however, the 2020 election was closer than many people expected, we almost saw some very unlikely outcomes materialize, and one can argue that a few unlikely outcomes really did come to pass. Ultimately, this post falls in a difficult space, arguing that we should discount unlikely outcomes more than we actually do, while acknowledging that sometimes very unlikely outcomes really do happen.

 

In Thinking Fast and Slow Kahneman writes, “The decision weights that people assign to outcomes are not identical to the probabilities of these outcomes, contrary to the expectation principle.”  This quote is referencing studies which showed that people are not good at conceptualizing chance outcomes at the far tails of a distribution. When the chance of something occurring gets below 10%, and especially when it pushes into the sub 5% range, we have trouble connecting that with real world expectations. Our behaviors seem to change when things move from 50-50 to 75-25 or even to 80-20, but we have trouble adjusting any further once the probabilities really stretch beyond that point.

 

Kahneman continues, “Improbable outcomes are overweighed – this is the possibility effect. Outcomes that are almost certain are underweighted relative to actual certainty. The expectation principle, by which values are weighted by their probability, is poor psychology.”

 

When something has only 5% or lower chance of happening, we actually behave as though chance or probability for that occurrence is closer to say 25%. We know the likelihood is very low, but we behave as if the likelihood is actually a bit higher than a single digit percentage. Meanwhile, the very certain and almost completely sure outcome of 95%+ is discounted beyond what it really should be. Certainly very rare outcomes do sometimes happen, but in our minds we have trouble conceptualizing these incredibly rare outcomes, and rather than keeping a perspective based on the actual probabilities, by utilizing rational decision weights, we overweight the improbably and underweight the certain.

 

Our challenges with thinking about and correctly weighting extremely certain or extremely unlikely events may have an evolutionary history. For our early ancestors, being completely sure of anything may have resulted in a few very unlikely deaths. Those who were a tad more cautious may have been less likely to run across the log that actually gave way into the whitewater rapids below. And our ancestors who reacted to the improbably as though it were a little more certain may have also been better at avoiding the lion the one time the twig snapping outside the campground really was a lion. Our ancestor who sat by the fire and said, “twigs snap every night, the chances that it actually is a lion this time have gotta be under 5%,” may not have lived long enough to pass enough genes into the future generations. The reality is that in most situations for our early ancestors, being a little more cautious was probably advantageous for society. Today being overly cautious and struggling with improbable or nearly certain decision weights can be costly for us in terms of over-purchasing insurance, spending huge amounts to avoid the rare chance that we could lose a huge amount, and over trusting democratic institutions in the face of a coup attempt.
Signaling Fairness with Altruistic Punishment

Maintaining the Rules of Fairness with Signaling and Altruistic Punishment

Society is held together by many unspoken rules of fairness, and maintaining rules of fairness is messy but rewarding work. We don’t just advocate for fairness in our own lives, but will go out of our way to call out unfairness when we see it hampering the lives of others. We will protest, march in the streets, and post outraged messages on social media to call out the unfairness we see in the world, even if we are not directly affected by it or even stand to gain by an unfair status quo.

 

Daniel Kahneman, in Thinking Fast and Slow, shares some research studying our efforts to maintain the rules of fairness and why we are so drawn to it. He writes, “Remarkably, altruistic punishment is accompanied by increased activity in the pleasure centers of the brain. It appears that maintaining the social order and the rules of fairness in this fashion is its own reward.”

 

This idea reminds me of Robin Hanson’s book The Elephant in the Brain, where Hanson suggests a staggering amount of human behavior is little more than signaling. Much of what we do is not about the high-minded rational that we attach to our actions. Much of what we do is about something else, and our stated rationales are little more than pretext and excuses. Altruistic punishment, or going out of our way to inflicting some sort of punishment (verbal reprimands, loss of a job, or imprisonment) is not necessarily about the person who was treated unfairly or the person who was being unfair to others. It is quite plausibly more about our own pleasure, and about the maintenance or establishment of a social order that we presumably will benefit from, and about signaling to the rest of society that are someone who believes in the rules and will adhere to strict moral principles.

 

Troublingly, Kahneman continues, “Altruistic punishment could well be the glue that holds societies together. However, our brains are not designed to reward generosity as reliably as they punish meanness. Here again, we find a marked asymmetry between losses and gains.”

 

The second part of Kahneman’s quote is referring to biases in our mental thinking, connecting our meanness or niceness toward others with our tendency toward loss aversion. Losses have a bigger mental impact on us than gains. We might not be consciously aware of this, but our actions – our willingness to inflict losses on others and our reluctance to endow gains on others – seems to reflect this mental bias. We are creating social order by threatening others with loss of social standing at all times, but only with minimal hope of gaining and improving social standing. Going back to the Hansonian framework from earlier, this makes sense. A gain in social status for another person is to some extent a loss to ourselves. Maintaining the social order involves maintaining or improving our relative social position. Tearing someone down signals to our allies that we are a valuable team member fighting on the right side, but lifting someone else up only diminishes our relative standing to them (unless they are the leader who we want to signal our alliance with). Kahneman’s quote, when viewed through Robin Hanson’s perspective, is quite troubling for how our social order is built and maintained.