Incentives for Overestimating Risk

Incentives for Overestimating Risk

In the United States, and really across the globe, things are becoming more expensive. The price of food, gasoline, cars, and other goods have gone up quite a bit in the last year as the global economy adjusts to the new realities of the post-COVID world, as economies continue to respond to economic stimulus events, and as uncertainty around whether the pandemic truly is in our rear-view mirror continue to hang over our global consciousnesses. During this time of high inflation, many tv pundits, politicians, and experts are forecasting doom and gloom for national and global economies. Forecasting bad news seems to be the norm right now.
 
 
Steven Pinker explored the incentives for overestimating risk and forecasting bad news in his 2011 book The Better Angels of Our Nature. Pinker specifically looked at violence and war in his book and found that there are incentives for people to predict something negative, but not necessarily incentives for people to predict something positive. Pinker writes:
 
 
“Like television weather forecasters, the pundits, politicians, and terrorism specialists have every incentive to emphasize the worst-case scenario. It is undoubtedly wise to scare governments into taking extra measures to lock down weapons and fissile material and to monitor and infiltrate groups that might be tempted to acquire them. Overestimating the risk, then, is safer than underestimating it – though only up to a point. (emphasis mine)
 
 
We might be safer if everyone predicts a worst case scenario. If the people with the largest platforms focus on the dangerous potential for a terrorist attack, the public will demand action to reduce the risk. If there is a great focus on the need for improved safety equipment in hospitals responding to a new strain of COVID, then public officials are more likely to act. If there is overwhelming concern about inflation and economic collapse, the government will hopefully take better actions to balance the economy. Predicting that everything will work out and hum along on its own could be more dangerous than predicting the worst outcomes. Predicting doom and gloom not only gets attention, it can drive early and decisive decision making.
 
 
But as Pinker notes, it is safer to overestimate risk only to a point. Pinker cites the costs of the war in Iraq in search of weapons of mass destruction that did not exist as an example of dangerous worst case forecasting. Overreacting to COVID in China through excessive lockdowns and insufficient vaccination efforts may be contributing to higher global prices for goods at the moment. And predicting an economic collapse could spook markets and scare consumers, leading to worse economic outcomes than might otherwise occur. There are incentives to predicting the worst, but also costs if our predictions go too far.
 
 
I think our jobs as individuals is to be aware of the worst case scenarios, but not to become too trapped by such predictions. We need to remember that making worst case scenario predictions will provide feedback into what is already a noisy system. It is likely that forecasting the worst and spurring action by individuals will avert the worst. This doesn’t mean we can sit back and let others handle everything, but it should encourage us to think deeply about worse cases, our actions, and how panicked we should be.
Sunk Costs and Public Commitment

Sunk Costs & Public Commitment

The Sunk Cost fallacy is an example of an error in human judgment that we should all try to keep in mind. Thinking about sunk costs and how we respond to them can help us make better decisions in the future. It is one small avenue of cognitive psychology research that we can act on and see immediate benefits in our lives.
 
 
Sunk costs pop up all over the place. Are you hesitant to change lines at the grocery store because you have already been waiting in one line for a while and might as well stick it out? Did you start a landscaping project that isn’t turning out the way you want, but you are afraid to give up and try something else because you have already put so much time, money, and effort into the current project? Would you be mad at a politician who wanted to pull out of a deadly war and give up because doing so would mean that soldiers died in vain?
 
 
All of these examples are instances where sunk cost fallacies can lead us to make worse decisions. In his book The Better Angels of Our Nature, Steven Pinker writes, “though psychologists don’t fully understand why people are suckers for sunk costs, a common explanation is that it signals a public commitment.” In the examples above, changing course signals something weak about us. We are not patient and not willing to stick out our original decision to wait in one grocery store line relative to another. We are not committed to our vision of a perfect lawn and are willing to give up and put in cheap rock instead of seeing our sprinkler system repair all the way through. And we are not truly patriotic and don’t truly value the lives of soldiers lost in war if we are willing to give up a fight. In each of these areas, we may feel pressured to persist with our original decision which has become more costly than we expected. Even as costs continue to mount, we feel a need to stay the course. We fail to recognize that sunk costs are in the past, that we can’t do anything to recoup them, and that we can make more efficient decisions moving forward if we can avoid feeling bad about sunk costs.
 
 
Tied  to the pressure we feel is a misperception of incremental costs. Somehow additional time spent in line, additional effort spent on the lawn, and additional lives lost in battle matter less given everything that has already passed. “An increment is judged relative to the previous amount,” writes Pinker. One more life lost in a war doesn’t feel as tragic once many lives have already been lost. Another hundred dollars on sprinkler materials doesn’t feel as costly when we have already put hundreds into our landscaping project (even if $100 in rock would go further and be simpler). And another minute in line at the grocery store is compared the to the time already spent waiting, distorting how we think about that time.
 
 
If we can reconsider sunk costs, we can start to make better decisions. We can get over the pride we feel waiting out the terrible line at the grocery store. We can reframe our landscaping and make the simpler decision and begin enjoying our time again. And we can save lives by not continuing fruitless wars because we don’t want those who already died to have died in vain. Changing our relationship to sunk costs and how we consider incremental costs can have an immediate benefit in our lives, one of the few relatively easy lessons we can learn from cognitive psychology research.
Street Gangs, Militias, Great Power Armies, & Game Theory

Street Gangs, Militias, Great Power Armies, & Game Theory

As we are learning with the War in Ukraine, understanding Game Theory is important if we want to understand why war breaks out, how long and how deadly a war will be, and how a war will come to a conclusion. Game Theory helps us think about the decision-making of the parties involved in a war and the incentives and risks that they face. Television pundits, journalists, politicians, and policy analysts are all engaging in game-theoretic evaluations of the current conflict in Ukraine to help think about a way that Russia could leave Ukraine without completely destroying the country and its population.
 
 
While much of the world has recently been thinking about Game Theory in the context of two large warring nations, the most basic way to think about and understand game theory is usually a two person situation known as the prisoner’s dilemma.  Conceptualizing Game Theory in this basic format gives us a framework that we can use to understand larger conflicts and dilemmas where parties have to make decisions anticipating the outcomes of their choices, the responses and choices of their opponents, and the subsequent reactions and decisions of everyone else along the way. The simple framework from a two person prisoner’s dilemma scales well.
 
 
In his book The Better Angels of Our Nature, Steven Pinker writes about Game Theory and how size doesn’t seem to matter when we think in a game-theoretic way. He writes:
 
 
“The same psychological or game-theoretic dynamics that govern whether quarreling coalitions will threaten, back down, bluff, engage, escalate, fight on, or surrender apply whether the coalitions are street gangs, militias, or armies of great powers. Presumably this is because humans are social animals who aggregate into coalitions, which amalgamate into larger coalitions, and so on.”
 
 
Our coalitions are large and complex, but they are still organized around humans. Our social nature is predictable, meaning that Game Theory can apply in any human coalition, regardless of size. Quite often our large coalitions, especially coalitions that employ violence, are ultimately lead by a single individual who can command the decisions to use violence. This all contributes to Game Theory’s application across two people interactions, high school gangs, or armies comprising hundreds of thousands of soldiers. Game Theory helps us understand the decision-making of all the groups, regardless of their complexity and size.
Scarcity & Short-Term Thinking

Scarcity & Short-Term Thinking

I find critiques of people living in poverty to generally be unfair and shallow. People living in poverty with barely enough financial resources to get through the day are criticized for not making smart investments of their time and money, and are criticized when they spend in a seemingly irrational manner. But for low income individuals who can’t seem to get ahead no matter what jobs they take, these critiques seem to miss the reality of life at the poorest socioeconomic level.
I wrote recently about the costs of work, which are not often factored into our easy critiques of the poor or unemployed. Much of America has inefficient and underinvested public transit. The time involved with catching a bus (or two) to get to work are huge compared with simply driving to work. Additionally, subways and other transports can be dangerous (there is no shortage of Youtube videos of people having phones stolen on public transit). This means that owning and maintaining a car can be essential for being able to work, an expensive cost that can make working prohibitive for those living in poverty.
The example of transportation to work is meant to demonstrate that not working can be a more rational choice for the poorest among us. Work involves extra stress and costs, and the individual might not break even, making unemployment the more rational choice. There are a lot of instances where the socially desirable thing becomes the irrational choice for those living in poverty. If we do not recognize this reality, then we will unfairly criticize the choices and decisions of the poor.
In his book Evicted, Matthew Desmond writes about scarcity and short-term thinking, showing that they are linked and demonstrating how this shapes the lives of those living in poverty. “research show[s] that under conditions of scarcity people prioritize the now and lose sight of the future, often at great cost.” People living in scarcity have trouble thinking ahead and planning for their future. When you don’t know where you will sleep, where your next meal will come from, and if you will be able to afford the next basic necessities, it is hard to think ahead to everything you need to do for basic living in American society. Your decisions  might not make sense to the outside world, but to you it makes sense because all you have is the present moment, and no prospects regarding the future to plan for or think about. Sudden windfalls may be spent irrationally, time may not be spent resourcefully, and tradeoffs that benefit the current moment and the expense of the future may seem like obvious choices if you live in constant scarcity.
Combined, the misperceptions about the cost of work and the psychological short-termism resulting from scarcity show us that we have to approach poverty differently from how we approach lazy middle class individuals. I think we design our programs for assisting those in poverty while thinking of middle class lazy people. We don’t think about individuals who are actually so poor that the costs of work that most of us barely think about become crippling. We  don’t consider how scarcity shapes the way people think, leading them to make poor decisions that seem obvious for us to critique from the outside. Deep poverty creates challenges and obstacles that are separate from the problem of free loading and lazy middle class children or trust fund babies. We have to recognize this if we are to actually improve the lives of the poorest among us and create a better social and economic system to help integrate those individuals.
The Screening-Off Effect

The Screening-Off Effect

Sometimes to our great benefit, and sometimes to our detriment, humans like to put things into categories – at least Western, Educated, Industrialized, Rich, Democratic (WEIRD) people do. We break things into component parts and categorize each part as belonging to a category of thing. We do this with things like planets, animals, and players within sports. We like established categories and dislike when our categorization changes. This ability has greatly helped us in science and strategic planning, allowing our species to do incredible things and learn crucial lessons about the world. What is remarkable about this ability is how natural and easy it is for us, but how hard it is to explain or program into a machine.
One component of this remarkable ability is referred to as the screening-off effect by Judea Pearl in The Book of Why. Pearl writes, “how do we decide which information to disregard, when every new piece of information changes the boundary between the relevant and the irrelevant? For humans, this understanding comes naturally. Even three-year-old toddlers understand the screening-off effect, though they don’t have a name for it. … But machines do not have this instinct, which is one reason that we equip them with causal diagrams.”
From a young age we know what information is the most important and what information we can ignore. We intuitively have a good sense for when we should seek out more information and when we have enough to make a decision (although sometimes we don’t follow this intuitive sense). We know there is always more information out there, but don’t have time to seek out every piece of information possible. Luckily, the screening-off effect helps us know when to stop and makes decision-making possible for us.
Beyond knowing when to stop, the screening-off effect helps us know when to ignore irrelevant information. The price of tea in China isn’t a relevant factor for us when deciding what time to wake up the next morning. We recognize that there are no meaningful causal pathways between the price of tea and the best time for us to wake up. This causal insight, however, doesn’t exist for machines that are only programmed with the specific statistics we build into them. We specifically have to code a causal pathway that doesn’t include the price of tea in China for a machine to know that it can ignore that information. The screening-off effect, Pearl explains, is part of what allows humans to think causally. In cutting edge science there are many factors we wouldn’t think to screen out that may impact the results of scientific experiments, but for the most part, we know what can be ignored and can look at the world around us through a causal lens because we know what is and is not important.
When to Stop Thinking

When to Stop Thinking

My last post was about closed-mindedness and focused on how closed-minded people fail to make appropriate inquiries to gain the necessary information to make good decisions and accurately understand the world. What the post didn’t ask, is when we should stop thinking and make a decision, versus when we should continue our investigations to gain more knowledge. A serious problem, and one we avoid when we are closed-minded, is often referred to as paralysis by analysis. It occurs when you lack confidence in decision-making and continually seek more information before making a decision, potentially delaying your choice or any action indefinitely.
Writing about this idea in Vices of the Mind, Quassim Cassam writes, “our investigations can be open-ended and there is often, though not always, scope for further investigation.” Sometimes we are asking questions and doing research on continually evolving topics. Sometimes we are working at a cutting edge where changes in politics, markets, social trends, and scientific breakthroughs can influence what we do from day to day. There never is a final answer, and we have to continually seek new information in order to adapt. However, this doesn’t mean that we can’t make important decisions that require thoughtful deliberation.
“A good investigator,” Cassam writes, “has a sense of when enough is enough and diminishing returns are setting in. But the decision to call a halt at that point isn’t properly described as closed-minded. What allows us to get on with our lives isn’t closed-mindedness but the ability to judge when no further research into the question at hand is necessary.”
Closed-minded people make decisions while ignoring pertinent information. Open-minded people make decisions while ignoring extraneous information. Over time, for each of us if we practice long enough, we should improve our judgements and become better at recognizing the diminishing returns of continued research. We might continue to learn a bit more as we continue to study, but the value of each new bit of information will be smaller and smaller, and at some point won’t truly impact our decisions. A novice might have trouble identifying this point, but an expert should be better. A closed-minded person doesn’t look for this optimal point, but an open-minded person does, continually updating their priors and judgements on when they have enough information to make a decision, rather than rigidly locking in with a specific set of information. This is how we avoid analysis paralysis and how we improve our decision-making over time to get on with our lives as Cassam writes.
A Leader's Toolbox

A Leader’s Toolbox

In the book Risk Savvy Gerd Gigerenzer describes the work of top executives within companies as being inherently intuitive. Executives and managers within high performing companies are constantly pressed for time. There are more decisions, more incoming items that need attention, and more things to work on than any executive or manager can adequately handle on their own. Consequentially, delegation is necessary, as is quick decision-making based on intuition. “Senior managers routinely need to make decisions or delegate decisions in an instant after brief consultation and under high uncertainty,” writes Gigerenzer. This combination of quick decision-making under uncertainty is where intuition comes to play, and the ability to navigate these situations is what truly comprises the leader’s toolbox.

 

Gigerenzer stresses that the intuitions developed by top managers and executives are not arbitrary. Successful managers and companies tend to develop similar tool boxes that help encourage trust and innovation. While many individual level decisions are intuitive, the structure of the leader’s toolbox often becomes visible and intentional. As an example, Gigerenzer highlights a line of thinking he uncovered when working on a previous book. He writes, hire well and let them do their jobs reflects a vision of an institution where quality control (hire well) goes together with a climate of trust (let them do their jobs) needed for cutting-edge innovation.”

 

In many companies and industries, the work to be done is incredibly complex, and a single individual cannot manage every decision. The structure of the decision-making process necessarily needs to be decentralized for the individual units of the team to work effectively and efficiently. Hiring talented individuals and providing them with the autonomy and tools necessary to be successful is the best approach to get the right work done well.

 

Gigerenzer continues, “Good leadership consists of a toolbox full of rules of thumb and the intuitive ability to quickly see which rule is appropriate in which context.”

 

A leader’s toolbox doesn’t consist of specific lists of what to do in certain situations or even specific skills that are easy to check off on a resume. A leader’s toolbox is built by experience in a diverse range of settings and intuitions about things as diverse as hiring, teamwork, and delegation. Because innovation is always uncertain and always includes risk, leaders must develop intuitive skills and be able to make quick and accurate judgements about how to best handle new challenges and obstacles. Intuition and gut-decisions are an essential part of leadership today, even if we don’t like to admit that we make important decisions on intuition.
Defensive Decision-Making - Joe Abittan

Defensive Decision-Making

One of the downfalls of a negative error cultures is that people become defensive over any mistake they make. Errors and mistakes are shamed and people who commit errors do their best to hide them or deflect responsibility. Within negative error cultures you are more likely to see people taking steps to distance themselves from responsibility before a decision is made, practicing what is called defensive decision-making.

 

Gerd Gigerenzer expands on this idea is his book Risk Savvy by writing, “defensive decision making [is] practiced by individuals who waste time and money to protect themselves at the cost of others, including their companies. Fear of personal responsibility creates a market for worthless products delivered by high-paid experts.”

 

Specifically, Gigerenzer writes about companies that hire expensive outside experts and consultants to make market predictions and help improve company decision-making. The idea is that individual banks, corporations, and sales managers can’t accurately know the state of a market as well as an outside expert whose job it is to study trends, talk to market actors, and understand how the market relates to internal and external pressures. The problem, as Gigerenzer explains, is that even experts are not very good at predicting the future of a market. There is simply too much uncertainty for anyone to be able to say that market trends will continue, that a shock is coming, or that a certain product or service is about to take off. Experts make these types of predictions all the time, but evidence suggests that their predictions are not much better than just throwing dice.

 

So why do companies pay huge fees, sit through lengthy meetings, and spend time trying to understand and adapt to the predictions of experts? Gigerenzer suggests that it is because individuals within the company are practicing defensive decision-making. If you are a sales manager and you make a decision to sell to a particular market with a new approach after analyzing performance and trends of your own team, then you are responsible for the outcome of the new approach and strategy. If it works, you will look great, but if it fails, then you will be blamed for not understanding the market, for failing to see the signs that indicated your plan wasn’t going to succeed, and for misinterpreting past trends. However, if a consultant suggested a course of action, presented your team with a great visual presentation, and was certain that they understood the market, then you escape blame when the plan doesn’t work out. If even the expert couldn’t see what was going to happen, then how could you be blamed for a plan not working out?

 

Defensive decision-making is good for the individual, but bad for the larger organization that the individual is a part of. Companies would be better off if they made decisions quicker, accepted risk, and could openly evaluate success and failure without having to place too much blame on individuals. Companies could learn more about their errors and could do a better job identifying and promoting talent. Defensive decision-making is expensive, time consuming, and outsources blame, preventing companies and organizations from actually learning and improving their decision-making over the long run.
Positive Error Cultures - Joe Abittan

Positive Error Cultures

My last post was about negative error cultures and the harm they can create. Today is about the flip side, positive error cultures and how they can help encourage innovation, channel creativity, and help people learn to improve their decision-making. “On the other end of the spectrum,” writes Gerd Gigerenzer in Risk Savvy, “are positive error cultures, that make errors transparent, encourage good errors, and learn from bad errors to create a safe environment.”

 

No one likes to make errors. Whether it is a small error on our personal finances or a major error on the job, we would all rather hide our mistakes from others. In school we probably all had the experience of quickly stuffing away a test that received a bad grade so that no one could see how many questions we got wrong. Errors in life have the same feeling, but like mistakes on homework, reports, or tests, hiding our errors doesn’t help us learn for the future. In school, reviewing our mistakes and being willing to work through them helps us better understand the material and shows us where we need to study for the final exam. In life, learning from our mistakes helps us become better people, make smarter decisions, and be more prepared for future opportunities.

 

This is why positive error cultures are so important. If we are trying to do something new, innovative, and important, then we are probably going to be in a position where we will make mistakes. If we are new homeowners and don’t know exactly how to tackle a project, we will certainly err, but by learning from our mistakes, we can improve and better handle similar home improvement projects in the future. Hiding our error will likely lead to greater costs in the future, and will leave us dependent on others to do costly work around the house. Business is the same way. If we want to grow to get a promotion or want to do something innovative to solve a new problem, we are going to make mistakes. Acknowledging where we were wrong and why we made an error helps us prepare for future challenges and opportunities. It helps us learn and grow rather than remaining stuck in one place, not solving any problems and not preparing for future opportunities.

 

80,000 Hours has a great “Our Mistakes” section on their website, demonstrating a positive error culture.
A mixture of Risks

A Mixture of Risks

In the book Risk Savvy, Gerd Gigerenzer explains the challenges we have with thinking statistically and how these difficulties can lead to poor decision-making. Humans have trouble holding lots of complex and conflicting information. We don’t do well with decisions involving risk and decisions where we cannot possibly know all the relevant information necessary for the best decision. We prefer to make decisions involving fewer variables, where we can have more certainty about our risks and about the potential outcomes. This leads to the substitution effect that Daniel Kahneman describes in his book Thinking Fast and Slow, where our minds substitute an easier question for the difficult question without us noticing.

 

Unfortunately, this can have bad outcomes for our decision-making. Gigerenzer writes, “few situations in life allow us to calculate risk precisely. In most cases, the risks involved are a mixture of more or less well known ones.” Most of our decisions that involve risk have a mixture of different risks. They are complex decisions with tiers and potential cascades of risk based on the decisions we make along the way. Few of our decisions involve just one risk independent of others that we can know with certainty.

 

If we consider investing for retirement we can see how complex decisions involving risk can be and how a mixture of risks is present across all the decisions we have to make. We can hoard money in a safe in our house where we reduce the risk of losing any of our money, but we risk being unable to have enough saved by the time we are ready to retire. We can invest our money, but have to make decisions regarding whether we will keep it in a bank account, invest it in the stock market, or look to other investment vehicles. Our bank is unlikely to lose much money, and is low risk, but is also unlikely to help us increase the value of our savings to have enough for retirement. Investing with a financial advisor takes on more risk, such as the risk that we are being scammed, the risk that the market tanks and our advisor made bad investments on our behalf, and the risk that we won’t have access to our money if we were to need it quickly in case of an emergency. What this shows is that even the most certain option for our money, protecting it in a secret safe at home, still contains additional risks for the future. The options that are likely to provide us with the greatest return on our savings, investing in the stock market, has a mixture of risks associated with each investment decision we make after the initial decision to invest. There is no way we can calculate and fully comprehend ever risk involved with such an investment decision.

 

Risk is complex, and we rarely deal with a single decision involving a single calculable risk at one time. Our brains are likely to flatten the decision by substituting more simple decisions, eliminating some of the risks from consideration and helping our mind focus on fewer variables at a time. Nevertheless, the complex mixture of risks doesn’t go away just because  our brains pretend it isn’t there.