The precautionary principle in governance

A Factor for Paralysis in Regulation & Legislation

A common complaint today in the United States is that nothing gets done. We are frustrated by political leaders who can’t pass important legislation. We dislike how slow local governments are to update infrastructure, adopt new technologies, and make improvements in the places we live. Gridlock has become the norm, and the actions that governments take seem to be too little too late.

 

But is this criticism really fair? Is the problem slow governments, ineffectual legislators, and inept public officials? Daniel Kahneman in his book Thinking Fast and Slow highlights a basic aspect of human psychology that might be one of the major contributing factors to the paralysis we see in governance today, and it has nothing to do with the quality of officials and legislators, but instead is all about the structures and systems of incentives that elected officials and policy actors respond to. The precautionary principle, a side effect of our general tendency toward loss aversion and our general stance against taboo tradeoffs drives our paralysis, and it is a logical response to the structure of many of our governing institutions.

 

Governments are necessary parts of human society, helping us establish rules for how we will live, interact, and make decisions collectively. Governments make investments, determine safety and efficacy standards, and help allocate resources across populations. In each of these functions of governance there is a possibility for error, a possibility for failure, and risk involved in the decisions. This is where the precautionary principle comes in. Kahneman writes,

 

“In the regulatory context, the precautionary principle imposes the entire burden of proving safety on anyone who undertakes actions that might harm people or the environment. Multiple international bodies have specified that the absence of scientific evidence of potential damage is not sufficient justification for taking risks. … the precautionary principle is costly, and when interpreted strictly it can be paralyzing.”

 

When risk is involved in decision-making processes, elected officials and public leaders are held responsible for any the bad outcomes that come to pass. There will always be a chance that a government investment fails, and no public official wants that failure to reflect poorly on their decision-making. There is always the risk that allocated resources could be misused, and it is often the official who approved the resource allocation (as well as the bad actor themselves) who faces consequences. When there is a deliberate decision to trade-off some level of safety or to accept an increase in risk in exchange for improved economic performance, faster traffic flows, or reduced government spending, public leaders and elected officials are the ones who look bad when something goes wrong.

 

The way our governance operates today encourages the precautionary principle. Risk is incredibly dangerous for public leaders, so the safer and more costly approach feels like the right choice in each individual decision. Over time, however, the costs add up, the paralysis becomes suffocating, and the public becomes dissatisfied and cynical. The answer might not be to completely cut out the regulation and safety apparatus of the government (that didn’t work well for President Trump who eliminated the NSC directorate for global health and security and bio-defense). The answer will be new structures for governance, new ways to allow government to take risks, and new ways to understand the risks that we all take in our lives. None of these are easy or simple transitions, but it is likely what we need in order to survive in a more complex and turbulent world.
Taboo Tradeoffs

Taboo Tradeoffs

A taboo tradeoff occurs when we are faced with the dilemma of exchanging something that we are not supposed to give up for money, food, or other resources. Our time, attention, energy, and sometimes even our happiness are perfectly legitimate to trade, but things like health and safety generally are not. We are expected to exchange our time, attention, and physical labor for money, but we are not expected to exchange our personal health for money. When I first read about taboo tradeoffs in Daniel Kahneman’s book Thinking Fast and Slow, the year was 2019, and we had not yet entered into a period of time defined by a global pandemic where people began to challenge the taboo against trading health and safety for entertainment, for trials for COVID-19 cures, and to signal their political allegiance.

 

In the book, Kahneman suggests that holding to hard rules against taboo tradeoffs actually makes us all worse off in the end. “The taboo tradeoff against accepting any increase in risk is not an efficient way to use the safety budget,” he writes. Kahneman’s point is that we can spend huge amounts of resources to ensure that there is absolutely no risk to ourselves, our children, or to others, but that we would be better off allocating those resources in a different way. I think Kahneman is correct, but I think that his message has the potential to be read very differently in 2020, and deserves more careful and nuanced discussion.

 

“The intense aversion to trading increased risk for some other advantage plays out on a grand scale in the laws and regulations governing risk.” The important point to note is that complete security and safety comes at a cost of other advantages. The advantage to driving to a football game is that we get to enjoy watching live sports, the risk is that we could be in a serious traffic accident. The advantage of using bug spray is that we kill the creepy crawlies in the dark corners of the garage, the risk is that we (or a child or pet) could accidently ingest the poison. The safest things to do would be to watch the game on TV and to use a broom and boot to kill the bugs, but if we avoid the risk then we give up the advantages of seeing live sports and using efficient pest control products.

 

Kahneman notes that when we make these decisions, we often make them based on a fear of regret more than out of altruistic concerns for our own health and safety or for the health and safety of others. If you traded some level of risk of your child’s safety, and they died, you would feel immense regret and shame, and so you avoid the taboo tradeoff to prevent your own shame. When this plays out across society in millions of large and small examples, we end up in a collectively risk averse paralysis, and society gives up huge advantages because there is a possibility of risk for some individuals.

 

To address the current global state of affairs, I think Kahneman would recognize the risk of COVID-19 and would not encourage us to trade our health and safety (and the health and safety of others) for the enjoyment of a birthday party, holiday meal, or other type of gathering without wearing masks and taking other precautions.  Throughout the book Kahneman highlights the difficulties and challenges of thinking through risk. He addresses the many biases that play into how we behave and how we understand the world. He demonstrates the difficulties we have in thinking statistically and understanding complex probabilities. The takeaway from Kahneman in regard to the taboo tradeoff is that there is a level at which our efforts of safety are outpaced by the advantages we could attain by giving up some of our safety. It isn’t necessarily on each of us individually to try to decide exactly what level of risk society should accept. It is up to the experts who can engage their System 2 brain and evaluate their biases to help the rest of us better understand and conceptualize risk. We might be able to do some things understanding that there is a level of risk we take when engaging in society in 2020, but adequate precautions can still mitigate that risk, and still help us maintain a reasonable balance of safety tradeoffs while enjoying our lives.
Risk Averse and Risk Seeking - Joe Abittan

Risk Averse and Risk Seeking

I would generally categorize myself as somewhat risk averse, but studies from Daniel Kahneman in Thinking Fast and Slow might suggest that I’m not really any different than anyone else. I might just be responding to the set of circumstances that I typically experience, similar to anyone else, and I might just be more aware of times when I am risk averse rather than times when I am more risk seeking. In particular, I might be risk averse in certain situations and categorize those situations correctly, but risk seeking in other situations without recognizing it.

 

Kahneman uses examples throughout his book to demonstrate to the audience that common cognitive errors and psychological tendencies are shared with even the most savvy readers who would pick up a book like Thinking Fast and Slow. Kahneman even uses anecdotes from his own life and his own thoughts to demonstrate how deep knowledge of cognitive biases and errors doesn’t make one immune. After demonstrating how our minds can lead us to be risk averse in some settings and risk seeking in others, Kahneman cautions us against a typical pattern that many of us will find ourselves in. “It is costly to be risk averse for gains and risk seeking for losses.”

 

On its own, this quote doesn’t seem to reveal anything to interesting, but in the context of Kahneman’s experiments and examples, it reveals a lot about the way we behave whether we are risk seeking or risk averse. When we are offered a flat sum or a gamble with the potentially win more than the flat sum, we often won’t be willing to take the gamble. The guaranteed money is more appealing to us than the prospects of a higher winning with a small chance of gaining nothing. When it comes to gains, we are often risk averse, preferring the sure thing rather than the possibility of getting more with the risk of getting nothing or facing a cost.

 

However, we become risk seeking when we stand to lose something. As long as there is a small outside chance that we won’t lose anything, we will avoid a certain loss, risk a larger loss, and take a gamble. In Kahneman’s example he demonstrates how people will quickly turn down a sure loss of $750 for a 25% chance of losing nothing, even when there is a 75% chance of losing $1000.

 

When you do the math over numerous trials, you see that taking the loss at $750 is better. However, our minds don’t perceive things this way. When we stand to win something, we tend to become conservative and risk averse, but if we stand to lose something, we suddenly become more risk seeking. Combining these two tendencies can be dangerous. It means we can stand to gain much less than we might if we flipped our biases around, and it also means we are more likely to face greater losses with greater frequency than if we had been less risk seeking with regard to losses.

 

If we think about this in the context of our lives more generally, we can see that categorizing ourselves and most of our friends as either risk averse or risk seeking doesn’t necessarily make sense. When you are young, you really don’t have anything that you will be worried about losing. It makes sense that you might be more risk seeking, more willing to take on behaviors and ideas that are risky, but might have a big upside. You might procrastinate with important homework, retirement savings, and household chores because you know you will lose time (the only thing you may have if you are really young), and you can gamble on the consequences. As you get older, once you are established in a career, own a home, have a 401K, and move through life in general, you stand to lose more. Gains throughout your life become less significant due to Tyler Cowen‘s favorite idea, diminishing marginal returns. It becomes harder to give up the guaranteed gains because the marginal increase in a potential gain through a gamble is less appealing. You become risk averse as you get older and in more situations as you grow to have more things to worry about losing. Therefore, categorizing people as generally risk averse or generally risk seeking is meaningless. You need to look at the circumstances of their lives to understand where they find themselves in terms of social status, what material possessions they have, what their family structure is like, and you will start to understand why they make generally more risk averse or generally more risk seeking decisions. There is probably some variability across people, but I would expect the structures and systems in place around us shape our behavior more than any genetic or inherent factors.
Desperate Gambles

Desperate Gambles

Daniel Kahneman worked with Amos Tversky to develop many of the concepts that today create the principle of Prospect Theory. Many people are familiar with the psychological and economic principle of Game Theory, and Prospect Theory is a similar psychological and economic theory of how people behave when faced with uncertainty. In his book Thinking Fast and Slow, Kahneman shares one of the early surprises of Prospect Theory that he and Tversky uncovered.

 

Prospect Theory gets its name from the way people behave when faced with different prospects, that is different potential outcomes with different potential likelihoods attached. This is similar to Game Theory, but instead of making decisions while another actor makes decisions that impact your final outcome, in prospect theory you generally are making a choice between a sure thing and an alternative minimal chance outcome. From the theory comes the fourfold pattern, which Kahneman uses to explain why large legal settlements are common, why people participate in lotteries, and why we buy insurance. What was surprising from Prospect Theory was the fourth block in the fourfold pattern, and it describes why some people are willing to take desperate gambles that have incredibly small likelihoods of paying off.

 

Kahneman writes, “when you consider a choice between a sure loss and a gamble with a high probability of a larger loss, diminishing sensitivity makes the sure loss more aversive, and the certainty effect reduces the aversiveness of the gamble.”

 

If you are suing a large corporation for damages, you are likely to accept a settlement far below what you are suing for. So, if you are suing the company for $1 million with a small chance of winning (say 5% or less), and the company offers you $95,000, you are likely to feel pressure to take the settlement to make sure you walk away with something. A guaranteed $95,000 is likely to be preferable to the tiny chance that you might actually win your lawsuit and walk away with $1 million. This is one square in the fourfold pattern that fit with Kahneman and Tversky’s prior expectations.

 

What surprised the pair was the tendencies of individuals when the tables are turned. Say you face the prospect of a large loss of $100,000 with a fringe legal possibility of getting off without facing any losses.  If you are offered a settlement where your costs will be $10,000 instead of $100,000, you are likely to feel pressure to turn down the settlement if there still exists some possibility of getting away without any losses. When we look at the expected value we find that accepting the settlement is the risk averse option, but few of us will be content taking the settlement.

 

We see this with politicians who take an “I’ll risk burning it all down to stay in power” approach, with poker players who get in too deep and misread an opponent or a hand, and with hockey teams who pull the goalie knowing that a sure loss is coming if they don’t take a big risk and get another offensive player on the ice while leaving their net open. When the sure loss is severe enough, then even large gambles with a minimal chance of success are worth the risk.

 

Kahneman and Tversky were surprised by this because it seems to violate  our normal pattern of acting based on expected value. We don’t consciously calculate the expected value of an event, but we usually do act in accordance to expected value. However, in these desperate situations, we actually choose the option with the worse expected value. We become less sensitive to the very likely large loss, and are unwilling to take the sure loss, violating expectations of risk aversion.
Avoiding Gambles

Avoiding Gambles

“Most people dislike risk (the chance of receiving the lowest possible outcome), and if they are offered a choice between a gamble and an amount equal to its expected value they will pick the sure thing,” writes Daniel Kahneman in Thinking Fast and Slow. I don’t want to get too far into expected value, but in my mind I think of it as a discount on the total value of the best outcome of a gamble blended with the possibility of getting nothing. Rather than the expected value of a $100 dollar bet being $100, the expected value is going to come in somewhere less than that, maybe around $50, $75, or $85 dollars depending on whether the odds of winning the bet are so-so or are pretty good. You will either win $100 or 0, not $50, $75, or $85, but the risk factor causes us to value the bet at less than the full amount up for grabs.

 

What Kahneman describes in his book is an interesting phenomenon where people will mentally (or maybe subjectively is the better way to put it) calculate an expected value in their head when faced with a betting opportunity. If the expected value of the bet that people calculate for themselves is not much higher than a guaranteed option, people will pick the guaranteed option. The quote I used to open the post explains the phenomenon which you have probably seen if you have watched enough game show TV. As Kahneman continues, “In fact a risk-averse decision maker will choose a sure thing that is less than the expected value, in effect paying a premium to avoid the uncertainty.”

 

On game shows, people will frequently walk away from the big possibility of a pay off with a modest sum of cash if they are risk averse or if the odds seem really stacked against them. What is interesting is that we can study when people make the bet versus when people walk away, and observe patterns in our decision making. It turns out we can predict the situations that drive people toward avoiding gambles, and the situations which encourage them. It turns out that the reward has to be about two times the possible loss before people will make a gamble. If the certain outcome is pretty close to the expected outcome, people will pick the certain outcome. If there is no certain outcome, people usually need a reward that is at least 2X what they might lose before people will be comfortable with a bet. We might like to take chances and gamble from time to time, but we tend to be pretty risk averse and we tend to prefer guaranteed outcomes, even at a slight cost over the expected value of a bet, than to lose it all.
How We Chose to Measure Risk

How We Chose to Measure Risk

Risk is a tricky thing to think about, and how we chose to measure and communicate risk can make it even more challenging to comprehend. Our brains like to categorize things, and categorization is easiest when the categories are binary or represent three or fewer distinct possibilities. Once you start adding options and different possible outcomes, decisions quickly become overwhelmingly complex, and our minds have trouble sorting through the possibilities. In his book Thinking Fast and Slow, Daniel Kahneman discusses the challenges of thinking about risk, and highlights another level of complexity in thinking about risk: what measurements we are going to use to communicate and judge risk.

 

Humans are pretty good at estimating coin flips – that is to say that our brains do ok with binary 50-50 outcomes (although as Kahneman shows in his book this can still trip us up from time to time). Once we have to start thinking about complex statistics, like how many people will die from cancer caused by smoking if they smoke X number of packs of cigarettes per month for X number of years, our brains start to have trouble keeping up. However, there is an additional decision that needs to be layered on top statistics such as cigarette related death statistics before we can begin to understand them. That decision is how we are going to report the death statistics.  Will we chose to report deaths per thousand smokers? Will we chose to report the number of packs smoked for a number of years? Will we just chose to report deaths among all smokers, regardless as to whether they smoked one pack per month or one pack before lunch every day?

 

Kahneman writes, “the evaluation of the risk depends on the choice of a measure – with the obvious possibility that the choice may have been guided by a preference for one outcome or another.”

 

Political decisions cannot be escaped, even when we are trying to make objective and scientific statements about risk. If we want to convey that something is dangerous, we might chose to report overall death numbers across the country. Those death numbers might sound like a large number, even though they may represent a very small fraction of incidents. In our lives today, this may be done with COVID-19 deaths, voter fraud instances, or wildfire burn acreage. Our brains will have a hard time comprehending risk in each of these areas, and adding the complexity of how that risk is calculated, measured, and reported can make virtually impossible for any of us to comprehend risk. Clear and accurate risk reporting is vital for helping us understand important risks in our lives and in society, but the entire process can be derailed if we chose measures that don’t accurately reflect risk or that muddy the waters of exactly what the risk is.
Probability Judgments

Probability Judgments

Julia Marcus, an epidemiologist at Harvard Medical School, was on a recent episode of the Ezra Klein show to discuss thinking about personal risk during the COVID-19 Pandemic. Klein and Marcus talked about the ways in which the United States Government has failed to help provide people with structures for thinking about risk, and how this has pushed risk decisions onto individuals. They talked about how this creates pressures on each of us to determine what activities are worthwhile, what is too risky for us, and how we can know if there is a high probability of infection in one setting relative to another.

 

On the podcast they acknowledged what Daniel Kahneman writes about in his book Thinking Fast and Slow – humans are not very good at making probability judgments. Risk is all about probability. It is fraught with uncertainty, with with small likelihoods of very bad outcomes, and with conflicting opinions and desires. Our minds, especially our normal operating mode of quick associations and judgments, doesn’t have the capacity to think statistically in the way that is necessary to make good probability judgments.

 

When we try to think statistically, we often turn to substitutions, as Kahneman explains in his book. “We asked ourselves how people manage to make judgments of probability without knowing precisely what probability is. We concluded that people must somehow simplify that impossible task and we set out to find how they do it. Our answer was that when called upon to judge probability, people actually judge something else and believe they have judged probability.”

 

This is very important when we think about our actions, and the actions of others, during this pandemic. We know it is risky to have family dinners with our loved ones, and we ask ourselves if it is too risky to get together with our parents, with siblings who are at risk due to health conditions, and if we shouldn’t be in the same room with a family member who is a practicing medical professional. But in the end, we answer a different question. We ask how much we miss our parents, if we think it is important to be close to our family, and if we really really want some of mom’s famous pecan pie.

 

As Klein and Marcus say during the podcast, it is a lot easier to be angry at people at a beach than to make probability judgments about a small family dinner. When governments, public health officials, and employers fail to establish systems to help us navigate the risk, we place the responsibility back onto individuals, so that we can have someone to blame, some sense of control, and an outlet for the frustrations that arise when our mind can’t process probability. We distort probability judgments and ask more symbolic questions about social cohesion, family love, and isolation. The answer to our challenges would be better and more responsive institutions and structures to manage risk and mediate probability judgments. The individual human mind can only substitute easier questions for complex probability judgments, and it needs visual aids, better structures, and guidance to help think through risk and probability in an accurate and reasonable manner.
The End is Always Near

The End is Always Near

The human mind thinks in narratives. Well take in information about the world around us, and we create a story that weaves all of those narratives together in a cohesive manner. The mind creates the reality that it experiences, and it uses narrative to give the story meaning. Unfortunately, sometimes the stories don’t fit the actual world we inhabit very well.

 

One area where the narrative we tell ourselves doesn’t fully match the reality of our lives is with regard to our risk of dying on any given day. As our brains build the narrative of our lives and of who we are, it projects forward into the future of who we will become and the world we will inhabit. My assumption, based on the way I know that I think, is that we project forward a long life with our ending far off in the distant future. I recognize this tendency in myself all the time, and I suspect that even if I do make it to old age, this same tendency will be with me then.  It is hard to imagine that my end is not always going to be far away.

 

The end is always near, however. Or at least, the potential and risk of the end is always near. Our brains believe that we have lots of life left, because that is how the narrative we have crafted in our minds plays out. But the real world doesn’t have to follow the narrative in our minds. The real world is separate from what we think it should be or will be, and it doesn’t much care about how we think about it or understand it (or fail to understand it either).

 

In Letters From a Stoic, Seneca wrote, “who is not near death? It is ready for us in all places and at all times.”

 

It is important to remember that the actual course of our lives could diverge from the narrative path we create at any moment on any day. The possibility of a natural disaster, a clumsy mistake, or the malice of another person resulting in our early departure from life is always greater than zero. This means that whatever narrative we create, however far off death is in the story we tell ourselves, the reality is that the end is always near.

 

The take-away is to make our time meaningful, to be content that we have done our best each day, so that if we die, the narrative we lived out will end with us as a confident, complete individual. This is not an excuse for a YOLO way of life, and it shouldn’t be a reason to bury ourselves in work – effectively enslaving ourselves to a job, a cause, or a relationship. Instead, what we should learn from our always near ending is that we should do our best to fully apply ourselves in a way that meaningfully engages in the world to produce more than our own selfish happiness. We should seek opportunities to live a life where we  can develop a strong and fulfilling narrative that helps to lift up others who are doing the same. The end is always near, so we should make sure we have made of our life a narrative we can be proud of.
Buying Insurance

We Don’t Buy Insurance for Ourselves

Why do we buy insurance of any kind? Is it really for ourselves and our own benefit, or is there something else going on with insurance decisions? According to Venture Capitalist Chris Brookfield, as quoted in Dave Chase’s book The Opioid Crisis Wake-Up Call, there is something beyond our own self interest at play when we decide to buy insurance.

 

Brookfield is quoted as writing, “Persuading individuals to buy insurance is kind of backwards. I saw this in India all the time. Individuals do not value their own risks – their relatives and neighbors do.” 

 

Buying insurance is actually more about our loved ones and our responsibility to our community than it is about ourselves. It is about protecting the financial standing of our relatives and those who would help us if we were down as much as it is about protecting our own financial standing. The standard story tells us that insurance shifts risk from ourselves to a group of individuals, but as Brookfield continues in the book, it really shifts risks from our immediate known allies, into a broader group of people that we don’t necessarily know.

 

If I don’t have health insurance or auto insurance and die in a terrible car crash, I am not the one who will bear the costs of the accident. My loved ones and other people in the community involved with the crash (other drivers or the owners of any private property that was damaged) are the ones who will face the costs. On their own it would be hard to manage the costs, but pooled together, the costs and the risk could be shared. In a situation where my death occurs, it is other people who derive the value of the insurance.

 

I’m sure there are some insurance products that are pretty solidly just about the individual buying the isurance, but it doesn’t seem to always be that way. Buying insurance seems to be an act of signaling, as Robin Hanson discusses in his book The Elephant in the Brain. Buying insurance isn’t all about sharing risk, it is also about showing others how much you care about them and about showing the community how responsible you are.

Risk

Joel Achenbach explored what went wrong on the Deep Water Horizon oil platform in the Gulf of Mexico the night it exploded and left an open gusher at the bottom of the ocean. He found that there was never one major mistake or any serious oversight that catastrophically caused the collapse of the system and the blow-out of the oil well.  What happened on the well was an accumulation of risky decisions, a failure to observe small and nonthreatening warning signs, and a cluster of poorly designed, or poorly integrated, back up systems. Everything played together to make it hard to determine the exact conditions on the sea floor, and mislead the people who had the power to stop operations. Each indicator of a potential problem on its own was insignificant, but taken together they lead to a total catastrophe.

 

“When doing something risky, remember that risk builds like plaque.” Achenbach wrote in his book, ‘Make sure that your back up plan is really in back and won’t get blown up out front along with your plan A.”

 

What Achenbach is encouraging us to do is to take the time to plan out our back up and understand how seriously our entire operations or systems could fail.  If we look for the best possible back up plans, and put in place real stop guards when the information we receive is potential damaging then we have a head start for preventing a disaster. The more we understand our warning signs the better we will be able to adjust and make decisions that minimize risk.