Scarcity & Short-Term Thinking

Scarcity & Short-Term Thinking

I find critiques of people living in poverty to generally be unfair and shallow. People living in poverty with barely enough financial resources to get through the day are criticized for not making smart investments of their time and money, and are criticized when they spend in a seemingly irrational manner. But for low income individuals who can’t seem to get ahead no matter what jobs they take, these critiques seem to miss the reality of life at the poorest socioeconomic level.
I wrote recently about the costs of work, which are not often factored into our easy critiques of the poor or unemployed. Much of America has inefficient and underinvested public transit. The time involved with catching a bus (or two) to get to work are huge compared with simply driving to work. Additionally, subways and other transports can be dangerous (there is no shortage of Youtube videos of people having phones stolen on public transit). This means that owning and maintaining a car can be essential for being able to work, an expensive cost that can make working prohibitive for those living in poverty.
The example of transportation to work is meant to demonstrate that not working can be a more rational choice for the poorest among us. Work involves extra stress and costs, and the individual might not break even, making unemployment the more rational choice. There are a lot of instances where the socially desirable thing becomes the irrational choice for those living in poverty. If we do not recognize this reality, then we will unfairly criticize the choices and decisions of the poor.
In his book Evicted, Matthew Desmond writes about scarcity and short-term thinking, showing that they are linked and demonstrating how this shapes the lives of those living in poverty. “research show[s] that under conditions of scarcity people prioritize the now and lose sight of the future, often at great cost.” People living in scarcity have trouble thinking ahead and planning for their future. When you don’t know where you will sleep, where your next meal will come from, and if you will be able to afford the next basic necessities, it is hard to think ahead to everything you need to do for basic living in American society. Your decisions  might not make sense to the outside world, but to you it makes sense because all you have is the present moment, and no prospects regarding the future to plan for or think about. Sudden windfalls may be spent irrationally, time may not be spent resourcefully, and tradeoffs that benefit the current moment and the expense of the future may seem like obvious choices if you live in constant scarcity.
Combined, the misperceptions about the cost of work and the psychological short-termism resulting from scarcity show us that we have to approach poverty differently from how we approach lazy middle class individuals. I think we design our programs for assisting those in poverty while thinking of middle class lazy people. We don’t think about individuals who are actually so poor that the costs of work that most of us barely think about become crippling. We  don’t consider how scarcity shapes the way people think, leading them to make poor decisions that seem obvious for us to critique from the outside. Deep poverty creates challenges and obstacles that are separate from the problem of free loading and lazy middle class children or trust fund babies. We have to recognize this if we are to actually improve the lives of the poorest among us and create a better social and economic system to help integrate those individuals.
The Screening-Off Effect

The Screening-Off Effect

Sometimes to our great benefit, and sometimes to our detriment, humans like to put things into categories – at least Western, Educated, Industrialized, Rich, Democratic (WEIRD) people do. We break things into component parts and categorize each part as belonging to a category of thing. We do this with things like planets, animals, and players within sports. We like established categories and dislike when our categorization changes. This ability has greatly helped us in science and strategic planning, allowing our species to do incredible things and learn crucial lessons about the world. What is remarkable about this ability is how natural and easy it is for us, but how hard it is to explain or program into a machine.
One component of this remarkable ability is referred to as the screening-off effect by Judea Pearl in The Book of Why. Pearl writes, “how do we decide which information to disregard, when every new piece of information changes the boundary between the relevant and the irrelevant? For humans, this understanding comes naturally. Even three-year-old toddlers understand the screening-off effect, though they don’t have a name for it. … But machines do not have this instinct, which is one reason that we equip them with causal diagrams.”
From a young age we know what information is the most important and what information we can ignore. We intuitively have a good sense for when we should seek out more information and when we have enough to make a decision (although sometimes we don’t follow this intuitive sense). We know there is always more information out there, but don’t have time to seek out every piece of information possible. Luckily, the screening-off effect helps us know when to stop and makes decision-making possible for us.
Beyond knowing when to stop, the screening-off effect helps us know when to ignore irrelevant information. The price of tea in China isn’t a relevant factor for us when deciding what time to wake up the next morning. We recognize that there are no meaningful causal pathways between the price of tea and the best time for us to wake up. This causal insight, however, doesn’t exist for machines that are only programmed with the specific statistics we build into them. We specifically have to code a causal pathway that doesn’t include the price of tea in China for a machine to know that it can ignore that information. The screening-off effect, Pearl explains, is part of what allows humans to think causally. In cutting edge science there are many factors we wouldn’t think to screen out that may impact the results of scientific experiments, but for the most part, we know what can be ignored and can look at the world around us through a causal lens because we know what is and is not important.
When to Stop Thinking

When to Stop Thinking

My last post was about closed-mindedness and focused on how closed-minded people fail to make appropriate inquiries to gain the necessary information to make good decisions and accurately understand the world. What the post didn’t ask, is when we should stop thinking and make a decision, versus when we should continue our investigations to gain more knowledge. A serious problem, and one we avoid when we are closed-minded, is often referred to as paralysis by analysis. It occurs when you lack confidence in decision-making and continually seek more information before making a decision, potentially delaying your choice or any action indefinitely.
Writing about this idea in Vices of the Mind, Quassim Cassam writes, “our investigations can be open-ended and there is often, though not always, scope for further investigation.” Sometimes we are asking questions and doing research on continually evolving topics. Sometimes we are working at a cutting edge where changes in politics, markets, social trends, and scientific breakthroughs can influence what we do from day to day. There never is a final answer, and we have to continually seek new information in order to adapt. However, this doesn’t mean that we can’t make important decisions that require thoughtful deliberation.
“A good investigator,” Cassam writes, “has a sense of when enough is enough and diminishing returns are setting in. But the decision to call a halt at that point isn’t properly described as closed-minded. What allows us to get on with our lives isn’t closed-mindedness but the ability to judge when no further research into the question at hand is necessary.”
Closed-minded people make decisions while ignoring pertinent information. Open-minded people make decisions while ignoring extraneous information. Over time, for each of us if we practice long enough, we should improve our judgements and become better at recognizing the diminishing returns of continued research. We might continue to learn a bit more as we continue to study, but the value of each new bit of information will be smaller and smaller, and at some point won’t truly impact our decisions. A novice might have trouble identifying this point, but an expert should be better. A closed-minded person doesn’t look for this optimal point, but an open-minded person does, continually updating their priors and judgements on when they have enough information to make a decision, rather than rigidly locking in with a specific set of information. This is how we avoid analysis paralysis and how we improve our decision-making over time to get on with our lives as Cassam writes.
A Leader's Toolbox

A Leader’s Toolbox

In the book Risk Savvy Gerd Gigerenzer describes the work of top executives within companies as being inherently intuitive. Executives and managers within high performing companies are constantly pressed for time. There are more decisions, more incoming items that need attention, and more things to work on than any executive or manager can adequately handle on their own. Consequentially, delegation is necessary, as is quick decision-making based on intuition. “Senior managers routinely need to make decisions or delegate decisions in an instant after brief consultation and under high uncertainty,” writes Gigerenzer. This combination of quick decision-making under uncertainty is where intuition comes to play, and the ability to navigate these situations is what truly comprises the leader’s toolbox.

 

Gigerenzer stresses that the intuitions developed by top managers and executives are not arbitrary. Successful managers and companies tend to develop similar tool boxes that help encourage trust and innovation. While many individual level decisions are intuitive, the structure of the leader’s toolbox often becomes visible and intentional. As an example, Gigerenzer highlights a line of thinking he uncovered when working on a previous book. He writes, hire well and let them do their jobs reflects a vision of an institution where quality control (hire well) goes together with a climate of trust (let them do their jobs) needed for cutting-edge innovation.”

 

In many companies and industries, the work to be done is incredibly complex, and a single individual cannot manage every decision. The structure of the decision-making process necessarily needs to be decentralized for the individual units of the team to work effectively and efficiently. Hiring talented individuals and providing them with the autonomy and tools necessary to be successful is the best approach to get the right work done well.

 

Gigerenzer continues, “Good leadership consists of a toolbox full of rules of thumb and the intuitive ability to quickly see which rule is appropriate in which context.”

 

A leader’s toolbox doesn’t consist of specific lists of what to do in certain situations or even specific skills that are easy to check off on a resume. A leader’s toolbox is built by experience in a diverse range of settings and intuitions about things as diverse as hiring, teamwork, and delegation. Because innovation is always uncertain and always includes risk, leaders must develop intuitive skills and be able to make quick and accurate judgements about how to best handle new challenges and obstacles. Intuition and gut-decisions are an essential part of leadership today, even if we don’t like to admit that we make important decisions on intuition.
Defensive Decision-Making - Joe Abittan

Defensive Decision-Making

One of the downfalls of a negative error cultures is that people become defensive over any mistake they make. Errors and mistakes are shamed and people who commit errors do their best to hide them or deflect responsibility. Within negative error cultures you are more likely to see people taking steps to distance themselves from responsibility before a decision is made, practicing what is called defensive decision-making.

 

Gerd Gigerenzer expands on this idea is his book Risk Savvy by writing, “defensive decision making [is] practiced by individuals who waste time and money to protect themselves at the cost of others, including their companies. Fear of personal responsibility creates a market for worthless products delivered by high-paid experts.”

 

Specifically, Gigerenzer writes about companies that hire expensive outside experts and consultants to make market predictions and help improve company decision-making. The idea is that individual banks, corporations, and sales managers can’t accurately know the state of a market as well as an outside expert whose job it is to study trends, talk to market actors, and understand how the market relates to internal and external pressures. The problem, as Gigerenzer explains, is that even experts are not very good at predicting the future of a market. There is simply too much uncertainty for anyone to be able to say that market trends will continue, that a shock is coming, or that a certain product or service is about to take off. Experts make these types of predictions all the time, but evidence suggests that their predictions are not much better than just throwing dice.

 

So why do companies pay huge fees, sit through lengthy meetings, and spend time trying to understand and adapt to the predictions of experts? Gigerenzer suggests that it is because individuals within the company are practicing defensive decision-making. If you are a sales manager and you make a decision to sell to a particular market with a new approach after analyzing performance and trends of your own team, then you are responsible for the outcome of the new approach and strategy. If it works, you will look great, but if it fails, then you will be blamed for not understanding the market, for failing to see the signs that indicated your plan wasn’t going to succeed, and for misinterpreting past trends. However, if a consultant suggested a course of action, presented your team with a great visual presentation, and was certain that they understood the market, then you escape blame when the plan doesn’t work out. If even the expert couldn’t see what was going to happen, then how could you be blamed for a plan not working out?

 

Defensive decision-making is good for the individual, but bad for the larger organization that the individual is a part of. Companies would be better off if they made decisions quicker, accepted risk, and could openly evaluate success and failure without having to place too much blame on individuals. Companies could learn more about their errors and could do a better job identifying and promoting talent. Defensive decision-making is expensive, time consuming, and outsources blame, preventing companies and organizations from actually learning and improving their decision-making over the long run.
Positive Error Cultures - Joe Abittan

Positive Error Cultures

My last post was about negative error cultures and the harm they can create. Today is about the flip side, positive error cultures and how they can help encourage innovation, channel creativity, and help people learn to improve their decision-making. “On the other end of the spectrum,” writes Gerd Gigerenzer in Risk Savvy, “are positive error cultures, that make errors transparent, encourage good errors, and learn from bad errors to create a safe environment.”

 

No one likes to make errors. Whether it is a small error on our personal finances or a major error on the job, we would all rather hide our mistakes from others. In school we probably all had the experience of quickly stuffing away a test that received a bad grade so that no one could see how many questions we got wrong. Errors in life have the same feeling, but like mistakes on homework, reports, or tests, hiding our errors doesn’t help us learn for the future. In school, reviewing our mistakes and being willing to work through them helps us better understand the material and shows us where we need to study for the final exam. In life, learning from our mistakes helps us become better people, make smarter decisions, and be more prepared for future opportunities.

 

This is why positive error cultures are so important. If we are trying to do something new, innovative, and important, then we are probably going to be in a position where we will make mistakes. If we are new homeowners and don’t know exactly how to tackle a project, we will certainly err, but by learning from our mistakes, we can improve and better handle similar home improvement projects in the future. Hiding our error will likely lead to greater costs in the future, and will leave us dependent on others to do costly work around the house. Business is the same way. If we want to grow to get a promotion or want to do something innovative to solve a new problem, we are going to make mistakes. Acknowledging where we were wrong and why we made an error helps us prepare for future challenges and opportunities. It helps us learn and grow rather than remaining stuck in one place, not solving any problems and not preparing for future opportunities.

 

80,000 Hours has a great “Our Mistakes” section on their website, demonstrating a positive error culture.
A mixture of Risks

A Mixture of Risks

In the book Risk Savvy, Gerd Gigerenzer explains the challenges we have with thinking statistically and how these difficulties can lead to poor decision-making. Humans have trouble holding lots of complex and conflicting information. We don’t do well with decisions involving risk and decisions where we cannot possibly know all the relevant information necessary for the best decision. We prefer to make decisions involving fewer variables, where we can have more certainty about our risks and about the potential outcomes. This leads to the substitution effect that Daniel Kahneman describes in his book Thinking Fast and Slow, where our minds substitute an easier question for the difficult question without us noticing.

 

Unfortunately, this can have bad outcomes for our decision-making. Gigerenzer writes, “few situations in life allow us to calculate risk precisely. In most cases, the risks involved are a mixture of more or less well known ones.” Most of our decisions that involve risk have a mixture of different risks. They are complex decisions with tiers and potential cascades of risk based on the decisions we make along the way. Few of our decisions involve just one risk independent of others that we can know with certainty.

 

If we consider investing for retirement we can see how complex decisions involving risk can be and how a mixture of risks is present across all the decisions we have to make. We can hoard money in a safe in our house where we reduce the risk of losing any of our money, but we risk being unable to have enough saved by the time we are ready to retire. We can invest our money, but have to make decisions regarding whether we will keep it in a bank account, invest it in the stock market, or look to other investment vehicles. Our bank is unlikely to lose much money, and is low risk, but is also unlikely to help us increase the value of our savings to have enough for retirement. Investing with a financial advisor takes on more risk, such as the risk that we are being scammed, the risk that the market tanks and our advisor made bad investments on our behalf, and the risk that we won’t have access to our money if we were to need it quickly in case of an emergency. What this shows is that even the most certain option for our money, protecting it in a secret safe at home, still contains additional risks for the future. The options that are likely to provide us with the greatest return on our savings, investing in the stock market, has a mixture of risks associated with each investment decision we make after the initial decision to invest. There is no way we can calculate and fully comprehend ever risk involved with such an investment decision.

 

Risk is complex, and we rarely deal with a single decision involving a single calculable risk at one time. Our brains are likely to flatten the decision by substituting more simple decisions, eliminating some of the risks from consideration and helping our mind focus on fewer variables at a time. Nevertheless, the complex mixture of risks doesn’t go away just because  our brains pretend it isn’t there.
Unconscious Rules of Thumb

Unconscious Rules of Thumb

Some of the decisions that I make are based on thorough calculations, analysis, evaluation of available options, and deliberate considerations of costs and benefits. When I am planning my workout routine, I think hard about how my legs have been feeling and what distance, elevation, and pace is reasonable for my upcoming workouts. I think about how early I need to be out the door for a certain distance, and whether I can run someplace new to mix things up. I’ll map out routes, look at my training log for the last few weeks, and try putting together a plan that maximizes my enjoyment, physical health, and fitness given time constraints.

 

However, outside of running, most of my decisions are generally based on rules of thumb and don’t receive the same level of attention as my running plans. I budget every two weeks around payday, but even when budgeting, I mostly rely on rules of thumb. There is a certain amount I like to keep in my checking account just in case I forgot a bill or have something pop-up last minute. Its not a deliberate calculation, it is more of a gut feeling. The same goes for how much money I set aside for free spending or if I feel that it is finally time to get that thing I have had my eye on for a while. My budget is probably more important than my running routine, but I actually spend more time rationally developing a running plan than I spend budgeting. The same goes for house and vehicle maintenance, spending time with friends and family, and choosing what to eat on the days we plan to do take-out.

 

The budget example is interesting because I am consciously and deliberately using rules of thumb to determine how my wife and I will use our money. I set aside a certain amount for gas without going to each vehicle and checking whether we are going to need to fill up soon. I am aware of the rules of thumb, and they are literally built into my spreadsheet where I sometimes ask if I should deviate, but usually decide to stick to them.

 

I also recognize that I have many unconscious rules of thumb. In his book Risky Savvy, Gerd Gigerenzer writes the following about unconscious rules of thumb:

 

“Every rule of thumb I am aware of can be used consciously and unconsciously. If it is used unconsciously, the resulting judgment is called intuitive. An intuition, or gut feeling, is a judgment:
  1. that appears quickly in consciousness,
  2. whose underlying reasons we are not fully aware of, yet
  3. is strong enough to act upon.”
I have lots of intuitive judgements that I often don’t think about in the moment, but only realize when I reflect back on how I do something. When I am driving down the freeway, cooking, or writing a blog post, many of my decisions flow naturally and quickly. In the moment the decisions seem obvious, and I don’t have to think too deliberately about my action and why I am making a specific decision. But if I were asked to explain why I made a decision, I would have a hard time finding exact reasons for my choices. I don’t know exactly how I know to change lanes at a certain point on the freeway, but I know I can often anticipate points where traffic will slow down, and where I might be better off in another lane. I can’t tell you why I chose to add the marsala wine to the mushrooms at the precise moment that I did. I also couldn’t explain why I chose to present a certain quote right at the beginning of a post rather than in the middle. My answer for all of these situations would simply be that it felt right.

 

We use unconscious rules of thumb like these all the time, but we don’t often notice when we do. When we are budgeting we might recognize our rules of thumb and be able to explain them, but our unconscious rules of thumb are harder to identify and explain. Nevertheless, they still have major impacts in our lives. Simply because we don’t notice them and can’t explain them doesn’t mean they don’t shape a lot of our decisions and don’t matter. The intuitions we have can be powerful and helpful, but they could also be wrong (maybe all this time I’ve been overcooking the mushrooms and should add the wine sooner!). Because these intuitions are unconscious, we don’t deliberately question them, unless something calls them up to the conscious level. The feedback we get is probably indirect, meaning that we won’t consciously tie our outcomes the to the unconscious rules of thumb that got us to them.

 

I am fascinated by things like unconscious rules of thumb because they reveal how little we actually control in our lives. We are the ones who act on these unconscious rules of thumb, but in a sense, we are not really doing anything at all. We are making decisions based on factors we don’t understand and might not be aware of. We have agency by being the one with the intuition, but we also lack agency by not being fully conscious of the how and why behind our own decisions. This should make us question ourselves and choices more than we typically do.
Probability is Multifaceted

Probability is Multifaceted

For five years my wife and I lived in a house that was at the base of the lee side of a small mountain range in Northern Nevada. When a storm would come through the area it would have to make it over a couple of small mountain ranges and valleys before getting to our house, and as a result we experienced less precipitation at our house than most people in the Reno/Sparks area. Now my wife and I live in a house higher up on a different mountain that is more in the direct path of storms coming from the west. We receive snow at our house while my parents and family lower in the valley barely get any wind. At both houses we have learned to adjust our expectations for precipitation relative to the probabilities reported by weather stations which reference the airport at the valley floor. Our experiences with rain and snow at our two places is a useful demonstration that probability (in this case the probability of precipitation) is multifaceted – that multiple factors  play a role in the probability of a given event at a given place and time.

 

In his book Risk Savvy, Gerd Gigerenzer writes, “Probability is not one of a kind; it was born with three faces: frequency, physical design, and degrees of belief.” Gigerenzer explains that frequency is about counting. To me, this is the most clearly understandable aspect of probability, and what we usually refer to when we discuss probability. On how many days does it usually rain in Reno each year? How frequently does a high school team from Northern Nevada win a state championship and how frequently does a team from Southern Nevada win a state championship? These types of questions simply require counting to give us a general probability of an event happening.

 

But probability is not just about counting and tallying events. Physical design plays a role as well. Our house on the lee side of a small mountain range was shielded from precipitation, so while it may have rained in the valley half a mile away, we didn’t get any precipitation. Conversely, our current home is in a position to get more precipitation than the rest of the region. In high school sports, fewer kids live in Reno/Sparks compared to the Las Vegas region, so in terms of physical design, state championships are likely to be more common for high schools in Southern Nevada. Additionally, there may be differences in the density of students at each school, meaning the North could have more schools per students than the south, also influencing the probability of a north or south school winning. Probability, Gigerenzer explains, can be impacted by the physical design of systems, potentially making the statistics and chance more complicated to understand.

 

Finally, degrees of belief play a role in how we comprehend probability. Gigerenzer states that degrees of belief include experience and personal impression which are very subjective. Trusting two eye witnesses, Gigerenzer explains, rather than two people who heard about an event from someone else can increase our perception that the probability of an unlikely story is accurate. Degrees of belief can also be seen in my experiences with rain and our two houses. I learned to discount the probability of rain at our first house and to increase my expectation of rain at our new house. If the meteorologist said there was a low chance of rain when we lived on the sheltered side of a hill, then I didn’t worry much about storm forecasts. At our new house, however, if there is a chance of precipitation and storm coming from the west, I will certainly go remove anything from the yard that I don’t want to get wet, because I believe the chance that our specific neighborhood will see rain is higher than what the meteorologist predicted.

 

Probability and how we understand it and consequentially make decisions  is complex, and Gigerenzer’s explanation of the multiple facets of probability helps us better understand the complexity. Simply tallying outcomes and predicting into the future often isn’t enough for us to truly have a good sense of the probability of a given outcome. We have to think about physical design, and we have to think about the personal experiences and subjective opinions that form the probabilities that people develop and express. Understanding probability requires that we hold a lot of information in our head at one time, something humans are not great at doing, but that we can do better when we have better strategies for understanding complexity.
Navigating Uncertainty with Nudges

Navigating Uncertainty with Nudges

In Risk Savvy Gerd Gigerenzer makes a distinction between known risks and uncertainty. In a foot note for a figure, he writes, “In everyday language, we make a distinction between certainty and risk, but the terms risk and uncertainty are used mostly as synonyms. They aren’t. In a world of known risks, everything, including the probabilities, is known for certain. Here statistical thinking and logic are sufficient to make good decisions. In an uncertain world, not everything is known, and one cannot calculate the best option. Here, good rules of thumb and intuition are also required.” Gigerenzer’s distinction between risk and uncertainty is important. He demonstrates that people can manage decision-making when making risk based decisions, but that people need to rely on intuition and good judgement when dealing with uncertainty. One solution to improved judgement and intuition is to use nudges.

 

In the book Nudge, Cass Sunstein and Richard Thaler encourage choice architects to design systems and structures that will help individuals make the best decision in a given situation as defined by the chooser. Much of their argument is supported by research presented by Daniel Kahneman in Thinking Fast and Slow, where Kahneman demonstrates how predictable biases and cognitive errors can lead people to making decisions that they likely wouldn’t make if they had more clear information, had the ability to free themselves from irrelevant biases, and could improve their statistical thinking. Gigerenzer’s quote supports Sunstein and Thaler’s nudges by building on the research from Kahneman. Distinguishing between risk and uncertainty helps us understand when to use nudges, and how aggressive our nudges may need to be.

 

Gigerenzer uses casino slot machines as an example of risk and for examples of uncertainty uses stocks, romance, earthquakes, business, and health. When we are gambling, we can know the statistical chances that our bets will pay off and calculate optimal strategies (there is a reason the casino dealer stays on 17). We won’t know what the outcome will be ahead of time, but we can precisely define the risk. The same cannot be said for picking the right stocks, the right romantic partner, or when creating business, earthquake preparedness, or health plans. We may know the five year rate of return for a company’s stocks, the divorce rate in our state, the average frequency and strength of earthquakes in our region, and how old our grandfather lived to be, but we cannot use this information alone to calculate risk. We don’t know exactly what business trends will arise in the future, we don’t know for sure whether we have a genetic disease that will strike us (or our romantic partner) down sooner than expected, and we can’t say for sure that a 7.0 earthquake is or is not possible next month.

 

But nudges can help us in these decisions. We can use statistical information for business development and international stock returns to identify general rules of thumb when investing. We can listen to parents and elders and learn from their advice and mistakes when selecting a romantic partner, intuiting the traits that make a good (or bad) spouse. We can overengineer our bridges and skyscrapers by 10% to give us a little more assurance that they can survive a major and unexpected earthquake. Nudges are helpful because they can augment our gut instincts and help bring visualizations to the rules of thumb that we might utilize.

 

Expecting everyone’s individual intuition and heuristics to be up to the task of navigating uncertainty is likely to lead to many poor choices. But, if we help pool the statistical information available, provide guides, communicate rules of thumb that have panned out for many people, and structure choices in ways that help present this information, then people can likely make marginally better decisions. My suggestion in this post, is a nudge to use more nudges in moments of uncertainty. When certainty exists, or even when calculable risks exist, nudges may not be needed. However, once we get beyond calculable risk, where we must rely on judgement and intuition, nudges are important tools to help people navigate uncertainty and improve their decision making.