Negative Error Cultures - Joe Abittan

Negative Error Cultures

No matter how smart, observant, and rational we are, we will never have perfect information for all of the choices we make in our lives. There will always be decisions that we have to make based on a limited set of information, and when that happens, there will be a risk that we won’t make the right decision. In reality, there is risk in almost any decision we make, because there are very few choices where we have perfect information and fully understand all the potential consequences of our decisions and actions. This means the chance for errors is huge, and we will make many mistakes throughout our lives. How our cultures respond to these errors is important in determining how we move forward from them.

 

In Risk Savvy, Gerd Gigerenzer writes the following about negative error cultures:

 

“On the one end of the spectrum are negative error cultures. People living in such a culture fear to make errors of any kind, good or bad, and, if an error does occur, they do everything to hide it. Such a culture has little chance to learn from errors and discover new opportunities.”

 

None of us want to live in a world with errors, but the reality is that we spend a lot of our time engulfed by them. We don’t want to make mistakes on the job and potentially lose a raise, promotion, or employment altogether. Many people belong to religious social communities or live in families characterized by negative error cultures where any social misstep feels like the end of the world and poses expulsion from the community/family. Additionally, our world of politics is typically a negative error culture, where one political slip-up is often enough to irrevocably damage an individual’s political career.

 

Gigerenzer encourages us move away from negative error cultures because they stifle learning, reduce creativity, and fail to acknowledge the reality that our world is inherently a world of risk. We cannot avoid all risk because we cannot be all-knowing, and that means we will make mistakes. We can try to minimize the mistakes we make and their consequences, but we can only do so by acknowledging mistakes, owning up to them, learning, adapting, and improving future decision making.

 

Negative error cultures don’t expose mistakes and do not learn from them. They prevent individuals and organizations from finding the root cause of an error, and don’t allow for changes and adaptation. What is worse, efforts to hid errors can lead to more errors. Describing hospitals with negative error cultures, Gigerenzer writes, “zero tolerance for talking about errors produces more errors and less patient safety.” Being afraid to ever make a mistake makes us less willing to innovate, to learn, and to improve the world around us. It isolates us, and keeps us from improving and reducing risk for ourselves and others in the future. In the end, negative error cultures drive more of the thing they fear, reinforcing a vicious cycle of errors, secrecy, and more errors.
Informed Bets

Informed Bets

My last post was about limitations of the human mind and why we should be willing to doubt our conclusions and beliefs. This post contrasts my last post to argue that we can trust the informed bets that our brains make. Our brains and bodies do not have the capabilities to fully capture all of the information necessary to perfectly replicate reality in our minds, but they can do a good job putting information together in a way that helps us successfully navigate the world and our lives. Informed guesses, that is assumptions and intuitions based on experience and expertise rather than random and amateurish judgements, are actually very useful and often good approximations.

 

“Intelligence…” Gerd Gigerenzer writes in his book Risk Savvy, “is the art of making informed guesses.” Our brains make a lot of predictions and rely on heuristics, assumptions, and guesses to get by. It turns out that our brains do this well, as Gigerenzer argues in his book. We don’t need to pull out graph paper and a scientific calculator to catch a football. We don’t need to record every thought and action we have had over the last month to know if we are happy with our New Year’s resolutions and can keep them going. When we see someone standing in a long customer service line at the grocery store we don’t need to approach them with a 100 point questionnaire to know whether they are bored or upset.  Informed bets and reasonable guesses are sufficient for us to have decent and functional understanding of the world.

 

Gigerenzer continues, “Intelligence means going beyond the information given and making informed bets on what’s outside.” This quote is introduced after an optical illusion, where a grayscale checkerboard is shown with a figure casting a shadow across the board. Two squares on the board are the same shade of gray, yet our minds see the squares as different colors. Our minds are going beyond the information given, the literal wavelength of light reaching the back of our eyes, and making informed bets on the relative colors of the squares on the board if there was not a figure to cast a shadow. In the case of the visual illusion, our brain’s guess about reality is actually more helpful for us than the literal reality of the same colors of the squares in the image.

 

Bounded rationality is a serious concern. We cannot absorb all the information that exists in the world which may help us make better decisions. However, humans are intelligent. We can use the information we receive and make informed bets about the best choices and decisions available. We might not be perfect, but by making informed bets and educated guesses we can successfully come to understand the world and create systems and structures that help us improve our understanding over time.
Our Brains Are Not Mirrors

Our Brains Are Not Mirrors

Humans are social creatures who need to interact for survival and meaning. Being part of a society requires that we take actions, behave in conjunction with others, make decisions, and have opinions about the world we inhabit. This puts us in difficult positions. We have to justify our decisions, actions, behaviors, and beliefs, but our brains are not mirrors of reality. They don’t reflect a perfect and objective view of the world to our inner mind, and we don’t always realize how imperfect the information we act on can be.

 

In Risk Savvy, Gerd Gigerenzer writes, “When we look around us, we think we perceive the world outside. But we don’t. Our brains are not mirrors. They have insufficient information to mirror the world.” We only perceive a limited slice of reality. We only absorb and process a limited amount of information. Our understanding of the true nature of reality can never be complete enough to say we truly understand the world.

 

In my opinion, this should lead us to be more cognitively humble. We should recognize that we can’t know everything and be more willing to question what we know, how we know what we know, and what we don’t and cannot ever know. In reality, few of us do this on a consistent basis. It is uncomfortable to live with uncertainty and a constant doubt that you know enough to take decisive action, to behave a certain way, or to hold certain beliefs. It is far more comfortable to believe that you know how and why the world is the way it is.

 

Ultimately, because we are social creatures and because we need to interact for our survival, we have to make decisions. We cannot question our decisions forever, or we would never get out of bed and would starve. We have to take action but this doesn’t mean we have to behave as though we know everything. We can still make it through life recognizing that we rely on rules of thumb, heuristics, and judgements based on imperfect information. We can question what we know and be willing to update our beliefs when we have reason to change our minds. It is important that we think critically about what we believe, and avoid simply believing that things are good or right because we happen to benefit from them now. Those instances are precisely the time when we should be the most willing to question our beliefs. Our brains are not mirrors, they don’t tell us exactly what is, and we should remember that and be willing to make changes in our thoughts, beliefs, and actions based on continual learning.
Risk and Innovation - Joe Abittan

Risk and Innovation

To be innovative is to make decisions, develop processes, and create things in new ways that improve over the status quo. Being innovative is necessarily different, and requires stepping away from the proven path to do something new or unusual. Risk and innovation are tied together because you cannot venture into something new or stray from the tried and true without the possibility of making a mistake and being wrong. Therefore, appropriately managing and understanding risk is imperative for innovation.

 

In Risk Savvy Gerd Gigerenzer writes, “Risk aversion is closely tied to the anxiety of making errors. If you work in the middle management of a company, your life probably revolves around the fear of doing something wrong and being blamed for it. Such a climate is not a good one for innovation, because originality requires taking risks and making errors along the way. No risks, no errors, no innovation.” Risk aversion is a fundamental aspect of human psychology. Daniel Kahneman in Thinking Fast and Slow shows that we won’t accept risk unless we are certain that the pay-off is at generally about two times greater than the potential loss. We go out of our way to avoid risk, because the potential of losing something is often paralyzing beyond the excitement of a potential gain. Individuals and companies who want to be innovative have to find ways around risk aversion in order to create something new.

 

Gigerenzer’s example of middle management is excellent for thinking about innovation and why it is often smaller companies and start-ups that make innovative breakthroughs. It also helps explain why in the United States so many successful and innovative companies are started by immigrants or by the super-wealthy. Large established companies are likely to have employees who have been with the company for a longer time and have become more risk averse. They have families, mortgages, and might be unsure they could find an equally attractive job elsewhere. Their incentives for innovation are diminished by their fear of loss if something where to go wrong and if the blame were to fall with them. Better to stick with established methods and to maximize according to well defined job evaluation statistics than to risk trying something new and uncharted. Start-ups, immigrants, and the super-wealthy don’t have the same constraining fears. New companies attract individuals who are less risk averse to begin with, and they don’t have established methods that everyone is comfortable sticking to. Immigrants are not as likely to have the same financial resources that limit their willingness to take risks, and the super-wealthy may have so many resources that the risks they face are smaller relative to their overall wealth and resources. The middle-class, like middle management, is stuck in a position where they feel they have too much to risk in trying to be innovative, and as a result stick to known and measured paths that ultimately reduce risk and innovation.
A mixture of Risks

A Mixture of Risks

In the book Risk Savvy, Gerd Gigerenzer explains the challenges we have with thinking statistically and how these difficulties can lead to poor decision-making. Humans have trouble holding lots of complex and conflicting information. We don’t do well with decisions involving risk and decisions where we cannot possibly know all the relevant information necessary for the best decision. We prefer to make decisions involving fewer variables, where we can have more certainty about our risks and about the potential outcomes. This leads to the substitution effect that Daniel Kahneman describes in his book Thinking Fast and Slow, where our minds substitute an easier question for the difficult question without us noticing.

 

Unfortunately, this can have bad outcomes for our decision-making. Gigerenzer writes, “few situations in life allow us to calculate risk precisely. In most cases, the risks involved are a mixture of more or less well known ones.” Most of our decisions that involve risk have a mixture of different risks. They are complex decisions with tiers and potential cascades of risk based on the decisions we make along the way. Few of our decisions involve just one risk independent of others that we can know with certainty.

 

If we consider investing for retirement we can see how complex decisions involving risk can be and how a mixture of risks is present across all the decisions we have to make. We can hoard money in a safe in our house where we reduce the risk of losing any of our money, but we risk being unable to have enough saved by the time we are ready to retire. We can invest our money, but have to make decisions regarding whether we will keep it in a bank account, invest it in the stock market, or look to other investment vehicles. Our bank is unlikely to lose much money, and is low risk, but is also unlikely to help us increase the value of our savings to have enough for retirement. Investing with a financial advisor takes on more risk, such as the risk that we are being scammed, the risk that the market tanks and our advisor made bad investments on our behalf, and the risk that we won’t have access to our money if we were to need it quickly in case of an emergency. What this shows is that even the most certain option for our money, protecting it in a secret safe at home, still contains additional risks for the future. The options that are likely to provide us with the greatest return on our savings, investing in the stock market, has a mixture of risks associated with each investment decision we make after the initial decision to invest. There is no way we can calculate and fully comprehend ever risk involved with such an investment decision.

 

Risk is complex, and we rarely deal with a single decision involving a single calculable risk at one time. Our brains are likely to flatten the decision by substituting more simple decisions, eliminating some of the risks from consideration and helping our mind focus on fewer variables at a time. Nevertheless, the complex mixture of risks doesn’t go away just because  our brains pretend it isn’t there.
intelligence - Joe Abittan

Intelligence

“Intelligence is not an abstract number such as an IQ, but similar to a carpenter’s tacit knowledge about using appropriate tools,” writes Gerd Gigerenzer in his book Risk Savvy. “This is why the modern science of intelligence studies the adaptive toolbox that individuals, organizations, and cultures have at their disposal; that is, the evolved and learned rules that guide our deliberate and intuitive decisions.”

 

I like Gigerenzer’s way of explaining intelligence. It is not simply a number or a ratio, but it is our knowledge and ability to understand our world. There are complex relationships between living creatures, physical matter, and information. Intelligence is an understanding of those relationships and an ability to navigate the complexity, uncertainty, and connections between everything in the world. Explicit rules, like mathematical formulas, help us understand some relationships while statistical percentages help us understand others. Recognizing and being aware of commonalities between different categories of things and items and identifying patterns help us understand these relationships and serves as the basis for our intelligence.

 

What is important to note, is that our intelligence is built with concrete tools for some situations, like 2+2=4, and less concrete rules of thumb for other situations, like the golden rule – do to others what you would like others to do to you. Gigerenzer shows that our intelligence requires that we know more than one mathematical formula, and that we have more than one rule of thumb to help us approach and address complex relationships in the world. “Granted, one rule of thumb cannot possibly solve all problems; for that reason, our minds have learned a toolbox of rules. … these rules of thumb need to be used in an adaptive way.”

 

Whether it is interpreting statistical chance, judging the emotions of others, of making plans now that delay gratification until a later time, our rules of thumb don’t have to be precise, but they do need to be flexible and adaptive given our current circumstances. 2+2 will always equal 4, but a smile from a family member might be a display of happiness or a nervous impulse and a silent plead for help in an awkward situation. It is our adaptive toolbox and our intelligence that allows us to figure out what a smile means. Similarly, adaptive rules of thumb and intelligence help us reduce complex interactions and questions to more manageable choices, reducing uncertainty about how much we need to save for retirement to a rule of thumb that tells us to save a small but significant amount of each pay check. Intelligence is not just about facts and complex math. It is about adaptable rules of thumb that help us make sense of complexity and uncertainty, and the more adaptive these rules of thumb are, the more our intelligence an help us in the complex world of today and into the uncertain future.
Unconscious Rules of Thumb

Unconscious Rules of Thumb

Some of the decisions that I make are based on thorough calculations, analysis, evaluation of available options, and deliberate considerations of costs and benefits. When I am planning my workout routine, I think hard about how my legs have been feeling and what distance, elevation, and pace is reasonable for my upcoming workouts. I think about how early I need to be out the door for a certain distance, and whether I can run someplace new to mix things up. I’ll map out routes, look at my training log for the last few weeks, and try putting together a plan that maximizes my enjoyment, physical health, and fitness given time constraints.

 

However, outside of running, most of my decisions are generally based on rules of thumb and don’t receive the same level of attention as my running plans. I budget every two weeks around payday, but even when budgeting, I mostly rely on rules of thumb. There is a certain amount I like to keep in my checking account just in case I forgot a bill or have something pop-up last minute. Its not a deliberate calculation, it is more of a gut feeling. The same goes for how much money I set aside for free spending or if I feel that it is finally time to get that thing I have had my eye on for a while. My budget is probably more important than my running routine, but I actually spend more time rationally developing a running plan than I spend budgeting. The same goes for house and vehicle maintenance, spending time with friends and family, and choosing what to eat on the days we plan to do take-out.

 

The budget example is interesting because I am consciously and deliberately using rules of thumb to determine how my wife and I will use our money. I set aside a certain amount for gas without going to each vehicle and checking whether we are going to need to fill up soon. I am aware of the rules of thumb, and they are literally built into my spreadsheet where I sometimes ask if I should deviate, but usually decide to stick to them.

 

I also recognize that I have many unconscious rules of thumb. In his book Risky Savvy, Gerd Gigerenzer writes the following about unconscious rules of thumb:

 

“Every rule of thumb I am aware of can be used consciously and unconsciously. If it is used unconsciously, the resulting judgment is called intuitive. An intuition, or gut feeling, is a judgment:
  1. that appears quickly in consciousness,
  2. whose underlying reasons we are not fully aware of, yet
  3. is strong enough to act upon.”
I have lots of intuitive judgements that I often don’t think about in the moment, but only realize when I reflect back on how I do something. When I am driving down the freeway, cooking, or writing a blog post, many of my decisions flow naturally and quickly. In the moment the decisions seem obvious, and I don’t have to think too deliberately about my action and why I am making a specific decision. But if I were asked to explain why I made a decision, I would have a hard time finding exact reasons for my choices. I don’t know exactly how I know to change lanes at a certain point on the freeway, but I know I can often anticipate points where traffic will slow down, and where I might be better off in another lane. I can’t tell you why I chose to add the marsala wine to the mushrooms at the precise moment that I did. I also couldn’t explain why I chose to present a certain quote right at the beginning of a post rather than in the middle. My answer for all of these situations would simply be that it felt right.

 

We use unconscious rules of thumb like these all the time, but we don’t often notice when we do. When we are budgeting we might recognize our rules of thumb and be able to explain them, but our unconscious rules of thumb are harder to identify and explain. Nevertheless, they still have major impacts in our lives. Simply because we don’t notice them and can’t explain them doesn’t mean they don’t shape a lot of our decisions and don’t matter. The intuitions we have can be powerful and helpful, but they could also be wrong (maybe all this time I’ve been overcooking the mushrooms and should add the wine sooner!). Because these intuitions are unconscious, we don’t deliberately question them, unless something calls them up to the conscious level. The feedback we get is probably indirect, meaning that we won’t consciously tie our outcomes the to the unconscious rules of thumb that got us to them.

 

I am fascinated by things like unconscious rules of thumb because they reveal how little we actually control in our lives. We are the ones who act on these unconscious rules of thumb, but in a sense, we are not really doing anything at all. We are making decisions based on factors we don’t understand and might not be aware of. We have agency by being the one with the intuition, but we also lack agency by not being fully conscious of the how and why behind our own decisions. This should make us question ourselves and choices more than we typically do.
Probability is Multifaceted

Probability is Multifaceted

For five years my wife and I lived in a house that was at the base of the lee side of a small mountain range in Northern Nevada. When a storm would come through the area it would have to make it over a couple of small mountain ranges and valleys before getting to our house, and as a result we experienced less precipitation at our house than most people in the Reno/Sparks area. Now my wife and I live in a house higher up on a different mountain that is more in the direct path of storms coming from the west. We receive snow at our house while my parents and family lower in the valley barely get any wind. At both houses we have learned to adjust our expectations for precipitation relative to the probabilities reported by weather stations which reference the airport at the valley floor. Our experiences with rain and snow at our two places is a useful demonstration that probability (in this case the probability of precipitation) is multifaceted – that multiple factors  play a role in the probability of a given event at a given place and time.

 

In his book Risk Savvy, Gerd Gigerenzer writes, “Probability is not one of a kind; it was born with three faces: frequency, physical design, and degrees of belief.” Gigerenzer explains that frequency is about counting. To me, this is the most clearly understandable aspect of probability, and what we usually refer to when we discuss probability. On how many days does it usually rain in Reno each year? How frequently does a high school team from Northern Nevada win a state championship and how frequently does a team from Southern Nevada win a state championship? These types of questions simply require counting to give us a general probability of an event happening.

 

But probability is not just about counting and tallying events. Physical design plays a role as well. Our house on the lee side of a small mountain range was shielded from precipitation, so while it may have rained in the valley half a mile away, we didn’t get any precipitation. Conversely, our current home is in a position to get more precipitation than the rest of the region. In high school sports, fewer kids live in Reno/Sparks compared to the Las Vegas region, so in terms of physical design, state championships are likely to be more common for high schools in Southern Nevada. Additionally, there may be differences in the density of students at each school, meaning the North could have more schools per students than the south, also influencing the probability of a north or south school winning. Probability, Gigerenzer explains, can be impacted by the physical design of systems, potentially making the statistics and chance more complicated to understand.

 

Finally, degrees of belief play a role in how we comprehend probability. Gigerenzer states that degrees of belief include experience and personal impression which are very subjective. Trusting two eye witnesses, Gigerenzer explains, rather than two people who heard about an event from someone else can increase our perception that the probability of an unlikely story is accurate. Degrees of belief can also be seen in my experiences with rain and our two houses. I learned to discount the probability of rain at our first house and to increase my expectation of rain at our new house. If the meteorologist said there was a low chance of rain when we lived on the sheltered side of a hill, then I didn’t worry much about storm forecasts. At our new house, however, if there is a chance of precipitation and storm coming from the west, I will certainly go remove anything from the yard that I don’t want to get wet, because I believe the chance that our specific neighborhood will see rain is higher than what the meteorologist predicted.

 

Probability and how we understand it and consequentially make decisions  is complex, and Gigerenzer’s explanation of the multiple facets of probability helps us better understand the complexity. Simply tallying outcomes and predicting into the future often isn’t enough for us to truly have a good sense of the probability of a given outcome. We have to think about physical design, and we have to think about the personal experiences and subjective opinions that form the probabilities that people develop and express. Understanding probability requires that we hold a lot of information in our head at one time, something humans are not great at doing, but that we can do better when we have better strategies for understanding complexity.
Navigating Uncertainty with Nudges

Navigating Uncertainty with Nudges

In Risk Savvy Gerd Gigerenzer makes a distinction between known risks and uncertainty. In a foot note for a figure, he writes, “In everyday language, we make a distinction between certainty and risk, but the terms risk and uncertainty are used mostly as synonyms. They aren’t. In a world of known risks, everything, including the probabilities, is known for certain. Here statistical thinking and logic are sufficient to make good decisions. In an uncertain world, not everything is known, and one cannot calculate the best option. Here, good rules of thumb and intuition are also required.” Gigerenzer’s distinction between risk and uncertainty is important. He demonstrates that people can manage decision-making when making risk based decisions, but that people need to rely on intuition and good judgement when dealing with uncertainty. One solution to improved judgement and intuition is to use nudges.

 

In the book Nudge, Cass Sunstein and Richard Thaler encourage choice architects to design systems and structures that will help individuals make the best decision in a given situation as defined by the chooser. Much of their argument is supported by research presented by Daniel Kahneman in Thinking Fast and Slow, where Kahneman demonstrates how predictable biases and cognitive errors can lead people to making decisions that they likely wouldn’t make if they had more clear information, had the ability to free themselves from irrelevant biases, and could improve their statistical thinking. Gigerenzer’s quote supports Sunstein and Thaler’s nudges by building on the research from Kahneman. Distinguishing between risk and uncertainty helps us understand when to use nudges, and how aggressive our nudges may need to be.

 

Gigerenzer uses casino slot machines as an example of risk and for examples of uncertainty uses stocks, romance, earthquakes, business, and health. When we are gambling, we can know the statistical chances that our bets will pay off and calculate optimal strategies (there is a reason the casino dealer stays on 17). We won’t know what the outcome will be ahead of time, but we can precisely define the risk. The same cannot be said for picking the right stocks, the right romantic partner, or when creating business, earthquake preparedness, or health plans. We may know the five year rate of return for a company’s stocks, the divorce rate in our state, the average frequency and strength of earthquakes in our region, and how old our grandfather lived to be, but we cannot use this information alone to calculate risk. We don’t know exactly what business trends will arise in the future, we don’t know for sure whether we have a genetic disease that will strike us (or our romantic partner) down sooner than expected, and we can’t say for sure that a 7.0 earthquake is or is not possible next month.

 

But nudges can help us in these decisions. We can use statistical information for business development and international stock returns to identify general rules of thumb when investing. We can listen to parents and elders and learn from their advice and mistakes when selecting a romantic partner, intuiting the traits that make a good (or bad) spouse. We can overengineer our bridges and skyscrapers by 10% to give us a little more assurance that they can survive a major and unexpected earthquake. Nudges are helpful because they can augment our gut instincts and help bring visualizations to the rules of thumb that we might utilize.

 

Expecting everyone’s individual intuition and heuristics to be up to the task of navigating uncertainty is likely to lead to many poor choices. But, if we help pool the statistical information available, provide guides, communicate rules of thumb that have panned out for many people, and structure choices in ways that help present this information, then people can likely make marginally better decisions. My suggestion in this post, is a nudge to use more nudges in moments of uncertainty. When certainty exists, or even when calculable risks exist, nudges may not be needed. However, once we get beyond calculable risk, where we must rely on judgement and intuition, nudges are important tools to help people navigate uncertainty and improve their decision making.
Inventing Excuses - Joe Abittan

Inventing Excuses

With the start of the new year and the inauguration of a new president of the United States, many individuals and organizations are turning their eyes toward the future. Individuals are working on resolutions to make positive changes in their lives. Companies are making plans and strategy adjustments to fit with economic and regulatory predictions. Political entities are adjusting a new course in anticipation of political goals, agendas, and actions of the new administration and the new distribution of political power in the country. However, almost all of the predictions and forecasts of individuals, companies, and political parties will end up being wrong, or at least not completely correct.

 

Humans are not great forecasters. We rarely do better than just assuming that what happened today will continue to happen tomorrow. We might be able to predict a regression to the mean, but usually we are not great at predicting when a new trend will come along, when a current trend will end, or when some new event will shake everything up. But this doesn’t mean that we don’t try, and it doesn’t mean that we throw in the towel or shrug our shoulders when we get things wrong.

 

In Risk Savvy Gerd Gigerenzer writes, “an analysis of thousands of forecasts by political and economic experts revealed that they rarely did better than dilettantes or dart-throwing chimps. But what the experts were extremely talented at was inventing excuses for their errors.” It is remarkable how poor our forecasting can be, and even more remarkable how much attention we still pay to forecasts. At the start of the year we all want to know whether the economy will improve, what a political organization is going to focus on, and whether a company will finally produce a great new product. We tune in as experts give us their predictions, running through all the forces and pressures that will shape the economy, political future, and performance of companies. And even when the experts are wrong, we listen to them as they explain why their initial forecast made sense, and why they should still be listened to in the future.

 

A human who threw darts, flipped a coin, or picked options out of a hat before making a big decision is likely to be just as wright or just as wrong as the experts who suggest a certain decision over another. However, the coin flipper will have no excuse when they make a poor decision. The expert on the other hand, will have no problem inventing excuses to explain away their culpability in poor decision-making. The smarter we are the better we are at rationalizing our choices and inventing excuses, even those that don’t go over so well.