Inventing Excuses - Joe Abittan

Inventing Excuses

With the start of the new year and the inauguration of a new president of the United States, many individuals and organizations are turning their eyes toward the future. Individuals are working on resolutions to make positive changes in their lives. Companies are making plans and strategy adjustments to fit with economic and regulatory predictions. Political entities are adjusting a new course in anticipation of political goals, agendas, and actions of the new administration and the new distribution of political power in the country. However, almost all of the predictions and forecasts of individuals, companies, and political parties will end up being wrong, or at least not completely correct.

 

Humans are not great forecasters. We rarely do better than just assuming that what happened today will continue to happen tomorrow. We might be able to predict a regression to the mean, but usually we are not great at predicting when a new trend will come along, when a current trend will end, or when some new event will shake everything up. But this doesn’t mean that we don’t try, and it doesn’t mean that we throw in the towel or shrug our shoulders when we get things wrong.

 

In Risk Savvy Gerd Gigerenzer writes, “an analysis of thousands of forecasts by political and economic experts revealed that they rarely did better than dilettantes or dart-throwing chimps. But what the experts were extremely talented at was inventing excuses for their errors.” It is remarkable how poor our forecasting can be, and even more remarkable how much attention we still pay to forecasts. At the start of the year we all want to know whether the economy will improve, what a political organization is going to focus on, and whether a company will finally produce a great new product. We tune in as experts give us their predictions, running through all the forces and pressures that will shape the economy, political future, and performance of companies. And even when the experts are wrong, we listen to them as they explain why their initial forecast made sense, and why they should still be listened to in the future.

 

A human who threw darts, flipped a coin, or picked options out of a hat before making a big decision is likely to be just as wright or just as wrong as the experts who suggest a certain decision over another. However, the coin flipper will have no excuse when they make a poor decision. The expert on the other hand, will have no problem inventing excuses to explain away their culpability in poor decision-making. The smarter we are the better we are at rationalizing our choices and inventing excuses, even those that don’t go over so well.
Predictable Outcomes

Predictable Outcomes

“In many domains people are tempted to think, after the fact, that an outcome was entirely predictable, and that the success of a musician, an actor, an author, or a politician was inevitable in light of his or her skills and characteristics. Beware of that temptation. Small interventions and even coincidences, at a key stage, can produce large variations in the outcome,” write Richard Thaler and Cass Sunstein in their book Nudge.

 

People are poor judges of the past. We lament the fact that the future is always unclear and unpredictable. We look back at the path that we took to get to where we are today, and are frustrated by how clear everything should have seemed. When we look back, each step to where we are seems obvious and unavoidable. However, what we forget in our own lives and when we look at others, is how much luck, coincidence, and random chance played a role in the way things developed. Whether you are a Zion Williams level athlete, a JK Rowling skilled author, or just someone who is happy with the job you have, there were plenty of chances where things could have gone wrong, derailing what seems like an obvious and secure path. Injuries, deaths in our immediate family, or even just a disparaging comment from the right person could have turned Zion away from basketball, could have shot Rowling’s writing confidence, and could have denied you the job you enjoy.

 

What we should recognize is that there is a lot of randomness along the path to success, it is not entirely about hard work and motivation. This should humble us if we are successful, and comfort us when we have a bad break. We certainly need to focus, work hard, develop good habits, and try to make the choices that will lead us to success, but when things don’t work out as well as we hoped, it is not necessarily because we lack something and are incompetent. At the same time, reaching fantastic heights is no reason to proclaim ourselves better than anyone else. We may have had the right mentor see us at the right time, we may have just happened to get a good review at the right time, and we maybe just got lucky when another person was unlucky to get to where we are. We can still be proud of where we are, but we shouldn’t use that pride to deny other people opportunity. We should use that pride to help other people have lucky breaks of their own. Nothing is certain, even if it looks like it always was when you look in the rear view mirror.
Take the Outside View

Take the Outside View

Taking the outside view is a shorthand and colloquial way to say, think of the base rate of the reference class to which something belongs, and make judgements and predictions from that starting point. Take the outside view is advice from Daniel Kahneman in his book Thinking Fast and Slow for anyone working on a group project, launching a start-up, or considering an investment with a particular company. It is easy to take the inside view, where everything seems predictable and success feels certain. However, it is often better for long-term success to take the outside view.

 

In his book, Kahneman writes, “people who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.” He writes this after discussing a group project he worked on where he and others made an attempt to estimate the time necessary to complete the project and the obstacles and hurdles they should expect along the way. For everyone involved, the barriers and likelihood of being derailed and slowed down seemed minimal, but Kahneman asked the group what to expect based on the typical experience of similar projects. The outlook was much more grim when viewed from the outside perspective, and helped the group better anticipate challenges they could face and set more reasonable timelines and work processes.

 

Kahneman continues, “when forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs.”

 

Taking the outside view helps us get beyond delusional optimism. It helps us make better expectations about how long a project will take, what rate of return we should expect, and what the risks really look like. It is like getting a medical second opinion, to ensure that your doctor isn’t missing anything and to ensure they are following the most up-to-date practices. Taking the outside view shifts our base rate, anchors us to a reality that is more reflective of the world we live in, and helps us prepare for challenges that we would otherwise overlook.
Understanding the Past

Understanding the Past

I am always fascinated by the idea, that continually demonstrates validity in my own life, that the more we learn about something, the more realize how little we actually know about it. I am currently reading Yuval Noah Harari’s book Sapiens: A Brief History of Humankind, and I am continually struck by how often Harari brings in events from mankind’s history that I had never heard about. The more I learn about the past, or about any given subject, the more I realize how little knowledge I have ever had, and how limited, narrow, and sometimes just flat out inaccurate my understandings have been.

 

This is particularly important when it comes to how we think about the past. I believe very strongly that our reality and the worlds in which we live and inhabit are mostly social constructions. The trees, houses, and roads are all real, but how we understand the physical objects, the spaces we operate, and how we use the real material things in our worlds is shaped to an incredible degree by social constructions and the relationships we build between ourselves and the world we inhabit. In order to understand these constructions and in order to shape them for a future that we want to live in (and are physiologically capable of living in) we need to understand the past and make predictions about the future with new social constructs that enable continued human flourishing.

 

To some extent, this feels easy and natural to us. We all have a story and we learn and adopt family stories, national stories, and global stories about the grand arc of humanity. But while our stories seem to be shared, and while we seem to know where we are heading, we all operate based on individual understandings of the past, and where that means we are (or should be) heading. As Daniel Kahneman writes in his  book Thinking Fast and Slow, “we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less that we believe we do.”

 

As I laid out to begin this post, there is always so much more complexity and nuance to anything that we might study and be familiar with than we often realize. We can feel that we know something well when we are ignorant of the nuance and complexity. When we start to really untangle something, whether it be nuclear physics, the history of the American Confederacy, or how our fruits and veggies get to the supermarket, we realize that we really don’t know and understand anything as well as we might intuitively believe.

 

When we lack a deep and complex understanding of the past, because we just don’t know about something or because we didn’t have an accurate and detailed presentation of the thing from the past, then we are likely to misinterpret and misunderstand how we got to our current point. By having a limited historical perspective and understanding, we will incorrectly assess where our best future lies. It is important that we recognize how limited our knowledge is, and remember that these limits will shape the extent to which we can make valid predictions for the future.
Intensity Matching and Intuitive Predictions

Intuitive Predictions and Intensity Matching

“Intuitive predictions are almost completely insensitive to the actual predictive quality of the evidence,” writes Daniel Kahneman in Thinking Fast and Slow. A lot of our thinking takes place in the part of our brain which is good at making quick connections, quickly detecting patterns, and making fast judgments. The deeper and more thoughtful part of our brain only engages with the world when it really needs to, when we really need to do some critical thinking to sort out a math problem, write a blog post, or figure out how to grind down some grains to make bread. The result is that a lot of our thinking processes happen at a quick and intuitive level that is subject to biases and assumptions based on incomplete information.  When we do finally turn our critical thinking brain to a problem, it is only operating with a limited set of information from the quick part of our brain which scanned the environment and grabbed the information which stood out.

 

When we make a prediction without sitting down and doing some math or weighing the factors that influence our prediction with pen and paper, our predictions will seem logical, but will miss critical information. We will make connections between ideas and experiences that might not be very reflective of the actual world. We will simplify the prediction by answering easy questions and substituting answers for the more difficult question that our prediction is trying to answer.

 

This year, as in 2016, we will see this in action. In 2016, for me and many of the people I know, it seemed as though very few people supported Donald Trump for president. I saw very few bumper stickers or yard signs for Trump, all the social media posts I saw highlighted his worst moments, and the news coverage I consumed described why he was unfit to be president. Naturally enough, I believed he would lose in a landslide. Of course, that did not happen. Intuitively I was sure that Clinton would win, and Kahneman’s research helps explain why I should have been more skeptical of my natural intuition.

 

Part of the problem was that my intuitive prediction was an exercise of intensity matching, and as Kahneman writes, “Intensity matching yields predictions that are as extreme as the evidence on which they are based.” All the information I saw highlighted how terrible Trump was. I didn’t see a lot of people supporting trump, I didn’t see news stories justifying his candidacy. I didn’t see people in my immediate environment who strongly supported him, so my intuition was biased. It didn’t help that I didn’t do anything to seek out people who did support him or information outlets that posted articles or stories in support of him.

 

Kahneman’s writing aligns with my real world experience. His studies of the brain and of our predictive machinery reveals biases and errors in our thinking. Our intuition is based on a limited set of information that the quick part of our brain can put together. When we do engage our deep thinking brain, it can still only operate on that limited information, so even if we do think critically, we are likely to still make mistakes because we can’t see the full picture and biases in the information we absorb will predictably shape the direction of our miscalculations. What might feel natural and obvious to us could be a result of faulty intensity matching and random chance in the environment around us.
A Capacity for Surprise

A Capacity for Surprise

For much of our waking life we operate on System 1, or we at least allow System 1 to be in control of many of our actions, thoughts, and reactions to the world around us. We don’t normally have to think very hard about our commute to work, we can practically walk through the house in the early morning on our way to the coffee machine with our eyes closed, and we can nod to the Walmart greeter and completely forget them half a second after we have passed. Most of the time, the pattern of associated ideas from System 1 is great at getting us through the world, but occasionally, something happens that doesn’t fit the model. Occasionally, something reveals our capacity for surprise.

 

Seeing someone juggling in the middle of the shopping isle in Walmart would be a surprise (although less of a surprise in a Walmart than in some other places). Stepping on a stray Lego is an unwelcome early morning pre-coffee surprise, as is an unexpected road closure on our commute. These are examples of large surprises in our daily routine, but we can also have very small surprises, like when someone tells us we will be meeting with Aaron to discuss our personal financial plan, and in walks Erin, surprising us by being a woman in a position we may have subconsciously associated with men.

 

“A capacity for surprise is an essential aspect of our mental life,” writes Daniel Kahneman in his book Thinking Fast and Slow, “and surprise itself is the most sensitive indication of how we understand our world and what we expect from it.”

 

Because so much of our lives is in the hands of System 1, we are frequently surprised. If we consciously think about the world and the vast array of possibilities at any moment, we might not be too surprised at any given outcome. We also would be paralyzed by trying to make predictions of a million different outcomes for the next five minutes. System 1 eases our cognitive load and sets us up for routine expectations based on the model of the world it has adapted from experience. Surprise occurs when something violates our model, and one of the best ways to understand what that model looks like is to look at the situations that surprise us.

 

Bias is revealed through surprise, when an associated pattern is interrupted by something that we were not expecting. The examples can be harmless, such as not expecting a friend to answer the phone sick, with a raspy and sleepy voice. But often our surprise can reveal more consequential biases, such as when we are surprised to see a person of a racial minority in a position of authority. It might not seem like much, but our surprise can convey a lot of information about what we expect and how we understand the world, and other people might pick up on that even if we didn’t intend to convey a certain expectation about another person’s place in the world. We are constantly making predictions about what we will experience in the world, and our capacity for surprise reveals the biases that exist within our predictions, saying something meaningful about what our model of the world looks like.

We Might Be Wrong

“If you can be sure of being right only 55 percent of the time,” writes Dale Carnegie in the book How to Win Friends and Influence People, “you can go down to Wall Street and make a million dollars a day. If you can’t be sure of being right even 55 percent of the time, why should you tell other people they are wrong?”

 

We always feel so sure of our judgments and conclusions. From the health and safety of GMO foods, to the impacts of a new tax, to who is going to win the Super Bowl, we are often very confident people. The world seems to always want our opinions, and we are usually very excited to offer our opinion with a staggering amount of confidence. This has lead to a lot of funny social media posts about people being incorrect about history, science, and sports, but more seriously, it can create thinking errors that lead nations to invade countries for poor reasons, lead to mechanical failures of spacecraft and oil platforms, and can cause us to loose huge sums of money when the game doesn’t turn out the way we knew it would.

 

I think a good practice is to look for areas where we feel a high degree of confidence, and to then try to ascribe a confidence level to our thoughts. We can try to tie our confidence levels back to real world events to help us ground our predictions: The percent chance of getting blackjack in a given hand is 4.83%, Steph Curry’s 3-point shooting percentage is 43.5%, and the percent chance of getting heads in a coin flip is of course 50%. Can you anchor your confidence (or the chance you are wrong) to one of these percentages?

 

I haven’t studied this (so I could be wrong – I’d wager the chance I’m wrong and this is not helpful at Steph Curry’s 3-point percentage), but I would expect that doing this type of exercise would help us recognize how overconfident we often are. It might even help us get to the next step, admitting that we might be wrong and considering different possibilities. Carnegie continues:

 

“You will never get into trouble by admitting that you may be wrong. That will stop all argument and inspire your opponent to be just as fair and open and broad-minded as you are. It will make him want to admit that he, too, may be wrong.”

 

The important thing to remember is that the world is incredibly complex, and our minds are only so good at absorbing lots of new data and articulating a comprehensive understanding of the information we synthesize. We should be more willing to consider ways in which our beliefs may be inaccurate, and more willing to listen to reasonable people (especially those who have demonstrated expertise or effective thinking skills) when they suggest an idea that does not conform to our prior beliefs. Try not to be close-minded and overly confident in your own beliefs, and you will be better at avoiding thinking errors and making better long-term decisions.

Be Calm Ahead of Your Obstacle

In Letters From a Stoic Seneca writes, “There are more things … likely to frighten us than there are to crush us; we suffer more often in imagination than in reality.” Our minds work really hard to keep us safe, keep us in important positions, and keep us connected so that we can succeed and so that our children and grandchildren can enjoy a high status life. Our minds are trying to help us navigate an uncertain future, but sometimes our minds go too far and we become paralyzed with a fear that is worse than the outcome we want to avoid.

 

Seneca continues, “What I advice you to do is not to be unhappy before the crisis comes; since it may be that the dangers before which you paled as if they were threatening you, will never come upon you; they certainly have not come yet.”

 

We can live our lives worrying about what will go wrong five minutes from now, five days from now, or five years from now, but we never truly know what is around the corner. Sometimes we set artificial deadlines on ourselves and sometimes those deadlines are forced upon us, but that doesn’t mean we need to live every moment of our lives up to that deadline in fear of what will happen if we don’t achieve what we intended by that date. The fear that we feel can be useful in pushing us to get stuff done and avoid procrastination, but when we notice that we can’t sleep at night because we are worried of the negative consequences of what may happen if that bad thing we fear occurs, then it is time for us to step back and refocus on our present moment. I find that it is helpful for me to look at the fears that I have and recognize that in the present moment I am fine, and to recognize that the status quo will most likely continue if I miss the deadline or if the bad thing does happen. There are plenty of things to fear, and we should build a capacity to see that we will still be able to move on with life even if some of our worst fears come true.

 

Ultimately, we know we are going to have obstacles and setbacks in our lives, but that does not mean we need to live every moment in fear of what bad thing is around the corner. We can live conservatively and save money and resources to confidently weather such challenges, but we do not need to allow negative things in our lives to cause us trauma before they have occurred. Preparing ourselves ahead of time will help mitigate the fear, but learning to accept that bad things will happen and learning to enjoy the present moment are the only ways we can truly escape from the fear of what lies ahead.

Training Our Instincts

In his book Becoming Who We Need To Be, author Colin Wright explains how training in certain areas changes us. “Training our instincts is like feeding our subconscious. It grants us more informed, helpful knee-jerk reactions, rather than blind and potentially damaging impulses.” For examples, Wright writes about the ways that experienced auto mechanics are diagnose vehicle problems in one area of an engine based on a signal in a different area of the engine and he writes about learning to cook in six months and having a new understanding and appreciation for raw ingredients that can be cooked together to make a meal. In isolated cases, things we don’t know about and don’t understand at all can become things that give us clues and slight insights based on our experience and knowledge.

 

Recently, Tyler Cowen interviewed Ezekiel Emanuel for his podcast, Conversations with Tyler, and I was struck by Emanuel’s efforts to learn and engage with something new each year. He has recently learned how to make his own jam and chocolate and in the interview talked about the insights and unexpected things that he has gained by trying something completely new. He doesn’t always stick with everything he learns and tries, but by applying himself in a lot of different areas, he picks up new perspectives, meets new people, and gains a new appreciation for something that was foreign to him in the past.

 

The lessons from Wright and Emanuel are things we should keep in mind and try to build into our own lives. When we only have a vague understanding or idea of how the world works, we are going to move through it making assumptions that are not warranted. We will act in ways that seem intuitively obvious for us, but our way of moving through the world may be as foolish as asking the French why they haven’t had an air tanker drop water on Notre-Dame. Ignorance can be quite costly in our own lives and in the negative externalities that we push onto the rest of the world, and as we become more responsible with relationships, families, and businesses that count on us, ignorance can be quite costly for the rest of society. Becoming aware of areas where we have no expertise and no training is important so that we can identify where we might have these knee-jerk reactions that won’t help anyone. Awareness of our ignorance can help us choose what we want to focus on, what we want to learn about, and what would help us become a better person for our society.

 

On the opposite side of the coin, as we become more expert in a given area, we will be able to better sense what is happening around us and make choices and decisions that we can’t explain but that work properly. It is something we should strive toward, but all the while we should recognize where our expertise falls short and how bad assumptions could harm us and others.