Overcoming Group Overconfidence

Overcoming Group Overconfidence

Overcoming group overconfidence is hard, but in Thinking Fast and Slow, Daniel Kahneman offers one partial remedy: a premortem. As opposed to a postmortem, and analysis of why a project failed, a premortem looks at why a program might fail before it has started.

 

Group communication is difficult. When the leader of a group is enthusiastic about an idea, it is hard to disagree with them. If you are a junior member of a team, it can be uncomfortable, and potentially even disadvantageous for you and your career to doubt the ideas that a senior leader is excited about. If you have concerns, it is not likely that you will bring them up, especially in a group meeting with other seemingly enthusiastic team members surrounding you.

 

Beyond the silencing of a member who has concerns but doesn’t want to speak up is another problem that contributes to overconfidence among teams: groupthink. Particularly among groups that lack diversity, groupthink can crush the planning stage of a project. When everyone has similar backgrounds, similar experiences, and similar styles of thinking, it is unlikely that anyone within the group will have a viewpoint or opinion that is significantly different than the prevailing wisdom of the rest. What seems like a good idea or the correct decision to one person probably feels like the correct idea or decision to everyone else – there is literally no one in the room who has any doubts or alternative perspectives.

 

Premortems help get beyond groupthink and the fear of speaking up against a powerful and enthusiastic leader. The idea is to brainstorm all the possible ways that a project might fail. It includes an element of creativity by asking everyone to imagine the project is finally finished, either successfully but well over budget, way late, after a very turbulent series of events, or the project was a complete failure and never reached its intended end point. People have to describe the issues that came up and why the project did not reach the rosy outcome everyone initially pictured. Imaging that these failures had taken place in real life gets people to step beyond groupthink and encourages highlighting roadblocks that particularly enthusiastic members overlook.

 

Because premortems are hypothetical, it gives people a chance to speak up about failure points and weaknesses in plans and ideas without appearing to criticize the person the idea came from. It creates a safe space for imagining barriers and obstacles that need to be overcome to achieve success. It reduces groupthink by encouraging a creative flow of ideas of failure points. As Kahneman writes, “The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier.”

 

Overcoming group overconfidence is possible, but it needs the right systems and structures to happen. Groupthink and fear are likely to prevent people from bringing up real doubts and threats, but a premortem allows those concerns to be aired and seriously considered. It helps get people to look beyond the picture of success they intuitively connect with, and it helps prevent enthusiastic supporters from getting carried away with their overconfidence.
Take the Outside View

Take the Outside View

Taking the outside view is a shorthand and colloquial way to say, think of the base rate of the reference class to which something belongs, and make judgements and predictions from that starting point. Take the outside view is advice from Daniel Kahneman in his book Thinking Fast and Slow for anyone working on a group project, launching a start-up, or considering an investment with a particular company. It is easy to take the inside view, where everything seems predictable and success feels certain. However, it is often better for long-term success to take the outside view.

 

In his book, Kahneman writes, “people who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.” He writes this after discussing a group project he worked on where he and others made an attempt to estimate the time necessary to complete the project and the obstacles and hurdles they should expect along the way. For everyone involved, the barriers and likelihood of being derailed and slowed down seemed minimal, but Kahneman asked the group what to expect based on the typical experience of similar projects. The outlook was much more grim when viewed from the outside perspective, and helped the group better anticipate challenges they could face and set more reasonable timelines and work processes.

 

Kahneman continues, “when forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs.”

 

Taking the outside view helps us get beyond delusional optimism. It helps us make better expectations about how long a project will take, what rate of return we should expect, and what the risks really look like. It is like getting a medical second opinion, to ensure that your doctor isn’t missing anything and to ensure they are following the most up-to-date practices. Taking the outside view shifts our base rate, anchors us to a reality that is more reflective of the world we live in, and helps us prepare for challenges that we would otherwise overlook.
Discount Confidence

Discount Confidence

You should probably discount confidence, even your own, when it comes to the certainty of a given outcome or event. I previously wrote about confidence stemming from the logical coherence of the story we are able to tell ourselves. I have also written about how logical coherence of personal narratives is easier when we lack key information and have a limited set of experiences to draw from. The more we know, the more experiences we have, the harder it becomes to construct a narrative that can balance conflicting and competing information. Laddering up from this point, we should be able to see that the more detailed and complete our information, the less coherent and easily logical our narrative about the world should be, and the less confidence we should have about anything.

 

If you have a high level of confidence in your own intuitions, then you probably don’t know enough about the world. If someone tells you they are very confident in something, like say an investment strategy, then you should probably discount the outcome based on their certainty. They may still be right in the end, but their certainty shouldn’t be a factor that leads to your support of the outcome they tell you to be a sure thing. As Daniel Kahneman writes in Thinking Fast and Slow, “The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trusty anyone – including yourself – to tell you how much you should trust their judgment.”

 

We tend to be very trustworthy. Our society and economy run on trust that we place in complete strangers. Our inclination toward trust is what causes us to be so easily fooled by confidence. It is easy to assume that someone who has a lot of confidence in something is more trustworthy, because we assume they must know a lot in order to be so confidence. But as I laid out at the start of this post, that isn’t always the case. In fact, the more knowledge you have about something, the less confidence you should have. With more knowledge comes more understanding of nuance, better conceptions of areas of uncertainty, and a better sense of trade-offs and contradictions. Confidence alone is not a predictor of accuracy. Our assumptions influence how accurate our prediction is, and we can be very confident in our assumptions without having any concrete connection to reality.
A Large Collection of Miniskills

A Large Collection of Miniskills

I  really like the way that Daniel Kahneman describes expertise in his book Thinking Fast and Slow. His description is incredibly meaningful today, in a world where so many of us work in offices and perform knowledge world. Expertise is important, but it is a bit nebulous when you think about knowledge work expertise compared to craftsmanship expertise. Nevertheless, a good concept of what expertise is can be helpful when thinking about personal growth and success.

 

Kahneman writes, “The acquisition of expertise in complex tasks such as high-level chess, professional basketball, or firefighting is intricate and slow because expertise in a domain is not a single skill but rather a large collection of miniskills.” By thinking about expertise as a large collection of miniskills it becomes more understandable and meaningful, even in the context of knowledge work. For sports, many crafts, and even physical labor, expertise as a collection of miniskills is so obvious it is almost invisible. But for knowledge work, expertise as a collection of miniskills is invisible because it is not obvious or ubiquitous.

 

The image coming to mind for me when I think of expertise as a series of miniskills is iron forging or glasswork. It is clear that one must have a lot of different skills ranging from skills related to noticing subtle changes in materials as heat is applied to physical skills involved in shaping the material once it is at a certain temperature. One also has to have imaginative skills in order to see the shape and design that one wants, and be able to connect the right twists, bends, and physical manipulations to the object to match the mental image. Forging a knife or making a glass marble requires a lot of skills in related but different spheres in order to make one final product. It is obvious that one needs a lot of miniskills to be successful, but unless we enroll in a beginners class, we don’t necessarily think about all the miniskills that go into the craftsmanship.

 

In the knowledge work economy, our final work products are also an accumulation of miniskills, even though it feels as though we just produce one thing or do one thing with no real “skill” involved. However, our work requires communication skills, writing skills (a particular variation of communication skills), scheduling and coordinating skills, and oftentimes skills that require us to be able to create visually stimulating and engaging materials. Whether it is creating a slide show, coordinating an important meeting, or drafting standard operating procedures, we are not simply doing one thing, but are engaging an entire set of miniskills. True expertise in knowledge work is still derived from a set of miniskills, but the skills themselves don’t seem like real skills, and are easily ignored or overlooked. Focusing on the miniskills needed for knowledge work expertise can help us understand where we can improve, what our image of success really entails, and how to approach important projects. It is the mastery and connection of various miniskills that enables us to be experts in what we do, even in our ubiquitous office environments.
Should You Be So Confident?

Should You Be So Confident?

Are you pretty confident that your diet is a healthy option for your? Are you confident in the outcome of your upcoming job interview? And how confident are you that you will have enough saved for retirement? Whatever your level of confidence, you might want to reconsider whether you should be as confident as you are, or whether you are just telling yourself a narrative that you like and that makes you feel comfortable with the decisions you have made.

 

In Thinking Fast and Slow, Daniel Kahneman writes the following about confidence:

 

“Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”

 

We feel confident in our choices, decisions, and predictions about the future when we can construct a coherent narrative. When we have limited information and experience, it is easy for us to fit that information together in a simplified manner that creates a logical story. The more conflicting and complex information and knowledge we obtain, the more diverse experiences and viewpoints we adopt, the harder it is to construct a simple narrative, and the harder it is for our story about the world to align in a way that makes us confident about anything.

 

A high level of confidence doesn’t represent reality, and it may actually reflect a lack of understanding of reality and all of its complexities. We are confident that our diet is good when we cut out ice cream and cookies, but we don’t really know that we are getting sufficient nutrients for our bodies and our lifestyles. We don’t really know how we perform in a job interview, but if we left feeling that we really connected and remembered to say the things we prepared, then we might be confident that we will land the job. And if we have a good retirement savings program through our job and also contribute to an IRA, we might feel that we are doing enough for retirement and be confident that we will be able to retire at 65, but few of us really do the calculations to ensure we are contributing what we need, and none of us can predict what housing or stock markets will look like as we get closer to retirement. Confidence is necessary for us to function in the world without being paralyzed by fear and never-ending cycles of analysis, but we shouldn’t mistake confidence in ourselves or in other people for actual certainty and knowledge.
Luck & Success - Joe Abittan

Luck & Success

I am someone who believes that we can all learn from the lessons of others. I believe that we can read books, listen to podcasts, watch documentaries, and receive guidance from good managers and mentors that will help us learn, grow, and become better versions of ourselves. I read Good to Great and Built to Last from Jim Collins, and I have seen value in books that look at successful companies and individuals. I have  believed that these books offer insights and lessons that can help me and others improve and adopt strategies and approaches that will help us become more efficient and productive overtime to reach large, sustainable goals.

 

But I might be wrong. In Thinking Fast and Slow, Daniel Kahneman directly calls into question whether books form authors like Jim Collins are useful for us at all. The problem, as Kahneman sees it, is that such books fail to account for randomness and chance. They fail to recognize the halo effect and see patterns where none truly exist. They ascribe causal mechanisms to randomness, and as a result, we derive a lesson that doesn’t really fit the actual world.

 

Kahneman writes, “because luck plays a large role, the quality of leadership and management practices cannot be inferred reliably from observations of success.” Taking a group of 20 successful companies and looking for shared operations, management styles, leadership traits, and corporate cultures will inevitably end up with us identifying commonalities. The mistake is taking those commonalities and then ascribing a causal link between these shared practices or traits and the success of companies or individuals. Without randomized controlled trials, and without natural experiments, we really cannot identify a strong causal link, and we might just be picking up on random chance within our sample selection, at least as Kahneman would argue.

 

I read Good to Great and I think there is a good chance that Kahneman is correct to a large extent. Circuit City was one of the success stories that Collins touted in the book, but the company barely survived another 10 years after the book’s initial publication. Clearly there are commonalities identified in books like Good to Great that are no more than chance, or that might themselves be artifacts of good luck. Perhaps randomness from good timing, fortunate economic conditions, or inexplicably poor decisions by the competition contribute to any given company or individual success just as much as the factors we identify by studying a group of success stories.

 

If this is the case, then there is not much to learn from case studies of several successful companies. Looking for commonalities among successful individuals and successful companies might just be an exercise in random pattern recognition, not anything specific that we can learn from. This doesn’t fit the reality that I want, but it may be the reality of the world we inhabit. Personally, I will still look to authors like Jim Collins and try to learn lessons that I can apply in my own life and career to help me improve the work I do. Perhaps I don’t have to fully implement everything mentioned in business books, but surely I can learn strategies that will fit my particular situation and needs, even if they are not broad panaceas to solve all productivity hang-ups in all times and places.
Hindsight Bias and Accountability - Joe Abittan

Hindsight Bias and Accountability

“Increased accountability is a mixed blessing,” writes Daniel Kahneman in his book Thinking Fast and Slow. This is an idea I came across in the past from books like Political Realism by Jonathan Rauch and The New Localism by Bruce Katz and  Jeremy Nowak. Our go-to answer to any challenges and problems tends to be increased transparency and greater oversight. However, in some complex fields simply opening processes and decision-making procedures to more scrutiny and review can create new problems that might be even worse. This is a particular challenge when we consider the way hindsight bias influences the thoughts and opinions of those reviewing bodies.

 

Kahneman continues, “because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions and – to an extreme – reluctance to take risks.”

 

Excess scrutiny and oversight can lead to rigid and mechanical decision-making processes. This might not be a problem when we are engineering a bridge and need to make technical decisions based on known mathematical calculations (I’ve never engineered a bridge so I may be wrong here), but it can be a problem for doctors and policy makers. Doctors have to rely on their experience, their knowledge, and their intuitions to determine the best possible medical treatment. Checklists are fantastic ideas, but when things go wrong in an operating room, doctors and nurses have to make quick decisions balancing risk and uncertainty. If the oversight they will face is high, then there is a chance that doctors stick to a rigid set of steps, that might not really fit the current emergency. In his book, Kahneman writes about how this leads doctors to order unnecessary tests and procedures, more to cover themselves from liability than to truly help the patient, wasting time and money within the healthcare system.

 

For public decision-making, hindsight bias can be a disaster for public growth and development. The federal government makes loans and backs many projects. Like any venture capitalist firm or large bank making multiple investments, some projects will fail. It is impossible to know at the outset which of ten solar energy projects will be a massive success, and which company is going to go bust. But thanks to hindsight bias and the intense oversight that public agencies and legislatures are subject to, an investment in a solar project that goes bust is likely to haunt the agency head or legislators who backed the project, even if 9 of the other 10 projects were huge successes.

 

Oversight is important, but when oversight is subject to hindsight bias, the accountability shifts into high gear, blaming decision-makers for failing to have the superhuman ability to predict the future. This creates risk averse institutions that stagnate, waste resources, and are slow to act, potentially creating new problems and new vulnerabilities to hindsight bias in the future. Rauch, Katz, and Nowak in the posts I linked to above, all favor reducing transparency in the public setting for this reason, but Kahneman might not agree with them, arguing that closing the deliberations to transparency won’t hide the outcomes from the public, and won’t stop hindsight bias from being an issue.
Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.
Can You Remember Your Prior Beliefs? - Joe Abittan

Can You Remember Your Prior Beliefs?

“A general limitation of the human mind,” writes Daniel Kahneman in his book Thinking Fast and Slow, “is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”

 

What Kahneman is referring to with this quote is the difficulty we have in understanding how our thinking evolves and changes over time. To each of us, our thinking slowly adapts and revises itself, sometimes quite dramatically, but often very slowly. Our experience of our changing mind isn’t very reflective of these changes, unless we had a salient change that I would argue is tied in one way or another to an important aspect of our identity. For most changes in our mental approach, we generally don’t remember our prior beliefs and views, and we likely don’t remember a point at which our beliefs changed.

 

In the book Kahneman uses an example of two football teams with the same record playing each other. One team crushes the other, but before we knew the outcome, we didn’t have a strong sense of how the game would go. After watching a resounding victory, it is hard to remember that we once were so uncertain about the future outcome.

 

This tendency of the mind wouldn’t be much of a problem if it was restricted to our thinking about sports – unless we had a serious betting problem. However, this applies to our thinking on many more important topics such as family member marriages, career choices, political voting patterns, and consumer brand loyalty. At this moment, many Democrat voters in our nation probably don’t remember exactly what their opinions were on topics like free trade, immigration, or infectious disease policy prior to the 2016 election. If they do remember their stances on any of those issues, they probably don’t remember all the legal and moral arguments they expressed at that time. Their minds and opinions on the matter have probably shifted in response to President Trump’s policy positions, but it is probably hard for many to say exactly how or why their views have changed.

 

In a less charged example, imagine that you are back in high school, and for years you have really been into a certain brand of shoes. But, one day, you are bullied for liking that brand, or perhaps someone you really dislike is now sporting that same brand, and you want to do everything in your power to distance yourself from any association with the bullying or the person you don’t like. Ditching the shoes and forgetting that you ever liked that brand is an easy switch for our minds to make, and you never have to remember that you too wore those shoes.

 

The high school example is silly, but for me it helps put our brain’s failure to remember previous opinions and beliefs in context. Our brains evolved in a social context, and for our ancestors, navigating complex tribal social structures and hierarchies was complex and sometimes a matter of life and death (not just social media death for a few years in high school like today). Being able to ditch beliefs that no longer fit our needs was probably helpful for our ancestors, especially if it helped them fully commit to a new tribal leader’s strange quirks and new spiritual beliefs. Today, this behavior can cause us to form strange high school (or office) social cliques and can foment toxic political debates, but it may have served a more constructive role for our ancestors forming early human civilizations.

Understanding the Past

Understanding the Past

I am always fascinated by the idea, that continually demonstrates validity in my own life, that the more we learn about something, the more realize how little we actually know about it. I am currently reading Yuval Noah Harari’s book Sapiens: A Brief History of Humankind, and I am continually struck by how often Harari brings in events from mankind’s history that I had never heard about. The more I learn about the past, or about any given subject, the more I realize how little knowledge I have ever had, and how limited, narrow, and sometimes just flat out inaccurate my understandings have been.

 

This is particularly important when it comes to how we think about the past. I believe very strongly that our reality and the worlds in which we live and inhabit are mostly social constructions. The trees, houses, and roads are all real, but how we understand the physical objects, the spaces we operate, and how we use the real material things in our worlds is shaped to an incredible degree by social constructions and the relationships we build between ourselves and the world we inhabit. In order to understand these constructions and in order to shape them for a future that we want to live in (and are physiologically capable of living in) we need to understand the past and make predictions about the future with new social constructs that enable continued human flourishing.

 

To some extent, this feels easy and natural to us. We all have a story and we learn and adopt family stories, national stories, and global stories about the grand arc of humanity. But while our stories seem to be shared, and while we seem to know where we are heading, we all operate based on individual understandings of the past, and where that means we are (or should be) heading. As Daniel Kahneman writes in his  book Thinking Fast and Slow, “we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less that we believe we do.”

 

As I laid out to begin this post, there is always so much more complexity and nuance to anything that we might study and be familiar with than we often realize. We can feel that we know something well when we are ignorant of the nuance and complexity. When we start to really untangle something, whether it be nuclear physics, the history of the American Confederacy, or how our fruits and veggies get to the supermarket, we realize that we really don’t know and understand anything as well as we might intuitively believe.

 

When we lack a deep and complex understanding of the past, because we just don’t know about something or because we didn’t have an accurate and detailed presentation of the thing from the past, then we are likely to misinterpret and misunderstand how we got to our current point. By having a limited historical perspective and understanding, we will incorrectly assess where our best future lies. It is important that we recognize how limited our knowledge is, and remember that these limits will shape the extent to which we can make valid predictions for the future.