Discount Confidence

Discount Confidence

You should probably discount confidence, even your own, when it comes to the certainty of a given outcome or event. I previously wrote about confidence stemming from the logical coherence of the story we are able to tell ourselves. I have also written about how logical coherence of personal narratives is easier when we lack key information and have a limited set of experiences to draw from. The more we know, the more experiences we have, the harder it becomes to construct a narrative that can balance conflicting and competing information. Laddering up from this point, we should be able to see that the more detailed and complete our information, the less coherent and easily logical our narrative about the world should be, and the less confidence we should have about anything.

 

If you have a high level of confidence in your own intuitions, then you probably don’t know enough about the world. If someone tells you they are very confident in something, like say an investment strategy, then you should probably discount the outcome based on their certainty. They may still be right in the end, but their certainty shouldn’t be a factor that leads to your support of the outcome they tell you to be a sure thing. As Daniel Kahneman writes in Thinking Fast and Slow, “The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trusty anyone – including yourself – to tell you how much you should trust their judgment.”

 

We tend to be very trustworthy. Our society and economy run on trust that we place in complete strangers. Our inclination toward trust is what causes us to be so easily fooled by confidence. It is easy to assume that someone who has a lot of confidence in something is more trustworthy, because we assume they must know a lot in order to be so confidence. But as I laid out at the start of this post, that isn’t always the case. In fact, the more knowledge you have about something, the less confidence you should have. With more knowledge comes more understanding of nuance, better conceptions of areas of uncertainty, and a better sense of trade-offs and contradictions. Confidence alone is not a predictor of accuracy. Our assumptions influence how accurate our prediction is, and we can be very confident in our assumptions without having any concrete connection to reality.
A Large Collection of Miniskills

A Large Collection of Miniskills

I  really like the way that Daniel Kahneman describes expertise in his book Thinking Fast and Slow. His description is incredibly meaningful today, in a world where so many of us work in offices and perform knowledge world. Expertise is important, but it is a bit nebulous when you think about knowledge work expertise compared to craftsmanship expertise. Nevertheless, a good concept of what expertise is can be helpful when thinking about personal growth and success.

 

Kahneman writes, “The acquisition of expertise in complex tasks such as high-level chess, professional basketball, or firefighting is intricate and slow because expertise in a domain is not a single skill but rather a large collection of miniskills.” By thinking about expertise as a large collection of miniskills it becomes more understandable and meaningful, even in the context of knowledge work. For sports, many crafts, and even physical labor, expertise as a collection of miniskills is so obvious it is almost invisible. But for knowledge work, expertise as a collection of miniskills is invisible because it is not obvious or ubiquitous.

 

The image coming to mind for me when I think of expertise as a series of miniskills is iron forging or glasswork. It is clear that one must have a lot of different skills ranging from skills related to noticing subtle changes in materials as heat is applied to physical skills involved in shaping the material once it is at a certain temperature. One also has to have imaginative skills in order to see the shape and design that one wants, and be able to connect the right twists, bends, and physical manipulations to the object to match the mental image. Forging a knife or making a glass marble requires a lot of skills in related but different spheres in order to make one final product. It is obvious that one needs a lot of miniskills to be successful, but unless we enroll in a beginners class, we don’t necessarily think about all the miniskills that go into the craftsmanship.

 

In the knowledge work economy, our final work products are also an accumulation of miniskills, even though it feels as though we just produce one thing or do one thing with no real “skill” involved. However, our work requires communication skills, writing skills (a particular variation of communication skills), scheduling and coordinating skills, and oftentimes skills that require us to be able to create visually stimulating and engaging materials. Whether it is creating a slide show, coordinating an important meeting, or drafting standard operating procedures, we are not simply doing one thing, but are engaging an entire set of miniskills. True expertise in knowledge work is still derived from a set of miniskills, but the skills themselves don’t seem like real skills, and are easily ignored or overlooked. Focusing on the miniskills needed for knowledge work expertise can help us understand where we can improve, what our image of success really entails, and how to approach important projects. It is the mastery and connection of various miniskills that enables us to be experts in what we do, even in our ubiquitous office environments.
Should You Be So Confident?

Should You Be So Confident?

Are you pretty confident that your diet is a healthy option for your? Are you confident in the outcome of your upcoming job interview? And how confident are you that you will have enough saved for retirement? Whatever your level of confidence, you might want to reconsider whether you should be as confident as you are, or whether you are just telling yourself a narrative that you like and that makes you feel comfortable with the decisions you have made.

 

In Thinking Fast and Slow, Daniel Kahneman writes the following about confidence:

 

“Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”

 

We feel confident in our choices, decisions, and predictions about the future when we can construct a coherent narrative. When we have limited information and experience, it is easy for us to fit that information together in a simplified manner that creates a logical story. The more conflicting and complex information and knowledge we obtain, the more diverse experiences and viewpoints we adopt, the harder it is to construct a simple narrative, and the harder it is for our story about the world to align in a way that makes us confident about anything.

 

A high level of confidence doesn’t represent reality, and it may actually reflect a lack of understanding of reality and all of its complexities. We are confident that our diet is good when we cut out ice cream and cookies, but we don’t really know that we are getting sufficient nutrients for our bodies and our lifestyles. We don’t really know how we perform in a job interview, but if we left feeling that we really connected and remembered to say the things we prepared, then we might be confident that we will land the job. And if we have a good retirement savings program through our job and also contribute to an IRA, we might feel that we are doing enough for retirement and be confident that we will be able to retire at 65, but few of us really do the calculations to ensure we are contributing what we need, and none of us can predict what housing or stock markets will look like as we get closer to retirement. Confidence is necessary for us to function in the world without being paralyzed by fear and never-ending cycles of analysis, but we shouldn’t mistake confidence in ourselves or in other people for actual certainty and knowledge.
Luck & Success - Joe Abittan

Luck & Success

I am someone who believes that we can all learn from the lessons of others. I believe that we can read books, listen to podcasts, watch documentaries, and receive guidance from good managers and mentors that will help us learn, grow, and become better versions of ourselves. I read Good to Great and Built to Last from Jim Collins, and I have seen value in books that look at successful companies and individuals. I have  believed that these books offer insights and lessons that can help me and others improve and adopt strategies and approaches that will help us become more efficient and productive overtime to reach large, sustainable goals.

 

But I might be wrong. In Thinking Fast and Slow, Daniel Kahneman directly calls into question whether books form authors like Jim Collins are useful for us at all. The problem, as Kahneman sees it, is that such books fail to account for randomness and chance. They fail to recognize the halo effect and see patterns where none truly exist. They ascribe causal mechanisms to randomness, and as a result, we derive a lesson that doesn’t really fit the actual world.

 

Kahneman writes, “because luck plays a large role, the quality of leadership and management practices cannot be inferred reliably from observations of success.” Taking a group of 20 successful companies and looking for shared operations, management styles, leadership traits, and corporate cultures will inevitably end up with us identifying commonalities. The mistake is taking those commonalities and then ascribing a causal link between these shared practices or traits and the success of companies or individuals. Without randomized controlled trials, and without natural experiments, we really cannot identify a strong causal link, and we might just be picking up on random chance within our sample selection, at least as Kahneman would argue.

 

I read Good to Great and I think there is a good chance that Kahneman is correct to a large extent. Circuit City was one of the success stories that Collins touted in the book, but the company barely survived another 10 years after the book’s initial publication. Clearly there are commonalities identified in books like Good to Great that are no more than chance, or that might themselves be artifacts of good luck. Perhaps randomness from good timing, fortunate economic conditions, or inexplicably poor decisions by the competition contribute to any given company or individual success just as much as the factors we identify by studying a group of success stories.

 

If this is the case, then there is not much to learn from case studies of several successful companies. Looking for commonalities among successful individuals and successful companies might just be an exercise in random pattern recognition, not anything specific that we can learn from. This doesn’t fit the reality that I want, but it may be the reality of the world we inhabit. Personally, I will still look to authors like Jim Collins and try to learn lessons that I can apply in my own life and career to help me improve the work I do. Perhaps I don’t have to fully implement everything mentioned in business books, but surely I can learn strategies that will fit my particular situation and needs, even if they are not broad panaceas to solve all productivity hang-ups in all times and places.
Hindsight Bias and Accountability - Joe Abittan

Hindsight Bias and Accountability

“Increased accountability is a mixed blessing,” writes Daniel Kahneman in his book Thinking Fast and Slow. This is an idea I came across in the past from books like Political Realism by Jonathan Rauch and The New Localism by Bruce Katz and  Jeremy Nowak. Our go-to answer to any challenges and problems tends to be increased transparency and greater oversight. However, in some complex fields simply opening processes and decision-making procedures to more scrutiny and review can create new problems that might be even worse. This is a particular challenge when we consider the way hindsight bias influences the thoughts and opinions of those reviewing bodies.

 

Kahneman continues, “because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions and – to an extreme – reluctance to take risks.”

 

Excess scrutiny and oversight can lead to rigid and mechanical decision-making processes. This might not be a problem when we are engineering a bridge and need to make technical decisions based on known mathematical calculations (I’ve never engineered a bridge so I may be wrong here), but it can be a problem for doctors and policy makers. Doctors have to rely on their experience, their knowledge, and their intuitions to determine the best possible medical treatment. Checklists are fantastic ideas, but when things go wrong in an operating room, doctors and nurses have to make quick decisions balancing risk and uncertainty. If the oversight they will face is high, then there is a chance that doctors stick to a rigid set of steps, that might not really fit the current emergency. In his book, Kahneman writes about how this leads doctors to order unnecessary tests and procedures, more to cover themselves from liability than to truly help the patient, wasting time and money within the healthcare system.

 

For public decision-making, hindsight bias can be a disaster for public growth and development. The federal government makes loans and backs many projects. Like any venture capitalist firm or large bank making multiple investments, some projects will fail. It is impossible to know at the outset which of ten solar energy projects will be a massive success, and which company is going to go bust. But thanks to hindsight bias and the intense oversight that public agencies and legislatures are subject to, an investment in a solar project that goes bust is likely to haunt the agency head or legislators who backed the project, even if 9 of the other 10 projects were huge successes.

 

Oversight is important, but when oversight is subject to hindsight bias, the accountability shifts into high gear, blaming decision-makers for failing to have the superhuman ability to predict the future. This creates risk averse institutions that stagnate, waste resources, and are slow to act, potentially creating new problems and new vulnerabilities to hindsight bias in the future. Rauch, Katz, and Nowak in the posts I linked to above, all favor reducing transparency in the public setting for this reason, but Kahneman might not agree with them, arguing that closing the deliberations to transparency won’t hide the outcomes from the public, and won’t stop hindsight bias from being an issue.
Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.
Can You Remember Your Prior Beliefs? - Joe Abittan

Can You Remember Your Prior Beliefs?

“A general limitation of the human mind,” writes Daniel Kahneman in his book Thinking Fast and Slow, “is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”

 

What Kahneman is referring to with this quote is the difficulty we have in understanding how our thinking evolves and changes over time. To each of us, our thinking slowly adapts and revises itself, sometimes quite dramatically, but often very slowly. Our experience of our changing mind isn’t very reflective of these changes, unless we had a salient change that I would argue is tied in one way or another to an important aspect of our identity. For most changes in our mental approach, we generally don’t remember our prior beliefs and views, and we likely don’t remember a point at which our beliefs changed.

 

In the book Kahneman uses an example of two football teams with the same record playing each other. One team crushes the other, but before we knew the outcome, we didn’t have a strong sense of how the game would go. After watching a resounding victory, it is hard to remember that we once were so uncertain about the future outcome.

 

This tendency of the mind wouldn’t be much of a problem if it was restricted to our thinking about sports – unless we had a serious betting problem. However, this applies to our thinking on many more important topics such as family member marriages, career choices, political voting patterns, and consumer brand loyalty. At this moment, many Democrat voters in our nation probably don’t remember exactly what their opinions were on topics like free trade, immigration, or infectious disease policy prior to the 2016 election. If they do remember their stances on any of those issues, they probably don’t remember all the legal and moral arguments they expressed at that time. Their minds and opinions on the matter have probably shifted in response to President Trump’s policy positions, but it is probably hard for many to say exactly how or why their views have changed.

 

In a less charged example, imagine that you are back in high school, and for years you have really been into a certain brand of shoes. But, one day, you are bullied for liking that brand, or perhaps someone you really dislike is now sporting that same brand, and you want to do everything in your power to distance yourself from any association with the bullying or the person you don’t like. Ditching the shoes and forgetting that you ever liked that brand is an easy switch for our minds to make, and you never have to remember that you too wore those shoes.

 

The high school example is silly, but for me it helps put our brain’s failure to remember previous opinions and beliefs in context. Our brains evolved in a social context, and for our ancestors, navigating complex tribal social structures and hierarchies was complex and sometimes a matter of life and death (not just social media death for a few years in high school like today). Being able to ditch beliefs that no longer fit our needs was probably helpful for our ancestors, especially if it helped them fully commit to a new tribal leader’s strange quirks and new spiritual beliefs. Today, this behavior can cause us to form strange high school (or office) social cliques and can foment toxic political debates, but it may have served a more constructive role for our ancestors forming early human civilizations.

Understanding the Past

Understanding the Past

I am always fascinated by the idea, that continually demonstrates validity in my own life, that the more we learn about something, the more realize how little we actually know about it. I am currently reading Yuval Noah Harari’s book Sapiens: A Brief History of Humankind, and I am continually struck by how often Harari brings in events from mankind’s history that I had never heard about. The more I learn about the past, or about any given subject, the more I realize how little knowledge I have ever had, and how limited, narrow, and sometimes just flat out inaccurate my understandings have been.

 

This is particularly important when it comes to how we think about the past. I believe very strongly that our reality and the worlds in which we live and inhabit are mostly social constructions. The trees, houses, and roads are all real, but how we understand the physical objects, the spaces we operate, and how we use the real material things in our worlds is shaped to an incredible degree by social constructions and the relationships we build between ourselves and the world we inhabit. In order to understand these constructions and in order to shape them for a future that we want to live in (and are physiologically capable of living in) we need to understand the past and make predictions about the future with new social constructs that enable continued human flourishing.

 

To some extent, this feels easy and natural to us. We all have a story and we learn and adopt family stories, national stories, and global stories about the grand arc of humanity. But while our stories seem to be shared, and while we seem to know where we are heading, we all operate based on individual understandings of the past, and where that means we are (or should be) heading. As Daniel Kahneman writes in his  book Thinking Fast and Slow, “we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less that we believe we do.”

 

As I laid out to begin this post, there is always so much more complexity and nuance to anything that we might study and be familiar with than we often realize. We can feel that we know something well when we are ignorant of the nuance and complexity. When we start to really untangle something, whether it be nuclear physics, the history of the American Confederacy, or how our fruits and veggies get to the supermarket, we realize that we really don’t know and understand anything as well as we might intuitively believe.

 

When we lack a deep and complex understanding of the past, because we just don’t know about something or because we didn’t have an accurate and detailed presentation of the thing from the past, then we are likely to misinterpret and misunderstand how we got to our current point. By having a limited historical perspective and understanding, we will incorrectly assess where our best future lies. It is important that we recognize how limited our knowledge is, and remember that these limits will shape the extent to which we can make valid predictions for the future.
Scared Before You Even Know It

Scared Before You Even Know It

In Thinking Fast and Slow, Daniel Kahneman demonstrates how quick our minds are and how fast they react to potential dangers and threats by showing us two very simple pictures of eyes. The pictures are black squares, with a little bit of white space that our brains immediately perceive as eyes, and beyond that immediate perception of eyes, our brains also immediately perceive an emotional response within the eyes. They are similar to the simple eyes I sketched out here:

In my sketch, the eyes on the left are aggressive and threatening, and our brains will pick up on the threat they pose and we will have physiological responses before we can consciously think through the fact that those eyes are just a few lines drawn on paper. The same thing happens with the eyes on the right, which our brains recognize as anxious or worried. Our body will have a quick fear reaction, and our brain will be on guard in case there is something we need to be anxious or worried about as well.

 

Regarding a study that was conducted where subjects in a brain scanner were shown a threatening picture for less than 2/100 of a second, Kahneman writes, “Images of the brain showed an intense response of the amygdala to a threatening picture that the viewer did not recognize. The information about the threat probably traveled via a superfast neural channel that feeds directly into a part of the brain that processes emotions, bypassing the visual cortex that supports the conscious experience of seeing.” The study was designed so that the subjects were not consciously aware of having seen an image of threatening eyes, but nevertheless their brain perceived it and their body reacted accordingly.

 

The takeaway from this kind of research is that our environments matter and that our brains respond to more than what we are consciously aware of. Subtle cues and factors around us can shape the way we behave and feel about where we are and what is happening. We might not know why we feel threatened, and we might not even realize that we feel threatened, but our heart rate may be elevated, we might tense up, and we might become short and defensive in certain situations. When we think back on why we behaved a certain way, why we felt the way we did, and why we had the reactions we did, our brains won’t be able to recognize these subtle cues that never rose to the level of consciousness. We won’t be able to explain the reason why we felt threatened, all we will be able to recall is the physiological response we had to the situation. We are influenced by far more than our conscious brain is aware, and we should remember that our conscious brain doesn’t provide us with a perfect picture of reality, but nevertheless our subconscious reacts to more of the world than we notice.
Ignore Our Ignorance

Ignore Our Ignorance

There is a quote that is attributed to Harry Truman along the lines of, “give me a one-handed economist.” The quote references the frustrations that any key decision-maker might have when faced with challenging and sometimes conflicting information and choices. On the one hand is a decision with a predicted set of outcomes, but on the other hand is another decision or a separate undesirable set of consequences. The quote shows how challenging it is to understand and navigate the world when you have complex and nuanced understandings of what is happening.

 

Living in ignorance actually makes choices and decisions easier – there is no other hand of separate choices, of negative consequences, or different points of view. Ignoring our ignorance is preferable when we live our own narrative constructions, where what we see is all there is, and reality is what we make it to be.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow, and how these narrative fallacies lead to so many of our predictable cognitive errors. He writes, “Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

 

When I think about Kahneman’s quote, I think about myself upon graduating with a Masters in Public Administration and Policy and my older sister upon her high school graduation. My sister has had strong political views for a very long time, views that she readily adopted as a high school student. Her self-assured profession of her political views which contrasted against the self-assured political views of my parents is part of what sparked an interest in me to study political science and public policy. I wanted to understand how people became so sure of political views that I didn’t fully understand, but which I could see contained multitudes of perspectives, benefits, and costs.

 

At the completion of my degree I felt that I had a strong understanding of the political processes in the United States. I could understand how public policy was shaped and formed, I could describe how people came to hold various points of view and why some people might favor different policies. But what I did not gain was a sense that one particular political approach was necessarily correct or inherently better than any other. So much of our political process is dependent on who stands to benefit, what is in our individual self-interest, and what our true goals happen to be. At the completion of a study of politics, I felt that I knew more than many, but I did not exactly feel that my political opinions were stronger than the political opinions of my sisters when she graduated high school. Her opinions were formed in ignorance (not saying this in a mean way!), and her limited perspective allowed her to be more confident in her opinions than I could be with my detailed and nuanced upstanding of political systems and processes.

 

Our views of the world and how we understand our reality is shaped by the information we absorb and the experiences we have. What you see is all there is, and the narrative you live within will make more sense when you are more ignorant of the complexities of the world around you. Your narrative will be simpler and more coherent since there won’t be other hands to contrast against your opinions, desires, and convictions.