Imaginary Reference Points

Imaginary Reference Points

My last post was about reference points and how they can create subjective experiences that differ from person to person. If my refence point is dramatically different than another person’s, then our experience of the same objective fact or reality can be quite different. If I suddenly won $1 million dollars my life might change dramatically, but if an incredibly wealthy person suddenly won $1 million, they might not care at all.

 

Reference points can get even more complicated than the example I just shared which was borrowed from yesterday’s post. Sometimes reference points don’t need to be real in order to shape our subjective experiences of the world. In his book Thinking Fast and Slow, Daniel Kahneman uses the example of an expected raise. If someone knows that their colleagues have received a raise, and expects to get a similar raise but does not, then they can feel as though they have really lost. Their financial situation has not changed, but they created an imaginary reference point in their mind which has shaped the way they think about their current situation.

 

We all have certain expectations about the world that we adopt as reference points. These reference points don’t have to reflect anything real about the world, but they can still greatly impact our subjective opinions and considerations of the world. They can be vain, such as how attractive we expect our spouse to be, or more positively aspirational, such as how much we expect everyone in society to participate in social causes to help those who are the most needy.  There is no real reference point that we are using, just hazy ideas of the way we think things should be, but nevertheless, these imaginary reference points can guide a lot of our thinking and behavior.

 

In my own life, examining these imaginary reference points has been incredibly helpful for making me a more happy and confident person. It is easy to let imaginary reference points fall to the background, where they run our lives without being considered in a critical way. By thinking deeply about our reference points we can better consider what we should and should not strive for, how much effort or money we need to put toward certain endeavors, and whether our behaviors are really reasonable and worthwhile. Through self-reflection and self-awareness we can recognize goals that serve as reference points which are unreasonable, desires which are vain and should be discarded, and ideas about who we are supposed to be that don’t truly align with our lives and what would help us live in a meaningful and fulfilling manner. Imaginary reference points matter, and they can greatly influence how we live our lives. We should make sure we think about them and let go of those which drive us in the wrong direction.
Subjective Reference Points

Subjective Reference Points

One reason why we will never be able to perfectly understand other people and the opinions, decisions, and beliefs that other people hold is because we all have different reference points. I cannot be inside your head, I cannot see things from exactly the same angle that you see things, and I cannot have the same background and experiences that you have. Our differing reference points create subjectivity in our lives, and not just in areas that we would all agree don’t have one true correct answer. Even areas where it seems like there should be a single objective fact or reality can be very subjective. We expect to see a lot of subjective variability in terms of our preferences for living in the city versus the country, in preferring private insurance versus state sponsored insurance, or preferring soft versus firm mattresses, but we probably don’t expect to have the same different subjective experiences or preferences for how loud a sound is, how light or dark a shade of gray appears, or the value of a $2 million dollar gambling win.

 

Each of these areas of unexpected subjectivity are discussed by Daniel Kahneman in his book Thinking Fast and Slow. He shows that understanding and predicting subjective experiences in these areas is possible if we understand the different reference points that are in play for each individual. There may be an objective and unchanging fact at the base of the reality each individual experiences, but the experience can nevertheless be subjective. Kahneman writes,

 

“To predict the subjective experience of loudness, it is not enough to know its absolute energy; you also need to know the reference sound to which it is automatically compared. Similarly, you need to know about the background before you can predict whether a gray patch on a page will appear dark or light. And you need to know the reference before you can predict the utility of an amount of wealth.”

 

I don’t want to end up in a point where we say there is no objective reality we can all observe, after all, “a dead body is a dead body, and someone is going to jail,” as a friend who is a federal judge here in Reno, NV once said to me. But I do want to highlight just how much of our world can be interpreted differently based on our reference points. Sounds, colors, and the value we would get from a certain amount of wealth are not obviously subjective, but Kahneman shows just how subjective these areas are in his book. This should make us consider how much our backgrounds, our unique points of view, and the circumstances in which we make our observations shape how we understand the world. A lot of our understandings of reality are context dependent, and that should cause us to pause before we say with absolute certainty that reality is exactly as we have experienced it. We should pause to consider the reference classes which shape and influence how we experienced the world, and how those references might be different than those of other people. We can’t just say that there is one way to interpret and experience everything in the world, we have to accept that how we experience the world will be shaped by many factors that we might not be aware of and might not consider if we don’t slow down to think about the references.
Avoiding Gambles

Avoiding Gambles

“Most people dislike risk (the chance of receiving the lowest possible outcome), and if they are offered a choice between a gamble and an amount equal to its expected value they will pick the sure thing,” writes Daniel Kahneman in Thinking Fast and Slow. I don’t want to get too far into expected value, but in my mind I think of it as a discount on the total value of the best outcome of a gamble blended with the possibility of getting nothing. Rather than the expected value of a $100 dollar bet being $100, the expected value is going to come in somewhere less than that, maybe around $50, $75, or $85 dollars depending on whether the odds of winning the bet are so-so or are pretty good. You will either win $100 or 0, not $50, $75, or $85, but the risk factor causes us to value the bet at less than the full amount up for grabs.

 

What Kahneman describes in his book is an interesting phenomenon where people will mentally (or maybe subjectively is the better way to put it) calculate an expected value in their head when faced with a betting opportunity. If the expected value of the bet that people calculate for themselves is not much higher than a guaranteed option, people will pick the guaranteed option. The quote I used to open the post explains the phenomenon which you have probably seen if you have watched enough game show TV. As Kahneman continues, “In fact a risk-averse decision maker will choose a sure thing that is less than the expected value, in effect paying a premium to avoid the uncertainty.”

 

On game shows, people will frequently walk away from the big possibility of a pay off with a modest sum of cash if they are risk averse or if the odds seem really stacked against them. What is interesting is that we can study when people make the bet versus when people walk away, and observe patterns in our decision making. It turns out we can predict the situations that drive people toward avoiding gambles, and the situations which encourage them. It turns out that the reward has to be about two times the possible loss before people will make a gamble. If the certain outcome is pretty close to the expected outcome, people will pick the certain outcome. If there is no certain outcome, people usually need a reward that is at least 2X what they might lose before people will be comfortable with a bet. We might like to take chances and gamble from time to time, but we tend to be pretty risk averse and we tend to prefer guaranteed outcomes, even at a slight cost over the expected value of a bet, than to lose it all.
Overcoming Group Overconfidence

Overcoming Group Overconfidence

Overcoming group overconfidence is hard, but in Thinking Fast and Slow, Daniel Kahneman offers one partial remedy: a premortem. As opposed to a postmortem, and analysis of why a project failed, a premortem looks at why a program might fail before it has started.

 

Group communication is difficult. When the leader of a group is enthusiastic about an idea, it is hard to disagree with them. If you are a junior member of a team, it can be uncomfortable, and potentially even disadvantageous for you and your career to doubt the ideas that a senior leader is excited about. If you have concerns, it is not likely that you will bring them up, especially in a group meeting with other seemingly enthusiastic team members surrounding you.

 

Beyond the silencing of a member who has concerns but doesn’t want to speak up is another problem that contributes to overconfidence among teams: groupthink. Particularly among groups that lack diversity, groupthink can crush the planning stage of a project. When everyone has similar backgrounds, similar experiences, and similar styles of thinking, it is unlikely that anyone within the group will have a viewpoint or opinion that is significantly different than the prevailing wisdom of the rest. What seems like a good idea or the correct decision to one person probably feels like the correct idea or decision to everyone else – there is literally no one in the room who has any doubts or alternative perspectives.

 

Premortems help get beyond groupthink and the fear of speaking up against a powerful and enthusiastic leader. The idea is to brainstorm all the possible ways that a project might fail. It includes an element of creativity by asking everyone to imagine the project is finally finished, either successfully but well over budget, way late, after a very turbulent series of events, or the project was a complete failure and never reached its intended end point. People have to describe the issues that came up and why the project did not reach the rosy outcome everyone initially pictured. Imaging that these failures had taken place in real life gets people to step beyond groupthink and encourages highlighting roadblocks that particularly enthusiastic members overlook.

 

Because premortems are hypothetical, it gives people a chance to speak up about failure points and weaknesses in plans and ideas without appearing to criticize the person the idea came from. It creates a safe space for imagining barriers and obstacles that need to be overcome to achieve success. It reduces groupthink by encouraging a creative flow of ideas of failure points. As Kahneman writes, “The main virtue of the premortem is that it legitimizes doubts. Furthermore, it encourages even supporters of the decision to search for possible threats that they had not considered earlier.”

 

Overcoming group overconfidence is possible, but it needs the right systems and structures to happen. Groupthink and fear are likely to prevent people from bringing up real doubts and threats, but a premortem allows those concerns to be aired and seriously considered. It helps get people to look beyond the picture of success they intuitively connect with, and it helps prevent enthusiastic supporters from getting carried away with their overconfidence.
Take the Outside View

Take the Outside View

Taking the outside view is a shorthand and colloquial way to say, think of the base rate of the reference class to which something belongs, and make judgements and predictions from that starting point. Take the outside view is advice from Daniel Kahneman in his book Thinking Fast and Slow for anyone working on a group project, launching a start-up, or considering an investment with a particular company. It is easy to take the inside view, where everything seems predictable and success feels certain. However, it is often better for long-term success to take the outside view.

 

In his book, Kahneman writes, “people who have information about an individual case rarely feel the need to know the statistics of the class to which the case belongs.” He writes this after discussing a group project he worked on where he and others made an attempt to estimate the time necessary to complete the project and the obstacles and hurdles they should expect along the way. For everyone involved, the barriers and likelihood of being derailed and slowed down seemed minimal, but Kahneman asked the group what to expect based on the typical experience of similar projects. The outlook was much more grim when viewed from the outside perspective, and helped the group better anticipate challenges they could face and set more reasonable timelines and work processes.

 

Kahneman continues, “when forecasting the outcomes of risky projects, executives too easily fall victim to the planning fallacy. In its grip, they make decisions based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs.”

 

Taking the outside view helps us get beyond delusional optimism. It helps us make better expectations about how long a project will take, what rate of return we should expect, and what the risks really look like. It is like getting a medical second opinion, to ensure that your doctor isn’t missing anything and to ensure they are following the most up-to-date practices. Taking the outside view shifts our base rate, anchors us to a reality that is more reflective of the world we live in, and helps us prepare for challenges that we would otherwise overlook.
Discount Confidence

Discount Confidence

You should probably discount confidence, even your own, when it comes to the certainty of a given outcome or event. I previously wrote about confidence stemming from the logical coherence of the story we are able to tell ourselves. I have also written about how logical coherence of personal narratives is easier when we lack key information and have a limited set of experiences to draw from. The more we know, the more experiences we have, the harder it becomes to construct a narrative that can balance conflicting and competing information. Laddering up from this point, we should be able to see that the more detailed and complete our information, the less coherent and easily logical our narrative about the world should be, and the less confidence we should have about anything.

 

If you have a high level of confidence in your own intuitions, then you probably don’t know enough about the world. If someone tells you they are very confident in something, like say an investment strategy, then you should probably discount the outcome based on their certainty. They may still be right in the end, but their certainty shouldn’t be a factor that leads to your support of the outcome they tell you to be a sure thing. As Daniel Kahneman writes in Thinking Fast and Slow, “The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trusty anyone – including yourself – to tell you how much you should trust their judgment.”

 

We tend to be very trustworthy. Our society and economy run on trust that we place in complete strangers. Our inclination toward trust is what causes us to be so easily fooled by confidence. It is easy to assume that someone who has a lot of confidence in something is more trustworthy, because we assume they must know a lot in order to be so confidence. But as I laid out at the start of this post, that isn’t always the case. In fact, the more knowledge you have about something, the less confidence you should have. With more knowledge comes more understanding of nuance, better conceptions of areas of uncertainty, and a better sense of trade-offs and contradictions. Confidence alone is not a predictor of accuracy. Our assumptions influence how accurate our prediction is, and we can be very confident in our assumptions without having any concrete connection to reality.
A Large Collection of Miniskills

A Large Collection of Miniskills

I  really like the way that Daniel Kahneman describes expertise in his book Thinking Fast and Slow. His description is incredibly meaningful today, in a world where so many of us work in offices and perform knowledge world. Expertise is important, but it is a bit nebulous when you think about knowledge work expertise compared to craftsmanship expertise. Nevertheless, a good concept of what expertise is can be helpful when thinking about personal growth and success.

 

Kahneman writes, “The acquisition of expertise in complex tasks such as high-level chess, professional basketball, or firefighting is intricate and slow because expertise in a domain is not a single skill but rather a large collection of miniskills.” By thinking about expertise as a large collection of miniskills it becomes more understandable and meaningful, even in the context of knowledge work. For sports, many crafts, and even physical labor, expertise as a collection of miniskills is so obvious it is almost invisible. But for knowledge work, expertise as a collection of miniskills is invisible because it is not obvious or ubiquitous.

 

The image coming to mind for me when I think of expertise as a series of miniskills is iron forging or glasswork. It is clear that one must have a lot of different skills ranging from skills related to noticing subtle changes in materials as heat is applied to physical skills involved in shaping the material once it is at a certain temperature. One also has to have imaginative skills in order to see the shape and design that one wants, and be able to connect the right twists, bends, and physical manipulations to the object to match the mental image. Forging a knife or making a glass marble requires a lot of skills in related but different spheres in order to make one final product. It is obvious that one needs a lot of miniskills to be successful, but unless we enroll in a beginners class, we don’t necessarily think about all the miniskills that go into the craftsmanship.

 

In the knowledge work economy, our final work products are also an accumulation of miniskills, even though it feels as though we just produce one thing or do one thing with no real “skill” involved. However, our work requires communication skills, writing skills (a particular variation of communication skills), scheduling and coordinating skills, and oftentimes skills that require us to be able to create visually stimulating and engaging materials. Whether it is creating a slide show, coordinating an important meeting, or drafting standard operating procedures, we are not simply doing one thing, but are engaging an entire set of miniskills. True expertise in knowledge work is still derived from a set of miniskills, but the skills themselves don’t seem like real skills, and are easily ignored or overlooked. Focusing on the miniskills needed for knowledge work expertise can help us understand where we can improve, what our image of success really entails, and how to approach important projects. It is the mastery and connection of various miniskills that enables us to be experts in what we do, even in our ubiquitous office environments.
Should You Be So Confident?

Should You Be So Confident?

Are you pretty confident that your diet is a healthy option for your? Are you confident in the outcome of your upcoming job interview? And how confident are you that you will have enough saved for retirement? Whatever your level of confidence, you might want to reconsider whether you should be as confident as you are, or whether you are just telling yourself a narrative that you like and that makes you feel comfortable with the decisions you have made.

 

In Thinking Fast and Slow, Daniel Kahneman writes the following about confidence:

 

“Confidence is a feeling, which reflects the coherence of the information and the cognitive ease of processing it. It is wise to take admissions of uncertainty seriously, but declarations of high confidence mainly tell you that an individual has constructed a coherent story in his mind, not necessarily that the story is true.”

 

We feel confident in our choices, decisions, and predictions about the future when we can construct a coherent narrative. When we have limited information and experience, it is easy for us to fit that information together in a simplified manner that creates a logical story. The more conflicting and complex information and knowledge we obtain, the more diverse experiences and viewpoints we adopt, the harder it is to construct a simple narrative, and the harder it is for our story about the world to align in a way that makes us confident about anything.

 

A high level of confidence doesn’t represent reality, and it may actually reflect a lack of understanding of reality and all of its complexities. We are confident that our diet is good when we cut out ice cream and cookies, but we don’t really know that we are getting sufficient nutrients for our bodies and our lifestyles. We don’t really know how we perform in a job interview, but if we left feeling that we really connected and remembered to say the things we prepared, then we might be confident that we will land the job. And if we have a good retirement savings program through our job and also contribute to an IRA, we might feel that we are doing enough for retirement and be confident that we will be able to retire at 65, but few of us really do the calculations to ensure we are contributing what we need, and none of us can predict what housing or stock markets will look like as we get closer to retirement. Confidence is necessary for us to function in the world without being paralyzed by fear and never-ending cycles of analysis, but we shouldn’t mistake confidence in ourselves or in other people for actual certainty and knowledge.
Luck & Success - Joe Abittan

Luck & Success

I am someone who believes that we can all learn from the lessons of others. I believe that we can read books, listen to podcasts, watch documentaries, and receive guidance from good managers and mentors that will help us learn, grow, and become better versions of ourselves. I read Good to Great and Built to Last from Jim Collins, and I have seen value in books that look at successful companies and individuals. I have  believed that these books offer insights and lessons that can help me and others improve and adopt strategies and approaches that will help us become more efficient and productive overtime to reach large, sustainable goals.

 

But I might be wrong. In Thinking Fast and Slow, Daniel Kahneman directly calls into question whether books form authors like Jim Collins are useful for us at all. The problem, as Kahneman sees it, is that such books fail to account for randomness and chance. They fail to recognize the halo effect and see patterns where none truly exist. They ascribe causal mechanisms to randomness, and as a result, we derive a lesson that doesn’t really fit the actual world.

 

Kahneman writes, “because luck plays a large role, the quality of leadership and management practices cannot be inferred reliably from observations of success.” Taking a group of 20 successful companies and looking for shared operations, management styles, leadership traits, and corporate cultures will inevitably end up with us identifying commonalities. The mistake is taking those commonalities and then ascribing a causal link between these shared practices or traits and the success of companies or individuals. Without randomized controlled trials, and without natural experiments, we really cannot identify a strong causal link, and we might just be picking up on random chance within our sample selection, at least as Kahneman would argue.

 

I read Good to Great and I think there is a good chance that Kahneman is correct to a large extent. Circuit City was one of the success stories that Collins touted in the book, but the company barely survived another 10 years after the book’s initial publication. Clearly there are commonalities identified in books like Good to Great that are no more than chance, or that might themselves be artifacts of good luck. Perhaps randomness from good timing, fortunate economic conditions, or inexplicably poor decisions by the competition contribute to any given company or individual success just as much as the factors we identify by studying a group of success stories.

 

If this is the case, then there is not much to learn from case studies of several successful companies. Looking for commonalities among successful individuals and successful companies might just be an exercise in random pattern recognition, not anything specific that we can learn from. This doesn’t fit the reality that I want, but it may be the reality of the world we inhabit. Personally, I will still look to authors like Jim Collins and try to learn lessons that I can apply in my own life and career to help me improve the work I do. Perhaps I don’t have to fully implement everything mentioned in business books, but surely I can learn strategies that will fit my particular situation and needs, even if they are not broad panaceas to solve all productivity hang-ups in all times and places.
Hindsight Bias and Accountability - Joe Abittan

Hindsight Bias and Accountability

“Increased accountability is a mixed blessing,” writes Daniel Kahneman in his book Thinking Fast and Slow. This is an idea I came across in the past from books like Political Realism by Jonathan Rauch and The New Localism by Bruce Katz and  Jeremy Nowak. Our go-to answer to any challenges and problems tends to be increased transparency and greater oversight. However, in some complex fields simply opening processes and decision-making procedures to more scrutiny and review can create new problems that might be even worse. This is a particular challenge when we consider the way hindsight bias influences the thoughts and opinions of those reviewing bodies.

 

Kahneman continues, “because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions and – to an extreme – reluctance to take risks.”

 

Excess scrutiny and oversight can lead to rigid and mechanical decision-making processes. This might not be a problem when we are engineering a bridge and need to make technical decisions based on known mathematical calculations (I’ve never engineered a bridge so I may be wrong here), but it can be a problem for doctors and policy makers. Doctors have to rely on their experience, their knowledge, and their intuitions to determine the best possible medical treatment. Checklists are fantastic ideas, but when things go wrong in an operating room, doctors and nurses have to make quick decisions balancing risk and uncertainty. If the oversight they will face is high, then there is a chance that doctors stick to a rigid set of steps, that might not really fit the current emergency. In his book, Kahneman writes about how this leads doctors to order unnecessary tests and procedures, more to cover themselves from liability than to truly help the patient, wasting time and money within the healthcare system.

 

For public decision-making, hindsight bias can be a disaster for public growth and development. The federal government makes loans and backs many projects. Like any venture capitalist firm or large bank making multiple investments, some projects will fail. It is impossible to know at the outset which of ten solar energy projects will be a massive success, and which company is going to go bust. But thanks to hindsight bias and the intense oversight that public agencies and legislatures are subject to, an investment in a solar project that goes bust is likely to haunt the agency head or legislators who backed the project, even if 9 of the other 10 projects were huge successes.

 

Oversight is important, but when oversight is subject to hindsight bias, the accountability shifts into high gear, blaming decision-makers for failing to have the superhuman ability to predict the future. This creates risk averse institutions that stagnate, waste resources, and are slow to act, potentially creating new problems and new vulnerabilities to hindsight bias in the future. Rauch, Katz, and Nowak in the posts I linked to above, all favor reducing transparency in the public setting for this reason, but Kahneman might not agree with them, arguing that closing the deliberations to transparency won’t hide the outcomes from the public, and won’t stop hindsight bias from being an issue.