On Ego - A Response to a Comment from Philip

On Ego – A Response to a Comment from Philip

Philip asked me some thoughts about ego in a recent comment. Several years back I read and wrote about Ryan Holiday’s book Ego is the Enemy and it has been fundamental in shaping how I see and understand myself within our complex social world. In addition to Ego is the Enemy, Robin Hanson and Kevin Simler’s book The Elephant in the Brain and Daniel Kahneman’s work in Thinking Fast and Slow dramatically shape the way I understand the idea of the self, how we think, and the role of ego in our lives. Here are some of my thoughts on ego, and some specific responses to questions that Philip asked.
 
 
First, Philip said that he sees, “ego as a closed loop of sort, independent of the acting self.”
 
 
I wouldn’t agree with Philip on this point, but that is because I reject the idea of an independent acting self. Yuval Noah Harari is a great person to read on meditation and the idea of the self. If you ever try meditating, you will quickly learn that you don’t truly have control over your thoughts. This suggests that we don’t have an independent self that is doing the thinking in our minds. “Thoughts think themselves,” Harari has said, and if you meditate, you will understand what he means. Thoughts frequently pop into our head without our control. Ego, and the kinds of thoughts we associate with ego and megalomania, are all just thoughts swirling around in what appears to be a chaos of thoughts. Given the nature of thought, I don’t think we should think of ego as anything independent of the other thoughts within our mind.
 
 
Second, Philip says, “if you are in a state of security, you can choose not to act on ego.”
 
 
I also wouldn’t agree with this point. In his Meditations, Aurelius writes about Epictetus, a slave and a pioneer of stoicism. Epictetus was not exactly secure, but he was able to put aside ego and focus on the present moment. His particular brand of stoicism has resonated with prisoners of war and involves the dissolution of personal ego for survival. We can put aside ego at any point, regardless of how secure we are.
 
 
Also pushing against Philips thoughts is Donald Trump. Certainly, at many points in his life, Trump has been secure in terms of money, fame, power, and influence. Yet Trump is clearly an egomaniac who is unable to set his ego aside and will pursue even the smallest slights and insults against him. I don’t think that a state of security is really an important consideration for whether we act in an egotistical way.
 
 
Philip’s third observation on ego is “as a self preserving mechanism, protecting you and helping you in motivating living the life you do.”
 
 
This is a view on our ego that I would agree with. When we think about how the mind works, I think we should always approach it from an evolutionary psychology standpoint. Very likely, our brains are the way they are because at some point in the history of human evolution it was beneficial for our minds to function in one way over another. There could be some accidents, some mental equivalents to vestigial organs, and some errors in our interpretations, but we probably didn’t develop many psychological traits and maintain them throughout generations if they were not helpful for survival somewhere along the way.
 
 
When we view the ego through this lens, it is not hard to see how the ego could help improve our chances of surviving and passing down our genes. If we are egotistical and think that we deserve the best and that we deserve larger amounts of resources, we will be more likely to advocate for ourselves and fight for a better lot in life. This could help our survival, could help us find a better mate, and could help ensure we pass more genes on to subsequent generations. Without the ego, we may chose to settle, we may be complacent, and we may not strive to pass our genes along or ensure that those subsequent genes have sufficient resources to further pass their genes into the following generation. Ego can push us to strive toward the higher salary, the fancier car, more exclusive golf clubs, and other things that are not really necessary for life, but could help ourselves amass more resources and help our kids have better connections to get into Stanford and ultimately find a spouse and have kids. Ego could certainly be antisocial and harmful for us and society, but it could also be important for genetic survival.
 
 
Philip’s fourth point is about a guy who is hurt because his wife forgot his birthday.
 
 
It is possible that an inflated ego is what made this guy upset when his wife forgot his birthday, but it could also be a number of other psychological or relationship issues between him and his spouse. He may have larger issues of self-worth and value independent of his ego. He may be codependent and perhaps need counseling to better manage his relationships with others. Or his wife could have just been having a bad day. This didn’t seem like a great avenue for discussing and understanding ego to me.
 
 
The fifth point that Philip brings up ties back to his third, by viewing ego as a “fairness calculator.”
 
 
I also think this could be a useful way to view ego and it also seems like it could be understood through a cognitive psychology perspective. We don’t want to feel like we are being cheated, yet we would be happy to bend the rules and cheat a little if we thought we could get away with it. This is a lot of what Robin Hanson and Kevin Simler discuss in The Elephant in the Brain. If we can signal that we are honest and trustworthy, without actually having to be honest and trustworthy, then we are at an advantage. However, if we suspect that another person is all signal and no actual behavior to back up those signals, then we may act in an egotistical way by being defensive and pushing back against the other. Ego does seem to help fuel this mindset and does seem to encourage a type of fairness calculator behavior.
 
 
The final point that Philip makes is that, “ego needs to be controlled in a civilized society.”
 
 
I think here Philip is also correct. We live in very complex social societies and ego helps us individually, but also has negative externalities. Ego certainly helped push Trump to the presidency and the history books, but I’m not sure the world was better for it. By pursuing our own self-interest and acting based on ego, we can damage the world around us.
 
 
Hanson and Simler would argue that much of these harmful effects of ego are moderated by our signaling ability. Hanson has said that his estimate is that up to 90% of what we do as humans is signaling, at least in rich countries like the United States. Signaling both helps us get ahead and tempers our ego. Overt displays are frowned upon, leaving less overt signaling as the way we display how amazing we are. An unchecked ego is going to break the rules for signaling, and unless it is Donald Trump in the 2016 election, such overt egotism will be punished. Ultimately, we do have to control ego because of negative externalities if we want to cooperate and live in complex social communities.
 
 
I hope this helps explain some of how I think about ego!
Violence and Convenient Mysticism

Violence and Convenient Mysticism

Mysticism in the United States doesn’t really feel like it lends itself to violence. When we think of mystics, we probably think of someone close to a shaman, or maybe a modern mystic whose aesthetic is very homeopathic. Mystics don’t seem like they would be the most violent people today, but in the past, mysticism was a convenient motivating factor for violence.
 
 
In his book The Better Angels of Our Nature, Steven Pinker describes the way that mysticism lends itself to violence by writing, “the brain has evolved to ferret out hidden powers in nature, including those that no one can see. Once you start rummaging around in the realm of the unverifiable there is considerable room for creativity, and accusations of sorcery are often blended with self-serving motives.”
 
 
There are two important factors to recognize in this quote from Pinker, and both are often overlooked and misunderstood. First, our brains look for causal links between events. They are very good and very natural at thinking causally and pinpointing causation, however, as Daniel Kahneman wrote in Thinking Fast and Slow, the brain can often fall into cognitive fallacies and misattribute causation. Mystical thinking is a result of misplaced causal reasoning. It is important that we recognize that our brains can see causation that doesn’t truly exist and lead us to wrong conclusions.
 
 
The second important factor that we often manage to overlook is our own self-interest. As Kevin Simler and Robin Hanson explain in The Elephant in the Brain, our self-interest plays a much larger role in much of our decision-making and behavior than we like to admit. When combined with mysticism, self-interest can be dangerous.
 
 
If you have an enemy who boasts that they are special and offers mystical explanations for their special powers, then it suddenly becomes convenient to justify violence against your enemy. You don’t need actual proof of any wrong doing, you don’t need actual proof of their danger to society, you just need to convince others that their mystical powers could be dangerous, and you now have a convenient excuse for disposing of those who you dislike. You can promote your own self-interest without regard to reality if you can harness the power of mystical thinking.
 
 
Pinker explains that the world is becoming a more peaceful place in part because mystical thinking is moving to smaller and smaller corners of the world. Legal systems don’t recognize mystical explanations and justifications for behaviors and crimes. Empirical facts and verifiable evidence has superseded mysticism in our evaluations and judgments of crime and the use of violence. By moving beyond mysticism we have created systems, structures, and institutions that foster more peace and less violence among groups of people.
Personally and Politically Disturbed by the Homeless

Personally and Politically Disturbed by the Homeless

On the first page of the preface of The Homeless, Christopher Jencks writes about the responses that many Americans had to the rise of homelessness in American cities in the 1970s. He writes, “The spread of homelessness disturbed affluent Americans for both personal and political reasons. At a personal level, the faces of the homeless often suggest depths of despair that we would rather not imagine, much less confront in the flesh. … At a political level, the spread of homelessness suggests that something has gone fundamentally wrong with America’s economic or social institutions.”
I think the two books which most accurately describe the way that I understand our political and social worlds are Thinking Fast and Slow by Daniel Kahneman and The Elephant in the Brain by Kevin Simler and Robin Hanson. Kahneman suggests that our brains are far more susceptible to cognitive errors than we would like to believe. Much of our decision-making isn’t really so much decision-making as it is excuse making, finding ways to give us agency over decisions that were more or less automatic. Additionally, Kahneman shows that we very frequently, and very predictably, make certain cognitive errors that lead us to inaccurate conclusions about the world. Simler and Hansen show that we often deliberately mislead ourselves, choosing to intentionally buy into our minds’ cognitive errors. By deliberately lying to ourselves and choosing to view ourselves and our beliefs through a false objectivity, we can better lie to others, enhancing the way we signal to the world and making ourselves appear more authentic. [Note: some recent evidence has put some findings from Kahneman in doubt, but I think his general argument around cognitive errors still holds.]
Jencks published his book long before Thinking Fast and Slow and The Elephant in the Brain were published, but I think his observation hints at the findings that Kahneman, Simler, and Hanson would all write about in the coming decades. People wanted to hold onto beliefs they possibly knew or suspected to be false. They were disturbed by a reality that did not match the imagined reality in which they wanted to believe. They embraced cognitive errors and adopted beliefs and conclusions based on those cognitive errors. They deceived themselves about reality to better appear to believe the myths they embraced, and in the end they developed a political system where they could signal their virtue by strongly adhering to the initial cognitive errors that sparked the whole process.
Jencks’ quote shows why homelessness is such a tough issue for many of us to face. When we see large number of people failing and ending up homeless it suggests that there is something more than individual shortcomings at work. It suggests that somewhere within society and our social structures are points of failure. It suggests that our institutions, from which we may benefit as individuals, are not serving everyone. This goes against our beliefs which reinforce our self-interest, and is hard to accept. It is much easier to simply fall back on cognitive illusions and errors and to blame those who have failed. We truly believe that homelessness is the problem of individuals because we are deceiving ourselves, and because it serves our self-interest to do so. When we see homeless, we see a reality we want to ignore and pretend does not exist because we fear it and we fear that we may be responsible for it in some way. We fear that homelessness will necessitate a change in the social structures and institutions that have helped us get to where we are and that changes may make things harder for us or somehow diminishing our social status. This is why we are so disturbed by homeless, why we prefer not to think about it, and why we develop policies based on the assumption that people who end up homeless are deeply flawed individuals and are responsible for their own situation. It is also likely why we have not done enough to help the homeless, why it is becoming a bigger issue in American cities, and why we have been so bad at addressing the real causes of homelessness in America. There is definitely some truth to the argument that homelessness is the result of flawed individuals, which is why it is such a strong argument, but we should accept that there are some flawed causal thoughts at play and that it is often in our self-interest to dismiss the homeless as individual failures.
The Fundamental Nature of Cause and Effect

The Fundamental Nature of Cause and Effect

In my undergraduate and graduate studies I had a few statistics classes and I remember the challenge of learning probability. Probability, odds, and statistics are not always easy to understand and interpret. There are some concepts that are pretty straightforward, and others that seem to contradict what we would expect if we had not gone through the math and if we had not studied the concepts in depth. To contrast the difficult and sometimes counter-intuitive nature of statistics, we can think about causality, which is a challenging concept, but unlike statistics, is something we are able to intuit from very young age.
In The Book of Why Judea Pearl writes, “In both a cognitive and a philosophical sense, the idea of cause and effect is much more fundamental than probability. We begin learning causes and effects before we understand language and before we understand mathematics.”
As Pearl explains, we see causality naturally and experience causality as we move through our lives. From a young child who learns that if they cry they receive attention to a nuclear physicist who learns what happens when two atoms collide at high energy levels, our minds are constantly looking at the world and looking for causes. It begins by making observations of phenomena around us and continues as we predict what outcomes would happen based on certain system inputs. Eventually, our minds reach a point where we can understand why our predictions are accurate or inaccurate, and we can imagine new ways to bring about certain outcomes. Even if we cannot explain all of this, we can still understand causation at a fundamental and intuitive level.
However, many of us deny that we can see and understand the world in a causal way. I am personally guilty of thinking in a purely statistical way and ignoring the causal. The classes I took in college helped me understand statistics and probability, but also told me not to trust my intuitive causal thinking. Books like Kahneman’s Thinking Fast and Slow cemented this mindset for me. Rationality, we believe, requires that we think statistically and discount our intuitions for fear of bias. Modern science says we can only trust evidence when it is backed by randomized controlled trials and directs us to think of the world through correlations and statistical relationships, not through a lens of causality.
Pearl pushes back against this notion. By arguing that causality is fundamental to the human mind, he implies that our causal reasoning can and should be trusted. Throughout the book he demonstrates that a purely statistical way of thinking leaves us falling short of the knowledge we really need to improve the world. He demonstrates that complex tactics to remove variables from equations in statistical methods are often unnecessary, and that we can accept the results of experiments and interventions even when they are not fully randomized controlled trials.  For much of human history our causal thinking nature has lead us astray, but I think that Pearl argues that we have overcorrected in modern statistics and science, and that we need to return to our causal roots to move forward and solve problems that statistics tells us are impossible to solve.
Epistemic Optimists & Pessimists - Joe Abittan

Epistemic Optimists & Pessimists

A little while back I did a mini dive into cognitive psychology and behavioral economics by reading Thinking Fast and Slow by Daniel Kahneman, Nudge by Sunstein and Thaler, Risk Savvy by Gerd Gigerenzer, Vices of the Mind by Quassim Cassam, and The Book of Why by Judea Pearl. Each of these authors asked questions about the ways we think and tried to explain why our thinking so often seems go awry. Recognizing that it is a useful but insufficient dichotomy, each of these authors can be thought of as either an epistemic optimist or an epistemic pessimist.
In Vices of the Mind Cassam gives us the definitions for epistemic optimists and pessimists. He writes, “Optimism is the view that self-improvement is possible, and that there is often (though not always) something we can do about our epistemic vices, including many of our implicit biases.” The optimists, Cassam argues, believes that we can learn about our mind, our biases, and how our thinking works to make better decisions and improve our beliefs to foster knowledge. Cassam continues, “Pessimism is much more sceptical about the prospects of self-improvement or, at any rate, of lasting self-improvement. … For pessimists, the focus of inquiry shouldn’t be on overcoming our epistemic vices but  on outsmarting them, that is, finding ways to work around them so as to reduce their ill effects.” With Cassam’s framework, I think it is possible to look at the ways each author and researcher presents information in their books and to think of them as either optimists or pessimists.
Daniel Kahneman in Thinking Fast and Slow wants to be an optimist, but ultimately is a pessimist. He writes throughout the book how his own knowledge about biases, cognitive illusions, and thinking errors hardly help him in his own life. He states that what he really hopes his book accomplishes is improved water-cooler talk and better understanding of how the brain works, not necessarily better decision-making for those who read his book. Similarly, Sunstein and Thaler are pessimists. They clearly believe that we can outsmart our epistemic vices, but not by our own actions but rather by outside nudges that smarter people and responsible choice architects have designed for us. Neither Kahneman nor the Chicago economics pair believe we really have any ability to control and change our thinking independently.
Gigerenzer and Pearl are both optimists. While Gigerenzer believes that nudges can be helpful and encourages the development of aids to outsmart our epistemic vices, he also clearly believes that we can overcome them on our own simply through gaining experience and through practice. For Gigerenzer, achieving epistemic virtuosity is possible, even if it isn’t something you explicitly work toward. Pearl focuses how human beings are able to interpret and understand causal structures in the real world, and breaks from the fashionable viewpoint of most academics in saying that humans are actually very good and understanding, interpreting, and measuring causality. He is an epistemic optimist because he believes, and argues in his book, that we can improve our thinking, improve the ways we approach questions of causality, and improve our knowledge without having to rely on fancy tricks to outsmart epistemic vices. Both authors believe that growth and improved thinking is possible.
Cassam is harder to place, but I think he still is best thought of as an epistemic optimist. He believes that we are blameworthy for our epistemic vices and that they are indeed reprehensible. He also believes that we can improve our thinking and reach a more epistemically virtuous way of thinking if we are deliberate about addressing our epistemic vices. I don’t think that Cassam believes we have to outsmart our epistemic vices, only that we need to be able to recognize them and understand how to get beyond them, and I believe that he would argue that we can do so.
Ultimately, I think that we should learn from Kahneman, Sunstein, and Thaler and be more thoughtful of our nudges as we look for ways to overcome the limitations of our minds. However, I do believe that learning about epistemic vices and taking steps to improve our thinking can help us grow and become more epistemically virtuous. Simple experience, as I think Gigerenzer would argue, will help us improve naturally, and deliberate and calibrated thought, as Pearl might argue, can help us clearly see real and accurate causal structures in the world. I agree with Cassam that we are at least revision responsible for our epistemic vices, and that we can take steps to get beyond them, improving our thinking and becoming epistemically virtuous. In the end, I don’t think humanity is a helpless pool of irrationality and that we can only improve our thinking and decision-making through nudges. I think we can and over time will improve our statistical thinking, decision-making, and limit cognitive errors and biases as individuals and as societies (then again, maybe its just the morning coffee talking).
Self-deceptive Rationalization

Self-Deceptive Rationalization

I don’t like doing online personality quizzes. Part of the reason why I dislike them is because I believe that three of the cognitive errors and biases identified by Daniel Kahneman in his book Thinking Fast and Slow are at play when we take online quizzes.
 
 
First, we are influenced by the availability heuristic. Our perception of how common or how accurate something is can be greatly influenced by whether we have an easy or hard time remembering the thing. This can influence how we answer questions about things we normally prefer or normally like to do. We might be answering based on how quickly we remember something, not on how we actually feel about something.
 
 
Second, we might substitute the questions being asked with easier to answer questions. In reality, this is what is happening with the availability heuristic. A difficult self-reflection question might not be answered directly. We might switch the question out and instead answer a simpler question. In the case of the availability heuristic, we are answering how easily something came to mind rather than the original question, but this can happen outside of the availability heuristic as well. The result is that we are not really measuring what the question purports to measure.
 
 
Third, Kahneman argues that we can think of ourselves as having two operating systems for how we act and feel in the present moment versus how we reflect back and remember previous experiences. The remembering self has different perceptions than the experiencing self, as Kahneman terms the two systems. The remembering self doesn’t have an accurate memory for how much we liked or disliked certain experiences. Think about a vacation. You may be feeling burnt out with work and life, and all you want to do, what you would enjoy the most in the world, is to sit on a familiar beach doing absolutely nothing. But your remembering self won’t take any exciting and novel memories from a week sitting on a beach doing nothing. Your remembering self would much rather have you go on an exciting yet stressful vacation to a new foreign country. This tension between your experiencing and remembering selves makes the reliability of online personality quizzes questionable. Your remembering self answers the questions, not your experiencing self, and they don’t always have the same opinions.
 
 
What this means, is that the kind of reflection that goes into online personality quizzes, or really any reflective activity, can potentially be self-deceptive. Quassim Cassam writes about these dangers in his book Vices of  the Mind. He writes, “there is always the danger that what critical reflection produces is not self-knowledge, but self-deceptive rationalization.” Our biases and cognitive errors can lead us to incorrect answers about ourselves during self-reflection. This process can feel honest and insightful, but it can often be nothing more than a rationalization for behaviors and actions that we want to believe are true about ourselves. The only way through, Cassam continues to explain, is to cultivate real epistemic virtues, to see the world more clearly, and to recognize our epistemic vices to become better thinkers.

Rules of Thumb: Helpful, but Systematically Error Producing

Rules of Thumb: Helpful, but Systematically Error Producing

The world throws a lot of complex problems at us. Even simple and mundane tasks and decisions hold a lot of complexity behind them. Deciding what time to wake up at, the best way to go to the grocery store and post office in a single trip, and how much is appropriate to pay for a loaf of bread have incredibly complex mechanisms behind them. In figuring out when to wake up we have to consider how many hours of sleep we need, what activities we need to do in the morning, and how much time it will take for each of those activities to still provide us a cushion of time in case something runs long. In making a shopping trip we are confronted with p=np, one of the most vexing mathematical problems that exists. And the price of bread was once the object of focus for teams of Soviet economists who could not pinpoint the right price for a loaf of bread that would create the right supply to match the population’s demand.
The brain handles all of these problems with relatively simple heuristics and rules of thumb, simplifying decisions so that we don’t waste the whole night doing math problems for the perfect time to set an alarm, don’t miss the entire day trying to calculate the best route to run all our errands, and don’t waste tons of brain power trying to set bread prices. We set a standard alarm time and make small adjustments knowing that we ought to leave the house ready for work by a certain time to make sure we reduce the risk of being late. We stick to main roads and travel similar routes to get where we need to go, eliminating the thousands of right or left turn alternatives we could chose from. We rely on open markets to determine the price of bread without setting a universal standard.
Rules of thumb are necessary in a complex world, but that doesn’t mean they are not without their own downfalls. As Quassim Cassam writes in Vices of the Mind, echoing Daniel Kahneman from Thinking Fast and Slow, “We are hard-wired to use simple rules of thumb (‘heuristics’) to make judgements based on incomplete or ambiguous information, and while these rules of thumb are generally quite useful, they sometimes lead to systematic errors.” Useful, but inadequate, rules of thumb can create predictable and reliable errors or mistakes. Our thinking can be distracted with meaningless information, we can miss important factors, and we can fail to be open to improvements or alternatives that would make our decision-making better.
What is important to recognize is that systematic and predictable errors from rules of thumb can be corrected. If we know where errors and mistakes are systematically likely to arise, then we can take steps to mitigate and reduce those errors. We can be confident with rules of thumb and heuristics that simplify decisions in positive ways while being skeptical of rules of thumb that we know are likely to produce errors, biases, and inaccurate judgements and assumptions. Companies, governments, and markets do this all the time, though not always in a step by step process (sometimes there is one step forward and two steps backward) leading to progress over time. Embracing the usefulness of rules of thumb while acknowledging their shortcomings is a powerful way to improve decision-making while avoiding the cognitive downfall of heuristics.
Risk and Innovation - Joe Abittan

Risk and Innovation

To be innovative is to make decisions, develop processes, and create things in new ways that improve over the status quo. Being innovative is necessarily different, and requires stepping away from the proven path to do something new or unusual. Risk and innovation are tied together because you cannot venture into something new or stray from the tried and true without the possibility of making a mistake and being wrong. Therefore, appropriately managing and understanding risk is imperative for innovation.

 

In Risk Savvy Gerd Gigerenzer writes, “Risk aversion is closely tied to the anxiety of making errors. If you work in the middle management of a company, your life probably revolves around the fear of doing something wrong and being blamed for it. Such a climate is not a good one for innovation, because originality requires taking risks and making errors along the way. No risks, no errors, no innovation.” Risk aversion is a fundamental aspect of human psychology. Daniel Kahneman in Thinking Fast and Slow shows that we won’t accept risk unless we are certain that the pay-off is at generally about two times greater than the potential loss. We go out of our way to avoid risk, because the potential of losing something is often paralyzing beyond the excitement of a potential gain. Individuals and companies who want to be innovative have to find ways around risk aversion in order to create something new.

 

Gigerenzer’s example of middle management is excellent for thinking about innovation and why it is often smaller companies and start-ups that make innovative breakthroughs. It also helps explain why in the United States so many successful and innovative companies are started by immigrants or by the super-wealthy. Large established companies are likely to have employees who have been with the company for a longer time and have become more risk averse. They have families, mortgages, and might be unsure they could find an equally attractive job elsewhere. Their incentives for innovation are diminished by their fear of loss if something where to go wrong and if the blame were to fall with them. Better to stick with established methods and to maximize according to well defined job evaluation statistics than to risk trying something new and uncharted. Start-ups, immigrants, and the super-wealthy don’t have the same constraining fears. New companies attract individuals who are less risk averse to begin with, and they don’t have established methods that everyone is comfortable sticking to. Immigrants are not as likely to have the same financial resources that limit their willingness to take risks, and the super-wealthy may have so many resources that the risks they face are smaller relative to their overall wealth and resources. The middle-class, like middle management, is stuck in a position where they feel they have too much to risk in trying to be innovative, and as a result stick to known and measured paths that ultimately reduce risk and innovation.
A mixture of Risks

A Mixture of Risks

In the book Risk Savvy, Gerd Gigerenzer explains the challenges we have with thinking statistically and how these difficulties can lead to poor decision-making. Humans have trouble holding lots of complex and conflicting information. We don’t do well with decisions involving risk and decisions where we cannot possibly know all the relevant information necessary for the best decision. We prefer to make decisions involving fewer variables, where we can have more certainty about our risks and about the potential outcomes. This leads to the substitution effect that Daniel Kahneman describes in his book Thinking Fast and Slow, where our minds substitute an easier question for the difficult question without us noticing.

 

Unfortunately, this can have bad outcomes for our decision-making. Gigerenzer writes, “few situations in life allow us to calculate risk precisely. In most cases, the risks involved are a mixture of more or less well known ones.” Most of our decisions that involve risk have a mixture of different risks. They are complex decisions with tiers and potential cascades of risk based on the decisions we make along the way. Few of our decisions involve just one risk independent of others that we can know with certainty.

 

If we consider investing for retirement we can see how complex decisions involving risk can be and how a mixture of risks is present across all the decisions we have to make. We can hoard money in a safe in our house where we reduce the risk of losing any of our money, but we risk being unable to have enough saved by the time we are ready to retire. We can invest our money, but have to make decisions regarding whether we will keep it in a bank account, invest it in the stock market, or look to other investment vehicles. Our bank is unlikely to lose much money, and is low risk, but is also unlikely to help us increase the value of our savings to have enough for retirement. Investing with a financial advisor takes on more risk, such as the risk that we are being scammed, the risk that the market tanks and our advisor made bad investments on our behalf, and the risk that we won’t have access to our money if we were to need it quickly in case of an emergency. What this shows is that even the most certain option for our money, protecting it in a secret safe at home, still contains additional risks for the future. The options that are likely to provide us with the greatest return on our savings, investing in the stock market, has a mixture of risks associated with each investment decision we make after the initial decision to invest. There is no way we can calculate and fully comprehend ever risk involved with such an investment decision.

 

Risk is complex, and we rarely deal with a single decision involving a single calculable risk at one time. Our brains are likely to flatten the decision by substituting more simple decisions, eliminating some of the risks from consideration and helping our mind focus on fewer variables at a time. Nevertheless, the complex mixture of risks doesn’t go away just because  our brains pretend it isn’t there.
Navigating Uncertainty with Nudges

Navigating Uncertainty with Nudges

In Risk Savvy Gerd Gigerenzer makes a distinction between known risks and uncertainty. In a foot note for a figure, he writes, “In everyday language, we make a distinction between certainty and risk, but the terms risk and uncertainty are used mostly as synonyms. They aren’t. In a world of known risks, everything, including the probabilities, is known for certain. Here statistical thinking and logic are sufficient to make good decisions. In an uncertain world, not everything is known, and one cannot calculate the best option. Here, good rules of thumb and intuition are also required.” Gigerenzer’s distinction between risk and uncertainty is important. He demonstrates that people can manage decision-making when making risk based decisions, but that people need to rely on intuition and good judgement when dealing with uncertainty. One solution to improved judgement and intuition is to use nudges.

 

In the book Nudge, Cass Sunstein and Richard Thaler encourage choice architects to design systems and structures that will help individuals make the best decision in a given situation as defined by the chooser. Much of their argument is supported by research presented by Daniel Kahneman in Thinking Fast and Slow, where Kahneman demonstrates how predictable biases and cognitive errors can lead people to making decisions that they likely wouldn’t make if they had more clear information, had the ability to free themselves from irrelevant biases, and could improve their statistical thinking. Gigerenzer’s quote supports Sunstein and Thaler’s nudges by building on the research from Kahneman. Distinguishing between risk and uncertainty helps us understand when to use nudges, and how aggressive our nudges may need to be.

 

Gigerenzer uses casino slot machines as an example of risk and for examples of uncertainty uses stocks, romance, earthquakes, business, and health. When we are gambling, we can know the statistical chances that our bets will pay off and calculate optimal strategies (there is a reason the casino dealer stays on 17). We won’t know what the outcome will be ahead of time, but we can precisely define the risk. The same cannot be said for picking the right stocks, the right romantic partner, or when creating business, earthquake preparedness, or health plans. We may know the five year rate of return for a company’s stocks, the divorce rate in our state, the average frequency and strength of earthquakes in our region, and how old our grandfather lived to be, but we cannot use this information alone to calculate risk. We don’t know exactly what business trends will arise in the future, we don’t know for sure whether we have a genetic disease that will strike us (or our romantic partner) down sooner than expected, and we can’t say for sure that a 7.0 earthquake is or is not possible next month.

 

But nudges can help us in these decisions. We can use statistical information for business development and international stock returns to identify general rules of thumb when investing. We can listen to parents and elders and learn from their advice and mistakes when selecting a romantic partner, intuiting the traits that make a good (or bad) spouse. We can overengineer our bridges and skyscrapers by 10% to give us a little more assurance that they can survive a major and unexpected earthquake. Nudges are helpful because they can augment our gut instincts and help bring visualizations to the rules of thumb that we might utilize.

 

Expecting everyone’s individual intuition and heuristics to be up to the task of navigating uncertainty is likely to lead to many poor choices. But, if we help pool the statistical information available, provide guides, communicate rules of thumb that have panned out for many people, and structure choices in ways that help present this information, then people can likely make marginally better decisions. My suggestion in this post, is a nudge to use more nudges in moments of uncertainty. When certainty exists, or even when calculable risks exist, nudges may not be needed. However, once we get beyond calculable risk, where we must rely on judgement and intuition, nudges are important tools to help people navigate uncertainty and improve their decision making.