Personally and Politically Disturbed by the Homeless

Personally and Politically Disturbed by the Homeless

On the first page of the preface of The Homeless, Christopher Jencks writes about the responses that many Americans had to the rise of homelessness in American cities in the 1970s. He writes, “The spread of homelessness disturbed affluent Americans for both personal and political reasons. At a personal level, the faces of the homeless often suggest depths of despair that we would rather not imagine, much less confront in the flesh. … At a political level, the spread of homelessness suggests that something has gone fundamentally wrong with America’s economic or social institutions.”
I think the two books which most accurately describe the way that I understand our political and social worlds are Thinking Fast and Slow by Daniel Kahneman and The Elephant in the Brain by Kevin Simler and Robin Hanson. Kahneman suggests that our brains are far more susceptible to cognitive errors than we would like to believe. Much of our decision-making isn’t really so much decision-making as it is excuse making, finding ways to give us agency over decisions that were more or less automatic. Additionally, Kahneman shows that we very frequently, and very predictably, make certain cognitive errors that lead us to inaccurate conclusions about the world. Simler and Hansen show that we often deliberately mislead ourselves, choosing to intentionally buy into our minds’ cognitive errors. By deliberately lying to ourselves and choosing to view ourselves and our beliefs through a false objectivity, we can better lie to others, enhancing the way we signal to the world and making ourselves appear more authentic. [Note: some recent evidence has put some findings from Kahneman in doubt, but I think his general argument around cognitive errors still holds.]
Jencks published his book long before Thinking Fast and Slow and The Elephant in the Brain were published, but I think his observation hints at the findings that Kahneman, Simler, and Hanson would all write about in the coming decades. People wanted to hold onto beliefs they possibly knew or suspected to be false. They were disturbed by a reality that did not match the imagined reality in which they wanted to believe. They embraced cognitive errors and adopted beliefs and conclusions based on those cognitive errors. They deceived themselves about reality to better appear to believe the myths they embraced, and in the end they developed a political system where they could signal their virtue by strongly adhering to the initial cognitive errors that sparked the whole process.
Jencks’ quote shows why homelessness is such a tough issue for many of us to face. When we see large number of people failing and ending up homeless it suggests that there is something more than individual shortcomings at work. It suggests that somewhere within society and our social structures are points of failure. It suggests that our institutions, from which we may benefit as individuals, are not serving everyone. This goes against our beliefs which reinforce our self-interest, and is hard to accept. It is much easier to simply fall back on cognitive illusions and errors and to blame those who have failed. We truly believe that homelessness is the problem of individuals because we are deceiving ourselves, and because it serves our self-interest to do so. When we see homeless, we see a reality we want to ignore and pretend does not exist because we fear it and we fear that we may be responsible for it in some way. We fear that homelessness will necessitate a change in the social structures and institutions that have helped us get to where we are and that changes may make things harder for us or somehow diminishing our social status. This is why we are so disturbed by homeless, why we prefer not to think about it, and why we develop policies based on the assumption that people who end up homeless are deeply flawed individuals and are responsible for their own situation. It is also likely why we have not done enough to help the homeless, why it is becoming a bigger issue in American cities, and why we have been so bad at addressing the real causes of homelessness in America. There is definitely some truth to the argument that homelessness is the result of flawed individuals, which is why it is such a strong argument, but we should accept that there are some flawed causal thoughts at play and that it is often in our self-interest to dismiss the homeless as individual failures.
The Fundamental Nature of Cause and Effect

The Fundamental Nature of Cause and Effect

In my undergraduate and graduate studies I had a few statistics classes and I remember the challenge of learning probability. Probability, odds, and statistics are not always easy to understand and interpret. There are some concepts that are pretty straightforward, and others that seem to contradict what we would expect if we had not gone through the math and if we had not studied the concepts in depth. To contrast the difficult and sometimes counter-intuitive nature of statistics, we can think about causality, which is a challenging concept, but unlike statistics, is something we are able to intuit from very young age.
In The Book of Why Judea Pearl writes, “In both a cognitive and a philosophical sense, the idea of cause and effect is much more fundamental than probability. We begin learning causes and effects before we understand language and before we understand mathematics.”
As Pearl explains, we see causality naturally and experience causality as we move through our lives. From a young child who learns that if they cry they receive attention to a nuclear physicist who learns what happens when two atoms collide at high energy levels, our minds are constantly looking at the world and looking for causes. It begins by making observations of phenomena around us and continues as we predict what outcomes would happen based on certain system inputs. Eventually, our minds reach a point where we can understand why our predictions are accurate or inaccurate, and we can imagine new ways to bring about certain outcomes. Even if we cannot explain all of this, we can still understand causation at a fundamental and intuitive level.
However, many of us deny that we can see and understand the world in a causal way. I am personally guilty of thinking in a purely statistical way and ignoring the causal. The classes I took in college helped me understand statistics and probability, but also told me not to trust my intuitive causal thinking. Books like Kahneman’s Thinking Fast and Slow cemented this mindset for me. Rationality, we believe, requires that we think statistically and discount our intuitions for fear of bias. Modern science says we can only trust evidence when it is backed by randomized controlled trials and directs us to think of the world through correlations and statistical relationships, not through a lens of causality.
Pearl pushes back against this notion. By arguing that causality is fundamental to the human mind, he implies that our causal reasoning can and should be trusted. Throughout the book he demonstrates that a purely statistical way of thinking leaves us falling short of the knowledge we really need to improve the world. He demonstrates that complex tactics to remove variables from equations in statistical methods are often unnecessary, and that we can accept the results of experiments and interventions even when they are not fully randomized controlled trials.  For much of human history our causal thinking nature has lead us astray, but I think that Pearl argues that we have overcorrected in modern statistics and science, and that we need to return to our causal roots to move forward and solve problems that statistics tells us are impossible to solve.
Epistemic Optimists & Pessimists - Joe Abittan

Epistemic Optimists & Pessimists

A little while back I did a mini dive into cognitive psychology and behavioral economics by reading Thinking Fast and Slow by Daniel Kahneman, Nudge by Sunstein and Thaler, Risk Savvy by Gerd Gigerenzer, Vices of the Mind by Quassim Cassam, and The Book of Why by Judea Pearl. Each of these authors asked questions about the ways we think and tried to explain why our thinking so often seems go awry. Recognizing that it is a useful but insufficient dichotomy, each of these authors can be thought of as either an epistemic optimist or an epistemic pessimist.
In Vices of the Mind Cassam gives us the definitions for epistemic optimists and pessimists. He writes, “Optimism is the view that self-improvement is possible, and that there is often (though not always) something we can do about our epistemic vices, including many of our implicit biases.” The optimists, Cassam argues, believes that we can learn about our mind, our biases, and how our thinking works to make better decisions and improve our beliefs to foster knowledge. Cassam continues, “Pessimism is much more sceptical about the prospects of self-improvement or, at any rate, of lasting self-improvement. … For pessimists, the focus of inquiry shouldn’t be on overcoming our epistemic vices but  on outsmarting them, that is, finding ways to work around them so as to reduce their ill effects.” With Cassam’s framework, I think it is possible to look at the ways each author and researcher presents information in their books and to think of them as either optimists or pessimists.
Daniel Kahneman in Thinking Fast and Slow wants to be an optimist, but ultimately is a pessimist. He writes throughout the book how his own knowledge about biases, cognitive illusions, and thinking errors hardly help him in his own life. He states that what he really hopes his book accomplishes is improved water-cooler talk and better understanding of how the brain works, not necessarily better decision-making for those who read his book. Similarly, Sunstein and Thaler are pessimists. They clearly believe that we can outsmart our epistemic vices, but not by our own actions but rather by outside nudges that smarter people and responsible choice architects have designed for us. Neither Kahneman nor the Chicago economics pair believe we really have any ability to control and change our thinking independently.
Gigerenzer and Pearl are both optimists. While Gigerenzer believes that nudges can be helpful and encourages the development of aids to outsmart our epistemic vices, he also clearly believes that we can overcome them on our own simply through gaining experience and through practice. For Gigerenzer, achieving epistemic virtuosity is possible, even if it isn’t something you explicitly work toward. Pearl focuses how human beings are able to interpret and understand causal structures in the real world, and breaks from the fashionable viewpoint of most academics in saying that humans are actually very good and understanding, interpreting, and measuring causality. He is an epistemic optimist because he believes, and argues in his book, that we can improve our thinking, improve the ways we approach questions of causality, and improve our knowledge without having to rely on fancy tricks to outsmart epistemic vices. Both authors believe that growth and improved thinking is possible.
Cassam is harder to place, but I think he still is best thought of as an epistemic optimist. He believes that we are blameworthy for our epistemic vices and that they are indeed reprehensible. He also believes that we can improve our thinking and reach a more epistemically virtuous way of thinking if we are deliberate about addressing our epistemic vices. I don’t think that Cassam believes we have to outsmart our epistemic vices, only that we need to be able to recognize them and understand how to get beyond them, and I believe that he would argue that we can do so.
Ultimately, I think that we should learn from Kahneman, Sunstein, and Thaler and be more thoughtful of our nudges as we look for ways to overcome the limitations of our minds. However, I do believe that learning about epistemic vices and taking steps to improve our thinking can help us grow and become more epistemically virtuous. Simple experience, as I think Gigerenzer would argue, will help us improve naturally, and deliberate and calibrated thought, as Pearl might argue, can help us clearly see real and accurate causal structures in the world. I agree with Cassam that we are at least revision responsible for our epistemic vices, and that we can take steps to get beyond them, improving our thinking and becoming epistemically virtuous. In the end, I don’t think humanity is a helpless pool of irrationality and that we can only improve our thinking and decision-making through nudges. I think we can and over time will improve our statistical thinking, decision-making, and limit cognitive errors and biases as individuals and as societies (then again, maybe its just the morning coffee talking).
Self-deceptive Rationalization

Self-Deceptive Rationalization

I don’t like doing online personality quizzes. Part of the reason why I dislike them is because I believe that three of the cognitive errors and biases identified by Daniel Kahneman in his book Thinking Fast and Slow are at play when we take online quizzes.
 
 
First, we are influenced by the availability heuristic. Our perception of how common or how accurate something is can be greatly influenced by whether we have an easy or hard time remembering the thing. This can influence how we answer questions about things we normally prefer or normally like to do. We might be answering based on how quickly we remember something, not on how we actually feel about something.
 
 
Second, we might substitute the questions being asked with easier to answer questions. In reality, this is what is happening with the availability heuristic. A difficult self-reflection question might not be answered directly. We might switch the question out and instead answer a simpler question. In the case of the availability heuristic, we are answering how easily something came to mind rather than the original question, but this can happen outside of the availability heuristic as well. The result is that we are not really measuring what the question purports to measure.
 
 
Third, Kahneman argues that we can think of ourselves as having two operating systems for how we act and feel in the present moment versus how we reflect back and remember previous experiences. The remembering self has different perceptions than the experiencing self, as Kahneman terms the two systems. The remembering self doesn’t have an accurate memory for how much we liked or disliked certain experiences. Think about a vacation. You may be feeling burnt out with work and life, and all you want to do, what you would enjoy the most in the world, is to sit on a familiar beach doing absolutely nothing. But your remembering self won’t take any exciting and novel memories from a week sitting on a beach doing nothing. Your remembering self would much rather have you go on an exciting yet stressful vacation to a new foreign country. This tension between your experiencing and remembering selves makes the reliability of online personality quizzes questionable. Your remembering self answers the questions, not your experiencing self, and they don’t always have the same opinions.
 
 
What this means, is that the kind of reflection that goes into online personality quizzes, or really any reflective activity, can potentially be self-deceptive. Quassim Cassam writes about these dangers in his book Vices of  the Mind. He writes, “there is always the danger that what critical reflection produces is not self-knowledge, but self-deceptive rationalization.” Our biases and cognitive errors can lead us to incorrect answers about ourselves during self-reflection. This process can feel honest and insightful, but it can often be nothing more than a rationalization for behaviors and actions that we want to believe are true about ourselves. The only way through, Cassam continues to explain, is to cultivate real epistemic virtues, to see the world more clearly, and to recognize our epistemic vices to become better thinkers.

Rules of Thumb: Helpful, but Systematically Error Producing

Rules of Thumb: Helpful, but Systematically Error Producing

The world throws a lot of complex problems at us. Even simple and mundane tasks and decisions hold a lot of complexity behind them. Deciding what time to wake up at, the best way to go to the grocery store and post office in a single trip, and how much is appropriate to pay for a loaf of bread have incredibly complex mechanisms behind them. In figuring out when to wake up we have to consider how many hours of sleep we need, what activities we need to do in the morning, and how much time it will take for each of those activities to still provide us a cushion of time in case something runs long. In making a shopping trip we are confronted with p=np, one of the most vexing mathematical problems that exists. And the price of bread was once the object of focus for teams of Soviet economists who could not pinpoint the right price for a loaf of bread that would create the right supply to match the population’s demand.
The brain handles all of these problems with relatively simple heuristics and rules of thumb, simplifying decisions so that we don’t waste the whole night doing math problems for the perfect time to set an alarm, don’t miss the entire day trying to calculate the best route to run all our errands, and don’t waste tons of brain power trying to set bread prices. We set a standard alarm time and make small adjustments knowing that we ought to leave the house ready for work by a certain time to make sure we reduce the risk of being late. We stick to main roads and travel similar routes to get where we need to go, eliminating the thousands of right or left turn alternatives we could chose from. We rely on open markets to determine the price of bread without setting a universal standard.
Rules of thumb are necessary in a complex world, but that doesn’t mean they are not without their own downfalls. As Quassim Cassam writes in Vices of the Mind, echoing Daniel Kahneman from Thinking Fast and Slow, “We are hard-wired to use simple rules of thumb (‘heuristics’) to make judgements based on incomplete or ambiguous information, and while these rules of thumb are generally quite useful, they sometimes lead to systematic errors.” Useful, but inadequate, rules of thumb can create predictable and reliable errors or mistakes. Our thinking can be distracted with meaningless information, we can miss important factors, and we can fail to be open to improvements or alternatives that would make our decision-making better.
What is important to recognize is that systematic and predictable errors from rules of thumb can be corrected. If we know where errors and mistakes are systematically likely to arise, then we can take steps to mitigate and reduce those errors. We can be confident with rules of thumb and heuristics that simplify decisions in positive ways while being skeptical of rules of thumb that we know are likely to produce errors, biases, and inaccurate judgements and assumptions. Companies, governments, and markets do this all the time, though not always in a step by step process (sometimes there is one step forward and two steps backward) leading to progress over time. Embracing the usefulness of rules of thumb while acknowledging their shortcomings is a powerful way to improve decision-making while avoiding the cognitive downfall of heuristics.
Risk and Innovation - Joe Abittan

Risk and Innovation

To be innovative is to make decisions, develop processes, and create things in new ways that improve over the status quo. Being innovative is necessarily different, and requires stepping away from the proven path to do something new or unusual. Risk and innovation are tied together because you cannot venture into something new or stray from the tried and true without the possibility of making a mistake and being wrong. Therefore, appropriately managing and understanding risk is imperative for innovation.

 

In Risk Savvy Gerd Gigerenzer writes, “Risk aversion is closely tied to the anxiety of making errors. If you work in the middle management of a company, your life probably revolves around the fear of doing something wrong and being blamed for it. Such a climate is not a good one for innovation, because originality requires taking risks and making errors along the way. No risks, no errors, no innovation.” Risk aversion is a fundamental aspect of human psychology. Daniel Kahneman in Thinking Fast and Slow shows that we won’t accept risk unless we are certain that the pay-off is at generally about two times greater than the potential loss. We go out of our way to avoid risk, because the potential of losing something is often paralyzing beyond the excitement of a potential gain. Individuals and companies who want to be innovative have to find ways around risk aversion in order to create something new.

 

Gigerenzer’s example of middle management is excellent for thinking about innovation and why it is often smaller companies and start-ups that make innovative breakthroughs. It also helps explain why in the United States so many successful and innovative companies are started by immigrants or by the super-wealthy. Large established companies are likely to have employees who have been with the company for a longer time and have become more risk averse. They have families, mortgages, and might be unsure they could find an equally attractive job elsewhere. Their incentives for innovation are diminished by their fear of loss if something where to go wrong and if the blame were to fall with them. Better to stick with established methods and to maximize according to well defined job evaluation statistics than to risk trying something new and uncharted. Start-ups, immigrants, and the super-wealthy don’t have the same constraining fears. New companies attract individuals who are less risk averse to begin with, and they don’t have established methods that everyone is comfortable sticking to. Immigrants are not as likely to have the same financial resources that limit their willingness to take risks, and the super-wealthy may have so many resources that the risks they face are smaller relative to their overall wealth and resources. The middle-class, like middle management, is stuck in a position where they feel they have too much to risk in trying to be innovative, and as a result stick to known and measured paths that ultimately reduce risk and innovation.
A mixture of Risks

A Mixture of Risks

In the book Risk Savvy, Gerd Gigerenzer explains the challenges we have with thinking statistically and how these difficulties can lead to poor decision-making. Humans have trouble holding lots of complex and conflicting information. We don’t do well with decisions involving risk and decisions where we cannot possibly know all the relevant information necessary for the best decision. We prefer to make decisions involving fewer variables, where we can have more certainty about our risks and about the potential outcomes. This leads to the substitution effect that Daniel Kahneman describes in his book Thinking Fast and Slow, where our minds substitute an easier question for the difficult question without us noticing.

 

Unfortunately, this can have bad outcomes for our decision-making. Gigerenzer writes, “few situations in life allow us to calculate risk precisely. In most cases, the risks involved are a mixture of more or less well known ones.” Most of our decisions that involve risk have a mixture of different risks. They are complex decisions with tiers and potential cascades of risk based on the decisions we make along the way. Few of our decisions involve just one risk independent of others that we can know with certainty.

 

If we consider investing for retirement we can see how complex decisions involving risk can be and how a mixture of risks is present across all the decisions we have to make. We can hoard money in a safe in our house where we reduce the risk of losing any of our money, but we risk being unable to have enough saved by the time we are ready to retire. We can invest our money, but have to make decisions regarding whether we will keep it in a bank account, invest it in the stock market, or look to other investment vehicles. Our bank is unlikely to lose much money, and is low risk, but is also unlikely to help us increase the value of our savings to have enough for retirement. Investing with a financial advisor takes on more risk, such as the risk that we are being scammed, the risk that the market tanks and our advisor made bad investments on our behalf, and the risk that we won’t have access to our money if we were to need it quickly in case of an emergency. What this shows is that even the most certain option for our money, protecting it in a secret safe at home, still contains additional risks for the future. The options that are likely to provide us with the greatest return on our savings, investing in the stock market, has a mixture of risks associated with each investment decision we make after the initial decision to invest. There is no way we can calculate and fully comprehend ever risk involved with such an investment decision.

 

Risk is complex, and we rarely deal with a single decision involving a single calculable risk at one time. Our brains are likely to flatten the decision by substituting more simple decisions, eliminating some of the risks from consideration and helping our mind focus on fewer variables at a time. Nevertheless, the complex mixture of risks doesn’t go away just because  our brains pretend it isn’t there.
Navigating Uncertainty with Nudges

Navigating Uncertainty with Nudges

In Risk Savvy Gerd Gigerenzer makes a distinction between known risks and uncertainty. In a foot note for a figure, he writes, “In everyday language, we make a distinction between certainty and risk, but the terms risk and uncertainty are used mostly as synonyms. They aren’t. In a world of known risks, everything, including the probabilities, is known for certain. Here statistical thinking and logic are sufficient to make good decisions. In an uncertain world, not everything is known, and one cannot calculate the best option. Here, good rules of thumb and intuition are also required.” Gigerenzer’s distinction between risk and uncertainty is important. He demonstrates that people can manage decision-making when making risk based decisions, but that people need to rely on intuition and good judgement when dealing with uncertainty. One solution to improved judgement and intuition is to use nudges.

 

In the book Nudge, Cass Sunstein and Richard Thaler encourage choice architects to design systems and structures that will help individuals make the best decision in a given situation as defined by the chooser. Much of their argument is supported by research presented by Daniel Kahneman in Thinking Fast and Slow, where Kahneman demonstrates how predictable biases and cognitive errors can lead people to making decisions that they likely wouldn’t make if they had more clear information, had the ability to free themselves from irrelevant biases, and could improve their statistical thinking. Gigerenzer’s quote supports Sunstein and Thaler’s nudges by building on the research from Kahneman. Distinguishing between risk and uncertainty helps us understand when to use nudges, and how aggressive our nudges may need to be.

 

Gigerenzer uses casino slot machines as an example of risk and for examples of uncertainty uses stocks, romance, earthquakes, business, and health. When we are gambling, we can know the statistical chances that our bets will pay off and calculate optimal strategies (there is a reason the casino dealer stays on 17). We won’t know what the outcome will be ahead of time, but we can precisely define the risk. The same cannot be said for picking the right stocks, the right romantic partner, or when creating business, earthquake preparedness, or health plans. We may know the five year rate of return for a company’s stocks, the divorce rate in our state, the average frequency and strength of earthquakes in our region, and how old our grandfather lived to be, but we cannot use this information alone to calculate risk. We don’t know exactly what business trends will arise in the future, we don’t know for sure whether we have a genetic disease that will strike us (or our romantic partner) down sooner than expected, and we can’t say for sure that a 7.0 earthquake is or is not possible next month.

 

But nudges can help us in these decisions. We can use statistical information for business development and international stock returns to identify general rules of thumb when investing. We can listen to parents and elders and learn from their advice and mistakes when selecting a romantic partner, intuiting the traits that make a good (or bad) spouse. We can overengineer our bridges and skyscrapers by 10% to give us a little more assurance that they can survive a major and unexpected earthquake. Nudges are helpful because they can augment our gut instincts and help bring visualizations to the rules of thumb that we might utilize.

 

Expecting everyone’s individual intuition and heuristics to be up to the task of navigating uncertainty is likely to lead to many poor choices. But, if we help pool the statistical information available, provide guides, communicate rules of thumb that have panned out for many people, and structure choices in ways that help present this information, then people can likely make marginally better decisions. My suggestion in this post, is a nudge to use more nudges in moments of uncertainty. When certainty exists, or even when calculable risks exist, nudges may not be needed. However, once we get beyond calculable risk, where we must rely on judgement and intuition, nudges are important tools to help people navigate uncertainty and improve their decision making.
Paternalistic Nudges - Joe Abittan

Paternalistic Nudges

In their book Nudge, Cass Sunstein and Richard Thaler argue in favor of libertarian paternalism. Their argument is that our world is complex and interconnected, and it is impossible for people to truly make decisions on their own. Not only is it impossible for people to simply make their own decisions, it is impossible for other people to avoid influencing the decisions of others. Whether we decide to influence a decision in a particular way, or whether we decide to try to avoid any influence on another’s decision, we still shape how decisions are presented, understood, and contextualized. Given this reality, the best alternative is to try to help people make consistently better decisions than they would without aid and assistance.

 

The authors describe libertarian paternalism by writing:

 

“The approach we recommend does count as paternalistic, because private and public choice architects are not merely trying to track or to implement people’s anticipated choices. Rather, they are self-consciously attempting to move people in directions that will make their lives better. They nudge.”

 

The nudge is the key aspect of libertarian paternalism. Forcing people into a single choice, forcing them to accept your advice and perspective, and aggressively trying to change people’s behaviors and opinions doesn’t fit within the libertarian paternalism framework advocated by Sunstein and Thaler. Instead, a more subtle form of guidance toward good decisions is employed. People retain maximal choices if desired, and their opinions, decisions, and behaviors are somewhat constrained but almost nothing is completely off the table.

 

“A nudge,” Sunstein and Thaler write, “as we will use the term, is any aspect of the choice architecture that alters people’s behavior in a predictable way without forbidding any options or significantly changing their economic incentives.”

 

Daniel Kahneman, in his book Thinking Fast and Slow demonstrated that people make predictable errors and have predictable biases. If we can understand these thinking errors and biases, then we can identify situations in which these biases and cognitive errors are likely to lead people to making suboptimal decisions. To go a step further, as Sunstein and Thaler would suggest, if we are a choice architect, we should design and structure choices in a way that leads people away from predictable cognitive biases and errors. We should design choices in a way that takes those thinking mistakes into consideration and improves the way people understand their choices and options.

 

As a real world example, if we are structuring a retirement savings plan, we can be relatively sure that people will anchor around a default contribution built into their retirement savings plan. If we want to encourage greater retirement savings (knowing that economic data indicate people rarely save enough), we can set the default to 8% or higher, knowing that people may reduce the default rate, but likely won’t eliminate contributions entirely. Setting a high default is a nudge toward better retirement saving. We could chose not to have a default rate at all, and it is likely that people wouldn’t be sure about what rate to select and might chose a low rate below inflation or simply chose not to enter a rate at all, completely failing to contribute anything to the plan. It is clear that there is a better outcome that we, as choice architects, could help people attain if we understand how their minds work and can apply a subtle nudge.
Do People Make the Best Choices?

Do People Make the Best Choices?

My wife works with families with children with disabilities and for several years I worked in the healthcare space. A common idea between our two worlds was that the people being assisted are the experts on their own lives, and they know what is best for them. Parents are the experts for their children and patients are the experts in their health. Even if parents to don’t know all the intervention strategies to help a child with disabilities, and even if patients don’t have an MD from Stanford, they are still the expert in their own lives and what they and their families need.

 

But is this really true? In recent years there has been a bit of a customer service pushback in the world of business, more of a recognition that the customer isn’t always right. Additionally, research from the field of cognitive psychology, like much of the research from Daniel Kahneman’s book Thinking Fast and Slow that I wrote about, demonstrates that people can have huge blind spots in their own lives. People cannot always think rationally, in part because their brains are limited in their capacity to handle lots of information and because their brains can be tempted to take easy shortcuts in decision-making that don’t always take into account the true nature of reality. Add to Kahneman’s research the ideas put forth by Robin Hanson and Kevin Simler in The Elephant in the Brain, where the authors argue that our minds intentionally hide information from ourselves for political and personal advantage, and we can see that individual’s can’t be trusted to always make the best decisions.

 

So while no one else may know a child as well as the child’s parents, and while no one knows your body and health as well as you do, your status as the expert of who you are doesn’t necessarily mean you are in the best position to always make choices and decisions that are in your own best interest. Biases, cognitive errors, and simple self-deception can lead you astray.

 

If you accept that you as an individual, and everyone else individually, cannot be trusted to always make the best choices, then it is reasonable to think that someone else can step in to help improve your decision-making in certain predictable instances where cognitive errors and biases can be anticipated. This is a key idea in the book Nudge by Cass Sunstein and Richard Thaler. In defending their ideas for libertarian paternalism, the authors write, “The false assumption is that almost all people, almost all of the time, make choices that are in their best interest or at the very least are better than the choices that would be made by someone else. We claim that this assumption is false – indeed, obviously false.”

 

In many ways, our country prefers to operate with markets shaping the main decisions and factors of our lives. We like to believe that we make the best choices for our lives, and that aggregating our choices into markets will allow us to minimize the costs of individual errors. The idea is that we will collectively make the right choices, driving society in the right direction and revealing the best option and decision for each individual without deliberate tinkering in the process. However, we have seen that markets don’t encourage us to save as much as we should and markets can be susceptible to the same cognitive errors and biases that we as individuals all share.  Markets, in other words, can be wrong just like us as individuals.

 

Libertarian paternalism helps overcome the errors of markets by providing nudges to help people make better decisions. Setting up systems and structures that make saving for retirement easier helps correct a market failure. Outsourcing investment strategies, rather than each of us individually making stock trades, helps ensure that shared biases and panics don’t overwhelm the entire stock exchange. The reality is that we as individuals are not rational, but we can develop systems and structures that provide us with nudges to help us act more rationally, overcoming the reality that we don’t always make the choices that are in our best interest.