When to Stop Thinking

When to Stop Thinking

My last post was about closed-mindedness and focused on how closed-minded people fail to make appropriate inquiries to gain the necessary information to make good decisions and accurately understand the world. What the post didn’t ask, is when we should stop thinking and make a decision, versus when we should continue our investigations to gain more knowledge. A serious problem, and one we avoid when we are closed-minded, is often referred to as paralysis by analysis. It occurs when you lack confidence in decision-making and continually seek more information before making a decision, potentially delaying your choice or any action indefinitely.
Writing about this idea in Vices of the Mind, Quassim Cassam writes, “our investigations can be open-ended and there is often, though not always, scope for further investigation.” Sometimes we are asking questions and doing research on continually evolving topics. Sometimes we are working at a cutting edge where changes in politics, markets, social trends, and scientific breakthroughs can influence what we do from day to day. There never is a final answer, and we have to continually seek new information in order to adapt. However, this doesn’t mean that we can’t make important decisions that require thoughtful deliberation.
“A good investigator,” Cassam writes, “has a sense of when enough is enough and diminishing returns are setting in. But the decision to call a halt at that point isn’t properly described as closed-minded. What allows us to get on with our lives isn’t closed-mindedness but the ability to judge when no further research into the question at hand is necessary.”
Closed-minded people make decisions while ignoring pertinent information. Open-minded people make decisions while ignoring extraneous information. Over time, for each of us if we practice long enough, we should improve our judgements and become better at recognizing the diminishing returns of continued research. We might continue to learn a bit more as we continue to study, but the value of each new bit of information will be smaller and smaller, and at some point won’t truly impact our decisions. A novice might have trouble identifying this point, but an expert should be better. A closed-minded person doesn’t look for this optimal point, but an open-minded person does, continually updating their priors and judgements on when they have enough information to make a decision, rather than rigidly locking in with a specific set of information. This is how we avoid analysis paralysis and how we improve our decision-making over time to get on with our lives as Cassam writes.
Rules of Thumb: Helpful, but Systematically Error Producing

Rules of Thumb: Helpful, but Systematically Error Producing

The world throws a lot of complex problems at us. Even simple and mundane tasks and decisions hold a lot of complexity behind them. Deciding what time to wake up at, the best way to go to the grocery store and post office in a single trip, and how much is appropriate to pay for a loaf of bread have incredibly complex mechanisms behind them. In figuring out when to wake up we have to consider how many hours of sleep we need, what activities we need to do in the morning, and how much time it will take for each of those activities to still provide us a cushion of time in case something runs long. In making a shopping trip we are confronted with p=np, one of the most vexing mathematical problems that exists. And the price of bread was once the object of focus for teams of Soviet economists who could not pinpoint the right price for a loaf of bread that would create the right supply to match the population’s demand.
The brain handles all of these problems with relatively simple heuristics and rules of thumb, simplifying decisions so that we don’t waste the whole night doing math problems for the perfect time to set an alarm, don’t miss the entire day trying to calculate the best route to run all our errands, and don’t waste tons of brain power trying to set bread prices. We set a standard alarm time and make small adjustments knowing that we ought to leave the house ready for work by a certain time to make sure we reduce the risk of being late. We stick to main roads and travel similar routes to get where we need to go, eliminating the thousands of right or left turn alternatives we could chose from. We rely on open markets to determine the price of bread without setting a universal standard.
Rules of thumb are necessary in a complex world, but that doesn’t mean they are not without their own downfalls. As Quassim Cassam writes in Vices of the Mind, echoing Daniel Kahneman from Thinking Fast and Slow, “We are hard-wired to use simple rules of thumb (‘heuristics’) to make judgements based on incomplete or ambiguous information, and while these rules of thumb are generally quite useful, they sometimes lead to systematic errors.” Useful, but inadequate, rules of thumb can create predictable and reliable errors or mistakes. Our thinking can be distracted with meaningless information, we can miss important factors, and we can fail to be open to improvements or alternatives that would make our decision-making better.
What is important to recognize is that systematic and predictable errors from rules of thumb can be corrected. If we know where errors and mistakes are systematically likely to arise, then we can take steps to mitigate and reduce those errors. We can be confident with rules of thumb and heuristics that simplify decisions in positive ways while being skeptical of rules of thumb that we know are likely to produce errors, biases, and inaccurate judgements and assumptions. Companies, governments, and markets do this all the time, though not always in a step by step process (sometimes there is one step forward and two steps backward) leading to progress over time. Embracing the usefulness of rules of thumb while acknowledging their shortcomings is a powerful way to improve decision-making while avoiding the cognitive downfall of heuristics.

Closed-Mindedness

One of the epistemic vices that Quassim Cassam describes in his book Vices of the Mind is closed-mindedness. An epistemic vice, Cassam explains, is a pattern of thought or a behavior that obstructs knowledge. They systematically get in the way of learning, communicating, or holding on to important and accurate information.
Regarding closed-mindedness, Cassam writes, “in the case of closed-mindedness, one of the motivations is the need for closure, that is, the individual’s desire for a firm answer to a question, any firm answer as compared to confusion and/or ambiguity [Italics indicate quote from A.W. Kruglanski]. This doesn’t seem an inherently bad motive and even has potential benefits. The point at which it becomes problematic is the point at which it gets in the way of knowledge.”
This quote about closed-mindedness reveals a couple of interesting aspects about the way we think and the patterns of thought that we adopt. The quote shows that we can become closed-minded without intending to be closed-minded people. I’m sure that very few people think that it is a good thing for us to close ourselves off from new information or diverse perspectives about how our lives should be. Instead, we seek knowledge and we prefer feeling as though we are correct and as though we understand the world we live in. Closed-mindedness is in some ways a by-product of living in a complex world where we have to make decisions with uncertainty. It is uncomfortable to constantly question every decision we make and can become paralyzing if we stress each decision too tightly. Simply making a decision and deciding we are correct without revisiting the question is easier, but also characteristically closed-minded.
The second interesting point is that epistemic vices such as closed-mindedness are not always inherently evil. As I wrote in the previous paragraph, closed-mindedness (or at least a shade of it), can help us navigate an uncertain world. It can help us make an initial decision and move on from that decision in situations where we otherwise may feel paralyzed. In many instances, like purchasing socks, there is no real harm that comes from being closed-minded. You might pay more than necessary purchasing fancy socks, but the harm is pretty minimal.
However, closed-mindedness systematically hinders knowledge by making people unreceptive to new information that challenges existing or desired beliefs. It makes people worse at communicating information because their data may be incomplete and irrelevant. Knowledge is limited by closed-mindedness, and overtime this creates a potential for substantial consequences in people’s lives. Selecting a poor health insurance plan as a result of being closed-minded, starting a war, or spreading harmful chemical pesticides are real world consequences that have occurred as a result of closed-mindedness. Substantial sums of money, people’s lives, and people’s health and well-being can hang in the balance when closed-mindedness prevents people from making good decisions, regardless of the motives that made someone closed-minded and regardless of whether being closed-minded helped solve analysis paralysis. Many of the epistemic vices, and the characteristics of epistemic vices, that Cassam describes manifest in our lives similar to closed-mindedness. Reducing such vices, like avoiding closed-mindedness, can help us prevent serious harms that can accompany the systematic obstruction of knowledge.
Epistemic Vices - Joe Abittan

Epistemic Vices

Quassim Cassam’s book Vices of the Mind is all about epistemic vices. Epistemic vices are intentional and unintentional habits, behaviors, personality traits, and patterns of thought that hinder knowledge, information sharing, and accurate and adequate understandings of the world around us. Sometimes we intentionally deceive ourselves, sometimes we simply fail to recognize that we don’t have enough data to confidently state our beliefs, and sometimes we are intentionally deceived by others without recognizing it. When we fall into thinking habits and styles that limit our ability to think critically and rationally, we are indulging in epistemic vices, and the results can often be dangerous to ourselves and people impacted by our decisions.
“Knowledge is something that we can acquire, retain, and transmit. Put more simply, it is something that we can gain, keep, and share. So one way to see how epistemic vices get in the way of knowledge is to see how they obstruct the acquisition, retention, and transmission of knowledge,” Cassam writes.
A challenge that I have is living comfortably knowing that I have incomplete knowledge on everything, that the world is more complex than I can manage to realize, and that even when doing my best I will still not know everything that another person does. This realization is paralyzing for me, and I constantly feel inadequate because of it. However, Cassam’s quote provides a perspective of hope.
Knowledge is something we can always gain, retain, and transmit. We can improve all of those areas, gaining more knowledge, improving our retention and retrieval of knowledge, and doing better to transmit our knowledge. By recognizing and eliminating epistemic vices we can increase the knowledge that we have, use, and share, ultimately boosting our productivity and value to human society. Seeing knowledge as an iceberg that we can only access a tiny fraction of is paralyzing, but recognizing that knowledge is something we can improve our access to and use of is empowering. Cassam’s book is helpful in shining a light on epistemic vices so we can identify them, understand how they obstruct knowledge, and overcome our vices to improve our relationship with knowledge.
On The Opportunity To Profit From Uninformed Patients

On The Opportunity To Profit From Uninformed Patients

The American Medical System is in a difficult and dangerous place right now. Healthcare services have become incredibly expensive, and the entire system has become so complex that few people fully understand it and even fewer can successfully navigate the system to get appropriate care that they can reasonably afford. My experience is that many people don’t see value in much of the care they receive or with many of the actors connected with their care. They know they need insurance to afford their care, but they really can’t see what value their insurance provides – it often appears to be more of a frustration than something most people appreciate. The same can be true for primary care, anesthesiologists, and the variety of healthcare benefits that employers may offer to their patients. There seem to be lots of people ready to profit from healthcare, but not a lot of people ready to provide real value to the people who need it.
 
These sentiments are all generalizations, and of course many people really do see value in at least some of their healthcare and are grateful for the care they receive. However, the complexity, the lack of transparency, and the ever climbing costs of care have people questioning the entire system, especially at a moral and ethical level. I think a great deal of support for Medicare for All, or universal healthcare coverage, comes from people thinking that profit within medicine may be unethical and from a lack of trust that stems from an inability to see anything other than a profit motive in many healthcare actors and services.
 
Gerd Gigerenzer writes about this idea in his book Risk Savvy. In the book he doesn’t look at healthcare specifically, but uses healthcare to show the importance of being risk literate in today’s complex world. Medical health screening in particular is a good space to demonstrate the harms that can come from misinformed patients and doctors. A failure to understand and communicate risk can harm patients, and it can actually create perverse incentives for healthcare systems by providing them the opportunity to profit from uninformed patients. Gigerenzer quotes Dr. Otis Brawley who had been Director of the Georgia Cancer Center at Emory in Atlanta.
 
In Dr. Brawley’s quote, he discusses how Emory could have screened 1,000 men at a mall for prostate cancer and how the hospital could have made $4.9 million in billing for the tests. Additionally the hospital would have profited from future services when men returned for other unrelated healthcare concerns as established patients. In Dr. Brawley’s experience, the hospital could tell him how much they could profit from the tests, but could not tell him whether screening 1,000 men early for prostate cancer would have actually saved any lives among the 1,000 men screened. Dr. Brawley knew that screening many men would lead to false positive tests, and unnecessary stress and further medical diagnostic care for those false positives – again medical care that Emory would profit from. The screenings would also identify men with prostate cancer that was unlikely to impact their future health, but would nevertheless lead to treatment that would make the men impotent or potentially incontinent. The hospital would profit, but their patients would be worse off than if they had not been screened. Dr. Brawley’s experience was that the hospital could identify avenues for profit, but could not identify avenues to provide real value in the healthcare services they offer.
 
Gigerenzer found this deeply troubling. A failure to understand and communicate the risks of prostate cancer (which is more complex than I can write about here) presents an opportunity for healthcare providers to profit by pushing unnecessary medical screening and treatment onto patients. Gigerenzer also notes that profiting from uninformed patients is not just limited to cancer screening. Doctors who are not risk literate cannot adequately explain risks and benefits of treatment to patients, and their patients cannot make the best decisions for themselves. This is a situation that needs to change if hospitals want to keep the trust of their patients and avoid being a hated entity that fails to demonstrate value. They will go the way of health insurance companies, with frustrated patients wanting to eliminate them altogether.
 
Wrapping up the quote from Dr. Brawley, Gigerenzer writes, “profiting from uninformed patients is unethical. medicine should not be a money game.” I believe that Gigerenzer and Dr. Brawley are right, and I think that all healthcare actors need to clearly demonstrate their value, otherwise any profits they earn will make them look like money-first enterprises and not patient-first enterprises, frustrating the public and leading to distrust in the medical field. In the end, this is going to be harmful for everyone involved. Demonstrating real value in healthcare is crucial, and profiting from uniformed patients will diminish the value provided and hurt trust, making the entire healthcare system in our country even worse.

Missing Feedback

Missing Feedback

I generally think we are overconfident in our opinions. We should all be more skeptical that we are right, that we have made the best possible decisions, and that we truly understand how the world operates. Our worldviews can only be informed by our experiences and by the information we take in about events, phenomena, and stories in the world. We will always be limited because we can’t take in all the information the world has to offer. Additionally, beyond simply not being able to hold all the information possible, we are unable to get the appropriate feedback we need in all situations for comprehensive learning. Some feedback is hazy and some feedback is impossible to receive at all. This means that we cannot be sure that we have made the best choices in our lives, even if things are going well and we are making our best efforts to study the world.

 

In Nudge, Cass Sunstein and Richard Thaler write, “When feedback does not work, we may benefit from a nudge.” When we can’t get immediate feedback on our choices and decisions, or when we get feedback that is unclear, we can’t adjust appropriately for future decisions. We can’t learn, we can’t improve, and we can’t make the best choices when we return to a decision-situation. However, we can observe where situations of poor feedback exist, and we can help design those decision-spaces to provide subtle nudges to help people make better decisions in the absence of feedback. Visual aids showing how much money people need for retirement and how much they can expect to have based on current savings rates is a helpful nudge in a situation where we don’t get feedback for how well we are saving money. There are devices that glow red or green based on your home’s current energy usage and efficiency, providing a subtle nudge to remind people not to use appliances at peak demand times and giving people feedback on energy usage that they normally wouldn’t receive. Nudges such as these can provide feedback, or can provide helpful information in the absence of feedback.

 

Sunstein and Thaler also write, “many of life’s choices are like practicing putting without being able to see where the balls end up, and for one simple reason: the situation is not structured to provide good feedback. For example, we usually get feedback only on the options we select, not the ones we reject.” Missing feedback is an important consideration because the lack of feedback influences how we understand the world and how we make decisions. The fact that we cannot get feedback on options we never chose should be nearly paralyzing. We can’t say how the world works if we never experiment and try something different. We can settle into a decent rhythm and routine, but we may be missing out on better lifestyles, happier lives, or better societies if we made different choices. However, we can never receive feedback on these non-choices. I don’t know that this means we should necessarily try to constantly experiment at the cost of settling in with the feedback we can receive, but I do think it means we should discount our own confidence and accept that we don’t know all there is. I also think it means we should look to increase nudges, use more visual aids, and structure our choices and decisions in ways that help maximize useful feedback to improve learning for future decision-making.
Hindsight Bias and Misleading Headlines

Hindsight Bias and Misleading Headlines

I absolutely hate internet ads that have headlines along the lines of “Analyst Who Predicted Stock Market Crash Says Makes New Prediction.” These headlines are always nothing but clickbait, and reading Daniel Kahneman’s book Thinking Fast and Slow has given me even more reason to hate these types of headlines. They play on cognitive errors in our thinking, particularly our hindsight bias. When we look back at previous choices, decisions, and important events, whether in our individual lives or across the globe, our present state of being always seems inevitable. It was clear that the internet would lead to major social network platforms, and that those platforms would then contribute to major challenges and problems with misinformation, how could anyone fail to see this as far back as 2004?

 

The problem of course, is that the inevitable present moment and the pathway that seems so obvious in retrospect, was never clear at all. There was no way to predict a major housing bubble and financial collapse in 2008 if you were living in 2006. Headlines introducing some genius who saw what the rest of us couldn’t see before the Great Recession, and then claiming that this person has made another prediction are pulling at our emotions and playing with hindsight bias in a way that is deliberately misleading. The fact that someone made an unlikely prediction that came true is not a reason to believe they will be correct again in the future. If anything, we should expect some version of regression to the mean with their predictions, and assume that their next grand claim is wrong.

 

Rather than using hindsight bias to convince more people to follow links to bogus news stories, we should be more cautious with hindsight bias and our proclivity toward inaccurate heuristics. As Kahneman writes, “Hindsight is especially unkind to decision makers who act as agents for others—physicians, financial advisers, third-base coaches, CEOs, social workers, diplomats, politicians. We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact. There is a clear outcome bias. When the outcomes are bad, the clients often blame their agents for not seeing the handwriting on the wall—forgetting that it was written in invisible ink that became legible only afterward. Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.”

 

Our key decision-makers can be punished by our hindsight bias. It can cloud our judgment for what we should expect in the future and lead us to trust individuals who don’t deserve trust, and mistrust those who are making the best possible decisions given a set of serious constraints. Hindsight bias deserves a greater recognition and more respect than use for misleading headlines.
Understanding the Past

Understanding the Past

I am always fascinated by the idea, that continually demonstrates validity in my own life, that the more we learn about something, the more realize how little we actually know about it. I am currently reading Yuval Noah Harari’s book Sapiens: A Brief History of Humankind, and I am continually struck by how often Harari brings in events from mankind’s history that I had never heard about. The more I learn about the past, or about any given subject, the more I realize how little knowledge I have ever had, and how limited, narrow, and sometimes just flat out inaccurate my understandings have been.

 

This is particularly important when it comes to how we think about the past. I believe very strongly that our reality and the worlds in which we live and inhabit are mostly social constructions. The trees, houses, and roads are all real, but how we understand the physical objects, the spaces we operate, and how we use the real material things in our worlds is shaped to an incredible degree by social constructions and the relationships we build between ourselves and the world we inhabit. In order to understand these constructions and in order to shape them for a future that we want to live in (and are physiologically capable of living in) we need to understand the past and make predictions about the future with new social constructs that enable continued human flourishing.

 

To some extent, this feels easy and natural to us. We all have a story and we learn and adopt family stories, national stories, and global stories about the grand arc of humanity. But while our stories seem to be shared, and while we seem to know where we are heading, we all operate based on individual understandings of the past, and where that means we are (or should be) heading. As Daniel Kahneman writes in his  book Thinking Fast and Slow, “we believe we understand the past, which implies that the future also should be knowable, but in fact we understand the past less that we believe we do.”

 

As I laid out to begin this post, there is always so much more complexity and nuance to anything that we might study and be familiar with than we often realize. We can feel that we know something well when we are ignorant of the nuance and complexity. When we start to really untangle something, whether it be nuclear physics, the history of the American Confederacy, or how our fruits and veggies get to the supermarket, we realize that we really don’t know and understand anything as well as we might intuitively believe.

 

When we lack a deep and complex understanding of the past, because we just don’t know about something or because we didn’t have an accurate and detailed presentation of the thing from the past, then we are likely to misinterpret and misunderstand how we got to our current point. By having a limited historical perspective and understanding, we will incorrectly assess where our best future lies. It is important that we recognize how limited our knowledge is, and remember that these limits will shape the extent to which we can make valid predictions for the future.
Substitution Heuristics

Substitution Heuristics

I think heuristics are underrated. We should discuss heuristics as a society way more than we do. We barely acknowledge heuristics, but if we look closely, they are at the heart of many of our decisions, beliefs, and assumptions. They save us a lot of work and help us move through the world pretty smoothly, but are rarely discussed directly or even slightly recognized.

 

In Thinking Fast and Slow, Daniel Kahneman highlights heuristics in the sense of substitution and explains their role as:

 

“The target question is the assessment you intended to produce.
The heuristic question is the simpler question that you answered instead.”

 

I have already written about our brain substituting easier questions for harder questions, but the idea of heuristics gives the process a deeper dimension. Kahneman defines a heuristic writing, “The technical definition of heuristic is a simple procedure that helps find adequate, though often imperfect, answers to difficult questions.”

 

In my own life, and I imagine I am a relatively average case, I have relied on heuristics to help me make a huge number of decisions. I don’t know the best possible investment strategies for my future retirement, but as a heuristic, I know that working with an investment advisor to manage mutual funds and IRAs can be an adequate (even if not perfect) way to ensure I save for the future. I don’t know the healthiest possible foods to eat and what food combinations will maximize my nutrient intake, but as a heuristic I can ensure that I have a colorful plate with varied veggies and not too many sweets to ensure I get enough of the vitamins and nutrients that I need.

 

We have to make a lot of difficult decisions in our lives. Most of us don’t have the time or the ability to compile all the information we need on a given subject to make a fully informed decision, and even if we try, most of us don’t have a reasonable way to sort through contrasting and competing information to determine what is true and what the best course of action would be. Instead, we make substitutions and use heuristics to figure out what we should do. Instead of recognizing that we are using heuristics, however, we ascribe a higher level of confidence and certainty to our decisions than is warranted. What we do, how we live, and what we believe become part of our identity, and we fail to recognize that we are adopting a heuristic to achieve some version of what we believe to be a good life. When pressed to think about it, our mind creates a justification for our decision that doesn’t acknowledge the heuristics in play.

 

In a world where we were quicker to recognize heuristics, we might be able to live with a little distance between ourselves, our decisions, and our beliefs. We could acknowledge that heuristics are driving us, and be more open to change and more willing to be flexible with others. Acknowledging that we don’t have all the answers (that we don’t even have all the necessary information) and are operating on substitution heuristics for complex questions, might help us be less polarized and better connected within our society.
What You See Is All There Is

What You See Is All There Is

In Thinking Fast and Slow, Daniel Kahneman gives us the somewhat unwieldy acronym WYSIATI – what you see is all there is. The acronym describes a phenomenon that stems from how our brains work. System 1, the name that Kahneman gives to the part of our brain which is automatic, quick, and associative, can only take in so much information. It makes quick inferences about the world around it, and establishes a simple picture of the world for System 2, the thoughtful calculating part of our brain, to work with.

 

What you see is all there is means that we are limited by the observations and information that System 1 can take in. It doesn’t matter how good System 2 is at processing and making deep insights about the world if System 1 is passing along poor information. Garbage in, garbage out, as the computer science majors like to say.

 

Daniel Kahneman explains what this means for our day to day lives in detail in his book. He writes, “As the WYSIATI rule implies, neither the quantity nor the quality of the evidence counts for much in subjective confidence. The confidence that individuals have in their beliefs depends mostly on the quality of the story they can tell about what they see, even if they see little.”

 

System 2 doesn’t recognize that System 1 hands it incomplete and limited information. It chugs along believing that the information handed off by System 1 is everything that it needs to know. It doesn’t ask for more information, it just accepts that it has been handed a complete data set and begins to work. System 2 creates a solid narrative out of whatever information System 1 gives it, and only momentarily pauses if it notices an inconsistency in the story it is stitching together about the world. If it can make a coherent narrative, then it is happy and doesn’t find a need to look for additional information. What you see is all there is, there isn’t anything missing.

 

But we know that we only take in a limited slice of the world. We can’t sense the Earth’s magnetic pull, we can’t see in ultraviolet or infrared, and we have no way of knowing what is really happening in another person’s mind. When we read a long paper or finish a college course, we will remember some stuff, but not everything. Our mind is only able to hold so much information, and System 2 is limited to what can be observed and held. This should be a huge problem for our brain, we should recognize enormous blind spots, and be paralyzed with inaction due to a lack of information. But this isn’t what happens. We don’t even notice the blind spots, and instead we make a story from the information we collect, building a complete world that makes sense of the information, no matter how limited it is. What you see is all there is, we make the world work, but we do so with only a portion of what is really out there, and we don’t even notice we do so.