Scrutinizing Causal Assumptions

Scrutinizing Causal Assumptions

Recently I have been writing about my biggest take-away from The Book of Why by Judea Pearl. The book is more technical than I can fully understand and grasp since it is written for a primarily academic audience with some knowledge of the fields that Pearl dives into, but I felt that I still was able to gain some insights from the book. Particularly, Pearl’s idea that humans are better causal thinkers than we typically give ourselves credit for was a big lesson for me. In thinking back on the book, I have been trying to recognize our powerful causal intuitions and to understand the ways in which our causal thinking can be trusted. Still, for me it feels that it can be dangerous to indulge our natural causal thinking tendencies.
However, Pearl offers guidance on how and when we can trust our causal instincts. he writes, “causal assumptions cannot be invented at our whim; they are subject to the scrutiny of data and can be falsified.”
Our ability to imagine different future states and to understand causality at an instinctual level has allowed our human species to move form hunter-gatherer groups to massive cities connected by electricity and Wi-Fi. However, our collective minds have also drawn causal connections between unfortunate events and imagined demons. Dictators have used implausible causal connections to justify eugenics and genocide and still to this day society is hampered by conspiracy theories that posit improbable causal links between disparate events.
The important thing to note, as Pearl demonstrates, is that causal assumptions can be falsified and must be supported with data. Supernatural demons cannot be falsified and wild conspiracy theories often lack any supporting data or evidence. We can intuit causal relations, but we must be able to test them in situations that would falsify our assumptions if we are to truly believe them. Pearl doesn’t simply argue that we are good causal thinkers and that we should blindly trust the causal assumptions that come naturally to our mind. Instead, he suggests that we lean into our causal faculties and test causal relationships and assumptions that are falsifiable and can be either supported or disproven by data. Statistics still has a role in this world, but importantly we are not looking at the data without making causal assumptions. We are making predictions and determining whether the data falsifies those predictions.
The Fundamental Nature of Cause and Effect

The Fundamental Nature of Cause and Effect

In my undergraduate and graduate studies I had a few statistics classes and I remember the challenge of learning probability. Probability, odds, and statistics are not always easy to understand and interpret. There are some concepts that are pretty straightforward, and others that seem to contradict what we would expect if we had not gone through the math and if we had not studied the concepts in depth. To contrast the difficult and sometimes counter-intuitive nature of statistics, we can think about causality, which is a challenging concept, but unlike statistics, is something we are able to intuit from very young age.
In The Book of Why Judea Pearl writes, “In both a cognitive and a philosophical sense, the idea of cause and effect is much more fundamental than probability. We begin learning causes and effects before we understand language and before we understand mathematics.”
As Pearl explains, we see causality naturally and experience causality as we move through our lives. From a young child who learns that if they cry they receive attention to a nuclear physicist who learns what happens when two atoms collide at high energy levels, our minds are constantly looking at the world and looking for causes. It begins by making observations of phenomena around us and continues as we predict what outcomes would happen based on certain system inputs. Eventually, our minds reach a point where we can understand why our predictions are accurate or inaccurate, and we can imagine new ways to bring about certain outcomes. Even if we cannot explain all of this, we can still understand causation at a fundamental and intuitive level.
However, many of us deny that we can see and understand the world in a causal way. I am personally guilty of thinking in a purely statistical way and ignoring the causal. The classes I took in college helped me understand statistics and probability, but also told me not to trust my intuitive causal thinking. Books like Kahneman’s Thinking Fast and Slow cemented this mindset for me. Rationality, we believe, requires that we think statistically and discount our intuitions for fear of bias. Modern science says we can only trust evidence when it is backed by randomized controlled trials and directs us to think of the world through correlations and statistical relationships, not through a lens of causality.
Pearl pushes back against this notion. By arguing that causality is fundamental to the human mind, he implies that our causal reasoning can and should be trusted. Throughout the book he demonstrates that a purely statistical way of thinking leaves us falling short of the knowledge we really need to improve the world. He demonstrates that complex tactics to remove variables from equations in statistical methods are often unnecessary, and that we can accept the results of experiments and interventions even when they are not fully randomized controlled trials.  For much of human history our causal thinking nature has lead us astray, but I think that Pearl argues that we have overcorrected in modern statistics and science, and that we need to return to our causal roots to move forward and solve problems that statistics tells us are impossible to solve.
A Leader's Toolbox

A Leader’s Toolbox

In the book Risk Savvy Gerd Gigerenzer describes the work of top executives within companies as being inherently intuitive. Executives and managers within high performing companies are constantly pressed for time. There are more decisions, more incoming items that need attention, and more things to work on than any executive or manager can adequately handle on their own. Consequentially, delegation is necessary, as is quick decision-making based on intuition. “Senior managers routinely need to make decisions or delegate decisions in an instant after brief consultation and under high uncertainty,” writes Gigerenzer. This combination of quick decision-making under uncertainty is where intuition comes to play, and the ability to navigate these situations is what truly comprises the leader’s toolbox.

 

Gigerenzer stresses that the intuitions developed by top managers and executives are not arbitrary. Successful managers and companies tend to develop similar tool boxes that help encourage trust and innovation. While many individual level decisions are intuitive, the structure of the leader’s toolbox often becomes visible and intentional. As an example, Gigerenzer highlights a line of thinking he uncovered when working on a previous book. He writes, hire well and let them do their jobs reflects a vision of an institution where quality control (hire well) goes together with a climate of trust (let them do their jobs) needed for cutting-edge innovation.”

 

In many companies and industries, the work to be done is incredibly complex, and a single individual cannot manage every decision. The structure of the decision-making process necessarily needs to be decentralized for the individual units of the team to work effectively and efficiently. Hiring talented individuals and providing them with the autonomy and tools necessary to be successful is the best approach to get the right work done well.

 

Gigerenzer continues, “Good leadership consists of a toolbox full of rules of thumb and the intuitive ability to quickly see which rule is appropriate in which context.”

 

A leader’s toolbox doesn’t consist of specific lists of what to do in certain situations or even specific skills that are easy to check off on a resume. A leader’s toolbox is built by experience in a diverse range of settings and intuitions about things as diverse as hiring, teamwork, and delegation. Because innovation is always uncertain and always includes risk, leaders must develop intuitive skills and be able to make quick and accurate judgements about how to best handle new challenges and obstacles. Intuition and gut-decisions are an essential part of leadership today, even if we don’t like to admit that we make important decisions on intuition.
Gut Decisions

Gut Decisions

“Although about half of professional decisions in large companies are gut decisions, it would probably not go over well if a manager publicly admitted, I had a hunch. In our society, intuition is suspicious. For that reason, managers typically hide their intuitions or have even stopped listening to them,” Gerd Gigerenzer writes in Risk Savvy.

 

The human mind evolved first in small tribal bands trying to survive in a dangerous world. As our tribes grew, our minds evolved to become more politically savvy, learning to intuitively hide our selfish ambitions and appear honest and altruistic. This pushed our brains toward more complex activity, which took place outside our direct consciousness, hiding in gut feelings and intuitions. Today however, we don’t trust those intuitions and gut decisions, even though they never left us.

 

We do have good reason to discount intuitions. Our minds did not evolve to serve us perfectly in a complex, data rich world full of uncertainty. Our brains are plagued by motivated reasoning, biases, and cognitive limitations. Making gut decisions can lead us vulnerable to these mental challenges, leading us to distrust our intuitions.

 

However, this doesn’t mean we have escaped gut decisions. Gerd Gigerenzer thinks that is actually a good thing, especially if we have developed years of insight and expertise through practice and real life training. What Gigerenzer argues is that we still make many gut decisions in areas as diverse as vacation planning, daily exercise, and corporate strategies. We just don’t admit we are making decisions based on intuition rather than careful statistical analysis. Taking it a step further, Gigerezner suggests that most of the time we make a decision at a gut level, and produce reasons after the fact.” We rationalize and use motivated reasoning to explain why we made a decision and we try to deceive ourselves to believe that we always intended to do the rational calculation first, and that we really hadn’t made up our mind until after we had done so.

 

Gigerenzer suggests that we acknowledge our gut decisions. Ignoring them and pretending they are not influential wastes our time and costs us money. An executive may have an intuitive sense of what to do in terms of a business decision, but may be reluctant to say they made a decision based on intuition. Instead, they spend time doing an analysis that didn’t need to be done. They create reasons to support their decision after the fact, again wasting time and energy that could go into implementing the decision that has already been made. Or an executive may bring in a consulting firm, hoping the firm will come up with the same answer that they got from their gut. Time and money are both wasted, and the decision-making and action-taking structures of the individual and organization are gummed up unnecessarily. Acknowledging gut decisions and moving forward more quickly, Gigerenzer seems to suggest, is better than rationalizing and finding support for gut decisions after the fact.
Informed Bets

Informed Bets

My last post was about limitations of the human mind and why we should be willing to doubt our conclusions and beliefs. This post contrasts my last post to argue that we can trust the informed bets that our brains make. Our brains and bodies do not have the capabilities to fully capture all of the information necessary to perfectly replicate reality in our minds, but they can do a good job putting information together in a way that helps us successfully navigate the world and our lives. Informed guesses, that is assumptions and intuitions based on experience and expertise rather than random and amateurish judgements, are actually very useful and often good approximations.

 

“Intelligence…” Gerd Gigerenzer writes in his book Risk Savvy, “is the art of making informed guesses.” Our brains make a lot of predictions and rely on heuristics, assumptions, and guesses to get by. It turns out that our brains do this well, as Gigerenzer argues in his book. We don’t need to pull out graph paper and a scientific calculator to catch a football. We don’t need to record every thought and action we have had over the last month to know if we are happy with our New Year’s resolutions and can keep them going. When we see someone standing in a long customer service line at the grocery store we don’t need to approach them with a 100 point questionnaire to know whether they are bored or upset.  Informed bets and reasonable guesses are sufficient for us to have decent and functional understanding of the world.

 

Gigerenzer continues, “Intelligence means going beyond the information given and making informed bets on what’s outside.” This quote is introduced after an optical illusion, where a grayscale checkerboard is shown with a figure casting a shadow across the board. Two squares on the board are the same shade of gray, yet our minds see the squares as different colors. Our minds are going beyond the information given, the literal wavelength of light reaching the back of our eyes, and making informed bets on the relative colors of the squares on the board if there was not a figure to cast a shadow. In the case of the visual illusion, our brain’s guess about reality is actually more helpful for us than the literal reality of the same colors of the squares in the image.

 

Bounded rationality is a serious concern. We cannot absorb all the information that exists in the world which may help us make better decisions. However, humans are intelligent. We can use the information we receive and make informed bets about the best choices and decisions available. We might not be perfect, but by making informed bets and educated guesses we can successfully come to understand the world and create systems and structures that help us improve our understanding over time.
Unconscious Rules of Thumb

Unconscious Rules of Thumb

Some of the decisions that I make are based on thorough calculations, analysis, evaluation of available options, and deliberate considerations of costs and benefits. When I am planning my workout routine, I think hard about how my legs have been feeling and what distance, elevation, and pace is reasonable for my upcoming workouts. I think about how early I need to be out the door for a certain distance, and whether I can run someplace new to mix things up. I’ll map out routes, look at my training log for the last few weeks, and try putting together a plan that maximizes my enjoyment, physical health, and fitness given time constraints.

 

However, outside of running, most of my decisions are generally based on rules of thumb and don’t receive the same level of attention as my running plans. I budget every two weeks around payday, but even when budgeting, I mostly rely on rules of thumb. There is a certain amount I like to keep in my checking account just in case I forgot a bill or have something pop-up last minute. Its not a deliberate calculation, it is more of a gut feeling. The same goes for how much money I set aside for free spending or if I feel that it is finally time to get that thing I have had my eye on for a while. My budget is probably more important than my running routine, but I actually spend more time rationally developing a running plan than I spend budgeting. The same goes for house and vehicle maintenance, spending time with friends and family, and choosing what to eat on the days we plan to do take-out.

 

The budget example is interesting because I am consciously and deliberately using rules of thumb to determine how my wife and I will use our money. I set aside a certain amount for gas without going to each vehicle and checking whether we are going to need to fill up soon. I am aware of the rules of thumb, and they are literally built into my spreadsheet where I sometimes ask if I should deviate, but usually decide to stick to them.

 

I also recognize that I have many unconscious rules of thumb. In his book Risky Savvy, Gerd Gigerenzer writes the following about unconscious rules of thumb:

 

“Every rule of thumb I am aware of can be used consciously and unconsciously. If it is used unconsciously, the resulting judgment is called intuitive. An intuition, or gut feeling, is a judgment:
  1. that appears quickly in consciousness,
  2. whose underlying reasons we are not fully aware of, yet
  3. is strong enough to act upon.”
I have lots of intuitive judgements that I often don’t think about in the moment, but only realize when I reflect back on how I do something. When I am driving down the freeway, cooking, or writing a blog post, many of my decisions flow naturally and quickly. In the moment the decisions seem obvious, and I don’t have to think too deliberately about my action and why I am making a specific decision. But if I were asked to explain why I made a decision, I would have a hard time finding exact reasons for my choices. I don’t know exactly how I know to change lanes at a certain point on the freeway, but I know I can often anticipate points where traffic will slow down, and where I might be better off in another lane. I can’t tell you why I chose to add the marsala wine to the mushrooms at the precise moment that I did. I also couldn’t explain why I chose to present a certain quote right at the beginning of a post rather than in the middle. My answer for all of these situations would simply be that it felt right.

 

We use unconscious rules of thumb like these all the time, but we don’t often notice when we do. When we are budgeting we might recognize our rules of thumb and be able to explain them, but our unconscious rules of thumb are harder to identify and explain. Nevertheless, they still have major impacts in our lives. Simply because we don’t notice them and can’t explain them doesn’t mean they don’t shape a lot of our decisions and don’t matter. The intuitions we have can be powerful and helpful, but they could also be wrong (maybe all this time I’ve been overcooking the mushrooms and should add the wine sooner!). Because these intuitions are unconscious, we don’t deliberately question them, unless something calls them up to the conscious level. The feedback we get is probably indirect, meaning that we won’t consciously tie our outcomes the to the unconscious rules of thumb that got us to them.

 

I am fascinated by things like unconscious rules of thumb because they reveal how little we actually control in our lives. We are the ones who act on these unconscious rules of thumb, but in a sense, we are not really doing anything at all. We are making decisions based on factors we don’t understand and might not be aware of. We have agency by being the one with the intuition, but we also lack agency by not being fully conscious of the how and why behind our own decisions. This should make us question ourselves and choices more than we typically do.
Intensity Matching and Intuitive Predictions

Intuitive Predictions and Intensity Matching

“Intuitive predictions are almost completely insensitive to the actual predictive quality of the evidence,” writes Daniel Kahneman in Thinking Fast and Slow. A lot of our thinking takes place in the part of our brain which is good at making quick connections, quickly detecting patterns, and making fast judgments. The deeper and more thoughtful part of our brain only engages with the world when it really needs to, when we really need to do some critical thinking to sort out a math problem, write a blog post, or figure out how to grind down some grains to make bread. The result is that a lot of our thinking processes happen at a quick and intuitive level that is subject to biases and assumptions based on incomplete information.  When we do finally turn our critical thinking brain to a problem, it is only operating with a limited set of information from the quick part of our brain which scanned the environment and grabbed the information which stood out.

 

When we make a prediction without sitting down and doing some math or weighing the factors that influence our prediction with pen and paper, our predictions will seem logical, but will miss critical information. We will make connections between ideas and experiences that might not be very reflective of the actual world. We will simplify the prediction by answering easy questions and substituting answers for the more difficult question that our prediction is trying to answer.

 

This year, as in 2016, we will see this in action. In 2016, for me and many of the people I know, it seemed as though very few people supported Donald Trump for president. I saw very few bumper stickers or yard signs for Trump, all the social media posts I saw highlighted his worst moments, and the news coverage I consumed described why he was unfit to be president. Naturally enough, I believed he would lose in a landslide. Of course, that did not happen. Intuitively I was sure that Clinton would win, and Kahneman’s research helps explain why I should have been more skeptical of my natural intuition.

 

Part of the problem was that my intuitive prediction was an exercise of intensity matching, and as Kahneman writes, “Intensity matching yields predictions that are as extreme as the evidence on which they are based.” All the information I saw highlighted how terrible Trump was. I didn’t see a lot of people supporting trump, I didn’t see news stories justifying his candidacy. I didn’t see people in my immediate environment who strongly supported him, so my intuition was biased. It didn’t help that I didn’t do anything to seek out people who did support him or information outlets that posted articles or stories in support of him.

 

Kahneman’s writing aligns with my real world experience. His studies of the brain and of our predictive machinery reveals biases and errors in our thinking. Our intuition is based on a limited set of information that the quick part of our brain can put together. When we do engage our deep thinking brain, it can still only operate on that limited information, so even if we do think critically, we are likely to still make mistakes because we can’t see the full picture and biases in the information we absorb will predictably shape the direction of our miscalculations. What might feel natural and obvious to us could be a result of faulty intensity matching and random chance in the environment around us.
The Availability Heuristic

The Science of Availability

Which presidential candidate is doing more advertising this year? Which college football team has been the most dominant over the last five years? Who has had the most songs on the Hot 100 over the last five years? You can probably come up with an intuitive answer to (at least one of) these questions even if you don’t follow politics, college football, or pop music very closely. But what you are doing when you come up with an intuitive answer isn’t really answering the question, but instead relying on substitution and the availability heuristic.

 

In Thinking Fast and Slow, Daniel Kahneman writes, “We defined the availability heuristic as the process of judging frequency by the ease with which instances come to mind.” So if you recently saw a few ads from the Trump Campaign, then your mind would probably intuit that his campaign is doing more advertising. If you remember that LSU won the college football national championship last year, then you might have answered LSU, but also if you see lots of people wearing Alabama hats on a regular basis, you might answer Alabama. And if you recently heard a Taylor Swift song, then your intuitive guess might be that she has had the most top 100 hits.

 

Kahneman continues, “The availability heuristic, like other heuristics of judgment, substitutes one question for another: you wish to estimate the size of a category or the frequency of an event, but you report an impression of the ease with which instances come to mind.” When we are asked to guess how often an event happens or what percent of a category fits a certain characteristic, our brains flip back through short-term memory for examples that match what we are looking for. The easier it is to remember an example the more weight we give to it.

 

I don’t really know who is doing more advertising, but I do know that I have seen a lot of Trump ads on YouTube, so it intuitively felt that he was doing more advertising, even though I might have just picked one channel where his ads were more salient. Overall, he may be doing less than the Biden campaign. Similarly, I didn’t initially remember that LSU won the national championship last year, but I did see someone wearing an Alabama sweatshirt recently, and that team came to mind quickly when thinking of dominant football programs. I also don’t have a clue who has had the most top 100 hits in the last 5 years, but people in my orbit on Twitter frequently post things relating to Taylor Swift, so her name came to mind easily when guessing for the top 100 hits. I wasn’t doing any deep thinking, I was just scratching the surface of my memory for an easy answer.

 

Throughout Thinking Fast and Slow, Kahneman reveals instances where our thinking appears to be deep and nuanced, but is really quick, intuitive, and prone to errors. In most instances we don’t do any deep calculation or thinking, and just roll with the intuitive answer. But our intuition is often faulty, incomplete, and based on a substitution for the real question we are being asked. This might not have high stakes when it means we are inaccurately estimating divorce rates for celebrities (an example from the book), but it can have high stakes in other decision-making areas. If we are looking to buy a home and are concerned about flood risk, we will incorrectly weight the risk of a flood at a property if there were a lot of news stories about hurricane flooding from a hurricane in the Gulf of Mexico. This could influence where we chose to live and whether we pay for expensive insurance or not. Little assumptions and misperceptions can nudge us in critical directions, either positive or negative, and change whether we invest for our futures, fudge our taxes, or buy a new car. Recognizing that our brains make mistakes based on thinking strategies like the availability heuristic can help us in some large decision-making areas, so it is important to understand how our brains work, and where they can go wrong.
Rarely Stumped

Rarely Stumped

Daniel Kahneman starts one of the chapters in his book Thinking Fast and Slow by writing, “A remarkable aspect of your mental life is that you are rarely stumped. True, you occasionally face a question such as 17 × 24 = ? to which no answer comes immediately to mind, but these dumbfounded moments are rare. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way.”

 

When I read this quote I am reminded of Gus, the father, in My Big Fat Greek Wedding. He is always ready to show how every word comes from a Greek root, even a Japanese word like kimono. He is sure of his intellect, sure that his heritage is perfect and is the foundation of all that is good in the world. He trusts his instincts and intuitions to a hilarious extent, even when he is clearly wrong and even when his decisions are gift-wrapped and planted in his mind in an almost Inception style.

 

His character is part caricature, but it is revealing of what Kahneman explains with the quote above. Our minds are good at finding intuitive answers that make sense of the world around us, even if we really don’t have any idea what is going on. We laugh at Gus and don’t consider ourselves to be guilty of behaving like him, but the only difference between most of us and Gus is that Gus is an exaggeration of the intuitive dogma and sense of self value and assurance that we all live with.

 

We scroll through social media, and trust that our initial judgment of a headline or post is the right frame for how to think about the issue. We are certain that our home remedy for tackling bug bites, cleaning windows, or curing a headache is based on sound science, even if it does nothing more than produce a placebo effect. We find a way to fit every aspect of our lives into a comprehensive framework where our decisions appear rational and justified, with us being the hero (or innocent victim if needed) of the story.

 

We should remember that we have a propensity to believe that we are always correct, that we are never stumped. We should pause, ask more questions, think about what is important to know before making a decision, and then deeply interrogate our thoughts to decide if we really have obtained meaningful information to inform our opinions, or if we are just acting on instinct, heuristics, self-interest, or out of groupthink. We cannot continue believing we are right, pushing baseless beliefs onto others when we have no real knowledge of an issue. We shouldn’t assume things are true just because they happen to align with the story we want to believe about ourselves and the world. When it comes to crucial issues and our interactions and relationships with others, we need to think more critically, and recognize when we are assuming we are right. If we can pause at those times and think more deeply, gather more information, ask more questions of our selves, we can have more accurate and honest interactions and relationships. Hopefully this will help us have more meaningful lives that better connect and better develop the community we all need in order to thrive.
Judging Faces

Judging Faces

One of the successes of System 1, the name Daniel Kahneman uses to describe our quick, intuitive part of the brain in his book Thinking Fast and Slow, is recognizing emotions in people’s faces. We don’t need much time to study someone’s face to recognize that they are happy, scared, or angry. We don’t even need to see someone’s face for a full second to get an accurate sense of their emotional state, and to adjust our behavior to interact accordingly with them.

 

The human mind is great at intuiting emotions from people’s faces. I can’t remember where, but I came across something that suggested the reason why we have white eyes is to help us better see where each other’s eyes are looking, and to help us better read each other’s emotions. Our ability to quickly and intuitively read each others’ faces helps us build social cohesion and connections. However, it can still go wrong, even though we are so adept.

 

Kahneman explains that biases and baseless assumptions can be built into System 1’s assessment of faces. We are quick to notice faces that share similar features as our own. We are also quick to judge people as nice, competent, or strong based on features in their faces. This is demonstrated in Thinking Fast and Slow with experiments conducted by Alex Todorov. He had showed potential voters the faces of candidates, for sometimes only fractions of seconds and noted that faces influenced votes. Kahneman writes, “As expected, the effect of facial competence on voting is about three times larger for information-poor and TV-prone voters than for others who are better informed and watch less television.”

 

I’m not here to hate on information-poor and TV-prone voters, but instead to help us see that we can easily be influenced by people’s faces and traits that we have associated with facial characteristics, even if we don’t consciously know those associations exist. For all of us, there will be situations where we are information-poor and ignorant of issues or important factors for our decision (the equivalent of being TV-prone in electoral voting). We might trust what a mechanic or investment banker says if they have a square jaw and high cheekbones. We might trust the advice of a nurse simply because she has facial features that make her seem caring and sympathetic. Perhaps in both situations the person is qualified and competent to be giving us advice, but even if they were not, we might trust them based on little more than appearance. System 1, which is so good at telling us about peoples’ emotions, can jump ahead and make judgement about many characteristics of people simply based on faces, and it may be correct sometimes, but it can also be wrong. System 2 will probably construct a coherent narrative to justify the quick decision made by System 1, but it likely won’t really have to do with the experience and qualifications of the person. We may find that we end up in situations where deep down, we are making judgments of someone based on little more than what they look like, and what System 1 thought of their face.