Laboratory Proof

Laboratory Proof

“If the standard of laboratory proof had been applied to scurvy,” writes Judea Pearl in The Book of Why, “then sailors would have continued dying right up until the 1930’s, because until the discovery of vitamin C, there was no laboratory proof that citrus fruits prevented scurvy.” Pearl’s quote shows that high scientific standards for definitive and exact causality are not always for the greater good. Sometimes modern science will spurn clear statistical relationships and evidence because statistical relationships alone cannot be counted on as concrete causal evidence. A clear answer will not be given because some marginal unknowns may still exist, and this can have its own costs.
Sailors did not know why or how citrus fruits prevented scurvy, but observations demonstrated that citrus fruits managed to prevent scurvy. There was no clear understanding of what scurvy was or why citrus fruits were helpful, but it was commonly understood that a causal relationship existed. People acted on these observations and lives were saved.
On two episodes, the Don’t Panic Geocast has talked about journal articles in the British Medical Journal that make the same point as Pearl. As a critique of the need for randomized controlled trials, the two journal articles highlight the troubling reality that there have not been any randomized controlled trials on the effectiveness of parachute usage when jumping from airplanes. The articles are hilarious and clearly satirical, but ultimately come to the same point that Pearl does with the quote above – laboratory proof is not always necessary, practical, or reasonable when lives are on the line.
Pearl argues that we can rely on our abilities to identify causality even without laboratory proof when we have sufficient statistical analysis and understanding of relationships. Statisticians always tell us that correlation is not causation and that observational studies are not sufficient to determine causality, yet the citrus fruit and parachute examples highlight that this mindset is not always appropriate. Sometimes more realistic and common sense understanding of causation – even if supported with just correlational relationships and statistics – are more important than laboratory proof.
The Cigarette Wars - Judea Pearl The Book of Why - Joe Abittan

The Cigarette Wars

Recently I have been writing about Judea Pearl’s The Book of Why, in which Pearl asks if our reliance on statistics and our adherence to the idea correlation is not causation has gone too far in science. For most people, especially students getting into science and those who have studied politics, reiterating the idea that correlation does not imply causation is important. There are plenty of ways to misinterpret data and there is no shortage of individuals and interest groups who would love to have an official scientist improperly assign causation to a correlation for their own political and economic gain. However, Pearl uses the cigarette wars to show us that failing to acknowledge that correlations can imply causation can also be dangerous.
“The cigarette wars were science’s first confrontation with organized denialism, and no one was prepared,” writes Pearl. For decades there was ample evidence from different fields and different approaches linking cigarette smoking to cancer. However, it isn’t the case that every single person who smokes a cigarette gets cancer. We all know people who smoked for 30 years, and seem to have great lungs. Sadly, we also all know people who developed lung cancer but never smoked. The causation between cigarettes is not a perfect 1:1 correlation with lung cancer, and tobacco companies jumped on this fact.
For years, it was abundantly clear that smoking greatly increased the risk of lung cancer, but no one was willing to say that smoking caused lung cancer, because powerful interest groups aligned against the idea and conditioned policy-makers and the public to believe that in the case of smoking and lung cancer, correlation was not causation. The evidence was obvious, but built on statistical information and the organized denial was stronger. Who was to say if people more susceptible to lung cancer were also more susceptible to start smoking in the first place? Arguments such as these hindered people’s willingness to adopt the clear causal picture that cigarettes caused cancer. People hid behind a possibility that the overwhelming evidence was wrong.
Today we are in a similar situation with climate change and other issues. It is clear that statistics cannot give us a 100% certain answer to a causal question, and it is true that correlation is not necessarily a sign of causation, but at a certain point we have to accept when the evidence is overwhelming. We have to accept when causal models that are not 100% proven have overwhelming support. We have to be able to make decisions without being derailed by organized denialism that seizes on the fact that correlation does not imply causation, just to create doubt and confusion. Pearl’s warning is that failing to be better with how we think about and understand causality can have real consequences (lung cancer in the cigarette wars, and devastating climate impacts today), and that we should take those consequences seriously when we look at the statistics and data that helps us understand our world.

Justified Beliefs

A lot of us have beliefs that are formed out of biases and prejudices. Often those beliefs still end up being true in the end, but they are nevertheless unjustified. A skill of the human mind is to ignore contradictory evidence and focus in on the limited evidence which supports what we want to believe and backs-up our prior assumptions. Whether it is a belief about a sports team, a racial or ethnic group, or about a restaurant, we often adopt unjustified beliefs that we support with anecdotal thinking. When these unjustified beliefs turn out to be correct, we use it as a defense of our biased thinking, and risk becoming entrenched with inaccurate assumptions of how the world works.
In Vices of the Mind Quassim Cassam writes about this directly. He argues that people need to be more considerate when considering whether a way of thinking is helpful or harmful, and whether a true result in the end justifies biased assumptions.  Cassam writes, “leading to true belief is not the same as being conducive to knowledge. Even in cases where an epistemic vice leads someone to believe something true that doesn’t mean that they have the right to be confident that things are as they take them to be or that their belief is justified.”
To take a relatively harmless example, imagine two sports fans who bet on a college basketball game. One fan might be biased in favor of big-name schools, while another might be less biased and willing to look at sports analytics when making decisions about which team is likely to win a game. The biased individual may bet against a smaller school, and may win that bet, but it is hard to say that they would systematically win bets against small schools in favor of more recognizable schools. In any individual instance their bet might pay off, but over the long term we would probably expect the more objective individual without biases who is more open-minded with sports analytics or other survey methods to win more bets. The biased individual who wins a lucky bet does not have justified beliefs even when his bias pays off.
This type of thinking can be more harmful than bets among college basketball fans. The human mind has a remarkable ability to remember the evidence that supports those beliefs we want to be true and to ignore evidence that undermines our desired beliefs. The biased sports fan probably remembers when he was right about a small school being over-hyped, but probably doesn’t remember the times when big-named schools lost to smaller schools. This can happen with people who are biased against police officers, minority groups, or people who drive certain types of cars. The reference class doesn’t matter to our brain, but the individual anecdotes that support our prior beliefs are remembered.
Holding justified beliefs requires that we inform our beliefs based on real-world evidence with statistical value. Basing our beliefs on individual anecdotes will not consistently lead us to having accurate beliefs, and if we do hit upon a true belief from time to time, we won’t be justified in the beliefs, assumptions, and conclusions that we draw. It is important to recognize when our thinking is anecdotal, and to consider whether our beliefs are justified.
Thinking Conspiratorially Versus Evidence-Based Thinking - Joe Abittan

Thinking Conspiratorially Versus Evidence-Based Thinking

My last two posts have focused around conspiratorial thinking and whether it is an epistemic vice. Quassim Cassam in Vices of the Mind argues that we can only consider thinking conspiratorially to be a vice based on context. He means that conspiratorial thinking is a vice dependent on whether there is reliable and accurate evidence to support a conspiratorial claim. Thinking conspiratorially is not an epistemic vice when we are correct and have solid evidence and rational justifications for thinking conspiratorially. Anti-conspiratorial thinking can be an epistemic vice if we ignore good evidence of a conspiracy to continue believing that everything is in order.
Many conspiracies are not based on reliable facts and information. They create causal links between disconnected events and fail to explain reality. Anti-conspiratorial thinking also creates a false picture of reality, but does so by ignoring causal links that actually do exist. As epistemic vices, both ways of thinking can be described consequentially and by examining the patterns of thought that contribute to the conspiratorial or anti-conspiratorial thinking.
However, that is not to say that conspiratorial thinking is a vice in non-conspiracy environments and that anti-conspiratorial thinking is a vice in high-conspiracy environments. Regarding this line of thought, Cassam writes, “Seductive as this line of thinking might seem, it isn’t correct. The obvious point to make is that conspiracy thinking can be vicious in a conspiracy-rich environment, just as anti-conspiracy thinking can be vicious in contexts in which conspiracies are rare.” The key, according to Cassam, is evidence-based thinking and whether we have justified beliefs and opinions, even if they turn out to be wrong in the end.
Cassam generally supports the principle of parsimony, the idea that the simplest explanation for a scenario is often the best and the one that you should assume to be correct. Based on the evidence available, we should look for the simplest and most direct path to explain reality. However, as Cassam continues, “the principle of parsimony is a blunt instrument when it comes to assessing the merits of a hypothesis in complex cases.” This means that we will still end up with epistemic vices related to conspiratorial thinking if we only look for the simplest explanation.
What Cassam’s quotes about conspiratorial thinking and parsimony get at is the importance of good evidence-based thinking. When we are trying to understand reality, we should be thinking about what evidence should exist for our claims, what evidence would be needed to support our claims, and what kinds of evidence would refute our claims. Evidence-based thinking helps us avoid pitfalls of conspiratorial or anti-conspiratorial thinking, regardless as to whether we live in conspiracy rich or poor environments. Accurately identifying or denying a conspiracy based on thinking without any evidence, based on assuming simple relationships, is ultimately not much better than simply making up beliefs based on magic. What we need to do is learn to adopt evidence-based thinking and to better understand the causal structures that exist in the world. That is the only true way to avoid the epistemic vices related to conspiratorial thinking.
Ignoring Conspiracy Theories

Ignoring Conspiracy Theories

Knowledge deals with facts. In order to have or to gain knowledge, you need to understand, gain experience in, or directly learn accurate information. You cannot have knowledge of things that are not true. Therefore, beyond knowing that a conspiracy theory is factually inaccurate or understanding its origins, you cannot have knowledge of a conspiracy theory. Importantly, what this means for us is that we can ignore implausible conspiracy theories.
In his book Vices of the Mind Quassim Cassam asks whether it is closed-minded to ignore conspiracy theories and whether ignoring them is an epistemic vice. However, Cassam explains that epistemic vices inhibit knowledge since knowledge only deals with truth and facts. Conspiracy theories such as moon landing hoax theories do not deal with facts, so ignoring them does not hinder knowledge.
Cassam brings up conspiracy theories when discussing closed-mindedness and addressing an argument that people occasionally make in favor of closed-mindedness. The argument is that closed-minded people won’t be swayed by implausible conspiracy theories, and therefore some dose of closed-mindedness rather than universal open-mindedness is a good thing. Regarding this opinion, and diving into the heart of conspiracy theories, Cassam writes the following:
“If I listen to them long enough I might change my mind and lose the knowledge that I already have. I should do everything possible to avoid or ignore [conspiracy theories], and that looks like a way of saying that the way to protect my knowledge is to be closed-minded. However, the real reason I am entitled not to listen to the conspiracy theorists is not that their views are inconsistent with my prior conception but that they are unlikely to be correct given the available evidence. Only the evidence can justify a policy of non-engagement.”
I previously wrote about analysis-paralysis and when it is ok to stop investigating something and to make a decision. At a certain point we have to judge that we have sufficient knowledge and understanding to move forward with our lives. We cannot spend time investigating every possibility, because we will run out of time and never make a decision for what to wear, who to vote for, and what to eat for dinner. Fortunately, as Cassam shows, our decision-making can and should be limited by fact and plausibility given the available evidence. Possibilities that fall far outside what is likely to be plausible can be ignored. We might be wrong once in a while, but systematically this approach is not going to inhibit knowledge. We don’t have to investigate every possible conspiracy theory. We can ignore choices, opinions, and different possibilities when they don’t match the evidence and fall outside plausible ranges. This helps reduce our cognitive load, give us an actionable way to move forward, and establishes a baseline of accuracy from which any decision, idea, or possibility must have roots. Conspiracy theories can be ignored without us being closed-minded because they don’t reach such a baseline.
Rarely Stumped

Rarely Stumped

Daniel Kahneman starts one of the chapters in his book Thinking Fast and Slow by writing, “A remarkable aspect of your mental life is that you are rarely stumped. True, you occasionally face a question such as 17 × 24 = ? to which no answer comes immediately to mind, but these dumbfounded moments are rare. The normal state of your mind is that you have intuitive feelings and opinions about almost everything that comes your way.”

 

When I read this quote I am reminded of Gus, the father, in My Big Fat Greek Wedding. He is always ready to show how every word comes from a Greek root, even a Japanese word like kimono. He is sure of his intellect, sure that his heritage is perfect and is the foundation of all that is good in the world. He trusts his instincts and intuitions to a hilarious extent, even when he is clearly wrong and even when his decisions are gift-wrapped and planted in his mind in an almost Inception style.

 

His character is part caricature, but it is revealing of what Kahneman explains with the quote above. Our minds are good at finding intuitive answers that make sense of the world around us, even if we really don’t have any idea what is going on. We laugh at Gus and don’t consider ourselves to be guilty of behaving like him, but the only difference between most of us and Gus is that Gus is an exaggeration of the intuitive dogma and sense of self value and assurance that we all live with.

 

We scroll through social media, and trust that our initial judgment of a headline or post is the right frame for how to think about the issue. We are certain that our home remedy for tackling bug bites, cleaning windows, or curing a headache is based on sound science, even if it does nothing more than produce a placebo effect. We find a way to fit every aspect of our lives into a comprehensive framework where our decisions appear rational and justified, with us being the hero (or innocent victim if needed) of the story.

 

We should remember that we have a propensity to believe that we are always correct, that we are never stumped. We should pause, ask more questions, think about what is important to know before making a decision, and then deeply interrogate our thoughts to decide if we really have obtained meaningful information to inform our opinions, or if we are just acting on instinct, heuristics, self-interest, or out of groupthink. We cannot continue believing we are right, pushing baseless beliefs onto others when we have no real knowledge of an issue. We shouldn’t assume things are true just because they happen to align with the story we want to believe about ourselves and the world. When it comes to crucial issues and our interactions and relationships with others, we need to think more critically, and recognize when we are assuming we are right. If we can pause at those times and think more deeply, gather more information, ask more questions of our selves, we can have more accurate and honest interactions and relationships. Hopefully this will help us have more meaningful lives that better connect and better develop the community we all need in order to thrive.