Beliefs are Not Voluntary

Beliefs Are Not Voluntary

One of the ideas that Quassim Cassam examines in his book Vices of the Mind is the idea of responsibility. Cassam recognizes two forms of responsibility in his book and examines those forms of responsibility through the lens of epistemic vices. The first form of responsibility is acquisition responsibility, or our responsibility for acquiring beliefs or developing ways of thinking, and the second form of responsibility is revision responsibility, or our responsibility for changing beliefs and ways of thinking that are shown to be harmful.
 
 
Within this context Cassam provides interesting insight about our beliefs. He writes, “If I raise my arm voluntarily, without being forced to raise it, then I am in this sense responsible for raising it.
Notoriously, we lack voluntary control over our own beliefs. Belief is not voluntary.”
 
 
Cassam explains that if it is raining outside, we cannot help but believe that it is raining. We don’t have control over many of our beliefs, they are in some ways inescapable and determined by factors beyond our control. beliefs are almost forced on us by external factors. I think this is true for many of our beliefs, ranging from spiritual beliefs to objective beliefs about the world. As Cassam argues, we are not acquisition responsible for believing that we are individuals, that something is a certain color, or that our favorite sports team is going to have another dreadful season.
 
 
But we are revision responsible for our beliefs.
 
 
Cassam continues, “We do, however, have a different type of control over our own beliefs, namely, evaluative control, and this is sufficient for us to count as revision responsible for our beliefs.”
 
 
Cassam introduces ideas from Pamela Hieronymi to explain our evaluative control over our beliefs. Hieronymi argues that we can revise our beliefs when new information arises that challenges those beliefs. She uses the example of our beliefs for how long a commute will be and our shifting beliefs if we hear about heavy traffic. We might not be responsible for the initial beliefs that we develop, but we are responsible for changing those beliefs if they turn out to be incorrect. We can evaluate our beliefs, reflect on their accuracy, and make adjustments based on those evaluations.
 
 
It is important for us to make this distinction because it helps us to better think about how we assign blame for inaccurate beliefs. We cannot blame people for developing inaccurate beliefs, but we can blame them for failing to change those beliefs. We should not spend time criticizing people for developing racist beliefs, harmful spiritual beliefs, or wildly inaccurate beliefs about health, well-being, and social structures. What we should do is blame people for failing to recognize their beliefs are wrong, and we should help people build evaluative capacities to better reflect on their own beliefs. This changes our stance from labeling people as racists, bigots, or jerks and instead puts the responsibility on us to foster a society of accurate self-reflection that push back against inaccurate beliefs. Labeling people will blame them for acquiring vices, which is unreasonable, but fostering a culture that values accurate information will ease the transition to more accurate beliefs.

Justified Beliefs

A lot of us have beliefs that are formed out of biases and prejudices. Often those beliefs still end up being true in the end, but they are nevertheless unjustified. A skill of the human mind is to ignore contradictory evidence and focus in on the limited evidence which supports what we want to believe and backs-up our prior assumptions. Whether it is a belief about a sports team, a racial or ethnic group, or about a restaurant, we often adopt unjustified beliefs that we support with anecdotal thinking. When these unjustified beliefs turn out to be correct, we use it as a defense of our biased thinking, and risk becoming entrenched with inaccurate assumptions of how the world works.
In Vices of the Mind Quassim Cassam writes about this directly. He argues that people need to be more considerate when considering whether a way of thinking is helpful or harmful, and whether a true result in the end justifies biased assumptions.  Cassam writes, “leading to true belief is not the same as being conducive to knowledge. Even in cases where an epistemic vice leads someone to believe something true that doesn’t mean that they have the right to be confident that things are as they take them to be or that their belief is justified.”
To take a relatively harmless example, imagine two sports fans who bet on a college basketball game. One fan might be biased in favor of big-name schools, while another might be less biased and willing to look at sports analytics when making decisions about which team is likely to win a game. The biased individual may bet against a smaller school, and may win that bet, but it is hard to say that they would systematically win bets against small schools in favor of more recognizable schools. In any individual instance their bet might pay off, but over the long term we would probably expect the more objective individual without biases who is more open-minded with sports analytics or other survey methods to win more bets. The biased individual who wins a lucky bet does not have justified beliefs even when his bias pays off.
This type of thinking can be more harmful than bets among college basketball fans. The human mind has a remarkable ability to remember the evidence that supports those beliefs we want to be true and to ignore evidence that undermines our desired beliefs. The biased sports fan probably remembers when he was right about a small school being over-hyped, but probably doesn’t remember the times when big-named schools lost to smaller schools. This can happen with people who are biased against police officers, minority groups, or people who drive certain types of cars. The reference class doesn’t matter to our brain, but the individual anecdotes that support our prior beliefs are remembered.
Holding justified beliefs requires that we inform our beliefs based on real-world evidence with statistical value. Basing our beliefs on individual anecdotes will not consistently lead us to having accurate beliefs, and if we do hit upon a true belief from time to time, we won’t be justified in the beliefs, assumptions, and conclusions that we draw. It is important to recognize when our thinking is anecdotal, and to consider whether our beliefs are justified.
Thinking Conspiratorially

Thinking Conspiratorially

Over the last few years a number of wild conspiracy theories have become popular. Former President Donald Trump embraced a conspiracy theory that the 2020 Presidential Election was rigged (it was not), supported the Qanon conspiracy theory, and did little to push back against conspiracy theories surrounding COVID-19. His actions, behaviors, and beliefs demonstrate that thinking conspiratorially can be an epistemic vice. His willingness to believe wild falsehoods obstructed knowledge for himself and his most ardent supporters.
However, thinking conspiratorially is not always an epistemic vice. One reason why conspiracy theories become so gripping and why people sometimes fall into them is because real conspiracies do occur. Nixon’s Watergate Scandal, Trump’s withholding of financial and military aid unless Ukraine announced an investigation into Joe Biden and his son, and fraud schemes uncovered by inspectors general and government auditors demonstrate that nefarious conspiracies sometimes are real. While thinking conspiratorially can become an epistemic vice, the same is true for anti-conspiratorial thinking.
In the book Vices of the Mind, Quassim Cassam quotes Dr. Charles Pigden from the University of Otago in New Zealand by writing, “there is nothing inherently vicious about believing or being disposed to believe conspiracy theories.” Cassam argues that conspiratorial thinking is not an epistemic vice on its own, but is instead a context dependent vice or virtue. He continues, “there are environments in which either way of thinking can be epistemically virtuous or vicious, and a way to capture this context-relativity is to describe these thinking styles as conditionally virtuous or vicious.”
The examples I used earlier show how conspiratorial thinking can be either virtuous or vicious. In the case of our former President, his conspiratorial thinking spread misinformation, suppressed true and accurate information, and created a set of false beliefs that some of his supporters believed so strongly that they stormed the United States Capitol in an attempt to stop Congress from certifying the election. The context of his conspiracy theories obstructed knowledge and caused substantial harm to life and property. However, a government auditor who notices inconsistencies in paperwork and accounting practices may be rewarded for thinking conspiratorially, at least to a point. Believing that something nefarious could possibly be going on will encourage the auditor to review financial statements and testimony from personnel with more scrutiny, potentially helping them uncover real fraud. Of course, they could still go too far and push the issue beyond reasonable bounds by thinking conspiratorially, but this type of thinking is conditionally virtuous when it discovers true fraud and improves knowledge about fraud schemes.
Given the dramatic consequences of conspiracy thinking over the last few years, it is easy to dismiss thinking conspiratorially as an epistemic vice. However, we should remember that it is only conditionally an epistemic vice, and that sometimes conspiracies do turn out to be true (or at least partially true). We don’t have to give every conspiracy our respect and attention, but when a conspiracy does appear to be grounded in reality and supported by real evidence, then we should not be too quick to dismiss it.
Causal Links Between Unconnected Events

Causal Links Between Unconnected Events

As a kid I grew up attending basketball camps at UCLA. I played in the old gym that used to host UCLA games in front of a few thousand fans, played on the current court in main stadium, and slept in the dorms. With my history of basketball at UCLA, I have always been a fan of the men’s basketball team, rooting for them and the Nevada Wolf Pack – where I actually went to school. With the UCLA team making a deep run in the NCAA March Madness tournament, I have been reminded of all the superstitious thinking that surrounds sports and that I used to indulge in.
Sports seem to bring out superstitious thinking in even the most rational of people. I try very hard to think about causal structures and to avoid seeing non-existent causal links between unconnected events, but nevertheless, it is hard to not let superstitious thinking creep in. When you are watching a game it is hard not to feel like you have to sit in the right spot, have to watch from a certain room, or have to avoid certain behaviors in order to keep your team in the lead. However, it is absolute nonsense to think that your actions on your couch, miles away from the sporting venue where the game is taking place, could have any causal link to the way that a sports team performs.
In the book Vices of the Mind, Quassim Cassam spends time examining what is happening within our mind when we engage in superstitious thinking. He explains that superstitious thinking qualifies as an epistemic vice because it gets in the way of knowledge. It prevents us from forming accurate beliefs about the world. “Superstitious thinking,” Cassam writes, “isn’t a generally reliable method for forming true beliefs about the future; it won’t generally lead to true beliefs because it posits causal links between unconnected events. … beliefs based on superstitious thinking aren’t reasonable.”
Cassam gives the example of superstitions about walking under ladders in the book. Someone with a superstition believing that bad luck will befall them if they walk under a ladder will probably avoid walking under ladders, and as a result they won’t be as likely to have paint drip on them, to have something fall on their head, or to knock over the ladder and anyone or anything on top of it. Their superstition will lead to better outcomes for them, but not because the superstition helped them create true beliefs about the dangers of walking under ladders. The individual ends up with the correct answer, but interprets the wrong causal chain to get there.
Thinking about rational and plausible causal chains is a way to escape superstitious thinking. You can rationally examine the risks, harms, and benefits of certain behaviors and actions with rational connections between events to see when a superstition is nonsense, and when it pulls from real-life causal chains to help improve life. Trying not step on cracks will not prevent you from starting a causal chain that leads to your mother’s broken back, but it will help ensure you have more stable and steady footing when you walk. Wearing the same basketball jersey for each sports game has no causal connection with the team’s performance, and wearing it or not wearing it will not have an impact on how your favorite team performs. We should strive to have accurate beliefs about the world, we should work to see causal connections clearly, and we should limit superstitious thinking even if it is about trivial things like sports.
Superstitious Thinking

Superstitious Thinking

Would you consider superstitious thinking to be a vice? According to Quassim Cassam in Vices of the Mind, superstitious thinking is indeed an epistemic vice. That is to say, Cassam believes that superstitious thinking is a is reprehensible, blameworthy, systematically obstructs knowledge. By systematically obstructing knowledge, superstitious thinking causes people to adopt beliefs about the world that don’t match reality, leaving them vulnerable to poor decision-making that can have real-world consequences in their lives.
Cassam writes, “a gambler who sees a succession of coin tosses coming down heads thinks that the next toss will be tails because a tails is now due. This is an example of superstitious or magical thinking, thinking that posits a causal link between unconnected events, namely, previous coin tosses and the next toss.” This quote shows how superstitious thinking systematically obstructs knowledge. It causes us to see causal connections when none exist, distorting our perception and theory of reality.
A gambler making bets sees a causal connection between previous roles of a dice or spins of a roulette wheel and the next roll or spin. In reality, each time you flip a coin, roll a dice, or spin a wheel, the previous result has no bearing on the current probability. A coin toss is a 50-50 affair that does not change because the previous flip was heads.
This type of thinking is prevalent in more than just gamblers. Sports enthusiasts regularly see causal links that cannot possibly exist. The same kind of thinking also shows up in people who have lucky clothing, special rituals in aspects of daily life, or who avoid certain phrases or behaviors. In many instances, the causal links we identify are absurd but don’t incur real costs in our lives. Avoiding stepping on cracks in the sidewalk doesn’t cost you anything and growing a beard because your favorite sports team is on a roll might even provide some social benefits and save you time from not shaving. However, giving in to superstitious thinking, as noted before, distorts your view of reality.
The causal chains misperceived through superstitious thinking create false understandings of how the world works. While it is harmless to believe that you need to sit in the same exact spot for your sports team to play well, it is not harmless to believe that hiring a woman to do a certain job is bad luck, and it is not harmless to bet your life savings on a gamble because of superstitious thinking. What may be even worse is that superstitious thinking in one area could spill into other areas, creating a habit of seeing causal chains that don’t exist. Overtime, superstitious thinking will lead to worse outcomes and poor decision-making that will have real costs in our lives.
The Human Need for Certainty - Joe Abittan

The Human Need for Certainty

Throughout the book Risk Savvy, Gerd Gigerenzer discusses the challenges that people face with thinking statistically, assessing different probable outcomes, and understanding risk. Gigerenzer also discusses how important it is that people become risk literate, and how the future of humanity will require that people better understand risk and uncertainty. What this future requires, he explains, is fighting against aspects of human psychology that are common to all of us and form part of our core nature. One aspect in particular that Gigerenzer highlights as a problem for humans moving forward, is our need for certainty.

 

“Humans appear to have a need for certainty, a motivation to hold onto something rather than to question it,” he writes. Whether it is our religion, our plans for retirement, or the brand of shoes we prefer, we have a need for certainty. We don’t want to question whether our religious, political, or social beliefs are correct. It is more comforting for us to adopt beliefs and be certain that we are correct. We don’t want to continuously re-evaluate our savings plans and open ourselves to the possibility that we are not doing enough to save for retirement. And we like to believe that we purchased the best running shoes, that we bough the most sustainable shoes for the planet, and that our shoe choices are the most popular. In all of these areas, ambiguity makes our decisions harder whereas a feeling of certainty gives us confidence and allows us to move through the world. In many ways, our need for certainty is simply a practicality. There are unlimited possibilities and decisions for us to make every day. Adopting certainty eliminates many possibilities and choices, simplifying our life and allowing us to move through the world without having to question every action of every second of every day.

 

But in the modern world, humans have to be more comfortable living with ambiguity and have to be able to give up certainty in some areas. “For the mature adult,” Gigerenzer writes, “a high need for certainty can be a dangerous thing.”  We live with risk and need to be able to adjust as we face new risks and uncertainties in our lives. We like to hold onto our beliefs and we are not comfortable questioning our decisions, but it can be necessary for us to do so in order to move forward and live in harmony in a changing world with new technologies, different demographics, and new uncertainties. A need for certainty can lead people to become dogmatic, to embrace apologetics when discounting science that demonstrates errors in thinking, and to ignore the realities of a changing world. One way or another, we have to find ways to be flexible and adjust our choices and plans according to risk, otherwise we are likely to make poor choices and be crushed when the world does not align itself with our beliefs and wishes.
A Lack of Internal Consistency

A Lack of Internal Consistency

Something I have been trying to keep in mind lately is that our internal beliefs are not as consistent as we might imagine. This is important right now because our recent presidential election has highlighted the divide between many Americans. In most of the circles I am a part of, people cannot imagine how anyone could vote for Donald Trump. Since they see President Trump as contemptible, it is hard for them to separate his negative qualities from the people who may vote for him. All negative aspects of Trump and of the ideas that people see him as representing are heaped onto his voters. The problem however, is that none of us have as much internal consistency between our thoughts, ideas, opinions, and beliefs for any of us to justify characterizing as much as half the country as bigoted, uncaring, selfish, or really any other adjective (except maybe self-interested).

 

I have written a lot recently about the narratives we tell ourselves. It is problematic that the more simplistic a narrative, the more believable and accurate it feels to us. The world is incredibly complicated, and a simplistic story that seems to make sense of it all is almost certainly wrong. Given this, it is worth looking at our ideas and views and trying to identify areas where we have inconsistencies in our thoughts. This helps us tease apart our narratives and recognize where simplistic thinking is leading us to unfound conclusions.

 

In Thinking Fast and Slow, Daniel Kahneman shows us how this inconsistency between our thoughts, beliefs, and behaviors can arise, using moral ambiguity as an example. He writes, “the beliefs that you endorse when you reflect about morality do not necessarily govern your emotional reactions, and the moral intuitions that come to your mind in different situations are not internally consistent.”

 

It is easy to adopt a moral position against some immoral behavior or attitude, but when we find ourselves in a situation where we are violating that moral position, we find ways to explain our internal inconsistency without directly violating our initial moral stance. We rationalize why our moral beliefs don’t apply to us in a given situation, and we create a story in our minds where there is no inconsistency at all.

 

Once we know that we do this with our own beliefs toward moral behavior, we should recognize that we do this with every area of life. It is completely possible for us to think entirely contradictory things, but to explain away those contradictions in ways that make sense to us, even if it leaves us with incoherent beliefs. And if we do this ourselves, then we should recognize that other people do this as well. So when we see people voting for a candidate and can’t imagine how they could vote for such a candidate, we should assume that they are making internally inconsistent justifications for voting for that candidate. They are creating a narrative in their head where they are making the best possible decision. They may have truly detestable thoughts and opinions, but we should remember that in their minds they are justified and making rational choices.

 

Rather than simply hating people and heaping every negative quality we can onto them. We should pause and ask what factors might be leading them to justify contemptible behavior. We should look for internal inconsistencies and try to help people recognize these areas and move forward more comprehensively. We should see in the negativity in others something we have the same capacity for, and we should try to find more constructive ways to engage with them and help them shift the narrative that justifies their inconsistent thinking.
Can You Remember Your Prior Beliefs? - Joe Abittan

Can You Remember Your Prior Beliefs?

“A general limitation of the human mind,” writes Daniel Kahneman in his book Thinking Fast and Slow, “is its imperfect ability to reconstruct past states of knowledge, or beliefs that have changed. Once you adopt a new view of the world (or any part of it), you immediately lose much of your ability to recall what you used to believe before your mind changed.”

 

What Kahneman is referring to with this quote is the difficulty we have in understanding how our thinking evolves and changes over time. To each of us, our thinking slowly adapts and revises itself, sometimes quite dramatically, but often very slowly. Our experience of our changing mind isn’t very reflective of these changes, unless we had a salient change that I would argue is tied in one way or another to an important aspect of our identity. For most changes in our mental approach, we generally don’t remember our prior beliefs and views, and we likely don’t remember a point at which our beliefs changed.

 

In the book Kahneman uses an example of two football teams with the same record playing each other. One team crushes the other, but before we knew the outcome, we didn’t have a strong sense of how the game would go. After watching a resounding victory, it is hard to remember that we once were so uncertain about the future outcome.

 

This tendency of the mind wouldn’t be much of a problem if it was restricted to our thinking about sports – unless we had a serious betting problem. However, this applies to our thinking on many more important topics such as family member marriages, career choices, political voting patterns, and consumer brand loyalty. At this moment, many Democrat voters in our nation probably don’t remember exactly what their opinions were on topics like free trade, immigration, or infectious disease policy prior to the 2016 election. If they do remember their stances on any of those issues, they probably don’t remember all the legal and moral arguments they expressed at that time. Their minds and opinions on the matter have probably shifted in response to President Trump’s policy positions, but it is probably hard for many to say exactly how or why their views have changed.

 

In a less charged example, imagine that you are back in high school, and for years you have really been into a certain brand of shoes. But, one day, you are bullied for liking that brand, or perhaps someone you really dislike is now sporting that same brand, and you want to do everything in your power to distance yourself from any association with the bullying or the person you don’t like. Ditching the shoes and forgetting that you ever liked that brand is an easy switch for our minds to make, and you never have to remember that you too wore those shoes.

 

The high school example is silly, but for me it helps put our brain’s failure to remember previous opinions and beliefs in context. Our brains evolved in a social context, and for our ancestors, navigating complex tribal social structures and hierarchies was complex and sometimes a matter of life and death (not just social media death for a few years in high school like today). Being able to ditch beliefs that no longer fit our needs was probably helpful for our ancestors, especially if it helped them fully commit to a new tribal leader’s strange quirks and new spiritual beliefs. Today, this behavior can cause us to form strange high school (or office) social cliques and can foment toxic political debates, but it may have served a more constructive role for our ancestors forming early human civilizations.

The Mental Scaffolding for Religious Belief

The Mental Scaffolding of Religious Belief

Yesterday’s post was about our mental structure for seeing causality in the world where there is none. We attribute agency to inanimate objects, imbue them with emotions, attribute intentions, and ascribe goals to objects that don’t appear to have any capacity for conscious thought or awareness. From a young age, our minds are built to see causality in the world, and we attribute causal actions linked to preferred outcomes to people, animals, plants, cars, basketballs, hurricanes, computers, and more. This post takes an additional step, looking at how our mind that intuitively perceives causal actions all around us plunges us into a framework for religious beliefs. There are structures in the mind that act as mental scaffolding for the construction of religious beliefs, and understanding these structures helps shed light on what is taking place inside the human mind.

 

In Thinking Fast and Slow, Daniel Kahneman writes the following:

 

“The psychologists Paul Bloom, writing in The Atlantic in 2005, presented the provocative claim that our inborn readiness to separate physical and intentional causality explains the near universality of religious beliefs. He observes that we perceive the world of objects as essentially separate from the world of minds, making it possible for us to envision soulless bodies and bodiless souls. The two models of causation that we are set to perceive make it natural for us to accept the two central beliefs of many religions: an immaterial divinity is the ultimate cause of the physical world, and immortal souls temporarily control our bodies while we live and leave them behind as we die.”

 

From the time that we are small children, we experience a desire for a change in the physical state around us. When we are tiny, we have no control over the world around us, but as we grow we develop the capacity to change the physical world to align with our desires. Infants who cannot directly change their environment express some type of discomfort by crying, and (hopefully) receiving loving attention. From a small age, we begin to understand that expressing some sort of discomfort brings change and comfort from a being that is larger and more powerful than we are.

 

This is an idea I heard years ago on a podcast. I don’t remember what show it was, but the argument that the guest presented was that humans have a capacity for imaging a higher being with greater power than what we have because that is literally the case when we are born. From the time we are in the womb to when we are first born, we experience larger individuals who provide for us, feed us, protect us, and literally talk down to us as if from above. In the womb we literally are within a protective world that nourishes our bodies and is ever present and ever powerful. We have an innate sense that there is something more than us, because we develop within another person, literally experiencing that we are part of something bigger. And when we are tiny and have no control over our world, someone else is there to protect and take care of us, and all we need to do to summon help is to cry out to the sky as we lay on our backs.

 

As we age, we learn to control our physical bodies with our mental thoughts and learn to use language to communicate our desired to other people. We don’t experience the build up action potentials between neurons prior to our decisions to do something. We only experience us, acting in the world and mentally interpreting what is around us. We carry with us the innate sense that we are part of something bigger and that there is a protector out there who will come to us if we cry out toward the sky. We don’t experience the phenomenological reality of the universe, we experience the narrative that we develop in our minds beginning at very young ages.

 

My argument in this piece is that both Paul Bloom as presented in Kahneman’s book and the argument from the scientist in the podcast are correct. The mind contains scaffolding for religious beliefs, making the idea that a larger deity exists and is the original causal factor of the universe feel so intuitive. Our brains are effectively primed to look for things that support the intuitive sense of our religions, even if there is no causal structure there, or if the causal structure can be explained in a more scientific and rational manner.
Familiarity vs Truth

Familiarity vs Truth

People who wish to spread disinformation don’t have to try very hard to get people to believe that what they are saying is true, or that their BS at least has some element of truth to it. All it takes, is frequent repetition. “A reliable way to make people believe in falsehoods,” writes Daniel Kahneman in his book Thinking Fast and Slow, “is frequent repetition, because familiarity is not easily distinguished from truth.”

 

Having accurate and correct representations of the world feels important to me. I really love science. I listen to lots of science based podcasts, love sciency discussions with family members and friends, and enjoy reading science books. By accurately understanding how the world operates, by seeking to better understand the truth of the universe, and by developing better models and systems to represent the way nature works, I believe we can find a better future. I try not to fall unthinkingly into techno-utopianism thinking, but I do think that having accurate beliefs and understandings are important for improving the lives of people across the planet.

 

Unfortunately, for many people, I don’t think that accurate and correct understandings of the worlds have such high priority in their lives. I fear that religion and science may be incompatible or at odds with each other, and there may be a willingness to accept inaccurate science or beliefs to support religious doctrine. I also fear that people in extractive industries may discount science, preferring to hold an inaccurate belief that supports their ability to profit through their extractive practices. Additionally, the findings, conclusions, and recommendations from science may just be scary for many ordinary people, and accepting what science says might be inconvenient or might require changes in lifestyles that people don’t want to make. When we are in this situations, it isn’t hard to imagine why we might turn away from scientific consensus in favor of something comfortable but wrong.

 

And this is where accurate representations of the universe face and uphill battle. Inaccuracies don’t need to be convincing, don’t really need to sound plausible, and don’t need to to come from credible authorities. They just need to be repeated on a regular basis. When we hear something over and over, we start to become familiar with the argument, and we start to have trouble telling the truth and falsehood apart. This happened in 2016 when the number one word associated with Hillary Clinton was Emails. It happened with global warming when enough people suggested that human related CO2 emissions were not related to the climate change we see. And it happens every day in trite sayings and ideas from trickle down economics to popping your knuckles causes arthritis.

 

I don’t think that disproving inaccuracies is the best route to solving the problem of familiarity vs truth. I think the only thing we can hope to do is amplify those ideas, conclusions, experiments, and findings which accurately reflect the true nature of reality. We have to focus on what is true, not on all the misleading nonsense that gets repeated. We must repeat accurate statements about the universe so that they are what become familiar, rather than the mistaken ideas that become hard to distinguish from the truth.