Talking About Causation - Judea Pearl - The Book of Why - Joe Abittan

Talking About Causation

In The Book of Why Judea Pearl argues that humans are better at modeling, predicting, and identifying causation than we like to acknowledge. For Pearl, the idea that we can see direct causation and study it scientifically is not a radical and naïve belief, but a common sense and defensible observation about human pattern recognition and intuition of causal structures in the world. He argues that we are overly reliant on statistical methods and randomized controlled trials that suggest relationships, but never tell us exactly what causal mechanisms are at the heart of such relationships.
One of the greatest frustrations for Pearl is the limitations he feels have been placed around ideas and concepts for causality. For Pearl, there is a sense that certain research, certain ways of talking about causality, and certain approaches to solving problems are taboo, and that he and other causality pioneers are unable to talk in a way that might lead to new scientific breakthroughs. Regarding a theory of causation and a the history of our study of causality, he writes, “they declared those questions off limits and turned to developing a thriving causality-free enterprise called statistics.”
Statistics doesn’t tell us a lot about causality. Statistical thinking is a difficult way for most people to think, and for non-statistically trained individuals it leads to frustrations. I remember around the time of the 2020 election that Nate Silver, a statistics wonk at Fivethirtyeight.com, posted a cartoon where one person was trying to explain the statistical chance of an outcome to another person. The other person interpreted statistical chances as either 50-50 or all or nothing. They interpreted a low probability event as a certainty that something would not happen and interpreted a high probability event as a certainty that it would happen, while more middle ground probabilities were simply lumped in as 50-50 chances. Statistics helps us understand these probabilities in terms of the outcomes we see, but doesn’t actually tell us anything about the why behind the statistical probabilities. That, I think Pearl would argue, is part of where the confusion for the individual in the cartoon who had trouble with statistics stems from.
Humans think causally, not statistically. However, our statistical studies and the accepted way of doing science pushes against our natural causal mindsets. This has helped us better understand the world in many ways, but Pearl argues that we have lost something along the way. He argues that we needed to be building better ways of thinking about causality and building models and theories of causality at the same time that we were building and improving our studies of statistics. Instead, statistics took over as the only responsible way to discuss relationships between events, with causality becoming taboo.
“When you prohibit speech,” Pearl writes, “you prohibit thought and stifle principles, methods, and tools.” Pearl argues that this is what is happening in terms of causal thinking relative to statistical thinking. I think he, and other academics who make similar speech prohibition arguments, are hyperbolic, but I think it is important to consider whether we are limiting speech and knowledge in an important way. In many studies, we cannot directly see the causal structure, and statistics does have ways of helping us better understand it, even if it cannot point to a causal element directly. Causal thinking alone can lead to errors in thinking, and can be hijacked by those who deliberately want to do harm by spreading lies and false information. Sometimes regressions and correlations hint at possible causal structures or completely eliminate others from consideration. The point is that statistics is still useful, but that it is something we should lean into as a tool to help us identify causality, not as the endpoint of research beyond which we cannot make any assumptions or conclusions.
Academics, such as Pearl and some genetic researchers, may want to push forward with ways of thinking that others consider taboo, and sometimes fail to adequately understand and address the concerns that individuals have about the fields. Addressing these areas requires tact and an ability to connect research in fields deemed off limits to the fields that are acceptable. Statistics and a turn away from a language of causality may have been a missed opportunity in scientific understanding, but it is important to recognize that human minds have posited impossible causal connections throughout history, and that we needed statistics to help demonstrate how impossible these causal chains were. If causality became taboo, it was at least partly because there were major epistemic problems in the field of causality. The time may have come for addressing causality more directly, but I am not convinced that Pearl is correct in arguing that there is a prohibition on speech around causality, at least not if the opportunity exists to tactfully and responsibly address causality as I think he does in his book.
A Vaccine for Lies and Falsehoods

A Vaccine for Lies and Falsehoods

Vaccines are on everyone’s mind this year as we hope to move forward from the Coronavirus Pandemic, and I cannot help but think about today’s quote from Quassim Cassam’s book Vices of the Mind through a vaccine lens. While writing about ways to build and maintain epistemic virtues Cassam writes, “only the inculcation and cultivation of the ability to distinguish truth from lies can prevent our knowledge from being undermined by malevolent individuals and organizations that peddle falsehoods for their own political or economic ends.” In other words, there is no vaccine for lies and falsehoods, only the hard work of building the skills to recognize truth, narrative, and outright lies.
I am also reminded of a saying that Steven Pinker included in his book Enlightenment Now, “any jackass can knock down a barn, but it takes a carpenter to build one.” This quote comes to mind when I think about Cassam’s quote because building knowledge is hard, but spreading falsehoods is easy. Epistemic vices are easy, but epistemic virtues are hard.
Anyone can be closed-minded, anyone can use lies to try to better their own position, and anyone can be tricked by wishful thinking. It takes effort and concentration to be open-minded yet not gullible, to identify and counter lies, and to create and transmit knowledge for use by other people. The vast knowledge bases that humanity has built has taken years to develop, to weed out the inaccuracies, and to painstakingly hone in on ever more precise and accurate understandings of the universe. All this knowledge and information has taken incredible amounts of hard work by people dedicated to building such knowledge.
But any jackass can knock it all down. Anyone can come along and attack science, attack knowledge, spread misinformation and deliberately use disinformation to confuse and mislead people. Being an epistemic carpenter and building knowledge is hard, but being a conman and acting epistemically malevolent is easy.
The task for all of us is to think critically about our knowledge, about the systems and structures that have facilitated our knowledge growth and development as a species over time, and to do what we can to be more epistemically virtuous. Only by working hard to identify truth, to improve systems for creating accurate information, and to enhance knowledge highways to help people learn and transmit knowledge effectively can we continue to move forward. At any point we can chose to throw sand in the gears of knowledge, bringing the whole system down, or we can find ways to make it harder to gum up the knowledge machinery we have built. We must do the latter if we want to continue to grow, develop, and live peacefully rather than at the mercy of the epistemically malevolent. After all, there is no vaccine to cure us from lies and falsehoods.
Lies Versus Epistemic Insouciance

Lies Versus Epistemic Insouciance

My last post was about epistemic insouciance, being indifferent to whether or not your beliefs, statements, and ideas are accurate or inaccurate. Epistemic insouciance, Quassim Cassam argues in Vices of the Mind is an attitude. It is a disposition toward accurate or false information that is generally case specific.
In the book, Cassam distinguishes between lies and epistemic insouciance. He writes, “lying is something that a person does rather than an attitude, and the intention to conceal the truth implies that the liar is not indifferent to the truth or falsity of his utterances. Epistemic insouciance is an attitude rather than something that a person does, and it does imply an indifference to the truth or falsity of one’s utterances.”
The distinction is helpful when we think about people who deliberately lie and manipulate information for their own gain and people who are bullshitters. Liars, as the quote suggests, know and care about what information is true and what is false. They are motivated by factors beyond the accuracy of the information, and do their best within their lies to present false information as factual.
Bullshitters, however, don’t care whether their information is accurate. The tools that work to uncover inaccurate information and counter a liar don’t work against a bullshitter because of their epistemic insouciance. Liars contort evidence and create excuses for misstatements and lies. Bullshitters simply flood the space with claims and statements of varying accuracy. If confronted, they argue that it doesn’t matter whether they lied or not, and instead argue that their information was wrong, that they didn’t care about it being wrong, and as a result they were not actually lying. This creates circular arguments and distracts from the epistemic value of information and the real costs of epistemic insouciance. Seeing the difference between liars and epistemically insouciant bullshitters is helpful if we want to know how to address those who intentionally obstruct knowledge.

Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

How we think, what we spend our time thinking about, and the way we view and understand the world is important. If we fail to develop accurate beliefs in the world then we will make decisions based on causal structures that do not exist. Our actions, thoughts, and behaviors will inhibit knowledge for ourselves and others, and our species will be worse off because of it.
This idea is at the heart of Quassim Cassam’s book Vices of the Mind. Throughout our human history we have held many beliefs that cannot plausibly be true, or which we came to learn were incorrect over time. Cassam would argue (alongside others such as Steven Pinker, Yuval Noah Harari, and Joseph Henrich) that adopting more accurate and correct beliefs and promoting knowledge would help us systematically make better decisions to improve the life of our fellow humans. Learning where we were wrong and using science, technology, and information to improve our decision-making has helped our world become less violent, given us more opportunity, provided better nutrition, and allowed us to be more cooperative on a global level.
This is why Cassam addresses paranormal beliefs, superstitions, and conspiratorial thinking in his book. While examining conspiracy theories in depth, he writes, “studies have also found that belief in conspiracy theories is associated with superstitious and paranormal beliefs, and it has been suggested that these beliefs are associated because they are underpinned by similar thinking styles [italicized text is cited with Swami et al. 2011].  Cassam argues that conspiracy theories are different from the other two modes of thinking because they can sometimes be accurate in their descriptions of the world. Sometimes a politician truly is running a corruption scheme, sometimes a group of companies are conspiring to keep prices high, and sometimes a criminal organization is hiding nefarious activities in plain sight. Conspiratorial thinking in some instances can reveal real causal connections in the world.
However, conspiratorial thinking is often bizarre and  implausible. When our conspiratorial thinking pushes us off the deep edge, then it does share important characteristics with superstitious and paranormal thinking. All three can be described by positing causal connections that cannot possibly exist between phenomena happening or imagined in the real world. They create explanations that are inaccurate and prevent us from identifying real information about the world. Superstitions posit causal connections between random and unconnected events and paranormal thinking posits causal connections between non-existent entities and real world events. Conspiratorial thinking seems to fall in line with both ways of thinking when it is not describing reality.
Over the last few years we have seen how conspiratorial thinking can be vicious, how it can inhibit knowledge, and how it can have real life and death consequences when it goes wrong. Superstitious thinking doesn’t generally seem to have as severe of consequences, but it still prevents us from making the best possible decisions and still drives us to adopt incorrect worldviews, sometimes entrenching unfair biases and prejudices. Paranormal thinking has been a foundation of many world religions and fables used to teach lessons and encourage particular forms of behavior. However, if it does not describe the world in a real way, then the value of paranormal thinking is minimized, and we should seriously consider the harms that can come from paranormal thinking, such as anxiety, suicide, or hours of lost sleep. These ideas are important to consider because we need to make the best possible decisions based on the most accurate information possible if we want to continue to advance human societies, to live sustainably, and to continue to foster cooperation and community between all humans on a global scale. Thinking accurately takes practice, so pushing against unwarranted conspiracy theories, superstitions, and paranormal beliefs helps us build our epistemic muscles to improve thinking overall.
Thinking Conspiratorially

Thinking Conspiratorially

Over the last few years a number of wild conspiracy theories have become popular. Former President Donald Trump embraced a conspiracy theory that the 2020 Presidential Election was rigged (it was not), supported the Qanon conspiracy theory, and did little to push back against conspiracy theories surrounding COVID-19. His actions, behaviors, and beliefs demonstrate that thinking conspiratorially can be an epistemic vice. His willingness to believe wild falsehoods obstructed knowledge for himself and his most ardent supporters.
However, thinking conspiratorially is not always an epistemic vice. One reason why conspiracy theories become so gripping and why people sometimes fall into them is because real conspiracies do occur. Nixon’s Watergate Scandal, Trump’s withholding of financial and military aid unless Ukraine announced an investigation into Joe Biden and his son, and fraud schemes uncovered by inspectors general and government auditors demonstrate that nefarious conspiracies sometimes are real. While thinking conspiratorially can become an epistemic vice, the same is true for anti-conspiratorial thinking.
In the book Vices of the Mind, Quassim Cassam quotes Dr. Charles Pigden from the University of Otago in New Zealand by writing, “there is nothing inherently vicious about believing or being disposed to believe conspiracy theories.” Cassam argues that conspiratorial thinking is not an epistemic vice on its own, but is instead a context dependent vice or virtue. He continues, “there are environments in which either way of thinking can be epistemically virtuous or vicious, and a way to capture this context-relativity is to describe these thinking styles as conditionally virtuous or vicious.”
The examples I used earlier show how conspiratorial thinking can be either virtuous or vicious. In the case of our former President, his conspiratorial thinking spread misinformation, suppressed true and accurate information, and created a set of false beliefs that some of his supporters believed so strongly that they stormed the United States Capitol in an attempt to stop Congress from certifying the election. The context of his conspiracy theories obstructed knowledge and caused substantial harm to life and property. However, a government auditor who notices inconsistencies in paperwork and accounting practices may be rewarded for thinking conspiratorially, at least to a point. Believing that something nefarious could possibly be going on will encourage the auditor to review financial statements and testimony from personnel with more scrutiny, potentially helping them uncover real fraud. Of course, they could still go too far and push the issue beyond reasonable bounds by thinking conspiratorially, but this type of thinking is conditionally virtuous when it discovers true fraud and improves knowledge about fraud schemes.
Given the dramatic consequences of conspiracy thinking over the last few years, it is easy to dismiss thinking conspiratorially as an epistemic vice. However, we should remember that it is only conditionally an epistemic vice, and that sometimes conspiracies do turn out to be true (or at least partially true). We don’t have to give every conspiracy our respect and attention, but when a conspiracy does appear to be grounded in reality and supported by real evidence, then we should not be too quick to dismiss it.
Discount Confidence

Discount Confidence

You should probably discount confidence, even your own, when it comes to the certainty of a given outcome or event. I previously wrote about confidence stemming from the logical coherence of the story we are able to tell ourselves. I have also written about how logical coherence of personal narratives is easier when we lack key information and have a limited set of experiences to draw from. The more we know, the more experiences we have, the harder it becomes to construct a narrative that can balance conflicting and competing information. Laddering up from this point, we should be able to see that the more detailed and complete our information, the less coherent and easily logical our narrative about the world should be, and the less confidence we should have about anything.

 

If you have a high level of confidence in your own intuitions, then you probably don’t know enough about the world. If someone tells you they are very confident in something, like say an investment strategy, then you should probably discount the outcome based on their certainty. They may still be right in the end, but their certainty shouldn’t be a factor that leads to your support of the outcome they tell you to be a sure thing. As Daniel Kahneman writes in Thinking Fast and Slow, “The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trusty anyone – including yourself – to tell you how much you should trust their judgment.”

 

We tend to be very trustworthy. Our society and economy run on trust that we place in complete strangers. Our inclination toward trust is what causes us to be so easily fooled by confidence. It is easy to assume that someone who has a lot of confidence in something is more trustworthy, because we assume they must know a lot in order to be so confidence. But as I laid out at the start of this post, that isn’t always the case. In fact, the more knowledge you have about something, the less confidence you should have. With more knowledge comes more understanding of nuance, better conceptions of areas of uncertainty, and a better sense of trade-offs and contradictions. Confidence alone is not a predictor of accuracy. Our assumptions influence how accurate our prediction is, and we can be very confident in our assumptions without having any concrete connection to reality.
Narrative Confidence

Narrative Confidence

We like to believe that having more information will make us more confident in our decisions and opinions. The opposite, however, may be true. I have written in the past about a jam study, where participants who selected jam from a sample of a few jams were more happy with their choice than participants who selected jam from a sample of several dozen jam options. More information and more choices seems like it would help make us more happy and make us more confident with our decision, but those who selected jam from the small sample were happier than those who had several dozen jam options.

 

We like simple stories. They are easy for our brain to construct a narrative around and easy for us to have confidence in. The stories we tell ourselves and the conclusions we reach are often simplistic, often built on incomplete information, and often lack the nuance that is necessary to truly reflect reality. Our brains don’t want to work too hard, and don’t want to hold conflicting information that forces an unpleasant compromise. We don’t want to constantly wonder if we made the right choice, if we should do something different, if we need to try another option. We just want to make a decision and have someone tell us it was a good decision, regardless of the actual outcome or impact on our lives, the lives of others, or the planet.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow. He describes a study (not the jam study) where participants were presented with either one side or two sides of an argument. They had to chose which side they agreed with, and rate their confidence. “Participants who saw one-sided evidence were more confident of their judgments than those who saw both sides,” writes Kahneman, “This is just what you would expect if the confidence that people experience is determined by the coherence of the story they manage to construct from available information. It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.”

 

Learning a lot and truly understanding any given issue is challenging because it means we must build a complex picture of the world. We can’t rely on simple arguments and outlooks on life when we start to get into the weeds of an issue or topic. We will see that admirable people have tragic flaws. We will see that policies which benefit us may exploit others. We will find that things we wish to be true about who we are and the world we live in are only semi-true. Ignorance is bliss in the sense that knowing only a little bit about the world will allow you to paint a picture that makes sense to you, but it won’t be accurate about the world and it won’t acknowledge the negative externalities that the story may create. Simplistic narratives may help us come together as sports fans, or as consumers, or as a nation, but we should all be worried about what happens when we have to accept the inaccuracies of our stories. How we do we weave a complex narrative that will bring people across the world together in a meaningful and peaceful way without driving inequality and negative externalities? That is the challenge of the age, and unfortunately, the better we try to be at accurately depicting the world we inhabit, the less confident any of us will be about the conclusions and decisions for how we should move forward.
Familiarity vs Truth

Familiarity vs Truth

People who wish to spread disinformation don’t have to try very hard to get people to believe that what they are saying is true, or that their BS at least has some element of truth to it. All it takes, is frequent repetition. “A reliable way to make people believe in falsehoods,” writes Daniel Kahneman in his book Thinking Fast and Slow, “is frequent repetition, because familiarity is not easily distinguished from truth.”

 

Having accurate and correct representations of the world feels important to me. I really love science. I listen to lots of science based podcasts, love sciency discussions with family members and friends, and enjoy reading science books. By accurately understanding how the world operates, by seeking to better understand the truth of the universe, and by developing better models and systems to represent the way nature works, I believe we can find a better future. I try not to fall unthinkingly into techno-utopianism thinking, but I do think that having accurate beliefs and understandings are important for improving the lives of people across the planet.

 

Unfortunately, for many people, I don’t think that accurate and correct understandings of the worlds have such high priority in their lives. I fear that religion and science may be incompatible or at odds with each other, and there may be a willingness to accept inaccurate science or beliefs to support religious doctrine. I also fear that people in extractive industries may discount science, preferring to hold an inaccurate belief that supports their ability to profit through their extractive practices. Additionally, the findings, conclusions, and recommendations from science may just be scary for many ordinary people, and accepting what science says might be inconvenient or might require changes in lifestyles that people don’t want to make. When we are in this situations, it isn’t hard to imagine why we might turn away from scientific consensus in favor of something comfortable but wrong.

 

And this is where accurate representations of the universe face and uphill battle. Inaccuracies don’t need to be convincing, don’t really need to sound plausible, and don’t need to to come from credible authorities. They just need to be repeated on a regular basis. When we hear something over and over, we start to become familiar with the argument, and we start to have trouble telling the truth and falsehood apart. This happened in 2016 when the number one word associated with Hillary Clinton was Emails. It happened with global warming when enough people suggested that human related CO2 emissions were not related to the climate change we see. And it happens every day in trite sayings and ideas from trickle down economics to popping your knuckles causes arthritis.

 

I don’t think that disproving inaccuracies is the best route to solving the problem of familiarity vs truth. I think the only thing we can hope to do is amplify those ideas, conclusions, experiments, and findings which accurately reflect the true nature of reality. We have to focus on what is true, not on all the misleading nonsense that gets repeated. We must repeat accurate statements about the universe so that they are what become familiar, rather than the mistaken ideas that become hard to distinguish from the truth.

A Science Fiction Message

Larry Niven wrote a letter to James Harmon for Harmon to publish in his book, Take My Advice, and in his letter Niven offers his 19 “Niven Laws” which are his observations of how the world works. I highlighted law number 14 because it brings to life the idea that other people do not think the same way that we do, and it does so in a fun way. “14. The only universal message in science fiction: There exist minds that think as well as you do, but differently.”  I think that once we have graduated from high school or college our 12 to 16 years of the academic world can leave us in a place where we look for right answers and assume that there is one correct way to look at everything in the world.  Standardized tests, teachers who push a single viewpoint, and competitions to see who can have the highest grades and GPA create a world where we constantly rank our intelligence and compete to see who can have the answers that are the most in tune with the ideas of the person grading our tests.  Niven uses science fiction to show that there are incredible thinkers in this world who can use their mind as well as anyone, but who can apply it in innovative ways. The diversity of science fiction is simply a reflection of the diversity of thought in the human mind, and in todays Age of Superheroes that diversity is being celebrated despite the fact that 24 hour news networks chariot single viewpoints and commercials direct materialistic views of happiness into our homes.

 

Perhaps we can look at Niven’s understanding of what science fiction is, a collection of thoughts that differ from the every day, to explain why superhero movies and shows have come to dominate lately.  We want to see the world in new ways and imagine what the world could be if we adopted new view points.  This whole notion could be a counter reaction to the media surrounding what happens in the “real world”.

 

What first drew me in to Niven’s quote was the idea that we compare the way that we think to others.  Academics puts a mark on who thinks well, an who does not think well, and it teaches us to identify ourselves based on the quality of our thought as graded by a professor or as outlined by a series of multiple choice questions.  Years of schooling can force an individual into pre-set manners of thinking and can encourage the assimilation towards a single viewpoint.  Niven’s quote uses science fiction to show that  we do not have to judge the way other thing or the quality of their thoughts based on how aligned they are with what we consider to be normal or standard.  The idea in the quote above shows that we can celebrate other ways of thinking without judgment.

 

It may be argued that those who study creative subjects such as literature or art throughout school will be more open to the idea of looking at multiple perspectives while those who study science, math, or engineering are more shut out to the possibility of multiple answers.  For me this is too simple of an explanation for differences in thought and education.  My first venture into the world of quantum physics was by reading the book, The Dancing Wu Li Masters, by Gary Zukav.  Zukav explains that the science of quantum mechanics, what happens at the sub-atomic level inside the atom of every element, is a shifting science with our interactions in the experiments we perform providing us with mixed results.  The most famous example of this is how we study light. Depending on the test we run and how we measure our tests, we can show light particles to behave as a wave or as an individual particle.  There is no right answer with identifying exactly what light is, and the problem can only be solved with creative people who do not accept a single right answer but can look for answers in new directions.  Another example of people abandoning the idea of a single right answer in science comes from the book I am currently reading, Stuff Matters, by Mark Miodownik.  In his book Miodownik dives into the world of material science and shows us how new thinking and applications of materials has changed our world in ways that are often hidden to most of us.  He explains how complex concrete is and how modern day concrete was created in part through the innovations of a gardener who simply wanted to build stronger pots.  It took the unique viewpoint of someone outside the world of material science to find an answer to concrete’s tendency to crack and crumble. Luck provided that concrete and steel have very similar coefficients of expansion, which allows for concrete to be poured around steel to give it extra tension and strength when poured in unique shapes.

 

If we stick to the idea that there is only one correct answer for everything, and if we assume that others think like us, then we are left in a world where we cease to innovate.  We must remember, and science fiction helps us to do so, that we do not think “better” than anyone else.  We simply all think in different ways, and combining our different ways of thinking is what builds a better world.  Science fiction today has come to celebrate unique thoughts and ideas, and there is plenty of room to expand our celebration of unique thought to areas of the world where the dominate ideas is that there is “one right answer”.

An Important Task

Scott Russel Sanders continues in his letter of advice for James Harmon’s book, Take My Advice, writing about self awareness, spirituality, and philosophical ideas.  Towards the end of his letter he writes, “To understand as well as we can who we truly are and in what sort of world we have been set down may be our most important task.”  What I like about this quote is that it takes away the importance of obtaining material things and addresses the questions or doubts that we all constantly sift through.  For Sanders, what he is showing in this quote is the value of objectively understanding ourselves, the world, and our place in it.
The first part of Sanders quote speaks to me about the purpose of self reflection.  Being able to think about what we are good at, what we enjoy and why, and what we truly want in life will help us find a path that is comfortable and appropriate for us.  This is a truly important task for each of us, because an increased self awareness will allow us to begin to live our lives intentionally rather than living in a reactionary way.  We do not have to chase the goals that our friends, the media, religion, and family tell us we need to chase. Self awareness and knowing who we are and what we truly desire will allow us to find a meaningful path to follow to a destination that we will be happy with.
The second half of Sanders quote seems to be a little more difficult in my opinion.  I have come across many people who write and speak about self awareness, and while the road to self awareness is bumpy and full of obstacles (especially when you first set out) the road to a true understanding of the world we are in is more challenging and subjective.  Self reflection (examining your goals, desires, motivations, and skills) takes practice and it can be hard to learn that life should not be judged by the sports car you drive, but there seems to be something more challenging about finding true sources for understanding the world.  We will each approach the world with different perspectives and experiences, and we will each appropriate separate values to ideas and topics.  I do not think we can honestly understand the world if we have not first mastered honestly knowing ourselves, and then it is a constant practice to source out the good and bad information.  I am not saying, and I don’t think that Saunders would either, that we should just look for positive information about the world, but that we should search for objective information about things that will actually matter and have meaningful impacts in the world.  With the avalanche of information on the internet, it is easy to get lost among fake news stories that do not represent the true world we live in.  At the same time we can all have so many unique niche interests that we investigate and learn about, and each of these interests build new experiences and perspectives through which we can understand the world.
I think that Sanders in reaction to my writing would say that the first step to fulfilling our important task on this planet is to understand ourselves, including our perspective and how our experiences have shaped our perspectives.  Next, Sanders would argue that we absorb as many other perspectives as possible, to help us begin to view the world in a new and meaningful way. This would involve vigorous research on our part to sift through the nonsense and gossip.