Restorative Justice

Restorative Justice

If our justice system were to change from a vehicle for legal revenge into a system that focused on deterring criminals and helping them reintegrate into society in a meaningful way, to ultimately prevent recidivism, what would the system look like? In his book The Better Angels of Our Nature, Stevena Pinker argues that the system might look like the reconciliation process that took place in South Africa under the leadership of Nelson Mandela and Desmond Tutu.
 
 
Steadfast truth telling and accepting incomplete justice are key parts of that system that are missing from our current system. Pinker writes, “though truth-telling sheds no blood, it requires a painful emotional sacrifice on the part of the confessors in the form of shame, guilt, and a unilateral disarmament of their chief moral weapon, the claim to innocence.” Incomplete justice means that every score doesn’t need to be settled. We don’t need to take an eye for an eye, it is ok if the punishment is not as violent and severe as the original crime, the justice can be incomplete but still be compelling, still be just, and still lead to a better future.
 
 
Truth telling helps overcome the moralization gap, where we dismiss our own harms done unto the world while focusing only on how we feel that we have been harmed. Truth telling requires that we acknowledge that we have harmed others and think about the world through their perspective. This may not seem like a substantial punishment, but still, the person does suffer a cost. “The punishment takes the form of hits to their reputation, prestige, and privileges rather than blood for blood,” describes Pinker.
 
 
This system actually allows for healing and acceptance. Justice based on revenge cuts people down and hinders advancement, healing, and acceptance. What is more important in the long run, rather than perfectly equal punishment in relation to crime, is that we become more cooperative, less violent, and less likely to commit crimes in the future. Revenge is a powerful motivating force, but it doesn’t serve the world as well as reconciliation. 

Evaluating Our Post Truth Moment

During the Trump Presidency I frequently heard people saying that we now live in a “post truth” society. People simply believe what they want to believe, the former President included, and reality or veracity of information no longer matter. All sources of knowledge were valid, as long as the source provided the information we wanted to believe.
 
 
But do we really live in a post truth society? I am not so sure that truth no longer matter. I am also not sure that what we are seeing with people choosing to believe things that cannot possibly be true is actually new. What seems to have happened during the Trump Presidency is that numerous people became dramatically attached to Trump, the identity he represented, and the cultural values he reflected. They agreed that they would not validate or recognize any information that ran against what Trump said or that was politically damaging for him. People chose to exercise political power over the veracity of information. That is disconcerting, but it isn’t really anything new in humanity. We hadn’t seen it in the United States at such a high level (at least not in my 30 year life-time) but humanity has seen such behavior in the past.
 
 
In his book The Better Angels of Our Nature, Steven Pinker writes, “faith, revelation, tradition, dogma, authority, and the ecstatic glow of subjective certainty – all are recipes for error, and should be dismissed as sources of knowledge.” The post truth moment we lived through included knowledge grounded in many of the fields that Pinker suggests we discard and also mirrors past human experiences of deriving knowledge from such fields. Trump was not the only authoritarian to claim that something was right (and to believe it himself) simply because it came from him or was something he said. Trump was elected on the dogma that a good business person was needed to run the government like a good business and the ecstatic glow of subjective certainty played a role in many people feeling that Trump’s electoral victory (or demise) was inevitable.
 
 
And all of these things have been seen in the past. Pinker writes, “the history of human folly, and our own susceptibility to illusions and fallacies, tell us that men and women are fallible.” We were afraid of the post truth moment that Trump fueled, but it was nothing new and on the other end we seem to be doing a better job of tying our knowledge and beliefs to empirical facts and data. Despite the upheaval of Trump’s four years in office, for almost all of us, our success in society is dependent on accurate interpretations of reality, not on illusions and beliefs born out of faith, tradition, or pure desires of how we want reality to be. In some large and concerning ways truth may take a back seat to our own desires, but for almost all of us, our daily lives still depend on accurate information.
Truth is a Poor Test for Knowledge

Truth is a Poor Test for Knowledge

We live in what is being called a post-truth world, where facts don’t seem to stand up on their own and motivated reasoning drives what people believe. Politicians, activists, and people of note say wild things without regard to accuracy. Against this backdrop, many people have begun to argue that we need more truth in our news, statements, and beliefs.
 
 
This quest for truth is noble, but also has its downsides. The COVID-19 pandemic is an example of how standards around truth can become self-defeating and can contribute to people’s motivated reasoning and cynicism around information. Science has moved very quick with regard to COVID-19, but that has often meant changing recommendations for how to stay healthy. We have changed what we know about infection rates, hospitalization rates, treatment, prevention, and death. This means that what people know and believe about the disease may change on a weekly or monthly basis, and consequently public policy and recommendations change. Unfortunately, that change can be a difficult process. Former Press Secretary Sean Spicer unfairly used the quick changes in science around COVID for political purposes in a tweet. On the other end of the spectrum, people are not happy with how slow some regulations update in the face of changing science, as George Mason Economist Bryan Caplan unfairly mocked in another tweet.
 
 
Yuval Noah Harari would argue that truth shouldn’t be the goal. In his book Sapiens, Harari writes, “truth is a poor test of knowledge. The real test is utility. A theory that enables us to do new things constitutes knowledge.” We treat scientific knowledge and information about the world as clear and deterministic. The reality is that our scientific knowledge and understanding of the world is incomplete, especially on a personal level. We all live with models for reality, and we should not make complete truth and accuracy our goal. We should strive to be as accurate and truthful as possible, but we should recognize that knowledge comes from how well our models work in the real world. Improved information along with more accurate and true knowledge should help us perform better, do new things, make new advances, and improve the world. We don’t have to mock science, policy, or the statements of others. We need to look for ways to update our models and theories so that we can do the most with what we know. We should be willing to update when we learn that our information is not true or accurate. Holding ourselves to impossible truth standards doesn’t help us build knowledge, and can actually be an obstacle to developing knowledge. 
Talking About Causation - Judea Pearl - The Book of Why - Joe Abittan

Talking About Causation

In The Book of Why Judea Pearl argues that humans are better at modeling, predicting, and identifying causation than we like to acknowledge. For Pearl, the idea that we can see direct causation and study it scientifically is not a radical and naïve belief, but a common sense and defensible observation about human pattern recognition and intuition of causal structures in the world. He argues that we are overly reliant on statistical methods and randomized controlled trials that suggest relationships, but never tell us exactly what causal mechanisms are at the heart of such relationships.
One of the greatest frustrations for Pearl is the limitations he feels have been placed around ideas and concepts for causality. For Pearl, there is a sense that certain research, certain ways of talking about causality, and certain approaches to solving problems are taboo, and that he and other causality pioneers are unable to talk in a way that might lead to new scientific breakthroughs. Regarding a theory of causation and a the history of our study of causality, he writes, “they declared those questions off limits and turned to developing a thriving causality-free enterprise called statistics.”
Statistics doesn’t tell us a lot about causality. Statistical thinking is a difficult way for most people to think, and for non-statistically trained individuals it leads to frustrations. I remember around the time of the 2020 election that Nate Silver, a statistics wonk at Fivethirtyeight.com, posted a cartoon where one person was trying to explain the statistical chance of an outcome to another person. The other person interpreted statistical chances as either 50-50 or all or nothing. They interpreted a low probability event as a certainty that something would not happen and interpreted a high probability event as a certainty that it would happen, while more middle ground probabilities were simply lumped in as 50-50 chances. Statistics helps us understand these probabilities in terms of the outcomes we see, but doesn’t actually tell us anything about the why behind the statistical probabilities. That, I think Pearl would argue, is part of where the confusion for the individual in the cartoon who had trouble with statistics stems from.
Humans think causally, not statistically. However, our statistical studies and the accepted way of doing science pushes against our natural causal mindsets. This has helped us better understand the world in many ways, but Pearl argues that we have lost something along the way. He argues that we needed to be building better ways of thinking about causality and building models and theories of causality at the same time that we were building and improving our studies of statistics. Instead, statistics took over as the only responsible way to discuss relationships between events, with causality becoming taboo.
“When you prohibit speech,” Pearl writes, “you prohibit thought and stifle principles, methods, and tools.” Pearl argues that this is what is happening in terms of causal thinking relative to statistical thinking. I think he, and other academics who make similar speech prohibition arguments, are hyperbolic, but I think it is important to consider whether we are limiting speech and knowledge in an important way. In many studies, we cannot directly see the causal structure, and statistics does have ways of helping us better understand it, even if it cannot point to a causal element directly. Causal thinking alone can lead to errors in thinking, and can be hijacked by those who deliberately want to do harm by spreading lies and false information. Sometimes regressions and correlations hint at possible causal structures or completely eliminate others from consideration. The point is that statistics is still useful, but that it is something we should lean into as a tool to help us identify causality, not as the endpoint of research beyond which we cannot make any assumptions or conclusions.
Academics, such as Pearl and some genetic researchers, may want to push forward with ways of thinking that others consider taboo, and sometimes fail to adequately understand and address the concerns that individuals have about the fields. Addressing these areas requires tact and an ability to connect research in fields deemed off limits to the fields that are acceptable. Statistics and a turn away from a language of causality may have been a missed opportunity in scientific understanding, but it is important to recognize that human minds have posited impossible causal connections throughout history, and that we needed statistics to help demonstrate how impossible these causal chains were. If causality became taboo, it was at least partly because there were major epistemic problems in the field of causality. The time may have come for addressing causality more directly, but I am not convinced that Pearl is correct in arguing that there is a prohibition on speech around causality, at least not if the opportunity exists to tactfully and responsibly address causality as I think he does in his book.
A Vaccine for Lies and Falsehoods

A Vaccine for Lies and Falsehoods

Vaccines are on everyone’s mind this year as we hope to move forward from the Coronavirus Pandemic, and I cannot help but think about today’s quote from Quassim Cassam’s book Vices of the Mind through a vaccine lens. While writing about ways to build and maintain epistemic virtues Cassam writes, “only the inculcation and cultivation of the ability to distinguish truth from lies can prevent our knowledge from being undermined by malevolent individuals and organizations that peddle falsehoods for their own political or economic ends.” In other words, there is no vaccine for lies and falsehoods, only the hard work of building the skills to recognize truth, narrative, and outright lies.
I am also reminded of a saying that Steven Pinker included in his book Enlightenment Now, “any jackass can knock down a barn, but it takes a carpenter to build one.” This quote comes to mind when I think about Cassam’s quote because building knowledge is hard, but spreading falsehoods is easy. Epistemic vices are easy, but epistemic virtues are hard.
Anyone can be closed-minded, anyone can use lies to try to better their own position, and anyone can be tricked by wishful thinking. It takes effort and concentration to be open-minded yet not gullible, to identify and counter lies, and to create and transmit knowledge for use by other people. The vast knowledge bases that humanity has built has taken years to develop, to weed out the inaccuracies, and to painstakingly hone in on ever more precise and accurate understandings of the universe. All this knowledge and information has taken incredible amounts of hard work by people dedicated to building such knowledge.
But any jackass can knock it all down. Anyone can come along and attack science, attack knowledge, spread misinformation and deliberately use disinformation to confuse and mislead people. Being an epistemic carpenter and building knowledge is hard, but being a conman and acting epistemically malevolent is easy.
The task for all of us is to think critically about our knowledge, about the systems and structures that have facilitated our knowledge growth and development as a species over time, and to do what we can to be more epistemically virtuous. Only by working hard to identify truth, to improve systems for creating accurate information, and to enhance knowledge highways to help people learn and transmit knowledge effectively can we continue to move forward. At any point we can chose to throw sand in the gears of knowledge, bringing the whole system down, or we can find ways to make it harder to gum up the knowledge machinery we have built. We must do the latter if we want to continue to grow, develop, and live peacefully rather than at the mercy of the epistemically malevolent. After all, there is no vaccine to cure us from lies and falsehoods.
Lies Versus Epistemic Insouciance

Lies Versus Epistemic Insouciance

My last post was about epistemic insouciance, being indifferent to whether or not your beliefs, statements, and ideas are accurate or inaccurate. Epistemic insouciance, Quassim Cassam argues in Vices of the Mind is an attitude. It is a disposition toward accurate or false information that is generally case specific.
In the book, Cassam distinguishes between lies and epistemic insouciance. He writes, “lying is something that a person does rather than an attitude, and the intention to conceal the truth implies that the liar is not indifferent to the truth or falsity of his utterances. Epistemic insouciance is an attitude rather than something that a person does, and it does imply an indifference to the truth or falsity of one’s utterances.”
The distinction is helpful when we think about people who deliberately lie and manipulate information for their own gain and people who are bullshitters. Liars, as the quote suggests, know and care about what information is true and what is false. They are motivated by factors beyond the accuracy of the information, and do their best within their lies to present false information as factual.
Bullshitters, however, don’t care whether their information is accurate. The tools that work to uncover inaccurate information and counter a liar don’t work against a bullshitter because of their epistemic insouciance. Liars contort evidence and create excuses for misstatements and lies. Bullshitters simply flood the space with claims and statements of varying accuracy. If confronted, they argue that it doesn’t matter whether they lied or not, and instead argue that their information was wrong, that they didn’t care about it being wrong, and as a result they were not actually lying. This creates circular arguments and distracts from the epistemic value of information and the real costs of epistemic insouciance. Seeing the difference between liars and epistemically insouciant bullshitters is helpful if we want to know how to address those who intentionally obstruct knowledge.

Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

Paranormal Beliefs, Superstitions, and Conspiratorial Thinking

How we think, what we spend our time thinking about, and the way we view and understand the world is important. If we fail to develop accurate beliefs in the world then we will make decisions based on causal structures that do not exist. Our actions, thoughts, and behaviors will inhibit knowledge for ourselves and others, and our species will be worse off because of it.
This idea is at the heart of Quassim Cassam’s book Vices of the Mind. Throughout our human history we have held many beliefs that cannot plausibly be true, or which we came to learn were incorrect over time. Cassam would argue (alongside others such as Steven Pinker, Yuval Noah Harari, and Joseph Henrich) that adopting more accurate and correct beliefs and promoting knowledge would help us systematically make better decisions to improve the life of our fellow humans. Learning where we were wrong and using science, technology, and information to improve our decision-making has helped our world become less violent, given us more opportunity, provided better nutrition, and allowed us to be more cooperative on a global level.
This is why Cassam addresses paranormal beliefs, superstitions, and conspiratorial thinking in his book. While examining conspiracy theories in depth, he writes, “studies have also found that belief in conspiracy theories is associated with superstitious and paranormal beliefs, and it has been suggested that these beliefs are associated because they are underpinned by similar thinking styles [italicized text is cited with Swami et al. 2011].  Cassam argues that conspiracy theories are different from the other two modes of thinking because they can sometimes be accurate in their descriptions of the world. Sometimes a politician truly is running a corruption scheme, sometimes a group of companies are conspiring to keep prices high, and sometimes a criminal organization is hiding nefarious activities in plain sight. Conspiratorial thinking in some instances can reveal real causal connections in the world.
However, conspiratorial thinking is often bizarre and  implausible. When our conspiratorial thinking pushes us off the deep edge, then it does share important characteristics with superstitious and paranormal thinking. All three can be described by positing causal connections that cannot possibly exist between phenomena happening or imagined in the real world. They create explanations that are inaccurate and prevent us from identifying real information about the world. Superstitions posit causal connections between random and unconnected events and paranormal thinking posits causal connections between non-existent entities and real world events. Conspiratorial thinking seems to fall in line with both ways of thinking when it is not describing reality.
Over the last few years we have seen how conspiratorial thinking can be vicious, how it can inhibit knowledge, and how it can have real life and death consequences when it goes wrong. Superstitious thinking doesn’t generally seem to have as severe of consequences, but it still prevents us from making the best possible decisions and still drives us to adopt incorrect worldviews, sometimes entrenching unfair biases and prejudices. Paranormal thinking has been a foundation of many world religions and fables used to teach lessons and encourage particular forms of behavior. However, if it does not describe the world in a real way, then the value of paranormal thinking is minimized, and we should seriously consider the harms that can come from paranormal thinking, such as anxiety, suicide, or hours of lost sleep. These ideas are important to consider because we need to make the best possible decisions based on the most accurate information possible if we want to continue to advance human societies, to live sustainably, and to continue to foster cooperation and community between all humans on a global scale. Thinking accurately takes practice, so pushing against unwarranted conspiracy theories, superstitions, and paranormal beliefs helps us build our epistemic muscles to improve thinking overall.
Thinking Conspiratorially

Thinking Conspiratorially

Over the last few years a number of wild conspiracy theories have become popular. Former President Donald Trump embraced a conspiracy theory that the 2020 Presidential Election was rigged (it was not), supported the Qanon conspiracy theory, and did little to push back against conspiracy theories surrounding COVID-19. His actions, behaviors, and beliefs demonstrate that thinking conspiratorially can be an epistemic vice. His willingness to believe wild falsehoods obstructed knowledge for himself and his most ardent supporters.
However, thinking conspiratorially is not always an epistemic vice. One reason why conspiracy theories become so gripping and why people sometimes fall into them is because real conspiracies do occur. Nixon’s Watergate Scandal, Trump’s withholding of financial and military aid unless Ukraine announced an investigation into Joe Biden and his son, and fraud schemes uncovered by inspectors general and government auditors demonstrate that nefarious conspiracies sometimes are real. While thinking conspiratorially can become an epistemic vice, the same is true for anti-conspiratorial thinking.
In the book Vices of the Mind, Quassim Cassam quotes Dr. Charles Pigden from the University of Otago in New Zealand by writing, “there is nothing inherently vicious about believing or being disposed to believe conspiracy theories.” Cassam argues that conspiratorial thinking is not an epistemic vice on its own, but is instead a context dependent vice or virtue. He continues, “there are environments in which either way of thinking can be epistemically virtuous or vicious, and a way to capture this context-relativity is to describe these thinking styles as conditionally virtuous or vicious.”
The examples I used earlier show how conspiratorial thinking can be either virtuous or vicious. In the case of our former President, his conspiratorial thinking spread misinformation, suppressed true and accurate information, and created a set of false beliefs that some of his supporters believed so strongly that they stormed the United States Capitol in an attempt to stop Congress from certifying the election. The context of his conspiracy theories obstructed knowledge and caused substantial harm to life and property. However, a government auditor who notices inconsistencies in paperwork and accounting practices may be rewarded for thinking conspiratorially, at least to a point. Believing that something nefarious could possibly be going on will encourage the auditor to review financial statements and testimony from personnel with more scrutiny, potentially helping them uncover real fraud. Of course, they could still go too far and push the issue beyond reasonable bounds by thinking conspiratorially, but this type of thinking is conditionally virtuous when it discovers true fraud and improves knowledge about fraud schemes.
Given the dramatic consequences of conspiracy thinking over the last few years, it is easy to dismiss thinking conspiratorially as an epistemic vice. However, we should remember that it is only conditionally an epistemic vice, and that sometimes conspiracies do turn out to be true (or at least partially true). We don’t have to give every conspiracy our respect and attention, but when a conspiracy does appear to be grounded in reality and supported by real evidence, then we should not be too quick to dismiss it.
Discount Confidence

Discount Confidence

You should probably discount confidence, even your own, when it comes to the certainty of a given outcome or event. I previously wrote about confidence stemming from the logical coherence of the story we are able to tell ourselves. I have also written about how logical coherence of personal narratives is easier when we lack key information and have a limited set of experiences to draw from. The more we know, the more experiences we have, the harder it becomes to construct a narrative that can balance conflicting and competing information. Laddering up from this point, we should be able to see that the more detailed and complete our information, the less coherent and easily logical our narrative about the world should be, and the less confidence we should have about anything.

 

If you have a high level of confidence in your own intuitions, then you probably don’t know enough about the world. If someone tells you they are very confident in something, like say an investment strategy, then you should probably discount the outcome based on their certainty. They may still be right in the end, but their certainty shouldn’t be a factor that leads to your support of the outcome they tell you to be a sure thing. As Daniel Kahneman writes in Thinking Fast and Slow, “The confidence that people have in their intuitions is not a reliable guide to their validity. In other words, do not trusty anyone – including yourself – to tell you how much you should trust their judgment.”

 

We tend to be very trustworthy. Our society and economy run on trust that we place in complete strangers. Our inclination toward trust is what causes us to be so easily fooled by confidence. It is easy to assume that someone who has a lot of confidence in something is more trustworthy, because we assume they must know a lot in order to be so confidence. But as I laid out at the start of this post, that isn’t always the case. In fact, the more knowledge you have about something, the less confidence you should have. With more knowledge comes more understanding of nuance, better conceptions of areas of uncertainty, and a better sense of trade-offs and contradictions. Confidence alone is not a predictor of accuracy. Our assumptions influence how accurate our prediction is, and we can be very confident in our assumptions without having any concrete connection to reality.
Narrative Confidence

Narrative Confidence

We like to believe that having more information will make us more confident in our decisions and opinions. The opposite, however, may be true. I have written in the past about a jam study, where participants who selected jam from a sample of a few jams were more happy with their choice than participants who selected jam from a sample of several dozen jam options. More information and more choices seems like it would help make us more happy and make us more confident with our decision, but those who selected jam from the small sample were happier than those who had several dozen jam options.

 

We like simple stories. They are easy for our brain to construct a narrative around and easy for us to have confidence in. The stories we tell ourselves and the conclusions we reach are often simplistic, often built on incomplete information, and often lack the nuance that is necessary to truly reflect reality. Our brains don’t want to work too hard, and don’t want to hold conflicting information that forces an unpleasant compromise. We don’t want to constantly wonder if we made the right choice, if we should do something different, if we need to try another option. We just want to make a decision and have someone tell us it was a good decision, regardless of the actual outcome or impact on our lives, the lives of others, or the planet.

 

Daniel Kahneman writes about this in his book Thinking Fast and Slow. He describes a study (not the jam study) where participants were presented with either one side or two sides of an argument. They had to chose which side they agreed with, and rate their confidence. “Participants who saw one-sided evidence were more confident of their judgments than those who saw both sides,” writes Kahneman, “This is just what you would expect if the confidence that people experience is determined by the coherence of the story they manage to construct from available information. It is the consistency of the information that matters for a good story, not its completeness. Indeed, you will often find that knowing little makes it easier to fit everything you know into a coherent pattern.”

 

Learning a lot and truly understanding any given issue is challenging because it means we must build a complex picture of the world. We can’t rely on simple arguments and outlooks on life when we start to get into the weeds of an issue or topic. We will see that admirable people have tragic flaws. We will see that policies which benefit us may exploit others. We will find that things we wish to be true about who we are and the world we live in are only semi-true. Ignorance is bliss in the sense that knowing only a little bit about the world will allow you to paint a picture that makes sense to you, but it won’t be accurate about the world and it won’t acknowledge the negative externalities that the story may create. Simplistic narratives may help us come together as sports fans, or as consumers, or as a nation, but we should all be worried about what happens when we have to accept the inaccuracies of our stories. How we do we weave a complex narrative that will bring people across the world together in a meaningful and peaceful way without driving inequality and negative externalities? That is the challenge of the age, and unfortunately, the better we try to be at accurately depicting the world we inhabit, the less confident any of us will be about the conclusions and decisions for how we should move forward.