The Poisson Nature of War

The Poisson Nature of War

When we look back at history and explain why the world is the way it is, we rarely attribute specific causes and results to chance. We don’t say that a group of terrorists happened to choose to fly planes into the World Trade Center on 9/11. We don’t say that a new technology happened to come along to advance the economy. And we don’t say that a war between two countries happened to break out. But in some ways it would make more sense for us to look back at history and view events as chance contingencies. Steven Pinker argues that we should do this when we look back at history’s wars.
 
 
Specifically, when we take a statistical view of the history of war, we see that wars follow a Poisson distribution. When we record all the wars in human history we see lots of short intervals between wars and fewer long gaps between wars. When we look back at history and try to explain wars from a causal standpoint, we don’t look at the pauses and gaps between wars. We look instead at the triggering factors and buildup to war. But what the statistics argue is that we are often seeing causal patterns and narratives where none truly exist. Pinker writes, “the Poisson nature of war undermines historical narratives that see constellations in illusory clusters.”
 
 
We see one war as leading to another war. We see a large war as making people weary of fighting and death, ultimately leading to a large period of peace. We create narratives which explain the patterns we perceive, even if the patterns are not really there. Pinker continues,
 
 
“Statistical thinking, particularly an awareness of the cluster illusion, suggests that we are apt to exaggerate the narrative coherence of this history – to think that what did happen must have happened because of historical forces like cycles, crescendos, and collision courses.”
 
 
We don’t like to attribute history to chance events. We don’t like to attribute historical decisions to randomness. We like cohesive narratives that weave together multiple threads of history, even when examples of random individual choices or chance events shape the historical threads and narratives. Statistics shows us that the patterns we see are not always real, but that doesn’t stop us from trying to pull patterns out of the randomness or the Poisson distribution of history anyway.
Random Clusters

Random Clusters

The human mind is not good at randomness. The human mind is good at identifying and seeing patterns. The mind is so good at patter recognition and so bad at randomness that we will often perceive a pattern in a situation where no pattern exists. We have trouble accepting that statistics are messy and don’t always follow a set pattern that we can observe and understand.
 
 
Steven Pinker points this out in his book The Better Angels of Our Nature and I think it is an important point to keep in mind. He writes, “events that occur at random will seem to come in clusters, because it would take a nonrandom process to space them out.” This problem of our perception of randomness comes into play when our music streaming apps shuffle songs at random. If we have a large library of our favorite songs to chose from, some of those songs will be by the same artist. If we hear two or more songs from the artist back to back, we will assume there is some sort of problem with the random shuffling of the streaming service. We should expect to naturally get clusters of songs by the same artist or even off the same album, but it doesn’t feel random to us when it happens. To solve this problem, music streaming services deliberately add algorithms that stop songs from the same artist from appearing in clusters. This makes the shuffle less random overall, but makes the perception of the shuffle feel more random to us.
 
 
Pinker uses lightning to describe the process in more detail. “Lightning strikes are an example of what statisticians call a Poisson process,” he writes. “In a Poisson process, events occur continuously, randomly, and independently of one another. … in a Poisson process the intervals between events are distributed exponentially: there are lots of short intervals and fewer and fewer of them as they get longer and longer.”
 
 
To understand a Poisson process, we have to be able to understand having many independent events and we have to shift our perspective to look at the space between events as variables, not just look at the events themselves as variables. Both of these things are hard to do. It is hard to look at a basketball team and think that their next shot is independent of the previous shot (this is largely true). It is hard to look at customer complaints and see them as independent (also largely true), and it is hard to look at the history of human wars and think that events are also independent (Pinker shows this to be largely true as well). We tend to see events as connected even when they are not, a perspective error on our part. We also look just at the events, not at the time between the events. If we think that the time between the events will have a statistical dispersion that we can analyze, it shifts our focus away from the actual event itself. We can then think about what caused the pause and not what caused the even. This helps us see the independence between events and helps us see the statistics between both the event and the subsequent pause between the next event. Shifting our focus in this way can help us see Poisson distributions, random distributions with clusters, and patterns that we might miss or misinterpret. 
 
 
All of these factors are part of probability and statistics which our minds have trouble with. We like to see patterns and think causally. We don’t like to see larger complex perspective shifting statistics. We don’t like to think that there is a statistical probability without an easily distinguishable pattern that we can attribute to specific causal structures. However, as lightning and other Poisson processes show us, sometimes the statistical perspective is the better perspective to have, and sometimes our brains run amok with finding patterns that do not exist in random clusters.
Bias Versus Discrimination - Joe Abittan

Bias Versus Discrimination

In The Book of Why Judea Pearl writes about a distinction between bias and discrimination from Peter Bickel, a statistician  from UC Berkeley. Regarding sex bias and discrimination in the workplace, Bickel carefully distinguished between bias and discrimination in a way that I find interesting. Describing his distinction Pearl writes the following:
“He [Bickel] carefully distinguishes between two terms, that in common English, are often taken as synonyms: bias and discrimination. He defines bias as a pattern of association between a particular decision and a particular sex of applicant. Note the words pattern and association. They tell us that bias is a phenomenon on rung one of the Ladder of Causation.”
Bias, Pearl explains using Bickel’s quote, is simply an observation. There is no causal mechanism at play when dealing with bias and that is why he states that it is on rung one of the Ladder of Causation. It is simply recognizing that there is a disparity, a trend, or some sort of pattern or association between two things.
Pearl continues, “on the other hand, he defines discrimination as the exercise of decision influenced by the sex of the applicant when that is immaterial to the qualification for entry. Words like exercise of decision, or influence and immaterial are redolent of causation, even if Bickel could not bring himself to utter that word in 1975. Discrimination, unlike bias, belongs on rung two or three of the Ladder of Causation.”
Discrimination is an intentional act. There is a clear causal pathway that we can posit between the outcome we observe and the actions or behaviors of individuals. In the case that Bickel used, sex disparities in work can be directly attributed to discrimination if it can be proven that immaterial considerations were the basis for not hiring women (or maybe men) for specific work. Discrimination does not happen all on its own, it happens because of something else. Bias can exist on its own. It can be caused by discrimination, but it could be caused by larger structural factors that themselves are not actively making decisions to create a situation. Biases are results, patterns, and associations we can observe. Discrimination is deliberate behavior that generates, sustains, and reinforces biases.
Post Hoc Conclusions

Post Hoc Conclusions

Our minds see a lot of patterns that don’t exist. We make observations of randomness and find patterns that we assume to be based on a causal link when in reality no causal structure exists between our observations. This can happen in 3 point shooting in basketball, in observations of bomb locations in WWII London, and in music streaming services. We are primed to see patterns and causes, and we can construct them even when we shouldn’t. One contributing factor for incorrect pattern observation is that we tend to make post hoc conclusions, making observations after the fact without predicting what we might expect to see before hand.

 

Using the WWII example, Cass Sunstein and Richard Thaler in the book Nudge show how people developed misconstructions of German bombing patterns in London during the war. The German bombing wasn’t precise, and there was no real pattern to the bombing raids and where bombs actually exploded across the city. Nevertheless, people mistakenly viewed a pattern in the random distribution of bombs. The authors describe the mistaken pattern identification by writing, “We often see patterns because we construct our informational tests only after looking at the evidence.”

 

People could map where bombs fell, and then create explanations for what targets the Germans were aiming at, for why the Germans would target a certain part of the city, and what strategic purpose the bombing was trying to accomplish. But these reasons are all post hoc constructions meant to satisfy a non-existent pattern that someone expected to find. We also see this in basketball, when a shooter makes a few baskets and is believed to have the hot hand or be on fire. In music streaming services, algorithms are actually tweaked to be less random, because listeners who hear two consecutive songs or more by the same band will assume the streaming isn’t randomizing the music, even though random chance will sometimes pick a string of songs from the same band or even from the same album.

 

The examples I mentioned in the previous paragraph are harmless cognitive errors stemming from poorly constructed post hoc explanations of phenomena.  However, post hoc conclusions based on non-existent patterns are important to consider because they can have real consequences in our lives and societies. If we are in a position to make important decisions for our families, our companies, or our communities, we should recognize that we possess the ability to be wildly wrong about observed patterns. It is important that we use better statistical techniques or listen to the experts who can honestly employ them to help us make decisions. We should not panic about meaningless stock market fluctuations and we should not incarcerate people based on poor crime statistic understandings. We should instead remember that our brains will look for patterns and find them even if they don’t actually exist. We should state assumptions before we make observations, rather than making post hoc conclusions on poor justifications for the patterns we want to see.
Detecting Rules

Detecting Rules

Our brains are built to think causally and look for patterns. We benefit when we can recognize that some foods make us feel sick, when certain types of clouds mean rain is on the way, or when our spouse gives us a certain look that means they are getting frustrated with something. Being able to identify patterns helps us survive and form better social groups, but it can also lead to problems. Sometimes we detect patterns and rules when there aren’t any, and we can adopt strange superstitions, traditions, or behaviors that don’t help us and might have a cost.

 

Daniel Kahneman demonstrates this in his book Thinking Fast and Slow by showing a series of letters and asking us to think about which series of letters would be more likely in a random draw situation. If we had a bag of letter tiles for each letter of the English alphabet, and we selected tiles at random, we wouldn’t expect to get a word we recognized. However, sometimes through random chance we do get a complete word. If you played thousands of Scrabble games, eventually you might draw 7 tiles that make a complete word on your first turn. The reality is that drawing the letters MGPOVIT is just as statistically likely as drawing the letters MORNING.

 

For our brains, however, seeing a full word feels less likely than a jumble of letters. As Kahneman explains, “We do not expect to see regularity produced by a random process, and when we detect what appears to be a rule, we quickly reject the idea that the process is truly random.”

 

We can go out of our way trying to prove that something is behaving according to rule when it is truly random. We can be sure that a pattern is taking place, even when there is no pattern occurring. This happens in basketball with the Hot Hand phenomenon and in science when researchers search for a significant finding that doesn’t really exist in the data from an experiment. Most of the time, this doesn’t cause a big problem for us. Its not really a big deal if you believe that you need to eat Pringles for your favorite sports team to win a playoff game. It only adds a small cost if you tackle some aspect of a house project in an inefficient way, because you are sure you have better luck when you do your long approach versus a more direct approach to handling the task.

 

However, once we start to see patterns that don’t exist in social life with other people, there can be serious consequences. The United States saw this with marijuana in the early days of marijuana prohibition as prejudice and racial fear overwhelmed the public through inaccurate stories of marijuana dangers. Ancient people who sacrificed humans to bring about rain were fooled by false pattern recognition. We see our brains looking for rules when we examine how every act of our president influences political polls for the upcoming election. Our powerful pattern and rule detecting brains can help us in a lot of ways, but they can also waste our time, make us look foolish, and have huge externalities for society.
Biased in Predictable Ways

Biased in Predictable Ways

“A judgment that is based on substitution will inevitably be biased in predictable ways,” writes Daniel Kahneman in his book Thinking Fast and Slow. Kahneman uses an optical illusion to show how our minds can be tricked in specific way to lead us to an incorrect conclusion. The key take-away, is that we can understand and predict our biases and how those biases will lead to specific patterns of thinking. The human mind is complex and varied, but the errors it makes can be studied, understood, and predicted.

 

We don’t like to admit that our minds are biased, and even if we are willing to admit a bias in our thinking, we are often even less willing to accept a negative conclusion about ourselves or our behavior resulting from such a bias. However, as Kahneman’s work shows, our biases are predictable and follow patterns. We know that we hold biases, and we know that certain biases can arise or be induced in certain settings. If we are going to accept these biases, then we must accept what they tell us about our brains and about the consequences of these biases, regardless whether they are trivial or have major implications in our lives and societies.

 

In a lot of ways, I think this describes the conflicts we are seeing in American society today. There are many situations where we are willing to admit that biases occur, but to admit and accept a bias implicates greater social phenomenon. Admitting a bias can make it hard to deny that larger social and societal changes may be necessary, and the costs of change can be too high for some to accept. This puts us in situations where many deny that bias exists, or live in contradiction where a bias is accepted, but a remedy to rectify the consequences of the bias is not accepted. A bias can be accepted, but the conclusion and recognition that biases are predictable and understandable can be rejected, despite the mental contradictions that arise.

 

As we have better understood how we behave and react to each other, we have studied more forms of bias in certain settings. We know that we are quick to form in-groups and out-groups. We know that we see some people as more threatening than others, and that we are likely to have very small reactions that we might not consciously be aware of, but that can nevertheless be perceived by others. Accepting and understanding these biases with an intention to change is difficult. It requires not just that one person adapt their behavior, but that many people change some aspect of their lives, often giving up material goods and resources or status. The reason there is so much anger and division in the United States today is because there are many people who are ready to accept these biases, to accept the science that Kahneman shows, and to make changes, while many others are not. Accepting the science of how the brain works and the biases that can be produced in the brain challenges our sense of self, reveals things about us that we would rather leave in the shadows, and might call for change that many of us don’t want to make, especially when a fiction that denies such biases helps propel our status.
Patterns of Associated Ideas

Patterns of Associated Ideas

In Thinking Fast and Slow, Daniel Kahneman argues that our brains try to conserve energy by operating on what he calls System 1. The part of our brain that is intuitive, automatic, and makes quick assessments of the world is System 1. It doesn’t require intense focus, it quickly scans our environment, and it simply ignores stimuli that are not crucially important to our survival or the task at hand. System 1 is our low-power resting mode, saving energy so that when we need to, we can activate System 2 for more important mental tasks.

 

Without our conscious recognition, System 1 builds mental mental models of the world that shape the narrative that we use to understand everything that happens around us. It develops simple association and expectations for things like when we eat, what we expect people to look like, and how we expect the world to react when we move through it. Kahneman writes, “as these links are formed and strengthened, the pattern of associated ideas comes to represent the structure of events in your life, and determines your interpretations of the present as well as your expectations of the future.”

 

It isn’t uncommon for people different people to watch the same TV show, read the same news article, or witness the same event and walk away with completely different interpretations. We might not like a TV show that everyone else loves. We might reach a vastly different conclusion from reading a news article about global warming, and we might interpret the actions or words of another person completely differently. Part of why we don’t all see things the same, Kahneman might argue, is because we have all trained our System 1 in unique ways. We have different patterns of associated ideas that we use to fit information into a comprehensive narrative.

 

If you never have interactions with people who are different than you are, then you might be surprised when people don’t behave the way you expect. When you have a limited background and experience, then your System 1 will develop a pattern of associated ideas that might not generalize to situations that are new for you. How you see and understand the world is in some ways automatic, determined by the pattern of associated ideas that your System 1 has built over the years. It is unique to you, and won’t fit perfectly with the associated ideas that other people develop.

 

We don’t have control over System 1. If we active our System 2, we can start  to influence what factors stand out to System 1, but under normal circumstances, System 1 will move along building the world that fits its experiences and expectations. This works if we want to move through the world on auto-pilot with few new experiences, but if we want to be more engaged in the world and want to better understand the variety of humanity that exists in the world, our System 1 on its own will never be enough, and it will continually let us down.

Peak, Trough, Rebound

Dan Pink’s book When: The Scientific Secrets of Perfect Timing includes a lot of interesting information about time, how we think about time, and about how humans and our societies interact with time. The book is one of the books I recommend the most because it includes a lot of interesting ideas that Pink does a good job of combining in ways that can really help with productivity and organizing one’s day. We all deal with time and never have enough of it, and Pink helps us think about how to best manage and use our time.
One interesting study that Pink shares has to do with mood and affect throughout the day. A study of twitter showed a striking pattern among people across the globe. For most people, excluding night owls, we tend to have our peak of the day about 3 to 4 hours after we wake up. From there, we slowly trend downward until we hit the middle of our trough in the mid-afternoon. But, we rebound and our mood and affect improve in the late evening. Pink writes, “Across continents and time zones, as predictable as the ocean tides, was the same daily oscillation – a peak, a trough, and a rebound. Beneath the surface of our everyday life is a hidden pattern: crucial, unexpected, and revealing.”
The study Pink references shows that we are not simply continuously in the same mood and attitude throughout the day. We have a point where we are at our zenith, and best able to tackle the challenges that come at us. However, our energy drains, and our mood and attentiveness diminish. We become irritable and easily distracted, and we can see this happen through the adjectives and emotion included in people’s social media posts. Through breaks, and the end of the workday, however, our energy levels come back and we rebound, becoming happier and more creative. We get through the low part of our day and can be functioning human beings again. This isn’t just something that we sometimes feel, it is a clear pattern that is common to humans across the globe.
What I find so interesting about Pink’s book and why I have recommended it so much is that timing is everything for us. So much of our lives is impacted by the way we relate to time, but very few of us ever think about it. There are patterns all around us relating to time, but usually these patterns are hidden and unknown to us. When we look at them and understand them, we can start to adjust our days and how we schedule things that we do.
I find it incredible that we can look at people on twitter, and see their mood based on the adjectives and words used in their posts. What is even more incredible, is that we can watch the mood and attitude of a region change through a day, and change in a rhythmic pattern. If we want to be effective, and want to help others to be effective, we should think about these patterns and organize our days and activities in a way that corresponds to these patterns. I have tried to do that in my life, and find it helpful to set up my day so that I am doing particular activities in line with the peak, trough, rebound flow of my days. Timing is important, and should be a purposeful part of our days.

Attribution Bias

Our brains are pretty impressive pattern recognition machines. We take in a lot of information about the world around us, remember stories, pull information together to form new thoughts and insights, move through the world based on the information we take in, and we are able to predict the results of actions before they have occurred. Our brain evolved to help us navigate a complex, dangerous, and uncertain world.

 

Today however, while our world is arguably more complex and uncertain than ever, it might not be as dangerous on a general day to day basis. I’m pretty sure I won’t encounter any animals who may try to eat me when I sit at the park to read during my lunch break, I won’t need to distinguish between two types of berries to make sure I don’t eat the poison kind, and if the thunder storms scheduled for this evening drop golf ball sized hail, I won’t have to worry to much about where I will find safety and shelter. Nevertheless, my evolved brain is still going to approach the world as if it were the dangerous place it was when my ancestors were evolving their thought capacities, and that will throw some monkey-wrenches into my life and lead to me to see patterns that don’t really exist.

 

Colin Wright has a great quote about this in his book Becoming Who We Need to Be. He writes, “You ascribe meaning to that person’s actions through the lens of what’s called “attribution bias.” If you’re annoyed by their slow driving, that inferred meaning will probably not be generous to the other driver: they’re a bad person, they’re in the way, and they’re doing this because they’re stupid or incapable. That these assumptions about the situation are possibly incorrect – maybe they’re driving slowly because thy’re in deep thought about elephant tool usage – is irrelevant. Ascribing meaning to acts unto itself is impressive, even if we often fail to arrive at a correct, or fully correct understanding of the situation.”

 

We often find ourselves in situations that are random and try to ascribe a greater meaning to the situation or event we are in. At least in the United States, it is incredibly common to hear people say that everything happens for a reason, creating a story for themselves in which this moment of inconvenience is part of a larger story filled with lessons, staircases, detours, success, and failure that are all supposed to culminate in a larger narrative that will one day all make sense. The fact that this way of thinking is so prevalent suggests to me that the power of our pattern recognition focused brains is still in full swing even though we no longer need it to be as active in as many situations of our life. We don’t need every moment of our life to happen for a reason, and if we allow for randomness and eliminate the running narrative of our life, we don’t have to work through challenging apologetics to understand something negative.

 

Attribution bias as described by Wright shows us how wrong our brain can be about the world. It shows us that our brains have certain tendencies that elevate ourselves in thought over the rest of the world that doesn’t conform to our desires, interests, wishes, and preferences. It reveals that we are using parts of our brains that evolved to help our ancestors in ways that we now understand to be irrational. If we can see that the slow person driving in front of us with a political sticker that makes our blood boil is not all the terrible things we instantly think they are (that instead they are a 75 year-old grandfather driving in a new town trying to get to the hospital where his child is sick) then we can recognize that not everything in life has a meaning, or at least not the meaning that our narrow pattern recognizing brain wants to ascribe. Remembering this mental bias and making an effort to recognize this type of thinking and move in a more generous thought direction will help us move through the world with less friction, anger, and disappointment because we won’t develop false patterns that let us down when they fail to materialize in the outcomes we expected.