Should We Assume Rationality?

Should We Assume Rationality?

The world is a complex place and people have to make a lot of decisions within that complexity. Whether we are deliberate about it or not, we create and manage systems and structures for navigating the complexity and framing the decisions we make. However, each of us operate from different perspectives. We make decisions that seem reasonable and rational from our individual point of view, but from the outside may seem irrational. The question is, should we assume rationality in ourselves and others? Should we think that we and other people are behaving irrationally when our choices seem to go against our own interests or should we assume that people have a good reason to do what they do?

 

This is a current debate and challenge in the world of economics and has been a long standing and historical debate in the world of politics. In his book Thinking Fast and Slow, Daniel Kahneman seems to take the stance that people are acting rationally, at least from their own point of view. He writes, “when we observe people acting in ways that seem odd, we should first examine the possibility that they have a good reason to do what they do.”

 

Rational decision-making involves understanding a lot of risk. It involves processing lots of data points, having full knowledge of our choices and the potential outcomes we might face, as well as thinking through the short and long-term consequences of our actions. Kahneman might argue, it would seem after reading his book, that truly rational thinking is beyond what our brains are ordinarily capable of managing. But to him, this doesn’t mean that people cannot still make rational choices and do what is in their best interests. When we see behaviors that seem odd, it is possible that the choices other people have made are still rational, but just require a different perspective.

 

The way people get to rationality, Thinking Fast and Slow suggests, is through heuristics that create shortcuts to decision-making and eliminate data that is more or less just noise. Markets can be thought of as heuristics in this way, allowing people to aggregate decisions and make choices with an invisible hand directing them toward rationality. So when we see people who seem to be acting obviously irrationally or opposed to their self-interest, we should ask whether they are making choices within an entirely different marketplace. What seems like odd behavior from the outside might be savvy signaling to a group we are not part of, might be a short term indulgence that will stand out to the remembering self in the long run, and might make sense if we can change the perspective through which we judge another person.

 

Kahneman shows that we can predict biases and patterns of thought in ourselves and others, but still, we don’t know exactly what heuristics and thinking structures are involved in other people’s decision-making. A charitable way to look at people is to assume their decisions are rational from where they stand and in line with the goals they hold, even if the choices they make do not appear to be rational to us from the outside.

 

Personally, I am on the side that doubts human rationality. While it is useful, empathetic, and humanizing to assume rationality, I think it can be a mistake, especially if we go too far in accepting the perspective of others as justification for their acts. I think that there are simply too many variables and too much information for us to truly make rational decisions or to fully understand the choices of others. My thinking is influenced by Kevin Simler and Robin Hanson who argue in The Elephant in the Brain, that we act on pure self-interest to a greater extent than we would ever admit, and we hide our self-interested behaviors and decisions from everyone, including ourselves.

 

At the same time, I do believe that we can set up systems, structures, and institutions that can help us make more rational decisions. Sunstein and Thaler, in Nudge, clearly show that markets can work and that people can be rational, but often need proper incentives and easy choice structures that encourage to encourage better choices. Gigerenzer in Risk Savvy ends up at a similar place, showing that we can get ahead of the brain’s heuristics and biases to produce rational thought. Creating the right frames, offering the right visual aids, and helping the brain focus on the relevant information can lead to rational thought, but nevertheless, as Kahneman shows, our thinking can still be hijacked and derailed, leading to choices that feel rational from the inside, but appear to violate what would be in our best interest when our decisions are stacked and combined over time. Ultimately, the greatest power in assuming rationality in others is that it helps us understand multiple perspectives, and might help us understand what nudges might help people change their behaviors and decisions to be more rational.
Thinking Statistically

Thinking Statistically

In Thinking Fast and Slow, Daniel Kahneman personifies two modes of thought as System 1 and System 2. System 1 is fast. It takes in information, processes it rapidly, and doesn’t always make us cognizant of the information we took in. It reacts to the world around us on an intuitive level, isn’t good at math, but is great at positioning us for catching a football.

 

System 2 is slow. Its is deliberate, calculating, and uses a lot of energy to maintain. Because it requires so much energy, we don’t actually active it very often, not unless we really need to. What is worse, System 2 can only operate on the information (unless we have a lot of time to pause specifically for information intake) that System 1 takes in, meaning, it processes incomplete information.

 

System 1 and System 2 are important to keep in mind when we start to to think statistically, something our minds are not good at. When we think back to the 2016 US Presidential election, we can see how hard statistical thinking is. Clinton was favored to win, but there was a statistical chance that Trump would win, as happened. The chance was small, but that didn’t mean the models were all wrong when he did win, it just means that the most likely event forecasted didn’t materialize. We had trouble thinking statistically about win percentages going into the election, and had trouble understanding an unlikely outcome after it happened.

 

“Why is it so difficult for us to think statistically?” Kahneman asks in his book, “We easily think associatively, wee think metaphorically, we think causally, but statistics requires thinking about many things at once, which is something that System 1 is not designed to do.”

 

System 1 operates quickly and cheaply. It takes less energy and effort to run on System 1, but because it is subject to bias and because it makes judgments on incomplete information, it is not reliable for important decisions and calculations based on nuance. We have to engage System 2 to be great at thinking statistically, but statistical thinking still trips up System 2 because it is hard to think about multiple competing outcomes at the same time and weight them appropriately. In Risk Savvy, Gerd Gigerenzer shows that statistical thinking can be substantially improved and that we really can think statistically, but that we need some help from visual aids and tools so that our minds can grasp statistical concepts better. We have to help System 1 so that it can set up System 2 for success if we want to be good at thinking statistically.

 

From the framework that Kahneman lays out, a quick reacting System 1 running on power save mode with limited informational processing power and System 2 operating on incomplete information aggregated by System 1, statistical thinking is nearly impossible. System 1 can’t bring in enough information for System 2 to analyze appropriately. As a result, we fall back on biases or maybe substitute an easier question over the challenging statistical question. Gigerenzer argues that we can think statistically, but that we need the appropriate framing and cues for System 1, so that System 2 can understand the number crunching and leg work that is needed. In the end, statistical thinking doesn’t happen quickly, and requires an ability to hold competing and conflicting information in the mind at the same time, making it hard for us to think statistically rather than anecdotally or metaphorically.