Mary Roach on Reincarnation in India

Mary Roach on Reincarnation in India

In the book Spook, Mary Roach writes, “People don’t seem to approach life with the same terrified, risk-aversive tenacity that we do. I’m beginning to understand why, religious doctrine aside, the concept of reincarnation might be so popular here. Rural India seems like a place where life is taken away too easily – accidents, childhood diseases, poverty, murder. If you’ll be back for another go, why get too worked up about the leaving?” Roach is joking of course, but this quote comes at the end of a lengthy description of dangers and risks that she experienced in India that we would find appalling in the United States. Her travels to India brought her face to face with cyclists moving through heavy traffic and breathing diesel smog. She was afraid of large trucks overflowing with potatoes and cauliflower that threatened to spill over onto the vehicle she was riding in. And she was also afraid for the lives of more than one woman riding precariously on the back of a fast moving Vespa.
While the quote is funny, it does get at some interesting ways of thinking about life, death, and how we go about our days. I’m not sure how much of our differences in risk tolerance in the United States versus India comes down to beliefs in reincarnation, but I can see how ideas of reincarnation would be comforting in a dangerous society. I don’t know if reincarnation would be enough to create a moral hazard scenario where people were intentionally negligent about safety because they expected to come back in another life, but I’m sure there is some impact that could be studied.
The quote from Roach also seems to suggest that Americans value our lives differently than individuals in India. She highlights how risk averse Americans tend to be, referring to how much we go out of our way to ensure everything we interact with is safe, and how we try to limit risk in everything from roller coasters to strollers. I think that what is likely going on is a difference in culture that stretches back years and is fraught with technological limitations and differences in population density. I am currently listening to an audiobook with an author who interviewed friends from her childhood in rural Ohio in the 1960’s and 70’s. Her dad was a doctor, and she notes how many individuals, including children, died in accidents involving farming equipment. Today we have adopted technology within everything we do, allowing us to make the world safer. Risk stands out more than in the 1960’s and 70’s when we didn’t have the technology to make everything as safe as we can now. Perhaps the difference that Roach noted, that she jokingly attributed to belief in reincarnation, is simply due to limitations in technology and a need to earn money.
Rent

Rent

When I was in high school I took a class my senior year that followed the secular personal financial management course from Dave Ramsey. Ramsey provides many practical lessons about money management and financial well-being. One area that he focuses on is how much of your income you should spend on different areas, such as on housing, groceries, and other necessities. Ramsey follows the standard recommendation that you don’t spend more than 30% of your income on housing, a great goal, but one that really isn’t a possibility for many Americans.
Kathryn Edin and H. Luke Shaefer examine the high cost of rent and how it impacts the lives of those living in poverty in their book $2.00 A Day, originally published in 2015. “Between 1990 and 2013, rents rose faster than inflation in virtually every region of the country,” the authors write. This has serious impacts for the lives of those living in poverty. One impact discussed by the authors, that I had not considered, was child custody. In some cities and states there are limitations on how many children can share a single room. At a certain point, too many children, especially of mixed gender, are not allowed to share a room and doing so could constitute neglect and lead to parents losing custody of their children.
Edin and Shaefer continue, “between 200 and 2012 alone, rents rose by 6 percent. During that same period, the real income of the middling renter in the United States fell by 13 percent.” While wages had stagnated and real incomes had fallen for lower class workers, rents across the country were rising. The increase in rent was particularly high in large cities where most of the economic output and job creation in the country has taken place. Renters faced a choice, live where rents are cheap, but where there are no jobs, or live where rents are high, and where jobs can be found. Living in a cheap place may mean an unreasonably long and expensive commute, but living where the jobs are might mean sharing a place with non-familial renters and crowding into living conditions that put renters at risk.
I haven’t studied affordable housing, and I don’t know the solution to rising rents for low income individuals and families. But I think it is important to know the statistics shared by Edin and Shaefer. I live in a city where rents and home prices have skyrocketed (Reno, NV). One consequence of the rising rents is an increase in homelessness, particularly in short term homelessness. We all see people on the streets and notice when there are more people on the street, but we don’t always notice the short term homeless. The chronic homeless overshadow what is sometimes a larger, yet less visible form of homelessness. Understanding the rise in rents, the stagnation of income (which we might hopefully be getting out of as we recover from COVID) and the impact on short-term homelessness helps us think more clearly and accurately about the challenges that renters face, and about ways to help those who are unable to keep up with rising rents. It is important that we think about the obvious consequences of increased rents, like homelessness, and also the less obvious consequences, such as families potentially losing custody of their children. As rents have risen, Dave Ramsey’s advice to keep your housing costs below 30% of your income just isn’t possible for many Americans, and the consequences have been dire for many individuals and communities.
Data Driven Methods

Data Driven Methods

In the world of big data scientists today have a real opportunity to push the limits scientific inquiry in ways that were never before possible. We have the collection methods and computing power available to analyze huge datasets and make observations in minutes that would have taken decades just a few years ago. However, many areas of science are not being strategic with this new power. Instead, many areas of science simply seem to be plugging variables into huge data sets and haphazardly looking for correlations and associations. Judea Pearl is critical of this approach to science in The Book of Why and uses the Genome-wide association study (GWAS) to demonstrate the shortcomings of this approach.
 
 
Pearl writes, “It is important to notice the word association in the term GWAS. This method does not prove causality; it only identifies genes associated with a certain disease in the given sample. It is a data-driven rather than hypothesis-driven method, and this presents problems for causal inference.”
 
 
In the 1950s and 1960s, Pearl explains, R. A. Fisher was skeptical that smoking caused cancer and argued that the correlation between smoking and cancer could have simply been the result of a hidden variable. He suggested it was possible for a gene to exist that both predisposed people to smoke and predisposed people to develop lung cancer. Pearl writes that such a smoking gene was indeed discovered in 2008 through the GWAS, but Pearl also notes that the existence of such a gene doesn’t actually provide us with any causal mechanism between people’s genes and smoking behavior or cancer development.  The smoking gene was not discovered by a hypothesis driven method but rather by data driven methods. Researchers simply looked at massive genomic datasets to see if any genes correlated between people who smoke and people who develop lung cancer. The smoking gene stood out in that study.
 
 
Pearl continues to say that causal investigations have shown that the gene in question is important for nicotine receptors  in lung cells, positing a causal pathway to smoking predispositions and the gene. However, causal studies also indicate that the gene increases your chance of developing lung cancer by less than doubling the chance of cancer. “This is serious business, no doubt, but it does not compare to the danger you face if you are a regular smoker,” writes Pearl. Smoking is associated with a 10 times increase in the risk of developing lung cancer, while the smoking gene only accounts for a less than double risk increase. The GWAS tells us that the gene is involved in cancer, but we can’t make any causal conclusions from just an association. We have to go deeper to understand its causality and to relate that to other factors that we can study. This helps us contextualize the information from the GWAS.
 
 
Much of science is still like the GWAS, looking for associations and hoping to be able to identify a causal pathway as was done with the smoking gene. In some cases these data driven methods can pay off by pointing the way for researchers to start looking for hypothesis driven methods, but we should recognize that data driven methods themselves don’t answer our questions and only represent correlations, not underlying causal structures. This is important because studies and findings based on just associations can be misleading. Discovering a smoking gene and not explaining the actual causal relationship or impact could harm people’s health, especially if they decided that they would surely develop cancer because they had the gene. Association studies ultimately can be misleading, misused, misunderstood, and dangerous, and that is part of why Pearl suggests a need to move beyond simple association studies. 

Risk Savvy Citizens

Risk Savvy Citizens

In Risk Savvy Gerd Gigerenzer argues for changes to the way that financial systems, healthcare systems, and discourse around public projects operate. He argues that we are too afraid of risk, allow large organizations to profit from misunderstandings of risk, and that our goals are often thwarted by poor conceptions of risk. Becoming risk savvy citizens, he argues, can help us improve our institutions and make real change to move forward in an uncertain world.

“The potential lies in courageous and risk savvy citizens,” writes Gigerenzer.

I think that Gigerenzer is correct to identify the importance of risk savvy citizens. We are more interconnected than we ever have been, and to develop new innovations will require new risks. Many of the institutions we have built today exist to minimize both risk and uncertainty, unintentionally limiting innovation. Moving forward, we will have to develop better relationships toward risk to accept and navigate uncertainty.

A population of risk savvy citizens can help reshape existing institutions, and will have to lean into risk to develop new institutions for the future. This idea was touched on in Bruce Katz and Jeremy Nowak’s book The New Localism where they argue that we need new forms of public private partnerships to manage investment, development, and public asset management. Public agencies, institutions we have relied upon, have trouble managing and accepting risk, even if they are comprised of risk savvy citizens. The solution, Katz and Nowak might suggest, is to reshape institutions so that risk savvy citizens can truly act and respond in ways that successfully manage reasonable risks. Institutions matter, and they need to be reshaped and reformed if our courageous and risk savvy citizens are going to help change the world and solve some of the ills that Gigerenzer highlights.

Risk literacy and Reduced Healthcare Costs - Joe Abittan

Risk Literacy & Reduced Healthcare Costs

Gerd Gigerenzer argues that risk literacy and reduced healthcare costs go together in his book Risk Savvy. By increasing risk literacy we will help both doctors and patients better understand how behaviors contribute to overall health, how screenings may or may not reveal dangerous medical conditions, and whether medications will or will not make a difference for an individual’s long-term well being. Having both doctors and patients better understand and better discuss the risks and benefits of procedures, drugs, and lifestyle changes can help us use our healthcare resources more wisely, ultimately bringing costs down.
Gigerenzer argues that much of the modern healthcare system, not just the US system but the global healthcare system, has been designed to sell more drugs and more technology. Increasing the number of people using medications, getting more doctors to order more tests with new high-tech diagnostic machines, and driving more procedures became more of a goal than actually helping to improve people’s health. Globally, health and the quality of healthcare has improved, but healthcare is often criticized as a low productivity sector, with relatively low gains in health or efficiency for the investments we make.
I don’t know that I am cynical enough to accept all of Gigerenzer’s argument at face value, but the story of opioids, the fact that we invest much larger sums of money in cancer research versus parasitic disease research, and the ubiquitous use of MRIs in our healthcare landscape do favor Gigerenzer’s argument. There hasn’t been as much focus on improving doctor and patient statistical reasoning, and we haven’t put forward the same effort and funding to remove lead from public parks compared to the funding put forward for cancer treatments. We see medicine as treating diseases after they have popped up with fancy new technologies and drugs. We don’t see medicine as improving risk and health literacy or as helping improve the environment before people get sick.
This poor vision of healthcare that we have lived with for so long, Gigerenzer goes on to argue, has blinded us to the real possibilities within healthcare. Gigerenzer writes, “calls for better health care have been usually countered by claims that this implies one of two alternatives, which nobody wants: raising taxes or rationing care. I argue that there is a third option: by promoting health literacy of doctors and patients, we can get better care for less money.”
Improving risk and health literacy means that doctors can better understand and better communicate which medications, which tests, and which procedures  are most likely to help patients. It will also help patients better understand why certain recommendations have been made and will help them push back against the feeling that they always need the newest drugs, the most cutting edge surgery, and the most expensive diagnostic screenings. Regardless of whether we raise taxes or try to ration care, we have to help people truly understand their options in new ways that incorporate tools to improve risk literacy and reduce healthcare costs. By better understanding the system, our own care, and our systemic health, we can better utilize our healthcare resources, and hopefully bring down costs by moving our spending into higher productivity healthcare spaces.
Believing We Are Well Informed

Believing We Are Well Informed

In his book Risk Savvy, Gerd Gigerenzer demonstrated that people often overestimate their level of knowledge about the benefits of prostate and cancer screening. “A national telephone survey of U.S. adults,” he writes, “reported that the majority were extremely confident in their decision about prostate, colorectal, and breast screening, believed they were well informed, but could not correctly answer a single knowledge question.” I think this quote reveals something important about the way our minds work. We often believe we are well informed, but that belief and our confidence in our knowledge is often an illusion.
This is something I have been trying to work on. My initial reaction any time I hear any fact or any discussion about any topic is to position myself as a knowledgeable semi-expert in the topic. I have noticed that I do this with ideas and topics that I have really only heard once or twice on a commercial, or that I have seen in a headline, or that I once overheard someone talking about. I immediately feel like an expert even though my knowledge is often less than surface deep.
I think that what is happening in these situations is that I am substituting my feeling of expertise or knowledge with a different question. I am instead answering the question can I recall a time when I thought about this thing and then answering that question. Mental substitution is common, but hard to actually detect. I suspect that the easier a topic comes to mind, even if it is a topic I don’t know anything about but have only heard the name of, then the more likely I am to feel like I am an expert.
Gigerenzer’s quote shows that people will believe themselves to be well informed even if they cannot answer a basic knowledge question about the topic. Rather than substituting the question can I recall a time when I thought about this thing, patients may also be substituting another question. Instead of analyzing their confidence in their own decision regarding cancer screening, people may be substituting the question do I trust my doctor? Trust in a physician, even without any knowledge about the procedure, may be enough for people to feel extremely confident in their decisions. They don’t have to know a lot about their health or how a procedure is going to impact it, they just need to be confident that their physician does.
These types of substitutions are important for us to recognize. We should try to identify when we are falling victim to the availability bias and when we are substituting different questions that are easier for us to answer. In a well functioning and accurate healthcare setting these biases and cognitive errors may not harm us too much, but in a world of uncertainty, we stand to lose a lot when we fail to recognize how little we actually know. Being honest about our knowledge and thinking patterns can help us develop better systems and structures in our lives to improve and guide our decision-making.

Medical Progress

What does medical progress look like? To many, medical progress looks like new machines, artificial intelligence to read your medical reports and x-rays, or new pharmaceutical medications to solve all your ailments with a simple pill. However, much of medical progress might be improved communication, better management and operating procedures, and better understandings of statistics and risk. In the book Risk Savvy, Gerd Gigerenzer suggests that there is a huge opportunity for improving physician understanding of risk, improved communication around statistics, and better processes related to risk that would help spur real medical progress.

 

He writes, “Medical progress has become associated with better technologies, not with better doctors who understand these technologies.” Gigerenzer argues that there is currently an “unbelievable failure of medical schools to provide efficient training in risk literacy.” Much of the focus of medical schools and physician education is on memorizing facts about specific disease states, treatments, and how a healthy body should look. What is not focused on, in Gigerenzer’s 2014 argument, is how physicians understand the statistical results from empirical studies, how physicians interpret risk given a specific biological marker, and how physicians can communicate risk to patients in a way that adequately inform their healthcare decisions.

 

Our health is complex. We all have different genes, different family histories, different exposures to environmental hazards, and different lifestyles. These factors interact in many complex ways, and our health is often a downstream consequence of many fixed factors (like genetics) and many social determinants of health (like whether we have a safe park that we can walk, or whether we grew up in a house infested with mold). Understanding how all these factors interact and shape our current health is not easy.

 

Adding new technology to the mix can help us improve our treatments, our diagnoses, and our lifestyle or environment. However, simply layering new technology onto existing complexity is not enough to really improve our health. Medical progress requires better ways to use and understand the technology that we introduce, otherwise we are adding layers to the existing complexity. If physicians cannot understand, cannot communicate, and cannot help people make reasonable decisions based on technology and the data that feeds into it, then we won’t see the medical progress we all hope for. It is important that physicians be able to understand the complexity, the risk, and the statistics involved so that patients can learn how to actually improve their behaviors and lifestyles and so that societies can address social determinants of health to better everyone’s lives.
Risk Literacy and Emotional Stress

Risk Literacy and Emotional Stress

In Risk Savvy Gerd Gigerenzer argues that better risk literacy could reduce emotional stress. To emphasize this point, Gigerenzer writes about parents who receive false positive medical test results for infant babies. Their children had been screened for biochemical disorders, and the tests indicated that the child had a disorder. However, upon follow-up screenings and evaluations, the children were found to be perfectly healthy. Nevertheless, in the long run (four years later) parents who initially received a false positive test result were more likely than other parents to say that their children required extra parental care, that their children were more difficult, and that that had more dysfunctional relationships with their children.

 

Gigerenzer suggests that the survey results represent a direct parental response to initially receiving a false positive test when their child was a newborn infant. He argues that parents received the biochemical test results without being informed about the chance of false positives and without understanding the prevalence of false positives due to a general lack of risk literacy.  Parents initially reacted strongly to the bad news of the test, and somewhere in their mind, even after the test was proven to be a false positive, they never adjusted their thoughts and evaluations of the children, and the false positive test in some ways became a self-fulfilling prophecy.

 

In writing about Gigerenzer’s argument, it feels more far-fetched than it did in an initial reading, but I think his general argument that risk literacy and emotional stress are tied together is probably accurate. Regarding the parents in the study, he writes, “risk literacy could have moderated emotional reactions to stress that harmed these parents’ relation to their child.” Gigerenzer suggests that parents had strong negative emotional reactions when their children received a false positive and that their initial reactions carried four years into the future. However, had the doctors better explained the chance of a false positive and better communicated next steps with parents, then the strong negative emotional reaction experienced by parents could have been avoided, and they would not have spent four years believing their child was in some ways more fragile or more needy than other children. I recognize that receiving a medical test with a diagnosis that no parent wants to hear is stressful, and I can see where better risk communication could reduce some of that stress, but I think there could have been other factors that the study picked up on. I think the results as Gigerenzer reported overhyped the connection between risk literacy and emotional stress.

 

Nevertheless, risk literacy is important for all of us living in a complex and interconnected world today. We are constantly presented with risks, and new risks can seemingly pop-up anywhere at any time. Being able to decipher and understand risk is important so that we can adjust and modulate our activities and behaviors as our environment and circumstances change. Doing so successfully should reduce our stress, while struggling to comprehend risk and adjust behaviors and beliefs is likely to increase emotional stress. When we don’t understand risks appropriately, we can become overly fearful, we can spend money on unnecessary insurance, and we can stress ourselves over incorrect information. Developing better charts, better communicative tools, and better information about risk will help individuals improve their risk literacy, and will hopefully reduce risk by allowing individuals to successfully adjust to the risks they face.
Understanding False Positives with Natural Frequencies

Understanding False Positives with Natural Frequencies

In a graduate course on healthcare economics a professor of mine had us think about drug testing student athletes. We ran through a few scenarios where we calculated how many true positive test results and how many false positive test results we should expect if we oversaw a university program to drug tests student athletes on a regular basis. The results were surprising, and a little confusing and hard to understand.

 

As it turns out, if you have a large student athlete population and very few of those students actually use any illicit drugs, then your testing program is likely to reveal more false positive tests than true positive tests. The big determining factors are the sensitivity of the test (how often it is actually correct) and the percentage of students using illicit drugs. A false positive occurs when the drug test indicates that a student who is not using illicit drugs is using them. A true positive occurs when the test correctly identifies a student who does indeed use drugs. The dilemma we discussed occurs if you have a test with some percentage of error and a large student athlete population with a minimal percentage of drug users. In this instance you cannot be confident that a positive test result is accurate. You will receive a number of positive tests, but most of the positive tests that you receive are actually false positives.

 

In class, our teacher walked us through this example verbally before creating some tables that we could use to multiply the percentages ourselves to see that the number of false positives will indeed exceed the number of true positives when you are dealing with a large population and a rare event that you are testing for. Our teacher continued to explain that this happens every day in the medical world with drug tests, cancer screenings, and other tests (including COVID-19 tests as we are learning today).  The challenge, as our professor explained, is that the math is complicated and it is hard to explain to person who just received a positive cancer test that they likely don’t have cancer, even though they just received a positive test. The statistics are hard to understand on their own.

 

However, Gerd Gigerenzer doesn’t think this is really a limiting problem for us to the extent that my professor had us work through. In Risk Savvy Gigerenzer writes that understanding false positives with natural frequencies is simple and accessible. What took nearly a full graduate course to go through and discuss, Gigerenzer suggests can be digested in simple charts using natural frequencies. Natural frequencies are numbers we can actually understand and multiply as opposed to fractions and percentages which are easy to mix up and hard to multiply and compare.

 

Rather than telling someone that the actual incidence of cancer in the population is only 1%, and that the chance of a false positive test is 9%, and trying to convince them that they still likely don’t have cancer is confusing. However, if you explain to an individual that for every 1,000 people who take a particular cancer test that only 10 actually have cancer and that 990 don’t, the path to comprehension begins to clear up. With the group of 10 true positives and true negatives 990, you can explain that of those 10 who do have cancer, the test correctly identifies 9 out of 10 of them, and provides 9 true positive results for every 1,000 test (or adjust according to the population and test sensitivity). The false positive number can then be explained by saying that for the 990 people who really don’t have cancer, the test will error and tell 89 of them (9% in this case) that they do have cancer. So, we see that 89 individuals will receive false positives while 9 people will receive true positives. 89 > 9, so the chance of actually having cancer with a positive test still isn’t a guarantee.

 

Gigernezer uses very helpful charts in his book to show us that the false positive problem can be understood more easily than we might think. Humans are not great at thinking statistically, but understanding false positives with natural frequencies is a way to get to better comprehension. With this background he writes, “For many years psychologists have argued that because of their limited cognitive capacities people are doomed to misunderstand problems like the probability of a disease given a positive test. This failure is taken as justification for paternalistic policymaking.” Gigerenzer shows that we don’t need to rely on the paternalistic nudges that Cass Sunstein and Richard Thaler encourage in their book Nudge. He suggest that in many instances where people have to make complex decisions what is really needed is better tools and aids to help with comprehension. Rather than developing paternalistic policies to nudge people toward certain behaviors that they don’t fully understand, Gigerenzer suggests that more work to help people understand problems will solve the dilemma of poor decision-making. The problem isn’t always that humans are incapable of understanding complexity and choosing the right option, the problem is often that we don’t present information in a clear and understandable way to begin with.
Procedure Over Performance

Procedure Over Performance

My wife works with families with children with disabilities for a state agency. She and I often have discussions about some of the administrative challenges and frustrations with her job, and some of the creative ways that she and other members of her agency are able to bend the rules to meet the human needs of the job, even though their decisions occasionally step beyond management decisions for standard operating procedures. For my wife and her colleagues below the management level of the agency, helping families and doing what is best for children is the motivation for all of their decisions, however, for the management team within the agency, avoiding errors and blame often seems to be the more important goal.

 

This disconnect between agency functions, mission, and procedures is not unique to my wife’s state agency. It is a challenge that Max Weber wrote about in the late 1800’s and early 1900’s. Somewhere along the line, public agencies and private companies seem to forget their mission. Procedure becomes more important than performance, and services or products suffer.

 

Gerd Gigerenzer offers an explanation for why this happens in his book Risk Savvy. Negative error cultures likely contribute to people becoming more focused on procedure over performance, because following perfect procedure is safe, even if it isn’t always necessary and doesn’t always lead to the best outcomes. A failure to accept risk and errors, and a failure to discuss and learn from errors, leads people to avoid situations where they could be blamed for failure. Gigerenzer writes, “People need to be encouraged to talk about errors and take the responsibility in order to learn and achieve better overall performance.”

 

As companies and government agencies age, their workforce ages. People become comfortable in their role, they don’t want to have to look for a new job, they take out mortgages, have kids, and send them to college. People become more conservative and risk averse as they have more to lose, and that means they are less likely to take risks in their career, because they don’t want to lose their income to support their lifestyles, retirements, or the college plans for their kids. Following procedures, like getting meaningless forms submitted on time and documenting conversations timely, become more important than actually ensuring valuable services or products are provided to constituents and customers. Procedure prospers over performance, and the agency or company as a whole suffers. Positive error cultures, where it is ok to take reasonable risks and acceptable to discuss errors without fear of blame are important for overcoming the stagnation that can arise when procedure becomes more important than the mission of the agency or company.