“Increased accountability is a mixed blessing,” writes Daniel Kahneman in his book Thinking Fast and Slow. This is an idea I came across in the past from books like Political Realism by Jonathan Rauch and The New Localism by Bruce Katz and Jeremy Nowak. Our go-to answer to any challenges and problems tends to be increased transparency and greater oversight. However, in some complex fields simply opening processes and decision-making procedures to more scrutiny and review can create new problems that might be even worse. This is a particular challenge when we consider the way hindsight bias influences the thoughts and opinions of those reviewing bodies.
Kahneman continues, “because adherence to standard operating procedures is difficult to second-guess, decision makers who expect to have their decisions scrutinized with hindsight are driven to bureaucratic solutions and – to an extreme – reluctance to take risks.”
Excess scrutiny and oversight can lead to rigid and mechanical decision-making processes. This might not be a problem when we are engineering a bridge and need to make technical decisions based on known mathematical calculations (I’ve never engineered a bridge so I may be wrong here), but it can be a problem for doctors and policy makers. Doctors have to rely on their experience, their knowledge, and their intuitions to determine the best possible medical treatment. Checklists are fantastic ideas, but when things go wrong in an operating room, doctors and nurses have to make quick decisions balancing risk and uncertainty. If the oversight they will face is high, then there is a chance that doctors stick to a rigid set of steps, that might not really fit the current emergency. In his book, Kahneman writes about how this leads doctors to order unnecessary tests and procedures, more to cover themselves from liability than to truly help the patient, wasting time and money within the healthcare system.
For public decision-making, hindsight bias can be a disaster for public growth and development. The federal government makes loans and backs many projects. Like any venture capitalist firm or large bank making multiple investments, some projects will fail. It is impossible to know at the outset which of ten solar energy projects will be a massive success, and which company is going to go bust. But thanks to hindsight bias and the intense oversight that public agencies and legislatures are subject to, an investment in a solar project that goes bust is likely to haunt the agency head or legislators who backed the project, even if 9 of the other 10 projects were huge successes.
Oversight is important, but when oversight is subject to hindsight bias, the accountability shifts into high gear, blaming decision-makers for failing to have the superhuman ability to predict the future. This creates risk averse institutions that stagnate, waste resources, and are slow to act, potentially creating new problems and new vulnerabilities to hindsight bias in the future. Rauch, Katz, and Nowak in the posts I linked to above, all favor reducing transparency in the public setting for this reason, but Kahneman might not agree with them, arguing that closing the deliberations to transparency won’t hide the outcomes from the public, and won’t stop hindsight bias from being an issue.