Thinking the unthinkable

Thinking the unthinkable

A recent EC Joint Research Council paper, “Thinking the unthinkable”, explored the question of whether technological accidents caused by natural hazards (Natech accidents) are real “Black Swans”.  A “Black Swan” is a concept introduced by Nassim Taleb in his 2010 book where he uses it to mean an unpredictable and hence unpreventable event.  Though the paper focusses on accidents caused by natural hazards, many of the issues can be applied more widely to other catastrophic events, such as the Deep Water Horizon explosion or the Grenfell Tower fire.

The author, Elisabeth Krausmann, concludes that “government and industry are using the term Black Swan too liberally in the wake of disaster as an excuse for poor planning” and that there is an “Act-of-God” mindset that needs to be challenged.

She cites two studies that examined major technological accidents (56 in one study, 50 in the other), which both concluded that the vast majority, if not all, could have been foreseen and prevented using available information and knowledge.  The Fukishima nuclear power station accident could likely have been prevented if information on large historical tsunamis on that coastline had been heeded and the power plant had been constructed elsewhere or on higher ground.

The paper is full of examples of similar catastrophic events that in retrospect seem like they could have been avoided.

Krausmann explores three types of event which are often called Black Swans

  • Events completely unknown to science (“unknown unknowns”, e.g. due to novel chemistry or technology);
  • Events unknown according to a person’s present knowledge (“unknown knowns”, e.g. safety practices known in one company but unknown in another);
  • Events that are known but judged to have negligible probability (e.g. scenarios removed from risk analysis below a specific cut-off value).

Taleb had argued that a Black Swan is a subjective phenomenon that is unexpected for a particular observer only, but not necessarily for others. Krausmann suggests that only “unknown unknowns” are true Black Swans: “unknown knowns” represent a failure of research or failing to learn; high-impact low probability accidents are errors of probability assessment or a high level of risk appetite.

The paper goes on to explore the nature of these events and how they occur. I have grouped comments together into a few themes that seem to me to be pertinent.

COST/ INCENTIVES

Economic considerations are a powerful driver in decision making which can lead to bad safety decisions. Despite protestations to the contrary, in reality safety considerations often are traded off against costs and operational efficiency.  Industry is generally reluctant to make investments in events are presumed to be extremely unlikely and to possibly never materialize at all.

Preparing for rare or unexpected events can be costly and inevitably a balance needs to be struck between the prudence recommended by science and budget constraints. When there are a wide range of risks, cost challenges can be reduced by building generic rather than threat-specific responses.  This is the approach hospitals take to respond to disasters – whether natural hazards, technology failure or terrorist attack.

In some cases the trade-off is readily apparent:  after a BP Texas City refinery fire in 2005, an analysis showed that under BP’s system of executive incentives, 70% of executive bonus accounted for financial performance while a mere 15% were attributed to attained safety targets.

POOR MESSAGING

False alarms cause major problems. Not all possible warning signs can or should be responded to with the same priority. Prioritising signals is a challenge – learning from “near misses” and other situations is key.

False alarms also create a fear of incurring costs for no reason – and a reluctance to report issues. This can extend into downright denial so that information is not communicated or that the potential severity of a situation is not believed at the decision-making level. Overly hierarchical organisations limit the communication of problems to the extent it seems that senior management simply don’t want to hear bad news.

There is also an element of who is doing the signalling. Junior staff are often ignored.  Residents at Grenfell  Tower had flagged concerns about fire safety many times – had it been a luxury block occupied by influential people would the messages have been taken more seriously?

LACK OF FOCUS – DRIFT

A remarkably common issue is complacency. Organizations tend to drift to a higher risk state, relaxing safeguards and controls as they try to accommodate conflicting business goals and tradeoffs.  if nothing happens for long stretches of time the view becomes that nothing will continue to happen in the future. COVID-19 is just another example of our tendency to use the past as a basis for our future expectations, of assuming that if the worst did not happen before, it will also not happen in the future (“it’s just like swine flu”). Krausmann refers to a disaster “incubation” period, where a system moves closer to the edge of its safety envelope until it fails.

There are also difficulties with retaining information from previous incidents, poor knowledge sharing, failure to use available knowledge, and corporate memory loss due to changes in staff and management, frequent ownership changes, and instability in business continuity.

A major accident may well generate media attention and enquiries. However, once media attention fades, stakeholder interest fades, and a risk might no longer be considered a threat. This is usually accompanied by a redefinition of priorities and a drop in resources made available for mitigating the risk.

PERCEPTION

Another aspect exposed by the pandemic is that it seems that human minds tend to linear thinking. Exponential growth – where things potter along at a lowish level and then rapidly explode – is not a simple concept to get to grips with.  Similarly, rare events with complex causal relationships are hard to grasp.

A range of cognitive biases explain why events may be referred to as Black Swans:

  • Confirmation bias: we look for evidence that confirms our beliefs but ignore facts that would refute them.
  • Narrative fallacy: we construct simplified stories out of sequential facts to make sense of the world. Taleb himself referred to the turkey illusion, in which the well-fed and cared-for bird could not imagine that the good life would come to a sudden and catastrophic end
  • Silent evidence: a sampling bias where only evidence that catches the eye is considered rather than searching for and considering what is there
  • Ludic fallacy: attempts to predict the future with tools and models that cannot capture rare events. Mathematical models of an uncertain risk create a false sense of certainty, thereby possibly doing more harm than good.  “No probabilistic model based on in-box thinking can deal with out-of-box type events.”

SYSTEMIC RISKS

In complex and tightly-coupled systems, small initial shocks can propagate through the individual subsystems, interacting in unexpected ways and creating a chain reaction that can ultimately lead to complete system failure. This also increases the challenge of dealing with the incident – multiple issues  cascade, overwhelming the response capacities of emergency responders.

DEALING WITH TRUE BLACK SWANS

Apart from simply avoiding (or trying to) the failings above with out-of-the-box thinking and a departure from the Act-of-God mindset, Krausmann identifies a number of strategies to cope with Black Swan challenges.

  • Risk-based versus precaution-based strategies: employing less dangerous substances and production processes, lowering the quantity of hazardous substances on site, and implementing passive safety systems
  • Disaster incubation theory and warning signals: analysis of multiple precursor events that accumulate before total failure occurs
  • Mindfulness: a preoccupation with failure, exploring one’s own or others’ near misses
  • Resilience engineering: locating the pathways to recover from an unforeseen crisis
  • Scenario planning

SCENARIOS

At SAMI of course we recognise the value of qualitative scenarios which do not describe the future to come, because it is variable and unknowable, but a set of possible futures that helps decision makers to orient themselves in the maze of uncertainties they have to tackle. Scenarios question existing beliefs and worldviews, and frequently include elements that cannot be formally captured through models.

This also includes:

  • Backcasting: starting at a final imagined outcome (e.g. the most catastrophic failure conceivable) and analysing backwards the conditions under which it could occur;
  • Red teaming: serves as devil’s advocate that challenges linear thinking

Scenarios are only useful if taken seriously and if there is no hesitation to act on contingency plans once a crisis materializes. As the COVID pandemic has shown, the best preparedness planning will be unsuccessful if its implementation falters once a disaster looms at the horizon.

If you need help with building scenarios to avoid some of these problems, SAMI Consulting is well-placed to do so.

Written by Huw Williams, SAMI Principal

The views expressed are those of the author(s) and not necessarily of SAMI Consulting.

Achieve more by understanding what the future may bring. We bring skills developed over thirty years of international and national projects to create actionable, transformative strategy. Futures, foresight and scenario planning to make robust decisions in uncertain times. Find out more at www.samiconsulting.co.uk.

If you enjoyed this blog from SAMI Consulting, the home of scenario planning, please sign up for our monthly newsletter at newreader@samiconsulting.co.uk and/or browse our website at https://www.samiconsulting.co.uk

Image by Patty Jansen from Pixabay

Leave a reply

Your email address will not be published. Required fields are marked *