Next Article in Journal
The Characteristics of the Housing Market and the Goal of Stable and Healthy Development in China’s Cities
Next Article in Special Issue
Towards True Climate Neutrality for Global Aviation: A Negative Emissions Fund for Airlines
Previous Article in Journal
Greek Banking Sector Stock Reaction to ECB’s Monetary Policy Interventions
Previous Article in Special Issue
Strategy and Practice for Sustainability in Businesses in the Middle East and North Africa in a Global Perspective
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:

Surviving Meltdowns That Cannot Be Prevented: Review of Gaps in Managing Uncertainty and Addressing Existential Vulnerabilities

Karamjeet S. Paul
Strategic Exposure Group, New York, NY 10031, USA
J. Risk Financial Manag. 2022, 15(10), 449;
Submission received: 6 July 2022 / Revised: 18 September 2022 / Accepted: 30 September 2022 / Published: 3 October 2022
(This article belongs to the Special Issue Innovation in Business and Energy Systems)


We make all decisions in the context of what we know and can envision. However, catastrophes often arise from what we had not known or had not envisioned previously. Approaches that work for addressing what can be envisioned are not useful in preventing catastrophic meltdowns arising from what cannot be envisioned ex ante. In extreme situations, such meltdowns can represent existential exposure to an organization, and thus cannot be ignored. Despite advances in risk management, a gap in addressing what cannot be envisioned ex ante has existed since Frank Knight’s designation of risk and uncertainty in 1921. As a result, organizations continue to employ approaches that may be ineffective against catastrophic meltdowns from the unknown. There is an urgent need to address this gap by scholars because as our world becomes more complex and globally interconnected our organizations and systems become increasingly vulnerable to this exposure from the unknown. The need and the urgency to address this exposure will only increase as, in addition to everyday operations, climate change represents a unique and growing challenge because everything about it, and its impact at organization levels, is unknown and beyond what can be envisioned today. Its impact, if it materializes, will be global and widespread, and it is likely that no organization and system will escape it. Thus, a mechanism to address the unknown that cannot be envisioned should be a priority for scholars and for organizations. There are clear differences between exposure from what can be envisioned and exposure arising from the unknown that require addressing the unknown differently. This paper explores these differences and then offers an approach to address the exposure from the unknown in practice through a disciplined managerial process to ensure that organizations can survive meltdowns that cannot be prevented.

1. Introduction

Catastrophes often arise from events that had not been envisioned previously, and thus not internalized in decision making. In extreme situations, such catastrophes can represent existential exposure to an organization. How is this exposure from what cannot be envisioned ex ante different from the exposure that arises in the normal course of business? How can organizations address and prepare for this existential exposure? The objective of this paper is to review issues related to this exposure from the unknown and to add to the understanding of certain aspects of risk and uncertainty.
“Human history is rife with catastrophe …” writes William Macaskill in The Beginning of History: Surviving the Era of Catastrophic Risk (Macaskill 2022). There is no reason to believe that we have learned to prevent all future catastrophes. Therefore, urgent attention is needed to address existential exposure from catastrophes that cannot be prevented.
Frank Knight’s work, Risk, Uncertainty and Profits (Knight 1921), constitutes the foundation of risk management today. In addition to defining risk and uncertainty, Knight also identified a gap in the designation of risk that arises from outcomes that cannot be envisioned ex ante. Over the last 100 years, scholars have discussed and written about this gap from time to time but have not fully addressed it. As a result, organizations continue to either rely on approaches that may be ineffective against catastrophic threats that cannot be envisioned ex ante or ignore such threats in decision making. Ignoring such threats can have significant consequences if the unanticipated exposure materializes into extreme adversities. For example, the suspension of the certification of Nord Stream 2 by Germany in 2022 was an event that was hard to envision even a few days before this government action in response to the Russian invasion of Ukraine. It is not known how many organizations had or had not ignored it and internalized such a geopolitical exposure in their decision making. However, when this exposure materialized it translated into huge unanticipated write down of investments for some businesses (such as Royal Dutch Shell and UniPer) as well as a sudden large increase in energy costs for many other businesses (Wikipedia 2022).
There are various types of risk, such as financial risk, market risk, supply chain risk, obsolescence risk, sovereign risk, etc., posing different levels of threats for different operating models. One set of risks may impact organizations in one industry while another set may pose larger threats in another industry. However, exposure from the unknown transcends all decisions in all organizations of all size with all operating models, leaving no organization immune to its impact if it materializes.
Two factors may be making organizations increasingly vulnerable to threats from the unknown. Today’s global interconnectedness of organizations means that a ripple in one part of the world can cascade into large adversities in places near and far. For example, the impact of high energy costs due to the global geopolitical developments, combined with domestic factors, has created a national crisis in the UK with possible catastrophic consequences for many businesses. On 8 September 2022, BBC reported that according to the chief economist of Red Flag Alert, a business data analytics firm, businesses are not covered by the British government’s recent energy-help program and therefore “more than 75,000 larger firms that are high energy users are at risk of insolvency or are likely to lay off staff without government support” (BBC 2022). As this interconnectedness and the complexity of our world grow, all systems and organizations—financial systems, energy grids, communication mechanisms, disease controls, supply chains, etc. at a macro level and organizations at a micro level—everywhere may find themselves more vulnerable to threats from the unknown. In addition, climate change is adding another dimension of vulnerability that cannot be envisioned, as everything about climate change and its impact at organization levels is unknown, and no organization will escape it.
Risk is commonly defined and managed using quantitative and qualitative metrics and measures such as probability distribution, likelihood, covariance, standard deviation, etc. For exposure from the unknown not only is there no disciplined framework to define and manage it, characteristics of this exposure are also not fully appreciated.
Thus, as the exposure from the unknown exists in every organization and system, can have existential consequences and is growing, it is too important to be ignored or discussed casually in magazines. It requires an urgent and exhaustive debate among scholars.
This paper, building on prior scholarly work, using examples and cases and based upon observations and experience of the author, will examine how exposure from the unknown is different in operational significance from what can be envisioned and internalized in decision making. The paper concludes with an elaboration of an approach for organizations to address such potentially existential exposure and with a call for scholars and senior managers to focus attention on advancing understanding of this aspect of risk that has not as yet been addressed systemically in the academic literature, nor in practice.

2. Current Approach in Managing Risk

Humans may adapt naturally to addressing potential adversities by focusing on prevention and mitigation (either or both referred to here as “prevention”). Precisely because of this natural tendency, the discipline of risk management is predicated upon the ability to envision, simulate and prevent potential adversities.
However, despite significant advances in risk management over the last 30–40 years, organizations can find themselves exposed to meltdowns from catastrophic adversities, as experienced by many well-known businesses, such as Hertz in the US in 2020, Virgin Australia in 2020, BlackBerry in Canada in the early 2010s and Nortel, also in Canada, in the early 2000s.

2.1. Prevention Based Approach & Vulnerability to the Unknown

Organizations may remain vulnerable to such meltdowns because they rely primarily on prevention. However, we cannot prevent something if we do not know what we are trying to prevent. As a result, prevention is possible only in the context of what we know and can envision.
Experience shows that catastrophes—man-made or natural, financial or non-financial, the ones that make headline news or go unreported—that result in meltdowns often arise from events or cascading domino effects of events that cannot be envisioned and anticipated ex ante and are thus unknown until they happen. Therefore, despite all the advances in risk management, meltdowns that arise from the unknown cannot be prevented.
Ignoring the unknown can leave an organization vulnerable because, left unaddressed, in extreme cases such meltdowns can threaten the continued functioning of a system or the survival of a going concern if things go awry, posing existential exposure. Richard Feynman, who identified the catastrophic failure of O-rings as the cause of the space shuttle Challenger disaster, once said: “It is not what we know, but what we do not know which we must address, to avoid major failures, catastrophes and panics” (Haldane and Madouros 2012). Thus, organizations mindful of existential exposure cannot ignore the unknown.

2.2. Challenges Related to the Unknown of Climate Change

Climate change represents a unique challenge. Nothing like this has happened on such a global scale in modern times, and thus it is disconcerting that the repercussions of climate change for individuals and organizations cannot be envisioned until they unfold, and therefore are not well-understood today. Climate-change conversations focus on its effect decades from now. Therefore, one may feel no urgency to take actions to address this unknown’s implications for their organization. This paper argues that such a belief is mistaken because while the time to address threats arising from the known is as soon as they are envisioned, the time to address threats arising from the unknown is well before they can be envisioned.

2.3. Two Types of Unknown Create Vulnerabilities

There are two broad categories into which possible unknown events may fall. A possible event may be unknown because of ignorance, purposeful dismissal, or discounting of its likelihood, referred to as a “failure of nerve” as it is not picked up and considered in decision making, or because it simply cannot be envisioned as it is outside the scope of one’s imagination of possibilities, referred to as a “failure of imagination” (Clarke 1962).
An example of the former would be the ignoring of the experience of the Great Flood of 1862 when a 43-day atmospheric-river storm poured 120 inches of rain in California, turning the 300 miles long 20 miles wide Central Valley into an inland sea with 3 to 30 feet of water, reducing agriculture production by 80% and inducing the temporary relocation of the state capital from Sacramento to San Francisco (Ingram 2013). The US Geological Survey estimates that, based upon their historical frequency, such storms can occur in California every 150–200 years. Therefore, we are already in the next frequency interval which could result in damages of over USD700 billion (United States Geological Survey 2011). Ignoring the empirically established likelihood of its occurrence and not factoring this in planning would constitute a failure of nerve. An example of the failure of imagination would be the unprecedented freezing of global financial markets which had never been experienced, nor envisioned, prior to its occurrence in 2008.
Even though the primary focus of this paper is the unknown arising from the failure of imagination, the approach discussed here is just as useful in addressing the unknown arising from the failure of nerve, as both may be unknown at the time of decision making.

2.4. Distinction between the Exposure from the Known and from the Unknown

Most experienced managers are aware of the exposure arising from the unknown. However, they may fail to make the critical distinction between extreme circumstances of the known that can be envisioned, which are addressed by traditional risk management, and extreme circumstances of the unknown that cannot be envisioned and may often refer to any extreme exposure simply as arising from tail risk or extreme risk. By not making the distinction, managers may not see the need for an explicit focus on extreme exposure from the unknown. This distinction is critical as without it, whenever concerns about extreme exposure arise, managers may keep doing more of what risk management calls for to prevent catastrophes, without realizing that such actions address only the cause of the known exposure and cannot prevent extreme threats from the unknown, leaving the unknown unaddressed. So, by not making the distinction, by default, managers may be ignoring the unknown even when they do not intend to.
Over the last 30–40 years, the primary focus of advances in the area of risk and uncertainty has been on exposure from the known, or what can be envisioned and addressed. Nassim Nicholas Taleb has been instrumental in raising awareness of what cannot be envisioned, particularly as related to black swan (Taleb 2004, 2007). However, to-date, why and how extreme threats from the unknown create existential exposure has not been fully explored. As a result, a disciplined approach to operationalize this challenge in organizations has not been addressed.
To address this challenge, this paper will attempt to define and demonstrate how extreme exposure from what cannot be envisioned ex ante differs in operational significance from what can be envisioned, and then outline how organizations can address such potentially existential exposure.
This paper is structured into three sections. Using a conceptual framework, the first section dissects the exposure from the known, and the second section defines the unique characteristics of the exposure from the unknown to demonstrate that it can never be addressed by traditional risk management. The third section offers a disciplined approach to address such exposure so organizations can survive meltdowns that cannot be prevented.

3. Conceptual Analytic Framework

A conceptual analysis, applying the Johari Window Technique, is helpful in fully understanding, dissecting, and developing solutions to address risk and uncertainty (Luft and Ingham 1961).
We can describe future circumstances by possible outcomes that may be known or unknown and the probabilities of these possible outcomes that also may be known or unknown. Using the technique, we can organize combinations of known and unknown outcomes and probabilities into four distinct sets of possible circumstances, as shown in the four quadrants of Figure 1. Each quadrant has distinct characteristics and thus requires distinct solutions. To address risk and uncertainty effectively, the circumstances of each quadrant as related to a decision must be considered fully and simultaneously. We will address each quadrant in turn beginning with exposure from the known.

4. Exposure from the Known

4.1. Known-Known

Quadrant 1 (Q1) covers circumstances where all possible outcomes are known and their probabilities are also known, such as the tossing of a coin where all possible outcomes are known, and each has a known probability of 50%. For Q1 circumstances, each known outcome has defined implications and thus can be addressed with defined solutions. While important in decision-tree analytics, very few situations in reality have such defined characteristics where all possible outcomes along with the probabilities of each outcome are known.

4.2. Unknown-Known

Quadrant 2 (Q2) covers circumstances when we do not know what the specific outcome would be, such as the Euro-US Dollar exchange rate or oil prices in 90 days. Thus, the outcome is unknown. However, based upon historical data we know the probabilities of various possible outcomes, i.e., probabilities are given and known, but the outcome is unknown. Using known probabilities of each possible outcome, we can derive the expected value of any known possible outcome and using expected value we can address Q2 circumstances. However, when actual outcomes deviate from the expected value, adversities from Q2 circumstances can arise. Theoretically, such adversities should not pose catastrophic threats if there are (i) sound models to predict the deviation of each possible outcome from the expected value, and (ii) prudent policies that define the maximum acceptable downside limit of deviations that can be absorbed by the operating model. Thus, the adversity from Q2 circumstances, if addressed effectively, will be limited to only volatility within a policy-designated range. Therefore, in addressing Q2 circumstances the objective is to manage the volatility while maximizing the value of the outcome. This is accomplished by managing probabilities to keep the volatility within the acceptable limit. Such actions, referred to as “Operating Solutions”, are undertaken within the operating model of the organization.
Examples of Q2 circumstances abound, such as:
  • A critical component in manufacturing a product is unexpectedly out of stock, which can be prevented by maintaining excess inventory.
  • The market value of a financial asset drops unexpectedly, the impact of which can be mitigated by a diversified portfolio.
  • An unexpected imbalance occurs between an electricity grid’s active capacity and the current demand, which can be mitigated by multiple standby sources of power, such as peaking plants or a plan to shed loads.

4.3. Known-Unknown

Quadrant 3 (Q3) covers circumstances that can be envisioned, and where a specific possible outcome may be defined as known, such as an extreme scenario of the foreign exchange rate of USD1 = €2, oil price of USD300 a barrel or a very large decline in asset values over a very short period of time, such as happened in 2008, or a dramatic change in social, political, technological, economic trends in the long term, such as the adoption of e-commerce by consumers. Thus, a specific outcome is given or known, however, the probability of such an outcome is unknown. By simulating specific known circumstances, through stress testing and scenario analyses, solutions can be developed for each envisioned outcome. Adversities from Q3 circumstances arise when the organization did not prepare for a known possible outcome and can result in huge unfavorable impact in extreme situations, such as the demises of Long-Term capital Management in 1998 and MF Global in 2011. Theoretically, Q3 circumstances should not pose a catastrophic threat if there are (i) sound models to simulate exposure from all possible known outcomes, and (ii) prudent policies that define the maximum acceptable exposure that the operating model can absorb. Therefore, in addressing Q3 circumstances, the objective is to manage the exposure to preserve the operating model. Q3 circumstances are addressed through backstop protections, or “Contingent Solutions/Strategies”, that get activated to limit the impact of adversity, such as insurance, guarantees, real options, alternative strategies or Plan B, and capital.
Some examples of Q3 circumstances:
  • The supplier of a critical component cannot supply it for an extended period, which can be prevented by maintaining standby suppliers for the critical component.
  • The market value of several financial assets drops precipitously, the impact of which can be mitigated by maintaining a cushion of capital to absorb losses.
  • All power plants supplying an electricity grid shut down, which can be prevented by installing standby generators called “black starts” or by providing spinning capacity.
Theoretically, Q3 circumstances can be managed effectively assuming all outcomes that can be envisioned are addressed. However, decisions are often based on intuitive reasoning, heuristics, or instinctive visceral desires leading to failures of nerve (Levin and Milgrom 2004, pp. 22–23). Given that humans have limited cognitive capacity, there is simply no way to reason through every outcome. Thus, outcomes that are not considered due to the failure of nerve become unknown, and they must be so treated rather than ignored.

4.4. Primary Focus in Addressing the Known: Event-Centric Prevention

All 3 sets of circumstances of Q1, Q2 and Q3 have one common theme: they can be envisioned. Once envisioned, they can be simulated and then managed through a combination of (i) controls that reduce probabilities or likelihood of adversity, and (ii) solutions to absorb the impact of defined adversities. If managed prudently, these circumstances would not represent catastrophic threats. This combination of steps to control and develop solutions to manage such circumstances constitutes an event-centric approach, where the primary focus is on defining the event and its likelihood. The event-centric approach addresses risk and uncertainty through a combination of simulation, prevention, and absorption of the impact of adversities. This is traditional risk management.
What makes the event-centric approach work—the exclusive focus on what can be envisioned—also reveals the limitation of traditional risk management that it excludes what cannot be envisioned. By not making the distinction between extreme adversities arising from what can be envisioned and from what cannot be envisioned, organizations may end up using only the event-centric approach, and thus finding themselves unprepared when catastrophic circumstances arise from the unknown.
For example, ignoring the distinction, and thus relying solely on event-centric approaches to prevent adversity, may have left Lehman Brothers with no options on the eve of its demise on 15 September 2008. Similarly, despite the knowledge of what a pandemic can bring, as envisioned and depicted in a 2011 movie titled Contagion, ignoring the distinction, and thus relying solely on event-centric approaches to prevent adversity, may have left some organizations unprepared and struggling for survival when demand for their products or services disappeared suddenly in 2020.

4.5. Scholarly Evolution

Even though the distinction between adversities that can be envisioned and the ones that arise from the unknown is often ignored, the critical importance of this distinction has been highlighted throughout the history of the evolution of risk and uncertainty theory.
One of the primary pillars of risk and uncertainty theory is based upon the work of Frank Knight. He classified circumstances into “three different types of probability situation”: “A priori probability” [referred to as Q1 in this paper], “Statistical probability” [referred to as Q2 in this paper], and “Estimates” [referred to as Q3 and Q4 in this paper] (Knight 1921). He further described risk as related to future situations where probability distributions of possible outcomes are known, and uncertainty where such probability distributions are unknown:
To preserve the distinction … between the measurable uncertainty and an unmeasurable one we may use the term “risk” to designate the former and the term “uncertainty” for the latter …. The practical difference between the two categories, risk and uncertainty, is that in the former the distribution of the outcome in a group of instances is known …, while in the case of uncertainty this is not true, the reason being in general that it is impossible to form a group of instances, because the situation dealt with is in a high degree unique.
This concept of differentiating uncertainty from risk has been further elaborated and dissected by economists (Guerron-Quintana 2012; Park and Shapira 2017; De Groot and Thurik 2018; Dizikes 2010).
However, the critical unknown element of uncertainty is not so much the probabilities as defining the possible outcomes. According to Langlois and Cosgel:
In Knight’s view, uncertainty arises out of our partial knowledge …. The key issue, however, is to understand what the partialness is about …. Knight’s use of partial knowledge, we argue, reveals that his distinction between risk and uncertainty has more to do with the initial classification of random outcomes than with the assignment of probabilities to the outcomes …. The point is not so much that we do not know probabilities as that we do not know the classification of outcomes … uncertainty as Knight understood it arises from the impossibility of exhaustive classification of states …. When a decision maker faces uncertainty … he or she would have first to “estimate” the possible outcomes to be able to “estimate” the probabilities of occurrence of each.
In post-World War II decades, decision theory found applications in addressing uncertainty, as discussed by Kenneth Arrow in his 1951 and 1953 papers (Arrow 1951). Manahem Yaari summarized the concept in an essay as: The Basic idea underlying this paper was to model the uncertainty faced by a decision maker not in terms of the probabilities of various events but in terms of the underlying states-of-nature which make up those events (Yaari 2017).
It is this estimation of possible states-of nature that distinguishes what can be envisioned from what cannot be envisioned, which was referenced by George L. S. Shackle, as described by S.F. Frowen:
At the heart of Shackle’s theory of choice is the idea that, when people consider the possible consequences of taking a decision, they give their attention only to those outcomes that they (a) imagine and (b) deem, to some degree, to be possible. This set of outcomes is not guaranteed to include what actually happens, which may be an event they had not even imagined and which comes as a complete surprise to them.
Several developments of the last few decades, such as scenario planning and stress testing, evolved out of Shackle’s thinking about possible consequences. However, due to the absence of tools to fully operationalize this thinking, emphasis began to move away from further addressing this theory and towards rational expectation hypothesis and probability distributions, which overshadowed Shackle’s theory and its evolution in relation to decision making.
However, partial knowledge of states-of-nature continues to present problems in addressing uncertainty that was highlighted by the financial crisis of 2008, as summarized by Nelson and Katzenstein:
The financial crisis of 2008 reminds us that we live in a world of risk and uncertainty. Knight and Keynes developed the conceptual distinction between risk and uncertainty ninety years ago and it remains fundamentally important today. In risky environments, sorting events into different classes poses not special challenge for sophisticated decision makers. We cannot be sure what tomorrow will bring, but we can rest assured that unforeseen events will be drawn from known probability distributions “with fixed mean and variance”. In the world of risk the assumption that agents follow consistent, rational, instrumental decision rules is plausible. But that assumption becomes untenable when parameters are too unstable to quantify the prospects for events that may or may not happen in the future.
The fact that the states of nature cannot be envisioned and defined ex ante makes all event-centric approaches, including decision-tree analyses, scenario planning and stress testing, deficient in addressing the unknown. This gap between the states of nature that can and cannot be envisioned is not fully addressed both in the literature and in practice and needs further exploring.

5. Exposure from the Unknown

The circumstances of Quadrant 4 (Q4) are unknown as they cannot be envisioned, and obviously the probabilities of such outcomes are also unknown—unknown-unknown. It is not possible to offer a hypothetical example of future Q4 events because once an example is outlined it can be envisioned and thus is not an unknown anymore. Therefore, examples of Q4 circumstances can only be described in rearview mirrors.
Some examples of past Q4 circumstances: a worldwide shortage of a critical component for an extended period leading to existential adversities for a business (Ericsson) in 2000; the freezing of global financial markets translating into a worldwide liquidity crisis in 2008; a coincidental out-of-commission circumstances for most of the black-start generators leaving an electricity grid with falling current frequency perilously close to an uncontrolled collapse in Texas in 2021; the terrorist attacks of 9/11 in the US in 2001 and 7/7 in UK in 2005; and the sanctions on Russia in 2022.

5.1. Current Approach to Address the Unknown

Experience shows that how organizations deal with extreme exposure from the unknown can be divided into three groups: (i) ignore it because of the belief that its near-zero likelihood equates to an expected value so small that it is not worth doing anything about it, and also there is no point worrying about what cannot be controlled, (ii) take no explicit actions because of the belief that their organization’s strength and resiliency are the best ways to guard against extreme risk, and (iii) make explicit efforts to address threats from the unknown through early warning systems because of the belief that extreme adversities can be prevented through pre-emptive steps, such as prudent policies and robust controls, disciplined monitoring of developments and trends, and strong managerial process that anticipates alternative outcomes.
Actually, none of these actions assures that an organization can survive meltdowns that arise from Q4 circumstances. But many managers think otherwise because of the failure to distinguish between the adverse circumstances that can be envisioned and the ones that cannot be envisioned. Three important reasons for this failure may be:
  • Less than full appreciation of the distinct characteristics of Q4 exposure.
  • The hindsight bias that leads individuals to believe that they had previously predicted or thought of an adverse event before it happened and thus they can predict other future events (Kahneman et al. 1982).
  • Another bias that leads individuals to exaggerate the importance of a specific factor they picked as the cause of an adverse event while overlooking the numerous other factors some of which may have been unknown and had a greater impact and thus believing they can identify factors that may contribute to future adversities (Gilbert 2006).
This failure is the reason organizations may continue to rely on traditional risk management to prevent adversities that cannot be envisioned ex ante. Let us look at some of the misplaced beliefs and define the characteristics of Q4 circumstances to develop solutions.

5.2. Can Materialize Anytime Anywhere and Cannot Be Envisioned, Simulated or Prevented

People often think that extreme exposure from the unknown arises only from headline-making events beyond their control, such as major disasters, natural calamities, pandemics, market crash, etc. Actually, it does not take much to start a cascading chain of events that can lead to an unanticipated outcome. There are examples of small events that would normally be routine but cascaded into catastrophes, such as the sparks from a defective drill in a small factory making a USD8 part that led to a shutdown of Toyota operations nationwide at a critical marketing moment in Japan in 1997 (Nishiguchi and Beaudet 1998).
It is impossible to tell ex ante which minor event may turn into a catastrophe. An analogy would be observing many small wavelets somewhere in the middle of the ocean. Some of these wavelets may ultimately turn into waves … some into large waves … or even into huge waves. It is easy to forget that tsunamis also begin as what seems like small wavelets.
Tsunamis Begin with Almost Undetectable Small Wavelets
There was nothing out of the ordinary in Albuquerque, NM on this Friday. As the day ended, thunderstorms came rolling in. At their peak, just before 8 P.M., a huge lightning bolt struck an electric line causing large power fluctuations, creating a blaze in Fabricator No. 22 at a nearby Philips plant. A combination of sprinklers and employee efforts put out the fire in under 10 min.
The Fabricator Number 22 was a plant fabricating and churning out millions of semiconductors with complex circuits on chips that are barely half-inch square in size. Precision was critical and the presence of any air-borne foreign matter of the size so small to be invisible to human eye on or around the chips could ruin these semiconductors. Such fabricators were referred to as “the clean rooms”.
The fire was minor. But the damage from water, smoke and soot was extensive, and wasn’t limited to only work-in-progress chips exposed to the air. The fabricator was also not “clean” anymore, making the plant useless for fabrication.
On Monday morning, as the process of informing customers began, one of the first calls went to Espoo, Finland, about 5000 miles away, explaining the incident, the number of chips lost and the expected plant recovery time of one week to the chief purchasing manager of Nokia, the world’s leading electronic company with almost US$20 billion in revenues. A one-week delay in supply chains was no big deal, as experienced managers know that supply disruptions are more the rule than the exceptions. The purchasing manager shared the news with all members of the Nokia team. The affected items were placed on “special monitor” and a routine of daily calls to Philips was initiated.
About two weeks later, Philips realized the full scope of the damage and estimated that it would take weeks, may be months, to restore clean rooms and restart production. Of the indispensable Albuquerque components purchased by Nokia, one, known as Asic chip, was made only by Philips. Nokia estimated that, in the absence of this chip, it would be 4 million handsets short during the holiday season, the strongest cell-phone sale period. That amounted to 5% of the company’s total sales. With cell phone market booming, running short of the handsets would be a major problem and could impact the company’s market share significantly.
Nokia pulled all stops, internally and externally. The problem was elevated to the Nokia CEO level, as he obtained commitments from Philips CEO to dedicate all excess capacity to Nokia until Albuquerque plant was restored. Through such collaboration Nokia avoided disruption and handsets kept rolling off the assembly line to reach store shelves.
The second of the two major customers of the Albuquerque plant was Ericsson, Sweden’s largest company with US$29 billion in sales. When Ericsson received the call on Monday following the fire, like its treatment at Nokia, it was no big deal. However, it was treated as one technician talking to another technician, and the news was not shared widely within the company. Even two weeks later, the delay was estimated by Ericsson to be a few weeks and the news was not widely disseminated internally.
A week later the full impact of the delay was realized. Major efforts were initiated at Ericsson to obtain the critical semiconductors, including Asic chips. Philips had no spare capacity to alleviate the problem, and no other manufacturer could supply it in time for the upcoming holiday season. With significant delay in getting the new model to the market, Ericsson missed the holiday season.
The end result was a favorable marketplace reception for Nokia cell phones, and its market share increased from 27% to 30%. Ericsson, on the other hand, found itself without a new model in the high-end cell-phone market and its market share went down from 12% to 9% in six months.
Eighteen months later, the WSJ reported: “Ericsson said a slew of component shortages, a wrong product mix and marketing problems sparked a loss of 16.2 billion kronor (US$1.68 billion) last year for the company’s mobile phone division … The company will take an additional restructuring charge of eight billion kronor to finance further restructuring of its mobile-phone unit. The news sent Ericsson shares falling 13.5% [last Friday] and shook other high-tech stocks around the world. … When the company revealed the damage from the fire for the first time publicly last July, its shares tumbled 14% in just hours. Since then, the shares have continued to fall along with the declining fortunes of many global telecommunications stocks. Ericsson shares are trading around 50% below where they were before the fire”.
Less than 24 months after the fire, the wavelets that began on that Friday evening almost 5000 miles away turned into a tsunami and forced Ericsson to leave the handset-manufacturing business, with almost 80% drop in its investor value over this period.
This case is based upon the events as discussed in The Spider’s Strategy (Mukherjee 2008) and reported in the press.
The disaster described here was triggered by an event so ordinary that it did not even make the local news in Albuquerque, NM, and it happened when no one had envisioned a universal shortage of chips. So, two observations from this case are that the extreme exposure from the unknown: (i) can materialize anytime, anywhere, and (ii) cannot be envisioned ex ante, and thus cannot be simulated, nor prevented.

5.3. Early Warning Mechanisms and Metrics Not Useful in Predicting and Preventing

People often believe that an organization’s strength and resiliency may be the best ways to guard against extreme exposure from the unknown. Therefore, they may rely on traditional metrics to gauge the strength of an entity to withstand all extreme threats, including the ones that cannot be envisioned.
Traditional Metrics Can Create a False Sense of Security
Memoirs of a Senior Global-Bank Executive Who Managed Client Relationship With AIG.
First Quarter 2007
The company was well capitalized. There was a tremendous amount of experience in all forms of insurance products and a very strong culture of sales and marketing because of CEO Sullivan’s successful experience. An entrepreneurial atmosphere permeated the company’s four major business segments. The AIG Financial Products (AIGFP) business was growing and contributing major profits to corporate earnings. It was a leader in structured swaps and credit default swap (CDS) transactions … One of the biggest corporate challenges at this point seemed to be cash management, as the company was flush with liquidity.
What could be better! Writing insurance for these financial products raked in huge profits, with no significant capital allocations, and no cash calls to post collateral. All the securities, regardless of their fair-market value designation of level 1, 2, or 3, were listed at par, and many were rated AAA by rating agencies. It was what any business would like to have!
September 2007
AIG reported a staggering US$352 million loss in CDS in the financial products business.
This caused a rush of adrenaline at our institution as loan agreements were pulled out for review. Our institution’s exposure to AIG ran into billions through direct loans, CDS protection purchased from AIG, and other “CDS wrapped” transactions. A downgrade would trigger a huge collateral call on AIG. The numbers were quite sobering. But I remember the general sentiment: “Hey, this is AIG!” The net assessment was that the downgrade was highly unlikely. AIG had just raised US$20 billion through a series of capital transactions. Their shareholders’ equity stood at over US$78 billion, with total assets of over US$1 trillion, as of 30 June 2007.
In addition, a review of traditional risk-management indicators suggested ample capital, cash, and securities to cover any collateral call that AIG may face in a hypothetically extreme scenario. The concern about the lack of transparency had grown significantly as the question kept coming up: “How much counter-party exposure to how many parties?” But discussions always concluded that AIG had sufficient resources to cover its obligations.
15 June 2008
Just weeks after J.P. Morgan Chase had completed the acquisition of Bear Stearns, AIG reported a shocking US$5.6 billion loss attributable to AIGFP’s CDS portfolio, along with US$20 billion in write-downs on CDS guarantees and an additional US$18 billion write-down on mortgage- and asset-backed securities during the previous three quarters.
Early September 2008
… we reviewed and discussed AIG’s credit. At that time, AIG seemed well capitalized, with significant global assets. There was some concern about the lack of complete transparency of AIG’s exposure to other counter parties, including the number of parties involved. But on that day, any concern about AIG almost seemed to be a side issue.
15 September 2008
The downgrade this evening from AA to A- prompted a collateral call for a staggering amount of US$14.5 billion. Since the Lehman Brothers’ bankruptcy this morning, no one could be sure of what is to come next. The subsequent panic has frozen the market; because of a lack of buyers, it is not clear where the market prices of assets are.
There was a full-blown panic at our institution as rumors began swirling around this morning. If Lehman wasn’t rescued, who could be next? Who will come to AIG’s rescue? What happens if there is no rescue?
It is a common practice to use relevant metrics to describe the strength of a system (be it financial, operational or physical) to withstand adversity. Using these metrics to compare a system’s strength to implied standards established from observations over time one can draw conclusions about the system’s ability to withstand adversities. However, meeting or exceeding such standards does not provide any guidance as to how the system would perform under extreme Q4 adversities that cannot be envisioned and may far exceed any historical observed adversity.
Based upon evaluations using traditional metrics of strength, analysts and credit-rating agencies had assigned a strong AA rating to AIG and concluded, as quoted in the case above, that “traditional risk-management indicators suggested ample capital, cash, and securities to cover any collateral call that AIG may face in a hypothetically extreme scenario”. And yet AIG fell apart. So, one observation from this case is that traditional metrics do not gauge the strength of an entity to withstand extreme adversity from the unknown and, unless viewed with such explicit caveats, could create a false sense of security.
It is natural to make resiliency a priority, as it helps organizations recover from adversity. However, without the context of distinction between adversities that can be envisioned and that cannot be envisioned, it may be misleading to emphasize that “business resilience is fundamental for a company to withstand major disruptions and survive an unpredictable, turbulent future” (Wladawsky-Berger 2019).
Merriam-Webster Dictionary defines resiliency as an ability to recover from or adjust easily to adversity or change. Resiliency is relevant only if the organization can survive a catastrophe first. Unfortunately, efforts and investment to achieve high resiliency offer no assurance of survival from existential threats from the unknown. Ignoring the unknown because of concerted steps to establish a high degree of resiliency is analogous to relying on the best post-hospitalization rehab therapy to survive a catastrophic cardiac event.
Some people believe that close monitoring of developments and trends can provide early warnings to prepare for adversities through planning steps, such as stress testing and scenario analyses for fast-paced events (such as a financial and commodity market crash or a natural disaster), or Plan B protocols to respond to changes from slower-developing trends (such as the adoption of new technology or acceptance of new social and political attitudes). While appropriate to prevent what can be defined, such measures cannot address catastrophes from the unknown because of how catastrophes materialize.

5.3.1. Correlations Determine Outcomes

While any event can take a toll on organizations, catastrophes from the unknown are rarely the result of a direct hit from an incident. Catastrophes arise from unanticipated correlations between an event and other factors present at the time. It is useful to internalize why early warning systems often cannot foresee and warn impending outcomes arising from the unknown.
Any event correlates with factors present at the time and triggers a cascade of following events that correlate with other factors to trigger another cascade of following events, and this correlating-and-cascading process goes on.
While initial events are likely to influence how the following events may turn out, the correlation of the impact of an initial event with other factors may play a significant, if not greater, role in determining the outcomes of the following events. As a result, one initial event can have two very different outcomes for two separate contexts depending upon the correlations that follow, as shown in the Ericsson/Nokia Case. Therefore, given an initial event, the outcome depends on how the event correlates with other factors present at the time.

5.3.2. Nonlinear Correlations Make It Impossible to Envision, Simulate and Predict

Humans often form expectations about outcomes of a chain of events based upon prior observations of correlations. Thus, initial events create expectations about correlations that result in outcomes of the following events. Most of the times these correlations are linear and result in outcomes as anticipated and predicted. However, when correlations are nonlinear, we have unpredictable and unanticipated outcomes. L. Douglas Kiel argues, for example, that in nonlinear situations, outcomes are subject to high levels of uncertainty and unpredictability, and thus they surprise us (Kiel 1996).
There can be an uncountable number of possible correlations between any specific event and other correlating factors that start a cascade of following events. Humans—even with the help of supercomputers and analytics—are likely to envision and process only a small fraction of these events. Correlations that cannot be envisioned ex ante, or nonlinear correlations, result in Q4 unknowns that can cascade into catastrophes. Therefore, nonlinearities can make correlations deviate from anticipations and very difficult, if not impossible, to envision, rendering an early warning system useless to predict Q4 outcomes.
An example of correlations that had not been envisioned and created Q4 complexity relates to the prevailing expectation that the combination of natural gas and wind farms achieves reliable electricity generation at affordable prices. Few people anticipated the correlated consequences of three disparate events: (i) a dead calm in North Sea, normally one of the windiest places in Europe; (ii) a drop in natural gas shipments from the Middle East to Europe because of the diversion of supplies to meet an increased demand in China; (iii) and reduced supplies from Russia. Each of these events could be envisioned individually, but their coincident correlation produced a nationwide energy crisis in the UK in October 2021 that had not been envisioned previously and led to the demise of some companies.

5.4. Nonlinear Correlations Cannot Be Recognized Ex-Ante

The pace of nonlinear correlations, resulting in unanticipated outcomes, is the reason why Q4 unknowns may be hard to detect ex ante. This pace can be too fast to recognize and address the unanticipated outcome. For example, a fast-moving financial-market shock, a natural disaster, or a new technology application, can turn into catastrophes before an organization can formulate and implement effective responses to unanticipated outcomes.
Or the pace of nonlinear correlations can be so slow that its incremental cascading impact is imperceptible in any given period and, thus, is not recognizable until cumulative consequences become evident over time. Even though in hindsight the impact of each incremental impact may seem clear, in real time such increments may not be recognized.
Thus, unanticipated nonlinear correlations can never be foreseen, simulated, predicted, or prevented, and can lead to catastrophes almost spontaneously or over a long period of time.
Correlating Factors Are Easily Missed in Real Time Even When Developing Over Years
Things were looking good. The company, the world’s largest private-sector producer and distributor in its industry, was named to Fortune Magazine’s 2008 list of America’s Most Admired Companies. Revenues had been growing consistently year-after-year. Its stock price reached an all-time high earlier in the year. Following record third quarter results, the announcement said: “We continue to differentiate ourselves with a very strong financial and operating performance, rising cash flows, major contract backlog and rapidly growing international contributions. In the face of current economic conditions, we are pleased to be raising our outlook”.
“We are seeing the first full quarter of benefits from our multi-year capital investment program”, said the CFO.
Also, Standard & Poor’s had just upgraded its corporate credit rating to BB+ based on the “strengthened credit measures due to the company’s financial performance and favorable outlook”.
The company, Peabody Energy based in St. Louis, Missouri in the US, was one of the world’s leading pure-play coal companies and operated through the entire value chain, from mining the coal, to selling and distributing it for electricity production and steelmaking. Prospects for price increase looked good as the global coal demand was growing. Peabody was well positioned to benefit from this.
The company said in 2010 that global demand for coal was entering a multi-year growth period: “We’re in the early stages of a 30-year supercycle in global coal markets”, and reiterated in 2011 that “the coal supercycle is just getting underway”.
Prospects remained positive through 2011. Peabody received CEO of the Year and Energy Company of the Year Honors at Global Energy Awards and S&P affirmed its BB+ credit rating. However, little attention was paid by the company or the marketplace to certain factors that had been developing over several years.
Pioneered by Halliburton Company, natural gas extraction through fracking dates to the 1940s but its cost, while declining over the years, remained high relative to coal until as recently as 2000. Coal used to be the cheapest source of electricity. But the cost of meeting new environmental standards raised its cost while the spurt in the number of fracking wells dropped natural gas prices. By 2012, the significant coal versus natural gas cost advantage was almost gone.
Historically, environmentalists, without much funding or disciplined organizations were viewed more as nuisance than game changers. However, beginning in the late 2000s the Sierra Club in the US changed its modus operandi. Encouraged by the White House, and Michael Bloomberg’s commitment of tens of millions of dollars to launch an organization with strategy-based plans in 45 states, it began a disciplined “war on coal” at state and local levels that killed on average one coal-fired power plant every 10 days between 2010 and 2015. By 2012, such groups represented existential threats to the coal industry in the US.
Peabody reported record operating and net profit margins, return on equity, and net income of almost US$1 billion in 2011.
That was the last profitable year at Peabody, as starting in the early 2010s, the US coal industry began imploding and the company initiated closing mines rapidly. Saddled with debt and the outlook growing bleaker by the day, Peabody Energy filed for bankruptcy on 13 April 2016.
This case is based upon information sourced by links inserted above as well as events as reported in Politico.
With the benefits of hindsight, it is easy to see clearly the events that happened. A sea change was underway and the day of reckoning for the coal-mining business in the US was around the corner. But in real time it was not obvious to the company, as shown by its statements and actions, nor to the market, as shown by the steady stock price and the affirmation of Peabody’s BB+ credit rating.
Fracking began changing the economic advantage of coal and started what seemed like little-noticed small wavelets. But these small wavelets quickly correlated with the success of the environmentalist movement and turned into a tsunami that changed the entire coal industry. This correlation materialized over several years but was missed by both the company and the marketplace until it was too late for Peabody.
A key observation from this case is that unanticipated correlations cannot be envisioned and monitored even when they materialize over a long period of time.
Infrastructure companies tend to require high levels of initial investment with very long project lives and extended time horizons for return on investment. Over such time horizons a lot can happen that may be envisioned but not necessarily anticipated because of the failure of nerve, not to mention how the unknowns that cannot be envisioned can materialize and impact investor returns because of the failure of imagination. For example, with a financial-return time horizon of 15–20 years, Peabody was making significant investment decisions in the early 2000s when fracking on a large scale was not even practical and environmentalists were ad hoc groups of people just throwing themselves in front of bulldozers to stop a project. As the unknown materialized, Peabody was challenged, leading to its demise in less than 10 years.

5.4.1. Climate Change Is an Unprecedented Challenge

This is significant in relation to climate change. Even though some macro-level hypotheses are being proposed, exposure from climate change, and how it would impact organizations and individuals, cannot be envisioned and defined with certainty. Close examples of previous phenomena that could not be envisioned ex ante and that brought changes on a global scale are the industrial revolution, the commercialization of the Internet and the adoption of digital social media. These phenomena upended the way we did everyday things, redefined how we lived and interacted with the rest of the world, wiped out many organizations and also created new ones. However, these phenomena were created by humans, and none impacted the natural fundamentals of life on earth. Climate change has the potential to do just that. Therefore, because nothing like this has happened on such a global scale in modern times. and the anthropogenic influence on climate has only recently been postulated, there are many more unknowns about its impact than any other type of risk we might be able to envision.
Established in 1988 and endorsed by the UN General Assembly, the Intergovernmental Panel on Climate Change (IPCC) has produced five Assessment Reports on the evidence and risks of climate change. The work of the IPCC has been widely reported in the news media, and concerns about warmer temperatures, rising sea levels, more frequent extreme weather, risk to human settlements, and food production, among possible consequences, are now widely discussed. However, it is not clear how one could envision the impact on life at organization levels directly as a result of these suggested changes, or indirectly because of the actions by others to counter the impact of climate change, making mitigation strategies nearly impossible to develop. According to Larry Fink, Chairman & CEO BlackRock, Inc., every company and every industry may be transformed by the transition to a net zero carbon world (Fink 2022). In addition, because of the global interconnectivity of organizations today, any impact or action in one region or organization would likely induce cascading effects on other organizations near and far. Speaking to a gathering at Lloyd’s of London in 2015, the Governor of the Bank of England, Mark Carney, spoke of a “tragedy of the horizon”: “The challenges currently posed by climate change pale in significance compared with what might come … We don’t need an army of actuaries to tell us that the catastrophic impacts of climate change will be felt beyond the traditional horizons of most actors …” (Carney 2015). So, because of its direct and indirect effects and the global interconnectivity, no organization may escape the unknown implications of climate change.
There is another related unknown. The effectiveness of any action one might take depends on, among other things, how well one understands the cause that the action is designed to address. While there is accumulated evidence, attributing causes to specific outcomes of a changing climate remains a theoretical and empirical challenge (Koonin 2021). These are emerging hypotheses, and there is no absolute certainty that we really know the real cause of climate change. It remains within the realm of possibility that the real cause itself cannot be envisioned today. This could mean that it is possible that actions one might take today to address any perceived cause of climate change and its projected consequences could be ineffectual or even counterproductive. Hence, the combination of the unknown about the cause of climate change, direct effects, and the implications of the actions taken to counter this cause and its direct effects compounds the extent of the unknown.
For these reasons, climate change represents an unprecedented challenge because so much about it cannot be envisioned today. Every organization, irrespective of industry, size or location, must develop capabilities to address the unknown. This is particularly true for organizations facing long-term planning horizons in capital-intensive industries.

5.4.2. Climate Change Exposure Requires Urgent Focus on the Unknown

Organizations may tend to eschew commitments to plan for uncertain consequences of climate change that seem far into the future. This orientation would be defensible in relation to Q2 and Q3 events that could be envisioned and prevented. Unfortunately, such postponements may magnify exposure resulting from Q4 circumstances, such as climate change, for which consequences can be neither envisioned nor prevented.
One can conclude that, because exposure from a Q1, Q2 and Q3 event can only be addressed when it can be envisioned, time to address it therefore is as soon as it is envisioned. However, by the time a Q4 event is envisioned, exposure from it is already turning into adversity. Therefore, time to address extreme exposure from Q4 threats is well before it can be envisioned.
The observations from these three case studies can be summarized into five distinct characteristics of extreme threats from the unknown:
  • They can materialize anytime, anywhere.
  • They cannot be envisioned ex ante, and thus they can neither be simulated, nor prevented.
  • Traditional metrics are useless in gauging the strength of an entity to withstand adversity from such threats, can be misleading, and thus can cultivate a false sense of security.
  • Correlations that give rise to them cannot be recognized ex ante even when they develop over a long period of time and are easily missed.
  • Their adverse impact can be so huge that it cannot be absorbed in the normal course of business and can have existential consequences.
Irrespective of the nature of exposure, one cannot classify, estimate, or extrapolate the kinds of conditions, correlations, timings, or situations that constitute Q4 circumstances. Therefore, it is prudent for all organizations to (i) distinguish between extreme exposure from the known and from the unknown, and (ii) address the unknown to survive meltdowns that cannot be prevented.
So, some salient questions are:
  • How do you ensure the next market crash will not devolve into an unanticipated meltdown for businesses, such as financial institutions?
  • How do you ensure survival when unanticipated longer-term trends redefine the marketplace for businesses, such as consumer products and services?
  • How do you protect shareholder value when expected payoffs from significant upfront investment are distributed over decades when political, regulatory, environmental, or technological changes that cannot be envisioned ex ante may render the operating model obsolete, such as for infrastructure businesses?
  • How do you ensure survival if any component of or link in the operating process fails with catastrophic consequences from an event that cannot be envisioned, such as for supply chains, or city, state and national infrastructure and services?

6. Practical Solutions

Adversities can arise from 3 distinct sources of exposure from the circumstances of: (i) Q2, (ii) Q3, and (iii) Q4.
Q2 events can be simulated and predicted using probabilities, and Q3 events can be simulated and stress-tested through scenario analyses, leading to decisions based upon the perceived expected value of cost–benefit considerations. In both cases, the critical component of the defense consists of prevention and/or absorption of the impact of potential adversities. These cases are addressed within the managerial process of the operating model, for which the objective is to preserve the operating model. This is done by managing probabilities and likelihood of occurrence to prevent adversity so that the upside potential of the business model can be maximized. This is traditional risk management.
Q4 events cannot be prevented, nor can their catastrophic impact always be absorbed by the operating model and, thus, pose existential threats to going concerns. So, the objective in addressing Q4 events must be to preserve the going concern. This requires managing the impact of exposure so that existential vulnerabilities are minimized to survive adversity.

6.1. Primary Focus in Addressing the Unknown: Survival

In reality, all three sources of possible adversities must be addressed simultaneously. Therefore, in addition to the prevention focus of the event-centric approach to protect the operating model, organizations must have a second critical defense to explicitly preserve the going-concern. This means thinking beyond probabilities and likelihood and recognizing that possibilities must be considered. This requires a focus on the possible worst-case exposure to assess “how vulnerable is the going concern?”.
Regardless of the events giving rise to it, worst-case exposure can always be envisioned and defined, and for certain risk categories it can even be quantified. By focusing on the worst-case exposure, regardless of what event may cause it or how low its likelihood may be, one needs to ask how this worst-case exposure can be addressed to survive adversity and preserve the going concern. This is an exposure-centric approach where the focus is on defining and managing the exposure to survive catastrophe, instead of defining events and managing their likelihood to prevent catastrophe as in the event-centric approach.
Q4 circumstances, by themselves, can never be addressed, as we do not know what to address. However, by envisioning the worst-case exposure we can turn an outcome of Q4 circumstances into defined exposure, even though we do not know what events may turn it into a worst-case catastrophe. Therefore, defining the worst-case exposure turns Q4 circumstances (unknown outcome-unknown probabilities) into Q3 outcomes (known outcome-unknown probabilities) that can be simulated and addressed.

6.2. Defining Possible Worst Case Is Key

Defining the worst-case possibilities is the foundation, and it is the first step in implementing a disciplined exposure-centric approach. Defining such exposure requires reviewing and analyzing the worst case for every function in the organization, such as: complete breakdown of manufacturing and distribution functions, sudden disappearance of demand for a product line, a huge drop in financial-asset value, inability to maintain financial liquidity, collapse of an electrical or other service grid, before prioritizing what needs addressing

6.3. Preventing Adversity and Surviving Adversity: Two Distinct Objectives

The objective of efforts to address Q4 circumstances to preserve the going concern is not how to prevent the worst case, which is the focus of risk management, but how to survive when the worst-case meltdown cannot be prevented. This is a key distinction, often overlooked by organizations and can leave existential vulnerabilities unaddressed as organizations continue to focus on preventing and not explicitly focus on surviving the worst case. Both need to be addressed. To illustrate, consider the example of the New York City government’s need to keep the city safe from terrorist attacks. This is critical to keeping the city operating. As a result, the New York Police Department (NYPD) spends enormous resources addressing Q2 and Q3 threats to prevent terrorist attacks. But there is also a recognition that, despite all the efforts, not every terrorist attempt can always be foiled and prevented. There is a possibility that one terrorist attack will succeed someday. So another unit of the city, separate from the NYPD, needs to focus on making sure that in the event of a terrorist attack, the city’s structure can survive the catastrophe. This unit’s mandate is not to prevent attacks, but to ensure survival, and thus focus on preserving medical, water, safety, transportation, communications, and other infrastructure elements so that the city can continue to operate after an attack that cannot be prevented. Similarly, organizations must ensure that the going concern is preserved in case of meltdowns that cannot be prevented.
Prioritizing between the need to prevent an adversity and the need to survive an adversity requires internalizing the relationship between operating models and going concerns.
A going concern exists to capture the potential of an operating model. At the same time, an operating model exists to achieve the objective of the going concern. The relationship between the going concern and its operating model is symbiotic, i.e., without a going concern the operating model does not have a platform to exist on, and without an operating model there is no purpose to having the going-concern platform.
Therefore, managing worst-case exposure does not mean placing a higher priority on addressing threats to the going concern than on preventing threats to the operating model, or vice versa. It requires managing worst-case exposure in such a way that addressing it does not get in the way of the operating model. It needs to be managed explicitly and distinctly so that the operating model can thrive and maximize its potential. The symbiotic relationship means that the maximization of operating model and the preservation of going concern requires thinking of another dimension.
Traditionally, when one thinks of the downside exposure, one thinks of tradeoffs between two variables: risk and reward. The analysis above shows that to address risk and uncertainty completely, one needs to think in terms of three variables: reward (upside potential), risk (downside exposure of Q2 and Q3), and existential vulnerability (catastrophic exposure of Q4). Most organizations have disciplined frameworks for addressing the traditional two-dimensional tradeoff, but they often do not have a similar disciplined framework to address the exposure from Q4. Therefore, a similar and parallel framework is needed to address the catastrophic exposure from the unknown.
Addressing catastrophic exposure from the unknown is not about a decision or two, ad hoc projects, or just adding another step in the risk management function. Addressing it requires an explicit process that, if implemented effectively, would influence key operating decisions in preserving the going concern. This requires establishing a discrete objective that drives policies and strategies to establish prudent guidelines, which must begin at the highest level of the organization. Therefore, the board must define an objective to ensure that the going concern is always viable.
It is often believed that addressing existential risk results in a net cost, or that it limits the potential to pursue opportunities. Therefore, managers may tend to avoid addressing the ultra-low probability existential exposure by rationalizing its near-zero expected value in cost–benefit terms. This is an untenable belief, as discussed below, and arises from assuming that the only way to address the catastrophic exposure is by reducing it.
There are several ways to address extreme exposure, but they fall into two categories: (i) contain it through limits, structural solutions, and contingency plans to reduce its impact, and (ii) reduce vulnerability to existential exposure. These are not mutually exclusive categories. Containing it, although necessary sometimes, may compromise some upside potential of the current model but a payoff from the ability to pursue new opportunities in the context of contained extreme exposure may be significantly greater than the compromised potential. Reducing vulnerabilities can also enhance upside potential if undertaken in a disciplined manner, as demonstrated by the next two cases. In practice, some combination of both would be required, and depends upon how the objective to address the worst-case exposure and preserve the going concern is established.
Similar to how the structure of an operating model is protected, a going concern is preserved through establishing specific policies and strategies, and how effectively they are implemented.
Three-part Framework for Living with the Unknown
Walter Wriston, chairman of Citicorp, was frustrated. Once again, Citicorp’s net interest revenue had declined in consecutive months. Until recently there had been an assumption that when interest rates increased, Citicorp’s net interest revenue would go up. Yet, in the second half of 1979 and early 1980, the opposite was happening. Sure, net interest revenue declined because interest expenses increased but so did interest income, albeit by a smaller amount. However, knowing that alone was not very helpful as it didn’t quite enable identifying specific actions that could stem the decline in net interest margin from rising interest rates. And there was no near-term relief in sight from increase in interest rates.
No one could be sure of how bad things could get. There had to be a way to turn the net interest margin dynamics to address the problem. It was clear that the company had too much exposure to interest rate changes, but how could this be addressed going forward … particularly if things got really bad?
Paul Collins, senior vice president, asked an analyst to look at the problem. The analyst created a framework to quantify interest rate exposure, by defining the difference between the amount of assets re-pricing and the amount of liabilities re-pricing in a given period as “interest rate gap”. He showed in a simplistic way that by managing the structure of this gap, the interest rate exposure to the net interest margin could be addressed proactively.
Policy Framework
Until defined, the finance committee of Citicorp was not aware of the worst-case exposure and therefore had not taken proactive actions to address the Unknown: how bad could it get? Once the company’s dynamics of interest rate gap were internalized, the analyst provided a startling perspective to the committee, showing that most of the company’s earnings would be wiped out if interest rates climbed by the highest 30-year historical increase in any 90-day period. This became a proxy for the worst-case exposure. There was sudden rush as the committee felt such large exposure was unacceptable.
So, the committee established an acceptable level of exposure that would provide for earnings growth and yet be prudent to ensure that the worst-case exposure would not be catastrophic. This turned into an interest-rate-gap-management policy that provided a framework to address and manage the company’s interest rate exposure.
The committee also devised a governance guideline to split the interest-rate gap into a structural gap to be managed by the committee and a short-term/treasury gap to be managed by operating units.
Strategy Framework
Using the policy framework, Paul Collins and the treasurer developed a strategy to restructure the balance sheet and for the funding of incremental assets. They also established structural and short-term/treasury interest rate gap limits for the company.
Implementation Framework
The new strategy translated into implementation on two fronts. A program was created to reduce fixed-rate assets through sale of certain assets; raising fixed-rate funds, albeit at the then existing higher rates; and restructuring certain business units to provide relief from external and regulatory constraints. The second part of the implementation included allocating the short-term/treasury gap limit among operating units with controls in place to monitor compliance and exposure.
Years later, Walter Wriston recalled (paraphrased): “… the late 70s and early 80s were perilous times for the industry as banks were being squeezed in a vise between usury limits, regulatory restrictions and a hyper-inflationary environment not envisioned previously … the bank’s net interest margin was in what seemed like a free fall … no one seemed to know where the bottom was … but by defining and getting a handle on the worst-case bottom, we were able to develop and implement a strategy to continue our growth while avoiding what could have been disastrous for the bank”.
This case is based upon the first-hand experience of the author as secretary of Citicorp’s finance committee. Portions of this case have been excerpted from (Paul 2013, pp. 50–52).
With a disciplined approach, and by defining the worst case, Citicorp developed new policies and strategies that changed something that was always lurking around into something that could be managed proactively. It led to the development of an interest-rate-gap management process, which today is the foundation of interest-rate-exposure management at all businesses worldwide.
More importantly, addressing the exposure paved the way for the operating model of Citicorp to grow and take on growing interest-rate risk prudently, as these developments happened just before interest-rate derivatives were introduced as financial products and became a key driver of a new fast-growing revenue stream that also increased extreme exposure significantly. This increase in exposure could have been catastrophic if it had been left unaddressed.
In some cases, it is not possible to contain the worst-case exposure, and thus reducing the vulnerability to the existential exposure is the only option, or vice versa. Reducing vulnerabilities requires developing strategies with a focus on the organization’s strengths to maximize the upside potential while addressing catastrophic exposure to manage the downside.
Reducing Going Concern Vulnerabilities to Maximize Operating Model Potential
The two-day offsite board meeting was viewed as successful. It covered a lot of ground for all major areas of the company, and board members seemed satisfied with the company’s performance and prospects. But one question could not be answered and left some members concerned.
Founded in the early 1990s, the company is a large manufacturer of a line of electronics devices. With 23% market share, it is the second largest producer of such devices, and has a record of steadily growing sales with stable profitable margins. It maintains a strong distribution network, with close relationships with two of the largest distributors who serve all customer segments in the marketplace. The company is consistent and diligent about investing in product development to maintain its market position and monitors new technology and applications closely. It is led by a respected CEO and an aggressive operating management team with a balanced focus on strategy development. It has strong operational and financial controls in treasury, credit, manufacturing, and distribution.
At the offsite meeting, the board was pleased with product development updates. However, following a discussion of extreme risk late on the first day of the meeting, there was a growing concern as it became evident that what the company viewed as its strong market position also had a flip side. The company derived all its sales from consumers. Although there were no visible ripples in the marketplace, questions were raised, asking what would happen to the company if its product line became obsolete suddenly due to the introduction of new technology … like what happened to Kodak or Nokia and Blackberry. During the wrap-up session on the second day the concern was raised again, and VP of strategy development was asked to address this concern.
Following a strategic review, the company concluded that the worst-case exposure from technology obsolescence cannot be reduced any more than what had already been done. The company monitored marketplace activity very diligently, invested in market research as well as in product and market development, and relied on external advisers for updates on technology development. Increased investment in any of these areas would not address the new concern, and an alternative solution would be needed.
Following the CEO’s recommendation, the board established an objective with a policy to minimize existential vulnerability without impacting its current product line or market position. A new strategy was adopted to reduce its 100% reliance on consumer segment, with a goal to derive 30% of its sales from non-consumer segments by the end of year 5 of the new segments’ operation.
An operating plan was put into place that emphasized modifying its product line with applications for corporate segment of the market. It was emphasized that these modified applications in the corporate market would not be impacted if new technology made its consumer devices obsolete.
A new division was created with outsiders hired as GM, chief marketing officer and VP of sales. A strategic review ruled out a major acquisition to diversify sales and distribution, and instead it was decided to leverage its relationship with a current large distributor who also had a strong presence in the corporate market.
Four years after the offsite board meeting, the company derived 14% of its sales from corporate segment in the third year of this new division’s operations and is optimistic about achieving the 30% diversification goal by the end of 5 years.
This case is based upon the first-hand experience and observations by the author. The company in this case wishes to remain anonymous. Certain details have been modified to maintain this anonymity.
Despite investing significant resources, the company did not see a viable strategy that would effectively address the existential exposure arising from obsolescence risk. Therefore, a strategy to focus on reducing the vulnerability was developed so that the going concern could survive relying only its non-consumer-segment business. Through a disciplined implementation of this strategy, the company created a second revenue stream that is growing profitably, while reducing the organization’s existential vulnerability.
One conclusion: a disciplined approach to address extreme exposure could actually pave the way for the operating model to continue striving towards maximizing its potential.
Many organizations turn to diversification with a similar objective. However, unless undertaken in a disciplined manner, focused on addressing the unknown and the defined worst-case exposure, one may not know the effectiveness of such a strategy, and thus complacency may lead to a false sense of security. Therefore, a disciplined approach to addressing the unknown is critical.

7. Conclusions

7.1. Risk Arising from the Unknown Needs Urgent Attention

As the world gets more complex, risks and the scale of exposure from the unknown are increasing significantly. The events of the 21st century demonstrate this both at the macro level, such as the cataclysmic terrorist attacks, the Great Financial Crisis, the pandemic of 2020, several natural disasters, etc., and at the organization level, as shown by the examples cited in this paper. The history of financial and business management includes many momentous innovations to address the challenges of the times. Risks from the unknown represent such a challenge today. This challenge urgently needs a clarion call for scholars to address exposure from the unknown-unknown.

7.2. Risk from the Unknown Can Be Addressed & Requires High-Level Commitment in Organizations

Exposure arising from events that cannot be envisioned can materialize anytime, anywhere and cannot be simulated or predicted. Because of these characteristics, such exposure from the unknown cannot be addressed by traditional risk management as it lies beyond the scope of risk management, which is predicated upon the ability to envision, simulate, and predict adversities.
Over the last 30–40 years, significant advances have been made in addressing risk. But relying on traditional risk management to address the unknown means that exposure from the unknown would be missed and organizations would remain vulnerable to catastrophic meltdowns that cannot be prevented. Therefore, a different approach is needed.
This need will only increase because of growing threats from climate change. Everything about climate change is unknown, including its direct impact as well as the impact of the actions of other organizations and governments to counter it. Government and regulatory mandates with deadlines are already being established to address threats from climate change. This is starting to have ripple effects as organizations begin to position themselves for a new business environment. Such actions and mandates are adding a new dimension of complexity to decision making. Therefore, the need and the urgency to address the unknown are becoming increasingly critical.
Exposure from the unknown can be addressed but requires a different perspective on risk and uncertainty. Traditional risk management is based upon an event-centric approach that defines events and their likelihood to prevent adversities to maximize the upside. Addressing extreme exposure from the unknown requires employing a disciplined exposure-centric approach that focuses on defining and managing the possible worst-case exposure to survive meltdowns to preserve the going concern structure of the organization.
This requires commitment from boards and senior management of organizations to establish explicit policies and strategies to implement solutions that reduce the adverse impact of extreme exposure from the unknown and sustain the going concern structure. Anything less leaves organizations vulnerable to meltdowns that cannot be prevented.
Scholars of risk and finance stand as the potential purveyors of new theories and analytical methods to address the significant and relevant challenges presented by Q4, the unknown-unknown.


This research received no external funding.


The author would like to thank the reviewers for their critiques which have helped improve the quality of this paper. The author is also grateful to David Gautschi for his guidance and counsel that has contributed to significant enhancements in communicating the content of this paper. Finally, the author wishes to acknowledge and thank many business associates and friends who, over the last several years, have patiently listened to the author’s presentations. Their feedback has been helpful in fine tuning the thesis presented in this paper.

Conflicts of Interest

The author declares no conflict of interest.


  1. Arrow, Kenneth J. 1951. Alternative Approaches to The Theory of Choice in Risk Taking Situations. Econometrica 19: 404–37. [Google Scholar] [CrossRef]
  2. BBC. 2022. Energy Bills: Tens of Thousands of Firms ‘Face Collapse’ without Help. Available online: (accessed on 10 September 2022).
  3. Carney, Mark. 2015. Breaking the Tragedy of the Horizon—Climate Change and Financial Stability. Speech at Lloyd’s of London. September 29. Available online: (accessed on 1 September 2022).
  4. Clarke, Arthur C. 1962. Profiles of the Future: An Inquiry into the Limits of the Possible. London: Victor Gollancz Ltd., p. 19. [Google Scholar]
  5. De Groot, Kristel, and Roy Thurik. 2018. Disentangling Risk and Uncertainty: When Risk-Taking Measures Are Not About Risk. Frontiers in Psychology 9: 2194. [Google Scholar] [CrossRef] [PubMed]
  6. Dizikes, Peter. 2010. Explained: Knightian uncertainty. MIT News. June 2. Available online: (accessed on 1 September 2022).
  7. Fink, Larry. 2022. The Power of Capitalism. Letter to CEOs 2022. Available online: (accessed on 1 September 2022).
  8. Frowen, Stephen F. 2004. Economists in Discussion: The Correspondence Between G.L.S. Shackle and Stephen F. Frowen, 1951–1992. New York: Palgrave Macmillan. [Google Scholar]
  9. Gilbert, Daniel. 2006. Stumbling on Happiness. New York: Knopf. [Google Scholar]
  10. Grunwald, Michael. 2015. Inside the War on Coal. Politico. May 26. Available online: (accessed on 1 September 2022).
  11. Guerron-Quintana, Pable. 2012. Risk and Uncertainty. Business Review 2012 Federal Reserve Bank of Philadelphia. Available online: (accessed on 1 September 2022).
  12. Haldane, Andrew G., and Vasileios Madouros. 2012. The Dog and the Frisbee. Speech at the Federal Reserve Bank of Kansas City’s Economic Policy Symposium, “The Changing Policy Landscape”, Jackson Hole, WY, USA, August 31; Available online: (accessed on 1 September 2022).
  13. Ingram, B. Lynn. 2013. California Megaflood: Lessons from a Forgotten Catastrophe. Scientific American. January 1. Available online: (accessed on 1 September 2022).
  14. Kahneman, Daniel, Paul Slovic, and Amos Tversky. 1982. Judgment under Uncertainty: Heuristics and Biases. Cambridge: University of Cambridge Press. [Google Scholar]
  15. Kiel, L. Douglas. 1996. Lessons for Managing Periods of Extreme Instability. In California Research Bureau. CRB-96-005. Sacramento: California State Library. [Google Scholar]
  16. Knight, Frank. 1921. Risk, Uncertainty, and Profits. Part III Ch VII and Part III Ch VIII. pp. 224–25, 233. Available online: (accessed on 1 September 2022).
  17. Koonin, Steven E. 2021. Unsettled: What Climate Science Tells Us, What It Doesn’t, and Why It Matters. Dallas: Ben Bella Books. [Google Scholar]
  18. Langlois, Richard N., and Metin M. Cosgel. 1993. Frank Knight on Risk, Uncertainty, And the Firm: A New Interpretation. Western Economic Association International XXXI: 459–60. [Google Scholar] [CrossRef]
  19. Latour, Almar. 2001. A Fire in Albuquerque Sparks Crisis For European Cell-Phone Giants. The Wall Street Journal. January 29. Available online: (accessed on 1 September 2022).
  20. Levin, Jonathan, and Paul Milgrom. 2004. Introduction to Choice Theory. Available online: (accessed on 1 September 2022).
  21. Luft, Joseph, and Harry Ingham. 1961. The Johari Window: A graphic model of awareness in interpersonal relations. Human Relations Training News 5: 6–7. [Google Scholar]
  22. Macaskill, William. 2022. The Beginning of History: Surviving the Era of Catastrophic Risk. Foreign Affairs, September/October. Volume 101, Number 5. [Google Scholar]
  23. Mukherjee, Amit S. 2008. The Spider’s Strategy: Creating Networks to Avert Crisis, Create Change, and Really Get Ahead. Upper Saddle River: FT Press. [Google Scholar]
  24. Nelson, Stephen C., and Peter J. Katzenstein. 2014. Uncertainty, Risk and the Financial Crisis of 2008. International Organization 68: 361–92. [Google Scholar] [CrossRef]
  25. Nishiguchi, Toshihiro, and Alexandre Beaudet. 1998. Case Study: The Toyota Group and the Aisin Fire. Sloan Management Review. Available online: (accessed on 1 September 2022).
  26. Park, K. Francis, and Zur Shapira. 2017. Risk and Uncertainty. In The Palgrave Encyclopedia of Strategic Management. Edited by de Mie Augier and David J. Teece. New York: Palgrave Macmillan. [Google Scholar]
  27. Parker, Mario, and Noah Buhayar. 2010. Peabody Says Coal in Early Phase of ‘Super Cycle’. Bloomberg Markets. June 24. Available online: (accessed on 1 September 2022).
  28. Paul, Karamjeet. 2013. Managing Extreme Financial Risk: Strategies and Tactics for Going Concerns. New York: Academic Press, pp. 50–52, 113–15. [Google Scholar]
  29. Restuccia, Andrew. 2015. Michael Bloomberg’s War on Coal. Politico. April 8. Available online: (accessed on 1 September 2022).
  30. Taleb, Nassim Nicholas. 2004. Fooled by Randomness: The Hidden Role of Chance in Life and in the Markets. New York: Random House. [Google Scholar]
  31. Taleb, Nassim Nicholas. 2007. The Black Swan. New York: Random House. [Google Scholar]
  32. United States Geological Survey. 2011. Overview of the ARkStorm Scenario; pp. 171–72. Available online: (accessed on 1 September 2022).
  33. Wikipedia. 2022. Nord Stream 2. Available online: (accessed on 1 September 2022).
  34. Wladawsky-Berger, Irving. 2019. The Business Value of Resilience. The Wall Street Journal/CIO Blog. February 15. Available online: (accessed on 1 September 2022).
  35. Yaari, Menahem. 2017. Kenneth J. Arrow’s Work on Coping with Risk and Uncertainty. The Econometric Society: In Remembrance. Available online: (accessed on 1 September 2022).
Figure 1. Possible Future Circumstances.
Figure 1. Possible Future Circumstances.
Jrfm 15 00449 g001
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Paul, K.S. Surviving Meltdowns That Cannot Be Prevented: Review of Gaps in Managing Uncertainty and Addressing Existential Vulnerabilities. J. Risk Financial Manag. 2022, 15, 449.

AMA Style

Paul KS. Surviving Meltdowns That Cannot Be Prevented: Review of Gaps in Managing Uncertainty and Addressing Existential Vulnerabilities. Journal of Risk and Financial Management. 2022; 15(10):449.

Chicago/Turabian Style

Paul, Karamjeet S. 2022. "Surviving Meltdowns That Cannot Be Prevented: Review of Gaps in Managing Uncertainty and Addressing Existential Vulnerabilities" Journal of Risk and Financial Management 15, no. 10: 449.

Article Metrics

Back to TopTop