Next Article in Journal
What Drives the Reverse of Overseas Brain Drain? Identifying the Critical Factors by a Hybrid Grey DANP Technique
Previous Article in Journal
Global–Local Linkage Patterns of Guangdong’s Industries: Evidence from Multi-Scale Input–Output Network Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Mistranslation of Uncertainties: From Epistemological Uncertainties to Legitimate Resilience Governance

Department of Technology and Society, State University of New York, Korea, Incheon 21985, Republic of Korea
Systems 2026, 14(3), 273; https://doi.org/10.3390/systems14030273
Submission received: 14 January 2026 / Revised: 16 February 2026 / Accepted: 23 February 2026 / Published: 3 March 2026
(This article belongs to the Section Systems Practice in Social Science)

Abstract

This study examines how system resilience can be strained when governments define and mistranslate epistemological uncertainties into technical and managerial problems, using two comparative case studies of sociotechnical disasters (i.e., nuclear safety failure and pandemic digital surveillance). Drawing on Funtowicz and Ravetz’s post-normal science as a framework, the analysis introduces three types of uncertainties and conceptualizes the “mistranslation of uncertainties,” through which this research illuminates a structure whereby epistemic diversity is marginalized, and in turn, policy legitimacy deficits in implementation can be amplified, potentially eroding sociotechnical resilience. The article contributes to the field of sociotechnical resilience by (1) visualizing uncertainty mistranslation, which leads to a legitimacy deficit, (2) illustrating how mistranslation develops from a monolithic, technocentric understanding of uncertainties, and (3) proposing a framework of “resilience for legitimacy” that seeks to embed resilience work and co-production of responses within institutional practices. This research highlights how the co-production of epistemological translation and social order, encompassing deliberation and social feedback, can support democratic legitimacy and thereby resilient governance of sociotechnical systems.

1. Introduction

Policymakers and experts often attempt to domesticate uncertainties by turning open-ended epistemological questions about what can be known, by whom, and under what assumptions into narrower technical or methodological scopes that appear more tractable in existing institutional routines [1]. However, scientific assessments of natural hazards, environmental uncertainties, and systemic vulnerabilities rarely offer clean, “evidence-based” answers that policymakers look for to legitimize high-stakes decisions. Instead, what Funtowicz and Ravetz have called post-normal conditions, where facts are uncertain, values in dispute, stakes high, and decisions urgent, are increasingly the prevailing conditions rather than an exception [2]. This growing prevalence is because governance of current technological systems, whether in nuclear power, public health surveillance, or other critical infrastructures, takes place under conditions of complex uncertainties [2,3,4]. These uncertainties often extend beyond the errors in parameters or calculation formulas to boundaries of social consensus on modeling assumptions, the validation of modeling, and even the policy value of research outcomes and political conclusions on the acceptable level of risk. Therefore, uncertainties inevitably demand appropriate processes of public debate and institutional negotiation [2,3,4,5,6,7,8,9].
Sociotechnical systems are embedded in networks of institutions, expertise, and material infrastructures that blur the lines between scientific reasoning, political authority, and social legitimacy [10,11,12,13,14,15,16,17,18,19,20,21,22,23,24,25,26,27]. The more sociotechnical buffers distance us from nature, the more tightly our technological systems become intertwined with social institutions, political and cultural contexts, and ecological complexity [28]. Within this tangle, scientific and technical uncertainties of systems, especially those reduced to numbers and models, regularly run up against policymaking processes that still expect clear, defensible answers [29,30]. Regulatory science [5,6] highlights this tension by showing how science used for policy must negotiate gaps between expert knowledge and the social order it seeks to support. Traditional reductionist approaches tend to tame uncertainty through measurement and modeling, but this strategy strains against the epistemological dimensions of policy problems that fall outside narrow feasibility testing [2,5,6,7].
Therefore, this study argues that the mistranslation of uncertainties is not merely a cognitive or communicative glitch. Scientific uncertainties originate from incomplete agreements on subjective judgments, methodological vagueness, and the unavailability of information and data. Unpredictability of social dynamics and political instability, inherent limitations of defining variables, and limited datasets can hinder precise scientific assessment and quantified judgment. When epistemological uncertainty regarding the limitations of models, predictive scenarios, or monitoring equipment is reframed as an engineering problem solvable within a certain technocrat group, diverse approaches to the uncertainties and alternative interpretive proposals are excluded. The systemic marginalization of epistemic diversity and justified constraints on daily life that emerged during the Fukushima nuclear disaster and South Korea’s COVID-19 response attest to this dynamic. In Japan, the long-standing debate over tsunami risk scenarios and seawall design was defined solely within specific expert circles as a probabilistic argument about technical feasibility. In South Korea, despite deep uncertainty about scientific facts, like the probability of viral spread from asymptomatic transmission, mass data collection and restrictions on personal privacy were legalized and enforced as legitimate technocratic approaches. In both cases, the way uncertainty was handled upstream conditioned not only material resilience but also the democratic quality of crisis governance.
Against this backdrop, the research advances the literature on environmental risk governance and resilience in three ways. First, it reframes the translation of scientific uncertainty as a mechanism that can give rise to legitimacy deficits in sociotechnical systems. Based on the distinction between technical, methodological, and epistemological uncertainty proposed in “post-normal science,” this study examines how misinterpretations or reductions in these uncertainties, along with the knowledge excluded in the process, inevitably became codified as “incompletely theorized agreements (e.g., uncomfortable knowledge),” where relevant institutions found it difficult to accept [31] p. 112, [32]. Second, the legitimacy approach to the Fukushima nuclear accident and South Korea’s COVID-19 response illustrates how knowledge production processes unreasonably transformed epistemological uncertainty into technical problems, such as seawall height or digital tracking system design. In doing so, this study describes how the knowledge narrowed the space of deliberation, excluded epistemic diversity, weakened their legitimacy, and ultimately affected the resilience of sociotechnical systems.
Third, extending existing resilience and risk-governance work, the analysis proposes a “resilience for legitimacy” framework that specifies institutional practices of communication, inclusion, integration, and reflection for dealing with epistemological uncertainties over long temporal and broad spatial scales. Rejecting resilience as a purely engineering property of “bouncing back,” the resilience framework appropriately translates and affords epistemological uncertainties by centering procedural openness, epistemic pluralism, and co-produced legitimacy. The knowledge-making process is a vital part of building legitimacy, ensuring good governance for complex, modern systems (e.g., healthcare and energy grids), and enhancing infrastructure resilience to withstand a crisis. This means that figuring out “what we do not know” is not just a technical problem for engineers or scientists to solve; rather it is a problem of how society institutionalizes and governs uncertainties [33].
“Legitimacy hence relates not only to political arguments for institutions being justified in exercising political power but can also be extended to the analysis of the quality of knowledge” [8] p. 5.
As such, the critical area to explore is how governance systems requiring legitimacy under conditions of crisis systemically mistranslate epistemological uncertainties into technical and administrative scopes. Drawing on insights from science and technology studies with a systems approach, this research suggests a conceptual framework to understand how complex sociotechnical governance manages uncertainty, failure, and resilience. However, this study does not aim to provide comprehensive empirical verification or examine causal sequences. Rather, it aims to offer a conceptual framework for exploring the intersectional approach to mistranslation governance and epistemological uncertainties in the sociotechnical resilience contexts. Section 2 and Section 3 outline types of uncertainty, while Section 4 shows how epistemological uncertainties are systemically reduced to technical or methodological terms, limiting epistemic diversity and undermining democratic legitimacy, conditions that ultimately weaken system resilience. Section 5 develops a “resilience for legitimacy” approach that integrates co-production and resilience governance, treating uncertainty not as an obstacle but as a starting point for reflexive and adaptive policymaking [34].

2. Theoretical Review: Democratic Legitimacy Deficit in Translating Scientific Uncertainties

Policymaking, backed by scientific advice and discussion, frequently encounters legitimacy challenges when uncertainty is substantially involved. The transparency, rationality, and procedural integrity of scientific advice all shape the way policies are formulated. Although scientific evidence typically strengthens the rational basis for policy decisions and improves transparency, uncertainties of evidence complicate trust and weaken policy acceptance. Particularly in contexts mistranslated by conflicting interests on scientific uncertainties, the participation and structure of scientific advisory committees may themselves become contested. Scientific uncertainty challenges policy accountability and precipitates judicial disputes (Figure 1).
A legitimacy deficit arises from priorities on definitive outcomes while confusing the boundary of epistemological uncertainty. Administrative priorities on definitive outcome over open-ended deliberation necessarily foreclose alternative interpretations and exclude from decision-making potentially relevant epistemic actors, often including civic organizations, communities, and diverse scientific groups. This intensifies an institutional disconnect between scientific expertise and democratic governance [8] (①, ④). The “science advisory curtain,” depicted in Figure 1, epitomizes this insulation, where expert judgments (③) are shielded from political scrutiny by stakeholders (②).
For instance, answering the question “To what extent do we require strict, intensive tracking systems in the COVID-19 response?” leads to a societal inquiry beyond the level of scientific assessment and deliberation on how much the policy interferes with privacy rights written in constitutional laws. With evidence on asymptomatic transmission limited, quarantine and contact-tracing policies on an unsubstantiated basis can undermine public trust in the scientific community’s judgment once the evidentiary weakness is revealed.
Scholars in science and technology studies have long theorized how scientific reasoning cannot be directly translated into its social implications (⑤). In disaster management, for example, models used to simulate floods generate deductive numerical results that rely on embedded assumptions and simplifications [35,36,37]. Sociotechnical models often privilege linearity and reductionism, obscuring epistemic plurality and overrepresenting deterministic interpretations. Even when models retrospectively match observed data, they construct only one possible representation of complex realities, particularly for rare or extreme events such as pluvial flooding [35,38,39,40].
Saltelli and Funtowicz [3] emphasize that scientific models are powerful exploratory tools but poor arbiters of certainty. Their assumptions and methodological constraints propagate through systems, leading to misleading precision. Policies that lean excessively on reductionist models are susceptible to fallacies and incomplete risk framing. As Saltelli et al. [4], p. 484 caution,
“Mathematical models are a great way to explore questions. They are also a dangerous way to assert answers… Asking models for certainty or consensus… can invite ritualistic use of quantification. Models’ assumptions and limitations must be appraised openly and honestly. Process and ethics matter as much as intellectual prowess.”
Legitimacy also begins to erode once scientific, political, and social systems fail to meet the criteria of “throughput legitimacy [8].” Conventional scientific advisory structures, compromised with uncertainties, lacking epistemic diversity, jeopardize procedural transparency and, in turn, policy legitimacy. This is particularly problematic in post-normal situations in which factual relationships are somewhat unclear, and conflicts of interest are acute [2,9]. Under these conditions, the isolation of technocrats and scientific advisors instead amplifies and exposes the vulnerability of knowledge governance.
The danger of homogeneous translation of uncertainty, highlighting legitimacy disconnect inherent in the pursuit of numeric complacency [7,8,41,42,43], has long been discussed. When scientific judgments are used for policy justification without adequate filtering, the fractured interplay between science, society, and policy has historically manifested in various fields, including health policy, environmental protection, and social welfare. In extreme cases, this lack of legitimacy leads to disaster causation, policy incoherence, and administrative failure, as seen in the Fukushima nuclear accident in East Asia and the initial COVID-19 response [44,45,46,47].
The central question, therefore, is how societies can complement a dearth of legitimacy, remaining grounded in bounded factuality interpreted by a unitary group. The sections that follow analyze how different types of uncertainty operate within sociotechnical systems and how epistemological uncertainties can be connected to more inclusive, participatory structures, overcoming uncertainty constraints [45,48,49].

3. Framework: Visualizing Uncertainties

A clear delineation of the types and sources of uncertainty is a prerequisite to strategizing resilience planning for knowledge making and disaster preparedness. Funtowicz and Ravetz [2] identify the classification of three core uncertainties, each distinctly corresponding to characteristics and policy domains (Table 1). First, technological uncertainty concerns numerical exactness and engineering precision to be translated into policy. Designing and constructing a dam, for example, involves calculations of water inflow and outflow, where variability or error margins must be technically quantified. Second, methodological uncertainty posits model assumptions. These, for instance, include choices of temporal scales, forecasting precipitation, or hydrological variations. Any real-world hypothesis test encompasses auxiliary assumptions, which may themselves introduce error [50] p. 16.
Lastly, epistemological uncertainty deals with the problems implicating social, institutional, or ecological outcomes that exceed the boundaries of scientific calculation and modeling. Decisions about how to adjudicate water rights and allocate the outflow from a reservoir, or which level of safety regulations should be implemented for nuclear facilities coping with tsunami risks, translate political judgments, adjust value trade-offs, and embrace long-term societal negotiations. Governing epistemological uncertainty necessarily addresses post-normal science inquiries: participatory engagement and reflexive deliberation [2]. Technical modeling alone rarely resolves disagreements about acceptable risk levels, infrastructural scales, legitimacy of trade-offs, or distribution of burdens for economic consideration impacting technical design. These reflect fundamentally social questions and multi-stakeholder deliberation.
Figure 2 visualizes the different contexts of uncertainties listed in Table 1. In general, technological uncertainties prioritizing numerical accuracy and scientific models resolve uncertainties by determining the exact gradient from point A, which makes it easy to define the direction of change in a variable (e.g., traffic flow) [54,55].
Considering the gradient from A to B or A to C, it is evident that sociotechnical trajectories differ: the slope is positive during t0–t1 (A to B), but negative during t0–t2 (A to C). These differences illustrate methodological uncertainties associated with, for instance, selecting temporal scales, which models handling multiple variables often struggle to define. While a trajectory between A and C may be defined, choosing the appropriate temporal scale remains challenging to predict a gradient beforehand for any given period (e.g., in Colorado River water allocation; see Gim [56]).
In Figure 2, over the more extended period of t0–t3, a model attempting to predict a gradient must first determine which trajectory is assumed among several moving targets (e.g., A–D, A–E), if they do. This requirement introduces epistemological uncertainties beyond calculation or variable issues, calling for reflexive and participatory governance to specify (social or technological) targets. Under such conditions, “simple concepts of causation” become insufficient for policy anticipation [57] p. 438.
As Table 1 and Figure 2 together show, it remains unclear how even improved technical or methodological adjustments and calibrations can address and justify the variations in outcomes of tsunami models, particularly when institutional blind spots intervene. This leaves the complex interplay of epistemological uncertainties unresolved. The epistemological uncertainties and following social outcomes hidden in nuclear disaster preparations and tsunami predictions, such as forecasts of complex and irregular natural phenomena and results, differ significantly from the goal of “basic, curiosity-driven science” [39] p. 14; [58] p. 255. Focusing on engineering and methodological uncertainties pursues deterministic causality, employing statistics and incorporating temporal–geospatial conditions to calculate correlations through “retrodiction” models that compare the past observed data with current empirical ones [35,48,50,59,60]. Acknowledging the inherent epistemological uncertainties in science shifts and opens up scientific conversations toward more inclusive and balanced governance discussions.
“Anticipatory governance comprises the ability of a variety of lay and expert stakeholders, both individually and through an array of feedback mechanisms, to collectively imagine, critique, and thereby shape the issues presented by emerging technologies before they become reified in particular ways. Anticipatory governance evokes a distributed capacity for learning and interaction stimulated into present action by reflection on imagined present and future sociotechnical outcomes.” [61] pp. 992–993.

4. Analysis: Mistranslation of Uncertainties

The following case studies conceptually and theoretically analyze the Fukushima nuclear accident and South Korea’s COVID-19 response. The analysis demonstrates how various types of uncertainty are, in some cases, misinterpreted at the policy and institutional levels, thereby transforming into policy uncertainty or systemic vulnerability. Instead of conducting empirical investigations of these events, the analysis relies on secondary literature sources of journal articles, official governmental investigation reports, statutory documents, and news articles illuminating the case analysis.

4.1. Seeking a Definitive Number: 5.7 Meters (2011 Fukushima Nuclear Disaster)

The Fukushima case clarifies the mistranslation mechanism of epistemological uncertainty (t0–t2, A to C in Figure 2). The regulatory design for nuclear safety and tsunami preparedness extended far beyond engineering considerations. The Japan Meteorological Agency initially warned that only a three-meter tsunami would reach the Fukushima power plant area after the magnitude 9.0 earthquake on March 11 [62,63]. After the disaster, it was reported that an ostensibly “unanticipated” event caused a station blackout at Fukushima, which led to cooling system failures, explosions in Units 1, 3, and 4, and the meltdown of three reactor cores. Tsunami prediction is less technically mature and less statistically robust than seismological models of crustal stress and precursory phenomena [64]. Determining wave heights induced from the impacts of nuclear disasters, regulatory capture, economic feasibility, risk compliance, social culture of risk perception, earthquake magnitude, scale, tidal conditions, and coastal geology includes not only a task of numerical decision, but also a complexity of epistemological considerations [65].
Revealed critical epistemological limitations extend beyond methodological or technical boundaries (Table 1). Combining Figure 1 and Figure 2 illustrates the process by which epistemological uncertainty is narrowed down to more specific categories of technical or methodological uncertainty, “uncomfortable knowledge” [31]. Reductionist technical approaches prefer technical feasibility and numeric quantification over contemplating the nature of epistemological uncertainty. Regarding tsunami risks, diverse perspectives of different domains are much less appreciated. Questions of systemic uncertainty were narrowly defined [2,9,66,67]. Key epistemological questions related to scientific and technological assessment, central to tsunami and nuclear safety regulation, were not fully considered [2,66,67]. Given the complexity of tsunami prediction, a clear epistemological consensus on the limitations of scientific models should have been revisited earlier [64]. No official warning for a large tsunami was issued, and the flooding hazards at Fukushima were not fully vetted or translated into preparedness measures, convoluted with insufficient scientific evidence [68]. Consequently, pre-existing regulations on tsunami prediction were ill-suited for nuclear plant construction in Japan [46,68].
The Tokyo Electric Power Company (TEPCO) case shows an organizational pattern of containing uncertainties within comfortable technical thresholds. (Figure 3 and Figure 4), reinforcing a technological framing of preparedness. In the 2011 Fukushima nuclear disaster, major issues in applying tsunami predictions were understood as obtaining precise data and making sound judgments [69]. TEPCO relied on its conventional method, searching for a selective, bifurcating number to meet established regulations as a means of evading legal responsibility. Remaining within a narrow technological zone also allowed TEPCO to avoid broader, out-of-the-box social deliberation on nuclear-related uncertainties. Social deliberation might have generated stronger mandates or prompted more extensive modeling and public hearing processes. Such a “fail-safe” risk assessment culture of engineering was paradoxically unsustainable in response to nonlinear and evolving conditions [60,70]. This mistranslation of epistemological uncertainties shifted the challenge of calculating moving targets (e.g., A–D, A–E) to the simplified gradient, such as A to B or A to C in Figure 2.
Notably, internal TEPCO documents revealed that as early as 2006, the company’s safety management team evaluated the probability of a tsunami exceeding 5.7 m, the maximum height in its preparedness plan [71]. However, this information did not translate into corrective policy or infrastructure decisions. In Fukushima’s case, technical forecasts alone did not prompt sufficient pre- or post-disaster action, signaling a consistent failure in preparedness and response systems [36,39]. TEPCO oversimplified uncertainties and misclassified epistemological issues as merely an issue of modeling accuracy (Table 1, Figure 2). TEPCO systematically dismissed historical evidence, social concerns, and alternative technical models that warned of larger tsunamis [46,68,71]. Reductionist models validating Earth system predictions possess inherent limitations, while raising important questions about the extent to which we can accept the suggested level of risks and the decision legitimacy from such models in policy contexts [39,62,63] (Figure 1).
Safety regulations and key decision-making, underestimating epistemological uncertainties, on seawall design and assessments of inundation risk and back-up generators were primarily clouded by engineering quantification. This amplified public distrust towards TEPCO’s closed engineering practices and culture, as well as the entire energy system resilience in Japan, in the aftermath of the disaster.

4.2. Tracking COVID-19: Uncertainties and Infringements on Freedom and Privacy in South Korea (2020–2021)

Korea’s response system to COVID-19 epitomizes the structure of taming epistemological uncertainty into a technocratic control realm with contested legitimacy. The trade-off between protecting public health and safeguarding individual privacy arose in containing the outbreak and spread by adopting draconian restrictions on the right of movement regarding “infection-suspected persons” cases during the COVID-19 pandemic in South Korea. During the early stages of the pandemic in 2020, responsible national and local agencies implemented extensive containment policies. COVID-19, or coronavirus disease, is caused by SARS-CoV-2, an RNA virus belonging to the Coronaviridae family [72], and was designated a “Group 1 infectious disease—novel infectious syndrome” under Article 2, Subparagraph 2 (l) of South Korea’s Infectious Disease Control and Prevention Act (hereafter “the Act”). To manage viral transmission, the South Korean government undertook responsibilities to treat and hospitalize infected individuals, while simultaneously implementing policies to “generate, analyze, and provide information related to an infectious disease” and to build and operate an information system for outbreak prevention and management (Article 4 (2), Subparagraphs 1, 2, 5, and 14) (as of 2020) (Figure 5).
The COVID-19 Epidemiological Investigation Support System, launched on 26 March 2020, replaced earlier manual methods based on interviews and phone calls. It enabled authorities to track the movements of confirmed and suspected cases using credit card transaction histories, transportation card usage data, and image data obtained from surveillance cameras. These location-based data were then used to identify and trace close contacts for further investigation (Appendix B). This digital platform was initially devised under the National Strategic Smart City R&D Program, funded by the Ministry of Land, Infrastructure and Transport (Figure 6). However, it was repurposed during the pandemic to track viral transmission.
Using a digital tracking network, extensive personal information about name, resident registration number, address, age, gender, telephone number (including mobile number), prescription records, medical treatment history (e.g., symptom onset date and diagnosis date), and travel history of confirmed and suspected cases, was mandated to be disclosed throughout the medical service network of the National Health Insurance Service and the Health Insurance Review and Assessment Service (Appendix B). This information was generally accessible for medical examination, prescription, and treatment, as well as used to justify enforcing isolation and other compulsory measures by local governments (Infectious Disease Control and Prevention Act, Article 76-2 (1), (2), (3)).
Furthermore, pursuant to the Act, individuals who exhibited no symptoms but fall into the category of “infection-suspected persons” were also subject to diagnostic testing (Article 13 (2)), mandatory quarantine, and symptom monitoring through digital and communication technologies (Article 42 (2)). Those who refused investigation were compelled to face legally enforceable sanctions (Article 42 (4)), without scientific certainties surrounding COVID-19 transmission (Figure 5). These provisions raised critical questions about the contested means of compulsory quarantining and tracking of “infection-suspected persons,” where democratic consensus lacks legitimacy. For detailed legal arrangements, refer to Appendix A.
To trace individuals categorized as “infection-suspected persons,” even detailed location data were collected and transmitted through relevant government agencies (e.g., Korea Centers for Disease Control and Prevention, KCDC, as of 2020) broadly for follow-up actions such as epidemiological investigations. The travel history and medical data of individuals who were suspected of testing positive were reconstructed, analyzed, and shared across surveillance, isolation, and medical systems operated by KCDC, local governments, quarantine facilities, and hospitals. Epistemological uncertainties and unexamined assumptions shaped policy decisions even though the authorities had constrained understanding of COVID-19 transmission. The KCDC adopted intrusive, privacy-restrictive measures while disregarding scientific contestation on transmission and limited and rapidly evolving evidence on the effectiveness and proportionality of such measures (e.g., mandatory testing, enforced quarantine, and behavioral monitoring) and being challenged by constitutional contestation over tracking technologies and the disclosure of private information. This is an instance of the mistranslation of epistemological uncertainties to technological methods. Such limitations, therefore, embedded in the control measures, constrained the legitimacy of public health policies that scientific communities suggested and intensified social resistance to intrusiveness.
For example, following the 2020 Itaewon outbreak, petitioners challenged the government’s collection and processing of Itaewon base-station connection data obtained from mobile network operators without individual consent or judicial authorization. They argued that tracing close contacts and telecommunications records infringed informational self-determination and privacy rights. Although, on April 25, 2024, the Korean Constitutional Court upheld the constitutionality of Article 76-2(1)(1) of the Infectious Disease Control and Prevention Act (as in effect at the time) and dismissed the challenge to the specific data-collection as inadmissible, contestation over the scope and targeting logic of blanket data collection continues to reverberate, disrupting social cohesion, and weakening trust in the scientific basis for infection-risk claims [74]. Social stress and resistance to privacy intrusion, social stigma, unethical approach of digital surveillance, expanded prosecutions, and imposition of fines widened the distrust gap between advice from scientific advisory communities and layperson perceptions on local and national authorities [32,60,75,76].

5. Discussion: Legitimacy Governance

5.1. Conflicts Between Technological Translation and Epistemological Uncertainty

A recurring theme across disasters is that technological approaches tend to confine understanding within technical boundaries and struggle to identify the deeper uncertainties in scientific models themselves. An amalgamation of uncertainty translation marginalizes divergent voices and weakens knowledge-making legitimacy. Technocratic convenience can “lead to greater costs than if no predictions were provided” [36] p. 98. The common argument from technocrats is that technical or methodological uncertainties in predictions (e.g., logistical challenges) were unavoidable and therefore the root cause of system failures. Engineering-based frameworks tend to chase technical “puzzles” and “single determinate quantities” [77] p. 123, yet, even with advances in data science and machine learning, the classic insight that “the whole is more than the sum of its parts” remains relevant. Ironically, TEPCO’s post hoc justification framed the cause as an unavoidable disaster in tsunami forecasting, a narrative strategically aligned with avoiding legal liability [78].
NISA, the regulatory authority responsible for nuclear power plant safety, operated under the Nuclear Science Committee and adhered to “The Regulatory Guide for Reviewing Safety Design of Light Water Nuclear Power Reactor Facilities.” These guidelines, last updated in 1990, did not address tsunami-related risks. This regulatory near-sighted myopia reinforced TEPCO’s corresponding legal responsibilities, which were narrowly defined, encouraging the company to focus exclusively on technical clarity while overlooking the broader epistemological context necessary for recognizing the limitations of reductionist models in addressing unpredictable phenomena. According to International Atomic Energy Agency guidelines, NISA should have required TEPCO to relocate backup generators to higher ground, protect lower sections of reactor units with impervious shielding, and secure additional backup pumps for flooding scenarios [68]. However, NISA mandated none of these measures while trusting myopic calculation.
The bureaucratic responses of crisis management reveal a failure to engage with the complexity of tsunamis and pandemics critically. TEPCO dismissed plural scientific predictions regarding the tsunami magnitude. Independent investigations afterwards subsequently questioned whether TEPCO and regulatory bodies considered the possibility of system failure, given the nature of tsunami uncertainties and the operation of nuclear plants. Findings revealed that TEPCO was entrenched in a bureaucratic myth of technical perfectionism, a belief in the “absolute safety of nuclear power plants” [79,80] p. 14. That said, it is critical to ensure that scientific limitations, institutional boundaries, and societal implications of technological or methodological approaches are openly acknowledged and discussed to minimize linear, bounded technocratic solutions, which marginalize the epistemological limitations [81].
Technological models face inherent constraints in mapping “the sensitivities of results to divergent assumptions.” [4,11,77] p. 123. During 2020 and 2021, in South Korea, many individuals of confirmed cases who violated self-isolation orders or failed to comply with compulsory measures were prosecuted or fined by local governments and the justice department. Self-isolation mobile applications enabled public officials to monitor violators’ movements, mask-wearing behavior, interpersonal contacts, and credit card use, extracted from public transportation logs, card transaction histories, surveillance camera footage, and even video recordings of dashcams in private vehicles. These restrictive means did not mandate warrants for administrative implementation under Korean law, while raising constitutional concerns regarding privacy and human rights under South Korean law. Misguided policy implementations amid epistemological uncertainties complicated the decision-making coordination and engendered unanticipated severe consequences [32,60,75,76].

5.2. From Scientific Uncertainties to Resilience for Legitimacy

Dependence on probabilistic calculations is misleading when addressing “fuzzy and controversial socio-political problems” [77] p.119, [82]. When contextual uncertainties are substantial or poorly understood, socially constructed judgments, rather than narrow technocratic assessments, strengthen policy legitimacy, enhance knowledge quality, and improve public acceptance [8]. In contexts in which uncertainty intersects with contested value judgments held by diverse epistemic communities, broad stakeholder participation, rather than exclusive reliance on technical experts, opens up space for conversation, securing legitimacy and public acceptance, as illustrated in Figure 1 [8,83,84,85]. Under such conditions, decision-making under uncertainty shifts emphasis toward weighing in on the legitimacy of policies. These considerations are central to bridging science and democratic legitimacy [8,43,47,48,60,86].
The cases discussed in Table 2 (i.e., the Fukushima nuclear disaster and South Korea’s COVID-19 response) synthesize the dynamics of social construction of scientific legitimacy. As Jasanoff [87] emphasizes, scientific assessments are not universal but are “constructed in different ways in different political and cultural settings” (p. 127). What constitutes a “sound” assessment varies depending on context, “scale,” “interactivity,” and “contingency” [87] p. 125. In both crises, system vulnerabilities escalated as knowledge governance broke down across institutional levels.
Institutional strategies of democratic deliberation, grounded in post-normal science and resilience frameworks, are outlined in Table 2. The ignored contextual complexity and expanded temporal and geospatial scales of epistemological uncertainties in Figure 1 create “socially comfortable zones” in which identified hazards appear manageable but later reemerge as latent factors in systemic failures. The suggestive framework in the table applies precautionary principles and implements anticipatory governance in which “unforeseen consequences” and “long-term impacts” are explicitly acknowledged [59,86,88,89,90,91]. Employing the openness and inclusivity of participatory and deliberative approaches may warrant Funtowicz and Ravetz’s [2] concept of post-normal science while confronting contested values and ambiguous social translations [2,9,51,84].
Wildavsky’s (1988) warning remains instructive: “all resilience, no anticipation, or vice versa—would be destructive” [92] p. 83. His caveat that “under considerable uncertainty, resilience is the preferable strategy; under substantial certainty, anticipation does make sense” (p. 79) highlights the need for institutional flexibility and balanced resilience practices. Achieving this balance requires procedural openness and epistemic diversity to prevent legitimacy deficits within scientific bureaucracies that must navigate unpredictable or low-probability events [92] p. 79. In this regard, consistent with the framework proposed in this research (Table 2), societies should strive to better understand, innovate, and institutionalize resilience in response to technological, methodological, and epistemological uncertainties. Resonating with Wildavsky (1988) [92], the framework of Table 2 conceptualizes “resilience for legitimacy” as a governance system’s capacity to treat democratic legitimacy as enabling conditions (e.g., inclusive communication, integrative reflection) for resilience. Transparent procedures, inclusive deliberation, and institutional learning underpin comprehensive systemic resilience to maintain core protective functions (e.g., safeguarding public health or critical infrastructure), when facing epistemological uncertainties and challenges [57] p. 439.
The recent “resilience engineering” notion also differentiates resilience from engineering control, emphasizing procedural and institutional practices rather than deterministic projections [70,93,94,95]. Bruneau et al. [54] conceptualize engineering resilience as the return time required for a system to come back to its previous status quo, emphasizing robustness, redundancy, resourcefulness, and rapidity (p. 737) after engineering risks. However, for resilience governance, prioritizing resilience processes over purely engineering-based outcomes is essential [70,93,94]. The legitimacy approaches advocated in this study stand “in stark contrast to equilibrium-centered, command-and-control strategies” [55] p. 255 and instead call for “the collaboration of a diverse set of stakeholders operating at different social and ecological scales in multi-level institutions and organizations” [55] p. 262. Optimizing a combination of engineering anticipation, social values, and legitimate reflection can help prevent destructive system failures and enhance overall system resilience [92].

5.3. Implications for System Governance

As Barben et al. [61] suggest, robust governance requires “feedback mechanisms” that mobilize collective imagination and epistemic diversity to shape technological trajectories. To illustrate, a counterfactual governance implication can be deduced across the Fukushima and COVID-19 cases. To make the suggested institutional practices in Table 2 more actionable, the following counterfactual implication, grounded in reflexive observation on legitimacy deficit and scientific uncertainties, can provide insights for governance design.
  • Counterfactual implication:  If “extended peer communities” had been executed such that citizen science groups, civil society, academic scholars, and governmental agencies could have freely shared and discussed tsunami risks and privacy concerns, public trust would have increased, and the sociotechnical trajectory would have shifted from limited engineering perspectives, triggering preventive precautions on undefined uncertainties in testing sensitivity. Narrower technical epistemology could have been extended into a better-balanced practice of weighing in on public concerns, formal agency reviews, anticipated hazard intensity, and social ramifications, which would have taken a different, legitimate pathway of knowledge-making prior to crises. Any obsolete assumptions would have been discarded with this hypothetical route of knowledge co-production.
In Fukushima, however, these mechanisms were missing. Epistemic diversity and feedback mechanisms essential for robust governance on the construction of a 5.7 m seawall were largely absent. Distorted preparedness efforts systematically pushed unresolved uncertainties out of discussion before disasters occur. TEPCO’s disaster-preparedness practices were fixated on calculating and favored the technical and methodological “low-hanging fruit” of the cost–benefit consideration associated with seawall construction.

5.4. Limitations of the “Resilience for Legitimacy” Framework and a Facilitating Checklist

The epistemic diversity framework is applicable to post-normal governance under conditions of uncertainty. As a plausible heuristic lens, it helps clarify contextual dynamics and reduce the risk of legitimacy deficits leading to mistranslations of epistemological uncertainty. To implement the framework effectively, the following checklist in Table 3 can be a resilience governance guideline for respective stakeholders before, during, and after a disaster (Table 3).

6. Conclusions

TEPCO dismissed conflicting tsunami projections, and the NISA did not require further scrutiny of inconsistent model outputs prior to the 2011 Fukushima disaster. Historical evidence and divergent risk assessments were sidelined. In the COVID-19 response, the South Korean government concentrated on halting viral transmission, paying far less attention to the social ramifications placed on individuals subjected to intensive tracking. To trust a government and regard it as legitimate, it is not enough for the leaders to have the right to be in charge. Policymaking also has to retain trust in the process of those big decisions.
Given the two cases investigated, we find that science and technology at times fail to fully model real-world dynamics or adequately capture the complexities of sociotechnical systems [36,39,69]. Science and technology play a crucial role in legitimizing and implementing public policies, especially in mitigating catastrophic natural disasters, such as floods or hurricanes, and in identifying system vulnerabilities. By applying democratic approaches to uncertainty in forecasting tsunamis or viral spread, policymakers must avoid overreliance on technological methods and devices when resolving wicked uncertainties. Instead, typologies of uncertainty should be translated into resilient, democratic management systems.
Taking epistemic diversity seriously through structured deliberation is essential to developing resilient systems. Insulated knowledge co-production in science and technology can become a double-edged sword: even though the insulation attempts to uphold policy integrity, it may undermine democratic legitimacy and policy effectiveness. As demonstrated in Fukushima’s probability modeling and South Korea’s tracking of symptomatic and asymptomatic COVID-19 cases, the oversimplification of sociotechnical uncertainties in post-normal science obscures the “wickedness” inherent in crisis governance, a critical factor contributing to disastrous outcomes and infringements on civil rights [96,97].
More critically, transformational shifts in how policymakers perceive, evaluate, and respond to epistemological uncertainties are the first step to innovating and institutionalizing resilience approaches that account for contextual uncertainties [64]. Ultimately, optimizing the alignment among engineering anticipation, societal values, and democratic legitimacy should be carefully considered to enhance knowledge legitimacy and subsequently system resilience and avoid destructive failures [92]. Ignoring the diversity of knowledge systems exacerbates legitimacy deficits and hampers system resilience, as illustrated by TEPCO’s inadequate seawall decisions and South Korea’s tracking systems during COVID-19. Conventional crisis management tools anchored in technological models and narrow assessments can mistranslate scientific and technological uncertainties, hinder social legitimacy (e.g., democratic deliberation and social consensus), and produce adverse consequences.
The concepts of “legitimacy deficit” and the “resilience for legitimacy” framework can be extended and applied to additional sectors, including climate extremes, public health crises, and energy outages. To identify how mistranslation emerges and enabling conditions mature, future research could employ mixed methods, including document analysis, interviews, focus group meetings, and quantitative modeling, to locate signals of diminished legitimacy deficits in decision-making systems.

Funding

This research was supported by Start-Up funding from SUNY Korea (2025–2026).

Data Availability Statement

The data, if not readily accessible, analyzed in this study can be made available by communicating with the corresponding author upon reasonable request.

Acknowledgments

The author appreciates the valuable feedback and constructive comments provided by the anonymous reviewers.

Conflicts of Interest

The author declares no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
IAEAInternational Atomic Energy Agency
NISANuclear and Industrial Safety Agency
TEPCOTokyo Electric Power Company

Appendix A

Relevant Legal Arrangements for Tracing COVID-19

The Infectious Disease Control and Prevention Act
Article 34-2 (Disclosure of Information during Infectious Disease Emergency)
(1) When an infectious disease harmful to citizens’ health is spreading, the Minister of Health and Welfare shall promptly disclose information with which citizens are required to be acquainted for preventing the infectious disease, such as the movement paths, transportation means, medical treatment institutions, and contacts of patients with the infectious disease; provided, that any relevant party with respect to whom there exist any matters inconsistent with the facts among the disclosed matters or who has any opinion on the disclosed matters, may file an objection with the Minister of Health and Welfare.
[This Article Newly Inserted by Act No. 13392, 6 July 2015]
(2) Necessary matters concerning the scope, procedures, methods, etc., of the disclosure of information as prescribed in paragraph (1), shall be prescribed by Enforcement Rule of the Ministry of Health and Welfare.
The Infectious Disease Control and Prevention Act
Article 76-2 (Request to Provide Information, etc.)
(1) If necessary to prevent infectious diseases and block the spread of infection, the Minister of Health and Welfare or the Director of the Korea Centers for Disease Control and Prevention may request the heads of relevant central administrative agencies (including affiliated agencies and responsible administrative agencies thereof), the heads of local governments (including superintendents of education prescribed in Article 18 of the Local Education Autonomy Act), public institutions and individuals, designated under Article 4 of the Act on the Management of Public Institutions, such as medical institutions, pharmacies, corporations, organizations, and individuals, to provide the following information concerning patients, etc. with infectious diseases and persons likely to be infected by infectious diseases, and persons in receipt of such request shall comply therewith: <Amended by Act No. 14286, 2 December 2016>
         1. Personal information, such as names, resident registration numbers prescribed in Article 7-2 (1) of the Resident Registration Act, addresses, and telephone numbers (including cell phone numbers);
         2. Prescriptions prescribed in Article 17 of the Medical Service Act, records of medical treatment prescribed in Article 22 of the same Act, etc.;
         3. Records of immigration control during the period determined by the Minister of Health and Welfare;
         4. Other information prescribed by Presidential Decree for monitoring the movement paths of patients with infectious diseases.
The Infectious Disease Control and Prevention Act
Article 76-2 (Request to Provide Information, etc.)
(2) If necessary, to prevent infectious diseases and block the spread of infection, the Minister of Health and Welfare may request the relevant head of the National Police Agency, regional police agency, and police station established under Article 2 of the Police Act (hereafter in this Article, referred to as “police agency”) to provide location information of patients, etc. with an infectious disease and persons likely to be infected by an infectious disease. In such cases, notwithstanding Article 15 of the Act on the Protection, Use, etc. of Location Information and Article 3 of the Protection of Communications Secrets Act, the relevant head of a police agency, upon request by the Minister of Health and Welfare, may request any location information provider defined in Article 5 (7) of the Act on the Protection, Use, etc. of Location Information and any telecommunications business operator defined in subparagraph 8 of Article 2 of the Telecommunications Business Act, to provide location information of patients, etc. with an infectious disease and persons likely to be infected by an infectious disease; and the location information provider and the telecommunications business operator in receipt of such request shall comply therewith, except in extenuating circumstances. <Amended by Act No. 13639, 29 December 2015>
Enforcement Decree of the Infectious Disease Control and Prevention Act
Article 32-2 (Information Requestable to be Provided)
“Information prescribed by Presidential Decree” in Article 76-2 (1) 4 of the Act, means the following:
         1. Credit card, debit card, and prepaid card statements defined in subparagraphs 3, 6, and 8 of Article 2 of the Specialized Credit Finance Business Act;
         2. Transportation card statements specified in Article 10-2 (1) of the Act on the Support and Promotion of Utilization of Mass Transit System;
         3. Image data compiled through image data processing equipment defined in subparagraph 7 of Article 2 of the Personal Information Protection Act.
Enforcement Rule of the Infectious Disease Control and Prevention Act
Article 27-3 (Scope, Procedures, etc. of Information Disclosure in the event of an Infectious Disease Crisis)
(1) In case the declaration of the “notice” or “warning” level is more serious than the “caution” level issued under Article 38 (2) of the Framework Act on the Management of Disasters and Safety, the movement paths, transportation means, medical treatment institutions, contact status, etc. of infectious disease patients shall be posted on the information and communications networks or released to the public by means of the press release, etc. pursuant to Article 34-2 of the Act.
(2) A party to information under paragraph (1) may file an objection with the Minister of Health and Welfare by means of oral, written, etc., if any of the matters disclosed are different from the facts or if there is an opinion, and the Minister of Health and Welfare shall take necessary measures, such as correction of the information disclosed accordingly.

Appendix B

An Example of the Movement Paths of a Confirmed Case: The First Community Transmission Patient in South Korea

Patient NumberPersonal InformationInfection RouteConfirmation DateInpatient InstitutionNumber of Contacts (Contact Persons in Isolation)
29Male (South Korea, 38)Under InvestigationFebruary 16Seoul National University Hospital117 (117)
To investigate the cause of infection, the scope of the investigation is being expanded based on the activities of the patient for two weeks (January 20 to February 4) prior to the patient’s symptom onset date, and the patient was confirmed to have visited the Jongno Senior Welfare Center and Baduk Center before the symptom onset date, and the facility users are under investigation about symptoms and overseas travel histories.
On February 15, a COVID-19 diagnosis was conducted based on the judgment of the medical staff, who diagnosed pneumonia during a chest X-ray test while the patient was visiting the emergency room of Korea University An-am Hospital for the examination of chest discomfort and treatment for suspected myocardial infarction. The patient tested positive on February 16.
Currently, the patient is being quarantined at the designated hospital (Seoul National University Hospital), and although there are signs of fever and pneumonia, the patient’s condition is stable overall.
The patient stated that he had not visited a foreign country since December 2019, and the source of the infection, the route of infection, and contact are being investigated by the immediate response team and the local government.
(February 4) Moving from Dongmyo Station to Sinseol-dong Station (15:53–15:57) by subway, and moving from Sinseondong Station to Dongmyo Station (21:36–21:46)
(February 5) Moving from Dongdaemun Station to Nokyang Station by subway (11:41–12:41), moving from Nokang Station to Dongdaemun Station by subway (12:43–13:38), visiting a medical institution located in Jongno-gu, Seoul (Shinjung Hospital of Internal Medicine, Jibong-ro 61-1), visiting a pharmacy located in Jongno-gu (Boram Pharmacy, Jongno 326) at around 15:10, and 20 in Jongno-gu, Seoul.
(February 6) (Checking the path of movement)
(February 7) Visit a medical institution (Shinjung Hospital of Internal Medicine) located in Jongno-gu at around 14:20 p.m. and move to Soyosan Station from Dongmyo Station by subway (14:37–15:53)
(February 8) Visit a medical institution (Gangbuk Seoul Medical Center) in Jongno-gu at around 11:30 and visit a pharmacy (Spring Pharmacy, Jibong-ro 37-1) in Jongno-gu at around 11:40 p.m.
(February 9) (Checking the path of movement)
(February 10) Visit a medical institution (Gangbuk Medical Center) in Jongno-gu, visit a pharmacy (Boram Pharmacy) in Jongno-gu at around 9:15, move from Sinseondong Station by subway (14:04–14:53), and move from Deokjeong Station to Dongmyo Station (14:58–16:14)
(February 11) Visit to a medical institution (Gangbuk Seoul Medical Center) in Jongno-gu around 11 p.m.
(February 12) Visit a medical institution (Gangbuk Seoul Medical Center) in Jongno-gu at 10:50 p.m. and visit a pharmacy (Spring Pharmacy) in Jongno-gu at 11:05 p.m.
(February 13) (Checking the path of movement)
(February 14) A round-trip from Changshin Station to Bonghwasan (17:57–18:53) by subway
(February 15) Visit a medical institution (Gangbuk Seoul Medical Center) in Jongno-gu at around 11 p.m. and visit the emergency room (Korea University An-am Hospital) in Seongbuk-gu around 11:45 p.m. and move to a negative pressure isolation room around 16:00 p.m.
(February 16) Transferring to a designated hospital (Seoul National University Hospital)

References

  1. Linkov, I.; Bridges, T.; Creutzig, F.; Decker, J.; Fox-Lent, C.; Kröger, W.; Lambert, J.H.; Levermann, A.; Montreuil, B.; Nathwani, J.; et al. Changing the resilience paradigm. Nat. Clim. Change 2014, 4, 407–409. [Google Scholar] [CrossRef]
  2. Funtowicz, S.O.; Ravetz, J.R. Science for the post-normal age. Futures 1993, 25, 739–755. [Google Scholar] [CrossRef]
  3. Saltelli, A.; Funtowicz, S.O. When all models are wrong. Issues Sci. Technol. 2014, 30, 79–85. [Google Scholar]
  4. Saltelli, A.; Bammer, G.; Bruno, I.; Charters, E.; Di Fiore, M.; Didier, E.; Espeland, W.N.; Kay, J.; Piano, S.L.; Mayo, D.; et al. Five ways to ensure that models serve society: A manifesto. Nature 2020, 582, 482–484. [Google Scholar] [CrossRef]
  5. Jasanoff, S. The Fifth Branch: Science Advisers as Policymakers; Harvard University Press: Cambridge, MA, USA, 1990. [Google Scholar]
  6. Jasanoff, S. Procedural choices in regulatory science. Technol. Soc. 1995, 17, 279–293. [Google Scholar] [CrossRef]
  7. Turnhout, E.; Dewulf, A.; Hulme, M. What does policy-relevant global environmental knowledge do? The cases of climate and biodiversity. Curr. Opin. Environ. Sustain. 2016, 18, 65–72. [Google Scholar] [CrossRef]
  8. Wagner, N.; Sarkki, S.; Dietz, T. More than policy neutral: Justifying the power of science-policy interfaces through legitimacy. Earth Syst. Gov. 2024, 21, 100219. [Google Scholar] [CrossRef]
  9. Funtowicz, S.O.; Ravetz, J.R. Uncertainty, complexity and post-normal science. Environ. Toxicol. Chem. 1994, 13, 1881–1885. [Google Scholar] [CrossRef]
  10. Ackoff, R.L. Towards a system of systems concepts. Manag. Sci. 1971, 17, 661–671. [Google Scholar] [CrossRef]
  11. von Bertalanffy, L. General System Theory: Foundations, Development, Applications; George Braziller: New York, NY, USA, 1972. [Google Scholar]
  12. Star, S.L. The ethnography of infrastructure. Am. Behav. Sci. 1999, 43, 377–391. [Google Scholar] [CrossRef]
  13. Ramaswami, A.; Weible, C.; Main, D.; Heikkila, T.; Siddiki, S.; Duvall, A.; Pattison, A.; Bernard, M. A social-ecological-infrastructural systems framework for interdisciplinary study of sustainable city systems: An integrative curriculum across seven major disciplines. J. Ind. Ecol. 2012, 16, 801–813. [Google Scholar] [CrossRef]
  14. Corvellec, H.; Campos, M.J.Z.; Zapata, P. Infrastructures, lock-in, and sustainable urban development: The case of waste incineration in the Göteborg Metropolitan Area. J. Clean. Prod. 2013, 50, 32–39. [Google Scholar] [CrossRef]
  15. McPhearson, T.; Haase, D.; Kabisch, N.; Gren, Å. Advancing understanding of the complex nature of urban systems. Ecol. Indic. 2016, 70, 566–573. [Google Scholar] [CrossRef]
  16. Shove, E. Infrastructures and practices: Networks beyond the city. In Beyond the Networked City: Infrastructure Reconfigurations and Urban Change in the North and South London; Coutard, O., Rutherford, J., Eds.; Routledge: New York, NY, USA, 2016; pp. 242–258. [Google Scholar]
  17. Slota, S.; Bowker, G. How infrastructures matter. In The Handbook of Science and Technology Studies, 4th ed.; Felt, U., Fouche, R., Miller, C., Smith-Doerr, L., Eds.; MIT Press: Cambridge, MA, USA, 2017; pp. 529–554. [Google Scholar]
  18. Grabowski, Z.J.; Matsler, A.M.; Thiel, C.; McPhillips, L.; Hum, R.; Bradshaw, A.; Miller, T.; Redman, C. Infrastructures as socio-eco-technical systems: Five considerations for interdisciplinary dialogue. J. Infrastruct. Syst. 2017, 23, 02517002. [Google Scholar] [CrossRef]
  19. Grimm, N.B.; Pickett, S.T.A.; Hale, R.L.; Cadenasso, M.L. Does the ecological concept of disturbance have utility in urban social–ecological–technological systems? Ecosyst. Health Sustain. 2017, 3, e01255. [Google Scholar] [CrossRef]
  20. Miller, C. Engaging with societal challenges. In The Handbook of Science and Technology Studies, 4th ed.; Felt, U., Fouche, R., Miller, C., Smith-Doerr, L., Eds.; MIT Press: Cambridge, MA, USA, 2017; pp. 909–914. [Google Scholar]
  21. Mostafavi, A. A system-of-systems approach for integrated resilience assessment in highway transportation infrastructure investment. Infrastructures 2017, 2, 22. [Google Scholar] [CrossRef]
  22. Markolf, S.A.; Chester, M.V.; Eisenberg, D.A.; Iwaniec, D.M.; Davidson, C.I.; Zimmerman, R.; Miller, T.R.; Ruddell, B.L.; Chang, H. Interdependent infrastructure as linked social, ecological, and technological systems (SETSs) to address lock-in and enhance resilience. Earth’s Future 2018, 6, 1638–1659. [Google Scholar] [CrossRef]
  23. Stokols, D. Social Ecology in the Digital Age: Solving Complex Problems in a Globalized World; Academic Press: London, UK, 2018. [Google Scholar]
  24. Tellman, B.; Bausch, J.C.; Eakin, H.; Anderies, J.M.; Mazari-Hiriart, M.; Manuel-Navarrete, D.; Redman, C.L. Adaptive pathways and coupled infrastructure: Seven centuries of adaptation to water risk and the production of vulnerability in Mexico City. Ecol. Soc. 2018, 23, 1. [Google Scholar] [CrossRef]
  25. Helmrich, A.; Markolf, S.; Li, R.; Carvalhaes, T.; Kim, Y.; Bondank, E.; Natarajan, M.; Ahmad, N.; Chester, M.V. Centralization and decentralization for resilient infrastructure and complexity. Environ. Res. Infrastruct. Sustain. 2021, 1, 021001. [Google Scholar] [CrossRef]
  26. Gim, C.; Miller, C.A. Institutional interdependence and infrastructure resilience. Curr. Opin. Environ. Sustain. 2022, 57, 101203. [Google Scholar] [CrossRef]
  27. Hale, R.L.; Armstrong, A.; Baker, M.A.; Bedingfield, S.; Betts, D.; Buahin, C.; Buchert, M.; Crowl, T.; Dupont, R.R.; Ehleringer, J.R.; et al. iSAW: Integrating structure, actors, and water to study socio-hydro-ecological systems. Earth’s Future 2015, 3, 110–132. [Google Scholar] [CrossRef]
  28. Latour, B. We Have Never Been Modern; Harvard University Press: Cambridge, MA, USA, 1991. [Google Scholar]
  29. Kundzewicz, Z.W.; Kanae, S.; Seneviratne, S.I.; Handmer, J.; Nicholls, N.; Peduzzi, P.; Mechler, R.; Bouwer, L.M.; Arnell, N.; Mach, K.; et al. Flood risk and climate change: Global and regional perspectives. Hydrol. Sci. J. 2014, 59, 1–28. [Google Scholar] [CrossRef]
  30. Porter, T.M. Trust in Numbers: The Pursuit of Objectivity in Science and Public Life; Princeton University Press: Princeton, NJ, USA, 1995. [Google Scholar]
  31. Rayner, S. Uncomfortable knowledge: The social construction of ignorance in science and environmental policy discourses. Econ. Soc. 2012, 41, 107–125. [Google Scholar] [CrossRef]
  32. Rittel, H.W.J.; Webber, M.M. Dilemmas in a general theory of planning. Policy Sci. 1973, 4, 155–169. [Google Scholar] [CrossRef]
  33. Cash, D.W.; Clark, W.C.; Alcock, F.; Dickson, N.M.; Eckley, N.; Guston, D.H.; Jäger, J.; Mitchell, R.B. Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. USA 2003, 100, 8086–8091. [Google Scholar] [CrossRef]
  34. Miller, C.A.; Wyborn, C. Co-production in global sustainability: Histories and theories. Environ. Sci. Policy 2020, 113, 88–95. [Google Scholar] [CrossRef]
  35. Oreskes, N.; Shrader-Frechette, K.; Belitz, K. Verification, validation, and confirmation of numerical models in the earth sciences. Science 1994, 263, 641–646. [Google Scholar] [CrossRef]
  36. Pielke, R.A., Jr. Who decides? Forecasts and responsibilities in the 1997 Red River flood. Appl. Behav. Sci. Rev. 1999, 7, 83–101. [Google Scholar] [CrossRef]
  37. Changnon, S.A. Flood prediction: Immersed in the quagmire of national flood mitigation strategy. In Prediction: Science, Decision Making, and the Future of Nature; Sarewitz, D., Pielke, R.A., Jr., Byerly, R.A., Jr., Eds.; Island Press: Covelo, CA, USA, 2000; pp. 85–106. [Google Scholar]
  38. Popper, K. A survey of some fundamental problems. In The Logic of Scientific Discovery; Routledge: New York, NY, USA, 1968. [Google Scholar]
  39. Sarewitz, D.; Pielke, R.A., Jr. Prediction in science and policy. In Prediction: Science, Decision Making, and the Future of Nature; Sarewitz, D., Pielke, R.A., Jr., Byerly, R.A., Jr., Eds.; Island Press: Covelo, CA, USA, 2000. [Google Scholar]
  40. Sarewitz, D. Unknown knowns. Issues Sci. Technol. 2020, 37, 18–19. [Google Scholar]
  41. Miller, C. Hybrid management: Boundary organizations, science policy, and environmental governance in the climate regime. Sci. Technol. Hum. Values 2001, 26, 478–500. [Google Scholar] [CrossRef]
  42. Rekola, A.; Paloniemi, R. Politics of knowledge use: Epistemic governance in marine spatial planning. J. Environ. Policy Plan. 2022, 24, 807–821. [Google Scholar] [CrossRef]
  43. Mehta, G.; Hopf, H.; Krief, A.; Matlin, S.A. Realigning science, society and policy in uncertain times. R. Soc. Open Sci. 2020, 7, 200554. [Google Scholar] [CrossRef]
  44. Guston, D.H. Boundary organizations in environmental policy and science: An introduction. Sci. Technol. Hum. Values 2001, 26, 399–408. [Google Scholar] [CrossRef]
  45. Nowotny, H.; Gibbons, M.; Scott, P. Re-Thinking Science: Knowledge and the Public in an Age of Uncertainty; Polity Press: Cambridge, UK, 2001. [Google Scholar]
  46. Acton, J.M.; Hibbs, M. Why Fukushima Was Preventable. Carnegie Endowment for International Peace. Available online: https://carnegieendowment.org/russia-eurasia/research/2012/03/why-fukushima-was-preventable (accessed on 20 November 2025).
  47. Gauchat, G.W. The legitimacy of science. Annu. Rev. Sociol. 2023, 49, 263–279. [Google Scholar] [CrossRef]
  48. Scolobig, A.; Pelling, M. The co-production of risk from a natural hazards perspective: Science and policy interaction for landslide risk management in Italy. Nat. Hazards 2016, 81, 7–25. [Google Scholar] [CrossRef]
  49. Scott, J.C. Seeing Like a State: How Certain Schemes to Improve the Human Condition Have Failed; Yale University Press: New Haven, CT, USA, 1998. [Google Scholar]
  50. Oreskes, N. The role of quantitative models in science. In Models in Ecosystem Science; Canham, C.D., Cole, J.J., Lauenroth, W.K., Eds.; Princeton University Press: Princeton, CA, USA, 2003; pp. 13–31. [Google Scholar]
  51. Gim, C.; Miller, C.A.; Hirt, P.W. The resilience work of institutions. Environ. Sci. Policy 2019, 97, 36–43. [Google Scholar] [CrossRef]
  52. Meerow, S.; Newell, J.P.; Stults, M. Defining urban resilience: A review. Landsc. Urban Plan. 2016, 147, 38–49. [Google Scholar] [CrossRef]
  53. Muñoz-Erickson, T.A.; Meerow, S.; Hobbins, R.; Cook, E.; Iwaniec, D.M.; Berbés-Blázquez, M.; Grimm, N.B.; Barnett, A.; Cordero, J.; Gim, C.; et al. Beyond bouncing back? Comparing and contesting urban resilience frames in US and Latin American contexts. Landsc. Urban Plan. 2021, 214, 104173. [Google Scholar] [CrossRef]
  54. Bruneau, M.; Chang, S.E.; Eguchi, R.T.; Lee, G.C.; O’Rourke, T.D.; Reinhorn, A.M.; Shinozuka, M.; Tierney, K.; Wallace, W.A.; Von Winterfeldt, D. A framework to quantitatively assess and enhance the seismic resilience of communities. Earthq. Spectra 2003, 19, 733–752. [Google Scholar] [CrossRef]
  55. Folke, C. Resilience: The emergence of a perspective for social–ecological systems analyses. Glob. Environ. Change 2006, 16, 253–267. [Google Scholar] [CrossRef]
  56. Gim, C. Institutional Management for Infrastructure Resilience. Ph.D. Thesis, Arizona State University, Tempe, AZ, USA, 2019. [Google Scholar]
  57. Van Asselt, M.B.A.; Renn, O. Risk governance. J. Risk Res. 2011, 14, 431–449. [Google Scholar] [CrossRef]
  58. Herrick, C. Predictive modeling of acid rain: Obstacles to generating useful information. In Prediction: Science, Decision Making, and the Future of Nature; Sarewitz, D., Pielke, R.A., Jr., Byerly, R.A., Jr., Eds.; Island Press: Covelo, CA, USA, 2000; pp. 251–268. [Google Scholar]
  59. Beck, U. Risk Society: Towards a New Modernity; Sage Publications: London, UK, 1992. [Google Scholar]
  60. Maynard, A.D. Why we need risk innovation. Nat. Nanotechnol. 2015, 10, 730–731. [Google Scholar] [CrossRef]
  61. Barben, D.; Fisher, E.; Selin, C.; Guston, D. Anticipatory governance of nanotechnology: Foresight, engagement, and integration. In The Handbook of Science and Technology Studies, 3rd ed.; Hackett, E.J., Amsterdamska, O., Eds.; MIT Press: Cambridge, MA, USA, 2008; pp. 979–1000. [Google Scholar]
  62. Cyranoski, D. Japan faces up to failure of its earthquake preparations. Nature 2011, 471, 556–557. [Google Scholar] [CrossRef]
  63. Cyranoski, D. Japan’s tsunami warning system retreats. Nature 2011. [Google Scholar] [CrossRef]
  64. Lacassin, R.; Lavelle, S. The crisis of a paradigm. A methodological interpretation of Tohoku and Fukushima catastrophe. Earth-Sci. Rev. 2016, 155, 49–59. [Google Scholar] [CrossRef]
  65. Müller, V. Reliable risk analysis on the example of tsunami heights. Econ. Qual. Control 2013, 28, 45–55. [Google Scholar] [CrossRef]
  66. Funtowicz, S.; Strand, R. Models of science and policy. In Biosafety First: Holistic Approaches to Risk and Uncertainty in Genetic Engineering and Genetically Modified Organisms; Travik, T., Lim, L.C., Eds.; Tapir Academic Press: Trondheim, Norway, 2007. [Google Scholar]
  67. Röckmann, C.; Ulrich, C.; Dreyer, M.; Bell, E.; Borodzicz, E.; Haapasaari, P.; Hauge, K.H.; Howell, D.; Mäntyniemi, S.; Miller, D.; et al. The added value of participatory modelling in fisheries management—What has been learnt? Mar. Policy 2012, 36, 1072–1085. [Google Scholar] [CrossRef]
  68. The Independent Investigation Commission of the Fukushima Nuclear Accident. The Fukushima Daiichi Nuclear Power Station Disaster: Investigating the Myth and Reality, 1st ed.; Routledge: Oxfordshire, UK, 2014. [Google Scholar]
  69. Stewart, T.R. Uncertainty, judgment, and error in Prediction. In Prediction: Science, Decision Making, and the Future of Nature; Sarewitz, D., Pielke, R.A., Jr., Byerly, R.A., Jr., Eds.; Island Press: Covelo, CA, USA, 2000. [Google Scholar]
  70. Park, J.; Seager, T.P.; Rao, P.S.C.; Convertino, M.; Linkov, I. Integrating risk and resilience approaches to catastrophe management in engineering systems. Risk Anal. 2013, 33, 356–367. [Google Scholar] [CrossRef] [PubMed]
  71. Sakai, T.; Takeda, T.; Soraoka, H.; Yanagisawa, K.; Annaka, T. Development of a probabilistic tsunami hazard analysis in Japan. In Proceedings of the 14th International Conference on Nuclear Engineering, Miami, FL, USA, 17–20 July 2006; Volume 5: Safety and security; low level waste management, decontamination and decommissioning; Nuclear Industry Forum; ASME: Washington, DC, USA, 2006; pp. 69–75. [Google Scholar]
  72. Van Doremalen, N.; Bushmaker, T.; Morris, D.H.; Holbrook, M.G.; Gamble, A.; Williamson, B.N.; Tamin, A.; Harcourt, J.L.; Thornburg, N.J.; Gerber, S.I.; et al. Aerosol and Surface stability of SARS-CoV-2 as compared with SARS-CoV-1. N. Engl. J. Med. 2020, 382, 1564–1567. [Google Scholar] [CrossRef] [PubMed]
  73. Ministry of Economy and Finance of South Korea. Flattening the Curve on COVID-19, 2020. Available online: https://www.mois.go.kr/eng/bbs/type002/commonSelectBoardArticle.do?bbsId=BBSMSTR_000000000022&nttId=76748 (accessed on 22 February 2026).
  74. People’s Solidarity for Participatory Democracy (PSPD). [Press Release] “Constitutional Complaint on the Processing of Itaewon Base-Station Connection Data and the Legal Basis for Non-Consensual Location Tracking Related to COVID-19,” 2020. Available online: https://www.peoplepower21.org/sue?mod=document&uid=1724967 (accessed on 22 February 2026). (In Korean)
  75. Sarewitz, D.; Nelson, R. Three rules for technological fixes. Nature 2008, 456, 871–872. [Google Scholar] [CrossRef] [PubMed]
  76. Maynard, A.D. Thinking differently about risk. Astrobiology 2018, 18, 244–245. [Google Scholar] [CrossRef]
  77. Stirling, A. Risk at a turning point? J. Environ. Med. 1999, 1, 119–126. [Google Scholar] [CrossRef]
  78. The Fukushima Nuclear Accident Independent Investigation Commission. The Official Report of The Fukushima Nuclear Accident Independent Investigation Commission (Executive Summary). The National Diet of Japan. 2012. Available online: https://www.nirs.org/wp-content/uploads/fukushima/naiic_report.pdf (accessed on 31 July 2025).
  79. Nöggerath, J.; Geller, R.J.; Gusiakov, V.K. Fukushima: The myth of safety, the reality of geoscience. Bull. At. Sci. 2011, 67, 37–46. [Google Scholar] [CrossRef]
  80. Funabashi, Y.; Kitazawa, K. Fukushima in review: A complex disaster, a disastrous response. Bull. At. Sci. 2012, 68, 9–21. [Google Scholar] [CrossRef]
  81. Solinas-Saunders, M. The U.S. federal response to COVID-19 during the first 3 months of the outbreak: Was an evidence-based approach an option? Am. Rev. Public Adm. 2020, 50, 713–719. [Google Scholar] [CrossRef]
  82. Mousavi, S.; Gigerenzer, G. Risk, uncertainty, and heuristics. J. Bus. Res. 2014, 67, 1671–1678. [Google Scholar] [CrossRef]
  83. Renn, O.; Klinke, A.; Van Asselt, M. Coping with complexity, uncertainty and ambiguity in risk governance: A synthesis. AMBIO 2011, 40, 231–246. [Google Scholar] [CrossRef]
  84. Renn, O.; Laubichler, M.; Lucas, K.; Kröger, W.; Schanze, J.; Scholz, R.W.; Schweizer, P. Systemic risks from different perspectives. Risk Anal. 2022, 42, 1902–1920. [Google Scholar] [CrossRef] [PubMed]
  85. Lejano, R.P.; Haque, C.E.; Berkes, F. Co-production of risk knowledge and improvement of risk communication: A three-legged stool. Int. J. Disaster Risk Reduct. 2021, 64, 102508. [Google Scholar] [CrossRef]
  86. Aven, T.; Renn, O. Some foundational issues related to risk governance and different types of risks. J. Risk Res. 2020, 23, 1121–1134. [Google Scholar] [CrossRef]
  87. Jasanoff, S. Bridging the two cultures of risk analysis. Risk Anal. 1993, 13, 123–129. [Google Scholar] [CrossRef]
  88. Beck, U. World at Risk; Polity Press: Cambridge, UK, 2009. [Google Scholar]
  89. Rogers, M.D. Scientific and technological uncertainty, the precautionary principle, scenarios and risk management. J. Risk Res. 2001, 4, 1–15. [Google Scholar] [CrossRef]
  90. Stirling, A. Renewables, sustainability and precaution: Beyond environmental cost-benefit and risk analysis. In Sustainability and Environmental Impact of the Renewable Energy Resources; Hester, R.E., Harrison, R.M., Harrison, R., Hester, R., Eds.; The Royal Society of Chemistry: Cambridge, UK, 2003; Volume 19. [Google Scholar]
  91. Aven, T. Foundational issues in risk assessment and risk management. Risk Anal. 2012, 32, 1647–1656. [Google Scholar] [CrossRef]
  92. Wildavsky, A. Searching for Safety; Social Philosophy and Policy Center: Bowling Green, OH, USA, 1988. [Google Scholar]
  93. Hollnagel, E. Prologue: The scope of resilience engineering. In Resilience Engineering in Practice; Hollnagel, E., Pariès, D., Woods, D., Wreathall, J., Eds.; Ashgate Publishing Company: Burlington, VT, USA, 2011; pp. xxix–xxxix. [Google Scholar]
  94. Hollnagel, E.; Pariès, J.; Woods, C.C.; Wreathall, J. Resilience Engineering in Practice; Ashgate Publishing Company: Burlington, VT, USA, 2011. [Google Scholar]
  95. Hollnagel, E.; Pariès, J.; Woods, D.D.; Wreathall, J. (Eds.) Resilience Engineering in Practice: A Guidebook; CRC Press: Boca Raton, FL, USA, 2011. [Google Scholar]
  96. Wang, Q.; Chen, X. Regulatory failures for nuclear safety—The bad example of Japan—Implication for the rest of world. Renew. Sustain. Energy Rev. 2012, 16, 2610–2617. [Google Scholar] [CrossRef]
  97. Perrow, C. Fukushima and the inevitability of accidents. Bull. At. Sci. 2011, 67, 44–52. [Google Scholar] [CrossRef]
Figure 1. Democratic legitimacy deficit in the science-policy decision-making model on uncertainties (Source: author). Dotted lines indicate the institutional independence and structural boundaries of knowledge body, the scientific advisory committee.
Figure 1. Democratic legitimacy deficit in the science-policy decision-making model on uncertainties (Source: author). Dotted lines indicate the institutional independence and structural boundaries of knowledge body, the scientific advisory committee.
Systems 14 00273 g001
Figure 2. Visualizing different types of uncertainties (Source: author). Dotted lines denote undefined contexts and trajectories.
Figure 2. Visualizing different types of uncertainties (Source: author). Dotted lines denote undefined contexts and trajectories.
Systems 14 00273 g002
Figure 3. The relationship between the lack of epistemological consensus on uncertainties and increased vulnerabilities (Source: author).
Figure 3. The relationship between the lack of epistemological consensus on uncertainties and increased vulnerabilities (Source: author).
Systems 14 00273 g003
Figure 4. The type of uncertainties and mistranslation of uncertainty in the 2011 Fukushima disaster and COVID-19 (Source: author, modified from Funtowicz and Ravetz [2]).
Figure 4. The type of uncertainties and mistranslation of uncertainty in the 2011 Fukushima disaster and COVID-19 (Source: author, modified from Funtowicz and Ravetz [2]).
Systems 14 00273 g004
Figure 5. South Korea’s tracking and surveillance systems during the COVID-19 Outbreak as of 2020 (Source: author). Arrows depict the inflow and outflow associated with visitors, patients, COVID-19 testing results, and medical service data. Key excerpts from A. Quarantine Act and B. Infectious Disease Control and Prevention Act are provided in the Appendix A.
Figure 5. South Korea’s tracking and surveillance systems during the COVID-19 Outbreak as of 2020 (Source: author). Arrows depict the inflow and outflow associated with visitors, patients, COVID-19 testing results, and medical service data. Key excerpts from A. Quarantine Act and B. Infectious Disease Control and Prevention Act are provided in the Appendix A.
Systems 14 00273 g005
Figure 6. COVID-19 epidemiological investigation support system (Source: Ministry of Economy and Finance of South Korea [73] p. 44). The two panels illustrate South Korea’s COVID-19 Epidemiological Investigation Support System (EISS) interface as of 2020. The left panel summarizes the main dashboard with real-time data collection of the number of COVID-19 confirmed cases, retrieved from regional heat maps, demographic analysis, and tracking modules. It contains the quantification of close contacts based on credit card transactions and cellular location data. The right panel displays an example map of a confirmed case’s trajectory in chronological order to visualize rapid contact tracing. The non-English text within the figure remains untranslated, as the tracking system was originally communicated to South Korean citizens in Korean during the pandemic.
Figure 6. COVID-19 epidemiological investigation support system (Source: Ministry of Economy and Finance of South Korea [73] p. 44). The two panels illustrate South Korea’s COVID-19 Epidemiological Investigation Support System (EISS) interface as of 2020. The left panel summarizes the main dashboard with real-time data collection of the number of COVID-19 confirmed cases, retrieved from regional heat maps, demographic analysis, and tracking modules. It contains the quantification of close contacts based on credit card transactions and cellular location data. The right panel displays an example map of a confirmed case’s trajectory in chronological order to visualize rapid contact tracing. The non-English text within the figure remains untranslated, as the tracking system was originally communicated to South Korean citizens in Korean during the pandemic.
Systems 14 00273 g006
Table 1. Types of uncertainties during the 2011 Fukushima disaster and the COVID-19 pandemic.
Table 1. Types of uncertainties during the 2011 Fukushima disaster and the COVID-19 pandemic.
Technical uncertainty
(Bouncing back)
“How many digits are reliable?”
- The inexactness of numbers
- Technological factors
- Known knowns
- Figuring out engineering predicaments
(e.g., operationalizing infrastructures)
- Constructing and managing seawalls
- Manufacturing COVID-19 testing kits and drive-through testing
Methodological uncertainty
(Adaptation; bouncing forward)
“To what extent are methods reliable?”
- The unreliability of prediction methodologies (assumptions)
- The complication of social and environmental factors
- Known unknowns
- Using prediction models and anticipating threats within a certain range
(e.g., adjusting infrastructures)
- The inundation areas with finite variation in tsunami heights
- Tracking close-contact patients and calculating their movement
- Testing and finding (a)symptomatic patients
- Modeling the spread of virus (the Basic Reproduction Number (R0) modeling)
Epistemological uncertainty
(Transformation)
“What can be known about this phenomenon?”
“How do we know that we know?”
- Ignorance and unknowable uncertainties
- Social-Ecological-Technological complexities
- Unknown unknowns
- Impactful decision-making based on lack of scientific data
(e.g., transformational changes from valleys to reservoirs with infrastructures)
- Making decisions on the height of seawalls and the site of backup generators (i.e., calculating the height of tsunamis and the areas)
- Anticipating the surge of COVID-19 variants and implementing vaccination
- Restrictions on dining inside
- Vaccination and mandates (vaccine safety and convincing people to be vaccinated)
(Source: author; adapted from Funtowicz and Ravetz [2], Gim et al. [51], Meerow et al. [52], and Muñoz-Erickson et al. [53]).
Table 2. A “resilience for legitimacy” framework to translate uncertainties and enhance system resilience (Source: author).
Table 2. A “resilience for legitimacy” framework to translate uncertainties and enhance system resilience (Source: author).
Resilience, Uncertainties, and LegitimacyResilience Work FieldsTemporal Scale/Spatial/FunctionalOrganizational
Practices
Instrument and
Governance
Legitimacy
Mechanism
Specific Cases
Technical UncertaintiesOperational, engineering approachesMinutes to Months/
Discrete infra or equipment/
Sustaining
- Equipment monitoring
- Optimizing operational practices
- Technical backup systems
- Technological equipment
- Technical expertise
- Quantitative risk assessment
- Standardizing
- Engineering protocols
- Real-time reactor monitoring
- COVID-19 contact tracing and disclosure
Methodological Uncertainties Regulatory, and methodological innovationMonths to Years/
Sociotechnological Systems/Adapting
- Agile project management
- Multi-stakeholders
- Adaptive compliances and check-ups
- Scientific/Technological modeling
- Consultation with various experts
- Interfacing knowledge domains
- Boundary organizations
- Multi-disciplinary anticipation
- Tsunami modeling convergence
- COVID-19 transmission models
Epistemological Uncertainties Transformational approaches to policymakingYears to Decades/Societal Systems/Transforming- Collective values and reasoning
- Post-normal science approaches
- Social learning and feedback
- Precautionary principles
- Anticipatory governance
- Democratic deliberation
- Epistemic Pluralism
- Democratic legitimacy and inclusiveness
- Height of Fukushima seawalls
- Asymptomatic transmission and data privacy
Note: adapted from Gim et al. [51].
Table 3. A checklist in the application of the “resilience for legitimacy” framework (Source: author).
Table 3. A checklist in the application of the “resilience for legitimacy” framework (Source: author).
StakeholderPreparationResponse and Recovery
Policymakers/public authoritiesDeveloping uncertainty scenarios; sharing information; funding for review and public deliberationDocumentation of event analysis
Scientific advisoryClarifying contested assumptions; coordinate with dissent; scientific communicationSupporting institutional updates after post-event review
Industry operatorsStress testing for surprises; enabling external feedback; complianceTransparent data sharing of operation during events
Civil societyLocal knowledge sharing; local interest matrix; community monitoringParticipatory engagement and community feedback
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Gim, C. Mistranslation of Uncertainties: From Epistemological Uncertainties to Legitimate Resilience Governance. Systems 2026, 14, 273. https://doi.org/10.3390/systems14030273

AMA Style

Gim C. Mistranslation of Uncertainties: From Epistemological Uncertainties to Legitimate Resilience Governance. Systems. 2026; 14(3):273. https://doi.org/10.3390/systems14030273

Chicago/Turabian Style

Gim, Changdeok. 2026. "Mistranslation of Uncertainties: From Epistemological Uncertainties to Legitimate Resilience Governance" Systems 14, no. 3: 273. https://doi.org/10.3390/systems14030273

APA Style

Gim, C. (2026). Mistranslation of Uncertainties: From Epistemological Uncertainties to Legitimate Resilience Governance. Systems, 14(3), 273. https://doi.org/10.3390/systems14030273

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop