Next Article in Journal
Patient Experience during Contrast-Enhanced Computed Tomography Examination: Anxiety, Feelings, and Safety
Previous Article in Journal
State-of-the-Practice Survey: United States Departments of Transportation Worker Injuries and Safety Program Efforts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Barry Turner: The Under-Acknowledged Safety Pioneer

School of Medical and Health Sciences, Edith Cowan University, Joondalup, WA 6027, Australia
*
Author to whom correspondence should be addressed.
Safety 2023, 9(4), 68; https://doi.org/10.3390/safety9040068
Submission received: 26 April 2023 / Revised: 17 July 2023 / Accepted: 8 September 2023 / Published: 2 October 2023

Abstract

:
Barry Turner’s 1978 Man-made Disasters and Charles Perrow’s 1984 Normal Accidents were seminal books but a detailed comparison has yet to be undertaken. Doing so is important to establish content and priority of key ideas underpinning contemporary safety science. Turner’s research found socio-technical and systemic patterns that meant that major organisational disasters could be foreseen and were preventable. Perrow’s macro-structuralist industry focus was on technologically deterministic but unpredictable and unpreventable “system” accidents, particularly rare catastrophes. Andrew Hopkins and Nick Pidgeon respectively suggested that some prominent writers who wrote after Turner may not have been aware of, or did not properly acknowledge, Turner’s work. Using a methodology involving systematic reading and historical, biographical and thematic theory analysis, a detailed review of Turner’s and Perrow’s backgrounds and publications sheds new light on Turner’s priority and accomplishment, highlighting substantial similarities as well as clear differences. Normal Accidents did not cite Turner in 1984 or when republished with major additions in 1999. Turner became better known after a 1997 second edition of Man-made Disasters but under-acknowledgment issues by Perrow and others continued. Ethical citation and potential reasons for under-acknowledgment are discussed together with lessons applicable more broadly. It is concluded that Turner’s foundational importance for safety science should be better recognised.

1. Introduction

1.1. Rationale for This Retrospective

Two major figures in the post-WW2 history and development of safety science and understanding of accident causation are the organisational sociologists Barry Turner and Charles Perrow. Their seminal books Man-made Disasters (MMD) by Turner in 1978 [1], and Normal Accidents: Living with High-Risk Technologies (NA) by Perrow in 1984 [2] and associated articles, addressed the interfaces between technology, people and organisations in leading to major accidents and disasters in a variety of domains. Despite Turner’s earlier publication, his 1978 book is less known and acknowledged than Perrow’s.
A recent edited book by Pettersen Gould and Macrae [3] has contributed to an increased profile for Turner’s work but has limited discussion in relation to Perrow, whereas a recent book on Perrow and NA by Le Coze [4] made limited reference to Turner. Surprisingly, there is yet to be a detailed comparison of the two books or an examination of the extent to which Perrow [2] overlapped with, or was derivative of, Turner’s work and, if this was significant, if he appropriately acknowledged it. Andrew Hopkins, noting some parallels between Turner [1] and Perrow [2], stated that Perrow “apparently wrote his book in ignorance of Turner’s work, since he makes no reference to it at the time” [5] (p. 21). Nick Pidgeon [6] (pp. 212–213) has argued that Perrow [2] “elaborated extensively” upon aspects of MMD but Pidgeon left open the issue of Perrow’s direct knowledge. More broadly, Pidgeon has recently contended that Turner’s analysis was
so well done that it correctly anticipated developments and conclusions that other more prominent accident researchers would subsequently lay claim to (system complexity and uncertainty, how multiple failures undermine layered safety defences, the cultural blinkers that organisations adopt), even when those who followed claimed only superficial or no knowledge of Turner’s original writings.
[7] (p. 239, emphasis in original)
Drew Rae in “The Safety of Work” podcast in October 2022 with David Provan, featuring Perrow’s NA, concluded that
I can’t see any sign that he [Perrow] was aware of Barry Turner’s work … The ability of people at different sides of the world to encounter each other’s work and understand where progress has been made relies on you knowing who else is working on the same things. … a lot of the work in safety, he [Perrow] never encountered, which I don’t think is his fault. Yeah, there are a lot of people today who’ve never heard of Barry Turner. … [Perrow] independently invented a lot of foundational thinking in safety, that he wasn’t the first to think of it, but he also did it without standing on the shoulders of other people who had those same ideas.
[8]
This paper stemmed from the primary author’s review of MMD in the context of other important publications involving accident causality, analysis and prevention. Unexpectedly, many more similarities were found between Turner [1] and Perrow [2] than anticipated, yet NA published six years later had made no reference to MMD. A range of other important safety scholars also appeared to have not seen, ignored, understated or under-acknowledged Turner’s work. This raised the issue of whether and when Perrow and the others knew of Turner’s MMD and associated work. Most of this review was researched and drafted independently of Hopkins and Pidgeon before they provided drafts of chapters [5,7] cited above, and before the Pettersen Gould and Macrae [3] and Le Coze [4] books were published.

1.2. Key Aims

The paper’s first aim is to provide an historical and contextual exposition of the major accident and disaster books and theories of Turner and Perrow and their evolution, with a comparison of their work to better understand their contributions to safety science. The second aim is to test the priority and originality of each pioneer and the possibility that Perrow [2] came to similar views and theories independently of Turner [1] and how each cited the other. The third aim is to review the knowledge of Turner by some other important safety pioneers and accident causation scholars and their treatment of him and Perrow. A final aim is to consider reasons for under-acknowledgment of Turner and potential ethical considerations linked to inadequate citation. By meeting these four aims, readers will have a much greater understanding of the strengths and originality of both Turner’s and Perrow’s foundational influence for safety science.

1.3. Outline of the Article

Section 2 outlines the materials, methods and approach utilised to meet the stated four aims. Section 3 reviews central themes in MMD and Turner’s relevant background in an extended fashion to ensure readers are familiar with the scope of his work. Section 4 does the same for NA and Perrow, albeit with less detail because his work is better known. Section 5 assesses key similarities and differences between MMD and NA. Section 6 considers relevant developments in Turner’s and Perrow’s subsequent publications. Section 7 examines the awareness, acknowledgment and citation of each other’s work and establishes that Perrow had read Turner’s MMD while writing NA but did not acknowledge it, including in 1984 [2] and 1999 [9]. Section 8 considers citation of MMD and acknowledgement of Turner by major contemporaries and successors and finds cases of appropriate, mixed and poor citation of Turner, even after allowing for potential lack of awareness until the 1997 second edition of MMD [10]. Section 9 discusses some important issues arising and suggests reasons why Perrow’s work has been much better known and cited than Turner’s, highlights ethical issues associated with poor acknowledgment, and notes some study limitations. A conclusion summarises how the four key aims were addressed.

2. Materials, Methods and Approach

Turner’s MMD [1] and corpus of published work was carefully, comprehensively and “systematically” read from early 2020 and summarised. Many of Turner’s source documents were also considered. Perrow’s NA [2] and relevant corpus was similarly read from late 2020 before making detailed comparisons between MMD and NA. Respective usage was annotated and transcribed to minimise omissions of relevant background or concept elaboration. This process established the earliest publication by each author of what have become key safety science concepts. Understanding the context of each author was also sought through biographical accounts. An assessment was made of whether similar terms in MMD and NA were being used in the same way or not, whether similar concepts were being described using different terminology, what was most important and unique for each author, and whether any contemporary explanation of words and concepts was required forty years after writing. A wide range of secondary literature was also considered and an assessment was made of how MMD had been acknowledged in comparison with NA by important safety science pioneers and successors writing on accident causation, models and theory. With the exception of those (such as Pidgeon) writing with first-hand knowledge, the scope for discussing such work in this paper was mostly limited to sixteen examples of important safety science researcher/writers in the field.
The particular qualitative approach and method of “systematic reading” that was used, evolved out of a broader literature review process and was developed by the primary author, having regard to the documentary data and research aims for this paper. This included historical, chronological, biographical and thematic theory analysis and an understanding of sociological frameworks. Le Coze is a prolific safety science researcher and methodologist who has recently advocated and utilised “systematic reading” [4], [11] (Section 2). Our approach was found to include a combination of three research strategies discussed by Le Coze in his review of NA: considering NA standalone; interpreting NA in light of Perrow’s earlier material; and including all of Perrow’s previous and subsequent writing. Le Coze favoured the third “complementary” approach [4] (p. 11). All three research strategies were utilised for each of MMD and NA, with a particular emphasis on the third. An additional strategy and method we used involved a detailed comparative analysis of the two books and their contexts and of relevant subsequent works, and an examination of the pioneers’ acknowledgment/citation of each other. Citation and acknowledgment of MMD and Turner was assessed in some detail for nine important contemporaries and close successors, with seven other researcher/writers reviewed in less detail. Brief comparisons were also made of their acknowledgment and citation of Perrow. To justify conclusions from the documentary material and to enable readers to review and replicate them, a deliberate strategy was employed of using extended quotations rather than just interpretive summaries and providing easy access to the large volume of source materials. This entailed searching for any online source/DOI available to list in the references and to provide relevant page numbers rather than just a general reference. Periodic reading of drafts and critical textual review by the secondary authors was also significant.

3. Central Themes in MMD and Their Background

Barry Turner’s landmark book Man-made Disasters [1] was based on his PhD dissertation “The Failure of Foresight” at the University of Exeter [12] that used “grounded theory” to analyse 84 official British accident inquiry reports in various industries. In MMD, based on common patterns derived from some very different high-hazard industry cases, Turner proposed a “socio-technical” organisational model of disasters and accidents. This was to understand “failures of foresight” and included a sequence highlighting pre-accident work and culture norms, “incubation” of accident precursors, a “precipitating event”, and post-inquiry readjustment. Alongside this was a systemic explanation incorporating organisational hierarchy and structure, “negentropy” and “anti-tasks”, “system forgiveness” and “deviation amplifying feedback”. An accident analysis diagram was developed to assist understanding and learning [1]. Turner sought to provide a “general framework” for “understanding disasters as a socio-technical problem, with social, organizational and technical processes interacting” with emphasis in MMD “upon the more neglected social elements” which included the structure of information and communication in organisations with social and cultural context [1] (pp. 2–3, 5). Before Turner, all disasters and major accidents, and not just those that were purely natural, such as earthquakes, were largely treated as unique or random, so the focus was on response not prevention. MMD was the seminal work that demonstrated that it was possible to identify precursors and seek to improve foresight and prevention. Turner died in 1995 aged 57, and a second edition of MMD with an added chapter and updated language and references was completed by his close colleague Nick Pidgeon using notes Turner had left for the revision [10] (pp. xviii–xix).
Turner was born in 1937 into a working-class family in Birmingham in England. He studied and worked in engineering before completing a degree in sociology in 1966 [13,14,15], [16] (p. 101), [17]. Intensive fieldwork whilst an organisational sociology researcher at Imperial College in London led to the 1971 book Exploring the Industrial Subculture [17], which is considered a classic [18], and associated publications [19,20,21,22]. A less known 1975 book Industrialism had a multidisciplinary global focus [23]. Turner’s background and publications, particularly [17], were drawn upon in MMD. Turner [1] pragmatically balanced information theory, biology, physics, cybernetics and systems thinking with psychology, sociology and Berger and Luckmann’s [24] “social construction of reality” [16] (p. 281). He wrote that in 1979 Morgan had called him an “ontologically confused social realist” [25] (pp. 192–194) which he wore as a badge of honour and “…always took more as a perceptive description than as a fault needing correction” [16] (p. 282). Before MMD, Turner published many key concepts [12,26,27,28,29,30,31], as can readily be seen in an Abstract in leading US journal Administrative Science Quarterly (ASQ)
Public inquiries into … three major disasters are examined and classified to study the conditions under which large-scale intelligence failures develop. Common causal factors are rigidities in institutional beliefs, distracting decoy phenomena, neglect of outside complaints, multiple information-handling difficulties, exacerbation of the hazards by strangers, failure to comply with regulations, and a tendency to minimize emergent danger. Such features form part of the incubation stage in a sequence of disaster development, accumulating unnoticed until a precipitating event leads to the onset of the disaster and a degree of cultural collapse. Recommendations following public inquiries are seen as part of a process of cultural readjustment after a disaster, allowing the ill-structured problem which led to the failure to be absorbed into the culture in a well-structured form. The sequence model of intelligence failure presented and the discussion of cases are intended to offer a paradigm for discussion of less tragic, but equally important organizational and interorganizational failures of foresight.
[27] (p. 378)
The initial MMD sequence model was based on the analysis of the official inquiries into three disasters: Aberfan (a coal mining waste tip collapse onto a Welsh school and village in which 144, mostly children, died), Hixton (a rail level crossing passenger express train collision with a slow-moving truck carrying a 120-ton transformer—11 died, 45 injured) and Summerland (a leisure centre building fire on the Isle of Man—50 died, 80 injured). The model comprised six stages:
(1)
a notionally “normal” starting point with culturally accepted beliefs about the world and its hazards, and associated precautionary norms that are followed through regulation and less formal practices;
(2)
an “incubation period” when an unnoticed set of events or chains of “discrepant” events at odds with accepted beliefs and norms about hazards develop and accumulate;
(3)
a “precipitating” event or incident linking with the chain of discrepant events, produces a transformation revealing the “latent structure” of the incubation period and a “gap in defences that were previously considered secure”;
(4)
the “onset” of a disaster or major accident follows immediately from the precipitating event, with direct and “unanticipated” consequences of the failure, and an onset of varying rate and intensity over varying scope and area;
(5)
rescue and salvage—rapid and ad hoc changes in understanding and a first stage adjustment to the disaster; and
(6)
full “cultural readjustment” after an investigation into the accident or disaster to understand how it happened and to review hazards and associated precautions to seek to avoid future occurrences [1] (pp. 84–92), [27] (p. 381).

3.1. Scope, Terms, Definitions and Data

Turner highlighted his concern for “an examination of some large-scale disasters that are potentially foreseeable and potentially avoidable … to look for a set of organizational patterns that precede such disasters.” [27] (p. 380) and “to consider whether there may be some general principles which could be formulated to deal with at least some ‘ill structured’ problems before they … [lead to a] disaster” [1] (p. 75, emphasis in original). Ill-structured referred to lack of necessary knowledge and framework for understanding, which created ambiguity [1] (pp. 64, 218 note 8). Ill-structured problems “often use symbolic or verbal variables, have vague, non-quantifiable goals, and lack available routines for solving them, so that rules of thumb or ad hoc heuristic procedures are commonly used” [27] (p. 21). Turner considered that disasters “may be regarded as transformations of socio-technical systems, in which the perceptions, expectations and understandings of those associated with the material world concerned are an integral part of the phenomena under study.” [1] (p. 188). MMD also provided a systemic “information-based view of disasters” [1] (p. 188).
The important concept of “variable disjunction of information”, based on Turner’s earlier research into engineering batch production [17] (pp. 126–128, 133–135), [22], was used in several places in MMD where different subculture groups had slightly different information and “theories” about what was going on, but the complexity and uncertainty could not be resolved because the level and distribution of knowledge changed dynamically and “time, money and energy are scarce” for obtaining additional data [1] (pp. 50–52, 61, 101, 217 note 43, 225).
Turner drew upon R.K. Merton’s classic 1936 article The Unanticipated Consequences of Purposive Social Action [32] which included to emphasise that lack of full knowledge of implications does not imply that unanticipated consequences of either individual or group actions will be undesirable [1] (pp. 131–132, 227–228). In addition to “bounded rationality” and “satisficing” decisions drawing from the work of H.A. Simon [33,34], Turner argued that “individuals within organizations may be thought of as having ‘perceptual horizons’ with regard to those things that are significant and important to them in the pursuit of their tasks, the positioning of these horizons being influenced and reinforced by institutional beliefs and norms” [1] (p. 59). This is partly the result of shared subgroup and organisational cultures and formal and informal communication channels leading to what will be routinised or not and what is an accepted “world view” [1] (pp. 120–121, 165–166). A parallel concept of “bounded decision zones” involved organisations in limiting the information and environmental context that they will pay attention to as a consequence of resource constraints and focusing on core goals [26] (p. 35), [1] (pp. 58, 165).
Turner’s working assumption was that “unless it was clearly shown to be otherwise, the patterns of behaviour displayed by the parties concerned were considered by them to be normal patterns of behaviour which could be indulged in without leading to disaster. … [they] followed a pattern of activity which was quite a normal one for them, and one which had been satisfactory in the past” [26] (pp. 21–22). His research had established that
It is rare that an individual, by virtue of a single error, can create a disastrous outcome in an area formerly believed to be relatively secure. To achieve such a transformation, he or she needs the unwitting assistance offered by access to the resources and resource flows … of large organizations, and time. The three accidents discussed here had been incubating for a number of years.
[27] (p. 395)
There are several places in MMD where Turner drew on his intensive fieldwork studies of work situations or further organisational investigations, comparing actual with intended work outcomes and what was normal work with what was imagined [1] (pp. 19, 51, 77, 84–85, 126, 130), [22] (p. 96); for example, he stated: “To prevent accidents and disasters … [among other things] We would need continuously to adjust the incipient discrepancies between the picture of the world envisaged in someone’s plan, and the way the world really is … [including as a result of] the social context within which such practices develop and are carried out” [1] (pp. 5, 194).
Readers may find the MMD terminology “man-made disasters” jarring in a couple of respects. While “man-made” employs a sole gender to represent human-influenced, as distinct from “natural” disasters such as earthquakes, gendered usage was still common in the 1970s and Turner was leveraging “man-made” as a prior category used in insurance and other accident and disaster literature such as Western’s [35] “etiologic” and “epidemiological” classifications [1] (pp. 12–14, 38). Turner also noted that multiple causality can challenge the division between natural and man-made disasters, and over time, population growth and concentration, and human-influenced environmental impacts, would erode such boundaries [1] (pp. 1–2, 14, 190), [31] (p. 6).
Turner emphasised a sociological definition of “disaster” focusing on surprise and disruption of existing cultural understanding and norms, rather than necessarily an extensive loss of life and damage. The formal definition of disaster in MMD is “an event, concentrated in time and space which threatens a society or a relatively self-sufficient subdivision of a society with major unwanted consequences as a result of the collapse of precautions which had hitherto been culturally accepted as adequate” [1] (pp. 83–84). However, Turner sometimes used the terms “accident” and “disaster” interchangeably, acknowledging there was no agreement on the definition of either word and “study of disasters merges with the study of accidents, although for an accident to be labelled a ‘disaster’, it will probably need to be an unusually large-scale accident, an unusually costly accident, an unusually public accident, an unusually unexpected accident, or have some combination of these properties.” [1] (pp. 26, 82). However
If the only cause of an incident is an inappropriate response to a recognized warning, the incident is more likely to be one which we characterize as an accident: by contrast, in a pre-disaster situation, given the typically large accumulation of predisposing factors, the nature of the last error is relatively unimportant. … The incubation network only refers to those chains of events which are discrepant, but are not perceived or are misperceived. It is meaningful to compare accidents and disasters only in terms of incubation networks ….
[1] (p. 88, emphasis in original)
Turner sometimes used “incidents” as a term to include both accidents and disasters [1] (pp. 88, 171). For Turner “Accidents are produced as a result of the combination of misinformation or misunderstanding with sufficient energy to produce an undesired transformation. Disasters are accidents which are more surprising or more alarming than usual, and … this depends on a number of subjective elements” [1] (p. 184).
As mentioned, Turner developed his core precursor pattern themes using “grounded theory” [36], [37] (p. 3) from the three primary UK disaster reports mentioned [1] (p. 50 note 4), [12] (III. 5), [26] (p. 21). While precise patterns differed in the three cases, as would be expected in such diverse industries and contexts, some broadly similar themes were discovered and then tested with a further ten inquiry reports across multiple industrial sectors, to establish typical precursor patterns. Another 71 reports were then analysed to ensure robustness of the results. In Turner’s three initial cases, a large, complex, ill-structured problem, the limits and bounds of which were difficult to specify, was being addressed by multiple groups in separate organisations or separate departments within organisations [1] (p. 53). The 13 official British disaster and accident inquiry reports comprising Turner’s core case study data included 351 fatalities ranging from two deaths to the 144 children and other residents at Aberfan [1] (p. 81, note 2). As Flin observed, most of the 84 reports analysed in MMD would not generally be considered “disasters” [38] (p. 88). They comprised 19 reports on mining accidents including Aberfan and the Cambrian Colliery and other “miscellaneous” accidents such as the Summerland fire, the Flixborough chemical plant explosion, the Hixton rail accident, and a laboratory smallpox escape; 33 boiler explosion reports; and 32 marine “wreck” reports that excluded fishing vessels. Turner omitted all 209 air accidents and 121 of the 122 rail accidents from the over 400 reports he originally contemplated [1] (pp. 202–203), [12] (III. 6), [28] (p. 755), probably deciding that 84 was sufficient as no new categories or inconsistencies emerged from the last 71 reports he had analysed.
In contemporary language, reflecting Turner’s own writing in and after MMD, Turner’s focus was predominantly unforeseen large-scale accidents involving death and major property damage such as a large fire, explosion or escape of contaminant, not all of which would commonly be described as “disasters”, but in all cases where decision making was intendedly rational, preconditions had incubated, and the onset and result were sufficiently unexpected and surprising as to require a substantial change in understanding and “cultural” adjustment after a thorough investigation or inquiry. His main interest was complex cases where there had been a “failure of foresight” rather than deliberate malfeasance or simpler, and more readily explained, types of accidents and failures, although these may also be illuminated by aspects of his sequence model and systemic socio-technical and informational approach. He also excluded smaller industrial Occupational Health and Safety (OHS) accidents that were focussed around individuals [1] (pp. 26–27, 31). Turner defined the incubation period as starting: “when the first discrepant event occurs unnoticed and it is brought to a conclusion by a precipitating incident which produces a transformation, revealing the latent structure of the events of the incubation period. A situation which had been presumed to have one set of properties is now revealed as having different and additional properties which must now be interpreted differently” [1] (pp. 89, 193, emphasis in original).

3.2. Incubation, Failures of Foresight, Prevention and Power

During incubation, Turner linked “precipitating incident” with Goffman’s [39] frame “triggers” [1] (p. 224) and varied his terminology from precipitating incident or event to, in some publications, “trigger” or “triggering” event [40] (p. 203), [41] (p. 366), [42] (p. 8), [43] (pp. 216–218). As noted, disaster incubation involving “complex chains” of discrepant events typically took several years [1] (pp. 87–89), [44] (p. 165). Turner asked a “largely sociological question”: “What stops people from acquiring and using appropriate advance warning information, so that large-scale accidents and disasters are prevented?” [1] (p. 195). Turner found that such “failures of foresight” [1] (pp. 50, 77, 92, 99, 107, 161, 170, 179, 217) occur during the incubation period in Stage 2 of his sequence model, as a result of:
  • Events unnoticed or misunderstood because of erroneous assumptions due to rigidities of belief and perception, “decoy” phenomena (focus on a problem that obscures a bigger one), and disregard of complaints from “outsiders”;
  • Events unnoticed or misunderstood because of difficulties in handling information in complex situations, information difficulties and “noise”, and the involvement of “strangers” (visitors and trespassers) on sites;
  • Effective violations of precautions passing unnoticed because of a cultural norm lag with existing precautions, including a failure to comply with regulations which may be unknown or out-of-date;
  • Events unnoticed or misunderstood because of a reluctance to fear the worst outcome, and minimising emergent danger and not taking action when things begin to go sour [1] (pp. 99–103).
These common themes and patterns were a Weberian “ideal-typical” or generalised formulation [1] (pp. 75–76 note 112, 222), [30] (p. 137) derived from analysis of the three initial cases [26] (p. 39), [45] (p. 344). Turner considered that these conditions and factors which might, in combination, produce a major accident or disaster, could potentially be addressed by administrators and managers to help avoid a disaster. He highlighted the importance of timely and accurate information and feedback, and proposed using the patterns found as something of a “conceptual tool-kit” and management checklist [1] (pp. 32, 75–76, 194–200). Turner also discussed the role of inquiries in converting ill-structured to well-structured problems (understood in hindsight) and enabling norm and cultural adjustment post-inquiry including through associated recommendations [1] (pp. 58–74). Turner noted that “disasters arise from an absence of some kind of knowledge at some point”, “failures of intention” and “the outcomes of misdirected energy”, so an aim of MMD was to consider foreknowledge from a socio-technical point of view and “demonstrate how an understanding of a number of aspects of disaster can be gained by seeking to discover the way in which knowledge and information relating to events provoking a disaster were distributed before the disaster … [the] ‘social distribution of knowledge’ about hazards … [before] energy is released at the wrong time, at the wrong rate or in the wrong place” [1] (pp. 3–4). Turner extended the energy focus on accidents and disasters by Gordon [46] and Haddon [47,48,49] by stating “the general principle that: disaster equals energy plus misinformation. If this proposition is accepted, it becomes clear that in order to understand the origins of disaster, it is necessary to study both the social and the non-social sources of both the energy and the misinformation which combine to produce disasters” [1] (p. 188, emphasis in original).
Turner’s 1971 research outcomes on industrial subcultures included the importance of symbolism, hierarchy, structure, power and context for conveying communication and meaning [17] (pp. vii, ix, 1–2, 4–7, 11–17, 38–41, 82–91, 127, 131–135) and the role of values and ethics [17] (pp. 92–96) and complexity [17] (pp. 128, 136). MMD drew on this as well as information and communication theory, with Turner noting that the success of a warning required it to be initiated in a timely manner, coded in an appropriate, accurate, unambiguous and reliable form, then transmitted, received by the correct recipient, and its reception and the response checked [1] (pp. 45–47). He discussed both warnings and learning in relation to the prevention of disasters, because both involved the treatment of relevant information needed in time to act within the incubation period ahead of a disaster trigger. Four information categories were distinguished: (1) that which is completely unknown and unsuspected—which he considered a rare but difficult-to-address category other than by better search procedures and imagining the worst [1] (p. 194); “(2) that which is known but not fully appreciated; (3) that which is known by someone, but is not brought together with other information at an appropriate time when its significance can be realized and its message acted upon; (4) that which was available to be known, but which could not be appreciated because there was no place for it within prevailing modes of understanding” [1] (p. 195).
Hopkins [5] (p. 21), citing Pidgeon, suggested that Turner had not highlighted the issue of power in MMD. However, the importance of power, and associated politics, are explicitly addressed on more than a dozen pages of MMD, including in relation to the risk of disaster from the concentration of power in large industries and organisations: “the sources of energy that men control, and which possess the potential for the creation of man-made disasters, are coming under the authority of centralized bodies and organizations, and are thus increasingly vulnerable to misuse if major errors are made at the centre … [involving those] in positions of power, those concerned with management and decision-making … growing concentrations of organizational power and the increasing growth of organizational interdependence” [1] (pp. 1, 3–6). Turner stated that, in addition to his sequence model
it becomes necessary, also, to start to take account of those other perennial concerns of the sociologist, the charting of the distribution of power, of the control of resources and of social reputation. … Powerful groups and organizations are able to specify the kinds of hazard that they recognize, to set out and implement the kinds of precautions which they think are necessary, and to exert their authority in intervening in areas which they regard as hazardous. … There may be confrontations between those who present an official definition of hazards and others who think that the situation is different … There is thus an overlay of differential power distributions which will affect knowledge, perceptions and expectations of accidents.
[1] (pp. 124–125, 152)

3.3. Turner’s Systems Approach

Among many MMD references relating to systemic factors and context, Turner stated that: “failures may arise, not merely because some component or sub-system is unable to function or to achieve its goal, but also, for example, because its goal may not have been devised as it ought to have been if some wider system, in turn, is to achieve its goal. The complexities created by such interconnections can be developed …” [1] (p. 19, emphasis in original). Turner’s systemic approach flows from a “‘modern” organisation’s “hierarchy of overlapping bounded decision zones, which has associated with it a hierarchical set of bounded rationalities, each defining a set of rules for describing a set of informational boxes within which the organization is prepared to receive messages about its environment” [1] (p. 170). Turner also drew from system thermodynamics: internal structures and hierarchies normally achieve the goals of an organization in an ordered way based on “man-made” energy inputs and processes that create negative entropy (“negentropy”), countering the system tendency towards the thermodynamic entropy of disorder and randomness. When there is an unexpected failure or emergent event internally, or via an external input, there is often an element of “system forgiveness” such as Turner noted had been identified by Lindquist [50], that allows goals to still be achieved [1] (pp. 19–20). Turner also considered that “At times … outcomes from components, sub-systems or lower levels in the system which appear to be unacceptable … [may] be perfectly adequate … the system may accept as non-failures, results which are failures in formal terms” [1] (p. 20).
However, as Merton had explained, there may be failures of intent where unintended consequences emerge [32]. Turner stated that these will not be random but rather structured by the organisational system making “nonrandom use of the rules of the organization in their propagation”. Sometimes unintended consequences will be positive and sometimes negative, sometimes small and sometimes large, and large negative consequences can lead to system failure. Turner considered a key mechanism for these effects to be what systems and cybernetics theorist Maruyama [51] termed “deviation amplifying feedback” which Turner cited and applied to organisational accidents, where “the higher the level at which an error originates, the greater the chance it has of being compounded … in the course of its transmission down the hierarchy … to produce or magnify unintended consequences in a surprisingly ordered way” [1] (pp. 179–180, 236). Other systems writers considered and cited in MMD included Schrödinger [52] on negentropy and life, Brillouin [53,54] on thermodynamics and cybernetics, Shannon and Weaver [55] on information theory, Pask [56] on networks, systems properties and interrelationships, Rivas and Rudd [57] on “disaster-resistant systems”, and Thom [58] in relation to chaos theory and crisis [1]. Turner referred to the causal preconditions with structured unintended negative consequences as “anti-tasks” that could overcome negentropy and intended organisational goals. He summarised that
“All that is required is the introduction of unintended or unforeseen variety near to the organizing centre to produce a large-scale, but orderly error which makes use of the amplifying power of any ordered organizational hierarchy. If we consider organizational hierarchies as systems set up to carry out tasks, these ordered but undesired consequences could be regarded as ‘anti-tasks’ rather than as completely random errors. Large-scale disasters need time, resources and organization if they are to occur—if the ‘anti-task’ is to be successfully executed … [such disasters are] most unlikely to be met solely as a result of a concatenation of random events”.
[1] (p. 180)
In MMD, accident and disaster systems are orderly as they contain anti-tasks and order-seeking individuals, enabling Turner’s consideration of non-random “chaotic order”. MMD was also “concerned with a larger system which includes not only physical events, but also the perception of those events by individuals” [1] (pp. 185–186). Turner’s view of disasters involving information and interlocking complex chains of causality [1] (pp. 88–9, 105) partly drew from cybernetic systems theory [25] (p. 6), [59].

3.4. Turner’s Multidisciplinarity and Optimism

Turner promoted the use of a multidisciplinary, and at times interdisciplinary, approach to study the nature and origins of disaster “in which psychologist, epidemiologist, engineer and manager might well co-operate with social scientists” [1] (pp. 31–32). He championed and undertook this in the 1980s [40] (p. 195), [14,16,60,61,62,63,64,65], albeit without an epidemiologist. He considered, citing Fischhoff [66] and Weick [67], that such a team must seek to avoid “hindsight bias” and “retrospective rationalisation” [1] (pp. 20 note 40, 209, 162–163 note 4, 234), and to understand local rationality
it is important that we should not assess the actions of decision-makers too harshly in the light of the knowledge that hindsight gives us. … it is necessary to look at the manner in which rationality becomes established and embedded within organizational procedures and habits, and to gain an understanding of the hierarchy of decision-making within which the individual administrator finds that he has to operate.
[1] (pp. 162–163 note 4, 234)
Turner was not completely optimistic about preventing major accidents and concluded MMD by stating that while
we can continue to try to improve … we are in a contingent universe, in which ultimately there are limits on our ability to reduce uncertainty, to master all of the open-ended and perverse qualities of our environment, and upon our ability to prevent disaster … We may come to realize that, even when our strategies are successful, they are still dependent upon the munificence of the environment and upon the mutability of fortune.
[1] (p. 201)
It can be seen that MMD included both a sequence model incorporating typical accident precursors based on qualitative case research in different high-risk industries, and a systemic organisational approach incorporating information, unforeseen variations, negentropy and amplification in a context of sociological thinking including norms, culture, power, hierarchy and structure. Turner’s socio-technical perspective was multivalent and encompassed several interrelated levels of analysis.

4. Central Themes in NA and Their Background

As an established Professor of Sociology, Perrow came to accident and disaster studies in August 1979 when he was invited by a former PhD student to provide social science input for the US Presidential Commission of Inquiry into the serious accident on 28 March 1979 at the Three Mile Island nuclear power station (TMI). Utilising his organisational background, review of initial TMI Inquiry witness transcripts, and assistance from graduate students, Perrow developed his core argument for TMI inside a month [2] (p. vii), [68] (pp. 429–430). His TMI paper considered that the accident was “inevitable—that is it could not have been prevented, foreseen or quickly terminated, because it was incomprehensible … [it] is termed normal because it is inherent in the characteristics of tightly coupled, complex systems and cannot be avoided” [69] (p. 1), [70] (pp. 173–174). After several years of reviewing reports on accidents in the context of their high-risk industries, his 1984 book Normal Accidents: Living with High-Risk Technologies was published [2]. In NA, rare “normal” or “system” accidents were illustrated by a memorable 2 × 2 matrix of interactive complexity and coupling and contrasted with much more common “prosaic” “component failure” accidents [2] (p. 327). A subset of system accidents could lead to “catastrophes”, but so too could some component failure accidents such as those involving large dams [2] (pp. 343–344). NA was reprinted in 1999 with a substantial “Afterword”, material on “Y2K”, and additional references [9]. Perrow considered his central “normal” or “system” accident theory was “unique” in focusing on systems
to chart the world of organized systems … [this] … constitutes a theory of systems, of their potential for failure and recovery from failure, As it such, it is, I believe, unique in the literature on accidents and the literature on organizations. Perhaps the most original aspect of the analysis is that it focuses on the properties of systems themselves, rather than on the errors that owners, designers, and operators make in running them.
[2] (pp. 62–63)
Charles B. (“Chick”) Perrow was born in 1925 into a poor family in the small town of Tacoma in Washington State and died in 2019 at the age of 94 as Emeritus Professor of sociology at Yale University. After service in WW2, he attended university and completed his PhD in sociology in 1960 on the topic “Authority, Goals and Prestige in a General Hospital” at the University of California at Berkeley [71] (p. 335). Perrow published some fine articles and reviews in organisational sociology from the early 1960s [71,72,73,74] including on “contingency theory”, in which technology influenced structure—and management should vary accordingly [75,76]. His first book Organizational Analysis [77] was negatively reviewed [78] (p. 338–339) and Perrow considered it “something of a quicky”, and preferred his books from 1972 [68] (p. 424). He was proud of The Radical Attack on Business in 1972 [79] where his left-leaning critical views on corporate power and politics were fully on display and what became a classic book and textbook, Complex Organizations: A Critical Essay [80], with a third edition in 1986 [81], [82] (p. 131).
In his Introduction to NA, Perrow outlined the scope, purpose and focus of his review of high-risk “systems”
nuclear power plants, chemical plants, aircraft and air traffic control, ships, dams, nuclear weapons, space missions, and genetic engineering. Most of these risky enterprises have catastrophic potential … it is the possibility of managing high-risk technologies better … that motivates this inquiry. There are many improvements we can make that I will not dwell on … such as better operator training, safer designs, more quality control, and more effective regulation. … Rather, I will dwell upon characteristics of high-risk technologies that suggest that no matter how effective conventional safety devices are, there is a form of accident that is inevitable.
[2] (p. 3)
Depending on assessment criteria, around 50 accidents were discussed in some detail in NA, together with broader industry and system data [2]. Surprisingly, in NA Perrow did not cite MMD or Turner’s [27] ASQ article that was in a journal with which Perrow was closely associated as a contributor and editorial board member. We will draw, as space permits, from Perrow’s relevant prior work [69,70,75,76,79,80,81,83,84,85,86,87,88,89] and his comments about NA [9,68,81,90,91,92,93,94,95,96]. For example, in 1986 Perrow [81] stated that in NA, as he moved beyond the foundational TMI case
The inquiry … grew into a major analysis of a number of systems. … it is relentlessly ‘structural’. … Investigating a number of these accidents, I found a common pattern. While most accidents in risky systems stemmed from a major failure that could have been prevented, a substantial minority resulted from the unexpected interaction of two or more small failures. … The unexpected and generally incomprehensible interaction of small failures was found in all the complex systems I studied in any detail, including those with catastrophic potential … The sources of failure were diverse … The resulting accidents were ‘system accidents’, arising from the ability of the system to permit the unexpected interactions of failures. … Multiple, unexpectedly interacting failures in risky systems still might not be a serious concern if operators could intervene before significant damage occurs. But there is another system characteristic to consider … if coupling is tight, none of these safeguards is available. … the environment [once incorporated in analysis provides] further insight into the problem of safe systems … If we take an industry, rather than particular organizations, as the unit of analysis, we can see the impact of the industry and its ties to society upon the organization and its problems. … [e.g.,] comparing the error-inducing marine transport system with the error-avoiding airline system … The focus was on system interactions and control, whether at the level of the operator or elites deciding what kind of risks the rest of us should run.
[81] (pp. 146–149, 153, 155)

4.1. System Focus and Definitions

In order to focus on system characteristics among organisations in industries using high-risk technologies, Perrow said “it is important for analysis to treat humans in most systems as mere parts” [2] (pp. 63, 66). He divided a “system” into four levels of increasing complexity—part, unit, subsystem and system, with accidents reserved for “serious matters” involving the third and fourth levels, and incidents involving “disruptions at the first or second level” with damage “limited to parts or a unit”. Perrow’s “formal definition” of accident is: “a failure in a subsystem, or the system as a whole, that damages more than one unit and in doing so disrupts the ongoing or future output of the system”, but he said that “sometimes we will ignore it”. “Component failure accidents involve one or more component failures (part, unit or subsystem) that are linked in an anticipated sequence. System accidents involve the unanticipated interaction of multiple failures”. The types “are distinguished on the basis of whether any interaction of two or more failures is anticipated, expected or comprehensible” to system designers or trained operators, albeit Perrow said both accident types start with a component failure [2] (pp. 63, 65–66, emphasis in original).
Perrow did not define “disaster” and simply said “we are concerned about those systems that have catastrophic potential—can cause damage to a great many humans”. He stated that NA’s concern is with “third- and fourth-party victims” not first-party operators or second-party associated personnel and system users such as passengers, but third-party “innocent bystanders” and fourth-party “fetuses and future generations” [2] (pp. 67–68, 257). In NA he stated that most high-risk enterprises “have catastrophic potential, the ability to take the lives of hundreds of people in one blow, or to shorten or cripple the lives of thousands or millions more” [2] (p. 3). In his subsequent Afterword, he defined “a catastrophe … [as] an accident that kills more than 100 people with one blow” [9] (p. 357). Here the “party” category of the dead is unclear. The (at least) “100 dead” criterion was not maintained later. Perrow highlighted that not all human-influenced catastrophes required complex systems, with dam failures—unless linked to a larger complex system—being a major exception [2] (pp. 13–14, 232, 254–255, 344–355). Neither Turner nor Perrow stuck to rigid definitions of their terms, which adds to the challenge of comparison.
Prior to NA, Perrow had used 2 × 2 matrices with aspects of technology relevant to organisational structure and noted that for poorly conceptualised problems “one draws upon the residue of unanalyzed experience or intuition or relies upon chance and guesswork”. His examples included “nuclear fuels” located in a top-right quadrant 2 [75] (pp. 195–196), [77] (pp. 75–78), where complex and tightly coupled nuclear power and nuclear weapons systems of greatest concern to Perrow were to be located in NA’s famous 2 × 2 matrix [2] (p. 327).

4.2. Complexity, Coupling, System and Component Accidents

Perrow considered that multiple failures in system accidents are likely to be in “reasonably independent units or subsystems” and can be termed “nonlinear”, “complex” and “complexly interactive” [2] (pp. 71, 78), and with minimal “slack” and “tight coupling” an accident caused by interactive complexity can spiral out of control. Unexpected tightly coupled “dependent” events occur when failure in one “triggers” the other and “processes happen very fast and can’t be turned off … Indeed, operator action or the safety systems may make it worse, since … it is not known what the problem really is” [2] (pp. 4–6). Multiple interaction problems are “incomprehensible” for a critical period and “greatly magnified” because of factors such as “proximity, common mode connections, or unfamiliar or unintended feedback loops” within complex systems [2] (pp. 9, 82, 85–86). The combination of interactive complexity and tight coupling can produce Perrow’s “normal” or “system” accident. Towards the end of NA, Perrow wrote of complexity, tight coupling and normal accident inevitability that “by itself this argument is without practical application” and argued that it needed to be augmented by consideration of risk to society and “catastrophic potential”. He summarised this in a table that ranked inherent and actual system accidents and component accidents with catastrophic potential and considered the cost of alternatives [2] (pp. 342–344). In line with his introduction to NA, preventing catastrophes was Perrow’s [2] (p. 3) stated central goal.
Most industry cases in NA were considered to be “component failure accidents”, resulting from multiple failures of factors summarised by the acronym “DEPOSE”: design, equipment, procedures, operators, supplies and materials, and environment [2] (p. 8), and excluded the “unanticipated and incomprehensible” interactive complexity needed for a system accident. As noted, some component failure accidents such as dams, could cause catastrophes.

4.3. Perrow’s Sociological Background and Method

Perrow’s relevant earlier research included the importance of appropriate organisational centralisation and decentralisation [75,83,84,85]. In NA, Perrow outlined a “dilemma” wherein complex and tightly coupled systems in quadrant 2, such as nuclear power, also required loose coupling decentralisation to enable operator workers to deal with unexpected issues [2] (pp. 5, 10, 330–334). Later in the 1980s, High Reliability Organisation (HRO) researchers found cases where this combination was possible [97,98]. Perrow said that when he initially analysed TMI in 1979, he
performed a ‘hammer analysis’ … I had a primitive theory about complexity and coupling and when they handed me the transcripts I pounded them with it and broke it open … In thirty days I had produced a 45-page paper that applied the theory to Three Mile Island, to tanker collisions, aircraft failures, chemical plant explosions, and suggested why most factories would not have ‘normal accidents’ or, in a more technical term, what I called ‘system accidents’. The students at Stony Brook sent me a steady stream of material and critiqued my rough drafts and ideas.
[68] (pp. 430–431) c.f. [92] (pp. 9–10)
Perrow claimed that in NA “organizations are at the center of our inquiry … [it is] a book about organizations … without the jargon and the sacred texts … [and] a book about technology” [2] (pp. 10–11). NA’s systemic focus was mostly on negative structural effects in organisations because of the nature of their “technology”. Detail of the groups, goals, management and other processes within particular organisations was largely, as with individuals, set to one side to focus at the macro industry and system level [2] (p. 3), [9] (p. 377), [81] (pp. 139–140). Perrow wrote a good deal about organisations and power in his articles and books from the late 1960s, notably his critical views on the power of large industrial and government organisations, bureaucracies and associated elites in the US. In NA he wrote that their expansion and concentration had increased the technological risk of major accidents and catastrophes. Perrow also considered that the “issue is not risk but power” by those who commission and assess risk analyses and “impose risks on the many for the benefit of the few” [2] (pp. 12, 306, 311). Le Coze argued that in addition to Max Weber, Karl Marx was the classical sociologist who shaped Perrow’s big-picture structuralist and “critical” (or radical) sociological stance [4] (pp. 15–18), [99] (p. 53) and highlighted Perrow’s structure and power focus from 1970, in contrast to US positivist functionalism with its conservative norms and assumed democratic outcomes from pluralism [4] (p. 12), [100,101].

5. Similarities and Differences between Turner’s MMD and Perrow’s NA

5.1. Similarities between MMD and NA

Apart from the obvious nationality, age and longevity differences, some parallels between Turner and Perrow may already be seen in the documentation and analysis so far. For example, both were Weberian organisational sociologists for whom power and politics was important; used a form of accident report documentary case study method; and focussed on unexpected events in high-risk industries involving multiple failures that, if triggered, could lead to a major accident, disaster or catastrophe that they sought to prevent.
Perrow’s early TMI summary of “the four characteristics of normal accidents: warning signals, equipment and design failures, operator errors, and unanticipated events” [70] (p. 175) demonstrated strong similarities to MMD. Both authors noted the importance of a gap between perceived and actual reality [1] (pp. 84, 94, 128–129, 138, 161, 194), [2] (pp. 9, 75). Perrow thought some warnings of an incomprehensible and unimaginable event cannot be seen because they cannot be believed [2] (pp. 9, 75, 351) and leveraged Karl Weick’s 1976 “I’ll see it when I believe it” [102] (p. 11). This had parallels with Turner’s “discrepant information”, “collective blindness” and “rigidities in perception and belief in organizational settings” [1] (pp. 58–59, 64, 71, 78, 88–89, 149, 153, 163, 169, 193, 195, 198, 200), [26] (pp. 35–36), [27] (p. 382), [30] (pp. 39–49), and particularly “that which was available to be known, but which could not be appreciated because there was no place for it within prevailing modes of understanding” stated in relation to Aberfan [29] (p. 12) and more generally in MMD [1] (p. 195). Both Turner and Perrow considered that a larger number of unseen, unexplained and unforeseen interactions were required for systemic accidents that led to a disaster or catastrophe.
In MMD Turner highlighted that “the kinds of energy … [being made] use of are inherently much more destructive … explosives … chemical plants … radioactive materials … [and intervening] more frequently and on a larger scale in the processes of the environment” [1] (pp. 1–2, see also pp. 4–6, 160, 199, 201). Shortly after MMD, Turner reiterated that: “the large national and international organizations which control many of the resources … are themselves undergoing an increasing growth and concentration, so that single errors or miscalculations are likely to have more far-reaching consequences than ever before” [103] (p. 53). Similar views were expressed by Perrow [2] (pp. 3, 102, 295–296, 311, [9] (pp. 354–355, 360) and many other times before and since.
Examples of similarities and overlaps between Turner’s MMD and Perrow’s NA are summarised in Table 1 with some page reference examples bracketed. Rows highlight similar concepts. There could be many more rows.

5.2. Differences between MMD and NA

At Imperial College in the University of London, Turner had been in a team led by Professor Joan Woodward whose earlier research from 1953 had pioneered “contingency theory” [104]. Turner’s research preference and focus was ethnographic and cultural and he left in 1969 [16] (p. 287), well before Woodward’s death in 1971. Perrow discussed Woodward’s work with her in the mid-1960s and 1970 and respected it greatly. During a sabbatical year in 1972–1973 at the London School of Economics and Political Science (LSE), Perrow worked one day a week at Imperial College with remaining members of Woodward’s team [68,75,76,86,105]. In terms of organisational sociology and academic background and interests, there were major differences between Turner and Perrow outlined earlier that inevitably influenced MMD and NA.
Important differences between Turner and Perrow lie in the mechanism for some rare major system accidents. Unlike Perrow’s 2 × 2 matrix, Turner stressed informational difficulties when individuals and organisations seek to deal with ill-defined safety problems [65] (p. 245). For Pidgeon, Turner’s accident and disaster theory in complex systems “differed crucially” from Perrow’s by emphasising history and “social and cultural context” including “organizational, management and communication failings that occur before an accident” [106] (pp. 404–405), providing scope for prior detection and prevention.
The core “normal accidents theory” (NAT) 2 × 2 matrix of complexity and coupling was quite distinct from, and much more technologically deterministic and pessimistic than, Turner’s energy, information, symbolism/cultural focus. In MMD, Turner noted the possibility of rare organisational system events leading to large-scale accidents and disasters but he provided a systemic explanation for most disasters related to organisational information, culture, subcultures, norms and hierarchies. Turner considered them “most unlikely to be met solely as a result of a concatenation of random events” [1] (p. 180). Perrow emphasised the similarity between normal events and those that lead to disasters citing historical explanations of war, that “it is the obscure, accidental, and even random concatenation of normal disorders that produces a great event that we assume must have had great causes” [70] (p. 176), whereas “small beginnings all too often cause great events” [2] (pp. 9–10). Turner also established a longer period for “incubation” that could be due to subcultures and organisational structure’s impact on communication (not seeing, understanding or reacting to the “discrepant” information and hazards) in comparison with Perrow’s rapid interaction timing with technology and scale having influenced physical structures, proximity and events [2] (pp. 4–6).
Table 2 lists some key differences between Turner’s MMD and Perrow’s NA, with some example page references in brackets. The lists do not seek to be exhaustive, and the same number is included for each author. Some are less different than others and most rows do not contain opposing or linked concepts—the concepts are just different.
There is further analysis of similarities and differences between Turner’s MMD [1] and Perrow’s NA [2] and their evolution in the next section, especially Section 6.3, with additional discussion in Section 9.

6. Turner and Perrow after MMD and NA

This section considers relevant publications by Perrow and Turner in light of their landmark books and discusses some areas where one author adopted the ideas of the other or where both extended their analyses in similar ways.

6.1. Turner’s Work after 1978

Following MMD in 1978, Turner further developed his pioneering work [17] on organisational symbolism and culture [107,108,109,110] and in 1982 launched a working group that soon became the international Standing Conference on Organizational Symbolism (SCOS) with Turner on the Board and chair 1988–92 [13] (p. ii), [16] (pp. 291–294), [18] (p. vii–xi). His leading work on qualitative and grounded theory methodology [12] (Section 3), [20,45,111,112], was also extended [3,16,113,114,115]. Turner fostered research collaborations involving sociology, engineering, psychology and learning [5,116] that led to a number of joint publications [14,16,40,60,61,62,63,64,65,116,117,118,119]. Safety science was far from Turner’s only focus, particularly after funding for a research team jointly led from 1983 by David Blockley, and including Nick Pidgeon and Brian Toft, ended in 1988, and Turner had administrative responsibilities as Dean and Head of Department at Exeter, worked in Italy as a visiting scientist in 1988–1989 and then in 1990 became Professor of Organizational Behaviour and Director of Research at Middlesex University Business School [42] (p. 19). Turner’s inaugural professorial lecture applied MMD theory to hazard management, emergency management and project management, as well as re-emphasising the need to learn from the accident patterns in other industries [42] (pp. 4–5). Turner extended his work in MMD on safety culture, hazards, software and systems failure, poor management, and risk [15,16,41,42,43,44,107,108,112,120,121,122,123,124,125,126,127,128,129]. In the period before he died in February 1995, Turner was revising MMD, and drawing upon its theory and his other major contributions in international collaborations on emergency management and security [128,130,131].
As noted, in the second edition of MMD, Pidgeon updated references for the original ten chapters and included additional material in a new chapter 11. Pidgeon discussed this in a Preface also written on behalf of Turner
we do believe that the basic theoretical model set out here remains as relevant to concerns about understanding the nature and origins of acute failures of major socio-technical systems as it did … Account is … taken in Chapter 11 of the reports of work on ‘high-reliability organizations’, of the possibility of applying notions of organizational design to the encouragement of ‘safe’ organizations and ‘safety cultures’, and the more wide-ranging issues raised by a concern with institutional design as a way forward in hazard management. … Chapter 11, together with this Preface … was written by Nick Pidgeon, working initially from various notes Barry had compiled prior to his death.
[10] (p. xviii)
Second edition updates included the use and extensions of the MMD model, criticisms of its philosophy or coverage, and contributions to safety science since 1978 by Perrow, Reason, Vaughan, the HRO theorists, and Sagan—all considered by Turner and Pidgeon to be broadly complementary to MMD [10] (pp. 169–195). Organisational learning was considered important in MMD and there was further emphasis in Chapter 11. With Toft, Turner had used the Schematic Report Analysis Diagram introduced in MMD [118,119] and Turner had supervised Toft’s 1990 PhD at the University of Exeter on “The Failure of Hindsight” [132].
In “Accidents and Non-random Error Propagation”, Turner [122] reinforced systems themes in MMD, stating that
A simple model to understand this nonrandom error propagation requires a description of the initial system structure in social and technical terms, specifying features such as the task and sentient boundaries of subsystems. … it is possible to trace the manner in which errors contributing to major system failures initiate structured consequences … When errors or distortions of intent appear … they interact with the negentropic or ordering properties of the system in which they occur to produce a novel chain of structured consequences. They create a small initial change which, depending on the location, timing, and structure, modifies the future arrangement of events in a manner that has its own logic and order. … In most accidents, it is axiomatic that there is never merely one starting point, but that there are at least six or seven … all of which must be taken account of in understanding the resulting multiple interaction patterns. … An unintended event will trace out those aspects of the preexisting system which it does not destroy. The error and the system intervention phases start in what we have previously referred to as the ‘incubation period’ of a large-scale accident, but they continue into the ‘onset’ stage and beyond … When unforeseen events occur, their consequences are strongly constrained by preexisting technical, task, and sentient structures. When the intervention is not strong enough to disrupt the structure completely, its consequences trace out a portion of the structure. We are thus encouraged to look for regularities in the apparently unstructured events surrounding large-scale accidents or large-scale system failures, and to reduce the extent to which we automatically assume ideas of ‘randomness’ will offer us an understanding of such phenomena.
[122] (pp. 437–444)
Turner’s socio-technical focus encompassed interrelated levels of analysis and included the close interdependence and interaction of technological hardware and software and the humans and their social arrangements using it [40], [65] (p. 245), [123]. Turner also worked on “fractals and accidents” and was acknowledged for suggesting a degree of self-similarity between small initial errors, accidents and disasters that could explain structured “chaotic” system outcomes [133] (p. 37).
The importance in MMD of different levels of hierarchies within organisations for norms, culture and communication within organisations was addressed early by Turner [17]. Later, in discussing safety culture, it was noted that
It is also possible to think of the culture of small groups of workers, of departments, of divisions and organisations as being nested successively within one another, and then to consider the organisational culture as being located within a national or international framework.
[65] (p. 249)
Turner developed his views on culture in a range of other publications, including to emphasise its roots
outside the precincts of classical rational-technical organizational theory and systems analysis … from positivism … [to] new interest in methods of qualitative inquiry and analysis … [to] symbols and culture in general … shared realities … a view based upon negotiation will see a complex of subcultures and counter-cultures … separating the ‘corporate culture’ from ‘culture in work’ which workers (and mangers) weave for themselves while making sense of their experiences in the organization.
[108] (pp. 83, 86–87, 90, 94)
Turner raised concerns about technology and risk tolerability and the inequitable distribution of risk especially in relation to nuclear power and weapons [120] (p. 78) and later published a paper on safety culture in nuclear installations [130]. A 1994 article focusing on software and system failure drew from postmodernist philosopher Jacques Derrida’s “system of marks”, and in it Turner made clear that his emphasis on organisation should not be (mis)characterised as simply another element in a sequential model of accident causation
the classificatory world view emphasizes a changing and kaleidoscopic perspective in which symbols exist within a frame, and in this perspective symbols of reversal are seen as expected and nourishing. By contrast, the instrumental world view, a more technological and purposive one, emphasizes the sequential harnessing of means to an end. The instrumental view threatens and is threatened by symbols of reversal. … Some safety specialists seem to be confident that accidents can be instrumentally eliminated from organizations, especially now that the model of accident generation has been completed by the identification by some of them of the role of ‘organization’ as the final ‘variable’ contributing to accidents. This view, however, can only be sustained by pushing the instrumental view to the centre and suppressing or eliminating the negative and the inversion. A control system is effectively a system of marks, but by reframing, by allowing the marks to migrate, other possibilities come into view. Management, including hazard management, must take an instrumental view of the world almost by definition. But unless some of the potential for reversal and transformation is recognised, managerial activities will repeatedly be threatened by apparently inexplicable and uncontrollable transformations, upsets and contingencies.
[126] (p. 37)
Turner’s concluding sentence above has parallels with Perrow’s NA [2] but from a very different perspective.

6.2. Perrow’s Work after 1984 and Assessment of Normal Accidents

Following NA, Perrow was sought after as a consultant and speaker, completed two books [134,135], updated three others [9,81,94] and published numerous chapters, articles and reviews [68,90,91,92,93,95,96,105,135,136,137,138,139,140,141,142,143,144,145,146,147,148,149,150,151,152,153,154]. Into his 90s, Perrow promoted themes in NA, arguing for the abolition of nuclear power and weapons, applying NA theory (NAT) to environmental and other disasters, challenging powerful US organisations and institutions, and seeking better regulation [90,92,94,96,139,153,154]. He encouraged and endorsed work by younger scholars such as Gephart [155], Sagan [156] and Clarke [157,158,159] in NA’s Afterword [9], and later included Snook [160]. He debated with American HRO theorist researchers, despite their support for NAT [90,161,162,163]. Whereas Turner had emphasised safety culture, learning, information and addressing other accident preconditions, Perrow emphasised structural remedies such as reducing energy concentration, ending nuclear power, modularisation and use of countervailing power. Reflecting, he considered himself
the kind of sociologist who emphasizes the overriding importance of power and interests in society, rather than the kind that emphasizes nurture, culture, or common humanity … [one who promoted better] structures … organizational forms, laws … Context shapes behavior, but the temptation to self-interested behavior is always there and must be fought.
[152] (p. 92)
Section 4 highlighted Perrow’s 1986 summary of NA [81]. In 2004, Perrow [92] posed a question and answered with a bold claim: “What is the future of normal accident theory? Because no one has really said it is wrong (I think the most critical review was from anthropologist Mary Douglas, unfortunately in a sociology journal, and she just found it hard to follow), I guess it will stay around as it becomes assimilated and part of the background” [92] (p. 13, emphasis added). This is unreasonable in several ways. Dame Mary Douglas was a brilliant British anthropologist who also wrote many books and articles on safety and risk over four decades to 2004, including the book Risk and Culture published in 1982 with Aaron Wildavsky [164], which Perrow had cited somewhat negatively in NA [2] (pp. 311, 364 note 16). Douglas’s review of NA in 1985 [165] was positive about Perrow’s use of Weick’s “coupling” concept in relation to accidents but considered that while NA’s “general idea is so good”, Perrow had embraced “complexity” from information theory, where it was a precise “if this, then that” term, while at the same time adopting “its opposite, confusion, ambiguity, disconnectedness” and that overall his “swashbuckling style obscures a rather cavalier development of the argument” [165] (pp. 171–173). Perrow’s “no one has really said it is wrong” [92] ignored Hopkins’s [166,167] withering critiques of NAT, criticism in reviews of NA by McGill [168], Hirschhorn [169], Kates [170], Roberts [171] and Rossi [172], and some critical comments by Turner and Pidgeon [10] (p. 179). Perrow’s own Afterword in NA had stated that a 1995 book by Wildavsky [173] included “an explicit rejection of Normal Accident Theory” [9] (p. 366)—but perhaps by 2004 he had found that to be incorrect. Of course, many reviews of NA were positive [174,175,176,177,178] and Perrow is widely cited and acknowledged as a safety science pioneer, e.g., [179,180,181]. Most recently, in Post Normal Accident, Le Coze built on an earlier article [182] to move away from NAT and emphasise a second thesis in the rich industry analyses in NA associated with “component failure” accidents that he aligned with themes in Hopkins’s disaster studies in a provocatively titled chapter “Hopkins, the Unofficial Theorist of NA” [4] (pp. xii, 35–65). Hopkins [166,167,183] was not only critical of Perrow’s main thesis but gave particular credit to Turner for his qualitative method and some important safety themes that Hopkins had embraced [5], [184] (pp. 10–11, 16–18), [185], [186] (p. 54), [187], [188] (pp. 110–111).
In his Afterword, Perrow referred to a “normal accident” as a “metaphor” including in a figure titled “The Metaphor Seeps” [9] (pp. 354, 387). In 2006, he suggested the term “normal accidents” was used, in part, to move the focus away from “operator error” and so “coined the deliberately paradoxical phrase that became the title of my book” [143] (p. 47). A “normal accident” was always startling nomenclature, but it helped to draw attention to Perrow’s theory, advocacy and book, as well as away from operator error.

6.3. Similarities and Differences after Publication of MMD and NA

Both Turner and Perrow referred to “drift” in their publications. Turner [21] (p. 66) first wrote about factory “drift inducing mechanisms” and later with Toft considered that “organisational personnel may have drifted into some form of complacent behaviour” aiding development of an “incident” [118] (p. 15). In MMD and after, discrepant events during incubation were associated with drift [1] (pp. 86, 88), [43] (p. 216). Perrow had referenced planning “drift” and stated that “Unplanned aspects of organizations are those which are subject to little administrative control and are not even noticed until their effects are quite evident, if even then” [80] (p. 179). He considered that while drift occurred as a result of “incompetent leadership”, deliberate management choice based on personal goals or “extraorganizational interests” should also be considered [81] (p. 263). In 1994 Turner reinforced his critique in MMD of top managers’ unrealistic view of safety, ignorance, deliberate misinformation, blindness and lack of control, and inadequate creativity and proactivity [1] (pp. 63, 66–67, 77–79, 125) to highlight “sloppy management” including lack of caring attention and inadequacies of control, unprofessional behaviour and the need to explicitly make trade-offs [43] (pp. 215, 218). However, his balanced position noted that “Some disasters are caused by inadequacies of management, or by unprofessional behaviour but others arise when disaster preconditions are generated as a result of normal functioning of larger managerial and technical systems” [43] (pp. 215, 217–218). After NA, Perrow ratcheted up a critique of incompetent leadership [81] (p. 263), subsequently discussing “Executive failure” [135,142], and ultimately preferring “Executive malfeasance” [94] (p. viii). Turner wrote in MMD that managers and the powerful largely define the hazards society will face and how they might or might not be addressed [1] (pp. 4, 72, 124–125, 132, 152, 191). In addition to NA [2] (pp. 12, 14, 306–307, 311), Perrow, drawing from his earlier critical sociological analysis, reinforced a similar concept of owner operators and elites imposing risks on society [81] (p. 155).
Part of MMD’s systems approach to disasters referred to structured error magnification linked to “deviation amplifying feedback” [1] (pp. 179–180, 236). Perrow extended his concept of “negative synergies” [2] (pp. 82, 88, 98) to suggest that 2+2 could equal 5 or “5 million” (implying negative amplification) potentially leading to catastrophe [81] (p. 147). MMD and Turner’s associated publications highlighted the key role of population and demography in disasters and major accidents [1] (pp. 1, 4, 14). Absolute numbers, population movements, population density, and proximity of population to hazards, were all relevant [103] (pp. 53–55). After NA, Perrow noted the importance of population generally [81] (p. 140) and considered its importance for major accidents in The Next Catastrophe [94] (pp. 7, 14, 19, 188, 196–197, 209, 319).
By 2004 Perrow, using the term “disaster” in a contemporary manner stated that
Only a few disasters, I believe will be exclusively due to design or human factors failures that cannot be attributed to higher level explanations. Any accidents, as opposed to disasters, can be traced to operator error … Disasters require a configuration that is more likely to be due to organizational and sociocultural factors. A few of these will be what I call ‘system accidents’, inherent in systems that are complexly interactive and also tightly coupled … The vast majority of disasters will be due to organizational, and ultimately, to sociocultural factors.
[92] (p. 284)
Leaving aside Perrow’s overly broad claim about “operator error” accidents, his conclusion about organisational and “ultimately sociocultural” factors required for “the vast majority” of disasters closely follows Turner’s in MMD. While Turner is not mentioned, this is a major change. Later, Perrow [94] (p. 1), [135] (p. 1) used the term “man-made disasters”, with a rare subset of these leading to a “catastrophe”. Perrow had earlier referred to “male-made disasters” [9] (p. 364). Notwithstanding some definitional and terminological confusion involving both authors, strong parallels are evident.
Organisations have multiple goals that may conflict and Perrow used “garbage can theory” [189,190] in NA, noting in the Afterword [9] (pp. 368–369, 374) that this had been extended by Sagan [156] and Clarke [157,158,159]. On the one hand, Perrow considered that the metaphor of garbage cans for storing and drawing out ideas could help to explain the randomness and unpredictability of major accidents better than H.A. Simon’s bounded rationality [33,34] but on the other, he considered the theory a “poor digging tool” for organisational analysis compared with bounded rationality and his structured neo-Weberian categories [81] (Chapter 4). In contrast, at the organisational level, while Turner [120] (pp. 52–53, 65–66) was well aware of garbage can theory, he extended Simon using bounded decision zones and stressed that internal hierarchy and communication structures, both formal and informal, imposed order on errors so that the incubation and progression of disasters was typically non-random and structured [1] (pp. 179–180), [122].

7. Turner and Perrow: Acknowledgment and Citation

This section discusses how each author cited and acknowledged the other’s ideas and publications in the course of their careers. As noted in the Introduction at Section 1.1, Hopkins has stated that in NA, Perrow “apparently wrote his book in ignorance of Turner’s work, since he makes no reference to it at the time” [5] (p. 21), an assumption also made by Rae [8]. Pidgeon suggested that some prominent accident researchers claimed superficial or no knowledge of Turner [7] (p. 239). The current research tests and extends the perspectives of Hopkins, Rae and Pidgeon in examining whether Perrow wrote in ignorance of MMD or otherwise. No assumption was made in our research that overlap did not arise from authors’ separate paradigms, sociological perspectives, contexts and research analyses.

7.1. Perrow’s Knowledge of MMD

Turner reportedly telephoned Perrow when he saw that NA had been published in 1984 with no reference to MMD, and Perrow reportedly said that his publishers had made him take it out along with all references to European works [110,191]. A few months later in a presentation on NA that Perrow gave at Imperial College, he reportedly stated that his publisher had required a literature review section that included reference to Turner to be removed [191,192].
On 21 November 1995, after Turner’s death on 24 February, Perrow emailed a response to a note from Mrs Turner [110] inviting him to contribute to a new volume and in an attached one-page typed letter stated that he “came across Barry’s disaster book when I was writing my own, and discussed it briefly in a ‘review of the literature’ section which my editor convinced me I should delete … I always regretted that in Barry’s case” [193]. In relation to Mrs Turner’s request for a written contribution, Perrow wrote
I just don’t feel that close to Barry’s work, so about all I could say would be that it was the earliest attempt to think through the matter of disasters in organizational terms, and thus very useful and insightful. To say so little would not serve much purpose. We approached disasters very differently, and indeed organizational analysis in general. Barry’s work has been strongest in the culture area, while I have been much more concerned with structure. He also had a catholic interest in disasters, whereas my concern has been quite narrow, focusing upon a small set of (very risky) systems, and even then with structural causes and little concern with the recovery phase, or even prevention. So there is not much in common.
[193]
Leaving aside the commonalities we have identified, and culture versus structure differences, the latter part of the quotation is curious given the breadth of industry accidents and systems analysed in NA and Perrow’s stated NA purpose to manage high-risk technologies better to prevent accidents and catastrophes [2] (p. 3). Following the 1997 second edition of MMD, NA was reprinted in 1999 with an extensive 59-page “Afterword and Postscript” and an additional five-page bibliography. Jens Rasmussen had been acknowledged in NA’s first edition [2] (p. ix) but in the 1999 book there was still no reference to MMD or any publication by Barry Turner, whereas the British James Reason (as well as the Danish Rasmussen) were Europeans added to the bibliography references [9] (pp. 371, 438).

7.2. Perrow’s Citation of Turner

Although it has now been established that Perrow had read MMD before 1984 and said he “always regretted” not citing it, only two references to Turner were found in any of Perrow’s published papers up to 2019. Both were in 1994. The first reference is in an article in the 1994 Journal of Contingencies and Crisis Management [194], for which Turner was an editorial board member from the journal’s inception the previous year [193] (p. 71). In the article which responded to criticism of Sagan [156] by La Porte [161], Perrow wrote
One might note one infrequent, but perverse, barrier to learning at this point, originally identified, I believe, by Turner (1978: 224) in Man-made Disasters, where accident investigations convert ill structured problems into well structured ones (see also Vaughan, 1994). Accident investigations are ‘left censored’ in that they examine only systems that failed, not the ones with the same characteristics that have not failed.
[90] (p. 214)
The reference to Turner is strange, particularly since page 224 of MMD [1] is a page of references and notes which are unrelated to the less than clear point about learning being made by Perrow. While there may have been a page number error, in MMD Turner was sensitive to hindsight bias [1] (pp. 100, 162–163, 173), he had referenced systems that generally worked normally [26] (pp. 21–22), [27] (p. 395) and he considered near misses [1] (pp. 96, 182). Both Turner’s and Perrow’s analyses of major accidents used “censored” accident investigation and inquiry reports.
The second reference is in a rather negative review by Perrow of the 1994 first edition of a book on disasters by Turner’s close colleague Brian Toft, based on his PhD, written with Simon Reynolds [195], where, inter alia, Perrow wrote
Nevertheless, there is a reasonably interesting ‘schematic report analysis diagram’ that analyses the Cambrian Colliery accident of 1965, based on the work of B.A. Turner (as is a great deal of the book), a pioneer in the accident field. It outlines the numerous failures and shows how the investigating committee ignored some of the more important ones. The diagram is useful for investigating committees, but when enlarged as a generic blueprint for an ‘organizational learning system’, as it is in the final chapter, it mimics the failure of 1960s system theory: everything is (equally) important, connected and must be taken into account.
[138] (p. 607)
The Cambrian diagram material was part of MMD [1] (pp. 97–98), and whilst acknowledging Turner as a pioneer, given Turner’s [1] (pp. 157–158) further emphasis on learning and subsequent use of these diagrams with Toft [118,119], Perrow may unfairly be associating Turner with “failed” 1960s system theory and organisational learning. If Perrow meant general systems theory (GST) [59,196,197,198,199] rather than the early “structural functionalism” social systems of Talcott Parsons [100,101], Turner understood that while “small closed system” models are often overly limited abstractions, and “any real decisions can only deal with open systems” [16] (p. 287, emphasis in original), fully open system models such as GST that he had studied in the 1960s can be excessively complex. So he paid “particular attention to the manner in which bounded rationality operates within hierarchical organizations, and look[ed] at the kind of event associated with the failure of bounded rationality” [1] (pp. 135–136). While often different from Perrow’s, Turner’s use of systems theory was nuanced, not naïve, and he used middle range theory, models and diagrams to capture key aspects of system complexity in an understandable manner that could aid comprehension and learning. Beyond these two brief references, Perrow (e.g., [9]) did not acknowledge other substantive and relevant themes in MMD or Turner’s other work.
While there are substantial differences between NA and MMD, the large number of parallels and similarities (e.g., Table 1) makes Perrow’s decision for NA in 1984 to omit any discussion and acknowledgment difficult to justify regardless of editorial preference. This is even more so later given that he said he “always regretted” it in 1995 [193], but it was still not addressed when his 1999 re-publication of NA and its extensive Afterword and additional bibliography [9] followed the 1997 second edition of MMD [10]. Several other relevant articles and books (e.g., [92,94,96,135]) also ignored Turner.

7.3. Turner’s Citation of Perrow

In contrast with Perrow’s practice regarding Turner, Turner had cited Perrow and his work throughout his academic career from 1970 until he died [10] (pp. 178–179, 186), [16] (p. 295), [19] (pp. 3, 16), [20] (p. 81), [41] (p. 377), [60] (p. 362–363), [65] (p. 246), [117] (p. 3), [120] (p. 69), [124] (p. 243), [125] (pp. 191–192), [130] (p. 329). In a paper prepared for a World Bank workshop, Turner et al. stated that: “The interactive complexity of events associated with large-scale accidents is discussed both by Turner (1978) and by Perrow (1984). Their accounts suggest that disaster results from unanticipated and complex interactions between sets of contributory causes that would be unlikely, singly, to defeat established safety systems” [117] (p. 4). Similar text also including Perrow [2] was included in [65] (p. 246).
Chapter 11 of MMD’s second edition in Section 11.2 “Complementary theoretical frameworks” discussed Perrow on four pages with associated notes [10] (pp. 177–179, 186, 228–230, 233) covering NA [2], Perrow’s autobiographical account [68] and Perrow’s “enhancement of NA” article [90]. A table was included to summarise NA’s complexity and coupling dimensions, and the discussion included that Perrow’s
account shares with man-made disasters the view that major accidents in socio-technical systems arise from the interaction of a chain of unanticipated errors and misunderstood events in complex and ill-structured situations. However, the basic model differs from man-made disasters, being focused primarily upon the prior structural properties of complex technical systems, rather than upon the ways in which disasters develop unseen over time.
[10] (p. 178)
Citing Karl Weick’s 1993 analysis of the Tenerife 583-fatality airport disaster [200] and the HRO [161,162] debate with Perrow [90] mentioned earlier, Turner and Pidgeon stated that
In practice the concepts of ‘complexity’ and ‘coupling’ have turned out to be difficult to use analytically and it seems likely that they are not fully independent from each other, in that both express aspects of the fundamental complexity underlying dynamic and ill-structured systems. … Perrow’s original account appears overly deterministic, having been derived in the main from an analysis of the structural properties of technology and technological systems. … he does not clearly specify … whether similar effects are produced by both organizational and technical complexity and interdependence. However, his analysis does … [draw] attention to the safety implications of the growing complexity and interdependence of today’s most advanced industrial systems.
[10] (pp. 179, 230)

8. Citation and Acknowledgment of Turner by Other Important Accident Causation and Theory Scholars

This section considers knowledge of Turner and MMD and its use and acknowledgement by major safety science contemporaries and temporally close successors working on accident causation and theory, also noting their citation of Perrow and NA [2]. Hale, Weick, Rasmussen and Reason are the main contemporaries discussed. Hollnagel, Leveson, Hopkins and Vaughan mostly wrote as close successors and are discussed with Dekker who is younger. Seven others are considered in less detail: Shrivastava, Sagan, Snook, Le Coze, Macrae, Hayes and Quinlan. In addition to earlier references, significant safety science contributions by Turner’s colleagues Pidgeon [5,7,106,201,202], Blockley [203,204,205], Toft [206,207] and Gherardi [208,209,210], and their acknowledgment of Turner, continued after his death and subsequent to the revised edition of MMD [10].

8.1. Citation of Man-Made Disasters and Normal Accidents

Forewords by Uriel Rosenthal and Diane Vaughan to MMD’s second edition [10] considered that MMD had been influential, especially in Europe. However, Vaughan stated that despite being “accompanied by two articles in well-regarded journals in the United States, the book nonetheless was seldom cited” [211] (p. xi). For Rosenthal, MMD: “was received as a curiosity. … Nobody questioned the quality of the research, the more so since the author’s main argument had already found its way into the prestigious American-based Administrative Science Quarterly” [212] (p. vii). Based on Google Scholar, there were 173 citations for MMD from 1978 to 1996 and 1350 for NA from 1984 to 1996, 1850 for MMD and 10,100 for NA from 1997 to 2016, and 593 for MMD and 3021 for NA from 2017 to 29 May 2021, when the data were extracted [213]. Based on the data, the average number of annual citations for MMD was 9.1 for 1978–1996, 92.5 for 1997–2016, and 136.3 for 2017–2020 (the last full year). The average number of annual citations for NA was 103.8 for 1984–1996, 505 for 1997–2016 and 699.8 for 2017–2020. From its 1984 publication, NA was cited at more than ten times the annual rate of MMD until 1996 and at over 5 times the annual rate from 1997 to 2020 [213]. The increasing citation rate of both books is likely to reflect the much larger number of safety science articles published after 1997 [214] (pp. 69, 72).

8.2. Andrew Hale

Psychologist Andrew Hale’s safety science roles and research output are longstanding, distinguished and well known, including in the areas of accident causality [215,216,217]. A highly cited chapter that categorised history and schools within safety science appropriately referenced Turner [1] with several others including Perrow [2] under “Analysis and prevention of major disasters”, and Turner’s research group including Pidgeon, Blockey and Toft was categorised with “Studies deriving from the sociotechnical tradition and from Turner’s analyses of major disasters” [218] (pp. 133, 138, 164–165). However, despite the caveats in the article, it is difficult to understand the justification for ascribing Turner’s “Speculative” “High technology” “Case study” “Explicit theory” to “Perrow, sociotechnical” [218] (p. 143, Table 11.1, line 31). “Perrow” was defined to include “Normal accidents theory developed by Perrow in 1984 and tested and developed by others” with Meshkati [219] and Sagan [156] listed. Perrow’s “High technology” “Case study” “Explicit theory” was stated to be “Systems” [218] (pp. 138, 143, 164). Use of “sociotechnical” principles as “Explicit theory” was illustrated by the Scandinavian school of participative management and ten papers [218] (pp. 137, 141, 161–162) unconnected with Turner. A separate “Sociological” category has Dwyer [220] as the sole representative, although his “Stage of work” in Table 11.1 is “completed book 1978” rather than his actual 1991 book [218] (pp. 137, 141, 162). Some of the foregoing references and characterisations seem quite confusing and understate Turner’s priority and contribution.
The breadth of Turner’s work appears to be further under-acknowledged in tables analysing research contents. Assessed as not mentioned in “(North) London” group publications that included MMD and subsequent publications by Turner and colleagues mostly at Exeter [218] (pp. 164–165) were “goals”, “policy, plans”, “structure, hierarchy”, “supervision”, “feedback, audit”, “organizational learning”, “adaptive capacity”, “group norms, boundaries”, “experiential knowledge, competence”, “trade-off, long-term strategy, order seeking”, “top management commitment”, “common goals, attitudes of colleagues”, “multicausality”, “safety culture, fit of system to organizational culture” [218] (pp. 148, 150–152). Our review of Turner’s work would support including many of these categories based on MMD [2] alone, and most of them if Turner’s other publications and those of close colleagues up to 1998 are considered. For example, Hale et al. [221] (p. 123) and Hale and Hovden [218] (p. 165) had referenced Turner [124] on “safety culture”. To ascribe Turner’s “Explicit theory” to Perrow [2] who wrote later seems hard to justify. Such issues reinforce concern about the under-acknowledgment of Turner at least until the second edition of MMD [10] became more widely known after 1997.
In 2000, Hale seemed to somewhat better recognise the importance of Turner as “one of the first to draw attention to the organisational processes deep within companies which incubate major accidents… Pidgeon’s and Turner’s analysis has its roots in the detailed study of major disasters in the past” [222] (p. 1, emphasis added). By 2017, Hale responded as part of an interview in The Psychologist that Turner had had a “big influence”, albeit not highlighting the priority of his main ideas on accident causality and prevention that transcended sociology
the fascinating thing about Barry was that he was coming at things from a very different disciplinary background, sociology, while interacting with people coming mostly from an engineering science or psychology background. So his point of view was very new, but because he wrote so eloquently and his work was so readable, he had a big influence. You could say that he was there at the right moment to give that push to include sociological factors in the causation frameworks.
[223] (p. 66)
Two of Hale’s joint publications [215,216] were cited by Turner in MMD [1] (pp. 26–7, 210–211). Hale said of the second, the 1972 Review of the Industrial Accident Literature, that it “formed the basis for all my later work” [223] (p. 64), [224]. An additional 1991 joint publication by Hale was cited twice in MMD’s second edition [10] (p. 214 note 32, p. 236 note 133).

8.3. Karl Weick

Turner’s 1976 Administrative Science Quarterly article [27] was published the year before social psychologist Karl Weick became editor of that journal [225] (p. 535) but Weick seemed unaware of it and MMD. After the publication of NA, Weick cited NA [2] multiple times and in a very positive manner—often in relation to points that Turner had identified earlier in MMD [1]—but with no apparent knowledge of Turner [200], [226] (pp. 305–317). However, in 1998 after the second edition of MMD [10], Weick concluded a fulsome appreciation of MMD after Turner’s death with: “it takes a kaleidoscopic sensitivity to render clearly the multiple patterns that eventuate in disaster. This re-issue of Man-Made Disasters reminds us that Barry Turner is one of the best minds ever to have tackled this nest of issues” [227] (p. 74). Weick thereafter regularly cited MMD and Turner’s other work [228,229,230,231].
In 2004, when reflecting on NA’s 1999 re-issue, Weick considered that it remained “generative” because “it frames” through the 2 × 2 matrix, “it links” multiple levels of analysis, and “it provokes”, but he suggested that Perrow had “often” used NA theory “as a pretext to make some larger points about which he feels strongly” [228] (pp. 27–30). In the 2004 article, Weick also highlighted some of Turner’s contributions in MMD. He considered that Perrow’s dimensions of coupling and complexity were a “shrewd distillation” of key organisational dimensions, with the 2 × 2 matrix a “tidy compression” of three mechanisms in MMD [1] (p. 164) that organisations use to reduce diversity in order to act as a single entity, while still retaining sufficient “internal diversity to cope with external complexities” [228] (pp. 28–29). Weick considered that Perrow’s
continuum of loose-tight coupling reflects the way in which [Turner’s] hierarchy, power, distributed problem solving, suppressed conflict, and socialization pressures either enhance diversity through looser coupling or discourage it through tighter coupling. Likewise, the continuum of linear transformation—interactive complexity reflects the ways that Turner’s (1978) modes of operation, simultaneous consideration of problems at multiple levels of generality, conflicting actions, and discontinuities of practice make for more or less knowable chains of consequence. To worry about normal accidents is to worry about what it means to organize. As Turner put it, ‘It could be said that organizations achieve a minimal level of coordination by persuading their decision-makers to agree that they will all neglect the same kinds of consideration when they make decisions’ (p. 166; Turner and Pidgeon, 1997, p. 138). The main differences between Turner and Perrow and the rest of us lie in what each would say is the focus of that coordinated neglect.
[228] (p. 29)
Weick’s first major book in 1969, The Social Psychology of Organizing [67], was cited by Turner in relation to prospective and retrospective rationalisation [1] (p. 20, note 40, 209). Turner [43] (p. 218) cited Weick [102] and in the second edition of MMD four references by Weick [67,102,200,232] were cited [10] (pp. 201, 229–230, 235).

8.4. Jens Rasmussen

Jens Rasmussen’s prodigious output [233] transcended his electronics and control engineering and cognitive systems roots to include qualitative models and material on human errors that profoundly influenced James Reason, albeit not uncritically [234] (pp. 106–107). Four of Rasmussen’s theories and models have enduring relevance: differentiating “skill-rule-knowledge” [235] (p. 316), [236] the safety boundary migration or “drift into danger” model; the “socio-technical system” hierarchical risk management model that included Rasmussen’s support for multidisciplinarity in accident causality investigation; and development of “Accimaps” [237,238,239,240]. Rasmussen drew attention to the “fallacy of defence in depth” in which additional barriers and defences to address weaknesses can lead to risk homeostasis, different risks, and lack of visibility when a defence is breached [238] (pp. 208–209), [241] (pp. 47–48). This shares some similarities with Turner’s emphasis on multiple failures that reveal “a gap in defences previously regarded as secure” [1] (p. 84) and Perrow’s concern in NA with “defence in depth” and discussion of extra safety technologies and interventions leading to new risks [2] (pp. 43, 60, 230). Turner developed an analysis and learning diagram in MMD, included the concept of “drift” and emphasised the importance of multidisciplinarity, and his systemic approach was socio-technical and drew upon organisational hierarchical levels. There are a number of parallels with Rasmussen’s work. Rasmussen [235,236,242,243,244,245] and Turner [1,17,21,22] had examined and analysed workplace behaviour since at least the late 1960s and both found that it often differed significantly from what was documented and intended. Turner considered that normal work as conducted was generally productive but could occasion an accident if precursors had incubated unnoticed, while Rasmussen noted that human variability could also be a positive factor with attendant safety implications [239] (p. 5).
Rasmussen thought that Quantitative Risk Assessment (QRA) techniques such as Probabilistic Risk Assessment (PRA) had their place in closed-loop risk control [241] (p. 46), [246,247] but considered that: “Major accidents in the past have typically been released by a systematic erosion of the preconditions of the quantitative predictions, not by a stochastic coincidence of independent events” [237] (p. 462). This is consistent with Turner’s discussion of incubation in MMD but less so with NA. An article by Rasmussen and Pedersen referred to risk related to control of energy balances, “accidental chains of events”, the “propagation pattern of accidents” allowing accident scenarios to be “categorized in a reasonable number of classes” and considered that “errors of management” lead to common-mode errors and may require feedback control [247] (pp. 183, 185). These are concepts found in MMD from a different disciplinary base.
Much of Rasmussen’s safety science work had not been published when Turner wrote MMD, but MMD’s second edition [10] (p. 230) did cite Rasmussen’s 1986 book [248]. As noted above, Turner and colleagues provided a working paper for a World Bank Workshop on Safety Control and Risk Management held at Karlstad in early November 1989 [117]. A month earlier, in October 1989, Rasmussen, as a consultant and workshop co-chair with Batstone, had finalised a joint summary report on the first of three planned World Bank workshops on Safety Control and Risk Management that was held in October 1988 with papers from Rasmussen, Reason, Westrum, Woods and others [249]. A 67-page summary of the 1989 Karlstad workshop was produced by Rasmussen et al. [250] but it did not mention the paper by Turner et al. [117]. While the paper was not presented in person, it seems unlikely that Rasmussen would not know about the paper but this has not been able to be established. The field of accidents and causation in safety science was smaller in the 1980s and even the 1990s than today, but no evidence was found that Rasmussen was aware of Turner’s work and there was no citation of Turner in Rasmussen’s published material that was able to be accessed when writing this article, perhaps because of the authors’ different disciplinary bases. While some key concepts may follow Turner, Rasmussen’s exceptional original contribution to safety science [233] is unassailable [251,252,253,254,255]. Rasmussen twice [244] (p. 162), [241] (p. 48) cited a 1986 draft article on “Risky Systems: Inducing and Avoiding Errors” drawn from a private communication sent to him by Perrow. Perrow acknowledged Rasmussen in NA [2] (p. ix) and in the 1999 reprint of NA [9] (p. 438) added a jointly authored book by Rasmussen to the references. However, neither Rasmussen nor Perrow drew much on the work of the other.

8.5. James Reason

Psychologist James Reason was alone among participants at an August 1986 NATO symposium on “the failure analysis of information systems” [256] to cite Turner [1], albeit alongside many others including Perrow [2], but Reason incorrectly implied that in MMD Turner had viewed disasters as “intrinsically unpredictable and largely unavoidable”, and he did not credit Turner for other relevant points in MMD [257] (pp. 211–212, 215). On 27 January 1987, Reason gave an invited lecture in the Psychology School at Exeter University and delivered a cognitive-oriented presentation about accident causality [192]. At the end of the lecture, Reason reportedly stated that disasters were dissimilar and therefore no general lessons could be learned [191]. Afterwards, Turner, Toft and Pidgeon had a long informal discussion with Reason to explain the importance of more general socio-technical disaster patterns and they reportedly suggested that he take a more organisationally focussed approach [191,192]. Toft wrote to Reason on 17 July 1987 after reading Reason’s 1987 article on “The Chernobyl errors” [258], which Toft stated had included a number of key matters discussed on 27 January and in material he provided to Reason subsequently. Reason had dramatically changed his approach from that article onwards but the ideas and material stated to have been provided as the basis for this [191,192] were used without appropriate acknowledgment. An example of the change is Reason’s statement that “a purely cognitive analysis of error mechanisms fails to capture some of the more important human contributions to catastrophic system breakdowns. What is missing is a further level of analysis acknowledging … a complex social milieu” [258] (p. 204). Reason’s article [258] included Perrow [2] in the references but not Turner. Reason responded to Toft on 21 July 1987 and suggested that there had been confusion about the level of analysis being discussed and apologised for the acknowledgment discourtesy [191]. Reason cited both MMD and NA once in a subsequent article on the Chernobyl disaster but did not explain that there were significant ideas from Turner that he was expanding upon [259] (pp. 537–538, 540).
Reason’s famous book Human Error was published in 1990 and briefly acknowledged “conversations and correspondence” with 40 people including Turner in a Preface that also acknowledged 33 others [260] (pp. v, xiii–xv). In the main text, after citing Turner’s use of case studies in one sentence, Reason included other ideas first published in MMD and associated publications [1,26,27,28,29,103] such as complex socio-technical organisational system accidents, a concatenation of linked multiple preconditions, latent factors and organisationally unforeseen or unforeseeable events, and disaster triggering (precipitating) events [260] (p. 197). Reason’s only other citation of Turner in the 1990 book was not in relation to MMD theory and might also confuse the reader as to Turner’s views: “The idea of personal responsibility is deeply rooted in Western cultures (Turner, 1978). The occurrence of a man-made disaster inevitably leads to a search for human culprits. …” [260] (p. 216). Turner didn’t seek culprits or blame front line staff but considered their roles as part of normal work in a systemic and organisational management context [1] (pp. 160, 162–163, 198), [26] (pp. 21–22), [27] (p. 395). Reason cited Perrow’s NA [2] more frequently in the book [260] (pp. 177–178, 183, 191, 197, 216).
Reason’s 1990 article [261] included Turner’s original ideas but only Perrow, not Turner, was included in the references. Among ideas that Turner had pioneered, in addition to those noted already, such as the concept of long incubation of latent failures, was that these precursors were most likely from “high-level decision-makers” like managers where “The higher an individual’s position within an organization, the greater is his or her opportunity for generating pathogens” [261] (pp. 476, 478), c.f. MMD [1] (pp. 1, 3–6, 179–180, 236). Reason’s discussion of complex system accidents arising from “deliberate or unwitting disabling of defences by operators in pursuit of what, at the time seem to be sensible or necessary goals” [261] (p. 481) has parallels in MMD [1] (pp. 70, 101), including Turner’s views on normal work and bounded decision zones [1] (pp. 5, 58, 120–121, 165–166, 194). Most significantly, Reason highlighted the important shared features of the aetiology of major disasters and included a “general framework for accident causation” that shared many elements in MMD [261].
In 1997, Reason took a step towards addressing the acknowledgement gap in his other classic book Managing the Risks of Organizational Accidents, stating: “the organizational model owes its intellectual origins to two books. The first was Man-Made Disasters by the late (and greatly missed) Barry Turner, published in 1978. The second influence was Charles Perrow’s Normal Accidents” [262] (pp. 225–256). However, in a 1998 article [263], only Perrow [2] is cited, despite the thrust of the article covering ideas such as culture’s role in recurrent accident patterns and the causation of organisational accidents that Turner [1,124,130] had explored in detail. Unlike Perrow and other pioneers, Turner is not mentioned in Reason’s book The Human Contribution [264]. In his next book A Life in Error in 2013, Reason [265] does not cite or acknowledge Turner despite including a chapter on “Organizational Accidents” which opens with: “The dozen years between 1976 and 1988 were marked by a succession of grisly major disasters worldwide, most of them manmade (see list below). They were also the years that I was developing the notions of ‘organizational accidents’ and latent failures—later to be modified to latent conditions” [265] (p. 73). However, by 2016, in his book Organizational Accidents Revisited, Reason was more generous
It is, I believe, fitting to begin this survey of alternative theoretical views with Barry Turner, a sociologist at the University of Exeter, who—if he didn’t actually coin the term ‘organizational accident’—laid the groundwork for understanding organizational breakdown in his pioneering book Man-Made Disasters in 1978. Later, Turner’s work was updated in a second edition … His most important concept was ‘incubation’. In other words, organizational disasters develop over long periods of time—in short they incubate within the system. Warning signs are ignored or misunderstood or even integrated into the pattern of organizational ‘normalcy’. As a result, safeguards and defences either do not get built or are allowed to atrophy. … Disasters, as noted elsewhere, are immensely diverse in their surface details. But Turner and Pidgeon have identified a set of developmental stages that appear universal. … These notions do not necessarily conflict with the idea of latent conditions: rather, their sociological emphasis upon cultural adjustments enriches them.
[234] (pp. 99–100)
Turner was generous in acknowledging Reason, writing in 1977 [29] (pp. 3–4) that from one perspective his framework could be considered as extending certain insights by psychologists into sociology, including Professor James Reason’s 1977 understanding of everyday slips, errors and accidents [266]. In MMD, when discussing transmission and amplification of error, Turner acknowledged some parallels between his organisational hierarchy approach and individual cognitive psychology, including by Reason [266] and a few others [1] (pp. 179, 236–237, note 29). Excluding introductory material, there are nine pages with citations of Reason [260,266,267] indexed in MMD’s second edition [10] (pp. 180–181, 186, 202–203, 224, 230–232). Turner and Pidgeon [10] mostly considered Reason’s work to be supportive of MMD [1], but not supplanting its more extensive sociological organisational analysis, and they suggested that “the distinction between active and latent failures can be seen as a splitting of the disaster incubation period into two linked but conceptually different phases” [10] (pp. 180–181). Without explicating the detail, in 1994 Turner [43] quietly indicated that his own work on complex organisational accidents preceded Reason’s
A multiplicity of minor causes, misperceptions, misunderstandings and miscommunications accumulate unnoticed during this ‘incubation period’. These preconditions which one researcher has subsequently called ‘pathogens’ (Reason, 1990) stay in place in the organization or managerial practice, ready to contribute to a major failure unless something happens to neutralize them by bringing them out into the open. … They constitute an accident waiting to happen…. Brought together by some trigger event. … the underlying pattern of the incubation period is common, and recurs in many disasters and in many industries.
[43] (pp. 216–218)
This brief summary indicates that Reason was aware of Turner’s first edition of MMD but probably misunderstood it until after the Exeter discussions in early 1987. His subsequent use of MMD [1] and associated insights from Turner, Toft and Pidgeon without appropriate referencing suggests a looseness with regard to academic integrity from an esteemed safety science pioneer. Matters improved after Turner’s death [234,262] but substantial and unfortunate gaps in citation and acknowledgment continued, perhaps mitigated by Reason not identifying that NA [2] contained many ideas that overlapped with MMD.

8.6. Diane Vaughan

Sociologist Diane Vaughan wrote that a good deal of Turner’s sociological work in MMD and associated articles had been confirmed in her investigation of the 1986 Challenger space shuttle launch disaster and, whilst also citing Perrow’s NA very positively, acknowledged Turner’s work in some detail [268] (pp. 225–226), [269], stating
Turner, investigating ‘man-made disasters’ (1976; 1978), pioneered in discovering organizational patterns that systematically contributed to the disasters he studied: norms and culturally accepted beliefs about hazards, poor communication, inadequate information handling in complex situations, and failure to comply with existing regulations instituted to assure safety (1976:391). He concluded that these factors created an absence of some kind of knowledge at some point. Crucial to understanding such accidents, then, is discovering how knowledge and information relating to events provoking a disaster were distributed in an organization before the incident (1978:3). Analysis of the Challenger accident not only confirms Turner’s findings about the relevance of knowledge and information in organizations, but also identifies structural factors that systematically affected the distribution of information and its interpretation at NASA: the competitive environment, the organization’s structure, processes, and transactions, and the regulatory environment. These factors combined to affect the decision to launch.
[269] (p. 248)
Vaughan’s renowned book The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA expanded on these and other MMD themes but her citations of Turner were relatively sparse [270] (pp. 69, 149, 410–411, 482 note 107), as they were for Perrow also [270] (pp. 34, 53, 415, 482 note 107). In Vaughan’s Foreword to the second edition of MMD in 1997, mentioned earlier, she stated
Published in 1978 and accompanied by two articles in well-regarded journals in the United States, the book nonetheless was seldom cited. The book had a cult following that advertised it by word of mouth. But the failure to become integral to mainstream sociology seems odd, given the quality of his work, its grounding in general organizational principles … Moreover, his approach was unprecedented. … Turner examined … preconditions, locating them in organizational systems. He was the first to demonstrate how technical, social, institutional and administrative arrangements, in combination, can systematically produce disasters. … Looking back, we must marvel not only at Turner’s prescience, but at his accomplishment. … classic ideas … Turner’s book contains two: the title … and his core idea of ‘failures of foresight’, which directs attention to a singularly important causal element that he found. … Man-made disasters not only had preconditions, but those preconditions had characteristics in common: long incubation periods studded with early warning signs that were ignored or misinterpreted. For Barry Turner, man-made disasters were distinguished not only by the institutional, organizational, and administrative structures associated with them, but by their process. In my view, this was his true intellectual breakthrough: disasters were not sudden cataclysmic events; they had long gestation periods. We must also marvel at the methodology and analysis on which his theoretical insights were based. … Using a grounded theory approach, Turner examined these archival data, identifying similarities and differences between these cases. … His effort produced a volume with richness that goes beyond his two key concepts. For example, his understanding of the relationship between information, error, and surprise in organizations was also farsighted.
[211] (pp. xii–xviii)
An article by Vaughan in 1997 cited MMD and Turner’s [27] article in relation to failures of foresight and incubation periods [271] (pp. 85, 96). In 1999, Vaughan also cited Turner and Pidgeon [10] in a fascinating article that included quotations from Turner’s broader work [272] (pp. 292, 294, 296). However, in other articles she didn’t mention Turner, despite the appropriateness of several aspects of the subject matter [273] (pp. 914, 916–917, 934), [274] (pp. 315–347), perhaps due to journal space constraints. When interviewed in May 2008, Vaughan acknowledged MMD [275]. In her monumental latest book, Dead Reckoning, she referred to Turner’s “famous” concept of “failure of foresight” [276] (p. 567) but some other concepts first used by Turner were used without acknowledgment (e.g., [276] (p. 9).

8.7. Nancy Leveson

Aeronautical engineer Nancy Leveson cited Turner’s MMD [1] in her 647-page 1995 book Safeware: Systems Safety and Computers: A Guide to Preventing Accidents and Losses Caused by Technology [277] as one of three references supporting the point “In fact, all hazards are affected by complex interactions among technological, ecological, sociopolitical and cultural systems” [277] (p. 4). However, relevant book sections without reference to Turner include: “Ignoring High-Consequence, Low-Probability Events” where Leveson states “A common discovery after accidents is that the events involved were recognised before the accident but were dismissed as incredible [277] (p. 60, emphasis in original); “Ignoring Warning Signs” [277] (p. 64); and Chapter 10 on “Accident and Human Error Models” [277] (pp. 185–224). These sections incorporated major ideas found in MMD, as well as in many of Turner’s related articles. Leveson does not reference Turner in two influential articles [278,279] or in her subsequent 2011 book Engineering a Safer World: Systems Thinking Applied to Safety [280]. There is one reference to Turner in a joint 2009 article that argues that both NA and HRO theories are inadequate and Leveson’s “pure” STAMP systems approach is to be preferred, where it is stated that although Perrow “was not the first social scientist to study major accidents (e.g., Turner 1978), his work was the starting point for many others to enter this area” [281] (p. 229). This article included several concepts in MMD without acknowledging Turner.

8.8. Andrew Hopkins

We have already discussed some important publications by sociologist Andrew Hopkins who, from 1999, wrote a series of well-regarded books on major disasters and accidents and associated articles. Hopkins drew particularly on the theory and “desktop ethnography” analysis within MMD and Turner’s “sloppy management” article [1,10,43] as a guide and regularly acknowledged his debt to Turner [5], [184] (pp. 10–11, 16–18), [185], [186] (p. 54), [188] (pp. 110–111).
Hopkins also regularly cited Perrow but not always favourably. Hopkins’s critiques of NAT [166,167] included that it did not follow from Perrow’s analysis of the Three Mile Island nuclear power station accident but instead “conforms beautifully to Turner’s account. The exemplar case of a normal accident turns out to be just another case of sloppy management” [167] (p. 70). The critique involving NA was revisited a decade later [282] (pp. 159–160, [183] (pp. 7–8). Hopkins also suggested that the high citation rates for some classic books such as NA reflected authors’ desires to be seen to know of their existence rather than actually having read them [183]. As stated in Section 1.1, Hopkins noted parallels between Turner’s work and Perrow’s but assumed that Perrow must have written NA in ignorance of MMD [5] (p. 21).

8.9. Erik Hollnagel

Erik Hollnagel originally trained as psychologist, worked with Rasmussen and extended his safety science contributions to include cognitive and information systems engineering and many other subjects in the course of writing hundreds of papers since 1971 and 28 books since 1986 [283]. He was both a contemporary and successor to Turner. Hollnagel’s 2004 book Barriers and Accident Prevention [284] surveyed a wide range of types of accident literature, but despite canvassing many ideas that had appeared in MMD, Turner was not included in any of the categories in the Preface and there is only one citation of Turner in the book: “The notion that latent or dormant conditions could contribute to the development and signature of an accident is, however, much older [than Reason] and may be traced back to Heinrich’s (1931, sic) domino model or Turner’s (1978) analogy of an incubation period in the build-up to man-made disasters” [284] (p. 55). A jointly written 2009 article included a somewhat positive section on MMD and, after classifying it as a complex linear system or epidemiological model, stated that: “Today, the complex linear accident model is best known as the Swiss Cheese model (Reason, 1997)” [285] (p. 1298). As outlined above, Turner employed a non-linear systemic approach as well as a sequential multiple chain (complex) model. The article was positive about Perrow’s NA model of “complex systems” and focus “on two system properties, called coupling and interaction” [285] (pp. 1298–1299). In 2012, Hollnagel noted in his book on FRAM, that
The distinction between work-as-imagined and work-as-done is often used in the ergonomics literature … Work-as-imagined represents what designers, managers, regulators and authorities believe happens or should happen, whereas work-as-done represents what actually happens. Differences … [are] classified as non-compliances, violations, errors or as performance adjustments and improvisations, depending on how one looks at it. An early discussion of this in the context of safety is found in Turner, B. (1978) Man-Made Disasters.
[286] (p. 38, emphasis in original)
Despite the relevant scope, Hollnagel did not cite Turner in The ETTO Principle (2009), Safety-I and Safety-II (2014), and Safety-II in Practice (2018) [287,288,289]. Time and space constraints did not allow for a close examination of Hollnagel’s prolific articles [283]. Much of Hollnagel’s relevant work occurred after MMD and Turner did not cite Hollnagel.

8.10. Sidney Dekker

Psychologist Sidney Dekker’s publications on accidents and incidents and causality commenced just after the 1997 second edition of MMD [290]. The first and second editions of his “Field Guide” books in 2002 and 2006 [291,292] did not mention Turner, despite MMD’s relevance to their contents, but there was a small mention in a 2005 book [293] (p. 23). By 2011, Dekker was citing Turner extensively, particularly MMD, in acknowledging incubation and surprise, drift, risk as energy to be contained and barrier analysis [294] (pp. 87–103). The Field Guide’s third edition in 2014 [295] was very positive about Turner’s legacy and listed him amongst pioneers in the Acknowledgements. Turner was cited in relation to drift and safety culture, man-made disaster theory, organisational information difficulties and decoy phenomena [295] (pp. 136–139, 170–171, 182–183). Dekker also wrote that MMD discussed complexity and other “precursors to New View thinking” [295] (p. 199). The 2015 second edition of Dekker’s 2005 book mentioned Turner more frequently [296] (pp. 29, 155, 239, 243, 244). Dekker’s 2019 book Foundations of Safety Science highlighted Turner’s contribution in Chapter 7 “The 1970s and Onward: Man-Made Disasters” and considered MMD to be “prescient and foundational”, stating: “man-made disaster theory was really out there. It was a pioneer, riding well out front of other thinking at the time” [297] (pp. 219–223, 231, 233, 262, 432, emphasis in original). Dekker credited Turner’s MMD as the basis for many ideas currently in use in safety science
Defences-in-depth thinking (e.g., latent errors or resident pathogens that are already present and help incubate disaster (Reason, 1990); High reliability theory (e.g., weak signals that do not get communicated or picked up (Weick & Sutcliffe, 2001); Safety culture research (e.g., organizational cultural preconditions for disaster); Concepts such as the normalization of deviance (Vaughan, 1996), procedural drift (Snook, 2000), and drift into failure (Dekker, 2011), which all refer to disaster incubation in one way or another; Control-theoretic notions about erosion and loss of control (Leveson, 2012): the kind that Turner talked about in sociological, managerial, and administrative terms.
[297] (pp. 220–221)
Dekker has also been very positive about Perrow’s NA in many publications, and especially in Chapter 8 of Foundations of Safety Science [297] (pp. 267–281). Turner’s 1995 death meant that he did not cite Dekker.

8.11. Acknowledgment and Citation by Seven other Accident Causality and Explanation Scholars

Paul Shrivastava’s book Bhopal: Anatomy of a Crisis [298] utilises concepts in MMD such as “triggering events” and cites Turner [1,27] including in relation to industrialisation and population and energy concentration by large corporations that can lead to major disasters [298] (pp. 16, 19). Many concepts in MMD could have been, but were not, drawn upon and cited. However, the book devoted significant attention to the post-disaster phases that were not Turner’s primary focus. Perrow’s NA [2] is included in a long list of “suggestions for further reading” but is not discussed in the text. Shrivastava [298] was cited in the second edition of MMD [10] (pp. 176, 229).
Scott Sagan, in his 1993 book The Limits of Safety [196], did not cite Turner, notwithstanding that a substantial part of his subject matter excluding domain detail about nuclear weapons, overlaps with themes in MMD and Turner’s other articles. Unsurprisingly, given his intended comparison between NA and HRO theory, Sagan repeatedly cites Perrow and Weick and his eminent HRO colleagues who are all US based, as well as other important US authors such as Wildavsky [299]. As for the Europeans, however, Reason [260] is cited once for being influenced by Perrow [196] (p. 36, footnote 68) and once again in relation to latent problems and interactive common mode failures, where it is noted that Reason had referenced Rasmussen [196] (pp. 39–40, footnote 76). Turner is unknown or ignored. However, Turner cited Sagan [10] (pp. 190–191, 193).
Scott Snook’s 2000 book Friendly Fire [160] cites Turner [27] (p. 383) in a footnote in relation to the variable disjunction of information [160] (p. 172, footnote 45). Surprisingly, this was the only reference, as Snook did not cite or acknowledge MMD or Turner’s many other relevant articles and themes including those that had been included in Turner’s 1976 ASQ article [27]. However, Snook [160] was generous in his citation of Perrow and NA.
In Table 3 is a simple summary of the foregoing twelve major safety science authors’ (in the left column) demonstrated knowledge of Turner’s MMD [1] or associated articles before or after 1997, with their poor, mixed, good or a hybrid acknowledgment of editions tabulated across the rows in column 3. Knowledge of NA [2,9] is indicated in column 5 (the 1984 and 1999 main body text and pagination is identical) with acknowledgment of NA options being poor, mixed or good in column 6. Neither Rasmussen nor Sagan cited Turner and it was unclear if they had knowledge of his work. Of the six who knew and cited MMD [1] or associated articles, the best treatment was one hybrid good and mixed, with two mixed, two hybrid poor and mixed, and one poor. Of the four who cited Turner after the 1997 second edition [10], the treatment of two was good, with one mixed/good, and one mixed. All except Rasmussen were clearly familiar with NA, but Shrivastava only included it among references for further reading.
A number of important safety science scholars of the next generation are familiar with Turner and have acknowledged his work. Le Coze cited Turner in an early 2005 article [300] (p. 626) and regularly after that [99]. He credits Turner and his colleagues [1,17,117] with recognising culture as an important aspect of safety and seeking to make it compatible with social science research and insights [301] (p. 223). A very positive summary of his view of Turner [302] (pp. 3–4) was included in discussing his 2020 edited book [303]. In his latest book on NA, Le Coze [4] naturally focused more on Perrow’s positive legacy but briefly wrote of Turner
Before Perrow’s [1984] NA, in 1978, Turner published Man-Made Disasters, the failure of foresight, a book looking into disasters from a sociotechnical perspective (Turner, 1978). The contribution of Turner at the time was to go beyond an engineering view of disasters and to understand, study and conceptualise these events as … engineering, organisational and cultural phenomena. Accidents are the products of fallible institutionalised views created by a wide range of actors of organisations.
[4] (p. 126)
Carl Macrae drew heavily on Turner’s 1971 ethnographic field method [27], as well as MMD [1] and associated articles, in his 2014 book Close Calls: Managing Risk and Resilience in Airline Flight Safety [304]. Macrae referenced Turner’s work extensively and in an exemplary fashion and jointly edited a fine 2021 book on aspects of Turner’s work [3].
Jan Hayes acknowledged Turner and MMD when writing about gas pipeline failures with Hopkins [305] (p. 89). Hayes and colleagues based a significant part of a chapter on Turner’s MMD, with appropriate citation and acknowledgment [306]. She also acknowledged Turner in some detail in a 2020 chapter [307] (pp. 188–189, 202).
An established older industrial relations specialist scholar writing in safety science, Michael Quinlan, in his 2014 book Ten Pathways to Death and Disaster acknowledged that: “The approach of reviewing a series of multi-fatality incidents has been pioneered by others—though still underutilised in my view—notably Barry Turner (1978) in his book Man Made Disasters who examined disasters in the United Kingdom over an 11 year period” [308] (p. 31). Quinlan included a balanced discussion of Perrow but found “no support for his technological complexity argument” [308] (pp. 19–21, 173).

9. Discussion

9.1. Turner and Perrow

Turner was a careful, creative and collaborative sociology researcher utilising the findings emerging from qualitative field data, and models and theories from many different disciplines [1]. He was non-dogmatic and conscious of contingency, complexity and the limitations and fallibility of models, including his own [120] (pp. 59, 63–65). Turner’s organisational culture approach had a primarily micro- and meso-sociological ethnographic and interpretative focus and method that attended to time and the cultural patterning of incubation events [7] (pp. 242, 244). Other than graduate student assistance [2] (pp. vii, 244), Perrow mostly worked and wrote alone, taking a macro-structuralist and sometimes provocative critical approach that battled corporate and executive self-interest and rarely took a backward step [2] (pp. 14, 306–307), [77] (p. viii), [81] (pp. 139–140, 146), [84] (pp. 31–32), [92] (p. 10), [136] (p. 726), [143] (p. 47), [146] (p. 915), [152] (p. 92). This was often based on his semi-radical American liberal and critical views in relation to the economic and political power of large organisations and their various “sins” and shortcomings, and marshalling case study and documentary evidence to prosecute normative arguments or to highlight and seek to redress perceived research imbalances [68,86,93,94,95,96,105,134,139,144,146,147].
Turner’s MMD and associated publications from the 1970s until his death in 1995 provided the roots (and hefty branches) for subsequent and contemporary theories and understandings within safety science from culture to accident aetiology and systems theory. The richness of his writing and eclectic use of theory and collaborative multidisciplinarity was remarkable. He also made linkages between safety and security, global industrial trends, population and the environment, and even insurance and risk [10] (p. 231, note 68), [309] with an associated interdisciplinarity that has assumed increasing importance. While the particulars of each accident typically differed, his view of systemic emergence and feedback focused on more structured sociological and cultural patterns within organisations that could be foreseen and potentially addressed ahead of a major accident. Turner also highlighted the key roles of both information and thermodynamics in complex systems.
Perrow’s main innovative focus in NA was on the small percentage of technology-based major accidents that were less amenable to prevention and could lead to catastrophes. He considered that only some reasons for tight coupling and interactive complexity could be proactively addressed. Perrow’s emphasis on unanticipated and “incomprehensible” negative systemic effects did not allow for the non-random incubation in which they may emerge and sometimes amplify that Turner thought could assist in developing foresight and prevention strategies. Some of Perrow’s rare negative systemic variants may align with what Turner placed in unknowable and unperceivable categories in MMD [1] (p. 195). Perrow’s more technologically deterministic complexity and coupling rationale can be considered complementary to Turner’s socio-technical, cultural, information and systemic explanation for system accident causality.
In his unpublished letter to Mrs Turner [110], Perrow wrote in 1995 that he had read MMD while writing NA and considered it “the earliest attempt to think through the matter of disasters in organizational terms, and thus very useful and insightful … strongest in the cultural area, while I have been much more concerned with structure … So there is not much in common” [193]. However, as well as many differences, a large number of similarities and overlaps were identified, such as the organisational and other matters summarised in Table 1. Perrow did not acknowledge Turner in his important books and material written after MMD and there was a great deal more in common than he suggested.
Over time, both Turner and Perrow found even more common ground. While interactive complexity and tight coupling leading to unpredictable system failures in rare circumstances could lead to a catastrophe, Perrow considered that most major accidents and disasters were caused by other organisational and sociocultural factors such as production pressures, downsizing, outsourcing, socialising risk, fantasy documents, power struggles, executive failure and deliberately ignoring warnings [9] (pp. 355, 360–362, 373–380), [94], [105] (p. 313), [135]. This and his “DEPOSE” factors [2] (p. 8) give support to the second thesis in NA emphasised by Le Coze [4] and attributed to Hopkins, who had drawn significantly from Turner. Perrow agreed that some major accidents he had considered normal or system accidents were not, and nor were the nuclear power plant disasters at Chernobyl and Fukushima [92,96]—failures of foresight, organisational and sociocultural factors and sloppy management in relation to high-risk technology provided better explanations, as Turner had emphasised.
Perrow employed a radical zero-sum “power over” perspective [81] (p. 262). His Afterword claimed for NA that “Group interests and power pervaded my book” [9] (p. 369) and he made the “wild assertion” that “much of the work in the risk area today is systematically detoxing the power aspects of my book” [9] (p. 378). However, in NA in 1984 he mentioned the word “power” explicitly only three times [2] (pp. 12, 306, 311), [228] (p. 31) and power was not a topic in the index. In MMD Turner placed explicit emphasis on power [1] (pp. 1, 3–6, 124–125, 199, 152, 202). In NA [2] Perrow had little to say about organisational culture but within two years it was important [81] (pp. 263, 265, 268–269, 278), with Perrow writing that “a cultural approach is necessary, but it must be informed by an awareness of political and organizational power” [81] (p. 265). In 1999 he wrote of Vaughan’s [270] “otherwise excellent” Challenger space shuttle accident “normalization of deviance” interpretation, that it “minimizes the corruption of the safety culture”, concluding “We miss a great deal when we substitute culture for power” [9] (pp. 379–380). However, by then he considered, like Turner, that both concepts were important.
In 2009, Perrow argued that “it was my focus upon systems, rather than just the humans in them, that made the theory distinctive. Normal accident theory should stand alongside of deviance/drift theories, power theories, or component failures (whether of design, operator error, materials, or environment). We need them all. There is no need for reconciliation, just application” [150] (p. 1392). This stance appears to be more productive for the future of safety science. It has been demonstrated that Turner also had a focus on systems and was catholic and embracing with respect to other theories and perspectives that he considered added value to his preventive analysis.

9.2. Citation and Acknowledgment

Unlike NA, why was MMD “seldom cited”, at least in the US, for two decades? An important factor assisting Perrow’s relative prominence compared with that of Turner was timing [106]. Vaughan’s quotation of “Timing is everything” is significant in noting that, when MMD was published in 1978, the threat of technical hazards was not considered a major social concern [211] (p. xi). After TMI in 1979, a succession of major disasters, including Bhopal, Chernobyl, Herald of Free Enterprise, Zeebrugge, Piper Alpha, Exxon Valdez and the Challenger space shuttle accident, is considered likely to have elevated international social concern and hunger for a simple theory such as the “normal accident” that seemed to explain why they had occurred. However, while media, public and even academic commentators around the world saw these as providing Perrow with “bragging rights” and he became somewhat famous, he ultimately did not agree—those he had examined after NA were “component failure” accidents [9] (p. 345), [92] (p. 10), [96]. While his potentially prescient “Y2K” disaster analysis did not materialise and further enhance NA’s reputation [9] (pp. 388–411), increasingly more pervasive contemporary system complexity and interconnectedness of technological disaster factors suggest that Perrow’s foundational ideas remain very relevant.
Another reason for less citation was that MMD was poorly marketed by Turner’s publisher in 1978 [16] (p. 294), [201] (p. 270). Alternatively, Short and Rosa [310] (pp. 93–94) suggested a paradigmatic “blind spot” because Turner had made the organisation his unit of analysis, which meant that Turner’s treatment of risk in MMD, as well as the other advances in his “seminal work”, were ignored by scholars, including themselves, e.g., Short [311], leaving the field open for Douglas and Wildavsky [164] and Perrow [2]. While in NA, Perrow [2] (pp. 10, 330) also said that his focus was on organisations, NA emphasised high-risk industries and their technologies more than organisations per se and the focus was on avoiding rarer catastrophic events. However, had Perrow used organisation as his central unit of analysis, it seems unlikely that it would have been cited to a similarly reduced extent as MMD prior to 1997. A further reason may have been a parochial reluctance by some US scholars to acknowledge scholars in Europe/UK [110], even when published in ASQ, reinforced by their publishers and “cut-throat” competitive pressures faced by American academics [16] (pp. 297–298). This appears true of Perrow in relation to Turner but less so regarding Rasmussen or Reason.
There are a great many parallels in the work of Turner and Perrow in MMD and NA and subsequently. In addition to his critical sociological perspective and important innovations, Perrow placed his own unique and colourful stamp on many ideas first raised by Turner and later embraced a number of Turner’s other ideas, such as culture, that he had initially ignored or indirectly criticised. It is possible that similar ideas could be developed without making conscious linkage to another author, especially when sociological perspectives are very different. Perhaps such factors help to explain elements of why Perrow read MMD [1] and used many similar approaches and ideas but did not cite it in NA [2]. While a counterfactual is difficult to establish, had Perrow done so, other scholars in the US and elsewhere may well have sought out and better acknowledged MMD prior to its 1997 second edition [10]. While we should remind ourselves that contemporary documentary access, especially through the Internet, is now much more rapid and comprehensive than in the 1980s and 1990s, this does not explain the extent of Perrow’s under-acknowledged overlaps with Turner because he had read MMD while writing NA.
Perrow’s two minor citations of Turner in 1994 [90,138] seem to be somewhat opportunistic. The first was perhaps calculated to head off any reviewer criticism of ignoring Turner in a journal with which Turner was closely associated. The second involved a negative review of a book arising from the doctoral thesis by Brian Toft that Turner had supervised and, while in it Perrow called Turner a “pioneer”, it contained substantial negative elements directed to Toft and perhaps by implication to Turner. Having been invited in 1995 to write about Turner ahead of the developing second edition of MMD, Perrow declined but said that he had “always regretted” [193] not mentioning MMD in NA in 1984. Yet he continued to omit reference to either edition of MMD when NA was republished in 1999 [9] with an extensive Postscript and associated bibliography, or in his other relevant books and articles.
When the primary researcher/author commenced this research, there was no reason to expect that major works of two of the most important safety science accident and disaster causality and theory pioneers, Perrow and Reason, might include contents excessively derivative of Turner without proper attribution. It had been assumed that Turner was generally just less well recognised and acknowledged. Given the breadth and geographical spread of safety science, it was possible (as Rae [8] noted) for authors in the 1970s and 1980s to be unaware of others with different disciplinary backgrounds such as Rasmussen, and journal word limits could constrain citation. Reason’s shift in writing about major organisational accidents from 1987 after discussion with Turner and his close colleagues might suggest that there was not only cross-Atlantic academic competition. It seems clear that after an early 1987 discussion with Turner, Toft and Pidgeon, Reason modified his individual cognitive focus on accident causality to add social and organisational elements to his major accident theory and employ concepts such as incubation, latent factors and triggers that had been published in MMD [1], which he had read [257,258,259,260]. It is possible that Reason did not notice that some ideas he had credited to Perrow had been first developed in MMD. Reason ultimately acknowledged a substantial debt to Turner [234,262]. Turner wrote well before Perrow and Reason and it was, and is, standard scholarly and ethical practice to acknowledge important similar ideas that have priority and are known. We consider that Turner’s contribution has been under-recognised compared with other pioneers such as Perrow and Reason, and that this is due in part to issues of academic fairness or integrity. The rationale for others in Section 8 and Table 3 who treated Turner poorly or in a mixed manner in terms of acknowledgment is less clear and their ethics are therefore not considered to be in question.
In a book first published in 1993, leading evolutionary biologist and science historian, Stephen Jay Gould wrote about the “dark side” of academic integrity as a result of careless reading, lack of reflection and making “straw man” arguments without reading at all [312] (pp. 124–126). Calhoun [313] (p. 12) raised similar issues with citation of renowned sociologist R.K. Merton, including by those who caricatured his work without reading it. Carsten Busch has documented a number of concerns in relation to citation of safety science pioneer H.W. Heinrich involving lack of contextualisation, failing to review original sources, or at best doing so superficially, excessively harsh and dualistic criticism, and assuming that an early source meant that it was old, tired and outdated [314] (pp. 67–69, 255). In his earlier 2018 thesis in relation to fellow “New View” scholars, Busch documented citation errors, wrong or unsubstantiated attributions, cherry picking, decontextualised quotations, failure to attribute concepts that Heinrich had developed, disrespectful judgments, and potentially writing with ideological or product marketing goals [315] (pp. iii, 44–45, 47–51, 54–61, 64–65, emphasis added). While many of these concerns differ from those found in the current review, they suggest that the findings in relation to Turner are not totally unique. Publish or perish was also identified by Steven Shorrock [316] (pp. 230, 234–235) as an issue facing safety science, as well as within other academic fields.
Among Busch’s “suggestions for further research” unrelated to the specifics of the Heinrich study were reviewing “The practices of citing within safety literature” and “Ways of dealing with and reducing quotation errors, notably concentrating on … primary sources and reducing pressures that encourage taking shortcuts” [315] (p. 89). These suggestions are important and considered worthy of follow-up. However, they would not fully address use of another author’s work without proper acknowledgment or potentially rebadging it as their own. A safety science editorial in 1976 noted colourfully that when “theories are plentiful and hard facts are few, it is perhaps inevitable that ideas what were already shop-soiled some years ago tend to be regurgitated from time to time” [317] (p. 1), [318]. This issue was not new when MMD and NA were published and Perrow’s (and Reason’s) differing academic backgrounds and contexts only explain some of the unattributed similarities with Turner’s MMD and his other publications.
The importance of acknowledgment of the ideas of another and not “Citing selectively to enhance own findings or to please editors…” [319] (p. 8) has received greater prominence from official and academic association bodies in recent decades [320] (Section 14(b)), [321] (Section 4.4), [322,323] but has always been important to academic integrity post-WW2. The current journal is very clear about its strict ethical policies and standards and zero tolerance of plagiarism or inappropriate authorship credit and the like [324]. Future safety science research could well benefit from these lessons, as well as considering the many aspects of Turner’s work summarised in this retrospective review.

9.3. Limitations

A qualitative documentary-based research study designed to assist explanation and understanding inevitably requires greater researcher judgement, reflexivity, and a different methodology and type and standard of evidence compared with applied quantitative or engineering research. This does not necessarily make it less “scientific” or less relevant to the advancement of knowledge in the field of safety science encompassed by the journal when it is conducted appropriately and thoroughly [325,326].
While it was possible to contact Turner’s spouse and several close academic collaborators to obtain additional background and material, this proved more difficult in the case of Perrow. One key academic collaborator, Emeritus Professor Lee Clarke, who was a graduate student assisting Perrow before the publication of NA, was able to be contacted and advised that he was unaware of Perrow having read MMD prior to publishing NA [327]. Clarke did recall that Perrow had mentioned and “paid homage” to MMD subsequently [327]. It is likely that this was one of the publications considered in Section 7.2 above.
In the cases of Reason and others mentioned in relation to knowledge and citation of MMD, the constraints of time and space prevented a more expansive treatment, although much more material was reviewed and assessed than could be reported in this article. Future researchers may be able to test and extend the conclusions presented here by accessing available private papers of pioneers such as Perrow, Reason and Rasmussen.
Significant additional research was also undertaken that further highlighted how concepts developed by Turner were being used in contemporary safety science and to help understand and prevent major accidents. Once again, space prevented further discussion.

10. Conclusions

The paper’s first aim was to provide an historical and contextual exposition of the major accident and disaster books and theories of Turner and Perrow and their evolution, and a comparison of their work to address a gap in the safety science literature and better understand their contributions to safety science. This was to help understand the intent of each author and the circumstances in which they wrote so as to better inform contemporary safety science of its roots. It was addressed by systematic reading and the very detailed citation and analysis in Section 3, Section 4, Section 5 and Section 6. Turner’s research found patterns that meant that major accidents were not totally unique and could be preventable. Turner employed a pragmatic blend of moderate realism and constructivism in engagement with a variety of industry cases, and an extraordinary breadth of interdisciplinary reading. Perrow used a more critical radical American structural approach and drew themes and lessons from a broad range of high-risk industry data. Among important differences was Turner’s focus on non-random systemic and predictable human, organisational, cultural and informational factors that took time to incubate and combine to trigger a disaster and were potentially foreseeable and preventable, compared with Perrow’s focus on technological complexity, tight coupling and risk that could quickly, through random and emergent interactions, lead to accidents that were structurally determined and, once initiated, uncontrollable and could occasion an unpreventable catastrophe. Turner explained “failures of foresight” using concepts such as organisational disaster “incubation”, culture, internal hierarchies, information distribution and “negentropy” in socio-technical systems. Perrow’s macro-structuralist and technically deterministic 2 × 2 matrix of “interactive complexity” and “tight coupling” focused on unpredictable “normal” or “system” accidents that in rare circumstances could be catastrophic. He also provided insights into more common “component failure” accidents. Significant similarities included the importance of poor management, the gap between perceived and actual reality, bounded rationality, unheeded warnings, systemic emergence and propagation, triggering of latent factors, growing concentration of energy and power of large organisations, and not blaming individual operator error. Continuities and significant changes in the authors’ subsequent writing were outlined in Section 6. Additional commonalities included the importance of “drift”, demographic factors, risk-determining leaders, organisational amplification, and Perrow’s embrace of Turner’s views on socio-cultural and organisational factors in accidents.
The second aim was to test the priority and originality of each pioneer and the possibility that Perrow came to similar views and theories independently of Turner. In light of comments made by Hopkins, Rae and Pidgeon, an assessment was made of whether Perrow wrote NA in ignorance of MMD and subsequently acknowledged Turner’s relevant work to the extent that might have been expected based on his academic context. Originality within MMD and NA was introduced in Section 3 and Section 4, summarised in Section 5 Table 1 and Table 2, and further discussed in Section 6. Acknowledgment was addressed in Section 7. Turner and Perrow were pioneers and original in different ways, such as seen in Table 2, as well as in the differing emphases given to a number of common elements in Table 1. While not always agreeing, Turner consistently cited and acknowledged Perrow’s work. It was established that Pidgeon and perhaps Weick had correctly implied that Perrow in NA had used and developed concepts published six years earlier by Turner in MMD. Hopkins’s reasonable assumption, because of the overlaps he found, shared by Rae, that Perrow had not read MMD before completing NA, was therefore established to be incorrect. Perrow had read MMD while writing NA and before its publication in 1984. However, he chose to omit any reference to it, despite many areas of overlap, such as seen in Table 1, and this was not remedied in the re-release of NA in 1999 with a substantial postscript and additional bibliography or in Perrow’s other significant publications, despite writing to Mrs Turner in 1995 that he “always regretted” not doing so.
The third aim was to review the knowledge of Turner by some other important safety pioneers and accident causation scholars and their treatment of him and Perrow. We noted that the citation rate for MMD was more than ten times lower than for NA in the period to 1996 and that both Vaughan and Rosenthal had remarked on MMD’s sparse citation, especially in the US. We found that in the UK, Reason had seriously under-acknowledged Turner in his articles from 1987 and classic 1990 book but his later treatment was more mixed. If Perrow and Reason had appropriately acknowledged the first edition of MMD, its relatively sparse citation may well have been different. A summary of the knowledge of, and treatment by, important contemporaries and close successors and some others is provided in Section 8. Some improvement is apparent since 2004 and even more so since 2014.
A final aim was to consider reasons for under-acknowledgment of Turner and potential ethical considerations linked to inadequate citation. While some of the parallels and overlaps with MMD in NA were no doubt independent and differently focused, Perrow’s lack of citation of MMD (and Reason’s subsequently) likely impacted relative awareness of Turner’s legacy within safety science, as well as raising ethical issues. While some of the darker side of academic writing was unexpectedly exposed in the process of this research in relation to Perrow’s, and to a lesser extent Reason’s, acknowledgment of Turner, they both remain pioneers who have made major and enduring contributions. Despite his early death, decades younger than Perrow and Reason (who happily is still alive), Turner deserves to be recognised and acknowledged as being in the highest strata of safety science pioneers, with MMD celebrated as a seminal and classic foundational book. While under-acknowledgment of Turner’s work is slowly being remedied, further progress and knowledge and use of his work in research is desirable.
In meeting these four aims, readers should now have a much greater understanding of the strengths and originality of Turner and Perrow and their influence, and the importance of careful and detailed historically contextualised and ethical scholarship that appropriately acknowledges major ideas that are being utilised. We conclude that Turner’s MMD, and his foundational importance for safety science more broadly, should be much better recognised.

Author Contributions

Conceptualization, K.B.; methodology, K.B.; formal analysis, K.B.; investigation and research, K.B.; data curation, K.B.; writing—original draft preparation, K.B.; writing—review and editing, K.B., M.C. and L.C.; supervision, M.C., L.C. and K.B. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding. The primary author/researcher holds an Australian Government Research Training Program (fee remission) Ph.D. scholarship.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Any non-public data presented in this study is available on request.

Acknowledgments

In addition to the two PhD supervisors who are the other authors, the primary author/researcher wishes to acknowledge the assistance of Janet Howd (Janet Turner), Brian Toft and Nick Pidgeon in accessing some hard-to-obtain data and background on B.A. Turner and for some context in relation to C.B. Perrow and J.T. Reason. The primary author is grateful to Andrew Hopkins for his initial supportive reaction to the research and for providing a draft of his 2021 chapter on Barry Turner.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Turner, B.A. Man-Made Disasters; Wykeham: London, UK, 1978. [Google Scholar]
  2. Perrow, C.B. Normal Accidents: Living with High-Risk Technologies; Basic Books: New York, NY, USA, 1984. [Google Scholar]
  3. Gould, K.P.; Macrae, C. (Eds.) Inside Hazardous Technological Systems: Methodological Foundations, Challenges and Future Directions; CRC Press: Boca Raton, FL, USA, 2021. [Google Scholar]
  4. Le Coze, J.C. Post Normal Accident: Revisiting Perrow’s Classic; CRC Press: Boca Raton, FL, USA, 2021. [Google Scholar]
  5. Hopkins, A. Turner and the Sociology of Disasters. In Inside Hazardous Technological Systems: Methodological Foundations, Challenges and Future Directions; Pettersen Gould, K., Macrae, C., Eds.; CRC Press: Boca Raton, FL, USA, 2021; pp. 19–32. [Google Scholar]
  6. Pidgeon, N.F. Systems Thinking, Culture of Reliability and Safety. Civ. Eng. Environ. Syst. 2010, 27, 211–217. [Google Scholar] [CrossRef]
  7. Pidgeon, N.F. Afterword: Connoisseurship, the Sociological Imagination and Turner’s Qualitative Method. In Inside Hazardous Technological Systems: Methodological Foundations, Challenges and Future Directions; Gould, K.P., Macrae, C., Eds.; CRC Press: Boca Raton, FL, USA, 2021; pp. 237–248. [Google Scholar]
  8. Rae, A.J. Can Major Accidents be Prevented? Transcript of Episode 100 of The Safety of Work Podcast by Provan, D.; Rae, A.J. Broadcast on 9 October 2022. Available online: https://safetyofwork.com/episodes/ep-100-can-major-accidents-be-prevented/transcript (accessed on 28 November 2022).
  9. Perrow, C.B. Normal Accidents: Living with High-Risk Technologies (Republished with a New Afterword and a Postscript on the Y2K Problem and an Additional Bibliography); Princeton University Press: Princeton, NJ, USA, 1999. [Google Scholar]
  10. Turner, B.A.; Pidgeon, N.F. Man-Made Disasters, 2nd ed.; Butterworth-Heinemann: Oxford, UK, 1997. [Google Scholar]
  11. Le Coze, J.C. The ‘new view’ of human error. Origins, ambiguities, successes and critiques. Saf. Sci. 2022, 154, 105853. [Google Scholar] [CrossRef]
  12. Turner, B.A. The Failure of Foresight: An Examination of Some of the Conditions Leading to Failures of Foresight, and of Some of the Institutionalised Processes for Accommodating Such Failures. Ph.D. Thesis, The University of Exeter, Exeter, UK, 1976. Available online: https://ethos.bl.uk/ethos/Logon.do (accessed on 24 January 2021).
  13. Jeffcutt, P. Obituary—Professor Barry Turner (1937–1995). Stud. Cult. Organ. Soc. 1995, 1, i–iii. [Google Scholar] [CrossRef]
  14. Pidgeon, N.F.; Blockley, D.I.; Turner, B.A. Design practice and snow loading—Lessons from a roof collapse. Struct. Eng. 1986, 64A, 67–71. [Google Scholar]
  15. Turner, B.A. Sociological Aspects of Organizational Symbolism. Organ. Stud. 1986, 7, 101–115. [Google Scholar] [CrossRef]
  16. Turner, B.A. A Personal Trajectory through Organization Studies. In Research in the Sociology of Organizations; Bacharach, S.B., Gagliardi, P., Mundell, B., Eds.; JAI Press: Greenwich, CN, USA, 1995; Volume 13, pp. 275–301. [Google Scholar]
  17. Turner, B.A. Exploring the Industrial Subculture; The Macmillan Press: London, UK, 1971. [Google Scholar]
  18. Jeffcutt, P. Editorial: From the Industrial to the Post-Industrial Subculture. Organ. Stud. 1999, 20, vii–xv. [Google Scholar] [CrossRef]
  19. Reeves, T.K.; Turner, B.A.; Woodward, J. Technology and Organizational Behaviour. In Industrial Organization: Behaviour and Control; Woodward, J., Ed.; Oxford University Press: Oxford, UK, 1970; pp. 3–18. [Google Scholar]
  20. Reeves, T.K.; Turner, B.A. A Theory of Organization and Behavior in Batch Production factories. Adm. Sci. Q. 1972, 17, 81–98. [Google Scholar] [CrossRef]
  21. Turner, B.A. Control Systems: Development and Interaction. In Industrial Organization: Behaviour and Control; Woodward, J., Ed.; Oxford University Press: Oxford, UK, 1970; pp. 59–84. [Google Scholar]
  22. Turner, B.A. The Organization of Production—Scheduling in Complex Batch-production Situations: A comparative view of organizations as systems for getting work done. In Approaches to the Study of Organizational Behaviour: Operational Research and the Behavioural Sciences; Heald, G., Ed.; Tavistock: London, UK, 1970; pp. 87–99. [Google Scholar]
  23. Turner, B.A. Industrialism; Longman: London, UK, 1975. [Google Scholar]
  24. Berger, P.L.; Luckmann, T. The Social Construction of Reality: A Treatise in the Sociology of Knowledge; Doubleday: New York, NY, USA, 1966. [Google Scholar]
  25. Burrell, G.; Morgan, G. Sociological Paradigms and Organisational Analysis; Ashgate: Aldershot, UK, 1979. [Google Scholar]
  26. Turner, B.A. An Examination of Some of the Organisational Preconditions Associated with Some Major Disasters. Presentation to an Open University Seminar on Systems Failures; City University: London, UK, 1974; Reprinted with updates in Peters, G.; Turner, B.A. Part of the 9-unit third level Open University course TD342, Systems Performance: Human Factors and Systems Failures. In Unit 4: Catastrophe and its Preconditions; Peters, G., Ed.; The Open University Press: Milton Keynes, UK, 1976; pp. 4–45. [Google Scholar]
  27. Turner, B.A. The Organizational and Interorganizational Development of Disasters. Adm. Sci. Q. 1976, 21, 378–397. [Google Scholar] [CrossRef]
  28. Turner, B.A. The Development of Disasters—A sequence model for the analysis of the origins of disasters. Sociol. Rev. 1976, 2, 753–774. [Google Scholar] [CrossRef]
  29. Turner, B.A. The origins of disaster. In Safety at Work: Recent Research into the Causes and Prevention of Industrial Accidents; Philips, J., Ed.; Centre for Socio-Legal Studies, Conference Papers No.1; Wolfson College: Oxford, UK; Social Science Research Council: Oxford, UK, 1977; pp. 1–18. [Google Scholar]
  30. Turner, B.A. Perceptions of Bureaucracy: A Variable in Administrative Theory. Soc. Econ. Adm. 1977, 11, 137–149. [Google Scholar] [CrossRef]
  31. Turner, B.A. Research note: A comment on the nature of information in channels of observation. Cybernetica 1977, XX, 39–42. [Google Scholar]
  32. Merton, R.K. The Unanticipated Consequences of Purposive Social Action. Am. Sociol. Rev. 1936, 1, 894–904. Available online: https://www.jstor.org/stable/2084615 (accessed on 15 December 2020). [CrossRef]
  33. Simon, H.A. Administrative Behavior: A Study of Decision-Making Processes in Administrative Organization, 2nd ed.; Macmillan: London, UK, 1957. [Google Scholar]
  34. Simon, H.A. Models of Man, Social and Rational: Mathematical Essays on Rational Human Behavior in Social Settings; Wiley: New York, NY, USA, 1957. [Google Scholar]
  35. Western, K.A. The epidemiology of natural and man-made disasters: The present ‘state of the art’. In Diploma in Tropical Public Health; The Ross Institute, London School of Hygiene and Tropical Medicine, University of London: London, UK, 1972. [Google Scholar]
  36. Glaser, B.G.; Strauss, A.L. The Discovery of Grounded Theory: Strategies for Qualitative Research; Aldine: Chicago, IL, USA, 1967. [Google Scholar]
  37. Le Coze, J.C. Broad (multilevel) safety research and strategy. A Sociological study. Saf. Sci. 2021, 136, 105132. [Google Scholar] [CrossRef]
  38. Flin, R. Safety Condition Monitoring: Lessons from Man-Made Disasters. J. Contingencies Crisis Manag. 1998, 6, 88–92. [Google Scholar] [CrossRef]
  39. Goffman, E. Frame Analysis: An Essay on the Organization of Experience; Peregrine Books, Penguin: London, UK, 1975. [Google Scholar]
  40. Pidgeon, N.F.; Turner, B.A. H.E. and Socio-Technical System Failure. In Modeling Human Error in Structural Design and Construction, Proceedings of a Workshop Sponsored by the National Science Foundation, Ann Arbor, MI, USA, 4–6 June 1986; Nowak, A.S., Ed.; Construction Division of the American Society of Civil Engineers: Reston, VA, USA, 1986; pp. 193–203. [Google Scholar]
  41. Turner, B.A. Failed Artifacts. In Symbols and Artifacts: Views of the Corporate Landscape; Gagliardi, P., Ed.; De Gruyter Studies in Organization 24; Walter de Gruyter: Berlin, Germany, 1990. [Google Scholar]
  42. Turner, B.A. Stepping into the same river twice: Learning to handle unique management problems. In Text of the Inaugural Professorial Lecture Delivered 8 December 1992 in the Middlesex University Business School, Middlesex University Inaugural Lectures 2, 1-19; Middlesex University: Middlesex, UK, 1992. [Google Scholar]
  43. Turner, B.A. Causes of Disaster: Sloppy Management. Br. Jnl. Manag. 1994, 5, 215–219. [Google Scholar] [CrossRef]
  44. Turner, B.A. The Making Sense of Unseemly Behavior in Organizations. Int. Stud. Manag. 1983, XIII, 164–181. Available online: https://www.jstor.org/stable/40396922 (accessed on 15 December 2020). [CrossRef]
  45. Turner, B.A. The Use of Grounded Theory for the Qualitative Analysis of Organizational Behaviour. J. Manag. Stud. 1983, 20, 333–348. [Google Scholar] [CrossRef]
  46. Gordon, J.E. The Epidemiology of Accidents. Am. J. Public Health 1949, 39, 504–515. [Google Scholar] [CrossRef]
  47. Haddon, W. A note concerning accident theory and research with special reference to motor vehicle accidents. Ann. N. Y. Acad. Sci. 1963, 107, 635–646. [Google Scholar] [CrossRef] [PubMed]
  48. Haddon, W. The Changing Approaches to the Epidemiology, Prevention, and Amelioration of Trauma: The transition to approaches epidemiologically rather than descriptively based. Am. J. Public Health 1968, 58, 1431–1438. [Google Scholar] [CrossRef]
  49. Haddon, W. Energy Damage and the Ten Countermeasure Strategies. J. Trauma 1973, 13, 321–331. [Google Scholar] [CrossRef]
  50. Lindquist, M.G. Analysis of system failure and corrective subsystems. Manag. Datamat. 1975, 4, 21–24. [Google Scholar]
  51. Maruyama, M. The Second Cybernetics: Decision-Amplifying Mutual Causal Processes. Am. Sci. 1963, 51, 164–179. [Google Scholar]
  52. Schrödinger, E. What Is Life? Cambridge University Press: Cambridge, UK, 1944. [Google Scholar]
  53. Brillouin, L. Life, Thermodynamics and Cybernetics. Am. Sci. 1949, 37, 554–568. Available online: https://www.jstor.org/stable/29773671 (accessed on 27 October 2020). [PubMed]
  54. Brillouin, L. Scientific Uncertainty and Information; Academic Press: New York, NY, USA, 1964. [Google Scholar]
  55. Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Champaign, IL, USA, 1949. [Google Scholar]
  56. Pask, G. The Natural History of Networks. In Self-Organizing Systems: Proceedings of an Interdisciplinary Conference, Chicago, IL, USA, 5–6 May 1959; Yovits, M.C., Cameron, S., Eds.; Pergamon Press: Oxford, UK, 1960; pp. 232–263. [Google Scholar]
  57. Rivas, J.R.; Rudd, D.F. Man-machine synthesis of disaster-resistant operations. Oper. Res. 1975, 23, 2–21. [Google Scholar] [CrossRef]
  58. Thom, R. Structural Stability and Morphogenesis: An Outline of a General Theory of Models; W.A. Benjamin: London, UK, 1975. [Google Scholar]
  59. Buckley, W. Sociology and Modern Systems Thinking; Prentice-Hall: Englewood Cliffs, NJ, USA, 1967. [Google Scholar]
  60. Pidgeon, N.F.; Turner, B.A.; Blockley, D.I. Hazard Assessment in Structural Engineering. In Reliability and Risk Analysis in Civil Engineering: Proceedings of the 5th International Conference on Applications of Statistics and Probability in Soil & Structural Engineering, Vancouver, Canada, 25–29 May 1987; Lind, N.C., Ed.; University of British Columbia: Vancouver, Canada, 1987; Volume 1, pp. 358–365. [Google Scholar]
  61. Pidgeon, N.F.; Blockley, D.I.; Turner, B.A. Site investigations: Lessons from a late discovery of hazardous waste. Struct. Eng. 1988, 66, 311–315. [Google Scholar]
  62. Pidgeon, N.F.; Stone, J.R.; Blockley, D.I.; Turner, B.A. Management of Safety Through Lessons from Case Histories. In Safety and Reliability in the 90s: Will Past Experience or Prediction Meet Our Needs? Walter, M.H., Cox, R.F., Eds.; Elsevier Applied Science: London, UK, 1990; pp. 201–216. [Google Scholar]
  63. Pidgeon, N.F.; Turner, B.A.; Blockley, D.I. The use of Grounded Theory for conceptual analysis in knowledge elicitation. Int. J. Man-Mach. Stud. 1991, 35, 151–173. [Google Scholar] [CrossRef]
  64. Pidgeon, N.F.; Turner, B.A.; Blockley, D.I.; Toft, B. Corporate Safety Culture: Improving the Management Contribution to System Reliability. In Reliability ’91, Proceedings of International Conference on Reliability Techniques and Their Application, London, UK, 10–12 June 1991; Matthews, R.H., Ed.; Elsevier: London, UK, 1991; reprinted in eBook; Chapman and Hall/CRC: London, UK, 2017; Chapter 63. [Google Scholar] [CrossRef]
  65. Pidgeon, N.F.; Turner, B.A.; Toft, B.; Blockley, D.I. Hazard management and safety culture. In Hazard Management and Emergency Planning: Perspectives on Britain; Also Published as an eBook in 2013; Parker, D.J., Handmer, J.W., Eds.; Routledge: London, UK, 1992; Chapter 17. [Google Scholar] [CrossRef]
  66. Fischhoff, B. Hindsight: Thinking backward? ONR Technical Report. In Oregon Research Institute Monograph 1974; No.1.; US Office of Naval Research: Arlington, VA, USA, 1974; Volume 14. [Google Scholar]
  67. Weick, K.E. The Social Psychology of Organizing; Addison Wesley: Reading, MA, USA, 1969. [Google Scholar]
  68. Perrow, C.B. An Almost Random Career. In Management Laureates: A Collection of Autobiographical Essays; Bedeian, A.G., Ed.; JAI Press: Greenwich, CT, USA, 1992; Volume 2, pp. 399–438, reprinted in Routledge ebook; Routledge: London, UK, 2018; Chapter 40. [Google Scholar] [CrossRef]
  69. Perrow, C.B. Three Mile Island: A normal accident. In The International Yearbook of Organization Studies 1981; Dunkerley, D., Salaman, G., Eds.; Routledge & Kegan Paul: London, UK, 1981; pp. 1–25. [Google Scholar]
  70. Perrow, C.B. The President’s Commission and the Normal Accident. In Accident at Three Mile Island: The Human Dimensions; Sills, D.L., Wolf, C.P., Shelanski, V.B., Eds.; Westview Press: Boulder, CO, USA, 1982; pp. 173–184. [Google Scholar]
  71. Perrow, C.B. Organizational Prestige: Some Functions and Dysfunctions. Am. J. Sociol. 1961, 66, 335–341. [Google Scholar] [CrossRef]
  72. Perrow, C.B. The Analysis of Goals in Complex Organizations. Am. Sociol. Rev. 1961, 26, 854–866. [Google Scholar] [CrossRef]
  73. Perrow, C.B. The Sociological Perspective and Political Pluralism. Soc. Res. 1964, 31, 411–422. Available online: https://www.jstor.org/stable/40969752 (accessed on 27 March 2021).
  74. Perrow, C.B. Hospitals: Technology, Structure and Goals in Handbook of Organizations; March, J.G., Ed.; Rand McNally: Chicago, IL, USA, 1965. [Google Scholar]
  75. Perrow, C.B. A Framework for the Comparative Analysis of Organizations. Am. Sociol. Rev. 1967, 32, 194–208. [Google Scholar] [CrossRef]
  76. Perrow, C.B. Book Review—Industrial Organization: Theory and Practice by Joan Woodward. OUP 1965. Am. Sociol. Rev. 1967, 32, 313–315. [Google Scholar] [CrossRef]
  77. Perrow, C.B. Organizational Analysis: A Sociological View; Tavistock Publications: London, UK, 1970. [Google Scholar]
  78. Zannetos, Z.S. Organizational Analysis: A Sociological Review by Charles Perrow. J. Bus. 1971, 44, 338–339. Available online: https://www.jstor.org/stable/2351349 (accessed on 29 March 2021). [CrossRef]
  79. Perrow, C.B. The Radical Attack on Business: A Critical Analysis; Harcourt, Brace, Jovanovich: Boston, MA, USA, 1972. [Google Scholar]
  80. Perrow, C.B. Complex Organizations: A Critical Essay; Scott Foresman & Company: Glenview, IL, USA, 1972. [Google Scholar]
  81. Perrow, C.B. Complex Organizations: A Critical Essay, 3rd ed.; McGraw-Hill: New York, NY, USA, 1986. [Google Scholar]
  82. Lacy, R. Introduction. (Special issue on the occasion of the twentieth anniversary of the publication of Complex Organizations: A Critical Essay by Charles Perrow). Int. Public Manag. J. 2007, 10, 131–135. [Google Scholar] [CrossRef]
  83. Perrow, C.B. The Short and Glorious History of Organizational Theory. Organ. Dyn. 1973, 2, 3–15. [Google Scholar] [CrossRef]
  84. Perrow, C.B. Is Business Really Changing? Organ. Dyn. 1974, 3, 31–44. [Google Scholar] [CrossRef]
  85. Perrow, C.B. The Bureaucratic Paradox: The Efficient Organization Centralizes in Order to Decentralize. Organ. Dyn. 1977, 5, 3–14. [Google Scholar] [CrossRef]
  86. Perrow, C.B. Zoo story, or life in the organizational sandpit. In Control and Ideology in Organizations; Salaman, G., Thompson, K., Eds.; MIT Press: Cambridge, MA, USA, 1980; pp. 259–277. [Google Scholar]
  87. Perrow, C.B. Normal Accident at Three Mile Island. Society 1981, 18, 17–25. [Google Scholar] [CrossRef]
  88. Perrow, C.B. Not Risk but Power—Book Review of Societal Risk Assessment: How Safe Is Safe Enough?; Richard, C.S., Walter, A.A., Jr., Eds.; Plenum Press: New York, NY, USA, 1980. Contemp. Sociol. 1982, 11, 298–300. [Google Scholar] [CrossRef]
  89. Perrow, C.B. The Organizational Context of Human Factors Engineering. Adm. Sci. Q. 1983, 28, 521–541. [Google Scholar] [CrossRef]
  90. Perrow, C.B. The Limits of Safety: The Enhancement of a Theory of Accidents. J. Contingencies Crisis Manag. 1994, 2, 212–220. [Google Scholar] [CrossRef]
  91. Perrow, C.B. Accidents in High-Risk Systems. J. Technol. Stud. 1994, 1, 1–20. [Google Scholar]
  92. Perrow, C.B. A Personal note on Normal Accidents. Organ. Environ. 2004, 17, 9–14. [Google Scholar] [CrossRef]
  93. Perrow, C.B. A Response. Int. Public Manag. J. 2007, 10, 191–200. [Google Scholar] [CrossRef]
  94. Perrow, C.B. The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters; Updated paperback edition; Princeton University Press: Princeton, NJ, USA, 2011. [Google Scholar]
  95. Perrow, C.B. Fukushima and the inevitability of accidents. Bull. At. Sci. 2011, 67, 44–52. [Google Scholar] [CrossRef]
  96. Perrow, C.B. Getting to Catastrophe: Concentration, Complexity and Coupling. The Montreal Review. December 2012. Available online: https://www.themontrealreview.com/2009/Normal-Accidents-Living-with-High-Risk-Technologies.php (accessed on 28 March 2021).
  97. La Porte, T.R. High Reliability Organizations: Unlikely, Demanding and At Risk. J. Contingencies Crisis Manag. 1996, 4, 60–71. [Google Scholar] [CrossRef]
  98. Roberts, K.H. Book Review Essay, Managing the Unexpected: Six years of HRO-Literature Reviewed. J. Contingencies Crisis Manag. 2009, 17, 50–54. [Google Scholar] [CrossRef]
  99. Le Coze, J.C. In the Footsteps of Turner: From Grounded Theory to Conceptual Ethnography in Safety. In Inside Hazardous Technological Systems: Methodological Foundations, Challenges and Future Directions; Gould, K.P., Macrae, C., Eds.; CRC Press: Boca Raton, FL, USA, 2021; pp. 49–68. [Google Scholar]
  100. Parsons, T. On Building Social System Theory: A Personal History. Daedalus 1970, 99, 826–881. Available online: http://www.jstor.org/stable/20023975 (accessed on 17 June 2021).
  101. Parsons, T. The Social System (With a New Preface by Bryan S. Turner); First Edition 1951; Routledge: London, UK, 1991. [Google Scholar]
  102. Weick, K.E. Educational Organizations as Loosely Coupled Systems. Adm. Sci. Q. 1976, 21, 1–19. [Google Scholar] [CrossRef]
  103. Turner, B.A. The Social Aetiology of Disasters. Disasters 1979, 3, 53–59. [Google Scholar] [CrossRef]
  104. Woodward, J. Industrial Organization: Theory and Practice; Oxford University Press: Oxford, UK, 1965. [Google Scholar]
  105. Perrow, C.B. From Medieval History to Smashing the Medieval Account of Organizations. In Technology and Organization: Essays in the Honour of Joan Woodward; Phillips, N., Griffiths, D., Sewell, G., Eds.; Research in the Sociology of Organizations; Emerald Group Publishing: Bradford, UK, 2010; Volume 29, pp. 25–28. [Google Scholar] [CrossRef]
  106. Pidgeon, N.F. In Retrospect: Normal Accidents. Nature 2011, 477, 404–405. [Google Scholar] [CrossRef]
  107. Turner, B.A. Introduction. In Organizational Symbolism; Turner, B.A., Ed.; Walter de Gruyter: Berlin, Germany, 1990; pp. 364–384. [Google Scholar]
  108. Turner, B.A. The Rise of Organisational Symbolism. In The Theory and Philosophy of Organizations: Critical Issues and New Perspectives; Hassard, J., Pym, D., Eds.; Routledge: London, UK, 1990; Chapter 5; pp. 83–96. [Google Scholar]
  109. Hassard, J. Pop Culture Magicians Seek Honest-Grappler-after-Truth for Marginal Discussion. Organ. Stud. 1999, 20, 561–578. [Google Scholar] [CrossRef]
  110. Howd, J. Personal communication. 21–24 February, 2 April & 19–28 June; 2021; 19 April 2023.
  111. Turner, B.A. Some Practical Aspects of Qualitative Data Analysis: One Way of Organising the Cognitive Processes Associated with the Generation of Grounded Theory. Qual. Quant. 1981, 15, 225–247. [Google Scholar] [CrossRef]
  112. Turner, B.A. Connoisseurship in the Study of Organizational Cultures. In Doing Research in Organizations; Alan, B., Ed.; Routledge: London, UK, 1988; Chapter 7; pp. 108–122. [Google Scholar]
  113. Gherardi, S.; Turner, B.A. Real Men Don’t Collect Soft Data. In Quaderno; Universita di Trento, Dipartimento di Politica: Trento, Italy, 1987; Volume 3; reprinted in The Qualitative Researcher’s Companion; Huberman, A.M., Miles, M.B., Eds.; SAGE Publications Inc.: New York, NY, USA, 2002; Part I, Chapter 4. [Google Scholar] [CrossRef]
  114. Gherardi, S.; Strati, A.; Turner, B.A. Industrial Democracy and Organizational Symbolism. In Organizational Democracy: Taking stock. International Handbook of Participation in Organizations; Lammers, C.J., Széll, G., Eds.; De Gruyter: Berlin, Germany; Oxford University Press: Oxford, UK, 1989. [Google Scholar]
  115. Martin, P.Y.; Turner, B.A. Grounded Theory and Organizational Research. J. Appl. Behav. Sci. 1986, 22, 141–157. [Google Scholar] [CrossRef]
  116. Pidgeon, N.F. Safety Culture and Risk Management in Organizations. J. Cross Cult. Psychol. 1991, 22, 129–140. [Google Scholar] [CrossRef]
  117. Turner, B.A.; Pidgeon, N.F.; Blockley, D.I.; Toft, B. Safety Culture: Its Importance in Future Risk Management. In Position Paper for the Second World Bank Workshop on Safety Control and Risk Management, Proceedings of the the Second World Bank Workshop on Safety Control and Risk Management, Karlstad, Sweden, 6–9 November 1989; University of Bristol: Bristol, UK.
  118. Toft, B.; Turner, B.A. The Schematic Report Analysis Diagram: A Simple Aid to Learning from Large-scale Failures. Int. CIS J. 1987, 1, 12–23, reprinted in Mimeo. In Risk Management, Volume II: Management and Control; Mars, G., Weir, D.T., Eds.; Routledge: London, UK, 2000; pp. 435–446. [Google Scholar]
  119. Turner, B.A.; Toft, B. Organizational Learning from Disasters. In Emergency Planning for Industrial Hazards. Proceedings of the European Conference on Emergency Planning for Industrial Hazards, Varese, Italy, 4–6 November 1987; Gow, H.B., Kay, R.W., Eds.; Commission of the European Communities; Elsevier: London, UK, 1988; Chapter 31; pp. 297–313. [Google Scholar]
  120. Turner, B.A. Organisational Responses to Hazard. In Risk: A Seminar Series, IIASA Collaborative Proceedings Series CP-82-S2, 1981; Kunreuther, H., Ed.; International Institute for Applied Systems Analysis: Laxenburg, Austria, 1982; Part I; pp. 49–86. [Google Scholar]
  121. Turner, B.A. Empty Portmanteaux? Organ. Stud. 1984, 5, 269–273. [Google Scholar] [CrossRef]
  122. Turner, B.A. Accidents and Non-random Error Propagation. Risk Anal. 1989, 9, 437–444. [Google Scholar] [CrossRef]
  123. Turner, B.A. How can we design a safe organisation? In Proceedings of the the Second International Conference on Industrial and Organisational Crisis Management, New York, NY, USA, 3–4 November 1989; Leonard, N., Ed.; Stern School of Business, New York University: New York, NY, USA, 1989. [Google Scholar]
  124. Turner, B.A. The Development of a Safety Culture. Chem. Ind. 1991, 1, 241–243. [Google Scholar]
  125. Turner, B.A. The Sociology of Safety. In Engineering Safety; Blockley, D.I., Ed.; McGraw Hill: London, UK, 1992; pp. 186–201. [Google Scholar]
  126. Turner, B.A. Software and Contingency: The Text and Vocabulary of System Failure? Softw. Conting. 1994, 2, 31–38. [Google Scholar] [CrossRef]
  127. Turner, B.A. The Future for Risk Research. J. Contingencies Crisis Manag. 1994, 2, 146–156. [Google Scholar] [CrossRef]
  128. Turner, B.A. Patterns of crisis behaviour—A qualitative inquiry. In Analyzing Qualitative Data; Bryman, A., Burgess, R.G., Eds.; Routledge: London, UK, 1994; pp. 195–215. [Google Scholar] [CrossRef]
  129. Turner, B.A. The role of flexibility and improvisation in emergency response and A perspective from the social sciences. In Natural Risk and Civil Protection; Horlick-Jones, T., Amendola, A., Casale, R., Eds.; European Commission/E&FN SPON: London, UK, 1995; pp. 463ߝ475, 535–537. [Google Scholar]
  130. Turner, B.A. Safety Culture Management: Safety Culture and its Context, Proceedings of the International Topical Meeting on Safety Culture in Nuclear Installations, Vienna, Austria, 24–28 April 1995; Carnino, A., Weimann, G., Eds.; American Nuclear Society Austria Local Section: Vienna, Austria, 1995; pp. 322–329. Available online: https://inis.iaea.org/collection/NCLCollectionStore/_Public/27/036/27036465.pdf?r=1 (accessed on 27 April 2021).
  131. Horlick-Jones, T.; Amendola, A.; Casale, R. (Eds.) Natural Risk and Civil Protection; European Commission/E&FN SPON: London, UK, 1995. [Google Scholar]
  132. Toft, B. The Failure of Hindsight. Disaster Prev. Manag. 1992, 1, 48–60, reprinted in Risk Management Volume II: Management and Control; Mars, G., Weir, D., Eds.; Ashgate: Dartmouth, UK, 2000; Chapter 34. [Google Scholar] [CrossRef]
  133. Haastrup, P.; Funtowicz, S. Accident generating systems and chaos: A dynamic study of accident time series. Reliab. Eng. Syst. 1992, 35, 31–37. [Google Scholar] [CrossRef]
  134. Perrow, C.B. Organizing America: Wealth, Power, and the Origins of Corporate Capitalism; Princeton University Press: Princeton, NJ, USA, 2002. [Google Scholar]
  135. Perrow, C.B. The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters; Princeton University Press: Princeton, NJ, USA, 2007. [Google Scholar]
  136. Perrow, C.B. A Society of Organizations. Theory Soc. 1991, 20, 725–762. [Google Scholar] [CrossRef]
  137. Perrow, C.B. Organisational Theorists in a Society of Organisations. Int. Sociol. 1992, 7, 371–380. [Google Scholar] [CrossRef]
  138. Perrow, C.B. Negative Synergy—Review of Learning from Disasters: A Management Approach by Brian Toft and Simon Reynolds. Nature 1994, 370, 607–608. [Google Scholar] [CrossRef]
  139. Perrow, C.B. Organizing for Environmental Destruction. Organ. Environ. 1997, 10, 66–72. [Google Scholar] [CrossRef]
  140. Perrow, C.B. Organizing to Reduce the Vulnerabilities of Complexity. J. Contingencies Crisis Manag. 1999, 7, 150–155. [Google Scholar] [CrossRef]
  141. Perrow, C.B. An Organizational Analysis of Organizational Theory. Contemp. Sociol. 2000, 29, 469–476. [Google Scholar] [CrossRef]
  142. Perrow, C.B. Organizational or Executive Failures? Contemp. Sociol. 2005, 34, 99–107. [Google Scholar] [CrossRef]
  143. Perrow, C.B. Shrink the Targets. IEEE Spectrum 2006, 43, 46–49. [Google Scholar] [CrossRef]
  144. Perrow, C.B. Disasters Ever More? Reducing U.S. Vulnerabilities. In Handbook of Disaster Research; Rodriguez, H., Quarantelli, E.L., Dynes, R.R., Eds.; Springer: New York, NY, USA, 2007; pp. 521–533. [Google Scholar] [CrossRef]
  145. Perrow, C.B. Complexity, Catastrophe, and Modularity. Sociol. Inq. 2008, 78, 162–173. [Google Scholar] [CrossRef]
  146. Perrow, C.B. Conservative Radicalism. Organization 2008, 15, 915–921. [Google Scholar] [CrossRef]
  147. Perrow, C.B. Modeling firms in the global economy. Theory Soc. 2009, 38, 217–243. [Google Scholar] [CrossRef]
  148. Perrow, C.B. Resilience Rather than Prevention and Recovery. Build. Res. Inf. 2009, 37, 213–216. [Google Scholar] [CrossRef]
  149. Perrow, C.B. Book Review—High Reliability Management: Operating on the Edge by Emery Roe & Paul R. Schulman. Adm. Sci. Q. 2009, 54, 364–367. Available online: https://www.jstor.org/stable/27749335 (accessed on 27 March 2021).
  150. Perrow, C.B. What’s needed is application, not reconciliation: A response to Shrivastava, Sonpar and Pazzaglia. Hum. Relat. 2009, 62, 1391–1393. [Google Scholar] [CrossRef]
  151. Perrow, C.B. The meltdown was not an accident. In Markets on Trial: The Economic Sociology of the U.S. Financial Crisis: Part A; Research in the Sociology of, Organizations; Lounsbury, M., Hirsch, P.M., Eds.; Emerald Group Publishing: Bingley, UK, 2010; Volume 30A, pp. 309–330. [Google Scholar] [CrossRef]
  152. Perrow, C.B. Drinking Deep at Black Mountain College. South. Cult. 2013, 19, 76–94. [Google Scholar] [CrossRef]
  153. Perrow, C.B. Cracks in the ‘Regulatory State’. Soc. Curr. 2015, 2, 203–212. [Google Scholar] [CrossRef]
  154. Perrow, C.B. Effectiveness of Regulatory Agencies. In The Routledge Companion to Risk, Crisis and Emergency Management; Gephart, R., Miller, C., Helgesson, K., Eds.; Routledge: London, UK, 2018; Chapter 36; pp. 508–512. [Google Scholar]
  155. Gephart, R.P. Making Sense of Organizationally Based Environmental Disasters. J. Manag. 1984, 10, 205–225. [Google Scholar] [CrossRef]
  156. Sagan, S.D. The Limits of Safety: Organizations, Accidents and Nuclear Weapons; Princeton University Press: Princeton, NJ, USA, 1993. [Google Scholar]
  157. Clarke, L. Acceptable Risk? Making Decisions in a Toxic Environment; University of California Press: Berkeley, CA, USA, 1989. [Google Scholar]
  158. Clarke, L. Mission Improbable: Using Fantasy Documents to Tame Disaster; University of Chicago Press: Chicago, IL, USA, 1999. [Google Scholar]
  159. Clarke, L.; Perrow, C.B. Prosaic Organizational Failure. Am. Behav. Sci. 1996, 39, 1040–1056. [Google Scholar] [CrossRef]
  160. Snook, S.A. Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq; Princeton University Press: Princeton, NJ, USA, 2000. [Google Scholar]
  161. La Porte, T.R. A Strawman Speaks Up: Comments on The Limits of Safety. J. Contingencies Crisis Manag. 1994, 2, 207–211. [Google Scholar] [CrossRef]
  162. La Porte, T.R.; Rochlin, G.I. A Rejoinder to Perrow. J. Contingencies Crisis Manag. 1994, 2, 221–227. [Google Scholar] [CrossRef]
  163. Rochlin, G.I. Safe operation as a social construct. Ergonomics 1999, 42, 1549–1560. [Google Scholar] [CrossRef]
  164. Douglas, M.; Wildavsky, A. Risk and Culture: An Essay on the Selection of Technological and Environmental Dangers; University of California Press: Berkeley, CA, USA, 1982. [Google Scholar]
  165. Douglas, M. Loose Ends and Complex Arguments—Review Essay of Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Contemp. Sociol. 1985, 14, 171–173. Available online: https://www.jstor.org/stable/2070132 (accessed on 27 March 2021). [CrossRef]
  166. Hopkins, A. The limits of normal accident theory. Saf. Sci. 1999, 32, 93–102. [Google Scholar] [CrossRef]
  167. Hopkins, A. Was Three Mile Island a normal accident? J. Contingencies Crisis Manag. 2001, 9, 65–72. [Google Scholar] [CrossRef]
  168. McGill, A.R. Book Review—Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Hum. Resour. Manag. 1984, 23, 434–436. [Google Scholar] [CrossRef]
  169. Hirschhorn, L. On Technological Catastrophe—Normal Accidents: Living with High-Risk Technologies. Science 1985, 228, 846–847. [Google Scholar] [CrossRef]
  170. Kates, R.W. Book Review—Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Prof. Geogr. 1986, 38, 121–122. [Google Scholar] [CrossRef]
  171. Roberts, K. The significance of Perrow’s Normal Accidents. Acad. Manag. Rev. 1989, 14, 285–289. [Google Scholar] [CrossRef]
  172. Rossi, P.H. Book Review—Normal Accidents: Living with High Risk Technologies by Charles Perrow. Am. J. Sociol. 1985, 91, 181–184. Available online: https://www.jstor.org/stable/2779895 (accessed on 28 March 2021). [CrossRef]
  173. Wildavsky, A. But Is It True? A Citizen’s Guide to Environmental Health and Safety Issues; Harvard University Press: Cambridge, MA, USA, 1995. [Google Scholar]
  174. Cummings, L.L. Book Review—Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Adm. Sci. Q. 1985, 29, 630–632. [Google Scholar] [CrossRef]
  175. Grimes, A.J. Book Review—Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Acad. Manag. Rev. 1985, 10, 366–368. [Google Scholar] [CrossRef]
  176. Ravetz, J. Making accidents ‘normal’—Review of Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Futures 1985, 17, 287–288. [Google Scholar] [CrossRef]
  177. Turkstra, C.J. Book Review—Normal Accidents: Living with High Risk Technologies, by Charles Perrow. Struct. Saf. 1986, 4, 165. [Google Scholar] [CrossRef]
  178. Williams, B. Accidents Will Happen—Review of Normal Accidents: Living with High Risk Technologies, by Charles Perrow. Soc. Stud. Sci. 1988, 18, 556–560. [Google Scholar] [CrossRef]
  179. Jermier, J.M. “Complex Systems Threaten to Bring Us Down …”: Introduction to the Symposium on Normal Accidents. Organ. Environ. 2004, 17, 5–8. [Google Scholar] [CrossRef]
  180. Rosa, E.A. Celebrating a Citation Classic—And More: Symposium on Charles Perrow’s Normal Accidents. Organ. Environ. 2005, 18, 229–234. [Google Scholar] [CrossRef]
  181. Sagan, S.D. Learning from Normal Accidents. Organ. Environ. 2004, 17, 15–19. [Google Scholar] [CrossRef]
  182. Le Coze, J.C. 1984–2014. Normal Accidents. Was Charles Perrow Right for the Wrong Reasons? J. Contingencies Crisis Manag. 2015, 23, 275–286. [Google Scholar] [CrossRef]
  183. Hopkins, A. Issues in safety science. Saf. Sci. 2014, 67, 6–14. [Google Scholar] [CrossRef]
  184. Hopkins, A. Managing Major Hazards: The Lessons of the Moura Mine Disaster; Allen & Unwin: Sydney, Australia, 1999. [Google Scholar]
  185. Hopkins, A. Counteracting the Cultural Causes of Disaster. J. Contingencies Crisis Manag. 1999, 7, 141–149. [Google Scholar] [CrossRef]
  186. Hopkins, A. Lessons from Longford; CCH: Sydney, Australia, 2000. [Google Scholar]
  187. Hopkins, A. A culture of denial: Sociological similarities between the Moura and Gretley mine disasters. J. Occup. Health Saf. —Aust. New Zealand 2000, 16, 29–36. [Google Scholar]
  188. Hopkins, A. Lessons from Gretley: Mindful Leadership and the Law; CCH: Sydney, Australia, 2007. [Google Scholar]
  189. Cyert, R.M.; March, J.G. A Behavioral Theory of the Firm; Prentice-Hall: Englewood Cliffs, NJ, USA, 1963. [Google Scholar]
  190. Cohen, M.D.; March, J.G.; Olsen, J.P. A Garbage Can Model of Organizational Choice. Adm. Sci. Q. 1972, 17, 1–25. [Google Scholar] [CrossRef]
  191. Toft, B. Personal communication, 2021. 21–30 June 2021 with copies of correspondence to and from Professor J.T. Reason on 17 and 21 July 1987 respectively; & 15 December 2022.
  192. Pidgeon, N.F. Personal communication, 2021. 21–23 April & 2–4, 8–9 June 2021.
  193. Perrow, C.B. Letter to Mrs Turner of 21 November 1995. The Letter Was an Emailed Attachment to Mrs Turner Provided to the Primary Author in a Personal Communication from Janet Howd (aka Janet Turner), on 24 February 2021.
  194. Editorial Introduction: A Special Symposium on Barry Turner’s Work on Man-Made Disasters. J. Contingencies Crisis Manag. 1998, 6, 71. [CrossRef]
  195. Toft, B.; Reynolds, S. Learning from Disasters: A Management Approach; Butterworth-Heinemann: Oxford, UK, 1994. [Google Scholar]
  196. Ashmos, D.P.; Huber, G.P. The Systems Paradigm in Organization Theory: Correcting the Record and Suggesting the Future. Acad. Manag. Rev. 1987, 12, 607–621. [Google Scholar] [CrossRef]
  197. Boulding, K.E. General Systems Theory—The Skeleton of Science. Manag. Sci. 1956, 2, 197–208. [Google Scholar] [CrossRef]
  198. Kast, F.E.; Rosenzweig, J.E. General Systems Theory: Applications for Organization and Management. Acad. Manag. J. 1972, 15, 447–465. [Google Scholar] [CrossRef]
  199. Von Bertalanffy, L. The History and Status of General Systems Theory. Acad. Manag. J. 1972, 15, 407–426. [Google Scholar] [CrossRef]
  200. Weick, K.E. The vulnerable system: An analysis of the Tenerife air disaster. In New Challenges to Understanding Organizations; Roberts, K.H., Ed.; Macmillan: New York, NY, USA, 1993; pp. 173–198. [Google Scholar]
  201. Pidgeon, N.F. Observing the English Weather: A Personal Journey from Safety I to Safety IV. In Safety Science Research: Evolution, Challenges and New Directions; Le Coze, J.C., Ed.; CRC Press: Boca Raton, FL, USA, 2000; pp. 269–279. [Google Scholar]
  202. Pidgeon, N.F.; O’Leary, M. Man-made Disasters: Why technology and organizations (sometimes) fail. Saf. Sci. 2000, 34, 15–30. [Google Scholar] [CrossRef]
  203. Blockley, D.I. Managing Proneness to Failure. J. Contingencies Crisis Manag. 1998, 6, 76–79. [Google Scholar] [CrossRef]
  204. Blockley, D.I. Building Bridges: Between Theory and Practice; World Scientific Publishing Europe Ltd.: London, UK, 2020. [Google Scholar]
  205. Blockley, D.I. Personal communication, 24&26 June 2021.
  206. Toft, B. External Review of Never Events in Interventional Procedures. Co-Commissioned by Sheffield Teaching Hospitals NHS Foundation Trust and Sheffield Clinical Commissioning Group. 2014. Available online: https://docplayer.net/4636370-Professor-brian-toft-obe-june-2014.html (accessed on 14 February 2021).
  207. Toft, B.; Reynolds, S. Learning from Disasters: A Management Approach, 3rd ed.; Perpetuity Press: Leicester, UK, 2005; Available online: https://link.springer.com/book/10.1007/978-1-349-27902-9 (accessed on 7 September 2023).
  208. Gherardi, S. Speaking Personally: Remembering Barry Turner. Organization 1995, 2, 547–549. [Google Scholar] [CrossRef]
  209. Gherardi, S. A Cultural Approach to Disasters. J. Contingencies Crisis Manag. 1998, 6, 80–83. [Google Scholar] [CrossRef]
  210. Gherardi, S. Man-made Disasters … Twenty Years On. Organ. Stud. 1999, 20, 695–700. [Google Scholar] [CrossRef]
  211. Vaughan, D.; Turner, B.A.; Pidgeon, N.F. Foreword. In Man-Made Disasters, 2nd ed.; Butterworth-Heinemann: Oxford, UK, 1997. [Google Scholar]
  212. Rosenthal, U.; Turner, B.A.; Pidgeon, N.F. Foreword. In Man-Made Disasters, 2nd ed.; Butterworth-Heinemann: Oxford, UK, 1997. [Google Scholar]
  213. Google Scholar. Citation searches for all editions of Man-made Disasters and Normal Accidents: Living with High-Risk Technologies. 29 May. Available online: https://scholar.google.com.au/ (accessed on 29 May 2021).
  214. Merigó, J.; Miranda, J.; Modak, N.; Boustras, G.; de la Sotta, C. Forty years of Safety Science: A bibliometric overview. Saf. Sci. 2019, 115, 66–88. [Google Scholar] [CrossRef]
  215. Hale, A.R.; Hale, M. Accidents in Perspective. Occup. Psychol. 1970, 44, 115–121. [Google Scholar]
  216. Hale, A.R.; Hale, M. A Review of the Industrial Accident Research Literature; Research Paper 2; Committee on Safety and Health at Work, H.M.S.O.: London, UK, 1972. [Google Scholar]
  217. Hale, A.R.; Glendon, I. Individual Behaviour in the Control of Danger. Industrial Safety Series; Elsevier: Amsterdam, The Netherlands, 1987; Volume 2. [Google Scholar]
  218. Hale, A.R.; Hovden, J. Management and Culture: The third age of safety. A review of approaches to organizational aspects of safety, health and environment. In Occupational Injury: Risk, Prevention, and Intervention; Feyer, A.M., Williamson, A., Eds.; CRC Press: London, UK, 1998; pp. 129–165. [Google Scholar]
  219. Meshkati, N. Self-organisation, requisite variety and cultural environment: Three links of a safety chain to harness complex technological systems. In Proceedings of the Second World Bank Workshop on Risk Management (In Large-Scale Technological Operations) Organised Jointly with the Swedish Rescue Services Board, Karlstad, Sweden, 6–11 November 1989. [Google Scholar]
  220. Dwyer, T. Life and Death at Work: Industrial Accidents as a Case of Socially Produced Error; Springer Science+Business Media: New York, NY, USA, 1991. [Google Scholar]
  221. Hale, A.R.; Heming, B.H.; Carthey, J.; Kirwan, B. Modelling of Safety Management Systems. Saf. Sci. 1997, 26, 121–140. [Google Scholar] [CrossRef]
  222. Hale, A.R. Culture’s Confusions—Editorial. Saf. Sci. 2000, 34, 1–14. [Google Scholar] [CrossRef]
  223. Hale, A.R. I came into safety by accident: Dr Patrick Waterson (Loughborough University) meets Professor Andrew Hale from Health and Safety Technology and Management. Psychologist 2017, 30, 64–67. Available online: https://www.bps.org.uk/psychologist/i-came-safety-accident (accessed on 14 February 2021).
  224. Hale, A.R. Review of the Industrial Accident Research Literature. Hastam Blog. 2017. Available online: https://www.hastam.co.uk/review-industrial-accident-research-literature/ (accessed on 14 February 2021).
  225. Palmer, D. Taking Stock of the Criteria We Use to Evaluate One Another’s Work: ASQ 50 Years Out. Adm. Sci. Q. 2006, 51, 535–559. [Google Scholar] [CrossRef]
  226. Weick, K.E. Enacted Sensemaking in Crisis Situations. J. Manag. Stud. 1988, 25, 305–317. [Google Scholar] [CrossRef]
  227. Weick, K.E. Foresights of Failure: An Appreciation of Barry Turner. J. Contingencies Crisis Manag. 1998, 6, 72–74. [Google Scholar] [CrossRef]
  228. Weick, K.E. Normal Accident Theory as Frame, Link, and Provocation. Organ. Environ. 2004, 17, 27–31. [Google Scholar] [CrossRef]
  229. Weick, K.E. Making Sense of the Organization, Volume 2: The Impermanent Organization; Also Available Online as a ProQuest Ebook; John Wiley & Sons: Chichester, UK, 2009. [Google Scholar]
  230. Weick, K.E.; Sutcliffe, K.M.; Obstfeld, D. Organizing and the Process of Sensemaking. Front. Organ. Sci. 2005, 16, 409–421. [Google Scholar] [CrossRef]
  231. Weick, K.E. Reflections on Enacted Sensemaking in the Bhopal Disaster. J. Manag. Stud. 2010, 47, 537–550. [Google Scholar] [CrossRef]
  232. Weick, K.E. Organizational culture as a source of high reliability. Calif. Manag. Rev. 1987, XXIX, 112–127. [Google Scholar] [CrossRef]
  233. Rasmussen, J. Publications. Danish Technological University Website, Denmark. 2021. Available online: http://www.jensrasmussen.org/publikations (accessed on 26 June 2021).
  234. Reason, J. Organizational Accidents Revisited; Ashgate: Farnham, UK, 2016. [Google Scholar]
  235. Rasmussen, J. Human Errors. A Taxonomy for describing human malfunction in industrial installations. J. Occup. Accid. 1982, 4, 311–333. [Google Scholar] [CrossRef]
  236. Rasmussen, J. Skills, Rules, and Knowledge; Signals, Signs, and Symbols, and Other Distinctions in Human Performance Models. IEEE Trans. Syst. Man Cybern. 1983, SMC-13, 257–266. [Google Scholar] [CrossRef]
  237. Rasmussen, J. Human Error and the problem of causality in analysis of accidents. Philos. Trans. R. Soc. Lond. 1990, B327, 449–462. [Google Scholar] [CrossRef]
  238. Rasmussen, J. Risk management in a dynamic society: A modelling problem. Saf. Sci. 1997, 27, 183–213. [Google Scholar] [CrossRef]
  239. Rasmussen, J.; Svedung, I. Proactive Risk Management in a Dynamic Society; Swedish Rescue Services Agency: Karlstad, Sweden, 2000. [Google Scholar]
  240. Svedung, I.; Rasmussen, J. Graphic representation of accident scenarios: Mapping structure and the causation of accidents. Saf. Sci. 2002, 40, 397–417. [Google Scholar] [CrossRef]
  241. Rasmussen, J. Human Factors in High-Risk Systems. In Proceedings of the Conference Record for 1988 IEEE Conference on Human Factors and Power Plants, Monterey, CA, USA, 5–9 June 1988; pp. 43–48. [Google Scholar] [CrossRef]
  242. Rasmussen, J. Man-Machine Communication in the Light of Accident Record. In IEEE Conference Records, Proceedings of the International Symposium on Man-Machine Systems, Cambridge, UK, 8–12 September 1969; 69C58-MMS. Volume 3, p. 3. [Google Scholar]
  243. Rasmussen, J. Outlines of a hybrid model of the process plant operation. In Monitoring Behavior and Supervisory Control, Proceedings of the International Symposium on Monitoring Behavior and Supervisory Control, Berchtesgaden, Germany, 8–12 March 1976; Plenum Press: New York, NY, USA, 1976; Chapter 31; pp. 31371–31383. [Google Scholar] [CrossRef]
  244. Rasmussen, J. Human Error Mechanisms in Complex Work Environments. Reliab. Eng. Syst. 1988, 22, 155–167. [Google Scholar] [CrossRef]
  245. Rasmussen, J.; Jensen, A. Mental Procedures in Real-Life Tasks: A Case Study of Electronic Trouble Shooting. Ergonomics 1974, 17, 293–307. [Google Scholar] [CrossRef] [PubMed]
  246. Rasmussen, J.; Pedersen, O.M. Formalized Search Strategies for Human Risk Contributions: A Framework for Further Development. Risø National Laboratory. Risø-M-2351. July 1982. Available online: https://backend.orbit.dtu.dk/ws/portalfiles/portal/53704802/ris_m_2351.pdf (accessed on 8 June 2021).
  247. Rasmussen, J.; Pedersen, O.M. Human factors in probabilistic risk analysis and in risk management. In IAEA Operational Safety of Nuclear Power Plants, Proceedings of the International Symposium on Operational Safety of Nuclear Power Plants, Marseilles, France, 2–6 May 1983; International Atomic Energy Agency Proceedings Series, IAEA-SM-268/2; IAEA: Vienna, Austria, 1984; pp. 181–194. [Google Scholar]
  248. Rasmussen, J. Information Processing and Human-Machine Interaction: An Approach to Cognitive Engineering; North-Holland System Science and Engineering Series Volume 12; North-Holland: New York, NY, USA, 1986. [Google Scholar]
  249. Rasmussen, J.; Batstone, R. Why Do Complex Organizational Systems Fail? Results of a Workshop on Safety Control and Risk Management Held in Washington, DC from 18–20 October 1988. The World Bank Policy Planning and Research Staff: Environment Working Paper No. 20, October 1989. Available online: https://documents.worldbank.org/en/publication/documents-reports/documentdetail/535511468766200820/why-do-complex-organizational-systems-fail (accessed on 6 June 2021).
  250. Rasmussen, J.; Batstone, R.; Rosenberg, T. (Eds.) Workshop on Safety Control and Risk Management: An Overview. Karlstad, Sweden, 6–8 November 1989; Sponsored by the World Bank and the Swedish Rescue Services Board. Paper Published in 1991 by the Swedish Rescue Services Board, Karlstad, Sweden. Available online: www.orbit.dtu.dk (accessed on 29 June 2021).
  251. Le Coze, J.C. New models for new times. An anti-dualist move. Saf. Sci. 2013, 59, 200–218. [Google Scholar] [CrossRef]
  252. Le Coze, J.C. Reflecting on Jens Rasmussen’s legacy. A strong program for a hard problem. Saf. Sci. 2015, 71, 123–141. [Google Scholar] [CrossRef]
  253. Le Coze, J.C. Reflecting on Jens Rasmussen’s legacy (2) behind and beyond, a ‘constructivist turn’. Appl. Ergon. 2017, 59, 558–569. [Google Scholar] [CrossRef]
  254. Dekker, S.W. Rasmussen’s legacy and the long arm of rational choice. Appl. Ergon. 2017, 59, 554–557. [Google Scholar] [CrossRef]
  255. Leveson, N.G. Rasmussen’s legacy: A paradigm change in engineering for safety. Appl. Ergon. 2017, 59, 581–591. [Google Scholar] [CrossRef]
  256. Wise, J.A.; Debons, A. (Eds.) Information Systems: Failure Analysis. In Proceedings of the NATO Advanced Research Workshop on Failure Analysis of Information Systems, Bad Windsheim, Germany, 18–22 August 1986; Springer-Verlag: Berlin, Germany, 1987. [Google Scholar]
  257. Reason, J.T. An Interactionist’s View of System Pathology. In Information Systems: Failure Analysis. Proceedings of the NATO Advanced Research Workshop on Failure Analysis of Information Systems, Bad Windsheim, Germany, 18–22 August 1986; Wise, J.A., Debons, A., Eds.; Springer: Berlin, Germany, 1987; pp. 211–220. [Google Scholar] [CrossRef]
  258. Reason, J.T. The Chernobyl errors. Bull. Br. Psychol. Soc. 1987, 40, 201–206. [Google Scholar]
  259. Reason, J.T. Errors and Evaluations: The lessons of Chernobyl. In Proceedings of the 1988 IEEE Conference on Human Factors and Power Plants, Monterey, CA, USA, 5–9 June 1988; pp. 537–540. [Google Scholar] [CrossRef]
  260. Reason, J.T. Human Error; Cambridge University Press: Cambridge, UK, 1990. [Google Scholar]
  261. Reason, J.T. The contribution of latent human failures to the breakdown of complex systems. Phil. Trans R. Soc. B 1990, 327, 475–484. [Google Scholar] [CrossRef]
  262. Reason, J.T. Managing the Risks of Organizational Accidents; Ashgate: Aldershot, UK, 1997. [Google Scholar]
  263. Reason, J.T. Achieving a safe culture: Theory and practice. Work Stress 1998, 12, 293–306. [Google Scholar] [CrossRef]
  264. Reason, J.T. The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries; Ashgate: Farnham, UK, 2008. [Google Scholar]
  265. Reason, J.T. A Life in Error: From Little Slips to Big Disasters; Ashgate: Farnham, UK, 2013. [Google Scholar]
  266. Reason, J.T. Skill and error in everyday life. In Adult Learning: Psychological Research and Applications; Howe, M.J., Ed.; Wiley: London, UK, 1977. [Google Scholar]
  267. Reason, J.T.; Mycielska, K. Absent Minded: The Psychology of Mental Lapses and Everyday Errors; Prentice Hall: Englewood Cliffs, NJ, USA, 1983. [Google Scholar]
  268. Vaughan, D. Autonomy, interdependence and social control: NASA and the space shuttle Challenger. Adm. Sci. Q. 1990, 35, 225–237. [Google Scholar] [CrossRef]
  269. Vaughan, D. Regulating risk: Implications of the Challenger Accident. In Organizations, Uncertainties, and Risk; Short, J.F., Clarke, L., Eds.; Routledge & Westview Press: Boulder, CO, USA, 1992; pp. 235–253. [Google Scholar]
  270. Vaughan, D. The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA; The University of Chicago Press: Chicago, IL, USA, 1996. [Google Scholar]
  271. Vaughan, D. The Trickle-Down Effect: Policy Decisions, Risky Work, and the Challenger Tragedy. Calif. Manag. Rev. 1997, 39, 80–102. [Google Scholar] [CrossRef]
  272. Vaughan, D. The Dark Side of Organizations: Mistake, Misconduct, and Disaster. Annu. Rev. Sociol. 1999, 25, 271–305. [Google Scholar] [CrossRef]
  273. Vaughan, D. The Role of the Organization in the Production of Techno-Scientific Knowledge. Soc. Stud. Sci. 1999, 29, 913–943. [Google Scholar] [CrossRef]
  274. Vaughan, D. Theorizing Disaster: Analogy, historical ethnography, and the Challenger accident. Ethnography 2004, 5, 315–347. [Google Scholar] [CrossRef]
  275. Vaughan, D. Interview: Diane Vaughan—Sociologist, Columbia University. Consultant. 2008. Available online: https://www.consultingnewsline.com/Info/Vie%20du%20Conseil/Le%20Consultant%20du%20mois/Diane%20Vaughan%20%28English%29.html (accessed on 27 April 2021).
  276. Vaughan, D. Dead Reckoning: Air Traffic Control, System Effects, and Risk; The University of Chicago Press: Chicago, IL, USA, 2021. [Google Scholar]
  277. Leveson, N.G. Safeware: Systems Safety and Computers: A Guide to Preventing Accidents and Losses Caused by Technology; Addison-Wesley: Boston, MA, USA, 1995. [Google Scholar]
  278. Leveson, N.G. A new accident model for engineering safety systems. Saf. Sci. 2004, 42, 237–270. [Google Scholar] [CrossRef]
  279. Leveson, N.G. Applying systems thinking to analyze and learn from events. Saf. Sci. 2011, 49, 55–64. [Google Scholar] [CrossRef]
  280. Leveson, N.G. Engineering a Safer World: Systems Thinking Applied to Safety; The MIT Press: Cambridge, MA, USA, 2011. [Google Scholar]
  281. Leveson, N.G.; Dulac, N.; Marais, K.; Carroll, J. Moving Beyond Normal Accidents and High Reliability Organizations: A Systems Approach to Safety in Complex Systems. Organ. Stud. 2009, 30, 227–249. [Google Scholar] [CrossRef]
  282. Hopkins, A. Disastrous Decisions: The Human and Organisational Causes of the Gulf of Mexico Blowout; CCH: Sydney, Australia, 2012. [Google Scholar]
  283. Hollnagel, E. Books and Papers. Available online: https://erikhollnagel.com (accessed on 13 March 2021).
  284. Hollnagel, E. Barriers and Accident Prevention; Ashgate: Aldershot, UK, 2004. [Google Scholar]
  285. Lundberg, J.; Rollenhagen, C.; Hollnagel, E. What-You-Look-For-Is-What-You-Find—The consequences of underlying accident models in eight accident investigation manuals. Saf. Sci. 2009, 47, 1297–1311. [Google Scholar] [CrossRef]
  286. Hollnagel, E. FRAM, The Functional Resonance Analysis Method: Modelling Complex Socio-Technical Systems; Ashgate: Farnham, UK, 2012. [Google Scholar]
  287. Hollnagel, E. The ETTO Principle: Efficiency-Thoroughness Trade-Off—Why Things that Go Right Sometimes Go Wrong; Ashgate: Farnham, UK, 2009. [Google Scholar]
  288. Hollnagel, E. Safety-I and Safety-II: The Past and Future of Safety Management; Ashgate: Farnham, UK, 2014. [Google Scholar]
  289. Hollnagel, E. Safety-II in Practice: Developing the Resilience Potentials; Routledge: Abingdon, UK, 2018. [Google Scholar]
  290. Dekker, S.W. Books and Papers. Available online: https://sidneydekker.com (accessed on 13 March 2021).
  291. Dekker, S.W. The Field Guide to Human Error Investigations; Ashgate: Aldershot, UK, 2002. [Google Scholar]
  292. Dekker, S.W. The Field Guide to Understanding Human Error, 2nd ed.; Ashgate/CRC Press: Aldershot, UK, 2006. [Google Scholar]
  293. Dekker, S.W. Ten Questions About Human Error: A New View of Human Factors and System Safety; Lawrence Erlbaum Associates Publishers: Malwah, NJ, USA, 2005. [Google Scholar]
  294. Dekker, S.W. Drift into Failure: From Hunting Broken Components to Understanding Complex Systems; Ashgate: Aldershot, UK, 2011. [Google Scholar]
  295. Dekker, S.W. The Field Guide to Understanding ‘Human Error’, 3rd ed.; CRC Press: Boca Raton, FL, USA, 2014. [Google Scholar]
  296. Dekker, S.W. Safety Differently: Human Factors for a New Era, 2nd ed.; CRC Press: Boca Raton, FL, USA, 2015. [Google Scholar]
  297. Dekker, S.W. Foundations of Safety Science: A Century of Understanding Accidents and Disasters; CRC Press: Boca Raton, FL, USA, 2019. [Google Scholar]
  298. Shrivastava, P. Bhopal: Anatomy of a Crisis, 2nd ed.; Paul Chapman: London, UK, 1992. [Google Scholar]
  299. Wildavsky, A. Searching for Safety: Social Theory and Social Policy; Routledge: New York, NY, USA, 1988. [Google Scholar]
  300. Le Coze, J.C. Are organisations too complex to be integrated in technical risk assessment and current safety auditing? Saf. Sci. 2005, 43, 613–638. [Google Scholar] [CrossRef]
  301. Le Coze, J.C. How safety culture can make us think. Saf. Sci. 2019, 118, 221–229. [Google Scholar] [CrossRef]
  302. Le Coze, J.C. Ideas for the future of safety science. Saf. Sci. 2020, 132, 104966. [Google Scholar] [CrossRef]
  303. Le Coze, J.C. (Ed.) Safety Science Research: Evolution, Challenges and New Directions; CRC Press: Boca Raton, FL, USA, 2020. [Google Scholar]
  304. Macrae, C. Close Calls: Managing Risk and Resilience in Airline Flight Safety; Palgrave Macmillan: London, UK, 2014. [Google Scholar]
  305. Hayes, J.; Hopkins, A. Nightmare Pipeline Failures: Fantasy Planning, Black Swans and Integrity Management; Wolters Kluwer CCH: Sydney, Australia, 2014. [Google Scholar]
  306. Dechy, N.; Dien, Y.; Hayes, J.; Paltrinieri, N. Failures of Foresight in Safety: Fantasy Risk Analysis and Blindness. In ESReDA Project Group Foresight in Safety, Enhancing Safety: The Challenge of Foresight; EUR 30441 EN; Publications Office of the European Union: Luxembourg, 2020; Chapter 3. [Google Scholar] [CrossRef]
  307. Hayes, J. Investigating Accidents: The Case for Disaster Case Studies in Safety Science. In Safety Science Research: Evolution, Challenges and New Directions; Le Coze, J.C., Ed.; CRC Press: Boca Raton, FL, USA, 2020; pp. 187–202. [Google Scholar]
  308. Quinlan, M. Ten Pathways to Death and Disaster: Learning from Fatal Incidents in Mines and Other High Hazard Workplaces; The Federation Press: Alexandria, Australia, 2014. [Google Scholar]
  309. Turner, B.A. Teaching old dogs new tricks: Restructuring the insurance industry. In Insurance Viability and Loss Mitigation: Partners in Risk Reduction; Britton, N.R., McDonald, J., Oliver, J., Eds.; Alexander Howden Re: Sydney, Australia, 1995; pp. 47–65. [Google Scholar]
  310. Short, J.F.; Rosa, E.A. Organizations, Disasters, Risk Analysis and Risk: Historical and Contemporary Contexts. J. Contingencies Crisis Manag. 1998, 6, 93–95. [Google Scholar] [CrossRef]
  311. Short, J.F. The Social Fabric at Risk: Toward the Social Transformation of Risk Analysis. Am. Sociol. Rev. 1984, 49, 711–725. [Google Scholar] [CrossRef]
  312. Gould, S.J. Eight Little Piggies: Reflections in Natural History; Originally Published in 1993 by Jonathan Cape, London and W.W. Norton and Co, New York; Vintage Digital ebook 2014; Vintage Books: London, UK, 2007. [Google Scholar]
  313. Calhoun, C. (Ed.) Introduction: On Merton’s Legacy and Contemporary Sociology. In Robert K. Merton: Sociology of Science and Sociology as Science; Columbia University Press: New York, NY, USA, 2010; pp. 1–29. [Google Scholar]
  314. Busch, C. Preventing Industrial Accidents: Reappraising H.W. Heinrich—More Than Triangles and Dominoes; Routledge: Abingdon, UK, 2021. [Google Scholar]
  315. Busch, C. Heinrich’s Local Rationality: Shouldn’t ‘New View’ Thinkers Ask Why Things Made Sense to Him? Master’s Thesis, Division of Risk Management and Societal Safety, Lund University, Lund, Sweden, 2018. Available online: https://lup.lub.lu.se/student-papers/search/publication/8975267 (accessed on 18 June 2020).
  316. Shorrock, S.T. Safety Research and Safety Practice: Islands in a Common Sea. In Safety Science Research: Evolution, Challenges and New Directions; Le Coze, J.C., Ed.; CRC Press: Boca Raton, FL, USA, 2020; pp. 223–245. [Google Scholar]
  317. Eisner, H.S. Editorial. J. Occup. Accid. 1976, 1, 1. [Google Scholar] [CrossRef]
  318. Hale, A.R.; Mearns, K.; Wybo, J.L.; Boustras, G. The future of Safety Science. Saf. Sci. 2022, 150, 105705. [Google Scholar] [CrossRef]
  319. ALLEA. The European Code of Conduct for Research Integrity. Revised Edition; All European Academies: Berlin, Germany, 2017; Available online: https://www.allea.org/wp-content/uploads/2017/05/ALLEA-European-Code-of-Conduct-for-Research-Integrity-2017.pdf (accessed on 29 October 2021).
  320. ASA. American Sociological Association Code of Ethics; Approved by the ASA Membership in June 1997; ASA: Washington, DC, USA, 2010; Available online: https://www.asanet.org/sites/dfault/files/savvy/images/asa/docs/pdf/CodeofEthics.pdf (accessed on 29 October 2021).
  321. NHMRC. Publication and Dissemination of Research: A Guide Supporting the Australian Code for the Responsible Conduct of Research; National Health and Medical Research Council, Australian Research Council and Universities Australia, Commonwealth of Australia: Canberra, Australia, 2020. Available online: https://www.nhmrc.gov.au/sites/default/files/documents/attachments/publications/publication_and_dissemniation_of_research_guide.pdf (accessed on 30 November 2020).
  322. ORI. Federal Policy on Research Misconduct. Office of Research Integrity. Executive Office of the (US) President. 2000. Available online: https://ori.hhs.gov/content/chapter-2-research-misconduct-federal-policies (accessed on 29 October 2021).
  323. Singapore Statement. Singapore Statement on Research Integrity. In Proceedings of the 2nd World Conference on Research Integrity, Singapore, 21–24 July 2010, as a Global Guide to the Responsible Conduct of Research. Available online: https://wcrif.org/documents/327-singapore-statement-a4size/file (accessed on 18 February 2022).
  324. Safety MDPI. Instructions for Authors and MDPI Research and Publication Ethics. Available online: https://www.mdpi.com/journal/safety/instructions; https://www.mdpi.com/ethics (accessed on 5 November 2022).
  325. Flick, U. An Introduction to Qualitative Research, 6th ed.; SAGE Publications: London, UK, 2018. [Google Scholar]
  326. Stake, R.E. Qualitative Research; The Guilford Press: New York, NY, USA, 2010. [Google Scholar]
  327. Clarke, L. Personal communication. 19 April & 4–5 May 2023.
Table 1. Similar concepts and emphases in MMD [1] and NA [2].
Table 1. Similar concepts and emphases in MMD [1] and NA [2].
Turner’s MMD 1978Perrow’s NA 1984
Multiple high-risk industry qualitative case documentsMultiple high-risk industry qualitative case documents
Patterns found in cases from inquiriesPatterns found in cases from inquiries
Organisational Sociology and Weberian backgroundOrganisational Sociology and Weberian background
Technology and high-risk location importantTechnology and high-risk location important
Man-made disaster focus (13–14, 190)Man-made catastrophe focus (3, 11, 351)
Organisational failure (66, 75–78, 199–200)Organisational failure (233, 330–331)
Socio-technical (2–3, 5, 8, 47–48, 89, 170, 185, 187–188)Socio-technical (3, 7, 9, 10–11, 352)
Systemic (19, 135–136, 141–142, 145, 158–159, 161–162, 185, 188)Systemic (3, 10, 62–71, 351)
Open systems/external environment (136, 151, 170, 201)Open systems and external environment (75)
Emergence and propagation (89, 135, 158, 180)Emergence and propagation (9–10)
Failures of control (7, 70, 191)Failures of control (81, 83)
System forgiveness (19–20)Cybernetic self-correcting and error-avoiding systems such as aviation (11, 79–81, 126–127, 146–147, 167–168)
Error magnification/feedback amplification (179–181, 187, 236)Negative synergy, error inducing systems, magnification, unfamiliar or unintended feedback loops (82, 88, 98)
Precursor contributory factors combine in complex, unexpected and discrepant ways to defeat safety systems (86, 88, 105, 126)Interactive complexity: small failures and other unanticipated interactions can cause system accidents (4–5, 7, 10, 101)
Complex large-scale accidents and disasters with multiple chains of causes (14, 23–24, 75–76, 89, 105, 187)Complex system accidents and catastrophes with multiple causes (7, 70–71, 75, 78, 85–86, 88)
Precipitating or triggering incident or event, last event is not focus (81, 88–90, 102, 107, 122, 150, 155–156, 193, 198)Trigger event and particular events are not the focus (6–7, 71, 342, 344)
Surprise and unanticipated events (86, 126, 138, 145–146, 151, 159, 169, 184–186)Unanticipated and unexpected outcomes from interactions (6, 70, 78)
Large-scale accidents, rare catastrophes (149–151, 178)System accidents, rare catastrophes (343–345)
Latent structure of incubation events (86–87, 89, 94, 193)Independent factors lying fallow for the fatal spark (111)
Less complex accidents separate from disasters (88–89, 99)Component failure accidents with ‘DEPOSE’ factors (8, 77, 111, 343) separate from system accidents (70)
Bounded rationality and satisficing (133–138, 161)Bounded rationality (315–321, 323–324)
Inability to see or comprehend hazard (93–95, 195, 198)Inability to see or comprehend hazard (9, 75, 351)
Gap between perceived and actual reality (84, 94, 128–129, 138, 161, 194)Gap between perceived and actual reality (9, 75)
Warnings not heeded or discerned (19, 61, 194–195)Warnings ignored or didn’t fit mental model (10, 31, 351)
Miscommunication and misinformation (45–47, 61, 64–67, 121–124, 139)Misinterpretation and indirect information sources (35, 73, 84)
Variable disjunction of information (50–52, 61, 101, 217, 225) and social construction of reality (165–166, 191)Cognitive models of ambiguous situations and the social construction of reality (9, 75, 176)
Don’t blame individual operator error (160, 162–163, 198)Don’t blame individual operator error (4, 9, 331, 351)
Importance of power/elites (4, 72, 124–125, 132, 152, 191)Importance of power/elites (12, 155, 306, 311, 339, 352)
Growing concentration and power of large organisations and energy sources (1–2, 4–6, 160, 199, 201)Growing concentration of energy sources and power of large organisations (102, 306, 311)
Intentional misinformation by managers (118, 125, 147)Deception and lying, false logs by ship captains (10, 187)
Regulatory issues/inadequacies (70–1, 79, 87, 99, 103–4)Regulatory issues/inadequacies (3, 176, 343)
Gap in defences and failure of precautions (84, 87, 91)Defence in depth limits and failures (3–4, 43, 60)
Intuition, tacit knowledge, craft (11, 25, 51)Intuition and use of heuristics (316–7, 319)
Poor and unrealistic management (63, 66–67, 77, 79)Poor management (111–112, 177, 343)
Environmental disasters (2, 5–6, 14, 128, 131, 149, 190)Eco-system disasters (233, 252–253, 255, 295–296)
Societal culture and context (84, 192)Societal values and culture (12, 315–316, 321–328)
Importance of learning from near misses (96, 182)Aviation occurrence reporting model important (167–169)
Table 2. Examples of different concepts and emphases in MMD [1] and NA [2].
Table 2. Examples of different concepts and emphases in MMD [1] and NA [2].
Turner’s MMD 1978Perrow’s NA 1984
Organisational and social unit focus (160, 186, 199)Macro industry and technology focus (3, 12–14, 339)
Multidisciplinary approach and theories are necessary to study large-scale accidents and disasters (31–32, 38, 127)Own theory and radical critical paradigm mostly applied to high-risk accident reports and industry data
Somewhat optimistic about learning and prevention (32, 75–80, 194–200)Somewhat pessimistic about learning and prevention (32, 60, 257, 343, 351)
Incubation network (86–89, 99–107, 125, 131, 193, 200)Inevitable normal or system accidents—irretrievable for at least some time (3–5, 256, 328, 330)
Disaster timing usually after a long incubation often of years (87, 105, 180, 193)Disaster timing rapid: unanticipated system interaction combined with external factors (4–5, 75, 233, 253–255)
Disasters require focused unintended organising attention on multiple fronts to occur (180)Banality and triviality lie behind most catastrophes (9)
Sequence model with 6 stages (84–92)Close or tight coupling with little slack (4–6, 10–11, 89–96, 330–332)
Failures of intention (4, 128–131, 160, 171, 181) and of foresight (50, 77, 92, 99, 107, 161, 170, 179)Garbage can theory helps explain randomness of system accidents (324)
Schematic accident representation diagram (97–98)2 × 2 matrix or grid of complexity and coupling (97, 327)
Hierarchy of levels of information (145)Catastrophic potential of risky technologies especially where complex and tightly coupled systems (342–346)
Sub-cultures and shared social context determine perception (4, 58, 78, 101, 120–121, 166–171)Capitalist production imperatives and distorted market prices are important (310–313)
Bounded decision zones and perceptual horizons in an organisational worldview (58–9, 120–121, 165, 168–171, 200)Common mode failures (72–73, 75, 85)
Ill-structured problems; confusion across organisations and divisions (19–22, 50, 52–53, 60, 72, 75, 77, 96, 107)Unnecessary proximity and tight spacing can lead to unexpected interactions (82, 85, 88)
Well-structured problem post-disaster (52, 74–76, 103, 106, 179–188)Centralisation and decentralisation (10, 331–335)
Intended actor rationality (129, 160, 171–178, 200)Social rationality by non-experts in society (315–6, 321–324)
Negentropy, anti-tasks and non-random structured nature of unintended consequences (127, 179, 181, 187, 190)Understanding of transformational designs and processes is limited (11, 84–86, 330)
Discrepant information and events (86–90, 122, 146)Externalities imposed on society (339–341)
Importance of organisational culture (77, 103)Incomprehensibility of system accidents (23, 277)
Catastrophe and chaos theory (153–156, 185–187, 194)Complex systems seek productive efficiency (88)
Misdirected energy and misinformation (4, 182–184, 187, 189–191, 193)Risk assessment has a narrow focus; typically assumes over-regulation (306–314)
Decoy problem takes the focus off more serious threats (59–61, 64, 78, 80, 86–87, 100, 102–104, 196)Risk assessor ‘shamans’ support elites’ use of ‘evil’ technologies (12, 14, 307); some scientists, engineers and cognitive psychologists complicit (14, 307, 316–320)
Complaints from outsiders discounted; reluctance to fear the worst (73–74, 76, 102–104)Social class distribution of risk, inequality linked to disproportionate risk (310)
Social and differentiated distribution of knowledge (3, 85, 106, 152)Error-inducing systems such as marine shipping (11, 173–176, 181–190, 230)
Channels of observation not just communication (141, 159); what organisations pay attention to (58, 163–171)Nuclear accidents like TMI, unreliability and inevitability (15–61, 344, 348)
Nuclear industry’s enormous hazards—but risk analysis, information and response (1–2, 18, 29–30, 35, 183) Normative advocacy; technologies like nuclear power and weapons should not be used (x, 14, 347–52)
Table 3. Summary of 12 important accident theorists’ knowledge of, and acknowledgment of, Turner and Perrow.
Table 3. Summary of 12 important accident theorists’ knowledge of, and acknowledgment of, Turner and Perrow.
Knowledge of Turner (MMD 1978 or after 1997 2nd edn)Acknowledgment of Turner’s IdeasKnowledge of Perrow’s NA (1984 or 1999)Acknowledgment of Perrow’s Ideas
HaleMMD 1978mixedNAgood
Weick2nd edn 1997goodNAgood
RasmussenUnclearpoorunclearpoor
ReasonMMD 1978poor/mixedNAgood
VaughanMMD 1978good/mixedNAgood
LevesonMMD 1978mixed/poorNAgood
Hopkins2nd edn 1997goodNAgood
Hollnagel2nd edn 1997mixedNAgood
Dekker2nd edn 1997mixed/goodNAgood
ShrivastavaMMD 1978mixedNApoor
SaganUnclearpoorNAgood
Snookpre MMD 1978poorNAgood
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Bills, K.; Costello, L.; Cattani, M. Barry Turner: The Under-Acknowledged Safety Pioneer. Safety 2023, 9, 68. https://doi.org/10.3390/safety9040068

AMA Style

Bills K, Costello L, Cattani M. Barry Turner: The Under-Acknowledged Safety Pioneer. Safety. 2023; 9(4):68. https://doi.org/10.3390/safety9040068

Chicago/Turabian Style

Bills, Kym, Leesa Costello, and Marcus Cattani. 2023. "Barry Turner: The Under-Acknowledged Safety Pioneer" Safety 9, no. 4: 68. https://doi.org/10.3390/safety9040068

APA Style

Bills, K., Costello, L., & Cattani, M. (2023). Barry Turner: The Under-Acknowledged Safety Pioneer. Safety, 9(4), 68. https://doi.org/10.3390/safety9040068

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop