Barry Turner: The Under-Acknowledged Safety Pioneer
Abstract
:1. Introduction
1.1. Rationale for This Retrospective
so well done that it correctly anticipated developments and conclusions that other more prominent accident researchers would subsequently lay claim to (system complexity and uncertainty, how multiple failures undermine layered safety defences, the cultural blinkers that organisations adopt), even when those who followed claimed only superficial or no knowledge of Turner’s original writings.[7] (p. 239, emphasis in original)
I can’t see any sign that he [Perrow] was aware of Barry Turner’s work … The ability of people at different sides of the world to encounter each other’s work and understand where progress has been made relies on you knowing who else is working on the same things. … a lot of the work in safety, he [Perrow] never encountered, which I don’t think is his fault. Yeah, there are a lot of people today who’ve never heard of Barry Turner. … [Perrow] independently invented a lot of foundational thinking in safety, that he wasn’t the first to think of it, but he also did it without standing on the shoulders of other people who had those same ideas.[8]
1.2. Key Aims
1.3. Outline of the Article
2. Materials, Methods and Approach
3. Central Themes in MMD and Their Background
Public inquiries into … three major disasters are examined and classified to study the conditions under which large-scale intelligence failures develop. Common causal factors are rigidities in institutional beliefs, distracting decoy phenomena, neglect of outside complaints, multiple information-handling difficulties, exacerbation of the hazards by strangers, failure to comply with regulations, and a tendency to minimize emergent danger. Such features form part of the incubation stage in a sequence of disaster development, accumulating unnoticed until a precipitating event leads to the onset of the disaster and a degree of cultural collapse. Recommendations following public inquiries are seen as part of a process of cultural readjustment after a disaster, allowing the ill-structured problem which led to the failure to be absorbed into the culture in a well-structured form. The sequence model of intelligence failure presented and the discussion of cases are intended to offer a paradigm for discussion of less tragic, but equally important organizational and interorganizational failures of foresight.[27] (p. 378)
- (1)
- a notionally “normal” starting point with culturally accepted beliefs about the world and its hazards, and associated precautionary norms that are followed through regulation and less formal practices;
- (2)
- an “incubation period” when an unnoticed set of events or chains of “discrepant” events at odds with accepted beliefs and norms about hazards develop and accumulate;
- (3)
- a “precipitating” event or incident linking with the chain of discrepant events, produces a transformation revealing the “latent structure” of the incubation period and a “gap in defences that were previously considered secure”;
- (4)
- the “onset” of a disaster or major accident follows immediately from the precipitating event, with direct and “unanticipated” consequences of the failure, and an onset of varying rate and intensity over varying scope and area;
- (5)
- rescue and salvage—rapid and ad hoc changes in understanding and a first stage adjustment to the disaster; and
- (6)
3.1. Scope, Terms, Definitions and Data
It is rare that an individual, by virtue of a single error, can create a disastrous outcome in an area formerly believed to be relatively secure. To achieve such a transformation, he or she needs the unwitting assistance offered by access to the resources and resource flows … of large organizations, and time. The three accidents discussed here had been incubating for a number of years.[27] (p. 395)
If the only cause of an incident is an inappropriate response to a recognized warning, the incident is more likely to be one which we characterize as an accident: by contrast, in a pre-disaster situation, given the typically large accumulation of predisposing factors, the nature of the last error is relatively unimportant. … The incubation network only refers to those chains of events which are discrepant, but are not perceived or are misperceived. It is meaningful to compare accidents and disasters only in terms of incubation networks ….[1] (p. 88, emphasis in original)
3.2. Incubation, Failures of Foresight, Prevention and Power
- Events unnoticed or misunderstood because of erroneous assumptions due to rigidities of belief and perception, “decoy” phenomena (focus on a problem that obscures a bigger one), and disregard of complaints from “outsiders”;
- Events unnoticed or misunderstood because of difficulties in handling information in complex situations, information difficulties and “noise”, and the involvement of “strangers” (visitors and trespassers) on sites;
- Effective violations of precautions passing unnoticed because of a cultural norm lag with existing precautions, including a failure to comply with regulations which may be unknown or out-of-date;
- Events unnoticed or misunderstood because of a reluctance to fear the worst outcome, and minimising emergent danger and not taking action when things begin to go sour [1] (pp. 99–103).
it becomes necessary, also, to start to take account of those other perennial concerns of the sociologist, the charting of the distribution of power, of the control of resources and of social reputation. … Powerful groups and organizations are able to specify the kinds of hazard that they recognize, to set out and implement the kinds of precautions which they think are necessary, and to exert their authority in intervening in areas which they regard as hazardous. … There may be confrontations between those who present an official definition of hazards and others who think that the situation is different … There is thus an overlay of differential power distributions which will affect knowledge, perceptions and expectations of accidents.[1] (pp. 124–125, 152)
3.3. Turner’s Systems Approach
“All that is required is the introduction of unintended or unforeseen variety near to the organizing centre to produce a large-scale, but orderly error which makes use of the amplifying power of any ordered organizational hierarchy. If we consider organizational hierarchies as systems set up to carry out tasks, these ordered but undesired consequences could be regarded as ‘anti-tasks’ rather than as completely random errors. Large-scale disasters need time, resources and organization if they are to occur—if the ‘anti-task’ is to be successfully executed … [such disasters are] most unlikely to be met solely as a result of a concatenation of random events”.[1] (p. 180)
3.4. Turner’s Multidisciplinarity and Optimism
it is important that we should not assess the actions of decision-makers too harshly in the light of the knowledge that hindsight gives us. … it is necessary to look at the manner in which rationality becomes established and embedded within organizational procedures and habits, and to gain an understanding of the hierarchy of decision-making within which the individual administrator finds that he has to operate.[1] (pp. 162–163 note 4, 234)
we can continue to try to improve … we are in a contingent universe, in which ultimately there are limits on our ability to reduce uncertainty, to master all of the open-ended and perverse qualities of our environment, and upon our ability to prevent disaster … We may come to realize that, even when our strategies are successful, they are still dependent upon the munificence of the environment and upon the mutability of fortune.[1] (p. 201)
4. Central Themes in NA and Their Background
to chart the world of organized systems … [this] … constitutes a theory of systems, of their potential for failure and recovery from failure, As it such, it is, I believe, unique in the literature on accidents and the literature on organizations. Perhaps the most original aspect of the analysis is that it focuses on the properties of systems themselves, rather than on the errors that owners, designers, and operators make in running them.[2] (pp. 62–63)
nuclear power plants, chemical plants, aircraft and air traffic control, ships, dams, nuclear weapons, space missions, and genetic engineering. Most of these risky enterprises have catastrophic potential … it is the possibility of managing high-risk technologies better … that motivates this inquiry. There are many improvements we can make that I will not dwell on … such as better operator training, safer designs, more quality control, and more effective regulation. … Rather, I will dwell upon characteristics of high-risk technologies that suggest that no matter how effective conventional safety devices are, there is a form of accident that is inevitable.[2] (p. 3)
The inquiry … grew into a major analysis of a number of systems. … it is relentlessly ‘structural’. … Investigating a number of these accidents, I found a common pattern. While most accidents in risky systems stemmed from a major failure that could have been prevented, a substantial minority resulted from the unexpected interaction of two or more small failures. … The unexpected and generally incomprehensible interaction of small failures was found in all the complex systems I studied in any detail, including those with catastrophic potential … The sources of failure were diverse … The resulting accidents were ‘system accidents’, arising from the ability of the system to permit the unexpected interactions of failures. … Multiple, unexpectedly interacting failures in risky systems still might not be a serious concern if operators could intervene before significant damage occurs. But there is another system characteristic to consider … if coupling is tight, none of these safeguards is available. … the environment [once incorporated in analysis provides] further insight into the problem of safe systems … If we take an industry, rather than particular organizations, as the unit of analysis, we can see the impact of the industry and its ties to society upon the organization and its problems. … [e.g.,] comparing the error-inducing marine transport system with the error-avoiding airline system … The focus was on system interactions and control, whether at the level of the operator or elites deciding what kind of risks the rest of us should run.[81] (pp. 146–149, 153, 155)
4.1. System Focus and Definitions
4.2. Complexity, Coupling, System and Component Accidents
4.3. Perrow’s Sociological Background and Method
performed a ‘hammer analysis’ … I had a primitive theory about complexity and coupling and when they handed me the transcripts I pounded them with it and broke it open … In thirty days I had produced a 45-page paper that applied the theory to Three Mile Island, to tanker collisions, aircraft failures, chemical plant explosions, and suggested why most factories would not have ‘normal accidents’ or, in a more technical term, what I called ‘system accidents’. The students at Stony Brook sent me a steady stream of material and critiqued my rough drafts and ideas.
5. Similarities and Differences between Turner’s MMD and Perrow’s NA
5.1. Similarities between MMD and NA
5.2. Differences between MMD and NA
6. Turner and Perrow after MMD and NA
6.1. Turner’s Work after 1978
we do believe that the basic theoretical model set out here remains as relevant to concerns about understanding the nature and origins of acute failures of major socio-technical systems as it did … Account is … taken in Chapter 11 of the reports of work on ‘high-reliability organizations’, of the possibility of applying notions of organizational design to the encouragement of ‘safe’ organizations and ‘safety cultures’, and the more wide-ranging issues raised by a concern with institutional design as a way forward in hazard management. … Chapter 11, together with this Preface … was written by Nick Pidgeon, working initially from various notes Barry had compiled prior to his death.[10] (p. xviii)
A simple model to understand this nonrandom error propagation requires a description of the initial system structure in social and technical terms, specifying features such as the task and sentient boundaries of subsystems. … it is possible to trace the manner in which errors contributing to major system failures initiate structured consequences … When errors or distortions of intent appear … they interact with the negentropic or ordering properties of the system in which they occur to produce a novel chain of structured consequences. They create a small initial change which, depending on the location, timing, and structure, modifies the future arrangement of events in a manner that has its own logic and order. … In most accidents, it is axiomatic that there is never merely one starting point, but that there are at least six or seven … all of which must be taken account of in understanding the resulting multiple interaction patterns. … An unintended event will trace out those aspects of the preexisting system which it does not destroy. The error and the system intervention phases start in what we have previously referred to as the ‘incubation period’ of a large-scale accident, but they continue into the ‘onset’ stage and beyond … When unforeseen events occur, their consequences are strongly constrained by preexisting technical, task, and sentient structures. When the intervention is not strong enough to disrupt the structure completely, its consequences trace out a portion of the structure. We are thus encouraged to look for regularities in the apparently unstructured events surrounding large-scale accidents or large-scale system failures, and to reduce the extent to which we automatically assume ideas of ‘randomness’ will offer us an understanding of such phenomena.[122] (pp. 437–444)
It is also possible to think of the culture of small groups of workers, of departments, of divisions and organisations as being nested successively within one another, and then to consider the organisational culture as being located within a national or international framework.[65] (p. 249)
outside the precincts of classical rational-technical organizational theory and systems analysis … from positivism … [to] new interest in methods of qualitative inquiry and analysis … [to] symbols and culture in general … shared realities … a view based upon negotiation will see a complex of subcultures and counter-cultures … separating the ‘corporate culture’ from ‘culture in work’ which workers (and mangers) weave for themselves while making sense of their experiences in the organization.[108] (pp. 83, 86–87, 90, 94)
the classificatory world view emphasizes a changing and kaleidoscopic perspective in which symbols exist within a frame, and in this perspective symbols of reversal are seen as expected and nourishing. By contrast, the instrumental world view, a more technological and purposive one, emphasizes the sequential harnessing of means to an end. The instrumental view threatens and is threatened by symbols of reversal. … Some safety specialists seem to be confident that accidents can be instrumentally eliminated from organizations, especially now that the model of accident generation has been completed by the identification by some of them of the role of ‘organization’ as the final ‘variable’ contributing to accidents. This view, however, can only be sustained by pushing the instrumental view to the centre and suppressing or eliminating the negative and the inversion. A control system is effectively a system of marks, but by reframing, by allowing the marks to migrate, other possibilities come into view. Management, including hazard management, must take an instrumental view of the world almost by definition. But unless some of the potential for reversal and transformation is recognised, managerial activities will repeatedly be threatened by apparently inexplicable and uncontrollable transformations, upsets and contingencies.[126] (p. 37)
6.2. Perrow’s Work after 1984 and Assessment of Normal Accidents
the kind of sociologist who emphasizes the overriding importance of power and interests in society, rather than the kind that emphasizes nurture, culture, or common humanity … [one who promoted better] structures … organizational forms, laws … Context shapes behavior, but the temptation to self-interested behavior is always there and must be fought.[152] (p. 92)
6.3. Similarities and Differences after Publication of MMD and NA
Only a few disasters, I believe will be exclusively due to design or human factors failures that cannot be attributed to higher level explanations. Any accidents, as opposed to disasters, can be traced to operator error … Disasters require a configuration that is more likely to be due to organizational and sociocultural factors. A few of these will be what I call ‘system accidents’, inherent in systems that are complexly interactive and also tightly coupled … The vast majority of disasters will be due to organizational, and ultimately, to sociocultural factors.[92] (p. 284)
7. Turner and Perrow: Acknowledgment and Citation
7.1. Perrow’s Knowledge of MMD
I just don’t feel that close to Barry’s work, so about all I could say would be that it was the earliest attempt to think through the matter of disasters in organizational terms, and thus very useful and insightful. To say so little would not serve much purpose. We approached disasters very differently, and indeed organizational analysis in general. Barry’s work has been strongest in the culture area, while I have been much more concerned with structure. He also had a catholic interest in disasters, whereas my concern has been quite narrow, focusing upon a small set of (very risky) systems, and even then with structural causes and little concern with the recovery phase, or even prevention. So there is not much in common.[193]
7.2. Perrow’s Citation of Turner
One might note one infrequent, but perverse, barrier to learning at this point, originally identified, I believe, by Turner (1978: 224) in Man-made Disasters, where accident investigations convert ill structured problems into well structured ones (see also Vaughan, 1994). Accident investigations are ‘left censored’ in that they examine only systems that failed, not the ones with the same characteristics that have not failed.[90] (p. 214)
Nevertheless, there is a reasonably interesting ‘schematic report analysis diagram’ that analyses the Cambrian Colliery accident of 1965, based on the work of B.A. Turner (as is a great deal of the book), a pioneer in the accident field. It outlines the numerous failures and shows how the investigating committee ignored some of the more important ones. The diagram is useful for investigating committees, but when enlarged as a generic blueprint for an ‘organizational learning system’, as it is in the final chapter, it mimics the failure of 1960s system theory: everything is (equally) important, connected and must be taken into account.[138] (p. 607)
7.3. Turner’s Citation of Perrow
account shares with man-made disasters the view that major accidents in socio-technical systems arise from the interaction of a chain of unanticipated errors and misunderstood events in complex and ill-structured situations. However, the basic model differs from man-made disasters, being focused primarily upon the prior structural properties of complex technical systems, rather than upon the ways in which disasters develop unseen over time.[10] (p. 178)
In practice the concepts of ‘complexity’ and ‘coupling’ have turned out to be difficult to use analytically and it seems likely that they are not fully independent from each other, in that both express aspects of the fundamental complexity underlying dynamic and ill-structured systems. … Perrow’s original account appears overly deterministic, having been derived in the main from an analysis of the structural properties of technology and technological systems. … he does not clearly specify … whether similar effects are produced by both organizational and technical complexity and interdependence. However, his analysis does … [draw] attention to the safety implications of the growing complexity and interdependence of today’s most advanced industrial systems.[10] (pp. 179, 230)
8. Citation and Acknowledgment of Turner by Other Important Accident Causation and Theory Scholars
8.1. Citation of Man-Made Disasters and Normal Accidents
8.2. Andrew Hale
the fascinating thing about Barry was that he was coming at things from a very different disciplinary background, sociology, while interacting with people coming mostly from an engineering science or psychology background. So his point of view was very new, but because he wrote so eloquently and his work was so readable, he had a big influence. You could say that he was there at the right moment to give that push to include sociological factors in the causation frameworks.[223] (p. 66)
8.3. Karl Weick
continuum of loose-tight coupling reflects the way in which [Turner’s] hierarchy, power, distributed problem solving, suppressed conflict, and socialization pressures either enhance diversity through looser coupling or discourage it through tighter coupling. Likewise, the continuum of linear transformation—interactive complexity reflects the ways that Turner’s (1978) modes of operation, simultaneous consideration of problems at multiple levels of generality, conflicting actions, and discontinuities of practice make for more or less knowable chains of consequence. To worry about normal accidents is to worry about what it means to organize. As Turner put it, ‘It could be said that organizations achieve a minimal level of coordination by persuading their decision-makers to agree that they will all neglect the same kinds of consideration when they make decisions’ (p. 166; Turner and Pidgeon, 1997, p. 138). The main differences between Turner and Perrow and the rest of us lie in what each would say is the focus of that coordinated neglect.[228] (p. 29)
8.4. Jens Rasmussen
8.5. James Reason
It is, I believe, fitting to begin this survey of alternative theoretical views with Barry Turner, a sociologist at the University of Exeter, who—if he didn’t actually coin the term ‘organizational accident’—laid the groundwork for understanding organizational breakdown in his pioneering book Man-Made Disasters in 1978. Later, Turner’s work was updated in a second edition … His most important concept was ‘incubation’. In other words, organizational disasters develop over long periods of time—in short they incubate within the system. Warning signs are ignored or misunderstood or even integrated into the pattern of organizational ‘normalcy’. As a result, safeguards and defences either do not get built or are allowed to atrophy. … Disasters, as noted elsewhere, are immensely diverse in their surface details. But Turner and Pidgeon have identified a set of developmental stages that appear universal. … These notions do not necessarily conflict with the idea of latent conditions: rather, their sociological emphasis upon cultural adjustments enriches them.[234] (pp. 99–100)
A multiplicity of minor causes, misperceptions, misunderstandings and miscommunications accumulate unnoticed during this ‘incubation period’. These preconditions which one researcher has subsequently called ‘pathogens’ (Reason, 1990) stay in place in the organization or managerial practice, ready to contribute to a major failure unless something happens to neutralize them by bringing them out into the open. … They constitute an accident waiting to happen…. Brought together by some trigger event. … the underlying pattern of the incubation period is common, and recurs in many disasters and in many industries.[43] (pp. 216–218)
8.6. Diane Vaughan
Turner, investigating ‘man-made disasters’ (1976; 1978), pioneered in discovering organizational patterns that systematically contributed to the disasters he studied: norms and culturally accepted beliefs about hazards, poor communication, inadequate information handling in complex situations, and failure to comply with existing regulations instituted to assure safety (1976:391). He concluded that these factors created an absence of some kind of knowledge at some point. Crucial to understanding such accidents, then, is discovering how knowledge and information relating to events provoking a disaster were distributed in an organization before the incident (1978:3). Analysis of the Challenger accident not only confirms Turner’s findings about the relevance of knowledge and information in organizations, but also identifies structural factors that systematically affected the distribution of information and its interpretation at NASA: the competitive environment, the organization’s structure, processes, and transactions, and the regulatory environment. These factors combined to affect the decision to launch.[269] (p. 248)
Published in 1978 and accompanied by two articles in well-regarded journals in the United States, the book nonetheless was seldom cited. The book had a cult following that advertised it by word of mouth. But the failure to become integral to mainstream sociology seems odd, given the quality of his work, its grounding in general organizational principles … Moreover, his approach was unprecedented. … Turner examined … preconditions, locating them in organizational systems. He was the first to demonstrate how technical, social, institutional and administrative arrangements, in combination, can systematically produce disasters. … Looking back, we must marvel not only at Turner’s prescience, but at his accomplishment. … classic ideas … Turner’s book contains two: the title … and his core idea of ‘failures of foresight’, which directs attention to a singularly important causal element that he found. … Man-made disasters not only had preconditions, but those preconditions had characteristics in common: long incubation periods studded with early warning signs that were ignored or misinterpreted. For Barry Turner, man-made disasters were distinguished not only by the institutional, organizational, and administrative structures associated with them, but by their process. In my view, this was his true intellectual breakthrough: disasters were not sudden cataclysmic events; they had long gestation periods. We must also marvel at the methodology and analysis on which his theoretical insights were based. … Using a grounded theory approach, Turner examined these archival data, identifying similarities and differences between these cases. … His effort produced a volume with richness that goes beyond his two key concepts. For example, his understanding of the relationship between information, error, and surprise in organizations was also farsighted.[211] (pp. xii–xviii)
8.7. Nancy Leveson
8.8. Andrew Hopkins
8.9. Erik Hollnagel
The distinction between work-as-imagined and work-as-done is often used in the ergonomics literature … Work-as-imagined represents what designers, managers, regulators and authorities believe happens or should happen, whereas work-as-done represents what actually happens. Differences … [are] classified as non-compliances, violations, errors or as performance adjustments and improvisations, depending on how one looks at it. An early discussion of this in the context of safety is found in Turner, B. (1978) Man-Made Disasters.[286] (p. 38, emphasis in original)
8.10. Sidney Dekker
Defences-in-depth thinking (e.g., latent errors or resident pathogens that are already present and help incubate disaster (Reason, 1990); High reliability theory (e.g., weak signals that do not get communicated or picked up (Weick & Sutcliffe, 2001); Safety culture research (e.g., organizational cultural preconditions for disaster); Concepts such as the normalization of deviance (Vaughan, 1996), procedural drift (Snook, 2000), and drift into failure (Dekker, 2011), which all refer to disaster incubation in one way or another; Control-theoretic notions about erosion and loss of control (Leveson, 2012): the kind that Turner talked about in sociological, managerial, and administrative terms.[297] (pp. 220–221)
8.11. Acknowledgment and Citation by Seven other Accident Causality and Explanation Scholars
Before Perrow’s [1984] NA, in 1978, Turner published Man-Made Disasters, the failure of foresight, a book looking into disasters from a sociotechnical perspective (Turner, 1978). The contribution of Turner at the time was to go beyond an engineering view of disasters and to understand, study and conceptualise these events as … engineering, organisational and cultural phenomena. Accidents are the products of fallible institutionalised views created by a wide range of actors of organisations.[4] (p. 126)
9. Discussion
9.1. Turner and Perrow
9.2. Citation and Acknowledgment
9.3. Limitations
10. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Turner, B.A. Man-Made Disasters; Wykeham: London, UK, 1978. [Google Scholar]
- Perrow, C.B. Normal Accidents: Living with High-Risk Technologies; Basic Books: New York, NY, USA, 1984. [Google Scholar]
- Gould, K.P.; Macrae, C. (Eds.) Inside Hazardous Technological Systems: Methodological Foundations, Challenges and Future Directions; CRC Press: Boca Raton, FL, USA, 2021. [Google Scholar]
- Le Coze, J.C. Post Normal Accident: Revisiting Perrow’s Classic; CRC Press: Boca Raton, FL, USA, 2021. [Google Scholar]
- Hopkins, A. Turner and the Sociology of Disasters. In Inside Hazardous Technological Systems: Methodological Foundations, Challenges and Future Directions; Pettersen Gould, K., Macrae, C., Eds.; CRC Press: Boca Raton, FL, USA, 2021; pp. 19–32. [Google Scholar]
- Pidgeon, N.F. Systems Thinking, Culture of Reliability and Safety. Civ. Eng. Environ. Syst. 2010, 27, 211–217. [Google Scholar] [CrossRef]
- Pidgeon, N.F. Afterword: Connoisseurship, the Sociological Imagination and Turner’s Qualitative Method. In Inside Hazardous Technological Systems: Methodological Foundations, Challenges and Future Directions; Gould, K.P., Macrae, C., Eds.; CRC Press: Boca Raton, FL, USA, 2021; pp. 237–248. [Google Scholar]
- Rae, A.J. Can Major Accidents be Prevented? Transcript of Episode 100 of The Safety of Work Podcast by Provan, D.; Rae, A.J. Broadcast on 9 October 2022. Available online: https://safetyofwork.com/episodes/ep-100-can-major-accidents-be-prevented/transcript (accessed on 28 November 2022).
- Perrow, C.B. Normal Accidents: Living with High-Risk Technologies (Republished with a New Afterword and a Postscript on the Y2K Problem and an Additional Bibliography); Princeton University Press: Princeton, NJ, USA, 1999. [Google Scholar]
- Turner, B.A.; Pidgeon, N.F. Man-Made Disasters, 2nd ed.; Butterworth-Heinemann: Oxford, UK, 1997. [Google Scholar]
- Le Coze, J.C. The ‘new view’ of human error. Origins, ambiguities, successes and critiques. Saf. Sci. 2022, 154, 105853. [Google Scholar] [CrossRef]
- Turner, B.A. The Failure of Foresight: An Examination of Some of the Conditions Leading to Failures of Foresight, and of Some of the Institutionalised Processes for Accommodating Such Failures. Ph.D. Thesis, The University of Exeter, Exeter, UK, 1976. Available online: https://ethos.bl.uk/ethos/Logon.do (accessed on 24 January 2021).
- Jeffcutt, P. Obituary—Professor Barry Turner (1937–1995). Stud. Cult. Organ. Soc. 1995, 1, i–iii. [Google Scholar] [CrossRef]
- Pidgeon, N.F.; Blockley, D.I.; Turner, B.A. Design practice and snow loading—Lessons from a roof collapse. Struct. Eng. 1986, 64A, 67–71. [Google Scholar]
- Turner, B.A. Sociological Aspects of Organizational Symbolism. Organ. Stud. 1986, 7, 101–115. [Google Scholar] [CrossRef]
- Turner, B.A. A Personal Trajectory through Organization Studies. In Research in the Sociology of Organizations; Bacharach, S.B., Gagliardi, P., Mundell, B., Eds.; JAI Press: Greenwich, CN, USA, 1995; Volume 13, pp. 275–301. [Google Scholar]
- Turner, B.A. Exploring the Industrial Subculture; The Macmillan Press: London, UK, 1971. [Google Scholar]
- Jeffcutt, P. Editorial: From the Industrial to the Post-Industrial Subculture. Organ. Stud. 1999, 20, vii–xv. [Google Scholar] [CrossRef]
- Reeves, T.K.; Turner, B.A.; Woodward, J. Technology and Organizational Behaviour. In Industrial Organization: Behaviour and Control; Woodward, J., Ed.; Oxford University Press: Oxford, UK, 1970; pp. 3–18. [Google Scholar]
- Reeves, T.K.; Turner, B.A. A Theory of Organization and Behavior in Batch Production factories. Adm. Sci. Q. 1972, 17, 81–98. [Google Scholar] [CrossRef]
- Turner, B.A. Control Systems: Development and Interaction. In Industrial Organization: Behaviour and Control; Woodward, J., Ed.; Oxford University Press: Oxford, UK, 1970; pp. 59–84. [Google Scholar]
- Turner, B.A. The Organization of Production—Scheduling in Complex Batch-production Situations: A comparative view of organizations as systems for getting work done. In Approaches to the Study of Organizational Behaviour: Operational Research and the Behavioural Sciences; Heald, G., Ed.; Tavistock: London, UK, 1970; pp. 87–99. [Google Scholar]
- Turner, B.A. Industrialism; Longman: London, UK, 1975. [Google Scholar]
- Berger, P.L.; Luckmann, T. The Social Construction of Reality: A Treatise in the Sociology of Knowledge; Doubleday: New York, NY, USA, 1966. [Google Scholar]
- Burrell, G.; Morgan, G. Sociological Paradigms and Organisational Analysis; Ashgate: Aldershot, UK, 1979. [Google Scholar]
- Turner, B.A. An Examination of Some of the Organisational Preconditions Associated with Some Major Disasters. Presentation to an Open University Seminar on Systems Failures; City University: London, UK, 1974; Reprinted with updates in Peters, G.; Turner, B.A. Part of the 9-unit third level Open University course TD342, Systems Performance: Human Factors and Systems Failures. In Unit 4: Catastrophe and its Preconditions; Peters, G., Ed.; The Open University Press: Milton Keynes, UK, 1976; pp. 4–45. [Google Scholar]
- Turner, B.A. The Organizational and Interorganizational Development of Disasters. Adm. Sci. Q. 1976, 21, 378–397. [Google Scholar] [CrossRef]
- Turner, B.A. The Development of Disasters—A sequence model for the analysis of the origins of disasters. Sociol. Rev. 1976, 2, 753–774. [Google Scholar] [CrossRef]
- Turner, B.A. The origins of disaster. In Safety at Work: Recent Research into the Causes and Prevention of Industrial Accidents; Philips, J., Ed.; Centre for Socio-Legal Studies, Conference Papers No.1; Wolfson College: Oxford, UK; Social Science Research Council: Oxford, UK, 1977; pp. 1–18. [Google Scholar]
- Turner, B.A. Perceptions of Bureaucracy: A Variable in Administrative Theory. Soc. Econ. Adm. 1977, 11, 137–149. [Google Scholar] [CrossRef]
- Turner, B.A. Research note: A comment on the nature of information in channels of observation. Cybernetica 1977, XX, 39–42. [Google Scholar]
- Merton, R.K. The Unanticipated Consequences of Purposive Social Action. Am. Sociol. Rev. 1936, 1, 894–904. Available online: https://www.jstor.org/stable/2084615 (accessed on 15 December 2020). [CrossRef]
- Simon, H.A. Administrative Behavior: A Study of Decision-Making Processes in Administrative Organization, 2nd ed.; Macmillan: London, UK, 1957. [Google Scholar]
- Simon, H.A. Models of Man, Social and Rational: Mathematical Essays on Rational Human Behavior in Social Settings; Wiley: New York, NY, USA, 1957. [Google Scholar]
- Western, K.A. The epidemiology of natural and man-made disasters: The present ‘state of the art’. In Diploma in Tropical Public Health; The Ross Institute, London School of Hygiene and Tropical Medicine, University of London: London, UK, 1972. [Google Scholar]
- Glaser, B.G.; Strauss, A.L. The Discovery of Grounded Theory: Strategies for Qualitative Research; Aldine: Chicago, IL, USA, 1967. [Google Scholar]
- Le Coze, J.C. Broad (multilevel) safety research and strategy. A Sociological study. Saf. Sci. 2021, 136, 105132. [Google Scholar] [CrossRef]
- Flin, R. Safety Condition Monitoring: Lessons from Man-Made Disasters. J. Contingencies Crisis Manag. 1998, 6, 88–92. [Google Scholar] [CrossRef]
- Goffman, E. Frame Analysis: An Essay on the Organization of Experience; Peregrine Books, Penguin: London, UK, 1975. [Google Scholar]
- Pidgeon, N.F.; Turner, B.A. H.E. and Socio-Technical System Failure. In Modeling Human Error in Structural Design and Construction, Proceedings of a Workshop Sponsored by the National Science Foundation, Ann Arbor, MI, USA, 4–6 June 1986; Nowak, A.S., Ed.; Construction Division of the American Society of Civil Engineers: Reston, VA, USA, 1986; pp. 193–203. [Google Scholar]
- Turner, B.A. Failed Artifacts. In Symbols and Artifacts: Views of the Corporate Landscape; Gagliardi, P., Ed.; De Gruyter Studies in Organization 24; Walter de Gruyter: Berlin, Germany, 1990. [Google Scholar]
- Turner, B.A. Stepping into the same river twice: Learning to handle unique management problems. In Text of the Inaugural Professorial Lecture Delivered 8 December 1992 in the Middlesex University Business School, Middlesex University Inaugural Lectures 2, 1-19; Middlesex University: Middlesex, UK, 1992. [Google Scholar]
- Turner, B.A. Causes of Disaster: Sloppy Management. Br. Jnl. Manag. 1994, 5, 215–219. [Google Scholar] [CrossRef]
- Turner, B.A. The Making Sense of Unseemly Behavior in Organizations. Int. Stud. Manag. 1983, XIII, 164–181. Available online: https://www.jstor.org/stable/40396922 (accessed on 15 December 2020). [CrossRef]
- Turner, B.A. The Use of Grounded Theory for the Qualitative Analysis of Organizational Behaviour. J. Manag. Stud. 1983, 20, 333–348. [Google Scholar] [CrossRef]
- Gordon, J.E. The Epidemiology of Accidents. Am. J. Public Health 1949, 39, 504–515. [Google Scholar] [CrossRef]
- Haddon, W. A note concerning accident theory and research with special reference to motor vehicle accidents. Ann. N. Y. Acad. Sci. 1963, 107, 635–646. [Google Scholar] [CrossRef] [PubMed]
- Haddon, W. The Changing Approaches to the Epidemiology, Prevention, and Amelioration of Trauma: The transition to approaches epidemiologically rather than descriptively based. Am. J. Public Health 1968, 58, 1431–1438. [Google Scholar] [CrossRef]
- Haddon, W. Energy Damage and the Ten Countermeasure Strategies. J. Trauma 1973, 13, 321–331. [Google Scholar] [CrossRef]
- Lindquist, M.G. Analysis of system failure and corrective subsystems. Manag. Datamat. 1975, 4, 21–24. [Google Scholar]
- Maruyama, M. The Second Cybernetics: Decision-Amplifying Mutual Causal Processes. Am. Sci. 1963, 51, 164–179. [Google Scholar]
- Schrödinger, E. What Is Life? Cambridge University Press: Cambridge, UK, 1944. [Google Scholar]
- Brillouin, L. Life, Thermodynamics and Cybernetics. Am. Sci. 1949, 37, 554–568. Available online: https://www.jstor.org/stable/29773671 (accessed on 27 October 2020). [PubMed]
- Brillouin, L. Scientific Uncertainty and Information; Academic Press: New York, NY, USA, 1964. [Google Scholar]
- Shannon, C.E.; Weaver, W. The Mathematical Theory of Communication; University of Illinois Press: Champaign, IL, USA, 1949. [Google Scholar]
- Pask, G. The Natural History of Networks. In Self-Organizing Systems: Proceedings of an Interdisciplinary Conference, Chicago, IL, USA, 5–6 May 1959; Yovits, M.C., Cameron, S., Eds.; Pergamon Press: Oxford, UK, 1960; pp. 232–263. [Google Scholar]
- Rivas, J.R.; Rudd, D.F. Man-machine synthesis of disaster-resistant operations. Oper. Res. 1975, 23, 2–21. [Google Scholar] [CrossRef]
- Thom, R. Structural Stability and Morphogenesis: An Outline of a General Theory of Models; W.A. Benjamin: London, UK, 1975. [Google Scholar]
- Buckley, W. Sociology and Modern Systems Thinking; Prentice-Hall: Englewood Cliffs, NJ, USA, 1967. [Google Scholar]
- Pidgeon, N.F.; Turner, B.A.; Blockley, D.I. Hazard Assessment in Structural Engineering. In Reliability and Risk Analysis in Civil Engineering: Proceedings of the 5th International Conference on Applications of Statistics and Probability in Soil & Structural Engineering, Vancouver, Canada, 25–29 May 1987; Lind, N.C., Ed.; University of British Columbia: Vancouver, Canada, 1987; Volume 1, pp. 358–365. [Google Scholar]
- Pidgeon, N.F.; Blockley, D.I.; Turner, B.A. Site investigations: Lessons from a late discovery of hazardous waste. Struct. Eng. 1988, 66, 311–315. [Google Scholar]
- Pidgeon, N.F.; Stone, J.R.; Blockley, D.I.; Turner, B.A. Management of Safety Through Lessons from Case Histories. In Safety and Reliability in the 90s: Will Past Experience or Prediction Meet Our Needs? Walter, M.H., Cox, R.F., Eds.; Elsevier Applied Science: London, UK, 1990; pp. 201–216. [Google Scholar]
- Pidgeon, N.F.; Turner, B.A.; Blockley, D.I. The use of Grounded Theory for conceptual analysis in knowledge elicitation. Int. J. Man-Mach. Stud. 1991, 35, 151–173. [Google Scholar] [CrossRef]
- Pidgeon, N.F.; Turner, B.A.; Blockley, D.I.; Toft, B. Corporate Safety Culture: Improving the Management Contribution to System Reliability. In Reliability ’91, Proceedings of International Conference on Reliability Techniques and Their Application, London, UK, 10–12 June 1991; Matthews, R.H., Ed.; Elsevier: London, UK, 1991; reprinted in eBook; Chapman and Hall/CRC: London, UK, 2017; Chapter 63. [Google Scholar] [CrossRef]
- Pidgeon, N.F.; Turner, B.A.; Toft, B.; Blockley, D.I. Hazard management and safety culture. In Hazard Management and Emergency Planning: Perspectives on Britain; Also Published as an eBook in 2013; Parker, D.J., Handmer, J.W., Eds.; Routledge: London, UK, 1992; Chapter 17. [Google Scholar] [CrossRef]
- Fischhoff, B. Hindsight: Thinking backward? ONR Technical Report. In Oregon Research Institute Monograph 1974; No.1.; US Office of Naval Research: Arlington, VA, USA, 1974; Volume 14. [Google Scholar]
- Weick, K.E. The Social Psychology of Organizing; Addison Wesley: Reading, MA, USA, 1969. [Google Scholar]
- Perrow, C.B. An Almost Random Career. In Management Laureates: A Collection of Autobiographical Essays; Bedeian, A.G., Ed.; JAI Press: Greenwich, CT, USA, 1992; Volume 2, pp. 399–438, reprinted in Routledge ebook; Routledge: London, UK, 2018; Chapter 40. [Google Scholar] [CrossRef]
- Perrow, C.B. Three Mile Island: A normal accident. In The International Yearbook of Organization Studies 1981; Dunkerley, D., Salaman, G., Eds.; Routledge & Kegan Paul: London, UK, 1981; pp. 1–25. [Google Scholar]
- Perrow, C.B. The President’s Commission and the Normal Accident. In Accident at Three Mile Island: The Human Dimensions; Sills, D.L., Wolf, C.P., Shelanski, V.B., Eds.; Westview Press: Boulder, CO, USA, 1982; pp. 173–184. [Google Scholar]
- Perrow, C.B. Organizational Prestige: Some Functions and Dysfunctions. Am. J. Sociol. 1961, 66, 335–341. [Google Scholar] [CrossRef]
- Perrow, C.B. The Analysis of Goals in Complex Organizations. Am. Sociol. Rev. 1961, 26, 854–866. [Google Scholar] [CrossRef]
- Perrow, C.B. The Sociological Perspective and Political Pluralism. Soc. Res. 1964, 31, 411–422. Available online: https://www.jstor.org/stable/40969752 (accessed on 27 March 2021).
- Perrow, C.B. Hospitals: Technology, Structure and Goals in Handbook of Organizations; March, J.G., Ed.; Rand McNally: Chicago, IL, USA, 1965. [Google Scholar]
- Perrow, C.B. A Framework for the Comparative Analysis of Organizations. Am. Sociol. Rev. 1967, 32, 194–208. [Google Scholar] [CrossRef]
- Perrow, C.B. Book Review—Industrial Organization: Theory and Practice by Joan Woodward. OUP 1965. Am. Sociol. Rev. 1967, 32, 313–315. [Google Scholar] [CrossRef]
- Perrow, C.B. Organizational Analysis: A Sociological View; Tavistock Publications: London, UK, 1970. [Google Scholar]
- Zannetos, Z.S. Organizational Analysis: A Sociological Review by Charles Perrow. J. Bus. 1971, 44, 338–339. Available online: https://www.jstor.org/stable/2351349 (accessed on 29 March 2021). [CrossRef]
- Perrow, C.B. The Radical Attack on Business: A Critical Analysis; Harcourt, Brace, Jovanovich: Boston, MA, USA, 1972. [Google Scholar]
- Perrow, C.B. Complex Organizations: A Critical Essay; Scott Foresman & Company: Glenview, IL, USA, 1972. [Google Scholar]
- Perrow, C.B. Complex Organizations: A Critical Essay, 3rd ed.; McGraw-Hill: New York, NY, USA, 1986. [Google Scholar]
- Lacy, R. Introduction. (Special issue on the occasion of the twentieth anniversary of the publication of Complex Organizations: A Critical Essay by Charles Perrow). Int. Public Manag. J. 2007, 10, 131–135. [Google Scholar] [CrossRef]
- Perrow, C.B. The Short and Glorious History of Organizational Theory. Organ. Dyn. 1973, 2, 3–15. [Google Scholar] [CrossRef]
- Perrow, C.B. Is Business Really Changing? Organ. Dyn. 1974, 3, 31–44. [Google Scholar] [CrossRef]
- Perrow, C.B. The Bureaucratic Paradox: The Efficient Organization Centralizes in Order to Decentralize. Organ. Dyn. 1977, 5, 3–14. [Google Scholar] [CrossRef]
- Perrow, C.B. Zoo story, or life in the organizational sandpit. In Control and Ideology in Organizations; Salaman, G., Thompson, K., Eds.; MIT Press: Cambridge, MA, USA, 1980; pp. 259–277. [Google Scholar]
- Perrow, C.B. Normal Accident at Three Mile Island. Society 1981, 18, 17–25. [Google Scholar] [CrossRef]
- Perrow, C.B. Not Risk but Power—Book Review of Societal Risk Assessment: How Safe Is Safe Enough?; Richard, C.S., Walter, A.A., Jr., Eds.; Plenum Press: New York, NY, USA, 1980. Contemp. Sociol. 1982, 11, 298–300. [Google Scholar] [CrossRef]
- Perrow, C.B. The Organizational Context of Human Factors Engineering. Adm. Sci. Q. 1983, 28, 521–541. [Google Scholar] [CrossRef]
- Perrow, C.B. The Limits of Safety: The Enhancement of a Theory of Accidents. J. Contingencies Crisis Manag. 1994, 2, 212–220. [Google Scholar] [CrossRef]
- Perrow, C.B. Accidents in High-Risk Systems. J. Technol. Stud. 1994, 1, 1–20. [Google Scholar]
- Perrow, C.B. A Personal note on Normal Accidents. Organ. Environ. 2004, 17, 9–14. [Google Scholar] [CrossRef]
- Perrow, C.B. A Response. Int. Public Manag. J. 2007, 10, 191–200. [Google Scholar] [CrossRef]
- Perrow, C.B. The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters; Updated paperback edition; Princeton University Press: Princeton, NJ, USA, 2011. [Google Scholar]
- Perrow, C.B. Fukushima and the inevitability of accidents. Bull. At. Sci. 2011, 67, 44–52. [Google Scholar] [CrossRef]
- Perrow, C.B. Getting to Catastrophe: Concentration, Complexity and Coupling. The Montreal Review. December 2012. Available online: https://www.themontrealreview.com/2009/Normal-Accidents-Living-with-High-Risk-Technologies.php (accessed on 28 March 2021).
- La Porte, T.R. High Reliability Organizations: Unlikely, Demanding and At Risk. J. Contingencies Crisis Manag. 1996, 4, 60–71. [Google Scholar] [CrossRef]
- Roberts, K.H. Book Review Essay, Managing the Unexpected: Six years of HRO-Literature Reviewed. J. Contingencies Crisis Manag. 2009, 17, 50–54. [Google Scholar] [CrossRef]
- Le Coze, J.C. In the Footsteps of Turner: From Grounded Theory to Conceptual Ethnography in Safety. In Inside Hazardous Technological Systems: Methodological Foundations, Challenges and Future Directions; Gould, K.P., Macrae, C., Eds.; CRC Press: Boca Raton, FL, USA, 2021; pp. 49–68. [Google Scholar]
- Parsons, T. On Building Social System Theory: A Personal History. Daedalus 1970, 99, 826–881. Available online: http://www.jstor.org/stable/20023975 (accessed on 17 June 2021).
- Parsons, T. The Social System (With a New Preface by Bryan S. Turner); First Edition 1951; Routledge: London, UK, 1991. [Google Scholar]
- Weick, K.E. Educational Organizations as Loosely Coupled Systems. Adm. Sci. Q. 1976, 21, 1–19. [Google Scholar] [CrossRef]
- Turner, B.A. The Social Aetiology of Disasters. Disasters 1979, 3, 53–59. [Google Scholar] [CrossRef]
- Woodward, J. Industrial Organization: Theory and Practice; Oxford University Press: Oxford, UK, 1965. [Google Scholar]
- Perrow, C.B. From Medieval History to Smashing the Medieval Account of Organizations. In Technology and Organization: Essays in the Honour of Joan Woodward; Phillips, N., Griffiths, D., Sewell, G., Eds.; Research in the Sociology of Organizations; Emerald Group Publishing: Bradford, UK, 2010; Volume 29, pp. 25–28. [Google Scholar] [CrossRef]
- Pidgeon, N.F. In Retrospect: Normal Accidents. Nature 2011, 477, 404–405. [Google Scholar] [CrossRef]
- Turner, B.A. Introduction. In Organizational Symbolism; Turner, B.A., Ed.; Walter de Gruyter: Berlin, Germany, 1990; pp. 364–384. [Google Scholar]
- Turner, B.A. The Rise of Organisational Symbolism. In The Theory and Philosophy of Organizations: Critical Issues and New Perspectives; Hassard, J., Pym, D., Eds.; Routledge: London, UK, 1990; Chapter 5; pp. 83–96. [Google Scholar]
- Hassard, J. Pop Culture Magicians Seek Honest-Grappler-after-Truth for Marginal Discussion. Organ. Stud. 1999, 20, 561–578. [Google Scholar] [CrossRef]
- Howd, J. Personal communication. 21–24 February, 2 April & 19–28 June; 2021; 19 April 2023.
- Turner, B.A. Some Practical Aspects of Qualitative Data Analysis: One Way of Organising the Cognitive Processes Associated with the Generation of Grounded Theory. Qual. Quant. 1981, 15, 225–247. [Google Scholar] [CrossRef]
- Turner, B.A. Connoisseurship in the Study of Organizational Cultures. In Doing Research in Organizations; Alan, B., Ed.; Routledge: London, UK, 1988; Chapter 7; pp. 108–122. [Google Scholar]
- Gherardi, S.; Turner, B.A. Real Men Don’t Collect Soft Data. In Quaderno; Universita di Trento, Dipartimento di Politica: Trento, Italy, 1987; Volume 3; reprinted in The Qualitative Researcher’s Companion; Huberman, A.M., Miles, M.B., Eds.; SAGE Publications Inc.: New York, NY, USA, 2002; Part I, Chapter 4. [Google Scholar] [CrossRef]
- Gherardi, S.; Strati, A.; Turner, B.A. Industrial Democracy and Organizational Symbolism. In Organizational Democracy: Taking stock. International Handbook of Participation in Organizations; Lammers, C.J., Széll, G., Eds.; De Gruyter: Berlin, Germany; Oxford University Press: Oxford, UK, 1989. [Google Scholar]
- Martin, P.Y.; Turner, B.A. Grounded Theory and Organizational Research. J. Appl. Behav. Sci. 1986, 22, 141–157. [Google Scholar] [CrossRef]
- Pidgeon, N.F. Safety Culture and Risk Management in Organizations. J. Cross Cult. Psychol. 1991, 22, 129–140. [Google Scholar] [CrossRef]
- Turner, B.A.; Pidgeon, N.F.; Blockley, D.I.; Toft, B. Safety Culture: Its Importance in Future Risk Management. In Position Paper for the Second World Bank Workshop on Safety Control and Risk Management, Proceedings of the the Second World Bank Workshop on Safety Control and Risk Management, Karlstad, Sweden, 6–9 November 1989; University of Bristol: Bristol, UK.
- Toft, B.; Turner, B.A. The Schematic Report Analysis Diagram: A Simple Aid to Learning from Large-scale Failures. Int. CIS J. 1987, 1, 12–23, reprinted in Mimeo. In Risk Management, Volume II: Management and Control; Mars, G., Weir, D.T., Eds.; Routledge: London, UK, 2000; pp. 435–446. [Google Scholar]
- Turner, B.A.; Toft, B. Organizational Learning from Disasters. In Emergency Planning for Industrial Hazards. Proceedings of the European Conference on Emergency Planning for Industrial Hazards, Varese, Italy, 4–6 November 1987; Gow, H.B., Kay, R.W., Eds.; Commission of the European Communities; Elsevier: London, UK, 1988; Chapter 31; pp. 297–313. [Google Scholar]
- Turner, B.A. Organisational Responses to Hazard. In Risk: A Seminar Series, IIASA Collaborative Proceedings Series CP-82-S2, 1981; Kunreuther, H., Ed.; International Institute for Applied Systems Analysis: Laxenburg, Austria, 1982; Part I; pp. 49–86. [Google Scholar]
- Turner, B.A. Empty Portmanteaux? Organ. Stud. 1984, 5, 269–273. [Google Scholar] [CrossRef]
- Turner, B.A. Accidents and Non-random Error Propagation. Risk Anal. 1989, 9, 437–444. [Google Scholar] [CrossRef]
- Turner, B.A. How can we design a safe organisation? In Proceedings of the the Second International Conference on Industrial and Organisational Crisis Management, New York, NY, USA, 3–4 November 1989; Leonard, N., Ed.; Stern School of Business, New York University: New York, NY, USA, 1989. [Google Scholar]
- Turner, B.A. The Development of a Safety Culture. Chem. Ind. 1991, 1, 241–243. [Google Scholar]
- Turner, B.A. The Sociology of Safety. In Engineering Safety; Blockley, D.I., Ed.; McGraw Hill: London, UK, 1992; pp. 186–201. [Google Scholar]
- Turner, B.A. Software and Contingency: The Text and Vocabulary of System Failure? Softw. Conting. 1994, 2, 31–38. [Google Scholar] [CrossRef]
- Turner, B.A. The Future for Risk Research. J. Contingencies Crisis Manag. 1994, 2, 146–156. [Google Scholar] [CrossRef]
- Turner, B.A. Patterns of crisis behaviour—A qualitative inquiry. In Analyzing Qualitative Data; Bryman, A., Burgess, R.G., Eds.; Routledge: London, UK, 1994; pp. 195–215. [Google Scholar] [CrossRef]
- Turner, B.A. The role of flexibility and improvisation in emergency response and A perspective from the social sciences. In Natural Risk and Civil Protection; Horlick-Jones, T., Amendola, A., Casale, R., Eds.; European Commission/E&FN SPON: London, UK, 1995; pp. 463ߝ475, 535–537. [Google Scholar]
- Turner, B.A. Safety Culture Management: Safety Culture and its Context, Proceedings of the International Topical Meeting on Safety Culture in Nuclear Installations, Vienna, Austria, 24–28 April 1995; Carnino, A., Weimann, G., Eds.; American Nuclear Society Austria Local Section: Vienna, Austria, 1995; pp. 322–329. Available online: https://inis.iaea.org/collection/NCLCollectionStore/_Public/27/036/27036465.pdf?r=1 (accessed on 27 April 2021).
- Horlick-Jones, T.; Amendola, A.; Casale, R. (Eds.) Natural Risk and Civil Protection; European Commission/E&FN SPON: London, UK, 1995. [Google Scholar]
- Toft, B. The Failure of Hindsight. Disaster Prev. Manag. 1992, 1, 48–60, reprinted in Risk Management Volume II: Management and Control; Mars, G., Weir, D., Eds.; Ashgate: Dartmouth, UK, 2000; Chapter 34. [Google Scholar] [CrossRef]
- Haastrup, P.; Funtowicz, S. Accident generating systems and chaos: A dynamic study of accident time series. Reliab. Eng. Syst. 1992, 35, 31–37. [Google Scholar] [CrossRef]
- Perrow, C.B. Organizing America: Wealth, Power, and the Origins of Corporate Capitalism; Princeton University Press: Princeton, NJ, USA, 2002. [Google Scholar]
- Perrow, C.B. The Next Catastrophe: Reducing Our Vulnerabilities to Natural, Industrial, and Terrorist Disasters; Princeton University Press: Princeton, NJ, USA, 2007. [Google Scholar]
- Perrow, C.B. A Society of Organizations. Theory Soc. 1991, 20, 725–762. [Google Scholar] [CrossRef]
- Perrow, C.B. Organisational Theorists in a Society of Organisations. Int. Sociol. 1992, 7, 371–380. [Google Scholar] [CrossRef]
- Perrow, C.B. Negative Synergy—Review of Learning from Disasters: A Management Approach by Brian Toft and Simon Reynolds. Nature 1994, 370, 607–608. [Google Scholar] [CrossRef]
- Perrow, C.B. Organizing for Environmental Destruction. Organ. Environ. 1997, 10, 66–72. [Google Scholar] [CrossRef]
- Perrow, C.B. Organizing to Reduce the Vulnerabilities of Complexity. J. Contingencies Crisis Manag. 1999, 7, 150–155. [Google Scholar] [CrossRef]
- Perrow, C.B. An Organizational Analysis of Organizational Theory. Contemp. Sociol. 2000, 29, 469–476. [Google Scholar] [CrossRef]
- Perrow, C.B. Organizational or Executive Failures? Contemp. Sociol. 2005, 34, 99–107. [Google Scholar] [CrossRef]
- Perrow, C.B. Shrink the Targets. IEEE Spectrum 2006, 43, 46–49. [Google Scholar] [CrossRef]
- Perrow, C.B. Disasters Ever More? Reducing U.S. Vulnerabilities. In Handbook of Disaster Research; Rodriguez, H., Quarantelli, E.L., Dynes, R.R., Eds.; Springer: New York, NY, USA, 2007; pp. 521–533. [Google Scholar] [CrossRef]
- Perrow, C.B. Complexity, Catastrophe, and Modularity. Sociol. Inq. 2008, 78, 162–173. [Google Scholar] [CrossRef]
- Perrow, C.B. Conservative Radicalism. Organization 2008, 15, 915–921. [Google Scholar] [CrossRef]
- Perrow, C.B. Modeling firms in the global economy. Theory Soc. 2009, 38, 217–243. [Google Scholar] [CrossRef]
- Perrow, C.B. Resilience Rather than Prevention and Recovery. Build. Res. Inf. 2009, 37, 213–216. [Google Scholar] [CrossRef]
- Perrow, C.B. Book Review—High Reliability Management: Operating on the Edge by Emery Roe & Paul R. Schulman. Adm. Sci. Q. 2009, 54, 364–367. Available online: https://www.jstor.org/stable/27749335 (accessed on 27 March 2021).
- Perrow, C.B. What’s needed is application, not reconciliation: A response to Shrivastava, Sonpar and Pazzaglia. Hum. Relat. 2009, 62, 1391–1393. [Google Scholar] [CrossRef]
- Perrow, C.B. The meltdown was not an accident. In Markets on Trial: The Economic Sociology of the U.S. Financial Crisis: Part A; Research in the Sociology of, Organizations; Lounsbury, M., Hirsch, P.M., Eds.; Emerald Group Publishing: Bingley, UK, 2010; Volume 30A, pp. 309–330. [Google Scholar] [CrossRef]
- Perrow, C.B. Drinking Deep at Black Mountain College. South. Cult. 2013, 19, 76–94. [Google Scholar] [CrossRef]
- Perrow, C.B. Cracks in the ‘Regulatory State’. Soc. Curr. 2015, 2, 203–212. [Google Scholar] [CrossRef]
- Perrow, C.B. Effectiveness of Regulatory Agencies. In The Routledge Companion to Risk, Crisis and Emergency Management; Gephart, R., Miller, C., Helgesson, K., Eds.; Routledge: London, UK, 2018; Chapter 36; pp. 508–512. [Google Scholar]
- Gephart, R.P. Making Sense of Organizationally Based Environmental Disasters. J. Manag. 1984, 10, 205–225. [Google Scholar] [CrossRef]
- Sagan, S.D. The Limits of Safety: Organizations, Accidents and Nuclear Weapons; Princeton University Press: Princeton, NJ, USA, 1993. [Google Scholar]
- Clarke, L. Acceptable Risk? Making Decisions in a Toxic Environment; University of California Press: Berkeley, CA, USA, 1989. [Google Scholar]
- Clarke, L. Mission Improbable: Using Fantasy Documents to Tame Disaster; University of Chicago Press: Chicago, IL, USA, 1999. [Google Scholar]
- Clarke, L.; Perrow, C.B. Prosaic Organizational Failure. Am. Behav. Sci. 1996, 39, 1040–1056. [Google Scholar] [CrossRef]
- Snook, S.A. Friendly Fire: The Accidental Shootdown of U.S. Black Hawks over Northern Iraq; Princeton University Press: Princeton, NJ, USA, 2000. [Google Scholar]
- La Porte, T.R. A Strawman Speaks Up: Comments on The Limits of Safety. J. Contingencies Crisis Manag. 1994, 2, 207–211. [Google Scholar] [CrossRef]
- La Porte, T.R.; Rochlin, G.I. A Rejoinder to Perrow. J. Contingencies Crisis Manag. 1994, 2, 221–227. [Google Scholar] [CrossRef]
- Rochlin, G.I. Safe operation as a social construct. Ergonomics 1999, 42, 1549–1560. [Google Scholar] [CrossRef]
- Douglas, M.; Wildavsky, A. Risk and Culture: An Essay on the Selection of Technological and Environmental Dangers; University of California Press: Berkeley, CA, USA, 1982. [Google Scholar]
- Douglas, M. Loose Ends and Complex Arguments—Review Essay of Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Contemp. Sociol. 1985, 14, 171–173. Available online: https://www.jstor.org/stable/2070132 (accessed on 27 March 2021). [CrossRef]
- Hopkins, A. The limits of normal accident theory. Saf. Sci. 1999, 32, 93–102. [Google Scholar] [CrossRef]
- Hopkins, A. Was Three Mile Island a normal accident? J. Contingencies Crisis Manag. 2001, 9, 65–72. [Google Scholar] [CrossRef]
- McGill, A.R. Book Review—Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Hum. Resour. Manag. 1984, 23, 434–436. [Google Scholar] [CrossRef]
- Hirschhorn, L. On Technological Catastrophe—Normal Accidents: Living with High-Risk Technologies. Science 1985, 228, 846–847. [Google Scholar] [CrossRef]
- Kates, R.W. Book Review—Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Prof. Geogr. 1986, 38, 121–122. [Google Scholar] [CrossRef]
- Roberts, K. The significance of Perrow’s Normal Accidents. Acad. Manag. Rev. 1989, 14, 285–289. [Google Scholar] [CrossRef]
- Rossi, P.H. Book Review—Normal Accidents: Living with High Risk Technologies by Charles Perrow. Am. J. Sociol. 1985, 91, 181–184. Available online: https://www.jstor.org/stable/2779895 (accessed on 28 March 2021). [CrossRef]
- Wildavsky, A. But Is It True? A Citizen’s Guide to Environmental Health and Safety Issues; Harvard University Press: Cambridge, MA, USA, 1995. [Google Scholar]
- Cummings, L.L. Book Review—Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Adm. Sci. Q. 1985, 29, 630–632. [Google Scholar] [CrossRef]
- Grimes, A.J. Book Review—Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Acad. Manag. Rev. 1985, 10, 366–368. [Google Scholar] [CrossRef]
- Ravetz, J. Making accidents ‘normal’—Review of Normal Accidents: Living with High-Risk Technologies by Charles Perrow. Futures 1985, 17, 287–288. [Google Scholar] [CrossRef]
- Turkstra, C.J. Book Review—Normal Accidents: Living with High Risk Technologies, by Charles Perrow. Struct. Saf. 1986, 4, 165. [Google Scholar] [CrossRef]
- Williams, B. Accidents Will Happen—Review of Normal Accidents: Living with High Risk Technologies, by Charles Perrow. Soc. Stud. Sci. 1988, 18, 556–560. [Google Scholar] [CrossRef]
- Jermier, J.M. “Complex Systems Threaten to Bring Us Down …”: Introduction to the Symposium on Normal Accidents. Organ. Environ. 2004, 17, 5–8. [Google Scholar] [CrossRef]
- Rosa, E.A. Celebrating a Citation Classic—And More: Symposium on Charles Perrow’s Normal Accidents. Organ. Environ. 2005, 18, 229–234. [Google Scholar] [CrossRef]
- Sagan, S.D. Learning from Normal Accidents. Organ. Environ. 2004, 17, 15–19. [Google Scholar] [CrossRef]
- Le Coze, J.C. 1984–2014. Normal Accidents. Was Charles Perrow Right for the Wrong Reasons? J. Contingencies Crisis Manag. 2015, 23, 275–286. [Google Scholar] [CrossRef]
- Hopkins, A. Issues in safety science. Saf. Sci. 2014, 67, 6–14. [Google Scholar] [CrossRef]
- Hopkins, A. Managing Major Hazards: The Lessons of the Moura Mine Disaster; Allen & Unwin: Sydney, Australia, 1999. [Google Scholar]
- Hopkins, A. Counteracting the Cultural Causes of Disaster. J. Contingencies Crisis Manag. 1999, 7, 141–149. [Google Scholar] [CrossRef]
- Hopkins, A. Lessons from Longford; CCH: Sydney, Australia, 2000. [Google Scholar]
- Hopkins, A. A culture of denial: Sociological similarities between the Moura and Gretley mine disasters. J. Occup. Health Saf. —Aust. New Zealand 2000, 16, 29–36. [Google Scholar]
- Hopkins, A. Lessons from Gretley: Mindful Leadership and the Law; CCH: Sydney, Australia, 2007. [Google Scholar]
- Cyert, R.M.; March, J.G. A Behavioral Theory of the Firm; Prentice-Hall: Englewood Cliffs, NJ, USA, 1963. [Google Scholar]
- Cohen, M.D.; March, J.G.; Olsen, J.P. A Garbage Can Model of Organizational Choice. Adm. Sci. Q. 1972, 17, 1–25. [Google Scholar] [CrossRef]
- Toft, B. Personal communication, 2021. 21–30 June 2021 with copies of correspondence to and from Professor J.T. Reason on 17 and 21 July 1987 respectively; & 15 December 2022.
- Pidgeon, N.F. Personal communication, 2021. 21–23 April & 2–4, 8–9 June 2021.
- Perrow, C.B. Letter to Mrs Turner of 21 November 1995. The Letter Was an Emailed Attachment to Mrs Turner Provided to the Primary Author in a Personal Communication from Janet Howd (aka Janet Turner), on 24 February 2021.
- Editorial Introduction: A Special Symposium on Barry Turner’s Work on Man-Made Disasters. J. Contingencies Crisis Manag. 1998, 6, 71. [CrossRef]
- Toft, B.; Reynolds, S. Learning from Disasters: A Management Approach; Butterworth-Heinemann: Oxford, UK, 1994. [Google Scholar]
- Ashmos, D.P.; Huber, G.P. The Systems Paradigm in Organization Theory: Correcting the Record and Suggesting the Future. Acad. Manag. Rev. 1987, 12, 607–621. [Google Scholar] [CrossRef]
- Boulding, K.E. General Systems Theory—The Skeleton of Science. Manag. Sci. 1956, 2, 197–208. [Google Scholar] [CrossRef]
- Kast, F.E.; Rosenzweig, J.E. General Systems Theory: Applications for Organization and Management. Acad. Manag. J. 1972, 15, 447–465. [Google Scholar] [CrossRef]
- Von Bertalanffy, L. The History and Status of General Systems Theory. Acad. Manag. J. 1972, 15, 407–426. [Google Scholar] [CrossRef]
- Weick, K.E. The vulnerable system: An analysis of the Tenerife air disaster. In New Challenges to Understanding Organizations; Roberts, K.H., Ed.; Macmillan: New York, NY, USA, 1993; pp. 173–198. [Google Scholar]
- Pidgeon, N.F. Observing the English Weather: A Personal Journey from Safety I to Safety IV. In Safety Science Research: Evolution, Challenges and New Directions; Le Coze, J.C., Ed.; CRC Press: Boca Raton, FL, USA, 2000; pp. 269–279. [Google Scholar]
- Pidgeon, N.F.; O’Leary, M. Man-made Disasters: Why technology and organizations (sometimes) fail. Saf. Sci. 2000, 34, 15–30. [Google Scholar] [CrossRef]
- Blockley, D.I. Managing Proneness to Failure. J. Contingencies Crisis Manag. 1998, 6, 76–79. [Google Scholar] [CrossRef]
- Blockley, D.I. Building Bridges: Between Theory and Practice; World Scientific Publishing Europe Ltd.: London, UK, 2020. [Google Scholar]
- Blockley, D.I. Personal communication, 24&26 June 2021.
- Toft, B. External Review of Never Events in Interventional Procedures. Co-Commissioned by Sheffield Teaching Hospitals NHS Foundation Trust and Sheffield Clinical Commissioning Group. 2014. Available online: https://docplayer.net/4636370-Professor-brian-toft-obe-june-2014.html (accessed on 14 February 2021).
- Toft, B.; Reynolds, S. Learning from Disasters: A Management Approach, 3rd ed.; Perpetuity Press: Leicester, UK, 2005; Available online: https://link.springer.com/book/10.1007/978-1-349-27902-9 (accessed on 7 September 2023).
- Gherardi, S. Speaking Personally: Remembering Barry Turner. Organization 1995, 2, 547–549. [Google Scholar] [CrossRef]
- Gherardi, S. A Cultural Approach to Disasters. J. Contingencies Crisis Manag. 1998, 6, 80–83. [Google Scholar] [CrossRef]
- Gherardi, S. Man-made Disasters … Twenty Years On. Organ. Stud. 1999, 20, 695–700. [Google Scholar] [CrossRef]
- Vaughan, D.; Turner, B.A.; Pidgeon, N.F. Foreword. In Man-Made Disasters, 2nd ed.; Butterworth-Heinemann: Oxford, UK, 1997. [Google Scholar]
- Rosenthal, U.; Turner, B.A.; Pidgeon, N.F. Foreword. In Man-Made Disasters, 2nd ed.; Butterworth-Heinemann: Oxford, UK, 1997. [Google Scholar]
- Google Scholar. Citation searches for all editions of Man-made Disasters and Normal Accidents: Living with High-Risk Technologies. 29 May. Available online: https://scholar.google.com.au/ (accessed on 29 May 2021).
- Merigó, J.; Miranda, J.; Modak, N.; Boustras, G.; de la Sotta, C. Forty years of Safety Science: A bibliometric overview. Saf. Sci. 2019, 115, 66–88. [Google Scholar] [CrossRef]
- Hale, A.R.; Hale, M. Accidents in Perspective. Occup. Psychol. 1970, 44, 115–121. [Google Scholar]
- Hale, A.R.; Hale, M. A Review of the Industrial Accident Research Literature; Research Paper 2; Committee on Safety and Health at Work, H.M.S.O.: London, UK, 1972. [Google Scholar]
- Hale, A.R.; Glendon, I. Individual Behaviour in the Control of Danger. Industrial Safety Series; Elsevier: Amsterdam, The Netherlands, 1987; Volume 2. [Google Scholar]
- Hale, A.R.; Hovden, J. Management and Culture: The third age of safety. A review of approaches to organizational aspects of safety, health and environment. In Occupational Injury: Risk, Prevention, and Intervention; Feyer, A.M., Williamson, A., Eds.; CRC Press: London, UK, 1998; pp. 129–165. [Google Scholar]
- Meshkati, N. Self-organisation, requisite variety and cultural environment: Three links of a safety chain to harness complex technological systems. In Proceedings of the Second World Bank Workshop on Risk Management (In Large-Scale Technological Operations) Organised Jointly with the Swedish Rescue Services Board, Karlstad, Sweden, 6–11 November 1989. [Google Scholar]
- Dwyer, T. Life and Death at Work: Industrial Accidents as a Case of Socially Produced Error; Springer Science+Business Media: New York, NY, USA, 1991. [Google Scholar]
- Hale, A.R.; Heming, B.H.; Carthey, J.; Kirwan, B. Modelling of Safety Management Systems. Saf. Sci. 1997, 26, 121–140. [Google Scholar] [CrossRef]
- Hale, A.R. Culture’s Confusions—Editorial. Saf. Sci. 2000, 34, 1–14. [Google Scholar] [CrossRef]
- Hale, A.R. I came into safety by accident: Dr Patrick Waterson (Loughborough University) meets Professor Andrew Hale from Health and Safety Technology and Management. Psychologist 2017, 30, 64–67. Available online: https://www.bps.org.uk/psychologist/i-came-safety-accident (accessed on 14 February 2021).
- Hale, A.R. Review of the Industrial Accident Research Literature. Hastam Blog. 2017. Available online: https://www.hastam.co.uk/review-industrial-accident-research-literature/ (accessed on 14 February 2021).
- Palmer, D. Taking Stock of the Criteria We Use to Evaluate One Another’s Work: ASQ 50 Years Out. Adm. Sci. Q. 2006, 51, 535–559. [Google Scholar] [CrossRef]
- Weick, K.E. Enacted Sensemaking in Crisis Situations. J. Manag. Stud. 1988, 25, 305–317. [Google Scholar] [CrossRef]
- Weick, K.E. Foresights of Failure: An Appreciation of Barry Turner. J. Contingencies Crisis Manag. 1998, 6, 72–74. [Google Scholar] [CrossRef]
- Weick, K.E. Normal Accident Theory as Frame, Link, and Provocation. Organ. Environ. 2004, 17, 27–31. [Google Scholar] [CrossRef]
- Weick, K.E. Making Sense of the Organization, Volume 2: The Impermanent Organization; Also Available Online as a ProQuest Ebook; John Wiley & Sons: Chichester, UK, 2009. [Google Scholar]
- Weick, K.E.; Sutcliffe, K.M.; Obstfeld, D. Organizing and the Process of Sensemaking. Front. Organ. Sci. 2005, 16, 409–421. [Google Scholar] [CrossRef]
- Weick, K.E. Reflections on Enacted Sensemaking in the Bhopal Disaster. J. Manag. Stud. 2010, 47, 537–550. [Google Scholar] [CrossRef]
- Weick, K.E. Organizational culture as a source of high reliability. Calif. Manag. Rev. 1987, XXIX, 112–127. [Google Scholar] [CrossRef]
- Rasmussen, J. Publications. Danish Technological University Website, Denmark. 2021. Available online: http://www.jensrasmussen.org/publikations (accessed on 26 June 2021).
- Reason, J. Organizational Accidents Revisited; Ashgate: Farnham, UK, 2016. [Google Scholar]
- Rasmussen, J. Human Errors. A Taxonomy for describing human malfunction in industrial installations. J. Occup. Accid. 1982, 4, 311–333. [Google Scholar] [CrossRef]
- Rasmussen, J. Skills, Rules, and Knowledge; Signals, Signs, and Symbols, and Other Distinctions in Human Performance Models. IEEE Trans. Syst. Man Cybern. 1983, SMC-13, 257–266. [Google Scholar] [CrossRef]
- Rasmussen, J. Human Error and the problem of causality in analysis of accidents. Philos. Trans. R. Soc. Lond. 1990, B327, 449–462. [Google Scholar] [CrossRef]
- Rasmussen, J. Risk management in a dynamic society: A modelling problem. Saf. Sci. 1997, 27, 183–213. [Google Scholar] [CrossRef]
- Rasmussen, J.; Svedung, I. Proactive Risk Management in a Dynamic Society; Swedish Rescue Services Agency: Karlstad, Sweden, 2000. [Google Scholar]
- Svedung, I.; Rasmussen, J. Graphic representation of accident scenarios: Mapping structure and the causation of accidents. Saf. Sci. 2002, 40, 397–417. [Google Scholar] [CrossRef]
- Rasmussen, J. Human Factors in High-Risk Systems. In Proceedings of the Conference Record for 1988 IEEE Conference on Human Factors and Power Plants, Monterey, CA, USA, 5–9 June 1988; pp. 43–48. [Google Scholar] [CrossRef]
- Rasmussen, J. Man-Machine Communication in the Light of Accident Record. In IEEE Conference Records, Proceedings of the International Symposium on Man-Machine Systems, Cambridge, UK, 8–12 September 1969; 69C58-MMS. Volume 3, p. 3. [Google Scholar]
- Rasmussen, J. Outlines of a hybrid model of the process plant operation. In Monitoring Behavior and Supervisory Control, Proceedings of the International Symposium on Monitoring Behavior and Supervisory Control, Berchtesgaden, Germany, 8–12 March 1976; Plenum Press: New York, NY, USA, 1976; Chapter 31; pp. 31371–31383. [Google Scholar] [CrossRef]
- Rasmussen, J. Human Error Mechanisms in Complex Work Environments. Reliab. Eng. Syst. 1988, 22, 155–167. [Google Scholar] [CrossRef]
- Rasmussen, J.; Jensen, A. Mental Procedures in Real-Life Tasks: A Case Study of Electronic Trouble Shooting. Ergonomics 1974, 17, 293–307. [Google Scholar] [CrossRef] [PubMed]
- Rasmussen, J.; Pedersen, O.M. Formalized Search Strategies for Human Risk Contributions: A Framework for Further Development. Risø National Laboratory. Risø-M-2351. July 1982. Available online: https://backend.orbit.dtu.dk/ws/portalfiles/portal/53704802/ris_m_2351.pdf (accessed on 8 June 2021).
- Rasmussen, J.; Pedersen, O.M. Human factors in probabilistic risk analysis and in risk management. In IAEA Operational Safety of Nuclear Power Plants, Proceedings of the International Symposium on Operational Safety of Nuclear Power Plants, Marseilles, France, 2–6 May 1983; International Atomic Energy Agency Proceedings Series, IAEA-SM-268/2; IAEA: Vienna, Austria, 1984; pp. 181–194. [Google Scholar]
- Rasmussen, J. Information Processing and Human-Machine Interaction: An Approach to Cognitive Engineering; North-Holland System Science and Engineering Series Volume 12; North-Holland: New York, NY, USA, 1986. [Google Scholar]
- Rasmussen, J.; Batstone, R. Why Do Complex Organizational Systems Fail? Results of a Workshop on Safety Control and Risk Management Held in Washington, DC from 18–20 October 1988. The World Bank Policy Planning and Research Staff: Environment Working Paper No. 20, October 1989. Available online: https://documents.worldbank.org/en/publication/documents-reports/documentdetail/535511468766200820/why-do-complex-organizational-systems-fail (accessed on 6 June 2021).
- Rasmussen, J.; Batstone, R.; Rosenberg, T. (Eds.) Workshop on Safety Control and Risk Management: An Overview. Karlstad, Sweden, 6–8 November 1989; Sponsored by the World Bank and the Swedish Rescue Services Board. Paper Published in 1991 by the Swedish Rescue Services Board, Karlstad, Sweden. Available online: www.orbit.dtu.dk (accessed on 29 June 2021).
- Le Coze, J.C. New models for new times. An anti-dualist move. Saf. Sci. 2013, 59, 200–218. [Google Scholar] [CrossRef]
- Le Coze, J.C. Reflecting on Jens Rasmussen’s legacy. A strong program for a hard problem. Saf. Sci. 2015, 71, 123–141. [Google Scholar] [CrossRef]
- Le Coze, J.C. Reflecting on Jens Rasmussen’s legacy (2) behind and beyond, a ‘constructivist turn’. Appl. Ergon. 2017, 59, 558–569. [Google Scholar] [CrossRef]
- Dekker, S.W. Rasmussen’s legacy and the long arm of rational choice. Appl. Ergon. 2017, 59, 554–557. [Google Scholar] [CrossRef]
- Leveson, N.G. Rasmussen’s legacy: A paradigm change in engineering for safety. Appl. Ergon. 2017, 59, 581–591. [Google Scholar] [CrossRef]
- Wise, J.A.; Debons, A. (Eds.) Information Systems: Failure Analysis. In Proceedings of the NATO Advanced Research Workshop on Failure Analysis of Information Systems, Bad Windsheim, Germany, 18–22 August 1986; Springer-Verlag: Berlin, Germany, 1987. [Google Scholar]
- Reason, J.T. An Interactionist’s View of System Pathology. In Information Systems: Failure Analysis. Proceedings of the NATO Advanced Research Workshop on Failure Analysis of Information Systems, Bad Windsheim, Germany, 18–22 August 1986; Wise, J.A., Debons, A., Eds.; Springer: Berlin, Germany, 1987; pp. 211–220. [Google Scholar] [CrossRef]
- Reason, J.T. The Chernobyl errors. Bull. Br. Psychol. Soc. 1987, 40, 201–206. [Google Scholar]
- Reason, J.T. Errors and Evaluations: The lessons of Chernobyl. In Proceedings of the 1988 IEEE Conference on Human Factors and Power Plants, Monterey, CA, USA, 5–9 June 1988; pp. 537–540. [Google Scholar] [CrossRef]
- Reason, J.T. Human Error; Cambridge University Press: Cambridge, UK, 1990. [Google Scholar]
- Reason, J.T. The contribution of latent human failures to the breakdown of complex systems. Phil. Trans R. Soc. B 1990, 327, 475–484. [Google Scholar] [CrossRef]
- Reason, J.T. Managing the Risks of Organizational Accidents; Ashgate: Aldershot, UK, 1997. [Google Scholar]
- Reason, J.T. Achieving a safe culture: Theory and practice. Work Stress 1998, 12, 293–306. [Google Scholar] [CrossRef]
- Reason, J.T. The Human Contribution: Unsafe Acts, Accidents and Heroic Recoveries; Ashgate: Farnham, UK, 2008. [Google Scholar]
- Reason, J.T. A Life in Error: From Little Slips to Big Disasters; Ashgate: Farnham, UK, 2013. [Google Scholar]
- Reason, J.T. Skill and error in everyday life. In Adult Learning: Psychological Research and Applications; Howe, M.J., Ed.; Wiley: London, UK, 1977. [Google Scholar]
- Reason, J.T.; Mycielska, K. Absent Minded: The Psychology of Mental Lapses and Everyday Errors; Prentice Hall: Englewood Cliffs, NJ, USA, 1983. [Google Scholar]
- Vaughan, D. Autonomy, interdependence and social control: NASA and the space shuttle Challenger. Adm. Sci. Q. 1990, 35, 225–237. [Google Scholar] [CrossRef]
- Vaughan, D. Regulating risk: Implications of the Challenger Accident. In Organizations, Uncertainties, and Risk; Short, J.F., Clarke, L., Eds.; Routledge & Westview Press: Boulder, CO, USA, 1992; pp. 235–253. [Google Scholar]
- Vaughan, D. The Challenger Launch Decision: Risky Technology, Culture and Deviance at NASA; The University of Chicago Press: Chicago, IL, USA, 1996. [Google Scholar]
- Vaughan, D. The Trickle-Down Effect: Policy Decisions, Risky Work, and the Challenger Tragedy. Calif. Manag. Rev. 1997, 39, 80–102. [Google Scholar] [CrossRef]
- Vaughan, D. The Dark Side of Organizations: Mistake, Misconduct, and Disaster. Annu. Rev. Sociol. 1999, 25, 271–305. [Google Scholar] [CrossRef]
- Vaughan, D. The Role of the Organization in the Production of Techno-Scientific Knowledge. Soc. Stud. Sci. 1999, 29, 913–943. [Google Scholar] [CrossRef]
- Vaughan, D. Theorizing Disaster: Analogy, historical ethnography, and the Challenger accident. Ethnography 2004, 5, 315–347. [Google Scholar] [CrossRef]
- Vaughan, D. Interview: Diane Vaughan—Sociologist, Columbia University. Consultant. 2008. Available online: https://www.consultingnewsline.com/Info/Vie%20du%20Conseil/Le%20Consultant%20du%20mois/Diane%20Vaughan%20%28English%29.html (accessed on 27 April 2021).
- Vaughan, D. Dead Reckoning: Air Traffic Control, System Effects, and Risk; The University of Chicago Press: Chicago, IL, USA, 2021. [Google Scholar]
- Leveson, N.G. Safeware: Systems Safety and Computers: A Guide to Preventing Accidents and Losses Caused by Technology; Addison-Wesley: Boston, MA, USA, 1995. [Google Scholar]
- Leveson, N.G. A new accident model for engineering safety systems. Saf. Sci. 2004, 42, 237–270. [Google Scholar] [CrossRef]
- Leveson, N.G. Applying systems thinking to analyze and learn from events. Saf. Sci. 2011, 49, 55–64. [Google Scholar] [CrossRef]
- Leveson, N.G. Engineering a Safer World: Systems Thinking Applied to Safety; The MIT Press: Cambridge, MA, USA, 2011. [Google Scholar]
- Leveson, N.G.; Dulac, N.; Marais, K.; Carroll, J. Moving Beyond Normal Accidents and High Reliability Organizations: A Systems Approach to Safety in Complex Systems. Organ. Stud. 2009, 30, 227–249. [Google Scholar] [CrossRef]
- Hopkins, A. Disastrous Decisions: The Human and Organisational Causes of the Gulf of Mexico Blowout; CCH: Sydney, Australia, 2012. [Google Scholar]
- Hollnagel, E. Books and Papers. Available online: https://erikhollnagel.com (accessed on 13 March 2021).
- Hollnagel, E. Barriers and Accident Prevention; Ashgate: Aldershot, UK, 2004. [Google Scholar]
- Lundberg, J.; Rollenhagen, C.; Hollnagel, E. What-You-Look-For-Is-What-You-Find—The consequences of underlying accident models in eight accident investigation manuals. Saf. Sci. 2009, 47, 1297–1311. [Google Scholar] [CrossRef]
- Hollnagel, E. FRAM, The Functional Resonance Analysis Method: Modelling Complex Socio-Technical Systems; Ashgate: Farnham, UK, 2012. [Google Scholar]
- Hollnagel, E. The ETTO Principle: Efficiency-Thoroughness Trade-Off—Why Things that Go Right Sometimes Go Wrong; Ashgate: Farnham, UK, 2009. [Google Scholar]
- Hollnagel, E. Safety-I and Safety-II: The Past and Future of Safety Management; Ashgate: Farnham, UK, 2014. [Google Scholar]
- Hollnagel, E. Safety-II in Practice: Developing the Resilience Potentials; Routledge: Abingdon, UK, 2018. [Google Scholar]
- Dekker, S.W. Books and Papers. Available online: https://sidneydekker.com (accessed on 13 March 2021).
- Dekker, S.W. The Field Guide to Human Error Investigations; Ashgate: Aldershot, UK, 2002. [Google Scholar]
- Dekker, S.W. The Field Guide to Understanding Human Error, 2nd ed.; Ashgate/CRC Press: Aldershot, UK, 2006. [Google Scholar]
- Dekker, S.W. Ten Questions About Human Error: A New View of Human Factors and System Safety; Lawrence Erlbaum Associates Publishers: Malwah, NJ, USA, 2005. [Google Scholar]
- Dekker, S.W. Drift into Failure: From Hunting Broken Components to Understanding Complex Systems; Ashgate: Aldershot, UK, 2011. [Google Scholar]
- Dekker, S.W. The Field Guide to Understanding ‘Human Error’, 3rd ed.; CRC Press: Boca Raton, FL, USA, 2014. [Google Scholar]
- Dekker, S.W. Safety Differently: Human Factors for a New Era, 2nd ed.; CRC Press: Boca Raton, FL, USA, 2015. [Google Scholar]
- Dekker, S.W. Foundations of Safety Science: A Century of Understanding Accidents and Disasters; CRC Press: Boca Raton, FL, USA, 2019. [Google Scholar]
- Shrivastava, P. Bhopal: Anatomy of a Crisis, 2nd ed.; Paul Chapman: London, UK, 1992. [Google Scholar]
- Wildavsky, A. Searching for Safety: Social Theory and Social Policy; Routledge: New York, NY, USA, 1988. [Google Scholar]
- Le Coze, J.C. Are organisations too complex to be integrated in technical risk assessment and current safety auditing? Saf. Sci. 2005, 43, 613–638. [Google Scholar] [CrossRef]
- Le Coze, J.C. How safety culture can make us think. Saf. Sci. 2019, 118, 221–229. [Google Scholar] [CrossRef]
- Le Coze, J.C. Ideas for the future of safety science. Saf. Sci. 2020, 132, 104966. [Google Scholar] [CrossRef]
- Le Coze, J.C. (Ed.) Safety Science Research: Evolution, Challenges and New Directions; CRC Press: Boca Raton, FL, USA, 2020. [Google Scholar]
- Macrae, C. Close Calls: Managing Risk and Resilience in Airline Flight Safety; Palgrave Macmillan: London, UK, 2014. [Google Scholar]
- Hayes, J.; Hopkins, A. Nightmare Pipeline Failures: Fantasy Planning, Black Swans and Integrity Management; Wolters Kluwer CCH: Sydney, Australia, 2014. [Google Scholar]
- Dechy, N.; Dien, Y.; Hayes, J.; Paltrinieri, N. Failures of Foresight in Safety: Fantasy Risk Analysis and Blindness. In ESReDA Project Group Foresight in Safety, Enhancing Safety: The Challenge of Foresight; EUR 30441 EN; Publications Office of the European Union: Luxembourg, 2020; Chapter 3. [Google Scholar] [CrossRef]
- Hayes, J. Investigating Accidents: The Case for Disaster Case Studies in Safety Science. In Safety Science Research: Evolution, Challenges and New Directions; Le Coze, J.C., Ed.; CRC Press: Boca Raton, FL, USA, 2020; pp. 187–202. [Google Scholar]
- Quinlan, M. Ten Pathways to Death and Disaster: Learning from Fatal Incidents in Mines and Other High Hazard Workplaces; The Federation Press: Alexandria, Australia, 2014. [Google Scholar]
- Turner, B.A. Teaching old dogs new tricks: Restructuring the insurance industry. In Insurance Viability and Loss Mitigation: Partners in Risk Reduction; Britton, N.R., McDonald, J., Oliver, J., Eds.; Alexander Howden Re: Sydney, Australia, 1995; pp. 47–65. [Google Scholar]
- Short, J.F.; Rosa, E.A. Organizations, Disasters, Risk Analysis and Risk: Historical and Contemporary Contexts. J. Contingencies Crisis Manag. 1998, 6, 93–95. [Google Scholar] [CrossRef]
- Short, J.F. The Social Fabric at Risk: Toward the Social Transformation of Risk Analysis. Am. Sociol. Rev. 1984, 49, 711–725. [Google Scholar] [CrossRef]
- Gould, S.J. Eight Little Piggies: Reflections in Natural History; Originally Published in 1993 by Jonathan Cape, London and W.W. Norton and Co, New York; Vintage Digital ebook 2014; Vintage Books: London, UK, 2007. [Google Scholar]
- Calhoun, C. (Ed.) Introduction: On Merton’s Legacy and Contemporary Sociology. In Robert K. Merton: Sociology of Science and Sociology as Science; Columbia University Press: New York, NY, USA, 2010; pp. 1–29. [Google Scholar]
- Busch, C. Preventing Industrial Accidents: Reappraising H.W. Heinrich—More Than Triangles and Dominoes; Routledge: Abingdon, UK, 2021. [Google Scholar]
- Busch, C. Heinrich’s Local Rationality: Shouldn’t ‘New View’ Thinkers Ask Why Things Made Sense to Him? Master’s Thesis, Division of Risk Management and Societal Safety, Lund University, Lund, Sweden, 2018. Available online: https://lup.lub.lu.se/student-papers/search/publication/8975267 (accessed on 18 June 2020).
- Shorrock, S.T. Safety Research and Safety Practice: Islands in a Common Sea. In Safety Science Research: Evolution, Challenges and New Directions; Le Coze, J.C., Ed.; CRC Press: Boca Raton, FL, USA, 2020; pp. 223–245. [Google Scholar]
- Eisner, H.S. Editorial. J. Occup. Accid. 1976, 1, 1. [Google Scholar] [CrossRef]
- Hale, A.R.; Mearns, K.; Wybo, J.L.; Boustras, G. The future of Safety Science. Saf. Sci. 2022, 150, 105705. [Google Scholar] [CrossRef]
- ALLEA. The European Code of Conduct for Research Integrity. Revised Edition; All European Academies: Berlin, Germany, 2017; Available online: https://www.allea.org/wp-content/uploads/2017/05/ALLEA-European-Code-of-Conduct-for-Research-Integrity-2017.pdf (accessed on 29 October 2021).
- ASA. American Sociological Association Code of Ethics; Approved by the ASA Membership in June 1997; ASA: Washington, DC, USA, 2010; Available online: https://www.asanet.org/sites/dfault/files/savvy/images/asa/docs/pdf/CodeofEthics.pdf (accessed on 29 October 2021).
- NHMRC. Publication and Dissemination of Research: A Guide Supporting the Australian Code for the Responsible Conduct of Research; National Health and Medical Research Council, Australian Research Council and Universities Australia, Commonwealth of Australia: Canberra, Australia, 2020. Available online: https://www.nhmrc.gov.au/sites/default/files/documents/attachments/publications/publication_and_dissemniation_of_research_guide.pdf (accessed on 30 November 2020).
- ORI. Federal Policy on Research Misconduct. Office of Research Integrity. Executive Office of the (US) President. 2000. Available online: https://ori.hhs.gov/content/chapter-2-research-misconduct-federal-policies (accessed on 29 October 2021).
- Singapore Statement. Singapore Statement on Research Integrity. In Proceedings of the 2nd World Conference on Research Integrity, Singapore, 21–24 July 2010, as a Global Guide to the Responsible Conduct of Research. Available online: https://wcrif.org/documents/327-singapore-statement-a4size/file (accessed on 18 February 2022).
- Safety MDPI. Instructions for Authors and MDPI Research and Publication Ethics. Available online: https://www.mdpi.com/journal/safety/instructions; https://www.mdpi.com/ethics (accessed on 5 November 2022).
- Flick, U. An Introduction to Qualitative Research, 6th ed.; SAGE Publications: London, UK, 2018. [Google Scholar]
- Stake, R.E. Qualitative Research; The Guilford Press: New York, NY, USA, 2010. [Google Scholar]
- Clarke, L. Personal communication. 19 April & 4–5 May 2023.
Turner’s MMD 1978 | Perrow’s NA 1984 |
---|---|
Multiple high-risk industry qualitative case documents | Multiple high-risk industry qualitative case documents |
Patterns found in cases from inquiries | Patterns found in cases from inquiries |
Organisational Sociology and Weberian background | Organisational Sociology and Weberian background |
Technology and high-risk location important | Technology and high-risk location important |
Man-made disaster focus (13–14, 190) | Man-made catastrophe focus (3, 11, 351) |
Organisational failure (66, 75–78, 199–200) | Organisational failure (233, 330–331) |
Socio-technical (2–3, 5, 8, 47–48, 89, 170, 185, 187–188) | Socio-technical (3, 7, 9, 10–11, 352) |
Systemic (19, 135–136, 141–142, 145, 158–159, 161–162, 185, 188) | Systemic (3, 10, 62–71, 351) |
Open systems/external environment (136, 151, 170, 201) | Open systems and external environment (75) |
Emergence and propagation (89, 135, 158, 180) | Emergence and propagation (9–10) |
Failures of control (7, 70, 191) | Failures of control (81, 83) |
System forgiveness (19–20) | Cybernetic self-correcting and error-avoiding systems such as aviation (11, 79–81, 126–127, 146–147, 167–168) |
Error magnification/feedback amplification (179–181, 187, 236) | Negative synergy, error inducing systems, magnification, unfamiliar or unintended feedback loops (82, 88, 98) |
Precursor contributory factors combine in complex, unexpected and discrepant ways to defeat safety systems (86, 88, 105, 126) | Interactive complexity: small failures and other unanticipated interactions can cause system accidents (4–5, 7, 10, 101) |
Complex large-scale accidents and disasters with multiple chains of causes (14, 23–24, 75–76, 89, 105, 187) | Complex system accidents and catastrophes with multiple causes (7, 70–71, 75, 78, 85–86, 88) |
Precipitating or triggering incident or event, last event is not focus (81, 88–90, 102, 107, 122, 150, 155–156, 193, 198) | Trigger event and particular events are not the focus (6–7, 71, 342, 344) |
Surprise and unanticipated events (86, 126, 138, 145–146, 151, 159, 169, 184–186) | Unanticipated and unexpected outcomes from interactions (6, 70, 78) |
Large-scale accidents, rare catastrophes (149–151, 178) | System accidents, rare catastrophes (343–345) |
Latent structure of incubation events (86–87, 89, 94, 193) | Independent factors lying fallow for the fatal spark (111) |
Less complex accidents separate from disasters (88–89, 99) | Component failure accidents with ‘DEPOSE’ factors (8, 77, 111, 343) separate from system accidents (70) |
Bounded rationality and satisficing (133–138, 161) | Bounded rationality (315–321, 323–324) |
Inability to see or comprehend hazard (93–95, 195, 198) | Inability to see or comprehend hazard (9, 75, 351) |
Gap between perceived and actual reality (84, 94, 128–129, 138, 161, 194) | Gap between perceived and actual reality (9, 75) |
Warnings not heeded or discerned (19, 61, 194–195) | Warnings ignored or didn’t fit mental model (10, 31, 351) |
Miscommunication and misinformation (45–47, 61, 64–67, 121–124, 139) | Misinterpretation and indirect information sources (35, 73, 84) |
Variable disjunction of information (50–52, 61, 101, 217, 225) and social construction of reality (165–166, 191) | Cognitive models of ambiguous situations and the social construction of reality (9, 75, 176) |
Don’t blame individual operator error (160, 162–163, 198) | Don’t blame individual operator error (4, 9, 331, 351) |
Importance of power/elites (4, 72, 124–125, 132, 152, 191) | Importance of power/elites (12, 155, 306, 311, 339, 352) |
Growing concentration and power of large organisations and energy sources (1–2, 4–6, 160, 199, 201) | Growing concentration of energy sources and power of large organisations (102, 306, 311) |
Intentional misinformation by managers (118, 125, 147) | Deception and lying, false logs by ship captains (10, 187) |
Regulatory issues/inadequacies (70–1, 79, 87, 99, 103–4) | Regulatory issues/inadequacies (3, 176, 343) |
Gap in defences and failure of precautions (84, 87, 91) | Defence in depth limits and failures (3–4, 43, 60) |
Intuition, tacit knowledge, craft (11, 25, 51) | Intuition and use of heuristics (316–7, 319) |
Poor and unrealistic management (63, 66–67, 77, 79) | Poor management (111–112, 177, 343) |
Environmental disasters (2, 5–6, 14, 128, 131, 149, 190) | Eco-system disasters (233, 252–253, 255, 295–296) |
Societal culture and context (84, 192) | Societal values and culture (12, 315–316, 321–328) |
Importance of learning from near misses (96, 182) | Aviation occurrence reporting model important (167–169) |
Turner’s MMD 1978 | Perrow’s NA 1984 |
---|---|
Organisational and social unit focus (160, 186, 199) | Macro industry and technology focus (3, 12–14, 339) |
Multidisciplinary approach and theories are necessary to study large-scale accidents and disasters (31–32, 38, 127) | Own theory and radical critical paradigm mostly applied to high-risk accident reports and industry data |
Somewhat optimistic about learning and prevention (32, 75–80, 194–200) | Somewhat pessimistic about learning and prevention (32, 60, 257, 343, 351) |
Incubation network (86–89, 99–107, 125, 131, 193, 200) | Inevitable normal or system accidents—irretrievable for at least some time (3–5, 256, 328, 330) |
Disaster timing usually after a long incubation often of years (87, 105, 180, 193) | Disaster timing rapid: unanticipated system interaction combined with external factors (4–5, 75, 233, 253–255) |
Disasters require focused unintended organising attention on multiple fronts to occur (180) | Banality and triviality lie behind most catastrophes (9) |
Sequence model with 6 stages (84–92) | Close or tight coupling with little slack (4–6, 10–11, 89–96, 330–332) |
Failures of intention (4, 128–131, 160, 171, 181) and of foresight (50, 77, 92, 99, 107, 161, 170, 179) | Garbage can theory helps explain randomness of system accidents (324) |
Schematic accident representation diagram (97–98) | 2 × 2 matrix or grid of complexity and coupling (97, 327) |
Hierarchy of levels of information (145) | Catastrophic potential of risky technologies especially where complex and tightly coupled systems (342–346) |
Sub-cultures and shared social context determine perception (4, 58, 78, 101, 120–121, 166–171) | Capitalist production imperatives and distorted market prices are important (310–313) |
Bounded decision zones and perceptual horizons in an organisational worldview (58–9, 120–121, 165, 168–171, 200) | Common mode failures (72–73, 75, 85) |
Ill-structured problems; confusion across organisations and divisions (19–22, 50, 52–53, 60, 72, 75, 77, 96, 107) | Unnecessary proximity and tight spacing can lead to unexpected interactions (82, 85, 88) |
Well-structured problem post-disaster (52, 74–76, 103, 106, 179–188) | Centralisation and decentralisation (10, 331–335) |
Intended actor rationality (129, 160, 171–178, 200) | Social rationality by non-experts in society (315–6, 321–324) |
Negentropy, anti-tasks and non-random structured nature of unintended consequences (127, 179, 181, 187, 190) | Understanding of transformational designs and processes is limited (11, 84–86, 330) |
Discrepant information and events (86–90, 122, 146) | Externalities imposed on society (339–341) |
Importance of organisational culture (77, 103) | Incomprehensibility of system accidents (23, 277) |
Catastrophe and chaos theory (153–156, 185–187, 194) | Complex systems seek productive efficiency (88) |
Misdirected energy and misinformation (4, 182–184, 187, 189–191, 193) | Risk assessment has a narrow focus; typically assumes over-regulation (306–314) |
Decoy problem takes the focus off more serious threats (59–61, 64, 78, 80, 86–87, 100, 102–104, 196) | Risk assessor ‘shamans’ support elites’ use of ‘evil’ technologies (12, 14, 307); some scientists, engineers and cognitive psychologists complicit (14, 307, 316–320) |
Complaints from outsiders discounted; reluctance to fear the worst (73–74, 76, 102–104) | Social class distribution of risk, inequality linked to disproportionate risk (310) |
Social and differentiated distribution of knowledge (3, 85, 106, 152) | Error-inducing systems such as marine shipping (11, 173–176, 181–190, 230) |
Channels of observation not just communication (141, 159); what organisations pay attention to (58, 163–171) | Nuclear accidents like TMI, unreliability and inevitability (15–61, 344, 348) |
Nuclear industry’s enormous hazards—but risk analysis, information and response (1–2, 18, 29–30, 35, 183) | Normative advocacy; technologies like nuclear power and weapons should not be used (x, 14, 347–52) |
Knowledge of Turner (MMD 1978 or after 1997 2nd edn) | Acknowledgment of Turner’s Ideas | Knowledge of Perrow’s NA (1984 or 1999) | Acknowledgment of Perrow’s Ideas | |
---|---|---|---|---|
Hale | MMD 1978 | mixed | NA | good |
Weick | 2nd edn 1997 | good | NA | good |
Rasmussen | Unclear | poor | unclear | poor |
Reason | MMD 1978 | poor/mixed | NA | good |
Vaughan | MMD 1978 | good/mixed | NA | good |
Leveson | MMD 1978 | mixed/poor | NA | good |
Hopkins | 2nd edn 1997 | good | NA | good |
Hollnagel | 2nd edn 1997 | mixed | NA | good |
Dekker | 2nd edn 1997 | mixed/good | NA | good |
Shrivastava | MMD 1978 | mixed | NA | poor |
Sagan | Unclear | poor | NA | good |
Snook | pre MMD 1978 | poor | NA | good |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bills, K.; Costello, L.; Cattani, M. Barry Turner: The Under-Acknowledged Safety Pioneer. Safety 2023, 9, 68. https://doi.org/10.3390/safety9040068
Bills K, Costello L, Cattani M. Barry Turner: The Under-Acknowledged Safety Pioneer. Safety. 2023; 9(4):68. https://doi.org/10.3390/safety9040068
Chicago/Turabian StyleBills, Kym, Leesa Costello, and Marcus Cattani. 2023. "Barry Turner: The Under-Acknowledged Safety Pioneer" Safety 9, no. 4: 68. https://doi.org/10.3390/safety9040068
APA StyleBills, K., Costello, L., & Cattani, M. (2023). Barry Turner: The Under-Acknowledged Safety Pioneer. Safety, 9(4), 68. https://doi.org/10.3390/safety9040068