1. Introduction
Within contemporary safety science, analyses of complex technical systems reveal that the dynamic interactions between human behavior, operational conditions, and organizational processes simultaneously create pathways for enhanced performance and channels for risk [
1,
2]. In high-risk industries such as oil and gas, adopting a systemic perspective is essential, as errors rarely occur in isolation but emerge from interdependent socio-technical structures [
3]. Organizations in this sector have become increasingly powerful, managing sophisticated technological infrastructures and developing financial and operational capacities that often rival those of states [
4]. These entities grow in size and influence, managing ever more complex and continuously evolving technological systems. At the same time, societal and institutional pressures for accountability have made accidents progressively less tolerable [
5,
6]. When adverse events occur, public and organizational responses often converge on attributing culpability to individuals, most frequently frontline operators, while broader organizational and contextual determinants remain underexplored [
7].
Classical frameworks for analyzing unsafe acts, such as Reason’s Swiss Cheese Model and the Human Factors Analysis and Classification System (HFACS), have been foundational in the evolution of safety science [
1,
2,
8]. These models highlight how human errors are shaped not only by immediate actions but also by latent conditions, including supervisory deficiencies, organizational shortcomings, and regulatory pressures. Within the oil and gas sector, HFACS has been adapted into the HFACS-OGI framework, which structures accident causation across five levels of analysis, ranging from unsafe acts by frontline operators to external industry and regulatory influences [
9,
10].
More recent models also emphasize the variability of operator behavior and the organizational capacity to monitor, anticipate, respond, and learn [
11,
12]. They shift the focus from retrospective blame to prospective adaptability, recognizing unsafe acts as the emergent outcomes of complex socio-technical systems rather than isolated individual failings [
13,
14]. System-theoretic approaches illustrate this shift through frameworks such as Rasmussen’s socio-technological system risk management model [
15], the AcciMap method [
16], the Functional Resonance Analysis Method (FRAM) [
17], and the System-Theoretic Accident Model and Processes (STAMP) [
18,
19]. In line with this perspective, contemporary theories increasingly conceptualize safety as a multicausal phenomenon, arising from the dynamic interactions among system components—technical, human, organizational, and regulatory—rather than as the effect of a single error or breakdown [
20,
21].
Despite these theoretical advances, practice in high-risk industries continues to be dominated by post-accident investigations. Such inquiries are frequently conducted under societal and organizational pressure to rapidly identify both causes and culpable parties, which often results in attributing errors exclusively to frontline operators [
22]. This approach reinforces individual responsibility while overlooking the systemic and organizational conditions that shape unsafe acts. In human factors research, however, a paradigm shift is taking place—from a narrow focus on human–technology interaction toward a holistic approach that views accidents as emergent from dysfunctional interactions across multiple system components [
23]. This shift highlights the importance of examining organizational influences not as background conditions but as active determinants in accident causation [
24,
25].
Against this backdrop, the present study investigates the role of organizational factors in unsafe acts within refinery maintenance operations in the oil and gas industry. Specifically, it examines how organizational communication, resource adequacy, and procedural design contribute to operator errors and violations. The study further examines the extent to which these factors serve as direct predictors of unsafe acts and, through partial correlation analysis, as linking variables between broader organizational conditions and operator behavior. By addressing these questions, the research seeks to extend the empirical evidence base of HFACS-OGI, moving beyond individual blame to demonstrate how organizational conditions shape frontline performance. In doing so, it contributes to both theory and practice: offering a more comprehensive understanding of accident causation in refinery operations and providing actionable insights for organizational safety governance.
1.1. Theoretical Framework
1.1.1. Factors of Accidents
One of the most prominent frameworks embodying the systemic view is Reason’s Swiss Cheese Model [
1], which depicts accidents as the result of multiple breakdowns across different layers of organizational defense. Each layer, operational procedures, technological safeguards, or managerial oversight, contains inherent “holes” representing vulnerabilities. These may arise from active failures, such as operational errors, or from latent conditions, including flawed processes, resource shortages, or a weak safety culture. While a single weakness may not cause an accident, the model emphasizes that when such vulnerabilities align across several layers, hazards can penetrate all defenses, leading to system failure. In high-risk industries, this perspective shifts the focus from individual blame to strengthening systemic resilience.
To overcome the Swiss Cheese Model’s limited specification of active and latent factors and their practical application, Shappell and Wiegmann developed the Human Factors Analysis and Classification System (HFACS) as an extension of Reason’s framework. HFACS expands each category of active and latent causal factors, offering a more comprehensive and systematic approach to analyzing and mitigating human factors in accidents [
26]. The model has been adapted for various high-risk industries, including oil and gas. In this sector, it evolved into the HFACS-OGI, developed from oil and gas technical reports, including those of the Society of Petroleum Engineers [
10]. According to this model, there are five analytical levels: (1) Unsafe acts, errors or violations committed by front-line operators; (2) Preconditions for unsafe acts, environmental, situational, or human conditions that increase the likelihood of such errors; (3) Unsafe supervision, deficiencies in oversight, planning, or corrective action; (4) Organizational influences, systemic issues within the company, including safety culture, resource allocation, and management priorities; and (5) Outside regulatory and industry influences—external factors such as national and international regulations, industry standards, and market pressures. This structure enables investigators to trace accident causation from immediate human actions back through deeper organizational and regulatory contexts, being primarily designed for post-incident analysis. Thus, under the HFACS-OGI framework, unsafe acts by front-line operators represent only one facet of human error, leaving room in the analysis for another critical dimension: the role that decisions and actions by other actors across the work system play in shaping accident trajectories [
15,
27]. This perspective is consistent with research on safety leadership [
28] and safety climate [
29], which highlights the importance of managerial decisions and actions in creating either safe or unsafe working environments.
1.1.2. The Concept of Unsafe Acts
Unsafe acts are defined as human actions or omissions that deviate from established safety norms, procedures, or expected performance, and that have the potential to contribute to accidents or incidents [
1,
26]. Within the HFACS, unsafe acts constitute the first level of analysis and are considered
active failures—the frontline errors and violations that directly precede an accident. Errors may be skill-based, decision-based, or perceptual, while violations represent deliberate departures from rules or procedures, whether routine or exceptional.
Consistent with Reason’s Swiss Cheese Model, unsafe acts rarely occur in isolation but are often the final link in a causal chain shaped by latent organizational conditions such as inadequate communication, insufficient resources, or poorly designed procedures. This systemic view is reinforced by the HFACS-OGI extension, which explicitly links unsafe acts to supervisory, organizational, and external regulatory influences in high-risk sectors like oil and gas.
Traditionally, human error models and methods have, in most cases, overlooked the fact that errors can occur at all levels of a system [
30]. Instead, they have tended to focus analytical attention on the actions of operators and end-users, particularly those working at the “front line” or “sharp end,” such as pilots, drivers, or control room personnel. These individuals are perceived as the direct agents whose actions or omissions led to the incident. They are also the easiest to identify, and their actions or omissions are the most straightforward to document. As a result, limited-depth investigations frequently assign exclusive blame to such individuals, who, in reality, are only part of a larger system.
Analyzing the system as a whole is considerably more challenging, which is why many incident investigations concentrate solely on the performance of the controller. This narrow focus is reflected in behavioral safety programs that monitor individual rule compliance, performance appraisals, and incentive schemes. Underpinning these approaches is the assumption that if the individual exerted greater effort, paid closer attention, and adhered precisely to prescribed procedures, the outcome would be successful [
31,
32].
In reality, however, systemic safety models have shown that individual actions are an integral part of a broader sociotechnical system and are shaped by a complex interplay of factors, ranging from organizational culture and resource allocation to communication structures, supervisory practices, and regulatory environments. These models emphasize that unsafe acts at the “sharp end” are often the visible consequences of deeper latent conditions embedded within the system’s design and management [
33]. Understanding this layered causation requires moving beyond a focus on individual performance towards an examination of the organizational and systemic contexts in which operators act, an approach central to frameworks such as the HFACS-OGI [
34].
Undoubtedly, there are unsafe acts committed by front-line operators. These may be intentional or unintentional and most often represent active errors—failures arising from actions or omissions directly related to the performance of work duties. Unintentional acts are typically caused by a lack of the necessary knowledge to perform the task correctly, a misperception of what is required, or flawed decision-making regarding how to execute assigned duties. By contrast, violations refer to intentional acts, whether direct or indirect, that involve breaching established rules, work procedures, or instructions from hierarchical superiors. Such violations can generally be classified as disciplinary offences and, in the event of an incident or accident, may also constitute grounds for legal liability, whether civil or criminal.
1.2. Research Objectives
Building on the core principle of systemic error theories, namely, that the actions of front-line operators in the oil and gas industry are intrinsically connected to organizational conditions, this study sought to identify which potential operator errors may be influenced or caused by organizational factors. In line with the HFACS-OGI framework, which situates unsafe acts within a broader causal chain encompassing supervisory, organizational, and regulatory domains, a questionnaire was administered inviting operators to respond, based on their own experience, to questions concerning actions related to their current activities. This approach allows for a deeper exploration of workers’ perceptions, experiences, and insights, providing valuable qualitative information while also generating quantifiable results for identifying both immediate and underlying organizational factors influencing performance. In interpreting these responses, the principle of
local rationality is essential: understanding each situation from the perspective of those directly involved, considering their mind-set, knowledge, objectives, demands, situational context, and the information available to them at the time [
35,
36]. This perspective is rooted in Simon’s [
37] concept of local or bounded rationality, which posits that the rationality of a decision or action should be evaluated from the standpoint of the person acting within a particular situation, taking into account their knowledge, objectives, and the environmental constraints they faced [
38]. Applying this approach helps avoid hindsight bias in safety investigations, recognizing that what appears obvious in retrospect may not have been evident to the actors during the event. Incorporating
local rationality into the analysis therefore strengthens the validity of the findings by ensuring that operator actions are interpreted within the actual operational context, rather than against an idealized or retrospective standard. This perspective aligns with Rasmussen and Jensen’s [
39] observation that studying normal performance and adaptability in real-world problem-solving reveals that the strategies people actually employ often diverge from the complex, rational approaches that psychologists—guided by idealized cognitive models—might expect.
Since most respondents were front-line operators performing civil and mechanical works on refinery installations, the questionnaire was developed by adapting the organizational factors from the HFACS-OGI framework to the specific activities undertaken by these workers. Three main categories of organizational factors were defined: resources, communication, and procedures. For the resources category, four items were formulated to assess operator work overload (Item 8), clarity in task allocation within the team (Item 20), provision of the necessary work equipment and procedures (Item 14), and efficiency of decision-making flows (Item 25). The communication category also included four items, addressing the absence of information regarding upcoming operations (Item 6), receipt of contradictory information (Item 19), receipt of incomplete information (Item 22), and the use of inappropriate language in information transfer (Item 23). The third category, procedures, reflected the highly proceduralized nature of the industry and the potential disconnect between ‘work as imagined’ and ‘work as performed’ [
13]. Three items were included to determine whether procedures are perceived as overly complex or burdensome (Item 27), whether they are embraced by operators as relevant to their workplace and specific tasks (Item 16), and whether operators have the capacity to assimilate and comply with them (Item 28).
For the purposes of this study, operator behaviors were categorized into two overarching groups consistent with the safety literature: errors and violations [
1,
10,
13]. The classification of unsafe acts in this study was adapted from the HFACS-OGI framework, with modifications to better capture the specific operational realities of refinery maintenance work. The original skill-based error category was replaced with two more precisely defined error types, decoding errors and model errors, while perception errors and decision errors were retained from the original framework. Violations were also analyzed as a distinct category. Perception errors were measured through two items: Item 4, where the operator does not correctly understand the task due to environmental conditions such as heat or noise, and Item 9, where the operator does not correctly understand which specific operation needs to be performed. Decoding errors referred to failures in interpreting visual or symbolic information and were measured through Item 7, where the operator misinterprets codes or signals, and Item 10, where the operator is unable to decipher a written instruction. Model errors were defined as failures in adhering to planned work processes. This category included Item 12, where the operator is unable to meet deadlines or maintain the quality of the work, and Item 13, where the operator undertakes operations other than those originally planned. Decision errors were operationalized through four items: Item 5 (the operator fails to detect and promptly communicate a hazard; Item 21, where the operator feels reluctant to request clarifications; Item 24, where the operator refrains from making decisions due to fear of making a mistake; and Item 26, where the operator does not promptly report a technical problem.
In contrast, violations were defined as deliberate deviations from established rules, procedures, or safe operating practices, which may be driven by situational pressures, organizational culture, or individual attitudes [
1,
2,
10]. While violations may sometimes be motivated by perceived efficiency or the belief that procedural compliance is unnecessary, they nonetheless increase operational risk. This category included Item 11, where the operator disregards risk signage; Item 15, where the operator does not understand the work plan and chooses to perform tasks they personally consider necessary and Item 17, where the operator deliberately breaches procedures, believing that their action is justified or beneficial.
Before defining the research objectives, it is necessary to clarify that all subsequent references to unsafe acts should be interpreted as the potential for unsafe acts to occur, since the actual occurrence of unsafe acts is inherently contained within their potential for occurrence.
One potential constraint of the research concerns the relatively small sample size, restricted to the actual number of employees within the analyzed organization. However, it is important to note that expanding the sample by administering the questionnaire across multiple organizations could have resulted in findings of limited relevance, since in most cases organizations differ in their processes, procedures, organizational climate, and even core values. Organizational factors are intrinsic to each organization; therefore, the analysis must necessarily be confined to a specific organizational context. At the same time, the study took into account the statistical inference capacity of data, ensuring that the results remain meaningful when compared with companies operating under similar conditions.
The research objectives are as follows:
H1—
Communication deficiencies and operator errors.
Communication deficiencies, such as incomplete, contradictory, or unclear information, significantly increase the likelihood of unsafe acts in refinery maintenance operations. Communication breakdowns have been consistently identified as precursors to active errors in high-risk industries [
29]. Within HFACS-OGI, miscommunication or incomplete task information constitutes a latent condition that can directly impair decision-making, task execution, and situational awareness, thereby increasing the probability of perception, model, and decision errors. This construct is operationalized via Items 6, 19, 22, and 23, which capture scenarios of missing (Item 6), contradictory (Item 19) inconsistent (Item 22), or ambiguous information (Item 23).
H2—
Inadequate resources and latent conditions for unsafe acts.
Insufficient resources, such as work overload, lack of necessary equipment, unclear task allocation, and delays in decision-making, significantly increase the likelihood of operator errors by creating latent organizational conditions conducive to unsafe acts. The systemic accident literature emphasizes that resource inadequacy is a recurrent latent factor undermining safe operations [
2,
40,
41]. Overburdened operators, insufficient tools, and slow decision-making processes compromise both efficiency and safety margins, heightening susceptibility to active errors. Measured via Items 8, 14, 20, and 25, this construct captures the operational manifestation of latent resource constraints.
H3—
Procedural complexity and deviations from standard work methods.
Overly complex, poorly adapted, or insufficiently applicable procedures significantly increase the likelihood of procedural violations and deviations from standard work methods. In heavily proceduralized industries such as oil and gas, the misalignment between ‘work as imagined’ and ‘work as performed’ [
13] can lead to deliberate procedural deviations or errors of omission. When procedures are perceived as impractical, irrelevant, or excessively burdensome, compliance diminishes, and adaptive workarounds proliferate [
42]. Measured via Items 16, 27, and 28, this construct examines how procedural design influences operator adherence.
To overcome the limitations of mediation analyses (caused by the restricted number of observations) and to capture more clearly the direct relationships among variables, the research hypotheses were formulated using partial correlations. This approach allows for the identification of specific links between organizational factors (communication, resources, procedures) and the potential for unsafe acts, while controlling for the influence of other organizational conditions.
H4—
Organizational factors as control variables in the analysis of unsafe acts.
Organizational factors in the domains of communication, resources, and procedures show significant partial correlations with unsafe acts, even when controlling for the influence of other organizational conditions.
H4a—
Communication as a control variable in the relationship between organizational factors and unsafe acts.
Communication deficiencies (lack of information, contradictions, incomplete details, unclear language) show significant partial correlations with unsafe acts, when controlling for the influence of other organizational factors.
H4b—
Resources as a control variable in the relationship between organizational factors and unsafe acts.
Inadequate resources (work overload, lack of equipment, unclear task allocation, delays in decision-making) show significant partial correlations with unsafe acts, when controlling for the influence of other organizational factors.
H4c—
Procedures as a control variable in the relationship between organizational factors and unsafe acts.
Procedural complexity and reduced adaptability show significant partial correlations with unsafe acts, when controlling for the influence of communication and resources.
2. Materials and Methods
2.1. Participants
The study population comprised the entire workforce (N = 48) of a maintenance company operating within a refinery, in Ploiesti, Romania. Of these, 46 valid questionnaires were collected, corresponding to a response rate of 95.8%. The participants had a mean age of 45.87 years (SD = 10.27; range: 23–62). The age distribution reflected a balanced representation across active professional generations, with 25% of respondents younger than 36.25 years and 25% older than 53.75 years.
In terms of organizational position, the majority of respondents held operational roles (n = 31; 67.4%), while the remainder occupied administrative/technical (TESA) positions (n = 15; 32.6%). This proportion reflects the typical workforce structure in refinery maintenance, where operational staff constitute the dominant group.
Organizational tenure ranged from 0 to 41 years, with an average of 11.52 years (SD = 8.53). Approximately half of the respondents reported between 7 and 14 years of experience, indicating a high level of familiarity with the specific procedures and working conditions of the industry.
This demographic composition offers a comprehensive basis for analysis, as it encompasses the entire workforce of the company, ensuring that both long-tenured employees and those in the early stages of their careers are represented, thus allowing for the examination of potential differences in perceptions and behaviors related to human error and organizational factors.
All procedures involving human participants complied with institutional ethical guidelines and conformed to the ethical standards outlined in the 1964 Helsinki Declaration and its subsequent amendments. To ensure compliance with ethical standards and protect participant confidentiality, no personal data were collected. Prior to participation, respondents received detailed information regarding the anonymous nature of their responses and their right to voluntary participation.
No material incentives or other forms of compensation were offered to participants.
2.2. Design
This study employed a cross-sectional observational design to examine the extent to which organizational workflows contribute to the occurrence of unsafe human acts. It constitutes an exploratory pilot investigation conducted on a limited sample, with the primary purpose of assessing methodological feasibility and generating hypotheses to guide future research. Data were collected through a structured questionnaire administered to a small sample of respondents, reflecting both the constraints of the organizational context and the exploratory nature of the research (
Appendix A). All items were assessed using a five-point Likert scale, where 1 indicated low or no severity and 5 indicated high severity, with scale anchors nuanced according to the content of each question.
The conceptual framework was structured into three main components: latent conditions for error occurrence; procedures as organizational barriers and unsafe acts.
The proposed factorial structure was derived from the HFACS-OGI conceptual framework, extensively validated in prior research. Given the limited sample, the statistical analyses were exploratory rather than confirmatory. An exploratory factor analysis (EFA) was used to assess item structure, while a confirmatory factor analysis (CFA) was performed only for indicative purposes and should not be regarded as conclusive. These preliminary results, with appropriate refinements, may inform future SEM analyses. However, due to the small sample size and exploratory design, the findings should be interpreted cautiously and are not generalizable beyond this study.
The results of the logistic regression and partial correlation analyses presented in the tables follow this sequence: first, the logistic regressions, then the partial correlations, in accordance with the research hypotheses formulated.
Latent conditions for error occurrence—This component was operationalized through two factors identified via Exploratory Factor Analysis (EFA) and validated using Confirmatory Factor Analysis (CFA):
Communication (Items 6, 19, 22, 23) and
Resources (Items 8, 14, 20, 25). The EFA indicated moderate sample adequacy (KMO = 0.601) and a statistically significant Bartlett’s test (χ
2 = 554.826, df = 276,
p < 0.001). Using promax rotation, two factors were extracted, together explaining 34.3% of the total variance, with factor loadings ranging from 0.434 to 0.870. CFA indicated the bidimensional structure, yielding good fit indices (χ
2 = 22.757, df = 19,
p = 0.248; CFI = 0.975; TLI = 0.963; GFI = 0.976; RMSEA = 0.066). The only index above the conventional threshold was SRMR = 0.101, which remains acceptable given the small sample size and the exploratory purpose of the study. The AVE values were 0.482 for Communication and 0.397 for Resources. Although the latter falls below the commonly recommended threshold of 0.40–0.50, it can be accepted in exploratory research, particularly when the factors are not used for hypothesis testing within structural models [
43,
44,
45]. The exploratory analysis of the two proposed factors—Communication and Resources—highlights differences in internal consistency, as well as their complementarity in shaping the analysis of human risk potential within organizations. For this purpose, Cronbach’s Alpha (α), McDonald’s Omega (ω), and Guttman’s λ coefficients were calculated.
The Communication factor shows satisfactory internal consistency (ω = 0.672; α = 0.634; λ = 0.651), with values exceeding the minimum thresholds generally accepted for exploratory studies. The calculated values and confidence intervals are presented in
Table 1.
The Resources factor shows more modest values (ω = 0.508; α = 0.526; λ = 0.540), indicating fragile internal consistency and a low degree of homogeneity among items. The analysis of this factor is nevertheless justified both by its theoretical relevance—since access to resources is recognized as a critical determinant of organizational behavior and human risk potential—and by the need to cover the conceptual domain of the HFACS-OGI framework. The results are reported at this stage of the research, as the literature emphasizes that values close to 0.50 may be acceptable in the early stages of instrument development, particularly when the objective is exploratory testing. The calculated values and confidence intervals are presented in
Table 2.
Consequently, the findings suggest that the two dimensions represent complementary elements, providing an integrated perspective on human risk and forming a basis for future validation and refinement. However, in view of the results obtained for these parameters, neither of the two proposed latent factors was retained for further statistical analysis, as their inclusion would not have provided robust or reliable outcomes at this stage of the research.
The heterotrait-monotrait (HTMT) ratio of 0.535 indicated acceptable discriminant validity between the two factors, supporting the interpretation that they capture distinct dimensions of the analyzed phenomenon.
To determine the level of overlap among the analyzed variables, item–item and item–total correlation analyses were performed. Following the calculation of the coefficients, Item 18 was removed from the analysis matrix, which led to a significant reduction in redundancy risk. The remaining correlations fall within the optimal range (r = 0.30–0.70), indicating sufficient homogeneity. The highest values were recorded between Item 15–Item 20 (r = 0.668), Item 19–Item 23 (r = 0.623), and Item 13–Item 16 (r = 0.598). Although these associations are close to the 0.70 threshold, they do not exceed it, suggesting that the respective items are not strictly redundant.
Overall, the removal of Item 18 contributed to reducing redundancy, and the results obtained support the retention of the remaining items at the exploratory stage. Subsequent confirmatory validation steps, to be conducted on much larger samples, will be able to test both the redundancy risk of the identified clusters and the potential for modeling specific sub-factors.
To assess the risk of common method bias (CMB) in a cross-sectional survey-based design, both procedural measures (anonymity, random ordering of items) and a minimal statistical check were applied. Harman’s single-factor test indicated that the first unrotated factor explained approximately 24.1% of the total variance, well below the commonly used 50% threshold suggesting systematic bias [
46]. Furthermore, the factorial separation was supported by the HTMT ratio of 0.535, confirming acceptable discriminant validity between the two factors. Given the small sample size and the exploratory, guiding role of the factors (rather than their use as latent constructs in causal models), a common latent method factor was not estimated, thereby avoiding the over-parameterization of a potentially unstable model. Overall, these results suggest a low probability that the observed relationships were substantially affected by common method bias.
Procedures as organizational barriers—Procedures were analyzed separately from latent conditions to reflect the theoretical distinction between factors that facilitate the emergence of errors and procedural barriers intended to prevent them [
2]. Three items were used to assess this dimension: whether procedures are perceived as overly complex or burdensome (Item 27), whether they are embraced by operators as relevant to their workplace and specific tasks (Item 16), and whether operators have the capacity to assimilate and comply with them (Item 28).
Unsafe acts—Drawing on established safety literature [
1,
26,
30], unsafe acts were classified into five categories: perception errors, decoding errors, model errors, decision errors, and violations.
Perception errors (Items 4 and 9) capture situations where the operator fails to correctly understand the task due to environmental conditions such as heat or noise.
Decoding errors (Items 7 and 10) refer to the misinterpretation of codes or signals.
Model errors (Items 12 and 13) occur when the operator performs operations other than those planned.
Decision errors (Items 5, 21, 24, and 26) include behaviors such as failing to promptly report a hazard.
Violations (Items 11, 15 and 17) capture intentional deviations from established safety protocols or supervisory instructions, including knowingly bypassing procedures.
This classification is consistent with the HFACS-OGI, which positions unsafe acts within a broader causal chain that includes supervisory practices, organizational conditions, and regulatory influences. From a systemic perspective, unsafe acts at the “sharp end” are rarely isolated human failings; rather, they often represent the final link in a chain of events shaped by latent organizational factors. By operationalizing errors and violations through specific questionnaire items, the study enables both the quantification of error prevalence and the mapping of each act to potential latent precursors. This approach aligns with Reason’s [
1] model of active and latent failures and reinforces the analytical value of connecting front-line behaviors to upstream organizational conditions.
To strengthen the validity of the results, a triangulation was applied by complementing the quantitative data obtained from the survey with qualitative information (
Appendix B). A total of 11 semi-structured interviews were conducted with both managers (3) and workers (8), in order to capture different perspectives on organizational conditions and unsafe acts. These interviews, lasting between 25 and 30 min, provided contextual insights that enriched the interpretation of the statistical results. The data were analyzed using reflexive thematic analysis [
47]. Transcripts were independently coded by the authors, who identified relevant meaning units that were subsequently organized into overarching themes. Coding consistency was ensured by comparing categories and resolving discrepancies through consensus. The viewpoints emerging from the interviews ensured that the analysis reflected not only individual perceptions but also organizational practices, thereby enhancing the robustness and credibility of the conclusions.
The qualitative analysis also highlighted aspects that could not be captured through the questionnaire. The interviews led to a deeper understanding of the reasons why certain statistical correlations emerge (for example, why communication deficiencies are associated with unsafe acts) and revealed differences in perception between managers and workers regarding the same organizational conditions. Moreover, the qualitative analysis contributed to the identification of additional mechanisms through which insufficient resources or complex procedures generate risky actions. Further elements not included in the questionnaire were also observed, such as organizational culture, time pressure, or interpersonal conflicts. Thus, triangulation provided a more nuanced and credible understanding of the relationship between communication, resources, procedures, and the occurrence of unsafe acts.
2.3. Procedure
Employees were informed about the study through internal communication channels within the company. They were provided with details regarding the estimated completion time (5–7 min) and were assured that the collected data would be processed exclusively for scientific purposes. Respondents were explicitly informed that no personal data would be disclosed and that no identifying information, such as names or email addresses, would be collected.
The questionnaire was administered individually, without the intervention of an interviewer. Given the organizational context and the aim of encouraging candid responses, data collection was conducted in a paper-based format during dedicated time slots provided by the employer.
A total of 48 questionnaires were distributed to all company employees. Two were excluded from the analysis due to incomplete responses, with more than 20% of answers missing for essential questions. This approach is consistent with established survey methodology practices, which emphasize minimizing bias arising from missing data. The final valid sample therefore comprised 46 respondents.
To ensure data reliability, responses were screened for inconsistencies, particularly where answers to positively and negatively worded items could contradict each other in an implausible manner.
2.4. Data Analysis
The hypotheses were tested using ordinal logistic regression models, selected to account for the ordinal nature of the dependent variables. All unsafe acts items were measured on a five-point Likert scale, indicating increasing degrees of severity or frequency. This approach allows for directly linking changes in organizational factors—communication, resources, and procedures—to changes in the probability of more severe unsafe behaviors, which is central to the study’s objectives.
The modelling procedure followed four steps:
H1: Communication factor predictors—Each of the four communication items was treated as an independent variable predicting each unsafe act item in a separate ordinal logistic regression.
H2: Resource factor predictors—Each of the four resource items was similarly tested as a predictor for each unsafe act item in an ordinal logistic regression.
H3: Procedural factor predictors—Each of the three procedural items was tested as a predictor for each unsafe act item in an ordinal logistic regression.
H4: Exploratory assessment of indirect associations—All significant correlations from H1, H2, and H3 were subsequently tested using partial correlations, with the remaining significant organizational factors (communication, resources, procedures) introduced as control variables for each specific error category. This step was designed to evaluate the direct associations between organizational conditions and unsafe acts, while controlling for the influence of other factors, in line with the HFACS-OGI framework’s emphasis on the interplay between latent conditions and active errors.
The use of ordinal logistic regression in this context not only respects the measurement properties of the data but also aligns with the study’s systemic perspective, allowing for a nuanced understanding of how variations in organizational factors influence the severity of unsafe acts in refinery maintenance operations.
To analyze the relationships among the independent variables, given the ordinal measurement scale, ordinal logistic regression was employed, which was considered an appropriate method in this case. Considering the small sample size (n = 46) specific to this exploratory research, classical estimations based on asymptotic assumptions may be unstable and susceptible to Type I or Type II errors. To address these limitations, a bootstrap procedure with 10,000 replications was applied. The statistical analyses were conducted in SPSS v.26, using bootstrap with case resampling and without stratification. This approach provided more robust estimates of standard errors and confidence intervals, thereby reducing the sensitivity of the results to fluctuations caused by the small sample size. The integration of bootstrapping within ordinal logistic regression offered a means of exploratory identification of relationships among variables, where the main objective was not the testing of definitive hypotheses but rather the identification of trends and directions of association that may support subsequent confirmatory analyses on larger samples.
The adequacy of the ordinal logistic regression models was evaluated through a series of standard statistical procedures designed to confirm the validity and robustness of the results.
First, the Model Fitting Information section was examined to determine whether the inclusion of predictors in the model resulted in a statistically significant improvement compared to the baseline (intercept-only) model. A Chi-square test value with p < 0.05 was considered significant, indicating that the model with predictors described the data better than the null model, thus satisfying the first validation criterion.
Second, the overall fit of the model was assessed using Goodness-of-Fit indicators, specifically the Pearson and Deviance tests. A non-significant result (p > 0.05) for both indicators was interpreted as showing that no systematic differences existed between the observed and estimated values, thereby confirming a good fit of the model to the data.
Third, the proportional odds assumption was evaluated using the Test of Parallel Lines. A non-significant result (p > 0.05) indicated that the assumption of parallel lines was met, validating the applicability of ordinal logistic regression for the analyzed data.
In addition, β coefficients were interpreted in terms of both statistical significance and direction of effect. A statistically significant β coefficient (p < 0.05) indicated that the associated predictor significantly influenced the probability of transitioning to a higher category of the dependent variable. The sign of the coefficient (positive or negative) reflected the direction of the relationship, while its transformation into an odds ratio (OR = eβ) enabled interpretation of the effect in terms of changes in odds.
Only those regression models meeting all of the following conditions were retained for analysis: statistically significant improvement over the null model, adequate fit according to the Pearson and Deviance tests, compliance with the proportional odds assumption, and statistically significant Beta coefficients with coherent interpretation. This ensured the validity of the models used in the analysis and supported the relevance of the conclusions drawn.
3. Results
When interpreting the results, it is important to emphasize that the results tables present two key parameters: the Beta coefficient (β) and the odds ratio (OR). In the logistic regression models calculated, the β coefficient represents the estimated change in the log-odds (the logarithm of the odds ratio) of the occurrence of the analyzed event, in this case, unsafe acts, associated with a one-unit increase in the Likert scale score for a given item, while holding all other variables in the model constant. A positive β indicates that, as the perceived severity of the latent condition increases, the probability of human error occurring increases accordingly.
The exponential transformation of β (eβ), referred to as the odds ratio (OR), expresses directly the multiplicative change in the odds of the event occurring for each unit increase in the predictor. OR values greater than 1 indicate risk factors, whereas subunitary values indicate a protective effect.
Reporting both metrics is essential: β retains the statistical scale and direction necessary for model evaluation and comparison across predictors, while OR conveys the practical magnitude of the effect, enabling applied interpretation in the context of safety management. The 95% confidence intervals for the beta coefficients were also reported, based on 10,000 bootstrap resamples.
3.1. Communication Deficiencies and Operators’ Unsafe Acts
The ordinal logistic regression analyses corresponding to hypothesis H1 (
Table 3) suggest that deficiencies in communication may increase the likelihood of unsafe acts across multiple error categories. All significant coefficients were positive, indicating that higher perceived severity of communication problems systematically raised the probability of errors.
Specifically, the absence of complete task-related information (Item 6) emerged as a predictor for perception errors (β = 1.202, OR = 3.327), suggesting that each one-unit increase on the Likert scale more than triples the odds of misperceiving operational requirements. Contradictory information (Item 19) was also associated with perception errors (β = 0.706, OR = 2.026), effectively doubling the risk of misinterpretation under conflicting task instructions.
Incomplete transmission of details (Item 22) showed consistent effects across several unsafe acts, including perception errors, decision errors, and violations, with β ranging from 1.004 to 1.345 and corresponding OR values between 2.729 and 3.838. These results indicate that insufficiently detailed communication substantially increases the likelihood of both unintentional errors and deliberate deviations.
Unclear or ambiguous language (Item 23) was found to be a particularly critical risk factor, with β values up to 1.653 and OR values exceeding 5.223 for decision errors, meaning that the odds of unsafe decision-making may increase more than fivefold when task-related communication lacks clarity.
Taken together, these findings offer preliminary indication that communication deficiencies operate as transversal determinants of unsafe acts. Rather than being confined to a single error category, poor communication simultaneously undermines perceptual accuracy, cognitive modelling, and decision-making, while also facilitating procedural violations. This systemic influence is consistent with the HFACS-OGI framework, which conceptualizes communication failures as latent organizational conditions that propagate through multiple pathways to generate active failures.
From a practical standpoint, the results underline the need to implement robust communication protocols in refinery operations, such as structured shift handovers, standardized reporting templates, and verification mechanisms (e.g., read-backs). By institutionalizing these practices, organizations can mitigate the latent risks associated with communication breakdowns and reduce the overall probability of unsafe acts.
3.2. Inadequate Resources and Latent Conditions for Unsafe Acts
The ordinal logistic regression analyses corresponding to hypothesis H2 (
Table 4) suggest that deficiencies in resources significantly may increase the probability of unsafe acts across several error categories. All statistically significant coefficients were positive, underscoring that inadequate resources function as amplifiers of risk within refinery maintenance operations.
Work overload (Item 8) was associated with a higher probability of decoding and modelling errors (β = 0.575, OR = 1.777; β = 0.891, OR = 2.436, respectively), suggesting that excessive cognitive and operational demands compromise both the interpretation of signals and adherence to planned work processes. This aligns with resilience engineering perspectives, which emphasize that overloaded operators have reduced adaptive capacity under time pressure, thereby increasing susceptibility to active errors.
Unclear allocation of responsibilities within teams (Item 20) emerged as one of the strongest predictors, with β coefficients between 1.196 and 2.141 and odds ratios ranging from 3.307 to 8.508. These results imply that when role definitions are vague, the probability of procedural violations can increase more than eightfold, while the risk of modelling and decision errors also triples. Such findings suggest that ambiguity in task distribution undermines accountability structures and fosters unsafe improvisations.
The lack of adequate equipment and procedural support (Item 14) also showed consistent effects, doubling the likelihood of model errors, decision errors, and violations (β between 0.715 and 0.887; OR between 2.044 and 2.428). These results highlight that deficiencies in material and informational resources compromise both compliance and decision accuracy.
Finally, delays in managerial decision-making (Item 25) were significantly linked to model errors (β = 0.651, OR = 1.917) and decision errors (β = 1.392, OR = 4.023). This suggests that organizational inertia in critical situations substantially increases the risk of erroneous or unsafe decisions, underscoring the importance of timely support from supervisors.
Taken together, these findings tentatively suggest that resource inadequacies may act as active determinants of unsafe acts. They operate transversally across error categories, affecting perception, modelling, decision-making, and rule compliance. This systemic influence is consistent with the HFACS-OGI framework, which conceptualizes resource constraints as latent organizational conditions that propagate through multiple pathways to generate active failures.
From an applied perspective, the results underline that resource adequacy must be managed comprehensively. Interventions should focus on workload balancing, clear role allocation, timely managerial responsiveness, and the provision of appropriate tools and procedural support. Such measures address both tangible and intangible dimensions of resources, transforming them from latent risk multipliers into protective factors that strengthen organizational resilience.
3.3. Procedural Complexity and Deviations from Standard Work Methods
The ordinal logistic regression analyses corresponding to hypothesis H3 suggested that perceptions of procedural complexity and misalignment significantly increase the likelihood of unsafe acts across multiple categories of human error (
Table 5). All significant β coefficients were positive, indicating that when procedures are viewed as excessively complex, poorly aligned with actual work, or burdensome, the probability of unsafe acts rises systematically.
The degree to which standard procedures reflect operators own working methods (Item 16) was a consistent predictor of unsafe behavior, with β values ranging from 0.540 to 1.004 and corresponding ORs between 1.716 and 2.729. These results suggest that when formal procedures diverge from intuitive or customary practices, the likelihood of modelling errors, decision errors, and even violations nearly doubles.
Overly detailed or overloaded procedures (Item 27) emerged as one of the strongest predictors, with β coefficients ranging from 0.893 to 1.527 and ORs between 2.442 and 4.604. This indicates that procedural overload more than doubles, and in some cases quadruples, the odds of errors in perception, modelling, and decision-making, while also significantly increasing the probability of deliberate violations.
Similarly, the misalignment between procedural sequences and the operators’ intuitive task order (Item 28) was associated with unsafe acts. Coefficients ranged from 0.588 to 1.063, corresponding to ORs between 1.800 and 2.895. These results tentatively indicate that when the logical flow of procedures is perceived as inconsistent with practical task execution, operators may be more likely to commit decoding errors, modelling errors, and intentional violations, with the odds of non-compliance nearly tripling.
Taken together, these findings offer preliminary indications that procedures are not merely passive reference documents but critical organizational barriers that directly influence safety outcomes. When procedures are poorly adapted, excessively complex, or counterintuitive, they cease to function as protective mechanisms and instead create latent conditions that promote unsafe acts. This dynamic is consistent with the HFACS-OGI framework, which conceptualizes procedural inadequacies as latent organizational factors that erode systemic defenses and channel upstream organizational weaknesses into active failures.
From a practical perspective, these results underline the need to shift from a compliance-driven procedural culture toward a usability-oriented approach. Organizations should engage frontline operators in the design and periodic revision of procedures, ensuring that they are concise, relevant, and aligned with operational realities. Such co-designed procedures not only enhance compliance but also strengthen the role of procedures as safety barriers rather than latent risk factors.
To ensure the validity of inferences in the presence of multiple testing, the Benjamini–Hochberg procedure [
48] was applied to control the false discovery rate (FDR) at q = 0.05 across the family of 33 tests performed (
Appendix C). In the present data, all 33 effects remained significant after adjustment, which indicates that, under the standard assumptions of independence or positive dependence, the average proportion of false discoveries is controlled at 5%. Accordingly, the empirical pattern of associations identified is not an artifact of multiple testing but remains robust under a correction framework recognized for its statistical power. Substantively, this result lends global credibility to the conclusions of the analysis: the statistical signals are consistent and pervasive across the family of tests, supporting the interpretation that the examined relationships are systematically present in the studied sample. At the same time, we maintain the caution appropriate to an exploratory study: the estimates should be viewed as strong but not definitive indications, requiring corroboration on independent samples and, ideally, within longitudinal designs. Overall, the application of the FDR-BH procedure at q = 0.05 strengthens the conclusiveness of the study: the findings remain significant after accounting for multiplicity, providing a credible statistical basis for interpreting and prioritizing both practical implications and directions for future research.
3.4. Communication as a Control Variable
The partial correlation analyses presented in
Table 6 assessed whether the relationship between organizational predictors and unsafe acts persisted after controlling for communication deficiencies. The results indicate that communication exerts a significant control effect, often reducing or eliminating the strength of associations once its influence is statistically accounted for (
Table 6).
For instance, the relationship between overloaded procedures (Item 27) and perception errors (Item 9) decreased from a moderate and significant correlation (r = 0.449, p = 0.002) to a weaker association when controlling for unclear language (Item 23), with the partial correlation dropping to r = 0.312 (p = 0.037). Although still marginally significant, the effect size reduction highlights that a considerable portion of the link between procedural overload and perceptual errors is attributable to communication problems.
Similarly, the association between unclear task allocation (Item 20) and model errors (Item 13) diminished from r = 0.396 (p = 0.006) to a non-significant r = 0.212 (p = 0.162) once communication was controlled. This finding suggests that communication deficiencies fully account for the observed relationship, functioning as a key explanatory mechanism.
The same pattern emerged for overloaded procedures (Item 27) and model errors (Item 13), where the initial correlation of r = 0.425 (p = 0.003) was attenuated to a non-significant r = 0.246 (p = 0.103) when controlling for communication. Likewise, the correlation between unclear task allocation (Item 20) and violations (Item 21) decreased from r = 0.361 (p = 0.014) to r = 0.178 (p = 0.242) under the same control, effectively eliminating the direct association.
Taken together, these results tentatively suggest that communication operates as a transversal organizational determinant shaping the manifestation of unsafe acts. Once communication deficiencies are accounted for, several direct pathways from organizational predictors to unsafe acts either weaken substantially or disappear altogether. This indicates that the effects of procedural overload and resource ambiguities on errors and violations are largely transmitted through communication mechanisms.
From a theoretical perspective, this finding is consistent with the HFACS-OGI framework, providing emerging evidence that communication may not only an independent predictor of unsafe acts but also a critical channel through which other organizational factors exert their influence [
49,
50]. Practically, it underscores the importance of communication management as a leverage point for improving safety. Ensuring clarity, consistency, and completeness in task-related communication may disrupt the propagation of latent organizational weaknesses into active failures, thereby reducing the overall probability of unsafe acts [
51,
52].
3.5. Resources as a Control Variable
The partial correlation analyses presented in
Table 7 evaluated the extent to which relationships between organizational conditions and unsafe acts persisted once resource adequacy was statistically controlled. The results demonstrate that resources exert a significant control effect, often weakening or fully eliminating direct associations.
For example, the relationship between contradictory information (Item 19) and violations (Item 15) was initially significant (r = 0.454, p = 0.002), but after controlling for unclear task allocation (Item 20), the partial correlation decreased to r = 0.288 (p = 0.055), losing statistical significance. This suggests that a considerable part of the link between contradictory communication and non-compliance is in fact channeled through resource allocation.
A similar pattern was observed in the relationship between insufficient equipment/procedural support (Item 14) and model errors (Item 13). The bivariate correlation was moderate and significant (r = 0.318, p = 0.031), but the partial correlation dropped to r = 0.197 (p = 0.194) once resources were controlled, indicating that the effect of inadequate equipment on modelling errors is largely explained by resource distribution issues.
Task detail provision (Item 22) also showed an initial significant correlation with decision errors (Item 5, r = 0.329, p = 0.025). However, when controlling for resource adequacy (Item 20), the association diminished sharply to r = 0.115 (p = 0.452), effectively disappearing. This implies that detailed task information only prevents decision errors when supported by adequate resources, underscoring the conditional role of resources in error prevention.
In contrast, the association between overloaded procedures (Item 27) and decision errors (Item 5) remained significant even after controlling for resources, although reduced in magnitude (from r = 0.449, p = 0.002 to r = 0.315, p = 0.035). This partial effect suggests that while resource adequacy accounts for a large portion of the variance, procedural overload still exerts an independent influence on unsafe decision-making.
Taken together, these findings lend tentative support to the interpretation that resources may be a relevant organizational mechanism shaping the manifestation of unsafe acts. When resource adequacy is statistically controlled, many previously significant relationships lose significance, highlighting that communication deficiencies and procedural overload often influence unsafe behavior indirectly, through their impact on perceived and actual resource availability.
From a theoretical perspective, this appears consistent with the HFACS-OGI framework by suggesting that resources act both as direct determinants of unsafe acts and as possible indirect channels through which other organizational conditions exert their effects. From a practical standpoint, the results stress the necessity of prioritizing resource adequacy—defined broadly to include workload distribution, clarity of role allocation, equipment provision, and decision-making support—as a cornerstone of organizational safety strategies.
3.6. Procedures as a Control Variable
The partial correlation analyses presented in
Table 8 examined whether the relationships between organizational predictors and unsafe acts persisted once procedural factors were statistically controlled. The results show that procedures exert a significant control effect, at times absorbing the influence of other organizational variables, and thereby shaping the pathways through which unsafe acts emerge.
In the first model, the relationship between delays in decision-making (Item 25) and model errors (Item 13) was significant at the bivariate level (r = 0.420, p = 0.004). However, after controlling for procedural adaptability (Item 16), the partial correlation dropped to r = 0.243 (p = 0.108), losing significance. This attenuation suggests that much of the association between slow decision-making and model errors is transmitted through the way procedures are aligned (or misaligned) with operational realities.
A similar pattern was observed in the relationship between unclear or ambiguous language (Item 23) and model errors (Item 13). The bivariate correlation was strong and significant (r = 0.458, p = 0.001), and while the partial correlation remained significant after controlling for procedures (r = 0.332, p = 0.026), the reduction in effect size may suggest a possible indirect association. This highlights that both poor communication and procedural misalignment jointly contribute to the occurrence of model errors, with procedures amplifying or attenuating the impact of communication deficiencies.
Finally, the link between incomplete task details (Item 22) and decision errors (Item 5) diminished from r = 0.329 (p = 0.025) to a non-significant r = 0.185 (p = 0.224) once procedural overload (Item 27) was controlled. This pattern appears to suggest that the effect of incomplete task information on unsafe decision-making may be channeled through procedural design. In other words, insufficient detail becomes a risk factor only when compounded by procedures perceived as complex or excessive.
Taken together, these findings suggest that procedures are not passive background conditions but active control mechanisms that structure the influence of communication and resource-related factors on unsafe acts. By statistically absorbing their effects, procedures emerge as a central organizational determinant, capable of either mitigating or exacerbating the risks associated with other latent conditions.
From a theoretical standpoint, this reinforces the HFACS-OGI framework, providing preliminary evidence that procedures serve as both barriers and conduits in the latent-to-active error chain. From a practical perspective, it points to the importance of designing procedures that are not only comprehensive but also adaptable and user-friendly. Poorly designed or overly complex procedures risk nullifying the benefits of good communication and adequate resources, whereas well-adapted procedures can break the chain of causation leading from organizational weaknesses to active failures.
3.7. Qualitative Analysis
The qualitative analysis reinforced the statistical results and provided deeper insight into the mechanisms through which organizational conditions shape unsafe acts in refinery maintenance operations. Five key themes emerged: communication deficiencies, resource constraints, procedural complexity and usability, divergent perspectives across organizational levels, and additional systemic mechanisms.
Communication emerged as a central theme and was frequently described as a precursor to unsafe acts. Workers emphasized incomplete or vague instructions, noting that critical details were sometimes missing at the outset of tasks: “Sometimes the instructions we receive from supervisors are not complete. We start the task and only later discover we were missing critical details.” (Worker, Interview 4). Managers also acknowledged communication gaps but tended to frame them in terms of clarity and time pressure: “I often have to clarify decisions twice, because the first message is vague. This delays the work and increases the chance of mistakes.” (Manager, Interview 2).
As regards the resources, both managers and workers highlighted the impact of limited personnel and equipment availability on safety. Workers reported that short-staffed shifts often led to risk-taking: “We are expected to finish complex operations with fewer people than needed. When shifts are short-staffed, everyone cuts corners.” (Worker, Interview 7). Equipment shortages were also a recurring concern: “Equipment availability is a constant issue. If the proper tool is not there, workers improvise, which is risky.” (Worker, Interview 9).
Procedural documentation was consistently described as overly detailed and difficult to apply in practice. As one worker explained: “Some procedures are too long and detailed. In practice, people skip steps they see as unnecessary.” (Worker, Interview 6). Managers recognized that bureaucratic requirements sometimes exceeded practical needs: “The paperwork takes more time than the actual maintenance task. This pushes workers to bypass formalities.” (Manager, Interview 1).
The analysis revealed differences in how managers and workers perceive the same organizational conditions. Workers stressed the gap between procedures and field realities: “From management’s point of view, procedures are clear. But when you’re on the field, conditions change and the instructions don’t always match reality.” (Worker, Interview 3). Managers, however, attributed difficulties to time pressure rather than unclear documentation: “Workers often claim the procedures are confusing, but the documents are designed to be comprehensive. The real issue is time pressure.” (Manager, Interview 2).
Beyond the three focal dimensions, the interviews also highlighted broader systemic influences not captured in the questionnaire. Workers pointed to a pervasive “rushing culture”: “Beyond communication and resources, there is a culture of rushing. Safety sometimes comes second to deadlines.” (Worker, Interview 8). Managers noted that interpersonal dynamics could undermine safety: “Conflicts between teams create noise and misunderstandings that no procedure can fully prevent.” (Manager, Interview 3).
Taken together, these qualitative findings contextualize the quantitative results by demonstrating how communication deficiencies, inadequate resources, and procedural complexity not only increase the potential for unsafe acts but also interact dynamically under conditions of time pressure and organizational culture [
53,
54]. The triangulation of survey data and interview evidence strengthens the credibility of the conclusions and underscores the systemic nature of unsafe acts in refinery maintenance operations.
4. Discussions
The present study investigated the role of organizational factors, specifically communication, resources, and procedural complexity, in predicting and possibly influencing indirectly the occurrence of unsafe acts within refinery maintenance operations. By integrating ordinal logistic regression modelling with partial correlation analysis, the research not only provided preliminary indications of direct associations between organizational deficiencies and human error but also suggested possible indirect pathways that link broader organizational contexts to active failures. The findings are consistent with systemic accident causation theories [
24,
26,
34,
55] which conceptualizes unsafe acts as the downstream manifestations of upstream latent conditions.
The results suggest that communication deficiencies, defined here as incomplete, contradictory, or unclear task-related information, may increase the likelihood of perception, model, and decision errors. Ordinal logistic regression models indicated that increases in perceived communication problems were associated with higher log-odds of these errors occurring, even when controlling for other organizational factors. This is consistent with earlier work in high-risk sectors showing that breakdowns in communication erode situational awareness, impair decision-making, and disrupt the mental models necessary for accurate task execution [
2,
50,
54].
From a safety science perspective, communication deficiencies function as a latent condition in the HFACS-OGI framework [
50,
51,
52]. While they may not immediately trigger an accident, they degrade the integrity of operational information flows, increasing the susceptibility of frontline personnel to misperception or misinterpretation of operational cues. Partial correlation analysis offered preliminary evidence of this role, indicating that communication partially explains or shapes the relationship between the general organizational context and both perception and decision errors. This means that part of the effect of unfavorable organizational conditions on unsafe acts may operate through the mechanism of impaired communication.
This finding emphasizes the dual role of communication in safety performance: it is both a direct risk factor for error and a pathway through which other organizational deficiencies translate into active failures. Such a conclusion aligns with Dekker’s [
36] emphasis on communication not merely as an operational process but as a safety-critical control function. Organizations that fail to implement robust verification and feedback loops for task-related communication risk amplifying the impact of otherwise manageable operational hazards.
The adequacy of resources emerged as another salient predictor of unsafe acts. Items capturing work overload, lack of necessary equipment, unclear task allocation, and delays in decision-making were significantly associated with violations and decision errors. The regression coefficients indicated that as perceptions of resource inadequacy increased, so too did the probability of these unsafe acts.
This relationship is well supported by systemic accident literature. Hollnagel’s [
13] resilience engineering approach emphasizes that resource constraints narrow the margins for safe adaptation, while Perrow’s [
3] Normal Accident Theory describes how complexity and tight coupling make systems more vulnerable to accidents when resources are insufficient. In the present study, partial correlation analyses revealed that inadequate resources amplified the effect of other organizational conditions, acting as an indirect channel through which structural weaknesses affected performance.
From a practical perspective, resource inadequacy should be viewed not only in terms of tangible assets such as tools and equipment but also in terms of intangible resources like time and decision-making capacity. Overloaded operators are more prone to procedural deviations, not necessarily due to negligence but as a coping mechanism under constrained conditions. This finding reinforces the need for workload management and timely provision of necessary resources as core components of safety management systems.
The analysis also suggested that procedural design may influence operator behaviour. Items measuring perceptions of procedural complexity, lack of relevance, and difficulty of assimilation were significantly linked to procedural violations. The literature [
31,
32,
56] has long recognized that overly rigid or poorly adapted procedures can create a mismatch between prescribed work methods and the realities of operational practice, thereby encouraging the development of informal workarounds and creating additional potential for unsafe acts.
The partial correlation results showed that procedural complexity also influenced cognitive errors (model and decision errors), suggesting that operators who perceive procedures as impractical may disengage from formal guidance altogether, increasing reliance on potentially flawed personal judgement. This is a critical insight: procedures are often conceived as barriers against unsafe acts, but when poorly designed, they may paradoxically contribute to risk.
The findings tentatively suggest the need for a shift from a compliance-driven procedural culture to a usability-focused one. Procedures should be regularly reviewed and updated in consultation with frontline personnel to ensure they are relevant, concise, and adapted to actual working conditions. Involving operators in procedural design could reduce the perception of irrelevance and enhance compliance.
The findings also indicate that violations, understood as deliberate acts by operators who intentionally disregard work instructions or industry regulations, are significantly influenced by organizational factors. While such acts of volition may, without doubt, be classified as behaviors that could trigger disciplinary, civil, or even criminal liability depending on the circumstances, they do not occur in a vacuum. Instead, they emerge against a backdrop of complex organizational conditions that may facilitate or tacitly tolerate such deviations. Factors such as inadequate resources, procedural misalignment, or deficient communication can create operational contexts in which violations become more likely, whether as coping mechanisms, adaptive responses to impractical constraints, or expressions of resistance to perceived inefficiencies. This perspective reinforces the importance of analyzing violations not solely through an individual blame lens but also within the systemic framework in which they occur, in line with contemporary safety science approaches that emphasize organizational context as a determinant of behavior [
1,
13,
57,
58].
An important methodological point in this study is the reporting of both Beta coefficients (β) and odds ratios (OR) for all regression models. The β coefficient retains the statistical scale and direction necessary for evaluating model fit and comparing predictors, while the OR offers a more intuitive understanding of effect magnitude for applied safety management. For instance, an OR greater than 1 clearly communicates that each unit increase in the predictor multiplies the odds of the unsafe act occurring, whereas an OR below 1 indicates a protective effect. By presenting both metrics, the analysis bridges the gap between statistical validity and practical applicability.
The combination of direct and partial correlation pathways identified in this study has clear implications for organizational safety strategies. First, communication systems must include verification mechanisms, such as read-backs, cross-checks, and structured shift handovers, to ensure clarity and completeness of task information. Second, resourcing decisions should explicitly account for workload distribution, availability of equipment, and speed of managerial decision-making to prevent the emergence of latent conditions. Third, procedural frameworks must balance standardization with operational flexibility, ensuring that rules are both applicable and accepted by those who must implement them.
Importantly, these findings suggest that addressing unsafe acts solely at the operator level, for example, through training or disciplinary action, will be insufficient if upstream organizational deficiencies remain unaddressed. Safety interventions must target the systemic roots of unsafe acts, integrating changes to organizational processes, structures, and resource allocation.
Moreover, the study suggests that the effectiveness of managerial actions is inherently conditioned by the understanding and overcoming of human barriers rooted in the principle of local rationality. In high-risk industrial environments, operators often make decisions that appear suboptimal when judged retrospectively but are entirely reasonable within the constraints, information, and pressures present at the time of action [
51]. Recognizing and addressing these locally rational behaviors is essential for designing managerial strategies that enhance safety performance, prevent recurrent errors, and align operational practices with organizational objectives.
The qualitative analysis provided important context for the quantitative results by illustrating how organizational factors translate into unsafe acts in daily operations. Communication, resources, and procedural design emerged as recurrent themes, but interviews also revealed that these dimensions interact dynamically under time pressure and within a broader organizational culture that often prioritizes productivity over safety. This finding aligns with the HFACS-OGI framework, which conceptualizes unsafe acts as the end point of multilevel systemic influences, and with system-theoretic models that emphasize the interplay between latent conditions and active failures. Workers’ and managers’ divergent perspectives, where frontline staff emphasize gaps between procedures and field realities while managers highlight time constraints, further demonstrate how organizational factors are interpreted differently across hierarchical levels. By integrating these qualitative insights with statistical evidence, the study underscores that unsafe acts cannot be reduced to individual failings but should be understood as products of complex socio-technical interactions.
Beyond its theoretical and methodological contributions, the present study also provides practical insights for those directly involved in post-incident analysis and accountability assessment. Incident investigators, whether engineers, criminal investigation bodies, or judges, are often required to determine the degree of operator culpability when an incident occurs in a refinery or when a disciplinary breach is suspected. For legal professionals in particular, it is crucial to grasp the inherent technical and organizational complexity of work in high-risk sectors. By providing preliminary evidence that operator error potential may be shaped by organizational factors such as communication deficiencies, inadequate resources, and procedural design, this research advances an exploratory systemic perspective that helps decision-makers contextualize individual actions. Such an approach fosters more balanced and evidence-based attributions of responsibility, avoiding an overly narrow focus on individual performance while acknowledging the organizational backdrop against which unsafe acts occur. Accordingly, when responsibility for an incident must be established, a more comprehensive assessment is required, one that examines the extent to which the organization, through the aforementioned factors, has contributed to the unsafe behavior of the operator that ultimately led to the incident under judicial review.
All interpretations presented in this section derive from an exploratory study. The results will be further examined in depth in future research, based on larger samples and extended to different sectors of activity.
5. Conclusions
This study extends the HFACS-OGI framework by offering preliminary exploratory evidence that organizational factors, specifically deficiencies in communication, resources, and procedures, act both as independent predictors and as possible indirect pathways in the occurrence of unsafe acts in refinery maintenance operations. The partial correlation approach applied here links descriptive accident causation models to empirical hypothesis testing, tentatively suggesting that latent organizational conditions may channel systemic influences into distinct categories of human error.
By categorizing unsafe acts into perception errors, decoding errors, model errors, decision errors, and violations, the analysis achieved a level of specificity that enables targeted identification of relevant organizational predictors. This aligns with calls in safety science for greater precision in mapping latent conditions to active failures. Importantly, the findings reveal that violations, intentional breaches of work instructions or safety regulations, are not solely individual choices but are also shaped by complex organizational contexts. While such acts may warrant disciplinary, civil, or criminal liability, their occurrence is facilitated by systemic factors such as unrealistic procedural demands, insufficient resources, or inconsistent rule enforcement.
Methodologically, the use of ordinal logistic regression models was appropriate for the Likert-scale dependent variables and enabled a detailed interpretation of both the statistical (Beta coefficients) and practical (odds ratios) significance of predictors. The results tentatively suggest that improving communication clarity, ensuring adequate resources, and designing practical, relevant procedures can directly reduce unsafe acts and indirectly lower risk by disrupting possible indirect pathways from organizational deficiencies to operator behavior.
The study’s exploratory nature and relatively small sample size (n = 46) limit the generalizability of the results. Some factors showed modest reliability coefficients, acceptable for exploratory analysis but indicative of the need for refinement. The reliance on self-reported data may introduce bias, although procedural safeguards such as anonymity and randomized item ordering were implemented. The cross-sectional design prevents conclusions about temporal causality.
Future research should aim to replicate these findings in larger and more diverse samples, ideally across multiple organizations and high-risk industries. Longitudinal designs would allow for assessment of the stability of relationships over time and the impact of organizational changes on unsafe act frequency. Combining perceptual data with objective performance metrics and incident records would strengthen causal inferences.
Overall, this study reinforces the view that unsafe acts, including violations, should be addressed through systemic interventions. Targeted improvements in communication, resources, and procedural design represent both direct safety measures and strategic levers for influencing the organizational conditions that give rise to human error.