Next Article in Journal
Foreign Language Learning Environment and Communicative Competence Development in Kazakhstan
Previous Article in Journal
The Enhancement of Number Sense Through the Interactive Reading of Mathematical Stories in Kindergarten
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Occupational Safety and Health as Assessable Transversal Competence in Higher Education

by
Sorin Mihai Radu
1,
Daniel Onut Badea
2,* and
Victoria-Rodica Cioca
3,*
1
Department of Electrical and Power Engineering, University of Petrosani, 332006 Petrosani, Romania
2
National Research and Development Institute on Occupational Safety—I.N.C.D.P.M. “Alexandru Darabont”, 35A Ghencea Boulevard, Sector 6, 061692 Bucharest, Romania
3
Department of Clinical Psychology, Master’s Program in Clinical Psychology, Psychological Counseling and Psychotherapy, Faculty of Psychology and Educational Sciences, Babes Bolyai University, 1 Mihail Kogălniceanu, 400084 Cluj-Napoca, Romania
*
Authors to whom correspondence should be addressed.
Educ. Sci. 2026, 16(2), 297; https://doi.org/10.3390/educsci16020297
Submission received: 14 January 2026 / Revised: 7 February 2026 / Accepted: 10 February 2026 / Published: 12 February 2026

Abstract

Occupational safety and health appears in many higher education programs. Universities rarely state what students must be able to do in situations involving occupational risk. This study analyses how occupational safety and health is defined and assessed in research on curriculum design, competence frameworks, and educational evaluation. The analysis used competence mapping, alignment checks, and cross-level tracing to examine learning outcomes, teaching activities, and assessment formats. The results suggest that safety is treated as knowledge of hazards and rules. Assessment depends on recall and procedural compliance. Judgment, decisions under real conditions, and responsibility allocation are not evaluated. A framework is derived that defines safety through three forms of performance, risk interpretation, action selection, and decision justification, organized across non-equivalent levels of evidence. Learning outcomes, learning activities, and assessment are connected to these performance requirements. Progression can be defined and checked across program stages. Occupational safety and health become a form of transversal academic competence when it is defined through evidence from performed tasks instead of topic coverage or regulatory content.

1. Introduction

Higher education is responsible for developing competencies required for safe professional practice, including hazard identification, risk evaluation, and decision-making in work situations. Competency-based education defines learning outcomes in terms of observable performance rather than through the accumulation of factual knowledge alone (Marcotte & Gruppen, 2022).
In most university programs, occupational safety and health (OSH) appears at the edges of the curriculum. It is delivered as a single course with limited hours or as a brief module attached to practical training. Course content focuses on legislation, requirements, and procedures. Teaching relies mainly on lectures, while assessment targets declarative knowledge. This structure meets formal compliance, but it does not result in the development of operational competencies required in real work contexts (Marcotte & Gruppen, 2022).
Applied competencies such as contextual hazard identification, risk analysis, and selection of preventive measures are rarely implemented. In many programs, these competencies remain implicit and develop unevenly through informal learning experiences. Reliance on implicit curricula reduces consistency in competence development and limits transfer to professional practice (Wolford et al., 2025).
From a curricular perspective, OSH rarely appears as a learning outcome and rarely integrates across disciplinary content. This positioning conflicts with current orientations in higher education that emphasize transversal competencies and performance-based assessment. Without integration, university education produces declarative knowledge and leaves professional judgment regarding occupational risk undeveloped (Marcotte & Gruppen, 2022; Wolford et al., 2025). As a result, graduates enter the workforce with limited ability to apply safety knowledge in working contexts. They may recall regulatory requirements but struggle to identify hazards and make decisions under uncertainty. Research on safety education reports a gap between academic instruction and the competence required in the workplace. Curricula that prioritize facts and rules often fail to develop judgment in real-world work situations, as required in practice (Qian et al., 2023). Digitalization of work systems not only changes technical hazards, but it also modifies cognitive load, decision pressure, and exposure to psychosocial risk, with measurable effects on workers’ mental health in digitally mediated workplaces (Iordache et al., 2025a).
Changes in work organization reinforce this gap. Automation, data-driven systems, and modular task allocation generate risk profiles that differ from those addressed in traditional curricula. Safety Education 4.0 notes that conventional teaching centered on hazards and compliance no longer matches emerging workplace demands. Education requires integration of risk assessment with realistic work scenarios and decision processes (Qian et al., 2023).
Assessment practices further limit competence development. Traditional written exams and multiple-choice tests measure recall of rules and definitions. They provide limited information about students’ ability to apply knowledge in context or to monitor performance. Research on assessment and calibration reports that review-based testing overestimates competence, while practice-based assessment improves alignment between perceived and performance (Bol & Hacker, 2001). Without assessment formats that capture decision-making in context, education fails to develop professional judgment for complex work situations (Salas et al., 2012).
The research landscape reflects this fragmentation. OSH research is dominated by technical and regulatory perspectives, with a focus on hazard control, procedural compliance, and safety management systems. Within this tradition, safety performance is explained through adherence to prescribed measures, while judgment and decision-making under variable conditions receive limited attention in educational contexts (Hale & Borys, 2013).
Higher education research offers well-established models for curriculum design and learning outcomes. Constructive alignment defines competence through observable performance and requires agreement between objectives, instruction, and assessment. These models provide guidance for competence development but are rarely applied to OSH as a curricular outcome across disciplines (Biggs & Tang, 2011).
Research on human error and performance in complex systems further states that safe behavior depends on interpretation, anticipation, and adaptation, not on strict rule-following. Safety emerges from judgment under constraints, especially when procedures do not fully match operational realities. These insights are conventional in safety science but remain inadequately translated into higher education curricula (Reason, 1990/2019).
Across these research strands, OSH is positioned in three main forms. One approach treats it as a standalone subject delivered through dedicated courses. A second method assigns responsibility for safety competence to organizational training after graduation. A third approach argues for integration within disciplinary education but lacks operational guidance on learning outcomes, teaching methods, and assessment.
The resulting gap lies in the absence of an operational educational framework that defines OSH as a transversal competence with progressive levels of complexity and aligned assessment strategies. Without such a framework, higher education programs cannot develop professional judgment related to occupational risk.
This study analyses how OSH is currently positioned within higher education curricula and identifies structural limitations that hinder competence development. The study has a dual objective. First, it conducts a descriptive-analytical examination of how OSH is defined, structured, and assessed in published curriculum documents. Second, it derives a normative educational framework based on the structural gaps identified in that analysis. The analysis focuses on educational design, not on technical risk control or regulatory compliance. The study makes three contributions. The study reframes occupational safety and health as a form of transversal educational competence. Safety is defined through performance in real work situations, not through topic coverage. Evidence from education research and safety science explains why current curricula fail to develop professional judgment under occupational risk. An operational curriculum framework aligns learning outcomes, learning activities, and assessment with progressive levels of performance, from risk interpretation to decision justification. This structure permits use across disciplines and fits different higher education settings. It provides a practical basis for improving how OSH is taught and assessed.

2. Conceptual Background

2.1. Occupational Safety and Health as Educational Competence

Competence integrates knowledge, skills, and professional judgment applied to concrete tasks (Mulder, 2014). In OSH, this integration can be described through three components. Cognitive competence concerns understanding hazards and preventive principles. Procedural competence involves the use of methods such as risk assessment and control selection. Decisional competence concerns judgment under constraints, including prioritization and response to unexpected situations. Outcome-based education defines learning outcomes in terms of observable student performance, not through content coverage (Biggs, 1999). When OSH is framed as educational competence, learning outcomes must specify task-based actions rather than rule recall.

2.2. Transversal Competencies in Higher Education

Higher education emphasizes transversal competencies that apply across disciplines and professional fields. These competencies support transfer, adaptability, and decision-making under changing conditions. They are context-dependent, transferable, and developed progressively through integrated learning experiences (Rychen & Salganik, 2003). Their development requires alignment between learning outcomes, teaching activities, and assessment. Formal models of occupational safety that structure hazards, actions, and responsibilities into explicit decision frameworks show that safety is not reducible to rule compliance but depends on how alternatives are selected and justified in concrete situations (Badea et al., 2024). OSH fits this profile, applies across sectors, and requires contextual judgment beyond routine rule application. Despite this alignment, higher education curricula are not framed as transversal competence and tend to treat it as a technical or regulatory subject (Le Deist & Winterton, 2005).

2.3. Emerging Risks and Educational Implications

Technological and organizational change affects how risks emerge in work environments. Digital systems, automation, and flexible work arrangements modify task execution, responsibility distribution, and risk perception (Qian et al., 2023). Digital systems, automation, and flexible work arrangements modify task execution, responsibility distribution, and risk perception (Qian et al., 2023). In organizational environments shaped by artificial intelligence and algorithmic coordination, safety depends on human judgment under conditions of incomplete, delayed, or conflicting information, and decision errors become a dominant source of risk (Iordache et al., 2025b). Under these conditions, static hazard lists and fixed procedures are insufficient. Safety depends on anticipation, adaptation, and management of variability instead of strict compliance (Hollnagel, 2014). From an educational perspective, the type of risk determines the type of competence required. Risks can be handled through procedural knowledge, while dynamic risks require situational judgment and situational decision-making. OSH education must therefore align learning outcomes and teaching strategies with risk characteristics.

3. Materials and Methods

3.1. Methodological Approach and Study Design

This study is a qualitative document-based conceptual analysis that derives an operational educational framework for OSH as a form of transversal competence in higher education. The approach synthesizes descriptive analysis of its integration and assessment within higher education curricula with the construction of a normative framework based on the structural gaps identified. The analysis models how OSH competence is defined, structured, and evaluated in higher education curricula. The analytical design was specified before data collection and included four components: document parsing, competence coding, alignment analysis, and cross-level translation tracing.
The design aimed to identify how curriculum structures shape safety-related competence. The analyzed research and framework documents were treated as formal representations of intended educational practice. The analysis produced three types of outputs: competence coding matrices; alignment mappings between learning outcomes, teaching activities and assessment; and a synthesized educational framework for OSH.

3.2. Units of Analysis and Document Selection

The unit of analysis was the curriculum statement, defined as any explicit formulation of learning outcomes, teaching activities, or assessment requirements reported in published program or course descriptions. Each statement was treated as a separate analytical unit to avoid conflating intended competence, instructional exposure, and assessment evidence. Document selection was not aimed at institutional or national representativeness. It targeted recurring curricular structures.
The dataset comprised publicly available documents from three educational levels. The first level included policy and framework documents addressing competence-based education, transversal skills, and assessment principles in higher education. These documents were included if they provided definitions or operational descriptions of competence relevant to professional practice. Policy or theoretical texts were treated as curriculum evidence only when they described competence in a form intended to be implemented in higher education programs. Texts that presented general models or theoretical discussions without reference to learning outcomes, teaching activities, or assessment were not treated as representations of curriculum practice. The second level included program-level curriculum descriptions from disciplinary domains characterized by intrinsic occupational exposure, including engineering, agriculture, applied sciences, and technology-oriented fields. Programs were included if they provided structured learning outcomes or curriculum specifications. Programs that contained only promotional or narrative information without educational descriptors were excluded. The third level included course-level documents, such as syllabi, learning outcome statements, and assessment descriptions, where these were publicly accessible. Course documents were included if they referred to occupational safety, risk management, or safety-relevant activities. Documents lacking competence-related language or assessment information were excluded.
The document corpus consisted of 29 sources published between 1989 and 2025. These sources include research articles, competence frameworks, and curriculum analyses addressing professional competence, assessment theory, and OSH education. Several sources report or analyze higher education programs in fields such as engineering, applied sciences, process industries, and health professions. The framework was derived through document-based synthesis of these published analyses rather than through direct institutional curriculum auditing. The corpus reflects documented curriculum formulations, not abstract policy or theoretical models. The analyzed corpus corresponds to the reference list of this article.

3.3. Competence Mapping and Coding Strategy

Data analysis followed a structured qualitative coding process focused on competence mapping.
In the first coding phase, documents were analyzed to determine the curricular positioning of OSH. Codes captured whether safety-related content was presented as a standalone subject, integrated within disciplinary courses, or implicitly embedded without explicit learning outcomes. In the second phase, competence-oriented coding was applied. This coding was based on a three-dimensional competence structure comprising cognitive, procedural, and decisional components. Cognitive competence codes captured references to hazard knowledge, mechanisms of harm, and preventive principles. Procedural competence codes captured references to application-oriented activities such as risk assessment, selection of control measures, or implementation of safety procedures. Decisional competence codes captured references to judgment, prioritization, uncertainty, and context-dependent decision-making.
Decisional competence was treated as a primary analytical category. Coding focused on identifying whether documents required students to evaluate alternatives, manage trade-offs, or respond to variable work conditions. This approach permitted differentiation between procedural training and competence development oriented towards professional judgment.

3.4. Curriculum Alignment Analysis

A structured alignment analysis was conducted to examine coherence between learning outcomes, teaching approaches, and assessment methods. For each document, the analysis evaluated whether the stated learning outcomes were supported by corresponding instructional activities and whether assessment methods evaluated the same competence components.
Alignment was assessed qualitatively by examining the internal consistency of curriculum descriptions. For example, learning outcomes referring to risk evaluation or decision-making were compared with teaching approaches and assessment formats to determine whether students were provided opportunities to practice and demonstrate such competencies.
This analysis was diagnostic rather than descriptive. It identified systematic patterns of misalignment, such as competence-oriented learning outcomes paired with assessment methods focused on factual recall, or procedural training without evaluation of judgment under uncertainty.

3.5. Cross-Level Comparison Between Policy and Curriculum

A cross-level comparison was conducted between competence formulations articulated at the policy or framework level and their implementation at the curriculum and course level. This comparison focused on the translation of transversal competence concepts across educational levels.
The analysis examined how abstract competence definitions were operationalized in program-level learning outcomes and how these were further translated into course-level teaching and assessment practices. Attention was given to points where competence concepts were simplified, fragmented, or reduced to declarative content.
This step identified structural mechanisms through which transversal competencies are weakened during curriculum design, without attributing gaps to individual courses or instructors.

3.6. Conceptual Educational Framework

The conceptual educational framework resulted from a synthesis of competence mapping, curriculum alignment checks, and cross-level translation traces. The framework formalizes OSH as a transversal educational competence defined by observable performance, not through content coverage. Three competence functions emerged in the coded material when safety-related statements appeared as curricular requirements. The first function, risk interpretation, covers the identification of hazards, exposure sources, affected actors, and responsibility within a defined work situation. The second function, action selection, covers the choice and use of preventive or protective measures for a specific task based on hazard characteristics and context. The third function, decision justification, covers comparison of alternatives, definition of selection criteria, documentation of trade-offs, and defense of a chosen action when procedures do not define a single correct response.
The framework structures these functions across three ordered levels of complexity to reflect how competence was missing progression markers in the analyzed curricula. Level 1 is a structured interpretation. Evidence at this level consists of correct classification and mapping within a scenario, including hazard type, exposure pathway, and role-specific obligations. Level 2 is controlled action. Evidence consists of selecting a control strategy and applying a method in a defined task, with explicit linkage between hazard characteristics and control choice. Level 3 is situational judgment. Evidence consists of a justified decision in a scenario with uncertainty, competing constraints, or incomplete information, where multiple actions are feasible, and the student must state why one option is acceptable and what residual risk remains. The levels are non-equivalent. Evidence at a lower level does not satisfy requirements at a higher level because the performance object changes from identification to method execution to judgment under constraints.
To operate as a curriculum design object, the framework specifies the evidentiary relationship between learning outcomes, learning activities, and assessment. For each level, learning outcomes are defined as task-based actions aligned to one of the three competence functions. Learning activities are specified as scenario-based tasks that elicit the targeted performance. Assessment is specified as direct evidence of performance in the same scenario family, using explicit criteria matched to the level. At Level 1, assessment evidence consists of structured mappings and classifications. At Level 2, evidence consists of documented method application and control selection steps. At Level 3, evidence consists of a decision record that includes option comparison, selection criteria, justification, and residual risk statement. Sector-specific hazards and discipline-specific tools are not prescribed. Invariant competence functions, level-specific performance requirements, and the minimum assessment evidence define when OSH has been achieved as a transversal learning outcome in higher education.

4. Results

4.1. Positioning of Occupational Safety and Health in Educational Documents

Analysis of curriculum and competence documents (Biggs, 1999; Biggs & Tang, 2011; Mulder, 2014; Rychen & Salganik, 2003; Okun et al., 2016) reports that OSH is rarely formulated as a learning outcome and is mainly referenced as regulatory or contextual information.
Across documents, OSH is positioned mainly as a compliance-related requirement. Learning outcome statements, when present, refer to awareness of regulations, standards, or general safety principles. References to task execution, contextual application, or decision-making are largely absent. This pattern indicates that OSH is framed at the level of declarative knowledge and not as an assessable educational outcome, consistent with findings from analyses of competence representation in formal curricula (Eraut, 2004).
A structural analysis of documents reveals that OSH-related content is not integrated into curriculum alignment mechanisms. Safety references are disconnected from teaching activity descriptions and assessment criteria. In most cases, no connection is made between safety-related statements and the corresponding evaluation methods. Similar patterns of peripheral positioning have been documented in curriculum analyses where transversal competencies are not formally integrated into assessment structures (Young & Muller, 2015).
Cross-document comparison indicates that OSH is treated as a static curricular condition instead of a progressive learning construct. Safety-related statements do not vary across program stages and are not differentiated by increasing complexity. Elements related to judgment, contextual adaptation, or uncertainty management are not specified. This pattern aligns with documented tendencies to reduce complex professional competences to stable content descriptors in formal curriculum documents (Tynjälä, 2008). OSH is positioned as an implicit and compliance-oriented element in educational documents, without a formulation as a transversal learning outcome.

4.2. Representation of Occupational Safety and Health in University Curricula

Curriculum-level analyses of higher education programs and courses reported in the literature (Ajmal et al., 2021; Qian et al., 2023; Okun et al., 2016; Wolford et al., 2025) note that OSH is implemented through discrete curricular units, most often as a standalone course or as a short module related to laboratory or field activities. OSH-related learning outcomes are not distributed across multiple courses or across successive program stages, which indicates the absence of planned progression.
Learning outcomes associated with OSH content are formulated at the level of conceptual knowledge. They focus on recognition and description of hazards, regulations, and preventive principles. Verbs related to task execution, contextual application, or decision-making are not used. This reduction of competence to content descriptors has been documented in curriculum analyses of professional education, where learning outcomes prioritize knowledge coverage over performance (Wheelahan, 2015).
Descriptions of teaching activities reported in the literature (Ajmal et al., 2021; Qian et al., 2023; Wolford et al., 2025) indicate a strong reliance on lecture-based instruction. Practical activities are occasionally mentioned, usually through generic labels such as laboratory exercise or practical tasks. These activities are not linked to explicit performance criteria and are not associated with OSH-related learning outcomes. As a result, practical engagement remains instructional rather than competence-oriented, a pattern documented in studies of learning for work that distinguishes between activity exposure and competence development (Billett, 2011).
Assessment specifications reported in the same sources rely mainly on written examinations, quizzes, and short-answer tests. These formats assess recall of regulatory requirements and predefined procedures. They do not require students to analyze safety situations, compare alternative control measures, or justify decisions under uncertainty. Similar misalignments between intended competence and assessment practice are reported in studies of authentic assessment in professional education (Gulikers et al., 2004; Kane, 2013).
Competence mapping across learning outcomes, teaching activities, and assessment criteria expresses a consistent pattern. Cognitive competence is formulated and assessed. Procedural competence appears only indirectly through described activities and is not formalized as an assessable outcome. Decisional competence is not represented in learning outcomes, teaching activities, or assessment specifications.
To illustrate how competence coding was applied to curriculum statements, Table 1 presents examples of learning outcomes, teaching activities, and assessment formats extracted from the analyzed documents and their corresponding competence function classification.

4.3. Gaps in Occupational Safety and Health Competence Development

Studies on competence and assessment (Messick, 1994; Gulikers et al., 2004; Kane, 2013; Clark & Karvonen, 2021) report that when judgment is not defined as an evidentiary object, it is not represented in assessment. Consistent with these findings, analysis of curriculum documents in this study reveals recurrent structural gaps between learning outcomes, teaching activities, and assessment that constrain the development of OSH competence in higher education.
Learning outcomes that reference OSH in the analyzed literature (Ajmal et al., 2021; Qian et al., 2023; Wolford et al., 2025) often use competence-related terms such as responsibility, awareness, or preparedness. Assessment specifications linked to these outcomes rely on written examinations and short-answer tests that evaluate factual knowledge. They do not translate competence-oriented formulations into observable performance criteria. This pattern reflects a systematic disconnection between intended learning outcomes and evidentiary requirements, which corresponds to construct underrepresentation at the assessment level (Messick, 1994).
Procedural elements related to OSH appear in curriculum documents as prescribed methods, rules, or stepwise procedures. These are described without reference to situational choice, prioritization, or justification. Coding indicates that procedural verbs are not accompanied by indicators of uncertainty, alternative actions, or contextual constraints. Procedures are treated as fixed sequences rather than as inputs to professional judgment, which indicates that decisional competence is not defined as an educational target (Sadler, 1989). Quantitative risk scoring methods based on fixed matrices and predefined scales can create a false sense of objectivity, because they suppress situational variability and mask the role of professional judgment in risk evaluation (Băbuț et al., 2011).
Comparison of OSH-related learning outcome formulations across curriculum stages in the analyzed literature (Ajmal et al., 2021; Okun et al., 2016; Wolford et al., 2025) demonstrates no differentiation in complexity or expected autonomy. Similar OSH-related statements appear in early and late phases of the study. There are no descriptors that indicate progression from basic understanding to applied performance or contextual judgment. This absence of progression markers corresponds to patterns of weakly articulated learning progressions reported in higher education programs (Shavelson et al., 2008).
When OSH appears within disciplinary courses, references remain at a descriptive level and are not linked to discipline-specific tasks or assessment criteria. Safety-related statements are not operationalized within the learning outcome structures of these courses. As a result, competence expectations weaken across disciplinary contexts and become fragmented, as described in studies of transversal competence implementation (Wheelahan, 2015; Rychen & Salganik, 2003).
Several curriculum-level sources indicate that advanced safety-related competence is expected to be developed during workplace induction or post-graduate training rather than during university study (Okun et al., 2016; Ajmal et al., 2021). This is visible in the absence of advanced learning outcomes and assessment requirements at program completion. OSH is therefore positioned as preparatory knowledge rather than as an educational outcome to be demonstrated within higher education.
The identified gaps are structural. They recur across documents and arise from misalignment between learning outcomes, teaching activities, and assessment.

4.4. Derived Competence Profile for OSH

Integration of findings from document positioning analysis, curriculum-level examination, and structural gap identification can lead to the delineation of a stable educational competence profile associated with OSH across the analyzed curricula. This profile emerges from the combined configuration of learning outcomes, teaching activities, and assessment specifications.
The resulting competence profile is predominantly cognitive. OSH is represented through declarative knowledge related to hazards, regulations, and prescribed procedures. Cognitive elements are formulated in learning outcomes and targeted by assessment formats. This configuration produces a competence profile centered on recognition and recall rather than on performance, a pattern previously observed in analyses of competence representations dominated by content-based formulations (Wheelahan, 2015).
Procedural competence is fragmentary and inadequately formalized. Procedural elements appear sporadically through descriptions of practical activities, laboratory work, or guided exercises. These elements are not translated into learning outcomes or assessment criteria. Procedural engagement remains instructional and does not contribute to an assessable competence profile. This separation between activity exposure and competence recognition aligns with documented distinctions between participation and learning outcomes in professional education (Billett, 2011).
Decisional competence is structurally absent from the resulting profile. Learning outcomes, teaching activity descriptions, and assessment specifications do not require learners to evaluate alternatives, manage uncertainty, or justify safety-related decisions in context. No curricular elements operationalize judgment as an educational outcome. The absence of decisional indicators confirms that the competence profile excludes contextual reasoning and adaptive decision-making, which are central to professional performance in complex systems (Hollnagel, 2014).
The competence profile indicates no evidence of progression across curriculum stages. Identical or equivalent formulations of OSH-related expectations recur throughout programs without differentiation in complexity, autonomy, or responsibility. This lack of developmental structuring indicates that competence is not designed as a cumulative construct. Instead, it remains static across the educational trajectory, consistent with patterns identified in curricula lacking explicit learning progressions (Shavelson et al., 2008).
Transversal integration is not reflected in the resulting profile. OSH remains confined to isolated curricular units, and competence expectations do not transfer across disciplinary contexts. When safety-related references appear outside dedicated units, they lack explicit linkage to discipline-specific tasks or assessment criteria. The competence profile degrades as it crosses curriculum boundaries, producing inconsistent and localized expectations.
The analyzed curricula produce a competence profile characterized by knowledge centrality, procedural marginality, decisional absence, lack of progression, and limited transversal coherence. This profile represents the cumulative educational outcome of current curricular arrangements for OSH.

4.5. Mapping of Curriculum Misalignments Affecting Occupational Safety and Health Competence

The analysis yields an analytical mapping method that specifies how structural characteristics of existing curricula constrain the development of OSH competence. This mapping integrates document positioning, curriculum organization, identified structural gaps, and the resulting competence profile into a coherent analytical output. The analytical mapping is summarized in Table 2, which relates curriculum components to affected competence dimensions and observed structural constraints, consistent with alignment-based analyses of educational design (Biggs, 1999).
At the level of learning outcomes, OSH is formulated primarily through declarative statements related to awareness, responsibility, or knowledge of regulations. These formulations do not specify observable actions, contextual performance, or decision-making requirements. Cognitive competence is defined, while procedural and decisional dimensions remain underspecified or absent, reflecting limitations previously identified in outcome-based formulations disconnected from performance evidence (Gulikers et al., 2004).
At the level of teaching activities, OSH-related content is delivered mainly through lectures and guided instruction. Practical activities, when present, are not associated with explicit performance criteria or competence expectations. This configuration limits the translation of conceptual knowledge into applied competence and prevents the formal recognition of procedural learning within the curriculum structure (Billett, 2011).
At the level of assessment, evaluation formats prioritize efficiency and standardization. Written examinations and short-answer tests are used to verify knowledge acquisition, while no assessment evidence is required for contextual judgment or adaptive decision-making. This configuration reinforces a narrow competence profile and excludes decisional competence from formal evaluation, a pattern associated with weak construct representation in assessment design (Messick, 1994).
Across curriculum stages, OSH-related expectations remain static. Learning outcomes and assessment requirements do not increase in complexity or autonomy over time. The absence of progression prevents cumulative competence development and confines OSH to an introductory or peripheral role within programs, consistent with observations on the lack of learning progressions in higher education curricula (Shavelson et al., 2008).
Across disciplinary contexts, OSH-related expectations weaken as they cross curriculum boundaries. Safety-related elements are not operationalized within discipline-specific tasks or assessment criteria, resulting in fragmented and localized competence representations, a feature previously noted in analyses of transversal competence implementation (Wheelahan, 2015).
The analytical mapping suggests that current curricular arrangements prioritize cognitive competence while constraining procedural development and excluding decisional competence. These constraints arise from misalignments between learning outcomes, teaching activities, and assessment specifications, as well as from the absence of progression and transversal integration mechanisms.

4.6. Derived Educational Framework for Occupational Safety and Health Competence

An educational framework is derived that formalizes OSH as a transversal competence in higher education. The framework specifies competence dimensions, progression constraints, and alignment conditions as an integrated educational architecture.
The framework is organized around three functionally distinct competence dimensions. Cognitive competence is defined through the interpretation of hazards, exposure factors, and responsibility structures. Procedural competence is defined through the execution of risk-related methods within bounded task conditions. Decisional competence is defined through the selection, prioritization, and justification of actions in situations characterized by uncertainty and competing constraints. These dimensions correspond to established distinctions between knowledge-based, action-based, and judgment-based performance in professional learning contexts (Le Deist & Winterton, 2005).
Competence development is structured across three levels of complexity. At the foundational level, acceptable evidence is limited to the correct identification and interpretation of safety-related information. At the intermediate level, acceptable evidence requires method selection and execution within defined scenarios. At the advanced level, acceptable evidence requires justification of decisions in non-routine or variable contexts. Evidence valid at lower levels is not sufficient to demonstrate competence at higher levels, establishing formal non-equivalence between levels (Sadler, 1989).
Progression within the framework is governed by transition conditions. Transition from cognitive to procedural competence requires demonstrated application of methods linked to task conditions. Transition from procedural to decisional competence requires demonstrated judgment where procedures do not uniquely determine action. Progression is therefore conditional and cannot be achieved through content accumulation alone, consistent with findings on professional judgment development in complex work settings (Eraut, 2004).
Alignment is implemented as a structural requirement of the framework. For each competence dimension and level, learning outcomes specify observable performance, teaching activities enable enactment of that performance, and assessment requires direct evidence of performance. Misalignment between these elements invalidates competence attribution at the corresponding level, reflecting established principles of assessment validity (Kane, 2013).
Transversal applicability is represented through abstraction from disciplinary content. Competence elements are defined independently of subject matter and can be instantiated within discipline-specific tasks without altering performance criteria. This structure supports consistent competence representation across curriculum boundaries, as required for transversal competence implementation in higher education (Rychen & Salganik, 2003).
The internal structure of the derived framework, including competence dimensions, progression levels, and alignment conditions, is represented schematically in Figure 1.

5. Discussion

Research on OSH education has focused mainly on courses, modules, and training interventions, with limited attention to how competent performance is defined and verified. In higher education, research on transversal competences suggests that when competences are expressed at a declarative level, assessment tends to measure awareness rather than performance (Gijón et al., 2025). This produces curricula in which competences are present in learning outcomes but absent from evaluation structures, a pattern also observed in the analyzed document corpus.
Within competence-based education, competence is defined through alignment between learning outcomes, instructional activities, and assessment evidence. Competence exists only if it can be demonstrated through observable task performance rather than through content recall (Biggs & Tang, 2011). Meta-analyses of competence assessment confirm that when frameworks do not specify task-based evidence, competence claims remain nominal and cannot be compared or accumulated across curricula (La Chimea et al., 2020). This establishes why task-based evidence is a necessary condition for competence to exist as an educational outcome.
Decisional competence does not disappear because curriculum designers ignore safety. In the analyzed documents, it appears to recede when assessment architectures do not accommodate judgment as an evidentiary object. When assessment systems rely on written tests, checklists, or predefined procedures, they can verify recognition and procedural compliance, but they cannot verify the quality of a decision under constraint. Uncertainty, trade-offs, and responsibility allocation have no measurable format and are therefore excluded from what counts as learning. Instructionally Embedded Assessment models argue that assessment design, delivery, and scoring must be aligned with instructional use to generate practical evidence about complex forms of competence (Clark & Karvonen, 2021). This creates a structural filtering effect. Learning outcomes that refer to responsibility, preparedness, or professional judgment enter the curriculum at the policy level but lose operational meaning when translated into assessable course elements. Only competence components that fit stable and reproducible assessment formats survive this translation. Cognitive knowledge and procedural steps are retained. In the reviewed corpus, decisional competence is rarely formalized as assessment evidence. Decisional competence is filtered out.
The framework derived in this study positions OSH within a performance-based competence tradition. Defining risk interpretation, action selection, and decision justification as separate competence functions specifies what students must demonstrate in work-related situations. This aligns OSH with how competence is operationalized in other professional domains, where judgment and action under task conditions constitute the basis for validation (Lindgren et al., 2024). By grounding OSH in observable performance instead of awareness of rules, the framework provides a formal competence definition that has been missing from safety education.
This framework introduces a different object of design. It does not define OSH through topics or rules, but through performance functions. Risk interpretation, action selection, and decision justification define three distinct forms of observable work that a learner must be able to perform in safety-relevant situations. This shifts the unit of analysis from knowledge about safety to performance in safety-relevant situations.
Assessment research indicates that when performance evidence is not specified, competence claims remain symbolic and cannot support cumulative development across curricula (La Chimea et al., 2020; Lindgren et al., 2024). Judgment cannot be inferred from multiple-choice scores or from completion of prescribed procedures. It requires task-embedded evidence in which learners must select, justify, and defend actions in context. When such evidence is absent, assessment systems default to what can be easily examined. Even when learning outcomes refer to responsibility or professional judgment, assessment formats translate these constructs into factual recall or rule application. The framework proposed in this study makes this filtering mechanism visible by separating interpretation, action, and justification as distinct competence functions and by revealing where existing curricula collapse decision into procedure.
This logic is reflected in how OSH is framed in existing curricula. Many existing OSH education frameworks describe content domains, hazard categories, and training requirements. They describe content domains, hazard categories, and training requirements. They specify what should be covered, not what must be demonstrated. Even competence-oriented OSH models tend to list abilities without defining the form of evidence needed to validate them (Ajmal et al., 2021). As a result, within the reviewed corpus, OSH is present in the curriculum language but remains weakly represented in assessment.
The framework also introduces a formal model of progression. Existing approaches assume that exposure to training leads to higher competence (Boafo et al., 2025). The present model defines progression through changes in the evidentiary object. At Level 1, competence is demonstrated through the correct interpretation of a situation. At Level 2, it is demonstrated through the execution of a control method. At Level 3, it is demonstrated through a justified choice between competing options under constraint. These levels are non-equivalent. Each requires a different form of evidence and cannot be inferred from the previous one (Okun et al., 2016).
As a consequence of this design, assessment becomes part of the competence definition. In this framework, competence does not exist unless a specific type of evidence is produced. Hazard maps, control selection logs, and decision records are not teaching aids. They are the objects through which OSH competence is claimed. This prevents the common situation in which curricula refer to judgment while assessments verify only recall or rule following.
The framework also supports transversal implementation without content standardization. The same competence functions and evidence types can be instantiated in different disciplines using different hazards, tools, and work processes. This allows OSH competence to be compared across programs while preserving disciplinary specificity.
At the curriculum level, this structure provides OSH to be distributed across program stages through explicit level-based requirements. A first-year laboratory course can require structured hazard mapping. A second-year field module can require documented control selection. A final-year design or capstone project can require justified safety decisions under real constraints. Under this structure, OSH becomes cumulative without the need for separate safety courses.
At the course level, instructors can incorporate OSH by adding decision points and justification criteria to existing tasks. A machine design assignment can include a comparison of guarding options. A chemical process task can require the selection of exposure controls under ventilation limits. A fieldwork protocol can require justification of work sequencing under time pressure. The content does not change; the performance object does.
At the assessment level, programs collect task-based evidence instead of attendance or test scores. Rubrics aligned to interpretation, action, and justification allow OSH competence to be evaluated with the same rigor as disciplinary skills. At the program level, coordinators can verify whether Level 1, Level 2, and Level 3 evidence appear across the curriculum and whether progression occurs before graduation.
The study has methodological limits that follow from its analytical design and data sources. The framework is derived from published research on OSH education, competence models, and assessment theory, not from direct observation of curricula, teaching, or student work. It specifies what should count as evidence of OSH competence without indicating how often or how well such evidence is produced in real programs. Future research should test the framework against actual course syllabi, assignments, and assessment artifacts. Longitudinal studies should examine whether students’ progress across the three levels. Intervention studies can compare courses that use the framework with courses based on content instruction to test effects on assessment validity and the appropriateness of decisions.

6. Conclusions

This study provides a way to define OSH as a form of academic competence that can be checked. The findings describe structural patterns identified within the analyzed document corpus. The framework illustrates what students must be able to do, not what topics they have been exposed to. By connecting interpretation, action, and justification to assessment evidence, OSH can be built into existing courses and projects. No separate safety subjects are required. The framework allows programs to track progression from basic risk recognition to decision-making under constraints. This creates conditions for OSH to be compared across courses and disciplines. The framework provides a basis for treating safety as part of an academic qualification rather than as background knowledge.

Author Contributions

Conceptualization, S.M.R., D.O.B. and V.-R.C.; methodology, S.M.R., D.O.B. and V.-R.C.; formal analysis, S.M.R., D.O.B. and V.-R.C.; writing—original draft preparation, S.M.R., D.O.B. and V.-R.C.; writing—review and editing, D.O.B. and V.-R.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Under Romanian Law No. 206/2004 on good conduct in scientific research and the EU General Data Protection Regulation (EU) 2016/679, ethics committee approval is required only for research involving human subjects or the collection or processing of personal data. This study did not collect, process, or store any personal data and therefore falls outside the scope of GDPR and ethical review requirements.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ajmal, M., Isha, A., & Nordin, S. (2021). Safety management practices and occupational health and safety performance: An empirical review. Jinnah Business Review, 9(2), 15–33. [Google Scholar] [CrossRef]
  2. Badea, D. O., Cioca, V., Darabont, D. C., Chiș, T. V., & Iordache, R. M. (2024). Ontology-based occupational safety and health management for workers with disabilities. Polish Journal of Management Studies, 30(1), 24–41. [Google Scholar] [CrossRef]
  3. Băbuț, G. B., Moraru, R. I., & Cioca, L. I. (2011, June 1–3). “Kinney methods”: Useful or harmful tools in risk assessment and management process? 5th International Conference on Manufacturing Science and Education (MSE 2011) (Vol. 2, pp. 315–318), Sibiu, Romania. [Google Scholar]
  4. Biggs, J. (1999). What the student does: Teaching for enhanced learning. Higher Education Research & Development, 18(1), 57–75. [Google Scholar] [CrossRef]
  5. Biggs, J., & Tang, C. (2011). Teaching for quality learning at university. McGraw-Hill Education. [Google Scholar]
  6. Billett, S. (2011). Subjectivity, self and personal agency in learning through and for work (M. Malloch, L. Cairns, K. Evans, & B. N. O’Connor, Eds.; pp. 60–72). SAGE Publications Ltd. [Google Scholar] [CrossRef]
  7. Boafo, C., Dornberger, U., & Boso, N. (2025). Enhancing international business competence: How cognitive and exposure training approaches matter. Africa Journal of Management, 11(1), 1–36. [Google Scholar] [CrossRef]
  8. Bol, L., & Hacker, D. J. (2001). A comparison of the effects of practice tests and traditional review on performance and calibration. The Journal of Experimental Education, 69(2), 133–151. [Google Scholar] [CrossRef]
  9. Clark, A. K., & Karvonen, M. (2021). Instructionally embedded assessment: Theory of action for an innovative system. Frontiers in Education, 6, 724938. [Google Scholar] [CrossRef]
  10. Eraut, M. (2004). Informal learning in the workplace. Studies in Continuing Education, 26(2), 247–273. [Google Scholar] [CrossRef]
  11. Gijón, M. K., Ruiz, I. Á., Villén, B. d. R., & de Quesada, M. G. (2025). Selecting and defining transversal competences for higher education training design. Frontiers in Education, 10, 1533505. [Google Scholar] [CrossRef]
  12. Gulikers, J. T. M., Bastiaens, T. J., & Kirschner, P. A. (2004). A five-dimensional framework for authentic assessment. Educational Technology Research and Development, 52(3), 67–86. [Google Scholar] [CrossRef]
  13. Hale, A., & Borys, D. (2013). Working to rule, or working safely? Part 1: A state of the art review. Safety Science, 55, 207–221. [Google Scholar] [CrossRef]
  14. Hollnagel, E. (2014). Safety-I and safety-II: The past and future of safety management. CRC Press. [Google Scholar] [CrossRef]
  15. Iordache, R., Cioca, V. R., & Mihaila, D. (2025a). Digitalization in the world of work—Benefits and new risks for employees’ mental health. Polish Journal of Management Studies, 32(2), 111–124. [Google Scholar]
  16. Iordache, R., Cioca, V. R., Mihaila, D., Štreimikienė, D., & Ionescu, Ș. (2025b). An analysis on leadership and decision making errors in the new artificial intelligence influenced organizational environment. Polish Journal of Management Studies, 31(2), 123–137. [Google Scholar] [CrossRef]
  17. Kane, M. T. (2013). Validating the interpretations and uses of test scores. Journal of Educational Measurement, 50(1), 1–73. [Google Scholar] [CrossRef]
  18. La Chimea, T., Kanji, Z., & Schmitz, S. (2020). Assessment of clinical competence in competency-based education. Canadian Journal of Dental Hygiene, 54(2), 83–91. [Google Scholar]
  19. Le Deist, F. D., & Winterton, J. (2005). What is competence? Human Resource Development International, 8(1), 27–46. [Google Scholar] [CrossRef]
  20. Lindgren, S., Argullos, J. L. P., & Millan, J. R. (2024). Assessment of clinical competence of medical students: Future perspectives for Spanish faculties. Medicina Clínica Práctica, 7(2), 100424. [Google Scholar] [CrossRef]
  21. Marcotte, K. M., & Gruppen, L. D. (2022). Competency-based education as curriculum and assessment for integrative learning. Education Sciences, 12(4), 267. [Google Scholar] [CrossRef]
  22. Messick, S. (1994). The interplay of evidence and consequences in the validation of performance assessments. Educational Researcher, 23(2), 13–23. [Google Scholar] [CrossRef]
  23. Mulder, M. (2014). Conceptions of professional competence. In S. Billett, C. Harteis, & H. Gruber (Eds.), International handbook of research in professional and practice-based learning (pp. 107–137). Springer. [Google Scholar] [CrossRef]
  24. Okun, A. H., Guerin, R. J., & Schulte, P. A. (2016). Foundational workplace safety and health competencies for the emerging workforce. Journal of Safety Research, 59, 43–51. [Google Scholar] [CrossRef]
  25. Qian, Y., Vaddiraju, S., & Khan, F. (2023). Safety education 4.0—A critical review and a response to the process industry 4.0 need in chemical engineering curriculum. Safety Science, 161, 106069. [Google Scholar] [CrossRef]
  26. Reason, J. (Ed.). (2019). Studies of human error. In Human error (pp. 19–52). Cambridge University Press. (Original work published 1990). [Google Scholar]
  27. Rychen, D. S., & Salganik, L. H. (Eds.). (2003). Key competencies for a successful life and a well-functioning society. Hogrefe Publishing. [Google Scholar]
  28. Sadler, D. R. (1989). Formative assessment and the design of instructional systems. Instructional Science, 18(2), 119–144. [Google Scholar] [CrossRef]
  29. Salas, E., Tannenbaum, S. I., Kraiger, K., & Smith-Jentsch, K. A. (2012). The science of training and development in organizations: What matters in practice. Psychological Science in the Public Interest, 13(2), 74–101. [Google Scholar] [CrossRef]
  30. Shavelson, R. J., Young, D. B., Ayala, C. C., Brandon, P. R., Furtak, E. M., Ruiz-Primo, M. A., Tomita, M. K., & Yin, Y. (2008). On the impact of curriculum-embedded formative assessment on learning: A collaboration between curriculum and assessment developers. Applied Measurement in Education, 21(4), 295–314. [Google Scholar] [CrossRef]
  31. Tynjälä, P. (2008). Perspectives into learning at the workplace. Educational Research Review, 3(2), 130–154. [Google Scholar] [CrossRef]
  32. Wheelahan, L. (2015). Not just skills: What a focus on knowledge means for vocational education. Journal of Curriculum Studies, 47(6), 750–762. [Google Scholar] [CrossRef]
  33. Wolford, L. L., Lugo-Neris, M. J., Liu, C. W., Nieves, L. E., Rodriguez, C. L., Patel, S. S., Lee, S. Y., & Naidoo, K. (2025). Uncovering the hidden curriculum in health professions education. Education Sciences, 15(7), 791. [Google Scholar] [CrossRef]
  34. Young, M., & Muller, J. (2015). Curriculum and the specialization of knowledge. Routledge. [Google Scholar] [CrossRef]
Figure 1. Analytical mapping of curriculum components and competence dimensions in occupational safety and health education.
Figure 1. Analytical mapping of curriculum components and competence dimensions in occupational safety and health education.
Education 16 00297 g001
Table 1. Examples of curriculum statements and corresponding competence coding.
Table 1. Examples of curriculum statements and corresponding competence coding.
Curriculum ElementStatement ExampleCompetence FunctionEvidence Type
Learning outcomeAwareness of safety regulations for laboratory work.Risk interpretationCognitive
Teaching activityLectures on safety legislation and hazards.Risk interpretationCognitive
AssessmentWritten exam on safety rules.Risk interpretationCognitive
Learning outcomeApply basic safety procedures in laboratory tasks.Action selectionProcedural
Teaching activitySupervised laboratory work following safety steps.Action selectionProcedural
AssessmentChecklist of completed safety steps.Action selectionProcedural
Table 2. Characteristics of curriculum design affecting occupational safety and health competence.
Table 2. Characteristics of curriculum design affecting occupational safety and health competence.
Curriculum ComponentObserved Structural FeatureAffected Competence DimensionResulting Constraint
Learning outcomesFormulated as awareness, responsibility, or knowledge of regulations.Cognitive competence.Procedural and decisional competence are not specified as educational outcomes.
Teaching activitiesPredominantly lecture-based, with generic references to practical activities.Cognitive competence, partial procedural exposure.Procedural learning remains instructional and is not formalized as competence.
Assessment methodsWritten examinations and recall-based tests.Cognitive competence only.No assessment evidence for contextual judgment or adaptive decision-making.
Curriculum progressionIdentical OSH-related formulations across program stages.Cognitive, procedural, and decisional competence.Absence of cumulative or progressive competence development.
Disciplinary integrationOSH confined to isolated courses or modules.Transversal competence.Fragmented and inconsistent competence expectations across disciplines.
Responsibility allocationAdvanced OSH competence deferred to workplace training.Decisional competence.OSH framed as preparatory knowledge, not as an educational outcome.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Radu, S.M.; Badea, D.O.; Cioca, V.-R. Occupational Safety and Health as Assessable Transversal Competence in Higher Education. Educ. Sci. 2026, 16, 297. https://doi.org/10.3390/educsci16020297

AMA Style

Radu SM, Badea DO, Cioca V-R. Occupational Safety and Health as Assessable Transversal Competence in Higher Education. Education Sciences. 2026; 16(2):297. https://doi.org/10.3390/educsci16020297

Chicago/Turabian Style

Radu, Sorin Mihai, Daniel Onut Badea, and Victoria-Rodica Cioca. 2026. "Occupational Safety and Health as Assessable Transversal Competence in Higher Education" Education Sciences 16, no. 2: 297. https://doi.org/10.3390/educsci16020297

APA Style

Radu, S. M., Badea, D. O., & Cioca, V.-R. (2026). Occupational Safety and Health as Assessable Transversal Competence in Higher Education. Education Sciences, 16(2), 297. https://doi.org/10.3390/educsci16020297

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop