Next Article in Journal
Information Sharing, Quality Management, and Firm Performance: The Mediating Role of Supply Chain Agility
Previous Article in Journal
Intelligent Manufacturing Demonstration Projects Driving Corporate ESG Ratings: An Analysis Based on Innovation Efficiency and Cost Management
 
 
Due to scheduled maintenance work on our servers, there may be short service disruptions on this website between 11:00 and 12:00 CEST on March 28th.
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Human-Centred Extended Reality (XR) System for Safe Human–Robot Collaboration (HRC) in Smart Logistics

Department of Supply Chain Management, International Hellenic University, Kanellopoulou 2, 601 32 Katerini, Greece
*
Author to whom correspondence should be addressed.
Systems 2026, 14(4), 348; https://doi.org/10.3390/systems14040348
Submission received: 12 February 2026 / Revised: 22 March 2026 / Accepted: 24 March 2026 / Published: 25 March 2026

Abstract

HRC is increasingly adopted in industrial and logistics environments, while workforce preparation often remains constrained by instructional approaches that provide limited embodied understanding of safety and ergonomics. This study examines the architectural design and system integration of a modular, human-centred XR platform intended to support safe and ergonomics-aware collaboration within smart logistics contexts. The proposed system integrates XR training scenarios deployed on consumer-grade hardware and follows a structured pedagogical progression from conceptual familiarisation through experiential task execution to reflective ergonomic evaluation. Multimodal feedback mechanisms based on posture-oriented guidance, attention-aware interaction design, and context-sensitive safety cues are incorporated without reliance on intrusive sensing technologies. A structured evaluation framework is defined to examine usability, task performance, and ergonomics-aligned posture indicators using standardised instruments and system-generated telemetry. The architectural design indicates that the framework supports scalable deployment, consistent interaction fidelity, and privacy-conscious data handling across educational and vocational settings. The proposed framework suggests that human-centred XR architectures can strengthen safety-oriented and ergonomically informed HRC within Industry 4.0 logistics environments.

1. Introduction

1.1. Industry 4.0 Logistics Systems and the Need for Human-Centred Capability Development

HRC is increasingly adopted within manufacturing and logistics systems, where coordinated interaction between human operators and robotic platforms is associated with improvements in efficiency, productivity, and operational safety in industrial environments [1,2]. In Industry 4.0 environments, the effective adoption of collaborative robotics is shaped not only by technological advances and system integration, but also by workforce-related factors, including the capacity to understand robot behaviour and engage within shared human–robot workspaces [2].
Simultaneously, artificial intelligence is reshaping supply chain management processes by enhancing forecasting accuracy, operational efficiency, and decision-making capabilities, while highlighting the importance of organisational readiness and human-centric integration [3]. Industry 4.0 technologies are increasingly integrated into distribution centre processes, supporting the automation of receiving, storage, packaging, and dispatching functions [4]. Applications such as UAV-enabled automated stocktaking illustrate the growing sophistication of robotic sensing and perception technologies in warehouse environments [5]. Similarly, mixed-case palletizing automation highlights the application of advanced algorithms and robotic solutions to improve efficiency, reduce errors, and optimise space utilisation in logistics operations [6]. Systematic analyses indicate that Industry 4.0 technologies influence logistics performance across economic, environmental, and social dimensions, while introducing trade-offs among sustainability indicators and lifecycle-related challenges [7]. From a broader systems perspective, digital transformation highlights the importance of not only technological advancement and system integration, but also the structured incorporation of human factors and ergonomics within digitally enabled work systems through human-centric socio-technical design approaches [8].
In this evolving socio-technical landscape, training practices are increasingly incorporating immersive technologies [9]. Immersive Virtual Reality has been identified as a promising educational medium due to its capacity to enhance engagement and experiential learning, particularly in adult and vocational training contexts [10]. XR technologies are being explored as instructional tools that support skill acquisition, safety training, and ergonomics-oriented learning processes [11]. Conventional instructional formats provide limited opportunities for embodied engagement and structured posture-based interaction, whereas immersive simulation environments enable the examination and assessment of spatial coordination and movement patterns [11,12]. XR environments have been shown to enhance skill development and safety-related training effectiveness compared with conventional instructional approaches, particularly in construction safety contexts [13]. In parallel, research on Augmented Reality applications in Industry 4.0 identifies maintenance, assembly, and HRC as principal domains of adoption, highlighting the role of XR technologies in supporting the digitalisation of industrial processes [14].
Nevertheless, certain robotics-oriented XR training systems concentrate primarily on programming and collaborative simulation, while placing comparatively less emphasis on ergonomics-oriented feedback mechanisms [15]. Empirical evidence indicates that immersive media can enhance learner engagement and reinforce the perception and retention of ergonomics risk factors in industrial training contexts, thereby supporting ergonomics-oriented learning processes [16]. However, improvements in engagement and risk perception do not necessarily result in sustained behavioural adaptation within collaborative workflows. These findings underscore the importance of XR systems that integrate technical instruction with safety-oriented design and human-centred system adaptation.
In response to these identified gaps, this study introduces a modular XR training architecture designed for scalable operation on consumer-grade head-mounted displays. The architecture adopts a distributed framework leveraging Unity and VIROO, thereby enabling interoperability across heterogeneous logistics infrastructures. Research on scalable XR environments emphasises the architectural and interaction challenges associated with multi-device and multi-user deployment, semantic scene segmentation, dynamic adaptation, and the preservation of consistency between virtual and physical contexts [17].
The system integrates posture proxies aligned with the Rapid Entire Body Assessment (REBA) framework [18], enabling ergonomics-oriented feedback without reliance on intrusive biometric sensing. This configuration is designed to connect immersive interaction, ergonomic safety, and HRC in Industry 4.0 logistics systems. This study examines the architectural design and system integration of an XR platform that supports human-centred capability development in digitally transformed logistics infrastructures. The analysis suggests that this modular architecture can support scalable and ergonomically informed training across contemporary logistics operations.

1.2. Architectural Framework and Research Contributions

This paper presents the architectural design and pilot study protocol of the HRC-XR Trainer system, developed to support ergonomics-aware training for HRC in smart logistics environments. While XR-based training and collaborative robotics education have been increasingly explored in recent studies, existing approaches often address technical interaction, safety training, and ergonomics awareness as separate concerns. As a result, there remains a need for integrated frameworks that combine pedagogically structured instruction with ergonomics-informed feedback within scalable XR-based training architectures. This study therefore examines how a modular XR architecture integrates pedagogically structured instruction with analytics-supported feedback derived from user interaction data. In this context, the research framework investigates whether telemetry captured from consumer-grade XR devices can support ergonomics-informed training evaluation and safety awareness within staged HRC learning scenarios. To guide the architectural design and pilot evaluation of the system, this study is structured around the following research questions:
  • RQ1: How can a modular XR architecture support ergonomics-aware training for human–robot collaboration in smart logistics environments?
  • RQ2: How can multimodal interaction telemetry and feedback mechanisms contribute to the assessment of safety awareness and ergonomics-related behaviour during XR-based HRC training?
  • RQ3: How can an XR-based training framework be integrated within a scalable ecosystem to support accessible collaborative robotics education for non-specialist learners?
The primary contribution concerns the design and implementation of an integrated XR platform that facilitates non-specialist engagement with collaborative robotics in warehouse and distribution settings. The HRC-XR Trainer combines Virtual and Mixed Reality (VR/MR) delivery with three interoperable toolkits, namely MANIPULAY XR [19] for exploring robot motion and kinematics, EMPOWER [20] for safety and ergonomics assessment, and ERGON-XR [21] for ergonomics-oriented coaching supported by standardised evaluation metrics. The platform builds upon the open XR ecosystem for robotics training introduced by the MASTER project [22,23], extending its functionality through modular integration and scalable deployment mechanisms compatible with consumer-grade hardware. This configuration demonstrates how interoperable XR components can support cross-disciplinary training scenarios that link technical system configuration with human factor analysis and reflective practice.
A second contribution concerns the pedagogically structured progression of the training modules. The instructional design follows a three-stage framework advancing from conceptual familiarisation to experiential safe-behaviour practice and finally to reflective ergonomic assessment. This staged progression is consistent with empirical findings indicating that immersive environments enhance learner engagement and can support posture-oriented and ergonomics-related learning processes [11,12], while maintaining coherence between learning objectives and system functionality.
A further contribution concerns the integration of multimodal feedback mechanisms designed to enhance embodied interaction and inclusive access. Posture-informed guidance, attention-aware interaction logic, and context-sensitive prompts are combined to support safety-oriented task execution. This approach builds upon empirical findings indicating that immersive media can enhance posture risk awareness [16] and engage novice users in collaborative XR training contexts [15]. Through this integration, the system extends beyond technical simulation toward human-centred system adaptation.
The architectural design addresses scalability and systematic validation through modular and interoperable integration. The implementation follows the integration interfaces of the MASTER XR ecosystem and supports deployment through Unity-based pipelines, ensuring consistent configuration across heterogeneous logistics infrastructures. MR passthrough capabilities on consumer-grade head-mounted devices enable flexible operation in authentic work settings, while defined performance targets ensure stable and responsive interaction. Privacy-conscious telemetry handling and support for both institutional and individual deployment contexts enable broad adoption without limiting the future extensibility of the system.
In parallel with these architectural considerations, this study defines a structured evaluation framework that integrates usability assessment, knowledge-based performance measures, ergonomics-aligned posture proxies grounded in the REBA method, and task execution indicators. This framework provides the methodological foundation for examining learning progression, safety awareness, and ergonomic adaptation within immersive logistics training contexts.
The remainder of this paper is organised as follows: Section 2 reviews related work on HRC training, XR applications, and human-centred industrial systems. Section 3 presents the pedagogical framework and articulates the system design goals and their mapping to the instructional modules. Section 4 details the layered system architecture and implementation framework, including modular integration and data flow mechanisms. Section 5 describes the instructional modules and associated learning scenarios that operationalise the proposed training approach. Section 6 outlines the evaluation design and pilot protocol. Section 7 discusses the findings in relation to prior work and highlights limitations and directions for future refinement.

2. Related Work on HRC, XR, and Human-Centred Industrial Systems

2.1. Training and Education for HRC

Research on HRC has examined the coordinated interaction mechanisms that enable joint task execution within industrial environments. Existing studies analyse human–robot communication frameworks, shared mental models, and modelling approaches that contribute to safe and efficient coordination between operators and robotic systems [24,25]. This body of work establishes conceptual and technical foundations for collaborative operation across manufacturing and logistics contexts.
Educational approaches addressing HRC have often emphasised engineering-oriented instruction, including robot programming laboratories, simulation environments, and system configuration exercises [22]. While such approaches support technical skill acquisition, they are frequently developed within specialist educational contexts, which may limit their direct applicability to broader vocational or operational training scenarios [15]. Consequently, variation persists in the distribution and pedagogical orientation of HRC training methodologies across learner profiles.
In response to these constraints, XR technologies have been explored as instructional media for communicating robot behaviour and safety principles [26]. Reported XR-based training systems frequently rely on structured and sequential interaction formats, which may limit the extent of open-ended collaborative task participation. In addition, the scalability and contextual adaptability of such systems across heterogeneous industrial environments remain subjects of ongoing investigation.

2.2. XR Applications in Vocational and Industrial Training

XR technologies are increasingly adopted in vocational and industrial training as enablers of experiential learning processes [9,11]. Applications have been reported in domains such as assembly guidance [27], maintenance support [28], and safety instruction [9], where immersive environments contribute to procedural understanding and task familiarisation.
Despite these developments, a considerable proportion of XR training systems remain closely aligned with specific tasks or depend on fragmented hardware and software configurations, which constrains broader adaptability across industrial sectors [28]. In the context of collaborative robotics, training environments frequently prioritise procedural execution and system interaction while providing comparatively limited support for embodied perception, posture awareness, or biomechanical feedback during task performance [15]. This orientation influences the degree to which ergonomic considerations are systematically integrated into immersive training workflows.
Empirical investigations addressing XR-supported ergonomics indicate that immersive environments can enhance awareness of body alignment and musculoskeletal risk when integrated with structured training interventions [16]. Nevertheless, many XR safety training evaluations remain conducted under controlled experimental conditions and show limited assessment of long-term or contextually complex deployment, indicating that the extension of such approaches to collaborative and multi-user industrial training environments remains comparatively under-explored [9].

2.3. Ethical, Safety, and Ergonomic Considerations in HRC

Safety considerations constitute a central dimension of shared human–robot workspaces. International standards, including ISO 10218-2, define safety requirements for industrial robot systems and specify protective measures, operational constraints, and risk-reduction mechanisms relevant to collaborative operation [29]. These standards establish a normative framework guiding the deployment, integration, and management of collaborative robotic systems within industrial environments.
Ergonomic assessment methods, such as the REBA [18] and the Rapid Upper Limb Assessment (RULA) [30], are widely applied to evaluate musculoskeletal risk during task execution. In training and educational settings, these methods are typically implemented through observational coding procedures or post-task analysis, rather than through continuous, embodied feedback mechanisms embedded within interactive digital systems.
Beyond physical safety, ethical dimensions of human–robot interaction have received increasing scholarly attention. Concepts such as transparency, explainability, accountability, and the preservation of human agency are discussed extensively in the literature on ethical governance in robotics and artificial intelligence [31]. While these principles are clearly articulated at the conceptual and policy level, their systematic integration into interactive educational and training systems appears uneven across reported implementations.

2.4. Multimodal Feedback and Attention-Aware Interaction in XR

Multimodal feedback has been identified as an important factor in supporting embodied interaction within XR-based training environments. Visual guidance cues and spatial annotations in AR have been reported to assist task comprehension and support user cognition during industrial procedures [27]. Such approaches strengthen perceptual alignment between the user and the operational task environment.
Attention-aware interaction techniques, including gaze-based input and eye-tracking, enable implicit estimation of user focus and support adaptive content presentation in XR environments [32]. Approaches based on fixation mapping and gaze-mediated interaction illustrate the potential of attention inference to support interface adaptation and user-centred interaction design [33]. Nevertheless, many of these techniques rely on specialised sensing hardware, which may constrain large-scale deployment in cost-sensitive training contexts.
Posture-based feedback mechanisms have been explored within XR-supported ergonomics training. Empirical findings indicate that virtual task execution can reproduce comparable joint-angle patterns to real environments, supporting the feasibility of posture-based ergonomic assessment in XR settings [12]. Integrated interaction workflows that combine posture analysis, attention inference, and task-specific guidance within collaborative training scenarios, however, remain relatively uncommon in existing systems.

2.5. XR Platforms and Educational Ecosystems

A growing number of commercial and academic XR platforms support immersive education and industrial training. Platforms such as VIROO provide environments for the authoring, deployment, and management of virtual laboratories and collaborative simulations [34]. However, several XR-based robotics training implementations concentrate primarily on programming and task execution, with comparatively limited integration of ergonomics-oriented assessment or safety analytics within a unified instructional architecture [15].
Modular XR ecosystems have been proposed as an alternative approach, enabling the integration of specialised toolkits within shared platform infrastructures [22]. Such architectures facilitate cross-domain training configurations in which robotics simulation, safety-oriented functions, and instructional logic are coordinated within a common framework. The alignment of these ecosystems with structured pedagogical progression and multimodal interaction strategies remains an active area of investigation.
From this standpoint, the proposed framework is positioned through a comparison across key dimensions, including ergonomics integration, scalability, multimodal feedback, and pedagogical structure. Existing XR-based HRC training approaches typically address these aspects in isolation, for example focusing on visualisation, task execution, or safety instruction, with limited integration across them. By contrast, the HRC-XR Trainer integrates these dimensions within a unified architecture combining staged pedagogy, ergonomics-informed feedback aligned with REBA principles, multimodal interaction, and scalable deployment (Table 1).
Overall, the reviewed literature documents substantial progress in XR-based industrial training and HRC education. At the same time, it reflects the diversity of existing approaches and the persistent complexity involved in integrating technical, ergonomic, and educational dimensions within immersive training systems. These observations underscore the need for explicitly articulated design goals and pedagogical principles capable of unifying these dimensions within a coherent XR training architecture.

3. Design Goals and Pedagogical Mapping of the HRC-XR Trainer

3.1. Pedagogical Framework and Learning Principles

The pedagogical framework of the HRC-XR Trainer is structured around four complementary learning paradigms that collectively shape the design of the training scenarios, interaction logic, and progression of learning activities within the simulated logistics environment. These paradigms operate as design principles that guide how knowledge, skills, and reflective judgement are developed through staged human–robot interaction.
The first paradigm concerns experiential learning. Drawing upon experiential learning theory [35] and case-based reasoning approaches that emphasise learning through the reinterpretation of prior experience [36], learning is understood as an iterative process in which concrete action, feedback, reflection, and conceptual understanding are progressively integrated. Within the training architecture, this paradigm is realised through repeated cycles of task execution and feedback, beginning with guided manipulation and advancing toward autonomous task performance and evaluation. Such cycles support the gradual internalisation of safety procedures, spatial reasoning, and ergonomic considerations through direct engagement with simulated robotic systems.
A second paradigm relates to embodied cognition, which emphasises the role of physical interaction, posture, and spatial coordination in shaping cognitive understanding [16]. This perspective informs the integration of posture-oriented cues, spatial proximity feedback, and attention-aware interaction mechanisms across all training scenarios. By aligning interaction design with the physical and perceptual demands of logistics tasks, the system supports learning processes that reflect the embodied nature of HRC in operational settings [37].
The third paradigm concerns situated and contextual learning. Training activities are embedded within a virtual environment that replicates an operational warehouse setting, ensuring that learning unfolds within task structures and spatial configurations representative of real logistics work. Such contextual alignment is consistent with findings emphasising the importance of aligning VR design elements with learning theories and domain-specific task structures in order to strengthen educational coherence and potential transfer [38]. The progressive scenario structure reinforces this alignment by increasing task realism and interaction complexity across training stages.
The fourth paradigm addresses reflective and responsibility-oriented practice. Effective HRC requires not only procedural competence but also the capacity to recognise risk, evaluate personal action, and adapt behaviour in response to system feedback. Reflective activities integrated into the training process are intended to foreground responsibility, system transparency, and human agency within collaborative workspaces, in alignment with principles articulated in the ethical governance literature [31]. Rather than abstract ethical deliberation, reflection is grounded in the analysis of task execution, ergonomic exposure, and safety-related decisions, thereby connecting ethical awareness with practical judgement in applied contexts [39].
Taken together, these learning paradigms structure the progression from initial familiarisation through embodied safe-behaviour practice to reflective ergonomic evaluation. In this way, pedagogical objectives are systematically integrated within the architectural and interaction design of the HRC-XR Trainer, establishing a coherent socio-technical learning pathway for collaborative logistics contexts.

3.2. System Design Goals

The design goals define the architectural configuration and functional priorities of the HRC-XR Trainer within Industry 4.0 logistics environments. These goals structure the system integration strategy and instructional logic, as summarised in Table 2. Each goal links technical implementation choices with human-centred capability development, ensuring alignment between system architecture, operational conditions, and pedagogical requirements.

3.3. Mapping of Pedagogical Objectives to Instructional Modules

Each instructional module operationalises the defined design goals through specific system functions, interaction mechanisms, and integrated toolkits. The mapping presented in Table 3 aligns learning focus, implemented XR components, and feedback modalities with the three instructional modules of the HRC-XR Trainer, ensuring coherence between pedagogical objectives and system-level implementation across the staged training pathway.
The resulting mapping embeds the progression from initial conceptual awareness through embodied safe-behaviour practice to reflective ergonomic evaluation directly within the system architecture. Across the three modules, cognitive and operational complexity increases in parallel with the depth of feedback integration, supporting structured learning progression and analytics-informed coaching within Industry 4.0 logistics training environments.

3.4. Alignment with Learning Outcomes

The HRC-XR Trainer addresses non-specialist learners, vocational trainees, and students by providing structured entry points into HRC in automated logistics contexts. Learning outcomes follow a staged progression consistent with Bloom’s taxonomy [40], advancing from conceptual awareness in the first module to embodied coordination during shared tasks in the second and culminating in reflective ergonomic evaluation in the third. This progression informs both the instructional structure and the architectural organisation of the system.
Learning outcomes are defined across three complementary dimensions that correspond directly to the implemented training scenarios. The knowledge dimension concerns the recognition of robot capabilities, workspace constraints, safety boundaries, and basic principles governing collaborative operation. The skills dimension focuses on the execution of safe and ergonomically appropriate actions during logistics-oriented tasks, including spatial coordination and posture management in shared human–robot workspaces. The attitudinal dimension relates to the cultivation of responsibility, situational awareness, and reflective judgement when interacting with robotic systems under varying operational conditions.
Assessment is integrated into the XR environment through a combination of task-based performance indicators, posture-related metrics, and guided reflection activities embedded within the training flow. These measures support both learner self-awareness and structured evaluation of learning progression across modules. In this way, learning outcomes are not treated as external criteria but are operationalised through interaction, feedback, and reflection within the modular system architecture.

4. System Architecture and Implementation

4.1. Layered System Architecture and Integration Workflow

The HRC-XR Trainer adopts a layered system architecture that supports modular integration, coordinated development, and feedback-informed refinement, as illustrated in Figure 1. The architecture is organised into three primary layers, namely the Design layer, the Implementation layer, and the Feedback layer, each addressing a distinct function within the system lifecycle.
The Design layer addresses pedagogical and technical configuration through scenario scripting, instructional structuring, and digital asset preparation. Educational content is developed collaboratively and managed through the MASTER XR platform, which supports version control and coordinated deployment across distributed logistics training contexts.
The Implementation layer operationalises the training scenarios by integrating interoperable toolkits, including MANIPULAY XR, EMPOWER, ERGON-XR, and VIROO, within a Unity-based runtime environment. Validated builds are configured through VIROO Studio for spatial orchestration and scenario management, enabling both VR and MR delivery on consumer-grade head-mounted displays. Runtime execution targets Meta Quest 3 devices via the OpenXR framework, ensuring compatibility with contemporary XR deployment standards.
The Feedback layer supports the collection and processing of interaction data derived from user input and task execution. This layer enables the aggregation and visualisation of anonymised telemetry and ergonomics-related indicators, supporting evaluation and content refinement while adhering to privacy-conscious data handling practices. Scalability and long-term accessibility across educational and institutional settings are maintained through platform-level data management.
Across all layers, modular interfaces within the MASTER XR ecosystem enable the integration of new instructional modules, content assets, or analytics components without modification of the core runtime. Interaction robustness is supported through attention-aware interaction mechanisms that provide fallback functionality on devices without dedicated eye-tracking hardware. This architectural configuration facilitates extensibility, maintainability, and scalable deployment within Industry 4.0 logistics training environments.

4.2. Modular SDK Architecture and Toolkit Integration

The instructional functionality of the HRC-XR Trainer is realised through modular software components distributed via the MASTER XR repository. The three core toolkits, namely MANIPULAY XR, EMPOWER, and ERGON-XR, correspond to successive stages of learning progression and collectively support architectural coherence across the system. Their modular configuration enables staged capability development from foundational understanding to applied coordination and reflective evaluation, while preserving interoperability within the wider XR ecosystem.
MANIPULAY XR supports the initial learning stage by providing an interactive XR environment for exploring the structure, motion, and kinematic relationships of robotic manipulators. Learners assemble robotic components and configure joint behaviour through structured, puzzle-based interaction, enabling accessible exploration of robot configuration, reachable workspace, and motion constraints. This interaction model supports conceptual understanding while remaining suitable for non-specialist learners [19].
EMPOWER addresses experiential safe-behaviour practice within shared human–robot workspaces. The toolkit integrates real-time visualisation, contextual assistance, structured data acquisition, and safety- and ergonomics-oriented assessment functions. Through its integration with the MASTER XR platform, EMPOWER supports immersive training scenarios that emphasise spatial awareness, posture management, and adaptive behaviour in simulated safety-sensitive collaborative scenarios [20].
ERGON-XR supports the reflective learning stage by enabling ergonomics-oriented evaluation within XR-based task environments. Learners perform representative logistics tasks while posture, movement patterns, and task-related demands are observed and visualised using standardised ergonomics indicators. Feedback is provided during and after task execution, supporting the interpretation of ergonomic exposure and the identification of opportunities for task and workspace improvement [21].
Together, these modular SDK components establish an integrated architectural framework that links conceptual exploration, embodied interaction, and reflective ergonomic analysis. Their coordinated integration supports extensibility, interoperability, and scalable deployment, reinforcing human-centred capability development in digitally enabled logistics and collaborative robotic systems.

4.3. Data Flow, Analytics, and Iterative Refinement

The modular toolkits are implemented as functional assets within the Unity-based application runtime environment and support successive stages of the training process. During training sessions, the system records anonymised interaction data derived from user input and task execution events, including task completion metrics and execution time. Posture-related and attention-related indicators are obtained through structured observation and interpretation. These data are aggregated and analysed to derive structured performance indicators and REBA-aligned proxies to support the evaluation of task performance, behavioural patterns, and ergonomics-related exposure. The resulting data are stored for subsequent analysis under privacy-conscious data governance procedures.
In the current pilot implementation, ergonomics indicators are derived from task execution characteristics and structured observation of posture during training tasks. These indicators follow REBA principles and support ergonomics-aware feedback. The analysis examines the derived indicators through descriptive statistics and comparative procedures in order to identify patterns related to task completion, ergonomic posture deviation, and attention allocation across training scenarios. In this context, the indicators also support the comparison of learning outcomes and ergonomics awareness across the instructional conditions implemented within the pilot study.
Analytical outputs are presented through structured visualisations that support the review of learning progression, task performance, and ergonomics-related risk patterns. Aggregated insights inform the refinement of instructional scenarios and system configuration, establishing a feedback loop between learner interaction, pedagogical adjustment, and architectural refinement. Through this data-informed process, the HRC-XR Trainer enables iterative refinement while preserving architectural scalability and alignment with defined data governance principles in Industry 4.0 logistics training contexts.

5. Instructional Modules and Learning Scenarios in HRC-XR Training

5.1. Module 1: Foundational Awareness and Conceptual Understanding

The first instructional module introduces learners to the fundamental mechanics of collaborative robotic manipulators within a structured logistics setting. Using the MANIPULAY XR toolkit, learners explore robot motion, joint constraints, and kinematic relationships through guided assembly and puzzle-oriented configuration tasks [19]. These activities are situated within a simulated warehouse environment that contextualises manipulation exercises and supports spatial reasoning through system-integrated visual cues and structured interaction sequences.
Conceptual understanding is developed through direct manipulation and visually guided exploration, supporting the development of operational understanding related to motion limits, reachable workspace, and basic coordination principles relevant to safe HRC, consistent with experiential learning principles [35]. An onboarding sequence introduces interface conventions, core terminology, and controller mappings, ensuring consistent interaction across learner profiles. Interaction traces recorded during this module provide initial behavioural and ergonomic reference indicators for subsequent training stages, as illustrated in Figure 2a.
Scenario progression is organised across three curriculum-aligned levels to accommodate differences in prior exposure and technical familiarity. At the introductory level, targeting learners aged thirteen to sixteen, tasks focus on simplified robot assembly and observation of motion range within clearly marked safety zones. The intermediate level, designed for learners aged seventeen to twenty-one, introduces timed configuration tasks that require planning of joint sequences and basic optimisation of tool reach. The advanced level, addressing adult learners and vocational trainees, incorporates calibration activities involving collision envelopes and introductory waypoint-based motion definition.
Across all levels, task completion unlocks progressively more complex manipulation challenges. This structured progression reinforces cumulative understanding of robotic kinematics while gradually increasing cognitive and operational demands within a controlled learning environment.

5.2. Module 2: Experiential Learning and Safe-Behaviour Practice

The second instructional module extends foundational understanding by engaging learners in operationally realistic HRC within shared logistics workspaces. Using the EMPOWER toolkit, learners perform structured reaching and pick-and-place tasks while interacting with an autonomously moving robot, with the aim of fostering awareness of safe approach distances, posture management, and coordinated timing [20]. The training environment provides system-integrated visual cues and real-time safety notifications that are intended to support behavioural adjustment during task execution.
Learning in this module draws upon embodied interaction principles, whereby physical engagement and spatial awareness are integrated into task execution, consistent with findings on whole-body interactive learning [37]. Ergonomics-oriented prompts are informed by established assessment approaches such as the REBA method [18], enabling learners to associate posture deviations and unsafe positioning with interpretable risk indicators. As task demands increase, learners respond to variations in robot speed, safety margins, and environmental conditions, requiring continuous adaptation and spatial decision-making, as illustrated in Figure 2b.
Scenario progression is organised across three curriculum-aligned levels to support differentiated competence development. At the introductory level targeting learners aged thirteen to sixteen, activities focus on simple reaching routines performed while maintaining clearly defined safe distances from the robot. The intermediate level, designed for learners aged seventeen to twenty-one, introduces time-constrained pick-and-place tasks with reduced safety margins, requiring efficient execution while avoiding proximity warnings. The advanced level, addressing adult learners and vocational trainees, simulates a high-efficiency shared workspace in which robot behaviour adapts in response to unsafe proximity or ergonomically demanding actions.
Across all levels, increasing task complexity is achieved through the systematic variation in speed, spatial constraints, and coordination demands. This progression supports the transition from guided interaction to autonomous, safety-aware task execution within collaborative logistics environments.

5.3. Module 3: Task-Based Reflection and Ergonomics-Oriented Coaching

The third instructional module supports reflective learning and ergonomics-oriented self-assessment through the ERGON-XR toolkit. Learners review previously recorded task sessions and examine posture quality, movement patterns, and task timing using feedback derived from standardised ergonomics indicators. This structured review process is designed to facilitate informed behavioural adjustment and sustained ergonomic awareness within collaborative task contexts [21].
From a pedagogical and systems perspective, this module links task execution with reflective evaluation by enabling the interpretation of quantitative risk indicators in relation to individual interaction behaviour. Feedback is embedded within the learning workflow, allowing ergonomic exposure and movement efficiency to be examined alongside coordination and timing demands encountered during earlier training stages, as illustrated in Figure 2c.
Scenario progression is organised across three curriculum-aligned levels. At the introductory level, targeting learners aged thirteen to sixteen, visual replay functions and colour-coded posture indicators support intuitive recognition of safe and unsafe movement patterns. The intermediate level, designed for learners aged seventeen to twenty-one, introduces comparative task execution under varying workspace configurations, enabling reflection on how layout and task organisation influence ergonomic risk. The advanced level, addressing adult learners and vocational trainees, presents high-strain pick-and-place conditions under time constraints, inviting learners to interpret ergonomic feedback and propose realistic adjustments to task setup and working posture.
Completion of the third module concludes the learning cycle by connecting experiential interaction with evidence-informed reflection. In this way, ergonomic awareness and responsibility-oriented practice are emphasised within digitally mediated logistics and industrial training environments.

5.4. Learning Objectives and Assessment Mapping Across Modules

The three instructional modules of the HRC-XR Trainer constitute a staged learning pathway progressing from conceptual understanding to embodied coordination and finally to reflective ergonomic evaluation. Each module is associated with clearly defined learning objectives and corresponding assessment indicators, enabling structured observation of performance progression in terms of technical comprehension, behavioural indicators, and ergonomics-oriented measures. An overview of this alignment is provided in Table 4.
Across the three modules, the modular system architecture integrates MANIPULAY XR, EMPOWER, and ERGON-XR into a coherent instructional sequence that aligns learning objectives with system functionality and analytics support. Performance indicators are derived directly from interaction data generated during training activities, supporting structured evaluation without reliance on additional external assessment procedures. In this way, learning objectives, instructional design, and assessment logic are embedded within a unified XR-based training framework, providing a methodological foundation for the evaluation design and pilot protocol presented in the following section.

6. Evaluation Design and Pilot Protocol

6.1. Participants and Evaluation Design

Approximately seventy to eighty participants are recruited from secondary education, higher education, and vocational training contexts. Participants complete the three instructional modules sequentially, from M1 to M3, within a supervised session of approximately sixty minutes. A smaller comparison group follows an equivalent video-based instructional pathway covering the same content, enabling baseline comparison of learning outcomes across instructional formats.
The evaluation follows a quasi-experimental design combining pre- and post-intervention knowledge assessment, task-based performance indicators, and post-session usability measures. Learning progression will be examined using metrics aligned with the instructional modules, including task completion success and configuration accuracy in Module 1, posture deviation and proximity compliance during shared tasks in Module 2, and REBA-aligned ergonomic indicators and posture stability measures in Module 3. This alignment supports the analysis of conceptual understanding, behavioural indicators, and ergonomics-oriented performance, which are commonly examined in immersive training studies [16]. In addition, system-level evaluation is enabled by relating these indicators to telemetry-derived interaction data generated across the modular system architecture.

6.2. Measures and Data Collection

The evaluation framework addresses four complementary dimensions, namely cognitive learning, behavioural adaptation, user experience, and system performance, as summarised in Table 5. These measures are designed to examine performance progression, ergonomics-oriented behaviour, engagement with the XR environment, and the technical reliability of the system during training sessions. Quantitative indicators, including pre- and post-intervention knowledge scores, REBA-aligned ergonomic indicators, task-based performance measures, and usability indices, are complemented by qualitative observation and participant feedback. This combined data structure supports consistent interpretation of learning outcomes across conceptual, behavioural, and technical dimensions.
A staged validation process supports methodological consistency across deployment contexts. Initial pilot implementation is planned to be conducted within educational and vocational settings, followed by comparative analysis and iterative refinement of instructional scenarios and interaction logic. This structured workflow is designed to support a reproducible evaluation pipeline aligned with the modular architecture of the MASTER XR ecosystem.
While large-scale trials are planned for subsequent project phases, the evaluation framework and pilot protocol presented provide an initial foundation for empirical assessment of XR-supported HRC training in Industry 4.0 logistics environments.

7. Discussion

This study examines the architectural and pedagogical integration of a modular XR-based training system designed to support ergonomics-aware HRC in digitally transformed logistics contexts.
The HRC-XR Trainer illustrates the feasibility of a modular and data-informed framework for ergonomics-aware HRC training within the MASTER XR ecosystem. In contrast to existing XR-based training systems, which often prioritise task execution, visualisation, or isolated safety instruction with limited integration of ergonomics and pedagogical structure, the proposed system integrates a staged instructional configuration within a unified architectural framework. This approach combines multimodal feedback, ergonomics-oriented analytics, and structured learning progression within a single environment, improving the alignment between interaction design, learning objectives, and evaluation. In this way, the framework addresses limitations of prior systems where these elements were treated as auxiliary, supporting more coherent training and more effective development of safety awareness and ergonomics-informed behaviour in practical training contexts.
By linking pedagogical structure with embodied interaction and analytics-supported feedback, the system reflects human-centred design principles within digitally transformed logistics environments. The architecture illustrates how immersive technologies can function not only as visualisation media but as capability-development infrastructures that support the staged instructional progression implemented across the three modules. From a systems perspective, this supports the development of human competencies that are increasingly relevant for adaptive and resilient logistics operations under Industry 4.0 conditions.
The system design suggests that consumer-grade XR devices can support acceptable levels of presence and usability while remaining accessible to non-specialist learner populations. This observation aligns with broader discussions highlighting scalability and cost-efficiency as relevant factors for the adoption of immersive training technologies in industrial and educational contexts. The integration of ergonomics-oriented analytics and safety- and responsibility-aware reflection extends collaborative robotics education beyond procedural skill acquisition, supporting broader socio-technical competence relevant to contemporary workforce development.
From an implementation perspective, the broader adoption of XR-based HRC training systems in industrial environments may depend on organisational and regulatory considerations. Factors such as integration with established safety procedures, training infrastructure costs, and organisational readiness to incorporate immersive technologies into existing workflows may influence deployment decisions. In this context, the current architecture emphasises scalability through the use of consumer-grade XR hardware. Future iterations may examine the integration of additional sensing modalities to support richer ergonomics analytics, provided that these enhancements preserve the modular structure and practical accessibility of the system.
Several limitations require consideration when interpreting these findings. Ergonomic assessment relies on task-based indicators and structured observation, which constrains precision when compared to full-body motion capture systems. Attention inference mechanisms remain sensitive to environmental conditions and device calibration variability. In addition, pilot activities are planned within controlled educational settings and broader deployment across authentic classroom and vocational environments will require further validation of scalability, robustness, and long-term learning retention. In this respect, the current pilot does not permit causal inference regarding long-term learning retention or sustained behavioural change beyond the immediate training context.
Future research should therefore examine enhanced sensing strategies, refinement of multimodal feedback mechanisms, and longitudinal assessment of learning effects across diverse learner groups. Collaboration with industrial training stakeholders will be important to evaluate transferability to operational contexts. Subsequent validation phases are expected to generate empirical evidence regarding learning progression, ergonomics-oriented behaviour, and system usability, informing iterative architectural refinement and contributing to the advancement of human-centred XR-based training for safe and responsible HRC in digitally mediated logistics environments.

Author Contributions

Conceptualization, A.D.; methodology, A.D.; validation, I.K.; formal analysis, A.D.; investigation, A.D.; resources, A.D.; writing—original draft preparation, A.D.; writing—review and editing, I.K.; visualisation, A.D.; supervision, I.K.; project administration, I.K. All authors have read and agreed to the published version of the manuscript.

Funding

This paper has been funded under the MASTER XR 2 Project, which has received funding from the European Union’s Horizon Europe research and innovation action programme, via the Open Call #2—MASTER issued and executed under project MASTER (grant agreement nr. 101093079).

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

During the preparation of this manuscript, the authors used (web-based version, accessed in 2026) to generate selected conceptual visualisations. The authors reviewed and edited the generated graphics and take full responsibility for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
HRCHuman–Robot Collaboration
XRExtended Reality
ARAugmented Reality
REBARapid Entire Body Assessment
RULARapid Upper Limb Assessment
SDKSoftware Development Kit

References

  1. Pietrantoni, L.; Favilla, M.; Fraboni, F.; Mazzoni, E.; Morandini, S.; Benvenuti, M.; De Angelis, M. Integrating collaborative robots in manufacturing, logistics, and agriculture: Expert perspectives on technical, safety, and human factors. Front. Robot. AI 2024, 11, 1342130. [Google Scholar] [CrossRef]
  2. Dhanda, M.; Rogers, B.A.; Hall, S.; Dekoninck, E.; Dhokia, V. Reviewing human-robot collaboration in manufacturing: Opportunities and challenges in the context of industry 5.0. Robot. Comput.-Integr. Manuf. 2025, 93, 102937. [Google Scholar] [CrossRef]
  3. Daios, A.; Kladovasilakis, N.; Kelemis, A.; Kostavelis, I. AI applications in supply chain management: A survey. Appl. Sci. 2025, 15, 2775. [Google Scholar] [CrossRef]
  4. Daios, A.; Kostavelis, I. Industry 4.0 Technologies in Distribution Centers: A Survey. In Supply Chains, Proceedings of the 5th Olympus International Conference; Springer: Berlin/Heidelberg, Germany, 2024; pp. 3–11. [Google Scholar] [CrossRef]
  5. Daios, A.; Xanthopoulos, A.; Folinas, D.; Kostavelis, I. Towards automating stocktaking in warehouses: Challenges, trends, and reliable approaches. Procedia Comput. Sci. 2024, 232, 1437–1445. [Google Scholar] [CrossRef]
  6. Daios, A.; Kladovasilakis, N.; Kostavelis, I. Mixed Palletizing for Smart Warehouse Environments: Sustainability Review of Existing Methods. Sustainability 2024, 16, 1278. [Google Scholar] [CrossRef]
  7. Sun, X.; Yu, H.; Solvang, W.D.; Wang, Y.; Wang, K. The application of Industry 4.0 technologies in sustainable logistics: A systematic literature review (2012–2020) to explore future research opportunities. Environ. Sci. Pollut. Res. 2022, 29, 9560–9591. [Google Scholar] [CrossRef] [PubMed]
  8. Kadir, B.A.; Broberg, O.; da Conceição, C.S. Current research and future perspectives on human factors and ergonomics in Industry 4.0. Comput. Ind. Eng. 2019, 137, 106004. [Google Scholar] [CrossRef]
  9. Scorgie, D.; Feng, Z.; Paes, D.; Parisi, F.; Yiu, T.; Lovreglio, R. Virtual reality for safety training: A systematic literature review and meta-analysis. Saf. Sci. 2024, 171, 106372. [Google Scholar] [CrossRef]
  10. Freina, L.; Ott, M. A literature review on immersive virtual reality in education: State of the art and perspectives. In Proceedings of the International Scientific Conference Elearning and Software for Education; SCIRP Open Access: Irvine, CA, USA, 2019; Volume 1, pp. 10–1007. [Google Scholar] [CrossRef]
  11. Li, S.; Wang, Q.C.; Wei, H.H.; Chen, J.H. Extended reality (XR) training in the construction industry: A content review. Buildings 2024, 14, 414. [Google Scholar] [CrossRef]
  12. Rizzuto, M.A.; Sonne, M.W.; Vignais, N.; Keir, P.J. Evaluation of a virtual reality head mounted display as a tool for posture assessment in digital human modelling software. Appl. Ergon. 2019, 79, 1–8. [Google Scholar] [CrossRef]
  13. Man, S.S.; Wen, H.; So, B.C.L. Are virtual reality applications effective for construction safety training and education? A systematic review and meta-analysis. J. Saf. Res. 2024, 88, 230–243. [Google Scholar] [CrossRef]
  14. Reljić, V.; Milenković, I.; Dudić, S.; Šulc, J.; Bajči, B. Augmented reality applications in industry 4.0 environment. Appl. Sci. 2021, 11, 5592. [Google Scholar] [CrossRef]
  15. Mulero-Pérez, D.; Zambrano-Serrano, B.; Ruiz Zúñiga, E.; Fernandez-Vega, M.; Garcia-Rodriguez, J. Enhancing Robotics Education Through XR Simulation: Insights from the X-RAPT Training Framework. Appl. Sci. 2025, 15, 10020. [Google Scholar] [CrossRef]
  16. Diego-Mas, J.A.; Alcaide-Marzal, J.; Poveda-Bautista, R. Effects of using immersive media on the effectiveness of training to prevent ergonomics risks. Int. J. Environ. Res. Public Health 2020, 17, 2592. [Google Scholar] [CrossRef]
  17. Memmesheimer, V.M.; Ebert, A. Scalable extended reality: A future research agenda. Big Data Cogn. Comput. 2022, 6, 12. [Google Scholar] [CrossRef]
  18. Hignett, S.; McAtamney, L. Rapid entire body assessment (REBA). Appl. Ergon. 2000, 31, 201–205. [Google Scholar] [CrossRef] [PubMed]
  19. MASTER-XR Project Consortium. MANIPULAY XR: Playfully Learning Basics of Manipulators and Kinematics in Extended Reality, 2026. Available online: https://www.master-xr.eu/oc-project/manipulay-xr/ (accessed on 7 February 2026).
  20. MASTER-XR Project Consortium. EMPOWER: Enhancing Workplace Safety Through XR-Based Digital Assistance, 2026. Available online: https://www.master-xr.eu/oc-project/empower/ (accessed on 7 February 2026).
  21. MASTER-XR Project Consortium. ERGON-XR: Ergonomics Assessment in XR Environments, 2026. Available online: https://www.master-xr.eu/oc-project/ergon-xr/ (accessed on 7 February 2026).
  22. Barz, M.; Karagiannis, P.; Kildal, J.; Rivera Pinto, A.; de Munain, J.R.; Rosel, J.; Madarieta, M.; Salagianni, K.; Aivaliotis, P.; Makris, S.; et al. MASTER-XR: Mixed Reality Ecosystem for Teaching Robotics in Manufacturing. In Integrated Systems: Data Driven Engineering; Springer: Berlin/Heidelberg, Germany, 2024; pp. 167–182. [Google Scholar] [CrossRef]
  23. European Commission CORDIS. MASTER—Mixed Reality Ecosystem for Teaching Robotics in Manufacturing (Project 101093079). Available online: https://cordis.europa.eu/project/id/101093079 (accessed on 7 February 2026).
  24. Chan, W.P.; Crouch, M.; Hoang, K.; Chen, C.; Robinson, N.; Croft, E. Design and implementation of a human-robot joint action framework using augmented reality and eye gaze. arXiv 2022, arXiv:2208.11856. [Google Scholar] [CrossRef]
  25. Xia, G.; Ghrairi, Z.; Wuest, T.; Hribernik, K.; Heuermann, A.; Liu, F.; Liu, H.; Thoben, K.D. Towards human modeling for human-robot collaboration and digital twins in industrial environments: Research status, prospects, and challenges. Robot. Comput.-Integr. Manuf. 2025, 95, 103043. [Google Scholar] [CrossRef]
  26. Dodoo, J.E.; Al-Samarraie, H.; Alzahrani, A.I.; Tang, T. XR and Workers’ safety in High-Risk Industries: A comprehensive review. Saf. Sci. 2025, 185, 106804. [Google Scholar] [CrossRef]
  27. Wang, Z.; Bai, X.; Zhang, S.; Billinghurst, M.; He, W.; Wang, P.; Lan, W.; Min, H.; Chen, Y. A comprehensive review of augmented reality-based instruction in manual assembly, training and repair. Robot. Comput.-Integr. Manuf. 2022, 78, 102407. [Google Scholar] [CrossRef]
  28. Palmarini, R.; Erkoyuncu, J.A.; Roy, R.; Torabmostaedi, H. A systematic review of augmented reality applications in maintenance. Robot. Comput.-Integr. Manuf. 2018, 49, 215–228. [Google Scholar] [CrossRef]
  29. ISO 10218-2:2025; Robotics—Safety Requirements—Part 2: Industrial Robot Applications and Robot Cells. International Organization for Standardization: Geneva, Switzerland, 2025. Available online: https://www.iso.org/standard/73934.html (accessed on 7 February 2026).
  30. McAtamney, L.; Corlett, E.N. RULA: A survey method for the investigation of work-related upper limb disorders. Appl. Ergon. 1993, 24, 91–99. [Google Scholar] [CrossRef] [PubMed]
  31. Winfield, A.F.; Jirotka, M. Ethical governance is essential to building trust in robotics and artificial intelligence systems. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2018, 376, 20180085. [Google Scholar] [CrossRef] [PubMed]
  32. Plopski, A.; Hirzle, T.; Norouzi, N.; Qian, L.; Bruder, G.; Langlotz, T. The eye in extended reality: A survey on gaze interaction and eye tracking in head-worn extended reality. ACM Comput. Surv. (CSUR) 2022, 55, 53. [Google Scholar] [CrossRef]
  33. Barz, M.; Bhatti, O.S.; Alam, H.M.T.; Nguyen, D.M.H.; Sonntag, D. Interactive fixation-to-AOI mapping for mobile eye tracking data based on few-shot image classification. In Proceedings of the Companion Proceedings of the 28th International Conference on Intelligent User Interfaces; ACM Digital Library: New York, NY, USA, 2023; pp. 175–178. [Google Scholar] [CrossRef]
  34. Virtualware. VIROO Studio for Unity—Documentation Overview, 2025. Available online: https://virooportal.virtualwareco.com/docs/2.5/ (accessed on 7 February 2026).
  35. Kolb, D.A. Experiential Learning: Experience as the Source of Learning and Development, 2nd ed.; FT Press: Upper Saddle River, NJ, USA, 2014. [Google Scholar]
  36. Kolodner, J.L. An Introduction to Case-Based Reasoning. Artif. Intell. Rev. 1992, 6, 3–34. [Google Scholar] [CrossRef]
  37. Lindgren, R.; Tscholl, M.; Wang, S.; Johnson, E. Enhancing learning and engagement through embodied interaction within a mixed reality simulation. Comput. Educ. 2016, 95, 174–187. [Google Scholar] [CrossRef]
  38. Radianti, J.; Majchrzak, T.A.; Fromm, J.; Wohlgenannt, I. A systematic review of immersive virtual reality applications for higher education: Design elements, lessons learned, and research agenda. Comput. Educ. 2020, 147, 103778. [Google Scholar] [CrossRef]
  39. Coeckelbergh, M. AI Ethics; MIT Press: Cambridge, MA, USA, 2020. [Google Scholar]
  40. Anderson, L.W.; Krathwohl, D.R. A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives, Complete Edition; Addison Wesley Longman, Inc.: Saddle River, NJ, USA, 2001. [Google Scholar]
  41. Brooke, J. SUS: A “Quick and Dirty” Usability Scale. In Usability Evaluation in Industry; Jordan, P.W., Thomas, B., Weerdmeester, B.A., McClelland, I.L., Eds.; Taylor & Francis: London, UK, 1996; pp. 189–194. [Google Scholar]
Figure 1. System architecture of the HRC-XR Trainer. Colours indicate the Design (yellow), Implementation (orange), and Feedback (green) layers. Arrows represent workflow, while the dashed arrow denotes iterative design.
Figure 1. System architecture of the HRC-XR Trainer. Colours indicate the Design (yellow), Implementation (orange), and Feedback (green) layers. Arrows represent workflow, while the dashed arrow denotes iterative design.
Systems 14 00348 g001
Figure 2. Conceptual visualisations of the three HRC-XR training modules: MANIPULAY XR for robot assembly and workspace exploration; EMPOWER for real-time safety and ergonomics guidance; and ERGON-XR for session review and ergonomic risk assessment using REBA-based metrics.
Figure 2. Conceptual visualisations of the three HRC-XR training modules: MANIPULAY XR for robot assembly and workspace exploration; EMPOWER for real-time safety and ergonomics guidance; and ERGON-XR for session review and ergonomic risk assessment using REBA-based metrics.
Systems 14 00348 g002
Table 1. Qualitative comparison of XR-based HRC training approaches across key design dimensions.
Table 1. Qualitative comparison of XR-based HRC training approaches across key design dimensions.
DimensionPrior XR Training SystemsHRC-XR Trainer Framework
Ergonomics integrationPartialIntegrated
ScalabilityLimitedHigh
Multimodal feedbackPartialIntegrated
Pedagogical structureLimitedStructured
Table 2. System design goals of the HRC-XR Trainer and their pedagogical orientation.
Table 2. System design goals of the HRC-XR Trainer and their pedagogical orientation.
GoalDescriptionPedagogical Orientation
G1 AccessibilityDeployment on consumer-grade XR hardware to support participation by non-specialist users across varied logistics contexts.Supports inclusive access, affordability, and scalable training deployment.
G2 Embodied UnderstandingIntegration of natural interaction mechanisms such as gaze inference, controller-based manipulation, and posture-oriented cues, subject to hardware capabilities.Reinforces embodied cognition and promotes ergonomics-aware interaction during task execution.
G3 Progressive ScaffoldingStructuring of learning activities into staged modules progressing from initial familiarisation to applied practice and reflective evaluation.Supports constructivist learning progression and gradual competency development [40].
G4 Immediate Multimodal FeedbackProvision of real-time visual and auditory feedback to support behavioural correction and adaptive task execution during interaction.Strengthens experiential learning through responsive, context-sensitive system feedback.
G5 Safety and Responsibility AwarenessIntegration of safety-oriented interaction logic designed to promote awareness of personal responsibility and adaptive behaviour in shared human–robot workspaces.Encourages reflective judgement and safety-conscious conduct within collaborative task contexts.
G6 Reusability and SustainabilityAdoption of modular SDK-based integration within the MASTER XR ecosystem to support extensibility and reuse across training configurations.Enables interoperability, long-term adaptability, and sustainable system evolution.
G7 Evaluation and AnalyticsCollection of anonymised interaction and performance data to support analysis of learning progression and ergonomic exposure.Facilitates evidence-based validation and iterative system refinement.
Table 3. Mapping of instructional modules to learning focus, implemented toolkits, and feedback modalities.
Table 3. Mapping of instructional modules to learning focus, implemented toolkits, and feedback modalities.
ModuleLearning FocusImplemented ToolkitsPrimary Feedback Modalities
M1 Foundational AwarenessUnderstanding robotic structure, joint kinematics, motion limits, and reachable workspace in relation to safe interaction zonesMANIPULAY XRVisual feedback supporting guided assembly, joint manipulation, reach testing, and spatial reasoning during structured tasks.
M2 Experiential Safe-Behaviour PracticeDevelopment of embodied safety awareness and coordination skills in shared human–robot workspacesEMPOWERReal-time visual and auditory cues indicating proximity, unsafe posture, and behavioural adaptation during collaborative task execution.
M3 Reflective Ergonomic EvaluationInterpretation of ergonomic exposure and movement patterns during logistics-oriented pick-and-place activitiesERGON-XRPost-task and near-real-time feedback based on posture indicators, task timing, and REBA-aligned risk visualisations.
Table 4. Mapping of instructional modules to learning objectives and assessment indicators.
Table 4. Mapping of instructional modules to learning objectives and assessment indicators.
ModuleLearning ObjectiveAssessment Indicator
M1 Foundational AwarenessRecognition of robot components, joint constraints, reachable workspace, and designated safety zones.Task completion success, assembly accuracy, and configuration consistency.
M2 Experiential Safe-Behaviour PracticeExecution of safe spatial coordination and ergonomically appropriate posture during shared HRC tasks.Posture deviation indicators, proximity threshold compliance, and task interruption frequency.
M3 Reflective Ergonomic EvaluationInterpretation of ergonomic exposure and identification of corrective adjustments based on task feedback.REBA-aligned risk indicators, posture stability measures, and variation across repeated task execution.
Table 5. Evaluation measures and data collection instruments.
Table 5. Evaluation measures and data collection instruments.
CategoryMeasureInstrument/Source
Cognitive LearningChange in conceptual understanding across training modulesPre- and post-intervention multiple-choice assessment and structured task checklist
Ergonomic BehaviourREBA-aligned proxy indicators and posture stability measuresErgonomics assessment supported by the ERGON-XR toolkit
User ExperiencePerceived usability and interaction claritySystem Usability Scale [41] questionnaire administered post-session
Task PerformanceTask completion success, execution time, and error occurrenceApplication-level performance metrics recorded during training sessions
Qualitative FeedbackParticipant reflections and observed interaction patternsSemi-structured interviews and structured observation records
System PerformanceFrame rate stability and interaction responsivenessRuntime diagnostics from Unity and device-level telemetry
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Daios, A.; Kostavelis, I. A Human-Centred Extended Reality (XR) System for Safe Human–Robot Collaboration (HRC) in Smart Logistics. Systems 2026, 14, 348. https://doi.org/10.3390/systems14040348

AMA Style

Daios A, Kostavelis I. A Human-Centred Extended Reality (XR) System for Safe Human–Robot Collaboration (HRC) in Smart Logistics. Systems. 2026; 14(4):348. https://doi.org/10.3390/systems14040348

Chicago/Turabian Style

Daios, Adamos, and Ioannis Kostavelis. 2026. "A Human-Centred Extended Reality (XR) System for Safe Human–Robot Collaboration (HRC) in Smart Logistics" Systems 14, no. 4: 348. https://doi.org/10.3390/systems14040348

APA Style

Daios, A., & Kostavelis, I. (2026). A Human-Centred Extended Reality (XR) System for Safe Human–Robot Collaboration (HRC) in Smart Logistics. Systems, 14(4), 348. https://doi.org/10.3390/systems14040348

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop