Augmented and Virtual Reality for Improving Safety in Railway Infrastructure Monitoring and Maintenance
Abstract
1. Introduction
2. Background
2.1. Virtual Reality
2.2. Augmented Reality
3. Methods
- To design and implement an integrated VR/AR system supporting immersive training and real-time, in-field decision making for railway infrastructure maintenance.
- To connect diagnostic data and intelligent anomaly detection (via AI and machine learning) with immersive visual interfaces to enhance situational awareness and reduce operator subjectivity.
- To apply a HCD methodology to ensure usability and accessibility of the developed tools across different user roles.
- To evaluate the effectiveness of the proposed system in improving the accuracy, efficiency, and traceability of inspection and maintenance tasks.
- Identification of the context of the use and research on users.
- Definition of user needs and requirements, and system requirements.
- Develop design solutions that can meet the needs and requirements.
- Evaluation of the design solution against requirements.
4. Results
4.1. VR Application
- Initial immersion and orientation. At launch, operators are placed in a virtual environment that mimics a section of track (see Figure 2). Users can physically “rotate” and observe the surrounding area to familiarize themselves with the environment. Virtual signs and markers (such as “Defects: 5 m”) indicate the distance and direction of the defects to be analyzed.
- Navigation by teleportation. The application guides users through interactive “totems,” virtual structures that provide textual and graphical instructions (see Figure 3a,b). The first totem explains how to move around the environment: the user points the controller at an area of the ground and presses the basic button (i.e., trigger) to “teleport” to the desired location. On the other hand, the second and third totems illustrate the steps for checking and controlling bolts, with instructions on selecting and interacting with an element.
- Identification and selection of bolts. Once the bolts to be checked are reached, graphical highlights (colored columns or cylinders) are noted to overlay interactive information on the track. The user, via the controller, can point to each bolt and activate contextual menus (See Figure 4).
- Bolt status classification (see Figure 5a,b). Upon clicking on a single bolt, three representative status options appear:
- ○
- No defect.
- ○
- Check (to be checked).
- ○
- Fix it! (needs replacement or repair).
- 5.
- Confirmation and completion of operations. When finished, when all bolts have been evaluated and have taken on the correct coloring corresponding to their status, the user heads to the last totem. Here, he finds the final instructions to confirm the inspection’s outcome. Once the confirmation button is pressed, the application registers that the operation is complete (see Figure 6).
- To train personnel in the bolt assessment and maintenance procedure in a safe and infinitely replicable context without the need to be physically present along the tracks.
- To assist operators in real time by indicating components to be inspected and actions to be taken, also in a future advanced version integrated with real systems.
- To reduce errors through a graphical UI and step-by-step instructions that minimize the risks of incorrect assessment or incomplete actions.
- To document and analyze data on bolt status (i.e., correct, to be checked, to be replaced) so that it can be tracked automatically, simplifying maintenance reports and statistics.
4.2. AR Application
- Target recognition and location. When the application is launched, the system shows the user a window depicting the “target object” to be searched for and framed (see Figure 7a). This “initialization” phase is used so that the head-mounted display (e.g., HoloLens or a similar device) can accurately recognize the rail component of interest (such as the joint or specific bolts on the tracks). Users moving around the target give the system different viewpoints for correct identification.
- Instructions and AR information visualization. Once the target is recognized, a virtual interface (see Figure 7b) appears to provide operational instructions. This UI section explains how the user can interact with objects, for example, by raising a hand to bring up a menu or selecting specific options. The idea is to keep the operator’s hands free while key information appears as graphical “overlays” of the physical components.
- Identification of defective bolts. After reading and confirming the instructions (see Figure 8a), the application directly highlights the bolts that need to be worked on in the field of view. Through visual cues (i.e., arrows, icons, or colored symbols) and spatialized audio cues, the system marks the precise points that require maintenance. In this way, the operator can immediately identify the intervention points without consulting paper documents or external maps.
- Defect type selection. The UI then allows the user to classify the detected defects into different categories (see Figure 8b, “Defect 1” and “Defect 2”), each indicated by a symbol. In this scenario, “?” (i.e., question mark) represents the bolt to be checked, and “X” is the bolt to be replaced. By virtually touching either option, the operator indicates what intervention will be performed at the marked bolt in the application.
- Performing maintenance. Once the defect selection is finished, the user can proceed with the maintenance (i.e., tighten or replace the bolt, as needed). This part takes place in the real world, but the AR head-mounted display (HMD) provides constant feedback on the component being serviced because of its visual overlays.
- Maintenance status confirmation and registration. After performing operations, users lift their hands to display a new status “tab” (see Figure 9). They can confirm whether the bolts have been repaired, replaced, or remain defective. In addition, recording a video/report can be triggered so that the operator can vocally and visually document the activity performed, generating helpful tracking for later analysis or sharing the work done.
- Process conclusion. At the end of the recording or after saving the status of all components, the operator can close the application. In this way, the intervention is “archived,” and the information collected (e.g., types of defects, corrective actions, documentation videos) can be sent to the reporting or maintenance management system.
- To guide the operator step by step, showing which rail components require maintenance thanks to AR elements superimposed on reality.
- To provide instructions directly in the field of view, reducing the need to consult external manuals.
- To classify and document defects, facilitating tracking and reporting.
- To simplify communication between the operational (i.e., maintenance) phase and the back-end systems that manage interventions.
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Bianchi, G.; Fanelli, C.; Freddi, F.; Giuliani, F.; La Placa, A. Systematic review railway infrastructure monitoring: From classic techniques to predictive maintenance. Adv. Mech. Eng. 2025, 17, 16878132241285631. [Google Scholar] [CrossRef]
- European Union Agency Railways. Report on Railway Safety and Interoperability in the EU—2020; European Union Agency Railways: Valenciennes, France, 2020.
- Gattullo, M.; Evangelista, A.; Manghisi, V.M.; Uva, A.E.; Fiorentino, M.; Boccaccio, A.; Ruta, M.; Gabbard, J.L. Towards Next Generation Technical Documentation in Augmented Reality Using a Context-Aware Information Manager. Appl. Sci. 2020, 10, 780. [Google Scholar] [CrossRef]
- Di Summa, M.; Cardellicchio, A.; Mosca, N.; Ricci, M.; Renò, V.; Vadukkal, U.K.V.; Stella, E. Integrating Deep Learning Based Anomaly Detection with Extended Reality: A Case Study on Extensive Railways Monitoring. In Proceedings of the 2024 IEEE International Conference on Metrology for eXtended Reality, Artificial Intelligence and Neural Engineering MetroXRAINE, St Albans, UK, 21–23 October 2024; pp. 464–469. [Google Scholar] [CrossRef]
- Vadukkal, U.K.V.; Cardellicchio, A.; Mosca, N.; di Summa, M.; Nitti, M.; Stella, E.; Renò, V. Enhancing Railway Safety: An Unsupervised Approach for Detecting Missing Bolts with Deep Learning and 3D Imaging. In Proceedings of the International Conference on Pattern Recognition Applications and Methods, Rome, Italy, 24–26 February 2024; Volume 1, pp. 924–929. [Google Scholar] [CrossRef]
- Jabłoński, A.; Jabłoński, M. Digital Safety in Railway Transport—Aspects of Management and Technology; Springer Nature: Dordrecht, The Netherlands, 2022. [Google Scholar] [CrossRef]
- Babalola, A.; Manu, P.; Cheung, C.; Yunusa-Kaltungo, A.; Bartolo, P. Applications of immersive technologies for occupational safety and health training and education: A systematic review. Saf. Sci. 2023, 166, 106214. [Google Scholar] [CrossRef]
- Gao, X.; Zhou, P.; Xiao, Q.; Peng, L.; Zhang, M. Research on the Effectiveness of Virtual Reality Technology for Locomotive Crew Driving and Emergency Skills Training. Appl. Sci. 2023, 13, 12452. [Google Scholar] [CrossRef]
- Padovano, A.; Longo, F.; Manca, L.; Grugni, R. Improving safety management in railway stations through a simulation-based digital twin approach. Comput. Ind. Eng. 2024, 187, 109839. [Google Scholar] [CrossRef]
- Liu, W.; Liu, Z.; Nunez, A. Virtual Reality and Convolutional Neural Networks for Railway Catenary Support Components Monitoring. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference, Auckland, New Zealand, 27–30 October 2019; pp. 2183–2188. [Google Scholar] [CrossRef]
- McDonald, T.; Robinson, M.; Tian, G.Y. Developments in 3D Visualisation of the Rail Tunnel Subsurface for Inspection and Monitoring. Appl. Sci. 2022, 12, 11310. [Google Scholar] [CrossRef]
- Ogunmodede, O.; Zincume, P.N. Railway Skill Development for the Fourth Industrial Revolution: A Systematic Review of Developing Countries. In Proceedings of the 30th ICE IEEE/ITMC Conference on Engineering, Technology, and Innovation: Digital Transformation on Engineering, Technology and Innovation, Funchal, Portugal, 24–28 June 2024. [Google Scholar] [CrossRef]
- Sadhu, A.; Peplinski, J.E.; Mohammadkhorasani, A.; Moreu, F. A Review of Data Management and Visualization Techniques for Structural Health Monitoring Using BIM and Virtual or Augmented Reality. J. Struct. Eng. 2023, 149, 03122006. [Google Scholar] [CrossRef]
- Damgrave, R.; Martinetti, A. Applications of Different Forms of Mixed Reality in the Maintenance of Railway Infrastructures. In Digital Railway Infrastructure; Springer Nature: Cham, Switzerland, 2024; pp. 197–209. [Google Scholar] [CrossRef]
- Aspiazu, J.; Siltanen, S.; Multanen, P.; Mäkiranta, A.; Barrena, N.; Díez, A.; Agirre, J.; Smith, T. Remote Support for Maintenance Tasks by the Use of Augmented Reality: The ManuVAR Project. In Proceedings of the 9th Congress on Virtual Reality Applications, CARVI 2011, Alava, Spain, 11–12 November 2011; Available online: https://cris.vtt.fi/en/publications/remote-support-for-maintenance-tasks-by-the-use-of-augmented-real (accessed on 10 February 2025).
- Kwon, H.J.; Lee, S.I.; Park, J.H.; Kim, C.S. Design of Augmented Reality Training Content for Railway Vehicle Maintenance Focusing on the Axle-Mounted Disc Brake System. Appl. Sci. 2021, 11, 9090. [Google Scholar] [CrossRef]
- Yi, B.; Sun, R.; Long, L.; Song, Y.; Zhang, Y. From coarse to fine: An augmented reality-based dynamic inspection method for visualized railway routing of freight cars. Meas. Sci. Technol. 2022, 33, 055013. [Google Scholar] [CrossRef]
- PTC. Vuforia Engine; PTC: Boston, MA, USA. Available online: https://developer.vuforia.com/ (accessed on 15 June 2025).
- ISO 9241-210:2019; Ergonomics of Human-System Interaction—Part 210: Human-Centred Design for Interactive Systems. ISO: Geneva, Switzerland, 2019. Available online: https://www.iso.org/standard/77520.html (accessed on 2 November 2024).
- Ricci, M.; Scarcelli, A.; D’Introno, A.; Strippoli, V.; Cariati, S.; Fiorentino, M. A Human-Centred Design Approach for Designing Augmented Reality Enabled Interactive Systems: A Kitchen Machine Case Study. In Lecture Notes in Mechanical Engineering; Springer: Cham, Switzerland, 2022; pp. 1413–1425. [Google Scholar] [CrossRef]
- Point Cloud Viewer and Tools 3, Unity Asset Store, Version 3.0.1, 165.7 MB, Released on 9 May 2025. Available online: https://assetstore.unity.com/packages/tools/utilities/point-cloud-viewer-and-tools-3-310385 (accessed on 15 June 2025).
- Ens, B.; Irani, P. Spatial analytic interfaces: Spatial user interfaces for in situ visual analytics. IEEE Comput. Graph. Appl. 2017, 37, 66–79. [Google Scholar] [CrossRef] [PubMed]
- Bodker, S. Through the Interface: A Human Activity Approach to User Interface Design; CRC Press: Boca Raton, FL, USA, 2021. [Google Scholar]
- Bozgeyikli, E.; Raij, A.; Katkoori, S.; Dubey, R. Point & Teleport locomotion technique for virtual reality. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play, Austin, TX, USA, 16–19 October 2016; pp. 205–216. [Google Scholar] [CrossRef]
- Badr, A.S.; De Amicis, R. An empirical evaluation of enhanced teleportation for navigating large urban immersive virtual environments. Front. Virtual Real. 2023, 3, 1075811. [Google Scholar] [CrossRef]
- Microsoft. Mixed Reality Toolkit; Version 3; Microsoft: Redmond, WA, USA, 2023; Available online: https://github.com/MixedRealityToolkit/MixedRealityToolkit-Unity (accessed on 15 June 2025).
- Shin, H.; Hong, S.; Lee, Y.; Son, J.; Baek, S.; Yu, C.-R.; Gil, Y.-H. Enhancing the Effectiveness of Virtual Training by Focusing on the Visual-Auditory Characteristics of Trainees and the Virtual Reality Content. Int. J. Hum. Comput. Interact. 2025, 1–14. [Google Scholar] [CrossRef]
- Pietschmann, L.; Schimpf, M.; Chen, Z.T.; Pfister, H.; Bohné, T. Enhancing User Performance and Human Factors through Visual Guidance in AR Assembly Tasks. In Proceedings of the Conference on Human Factors in Computing Systems, Yokohama, Japan, 26 April–1 May 2025. [Google Scholar] [CrossRef]
- Li, X.; Zheng, C.; Pan, Z.; Huang, Z.; Niu, Y.; Wang, P.; Geng, W. Comparative Study on 2D and 3D User Interface for Eliminating Cognitive Loads in Augmented Reality Repetitive Tasks. Int. J. Hum. Comput. Interact. 2024, 40, 8008–8024. [Google Scholar] [CrossRef]
- Gavgiotaki, D.; Ntoa, S.; Margetis, G.; Apostolakis, K.C.; Stephanidis, C. Gesture-based Interaction for AR Systems: A Short Review. In Proceedings of the 16th International Conference on PErvasive Technologies Related to Assistive Environments, PETRA ’23, Corfu, Greece, 5–7 July 2023; pp. 284–292. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ricci, M.; Mosca, N.; Di Summa, M. Augmented and Virtual Reality for Improving Safety in Railway Infrastructure Monitoring and Maintenance. Sensors 2025, 25, 3772. https://doi.org/10.3390/s25123772
Ricci M, Mosca N, Di Summa M. Augmented and Virtual Reality for Improving Safety in Railway Infrastructure Monitoring and Maintenance. Sensors. 2025; 25(12):3772. https://doi.org/10.3390/s25123772
Chicago/Turabian StyleRicci, Marina, Nicola Mosca, and Maria Di Summa. 2025. "Augmented and Virtual Reality for Improving Safety in Railway Infrastructure Monitoring and Maintenance" Sensors 25, no. 12: 3772. https://doi.org/10.3390/s25123772
APA StyleRicci, M., Mosca, N., & Di Summa, M. (2025). Augmented and Virtual Reality for Improving Safety in Railway Infrastructure Monitoring and Maintenance. Sensors, 25(12), 3772. https://doi.org/10.3390/s25123772