Next Article in Journal
Improving Effectiveness of Energy Baseline Using Deep Learning
Previous Article in Journal
Fe3O4 Magnetic Biochar Derived from Pecan Nutshell for Arsenic Removal Performance Analysis Based on Fuzzy Decision Network
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Enhancing Anatomy Teaching and Learning Through 3D and Mixed-Reality Tools in Medical Education †

1
School of Design Arts, Xiamen University of Technology, Xiamen 361024, China
2
Faculty of Math, University of Waterloo, Waterloo, ON N2L 3G1, Canada
3
The Second Affiliated Hospital of Xiamen Medical College, Xiamen 361021, China
*
Authors to whom correspondence should be addressed.
Presented at the 8th Eurasian Conference on Educational Innovation 2025, Bali, Indonesia, 7–9 February 2025.
Eng. Proc. 2025, 103(1), 22; https://doi.org/10.3390/engproc2025103022
Published: 18 August 2025

Abstract

We developed an innovative system to enhance medical education, specifically in teaching anatomy and surgical techniques. We applied three-dimensional (3D) and mixed-reality (MR) technologies to create immersive learning experiences. The system enables students to gain hands-on experience in a virtual environment and understand anatomical structures and related surgical procedures. We designed the system architecture, user interface, and educational methodologies. By integrating modern technology into teaching methods, students can engage and improve learning efficiency, and the potential of 3D and MR technologies is validated in medical education.

1. Introduction

Lung anatomy represents challenges in thoracic surgery education, primarily due to its substantial anatomical variability and intricate three-dimensional architecture. Traditional teaching methods in thoracic surgery rely on anatomical textbooks and two-dimensional imaging materials for knowledge transmission. However, this conventional method fails to adequately convey the complex three-dimensional anatomical relationships of pulmonary structures. Moreover, the transition from open surgery to thoracoscopic procedures has complicated the learning curve, as surgeons must adapt to limited tissue manipulation and reduced tactile feedback [1].
To enhance the understanding of anatomical relationships and facilitate preoperative planning, three-dimensional (3D) visualization techniques are used. Advanced imaging technologies, including 3D computerized tomography (CT)-based reconstruction and simulation, are integrated into immersive extended reality (ER) platforms, such as virtual reality (VR), mixed reality (MR), and augmented reality (AR) [2,3]. These immersive technologies enable healthcare professionals to visualize, manipulate, and interact with patient-specific 3D models in fully virtual and hybrid simulated environments [4,5,6]. VR technology, in particular, bridges the gap between theoretical knowledge and practical training by allowing physicians to acquire cognitive understanding and technical skills in a risk-free environment outside the operating room.
In this study, we developed an innovative educational system to enhance thoracic surgical training. We examined the system architecture, user interaction modalities, pedagogical methodology, and learning outcomes of our VR-based platform. The system integrates interactive virtual environments into anatomical visualization, enabling medical trainees to acquire hands-on experience through immersive simulation. The system facilitates an understanding of complex pulmonary anatomical structures and associated surgical procedures, while providing a risk-free environment for skill development and procedural practice.

2. Background and Related Works

2.1. MR in Medical Education

Contemporary medical education is experiencing a paradigm shift driven by the proliferation of mobile MR technologies, encompassing smartphones, AR, VR, and 3D printing. These advanced technologies extend beyond traditional video-based educational methodologies and substantially enhance existing physical simulation training approaches. While conventional teaching models—including mannequin-based training, role-playing exercises, standardized patient interactions, and cadaveric dissection—remain valuable, digital media innovations have fostered diverse instructional modalities that significantly expand the scope and depth of clinical skills training [7,8].
Learners increasingly expect educational experiences characterized by interactive simulations and rich sensory engagement [9]. This expectation has transformed medical education from traditional didactic lectures toward self-directed learning approaches [10], online educational platforms [11], and experiential simulation-based training [12]. As empirical evidence continues to demonstrate the effectiveness of simulation and MR in enhancing learning outcomes [13] with the ubiquity of mobile devices [14], mobile MR tools integrated into medical education have become increasingly prevalent. In medicine, MR systems have been implemented in foundational subjects, particularly anatomy education. Zhang et al. [15] investigated the pedagogical advantages of direct manipulation of anatomical structures through stereoscopic 3D environments utilizing joystick controllers. Based on the results, Kucuk, Kapakin, and Goktas [16] examined the impact of AR-enhanced anatomy education on student learning effectiveness and cognitive load.
An advantage of MR lies in its capacity to facilitate real-time interactive integration between physical environments and virtual simulations. This distinctive capability has demonstrated remarkable potential in medical education and offered unprecedented opportunities for experiential learning and skill acquisition. As its applications expand, related research has progressed substantially, yielding innovative pedagogical approaches and transformative educational methodologies. The seamless integration of virtual and physical elements enables learners to engage with complex medical concepts and procedures in unprecedented ways through conventional teaching methods, thereby revolutionizing medical education paradigms and opening up new avenues for educational innovation.

2.2. Volumetric Video and 3D Capturing

Volumetric video technology has been advanced based on established technologies, primarily photogrammetry, a technique that enables three-dimensional object reconstruction from multiple two-dimensional images [17]. Unlike conventional surgical instructional videos that merely present specific techniques in limited surgeries, volumetric video technology enables surgical spatial visualization from the user’s perspective. This technology transcends the limitations of traditional videos and 360-degree stereoscopic recordings by using full-spectrum scene interaction, such as translational movements (anterior–posterior, superior–inferior, and lateral) and rotational movements (lateral tilt and anterior–posterior tilt). Through precise user position tracking, the native VR environment facilitates natural navigation in a virtual surgical environment, delivering an enhanced immersive experience.
This technological innovation aligns with contemporary trends in medical education and training. Simulation technologies have been extensively implemented in healthcare education in the civilian and military medical sectors [18], encompassing medical mannequins for psychomotor skill development and virtual training systems for procedural knowledge acquisition. Extensive research results have validated the effectiveness of simulation-based education, garnering widespread recognition within the medical community [19,20,21]. While less prevalent than physical simulation devices, games and virtual simulations demonstrate substantial advantages in learner engagement and educational outcomes [22,23]. Although related research continues to evolve, AR and VR technologies exhibit remarkable potential for enhancing learning experiences in healthcare education [24,25].
In this context, we investigated the technical workflow of volumetric video, encompassing the acquisition, generation, and semantic labeling of hybrid animation, and developed an enhanced educational system for thoracic surgery through the integration of volumetric video technology and wearable MR devices.

3. System Design and Implementation

3.1. Hardware and Software

In this study, we implemented Holocode https://support.makeblock.com/hc/en-us/articles/20072180888343-About-Halocode) (accessed on 8 August 2025) as the primary data processing and control unit of the system. Holocode processes input signals from compatible sensors, including angular and magnetic sensors, to facilitate user data tracking, gesture recognition, and real-time interaction control. With the assessment component, Holocode presents the question-answering interface and coordinates the logical processing of response selection, ensuring fluid interaction throughout the educational experience.
For the MR visualization component, we employed Microsoft HoloLens 2 to render volumetric videos in the system (Figure 1). The implementation system comprises MR Toolkit 2.7 and Unity 2019.4.23. Beyond conventional surgical skills training, the system incorporates an interactive assessment module. The evaluation interface adopts angular and magnetic sensors for response selection, enabling learners to indicate their choices through sensor-based interactions. The system incorporates automatic response logging functionality and delivers instantaneous feedback alongside comprehensive evaluation based on predetermined assessment criteria.

3.2. System Architecture

Surgical skill acquisition without expert supervision yields suboptimal outcomes. To enhance the effectiveness of surgical training, the developed system incorporates multiple high-precision sensors for comprehensive operative data tracking and recording. Inertial and magnetic sensors were used to collect operational parameters, including spatial positioning of surgical instruments, movement trajectories, and procedural angles. These high-fidelity data metrics were used to evaluate procedural standardization and provide quantitative benchmarks for surgical skill progression. The systematic collection and analysis of these objective parameters of the system for continuous performance assessment and skill refinement in surgical training are shown in Figure 2.

3.3. Interactive Design

A distinguishing feature of the developed system is its capacity to facilitate physical movement replication, gesture memorization, and spatial comprehension of surgical instruments and anatomical structures before operation. This enables it to overcome the limitations of traditional fixed-screen learning approaches. This design was developed to effectively bridge the gap between simulated training and clinical practice in instrument positioning, equipment familiarization, and workflow optimization. The system transforms the educational experience from passive observation to active participation, from 2D screen-based learning to immersive spatial interaction.
The system comprises two distinct phases. In the initial volumetric video learning, learners engage with surgical content through HoloLens 2, with free manipulation. Subsequently, the system leads to interactive assessment, where learners utilize angular and magnetic sensors for response selection. The system provides real-time feedback on responses and generates comprehensive performance reports upon completion (Figure 3).

4. Conclusions

The convergence of MR and volumetric video technologies enhances thoracic surgical education and training. The system developed integrates wearable MR devices into volumetric video technology, complemented by multi-sensor data acquisition, to create an advanced educational platform for thoracic surgery training. The system delivers immersive surgical instruction while incorporating quantitative assessment mechanisms through an interactive evaluation module, providing objective metrics for educational efficacy evaluation. The system’s dual functionality, immersive visualization, and performance analytics enable surgical skill acquisition and assessment. Positive user feedback suggests potential for expanded implementation and validation.

Author Contributions

Conceptualization, W.G. and R.G.; methodology, W.G. and Y.Y.; software, R.G. and Z.H.; validation, S.Y., T.L. and S.Y.; investigation, J.Y., Y.Z. and Z.H.; resources, W.G. and T.L.; writing—original draft preparation, W.G., Y.Y. and J.Y.; writing—review and editing, W.G., R.G. and T.L.; visualization, J.Y., Y.Y. and Z.H.; supervision, W.G. and R.G.; project administration, W.G.; funding acquisition, T.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Xiamen Educational Science Planning Project (No. 23026) and Xiamen Medical College Educational Research Project (No. XBLG2023008).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Meershoek, A.J.A.; Loonen, T.G.J.; Maal, T.J.J.; Hekma, E.J.; Hugen, N. Three Dimensional Printing as a Tool For Anatomical Training in Lung Surgery. Med. Sci. Educ. 2023, 33, 873–878. [Google Scholar] [CrossRef]
  2. Sadeghi, A.H.; El Mathari, S.; Abjigitova, D.; Maat, A.P.; Taverne, Y.J.; Bogers, A.J.; Mahtab, E.A. Current and future applications of virtual, augmented, and MR in cardiothoracic surgery. Ann. Thorac. Surg. 2022, 113, 681–691. [Google Scholar] [CrossRef] [PubMed]
  3. Vervoorn, M.T.; Wulfse, M.; Hoesein, F.A.A.M.; Stellingwerf, M.; van der Kaaij, N.P.; de Heer, L.M. Application of three-dimensional computed tomography imaging and reconstructive techniques in lung surgery: A mini-review. Front. Surg. 2022, 9, 1079857. [Google Scholar] [CrossRef]
  4. Tokuno, J.; Chen-Yoshikawa, T.F.; Nakao, M.; Iwakura, M.; Motoki, T.; Matsuda, T.; Date, H. Creation of a video library for education and virtual simulation of anatomical lung resection. Interact. Cardiovasc. Thorac. Surg. 2022, 34, 808–813. [Google Scholar] [CrossRef]
  5. Dho, Y.-S.; Park, S.J.; Choi, H.; Kim, Y.; Moon, H.C.; Kim, K.M.; Kang, H.; Lee, E.J.; Kim, M.-S.; Kim, J.W.; et al. Development of an insideout augmented reality technique for neurosurgical navigation. Neurosurg. Focus 2021, 51, E21. [Google Scholar] [CrossRef]
  6. Andrews, C.; Southworth, M.K.; Silva, J.N.A.; Silva, J.R. Extended reality in medical practice. Curr. Treat. Options Cardiovasc. Med. 2019, 21, 18. [Google Scholar] [CrossRef]
  7. Alaker, M.; Wynn, G.R.; Arulampalam, T. VR training in laparoscopic surgery: A systematic review & meta-analysis. Int. J. Surg. 2016, 29, 85–94. [Google Scholar] [CrossRef] [PubMed]
  8. Vitish-Sharma, P.; Knowles, J.; Patel, B. Acquisition of fundamental laparoscopic skills: Is a box really as good as a VR trainer? Int. J. Surg. 2011, 9, 659–661. [Google Scholar] [CrossRef] [PubMed]
  9. Jones, C.; Ramanau, R.; Cross, S.; Healing, G. Net generation or Digital Natives: Is there a distinct new generation entering university? Comput. Educ. 2010, 54, 722–732. [Google Scholar] [CrossRef]
  10. Murad, M.H.; Coto-Yglesias, F.; Varkey, P.; Prokop, L.J.; Murad, A.L. The effectiveness of self-directed learning in health professions education: A systematic review. Med. Educ. 2010, 44, 1057–1068. [Google Scholar] [CrossRef]
  11. Clark, R.C.; Mayer, R.E. E-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning; John Wiley & Sons: Hoboken, NJ, USA, 2016. [Google Scholar]
  12. Cook, D.A.; Brydges, R.; Hamstra, S.J.; Zendejas, B.; Szostek, J.H.; Wang, A.T.; Erwin, P.J.; Hatala, R. Comparative effectiveness of technology-enhanced simulation versus other instructional methods: A systematic review and meta-analysis. Simul. Healthc. 2012, 7, 308–320. [Google Scholar] [CrossRef]
  13. Dalgarno, B.; Lee, M.J.W. What are the learning affordances of 3-D virtual environments? Br. J. Educ. Technol. 2010, 41, 10–32. [Google Scholar] [CrossRef]
  14. Akçayır, M.; Akçayır, G. Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educ. Res. Rev. 2017, 20, 1–11. [Google Scholar] [CrossRef]
  15. Jang, S.; Vitale, J.M.; Jyung, R.W.; Black, J.B. Direct manipulation is better than passive viewing for learning anatomy in a three-dimensional VR environment. Comput. Educ. 2017, 106, 150–165. [Google Scholar] [CrossRef]
  16. Küçük, S.; Kapakin, S.; Göktaş, Y. Learning anatomy via mobile augmented reality: Effects on achievement and cognitive load. Anat. Sci. Educ. 2016, 9, 411–421. [Google Scholar] [CrossRef]
  17. Mikhail, E.M.; Bethel, J.S.; McGlone, J.C. Introduction to Modern Photogrammetry; Wiley: Hoboken, NJ, USA, 2001. [Google Scholar]
  18. Kunkler, K. The role of medical simulation: An overview. Int. J. Med. Robot. Comput. Assist. Surg. 2006, 2, 203–210. [Google Scholar] [CrossRef]
  19. McGaghie, W.C.; Issenberg, S.B.; Cohen, E.R.; Barsuk, J.H.; Wayne, D.B. Does Simulation-Based Medical Education With Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence. Acad. Med. 2011, 86, 706–711. [Google Scholar] [CrossRef] [PubMed]
  20. McGaghie, W.C.; Issenberg, S.B.; Petrusa, E.R.; Scalese, R.J. A critical review of simulation-based medical education research: 2003–2009. Med. Educ. 2010, 44, 50–63. [Google Scholar] [CrossRef]
  21. Okuda, Y.; Bryson, E.O.; DeMaria, S., Jr.; Jacobson, L.; Quinones, J.; Shen, B.; Levine, A.I. The utility of simulation in medical education: What is the evidence? Mt. Sinai J. Med. 2009, 76, 330–343. [Google Scholar] [CrossRef]
  22. Gorbanev, I.; Agudelo-Londoño, S.; González, R.A.; Cortes, A.; Pomares, A.; Delgadillo, V.; Yepes, F.J.; Muñoz, Ó. A systematic review of serious games in medical education: Quality of evidence and pedagogical strategy. Med. Educ. Online 2018, 23, 1438718. [Google Scholar] [CrossRef]
  23. Graafland, M.; Schraagen, J.M.; Schijven, M.P. Systematic review of serious games for medical education and surgical skills training. Br. J. Surg. 2012, 99, 1322–1330. [Google Scholar] [CrossRef] [PubMed]
  24. Barsom, E.Z.; Graafland, M.; Schijven, M.P. Systematic review on the effectiveness of augmented reality applications in medical training. Surg. Endosc. 2016, 30, 4174–4183. [Google Scholar] [CrossRef] [PubMed]
  25. Brown, R.; McIlwain, S.; Willson, B.; Hackett, M. Enhancing Combat Medic training with 3D virtual environments. In Proceedings of the 2016 IEEE International Conference on Serious Games and Applications for Health (SeGAH), Orlando, FL, USA, 11–13 May 2016; pp. 1–7. [Google Scholar]
Figure 1. Devices of developed system: Holocode (left), Azure Kinect DK (middle), Microsoft HoloLens 2 (right).
Figure 1. Devices of developed system: Holocode (left), Azure Kinect DK (middle), Microsoft HoloLens 2 (right).
Engproc 103 00022 g001
Figure 2. System architecture.
Figure 2. System architecture.
Engproc 103 00022 g002
Figure 3. Volumetric video surgery instruction.
Figure 3. Volumetric video surgery instruction.
Engproc 103 00022 g003
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, J.; Gang, W.; Ye, Y.; Zhu, Y.; You, S.; Huang, Z.; Lv, T.; Gang, R. Enhancing Anatomy Teaching and Learning Through 3D and Mixed-Reality Tools in Medical Education. Eng. Proc. 2025, 103, 22. https://doi.org/10.3390/engproc2025103022

AMA Style

Yang J, Gang W, Ye Y, Zhu Y, You S, Huang Z, Lv T, Gang R. Enhancing Anatomy Teaching and Learning Through 3D and Mixed-Reality Tools in Medical Education. Engineering Proceedings. 2025; 103(1):22. https://doi.org/10.3390/engproc2025103022

Chicago/Turabian Style

Yang, Jie, Wang Gang, Yihe Ye, Yaning Zhu, Shijiang You, Zhihuang Huang, Tianyu Lv, and Ren Gang. 2025. "Enhancing Anatomy Teaching and Learning Through 3D and Mixed-Reality Tools in Medical Education" Engineering Proceedings 103, no. 1: 22. https://doi.org/10.3390/engproc2025103022

APA Style

Yang, J., Gang, W., Ye, Y., Zhu, Y., You, S., Huang, Z., Lv, T., & Gang, R. (2025). Enhancing Anatomy Teaching and Learning Through 3D and Mixed-Reality Tools in Medical Education. Engineering Proceedings, 103(1), 22. https://doi.org/10.3390/engproc2025103022

Article Metrics

Back to TopTop