Skip to Content
Engineering ProceedingsEngineering Proceedings
  • Proceeding Paper
  • Open Access

18 August 2025

Enhancing Anatomy Teaching and Learning Through 3D and Mixed-Reality Tools in Medical Education †

,
,
,
,
,
,
and
1
School of Design Arts, Xiamen University of Technology, Xiamen 361024, China
2
Faculty of Math, University of Waterloo, Waterloo, ON N2L 3G1, Canada
3
The Second Affiliated Hospital of Xiamen Medical College, Xiamen 361021, China
*
Authors to whom correspondence should be addressed.

Abstract

We developed an innovative system to enhance medical education, specifically in teaching anatomy and surgical techniques. We applied three-dimensional (3D) and mixed-reality (MR) technologies to create immersive learning experiences. The system enables students to gain hands-on experience in a virtual environment and understand anatomical structures and related surgical procedures. We designed the system architecture, user interface, and educational methodologies. By integrating modern technology into teaching methods, students can engage and improve learning efficiency, and the potential of 3D and MR technologies is validated in medical education.

1. Introduction

Lung anatomy represents challenges in thoracic surgery education, primarily due to its substantial anatomical variability and intricate three-dimensional architecture. Traditional teaching methods in thoracic surgery rely on anatomical textbooks and two-dimensional imaging materials for knowledge transmission. However, this conventional method fails to adequately convey the complex three-dimensional anatomical relationships of pulmonary structures. Moreover, the transition from open surgery to thoracoscopic procedures has complicated the learning curve, as surgeons must adapt to limited tissue manipulation and reduced tactile feedback [1].
To enhance the understanding of anatomical relationships and facilitate preoperative planning, three-dimensional (3D) visualization techniques are used. Advanced imaging technologies, including 3D computerized tomography (CT)-based reconstruction and simulation, are integrated into immersive extended reality (ER) platforms, such as virtual reality (VR), mixed reality (MR), and augmented reality (AR) [2,3]. These immersive technologies enable healthcare professionals to visualize, manipulate, and interact with patient-specific 3D models in fully virtual and hybrid simulated environments [4,5,6]. VR technology, in particular, bridges the gap between theoretical knowledge and practical training by allowing physicians to acquire cognitive understanding and technical skills in a risk-free environment outside the operating room.
In this study, we developed an innovative educational system to enhance thoracic surgical training. We examined the system architecture, user interaction modalities, pedagogical methodology, and learning outcomes of our VR-based platform. The system integrates interactive virtual environments into anatomical visualization, enabling medical trainees to acquire hands-on experience through immersive simulation. The system facilitates an understanding of complex pulmonary anatomical structures and associated surgical procedures, while providing a risk-free environment for skill development and procedural practice.

3. System Design and Implementation

3.1. Hardware and Software

In this study, we implemented Holocode https://support.makeblock.com/hc/en-us/articles/20072180888343-About-Halocode) (accessed on 8 August 2025) as the primary data processing and control unit of the system. Holocode processes input signals from compatible sensors, including angular and magnetic sensors, to facilitate user data tracking, gesture recognition, and real-time interaction control. With the assessment component, Holocode presents the question-answering interface and coordinates the logical processing of response selection, ensuring fluid interaction throughout the educational experience.
For the MR visualization component, we employed Microsoft HoloLens 2 to render volumetric videos in the system (Figure 1). The implementation system comprises MR Toolkit 2.7 and Unity 2019.4.23. Beyond conventional surgical skills training, the system incorporates an interactive assessment module. The evaluation interface adopts angular and magnetic sensors for response selection, enabling learners to indicate their choices through sensor-based interactions. The system incorporates automatic response logging functionality and delivers instantaneous feedback alongside comprehensive evaluation based on predetermined assessment criteria.
Figure 1. Devices of developed system: Holocode (left), Azure Kinect DK (middle), Microsoft HoloLens 2 (right).

3.2. System Architecture

Surgical skill acquisition without expert supervision yields suboptimal outcomes. To enhance the effectiveness of surgical training, the developed system incorporates multiple high-precision sensors for comprehensive operative data tracking and recording. Inertial and magnetic sensors were used to collect operational parameters, including spatial positioning of surgical instruments, movement trajectories, and procedural angles. These high-fidelity data metrics were used to evaluate procedural standardization and provide quantitative benchmarks for surgical skill progression. The systematic collection and analysis of these objective parameters of the system for continuous performance assessment and skill refinement in surgical training are shown in Figure 2.
Figure 2. System architecture.

3.3. Interactive Design

A distinguishing feature of the developed system is its capacity to facilitate physical movement replication, gesture memorization, and spatial comprehension of surgical instruments and anatomical structures before operation. This enables it to overcome the limitations of traditional fixed-screen learning approaches. This design was developed to effectively bridge the gap between simulated training and clinical practice in instrument positioning, equipment familiarization, and workflow optimization. The system transforms the educational experience from passive observation to active participation, from 2D screen-based learning to immersive spatial interaction.
The system comprises two distinct phases. In the initial volumetric video learning, learners engage with surgical content through HoloLens 2, with free manipulation. Subsequently, the system leads to interactive assessment, where learners utilize angular and magnetic sensors for response selection. The system provides real-time feedback on responses and generates comprehensive performance reports upon completion (Figure 3).
Figure 3. Volumetric video surgery instruction.

4. Conclusions

The convergence of MR and volumetric video technologies enhances thoracic surgical education and training. The system developed integrates wearable MR devices into volumetric video technology, complemented by multi-sensor data acquisition, to create an advanced educational platform for thoracic surgery training. The system delivers immersive surgical instruction while incorporating quantitative assessment mechanisms through an interactive evaluation module, providing objective metrics for educational efficacy evaluation. The system’s dual functionality, immersive visualization, and performance analytics enable surgical skill acquisition and assessment. Positive user feedback suggests potential for expanded implementation and validation.

Author Contributions

Conceptualization, W.G. and R.G.; methodology, W.G. and Y.Y.; software, R.G. and Z.H.; validation, S.Y., T.L. and S.Y.; investigation, J.Y., Y.Z. and Z.H.; resources, W.G. and T.L.; writing—original draft preparation, W.G., Y.Y. and J.Y.; writing—review and editing, W.G., R.G. and T.L.; visualization, J.Y., Y.Y. and Z.H.; supervision, W.G. and R.G.; project administration, W.G.; funding acquisition, T.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Xiamen Educational Science Planning Project (No. 23026) and Xiamen Medical College Educational Research Project (No. XBLG2023008).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Meershoek, A.J.A.; Loonen, T.G.J.; Maal, T.J.J.; Hekma, E.J.; Hugen, N. Three Dimensional Printing as a Tool For Anatomical Training in Lung Surgery. Med. Sci. Educ. 2023, 33, 873–878. [Google Scholar] [CrossRef]
  2. Sadeghi, A.H.; El Mathari, S.; Abjigitova, D.; Maat, A.P.; Taverne, Y.J.; Bogers, A.J.; Mahtab, E.A. Current and future applications of virtual, augmented, and MR in cardiothoracic surgery. Ann. Thorac. Surg. 2022, 113, 681–691. [Google Scholar] [CrossRef] [PubMed]
  3. Vervoorn, M.T.; Wulfse, M.; Hoesein, F.A.A.M.; Stellingwerf, M.; van der Kaaij, N.P.; de Heer, L.M. Application of three-dimensional computed tomography imaging and reconstructive techniques in lung surgery: A mini-review. Front. Surg. 2022, 9, 1079857. [Google Scholar] [CrossRef]
  4. Tokuno, J.; Chen-Yoshikawa, T.F.; Nakao, M.; Iwakura, M.; Motoki, T.; Matsuda, T.; Date, H. Creation of a video library for education and virtual simulation of anatomical lung resection. Interact. Cardiovasc. Thorac. Surg. 2022, 34, 808–813. [Google Scholar] [CrossRef]
  5. Dho, Y.-S.; Park, S.J.; Choi, H.; Kim, Y.; Moon, H.C.; Kim, K.M.; Kang, H.; Lee, E.J.; Kim, M.-S.; Kim, J.W.; et al. Development of an insideout augmented reality technique for neurosurgical navigation. Neurosurg. Focus 2021, 51, E21. [Google Scholar] [CrossRef]
  6. Andrews, C.; Southworth, M.K.; Silva, J.N.A.; Silva, J.R. Extended reality in medical practice. Curr. Treat. Options Cardiovasc. Med. 2019, 21, 18. [Google Scholar] [CrossRef]
  7. Alaker, M.; Wynn, G.R.; Arulampalam, T. VR training in laparoscopic surgery: A systematic review & meta-analysis. Int. J. Surg. 2016, 29, 85–94. [Google Scholar] [CrossRef] [PubMed]
  8. Vitish-Sharma, P.; Knowles, J.; Patel, B. Acquisition of fundamental laparoscopic skills: Is a box really as good as a VR trainer? Int. J. Surg. 2011, 9, 659–661. [Google Scholar] [CrossRef] [PubMed]
  9. Jones, C.; Ramanau, R.; Cross, S.; Healing, G. Net generation or Digital Natives: Is there a distinct new generation entering university? Comput. Educ. 2010, 54, 722–732. [Google Scholar] [CrossRef]
  10. Murad, M.H.; Coto-Yglesias, F.; Varkey, P.; Prokop, L.J.; Murad, A.L. The effectiveness of self-directed learning in health professions education: A systematic review. Med. Educ. 2010, 44, 1057–1068. [Google Scholar] [CrossRef]
  11. Clark, R.C.; Mayer, R.E. E-Learning and the Science of Instruction: Proven Guidelines for Consumers and Designers of Multimedia Learning; John Wiley & Sons: Hoboken, NJ, USA, 2016. [Google Scholar]
  12. Cook, D.A.; Brydges, R.; Hamstra, S.J.; Zendejas, B.; Szostek, J.H.; Wang, A.T.; Erwin, P.J.; Hatala, R. Comparative effectiveness of technology-enhanced simulation versus other instructional methods: A systematic review and meta-analysis. Simul. Healthc. 2012, 7, 308–320. [Google Scholar] [CrossRef]
  13. Dalgarno, B.; Lee, M.J.W. What are the learning affordances of 3-D virtual environments? Br. J. Educ. Technol. 2010, 41, 10–32. [Google Scholar] [CrossRef]
  14. Akçayır, M.; Akçayır, G. Advantages and challenges associated with augmented reality for education: A systematic review of the literature. Educ. Res. Rev. 2017, 20, 1–11. [Google Scholar] [CrossRef]
  15. Jang, S.; Vitale, J.M.; Jyung, R.W.; Black, J.B. Direct manipulation is better than passive viewing for learning anatomy in a three-dimensional VR environment. Comput. Educ. 2017, 106, 150–165. [Google Scholar] [CrossRef]
  16. Küçük, S.; Kapakin, S.; Göktaş, Y. Learning anatomy via mobile augmented reality: Effects on achievement and cognitive load. Anat. Sci. Educ. 2016, 9, 411–421. [Google Scholar] [CrossRef]
  17. Mikhail, E.M.; Bethel, J.S.; McGlone, J.C. Introduction to Modern Photogrammetry; Wiley: Hoboken, NJ, USA, 2001. [Google Scholar]
  18. Kunkler, K. The role of medical simulation: An overview. Int. J. Med. Robot. Comput. Assist. Surg. 2006, 2, 203–210. [Google Scholar] [CrossRef]
  19. McGaghie, W.C.; Issenberg, S.B.; Cohen, E.R.; Barsuk, J.H.; Wayne, D.B. Does Simulation-Based Medical Education With Deliberate Practice Yield Better Results Than Traditional Clinical Education? A Meta-Analytic Comparative Review of the Evidence. Acad. Med. 2011, 86, 706–711. [Google Scholar] [CrossRef] [PubMed]
  20. McGaghie, W.C.; Issenberg, S.B.; Petrusa, E.R.; Scalese, R.J. A critical review of simulation-based medical education research: 2003–2009. Med. Educ. 2010, 44, 50–63. [Google Scholar] [CrossRef]
  21. Okuda, Y.; Bryson, E.O.; DeMaria, S., Jr.; Jacobson, L.; Quinones, J.; Shen, B.; Levine, A.I. The utility of simulation in medical education: What is the evidence? Mt. Sinai J. Med. 2009, 76, 330–343. [Google Scholar] [CrossRef]
  22. Gorbanev, I.; Agudelo-Londoño, S.; González, R.A.; Cortes, A.; Pomares, A.; Delgadillo, V.; Yepes, F.J.; Muñoz, Ó. A systematic review of serious games in medical education: Quality of evidence and pedagogical strategy. Med. Educ. Online 2018, 23, 1438718. [Google Scholar] [CrossRef]
  23. Graafland, M.; Schraagen, J.M.; Schijven, M.P. Systematic review of serious games for medical education and surgical skills training. Br. J. Surg. 2012, 99, 1322–1330. [Google Scholar] [CrossRef] [PubMed]
  24. Barsom, E.Z.; Graafland, M.; Schijven, M.P. Systematic review on the effectiveness of augmented reality applications in medical training. Surg. Endosc. 2016, 30, 4174–4183. [Google Scholar] [CrossRef] [PubMed]
  25. Brown, R.; McIlwain, S.; Willson, B.; Hackett, M. Enhancing Combat Medic training with 3D virtual environments. In Proceedings of the 2016 IEEE International Conference on Serious Games and Applications for Health (SeGAH), Orlando, FL, USA, 11–13 May 2016; pp. 1–7. [Google Scholar]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.