Medical Augmented Reality: Definition, Principle Components, Domain Modeling, and Design-Development-Validation Process
Abstract
:1. Introduction
- 1
- Physical-World Modeling
- 2
- Display, including Augmentation and Perceptual Rendering
- 3
- Interaction
- 4
- Evaluation, including functional Testing & Validation as well as Ethics
2. The Medical Augmented Reality Framework
2.1. Physical World
2.2. Digital World
2.2.1. Sensors
2.2.2. Perception
2.2.3. Digital World
2.3. AR/VR Display
2.4. AR/VR User Interaction
2.5. Evaluation
3. Exemplary Applications of the Medical Augmented Reality Framework
3.1. 3D Telepresence Based on Real-Time Point Cloud Transmission
- Physical World of ArTekMed: The physical world includes the patient with the surrounding environment and objects, comprising local users such as paramedics and bystanders.
- Computer Sensors of ArTekMed: On the patient’s side, the RGB-D cameras, the built-in sensors of the AR-HMD, and microphones. On the remote user’s side, infrared-based sensors capture the LED constellation on the VR headset, controllers, and body tracking pugs.
- Computer Perception of ArTekMed: The computation units connected to the sensors interpret the acquired data. In particular, the inside-out sensors on the AR headset use simultaneous localization and mapping (SLAM) to compute the local user’s first-person perspective correctly. Furthermore, on the VR side, the tracking system returns the six degrees of freedom pose within the digital world.
- Digital World of ArTekMed: The digital world consists of the reconstructed point cloud and virtual avatars to represent the users for conveying non-verbal communication and social presence. Additional tools to aid the consultation, including 3D annotations and the Magnorama, are part of this digital world and are rendered to all participating users.
- Rendering Displays of ArTekMed: The local users wear AR-HMDs for in-situ augmentations inside the real environment. They allow the local user to perceive the real world alongside digital visual and auditive augmentations that ArTekMed generates from co-located and remote-connected participants. In addition, the VR users perceive the reconstructed point cloud within their VR headsets and can talk and listen to the local AR users.
- Augmentations of ArTekMed: Augmented components are virtual elements such as 3D annotations, the Magnorama, and the avatars. To fully utilize Augmentations, ArTekMed uses the SLAM reconstruction of the AR-HMD for occlusion handling in the real environment.
- Human Perception within ArTekMed: Occlusion handling with the SLAM reconstruction in AR allows users to understand the depth of virtual objects concerning real objects quickly. Stereoscopic rendering within the HMDs in AR and VR lets users observe the scene in 3D. Moreover, tracking of the HMD qualifies motion parallax as an additional visual cue for depth perception.
- Dynamic UI in ArTekMed: Conventional 2D in 3D UIs, such as menus and a radial menu to adjust settings and tools, allow users to interact with the system on a high level. Users interact with the ArTekMed system using their bodies and transform the point cloud reconstruction or digital avatar respectively for non-verbal communication. While interacting with the environment, diegetic virtual elements such as the 3D annotations and Magnorama are part of the digital world and fused with the real world in AR.
- Evaluation in ArTekMed: Novel systems such as the ArTekMed disrupt standard practices in Healthcare. Evaluation should therefore cover the fundamental acceptance of every component in the system and their usability. The evaluation covers the acceptance of teleconsultation versus conventional video calls [38], user representations [41], and advanced interaction techniques [42,43] - all with clinical use cases in mind.
3.2. Augmented Reality for Ophthalmology
- Physical World in Ophthalmology: The operating environment consists of a surgical microscope providing direct visual access to the ocular anatomy of the patient. The surgeon uses both hands to manipulate micro-surgical instruments and feet to control the microscope as well as the iOCT system via pedals on the floor. In modern setups, visualization of the operating area is provided by 3D monitors next to the patient site.
- Sensors in Ophthalmology: (Digital) operating microscopes providing a stereo view, intraoperative OCT lasers for 2D and 3D depth visualization.
- Computer Perception in Ophthalmic Applications: The operating microscope provides stereo RGB images, while 2D cross-sectional slices are acquired by the iOCT system. The compounding of these slices enables 3D, and in state-of-the-art systems, even temporal 4D visualization of the surgical area.
- Digital World in Ophthalmic Applications: The digital world consists of the raw imaging data of all sensors brought into a common coordinate frame, as well as the semantic information of anatomical structures and surgical instruments and information about their relationship. It can further contain information about the surgical phases.
- Augmentations in Ophthalmic Applications: Augmentations are either integrated into the surgical microscope or the 3D display or provided via audio signals and sonification methods. In both cases, they leverage semantic understanding provided by the digital world and aim to improve the perception of the surgical scene.
- Human Perception in Ophthalmic Applications: Depth and distance perception in iOCT volume renderings is mainly provided by color transfer functions and by generating sound signals or modifying musical pieces.
- UI in Ophthalmic Applications: The surgeon’s hands manipulate the light guide and tools. Access to the surgical microscope and iOCT system, hence, also the interaction with an AR system is mainly provided via foot switches. Automatic surgical phase understanding could improve the user interface for AR systems and reduce the cognitive load on the surgeon. The design of a user-centric UI is of utmost importance, allowing optimal usability of the system and optimal perception of the provided information without disturbing the surgeon’s workflow.
- Evaluation in Ophthalmic Applications: Ophthalmic applications are carefully evaluated on phantom eye models, in ex-vivo animal wet lab settings, and on surgical simulators. Close collaborations with surgeons during all stages of design, development, testing, and validation are required for ethical development. Only at a later stage in vivo animal studies will be conducted, and only then can systems be integrated into clinical studies on humans.
3.3. Camera-Augmented Mobile C-Arm
- Physical World in CAMC: The main component of the physical world for CAMC is the deep-seated patient’s anatomy, which is sensed using a mobile X-ray system: Then, the visible patient surface, the surgical tools, the surgeon’s hand, and in later stages, the operating table, assistants, and the rest of the operating room. In the first versions of the CAMC, the co-registration of X-ray sensing and optical imaging improved surgical viewing. Later, the system got extended to include augmentation of full operating room interactions.
- Computer Sensors and Perception in CAMC: The sensors attached to the mobile C-arm have evolved over time. The first iteration started with a CCD camera near the X-ray source [49]. With the help of a double mirror system, it captures the live view of patient anatomy [51] fully registered with the X-ray view without any need for dynamic calibration. Later, the introduction of an RGB-D camera attached to the X-ray detector enabled acquiring a 3D representation of the surgical scene [52,53]. A recent advancement of CAMC adopted HMDs [54,55] both for visualization of the surgical site for the surgeon, as well as for tracking the C-arm.
- Digital World in CAMC: With each iteration of the system, CAMC creates different understandings of the digital world. However, the core aspect of the system is to associate a spatial-temporal relation of the imaging data provided by the C-arm, the patient anatomy captured by the attached cameras, the surgeon, and tools in the same coordinate frame [56].
- Augmentations in CAMC Applications: Patient’s pre-operative or intra-operative 2D and 3D medical data are augmented and fused with the live optical information of the patient. The system can further visualize the trajectory of tools and annotated points, lines, and planes.
- Human Perception in CAMC Applications: Machine Learning improves the perception of the scene by identifying relevant objects in both X-ray and optical images captured with the CAMC system to build a fused image for better handling occlusion [57]. In the presence of deformation, the best solution is to use intraoperative imaging such as ultrasound or optoacoustic imaging, and in particular cases, low-dose fluoroscopy to observe the motion of anatomical targets. Alternatively, scientists used endoscopic views of surface deformation and biomechanical models to estimate the deformation of the target anatomy for AR [58]. In addition, real-time tool tracking can provide the required precision for the execution of accurate surgical actions based on intraoperative computer-aided guidance [59]. With AR-HMDs, users can understand the spatial relationship between medical imaging data and patient anatomy more effortlessly.
- UI in CAMC Applications: With the HMD variant of the CAMC system, the user interaction is mostly through hand gestures and voice commands [60]. The user can manipulate the scale and position of the spatial X-ray images.
- Evaluation in CAMC Applications: The system was first evaluated on phantoms [51,61], then on human cadaver and ex-vivo animal anatomy [62,63] and finally through a set of patient studies [64,65,66]. Quantitative data such as radiation dose, planning and execution time, K-wire insertion accuracy, and other surgical tasks have been evaluated over the years with different types of the CAMC [55,63,67,68]. We also have performed qualitative evaluations of the system usability and user depth perception to help better understand and improve the visualization of CAMC.
3.4. Magic Mirror
- Physical World of Magic Mirrors: The physical world relevant to the Magic Mirror consists of its users. To complete the mirror view, the Magic Mirror additionally captures the environment behind the users.
- Computer Sensors and Perception for Magic Mirrors: An RGB-D camera facilitates both the mirror view and computes the pose of the users.
- Digital World of Magic Mirrors: The internal representation of the system consists of the user’s body pose and an anatomical model. The user’s pose deforms the virtual representation of their anatomy accordingly.
- Rendering Displays of Magic Mirrors: The Magic Mirror renders the image of its color camera on a large 2D monitor.
- Augmentation of Magic Mirrors: The virtual anatomies are augmented onto the color image. The system further enhances the illusion of looking into the body instead of seeing a simple overlay by utilizing a soft fall-off at the transition between the color image and the augmented view based on findings of Bichlmeier et al. [17].
- Dynamic UI of Magic Mirrors: As the anatomy augments the viewing users, they may use proprioception to move their hand to a specific organ with the visuo-proprioceptive feedback loop provided by the mirror and, once arrived, feel the location on their own body.
4. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Kemp, M. The Science of Art: Optical Themes in Western art from Brunelleschi to Seurat; Yale University Press: London, UK, 1992. [Google Scholar]
- International Year of Light: Ibn al Haytham, pioneer of modern optics celebrated at UNESCO. Available online: https://www.unesco.org/en/articles/international-year-light-ibn-al-haytham-pioneer-modern-optics-celebrated-unesco (accessed on 12 June 2022).
- The ’First True Scientist’. Available online: http://news.bbc.co.uk/2/hi/science/nature/7810846.stm (accessed on 12 June 2022).
- Wootton, D. The Invention of Science: A New History of the Scientific Revolution; Penguin: London, UK, 2015. [Google Scholar]
- Sielhorst, T.; Feuerstein, M.; Navab, N. Advanced Medical Displays: A Literature Review of Augmented Reality. J. Disp. Technol. 2008, 4, 451–467. [Google Scholar] [CrossRef] [Green Version]
- Birlo, M.; Edwards, P.E.; Clarkson, M.; Stoyanov, D. Utility of optical see-through head mounted displays in augmented reality-assisted surgery: A systematic review. Med. Image Anal. 2022, 77, 102361. [Google Scholar] [CrossRef] [PubMed]
- The HoloLens in Medicine: A systematic Review and Taxonomy. arXiv 2022, arXiv:2209.03245.
- Azuma, R.T. A Survey of Augmented Reality. Presence Teleoperators Virtual Environ. 1997, 6, 355–385. [Google Scholar] [CrossRef]
- Bajura, M.; Fuchs, H.; Ohbuchi, R. Merging virtual objects with the real world: Seeing ultrasound imagery within the patient. ACM SIGGRAPH Comput. Graph. 1992, 26, 203–210. [Google Scholar] [CrossRef]
- Clegg, N. Making the Metaverse: What it Is, How it will Be Built, and why it Matters. 2022. Available online: https://nickclegg.medium.com/making-the-metaverse-what-it-is-how-it-will-be-built-and-why-it-matters-3710f7570b04 (accessed on 22 September 2022).
- Özsoy, E.; Örnek, E.P.; Eck, U.; Tombari, F.; Navab, N. Multimodal Semantic Scene Graphs for Holistic Modeling of Surgical Procedures. arXiv Prepr. 2021, arXiv:2106.15309. [Google Scholar]
- Özsoy, E.; Örnek, E.P.; Eck, U.; Czempiel, T.; Tombari, F.; Navab, N. 4D-OR: Semantic Scene Graphs for OR Domain Modeling. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention MICCAI 2022, Singapore, 18–22 September 2022. [Google Scholar]
- Navab, N.; Traub, J.; Sielhorst, T.; Feuerstein, M.; Bichlmeier, C. Action- and Workflow-Driven Augmented Reality for Computer-Aided Medical Procedures. IEEE Comput. Graph. Appl. 2007, 27, 10–14. [Google Scholar] [CrossRef] [Green Version]
- Mezger, U.; Jendrewski, C.; Bartels, M. Navigation in surgery. Langenbeck’s Arch. Surg. 2013, 398, 501–514. [Google Scholar] [CrossRef] [Green Version]
- Okur, A.; Ahmadi, S.A.; Bigdelou, A.; Wendler, T.; Navab, N. MR in OR: First analysis of AR/VR visualization in 100 intra-operative Freehand SPECT acquisitions. In Proceedings of the 2011 10th IEEE International Symposium on Mixed and Augmented Reality, Basel, Switzerland, 26–29 October 2011; pp. 211–218. [Google Scholar] [CrossRef]
- Matinfar, S.; Nasseri, M.A.; Eck, U.; Roodaki, H.; Navab, N.; Lohmann, C.P.; Maier, M.; Navab, N. Surgical Soundtracks: Towards Automatic Musical Augmentation of Surgical Procedures. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention MICCAI 2017, Quebec City, QBC, Canada, 10–14 September 2017; pp. 673–681. [Google Scholar]
- Bichlmeier, C.; Wimmer, F.; Heining, S.M.; Navab, N. Contextual Anatomic Mimesis Hybrid In-Situ Visualization Method for Improving Multi-Sensory Depth Perception in Medical Augmented Reality. In Proceedings of the 2007 6th IEEE and ACM International Symposium on Mixed and Augmented Reality, Washington, DC, USA, 13–16 November 2007; pp. 129–138. [Google Scholar] [CrossRef] [Green Version]
- Kutter, O.; Aichert, A.; Bichlmeier, C.; Michael, R.; Ockert, B.; Euler, E.; Navab, N. Real-time Volume Rendering for High Quality Visualization. In Proceedings of the International Workshop on Augmented environments for Medical Imaging including Augmented Reality in Computer-aided Surgery (AMI-ARCS 2008), MICCAI Society, New York, NY, USA, 6–10 September 2008. [Google Scholar]
- Martin-Gomez, A.; Weiss, J.; Keller, A.; Eck, U.; Roth, D.; Navab, N. The Impact of Focus and Context Visualization Techniques on Depth Perception in Optical See-Through Head-Mounted Displays. IEEE Trans. Vis. Comput. Graph. 2021, 1. [Google Scholar] [CrossRef]
- Kalia, M.; Schulte zu Berge, C.; Roodaki, H.; Chakraborty, C.; Navab, N. Interactive Depth of Focus for Improved Depth Perception. In Proceedings of the Medical Imaging and Augmented Reality, Tokyo, Japan, 1–2 August 2008; Zheng, G., Liao, H., Jannin, P., Cattin, P., Lee, S.L., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 221–232. [Google Scholar]
- Kalia, M.; Navab, N.; Fels, S.; Salcudean, T. A Method to Introduce & Evaluate Motion Parallax with Stereo for Medical AR/MR. In Proceedings of the 2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR), Osaka, Japan, 23–27 March 2019; pp. 1755–1759. [Google Scholar] [CrossRef]
- Roodaki, H.; Navab, N.; Eslami, A.; Stapleton, C.; Navab, N. SonifEye: Sonification of Visual Information Using Physical Modeling Sound Synthesis. IEEE Trans. Vis. Comput. Graph. 2017, 23, 2366–2371. [Google Scholar] [CrossRef]
- Ostler, D.; Seibold, M.; Fuchtmann, J.; Samm, N.; Feussner, H.; Wilhelm, D.; Navab, N. Acoustic signal analysis of instrument–tissue interaction for minimally invasive interventions. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 771–779. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Jones, B.; Sodhi, R.; Murdock, M.; Mehra, R.; Benko, H.; Wilson, A.; Ofek, E.; MacIntyre, B.; Raghuvanshi, N.; Shapira, L. RoomAlive: Magical Experiences Enabled by Scalable, Adaptive Projector-camera Units. In Proceedings of the 27th Annual ACM Symposium on User Interface Software and Technology, Honolulu, HI, USA, 5–8 October 2014; ACM: New York, NY, USA, 2014. UIST ’14. pp. 637–644. [Google Scholar] [CrossRef]
- Navab, N.; Feuerstein, M.; Bichlmeier, C. Laparoscopic Virtual Mirror New Interaction Paradigm for Monitor Based Augmented Reality. In Proceedings of the 2007 IEEE Virtual Reality Conference, Charlotte, NC, USA, 10–14 March 2007; pp. 43–50. [Google Scholar]
- Bichlmeier, C.; Heining, S.M.; Rustaee, M.; Navab, N. Virtually Extended Surgical Drilling Device: Virtual Mirror for Navigated Spine Surgery. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention—MICCAI 2007, Brisbane, Australia, 29 October–2 November 2007; Ayache, N., Ourselin, S., Maeder, A., Eds.; 2007; pp. 434–441. [Google Scholar]
- Bichlmeier, C.; Heining, S.M.; Feuerstein, M.; Navab, N. The Virtual Mirror: A New Interaction Paradigm for Augmented Reality Environments. IEEE Trans. Med. Imaging 2009, 28, 1498–1510. [Google Scholar] [CrossRef] [PubMed]
- Wendler, T.; Hartl, A.; Lasser, T.; Traub, J.; Daghighian, F.; Ziegler, S.I.; Navab, N. Towards Intra-operative 3D Nuclear Imaging: Reconstruction of 3D Radioactive Distributions Using Tracked Gamma Probes. In Proceedings of the Medical Image Computing and Computer-Assisted Intervention – MICCAI 2007, Brisbane, Australia, 29 October–2 November 2007; Ayache, N., Ourselin, S., Maeder, A., Eds.; pp. 909–917. [Google Scholar]
- Dünser, A.; Billinghurst, M. Evaluating augmented reality systems. In Handbook of Augmented Reality; Springer: Berlin/Heidelberg, Germany, 2011; pp. 289–307. [Google Scholar]
- Lewis, J.R. The System Usability Scale: Past, Present, and Future. Int. J. -Hum.-Comput. Interact. 2018, 34, 577–590. [Google Scholar] [CrossRef]
- Hart, S.G. Nasa-Task Load Index (NASA-TLX); 20 Years Later. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting, San Fransisco, CA, USA, 16–20 October 2006; Volume 50, pp. 904–908. [Google Scholar] [CrossRef] [Green Version]
- Wilson, M.R.; Poolton, J.M.; Malhotra, N.; Ngo, K.; Bright, E.; Masters, R.S. Development and validation of a surgical workload measure: The surgery task load index (SURG-TLX). World J. Surg. 2011, 35, 1961–1969. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Baños, R.M.; Botella, C.; Garcia-Palacios, A.; Villa, H.; Perpiñá, C.; Alcaniz, M. Presence and Reality Judgment in Virtual Environments: A Unitary Construct? CyberPsychol. Behav. 2000, 3, 327–335. [Google Scholar] [CrossRef]
- Nowak, K.L.; Biocca, F. The Effect Of The Agency And Anthropomorphism On Users’ Sense Of Telepresence, Copresence, And Social Presence In Virtual Environments. Presence Teleoperators Virtual Environ. 2003, 12, 481–494. [Google Scholar] [CrossRef]
- Schafer, W.A.; Bowman, D.A. Evaluating The Effects Of Frame Of Reference On Spatial Collaboration Using Desktop Collaborative Virtual Environments. Virtual Real. 2004, 7, 164–174. [Google Scholar] [CrossRef]
- Georgiou, Y.; Kyza, E.A. The Development And Validation Of The ARI Questionnaire: An Instrument For Measuring Immersion In Location-based Augmented Reality Settings. Int. J. Hum.-Comput. Stud. 2017, 98, 24–37. [Google Scholar] [CrossRef]
- Luo, H.; Lee, P.A.; Clay, I.; Jaggi, M.; De Luca, V. Assessment of fatigue using wearable sensors: A pilot study. Digit. Biomarkers 2020, 4, 59–72. [Google Scholar] [CrossRef]
- Strak, R.; Yu, K.; Pankratz, F.; Lazarovici, M.; Sandmeyer, B.; Reichling, J.; Weidert, S.; Kraetsch, C.; Roegele, B.; Navab, N.; et al. Comparison Between Video-Mediated and Asymmetric 3D Teleconsultation During a Preclinical Scenario. In Proceedings of the Proceedings of Mensch Und Computer 2021, Ingolstadt, Germany, 5–8 September 2021; Association for Computing Machinery: New York, NY, USA, 2021. MuC ’21. pp. 227–235. [Google Scholar] [CrossRef]
- Roth, D.; Yu, K.; Pankratz, F.; Gorbachev, G.; Keller, A.; Lazarovici, M.; Wilhelm, D.; Weidert, S.; Navab, N.; Eck, U. Real-time Mixed Reality Teleconsultation for Intensive Care Units in Pandemic Situations. In Proceedings of the 2021 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)m, Lisbon, Portugal, 27 March–1 April 2021; pp. 693–694. [Google Scholar]
- Song, T.; Eck, U.; Navab, N. If I Share with you my Perspective, Would you Share your Data with me? In Proceedings of the 2022 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW), Christchurch, New Zealand, 12–16 March 2022; pp. 666–667. [Google Scholar]
- Yu, K.; Gorbachev, G.; Eck, U.; Pankratz, F.; Navab, N.; Roth, D. Avatars for Teleconsultation: Effects of Avatar Embodiment Techniques on User Perception in 3D Asymmetric Telepresence. IEEE Trans. Vis. Comput. Graph. 2021, 27, 4129–4139. [Google Scholar] [CrossRef]
- Yu, K.; Winkler, A.; Pankratz, F.; Lazarovici, M.; Wilhelm, D.; Eck, U.; Roth, D.; Navab, N. Magnoramas: Magnifying Dioramas for Precise Annotations in Asymmetric 3D Teleconsultation. In Proceedings of the 2021 IEEE Virtual Reality and 3D User Interfaces (VR), Lisboa, Portugal, 27 March–1 April 2021; pp. 392–401. [Google Scholar]
- Yu, K.; Eck, U.; Pankratz, F.; Lazarovici, M.; Wilhelm, D.; Navab, N. Duplicated Reality for Co-located Augmented Reality Collaboration. IEEE Trans. Vis. Comput. Graph. 2022, 28, 2190–2200. [Google Scholar] [CrossRef] [PubMed]
- Yu, K.; Zacharis, K.; Eck, U.; Navab, N. Projective Bisector Mirror (PBM): Concept and Rationale. IEEE Trans. Vis. Comput. Graph. 2022, 28, 3694–3704. [Google Scholar] [CrossRef] [PubMed]
- Pauly, O.; Diotte, B.; Fallavollita, P.; Weidert, S.; Euler, E.; Navab, N. Machine learning-based augmented reality for improved surgical scene understanding. Comput. Med. Imaging Graph. 2015, 41, 55–60. [Google Scholar] [CrossRef] [PubMed]
- Roodaki, H.; Filippatos, K.; Eslami, A.; Navab, N. Introducing Augmented Reality to Optical Coherence Tomography in Ophthalmic Microsurgery. In Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality, Fukuoka, Japan, 29 September–3 October 2015; pp. 1–6. [Google Scholar] [CrossRef]
- Weiss, J.; Rieke, N.; Nasseri, M.A.; Maier, M.; Lohmann, C.P.; Navab, N.; Eslami, A. Injection Assistance via Surgical Needle Guidance using Microscope-Integrated OCT (MI-OCT). Invest. Ophthalmol. Vis. Sci. 2018, 59, 287. [Google Scholar]
- Weiss, J.; Eck, U.; Nasseri, M.A.; Maier, M.; Eslami, A.; Navab, N. Layer-Aware iOCT Volume Rendering for Retinal Surgery. In Proceedings of the Eurographics Workshop on Visual Computing for Biology and Medicine, The Eurographics Association, Brno, Czech Republic, 4–6 September 2019; Kozlíková, B., Linsen, L., Vázquez, P.P., Lawonn, K., Raidou, R.G., Eds.; [Google Scholar] [CrossRef]
- Navab, N.; Mitschke, M.; Schütz, O. Camera-augmented mobile C-arm (CAMC) application: 3D reconstruction using a low-cost mobile C-arm. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Cambridge, UK, 19–22 September 1999; Springer: Berlin/Heidelberg, Germany, 1999; pp. 688–697. [Google Scholar]
- Navab, N.; Heining, S.M.; Traub, J. Camera Augmented Mobile C-Arm (CAMC): Calibration, Accuracy Study, and Clinical Applications. IEEE Trans. Med. Imaging 2010, 29, 1412–1423. [Google Scholar] [CrossRef]
- Navab, N.; Bani-Kashemi, A.; Mitschke, M. Merging visible and invisible: Two camera-augmented mobile C-arm (CAMC) applications. In Proceedings of the Proceedings 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR’99), San Francisco, CA, USA, 20–21 October 1999; pp. 134–141. [Google Scholar]
- Habert, S.; Gardiazabal, J.; Fallavollita, P.; Navab, N. Rgbdx: First design and experimental validation of a mirror-based RGBD X-ray imaging system. In Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality, Fukuoka, Japan, 29 September–3 October 2015; pp. 13–18. [Google Scholar]
- Lee, S.C.; Fuerst, B.; Fotouhi, J.; Fischer, M.; Osgood, G.; Navab, N. Calibration of RGBD camera and cone-beam CT for 3D intra-operative mixed reality visualization. Int. J. Comput. Assist. Radiol. Surg. 2016, 11, 967–975. [Google Scholar] [CrossRef]
- Hajek, J.; Unberath, M.; Fotouhi, J.; Bier, B.; Lee, S.C.; Osgood, G.; Maier, A.; Armand, M.; Navab, N. Closing the calibration loop: An inside-out-tracking paradigm for augmented reality in orthopedic surgery. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Granada, Spain, 16–20 September 2018; Springer: Berlin/Heidelberg, Germany, 2018; pp. 299–306. [Google Scholar]
- Fotouhi, J.; Mehrfard, A.; Song, T.; Johnson, A.; Osgood, G.; Unberath, M.; Armand, M.; Navab, N. Development and pre-clinical analysis of spatiotemporal-aware augmented reality in orthopedic interventions. IEEE Trans. Med. Imaging 2020, 40, 765–778. [Google Scholar] [CrossRef]
- Fotouhi, J.; Unberath, M.; Song, T.; Hajek, J.; Lee, S.C.; Bier, B.; Maier, A.; Osgood, G.; Armand, M.; Navab, N. Co-localized augmented human and X-ray observers in collaborative surgical ecosystem. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 1553–1563. [Google Scholar] [CrossRef]
- Pauly, O.; Katouzian, A.; Eslami, A.; Fallavollita, P.; Navab, N. Supervised classification for customized intraoperative augmented reality visualization. In Proceedings of the 2012 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Atlanta, GA, USA, 5–8 November 2012; pp. 311–312. [Google Scholar]
- Paulus, C.J.; Haouchine, N.; Cazier, D.; Cotin, S. Augmented Reality during Cutting and Tearing of Deformable Objects. In Proceedings of the 2015 IEEE International Symposium on Mixed and Augmented Reality, Fukuoka, Japan, 29 September–3 October 2015; pp. 54–59. [Google Scholar] [CrossRef] [Green Version]
- Pakhomov, D.; Premachandran, V.; Allan, M.; Azizian, M.; Navab, N. Deep Residual Learning for Instrument Segmentation in Robotic Surgery. In Proceedings of the Machine Learning in Medical Imaging, Shenzhen, China, 13 October 2019; Suk, H.I., Liu, M., Yan, P., Lian, C., Eds.; Springer International Publishing: Cham, Switzerland, 2019; pp. 566–573. [Google Scholar]
- Fotouhi, J.; Unberath, M.; Song, T.; Gu, W.; Johnson, A.; Osgood, G.; Armand, M.; Navab, N. Interactive Flying Frustums (IFFs): Spatially aware surgical data visualization. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 913–922. [Google Scholar] [CrossRef]
- Mitschke, M.; Bani-Hashemi, A.; Navab, N. Interventions under video-augmented X-ray guidance: Application to needle placement. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Pittsburgh, PA, USA, 11–14 October 2000; Springer: Berlin/Heidelberg, Germany, 2000; pp. 858–868. [Google Scholar]
- Traub, J.; Ahmadi, S.A.; Padoy, N.; Wang, L.; Heining, S.M.; Euler, E.; Jannin, P.; Navab, N. Workflow Based Assessment of the Camera Augmented Mobile C-arm System. In Proceedings of the AMIARCS workshop of MICCAI 2008, New York, NY, USA, 6–10 September 2008. [Google Scholar]
- Wang, L.; Landes, J.; Weidert, S.; Blum, T.; von der Heide, A.; Euler, E.; Navab, N. First Animal Cadaver Study for Interlocking of Intramedullary Nails under Camera Augmented Mobile C-arm. In Proceedings of the Information Processing in Computer-Assisted Interventions, Geneva, Switzerland, 23 June 2010; Navab, N., Jannin, P., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 56–66. [Google Scholar]
- Weidert, S.; Wang, L.; von der Heide, A.; Navab, N.; Euler, E. Intraoperative augmented reality visualization. Current state of development and initial experiences with the CamC. Unfallchirurg 2012, 115, 209–213. [Google Scholar] [CrossRef]
- Navab, N.; Blum, T.; Wang, L.; Okur, A.; Wendler, T. First Deployments of Augmented Reality in Operating Rooms. Computer 2012, 45, 48–55. [Google Scholar] [CrossRef]
- von der Heide, A.M.; Fallavollita, P.; Wang, L.; Sandner, P.; Navab, N.; Weidert, S.; Euler, E. Camera-augmented mobile C-arm (CamC): A feasibility study of augmented reality imaging in the operating room. Int. J. Med. Robot. 2018, 14, e1885. [Google Scholar] [CrossRef] [PubMed]
- Fischer, M.; Fuerst, B.; Lee, S.C.; Fotouhi, J.; Habert, S.; Weidert, S.; Euler, E.; Osgood, G.; Navab, N. Preclinical usability study of multiple augmented reality concepts for K-wire placement. Int. J. Comput. Assist. Radiol. Surg. 2016, 11, 1007–1014. [Google Scholar] [CrossRef] [PubMed]
- Fotouhi, J.; Fuerst, B.; Lee, S.C.; Keicher, M.; Fischer, M.; Weidert, S.; Euler, E.; Navab, N.; Osgood, G. Interventional 3D augmented reality for orthopedic and trauma surgery. In Proceedings of the 16th Annual Meeting of the International Society for Computer Assisted Orthopedic Surgery (CAOS), Osaka, Japan, 8–11 June 2016. [Google Scholar]
- Maes, P.; Darrell, T.; Blumberg, B.; Pentland, A. The ALIVE system: Wireless, Full-body Interaction with Autonomous Agents. Multimed. Syst. 1997, 5, 105–112. [Google Scholar] [CrossRef]
- Blum, T.; Kleeberger, V.; Bichlmeier, C.; Navab, N. Mirracle: An Augmented Reality Magic Mirror System For Anatomy Education. In Proceedings of the 2012 IEEE Virtual Reality Workshops (VRW), Costa Mesa, CA, USA, 4–8 March 2012; pp. 115–116. [Google Scholar]
- Meng, M.; Fallavollita, P.; Blum, T.; Eck, U.; Sandor, C.; Weidert, S.; Waschke, J.; Navab, N. Kinect for interactive AR anatomy learning. In Proceedings of the 2013 IEEE International Symposium on Mixed and Augmented Reality (ISMAR), Adelaide, SA, Australia, 1–4 October 2013; pp. 277–278. [Google Scholar] [CrossRef]
- Bork, F.; Barmaki, R.; Eck, U.; Fallavolita, P.; Fuerst, B.; Navab, N. Exploring Non-reversing Magic Mirrors for Screen-based Augmented Reality Systems. In Proceedings of the 2017 IEEE Virtual Reality (VR), Los Angeles, CA, USA, 18–22 March 2017; pp. 373–374. [Google Scholar]
- Barmaki, R.; Yu, K.; Pearlman, R.; Shingles, R.; Bork, F.; Osgood, G.M.; Navab, N. Enhancement of Anatomical Education Using Augmented Reality: An Empirical Study of Body Painting. Anat. Sci. Educ. 2019, 12, 599–609. [Google Scholar] [CrossRef]
- Bork, F.; Stratmann, L.; Enssle, S.; Eck, U.; Navab, N.; Waschke, J.; Kugelmann, D. The Benefits of an Augmented Reality Magic Mirror System for Integrated Radiology Teaching in Gross Anatomy. Anat. Sci. Educ. 2019, 12, 585–598. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Navab, N.; Martin-Gomez, A.; Seibold, M.; Sommersperger, M.; Song, T.; Winkler, A.; Yu, K.; Eck, U. Medical Augmented Reality: Definition, Principle Components, Domain Modeling, and Design-Development-Validation Process. J. Imaging 2023, 9, 4. https://doi.org/10.3390/jimaging9010004
Navab N, Martin-Gomez A, Seibold M, Sommersperger M, Song T, Winkler A, Yu K, Eck U. Medical Augmented Reality: Definition, Principle Components, Domain Modeling, and Design-Development-Validation Process. Journal of Imaging. 2023; 9(1):4. https://doi.org/10.3390/jimaging9010004
Chicago/Turabian StyleNavab, Nassir, Alejandro Martin-Gomez, Matthias Seibold, Michael Sommersperger, Tianyu Song, Alexander Winkler, Kevin Yu, and Ulrich Eck. 2023. "Medical Augmented Reality: Definition, Principle Components, Domain Modeling, and Design-Development-Validation Process" Journal of Imaging 9, no. 1: 4. https://doi.org/10.3390/jimaging9010004
APA StyleNavab, N., Martin-Gomez, A., Seibold, M., Sommersperger, M., Song, T., Winkler, A., Yu, K., & Eck, U. (2023). Medical Augmented Reality: Definition, Principle Components, Domain Modeling, and Design-Development-Validation Process. Journal of Imaging, 9(1), 4. https://doi.org/10.3390/jimaging9010004