Next Article in Journal
Simulation and Experiment Research on the Surface Deformation and Residual Stress of Fractal Rough Surface Single-Shot Peening
Next Article in Special Issue
Deep Active Learning for Computer Vision Tasks: Methodologies, Applications, and Challenges
Previous Article in Journal
Actimetry-Derived 24 h Rest–Activity Rhythm Indices Applied to Predict MCTQ and PSQI
Previous Article in Special Issue
IMNets: Deep Learning Using an Incremental Modular Network Synthesis Approach for Medical Imaging Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Augmented Reality in Surgery: A Scoping Review

1
Department of Information Engineering, University of Florence, 50139 Florence, Italy
2
Department of Biomedical Experimental and Clinical Sciences “Mario Serio”, University of Florence, 50139 Florence, Italy
3
Department of Medical Biotechnologies, University of Siena, 53100 Siena, Italy
4
Epica Imaginalis, 50019 Sesto Fiorentino, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2022, 12(14), 6890; https://doi.org/10.3390/app12146890
Submission received: 5 May 2022 / Revised: 1 July 2022 / Accepted: 4 July 2022 / Published: 7 July 2022

Abstract

:
Augmented reality (AR) is an innovative system that enhances the real world by superimposing virtual objects on reality. The aim of this study was to analyze the application of AR in medicine and which of its technical solutions are the most used. We carried out a scoping review of the articles published between 2019 and February 2022. The initial search yielded a total of 2649 articles. After applying filters, removing duplicates and screening, we included 34 articles in our analysis. The analysis of the articles highlighted that AR has been traditionally and mainly used in orthopedics in addition to maxillofacial surgery and oncology. Regarding the display application in AR, the Microsoft HoloLens Optical Viewer is the most used method. Moreover, for the tracking and registration phases, the marker-based method with a rigid registration remains the most used system. Overall, the results of this study suggested that AR is an innovative technology with numerous advantages, finding applications in several new surgery domains. Considering the available data, it is not possible to clearly identify all the fields of application and the best technologies regarding AR.

1. Introduction

Imaging is known to play an increasingly important role in many surgery domains [1]. Its origin can be dated back to 1895 when W. C. Roentgen discovered the existence of X-rays [2]. While in the course of the twentieth century, X-rays have found increasing application, in more recent years, other techniques have been developed and acquiring data from the internal structures of the human body has become more and more useful [1,3,4,5]. All this facilitated an increasing use of images to guide surgeons during interventions, leading to the affirmation of image-guided surgery (IGS) [6]. In this sense, the need for reducing surgery evasiveness, by supporting physicians in the diagnosis and preoperative phases as well as during surgeries themselves, led to the use of different solutions such as the 3D visualization of anatomical parts and the application of augmented reality (AR) in surgery [1,3,4]. Augmented reality consists in merging the real word with virtual objects (VOs) generated by computer graphic systems, creating a world for the user that is augmented with VOs. The first application of AR in medicine dates back to 1968 when Sutherland created the first head-mounted display [7]. The term AR is often used in conjunction with virtual reality (VR). The difference between them is that VR creates a digital artificial environment by stimulating the senses of the user and simulating the external world through computer graphic systems [8], while AR overlays computer-generated images onto the real world, increasing the user perception and showing something that would otherwise not be perceptible as reported by Park et al. in [1] and Desselle et al. in [9].
The application of AR in IGS can be an increasingly important opportunity for the treatment of patients. In particular, AR allows one to see 3D images projected directly onto patients thanks to the use of special displays. All this can facilitate the perception of the reality examined and lighten the task of the operators themselves compared to the traditional approach consisting in 2D preoperative images displayed on 2D monitors [1,5].
In this way, doctors can directly see 3D images projected onto patients using special displays, described in the next paragraph, instead of using 2D preoperative images displayed on 2D monitors that require the doctor to mentally transform them into 3D objects as well as remove the sight from the patient [1,5].
The purpose of this review is providing an overview of AR by describing which medical applications it can be used in and which aspects characterize this technology to provide doctors with information on this emerging tool. We would like it to be a starting point for more in-depth research and applications in the clinical field. In order to better understand AR application, this review started by describing some key technological aspects such as: tracking, registration and displays.

2. Theoretical Background

This section describes the main aspects leading to the visualization of the VOs superimposed on the real world. The workflow of augmented-reality-enabled systems is shown in Figure 1. This Figure 1 shows that once the virtual model has been rendered, tracking and recording are the two basic steps. In this sense, tracking and registration provide the correct spatial positioning of the VOs with respect to the real world [10]. This result is possible because, with monitoring, the spatial characteristics of an object are detected and measured. Specifically, with regard to AR, tracking indicates the operations necessary to determine the device’s six degrees of freedom, 3D location and orientation within the environment, necessary to calculate the real time user’s point of view. Tracking can be performed outdoors and indoors. We focused on the latter. Two methods of indoor tracking are then distinguishable: outside-in and inside-out. In the outside-in method, the sensors are placed in a stationary place in the environment and sense the device location, often resorting to marker-based systems [11]. In the inside-out method, the camera or the sensors are placed on the actual device whose spatial features are to be tracked in the environment. In this case, the device aims to determine how its position changes in relation to the environment, as for the head-mounted displays (HMDs). The inside-out tracking can be marker-based or marker-less. The marker-based vision technique, making use of optical sensors, measures the device pose starting from the recognition of some fiducial markers placed in the environment. This method can also hyperlink physical objects to web-based content using graphic tags or automatic identification technologies such as radio-frequency-identification (RFId) systems [12]. The marker-less method, conversely, does not require fiducial markers. It bases its measures on the recognition of distinct characteristics, present in the environment, that in turn are used to localize the position of the device in combination with computer vision and image-processing techniques. Registration involves the matching and alignment of tracked spatial features obtained from the real world (RW) with the corresponding points of the VOs to reach an optimal overlapping between them [1]. The accuracy of this process allows an accurate representation of the virtual reality over the real world and determines the natural appearance of an augmented image [13]. The registration phase is connected to the tracking one. Based on the ways these two are accomplished, the process is defined as manual, fully automatic or semiautomatic. The manual one refers to manual registration and manual tracking. It consists in finding landmarks both on the model and the patient and consequently manually orienting and resizing of the obtained preoperative 3D model displayed on the operative monitor to make it match real images. The fully automatic process is the most complex one, especially with soft tissues. Since real world objects change their shapes with time, the same deformation needs to be applied to the VOs to address the fact that any deformation during surgery, due to events such as respiration, can result in an inaccurate real-time registration, subsequently causing an imprecise overlapping between 3D VOs and ROs. Finally, the semiautomatic process associates the automatic tracking with the manual registration. The identification of landmark structures, both on the obtained 3D model and on the real structures, occurs automatically, while its overlay on the model, and its orienting and resizing, occurs manually. This aspect is what differentiates the automatic process from the semiautomatic one. The latter provides the overlay of the AR images on real life statically and manually, while the former makes the 3D virtual models dynamically match the actual structures [1,14,15,16]. For the visualization of the VOs onto the real world, several AR display technologies exist, usually classified in head, body and world devices, depending on the place where they are located [7,17]. World devices are located in a fixed place. This category includes desktop displays used as AR displays, and projector-based displays. The former are equipped with a webcam, a virtual mirror showing the scene framed by the camera and a virtual showcase, allowing the user to see the scene, alongside additional information. Projector-based displays cast virtual objects directly onto the corresponding real-world objects’ surfaces. With body devices, we usually refer to handheld Android-based platforms, such as tablets or mobile phones. These devices use the camera for capturing the actual scenes in real time, while some sensors (e.g., gyroscopes and accelerometers and magnetometer) can determine their rotation. These devices usually resort to fiducial image targets for the tracking-registration phase [18]. Finally, the HMDs are near eye displays, wearable devices consisting in sort of glasses that have the advantage of leaving the hands free to perform other tasks. HMDs are mainly of two types: video see-through and optical see-through. The first ones refer to special lenses that let the user see the external real world through a camera whose frames are in turn combined with VOs. In this way, the external environment is recorded in real time and the final images overlaying the VOs are produced directly over the user’s lenses. Differently, the optical see-through devices consist of an optical combiner or holographic waveguides, the lenses, that enable the overlay of images transmitted by a projector over the same lenses through which a normal visualization of the real world is allowed. In this way the user visualizes directly the reality augmented with the VOs overlaid onto it [7,19]. Figure 2 shows an example of HMD.
The different techniques are summarized in Figure 3. The aim of this study was to describe the state of the art relating to the use of AR in the surgery domain. The description and analyses of the various procedures used to create the virtual images represented a further objective. This scoping review aims to provide a summary of the surgical fields in which this new technology finds its best application providing doctors with an overview of the key aspects behind viewing accurate virtual images superimposed on the real world. The research highlighted that the marker-based tracking and the rigid registration are currently the most used systems to acquire data, as reported in the following paragraphs.

3. Materials and Methods

We followed the PRISMA Guidelines for scoping reviews [20]. The results are shown in Figure 4. The histogram in Figure 5 shows the trend of the number of publications from 1982 to 2021 present on Scopus searching English articles for “augmented reality in surgery”. Between 2020 and 2021, the number of publications increased by 40%. In 2022, at the time of writing, 50 articles have already been published and indexed on Scopus.

3.1. Inclusion Criteria

The studies included in the review need to be related to the main topic: augmented reality. We limited the selection by imposing restrictions on the document type (articles only) and on the language (English only). The query was limited to a relatively short period of time, (2019–February 2022) ensuring the attention was focused on the innovations introduced in the latest years. The queries we used during our searches were: “TITLE-ABS-KEY (“augmented reality” AND surgery) AND (LIMIT-TO (DOCTYPE, “ar”)) AND (LIMIT-TO (LANGUAGE, “English”)) AND (LIMIT-TO (PUBYEAR, 2022) OR LIMIT-TO (PUBYEAR, 2021) OR LIMIT-TO (PUBYEAR, 2020) OR LIMIT-TO (PUBYEAR, 2019))” for Scopus and Record on Pubmed.

3.2. Selection of Sources Criteria

The inclusion criteria were applied to filter the found articles. Additional documents were then added based on citations from excluded articles, deemed interesting for this review but not caught by the query because of the limitations that we decided to set. The team established two reviewers, E.B. and P.F. In both searches; they screened independently all the articles, starting from the abstracts and the titles, choosing the ones deemed pertinent according to their own judgement. The articles chosen by both reviewers were directly integrated in the list of articles to be downloaded. The studies that were chosen from only one of the two reviewers were integrated in the list only after the agreement of a third reviewer, L.B., who took the final decision whether to include or discard the article from the final review. Starting from this list, full texts of these studies were downloaded and the process of choice was repeated based on the content of the studies found, thus obtaining the final list of articles to be included.

4. Results

The initial search yielded a total of 2649 articles. After applying filters, removing duplicates and screening the studies based on abstracts and titles, 125 studies remained, from which those included in the study were chosen. The final summary refers to a total of 34 articles. The reason for not including some articles is related to their content, in some cases deemed too specific, concerning clinical trials or topics outside the field of interest. The list of AR applications in the different surgery domains as reported in the selected articles is shown in Table 1. We decided to create Table 1, containing an overview of the AR applications in different areas and methods present in the chosen articles. The Table 1 is organized as follows: the first column shows the author (or authors) of the article, the second the application to which the article refers, the third the technology used for processing, the fourth the display used to view the virtual object merged with reality, the fifth the registration method used in the article, the sixth the error made in terms of approximations and the seventh the data set that was used in the article.
Evaluating all the selected articles, both in the filtered research and those added manually, we decided to summarize the main aspects of the AR applications in three schemes reported in Table 2, Table 3 and Table 4. The aspects we decided to analyze and report as percentage of application in the analyzed studies are the ones described in the Section 2. For what concerns the application of AR in different fields, the scheme in Table 2 shows that this technique has been traditionally mainly used in orthopedics. Lately, the innovation has been represented by its increasingly widespread application in maxillofacial surgery, in addition to oncology. However, the numerous areas in which AR is used confirm the important role that this technology may have in the future in the health field. The scheme in Table 3 shows how the projection over the patient is, at the moment, the least used method, while the Optical Viewer by Microsoft, HoloLens, is the most used one. The first model (HoloLens 1) together with the second one (HoloLens 2) amounts to 38% of the scheme in Table 3. For what concerns tracking and registration, reported in the scheme in Table 4, the marker-based method paired with rigid registration remains the most used system. Once we analyzed all the articles listed in Table 2, we decided to delve into the applications more recurrent in our research and which, in our opinion, seemed to have the most interesting clinical implications. The applications we decided to investigate are reported below.

4.1. Oncology

AR application is frequent in oncology, being used for osteosarcoma [53], mandibular [54], kidney and prostate cancer [55], meningioma [56], urological cancer, intracranial [57], neuro-oncological [58], and cancer of the liver [14]. Indeed, AR application ensures an accurate visualization of the tumor, identifying its edges and position during surgeries. The capability to visualize the real anatomical structures, such as convolutions, grooves, blood vessels and nervous tracts, allows control during their resection, and permits surgeons to try to eradicate the tumor while removing as little of the surrounding healthy tissue as possible [59,60,61,62]. Adequate planning also provides bone information that, together with information about the tumor, can lead to its successful removal [54]. Furthermore, the application of the AR to the innovative twin digital simulation technique can also be a medical support tool. In particular, this solution may allow oncologists to monitor and control the patient in addition to predicting the outcome of cancer through the development of appropriate simulation models and the creation of appropriate data sets [63].

4.2. Orthopedics

The application of AR in orthopedics [64,65,66] is relatively recent, dating back to the beginning of the 2000s [65]. The purpose of applying AR to orthopedic computer systems for computer-assisted surgery (CAS) is to increase the accuracy during surgeries, improving the possible outcomes and at the same time decreasing procedure-related complications. AR application can also contribute to the reduction of both surgery time and radiographic doses for both patients and surgery teams. AR avoids the use of X-rays to see through the patients, reducing their exposure time [67].

4.3. Spinal Surgeries

AR is often used in spinal surgeries [68,69]. In this application, the accuracy is fundamental since an imprecision in the placement of an instrument can lead to spinal cord, nerve root or vascular injuries [70]. Open methods and direct visualization supporting the placement of the instrumentation, such as pedicle screw, have historically characterized this type of surgery [70]. The use of AR in spine surgery dates back to 1997 when Peuchot and his team developed a system for visualizing a vertebra during surgery [71]. For the past 20 years, Minimally Invasive Surgery (MIS) has been under investigation. Many articles have targeted study of Minimally Invasive Surgery (MIS) over the past 20 years. This has led to the introduction of new approaches such as the inoperative navigation that introduces several advantages to visualizing anatomy and precisely guiding surgeries. Furthermore, MIS ensures a higher level of accuracy, while minimizing possible damage to contiguous structures, providing access to deeper ones and improving dynamics and logistics in the operating room. The union of AR and MIS allows the surgeon to see more accurately inside the patient, possibly visualizing the preoperative planned drilling trajectory over the display, ensuring advantages in terms of accuracy, reduction of radiation exposure, blood loss and hospital stay. The drawbacks are mainly related to high costs and to the steepness of the learning curve, still too high [72].

4.4. Neurosurgery

The use of AR is quite frequent also in neurosurgery. Its application in this area has already been tackled in oncology, but it finds its maximum utility in neuronavigation [73]. It can help surgeons in reducing the consequences of the treatment, improving the quality of the surgery and reducing the operation time [74,75,76,77]. The first neuronavigation system (NNS) dates back to 1986. The advantage offered by AR associated to NSS consists in the mapping of the preoperative images directly onto the patient’s visible surface, thus showing its anatomy on it [73,78].

4.5. Surgical Training and Medical Education

AR is assuming a fundamental and emerging role also for what concerns surgical training and medical education [79,80,81,82,83]. Its introduction results in providing students with a better anatomic conceptualization and allows surgical simulations to improve their performances [84]. AR ensures the possibility to practice surgeries without risks for the patient, saving the need of a supervisor and consequently reducing costs for the structures [85]. It also provides an increasing acquisition of skills such as speed, ability to multitask, accuracy, hand–eye coordination and bimanual operation. The evolution of this system has led to the use of telemonitoring, where experienced surgeons can train students remotely, and also to take part in consultations among experts located in different countries [86].

5. Discussion

Augmented reality is an innovative technology that presents several advantages, with new applications still in development. Knowing about this technology is every day becoming more important and can provide information to medical doctors and encourage new applications and deeper research. The reason for its increasing success is connected to the possibility it offers to visualize and interact with digital objects without having to lose view of the real world to watch the monitor displaying the medical imaging of the area of interest [1]. Moreover, research has shown its capacity to reduce the exposure to ionizing radiation. This aspect is important because it is well known that ionizing radiations can have harmful effects with possible effects on biomolecules such as DNA, lipids, proteins, and cancer risks [87,88,89]. One study [71] calculated the average of the staff radiation exposure using AR that, compared to the literature values, decreased to less than 0.01%. Moreover, the absorbed dose of the patient exposition resulted in a decrease of its value up to 32% compared to the quantities due to conventional techniques [71]. All these aspects may allow the diffusion of AR and the possibility of assuming it as a systemic tool in medicine. The analysis of the studies considered showed that AR finds applications in many surgery domains and especially in the field of maxillofacial surgery, orthopedics and oncology. In particular, oncology is one of the areas of application particularly indicated. In this sense, AR finds a lot of applications in different kinds of cancer with the aim of facilitating and reducing the consequences of the treatment as well as improving outcomes [14,53,54,55,56,57,58]. Even with regard to orthopedics, the use of AR can be particularly recommended and is aimed at promoting the quality of surgical interventions, and therefore improving the outcomes as well as reducing the risk of complications [64,65,66,67]. In this sense, spinal surgeries represent an important area of application of AR where it can represent an important resource available to surgeons [68,69]. Regarding the available display technology, the results obtained show that the Optical Viewer by Microsoft, HoloLens, is the most used [36,39,90]. The marker-based method paired with the rigid registration was the most used solution in the context of AR tracking and registration methods [42,43,44,45,46]. In this regard, it is clear that the goal is to be able to reduce or eliminate the problems associated with tissue deformations. Unfortunately, the limited number of data available did not allow for more in-depth analyses on this issue. AR is a technology that is every day becoming more popular. Here we provided an idea of what it is, which technologies it is formed from and in which applications it is more popularly used. Unfortunately, some limitations still affect the application of AR in the surgical field. From our study, we noticed that the output is too much related to the accuracy of the registration and tracking systems that need to be as reliable as possible. Errors during those mentioned phases could lead to a misalignment of the VOs with the real world [91,92]. Mainly for the HMDs, the different field of view between human vision and visors represents an obstacle too [93,94]. Finally, one of the biggest issues that affects this technology is the vergence–accommodation conflict. In nature, the point where the eyes verge and focus is the same, while AR displays are featured by a fixed focal distance; consequently, the points of vergence and focus may be different. This causes discomfort, fatigue and different eye depth perception [95,96,97]. Some limitations characterize this study since the purpose of the review consisted in providing a contemporary view, but the results may exclude longitudinal trends. A potential problem in this study may also be the possible underrepresentation of documents about AR in surgery. Not all the studies published in the years analyzed may have been identified, despite trying to be as comprehensive as possible (according to the filters chosen). For our search, we used only those terms indicated in the Section 3, but others could have been chosen. Moreover, it is possible that some papers were excluded as they did not include those specific words, but their synonyms. Furthermore, our search was attempted using two multidisciplinary databases, Pubmed and Scopus, but others could have yielded additional studies. We decided to use only English terms and include only English articles. We did not reach out to experts on the topic for a consultation about additional studies that we may not have included.

6. Conclusions

AR is a technology that is increasingly being applied in surgery. This is due to the numerous advantages it offers although it is still an evolving technology. Since AR allows an accurate visualization of the anatomical structures and a good control of the activities performed during surgical resections, the fields in which it is most commonly used are orthopedics and oncology. For what concerns the displays, Microsoft HoloLens Viewer is the most used method. Likewise, the marker-based system combined with rigid registration is the most common solution for tracking and registration. The need for high accuracy of registration and tracking systems, as well as VOs misalignment problems and the possible vergence–accommodation conflict are important limitations. The latter can hinder the use of AR in surgery. The results of this study, as well as presenting the technological solutions used, show that AR can be applied in different fields of surgery. All of this can favor the realization of further studies aimed at overcoming the current limitations on AR in the clinical setting as well as promoting its application. Considering the significant role that AR can play within the treatment of a large numbers of patients, further studies are needed to better define all possible fields of application of AR and the best technological solutions to be used.

Author Contributions

E.B., L.M., E.I. and L.B. designed the study. E.B., P.F. and L.B. performed the bibliographic research and organized the results. E.I., P.F., C.N. and L.M. aided in interpreting the results and wrote the final version of the manuscript with the support of all authors. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by Fondazione Cassa di Risparmio di Firenze, Florence, Italy (grant number 2020.1515). The authors thank Ian Webster PGCE for revising the English content.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Park, B.J.; Hunt, S.J.; Martin, C., III; Nadolski, G.J.; Wood, B.; Gade, T.P. Augmented and Mixed Reality: Technologies for Enhancing the Future of IR. J. Vasc. Interv. Radiol. 2020, 31, 1074–1082. [Google Scholar] [CrossRef] [PubMed]
  2. Villarraga-Gómez, H.; Herazo, E.L.; Smith, S.T. X-ray computed tomography: From medical imaging to dimensional metrology. Precis. Eng. 2019, 60, 544–569. [Google Scholar] [CrossRef]
  3. Cutolo, F. Augmented Reality in Image-Guided Surgery. In Encyclopedia of Computer Graphics and Games; Lee, N., Ed.; Springer International Publishing: Cham, Switzterland, 2017; pp. 1–11. [Google Scholar] [CrossRef]
  4. Allison, B.; Ye, X.; Janan, F. MIXR: A Standard Architecture for Medical Image Analysis in Augmented and Mixed Reality; IEEE Computer Society: Washington, DC, USA, 2020; pp. 252–257. [Google Scholar] [CrossRef]
  5. Marmulla, R.; Hoppe, H.; Mühling, J.; Eggers, G. An augmented reality system for image-guided surgery: This article is derived from a previous article published in the journal International Congress Series. Int. J. Oral Maxillofac. Surg. 2005, 34, 594–596. [Google Scholar] [CrossRef] [PubMed]
  6. Peters, T.M. Image-guidance for surgical procedures. Phys. Med. Biol. 2006, 51, R505–R540. [Google Scholar] [CrossRef] [PubMed]
  7. Eckert, M.; Volmerg, J.S.; Friedrich, C.M. Augmented Reality in Medicine: Systematic and Bibliographic Review. JMIR Publ. 2019, 7, e10967. [Google Scholar] [CrossRef] [PubMed]
  8. Kim, Y.; Kim, H.; Kim, Y.O. Virtual Reality and Augmented Reality in Plastic Surgery: A Review. Arch. Plast. Surg. 2017, 44, 179–187. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Desselle, M.R.; Brown, R.A.; James, A.R.; Midwinter, M.J.; Powell, S.K.; Woodruff, M.A. Augmented and Virtual Reality in Surgery. Comput. Sci. Eng. 2020, 22, 18–26. [Google Scholar] [CrossRef]
  10. Pérez-Pachón, L.; Poyade, M.; Lowe, T.; Gröning, F. Image Overlay Surgery Based on Augmented Reality: A Systematic Review. In Biomedical Visualisation. Advances in Experimental Medicine and Biology; Springer International Publishing: Cham, Switzterland, 2020; Volume 1260, pp. 175–195. [Google Scholar] [CrossRef]
  11. Zafari, F.; Gkelias, A.; Leung, K.K. A Survey of Indoor Localization Systems and Technologies. IEEE Commun. Surv. Tutor. 2019, 21, 2568–2599. [Google Scholar] [CrossRef] [Green Version]
  12. Cheng, J.; Chen, K.; Chen, W. Comparison of marker-based AR and markerless AR: A case study on indoor decoration system. In Proceedings of the Lean and Computing in Construction Congress (LC3): Proceedings of the Joint Conference on Computing in Construction (JC3), Heraklion, Greece, 4–7 July 2017; pp. 483–490. [Google Scholar] [CrossRef] [Green Version]
  13. Thangarajah, A.; Wu, J.; Madon, B.; Chowdhury, A.K. Vision-based registration for augmented reality-a short survey. In Proceedings of the 2015 IEEE International Conference on Signal and Image Processing Applications (ICSIPA), Kuala Lumpur, Malaysia, 19–21 October 2015; pp. 463–468. [Google Scholar] [CrossRef]
  14. Quero, G.; Lapergola, A.; Soler, L.; Shahbaz, M.; Hostettler, A.; Collins, T.; Marescaux, J.; Mutter, D.; Diana, M.; Pessaux, P. Virtual and Augmented Reality in Oncologic Liver Surgery. Surg. Oncol. Clin. N. Am. 2019, 28, 31–44. [Google Scholar] [CrossRef]
  15. Tuceryan, M.; Greer, D.S.; Whitaker, R.T.; Breen, D.E.; Crampton, C.; Rose, E.; Ahlers, H.K. Calibration requirements and procedures for a monitor-based augmented reality system. IEEE Trans. Vis. Comput. Graph. 1995, 1, 255–273. [Google Scholar] [CrossRef]
  16. Maybody, M.; Stevenson, C.; Solomon, S.B. Overview of Navigation Systems in Image-Guided Interventions. Tech. Vasc. Interv. Radiol. 2013, 16, 136–143. [Google Scholar] [CrossRef]
  17. Zhanat, M.; Vslor, H.A. Augmented Reality for Robotics: A Review. Robotics 2020, 9, 21. [Google Scholar] [CrossRef] [Green Version]
  18. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N. Intelligent Predictive Maintenance and Remote Monitoring Framework for Industrial Equipment Based on Mixed Reality. Front. Mech. Eng. 2020, 6, 578379. [Google Scholar] [CrossRef]
  19. Bruce, T.H. A Survey of Visual, Mixed, and Augmented Reality Gaming. Assoc. Comput. Mach. 2012, 10, 1. [Google Scholar] [CrossRef]
  20. Tricco, A.C.; Lillie, E.; Zarin, W.; O’Brien, K.K.; Colquhoun, H.; Levac, D.; Moher, D.; Peters, M.D.; Horsley, T.; Weeks, L.; et al. PRISMA Extension for Scoping Reviews (PRISMA-ScR): Checklist and Explanation. Ann. Intern. Med. 2018, 169, 467–473. [Google Scholar] [CrossRef] [Green Version]
  21. Schwam, Z.G.; Kaul, V.F.; Bu, D.D.; Iloreta, A.M.C.; Bederson, J.B.; Perez, E.; Cosetti, M.K.; Wanna, G.B. The utility of augmented reality in lateral skull base surgery: A preliminary report. Am. J. Otolaryngol. 2021, 42, 102942. [Google Scholar] [CrossRef]
  22. Coelho, G.; Trigo, L.; Faig, F.; Vieira, E.V.; da Silva, H.P.G.; Acácio, G.; Zagatto, G.; Teles, S.; Gasparetto, T.P.D.; Freitas, L.F.; et al. The Potential Applications of Augmented Reality in Fetoscopic Surgery for Antenatal Treatment of Myelomeningocele. World Neurosurg. 2022, 159, 27–32. [Google Scholar] [CrossRef]
  23. Gouveia, P.F.; Costa, J.; Morgado, P.; Kates, R.; Pinto, D.; Mavioso, C.; Anacleto, J.; Martinho, M.; Lopes, D.S.; Ferreira, A.R.; et al. Breast cancer surgery with augmented reality. Breast 2021, 56, 14–17. [Google Scholar] [CrossRef]
  24. Chen, F.; Cui, X.; Han, B.; Liu, J.; Zhang, X.; Liao, H. Augmented reality navigation for minimally invasive knee surgery using enhanced arthroscopy. Comput. Methods Programs Biomed. 2021, 201, 105952. [Google Scholar] [CrossRef]
  25. Golse, N.; Petit, A.; Lewin, M.; Vibert, E.; Cotin, S. Augmented Reality during Open Liver Surgery Using a Markerless Non-rigid Registration System. J. Gastrointest. Surg. 2021, 25, 662–671. [Google Scholar] [CrossRef]
  26. Gsaxner, C.; Pepe, A.; Li, J.; Ibrahimpasic, U.; Wallner, J.; Schmalstieg, D.; Egger, J. Augmented Reality for Head and Neck Carcinoma Imaging: Description and Feasibility of an Instant Calibration, Markerless Approach. Comput. Methods Programs Biomed. 2020, 200, 105854. [Google Scholar] [CrossRef] [PubMed]
  27. Molina, C.; Sciubba, D.; Greenberg, J.; Khan, M.; Withamm, T. Clinical Accuracy, Technical Precision, and Workflow of the First in Human Use of an Augmented-Reality Head-Mounted Display Stereotactic Navigation System for Spine Surgery. Oper. Neurosurg. 2021, 20, 300–309. [Google Scholar] [CrossRef] [PubMed]
  28. Ackermann, J.; Florentin, L.; Armando, H.; Jess, S.; Mazda, F.; Stefan, R.; Patrick, Z.; Furnstahl, P. Augmented Reality Based Surgical Navigation of Complex Pelvic Osteotomies—A Feasibility Study on Cadavers. Appl. Sci. 2021, 11, 1228. [Google Scholar] [CrossRef]
  29. Peng, L.; Chenmeng, L.; Changlin, X.; Zeshu, Z.; Junqi, M.; Jian, G.; Pengfei, S.; Ian, V.; Pawlik, T.M.; Chengbiao, D.; et al. A Wearable Augmented Reality Navigation System for Surgical Telementoring Based on Microsoft HoloLens. Ann. Biomed. Eng. 2021, 49, 287–298. [Google Scholar] [CrossRef]
  30. Collins, T.; Pizarro, D.; Gasparini, S.; Bourdel, N.; Chauvet, P.; Canis, M.; Calvet, L.; Bartoli, A. Augmented Reality Guided Laparoscopic Surgery of the Uterus. IEEE Trans. Med. Imaging 2021, 40, 371–380. [Google Scholar] [CrossRef] [PubMed]
  31. Arpaia, P.; Benedetto, E.D.; Duraccio, L. Design, implementation, and metrological characterization of a wearable, integrated AR-BCI hands-free system for health 4.0 monitoring. Measurement 2021, 177, 109280. [Google Scholar] [CrossRef]
  32. Shrestha, G.; Alsadoon, A.; Prasad, P.W.C.; Al-Dala’in, T.; Alrubaie, A. A novel enhanced energy function using augmented reality for a bowel: Modified region and weighted factor. Multimed. Tools Appl. 2021, 80, 17893–17922. [Google Scholar] [CrossRef]
  33. Wei, W.; Ho, E.; McCay, K.; Damasevicius, R.; Maskeliunas, R.; Esposito, A. Assessing Facial Symmetry and Attractiveness using Augmented Reality. Pattern Anal. Appl. 2021, 1–17. [Google Scholar] [CrossRef]
  34. Lee, D.; Yu, H.W.; Kim, S.; Yoon, J.; Lee, K.; Chai, Y.J.; Choi, J.Y.; Kong, H.J.; Lee, K.E.; Cho, H.S.; et al. Vision-based tracking system for augmented reality to localize recurrent laryngeal nerve during robotic thyroid surgery. Sci. Rep. 2020, 10, 8437. [Google Scholar] [CrossRef]
  35. Hussain, R.; Lalande, A.; Marroquin, R.; Guigou, C.; Bozorg Grayeli, A. Video-based augmented reality combining CT-scan and instrument position data to microscope view in middle ear surgery. Sci. Rep. 2020, 10, 6767. [Google Scholar] [CrossRef] [Green Version]
  36. Rüger, C.; Feufel, M.; Moosburner, S.; Özbek, C.; Pratschke, J.; Sauer, I. Ultrasound in augmented reality: A mixed-methods evaluation of head-mounted displays in image-guided interventions. Int. J. Comput. Assist. Radiol. Surg. 2020, 15, 1895–1905. [Google Scholar] [CrossRef]
  37. Carl, B.; Bopp, M.H.A.; Benescu, A.; Saß, B.; Nimsky, C. Indocyanine green angiography visualized by augmented reality in aneurysm surgery. World Neurosurg. 2020, 142, e307–e315. [Google Scholar] [CrossRef]
  38. Chan, J.Y.K.; Holsinger, F.C.; Liu, S.; Sorger, J.M.; Azizian, M.; Tsang, R.K.Y. Augmented reality for image guidance in transoral robotic surgery. J. Robot. Surg. 2019, 14, 579–583. [Google Scholar] [CrossRef]
  39. Ferraguti, F.; Minelli, M.; Farsoni, S.; Bazzani, S.; Bonfè, M.; Vandanjon, A.; Puliatti, S.; Bianchi, G.; Secchi, C. Augmented Reality and Robotic-Assistance for Percutaneous Nephrolithotomy. IEEE Robot. Autom. Lett. 2020, 5, 4556–4563. [Google Scholar] [CrossRef]
  40. Auloge, P.; Cazzato, R.; Ramamurthy, N.; De Marini, P.; Rousseau, C.; Garnon, J.; Charles, Y.; Steib, J.P.; Gangi, A. Augmented reality and artificial intelligence-based navigation during percutaneous vertebroplasty: A pilot randomised clinical trial. Eur. Spine J. 2020, 29, 1580–1589. [Google Scholar] [CrossRef]
  41. Libaw, J.; Sinskey, J. Use of Augmented Reasameility During Inhaled Induction of General Anesthesia in 3 Pediatric Patients: A Case Report. A&A Pract. 2020, 14, e01219. [Google Scholar] [CrossRef]
  42. Pietruski, P.; Majak, M.; Swiatek-Najwer, E.; Żuk, M.; Popek, M.; Jaworowski, J.; Mazurek, M. Supporting Fibula Free Flap Harvest With Augmented Reality: A Proof-of-Concept Study. Laryngoscope 2019, 130, 1173–1179. [Google Scholar] [CrossRef]
  43. Jiang, T.; Yu, D.; Wang, Y.; Zan, T.; Wang, S.; Li, Q. HoloLens-Based Vascular Localization System: Precision Evaluation Study With a Three-Dimensional Printed Model. J. Med. Internet Res. 2020, 22, e16852. [Google Scholar] [CrossRef]
  44. Samei, G.; Tsang, K.; Kesch, C.; Lobo, J.; Hor, S.; Mohareri, O.; Chang, S.; Goldenberg, S.L.; Black, P.C.; Salcudean, S. A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Med. Image Anal. 2020, 60, 101588. [Google Scholar] [CrossRef]
  45. Rose, A.; Kim, H.; Fuchs, H.; Frahm, J.M. Development of augmented-reality applications in otolaryngology-head and neck surgery: Augmented Reality Applications. Laryngoscope 2019, 129, S1–S11. [Google Scholar] [CrossRef]
  46. Carl, B.; Bopp, M.; Voellger, B.; Saß, B.; Nimsky, C. Augmented reality in transsphenoidal surgery. World Neurosurg. 2019, 125, e873–e883. [Google Scholar] [CrossRef] [PubMed]
  47. Sharma, A.; Alsadoon, A.; Prasad, P.W.C.; Al-Dala’in, T.; Haddad, S. A novel augmented reality visualization in jaw surgery: Enhanced ICP based modified rotation invariant and modified correntropy. Multimed. Tools Appl. 2021, 80, 1–25. [Google Scholar] [CrossRef]
  48. Abdel Al, S.; Abou Chaar, M.K.; Mustafa, A.; Al-Hussaini, M.; Barakat, F.; Asha, W. Innovative Surgical Planning in Resecting Soft Tissue Sarcoma of the Foot Using Augmented Reality With a Smartphone. J. Foot Ankle Surg. 2020, 59, 1092–1097. [Google Scholar] [CrossRef]
  49. Melero, M.; Hou, A.; Cheng, E.; Tayade, A.; Lee, S.C.; Unberath, M.; Navab, N. Upbeat: Augmented Reality-Guided Dancing for Prosthetic Rehabilitation of Upper Limb Amputees. J. Healthc. Eng. 2019, 2019, 1–9. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  50. Tu, P.; Yao, G.; Lungu, A.; Li, D.; Wang, H.; Chen, X. Augmented Reality Based Navigation for Distal Interlocking of Intramedullary Nails Utilizing Microsoft HoloLens 2. Comput. Biol. Med. 2021, 133, 104402. [Google Scholar] [CrossRef] [PubMed]
  51. Cofano, F.; Di Perna, G.; Bozzaro, M.; Longo, A.; Marengo, N.; Zenga, F.; Zullo, N.; Cavalieri, M.; Damiani, L.; Boges, D.; et al. Augmented Reality in Medical Practice: From Spine Surgery to Remote Assistance. Front. Surg. 2021, 8, 657901. [Google Scholar] [CrossRef] [PubMed]
  52. Heinrich, F.; Huettl, F.; Schmidt, G.; Paschold, M.; Kneist, W.; Huber, T.; Hansen, C. HoloPointer: A virtual augmented reality pointer for laparoscopic surgery training. Int. J. CARS 2021, 16, 161–168. [Google Scholar] [CrossRef] [PubMed]
  53. Brookes, M.J.; Chan, C.D.; Baljer, B.; Wimalagunaratna, S.; Crowley, T.P.; Ragbir, M.; Irwin, A.; Gamie, Z.; Beckingsale, T.; Ghosh, K.M.; et al. Surgical Advances in Osteosarcoma. Cancers 2021, 13, 388. [Google Scholar] [CrossRef]
  54. Kraeima, J.; Glas, H.; Merema, B.; Vissink, A.; Spijkervet, F.; Witjes, M. Three-dimensional virtual surgical planning in the oncologic treatment of the mandible. Oral Dis. 2021, 27, 14–20. [Google Scholar] [CrossRef]
  55. Wake, N.; Nussbaum, J.E.; Elias, M.I.; Nikas, C.V.; Bjurlin, M.A. 3D Printing, Augmented Reality, and Virtual Reality for the Assessment and Management of Kidney and Prostate Cancer: A Systematic Review. Urology 2020, 143, 20–32. [Google Scholar] [CrossRef]
  56. Alexandre, L.; Torstein, M.; Karl, S.; Marco, C. Augmented reality in intracranial meningioma surgery: A case report and systematic review. J. Neurosurg. Sci. 2020, 64, 369–376. [Google Scholar] [CrossRef]
  57. Lee, C.; Wong, G.K.C. Virtual reality and augmented reality in the management of intracranial tumors: A review. J. Clin. Neurosci. 2019, 62, 14–20. [Google Scholar] [CrossRef]
  58. Gerard, I.J.; Kersten-Oertel, M.; Petrecca, K.; Sirhan, D.; Hall, J.A.; Collins, D.L. Brain shift in neuronavigation of brain tumors: A review. Med. Image Anal. 2017, 35, 403–420. [Google Scholar] [CrossRef]
  59. Inoue, D.; Cho, B.; Mori, M.; Kikkawa, Y.; Amano, T.; Nakamizo, A.; Yoshimoto, K.; Mizoguchi, M.; Tomikawa, M.; Hong, J.; et al. Preliminary study on the clinical application of augmented reality neuronavigation. J. Neurol. Surg. Part A Cent. Eur. Neurosurg. 2013, 74, 71–76. [Google Scholar] [CrossRef] [Green Version]
  60. Besharati, T.L.; Mehran, M. Augmented reality-guided neurosurgery: Accuracy and intraoperative application of an image projection technique. J. Neurosurg. 2015, 123, 206–211. [Google Scholar] [CrossRef] [Green Version]
  61. Cabrilo, I.; Sarrafzadeh, A.; Bijlenga, P.; Landis, B.N.; Schaller, K. Augmented reality-assisted skull base surgery. Neurochirurgie 2014, 60, 304–306. [Google Scholar] [CrossRef]
  62. Contreras López, W.; Navarro, P.; Crispin, S. Intraoperative clinical application of augmented reality in neurosurgery: A systematic review. Clin. Neurol. Neurosurg. 2019, 177, 6–11. [Google Scholar] [CrossRef]
  63. Mourtzis, D.; Angelopoulos, J.; Panopoulos, N.; Kardamakis, D. A Smart IoT Platform for Oncology Patient Diagnosis based on AI: Towards the Human Digital Twin. Procedia CIRP 2021, 104, 1686–1691. [Google Scholar] [CrossRef]
  64. Casari, F.A.; Navab, N.; Hruby, L.A.; Philipp, K.; Ricardo, N.; Romero, T.; de Lourdes dos Santos Nunes, F.; Queiroz, M.C.; Furnstahl, P.; Mazda, F. Augmented Reality in Orthopedic Surgery Is Emerging from Proof of Concept Towards Clinical Studies: A Literature Review Explaining the Technology and Current State of the Art. Curr. Rev. Musculoskelet. Med. 2021, 14, 192–203. [Google Scholar] [CrossRef]
  65. Bagwe, S.; Singh, K.; Kashyap, A.; Arora, S.; Maini, L. Evolution of augmented reality applications in Orthopaedics: A systematic review. J. Arthrosc. Jt. Surg. 2021, 8, 84–90. [Google Scholar] [CrossRef]
  66. Negrillo-Cardenas, J.; Jimenez-Perez, J.R.; Feito, F.R. The role of virtual and augmented reality in orthopedic trauma surgery: From diagnosis to rehabilitation. Comput. Methods Programs Biomed. 2020, 191, 105407. [Google Scholar] [CrossRef] [PubMed]
  67. Jud, L.; Fotouhi, J.; Andronic, O.; Aichmair, A.; Osgood, G.; Navab, N.; Farshad, M. Applicability of augmented reality in orthopedic surgery—A systematic review. BMC Musculoskelet. Disord. 2020, 21, 103. [Google Scholar] [CrossRef] [PubMed]
  68. Molina, C.A.; Phillips, F.M.; Poelstra, K.A.; Colman, M.; Khoo, L.T. 151. A cadaveric precision and accuracy analysis of augmented reality mediated percutaneous pedicle implant insertion. Spine J. 2020, 20, S74. [Google Scholar] [CrossRef]
  69. Burstrom, G.; Persson, O.; Edstrom, E.; Elmi-Terander, A. Augmented reality navigation in spine surgery: A systematic review. Acta Neurochir. 2021, 163, 843–852. [Google Scholar] [CrossRef]
  70. Frank, Y.; Georgios, M.; Kosuke, S.; Jeremy, S. Current innovation in virtual and augmented reality in spine surgery. Ann. Transl. Med. 2021, 9, 94. [Google Scholar] [CrossRef]
  71. Sakai, D.; Joyce, K.; Sugimoto, M.; Horikita, N.; Hiyama, A.; Sato, M.; Devitt, A.; Watanabe, M. Augmented, virtual and mixed reality in spinal surgery: A real-world experience. J. Vasc. Intervetional Radiol. 2020, 3, 28. [Google Scholar] [CrossRef]
  72. Vadalà, G.; Salvatore, S.D.; Ambrosio, L.; Russo, F.; Papalia, R.; Denaro, V. Robotic Spine Surgery and Augmented Reality Systems: A State of the Art. Neurospine 2020, 17, 88–100. [Google Scholar] [CrossRef] [Green Version]
  73. Liu, T.; Yonghang, T.; Chengming, Z.; Lei, W.; Jun, Z.; Junjun, P.; Shi, J. Augmented reality in neurosurgical navigation: A survey. Int. J. Med Robot. Comput. Assist. Surg. MRCAS 2020, 16, e2160. [Google Scholar] [CrossRef]
  74. Deng, W.; Li, F.; Wang, M.; Song, Z. Easy-to-Use Augmented Reality Neuronavigation Using a Wireless Tablet PC. Stereotact. Funct. Neurosurg. 2014, 92, 17–24. [Google Scholar] [CrossRef]
  75. Gumprecht, H.K.; Widenka, D.C.; Lumenta, C.B. BrainLab VectorVision Neuronavigation System: Technology and clinical experiences in 131 cases. Neurosurgery 1999, 44, 97–104. [Google Scholar] [CrossRef]
  76. Grunert, P.; Darabi, K.; Espinosa, J.; Filippi, R. Computer-aided navigation in neurosurgery. Neurosurg. Rev. 2003, 26, 73–99. [Google Scholar] [CrossRef]
  77. Cleary, K.; Peters, T.M. Image-Guided Interventions: Technology Review and Clinical Applications. Annu. Rev. Biomed. Eng. 2010, 12, 119–142. [Google Scholar] [CrossRef]
  78. Incekara, F.; Smits, M.; Dirven, C.; Vincent, A. Clinical Feasibility of a Wearable Mixed-Reality Device in Neurosurgery. World Neurosurg. 2018, 118, e422–e427. [Google Scholar] [CrossRef]
  79. Moro, C.; Phelps, C.; Redmond, P.; Stromberga, Z. HoloLens and mobile augmented reality in medical and health science education: A randomised controlled trial. Br. J. Educ. Technol. 2020, 52, 680–694. [Google Scholar] [CrossRef]
  80. Kumar, N.; Pandey, S.; Rahman, E. A Novel Three-Dimensional Interactive Virtual Face to Facilitate Facial Anatomy Teaching Using Microsoft HoloLens. Aesthetic Plast. Surg. 2021, 45, 1005–1011. [Google Scholar] [CrossRef]
  81. Moro, C.; Phelps, C.; Jones, D.; Stromberga, Z. Using Holograms to Enhance Learning in Health Sciences and Medicine. Med. Sci. Educ. 2020, 30, 1351–1352. [Google Scholar] [CrossRef]
  82. Parsons, D.; Mac Callum, K. Current Perspectives on Augmented Reality in Medical Education: Applications, Affordances and Limitations. Adv. Med Educ. Pract. 2021, 12, 77–91. [Google Scholar] [CrossRef]
  83. Williams, M.A.; McVeigh, J.; Handa, A.I.; Regent, L. Augmented reality in surgical training: A systematic review. Postgrad. Med. J. 2020, 96, 537–542. [Google Scholar] [CrossRef]
  84. Cao, C.; Cerfolio, R.J. Virtual or Augmented Reality to Enhance Surgical Education and Surgical Planning. Thorac. Surg. Clin. 2019, 29, 329–337. [Google Scholar] [CrossRef]
  85. Yeung, A.W.K.; Tosevska, A.; Klager, E.; Eibensteiner, F.; Laxar, D.; Jivko, S.; Marija, G.; Sebastian, Z.; Stefan, K.; Rik, C.; et al. Virtual and Augmented Reality Applications in Medicine: Analysis of the Scientific Literature. J. Med Internet Res. 2021, 23, e25499. [Google Scholar] [CrossRef]
  86. McKnight, R.R.; Pean, C.A.; Buck, J.S.; Hwang, J.S.; Hsu, J.R.; Pierrie, S.N. Virtual Reality and Augmented Reality—Translating Surgical Training into Surgical Technique. Curr. Rev. Musculoskelet. Med. 2020, 13, 663–674. [Google Scholar] [CrossRef]
  87. Fazel, R.; Krumholz, H.M.; Wang, Y.; Ross, J.S.; Chen, J.; Ting, H.H.; Shah, N.D.; Nasir, K.; Einstein, A.J.; Nallamothu, B.K. Exposure to low-dose ionizing radiation from medical imaging procedures. N. Engl. J. Med. 2009, 361, 849–857. [Google Scholar] [CrossRef] [Green Version]
  88. Hong, J.Y.; Han, K.; Jung, J.H.; Kim, J.S. Association of exposure to diagnostic low-dose ionizing radiation with risk of cancer among youths in South Korea. JAMA Netw. Open 2019, 2, e1910584. [Google Scholar] [CrossRef] [Green Version]
  89. Reisz, J.; Bansal, N.; Qian, J.; Zhao, W.; Furdui, C. Effects of ionizing radiation on biological molecules–mechanisms of damage and emerging methods of detection. Antioxidants Redox Signal. 2014, 21, 260–292. [Google Scholar] [CrossRef]
  90. Peng, H.; Ding, C. Minimum redundancy and maximum relevance feature selection and recent advances in cancer classification. Feature Sel. Data Min. 2005, 3, 185–205. [Google Scholar] [CrossRef]
  91. Singh, V.K.; Ali, A.; Nair, P.S. A Report on Registration Problems in Augmented Reality. Int. J. Eng. Res. Technol. 2014, 3, 819–821. [Google Scholar]
  92. Chen, Y.; Wang, Q.; Chen, H.; Song, X.; Tang, H.; Tian, M. An overview of augmented reality technology. J. Phys. Conf. Ser. 2019, 1237, 022082. [Google Scholar] [CrossRef]
  93. Lee, Y.H.; Zhan, T.; Wu, S.T. Prospects and challenges in augmented reality displays. Virtual Real. Intell. Hardw. 2019, 1, 10–20. [Google Scholar] [CrossRef]
  94. Ren, D.; Goldschwendt, T.; Chang, Y.; Höllerer, T. Evaluating wide-field-of-view augmented reality with mixed reality simulation. In Proceedings of the 2016 IEEE Virtual Reality (VR), Greenville, SC, USA, 19–23 March 2016; pp. 93–102. [Google Scholar] [CrossRef]
  95. Zhou, Y.; Zhang, J.; Fang, F. Vergence-accommodation conflict in optical see-through display: Review and prospect. Results Opt. 2021, 5, 100160. [Google Scholar] [CrossRef]
  96. Erkelens, I.M.; MacKenzie, K.J. 19-2: Vergence-Accommodation Conflicts in Augmented Reality: Impacts on Perceived Image Quality. SID Symp. Dig. Tech. Pap. 2020, 51, 265–268. [Google Scholar] [CrossRef]
  97. Kim, J.; Kane, D.; Banks, M.S. The rate of change of vergence–Accommodation conflict affects visual discomfort. Vis. Res. 2014, 105, 159–165. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Figure 1. Workflow of augmented-reality-enabled systems.
Figure 1. Workflow of augmented-reality-enabled systems.
Applsci 12 06890 g001
Figure 2. Example of HMD, HoloLens 2 (Microsoft, WA, USA).
Figure 2. Example of HMD, HoloLens 2 (Microsoft, WA, USA).
Applsci 12 06890 g002
Figure 3. Summary of the techniques.
Figure 3. Summary of the techniques.
Applsci 12 06890 g003
Figure 4. Criteria for the inclusion of articles.
Figure 4. Criteria for the inclusion of articles.
Applsci 12 06890 g004
Figure 5. Trend of Publications on Augmented Reality in Surgery over the Years.
Figure 5. Trend of Publications on Augmented Reality in Surgery over the Years.
Applsci 12 06890 g005
Table 1. Most recent Augmented reality application for each field and method that resulted from the research.
Table 1. Most recent Augmented reality application for each field and method that resulted from the research.
AuthorApplicationTechnologyDisplayRegistrationErrorData Set
Schwam [21]Lateral skull surgeryBrainLab Curve T M , Surgical Theate and Zeiss OPMI® PENTERO® 900Microscope-based HUDMarker-less, rigidNot reported40 patients.
Coelho [22]Antenatal Treatment of Myelomeningocele. Preoperative and post operative simulation to make it happenUnity Engine, Google ARCore libraries, ray casting target object renderingApplication for smartphone and tabletsobject placed based on the rendering,Not reported1 pregnant woman at 27 weeks of gestation.
Gouveia [23]Left breast cancer: OncologyContrast-enhanced MRI Horos R software v2.4.0HoloLens AR HeadsetMarker-based, rigid,Not reported57 menopausal woman.
Chen [24]Knee surgery arthroscopyCT scanner, optical tracking system,Glasses-free 3D displayMarker-based, rigidMean: 0.32 mm. Reduced error targets of 2.10 mm and 2.70 mmExperiments: preclinical on knee phantom and in-vitro swine knee.
Golse [25]Liver resection3D segmentation. CTStandard mobile external monitorReal time Marker-less, non-rigid7.9 mm root mean square error for internal landmark registrationIn vivo: 5 patients, ex vivo: native tumor-free.
Gsaxner [26]Head and neck Carcinoma: TrainingCT, PET-CT and MRI scans. Instant calibrationHoloLens AR HeadsetMarker-less, rigidBetween a few mm of up to 2 cm.11 health care professionals.
Molina [27]Spinal navigationiCT scans. Gertzbein-Robbins (GS) scaleAR-HMD Xvision (Augmedics)Marker-based, rigidLinear deviation: 2.07 mm. Angular deviation: 2.41 .78-yr-old female.
Ackermann [28]Osteotomy cuts and reorientation of the acetabular fragment: navigation systemCT scanMicrosoft HoloLensMarker-based, rigidOsteotomy starting points: 10.8 mm. Osteotomy directions: 5.4. Reorientation errors: x = 6.7 , y = 7.0 , z = 0.9 . LCE angle postoperative error: 4.5 2 fresh-frozen human cadaverous hips.
Liu [29]Medical training and telemonitored surgery2 color digital cameras.Microsoft HoloLensMarker-based, rigidThe overall tracking one: less than 2.5 mm. The overall guidance one: less than 2.75 mmEx vivo arm phantom, in vivo rabbit model.
Collins [30]Uterus: LaparoscopyMR or CT and monocular laparoscopesMonitorMarker-less, rigidDistribution increase towards the cervix (2 mm for 15 views up to 8 mm for 2 views)Phantom and videos recorded during laparoscopic surgery.
Arpaia [31]NeurosurgeryEquipment of the brain computer interface.Epson Moverio BT-350 glasses.Not reportedNot reported10 runs on the same patient.
Shrestha [32]BowelCT scans and endoscope camera intraoperatively.MonitorMarker-based, rigidThe overlay error accuracy was 0.24777px. Performance was 44fpsPeople with three different ages: 15–25, 26–35, 35–60.
Wei [33]Plastic surgeryGoogle Face APIAndroid displayRigid, Marker-basedNot reported4 benchmarks data set.
Lee [34]thyroid surgeryCT. Semiautomatic registrationAR screen. Master surgical robot screen.Marker-based, rigidMean ± SD = 1.9 ± 1.5 mm9 patients.
Hussain [35]Ear surgeryWithout tracking system, CT. Microscope 2D real time videoDDM. Bronchoscopy monitorMarker-less, rigidSurgical instrument tip position one: 0.3 ± 0.22 mm6 artificial human temporal bone specimens.
Feufel [36]Ultrasound guided needle placementReflective markers Ultrasound transducerMicrosoft HoloLensMarker-based, rigidMean error of 7.4 mm20 participants.
Carl [37]Aneurysm surgery: indocyanine green (ICG) hagiographyCT, 3D rotational (DynaCT) or Time-of-flight magnetic resonance angiography. Automatic registrationOperating microscope HUDMarker-based, rigidTarget registration one: 0.71 ± 0.21 mm20 patients with 22 aneurysm.
Chan [38]Transoral robotic surgeryCT3D Surgeon’s consoleMarker-based, rigidNot reported2 cadavers.
Ferraguti [39]Percutaneous NephrolithotomyCt or MRI, 3 electrodes. Real time registration.Microsoft HoloLensMarker-based, rigidTranslation and orientation norm between 2 transformation matrices: 15.80 mm and 4.12 11 samples.
Auloge [40]Percutaneous vertebroplastyCone-beam CTMonitorMarker-based, rigidNot reported2 groups of 10 patients.
Libaw [41]Inhaled induction of general anesthesia, pediatriciPhone 7.AR headsetNot reportedNot reported3 patients: 8 an 10 years old.
Pietruski [42]Fibula free flap harvest7 markers. Actual, virtual registration. Sagittal surgical saw (GB129R) with a tracking adapterHMD: Moverio BT-200 Smart Glasses, EpsonMarker-based, rigidNot reported756 osteotomies simulated.
Jiang [43]Vascular localization systemCTA scan. No ionic contrast agent. Registration real time.Microsoft HoloLensMarker-based, rigidMinimum 1.35 mm. Maximum 3.18 mm7 operators.
Samei [44]Laparoscopic radical prostatectomyMRI. 3 transformations.From Da Vinci to pcMarker-based, rigidNot reportedAgar prostate phantom ex vivo. 12 patients in vivo.
Rose [45]Otolaryngology - head and neck surgeryCT, MeshLab and Unity.Microsoft HoloLensMarker-based, rigidIn measurement of accuracy: 2.47 ± 0.46 mm (1.99, 3.30)A phantom.
Carl [46]Transsphenoidal SurgeryC-arm radiographic fluoroscopy. Registration using iCT.Operating microscopes HUDMarker-based, rigidTarget registration error of 0.83 ± 0.44 mm288 cases of transsphenoidal surgery.
Sharma [47]Jaw surgeryCt scan. Virtual scenes. Stereo views.monitorMarker-less, rigidAlignment error 0.59 ± 0.62 mm20 different samples after jaw surgery.
Abdel [48]Foot sarcoma: OncologyNDI Polaris. Smartphone AR application: FINOSamsung galaxyMarker-based, rigidNot reportedA 39-year-old male patient.
Melero [49]Rehabilitation: upper limbsMyo armband. EMG data. Microsoft Kinect sensorMonitorMarker-less, rigidNot reported3 subjects, with 10 trials for each subject.
Tu [50]OrthopedicsC++ application on pc. C♯ application in Unity. Connection via TCP/IPHoloLens 2Marker-based, rigidDistance error: 1.61 ± 0.44 mm. 3D angle error: 1.46 ± 0.46 Phantom and cadaver experiment.
Cofano [51]Spine SurgeryCt, TeamViewer software and holosurgeryHoloLens 2Marker-less, rigidnot reported2 patients.
Heinrich [52]TrainingNot specifiedHoloLens 1Marker-based, rigidError rates (p = 0.047)10 surgical trainees.
Table 2. Augmented Reality Applications.
Table 2. Augmented Reality Applications.
ApplicationPercentage of Application
Telemonitoring4%
Maxillofacial23%
Liver Surgery4%
Pediatric4%
Orthopedics27%
Oncology19%
Training8%
Puncture Surgery7%
Bowel Surgery4%
Table 3. Percentage of distribution of the displays of Augmented Reality used in medical applications evaluated in our study.
Table 3. Percentage of distribution of the displays of Augmented Reality used in medical applications evaluated in our study.
Type of DisplayPercentage of Application
Smartphone14%
Video see through Device14%
Generic Head Mounted Display17%
Unspecified Display14%
Projected Directly over the Patient3%
HoloLens 210%
HoloLens 128%
Table 4. Augmented Reality tracking and registration methods.
Table 4. Augmented Reality tracking and registration methods.
Tracking and Registration MethodsPercentage of Application
Marker based and Non-rigid Registration4%
Markerless rigid Registration20%
Markerless Non-rigid Registration8%
Markeerbased and rigid Registration68%
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Barcali, E.; Iadanza, E.; Manetti, L.; Francia, P.; Nardi, C.; Bocchi, L. Augmented Reality in Surgery: A Scoping Review. Appl. Sci. 2022, 12, 6890. https://doi.org/10.3390/app12146890

AMA Style

Barcali E, Iadanza E, Manetti L, Francia P, Nardi C, Bocchi L. Augmented Reality in Surgery: A Scoping Review. Applied Sciences. 2022; 12(14):6890. https://doi.org/10.3390/app12146890

Chicago/Turabian Style

Barcali, Eleonora, Ernesto Iadanza, Leonardo Manetti, Piergiorgio Francia, Cosimo Nardi, and Leonardo Bocchi. 2022. "Augmented Reality in Surgery: A Scoping Review" Applied Sciences 12, no. 14: 6890. https://doi.org/10.3390/app12146890

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop