Next Article in Journal
SGLT2 Inhibitors: Statins or ACE-Inhibitors of the 21st Century?
Next Article in Special Issue
Is the Pre-Shaping of an Orbital Implant on a Patient-Specific 3D-Printed Model Advantageous Compared to Conventional Free-Hand Shaping? A Systematic Review and Meta-Analysis
Previous Article in Journal
Kinesio Taping as an Adjunct Therapy in Postoperative Care after Extraction of Impacted Third Lower Molars—A Randomized Pilot Study
Previous Article in Special Issue
Can Steam Sterilization Affect the Accuracy of Point-of-Care 3D Printed Polyetheretherketone (PEEK) Customized Cranial Implants? An Investigative Analysis
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Preclinical Application of Augmented Reality in Pediatric Craniofacial Surgery: An Accuracy Study

by
Federica Ruggiero
1,2,*,†,
Laura Cercenelli
3,†,
Nicolas Emiliani
3,
Giovanni Badiali
1,4,
Mirko Bevini
1,4,
Mino Zucchelli
5,
Emanuela Marcelli
3,‡ and
Achille Tarsitano
1,4,‡
1
Department of Biomedical and Neuromotor Science, University of Bologna, 40138 Bologna, Italy
2
Maxillo-Facial Surgery Unit, AUSL Bologna, 40124 Bologna, Italy
3
Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
4
Oral and Maxillo-Facial Surgery Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, Via Albertoni 15, 40138 Bologna, Italy
5
Pediatric Neurosurgery, IRCCS Istituto delle Scienze Neurologiche di Bologna, Via Altura 3, 40138 Bologna, Italy
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
These authors can be considered as senior authors.
J. Clin. Med. 2023, 12(7), 2693; https://doi.org/10.3390/jcm12072693
Submission received: 25 February 2023 / Revised: 29 March 2023 / Accepted: 31 March 2023 / Published: 4 April 2023
(This article belongs to the Special Issue Innovation in Head and Neck Reconstructive Surgery—Series 2)

Abstract

:
Background: Augmented reality (AR) allows the overlapping and integration of virtual information with the real environment. The camera of the AR device reads the object and integrates the virtual data. It has been widely applied to medical and surgical sciences in recent years and has the potential to enhance intraoperative navigation. Materials and methods: In this study, the authors aim to assess the accuracy of AR guidance when using the commercial HoloLens 2 head-mounted display (HMD) in pediatric craniofacial surgery. The Authors selected fronto-orbital remodeling (FOR) as the procedure to test (specifically, frontal osteotomy and nasal osteotomy were considered). Six people (three surgeons and three engineers) were recruited to perform the osteotomies on a 3D printed stereolithographic model under the guidance of AR. By means of calibrated CAD/CAM cutting guides with different grooves, the authors measured the accuracy of the osteotomies that were performed. We tested accuracy levels of ±1.5 mm, ±1 mm, and ±0.5 mm. Results: With the HoloLens 2, the majority of the individuals involved were able to successfully trace the trajectories of the frontal and nasal osteotomies with an accuracy level of ±1.5 mm. Additionally, 80% were able to achieve an accuracy level of ±1 mm when performing a nasal osteotomy, and 52% were able to achieve an accuracy level of ±1 mm when performing a frontal osteotomy, while 61% were able to achieve an accuracy level of ±0.5 mm when performing a nasal osteotomy, and 33% were able to achieve an accuracy level of ±0.5 mm when performing a frontal osteotomy. Conclusions: despite this being an in vitro study, the authors reported encouraging results for the prospective use of AR on actual patients.

1. Introduction

Augmented reality (AR) is a computerized technology which transfers virtual information to real environments. The technology is based on the “onlay” principle, i.e., the camera “reads” the object in the frame and the system recognizes it and activates a new level of communication, overlapping and integrating the virtual information with the actual object or environment [1]. Unlike Virtual Reality (VR), which creates a totally computer-generated artificial environment, AR uses the real environment and overlays new virtual information on top of it, thus providing a composite view that enhances the user’s sensory perception of the world. Tools and systems employing AR have been designed and tested in the context of several medical applications [2,3,4,5,6,7,8,9,10,11,12,13], including surgical navigation in neurosurgery [14], craniomaxillofacial surgery [1,15,16,17,18,19,20,21], and head and neck oncology [22].
In recent years, its use in surgical navigation has been widely validated; many procedures, such as ventriculo-peritoneal shunt insertion and tumor resection in craniofacial and neurosurgery, are now usually performed using AR navigation [14].
Standard surgical navigation relies on an external device that recognizes the patient’s position in the environment and then merges information from patient imaging, e.g., computer tomography (CT) or magnetic resonance imaging (MRI) on the screen.
The navigator ensures that the noble anatomical structures can be localized before and during the operation. This improves the safety of the procedure. By means of navigation, it is also possible to import onto the patient’s imaging the trajectories of osteotomies and the target lesions in order to improve the extension and invasiveness of the surgery itself [1,17].
Though conventional navigation is a well-established methodology in surgery, it is burdened by several drawbacks. One of these is the fact that the operator has to switch his or her attention continuously between the patient and the screen [1,12,15].
AR head-mounted displays (HMDs) represent a technology that overcomes this limitation and can improve surgical navigation. Indeed, HMDs have integrated displays in order to allow surgeons to receive pertinent information while focusing their view on the surgical field [12]. HMDs enable the operator to visualize noble structures, trajectories for osteotomies, and incision points which are directly superimposed onto the patient. In optical see-through HMDs, such as the HoloLens 2 (Microsoft), the natural sight of the surgeon is not compromised since the holograms are projected on transparent lenses.
Craniofacial surgery is a subspecialty that aims to address congenital skull dysmorphologies in pediatric patients. Among these, we cite single-suture craniosynostosis, multiple-suture synostosis, and syndromic craniosynostosis.
It is a surgery that demands a certain amount of accuracy to avoid noble structures. Additionally, the morphological surgical outcome relies on the accurate design of the osteotomy [23].
In this study, the authors aim to assess the accuracy of AR guidance when using the commercial HoloLens 2 HMD [24] in pediatric craniofacial surgery.
In particular, we focus on the procedure used to address frontal skull anomalies, known as fronto-orbital remodeling (FOR).
While this is a pre-clinical study on a 3D printed phantom, the authors ultimately want to demonstrate the extent to which AR guidance using the HoloLens 2 smart glasses can accurately reproduce osteotomy trajectories in pediatric craniofacial surgery.

2. Materials and Methods

This study was designed in order first to implement the AR-based protocol using the HoloLens 2 smart glasses. The authors then arranged a test session to evaluate the success rate in executing the AR-guided osteotomies for fronto-orbital remodeling (FOR) on a 3D printed phantom.
In the following sections, the development phase and the experimental phase of the study are discussed.

2.1. Development Phase

2.1.1. Virtual Content Preparation

We selected the preoperative CT scan of a patient who had already been admitted and undergone an operation (study protocol CE 499-2022-OSS-AUSLBO). A DICOM file dataset was acquired and segmented in order to reconstruct a three-dimensional (3D) virtual model of the skull. Areas of the subject’s head that were of anatomical interest (e.g., bones, brain, eye globes, and skin) were segmented using Mimics (Materialise, Leuven, Belgium). Next, 3D meshes were generated from all the segmented masks and saved in standard tessellation language (STL) format (Figure 1).

2.1.2. 3D Printing of Skull Phantom and CAD/CAM Templates for Testing Accuracy

An appropriate portion of the reconstructed skull was selected for printing. In particular, relying on clinical information, we decided to visualize the skull from the top, as it would be viewed in theatre, and to cut the model behind the coronal sutures and bilaterally at the level of the fronto-zygomatic sutures.
From the cut STL files, a phantom model made of photosensitive resin was produced by means of a stereolithography (SLA) 3D printer (Form 3, Formlabs, Somerville, MA, USA).
To evaluate the AR guidance accuracy, CAD/CAM templates were designed using MeshMixer 3.5 software (Autodesk Inc., Mill Valley, CA, USA), and these were to be positioned on the surface of the phantom model, as cutting guides, in correspondence with the planned FOR osteotomies. For the AR-guided task, we selected the nasal and the frontal osteotomies that are part of the fronto orbital remodeling (Figure 2, left).
The templates were 3D printed (Form 3, Formlabs) with grooves of different widths (3 mm, 2 mm, 1 mm) in order to evaluate three levels of achievable accuracy (±1.5 mm, ±1.0 mm, and ±0.5 mm) (Figure 2, right). Strips of calibrated adhesive tape were applied to each template and used to measure the cumulative length of the traced osteotomy included within the grooves. We considered the AR-guided tasks successfully completed (100% success rate) when the traced osteotomy profile fell within the grooves of the cutting guides along their entire length (nasal osteotomy: 27 mm; frontal osteotomy: 75 mm).

2.1.3. The AR Application

The virtual skull model with all its components (bone, skin, eye globes and brain), was imported into the Unity 3D software 2019.4.21f1 (Unity Technologies, San Francisco, CA, USA), extended with a specific software development kit for creating augmented reality apps (Vuforia Engine package 9.8.5, PTC, Inc., Boston, MA, USA).
By means of the Vuforia Engine software, the registration between the virtual osteotomy traces and the skull phantom was achieved using the “model target” function, which allows the system to recognize the shape of an actual object to be tracked. In order to achieve this, the object has to be observed from a certain perspective by the surgeon wearing the AR glasses. In this case, we decided to reproduce the point of view of the surgeon in the theatre when performing the surgical procedure for fronto-orbital remodeling. The application draws and projects in front of the user wearing the AR glasses a profile (“guide view”) of the model target; the user simply needs to move the lenses and match the projected drawing on the actual object.
In this study, a 3D model of the patient’s skull was used as the model target for the virtual-to-real scene registration.
The AR application generates several holograms, which are superimposed on the printed skull portion, for each structure we want to visualize, i.e., the bony skull, the skin, the brain, the eye globes, and the FOR osteotomy trajectories to be used as guiding information during the surgical task.
The AR application was built as a UWP (Universal Windows Platform) app deployed on the Microsoft HoloLens 2 smart glasses.
Interactive user interface toggles (check boxes) were added to turn the rendering of each virtual anatomical structure and the planned virtual osteotomy trajectories on and off. Voice commands to show/hide the virtual structures were also implemented in order to provide a completely hands-free AR guidance system.

2.2. Experimental Phase

The authors tested the AR application for the HoloLens 2 by having the selected FOR osteotomies, i.e., the nasal osteotomy (27 mm long) and the frontal osteotomy (75 mm long) performed under its guidance.
We recruited three surgeons and three engineers (three females and three males aged between 25 and 50 years). Each user repeated the procedure six times on the same 3D printed phantom with intervals of one week between each trial. Each user was briefed about the task that they were to perform. They had to perform the osteotomies on the phantom under the AR guidance provided by the HoloLens 2. Each user, after having calibrated the HMD for the perception of the optimal holograms, looked at the phantom in order to track it via the model target tracking function and then began the AR-guided task. Each user carefully drew the trajectory of the osteotomy on the skull phantom with a pencil, following the planned trajectories which were displayed as holograms in dashed lines. Vocal commands allowed the users to show/hide the virtual structures during the execution of their tasks.
Afterward, using the 3D printed templates for accuracy evaluation, another operator assessed the extent to which the line traced under AR guidance fell into the groove of the single template. Each template had a calibrated tape along the groove itself to facilitate the measurements.

2.3. Statistics

All measurements were recorded in an Excel spreadsheet file. Percentages were recorded and both a Kruskal–Wallis test and a Mann–Whitney test were performed on the measurements.
SPSS software (IBM, Armonk, New York, NY, USA) was used to perform the statistical analysis, and a p value of <0.05 was considered statistically significant.

3. Results

The results are summarized in Table 1. With the HoloLens 2, 97% of the users were able to successfully trace the osteotomy trajectory with an accuracy level of ±1.5 mm (verified with the “3 mm” template) for the nasal cut. The percentage falls to 80% when looking at the frontal cut.
For accuracy levels of ±1 mm and ±0.5 mm, lower success rates were recorded. Specifically, for the nasal cut, we reported success rates of 80% and 61%, respectively. Regarding the frontal cut, the users were able to accomplish the task with an accuracy level ± 1 mm in 52% of cases, whereas only 33% precisely followed the groove with an accuracy level of ±0.5 mm.
The Kruskal–Wallis test demonstrated that all the users were able to complete the nasal cut task with no significant differences between them. However, more inter-operator differences were reported for the frontal cut (Table 2).
From the measurements, only one outlier was evident according to the Mann–Whitney test (Table 3).
The users reported that the usability of the AR guidance system was very good, but most of them reported a perceived loss of image quality when moving the pencil in front of the visor.

4. Discussion

AR is a promising technology in the medical field; more and more studies on its applications, especially in the surgical field, are being published due to increasing interest [1,2,10,11,12,13,14,15,16,17,25,26,27,28,29].
Its introduction enables the realization of a full concept of navigation, i.e., the operator does not have to shift his or her attention anymore and can stay focused on the patient, on whom holograms are projected. A specific taxonomy was introduced in the 1960s to categorize this technology [13].
AR HMDs are divided into optical see-through devices and video see-through devices [11,12,13,15,16,17,23,24,25,26,27,28,29,30,31]. In this study, the authors used the HoloLens 2, an optical see-through HMD.
Craniofacial surgery, and in particular corrective procedures for forehead morphological anomalies, can be challenging due to the noble structures underlying the bone [23]. Furthermore, accuracy in the execution of osteotomies is necessary in order to achieve a good result. In most cases of single-suture synostosis, the indication is merely morphological. Therefore, errors over a certain amount can lead to disastrous results. However, it has been explained in the literature that a 2 mm error is still acceptable and will not necessarily compromise the overall result [25,26]
Therefore, an HMD suitable for this surgery has to be accurate and maintain the hologram in the field of vision even if the surgeon moves. The HoloLens has already demonstrated good potential in this sense [13,24].
However, different drawbacks and limitations have been reported in the literature, such as depth of perception and registration errors [17,32].
We have reported some drawbacks, too. In an effort to overcome these limitations, our group has already tried to address the registration errors, in which a static error ranging from 1 mm to 10 mm is reported. This results in a misalignment in the subjective perception of the virtual image and actual image [17]. This registration error contributed to the overall error rate quantified in this study at the tracing stage, i.e., while performing AR-guided task of tracing the osteotomy lines on the skull phantom.
In this study, the authors evaluated the accuracy of the HoloLens 2 when used to perform a craniofacial surgery task. The procedure of choice was fronto-orbital remodeling, focusing in particular on two osteotomies, nasal and frontal, which define the orbital rim.
The authors selected six operators, and every operator had to repeat the task six times for each osteotomy, both nasal and frontal. In order to avoid an additive learning curve effect, an interval of one week passed between each trial.
The users were asked to trace the osteotomies with a pen under the guidance of the HoloLens 2 projection. They could see on the lenses the phantom and the dotted lines for the planned osteotomies. We noticed some errors in virtual-to-real alignment and a loss of sharpness when moving the pen in the field of view.
All the users were able to complete the task. The osteotomy traced with the pen was then checked with the cutting guides having different width grooves.
In terms of accuracy, our findings are consistent with previous findings outlined in the literature, and, to a certain extent, this is encouraging. We described the maximum accuracy with an error of ±1.5 mm. We also noticed that a lower level of accuracy was reported for the frontal cut, and this might be due to the length of the trace that had to be followed (75 mm), and to the more complex round anatomy.
Despite these technical pitfalls, the levels of accuracy that we reported are consistent with what has been described elsewhere in the literature [12,13,33,34,35].
Scherl et al. reported an accuracy of less than 1.3 mm in their in vivo study [13], whereas Tang et al., in their account of their experience with the HoloLens 2 in head and neck oncology, reported that the mean deviation between the preoperative virtual osteotomy plane and the actual postoperative osteotomy plane was 1.68 ± 0.92 mm, with the largest deviation being 3.46 mm [36].
Han et al. described their experience in craniosynostosis, and this is the only available on-patient work of its kind in the literature. Their study involved seven patients undergoing calvarial remodeling for plagiocephaly, and they compared the planned intracranial volume with the obtained one, with encouraging results [37]. In our study, we assessed another type of osteotomy for calvarial remodeling. This specific procedure consists of plain osteotomies and very limited areas for drilling and/or cutting. In this case, the authors focused on the simplest issues due to the technical limitations of the HoloLens 2. These included, but were not limited to, registration errors and the small augmentable field of view. Therefore, whereas there is a more stringent requirement for accuracy (i.e., sub-millimetric precision), other “surgery-specific” devices should be addressed, such as the ones previously described in the literature [17,38,39,40,41].
Our findings have, however, been encouraging. Our results included only one outlier, and therefore we cannot exclude a learning curve effect.
Our study had limitations, principally due to the advantageous lighting conditions compared with those typically found in theatre, the errors in model target registration, and operator-dependent factors. The authors also wish to emphasize that, despite the fact that loupe glasses are in common use, HMD technology has not yet been optimized for compatibility with loupes. Furthermore, the procedures were performed in ideal conditions. Our next step will be to implement the navigation system by inserting more details and more 3D objects, to be seen contemporaneously with the trajectories that are to be followed with the instruments (i.e., osteotomies). The preparation time for the 3D model reconstruction, starting from the DICOM data segmentation and including the setting up of the AR guidance software, was only one to two hours [17]. An in vivo study would be necessary to confirm these preliminary data.

5. Conclusions

This was an in vitro study. The encouraging results in terms of the ±1.5 mm accuracy make it suitable for application in craniofacial surgery, where such a margin of error can be admitted in certain tasks. More studies, including in vivo evaluations, are required to overcome the technical pitfalls involved in this promising technology.

Author Contributions

Conceptualization, F.R., L.C., G.B., E.M. and A.T.; methodology, F.R., L.C., M.B. and N.E.; software, F.R., L.C., N.E. and M.B.; validation, F.R., L.C., G.B., M.Z., E.M. and A.T.; formal analysis, F.R. and M.B.; investigation, F.R., L.C. and N.E.; resources, L.C., M.Z., E.M. and A.T.; data curation, F.R., L.C., N.E. and M.B.; writing—original draft preparation, F.R. and L.C.; writing—review and editing, F.R., L.C., E.M. and A.T. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

This study was conducted according to the guidelines of the Declaration of Helsinki, and it was approved by the Institutional Review Board (or Ethics Committee) of CEAVEC (protocol code CE 499-2022-OSS-AUSLBO, approved on 14 September 2022).

Informed Consent Statement

Not applicable.

Data Availability Statement

Data are available upon reasonable request.

Acknowledgments

The Authors would like to thank Lorenzo Federico and Micol Babini for their participation.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ayoub, A.; Pulijala, Y. The application of virtual reality and augmented reality in Oral & Maxillofacial Surgery. BMC Oral Health 2019, 19, 238. [Google Scholar]
  2. Marzano, E.; Piardi, T.; Soler, L.; Diana, M.; Mutter, D.; Marescaux, J.; Pessaux, P. Augmented reality-guided artery-first pancreatico-duodenectomy. J. Gastrointest. Surg. 2013, 17, 1980–1983. [Google Scholar] [CrossRef] [PubMed]
  3. Ghaednia, H.; Fourman, M.S.; Lans, A.; Detels, K.; Dijkstra, H.; Lloyd, S.; Sweeney, A.; Oosterhoff, J.H.F.; Schwab, J.H. Augmented and virtual reality in spine surgery, current applications and future potentials. Spine J. 2021, 21, 1617–1625. [Google Scholar] [CrossRef] [PubMed]
  4. Verhey, J.T.; Haglin, J.M.; Verhey, E.M.; Hartigan, D.E. Virtual, augmented, and mixed reality applications in orthopedic surgery. Int. J. Med. Robot. 2020, 16, e2067. [Google Scholar] [CrossRef]
  5. Quero, G.; Lapergola, A.; Soler, L.; Shahbaz, M.; Hostettler, A.; Collins, T.; Marescaux, J.; Mutter, D.; Diana, M.; Pessaux, P. Virtual and Augmented Reality in Oncologic Liver Surgery. Surg. Oncol. Clin. N. Am. 2019, 28, 31–44. [Google Scholar] [CrossRef]
  6. Aguilar-Salinas, P.; Gutierrez-Aguirre, S.F.; Avila, M.J.; Nakaji, P. Current status of augmented reality in cerebrovascular surgery: A systematic review. Neurosurg. Rev. 2022, 45, 1951–1964. [Google Scholar] [CrossRef]
  7. Schiavina, R.; Bianchi, L.; Lodi, S.; Cercenelli, L.; Chessa, F.; Bortolani, B.; Gaudiano, C.; Casablanca, C.; Droghetti, M.; Porreca, A.; et al. Real-time Augmented Reality Three-dimensional Guided Robotic Radical Prostatectomy: Preliminary Experience and Evaluation of the Impact on Surgical Planning. Eur. Urol. Focus 2021, 7, 1260–1267. [Google Scholar] [CrossRef]
  8. Schiavina, R.; Bianchi, L.; Chessa, F.; Barbaresi, U.; Cercenelli, L.; Lodi, S.; Gaudiano, C.; Bortolani, B.; Angiolini, A.; Mineo Bianchi, F.; et al. Augmented Reality to Guide Selective Clamping and Tumor Dissection During Robot-assisted Partial Nephrectomy: A Preliminary Experience. Clin. Genitourin. Cancer 2021, 19, e149–e155. [Google Scholar] [CrossRef]
  9. Bianchi, L.; Chessa, F.; Angiolini, A.; Cercenelli, L.; Lodi, S.; Bortolani, B.; Molinaroli, E.; Casablanca, C.; Droghetti, M.; Gaudiano, C.; et al. The Use of Augmented Reality to Guide the Intraoperative Frozen Section During Robot-assisted Radical Prostatectomy. Eur. Urol. 2021, 80, 480–488. [Google Scholar] [CrossRef]
  10. Kim, Y.; Kim, H.; Kim, Y.O. Virtual reality and augmented reality in plastic surgery: A review. Arch. Plast. Surg. 2017, 44, 179–187. [Google Scholar] [CrossRef] [Green Version]
  11. Okamoto, T.; Onda, S.; Yanaga, K.; Suzuki, N.; Hattori, A. Clinical application of navigation surgery using augmented reality in the abdominal field. Surg. Today 2015, 45, 397–406. [Google Scholar] [CrossRef]
  12. Scherl, C.; Stratemeier, J.; Rotter, N.; Hesser, J.; Schönberg, S.O.; Servais, J.J.; Männle, D.; Lammert, A. Augmented Reality with HoloLens® in Parotid Tumor Surgery: A Prospective Feasibility Study. ORL J. Otorhinolaryngol. Relat. Spec. 2021, 83, 439–448. [Google Scholar] [CrossRef]
  13. Scherl, C.; Stratemeier, J.; Karle, C.; Rotter, N.; Hesser, J.; Huber, L.; Dias, A.; Hoffmann, O.; Riffel, P.; Schoenberg, S.O.; et al. Augmented reality with HoloLens in parotid surgery: How to assess and to improve accuracy. Eur. Arch. Otorhinolaryngol. 2021, 278, 2473–2483. [Google Scholar] [CrossRef]
  14. Guha, D.; Alotaibi, N.M.; Nguyen, N.; Gupta, S.; McFaul, C.; Yang, V.X.D. Augmented Reality in Neurosurgery: A Review of Current Concepts and Emerging Applications. Can. J. Neurol. Sci. 2017, 44, 235–245. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Badiali, G.; Ferrari, V.; Cutolo, F.; Freschi, C.; Caramella, D.; Bianchi, A.; Marchetti, C. Augmented reality as an aid in maxillofacial surgery: Validation of a wearable system allowing maxillary repositioning. J. Cranio Maxillo Facial Surg. 2014, 42, 1970–1976. [Google Scholar] [CrossRef]
  16. Qu, M.; Hou, Y.; Xu, Y.; Shen, C.; Zhu, M.; Xie, L.; Wang, H.; Zhang, Y.; Chai, G. Precise positioning of an intraoral distractor using augmented reality in patients with hemifacial microsomia. J. Craniomaxillofac. Surg. 2015, 43, 106–112. [Google Scholar] [CrossRef]
  17. Cercenelli, L.; Babini, F.; Badiali, G.; Battaglia, S.; Tarsitano, A.; Marchetti, C.; Marcelli, E. Augmented Reality to Assist Skin Paddle Harvesting in Osteomyocutaneous Fibular Flap Reconstructive Surgery: A Pilot Evaluation on a 3D-Printed Leg Phantom. Front. Oncol. 2022, 11, 804748. [Google Scholar] [CrossRef]
  18. Badiali, G.; Cercenelli, L.; Battaglia, S.; Marcelli, E.; Marchetti, C.; Ferrari, V.; Cutolo, F. Review on Augmented Reality in Oral and Cranio-Maxillofacial Surgery: Toward “Surgery-Specific” Head-Up Displays. IEEE Access 2020, 2020, 59015–59028. [Google Scholar] [CrossRef]
  19. Benmahdjoub, M.; van Walsum, T.; van Twisk, P.; Wolvius, E.B. Augmented reality in craniomaxillofacial surgery: Added value and proposed recommendations through a systematic review of the literature. Int. J. Oral. Maxillofac. Surg. 2021, 50, 969–978. [Google Scholar] [CrossRef]
  20. Battaglia, S.; Badiali, G.; Cercenelli, L.; Bortolani, B.; Marcelli, E.; Cipriani, R.; Contedini, F.; Marchetti, C.; Tarsitano, A. Combination of CAD/CAM and Augmented Reality in Free Fibula Bone Harvest. Plast. Reconstr. Surg. Glob. Open 2019, 7, e2510. [Google Scholar] [CrossRef]
  21. Battaglia, S.; Ratti, S.; Manzoli, L.; Marchetti, C.; Cercenelli, L.; Marcelli, E.; Tarsitano, A.; Ruggeri, A. Augmented Reality-Assisted Periosteum Pedicled Flap Harvesting for Head and Neck Reconstruction: An Anatomical and Clinical Viability Study of a Galeo-Pericranial Flap. J. Clin. Med. 2020, 9, 2211. [Google Scholar] [CrossRef]
  22. Ceccariglia, F.; Cercenelli, L.; Badiali, G.; Marcelli, E.; Tarsitano, A. Application of Augmented Reality to Maxillary Resections: A Three-Dimensional Approach to Maxillofacial Oncologic Surgery. J. Pers. Med. 2022, 12, 2047. [Google Scholar] [CrossRef] [PubMed]
  23. Mathijssen, I.M.J. Working Group Guideline Craniosynostosis. Updated Guideline on Treatment and Management of Craniosynostosis. J. Craniofac. Surg. 2021, 32, 371–450. [Google Scholar] [CrossRef] [PubMed]
  24. Vassallo, R.; Rankin, A.; Chen, E.C.S.; Peters, T.M. Hologram Stability Evaluation for Microsoft HoloLens. In Proceedings of the SPIE Medical Imaging 2017: Image Perception, Observer Performance, and Technology Assessment; Kupinski, M.A., Nishikawa, R.M., Eds.; Society of Photo-Optical Instrumentation Engineers (SPIE): Bellingham, WA, USA, 2017; Volume 10136. [Google Scholar] [CrossRef]
  25. Wandell, A.; Papanastassiou, A.; Tarasiewicz, I.; Miller, M. What is the Accuracy of PEEK Implants for Cranioplasty in Comparison to Their Patient Specific Surgical Plan? J. Oral. Maxillofac. Surg. 2023, 81, 24–31. [Google Scholar] [CrossRef] [PubMed]
  26. Mazzoni, S.; Bianchi, A.; Schiariti, G.; Badiali, G.; Marchetti, C. Computer-aided design and computer-aided manufacturing cutting guides and customized titanium plates are useful in upper maxilla waferless repositioning. J. Oral Maxillofac. Surg. 2015, 73, 701–707. [Google Scholar] [CrossRef]
  27. Porpiglia, F.; Fiori, C.; Checcucci, E.; Amparore, D.; Bertolo, R. Augmented reality robot-assisted radical prostatectomy: Preliminary experience. Urology 2018, 115, 184. [Google Scholar] [CrossRef]
  28. Elmi-Terander, A.; Nachabe, R.; Skulason, H.; Pedersen, K.; Söderman, M.; Racadio, J.; Babic, D.; Gerdhem, P.; Edström, E. Feasibility and accuracy of thoracolumbar minimally invasive pedicle screw placement with augmented reality navigation technology. Spine 2018, 43, 1018–1023. [Google Scholar] [CrossRef]
  29. Bong, J.H.; Song, H.J.; Oh, Y.; Park, N.; Kim, H.; Park, S. Endoscopic navigation system with extended field of view using augmented reality technology. Int. J. Med. Robot. 2018, 14, e1886. [Google Scholar] [CrossRef]
  30. Qian, L.; Barthel, A.; Johnson, A.; Osgood, G.; Kazanzides, P.; Navab, N.; Fuerst, B. Comparison of optical see-through head-mounted displays for surgical interventions with object-anchored 2D-display. Int. J. Comput. Assist. Radiol. Surg. 2017, 12, 901–910. [Google Scholar] [CrossRef] [Green Version]
  31. Bremers, A.W.D.; Yöntem, A.O.; Li, K.; Chu, D.; Meijering, V.; Janssen, C.P. Perception of Perspective in Augmented Reality Head-Up Displays. Int. J. Hum. Comput. Stud. 2021, 155, 102693. [Google Scholar] [CrossRef]
  32. Tokunaga, D.M.; Corrêa, C.G.; Bernardo, F.M.; Bernardes, J.; Ranzini, E.; Nunes, F.L.S.; Tori, R. Registration System Errors Perception in Augmented Reality Based on RGB-D Cameras. In Virtual, Augmented and Mixed Reality; Lecture Notes in Computer Science; Shumaker, R., Lackey, S., Eds.; Springer International Publishing: Cham, Switzerland, 2015; pp. 119–129. [Google Scholar]
  33. Barber, S.R.; Jain, S.; Son, Y.-J.; Chang, E.H. Virtual functional endoscopic sinus surgery simulation with 3D-printed models for mixed-reality nasal endoscopy. Otolaryngol. Head Neck Surg. 2018, 159, 933–937. [Google Scholar] [CrossRef] [Green Version]
  34. Li, L.; Yang, J.; Chu, Y.; Wu, W.; Xue, J.; Liang, P.; Chen, L. A novel augmented reality navigation system for endoscopic sinus and skull base surgery: A feasibility study. PLoS ONE 2016, 11, e0146996. [Google Scholar] [CrossRef]
  35. Van Doormaal, T.P.C.; van Doormaal, J.A.M.; Mensink, T. Clinical Accuracy of Holographic Navigation Using Point-Based Registration on Augmented-Reality Glasses. Oper. Neurosurg. Hagerstown 2019, 17, 588–593. [Google Scholar] [CrossRef] [Green Version]
  36. Tang, Z.N.; Hu, L.H.; Soh, H.Y.; Yu, Y.; Zhang, W.B.; Peng, X. Accuracy of Mixed Reality Combined with Surgical Navigation Assisted Oral and Maxillofacial Tumor Resection. Front. Oncol. 2022, 11, 715484. [Google Scholar] [CrossRef]
  37. Han, W.; Yang, X.; Wu, S.; Fan, S.; Chen, X.; Aung, Z.M.; Liu, T.; Zhang, Y.; Gu, S.; Chai, G. A new method for cranial vault reconstruction: Augmented reality in synostotic plagiocephaly surgery. J. Craniomaxillofac. Surg. 2019, 47, 1280–1284. [Google Scholar] [CrossRef]
  38. Cercenelli, L.; Carbone, M.; Condino, S.; Cutolo, F.; Marcelli, E.; Tarsitano, A.; Marchetti, C.; Ferrari, V.; Badiali, G. The Wearable VOSTARS System for Augmented Reality-Guided Surgery: Preclinical Phantom Evaluation for High-Precision Maxillofacial Tasks. J. Clin. Med. 2020, 9, 3562. [Google Scholar] [CrossRef]
  39. Badiali, G.; Cutolo, F.; Cercenelli, L.; Carbone, M.; D’Amato, R.; Ferrari, V.; Marchetti, C. The Vostars Project: A New Wearable Hybrid Video and Optical See-Through Augmented Reality Surgical System for Maxillofacial Surgery. Int. J. Oral. Maxillofac. Surg. 2019, 48, 153. [Google Scholar] [CrossRef]
  40. Cutolo, F.; Freschi, C.; Mascioli, S.; Parchi, P.D.; Ferrari, M.; Ferrari, V. Robust and Accurate Algorithm for Wearable Stereoscopic Augmented Reality with Three Indistinguishable Markers. Electronics 2016, 5, 59. [Google Scholar] [CrossRef] [Green Version]
  41. Carbone, M.; Cutolo, F.; Condino, S.; Badiali, G.; Ferrari, V. Architecture of a Hybrid Video/Optical See-through Head-Mounted Display-Based Augmented Reality Surgical Navigation Platform. Information 2022, 13, 81. [Google Scholar] [CrossRef]
Figure 1. Development phase: (a) from CT scan to virtual content preparation; the virtual 3D skull model was also 3Dprinted to obtain a patient-specific phantom; (b) Unity software interface used for AR application development; (c) the planned osteotomy lines displayed in AR with HoloLens 2 smart glasses.
Figure 1. Development phase: (a) from CT scan to virtual content preparation; the virtual 3D skull model was also 3Dprinted to obtain a patient-specific phantom; (b) Unity software interface used for AR application development; (c) the planned osteotomy lines displayed in AR with HoloLens 2 smart glasses.
Jcm 12 02693 g001
Figure 2. On the left, the planned osteotomies for the fronto-orbital bandeau of the Fronto Orbital Remodeling (FOR) on the right, the 3D printed cutting guides with calibrated grooves for both osteotomies.
Figure 2. On the left, the planned osteotomies for the fronto-orbital bandeau of the Fronto Orbital Remodeling (FOR) on the right, the 3D printed cutting guides with calibrated grooves for both osteotomies.
Jcm 12 02693 g002
Table 1. Measurements taken from each recruited user. The left column shows the measurements for the nasal cut and the right column shows the measurements for the frontal cut. CG: cutting guide.
Table 1. Measurements taken from each recruited user. The left column shows the measurements for the nasal cut and the right column shows the measurements for the frontal cut. CG: cutting guide.
NoseFrontal
PT 1 CG (3 mm)CG 2 (mm)CG 1 (mm)PT 1 CG (3 mm)CG 2 (mm)CG 1 (mm)
127 mm27 mm24 mm175 mm70 mm40 mm
227 mm27 mm27 mm275 mm75 mm55 mm
327 mm25 mm23 mm375 mm75 mm60 mm
427 mm24 mm22 mm475 mm72 mm65 mm
527 mm27 mm27 mm570 mm70 mm60 mm
627 mm26 mm25 mm675 mm75 mm75 mm
PT 2 PT 2
127 mm26 mm23 mm175 mm55 mm35 mm
227 mm27 mm26 mm275 mm75 mm75 mm
327 mm27 mm27 mm375 mm73 mm55 mm
427 mm27 mm27 mm475 mm75 mm74 mm
527 mm27 mm25 mm575 mm75 mm75 mm
627 mm27 mm27 mm675 mm75 mm35 mm
PT 3 PT 3
127 mm25 mm20 mm175 mm75 mm70 mm
227 mm27 mm27 mm275 mm71 mm75 mm
327 mm27 mm27 mm375 mm75 mm70 mm
427 mm27 mm27 mm475 mm75 mm75 mm
527 mm27 mm27 mm575 mm70 mm60 mm
627 mm27 mm27 mm675 mm70 mm35 mm
PT 4 PT 4
127 mm27 mm27 mm175 mm75 mm65 mm
227 mm15 mm10 mm275 mm75 mm75 mm
327 mm27 mm26 mm375 mm75 mm75 mm
427 mm27 mm22 mm475 mm70 mm55 mm
527 mm27 mm27 mm573 mm71 mm65 mm
627 mm27 mm27 mm670 mm60 mm60 mm
PT 5 PT 5
127 mm27 mm27 mm170 mm50 mm35 mm
227 mm27 mm27 mm270 mm65 mm55 mm
320 mm12 mm11 mm365 mm 45 mm35 mm
427 mm25 mm25 mm475 mm59 mm54 mm
527 mm27 mm27 mm575 mm75 mm45 mm
627 mm27 mm27 mm675 mm57 mm45 mm
PT 6 PT 6
127 mm27 mm27 mm175 mm75 mm75 mm
227 mm27 mm27 mm265 mm 60 mm55 mm
327 mm27 mm27 mm375 mm75 mm75 mm
427 mm27 mm27 mm475 mm75 mm75 mm
527 mm27 mm27 mm575 mm75 mm75 mm
627 mm27 mm27 mm675 mm75 mm75 mm
Table 2. Kruskal–Wallis Test demonstrating no significative differences between operators.
Table 2. Kruskal–Wallis Test demonstrating no significative differences between operators.
fro 3 mmfro 2 mmfro 1 mmnos 3 mmnos 2 mmnos 1 mm
KruskaI-WaIIis H6.9929.57913.0835.0004.8836.521
df555555
Asymp. Sig.0.2210.0880.0230.4160.430.259
Table 3. Mann–Whitney test, according to which only one outlier between the operators was evident.
Table 3. Mann–Whitney test, according to which only one outlier between the operators was evident.
op2–op1op3–op1op4–op1op5–op1op6–op1op3–op2op4–op2op5–op2op6–op2op4–op3op5–op3op6–op3op5–op4op6–op4op6–op5
Z−0.315−0.677−1.051−2.023−1.8410.921−1.214−1.786−1.361−0.412−1.997−0.984−2.207−0.816−2.041
Asymp. Sig. (2-tailed)0.7520.4980.2930.0430.0660.3570.2250.0740.1740.680.0460.3430.0270.4140.041
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Ruggiero, F.; Cercenelli, L.; Emiliani, N.; Badiali, G.; Bevini, M.; Zucchelli, M.; Marcelli, E.; Tarsitano, A. Preclinical Application of Augmented Reality in Pediatric Craniofacial Surgery: An Accuracy Study. J. Clin. Med. 2023, 12, 2693. https://doi.org/10.3390/jcm12072693

AMA Style

Ruggiero F, Cercenelli L, Emiliani N, Badiali G, Bevini M, Zucchelli M, Marcelli E, Tarsitano A. Preclinical Application of Augmented Reality in Pediatric Craniofacial Surgery: An Accuracy Study. Journal of Clinical Medicine. 2023; 12(7):2693. https://doi.org/10.3390/jcm12072693

Chicago/Turabian Style

Ruggiero, Federica, Laura Cercenelli, Nicolas Emiliani, Giovanni Badiali, Mirko Bevini, Mino Zucchelli, Emanuela Marcelli, and Achille Tarsitano. 2023. "Preclinical Application of Augmented Reality in Pediatric Craniofacial Surgery: An Accuracy Study" Journal of Clinical Medicine 12, no. 7: 2693. https://doi.org/10.3390/jcm12072693

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop