Next Article in Journal
High-Resolution Chaos Maps for Optically Injected Lasers
Previous Article in Journal
Muffins Enriched with the Polysaccharide Fraction Residue After Isolation of Starch from Unripe Apples—Nutritional Composition, Profile of Phenolic Compounds, and Oxidation Stability
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Systematic Review

Augmented Reality in Dental Extractions: Narrative Review and an AR-Guided Impacted Mandibular Third-Molar Case

1
Oral Surgery Unit, Department of Biomedical and Neuromotor Sciences (DIBINEM), University of Bologna, 40125 Bologna, Italy
2
Laboratory of Bioengineering—eDIMES Lab, Department of Medical and Surgical Sciences (DIMEC), University of Bologna, 40138 Bologna, Italy
3
Istituto di Ricovero e Cura a Carattere Scientifico Azienda Ospedaliero-Universitaria di Bologna, 40138 Bologna, Italy
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(17), 9723; https://doi.org/10.3390/app15179723
Submission received: 21 July 2025 / Revised: 29 August 2025 / Accepted: 29 August 2025 / Published: 4 September 2025

Abstract

Background: Augmented-reality (AR) navigation is emerging as a means of turning pre-operative cone-beam CT data into intuitive, in situ guidance for difficult tooth removal, yet the scattered evidence has never been consolidated nor illustrated with a full clinical workflow. Aims: This study aims to narratively synthesise AR applications limited to dental extractions and to illustrate a full AR-guided clinical workflow. Methods: We performed a PRISMA-informed narrative search (PubMed + Cochrane, January 2015–June 2025) focused exclusively on AR applications in dental extractions and found nine eligible studies. Results: These pilot reports—covering impacted third molars, supernumerary incisors, canines, and cyst-associated teeth—all used marker-less registration on natural dental surfaces and achieved mean target-registration errors below 1 mm with headset set-up times under three minutes; the only translational series (six molars) recorded a mean surgical duration of 21 ± 6 min and a System Usability Scale score of 79. To translate these findings into practice, we describe a case of AR-guided mandibular third-molar extraction. A QR-referenced 3D-printed splint, intra-oral scan, and CBCT were fused to create a colour-coded hologram rendered on a Magic Leap 2 headset. The procedure took 19 min and required only a conservative osteotomy and accurate odontotomy that ended without neurosensory disturbance (VAS pain 2/10 at one week). Conclusions: Collectively, the literature synthesis and clinical demonstration suggest that current AR platforms deliver sub-millimetre accuracy, minimal workflow overhead, and high user acceptance in high-risk extractions while highlighting the need for larger, controlled trials to prove tangible patient benefit.

1. Introduction

Augmented-reality (AR) technology projects computer-generated three-dimensional information directly onto the operative field, enabling surgeons to view virtual osteotomy lines, anatomical landmarks, or instrument trajectories while maintaining an unobstructed view of the patient [1,2,3]. By contrast, virtual reality immerses the user in a totally synthetic scene [4]. AR, therefore, lends itself more naturally to intra-operative navigation. The earliest maxillofacial prototypes, published just over a decade ago, used stereo cameras and half-silvered mirrors to display the inferior alveolar canal on cadaver mandibles and already achieved placement errors of about two millimetres [5]. Subsequent advances—most notably intra-oral optical scanners, marker-less tooth-recognition algorithms, and head-mounted displays such as Microsoft HoloLens 2—have reduced registration error below one millimetre and cut set-up times to fewer than three minutes [6,7].
Although implant navigation and maxillofacial surgeries such as orthognathic repositioning dominate the AR literature [8,9,10,11,12,13,14,15,16], an emerging line of enquiry has begun to address complex dental extractions, in which deeply embedded roots, nerves, and adjacent structures are notoriously difficult to visualise on two-dimensional imaging. Mandibular third-molar surgery concentrates the clinical risk: recent cohorts report ≈2–3% transient and ≈0.5–1% permanent inferior alveolar nerve (IAN) injury (e.g., 2.4% transient and 0.57% permanent in a 2024 series of 705 extractions), and lingual nerve (LN) injury rises markedly when lingual retraction is used (meta-analysis: ~0.08% permanent without retraction vs. ~7.9% temporary with repurposed retractors) [17,18]. Because greater bone and soft-tissue trauma correlates with more postoperative pain/trismus and slower recovery, minimising osteotomy is clinically consequential [19,20,21]. By accurately superimposing patient-specific cone-beam CT volumes on the jaw, an AR navigator can delineate the narrowest safe corticotomy and steer luxation vectors in real time; phantom experiments have already achieved mean registration errors well below a millimetre, a threshold comparable to state-of-the-art optical tracking [22,23].
Parallel research has flourished in the educational sphere, where mixed-reality headsets create immersive simulations that sharpen psychomotor skills and spatial understanding [24]. Yet the clinical corpus remains extremely small—fewer than twenty extraction-related patients have been reported—and no randomised trial has so far been undertaken. Systematic overviews of AR in oral and maxillofacial surgery continue to concentrate on implant navigation or orthognathic repositioning, touching on dentoalveolar indications only tangentially [25,26]. The resulting heterogeneity of head-mounted displays, registration strategies, and outcome metrics hampers cross-study comparison and blurs the true incremental value of AR guidance for tooth removal. Nevertheless, converging signals—sub-millimetre accuracy in bench tests, operative times comparable to standard care in first-in-human pilots, and consistently favourable usability scores—suggest that AR can enhance visualisation, minimise bone sacrifice, and may ultimately reduce iatrogenic nerve injury when managing impacted or ectopic teeth.
This study has two aims: (i) to provide a PRISMA-informed narrative synthesis of all PubMed-indexed studies on AR for dental extractions (visualisation, planning, and surgical treatment); and (ii) to present a worked AR-guided mandibular third-molar case that demonstrates the end-to-end workflow. To our knowledge, this is the first review focused solely on AR in dental extractions paired with a clinical example, and it highlights the technological and methodological advances still required for broad clinical adoption.

2. Materials and Methods

This narrative review investigates augmented-reality (AR) applications limited to dental extractions—including wisdom-tooth removal, surgery on impacted or supernumerary teeth, cyst enucleation associated with tooth removal, and exposure of impacted canines. Uses of AR in implantology and orthognathic or trauma surgery were intentionally excluded. The review protocol was prospectively specified in an internal document and subsequently registered in the PROSPERO database (CRD420251117535) after the initial searches. No protocol deviations occurred between the pre-specified protocol and the conduct of the review. The review followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) framework insofar as it is applicable to narrative reviews, ensuring transparent reporting of the search process, study selection, and data extraction [27].

2.1. Search Strategy

A comprehensive electronic search was performed in PubMed and the Cochrane Library (CENTRAL and Cochrane Reviews) covering the 1 January 2015–12 June 2025 period. PubMed was last run on 12 June 2025; Cochrane strategies were verified and archived on 12 August 2025 to document exact queries and counts.
The strategy paired AR-related terms with dental-extraction terminology, explicitly targeting studies on AR guidance across education (undergraduate and postgraduate) and real-world surgical practice.
Full, verbatim database strings are provided in Appendix A (Appendix A.1: PubMed; Appendix A.2: Cochrane).

2.2. Inclusion and Exclusion Criteria

For this review, we considered only peer-reviewed articles published in English between 2015 and 2025 that applied augmented- or mixed-reality technology to any phase of a dental-extraction workflow—whether the work dealt with pre-operative simulation, training, surgical exposure, real-time guidance, or postoperative evaluation of wisdom-tooth removal, impacted or supernumerary teeth, dentigerous-cyst-associated extractions, or the surgical exposure of canines. Original investigations of any design, from single-patient case reports to randomised clinical trials and cadaver or phantom simulations, were eligible, provided that an AR component was integral to the procedure. Conversely, we excluded systematic or narrative reviews (which were consulted only for contextual background), conference abstracts without a full text, non-English papers, and studies that addressed virtual reality or conventional navigation without an AR element or that focused on maxillofacial, ENT, orthognathic, or implant surgery in which no tooth extraction was performed. Records lacking full-text access were also excluded.

2.3. Article Selection Process

The searches retrieved 237 records from PubMed and 15 from the Cochrane Library (CENTRAL), for a total of 252 records before de-duplication. After removing 2 duplicates, 250 unique records were screened. Limiting the time frame to studies published from 2015 onward reduced the pool to 226. Titles and abstracts were screened against the predefined eligibility criteria, and 181 articles were excluded because they did not address augmented-reality applications in dental extractions. The remaining forty-five articles were assessed for full-text availability; three had no accessible full text (non-subscribed journals and language/hosting constraints). In keeping with PRISMA recommendations, the corresponding authors were contacted and given a two-week period to provide the missing manuscripts, but no responses were received, and these three studies were therefore excluded. Full-text review of the remaining 42 articles led to the exclusion of 33 additional papers that failed to meet the inclusion criteria, leaving 9 studies for the final qualitative synthesis. Screening was performed independently by two oral surgeons with research experience in implantology; when disagreement arose, consensus was reached with the help of a third reviewer of the same professional background. A PRISMA flow diagram illustrates the selection pathway (Figure 1).

2.4. Data Extraction and Quality Assessment

For each of the nine eligible studies, the following information was recorded in a standardised spreadsheet: publication year, study design, extraction scenario (e.g., wisdom-tooth removal, supernumerary-tooth extraction, or exposure of impacted canines), AR hardware and software employed, metrics of surgical accuracy and operative time, and any quantitative or qualitative data on user experience or training benefit. Although no formal risk-of-bias tool was applied—because the body of evidence consists largely of pilot reports, case series, and phantom studies—every paper was scrutinised for methodological clarity, completeness of outcome reporting, and coherence between aims and conclusions. Particular attention was paid to sample size, validation of accuracy measurements, and transparency of the AR workflow. This critical appraisal, while narrative in nature, provides a cautious context for interpreting the heterogeneous results that characterise early research in AR-assisted dental extractions.

2.5. Surgical Workflow of the Case

A mandibular third-molar extraction was selected as the exemplar because it is among the most frequent dentoalveolar procedures and carries a clinically relevant risk of inferior alveolar nerve injury, making conservative bone removal particularly important. This indication also maximises the expected benefit of in-field AR overlays for nerve visualisation and osteotomy planning. Although the present case centres on a third molar, the planning–registration–overlay steps are transferable to other extraction scenarios covered in the review (e.g., supernumerary incisors, impacted canines, and cyst-associated teeth).

2.5.1. Digital Workflow

The digital workflow began with an intra-oral surface acquisition: the patient’s mandibular arch was scanned with an intra-oral scanner (TRIOS 4, 3Shape, Copenhagen, Denmark) to generate an STL file accurate to 20 µm. A cone-beam CT (CBCT) (NewTom, Cefla, Imola, Italy) scan was then obtained. DICOM data were imported into Mimics Medical 27 software (Materialise, Leuven, Belgium) and threshold-segmented to isolate the mandibular cortex, the fully impacted right third molar, and the inferior alveolar canal. The original TRIOS STL was co-registered to the CBCT surface in a CAD software (3-Matic Medical, version 9.0, Materialise, Leuven, Belgium), yielding a unified virtual 3D model of the patient’s mandible with teeth accurately reconstructed from an intra-oral scan. The mandibular 3D model was utilised to design a dental splint specifically fitted to the subject’s lower dental arch. The splint included a holder engineered to uniquely attach to a QR-based marker (5 × 5 cm), ensuring a precise and consistent position relative to the patient that is easily reproducible during surgery. The holder was integrated into the splint in the region of the right canine to allow for an unobstructed line of sight during surgery. The splint with the holder was produced with a stereolithography 3D printer (Form3B, FormLabs, Somerville, MA, USA) using biocompatible resin (BioMed Clear, FormLabs, Somerville, MA, USA), while the QR-based marker was produced as a textured object using Polyjet 3D printing technology (J720 Dental 3D printer, Stratasys Ltd., Eden Prairie, MN, USA) and then securely connected to the holder through a mechanical joint (Figure 2).
Surgical planning was performed in CAD software (3-Matic Medical 19, Materialise, Leuven, Belgium). A trapezoidal flap and an 8 × 6 mm cortical window were positioned directly over the crown; for intra-operative clarity, the flap outline was coloured green, the inferior alveolar canal yellow, and the tooth crown white. The composite scene of the surgical planning was exported to a Unity-based application (Unity 2022.3.39f1, Unity Software Inc., San Francisco, CA, USA), extended with Vuforia 10.27.3 (PTC Inc., Boston, MA, USA), which was then compiled for the Magic Leap 2 head-mounted display (Magic Leap, Inc., Plantation, FL, USA).

2.5.2. Surgical Execution

The operating surgeon completed a short rehearsal on a 3D-printed model using the same application, focusing on overlay lock acquisition, head-movement compensation, and bur angulation relative to the projected cutting plane.
On the day of surgery, the splint was reseated on the patient. Using the Vuforia Image Target function, the Magic Leap cameras detected the QR code anchored to the splint and locked the holograms of the surgical planning to the dentition and displayed the colour-coded anatomy stereoscopically in the surgeon’s field (Figure 2).

3. Results

3.1. Literature Synthesis

Nine papers published from 2015 to 2025 met the refined eligibility criteria and together document the technical accuracy, operative feasibility, and educational value of marker-less augmented-reality guidance for difficult tooth surgery. Across the nine included studies, clinical reports encompassed twelve patients, studies on donated human bodies comprised two donated human bodies, early human tests included two volunteers, and multiple phantom/bench models were used in proof-of-concept workflows (Table 1).
In the earliest volunteer test, Suenaga et al. (2015) [28] achieved a mean target-registration error (TRE) of 0.91 ± 0.18 mm while superimposing an impacted mandibular third molar and the inferior alveolar canal, with image updates at 3–5 frames s−1, confirming that natural dental surfaces can replace physical markers. Wang et al. (2017) [29] refined the concept on 3D-printed jaws and a live subject using a single 4 K camera, reporting TRE values between 0.67 and 1.05 mm (worst case 1.38 mm) at comparable frame-rates, thus showing sub-millimetre fidelity under video see-through conditions. A mixed phantom–animal–human proof of concept by Pham-Dang et al. (2021) [30] visually confirmed stable overlay (<1 mm deviation on bench tests) and enabled a precisely located bone window for cyst removal in one patient, all without fiducial markers.
Clinical evidence then broadened: Suenaga et al. (2023) [31] removed two deeply impacted supernumerary maxillary incisors in paediatric patients; AR localisation limited the cortical window to the crown outline; and both extractions healed uneventfully with no root or follicle injury. Macrì et al. (2023) [32] exposed a palatally impacted canine using a monitor-based system whose real-time tracking inlier ratio remained at 0.39–0.48, allowing a targeted osteotomy that avoided postoperative morbidity. Suenaga et al. (2024) [33] employed the same marker-less platform to delineate a conservative posterior maxillary window over a dentigerous cyst harbouring an ectopic wisdom tooth; enucleation and extraction were completed intact, with preservation of the sinus walls and adjacent roots. Additionally, Koyama et al. (2023) [34] reported three paediatric mesiodens extractions performed under general anaesthesia with a HoloLens-based mixed-reality system; two cases were approached palatally and one labially, and the surgeons shared the same holographic model intra-operatively, underscoring MR’s value for team situational awareness and conservative bone removal.
Rieder et al. (2024) [35] presented the first translational series on mandibular third molars: six fully impacted teeth (two donated human bodies and four clinical) were removed after a mean AR set-up time of 166 ± 44 s and an operative time of 21 ± 6 min. Koyama et al. (2023) [34] reported a mean operative time of 32 min for the three mesiodens cases.
In a study by Rieder et al. (2024) [35], surgeons rated usability “good” with a mean System Usability Scale score of 79 ± 9 and reported no AR-related complications. Suenaga et al. (2023) [31] reported uneventful healing with no root or follicle injury, and Koyama et al. (2023) [34] reported no intra- or postoperative complications. Suenaga et al. (2024) [33] reported preservation of the sinus walls and adjacent roots.
Finally, Fudalej et al. (2024) [36] evaluated a HoloLens2 planner for impacted-canine traction: among 38 clinicians, the median Likert response for six of ten usability items reached the maximum (“totally agree”), and overall, >90% felt mixed reality improved three-dimensional understanding of tooth position and force vectors, indicating strong educational promise.
Taken together, these studies—despite modest sample sizes and heterogeneous hardware—consistently showed sub-millimetre overlay accuracy, set-up times under three minutes, operative success without AR-induced complications, and high user acceptance, collectively supporting augmented reality as a precise and clinically valuable adjunct for the visualisation and management of impacted or ectopic teeth.
Table 1. This table provides an overview of the studies included in the review.
Table 1. This table provides an overview of the studies included in the review.
TitleAuthorsYearStudy TypeExtraction ContextAR ModalityKey Findings
Vision-based marker-less registration using stereo vision and an augmented reality surgical navigation system: a pilot studySuenaga et al. [28]2015Pilot volunteerImpacted third molar (visualised)Marker-less stereo-vision AR<1 mm registration error; real-time overlay of canal + roots
Video see-through augmented reality for oral and maxillofacial surgeryWang et al. [29]2017Proof of concept (phantom + volunteer)Impacted wisdom tooth and mandibular canal visualised (no extraction performed)Video see-through AR~1 mm overlay error; real-time workflow
A proof-of-concept augmented reality system in oral and maxillofacial surgeryPham Dang et al. [30]2021Proof of concept (phantom, pig jaw, 1 patient)Bone window and removal of a maxillary cyst/impacted tooth root; visualises nerve and foraminaMarker-less screen-based AR using dental-cusp landmarksAccurate alignment; successful cyst removal
Computer-Assisted Pre-operative Simulation & AR for Extraction of Impacted Supernumerary Teeth: A Clinical Case Report of Two CasesSuenaga et al. [31]2023Case report (2 patients)Deeply impacted supernumerary maxillary incisorsMarker-less video see-through AR (CT + intra-oral-scan fused)Precise localisation; atraumatic removals; no complications; demonstrates clinical feasibility
Mixed reality for extraction of maxillary mesiodensKoyama et al. [34]2023Case series (3 paediatric patients)Mesiodens (supernumerary maxillary incisors) extractionsMicrosoft HoloLens-based mixed reality (CarnaLife Holo; marker-less volume rendering)Boys aged 7–11; mean operative time 32 min; palatal approach in 2 cases, labial in 1; minimal bleeding, no intra/postoperative complications; shared holograms among surgeons
AR-Assisted Surgical Exposure of an Impacted Tooth: a pilot studyMacrì et al. [32]2023Pilot single case (1 patient)Impacted maxillary canine (exposure)Marker-less, video-based AR (VisLab; single camera, monitor view)Targeted flap and osteotomy; uneventful healing
Mixed reality-based technology to visualise impacted teeth: proof of conceptFudalej et al. [36]2024Proof of concept (demo + survey)Impacted-canine planningHoloLens 2 mixed reality>90 % clinicians found MR beneficial
AR-Guided Extraction of Fully Impacted Lower Third MolarsRieder et al. [35]2024Pilot (2 cadaver + 4 patients)Fully impacted lower third molarsHoloLens 2 CBCT overlaySet-up 166 s; surgery 21 min; SUS ≈ 79
Marker-less AR-assisted surgery for resection of a dentigerous cyst in the maxillary sinusSuenaga et al. [33]2024Case reportDentigerous cyst + ectopic wisdom toothMarker-less HUD ARConservative window; complete enucleation

3.2. Case Report

A 25-year-old woman presented with a mesio-angular, fully impacted mandibular right third molar (tooth 48, Pell–Gregory IIB). Pre-operative CBCT showed the proximity of the inferior alveolar canal, with the root apex (Figure 3).
After passive seating of the QR-referenced splint, Magic Leap 2 detected the fiducial markers. Local anaesthesia comprised 1.8 mL of 3% mepivacaine for inferior alveolar and lingual blocks plus 1.8 mL of 4% articaine with 1:100,000 epinephrine delivered buccally and lingually to reinforce anaesthesia and promote haemostasis.
The surgeon viewed a stable holographic overlay—flap outline (green), tooth crown (white), and inferior alveolar canal (yellow) (Figure 4). The mucoperiosteal flap was executed following the incision line as projected: a vertical releasing incision mesial to the mandibular second molar, an intrasulcular sweep around the second molar, and a distal relieving cut carried onto the anterior border of the ascending ramus. A full-thickness envelope was reflected, exposing the planned cortical window.
Using tungsten-carbide round and fissure burs under copious saline, an osteotomy was performed to expose the crown. Then, the odontotomy was performed following the holographically planned cutting plane. The crown fragment was removed first, after which the roots were luxated and delivered with straight elevators, all while the surgeon monitored the yellow canal trajectory in the headset. Total operative time—from initial incision to final suture—was nineteen minutes, six of which were devoted to rotary bone removal. Haemostasis was routine, and the cortical roof of the canal remained intact. Single sutures were applied (PGA 4/0) (Figure 5).
One week later, the patient reported a VAS pain score of 2/10, displayed normal lower-lip sensation, and showed uneventful soft-tissue healing.

4. Discussion

4.1. Strengths of AR-Guided Extraction

The present synthesis and clinical case underscore augmented reality (AR) as a credible adjunct for difficult dental extractions, with every published pilot showing sub-millimetre overlay accuracy when natural tooth surfaces are used for registration [29,34,37]. Across the available reports, sub-millimetre accuracy and short set-up times recur as consistent themes, indicating that AR can be integrated without materially prolonging procedures.
In the present case of mandibular third-molar extraction, the headset locked to the splint and remained stable throughout a 19 min procedure, permitting a conservative osteotomy and root delivery without neurovascular compromise. These timings mirror the 2 to 3 min set-up and similar operative durations reported by Rieder et al. [35], suggesting that AR integration does not prolong surgery once the workflow is rehearsed.
Compared with static guides or conventional dynamic navigation [38,39], AR offers two practical gains. First, three-dimensional radiographic data are projected directly into the operative field, eliminating the need to glance at a remote monitor and thereby preserving hand–eye coordination [40]. Second, recent marker-less algorithms avoid bulky physical trackers and halve the registration steps seen in earlier optical systems [29], although they may be more limited in terms of registration accuracy.
Together, these features deliver what has been described as real-time “X-ray vision”, helping surgeons tailor osteotomies to the true extent of hidden crowns and roots while continuously monitoring the inferior alveolar nerve trajectory. The present case exemplifies these themes by showing AR-constrained bone removal around the crown outline while maintaining a visible buffer from the canal, thereby illustrating the review’s central practical advantages in a high-risk indication.

4.2. Clinical Relevance

Clinical relevance is most evident in mandibular third-molar surgery, where inappropriate bone removal or nerve trauma remains a persistent problem. Conventional navigation has already shown that real-time tracking can reduce bony windows and postoperative morbidity [41]; AR builds on this by presenting guidance cues superimposed on the patient, not on a separate screen. In the present case, the yellow hologram of the nerve canal imposed a conscious buffer during bur work, and the white crown outline curtailed distal bone removal—qualitative advantages echoed in earlier single-patient reports of supernumerary incisor removal, cyst enucleation, and canine exposure, all completed uneventfully under AR guidance [37]. Although definitive evidence of superior patient-reported outcomes is not yet available, the consistent trend toward smaller osteotomies and shorter luxation paths is encouraging.

4.3. Educational Potential

The educational potential of AR is equally compelling. Fudalej et al. showed that over 90% of surveyed clinicians felt that a HoloLens application improved their spatial appreciation of impacted-canine position and traction vectors [36]. Holographic overlays allow trainees to rehearse flap design or instrument angulation on mannequins and cadavers, bridging the gap between two-dimensional textbooks and live surgery. Early work in orthognathic and oncologic planning suggests that such immersive rehearsal can accelerate psychomotor learning and facilitate remote telementoring, where an expert observes the same augmented view in real time.

4.4. Limitations

Nonetheless, AR-assisted oral surgery is still in its infancy [42,43,44,45,46,47,48,49,50,51,52,53]. Across the nine included studies, the clinical corpus comprises twelve patients, supplemented by work on two donated human bodies and early tests on two volunteers, alongside multiple phantom/bench models. Fewer than twenty extraction-related patients have been documented, and no randomised trial has yet benchmarked AR against best conventional practice [54]. Several reports are of a single patient or are very small series with limited or absent long-term follow-up, and heterogeneous devices/registration methods complicate pooled estimates. Most reports are proof-of-concept or in vitro studies, employing disparate hardware, tracking methods, and accuracy metrics, which complicates quantitative comparison. Pre-operative preparation remains labour-intensive: high-resolution CBCT, intra-oral scanning, segmentation, and registration demand digital expertise and additional chair time, although rapid AI-based segmentation promises to streamline this workflow in the near future. Technical constraints persist as well; limited headset field of view, device weight, and occasional tracking loss during irrigation or wide mouth opening oblige the surgeon to maintain situational awareness and be ready to revert to conventional cues when necessary [55,56,57,58].
Economic and regulatory considerations must also be addressed before widespread adoption. Head-mounted displays cost several thousand euros, and current reimbursement pathways rarely recognise digital navigation for tooth extraction. Additionally, the incremental radiation from a prerequisite CBCT scan may not be justified in routine low-risk cases; AR is therefore best reserved for high-risk scenarios—roots intimately related to the canal, ectopic teeth in the sinus or ramus, or cases combined with cyst enucleation—where its precision can offset the added cost and complexity. From a patient-outcome standpoint, these costs and device constraints will be justified only if prospective studies confirm reductions in nerve injury, procedure time, and postoperative morbidity relative to standard care.

4.5. Future Directions

In summary, a small body of case reports and pilot series suggests that AR achieves sub-millimetre accuracy, offers intuitive visualisation, and is generally well accepted by surgeons when managing impacted or ectopic teeth. Whether these advantages ultimately translate to fewer nerve injuries, faster recovery, or meaningful cost savings remains to be proven. Multicentre clinical trials with standardised outcome sets, coupled with cost–benefit analyses, are logical next steps. Until such data emerge, AR should be regarded as a powerful adjunct for carefully selected high-risk extractions and an invaluable training tool rather than a universal replacement for surgical expertise. Continued software automation, lighter headsets, and clearer reimbursement guidance are likely to accelerate its transition from experimental novelty to routine clinical practice.

5. Conclusions

Augmented-reality guidance appears to be a promising adjunct for high-risk extractions. The narrative review and the illustrative third-molar case suggest feasibility in clinical use, but the limited evidence base counsels caution. Before routine adoption, robust multicentre—ideally randomised—studies with patient-centred outcomes are needed; in the meantime, AR may be best regarded as a precision aid for selected complex cases and a useful educational tool, with wider uptake contingent on streamlined planning workflows and lighter head-mounted displays.

Author Contributions

Conceptualisation, G.P. and E.M.; methodology, S.T.; software, F.F., L.C. and S.S.; validation, G.P., P.F. and C.B.; formal analysis, S.T.; investigation, G.P. and E.M.; resources, P.F. and E.M.; data curation, S.T. and M.C.; writing—original draft preparation, S.T.; writing—review and editing, S.S. and E.V.; visualisation F.F. and S.S.; supervision, C.B. and E.V.; project administration, G.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Approval is neither required nor possible for single anonymized case reports, in line with national and local regulations.

Informed Consent Statement

Written informed consent was obtained from the patient to publish this paper.

Data Availability Statement

No new data were created or analysed in this study. Data sharing is not applicable to this article.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
DIBINEMDepartment of Biomedical and Neuromotor Sciences
AIArtificial Intelligence
CADComputer-Aided Design
DICOMDigital Imaging and Communications in Medicine
HUDHead-Up Display
PRISMAPreferred Reporting Items for Systematic Review
QRQuick Response
STLStereolithography
SUSSystem Usability Scale
ARAugmented Reality
CTComputed Tomography
CBCTCone-Beam Computed Tomography
TRETarget-Registration Error
VASVisual Analogue Scale
VRVirtual Reality

Appendix A

Appendix A.1. PubMed—Query (Verbatim)

(“augmented reality”[tiab] OR “mixed reality”[tiab])
AND
(“tooth extraction”[tiab] OR “dental extraction”[tiab] OR extraction[tiab] OR removal[tiab]
OR “impacted tooth”[tiab] OR “impacted teeth”[tiab] OR “impacted canine”[tiab]
OR “third molar”[tiab] OR “wisdom tooth”[tiab] OR “wisdom teeth”[tiab]
OR “supernumerary tooth”[tiab] OR “supernumerary teeth”[tiab]
OR dentigerous[tiab] OR cyst[tiab]
OR “oral surgery”[tiab] OR “maxillofacial surgery”[tiab] OR “oral and maxillofacial surgery”[tiab])

Appendix A.2. Cochrane Library

#1 (Technology): “augmented reality”:ti,ab,kw OR “mixed reality”:ti,ab,kw
#2 (Indication): “tooth extraction”:ti,ab,kw OR “dental extraction”:ti,ab,kw OR extraction:ti,ab,kw OR removal:ti,ab,kw
OR “impacted tooth”:ti,ab,kw OR “impacted teeth”:ti,ab,kw OR “impacted canine”:ti,ab,kw
OR “third molar”:ti,ab,kw OR “wisdom tooth”:ti,ab,kw OR “wisdom teeth”:ti,ab,kw
OR “supernumerary tooth”:ti,ab,kw OR “supernumerary teeth”:ti,ab,kw
OR dentigerous:ti,ab,kw OR cyst:ti,ab,kw
OR “oral surgery”:ti,ab,kw OR “maxillofacial surgery”:ti,ab,kw OR “oral and maxillofacial surgery”:ti,ab,kw
Combination: #1 AND #2

References

  1. Shuhaiber, J.H. Augmented reality in surgery. Arch. Surg. 2004, 139, 170–174. [Google Scholar] [CrossRef] [PubMed]
  2. Benmahdjoub, M.; van Walsum, T.; van Twisk, P.; Wolvius, E.B. Augmented reality in craniomaxillofacial surgery: Added value and proposed recommendations through a systematic review of the literature. Int. J. Oral Maxillofac. Surg. 2021, 50, 969–978. [Google Scholar] [CrossRef] [PubMed]
  3. Joda, T.; Gallucci, G.O.; Wismeijer, D.; Zitzmann, N.U. Augmented and virtual reality in dental medicine: A systematic review. Comput. Biol. Med. 2019, 108, 93–100. [Google Scholar] [CrossRef] [PubMed]
  4. Queisner, M.; Eisenträger, K. Surgical planning in virtual reality: A systematic review. J. Med. Imaging 2024, 11, 062603. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  5. Badiali, G.; Ferrari, V.; Cutolo, F.; Freschi, C.; Caramella, D.; Bianchi, A.; Marchetti, C. Augmented reality as an aid in maxillofacial surgery: Validation of a wearable system allowing maxillary repositioning. J. Cranio-Maxillofac. Surg. 2014, 42, 1970–1976. [Google Scholar] [CrossRef] [PubMed]
  6. Blanchard, J.; Koshal, S.; Morley, S.; McGurk, M. The use of mixed reality in dentistry. Br. Dent. J. 2022, 233, 261–265. [Google Scholar] [CrossRef] [PubMed]
  7. Badiali, G.; Cercenelli, L.; Battaglia, S.; Marcelli, E.; Marchetti, C.; Ferrari, V.; Cutolo, F. Review on Augmented Reality in Oral and Cranio-Maxillofacial Surgery: Toward “Surgery-Specific” HeadUp Displays. IEEE Access 2020, 8, 59015–59028. [Google Scholar] [CrossRef]
  8. Pellegrino, G.; Mangano, C.; Mangano, R.; Ferri, A.; Taraschi, V.; Marchetti, C. Augmented reality for dental implantology: A pilot clinical report of two cases. BMC Oral Health 2019, 19, 158. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  9. Mai, H.N.; Dam, V.V.; Lee, D.H. Accuracy of Augmented Reality-Assisted Navigation in Dental Implant Surgery: Systematic Review and Meta-analysis. J. Med. Internet Res. 2023, 25, e42040. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  10. Mosch, R.; Alevizakos, V.; Ströbele, D.A.; Schiller, M.; von See, C. Exploring Augmented Reality for Dental Implant Surgery: Feasibility of Using Smartphones as Navigation Tools. Clin. Exp. Dent. Res. 2025, 11, e70110. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  11. Glas, H.H.; Kraeima, J.; van Ooijen, P.M.A.; Spijkervet, F.K.L.; Yu, L.; Witjes, M.J.H. Augmented Reality Visualization for Image-Guided Surgery: A Validation Study Using a Three-Dimensional Printed Phantom. J. Oral Maxillofac. Surg. 2021, 79, 1943.e1–1943.e10. [Google Scholar] [CrossRef] [PubMed]
  12. Tang, Z.N.; Hui, Y.; Hu, L.H.; Yu, Y.; Zhang, W.B.; Peng, X. Application of mixed reality technique for the surgery of oral and maxillofacial tumors. Beijing Da Xue Xue bao. Yi Xue Ban = J. Peking University. Health Sci. 2020, 52, 1124–1129. (In Chinese) [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  13. Fushima, K.; Kobayashi, M. Mixed-reality simulation for orthognathic surgery. Maxillofac. Plast. Reconstr. Surg. 2016, 38, 13. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  14. Tang, Z.N.; Hu, L.H.; Soh, H.Y.; Yu, Y.; Zhang, W.B.; Peng, X. Accuracy of Mixed Reality Combined With Surgical Navigation Assisted Oral and Maxillofacial Tumor Resection. Front. Oncol. 2022, 11, 715484. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  15. Stevanie, C.; Ariestiana, Y.Y.; Hendra, F.N.; Anshar, M.; Boffano, P.; Forouzanfar, T.; Sukotjo, C.; Kurniawan, S.H.; Ruslin, M. Advanced outcomes of mixed reality usage in orthognathic surgery: A systematic review. Maxillofac. Plast. Reconstr. Surg. 2024, 46, 29. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  16. Yang, R.; Li, C.; Tu, P.; Ahmed, A.; Ji, T.; Chen, X. Development and Application of Digital Maxillofacial Surgery System Based on Mixed Reality Technology. Front. Surg. 2022, 8, 719985. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  17. Chai, Y.; Dong, Y.; Lu, Y.; Wei, W.; Chen, M.; Yang, C. Risk Factors Associated With Inferior Alveolar Nerve Injury After Extraction of Impacted Lower Mandibular Third Molars: A Prospective Cohort Study. J. Oral Maxillofac. Surg. 2024, 82, 1100–1108. [Google Scholar] [CrossRef] [PubMed]
  18. Rapaport, B.H.J.; Brown, J.S. Systematic review of lingual nerve retraction during surgical mandibular third molar extractions. Br. J. Oral Maxillofac. Surg. 2020, 58, 748–752. [Google Scholar] [CrossRef] [PubMed]
  19. Li, Y.; Ling, Z.; Zhang, H.; Xie, H.; Zhang, P.; Jiang, H.; Fu, Y. Association of the Inferior Alveolar Nerve Position and Nerve Injury: A Systematic Review and Meta-Analysis. Healthcare 2022, 10, 1782. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  20. Zhu, M.; Liu, F.; Chai, G.; Pan, J.J.; Jiang, T.; Lin, L.; Xin, Y.; Zhang, Y.; Li, Q. A novel augmented reality system for displaying inferior alveolar nerve bundles in maxillofacial surgery. Sci. Rep. 2017, 7, 42365. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  21. Sarikov, R.; Juodzbalys, G. Inferior alveolar nerve injury after mandibular third molar extraction: A literature review. J. Oral Maxillofac. Res. 2014, 5, e1. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  22. Bjelovucic, R.; Wolff, J.; Nørholt, S.E.; Pauwels, R.; Taneja, P. Effectiveness of Mixed Reality in Oral Surgery Training: A Systematic Review. Sensors 2025, 25, 3945. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  23. Ayoub, A.; Pulijala, Y. The application of virtual reality and augmented reality in Oral & Maxillofacial Surgery. BMC Oral Health 2019, 19, 238. [Google Scholar] [CrossRef]
  24. Barausse, C.; Felice, P.; Pistilli, R.; Pellegrino, G.; Bonifazi, L.; Tayeb, S.; Neri, I.; Koufi, F.D.; Fazio, A.; Marvi, M.V.; et al. Anatomy Education and Training Methods in Oral Surgery and Dental Implantology: A Narrative Review. Dent. J. 2024, 12, 406. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  25. Bosc, R.; Fitoussi, A.; Hersant, B.; Dao, T.H.; Meningaud, J.P. Intraoperative augmented reality with heads-up displays in maxillofacial surgery: A systematic review of the literature and a classification of relevant technologies. Int. J. Oral Maxillofac. Surg. 2019, 48, 132–139. [Google Scholar] [CrossRef] [PubMed]
  26. Puleio, F.; Tosco, V.; Pirri, R.; Simeone, M.; Monterubbianesi, R.; Lo Giudice, G.; Lo Giudice, R. Augmented Reality in Dentistry: Enhancing Precision in Clinical Procedures-A Systematic Review. Clin. Pract. 2024, 14, 2267–2283. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  27. Page, M.J.; McKenzie, J.E.; Bossuyt, P.M.; Boutron, I.; Hoffmann, T.C.; Mulrow, C.D.; Shamseer, L.; Tetzlaff, J.M.; Akl, E.A.; Brennan, S.E.; et al. The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. BMJ 2021, 372, n71. [Google Scholar] [CrossRef]
  28. Suenaga, H.; Tran, H.H.; Liao, H.; Masamune, K.; Dohi, T.; Hoshi, K.; Takato, T. Vision-based markerless registration using stereo vision and an augmented reality surgical navigation system: A pilot study. BMC Med. Imaging 2015, 15, 51. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  29. Wang, J.; Suenaga, H.; Yang, L.; Kobayashi, E.; Sakuma, I. Video see-through augmented reality for oral and maxillofacial surgery. Int. J. Med. Robot. Comput. Assist. Surg. 2017, 13, e1754. [Google Scholar] [CrossRef] [PubMed]
  30. Dang, N.P.; Chandelon, K.; Barthélémy, I.; Devoize, L.; Bartoli, A. A proof-of-concept augmented reality system in oral and maxillofacial surgery. J. Stomatol. Oral Maxillofac. Surg. 2021, 122, 338–342. [Google Scholar] [CrossRef] [PubMed]
  31. Suenaga, H.; Sakakibara, A.; Taniguchi, A.; Hoshi, K. Computer-Assisted Preoperative Simulation and Augmented Reality for Extraction of Impacted Supernumerary Teeth: A Clinical Case Report of Two Cases. J. Oral Maxillofac. Surg. 2023, 81, 201–205. [Google Scholar] [CrossRef] [PubMed]
  32. Macrì, M.; D’Albis, G.; D’Albis, V.; Timeo, S.; Festa, F. Augmented Reality-Assisted Surgical Exposure of an Impacted Tooth: A Pilot Study. Appl. Sci. 2023, 13, 11097. [Google Scholar] [CrossRef]
  33. Suenaga, H.; Sakakibara, A.; Koyama, J.; Hoshi, K. A clinical presentation of markerless augmented reality assisted surgery for resection of a dentigerous cyst in the maxillary sinus. J. Stomatol. Oral Maxillofac. Surg. 2024, 125, 101767. [Google Scholar] [CrossRef] [PubMed]
  34. Koyama, Y.; Sugahara, K.; Koyachi, M.; Tachizawa, K.; Iwasaki, A.; Wakita, I.; Nishiyama, A.; Matsunaga, S.; Katakura, A. Mixed reality for extraction of maxillary mesiodens. Maxillofac. Plast. Reconstr. Surg. 2023, 45, 1. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  35. Rieder, M.; Remschmidt, B.; Gsaxner, C.; Gaessler, J.; Payer, M.; Zemann, W.; Wallner, J. Augmented Reality-Guided Extraction of Fully Impacted Lower Third Molars Based on Maxillofacial CBCT Scans. Bioengineering 2024, 11, 625. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  36. Fudalej, P.S.; Garlicka, A.; Dołęga-Dołegowski, D.; Dołęga-Dołegowska, M.; Proniewska, K.; Voborna, I.; Dubovska, I. Mixed reality-based technology to visualize and facilitate treatment planning of impacted teeth: Proof of concept. Orthod. Craniofacial Res. 2024, 27 (Suppl. 2), 42–47. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  37. Lin, P.Y.; Chen, T.C.; Lin, C.J.; Huang, C.C.; Tsai, Y.H.; Tsai, Y.L.; Wang, C.Y. The use of augmented reality (AR) and virtual reality (VR) in dental surgery education and practice: A narrative review. J. Dent. Sci. 2024, 19 (Suppl. 2), S91–S101. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  38. Pellegrino, G.; Bellini, P.; Cavallini, P.F.; Ferri, A.; Zacchino, A.; Taraschi, V.; Marchetti, C.; Consolo, U. Dynamic Navigation in Dental Implantology: The Influence of Surgical Experience on Implant Placement Accuracy and Operating Time. An in Vitro Study. Int. J. Environ. Res. Public Health 2020, 17, 2153. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  39. Pellegrino, G.; Ferri, A.; Del Fabbro, M.; Prati, C.; Gandolfi, M.G.; Marchetti, C. Dynamic Navigation in Implant Dentistry: A Systematic Review and Meta-analysis. Int. J. Oral Maxillofac. Implant. 2021, 36, e121–e140. [Google Scholar] [CrossRef] [PubMed]
  40. Stucki, J.; Dastgir, R.; Baur, D.A.; Quereshy, F.A. The use of virtual reality and augmented reality in oral and maxillofacial surgery: A narrative review. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2024, 137, 12–18. [Google Scholar] [CrossRef] [PubMed]
  41. Pellegrino, G.; Lizio, G.; Ferri, A.; Marchetti, C. Flapless and bone-preserving extraction of partially impacted mandibular third molars with dynamic navigation technology. A report of three cases. Int. J. Comput. Dent. 2021, 24, 253–262. [Google Scholar] [PubMed]
  42. Heredero, S.; Gómez, V.J.; Sanjuan-Sanjuan, A. The Role of Virtual Reality, Augmented Reality and Mixed Reality in Modernizing Free Flap Reconstruction Techniques. Oral Maxillofac. Surg. Clin. 2025, 37, 479–486. [Google Scholar] [CrossRef] [PubMed]
  43. Nasir, N.; Cercenelli, L.; Tarsitano, A.; Marcelli, E. Augmented reality for orthopedic and maxillofacial oncological surgery: A systematic review focusing on both clinical and technical aspects. Front. Bioeng. Biotechnol. 2023, 11, 1276338. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  44. Lungu, A.J.; Swinkels, W.; Claesen, L.; Tu, P.; Egger, J.; Chen, X. A review on the applications of virtual reality, augmented reality and mixed reality in surgical simulation: An extension to different kinds of surgery. Expert Rev. Med. Devices 2021, 18, 47–62. [Google Scholar] [CrossRef] [PubMed]
  45. Koyachi, M.; Sugahara, K.; Tachizawa, K.; Nishiyama, A.; Odaka, K.; Matsunaga, S.; Sugimoto, M.; Katakura, A. Mixed-reality and computer-aided design/computer-aided manufacturing technology for mandibular reconstruction: A case description. Quant. Imaging Med. Surg. 2023, 13, 4050–4056. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  46. Puladi, B.; Ooms, M.; Bellgardt, M.; Cesov, M.; Lipprandt, M.; Raith, S.; Peters, F.; Möhlhenrich, S.C.; Prescher, A.; Hölzle, F.; et al. Augmented Reality-Based Surgery on the Human Cadaver Using a New Generation of Optical Head-Mounted Displays: Development and Feasibility Study. JMIR Serious Games 2022, 10, e34781. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  47. Lin, Y.K.; Yau, H.T.; Wang, I.C.; Zheng, C.; Chung, K.H. A novel dental implant guided surgery based on integration of surgical template and augmented reality. Clin. Implant. Dent. Relat. Res. 2015, 17, 543–553. [Google Scholar] [CrossRef] [PubMed]
  48. Farronato, M.; Maspero, C.; Lanteri, V.; Fama, A.; Ferrati, F.; Pettenuzzo, A.; Farronato, D. Current state of the art in the use of augmented reality in dentistry: A systematic review of the literature. BMC Oral Health 2019, 19, 135. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  49. Mangano, F.G.; Admakin, O.; Lerner, H.; Mangano, C. Artificial intelligence and augmented reality for guided implant surgery planning: A proof of concept. J. Dent. 2023, 133, 104485. [Google Scholar] [CrossRef] [PubMed]
  50. Wang, J.; Shen, Y.; Yang, S. A practical marker-less image registration method for augmented reality oral and maxillofacial surgery. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 763–773. [Google Scholar] [CrossRef] [PubMed]
  51. Ruggiero, F.; Cercenelli, L.; Emiliani, N.; Badiali, G.; Bevini, M.; Zucchelli, M.; Marcelli, E.; Tarsitano, A. Preclinical Application of Augmented Reality in Pediatric Craniofacial Surgery: An Accuracy Study. J. Clin. Med. 2023, 12, 2693. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  52. Strong, E.B.; Patel, A.; Marston, A.P.; Sadegh, C.; Potts, J.; Johnston, D.; Ahn, D.; Bryant, S.; Li, M.; Raslan, O.; et al. Augmented Reality Navigation in Craniomaxillofacial/Head and Neck Surgery. OTO Open 2025, 9, e70108. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  53. Chegini, S.; Edwards, E.; McGurk, M.; Clarkson, M.; Schilling, C. Systematic review of techniques used to validate the registration of augmented-reality images using a head-mounted device to navigate surgery. Br. J. Oral Maxillofac. Surg. 2023, 61, 19–27. [Google Scholar] [CrossRef] [PubMed]
  54. Ceccariglia, F.; Cercenelli, L.; Badiali, G.; Marcelli, E.; Tarsitano, A. Application of Augmented Reality to Maxillary Resections: A Three-Dimensional Approach to Maxillofacial Oncologic Surgery. J. Pers. Med. 2022, 12, 2047. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  55. Li, X.; Sun, Q.; Shao, L.; Zhu, Z.; Zhao, R.; Meng, F.; Zhao, Z.; Jihu, K.; Xiang, X.; Fu, T.; et al. Robot-assisted augmented reality navigation for osteotomy and personalized guide-plate in mandibular reconstruction: A preclinical study. BMC Oral Health 2025, 25, 1309. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  56. Schneider, B.; Ströbele, D.A.; Grün, P.; Mosch, R.; Turhani, D.; See, C.V. Smartphone application-based augmented reality for pre-clinical dental implant placement training: A pilot study. Oral Maxillofac. Surg. 2025, 29, 38. [Google Scholar] [CrossRef] [PubMed]
  57. Tao, B.; Fan, X.; Wang, F.; Chen, X.; Shen, Y.; Wu, Y. Comparison of the accuracy of dental implant placement using dynamic and augmented reality-based dynamic navigation: An in vitro study. J. Dent. Sci. 2024, 19, 196–202. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  58. Won, Y.J.; Kang, S.H. Application of augmented reality for inferior alveolar nerve block anesthesia: A technical note. J. Dent. Anesth. Pain Med. 2017, 17, 129–134. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
Figure 1. This diagram illustrates the systematic process of identifying, screening, and selecting studies for inclusion in the review.
Figure 1. This diagram illustrates the systematic process of identifying, screening, and selecting studies for inclusion in the review.
Applsci 15 09723 g001
Figure 2. (A) Photograph of the dental splint with the QR-based marker used for patient tracking and virtual-to-real registration during AR view wearing Magic Leap 2 glasses. (B) Intra-operative view through the Magic Leap 2 headset with colour-coded holograms: green—incision outline; white—crown; and yellow—inferior alveolar canal.
Figure 2. (A) Photograph of the dental splint with the QR-based marker used for patient tracking and virtual-to-real registration during AR view wearing Magic Leap 2 glasses. (B) Intra-operative view through the Magic Leap 2 headset with colour-coded holograms: green—incision outline; white—crown; and yellow—inferior alveolar canal.
Applsci 15 09723 g002
Figure 3. Pre-operative views of the impacted mandibular right third molar. (A) Panoramic radiograph demonstrating the mesio-angular impaction of tooth 48 and its intimate relationship to the inferior alveolar canal. (B) Intra-oral photograph showing the fully mucosa-covered inclusion site of tooth 48, with no crown exposure. (C) Tridimensional reconstruction of the mandible showing the proximity of the dental roots to the inferior alveolar nerve (yellow) and the nerve in the opposite side (pink).
Figure 3. Pre-operative views of the impacted mandibular right third molar. (A) Panoramic radiograph demonstrating the mesio-angular impaction of tooth 48 and its intimate relationship to the inferior alveolar canal. (B) Intra-oral photograph showing the fully mucosa-covered inclusion site of tooth 48, with no crown exposure. (C) Tridimensional reconstruction of the mandible showing the proximity of the dental roots to the inferior alveolar nerve (yellow) and the nerve in the opposite side (pink).
Applsci 15 09723 g003
Figure 4. (A) Three-dimensional reconstruction of the mandible highlighting the planned mucoperiosteal flap (green) and the inferior alveolar nerve (yellow). (B) AR visualisation of the planned osteotomy (light blue) overlaid on the inferior alveolar nerve (yellow). (C) AR visualisation of the planned osteotomy (green).
Figure 4. (A) Three-dimensional reconstruction of the mandible highlighting the planned mucoperiosteal flap (green) and the inferior alveolar nerve (yellow). (B) AR visualisation of the planned osteotomy (light blue) overlaid on the inferior alveolar nerve (yellow). (C) AR visualisation of the planned osteotomy (green).
Applsci 15 09723 g004
Figure 5. Augmented-reality-assisted surgical sequence. (A) Flap elevation following the projected green trajectory. (B) Clinical aspect of the trapezoidal envelope incision. (C) Creation of the cortical window observed under AR guidance, preventing encroachment toward the canal roof and reducing the osteotomy footprint. (D) Clinical overview of the osteotomy. (E) Odontotomy under AR guidance to separate the crown and roots. The cutting line follows the planned plane, shortening the luxation path and limiting distal bone removal. (F) Clinical view of the odontotomy. (G) Delivery of the tooth crown. (H) Delivery of the root fragments with a straight elevator. (I) Placement of the final 4–0 polyglactin suture. (J) Extracted tooth 48 showing crown and root segments.
Figure 5. Augmented-reality-assisted surgical sequence. (A) Flap elevation following the projected green trajectory. (B) Clinical aspect of the trapezoidal envelope incision. (C) Creation of the cortical window observed under AR guidance, preventing encroachment toward the canal roof and reducing the osteotomy footprint. (D) Clinical overview of the osteotomy. (E) Odontotomy under AR guidance to separate the crown and roots. The cutting line follows the planned plane, shortening the luxation path and limiting distal bone removal. (F) Clinical view of the odontotomy. (G) Delivery of the tooth crown. (H) Delivery of the root fragments with a straight elevator. (I) Placement of the final 4–0 polyglactin suture. (J) Extracted tooth 48 showing crown and root segments.
Applsci 15 09723 g005aApplsci 15 09723 g005b
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pellegrino, G.; Barausse, C.; Tayeb, S.; Vignudelli, E.; Casaburi, M.; Stradiotti, S.; Ferretti, F.; Cercenelli, L.; Marcelli, E.; Felice, P. Augmented Reality in Dental Extractions: Narrative Review and an AR-Guided Impacted Mandibular Third-Molar Case. Appl. Sci. 2025, 15, 9723. https://doi.org/10.3390/app15179723

AMA Style

Pellegrino G, Barausse C, Tayeb S, Vignudelli E, Casaburi M, Stradiotti S, Ferretti F, Cercenelli L, Marcelli E, Felice P. Augmented Reality in Dental Extractions: Narrative Review and an AR-Guided Impacted Mandibular Third-Molar Case. Applied Sciences. 2025; 15(17):9723. https://doi.org/10.3390/app15179723

Chicago/Turabian Style

Pellegrino, Gerardo, Carlo Barausse, Subhi Tayeb, Elisabetta Vignudelli, Martina Casaburi, Stefano Stradiotti, Fabrizio Ferretti, Laura Cercenelli, Emanuela Marcelli, and Pietro Felice. 2025. "Augmented Reality in Dental Extractions: Narrative Review and an AR-Guided Impacted Mandibular Third-Molar Case" Applied Sciences 15, no. 17: 9723. https://doi.org/10.3390/app15179723

APA Style

Pellegrino, G., Barausse, C., Tayeb, S., Vignudelli, E., Casaburi, M., Stradiotti, S., Ferretti, F., Cercenelli, L., Marcelli, E., & Felice, P. (2025). Augmented Reality in Dental Extractions: Narrative Review and an AR-Guided Impacted Mandibular Third-Molar Case. Applied Sciences, 15(17), 9723. https://doi.org/10.3390/app15179723

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop