Next Article in Journal
Dietary White Grape Pomace Silage for Goats: Assessing the Impact of Inclusion Level on Milk Processing Attributes
Previous Article in Journal
Robust Positioning Scheme Based on Deep Reinforcement Learning with Context-Aware Intelligence
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Integration and Innovation in Digital Implantology–Part II: Emerging Technologies and Converging Workflows: A Narrative Review

by
Tommaso Lombardi
1 and
Alexandre Perez
2,*
1
Unit of Oral Medicine and Oral Maxillofacial Pathology, Division of Oral and Maxillofacial Surgery, Department of Surgery, Faculty of Medicine, University Hospitals of Geneva, University of Geneva, 1205 Geneva, Switzerland
2
Unit of Oral Surgery and Implantology, Division of Oral and Maxillofacial Surgery, Department of Surgery, Faculty of Medicine, University Hospitals of Geneva, University of Geneva, 1205 Geneva, Switzerland
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(23), 12789; https://doi.org/10.3390/app152312789
Submission received: 29 September 2025 / Revised: 19 November 2025 / Accepted: 21 November 2025 / Published: 3 December 2025

Abstract

Emerging artificial intelligence (AI) and robotic surgical technologies have the potential to influence digital implant dentistry substantially. As a narrative review, and building on the foundations outlined in Part I, which described current digital tools and workflows alongside their persistent interface-related limitations, this second part examines how AI and robotics may overcome these barriers. This synthesis is based on peer-reviewed literature published between 2020 and 2025, identified through searches in PubMed, Scopus, and Web of Science. Current evidence suggests that AI-based approaches, including rule-based systems, traditional machine learning, and deep learning, may achieve expert-level performance in diagnostic imaging, multimodal data registration, virtual patient model generation, implant planning, prosthetic design, and digital smile design. These methods offer substantial improvements in efficiency, reproducibility, and accuracy while reducing reliance on manual data handling across software, datasets, and workflow interfaces. In parallel, robotic-assisted implant surgery has advanced from surgeon-guided systems to semi-autonomous and fully autonomous platforms, with the potential to provide enhanced surgical precision and reduce operator dependency compared with conventional static or dynamic navigation. Several of these technologies have already reached early stages of clinical deployment, although important challenges remain regarding interoperability, standardization, validation, and the continuing need for human oversight. Together, these innovations may enable the gradual convergence of digital technologies, real-time-assisted, unified, end-to-end implant prosthodontic workflows, and gradual automation, while acknowledging that full automation remains a longer-term prospect. By synthesizing current evidence and proof-of-concept applications, this review aims to provide clinicians with a comprehensive overview of the AI and robotics toolkit relevant to implant dentistry and to outline both the opportunities and remaining limitations of these disruptive technologies as the field progresses towards seamless, fully integrated treatment pathways.

1. Introduction

Digital tools and workflows have greatly enhanced the possibilities in modern implant dentistry [1]. Contemporary digital technologies enable prosthetically driven planning, guided and minimally invasive surgery, and streamlined CAD/CAM-based prosthetic fabrication [2,3,4,5,6]. These capabilities may yield measurable improvements in procedural accuracy, clinical efficiency, patient experience, and interdisciplinary coordination [7,8,9,10,11,12]. With the digitalization of the entire implant prosthodontic workflow, advanced treatment concepts have become available across all indications and clinical stages of implant treatment, including diagnosis, planning, surgical execution, and prosthetic delivery [1,13,14,15,16].
While potentially improving many aspects of treatment delivery, the adoption of digital workflows has also introduced increasing complexity [17,18,19,20]. A multitude of platforms, techniques, and multimodal datasets, ranging from cone-beam computed tomography (CBCT) and intraoral scans (IOS) to facial imaging and virtual articulators (VAs)—must be synchronized and managed throughout the clinical workflow [1,20,21,22,23]. Despite the high level of technological maturity of individual tools and methods, the clinical reality remains constrained by persistent interfaces between software platforms, data formats, and procedural stages [19,20,23,24]. These interfaces require manual data transfers, conversions, and verification steps, demanding considerable training and expertise while introducing potential sources for errors, data loss, artifact generation, and cumulative inaccuracies [20,24].
Part I of this review series identified three main barriers towards the realization of integrated seamless digital implant end-to-end workflows:
(1) Dataset interfaces, which are related to the requirement of using and merging multimodal datasets. These interfaces restrict, e.g., the efficiency and accuracy of the virtual patient model setup, requiring robust multimodal registration (CBCT ↔ IOS ↔ facial scans) and reliable segmentation, while potentially being compromised by poor image quality (e.g., metallic artifacts), algorithmic choices, or landmark scarcity. Error propagation from such data treatment steps may propagate to surgical and prosthetic steps and measurably contribute to overall implant and restorative errors [20,21,25,26,27,28,29,30,31,32].
(2) Restricted software interoperability: closed versus open software ecosystems and heterogeneous feature sets between individual software systems force cross-vendor platform file handling along the different steps of the digital implant prosthodontic workflow, increasing conversion steps and potential error creation or conversion transfer artifacts. “Hybrid” approaches using standardized Application Programming Interfaces (APIs) have begun to reduce but not eliminate these barriers [33,34,35,36].
(3) Workflow synchronization: Advanced immediate procedures, involving immediate provisionalization with chair-side CAD/CAM, require parallel execution and alignment between surgical and prosthetic workflows to ensure prosthetic delivery with adequate passive fit; feedback loops and intra-treatment scans remain essential to update planning models but add complexity and potential error sources [14,25,27,37].
Collectively, these structural constraints in modern digital implant-prosthodontic workflows may affect reproducibility, scalability, and real-time responsiveness across indications and procedural outcomes [4,8,12,38,39,40,41].In recent years, artificial intelligence (AI) and robotic technologies have emerged as promising tools with the potential to overcome the interface-related limitations of fully digital workflows [24,42,43,44,45,46]. Today, a wide array of proof-of-concept studies has demonstrated the feasibility of AI models across nearly every step of the implant-prosthodontic workflow. These include automated diagnostic evaluation, virtual patient model building, implant planning, prosthetic computer-aided design (CAD), digital smile design, and even intraoperative navigation and execution [24,42,43,44,45,46]. Robotic surgery, in turn, holds considerable potential to establish as a viable alternative to traditional guided surgery, enabling real-time adaptation with comparable or even enhanced surgical precision [47]. While these technologies hold considerable promise to improve treatment precision, efficiency, and accuracy, their introduction into clinical practice has also brought an influx of new tools, terms, and conceptual frameworks—making it increasingly difficult for clinicians to contextualize and compare their value and technological readiness level [46].
Several recent reviews have comprehensively examined the potential role of artificial intelligence, machine learning, and robotics in dental medicine and prosthodontics, highlighting their capacity to improve diagnostic precision, workflow efficiency, and treatment predictability [24,42,43,44,45,46,48,49,50,51,52,53]. However, the specific appraisal of how these rapidly evolving AI- and robotics-based solutions may overcome the persistent interface-related limitations of current digital implant workflows may deserve more detailed examination. Building upon the foundations established in Part I of this narrative review—which summarized contemporary tools, methodologies, and workflow stages of current digital implant workflow with a specific focus on the limitations in achieving seamless end-to-end integration—this second part aims to explore how AI and robotics may address those very limitations toward realizing fully integrated end-to-end digital workflows.
Part II of the narrative review on integrated workflows in digital implant dentistry aimed to provide clinicians with an accessible overview of the current AI toolkit relevant to implant dentistry, clarifying distinctions among different AI systems and highlighting their data requirements, strengths, and limitations. Secondly, this review sought to synthesize the growing body of literature on AI-driven techniques across the different stages of digital implant prosthodontic workflows, while summarizing how they may help reduce or eliminate the workflow interfaces and limitations identified in Part I. Finally, this review aimed at outlining the integration of robotics into dental implantology, with particular attention to their capacity to translate virtual treatment plans into precise surgical procedures, and to situate these systems within the broader landscape of static and dynamic computer-assisted surgery.
By focusing on the recently reported capabilities of AI and robotics in implant dentistry, this narrative review aimed to outline their potential while providing a forward-looking perspective on how today’s fragmented digital workflows may evolve into unified, partially or even fully automated, and clinically scalable treatment pathways.

2. Methods

This review was conducted as a narrative synthesis of the contemporary literature on AI, ML, and robotics in implant dentistry. This review aimed to provide an integrative and descriptive overview of technological capabilities, clinical applications, and emerging trends relevant to digital implant workflows, rather than to perform a formal systematic review.

2.1. Literature Search Strategy

Peer-reviewed publications were identified through structured searches of PubMed, Scopus, and Web of Science, and considered relevant to the major research domains relevant to implant dentistry. Searches focused on scientific manuscripts published between 2020 and October 2025, based on the rapid evolution of AI- and robotics-related developments during this period.
Searches used combinations of controlled vocabulary (MeSH) and free-text terms. An example search structure was:
(“digital implantology” OR “computer-assisted implantology” OR “artificial intelligence” OR “machine learning” OR “deep learning” OR “robotics”) AND (“dental implants” OR “implant dentistry” OR “dental implantology”).

2.2. Inclusion and Exclusion Criteria

To maintain focus and ensure relevance to clinical implantology workflows, the following criteria were applied:
Inclusion criteria:
English-language, peer-reviewed publications; systematic reviews, narrative reviews, randomized controlled trials, prospective studies, and technical evaluations with direct relevance to clinical implantology and AI and robotics-related workflows; reports describing AI/ML models, robotics platforms, or automated workflow tools used in diagnosis, planning, surgery, or prosthodontics.
Device-specific regulatory status was verified using authoritative medical device databases, including the U.S. FDA Registration and Listing Database, the 510(k), De Novo, and PMA approval databases, and the MAUDE adverse event database. When available, European CE-mark documentation and publicly accessible records from EUDAMED were additionally consulted.
Exclusion criteria:
Case reports, opinion papers, conference abstracts, preprints, and non-peer-reviewed company technical documents, purely engineering or algorithm-development papers without a dental/clinical application, studies focused exclusively on orthodontics, caries detection, dermatology, or generic medical image analysis.

2.3. Study Selection Process

Search results were screened for relevance by the authors. Titles and abstracts were evaluated for eligibility, followed by full-text assessment of potentially relevant articles. Reference lists of key reviews were hand-searched to identify additional publications. Duplicate records were removed manually.
Because the included studies vary widely in design, dataset size, validation rigor, and clinical maturity, performance metrics were interpreted with attention to these methodological factors. As this is a narrative review, no formal risk-of-bias grading tool was applied; however, higher-level evidence (systematic reviews, randomized trials, prospective studies) was prioritized, and limitations such as small sample sizes, single-center datasets, laboratory-only validation, and heterogeneity of imaging devices were explicitly considered when summarizing results.

2.4. Narrative Synthesis Structure

The review was organized thematically around the major components of the digital implantology workflow, covering the major domains of diagnostic and data-acquisition AI (CBCT segmentation, IOS processing, landmarking), AI-driven planning and decision support, robot-assisted implant surgery, and autonomous execution, in comparison with alternative guided implant placement methodologies.
Special attention was given to how these technologies reduce workflow fragmentation and support integrated, partly or fully automated clinical workflows.

3. Technologies That Disrupt and Deepen Integration

While interfaces between workflow entities persist, fully digital implant prosthodontic workflows are well-established, and their capabilities have been demonstrated [17,19,24,54]. Emerging technologies, particularly artificial intelligence (AI), including machine learning (ML) and deep learning (DL) artificial neural networks, as well as robotics, show considerable potential to meaningfully enhance these workflows [45,46,48,49,55]. Robotic-assisted implant placement and dynamic navigation systems enabling real-time intraoperative adjustments, directly linking surgical execution to virtual planning environments [47]. In this context, “disruption” refers not only to advances in technical interoperability—such as improved multimodal data exchange, standardized annotation, and tighter coupling between diagnostic, planning, and execution modules—but also to deeper clinical workflow integration, where AI-supported decision-making and robot-assisted execution increasingly operate within unified, feedback-driven treatment pathways. The convergence of these technologies with digital implant workflows may contribute to evolving current treatment paradigms by significantly reducing—or even eliminating—many of the persistent interfaces that currently hinder efficiency and prevent the realization of truly integrated, end-to-end workflows [19,43].
While current AI systems and robotic platforms primarily exchange data through established file formats (e.g., DICOM, STL/PLY) and vendor-specific interfaces, fully standardized, real-time interoperability has not yet been achieved. However, AI-based multimodal data fusion, semantic segmentation, and automated plan generation are precisely the technologies most likely to overcome these interoperability barriers. By converting heterogeneous inputs into structured, robot-readable planning outputs, AI represents a potential enabler—rather than an obstacle—for future direct AI-to-robot integration within unified digital implant workflows.
Emerging commercial platforms, such as Exocad AI modules, 3Shape Automate, and DTX Studio, already provide a glimpse of what will be possible in the near future by enabling fully automated prosthetic design directly from digital impressions. While AI algorithms are being rapidly adopted in diagnostics, their integration into other parts of the implant-prosthodontic workflow, from treatment planning to surgical execution and prosthetic fabrication, remains largely limited to initial proof-of-concepts. However, the development and extension of advanced, disruptive AI- and robotics-based techniques into digital implantology are advancing rapidly. They will be explored in detail in the following chapters of this review, with the aim of envisioning how these techniques may contribute to the realization of fully integrated, end-to-end implant treatment workflows [49].
Several of the commercially available systems discussed in this review (e.g., Diagnocat, 3Shape Automate, Exocad AI modules, DTX Studio, and relu), which rely primarily on deep learning architectures, have already obtained regulatory approval or market clearance for diagnostic imaging, automated design, or planning-related tasks. Their availability demonstrates that DL-based paradigms have reached a level of technical maturity and are now deployed in implantology-relevant workflows.

3.1. The Artificial Intelligence Toolkit–Types of AI Models and Their General Capabilities

Before discussing dental-specific applications, a brief overview of the main AI model families is provided to assist readers without formal training in computer science. Because many dental AI studies implicitly rely on these concepts, a basic understanding of the underlying architectures (e.g., rule-based systems, traditional machine learning, and deep learning) facilitates accurate interpretation of the evidence presented in implantology-specific research. This synopsis is intentionally concise and limited to concepts directly relevant to dental applications.
The term AI refers to a branch of computer science focused on developing machines and systems capable of performing tasks that typically require human intelligence, such as pattern recognition, reasoning, learning, problem-solving, perception, and natural language understanding, all of which are highly relevant to digital implant therapy [49,55].
Early computational systems relied primarily on manually defined decision rules, which provided the foundations for later data-driven approaches but were limited in adaptability beyond narrow, predefined tasks [56,57,58].
From the 1980s onward, technologies were developed that became increasingly capable of “learning” relationships between selected input features (e.g., medical parameters such as bone density or smoking status) and specific outcomes, such as complication risks [59,60]. Driven by increasing computational power, the availability of large datasets, and the emergence of practical algorithms and development frameworks, these advances culminated in the rise of deep learning (DL) in the 2010s—an approach capable of learning directly from complex data such as images, speech, and text at scale [61,62]. These architectures now underpin nearly all contemporary dental AI systems (Figure 1) [45,55].
In more detail, all AI systems share a common principle: they process input data, extract relevant features or patterns, and use them within an analytical model to generate an output or decision [45]. Depending on the degree of learning applied during feature extraction and model construction, different AI models can be broadly categorized into explicit algorithmic models, shallow or traditional machine learning (ML), and deep learning (DL) models (Figure 1) [49,63].
Explicit algorithmic models, often implemented as rule-based logic or expert systems, represent the simplest and earliest forms of AI [57]. In modern terminology, these models are often considered outside of “true” AI as they do not learn from data. Instead, they rely on handcrafted rules and manually defined decision logic, fully specified by a human expert—for example, a rule-based CBCT analysis system classifying bone density according to predefined Hounsfield unit thresholds. These systems are highly interpretable but lack adaptability for data that deviates from the training sets [49,63]. Such approaches still appear in certain dental diagnostic pipelines but have largely been superseded by data-driven methods.
Shallow or traditional ML introduces data-driven modeling. Algorithms such as decision trees, support vector machines (SVMs), and logistic regression “learn” statistical relationships between engineered features and target outcomes [55,58,64,65]. However, these models still depend on manual feature selection and preprocessing (e.g., choosing variables such as bone density, implant length, and smoking status). Model training is supervised using datasets containing both input features and known outputs, allowing the algorithm to optimize its parameters for prediction or classification. In implant dentistry, such models have been used for structured tasks, such as risk-factor analysis and complication prediction.
DL represents the most complex and powerful level of AI [55,63,66,67]. DL models employ trained, multi-layered artificial neural networks that are capable of performing both automatic feature extraction and end-to-end model construction, enabling the direct processing of raw, high-dimensional data, such as CBCT scans, 3D surface models, or unstructured textual records [67,68]. Through iterative optimization, these algorithms learn complex, hierarchical representations that capture mathematical descriptions, or “features,” of data from large, problem-specific datasets. This capacity to extract complex anatomical features directly from imaging data forms the basis of most AI-driven workflows in CBCT segmentation, pathology detection, virtual patient model construction, and automated planning [55].
Artificial neural network (ANN) models consist of interconnected artificial neurons (also called nodes or units). These nodes are arranged in layers, where each neuron performs a simple mathematical operation to transform input signals into outputs [45,69]. While TMLs typically use only one or a few layers, DL models often comprise tens or several tens of computational layers to progressively extract and combine higher-level features, enabling the detection of complex patterns in large, high-dimensional datasets. DL models, including convolutional neural networks (CNNs), recurrent neural networks (RNNs), transformers, and generative adversarial networks (GANs), automatically learn both features and predictive relationships directly from raw data (e.g., images, 3D scans, or textual records) [49,55,70]. Conversely, shallow ML models, such as decision trees, support vector machines (SVMs), k-nearest neighbors (kNN), and logistic regression, are better suited for structured, tabular datasets and smaller data samples but are limited when processing highly complex or multimodal data. Compared to TML, which is generally efficient and interpretable, DL models require larger datasets and greater computational resources. At the same time, their internal decision-making processes often remain less transparent, frequently described as operating like a “black box” [69,71]. Despite these limitations, DL architectures form the backbone of nearly all state-of-the-art dental imaging and planning systems discussed in subsequent sections.

3.2. AI-Powered Integration of Digital Workflows in Dental Implantology

AI-based methods offer significant potential for analyzing and interpreting large volumes of imaging and clinical data, making them highly relevant to modern dentistry [24,46,58,70,71,72]. Given its reliance on large, multimodal datasets, dental implantology has great potential to benefit from AI-based methods that enhance precision, consistency, and efficiency [46]. The application of AI technologies in medical science, and specifically in dentistry, became more accessible with breakthroughs in vision processing, when deep convolutional neural networks began to outperform traditional feature-based methods in large-scale image recognition (ImageNet), making automated AI-based image understanding both fast and reliable [61,66]. A second milestone in 2017 was the introduction of transformer architectures, which enabled more effective pattern recognition across complex, multimodal data [62]. Together, these advances laid the groundwork for the application of modern AI in dentistry, giving rise to a powerful AI-based toolkit in dentistry spanning three main families of rule-based logic, TML, and DL-based systems [49].
Table 1 summarizes representative AI applications, their tasks, data, and performance metrics, and their approximate technological readiness level, distinguishing between commercially available tools and experimental prototypes.
With the advent of high-performance AI models, image-based diagnosis has become one of the most promising application areas in dental medicine, allowing for faster, more consistent, and often more accurate interpretation of, e.g., radiographic and volumetric anatomical data [49,58]. For instance, AI-driven algorithms have demonstrated the capability to automate the diagnosis of dental and oral pathological conditions, including, e.g., root resorption, root fractures, peri-implant and periodontal bone loss, and other conditions relevant to implant therapy, with comparable or improved diagnostic accuracy and significantly improved efficiency [45,67,104,105,106,107,108]. Despite its primary use for diagnostic purposes, recent advancements suggest a potential application of AI in all areas and processes throughout the entire implant prosthodontic workflow, from diagnostic model building to treatment planning, and up to delivery and maintenance (Figure 2) [19,45,46,48,49,109]. Furthermore, the implementation of surgical robotic systems and the integration of AI-based processes into prosthetic CAD processes will extend their use into chair-side operations, further reducing workflow interfaces and fostering integration [19,47,110,111].

3.2.1. AI in Patient Virtual Model Building, Dataset Registration, and Segmentation

The generation of a virtual patient model as a basis for digital prosthetic and implant planning often requires a sequence of time-consuming steps, comprising data collection, transfer, registration using existing landmarks or artificial fiducial markers, and segmentation to dissect and display the corresponding relevant anatomic structures necessary for planning [20,21,22,112,113]. A range of AI models and algorithms capable of automating these processes has been introduced, potentially increasing both accuracy and workflow efficiency while paving the way towards fully integrated digital workflows.
Ahlamari et al. recently conducted a systematic review of studies evaluating the accuracy of AI-based radiographic image segmentation for various anatomic structures, including teeth, individual jaws, the temporomandibular joint, and the mandibular inferior alveolar canal [83]. The authors identified 27 studies that employed DL models and 3 that utilized TML approaches, all of which reported high overlap between AI-generated and reference segmentations, indicating strong AI performance in delineating dental and maxillofacial structures. DL models consistently outperformed TML methods, with more advanced approaches incorporating preprocessing routines, hierarchical network architectures, and larger training datasets, resulting in improved segmentation accuracy and precision [69,76,114,115]. Collectively, these and other reports highlight the strong potential of AI-based segmentation techniques to automate image segmentation and facilitate anatomic model generation towards fully integrated implant prosthodontic workflows [73,116].
Verhelst et al. have, e.g., demonstrated the capabilities of AI in CBCT image segmentation by applying a layered deep learning algorithm that enables the automatic generation of 3D surface models of the human mandible [68]. Compared to user-refined AI segmentations (RAI) and semi-automatic segmentation (SA, current clinical standard), the AI model demonstrated near-instant processing with an average time of only 17 s versus 456.5 s for RAI and 1218.4 s for SA, and thus an up to 71-fold reduction in processing time, while delivering comparable accuracy to both RAI and SA methods. The described AI algorithm has been commercialized as Relu Creator (Relu BV, Leuven, Belgium), which has recently obtained both U.S. and European regulatory approvals. The platform has been documented across multiple studies; for example, Elgarba et al. reported using its AI-based tools to perform integrated tasks—including CBCT segmentation and IOS-to-CBCT registration—for implant planning. In their study, AI-generated plans showed comparable qualitative outcomes to expert planning while requiring only half the time (198 ± 33 s vs. 435 ± 92 s) [73,74,75,76,77,78,79,80,81,82].
A second widely adopted commercial cloud-based AI system platform for diagnostic image segmentation of CBCT, panoramic, and intraoral radiographs is Diagnocat (LLC Diagnocat, Miami, FL, USA, version 1.0). Initially described in methodological work on 3D CBCT pathology detection and coarse-to-fine volumetric segmentation of teeth and periapical lesions, the underlying deep learning pipeline has since been extended to multi-pathology CBCT diagnosis, implant-site analysis, and panoramic/intraoral detection of caries, restorations, missing teeth, periodontal bone loss, and endodontic outcomes [85,86,87,88,89,90,91]. The system combines 3D U-Net-based volumetric CNNs for CBCT analysis and 2D convolutional models for panoramic and intraoral radiographs. Specifically, a coarse-to-fine 3D CNN pipeline is used for CBCT detection and refinement of anatomical structures. In contrast, 2D CNN classifiers and object-detection models (e.g., Faster R-CNN/YOLO) are used for panoramic and intraoral images, which are integrated via a multimodal fusion layer to produce consistent, structured diagnostic and implant-planning models. The software is CE-marked under the EU MDR, Health Canada-approved, and obtained FDA clearance for its AI-powered CBCT visualization tool, making it a fully regulated medical device currently deployed in many countries.
Besides these commercial models, a range of other applications have been reported in the literature. Pankert et al., e.g., recently reported on the use of a two-step 3D U-Net–based segmentation pipeline capable of effectively removing streaking artifacts caused by metal restorations and implants, a common source of segmentation and registration artifacts, producing high-resolution, artifact-compensated 3D models directly from CT data [84]. Compared to manual and semi-automatic segmentation, the AI model achieved substantially faster processing (~31 s per scan) and higher accuracy. Taken together, these and other reports on implant planning highlight the significant advantages of AI-based tools in reducing workflow interfaces and markedly enhancing the efficiency and accuracy of treatment planning [73]. Likewise, the improved overall planning efficiency and accuracy of AI-driven segmentation algorithms compared to traditional human-intelligence-driven workflows may support their central role in future fully integrated digital workflows.
Notably, the segmentation studies summarized above demonstrate consistently strong performance; most investigations relied on modest case numbers, heterogeneous imaging devices, or single-center datasets, partly lacking external validation. Reported accuracy and efficiency gains should therefore be interpreted in the context of these constraints and methodological limitations.

3.2.2. Implant Planning

Bayrakdar et al. were among the first to explore the use of AI for automated implant planning, employing the cloud-based Diagnocat platform to perform automated CBCT segmentation of anatomical landmarks and virtual implant placement [88]. The model utilized a modified 3D U-Net–based deep convolutional neural network (DCNN) architecture, achieving high detection accuracy for missing teeth (95.3%) and mandibular canals (72.2%), demonstrating the early potential of AI-driven planning tools in dental implantology. Al-Asali et al. recently presented a DL–based approach for automated dental implant planning using two consecutive U-Net models for 3D bone segmentation and the prediction of optimal implant positions from CBCT data [92]. The AI model achieved high accuracy and generated implant planning outputs almost instantly, i.e., within ~10 s. Similarly, Elgarba et al. described a cloud-based AI tool for fully automated implant planning in single mandibular molar and premolar sites. They compared the outcomes with those of expert-driven planning [74,116]. The model employed multiple 3D U-Net networks for the segmentation of anatomical landmarks, AI-based registration of CBCT and IOS data, and automated implant positioning according to predefined prosthetic and anatomical guidelines. In a blinded assessment, 95% of AI-generated plans and 96% of expert plans required no major modifications, while AI-based planning was 2 times faster than human planning and achieved 100% planning consistency. These findings demonstrate that AI-driven virtual implant planning can achieve expert-level accuracy while substantially increasing efficiency and reproducibility, highlighting its potential role in realizing fully integrated digital workflows.
Beyond implant positional planning, machine learning has also been successfully employed for the diagnosis of dento-maxillofacial deformities and surgical planning of orthodontic-orthognathic surgery [117]. Together, these reports indicate the tremendous potential of DL models to automate entire implant planning workflows, encompassing dataset registration and segmentation through to implant planning, thereby helping reduce certain interfaces in current digital workflows.
It should be noted, however, that many AI-driven planning frameworks remain at the proof-of-concept or early clinical validation stage, with potential limitations in sample sizes and testing performed in selective implant scenarios. As such, while planning time reductions and expert-level concordance are encouraging, these findings require confirmation in larger, multicenter clinical cohorts to establish generalizability and clinical robustness.

3.2.3. Prosthetic Design

Lerner et al. applied AI as a component of CAD software to automatically retrieve and align original abutment designs with intraoral scans and to digitally design subgingival margins as part of prosthodontic design workflows [93]. This approach streamlined the restoration process, eliminated the need for manual gingival margin tracing, and yielded excellent functional and aesthetic outcomes.
Cho et al. evaluated a deep learning method for designing implant-supported posterior crowns and compared its performance with traditional technician-driven CAD [94]. The DL model-generated crowns achieved significantly higher time efficiency (~83 s) compared to both technician-optimized DL crowns (~322 s) and conventional CAD-derived crowns (~371 s), while maintaining comparable occlusal table area, cusp angle, cusp height, proximal contacts, and emergence profile, with only minor differences in occlusal contact intensity.
In addition to implant-borne restorative designs, several studies have reported on the application of AI- and DL-based methods for tooth-borne crown design. Recent investigations on DL-generated anterior crowns have demonstrated, for example, clinically acceptable morphology and functional outcomes, including comparable incisal path length, inclination, and occlusal relationships when compared to conventional CAD controls [95,96]. Moreover, DL-based workflows have been reported to offer potential advantages in terms of time efficiency and posterior crown quality, outperforming conventional automated CAD solutions in marginal adaptation, proximal contacts, and anatomical form [97] However, in complex anterior scenarios, such as diastemas or atypical occlusal patterns, limited human refinement was considered beneficial to ensure optimal functional performance [97]. 3D-Deep Convolutional Generative Adversarial Network-based models have been shown to yield dental crowns with functional occlusal and biomechanical outcomes comparable to those of conventional models. Shetty et al. concluded that the use of AI-based algorithms for crown-shade matching holds promise, as reported in a systematic review [98,99]. Together, these studies demonstrate the tremendous potential of DL-based methods to deliver clinically functional and aesthetically pleasing crown designs. While technician refinements may still be beneficial for enhancing geometric fidelity, these reports illustrate the potential role of DL-based methods in helping remove interfaces in CAD-based workflows, ultimately leading to the realization of end-to-end integrated digital workflows.
Despite their promising performance, available studies on AI-generated restorations often use controlled experimental settings, relatively small datasets, or comparisons limited to a single software ecosystem. Evidence on long-term clinical performance or cross-platform interoperability remains limited, and minor technician-mediated refinements are still required in complex anterior or non-ideal occlusal situations. Consequently, while being encouraging, the reported advantages should be interpreted in the context of potential limitations.
Beyond prosthetic design, AI-based tools are increasingly being used in the aesthetic and functional planning phases of implant prosthodontics [118,119]. Contemporary digital smile design (DSD) systems no longer operate as isolated cosmetic add-ons, but are progressively converging with AI-driven facial landmarking, peri-oral segmentation, occlusal modeling, virtual patient construction, and prosthetically driven planning. As automated facial analysis, smile-type classification, and tooth morphology prediction begin to feed directly into restorative and implant planning decisions, DSD is increasingly being explored as a component of AI-assisted aesthetic–functional planning. This convergence strengthens the link between facial-driven design, occlusal harmony, and implant positioning and situates DSD naturally within unified, AI-enabled digital workflows.

3.2.4. Digital Smile Design

Recent studies and reports indicate significant potential for AI-based workflows in DSD. Mohsin et al. recently reported the use of a combination of CNNs and GANs for facial feature analysis, including lip contours, smile lines, and facial symmetry [100]. Combined with customized smile design, this approach demonstrated to yield significantly higher aesthetic outcomes, superior patient satisfaction, and a 40% shorter design time compared to traditional methods. Complementing these findings, Ceylan et al. found that smile designs generated via AI-based methods yielded acceptable results, especially in symmetric cases, regardless of the patient’s experience with the topic, and offered time-saving benefits [101]. Lee et al. have recently presented a method for segmenting and classifying peri-oral tissues and smile types, which represents a key technical hurdle in developing fully automated DSD and AI-driven routines [102]. Commercial platforms like IvoSmile and Smile Cloud, which claim to be AI-based, exemplify this uptake. These platforms offer rapid smile previews in just minutes. At the same time, details on the specific AI-based methodologies remain largely unknown, and a systematic comparative investigation of their benefits has remained widely undescribed [120]. Finally, a scoping review supported these reported outcomes, while identifying only a limited overall number of studies on the subjects currently available [121].
Related to DSD, Ye et al. have demonstrated that deep learning and machine-learning-based cephalometric software, including MyOrthoX, Angelalign, and Digident, can automate landmark detection with accuracy comparable to that of experienced orthodontists, while reducing analysis time by up to 50% [103]. This capability has the potential to significantly facilitate workflow integration by automatically providing relevant skeletal landmark information for prosthodontic analysis and treatment planning, while also enabling integration of DSD, implant planning, and digital wax-up datasets. Despite acknowledging such progressively improving performance of AI-based models, Polizzi and Leonardi still concluded that cephalometric landmarking required final human supervision [122].
Looking forward, the integration of AI promises to streamline DSD by facilitating objective landmark detection and smile classification, resulting in optimized personalized dental design with the benefit of instant patient-driven visualization [23]. Larger, methodical clinical trials, transparent algorithm reporting, and evaluation across diverse patient profiles and case complexities will further contribute to the seamless integration of DSD and the realization of fully integrated digital implant prosthodontic workflows.

3.2.5. Robotics and Smart Surgery

Besides artificial intelligence-based diagnostic and planning tools, surgical robots represent a third group of disruptive technologies that are increasingly being integrated into dental implantology to improve accuracy, safety, and procedural efficiency compared with freehand (FH), static computer-assisted implant surgery (S-CAIS), or dynamic computer-assisted implant surgery (D-CAIS) [12,47]. A series of studies has demonstrated that robot-assisted computer-aided implant surgery (R-CAIS) may potentially achieve higher implant positional accuracy than S-CAIS or D-CAIS systems [110,123,124]. Across the clinical studies evaluating robotic-assisted implant surgery, positional accuracy was assessed by superimposing postoperative CBCT scans onto the preoperative virtual implant plan using voxel-based or surface registration. Deviation was consistently quantified along three validated metrics: global platform (coronal) deviation, global apex deviation, and angular deviation. This methodology aligns with the standard CAIS accuracy assessment framework. It has been applied consistently across recent systematic reviews, including the most recent meta-analysis by Luo et al., which confirms that robotic systems exhibit low coronal, apical, and angular deviations when measured using these established reference standards [125].
Robotic systems typically consist of a robotic operating platform, a visual tracking system to register the real-time 3D orientation and position of the surgical field, and a central control unit that, together, replicate the roles of a dentist’s hands, eyes, and brain. While most current systems are pre-programmed, AI integration is emerging. Prototype models already use convolutional neural networks (CNNs) and image-based landmark detection to analyze CBCT data, predict optimal implant trajectories, and adjust intraoperatively based on real-time bone density and patient motion. For example, advanced versions of R-CAIS combine autonomous drilling with AI-driven haptic feedback and optical navigation, enabling highly precise angular and apical positioning [12,47].
Depending on the level of autonomy, dental implant robots can be classified into surgeon-guided, collaborative, and fully autonomous systems. Surgeon-guided systems (formerly passive robots) provide positional guidance but require the operator to control drilling and implant insertion [47]. Examples include Yomi (Neocis, Miami, FL, USA) and DentRobot (Dcarer Medical Technology, China). Collaborative systems (formerly semi-active robots) combine autonomous site preparation with manual assistance for intraoral navigation. Examples include Remebot (Baihui Weikang, China), Theta (Hangzhou Jianjia, China), and Langyue (Shecheng, Beijing, China). Fully autonomous systems, also called active robots, have the ability to independently perform both osteotomy preparation and implant placement according to a presurgical plan and under the supervision of a surgeon. An example is YekeBot (YekeBot Technology, China), which automatically enters the mouth, drills, and places implants under the supervision of a surgeon.
Table 2 summarizes prominent robotic implant systems currently available, including their autonomy levels and regulatory status. At present, Yomi (Neocis, Miami, FL, USA) remains the only robot-assisted implant system with FDA clearance, while several China-developed systems (e.g., Remebot, THETA, YekeBot, Langyue) have obtained approval through the Chinese NMPA [126]. To the author’s knowledge, no dental implant robot currently has a verified CE mark for marketing in the EU. Both FDA and NMPA approvals indicate a high technological readiness level (approximately TRL 8–9), yet commercial use remains concentrated almost exclusively in China and the United States.
Compared to S-CAIS systems, R-CAIS systems do not require the production and use of a surgical guide, thereby eliminating the associated planning and manufacturing steps and enabling same-day diagnostic planning and surgery. Additionally, factors such as the surgeon’s experience with S-CAIS procedure accuracy become obsolete [12,127,128]. Another advantage of R-CAIS systems over D-CAIS systems is that they do not require continuous screen focus, heavy or bulky equipment, and are not prone to a relatively pronounced learning curve, all of which represent potential risk factors that can contribute to deviations from the surgical plan [111,129,130]. Together, these striking advantages of robotic systems and their increasing availability indicate that these systems represent a notable technological development with the potential to streamline current workflows while becoming a major contributor to the establishment of fully integrated implant prosthodontic workflows.
Despite the strong potential of robotic systems in modern dental implant workflows, human clinical judgment remains indispensable for case selection, interpretation of patient-specific nuances, and intraoperative decision-making in unforeseen situations. Accordingly, R-CAIS should be regarded as a powerful adjunct that enhances surgical precision and workflow reliability while operating under the clinician’s oversight and expertise. While current systems may be primarily regarded as assistive technologies, and they are likely to remain so for some time, ongoing developments in autonomy and AI-driven decision support suggest that more advanced forms of automation may emerge as the technology matures.
Table 2. Overview of commercially available and emerging robotic systems for dental implantology, including autonomy level, key functionalities, and current regulatory approval status.
Table 2. Overview of commercially available and emerging robotic systems for dental implantology, including autonomy level, key functionalities, and current regulatory approval status.
Robotic SystemManufacturerLevel of AutonomyRegulatory Status
Yomi [126,131]Neocis Inc. (USA)Surgeon-guided (passive)FDA 510(k) cleared (2017, expanded indications 2020–2023)
Dcarer [126]Dcarer Medical Technology Co., Ltd., Suzhou, ChinaSurgeon-guided (passive)NMPA-approved (China)
Remebot [132]Baihui Weikang Technology Co., Ltd., Beijing, ChinaCollaborative (semi-active)NMPA-approved since 2021 (China)
Theta
[126]
Hangzhou Jianjia robot Co., Ltd., Hangzhou, ChinaCollaborative (semi-active)NMPA-approved (China)
Cobot
[126]
Langyue dental surgery robot, Shecheng Co., Ltd., Shanghai, ChinaCollaborative (semi-active)NMPA-approved (China)
YekeBot
[126]
Yakebot Technology Co., Ltd., Beijing, ChinaFully autonomous claimed (active robot)NMPA-approved (China)

3.2.6. Implant Maintenance

Supportive implant care is regarded as crucial to prevent the onset or recurrence of peri-implant disease following successful implant integration and/or active or completed peri-implant therapy, respectively. Baseline and follow-up periapical radiographs represent important diagnostic tools to ensure peri-implant health [133,134,135,136]. In this context, Cha et al. recently presented a modified deep convolutional neural network (R-CNN) model capable of measuring radiographic peri-implant bone loss, performing comparably to human diagnosis, indicating that AI models may not only play an important future role in pre-placement diagnostic but also in post-delivery treatment phases and maintenance [104,137].

4. Conclusions

4.1. Current State of AI and Robotics in Digital Implantology

The emergence of artificial intelligence, deep learning, and robotics has significant potential to transform digital implant prosthodontic workflows. Pilot and proof-of-concept studies indicate that AI-based tools are capable of delivering expert-level performance in diagnostic image analysis and processing, data registration and segmentation, implant planning, prosthetic design, and smile simulation. Early clinical and preclinical studies suggest that robotic-assisted surgery may reduce operator dependency while enhancing workflow efficiency and surgical precision. Together, these technologies offer the potential to help reduce existing workflow interfaces, minimize human error, and enable real-time, adaptive treatment strategies.

4.2. Practical Implications for Clinicians

As these technologies mature, more sophisticated AI-based models and advanced robotic tools trained on larger datasets are expected to contribute to the convergence of current tools and tasks throughout the entire digital implant workflow. Pilot and proof-of-concept studies suggest that, in the near to mid-term, AI-based data processing and robotic surgery could guide digital implantology toward fully integrated, direct digital workflows, with significantly improved efficiency and accuracy. This development may tailor traditional practices by connecting and integrating fragmented systems with intelligent, end-to-end treatment ecosystems where diagnostics, planning, execution, and prosthetic delivery operate seamlessly within a unified digital framework.
At the same time, broader clinical translation requires robust standardization: consistent datasets, harmonized annotation protocols, and transparent validation frameworks are essential to ensure that AI systems perform reliably across diverse patient populations, imaging modalities, and clinical environments. Such standardization will be critical for safe integration into daily practice.
From an implementation perspective, the economic and operational barriers differ substantially between AI and robotic systems. AI-based tools, which are typically cloud-delivered and often offered on subscription or pay-per-use models, generally involve low upfront investment and therefore pose limited financial barriers to adoption. In contrast, robotic-assisted implant systems may require substantial capital investment and dedicated training, although these costs may be offset in part by potential advantages, such as increased workflow robustness, potentially improved accuracy, and the potential elimination of guide fabrication steps, depending on the specific system. On the other hand, questions of clinical responsibility are less controversial, as current regulations place final accountability with the surgeon, and the introduction of assistive automation does not alter this fundamental professional obligation.

4.3. Future Outlook and Emerging Considerations

Looking ahead, AI and robotics hold great potential for realizing fully integrated, end-to-end digital implant workflows that link diagnostics, planning, execution, and prosthetic delivery. As autonomy and AI-driven decision support advance, new ethical, professional, and regulatory considerations will emerge, including transparency, explainability, clinician oversight, and medico-legal accountability. While current systems function primarily as powerful assistive technologies and are likely to remain so in the near term, ongoing progress may enable more advanced automation as the field matures. Ensuring this evolution remains patient-centered will require continued evaluation, standardized assessment methodologies, and clear regulatory frameworks.

Author Contributions

Conceptualization, methodology, investigation, writing—original draft preparation, writing—review and editing, A.P.; supervision and critical revision of the manuscript, T.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

No new data were created or analyzed in this study. Data sharing is not applicable.

Conflicts of Interest

The authors declare no conflicts of interest. The authors declare no financial, advisory, or commercial relationships with any of the companies or products mentioned in this manuscript. References to commercial systems are provided solely as descriptive examples of available technologies.

Abbreviations

The following abbreviations are used in this manuscript:
AbbreviationDefinition
AIArtificial Intelligence
MLMachine Learning
DLDeep Learning
ANNArtificial Neural Network
CNNConvolutional Neural Network
RNNRecurrent Neural Network
GNNGraph Neural Network
GANGenerative Adversarial Network
VAEVariational Autoencoder
NLPNatural Language Processing
EHRElectronic Health Record
SVMSupport Vector Machine
kNNk-Nearest Neighbors
CAISComputer-Assisted Implant Surgery
S-CAISStatic Computer-Assisted Implant Surgery
D-CAISDynamic Computer-Assisted Implant Surgery
R-CAISRobot-Assisted Computer-Assisted Implant Surgery
FHFreehand (implant placement)
CADComputer-Aided Design
CAMComputer-Aided Manufacturing
CBCTCone Beam Computed Tomography
CTComputed Tomography
IOSIntraoral Scanner/Intraoral Scan
DSDDigital Smile Design
VAVirtual Articulator
APIApplication Programming Interface
TRLTechnology Readiness Level
IoUIntersection over Union
HD9595th Percentile Hausdorff Distance
U-NetConvolutional Neural Network Architecture for Segmentation
DCNNDeep Convolutional Neural Network
R-CNNRegion-based Convolutional Neural Network
OPGOrthopantomogram (panoramic radiograph)
YOLO“You Only Look Once” Object Detection Model
SaaSSoftware-as-a-Service
FDAU.S. Food and Drug Administration
NMPANational Medical Products Administration (China)
CEConformité Européenne (CE-mark)

References

  1. Joda, T.; Zarone, F.; Ferrari, M. The Complete Digital Workflow in Fixed Prosthodontics: A Systematic Review. BMC Oral Health 2017, 17, 124. [Google Scholar] [CrossRef] [PubMed]
  2. Xiang, B.; Yu, J.; Lu, J.; Yan, Z. Comparisons between Digital-Guided and Non-Digital Protocol in Implant Planning, Placement, and Restorations: A Systematic Review and Meta-Analysis of Randomized Controlled Trials. J. Evid. Based Dent. Pract. 2023, 23, 101919. [Google Scholar] [CrossRef]
  3. Scolozzi, P.; Michelini, F.; Crottaz, C.; Perez, A. Computer-Aided Design and Computer-Aided Modeling (CAD/CAM) for Guiding Dental Implant Surgery: Personal Reflection Based on 10 Years of Real-Life Experience. J. Pers. Med. 2023, 13, 129. [Google Scholar] [CrossRef]
  4. Romandini, M.; Ruales-Carrera, E.; Sadilina, S.; Hämmerle, C.H.F.; Sanz, M. Minimal Invasiveness at Dental Implant Placement: A Systematic Review with Meta-analyses on Flapless Fully Guided Surgery. Periodontology 2000 2023, 91, 89–112. [Google Scholar] [CrossRef]
  5. Raico Gallardo, Y.N.; Da Silva-Olivio, I.R.T.; Mukai, E.; Morimoto, S.; Sesma, N.; Cordaro, L. Accuracy Comparison of Guided Surgery for Dental Implants According to the Tissue of Support: A Systematic Review and Meta-analysis. Clin. Oral Implant. Res. 2017, 28, 602–612. [Google Scholar] [CrossRef] [PubMed]
  6. Perez, A.; Lombardi, T. Integration and Innovation in Digital Implantology—Part I: Capabilities and Limitations of Contemporary Workflows: A Narrative Review. Appl. Sci. 2025, 15, 12214. [Google Scholar] [CrossRef]
  7. Tahmaseb, A.; Wu, V.; Wismeijer, D.; Coucke, W.; Evans, C. The Accuracy of Static Computer-Aided Implant Surgery: A Systematic Review and Meta-Analysis. Clin. Oral Implant. Res. 2018, 29, 416–435. [Google Scholar] [CrossRef]
  8. Schneider, D.; Marquardt, P.; Zwahlen, M.; Jung, R.E. A Systematic Review on the Accuracy and the Clinical Outcome of Computer-Guided Template-Based Implant Dentistry. Clin. Oral Implant. Res. 2009, 20, 73–86. [Google Scholar] [CrossRef]
  9. Vercruyssen, M.; Laleman, I.; Jacobs, R.; Quirynen, M. Computer-supported Implant Planning and Guided Surgery: A Narrative Review. Clin. Oral Implant. Res. 2015, 26, 69–76. [Google Scholar] [CrossRef]
  10. Nkenke, E.; Eitner, S.; Radespiel-Tröger, M.; Vairaktaris, E.; Neukam, F.W.; Fenner, M. Patient-centred Outcomes Comparing Transmucosal Implant Placement with an Open Approach in the Maxilla: A Prospective, Non-randomized Pilot Study. Clin. Oral Implant. Res. 2007, 18, 197–203. [Google Scholar] [CrossRef] [PubMed]
  11. Fortin, T.; Bosson, J.L.; Isidori, M.; Blanchet, E. Effect of Flapless Surgery on Pain Experienced in Implant Placement Using an Image-Guided System. Int. J. Oral Maxillofac. Implant. 2006, 21, 298–304. [Google Scholar]
  12. Gargallo-Albiol, J.; Barootchi, S.; Salomó-Coll, O.; Wang, H. Advantages and Disadvantages of Implant Navigation Surgery. A Systematic Review. Ann. Anat.—Anat. Anz. 2019, 225, 1–10. [Google Scholar] [CrossRef]
  13. Pozzi, A.; Arcuri, L.; Moy, P.K. The Smiling Scan Technique: Facially Driven Guided Surgery and Prosthetics. J. Prosthodont. Res. 2018, 62, 514–517. [Google Scholar] [CrossRef] [PubMed]
  14. Sobczak, B.; Majewski, P. An Integrated Fully Digital Prosthetic Workflow for the Immediate Full-Arch Restoration of Edentulous Patients—A Case Report. Int. J. Environ. Res. Public Health 2022, 19, 4126. [Google Scholar] [CrossRef]
  15. Pariente, L.; Dada, K.; Linder, S.; Dard, M. Immediate Implant Placement in the Esthetic Zone Using a Novel Tapered Implant Design and a Digital Integrated Workflow: A Case Series. Int. J. Periodontics Restor. Dent. 2023, 43, 578–587. [Google Scholar] [CrossRef]
  16. Sobczak, B.; Majewski, P.; Egorenkov, E. Survival and Success of 3D-Printed Versus Milled Immediate Provisional Full-Arch Restorations: A Retrospective Analysis. Clin. Implant. Dent. Relat. Res. 2025, 27, e13418. [Google Scholar] [CrossRef]
  17. Michelinakis, G.; Apostolakis, D.; Kamposiora, P.; Papavasiliou, G.; Özcan, M. The Direct Digital Workflow in Fixed Implant Prosthodontics: A Narrative Review. BMC Oral Health 2021, 21, 37–61. [Google Scholar] [CrossRef]
  18. Rutkūnas, V.; Auškalnis, L.; Pletkus, J. Intraoral Scanners in Implant Prosthodontics. A Narrative Review. J. Dent. 2024, 148, 105152. [Google Scholar] [CrossRef]
  19. Elgarba, B.M.; Fontenele, R.C.; Tarce, M.; Jacobs, R. Artificial Intelligence Serving Presurgical Digital Implant Planning: A Scoping Review. J. Dent. 2024, 143, 104862. [Google Scholar] [CrossRef] [PubMed]
  20. Schubert, O.; Schweiger, J.; Stimmelmayr, M.; Nold, E.; Güth, J.-F. Digital Implant Planning and Guided Implant Surgery—Workflow and Reliability. Br. Dent. J. 2019, 226, 101–108. [Google Scholar] [CrossRef] [PubMed]
  21. Joda, T.; Gallucci, G.O. The Virtual Patient in Dental Medicine. Clin. Oral Implant. Res. 2015, 26, 725–726. [Google Scholar] [CrossRef]
  22. Mangano, C.; Luongo, F.; Migliario, M.; Mortellaro, C.; Mangano, F.G. Combining Intraoral Scans, Cone Beam Computed Tomography and Face Scans: The Virtual Patient. J. Craniofac. Surg. 2018, 29, 2241–2246. [Google Scholar] [CrossRef]
  23. Coachman, C.; Sesma, N.; Blatz, M.B. The Complete Digital Workflow in Interdisciplinary Dentistry. Int. J. Esthet. Dent. 2021, 16, 34–49. [Google Scholar]
  24. Wang, J.; Wang, B.; Liu, Y.Y.; Luo, Y.L.; Wu, Y.Y.; Xiang, L.; Yang, X.M.; Qu, Y.L.; Tian, T.R.; Man, Y. Recent Advances in Digital Technology in Implant Dentistry. J. Dent. Res. 2024, 103, 787–799. [Google Scholar] [CrossRef]
  25. Papaspyridakos, P.; Chen, Y.; Gonzalez-Gusmao, I.; Att, W. Complete Digital Workflow in Prosthesis Prototype Fabrication for Complete-Arch Implant Rehabilitation: A Technique. J. Prosthet. Dent. 2019, 122, 189–192. [Google Scholar] [CrossRef]
  26. Carosi, P.; Ferrigno, N.; De Renzi, G.; Laureti, M. Digital Workflow to Merge an Intraoral Scan and CBCT of Edentulous Maxilla: A Technical Report. J. Prosthodont. 2020, 29, 730–732. [Google Scholar] [CrossRef] [PubMed]
  27. Auduc, C.; Douillard, T.; Nicolas, E.; El Osta, N. Fully Digital Workflow in Full-Arch Implant Rehabilitation: A Descriptive Methodological Review. Prosthesis 2025, 7, 85. [Google Scholar] [CrossRef]
  28. Preda, F.; Nogueira-Reis, F.; Stanciu, E.M.; Smolders, A.; Jacobs, R.; Shaheen, E. Validation of Automated Registration of Intraoral Scan onto Cone Beam Computed Tomography for an Efficient Digital Dental Workflow. J. Dent. 2024, 149, 105282. [Google Scholar] [CrossRef] [PubMed]
  29. Ruiz-Romero, V.; Jorba-Garcia, A.; Camps-Font, O.; Figueiredo, R.; Valmaseda-Castellón, E. Accuracy of Dynamic Computer-Assisted Implant Surgery in Fully Edentulous Patients: An in Vitro Study. J. Dent. 2024, 149, 105290. [Google Scholar] [CrossRef]
  30. Marquez Bautista, N.; Meniz-García, C.; López-Carriches, C.; Sánchez-Labrador, L.; Cortés-Bretón Brinkmann, J.; Madrigal Martínez-Pereda, C. Accuracy of Different Systems of Guided Implant Surgery and Methods for Quantification: A Systematic Review. Appl. Sci. 2024, 14, 11479. [Google Scholar] [CrossRef]
  31. Biun, J.; Dudhia, R.; Arora, H. The In-vitro Accuracy of Fiducial Marker-based versus Markerless Registration of an Intraoral Scan with a Cone-beam Computed Tomography Scan in the Presence of Restoration Artifact. Clin. Oral Implant. Res. 2023, 34, 1257–1266. [Google Scholar] [CrossRef]
  32. Woo, H.-W.; Mai, H.-N.; Lee, D.-H. Comparison of the Accuracy of Image Registration Methods for Merging Optical Scan and Radiographic Data in Edentulous Jaws. J. Prosthodont. 2020, 29, 707–711. [Google Scholar] [CrossRef]
  33. Watanabe, H.; Fellows, C.; An, H. Digital Technologies for Restorative Dentistry. Dent. Clin. N. Am. 2022, 66, 567–590. [Google Scholar] [CrossRef] [PubMed]
  34. Lepidi, L.; Galli, M.; Grammatica, A.; Joda, T.; Wang, H.-L.; Li, J. Indirect Digital Workflow for Virtual Cross-Mounting of Fixed Implant-Supported Prostheses to Create a 3D Virtual Patient. J. Prosthodont. 2021, 30, 177–182. [Google Scholar] [CrossRef] [PubMed]
  35. Flügge, T.; Kramer, J.; Nelson, K.; Nahles, S.; Kernen, F. Digital Implantology—A Review of Virtual Planning Software for Guided Implant Surgery. Part II: Prosthetic Setup and Virtual Implant Planning. BMC Oral Health 2022, 22, 23. [Google Scholar] [CrossRef] [PubMed]
  36. Kernen, F.; Kramer, J.; Wanner, L.; Wismeijer, D.; Nelson, K.; Flügge, T. A Review of Virtual Planning Software for Guided Implant Surgery—Data Import and Visualization, Drill Guide Design and Manufacturing. BMC Oral Health 2020, 20, 251. [Google Scholar] [CrossRef]
  37. Mukhopadhyay, P. The Passive Fit Concept- A Review of Methods to Achieve and Evaluate in Multiple Unit Implant Supported Screw Retained Prosthesis. J. Dent. Oral Sci. 2021, 3, 1–7. [Google Scholar] [CrossRef]
  38. Araujo-Corchado, E.; Pardal-Peláez, B. Computer-Guided Surgery for Dental Implant Placement: A Systematic Review. Prosthesis 2022, 4, 540–553. [Google Scholar] [CrossRef]
  39. Cristache, C.M.; Burlibasa, M.; Tudor, I.; Totu, E.E.; Di Francesco, F.; Moraru, L. Accuracy, Labor-Time and Patient-Reported Outcomes with Partially versus Fully Digital Workflow for Flapless Guided Dental Implants Insertion—A Randomized Clinical Trial with One-Year Follow-Up. J. Clin. Med. 2021, 10, 1102. [Google Scholar] [CrossRef]
  40. Sadilina, S.; Vietor, K.; Doliveux, R.; Siu, A.; Chen, Z.; Al-Nawas, B.; Mattheos, N.; Pozzi, A. Beyond Accuracy: Clinical Outcomes of Computer Assisted Implant Surgery. Clin. Exp. Dent. Res. 2025, 11, e70129. [Google Scholar] [CrossRef]
  41. Graf, T.; Keul, C.; Wismeijer, D.; Güth, J.F. Time and Costs Related to Computer-assisted versus Non-computer-assisted Implant Planning and Surgery. A Systematic Review. Clin. Oral Implant. Res. 2021, 32, 303–317. [Google Scholar] [CrossRef] [PubMed]
  42. Dhopte, A.; Bagde, H. Smart Smile: Revolutionizing Dentistry With Artificial Intelligence. Cureus 2023, 15, e41227. [Google Scholar] [CrossRef]
  43. Sirko, J.; Shi, W. Disruptive Innovation Events in Dentistry. J. Am. Dent. Assoc. 2024, 155, 899–901. [Google Scholar] [CrossRef] [PubMed]
  44. Aseri, A.A. Exploring the Role of Artificial Intelligence in Dental Implantology: A Scholarly Review. J. Pharm. Bioallied Sci. 2025, 17, S102–S104. [Google Scholar] [CrossRef]
  45. Najeeb, M.; Islam, S. Artificial Intelligence (AI) in Restorative Dentistry: Current Trends and Future Prospects. BMC Oral Health 2025, 25, 592. [Google Scholar] [CrossRef] [PubMed]
  46. Samaranayake, L.; Tuygunov, N.; Schwendicke, F.; Osathanon, T.; Khurshid, Z.; Boymuradov, S.A.; Cahyanto, A. The Transformative Role of Artificial Intelligence in Dentistry: A Comprehensive Overview. Part 1: Fundamentals of AI, and Its Contemporary Applications in Dentistry. Int. Dent. J. 2025, 75, 383–396. [Google Scholar] [CrossRef]
  47. Bahrami, R.; Pourhajibagher, M.; Nikparto, N.; Bahador, A. Robot-Assisted Dental Implant Surgery Procedure: A Literature Review. J. Dent. Sci. 2024, 19, 1359–1368. [Google Scholar] [CrossRef]
  48. Salvi, S.; Vu, G.; Gurupur, V.; King, C. Digital Convergence in Dental Informatics: A Structured Narrative Review of Artificial Intelligence, Internet of Things, Digital Twins, and Large Language Models with Security, Privacy, and Ethical Perspectives. Electronics 2025, 14, 3278. [Google Scholar] [CrossRef]
  49. Shirani, M. Trends and Classification of Artificial Intelligence Models Utilized in Dentistry: A Bibliometric Study. Cureus 2025, 17, e81836. [Google Scholar] [CrossRef]
  50. Tuygunov, N.; Samaranayake, L.; Khurshid, Z.; Rewthamrongsris, P.; Schwendicke, F.; Osathanon, T.; Yahya, N.A. The Transformative Role of Artificial Intelligence in Dentistry: A Comprehensive Overview Part 2: The Promise and Perils, and the International Dental Federation Communique. Int. Dent. J. 2025, 75, 397–404. [Google Scholar] [CrossRef]
  51. Koul, R.; Upadhyay, G.; Kalia, D.; Verma, K. Artificial Intelligence in Prosthodontics: Current Applications and Future Avenues: A Narrative Review. J. Prim. Care Dent. Oral Health 2024, 5, 94–100. [Google Scholar]
  52. Karnik, A.P.; Chhajer, H.; Venkatesh, S.B. Transforming Prosthodontics and Oral Implantology Using Robotics and Artificial Intelligence. Front. Oral Health 2024, 5, 1442100. [Google Scholar] [CrossRef]
  53. Schwendicke, F.; Mohammad Rahimi, H.; Tichy, A. Artificial Intelligence in Prosthodontics. Dent. Clin. N. Am. 2025, 69, 315–326. [Google Scholar] [CrossRef]
  54. Joda, T.; Brägger, U. Time-Efficiency Analysis Comparing Digital and Conventional Workflows for Implant Crowns: A Prospective Clinical Crossover Trial. Int. J. Oral Maxillofac. Implant. 2015, 30, 1047–1053. [Google Scholar] [CrossRef]
  55. Janiesch, C.; Zschech, P.; Heinrich, K. Machine Learning and Deep Learning. Electron. Mark. 2021, 31, 685–695. [Google Scholar] [CrossRef]
  56. Turing, A.M. Computing Machinery and Intelligence. Mind 1950, 59, 433–460. [Google Scholar] [CrossRef]
  57. Weizenbaum, J. ELIZA—A Computer Program for the Study of Natural Language Communication between Man and Machine. Commun. ACM 1966, 9, 36–45. [Google Scholar] [CrossRef]
  58. Dey, D.; Slomka, P.J.; Leeson, P.; Comaniciu, D.; Shrestha, S.; Sengupta, P.P.; Marwick, T.H. Artificial Intelligence in Cardiovascular Imaging. J. Am. Coll. Cardiol. 2019, 73, 1317–1335. [Google Scholar] [CrossRef]
  59. Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning Representations by Back-Propagating Errors. Nature 1986, 323, 533–536. [Google Scholar] [CrossRef]
  60. Lecun, Y.; Bottou, L.; Bengio, Y.; Haffner, P. Gradient-Based Learning Applied to Document Recognition. Proc. IEEE 1998, 86, 2278–2324. [Google Scholar] [CrossRef]
  61. Krizhevsky, A.; Sutskever, I.; Hinton, G.E. ImageNet Classification with Deep Convolutional Neural Networks. Commun. ACM 2017, 60, 84–90. [Google Scholar] [CrossRef]
  62. Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention Is All You Need. In Proceedings of the 31st International Conference on Neural Information Processing Systems, Auckland, New Zealand, 2–6 December 2024; Curran Associates Inc.: Red Hook, NY, USA, 2017; pp. 6000–6010. [Google Scholar]
  63. Goodfellow, I.; Bengio, Y.; Courville, A. Deep Learning; MIT Press: Cambridge, MA, USA, 2016. [Google Scholar]
  64. Quinlan, J.R. Induction of Decision Trees. Mach. Learn. 1986, 1, 81–106. [Google Scholar] [CrossRef]
  65. Cortes, C.; Vapnik, V. Support-Vector Networks. Mach. Learn. 1995, 20, 273–297. [Google Scholar] [CrossRef]
  66. LeCun, Y.; Bengio, Y.; Hinton, G. Deep Learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
  67. Lee, C.; Kabir, T.; Nelson, J.; Sheng, S.; Meng, H.; Van Dyke, T.E.; Walji, M.F.; Jiang, X.; Shams, S. Use of the Deep Learning Approach to Measure Alveolar Bone Level. J. Clin. Periodontol. 2022, 49, 260–269. [Google Scholar] [CrossRef]
  68. Verhelst, P.-J.; Smolders, A.; Beznik, T.; Meewis, J.; Vandemeulebroucke, A.; Shaheen, E.; Van Gerven, A.; Willems, H.; Politis, C.; Jacobs, R. Layered Deep Learning for Automatic Mandibular Segmentation in Cone-Beam Computed Tomography. J. Dent. 2021, 114, 103786. [Google Scholar] [CrossRef] [PubMed]
  69. Samek, W.; Wiegand, T.; Müller, K.-R. Explainable Artificial Intelligence: Understanding, Visualizing and Interpreting Deep Learning Models. arXiv 2017, arXiv:1708.08296. [Google Scholar] [CrossRef]
  70. Mallineni, S.K.; Sethi, M.; Punugoti, D.; Kotha, S.B.; Alkhayal, Z.; Mubaraki, S.; Almotawah, F.N.; Kotha, S.L.; Sajja, R.; Nettam, V.; et al. Artificial Intelligence in Dentistry: A Descriptive Review. Bioengineering 2024, 11, 1267. [Google Scholar] [CrossRef]
  71. Schwendicke, F.; Samek, W.; Krois, J. Artificial Intelligence in Dentistry: Chances and Challenges. J. Dent. Res. 2020, 99, 769–774. [Google Scholar] [CrossRef]
  72. Thurzo, A.; Urbanová, W.; Novák, B.; Czako, L.; Siebert, T.; Stano, P.; Mareková, S.; Fountoulaki, G.; Kosnáčová, H.; Varga, I. Where Is the Artificial Intelligence Applied in Dentistry? Systematic Review and Literature Analysis. Healthcare 2022, 10, 1269. [Google Scholar] [CrossRef]
  73. Elgarba, B.M.; Ali, S.; Fontenele, R.C.; Meeus, J.; Jacobs, R. An AI-Based Tool for Prosthetic Crown Segmentation Serving Automated Intraoral Scan-to-CBCT Registration in Challenging High Artifact Scenarios. J. Prosthet. Dent. 2025, 134, 191–198. [Google Scholar] [CrossRef]
  74. Elgarba, B.M.; Fontenele, R.C.; Mangano, F.; Jacobs, R. Novel AI-Based Automated Virtual Implant Placement: Artificial versus Human Intelligence. J. Dent. 2024, 147, 105146. [Google Scholar] [CrossRef]
  75. Elgarba, B.M.; Van Aelst, S.; Swaity, A.; Morgan, N.; Shujaat, S.; Jacobs, R. Deep Learning-Based Segmentation of Dental Implants on Cone-Beam Computed Tomography Images: A Validation Study. J. Dent. 2023, 137, 104639. [Google Scholar] [CrossRef] [PubMed]
  76. Elsonbaty, S.; Elgarba, B.M.; Fontenele, R.C.; Swaity, A.; Jacobs, R. Novel AI-Based Tool for Primary Tooth Segmentation on CBCT Using Convolutional Neural Networks: A Validation Study. Int. J. Paediatr. Dent. 2025, 35, 97–107. [Google Scholar] [CrossRef] [PubMed]
  77. Fontenele, R.C.; Gerhardt, M.D.N.; Picoli, F.F.; Van Gerven, A.; Nomidis, S.; Willems, H.; Freitas, D.Q.; Jacobs, R. Convolutional Neural Network-based Automated Maxillary Alveolar Bone Segmentation on Cone-beam Computed Tomography Images. Clin. Oral Implant. Res. 2023, 34, 565–574. [Google Scholar] [CrossRef] [PubMed]
  78. Jindanil, T.; Marinho-Vieira, L.E.; de-Azevedo-Vaz, S.L.; Jacobs, R. A Unique Artificial Intelligence-Based Tool for Automated CBCT Segmentation of Mandibular Incisive Canal. Dentomaxillofacial Radiol. 2023, 52, 20230321. [Google Scholar] [CrossRef]
  79. Nogueira-Reis, F.; Morgan, N.; Suryani, I.R.; Tabchoury, C.P.M.; Jacobs, R. Full Virtual Patient Generated by Artificial Intelligence-Driven Integrated Segmentation of Craniomaxillofacial Structures from CBCT Images. J. Dent. 2024, 141, 104829. [Google Scholar] [CrossRef]
  80. Oliveira-Santos, N.; Jacobs, R.; Picoli, F.F.; Lahoud, P.; Niclaes, L.; Groppo, F.C. Automated Segmentation of the Mandibular Canal and Its Anterior Loop by Deep Learning. Sci. Rep. 2023, 13, 10819. [Google Scholar] [CrossRef]
  81. Swaity, A.; Elgarba, B.M.; Morgan, N.; Ali, S.; Shujaat, S.; Borsci, E.; Chilvarquer, I.; Jacobs, R. Deep Learning Driven Segmentation of Maxillary Impacted Canine on Cone Beam Computed Tomography Images. Sci. Rep. 2024, 14, 369. [Google Scholar] [CrossRef]
  82. Wang, X.; Alqahtani, K.A.; Van Den Bogaert, T.; Shujaat, S.; Jacobs, R.; Shaheen, E. Convolutional Neural Network for Automated Tooth Segmentation on Intraoral Scans. BMC Oral Health 2024, 24, 804. [Google Scholar] [CrossRef]
  83. Alahmari, M.; Alahmari, M.; Almuaddi, A.; Abdelmagyd, H.; Rao, K.; Hamdoon, Z.; Alsaegh, M.; Chaitanya, N.C.S.K.; Shetty, S. Accuracy of Artificial Intelligence-Based Segmentation in Maxillofacial Structures: A Systematic Review. BMC Oral Health 2025, 25, 350. [Google Scholar] [CrossRef]
  84. Pankert, T.; Lee, H.; Peters, F.; Hölzle, F.; Modabber, A.; Raith, S. Mandible Segmentation from CT Data for Virtual Surgical Planning Using an Augmented Two-Stepped Convolutional Neural Network. Int. J. Comput. Assist. Radiol. Surg. 2023, 18, 1479–1488. [Google Scholar] [CrossRef]
  85. Amasya, H.; Jaju, P.P.; Ezhov, M.; Gusarev, M.; Atakan, C.; Sanders, A.; Manulius, D.; Golitskya, M.; Shrivastava, K.; Singh, A.; et al. Development and Validation of an Artificial Intelligence Software for Periodontal Bone Loss in Panoramic Imaging. Int. J. Imaging Syst. Tech. 2024, 34, e22973. [Google Scholar] [CrossRef]
  86. Ezhov, M.; Gusarev, M.; Golitsyna, M.; Yates, J.M.; Kushnerev, E.; Tamimi, D.; Aksoy, S.; Shumilov, E.; Sanders, A.; Orhan, K. Clinically Applicable Artificial Intelligence System for Dental Diagnosis with CBCT. Sci. Rep. 2021, 11, 15006. [Google Scholar] [CrossRef] [PubMed]
  87. Kazimierczak, W.; Kazimierczak, N.; Issa, J.; Wajer, R.; Wajer, A.; Kalka, S.; Serafin, Z. Endodontic Treatment Outcomes in Cone Beam Computed Tomography Images—Assessment of the Diagnostic Accuracy of AI. J. Clin. Med. 2024, 13, 4116. [Google Scholar] [CrossRef] [PubMed]
  88. Kurt Bayrakdar, S.; Orhan, K.; Bayrakdar, I.S.; Bilgir, E.; Ezhov, M.; Gusarev, M.; Shumilov, E. A Deep Learning Approach for Dental Implant Planning in Cone-Beam Computed Tomography Images. BMC Med. Imaging 2021, 21, 86. [Google Scholar] [CrossRef] [PubMed]
  89. Mema, H.; Gaxhja, E.; Alicka, Y.; Gugu, M.; Topi, S.; Giannoni, M.; Pietropaoli, D.; Altamura, S. Application of AI-Driven Software Diagnocat in Managing Diagnostic Imaging in Dentistry: A Retrospective Study. Appl. Sci. 2025, 15, 9790. [Google Scholar] [CrossRef]
  90. Orhan, K.; Aktuna Belgin, C.; Manulis, D.; Golitsyna, M.; Bayrak, S.; Aksoy, S.; Sanders, A.; Önder, M.; Ezhov, M.; Shamshiev, M.; et al. Determining the Reliability of Diagnosis and Treatment Using Artificial Intelligence Software with Panoramic Radiographs. Imaging Sci. Dent. 2023, 53, 199. [Google Scholar] [CrossRef]
  91. Zakirov, A.; Ezhov, M.; Gusarev, M.; Alexandrovsky, V.; Shumilov, E. Dental Pathology Detection in 3D Cone-Beam CT. arXiv 2018, arXiv:1810.10309. [Google Scholar] [CrossRef]
  92. Al-Asali, M.; Alqutaibi, A.Y.; Al-Sarem, M.; Saeed, F. Deep Learning-Based Approach for 3D Bone Segmentation and Prediction of Missing Tooth Region for Dental Implant Planning. Sci. Rep. 2024, 14, 13888. [Google Scholar] [CrossRef]
  93. Lerner, H.; Hauschild, U.; Sader, R.; Ghanaati, S. Complete-Arch Fixed Reconstruction by Means of Guided Surgery and Immediate Loading: A Retrospective Clinical Study on 12 Patients with 1 Year of Follow-Up. BMC Oral Health 2020, 20, 15. [Google Scholar] [CrossRef]
  94. Cho, J.-H.; Çakmak, G.; Choi, J.; Lee, D.; Yoon, H.-I.; Yilmaz, B.; Schimmel, M. Deep Learning-Designed Implant-Supported Posterior Crowns: Assessing Time Efficiency, Tooth Morphology, Emergence Profile, Occlusion, and Proximal Contacts. J. Dent. 2024, 147, 105142. [Google Scholar] [CrossRef]
  95. Çakmak, G.; Cho, J.-H.; Choi, J.; Yoon, H.-I.; Yilmaz, B.; Schimmel, M. Can Deep Learning-Designed Anterior Tooth-Borne Crown Fulfill Morphologic, Aesthetic, and Functional Criteria in Clinical Practice? J. Dent. 2024, 150, 105368. [Google Scholar] [CrossRef]
  96. Cho, J.-H.; Çakmak, G.; Jee, E.-B.; Yoon, H.-I.; Yilmaz, B.; Schimmel, M. A Comparison between Commercially Available Artificial Intelligence-Based and Conventional Human Expert-Based Digital Workflows for Designing Anterior Crowns. J. Prosthet. Dent. 2025; in press. [Google Scholar] [CrossRef]
  97. Hlaing, N.H.M.M.; Çakmak, G.; Karasan, D.; Kim, S.-J.; Sailer, I.; Lee, J.-H. Artificial Intelligence-Driven Automated Design of Anterior and Posterior Crowns Under Diverse Occlusal Scenarios. J. Esthet. Restor. Dent. 2025. [Google Scholar] [CrossRef]
  98. Ding, H.; Cui, Z.; Maghami, E.; Chen, Y.; Matinlinna, J.P.; Pow, E.H.N.; Fok, A.S.L.; Burrow, M.F.; Wang, W.; Tsoi, J.K.H. Morphology and Mechanical Performance of Dental Crown Designed by 3D-DCGAN. Dent. Mater. 2023, 39, 320–332. [Google Scholar] [CrossRef]
  99. Shetty, S.; Gali, S.; Augustine, D.; Sv, S. Artificial Intelligence Systems in Dental Shade-Matching: A Systematic Review. J. Prosthodont. 2024, 33, 519–532. [Google Scholar] [CrossRef] [PubMed]
  100. Mohsin, L.; Alenezi, N.; Rashdan, Y.; Hassan, A.; Alenezi, M.; Alam, M.K.; Noor, N.F.B.M.; Akhter, F. Development of AI-Enhanced Smile Design Software for Ultra-Customized Aesthetic Outcomes. J. Pharm. Bioallied Sci. 2025, 17, S1282–S1284. [Google Scholar] [CrossRef] [PubMed]
  101. Ceylan, G.; Özel, G.S.; Memişoglu, G.; Emir, F.; Şen, S. Evaluating the Facial Esthetic Outcomes of Digital Smile Designs Generated by Artificial Intelligence and Dental Professionals. Appl. Sci. 2023, 13, 9001. [Google Scholar] [CrossRef]
  102. Lee, S.; Jin, G.; Park, J.-H.; Jung, H.-I.; Kim, J.-E. Evaluation Metric of Smile Classification by Peri-Oral Tissue Segmentation for the Automation of Digital Smile Design. J. Dent. 2024, 145, 104871. [Google Scholar] [CrossRef] [PubMed]
  103. Ye, H.; Cheng, Z.; Ungvijanpunya, N.; Chen, W.; Cao, L.; Gou, Y. Is Automatic Cephalometric Software Using Artificial Intelligence Better than Orthodontist Experts in Landmark Identification? BMC Oral Health 2023, 23, 467. [Google Scholar] [CrossRef]
  104. Cha, J.-Y.; Yoon, H.-I.; Yeo, I.-S.; Huh, K.-H.; Han, J.-S. Peri-Implant Bone Loss Measurement Using a Region-Based Convolutional Neural Network on Dental Periapical Radiographs. J. Clin. Med. 2021, 10, 1009. [Google Scholar] [CrossRef]
  105. Estrella, N.-F.; Alexandra, D.-S.; Yun, C.; Palma-Fernández, J.C.; Alejandro, I.-L. Ai-aided volumetric root resorption assessment following personalized forces in orthodontics: Preliminary results of a randomized clinical trial. J. Evid.-Based Dent. Pract. 2025, 25, 102095. [Google Scholar] [CrossRef]
  106. Chang, H.-J.; Lee, S.-J.; Yong, T.-H.; Shin, N.-Y.; Jang, B.-G.; Kim, J.-E.; Huh, K.-H.; Lee, S.-S.; Heo, M.-S.; Choi, S.-C.; et al. Deep Learning Hybrid Method to Automatically Diagnose Periodontal Bone Loss and Stage Periodontitis. Sci. Rep. 2020, 10, 7531. [Google Scholar] [CrossRef] [PubMed]
  107. Setzer, F.C.; Shi, K.J.; Zhang, Z.; Yan, H.; Yoon, H.; Mupparapu, M.; Li, J. Artificial Intelligence for the Computer-Aided Detection of Periapical Lesions in Cone-Beam Computed Tomographic Images. J. Endod. 2020, 46, 987–993. [Google Scholar] [CrossRef] [PubMed]
  108. Fukuda, M.; Inamoto, K.; Shibata, N.; Ariji, Y.; Yanashita, Y.; Kutsuna, S.; Nakata, K.; Katsumata, A.; Fujita, H.; Ariji, E. Evaluation of an Artificial Intelligence System for Detecting Vertical Root Fracture on Panoramic Radiography. Oral Radiol. 2020, 36, 337–343. [Google Scholar] [CrossRef] [PubMed]
  109. Ossowska, A.; Kusiak, A.; Świetlik, D. Artificial Intelligence in Dentistry—Narrative Review. Int. J. Environ. Res. Public Health 2022, 19, 3449. [Google Scholar] [CrossRef]
  110. Khaohoen, A.; Powcharoen, W.; Sornsuwan, T.; Chaijareenont, P.; Rungsiyakull, C.; Rungsiyakull, P. Accuracy of Implant Placement with Computer-Aided Static, Dynamic, and Robot-Assisted Surgery: A Systematic Review and Meta-Analysis of Clinical Trials. BMC Oral Health 2024, 24, 359. [Google Scholar] [CrossRef]
  111. Bolding, S.L.; Reebye, U.N. Accuracy of Haptic Robotic Guidance of Dental Implant Surgery for Completely Edentulous Arches. J. Prosthet. Dent. 2022, 128, 639–647. [Google Scholar] [CrossRef]
  112. Amin, S.A.; Hann, S.; Elsheikh, A.K.; Boltchi, F.; Zandinejad, A. A Complete Digital Approach for Facially Generated Full Arch Diagnostic Wax up, Guided Surgery, and Implant-supported Interim Prosthesis by Integrating 3D Facial Scanning, Intraoral Scan and CBCT. J. Prosthodont. 2023, 32, 90–93. [Google Scholar] [CrossRef]
  113. Joda, T.; Gallucci, G.O.; Wismeijer, D.; Zitzmann, N.U. Augmented and Virtual Reality in Dental Medicine: A Systematic Review. Comput. Biol. Med. 2019, 108, 93–100. [Google Scholar] [CrossRef]
  114. Ntovas, P.; Sirirattanagool, P.; Asavanamuang, P.; Jain, S.; Tavelli, L.; Revilla-León, M.; Galarraga-Vinueza, M.E. Accuracy and Time Efficiency of Artificial Intelligence-Driven Tooth Segmentation on CBCT Images: A Validation Study Using Two Implant Planning Software Programs. Clin. Oral Impl. Res. 2025, 36, 1312–1323. [Google Scholar] [CrossRef]
  115. Tarce, M.; Zhou, Y.; Antonelli, A.; Becker, K. The Application of Artificial Intelligence for Tooth Segmentation in CBCT Images: A Systematic Review. Appl. Sci. 2024, 14, 6298. [Google Scholar] [CrossRef]
  116. Elgarba, B.M.; Fontenele, R.C.; Du, X.; Mureșanu, S.; Tarce, M.; Meeus, J.; Jacobs, R. Artificial Intelligence Versus Human Intelligence in Presurgical Implant Planning: A Preclinical Validation. Clin. Oral Implant. Res. 2025, 36, 835–845. [Google Scholar] [CrossRef] [PubMed]
  117. Du, W.; Bi, W.; Liu, Y.; Zhu, Z.; Tai, Y.; Luo, E. Machine Learning-Based Decision Support System for Orthognathic Diagnosis and Treatment Planning. BMC Oral Health 2024, 24, 286. [Google Scholar] [CrossRef] [PubMed]
  118. Coachman, C.; Georg, R.; Bohner, L.; Rigo, L.C.; Sesma, N. Chairside 3D Digital Design and Trial Restoration Workflow. J. Prosthet. Dent. 2020, 124, 514–520. [Google Scholar] [CrossRef] [PubMed]
  119. Rokhshad, R.; Karteva, T.; Chaurasia, A.; Richert, R.; Mörch, C.-M.; Tamimi, F.; Ducret, M. Artificial Intelligence and Smile Design: An e-Delphi Consensus Statement of Ethical Challenges. J. Prosthodont. 2024, 33, 730–735. [Google Scholar] [CrossRef]
  120. Kurian, N.; Sudharson, N.A.; Varghese, K.G. Artificial Intelligence. Br. Dent. J. 2024, 236, 146. [Google Scholar] [CrossRef]
  121. Baaj, R.E.; Alangari, T.A. Artificial Intelligence Applications in Smile Design Dentistry: A Scoping Review. J. Prosthodont. 2025, 34, 341–349. [Google Scholar] [CrossRef]
  122. Polizzi, A.; Leonardi, R. Automatic Cephalometric Landmark Identification with Artificial Intelligence: An Umbrella Review of Systematic Reviews. J. Dent. 2024, 146, 105056. [Google Scholar] [CrossRef]
  123. Chen, J.; Bai, X.; Ding, Y.; Shen, L.; Sun, X.; Cao, R.; Yang, F.; Wang, L. Comparison the Accuracy of a Novel Implant Robot Surgery and Dynamic Navigation System in Dental Implant Surgery: An in Vitro Pilot Study. BMC Oral Health 2023, 23, 179. [Google Scholar] [CrossRef]
  124. Tao, B.; Feng, Y.; Fan, X.; Zhuang, M.; Chen, X.; Wang, F.; Wu, Y. Accuracy of Dental Implant Surgery Using Dynamic Navigation and Robotic Systems: An in Vitro Study. J. Dent. 2022, 123, 104170. [Google Scholar] [CrossRef] [PubMed]
  125. Luo, Z.; Li, A.; Unkovskiy, A.; Li, J.; Beuer, F.; Wu, Z.; Li, P. Accuracy of Robotic Computer-Assisted Implant Surgery in Clinical Studies: A Systematic Review and Meta-Analysis. BMC Oral Health 2025, 25, 540. [Google Scholar] [CrossRef]
  126. Liu, C.; Liu, Y.; Xie, R.; Li, Z.; Bai, S.; Zhao, Y. The Evolution of Robotics: Research and Application Progress of Dental Implant Robotic Systems. Int. J. Oral Sci. 2024, 16, 28. [Google Scholar] [CrossRef]
  127. Sigcho López, D.A.; García, I.; Da Silva Salomao, G.; Cruz Laganá, D. Potential Deviation Factors Affecting Stereolithographic Surgical Guides: A Systematic Review. Implant. Dent. 2019, 28, 68–73. [Google Scholar] [CrossRef]
  128. Cassetta, M.; Bellardini, M. How Much Does Experience in Guided Implant Surgery Play a Role in Accuracy? A Randomized Controlled Pilot Study. Int. J. Oral Maxillofac. Surg. 2017, 46, 922–930. [Google Scholar] [CrossRef]
  129. Block, M.S.; Emery, R.W.; Cullum, D.R.; Sheikh, A. Implant Placement Is More Accurate Using Dynamic Navigation. J. Oral Maxillofac. Surg. 2017, 75, 1377–1386. [Google Scholar] [CrossRef]
  130. Sun, T.-M.; Lee, H.-E.; Lan, T.-H. The Influence of Dental Experience on a Dental Implant Navigation System. BMC Oral Health 2019, 19, 222. [Google Scholar] [CrossRef]
  131. Mozer, P.S. Accuracy and Deviation Analysis of Static and Robotic Guided Implant Surgery: A Case Study. Int. J. Oral Maxillofac. Implant. 2020, 35, e86–e90. [Google Scholar] [CrossRef]
  132. Yang, S.; Chen, J.; Li, A.; Li, P.; Xu, S. Autonomous Robotic Surgery for Immediately Loaded Implant-Supported Maxillary Full-Arch Prosthesis: A Case Report. J. Clin. Med. 2022, 11, 6594. [Google Scholar] [CrossRef] [PubMed]
  133. Roccuzzo, M.; Layton, D.M.; Roccuzzo, A.; Heitz-Mayfield, L.J. Clinical Outcomes of Peri-implantitis Treatment and Supportive Care: A Systematic Review. Clin. Oral Implant. Res. 2018, 29, 331–350. [Google Scholar] [CrossRef] [PubMed]
  134. Rokn, A.; Aslroosta, H.; Akbari, S.; Najafi, H.; Zayeri, F.; Hashemi, K. Prevalence of Peri-implantitis in Patients Not Participating in Well-designed Supportive Periodontal Treatments: A Cross-sectional Study. Clin. Oral Implant. Res. 2017, 28, 314–319. [Google Scholar] [CrossRef] [PubMed]
  135. Perez, A.; Lombardi, T. Frontiers in the Understanding of Peri-Implant Disease. In Periodontal Frontiers; IntechOpen: London, UK, 2025. [Google Scholar] [CrossRef]
  136. Perez, A.; Lombardi, T. Treatment of Peri-Implant Disease: Current Knowledge. In Periodontal Frontiers; IntechOpen: London, UK, 2025. [Google Scholar] [CrossRef]
  137. Schwarz, F.; Derks, J.; Monje, A.; Wang, H.-L. Peri-Implantitis. J. Periodontol. 2018, 89 (Suppl. 1), S267–S290. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Overview of the artificial intelligence (AI) toolbox for dental implantology and prosthodontics. The figure illustrates the hierarchy of AI approaches, ranging from shallow or traditional machine learning (e.g., decision trees, SVMs) to deep learning architectures (e.g., CNNs, RNNs, transformers, GANs), and highlights their typical applications, including CBCT segmentation, implant site planning, multimodal data integration, and digital smile design. Abbreviations: AI: artificial intelligence; TML: traditional machine learning; DL: deep learning; ANNs: artificial neural networks; CNNs: convolutional neural networks; RNNs: recurrent neural networks; GNNs: graph neural networks; GANs: generative adversarial networks; VAEs: variational autoencoders; NLP: natural language processing; EHRs: electronic health records; CBCT: cone beam computed tomography.
Figure 1. Overview of the artificial intelligence (AI) toolbox for dental implantology and prosthodontics. The figure illustrates the hierarchy of AI approaches, ranging from shallow or traditional machine learning (e.g., decision trees, SVMs) to deep learning architectures (e.g., CNNs, RNNs, transformers, GANs), and highlights their typical applications, including CBCT segmentation, implant site planning, multimodal data integration, and digital smile design. Abbreviations: AI: artificial intelligence; TML: traditional machine learning; DL: deep learning; ANNs: artificial neural networks; CNNs: convolutional neural networks; RNNs: recurrent neural networks; GNNs: graph neural networks; GANs: generative adversarial networks; VAEs: variational autoencoders; NLP: natural language processing; EHRs: electronic health records; CBCT: cone beam computed tomography.
Applsci 15 12789 g001
Figure 2. AI-powered integration of digital workflows in dental implantology. Overview of artificial intelligence (AI) applications across diagnostic, planning, surgical, prosthetic, and follow-up stages in integrated digital implant workflows. Abbreviations: CNN = Convolutional Neural Network; GAN = Generative Adversarial Network; TML = Traditional Machine Learning; R-CNN = Region-based Convolutional Neural Network; NLP = Natural Language Processing; U-Net = Convolutional Neural Network architecture for image segmentation.
Figure 2. AI-powered integration of digital workflows in dental implantology. Overview of artificial intelligence (AI) applications across diagnostic, planning, surgical, prosthetic, and follow-up stages in integrated digital implant workflows. Abbreviations: CNN = Convolutional Neural Network; GAN = Generative Adversarial Network; TML = Traditional Machine Learning; R-CNN = Region-based Convolutional Neural Network; NLP = Natural Language Processing; U-Net = Convolutional Neural Network architecture for image segmentation.
Applsci 15 12789 g002
Table 1. Representative AI applications in digital implantology, including task, model, dataset, reported performance, and estimated technology readiness level based on reported registration stage and commercial availability (TRL).
Table 1. Representative AI applications in digital implantology, including task, model, dataset, reported performance, and estimated technology readiness level based on reported registration stage and commercial availability (TRL).
#Study (First Author, Year; Ref #)Clinical TaskAI ModelDataset (as Reported/Summarized)Primary Metric(s)Main Outcome (Short)Approx. TRL
1Verhelst et al., 2021; + subsequent Relu-based studies (2021–2025)
[68,73,74,75,76,77,78,79,80,81,82]
Automated CBCT segmentation of dentoalveolar structures (mandible, maxilla, teeth, mandibular canal, alveolar bone) and 3D virtual patient generationMulti-stage 3D U-Net CNN (Relu Creator/Virtual Patient Creator), cloud-based voxel-wise segmentationAggregated across studies: ~150–250 CBCTs per structure for training; 20–40 test scans per study; multi-device datasets (NewTom, Morita, Planmeca); includes CBCT-only and CBCT + IOS datasetsDice/IoU, HD95, surface deviation, segmentation time, inter-run consistencyHigh-accuracy segmentation (Dice 0.90–0.98 across structures), instant inference (20–60 s), consistent across CBCT devices, 50–100× faster than manual workflows; widely validated across multiple independent clinical studies9 (FDA-cleared & CE-marked commercial system; extensively validated; in routine clinical use)
2Alahamari et al., 2025 systematic review [83]Radiological segmentation of teeth, jaws, TMJ, mandibular canalMultiple DL CNNs and TML models30 included studies across CBCT/CT modalitiesDice, surface deviation, and other overlap metricsDL models consistently achieved high overlaps and outperformed TML approaches across most structures5–6 (portfolio of mostly prototype/early clinical tools)
3Pankert et al., 2023 [84]Mandible segmentation with metal artifact reduction on CTTwo-step 3D U-Net CNN pipelineCT datasets with metal restorations/implantsDice, accuracy vs. manual/semi-automatic, processing timeArtifact-compensated 3D models in ~31 s per scan with higher accuracy and markedly reduced manual workload4–5 (advanced proof-of-concept/early clinical validation)
4Ezhov et al., 2021; Bayrakdar et al., 2021; plus multiple subsequent Diagnocat evaluations (2021–2025)
[85,86,87,88,89,90,91]
AI-assisted diagnostic support and reporting on CBCT, panoramic, and intraoral radiographs; automated multi-pathology detection; implant-site analysis and planningHybrid 2D/3D CNN architecture: coarse-to-fine volumetric 3D CNN for CBCT tooth and pathology segmentation; 2D CNN modules for caries, restorations, periodontal bone loss, missing teeth; cloud-based SaaS platform (Diagnocat)CBCT datasets (100–300 scans per task); panoramic datasets (100–4500 annotated teeth); multimodal datasets (CBCT + OPG + IO) across heterogeneous devices; includes studies on implant planning, airway analysis, caries, periapical pathology, and periodontal diseaseAccuracy, sensitivity/specificity, Dice/IoU, AUC, inter-rater agreement (vs. expert panels), diagnostic concordance, time savingsConsistent diagnostic performance on CBCT and OPG; high agreement with expert references for periapical pathology, bone levels, and implant-site metrics; significant time savings in structured reporting; robust performance across imaging modalities9 (FDA-cleared, CE-marked, Health Canada–approved commercial system in routine clinical use)
5Al-Asali et al., 2024 [92]Fully automated implant planning: bone segmentation + implant position predictionTwo consecutive 3D U-Net modelsCBCT datasets with edentulous sites for implant placementSegmentation accuracy, positional error of proposed implants, planning timeAccurate bone segmentation and near-instant (~10 s) generation of implant proposals with high concordance to expert plans4–5 (technically robust, but still research-prototype)
6Lerner et al., 2020 [93]Automated retrieval/design of implant abutments and subgingival marginsAI module embedded in CAD software (feature-based ML/DL)IOS data and original abutment libraryWorkflow time, need for manual gingival margin tracing; qualitative fit/aestheticsAutomated realignment of original abutment designs and margin definition, eliminating manual margin tracing and streamlining abutment design5–6 (integrated into specialized CAD workflows, limited commercial roll-out)
7Cho et al., 2024 [94]DL-based design of implant-supported posterior crownsCNN-based DL crown generatorDigital models of posterior implant casesDesign time; occlusal table area; cusp height/angle; proximal contacts; occlusal contact patternDL crowns generated in ~83 s vs. 322–371 s for technician-optimized/DL-assisted or conventional CAD, with comparable occlusal and contact parameters5–6 (strong technical validation; not yet mainstream clinical product)
8Various authors, DL anterior & posterior crown design studies [94,95,96,97,98]Automated design of tooth-borne and posterior crownsVarious 3D CNNs and 3D-GAN modelsIOS/cast scan datasets of anterior and posterior crownsMorphologic & functional metrics (incisal path, inclination, occlusal relation, marginal fit, contact quality), design timeDL-generated crowns showed clinically acceptable morphology and function, superior time efficiency and posterior crown quality vs. conventional automated CAD; limited human refinement still helpful in complex esthetic cases4–5 (advanced research tools; early commercial pilots via Exocad AI, 3Shape Automate, DTX Studio, etc.)
9Shetty et al., systematic review [99]AI-based crown shade matchingMultiple CNN/ML shade-matching algorithmsCollection of in vitro and in vivo shade-matching studiesShade-matching accuracy vs. visual methods, agreement with reference devicesReview concluded that AI-based shade matching is promising and can improve consistency, but available evidence is still limited and heterogeneous3–4 (early-stage; few robust clinical implementations)
10Mohsin et al., 2025 [100]AI-enhanced digital smile design (DSD) with facial feature analysisHybrid CNN + GAN architectureClinical smile design cases with 2D facial photographsPatient satisfaction scores; expert aesthetic ratings; design timeAI-enhanced DSD produced higher patient satisfaction and aesthetic ratings and reduced design time by ~40% compared with conventional DSD4–5 (pilot software; not yet widely commercialized)
11Ceylan et al., 2024 [101]Comparison of AI-generated vs. conventional DSD layoutsProprietary AI DSD engine (likely CNN-based)Clinical cases with symmetric/asymmetric smilesSubjective aesthetic ratings, usability, design timeAI-generated designs were generally acceptable, especially in symmetric faces, and offered relevant time savings independent of user experience4–5 (experimental/early clinical tool)
12Lee et al., 2024 [102]Automatic segmentation and classification of peri-oral tissues and smile typesCNN-based segmentation + classifierClinical 2D facial/smile imagesSegmentation accuracy; smile-type classification accuracyReliable segmentation of lips/teeth and classification of smile types, enabling a key technical step towards fully automated DSD pipelines3–4 (technical enabler; not a standalone clinical product yet)
13Ye et al., 2023 [103]Automated cephalometric landmarking for ortho/DSD integrationDL and ML in commercial cephalometric tools (MyOrthoX, Angelalign, Digident)Lateral cephalograms assessed by software vs. experienced orthodontistsLandmark error vs. expert; analysis timeAI-based cephalometric systems achieved accuracy comparable to orthodontists and reduced analysis time by up to 50%, while still requiring human supervision8–9 (commercial software with broad clinical use)
14Cha et al., 2022 [104]Automated measurement of peri-implant bone loss on periapical radiographsModified R-CNN deep CNNPeriapical radiographs of implants with serial follow-upMeasurement error vs. human raters; diagnostic agreement indicesR-CNN model measured peri-implant bone loss with performance comparable to dentists, supporting its use as future maintenance/recall aid4–5 (pilot stage; not widely commercialized yet)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Lombardi, T.; Perez, A. Integration and Innovation in Digital Implantology–Part II: Emerging Technologies and Converging Workflows: A Narrative Review. Appl. Sci. 2025, 15, 12789. https://doi.org/10.3390/app152312789

AMA Style

Lombardi T, Perez A. Integration and Innovation in Digital Implantology–Part II: Emerging Technologies and Converging Workflows: A Narrative Review. Applied Sciences. 2025; 15(23):12789. https://doi.org/10.3390/app152312789

Chicago/Turabian Style

Lombardi, Tommaso, and Alexandre Perez. 2025. "Integration and Innovation in Digital Implantology–Part II: Emerging Technologies and Converging Workflows: A Narrative Review" Applied Sciences 15, no. 23: 12789. https://doi.org/10.3390/app152312789

APA Style

Lombardi, T., & Perez, A. (2025). Integration and Innovation in Digital Implantology–Part II: Emerging Technologies and Converging Workflows: A Narrative Review. Applied Sciences, 15(23), 12789. https://doi.org/10.3390/app152312789

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop