Next Article in Journal
Access Pain During Transforaminal Endoscopic Lumbar Discectomy for Foraminal or Extraforaminal Disc Herniation
Next Article in Special Issue
Unveiling Guyon’s Canal: Insights into Clinical Anatomy, Pathology, and Imaging
Previous Article in Journal
Inflammatory Signals Across the Spectrum: A Detailed Exploration of Acute Appendicitis Stages According to EAES 2015 Guidelines
Previous Article in Special Issue
Assessment of Calcaneal Spongy Bone Magnetic Resonance Characteristics in Women: A Comparison between Measures Obtained at 0.3 T, 1.5 T, and 3.0 T
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Automating Dental Condition Detection on Panoramic Radiographs: Challenges, Pitfalls, and Opportunities

by
Sorana Mureșanu
1,*,†,
Mihaela Hedeșiu
1,†,
Liviu Iacob
2,
Radu Eftimie
3,
Eliza Olariu
4,
Cristian Dinu
1,
Reinhilde Jacobs
5,6,7 and
on behalf of Team Project Group
1
Department of Oral and Maxillofacial Surgery and Radiology, Iuliu Hațieganu University of Medicine and Pharmacy, 32 Clinicilor Street, 400006 Cluj-Napoca, Romania
2
Department of Computer Science, Technical University of Cluj-Napoca, 400114 Cluj-Napoca, Romania
3
Iuliu Hațieganu University of Medicine and Pharmacy, 32 Clinicilor Street, 400006 Cluj-Napoca, Romania
4
Department of Electrical Engineering, Technical University of Cluj-Napoca, 400114 Cluj-Napoca, Romania
5
OMFS IMPATH Research Group, Department of Imaging and Pathology, Faculty of Medicine, Katholieke Universiteit Leuven, 3000 Louvain, Belgium
6
Department of Oral and Maxillofacial Surgery, University Hospitals Leuven, 3000 Louvain, Belgium
7
Department of Dental Medicine, Karolinska Institute, 171 77 Stockholm, Sweden
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
The TEAM Project Consortium is provided in the Acknowledgments.
Diagnostics 2024, 14(20), 2336; https://doi.org/10.3390/diagnostics14202336
Submission received: 25 September 2024 / Revised: 13 October 2024 / Accepted: 14 October 2024 / Published: 21 October 2024

Abstract

:
Background/Objectives: The integration of AI into dentistry holds promise for improving diagnostic workflows, particularly in the detection of dental pathologies and pre-radiotherapy screening for head and neck cancer patients. This study aimed to develop and validate an AI model for detecting various dental conditions, with a focus on identifying teeth at risk prior to radiotherapy. Methods: A YOLOv8 model was trained on a dataset of 1628 annotated panoramic radiographs and externally validated on 180 radiographs from multiple centers. The model was designed to detect a variety of dental conditions, including periapical lesions, impacted teeth, root fragments, prosthetic restorations, and orthodontic devices. Results: The model showed strong performance in detecting implants, endodontic treatments, and surgical devices, with precision and recall values exceeding 0.8 for several conditions. However, performance declined during external validation, highlighting the need for improvements in generalizability. Conclusions: YOLOv8 demonstrated robust detection capabilities for several dental conditions, especially in training data. However, further refinement is needed to enhance generalizability in external datasets and improve performance for conditions like periapical lesions and bone loss.

1. Introduction

The integration of artificial intelligence (AI) into healthcare has advanced considerably over the past decade, and the field of dentistry is no exception to this trend [1,2]. The World Health Organization estimates that nearly 3.5 billion people suffer from a form of oral disease throughout their lives [3]. This underscores a pressing need for tools to ease the burden on practitioners and improve access to oral healthcare. AI algorithms could help alleviate some of these burdens, as they have been shown to decrease workloads and increase diagnostic efficacy [4]. This may lead to earlier detection and treatment of dental pathologies and result in better oral health outcomes for patients. AI holds particular promise in dentomaxillofacial radiology, where the large volume of imaging data provides an excellent foundation for algorithm training [5].
Panoramic radiography examination is a part of the pre-treatment dental screening for patients undergoing radiotherapy for head and neck cancers [6]. Patients exposed to therapeutic radiation often experience dental complications. Radiation damage to salivary glands can result in severe xerostomia and post-radiation caries, while bone damage, potentially leading to osteoradionecrosis, may take years to develop and is frequently triggered by dental extractions [7]. Screening aims to identify and extract teeth with poor prognoses to reduce the need for post-radiation therapy extractions, which are associated with an increased risk of osteoradionecrosis [8]. Screening is also carried out before cardiac surgery, to detect and eliminate potential sources of dental sepsis before treatment onset, thereby reducing the risk of infective endocarditis [9]. The recent rise of AI presents significant potential for its integration into dental screening applications, offering opportunities to enhance inter-specialty communication, detect hard-to-see lesions, and streamline workflows.
A substantial number of AI-based decision support systems have been developed and trained on panoramic radiographs. Applications range from detection and/or segmentation of dental caries [10,11], alveolar bone loss [12,13,14,15], impacted teeth [16,17,18], maxillary sinus pathology [19,20], vertical root fractures [21], prosthetic restorations [18,22], the detection of intraosseous cysts and tumors [23,24], and osteoporosis screening [25], as well as tooth identification and numbering [26,27,28]. To the best of our knowledge, no current applications specifically target the detection of teeth at risk or needing extraction prior to radiotherapy for head and neck cancers. AI applications are widely studied in the field of dental medicine; however, despite the large amount of research, there are comparatively few clinically viable AI models. This has been attributed to limited data availability, insufficient methodological standards, and concerns about these solutions’ value, ethics, and practicality [29]. A few examples of commercially available dental charting or radiographic interpretation applications include: CranioCatch Clinic (CranioCatch, Eskişehir, Turkey), DiagnoCat (DGNCT LLC, Miami, FL, USA), VelmeniAI (Velmeni, Sunnyvale, CA, USA), and Denti.AI (Denti.AI Technology Inc., Toronto, ON, Canada). The available literature has indicated strong performance in external validation studies [30,31,32,33,34], though outcomes seem to vary significantly across different lesions or conditions.
This study aimed to train and validate an AI model for automated dental lesion detection on panoramic X-ray images. We aimed to detect a varied pool of dental pathologies and treatments; however, the focus was on the preliminary identification of teeth that posed potential risks for oncologic patients. This includes teeth presenting with periapical lesions, as well as root fragments. Our model also detects prosthetic restorations, dental implants, dental fillings, endodontic treatments, caries, periodontal bone loss, impacted teeth, root resorption, fixed orthodontic treatments, and surgical devices (i.e., osteosynthesis plates, orthodontic temporary anchorage devices).

2. Materials and Methods

2.1. Panoramic Image Selection and Dataset

The training dataset consisted of 1628 panoramic images from the Department of Maxillofacial Surgery and Radiology at the Iuliu Hațieganu University of Medicine and Pharmacy, Cluj-Napoca, Romania. The images were retrospectively collected from the department’s database of patients who visited the hospital between April 2021 and December 2023. The images were acquired using a Vatech PCH-2500 machine (Vatech, Hwaseong, Republic of Korea) and stored in jpeg format. A second set consisting of 180 radiographs was acquired from different medical centers in Cluj-Napoca, Romania, and was used as external validation. Patient consent was waived due to the retrospective nature of the study by the Ethics Committee of the Iuliu Hațieganu University of Medicine and Pharmacy, reference number 117/04.06.2024. All images were de-identified before analysis. The study was conducted in accordance with the principles of the Declaration of Helsinki.
Panoramic images of diagnostic quality were selected based on the following criteria: adult patients with permanent dentition. Images containing large cysts, tumors, metallic artifacts, blurring, or severe technique errors that would impede the radiological interpretation were excluded.

2.2. Image Annotation

The panoramic images were manually annotated by three calibrated researchers (S.M., M.H., and R.E.) using an open-source labeling platform (Makesense.ai v1.11.0 [35]). The inter-observer agreement was calculated using the interclass correlation coefficient (ICC), to ensure examiner reliability and achieved a value of 0.91. Bounding boxes were created for the collected images to identify the following dental conditions: prosthetic restoration, dental implant, dental filling, endodontic treatment, caries, periapical lesion, periodontal bone loss, impacted tooth, root fragment, root resorption, orthodontic treatment (brackets), and surgical devices (i.e., osteosynthesis plates, orthodontic temporary anchorage devices). Finally, the annotations were exported in the YOLO label format.

2.3. Preprocessing, Architecture, Training, and Evaluation

The internal dataset was split into 1375 training images, 153 validation images, and 100 testing images. An additional 180 radiographs comprised the external validation set.
The architecture used in this study was YOLO-v8 [36], a variant of the YOLO (You Only Look Once) object detector, released in January 2023. All images were resized to 1024 × 1024 pixels before being input into the model. The final model was trained over 75 epochs using the entire training dataset. A batch size of 8 was chosen, meaning the model updated its weights after every 8 images. A patience parameter of 5 epochs was set, ensuring that if the model did not show improvement within 5 consecutive epochs, the training would be halted and the best weights would be restored.
The accuracy of the model was assessed using the following metrics: F1 score, precision, and recall. The formulas for these metrics are provided in Table 1.

3. Results

During training, the best model achieved the following results: F1 score = 0.6 with 0.275 confidence, and recall = 0.657 for mAP@50. Although a 0.275 confidence could be considered low for other CNN architectures, in YOLO architectures, 0.3 is the recommended threshold. For the external validation set, we obtained the following results: F1 score = 0.47 with 0.192 confidence, and recall = 0.451 for mAP@50. Table 2 shows the performance results, in terms of precision and recall, for each class on the internal validation set, compared to the ground truth. Table 3 shows the performance on the external validation set. The F1-confidence curve and the precision-recall curve are provided in Appendix A.
During the training epochs, the metrics used were box_loss, class_loss, and dfl_loss (distribution focal loss) for both training and validation data.
The normalized confusion matrix for the internal and external datasets is presented in Figure 1. Figure 2 illustrates a side-by-side comparison of the model’s predictions and corresponding ground truths.

4. Discussion

In this study, we evaluate the results of YOLO-v8 for the detection of various dental pathological situations and treatments. During training, the algorithm showed decent performance in terms of precision, recall, and F1 score for the detection of implants, endodontic treatments, orthodontic devices, and surgical devices. Among the oral pathologies examined, impacted teeth, root fragments, and periapical lesions were the most reliably detected. The model’s evaluation metrics showed a decline when applied to the external dataset, showing a need for improvement in terms of generalizability.
Bonfanti-Gris et al. [37] evaluated the diagnostic performance of three versions of YOLO (YOLOv5, YOLOv7, and YOLOv8), in terms of object detection and segmentation, on a multiclass panoramic dataset. They found that the latest versions of YOLO showed better results for both tasks. YOLOv8 supports the following computer vision tasks: object detection, segmentation, pose estimation, tracking, and classification [36]. Our team employed YOLOv8 for object detection, as it demonstrates improved throughput compared to earlier versions while maintaining a similar number of parameters [38]. Additionally, it shows enhanced object detection performance, particularly for smaller objects, owing to the loss functions (CIoU and DFL loss functions for bounding box loss and binary cross-entropy for classification loss) [39].

4.1. Detection of Dental Treatments

The automated detection of dental restorations and treatments has been proposed as a first step toward the later AI-based diagnosis of dental pathology [40]. Similar to our findings, the study by Bonfanti-Gris et al. [37] faced difficulties in identifying and differentiating between coronal restorations and crowns/pontics; however, we did not face this problem with endodontic treatments. Similar trends have been observed in other studies. One study used YOLOv4 to detect prosthetic restorations on a large panoramic dataset [22]. They found that the model struggled to detect crowns, partly due to the inclusion of images with artifacts and superpositions. Similarly, our study did not exclude such images. The use of data from multiple centers may be a key factor contributing to our performance being significantly lower than that reported in the literature. Abdalla-Aslan et al. [40] developed a machine learning algorithm for automatically detecting and classifying several dental restorations. Results for amalgam fillings, crowns, implants, and cores were high. However, composite fillings and root canal treatments had lower detection rates due to their similarity to normal teeth and narrow structures. In the case of our study, implants and endodontic treatments were reliably detected.
A novel aspect of our paper is the inclusion of labels for surgical devices (in this case osteosynthesis plates) and orthodontic devices (including brackets, palatal expanders, and TADs). The model demonstrated robustness in accurately detecting these elements.

4.2. Detection of Dental Pathology

The detection of impacted teeth has been extensively studied in the literature. One study achieved high precision and recall using YOLOv3 for the detection of impacted third molars [16]. Another study by Kuwada [20] showed that DetectNet and AlexNet had the potential for detecting impacted supernumerary teeth in the maxillary incisor region. Another framework proposed by Zhu et al. [41] also diagnosed impacted teeth with high sensitivity and specificity. Our dataset included a broader range of dental impaction cases, encompassing not only impacted molars but also canines and premolars. While the detection of impacted teeth yielded the highest results among the various dental pathological conditions, our model performance was still lower than reported in other studies.
Periapical lesions are common dental pathologies normally resulting from apical periodontitis [42]. While our model performed surprisingly well on the training dataset, performance dropped on the external images. A clinical validation of the DiagnoCat software showed high specificity and reproducibility for lesion detection in panoramic radiographs, but limited reliability for the assessment of caries and periapical lesions [30]. Another study on DiagnoCat performed this detection on periapical radiographs [34], achieving a 0.92 F1 score. Kazimierczak et al. [43] found that periapical lesion detection had higher sensitivity and specificity on CBCT compared to panoramic radiography. The difficulties in AI-based detection of periapical lesions are reflected by the limitations of available diagnostic methods, with panoramic radiography, although common, having the lowest sensitivities [44]. Periapical radiographs result in higher accuracies, but limitations such as superimpositions and distortions [45] still affect the diagnostic process. CBCT overcomes these limitations but is associated with increased false-positive results [46]. Furthermore, the increased radiation doses additionally prevent CBCT from being used routinely. Given these considerations, we see significant value in AI-based panoramic detection for periapical lesions. It offers a reliable second opinion and has the potential to identify early or small lesions that may otherwise be overlooked.
Several tools have been developed to assess periodontal bone levels. One study [47] trained a deep learning model to evaluate bone loss on panoramic radiographs, achieving dentist-level accuracy. Kim et al. [48] improved upon it with a larger dataset, transfer learning, and clinical knowledge in their DeNTNet model. Chang et al. [49] introduced a method to detect and classify bone loss into periodontitis stages, though it relied solely on radiographic data. In our study, bounding box annotation was not appropriate for these lesions. Instead, other labeling strategies may be necessary to achieve more robust results.
Carious lesion detection had some of the lowest precision and recall scores. Other studies have faced similar challenges in detecting caries on panoramic radiographs [30,41,50], most likely due to variations in the position, shape, and extension of these lesions. Another reason might be related to observer under-detection of incipient and/or interproximal caries on radiographic findings [51]. This also explains the large number of false negatives.

4.3. Panoramic Radiographs

Panoramic radiography is one of the most common diagnostic tools in dentistry. It provides a two-dimensional overview of the teeth and jaws. While being a valuable adjunct for diagnosis, it presents with certain pitfalls: variable image contrast (influenced by the selection of the appropriate peak kilovoltage), image magnification, superimpositions and distortions, the image’s tomographic character (reduced visibility of structures located outside the focal through), and susceptibility to technical errors [52,53]. Therefore, diligence and a systematic approach to their interpretation are required [54]. We addressed this through meticulous calibration and training of the examiners; however, it is important to recognize that panoramic radiography may not be the most optimal diagnostic examination for some conditions. Furthermore, the variability of panoramic radiograph quality may also constitute a technical issue for AI models trained on these images.

4.4. Potential for Pre-Radiation Therapy Dental Screening

Head and neck cancer patients often experience a variety of treatment-related oral symptoms, including xerostomia, thickened secretions, mucosal sensitivity, dental caries, periodontal disease, infections, odynophagia, and osteoradionecrosis [55]. Pre-radiation dental screening and interventions are necessary to manage the risk of osteonecrosis [8]. The primary goal is to prevent post-radiation dental extractions when the risk of osteoradionecrosis is significantly higher [56]. Some AI models in the literature detect multiple dental conditions, including compromised teeth [32,50]. However, no studies focus specifically on this area. Regarding our model’s potential for dental screening, we achieved strong results in detecting periapical lesions and root fragments. One study emphasizes the necessary to evaluate even more pathologies, including dental caries, pulpal and periapical disease, root resorption, periodontal disease, tooth mobility, and tooth impactions [57]. Although defining tooth prognosis is a difficult task [58], we believe that AI-based screening could play a valuable role in this process in the future.

4.5. Practical Concerns for Clinical Implementation

Although AI has the potential to improve many aspects of dental practice, its translation into practice has been slow [59]. Reasons for this include concerns over data privacy, the lack of methodological transparency of AI methods (“black-box”), health inequity issues due to potential data biases, and underdeveloped government regulations [60]. In light of these challenges, van der Vegt et al. introduced SALIENT [61], a provisional implementation framework, which aims to aid healthcare providers in navigating the steps required to integrate AI into clinical practice. Along with new standards and regulations, physicians require training in machine learning to build the skills necessary to interpret the outputs of algorithmic decision support systems.

4.6. Limitations and Future Research

This study faced limitations due to the small dataset and the imbalanced classes. The latter was anticipated given the study’s focus and reflects the varying frequencies of different dental pathologies [62]. Overfitting was a concern in the starting phase. Still, to our advantage, the model exhibited overfitting only in a few classes (i.e.: carious lesion vs. obturation, preferring obturation over the other). On the other hand, the model presented an underfitting behavior for the bone resorption class due to the limited context provided by the annotation bounding boxes. This issue will be addressed in the future by considering a broader context while training and potentially employing a different annotation strategy. Another class for which the model underfitted was apical surgery, because of the limited data regarding that lesion. We aim to address this in future studies by expanding the training dataset and incorporating images from various centers to enhance its generalizability.

5. Conclusions

The present method achieved reliable results in detecting several oral conditions, including impacted teeth, root fragments, periapical lesions implants, endodontic treatments, orthodontic devices, and surgical devices. Our results also highlight AI’s potential to aid in pre-radiation therapy dental screening. Despite the promising findings, the inherent limitations of this study and the technical challenges associated with using panoramic radiographs for AI-based detection require that the results be interpreted with caution. Further research will need to improve generalizability, increase the sample size for validation, tackle annotation issues, and improve the model’s capability to detect critical conditions. Our results highlight the need for external, as well as clinical validation in AI studies, before their integration into dental practice.

Author Contributions

Conceptualization, S.M. and M.H.; data curation, S.M., M.H. and R.E.; formal analysis, L.I.; funding acquisition, S.M., M.H., C.D. and R.J.; investigation, S.M., M.H., L.I. and R.E.; methodology, S.M., M.H. and L.I.; project administration, C.D. and R.J.; resources, M.H., C.D. and Team Project Group; software, L.I. and E.O.; supervision, M.H., C.D. and R.J.; validation, S.M., L.I., R.E., E.O. and Team Project Group; visualization, S.M. and L.I.; writing—original draft, S.M.; writing—review and editing, M.H. and R.J. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by project PNRR-III-C9-2023-I8, Technologically Enabled Advancements in Dental Medicine (TEAM), Planul Național de Redresare și Reziliență (PNRR), CF.80/31.07.2023, number 760235/28 December 2023.

Institutional Review Board Statement

The study was conducted following the Declaration of Helsinki and approved by the Ethics Committee of the Iuliu Hațieganu University of Medicine and Pharmacy, Cluj-Napoca, Romania (approval number 117/4 June 2024).

Informed Consent Statement

Patient consent was waived due to the retrospective nature of the study.

Data Availability Statement

Data are available upon reasonable from the corresponding author.

Acknowledgments

This article is an expanded version of a paper entitled Artificial Intelligence for Diagnosis of Dental Pathology on Panoramic Images, which was presented at the European Congress of Radiology, Viena, Austria, 28 February–3 March 2024. Sorana Eftimie, Liviu Iacob, Alin Cordos, Raluca Roman, and Mihaela Hedeșiu, Artificial intelligence for diagnosis of dental pathology on panoramic images, Poster C-14505, Proceedings of the European Congress of Radiology, Viena, Austria, 28 February–3 March 2024. We would like to acknowledge the contributions of the Team Project Group, which includes: Almășan O, Băciuț M, Bran S, Burde A, Cordoș A, Crișan B, Dinu C, Dioșan L, Mureșanu S, Hedeșiu M, Ilea A, Jacobs R, Leucuța DC, Lucaciu O, Manea A, Olariu E, Roman R, Rotaru H, Stoia S, Tamaș T, and Văcăraș S.

Conflicts of Interest

The authors declare no conflicts of interest.

Appendix A

Figure A1. F1–confidence curves on the internal (a) and external (b) datasets.
Figure A1. F1–confidence curves on the internal (a) and external (b) datasets.
Diagnostics 14 02336 g0a1aDiagnostics 14 02336 g0a1b
Figure A2. Precision–recall curves on the internal (a) and external (b) datasets.
Figure A2. Precision–recall curves on the internal (a) and external (b) datasets.
Diagnostics 14 02336 g0a2aDiagnostics 14 02336 g0a2b

References

  1. Rajpurkar, P.; Chen, E.; Banerjee, O.; Topol, E. AI in Health and Medicine. Nat. Med. 2022, 28, 31–38. [Google Scholar] [CrossRef]
  2. Mureșanu, S.; Almășan, O.; Hedeșiu, M.; Dioșan, L.; Dinu, C.; Jacobs, R. Artificial Intelligence Models for Clinical Usage in Dentistry with a Focus on Dentomaxillofacial CBCT: A Systematic Review. Oral Radiol. 2023, 39, 18–40. [Google Scholar] [CrossRef] [PubMed]
  3. Global Oral Health Status Report: Towards Universal Health Coverage for Oral Health by 2030; World Health Organization: Geneva, Switzerland, 2022.
  4. Nguyen, T.T.; Larrivée, N.; Lee, A.; Bilaniuk, O.; Durand, R. Use of Artificial Intelligence in Dentistry: Current Clinical Trends and Research Advances. J. Can. Dent. Assoc. 2021, 87, l7. [Google Scholar] [CrossRef]
  5. Heo, M.-S.; Kim, J.-E.; Hwang, J.-J.; Han, S.-S.; Kim, J.-S.; Yi, W.-J.; Park, I.-W. Artificial Intelligence in Oral and Maxillofacial Radiology: What Is Currently Possible? Dentomaxillofac. Radiol. 2021, 50, 20200375. [Google Scholar] [CrossRef]
  6. Lee, J.; Hueniken, K.; Cuddy, K.; Pu, J.; El Maghrabi, A.; Hope, A.; Hosni, A.; Glogauer, M.; Watson, E. Dental Extractions Before Radiation Therapy and the Risk of Osteoradionecrosis in Patients with Head and Neck Cancer. JAMA Otolaryngol.–Head. Neck Surg. 2023, 149, 1130–1139. [Google Scholar] [CrossRef] [PubMed]
  7. Watson, E.; Dorna Mojdami, Z.; Oladega, A.; Hope, A.; Glogauer, M. Clinical Practice Guidelines for Dental Management Prior to Radiation for Head and Neck Cancer. Oral Oncol. 2021, 123, 105604. [Google Scholar] [CrossRef]
  8. Kufta, K.; Forman, M.; Swisher-McClure, S.; Sollecito, T.P.; Panchal, N. Pre-Radiation Dental Considerations and Management for Head and Neck Cancer Patients. Oral Oncol. 2018, 76, 42–51. [Google Scholar] [CrossRef] [PubMed]
  9. Krasniqi, L.; Schødt Riber, L.P.; Nissen, H.; Terkelsen, C.J.; Andersen, N.H.; Freeman, P.; Povlsen, J.A.; Gerke, O.; Clavel, M.-A.; Dahl, J.S. Impact of Mandatory Preoperative Dental Screening on Post-Procedural Risk of Infective Endocarditis in Patients Undergoing Transcatheter Aortic Valve Implantation: A Nationwide Retrospective Observational Study. Lancet Reg. Health Eur. 2023, 36, 100789. [Google Scholar] [CrossRef]
  10. Lakshmi, M.M.; Chitra, P. Classification of Dental Cavities from X-ray Images Using Deep CNN Algorithm. In Proceedings of the 2020 4th International Conference on Trends in Electronics and Informatics (ICOEI) (48184), Tirunelveli, India, 15–17 June 2020; pp. 774–779. [Google Scholar]
  11. Mărginean, A.C.; Mureşanu, S.; Hedeşiu, M.; Dioşan, L. Teeth Segmentation and Carious Lesions Segmentation in Panoramic X-Ray Images Using CariSeg, a Networks’ Ensemble. Heliyon 2024, 10, e30836. [Google Scholar] [CrossRef]
  12. Uzun Saylan, B.C.; Baydar, O.; Yeşilova, E.; Kurt Bayrakdar, S.; Bilgir, E.; Bayrakdar, İ.Ş.; Çelik, Ö.; Orhan, K. Assessing the Effectiveness of Artificial Intelligence Models for Detecting Alveolar Bone Loss in Periodontal Disease: A Panoramic Radiograph Study. Diagnostics 2023, 13, 1800. [Google Scholar] [CrossRef]
  13. Vollmer, A.; Vollmer, M.; Lang, G.; Straub, A.; Kübler, A.; Gubik, S.; Brands, R.; Hartmann, S.; Saravi, B. Automated Assessment of Radiographic Bone Loss in the Posterior Maxilla Utilizing a Multi-Object Detection Artificial Intelligence Algorithm. Appl. Sci. 2023, 3, 1858. [Google Scholar] [CrossRef]
  14. Kurt, S.; Çelik, Ö.; Bayrakdar, İ.Ş.; Orhan, K.; Bilgir, E.; Odabas, A.; Aslan, A.F. Success of Artificial Intelligence System in Determining Alveolar Bone Loss from Dental Panoramic Radiography Images. Cumhur. Dent. J. 2020, 23, 318–324. [Google Scholar] [CrossRef]
  15. Siontis, G.C.M.; Sweda, R.; Noseworthy, P.A.; Friedman, P.A.; Siontis, K.C.; Patel, C.J. Development and Validation Pathways of Artificial Intelligence Tools Evaluated in Randomised Clinical Trials. BMJ Health Care Inform. 2021, 28, e100466. [Google Scholar] [CrossRef]
  16. Celik, M.E. Deep Learning Based Detection Tool for Impacted Mandibular Third Molar Teeth. Diagnostics 2022, 12, 942. [Google Scholar] [CrossRef] [PubMed]
  17. ResMIBCU-Net: An Encoder–Decoder Network with Residual Blocks, Modified Inverted Residual Block, and Bi-Directional ConvLSTM for Impacted Tooth Segmentation in Panoramic X-ray Images|Oral Radiology. Available online: https://link.springer.com/article/10.1007/s11282-023-00677-8 (accessed on 7 February 2024).
  18. Kohinata, K.; Kitano, T.; Nishiyama, W.; Mori, M.; Iida, Y.; Fujita, H.; Katsumata, A. Deep Learning for Preliminary Profiling of Panoramic Images. Oral Radiol. 2023, 39, 275–281. [Google Scholar] [CrossRef] [PubMed]
  19. Kotaki, S.; Nishiguchi, T.; Araragi, M.; Akiyama, H.; Fukuda, M.; Ariji, E.; Ariji, Y. Transfer Learning in Diagnosis of Maxillary Sinusitis Using Panoramic Radiography and Conventional Radiography. Oral Radiol. 2023, 39, 467–474. [Google Scholar] [CrossRef]
  20. Kuwana, R.; Ariji, Y.; Fukuda, M.; Kise, Y.; Nozawa, M.; Kuwada, C.; Muramatsu, C.; Katsumata, A.; Fujita, H.; Ariji, E. Performance of Deep Learning Object Detection Technology in the Detection and Diagnosis of Maxillary Sinus Lesions on Panoramic Radiographs. Dentomaxillofac. Radiol. 2021, 50, 20200171. [Google Scholar] [CrossRef]
  21. Fukuda, M.; Inamoto, K.; Shibata, N.; Ariji, Y.; Yanashita, Y.; Kutsuna, S.; Nakata, K.; Katsumata, A.; Fujita, H.; Ariji, E. Evaluation of an Artificial Intelligence System for Detecting Vertical Root Fracture on Panoramic Radiography. Oral. Radiol. 2020, 36, 337–343. [Google Scholar] [CrossRef] [PubMed]
  22. Altan, B.; Gunec, H.G.; Cinar, S.; Kutal, S.; Gulum, S.; Aydin, K.C. Detecting Prosthetic Restorations Using Artificial Intelligence on Panoramic Radiographs. Sci. Program. 2022, 2022, e6384905. [Google Scholar] [CrossRef]
  23. Watanabe, H.; Ariji, Y.; Fukuda, M.; Kuwada, C.; Kise, Y.; Nozawa, M.; Sugita, Y.; Ariji, E. Deep Learning Object Detection of Maxillary Cyst-like Lesions on Panoramic Radiographs: Preliminary Study. Oral Radiol. 2021, 37, 487–493. [Google Scholar] [CrossRef]
  24. Ver Berne, J.; Saadi, S.B.; Politis, C.; Jacobs, R. A Deep Learning Approach for Radiological Detection and Classification of Radicular Cysts and Periapical Granulomas. J. Dent. 2023, 135, 104581. [Google Scholar] [CrossRef] [PubMed]
  25. Evaluation of Transfer Learning with Deep Convolutional Neural Networks for Screening Osteoporosis in Dental Panoramic Radiographs—PMC. Available online: https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7074309/ (accessed on 23 August 2024).
  26. Chandrashekar, G.; AlQarni, S.; Bumann, E.E.; Lee, Y. Collaborative Deep Learning Model for Tooth Segmentation and Identification Using Panoramic Radiographs. Comput. Biol. Med. 2022, 148, 105829. [Google Scholar] [CrossRef] [PubMed]
  27. Yilmaz, S.; Tasyurek, M.; Amuk, M.; Celik, M.; Canger, E.M. Developing Deep Learning Methods for Classification of Teeth in Dental Panoramic Radiography. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2024, 138, 118–127. [Google Scholar] [CrossRef] [PubMed]
  28. Putra, R.H.; Astuti, E.R.; Putri, D.K.; Widiasri, M.; Laksanti, P.A.M.; Majidah, H.; Yoda, N. Automated Permanent Tooth Detection and Numbering on Panoramic Radiograph Using a Deep Learning Approach. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2024, 137, 537–544. [Google Scholar] [CrossRef]
  29. Schwendicke, F.; Samek, W.; Krois, J. Artificial Intelligence in Dentistry: Chances and Challenges. 2020. Available online: https://journals.sagepub.com/doi/10.1177/0022034520915714 (accessed on 25 July 2024).
  30. Zadrożny, Ł.; Regulski, P.; Brus-Sawczuk, K.; Czajkowska, M.; Parkanyi, L.; Ganz, S.; Mijiritsky, E. Artificial Intelligence Application in Assessment of Panoramic Radiographs. Diagnostics 2022, 12, 224. [Google Scholar] [CrossRef]
  31. Bonfanti-Gris, M.; Garcia-Cañas, A.; Alonso-Calvo, R.; Salido Rodriguez-Manzaneque, M.P.; Pradies Ramiro, G. Evaluation of an Artificial Intelligence Web-Based Software to Detect and Classify Dental Structures and Treatments in Panoramic Radiographs. J. Dent. 2022, 126, 104301. [Google Scholar] [CrossRef]
  32. Orhan, K.; Aktuna Belgin, C.; Manulis, D.; Golitsyna, M.; Bayrak, S.; Aksoy, S.; Sanders, A.; Önder, M.; Ezhov, M.; Shamshiev, M.; et al. Determining the Reliability of Diagnosis and Treatment Using Artificial Intelligence Software with Panoramic Radiographs. Imaging Sci. Dent. 2023, 53, 199–208. [Google Scholar] [CrossRef]
  33. Bilgir, E.; Bayrakdar, İ.Ş.; Çelik, Ö.; Orhan, K.; Akkoca, F.; Sağlam, H.; Odabaş, A.; Aslan, A.F.; Ozcetin, C.; Kıllı, M.; et al. An Artifıcial Intelligence Approach to Automatic Tooth Detection and Numbering in Panoramic Radiographs. BMC Med. Imaging 2021, 21, 124. [Google Scholar] [CrossRef]
  34. Issa, J.; Jaber, M.; Rifai, I.; Mozdziak, P.; Kempisty, B.; Dyszkiewicz-Konwińska, M. Diagnostic Test Accuracy of Artificial Intelligence in Detecting Periapical Periodontitis on Two-Dimensional Radiographs: A Retrospective Study and Literature Review. Medicina 2023, 59, 768. [Google Scholar] [CrossRef]
  35. Make Sense. Available online: https://www.makesense.ai/ (accessed on 20 May 2024).
  36. Jocher, G.; Chaurasia, A.; Qiu, J. Ultralytics YOLO 2023. YOLO GitHub Repository. Available online: https://github.com/ultralytics/ultralytics (accessed on 20 May 2024).
  37. Bonfanti-Gris, M.; Herrera, A.; Paraíso-Medina, S.; Alonso-Calvo, R.; Martínez-Rus, F.; Pradíes, G. Performance Evaluation of Three Versions of a Convolutional Neural Network for Object Detection and Segmentation Using a Multiclass and Reduced Panoramic Radiograph Dataset. J. Dent. 2024, 144, 104891. [Google Scholar] [CrossRef]
  38. Hussain, M. YOLO-v1 to YOLO-v8, the Rise of YOLO and Its Complementary Nature toward Digital Manufacturing and Industrial Defect Detection. Machines 2023, 11, 677. [Google Scholar] [CrossRef]
  39. Ozsari, S.; Güzel, M.S.; Yılmaz, D.; Kamburoğlu, K. A Comprehensive Review of Artificial Intelligence Based Algorithms Regarding Temporomandibular Joint Related Diseases. Diagnostics 2023, 13, 2700. [Google Scholar] [CrossRef] [PubMed]
  40. Abdalla-Aslan, R.; Yeshua, T.; Kabla, D.; Leichter, I.; Nadler, C. An Artificial Intelligence System Using Machine-Learning for Automatic Detection and Classification of Dental Restorations in Panoramic Radiography. Oral Surg. Oral Med. Oral Pathol. Oral Radiol. 2020, 130, 593–602. [Google Scholar] [CrossRef]
  41. Zhu, J.; Chen, Z.; Zhao, J.; Yu, Y.; Li, X.; Shi, K.; Zhang, F.; Yu, F.; Shi, K.; Sun, Z.; et al. Artificial Intelligence in the Diagnosis of Dental Diseases on Panoramic Radiographs: A Preliminary Study. BMC Oral Health 2023, 23, 358. [Google Scholar] [CrossRef]
  42. Karamifar, K.; Tondari, A.; Saghiri, M.A. Endodontic Periapical Lesion: An Overview on the Etiology, Diagnosis and Current Treatment Modalities. Eur. Endod. J. 2020, 5, 54–67. [Google Scholar] [CrossRef]
  43. Kazimierczak, W.; Wajer, R.; Wajer, A.; Kiian, V.; Kloska, A.; Kazimierczak, N.; Janiszewska-Olszowska, J.; Serafin, Z. Periapical Lesions in Panoramic Radiography and CBCT Imaging-Assessment of AI’s Diagnostic Accuracy. J. Clin. Med. 2024, 13, 2709. [Google Scholar] [CrossRef]
  44. Nardi, C.; Calistri, L.; Pradella, S.; Desideri, I.; Lorini, C.; Colagrande, S. Accuracy of Orthopantomography for Apical Periodontitis without Endodontic Treatment. J. Endod. 2017, 43, 1640–1646. [Google Scholar] [CrossRef]
  45. Chang, L.; Umorin, M.; Augsburger, R.A.; Glickman, G.N.; Jalali, P. Periradicular Lesions in Cancellous Bone Can Be Detected Radiographically. J. Endod. 2020, 46, 496–501. [Google Scholar] [CrossRef] [PubMed]
  46. Pope, O.; Sathorn, C.; Parashos, P. A Comparative Investigation of Cone-Beam Computed Tomography and Periapical Radiography in the Diagnosis of a Healthy Periapex. J. Endod. 2014, 40, 360–365. [Google Scholar] [CrossRef]
  47. Krois, J.; Ekert, T.; Meinhold, L.; Golla, T.; Kharbot, B.; Wittemeier, A.; Dörfer, C.; Schwendicke, F. Deep Learning for the Radiographic Detection of Periodontal Bone Loss. Sci. Rep. 2019, 9, 8495. [Google Scholar] [CrossRef]
  48. Kim, J.; Lee, H.-S.; Song, I.-S.; Jung, K.-H. DeNTNet: Deep Neural Transfer Network for the Detection of Periodontal Bone Loss Using Panoramic Dental Radiographs. Sci. Rep. 2019, 9, 17615. [Google Scholar] [CrossRef] [PubMed]
  49. Chang, H.-J.; Lee, S.-J.; Yong, T.-H.; Shin, N.-Y.; Jang, B.-G.; Kim, J.-E.; Huh, K.-H.; Lee, S.-S.; Heo, M.-S.; Choi, S.-C.; et al. Deep Learning Hybrid Method to Automatically Diagnose Periodontal Bone Loss and Stage Periodontitis. Sci. Rep. 2020, 10, 7531. [Google Scholar] [CrossRef] [PubMed]
  50. Başaran, M.; Çelik, Ö.; Bayrakdar, I.S.; Bilgir, E.; Orhan, K.; Odabaş, A.; Aslan, A.F.; Jagtap, R. Diagnostic Charting of Panoramic Radiography Using Deep-Learning Artificial Intelligence System. Oral Radiol. 2022, 38, 363–369. [Google Scholar] [CrossRef] [PubMed]
  51. Gimenez, T.; Piovesan, C.; Braga, M.M.; Raggio, D.P.; Deery, C.; Ricketts, D.N.; Ekstrand, K.R.; Mendes, F.M. Visual Inspection for Caries Detection: A Systematic Review and Meta-Analysis. J. Dent. Res. 2015, 94, 895–904. [Google Scholar] [CrossRef]
  52. Różyło-Kalinowska, I. Panoramic Radiography in Dentistry. Clin. Dent. Rev. 2021, 5, 26. [Google Scholar] [CrossRef]
  53. White, S.C.; Pharoah, M.J. Oral Radiology: Principles and Interpretation; Elsevier Health Sciences: Amsterdam, The Netherlands, 2014; ISBN 978-0-323-09634-8. [Google Scholar]
  54. Perschbacher, S. Interpretation of Panoramic Radiographs. Aust. Dent. J. 2012, 57, 40–45. [Google Scholar] [CrossRef]
  55. Ganzer, H.; Touger-Decker, R.; Parrott, J.S.; Murphy, B.A.; Epstein, J.B.; Huhmann, M.B. Symptom Burden in Head and Neck Cancer: Impact upon Oral Energy and Protein Intake. Support. Care Cancer 2013, 21, 495–503. [Google Scholar] [CrossRef]
  56. Wanifuchi, S.; Akashi, M.; Ejima, Y.; Shinomiya, H.; Minamikawa, T.; Furudoi, S.; Otsuki, N.; Sasaki, R.; Nibu, K.; Komori, T. Cause and Occurrence Timing of Osteoradionecrosis of the Jaw: A Retrospective Study Focusing on Prophylactic Tooth Extraction. Oral Maxillofac. Surg. 2016, 20, 337–342. [Google Scholar] [CrossRef]
  57. Ben-David, M.A.; Diamante, M.; Radawski, J.D.; Vineberg, K.A.; Stroup, C.; Murdoch-Kinch, C.-A.; Zwetchkenbaum, S.R.; Eisbruch, A. Lack of Osteoradionecrosis of the Mandible after Intensity-Modulated Radiotherapy for Head and Neck Cancer: Likely Contributions of Both Dental Care and Improved Dose Distributions. Int. J. Radiat. Oncol. Biol. Phys. 2007, 68, 396–402. [Google Scholar] [CrossRef]
  58. Mordohai, N.; Reshad, M.; Jivraj, S.; Chee, W. Factors That Affect Individual Tooth Prognosis and Choices in Contemporary Treatment Planning. Br. Dent. J. 2007, 202, 63–72. [Google Scholar] [CrossRef]
  59. Cabitza, F.; Rasoini, R.; Gensini, G.F. Unintended Consequences of Machine Learning in Medicine. JAMA 2017, 318, 517–518. [Google Scholar] [CrossRef] [PubMed]
  60. Licitra, L.; Trama, A.; Hosni, H. Benefits and Risks of Machine Learning Decision Support Systems. JAMA 2017, 318, 2354. [Google Scholar] [CrossRef] [PubMed]
  61. van der Vegt, A.H.; Scott, I.A.; Dermawan, K.; Schnetler, R.J.; Kalke, V.R.; Lane, P.J. Implementation Frameworks for End-to-End Clinical AI: Derivation of the SALIENT Framework. J. Am. Med. Inform. Assoc. 2023, 30, 1503–1515. [Google Scholar] [CrossRef] [PubMed]
  62. White, D.A.; Tsakos, G.; Pitts, N.B.; Fuller, E.; Douglas, G.V.A.; Murray, J.J.; Steele, J.G. Adult Dental Health Survey 2009: Common Oral Health Conditions and Their Impact on the Population. Br. Dent. J. 2012, 213, 567–572. [Google Scholar] [CrossRef]
Figure 1. Normalized confusion matrix for the internal (a) and external (b) datasets.
Figure 1. Normalized confusion matrix for the internal (a) and external (b) datasets.
Diagnostics 14 02336 g001
Figure 2. Comparison between ground truths (a,c) and predicted classes (b,d). Green labels represent dental treatments, yellow – moderate-risk lesions, and red – high-risk lesions, the target of pre-radiotherapy screening. IMP—implant; PRR—prosthetic restoration; OBT—dental filling; END—root canal treatment; CAR—carious lesion; BON—alveolar bone loss; IMT—impacted tooth; API—periapical lesion; ROT—root fragment; FUR—furcation lesion; ORD—orthodontic treatment; SRD—surgical device; APS—apical surgery.
Figure 2. Comparison between ground truths (a,c) and predicted classes (b,d). Green labels represent dental treatments, yellow – moderate-risk lesions, and red – high-risk lesions, the target of pre-radiotherapy screening. IMP—implant; PRR—prosthetic restoration; OBT—dental filling; END—root canal treatment; CAR—carious lesion; BON—alveolar bone loss; IMT—impacted tooth; API—periapical lesion; ROT—root fragment; FUR—furcation lesion; ORD—orthodontic treatment; SRD—surgical device; APS—apical surgery.
Diagnostics 14 02336 g002
Table 1. Evaluation metrics.
Table 1. Evaluation metrics.
MetricFormulaDefinition
Precision T P T P + F P The fraction of relevant instances among all retrieved instances.
Recall T P T P + F N The fraction of retrieved instances among all relevant instances.
F1 T P T P + 1 2 ( F P + F N ) F1 score combines precision and recall into a single metric.
TP—true positive rate; FP—false positive rate; FN—false negative rate.
Table 2. Model performance on the internal dataset.
Table 2. Model performance on the internal dataset.
ClassImagesInstancesPrecisionRecall
all10013690.6250.587
IMP100250.8780.96
PRR100890.3110.483
OBT1003910.7780.506
END1002670.8790.899
CAR1001410.6260.511
BON1001180.2850.169
IMT100580.7220.759
API100800.7790.412
ROT100330.6490.818
FUR100180.5840.235
APS100400
ORD100250.7180.917
SRD1001200.920.964
IMP—implant; PRR—prosthetic restoration; OBT—dental filling; END—root canal treatment; CAR—carious lesion; BON—alveolar bone loss; IMT—impacted tooth; API—periapical lesion; ROT—root fragment; FUR—furcation lesion; ORD—orthodontic treatment; SRD—surgical device; APS—apical surgery.
Table 3. Model performance on the external validation set.
Table 3. Model performance on the external validation set.
ClassImagesInstancesPrecisionRecall
all18029800.5390.498
IMP180160.6190.875
PRR1802060.6590.639
OBT1808970.6770.666
END1805280.7340.777
CAR1802960.4660.439
BON1806800.4370.107
IMT1801110.7650.793
API1801060.3670.396
ROT180930.5610.57
FUR180190.1550.211
APS1801700
ORD18060.5631
SRD180510
IMP—implant; PRR—prosthetic restoration; OBT—dental filling; END—root canal treatment; CAR—carious lesion; BON—alveolar bone loss; IMT—impacted tooth; API—periapical lesion; ROT—root fragment; FUR—furcation lesion; ORD—orthodontic treatment; SRD—surgical device; APS—apical surgery.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Mureșanu, S.; Hedeșiu, M.; Iacob, L.; Eftimie, R.; Olariu, E.; Dinu, C.; Jacobs, R.; on behalf of Team Project Group. Automating Dental Condition Detection on Panoramic Radiographs: Challenges, Pitfalls, and Opportunities. Diagnostics 2024, 14, 2336. https://doi.org/10.3390/diagnostics14202336

AMA Style

Mureșanu S, Hedeșiu M, Iacob L, Eftimie R, Olariu E, Dinu C, Jacobs R, on behalf of Team Project Group. Automating Dental Condition Detection on Panoramic Radiographs: Challenges, Pitfalls, and Opportunities. Diagnostics. 2024; 14(20):2336. https://doi.org/10.3390/diagnostics14202336

Chicago/Turabian Style

Mureșanu, Sorana, Mihaela Hedeșiu, Liviu Iacob, Radu Eftimie, Eliza Olariu, Cristian Dinu, Reinhilde Jacobs, and on behalf of Team Project Group. 2024. "Automating Dental Condition Detection on Panoramic Radiographs: Challenges, Pitfalls, and Opportunities" Diagnostics 14, no. 20: 2336. https://doi.org/10.3390/diagnostics14202336

APA Style

Mureșanu, S., Hedeșiu, M., Iacob, L., Eftimie, R., Olariu, E., Dinu, C., Jacobs, R., & on behalf of Team Project Group. (2024). Automating Dental Condition Detection on Panoramic Radiographs: Challenges, Pitfalls, and Opportunities. Diagnostics, 14(20), 2336. https://doi.org/10.3390/diagnostics14202336

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop