Next Article in Journal
The PacifAIst Benchmark: Do AIs Prioritize Human Survival over Their Own Objectives?
Next Article in Special Issue
Bio-Inspired Optimization of Transfer Learning Models for Diabetic Macular Edema Classification
Previous Article in Journal
Support Vector Machines to Propose a Ground Motion Prediction Equation for the Particular Case of the Bojorquez Intensity Measure INp
Previous Article in Special Issue
MST-DGCN: Multi-Scale Temporal–Dynamic Graph Convolutional with Orthogonal Gate for Imbalanced Multi-Label ECG Arrhythmia Classification
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Diagnostic Performance of AI-Assisted Software in Sports Dentistry: A Validation Study

1
Egas Moniz Center for Interdisciplinary Research (CiiEM), Egas Moniz School of Health & Science, 2829-511 Almada, Portugal
2
Casa Pia Atlético Clube, 1500-462 Lisbon, Portugal
3
UCL Eastman Dental Institute, London WC1E 6BT, UK
*
Author to whom correspondence should be addressed.
AI 2025, 6(10), 255; https://doi.org/10.3390/ai6100255
Submission received: 30 July 2025 / Revised: 23 September 2025 / Accepted: 25 September 2025 / Published: 1 October 2025

Abstract

Artificial Intelligence (AI) applications in sports dentistry have the potential to improve early detection and diagnosis. We aimed to validate the diagnostic performance of AI-assisted software in detecting dental caries, periodontitis, and tooth wear using panoramic radiographs in elite athletes. This cross-sectional validation study included secondary data from 114 elite athletes from the Sports Dentistry department at Egas Moniz Dental Clinic. The AI software’s performance was compared to clinically validated assessments. Dental caries and tooth wear were inspected clinically and confirmed radiographically. Periodontitis was registered through self-reports. We calculated sensitivity, specificity, positive predictive value (PPV), and negative predictive value (NPV), as well as the area under the curve and respective 95% confidence intervals. Inter-rater agreement was assessed using Cohen’s kappa statistic. The AI software showed high reproducibility, with kappa values of 0.82 for caries, 0.91 for periodontitis, 0.96 for periapical lesions, and 0.76 for tooth wear. Sensitivity was highest for periodontitis (1.00; AUC = 0.84), moderate for caries (0.74; AUC = 0.69), and lower for tooth wear (0.53; AUC = 0.68). Full agreement between AI and clinical reference was achieved in 86.0% of cases. The software generated a median of 3 AI-specific suggestions per case (range: 0–16). In 21.9% of cases, AI’s interpretation of periodontal level was deemed inadequate; among these, only 2 cases were clinically confirmed as periodontitis. Of the 34 false positives for periodontitis, 32.4% were misidentified by the AI. The AI-assisted software demonstrated substantial agreement with clinical diagnosis, particularly for periodontitis and caries. The relatively high false-positive rate for periodontitis and limited sensitivity for tooth wear underscore the need for cautious clinical integration, supervision, and further model refinements. However, this software did show overall adequate performance for application in Sports Dentistry.

1. Introduction

The incorporation of artificial intelligence (AI) into clinical diagnostics is transforming healthcare through innovative opportunities for early screening, standardization, and enhanced efficiency [1,2,3]. In dentistry, AI applications have rapidly expanded; however, their validation in specialized areas remains limited [4,5,6,7].
Oral health in elite athletes holds critical relevance [8]. Evidence suggests that poor oral health not only affects general well-being but also compromises athletic performance, elevates the risk of systemic complications, and increases susceptibility to sports-related injuries [9,10]. Despite these implications, oral health continues to be an underappreciated aspect of sports medicine [11]. A recent study applying the Universal Screening Protocol for Dental Examinations in Sports (USPDES) revealed that nearly one out of two oral diseases coexist, underscoring the urgent need for effective screening tools tailored to this high-performance population [12].
Sports Dentistry, as a growing discipline, faces unique implementation challenges, particularly related to athletes’ season-dependent schedules and the need for rapid, actionable diagnostics [8]. Although tools like the recent USPDES systematize assessments, they remain time-consuming and may be impractical in elite contexts [13]. In this setting, athlete-centered outcomes and self-reported tools have gained attention as valuable yet underutilized resources to streamline oral health monitoring [14]. Panoramic radiographs, although widely available, demand expert interpretation and may be affected by inter-examiner variability, particularly when rapid decisions are required. These constraints contribute to under-diagnosis and delayed management of oral conditions in elite settings.
AI-supported diagnostics, when validated, could help overcome the limitations of conventional protocols by offering scalable, reproducible, and time-efficient alternatives that align with the constraints of elite training and competition calendars. In Sports Dentistry, this is even more challenging regarding pre-season and season demands [11,15], particularly in collective sports, where this is inherently challenging due to the logistical constraints of athletes’ training, competition, and travel schedules. For this reason, AI might be interesting to apply using standard complementary diagnostic tools such as the panoramic X-ray.
To address these needs, the present study evaluates the diagnostic accuracy of CE-marked AI software in detecting common oral conditions using panoramic radiographs from elite athletes. Additionally, this investigation characterizes the oral health burden in this population and explores the potential role of AI in enhancing Sports Dentistry protocols. This study may represent the first validation of AI-assisted diagnostics in sports dentistry and the performance of CE-marked AI software and may provide a comprehensive characterization of oral health burden in elite athletes, thereby contextualizing the role of AI within sports dentistry protocols.

2. Materials and Methods

2.1. Study Design and Setting

To explore the performance of the AI software, we implemented a secondary analysis of a dataset from a previous cross-sectional study on elite athletes [12], where participants provided and signed informed consent. This group was recruited at the Sports Dentistry department of a university clinic (Egas Moniz Dental Clinic, Almada, Portugal). The study was approved by the Institutional Review Board (Ethics Committee ID nº. 1101) and adhered to the Declaration of Helsinki. Data collection originally took place over a fourteen-month period, from July 2023 to November 2024.

2.2. Participants and Study Size

This dataset included 114 elite athletes aged 18 or older, actively training or competing. Exclusion criteria included recent dental treatments, inability to undergo or absence of panoramic radiography.
This is a secondary study that reuses data from athletes assessed using the European Association for Sports Dentistry (EA4SD)/Academy for Sports Dentistry (ASD) [13] protocol for oral observation in sports.

2.3. Panoramic X-Rays

Panoramic radiographs were acquired by a certified radiologist utilizing the Viso G5 (Planmeca, Helsinki, Finland) in accordance with the manufacturer’s guidelines. These images were subsequently stored in an internal cloud via the Romexis software system database (Planmeca, Helsinki, Finland). The quality of the panoramic radiographs was qualitatively evaluated by a single examiner (AJ) using the criteria proposed by Sabarudin and Tiau [16]. The assessment was adapted to two regions—the upper and lower arches—and applied an ordinal scale covering three domains: anatomical coverage, image density and contrast, and visibility of anatomical structures [16]. When teeth were missing or implants were present, the category “Not Applicable (NA)” was used. All radiographs were classified as high quality, because they accounted for all domains with a score of 3 or 4. No images were deemed of low quality, that is, if any domain was scored 1 or 2.

2.4. AI Software for Diagnosis

Recent advances in neural architecture search, including Pareto-wise ranking approaches for multiobjective optimization [17], provide a framework for refining such models, which may further enhance the detection of subtle radiographic changes in elite athletes.
For this study, we used WeDiagnostiX® [18], a CE-credited, commercially available, AI-powered diagnostic software designed for automated analysis of dental panoramic radiographs. WeDiagnostiX® was used in its standard operating mode, without manual overrides or post-processing. Developed for use in general and specialized dentistry, it integrates deep learning algorithms trained on large datasets to identify radiographic signs of oral health conditions, including but not restricted to dental caries, periodontal bone loss, and tooth wear patterns. The software provides color-coded overlays on radiographic images, along with probability scores for each detected condition, aiming to support clinical decision-making. Its outputs were registered into a predefined table in Microsoft Excel into binary outcomes (presence or absence of disease) for comparison against validated reference standards.
We only accounted for information regarding the conditions defined for the goal of the study (dental caries, periodontitis and periapical lesions). We combined information regarding ‘possible bruxism’ from the system as tooth wear information and complemented it with observer supervision. Whenever adequate, observers were allowed to supervise and override diagnoses on the system, and this information was registered. To assign a possible radiographic periodontal bone loss, we defined at least one sextant with 20% or more AI-based periodontal bone loss, according to a previously validated study [19,20].
The software was used in Macintosh systems with maximum brightness.

2.5. Measurement Reproducibility

To assess the reproducibility of the reference assessments, two independent observers (CR and JB) evaluated all panoramic radiographs independently for the presence of caries, periodontitis, and tooth wear. Both examiners were calibrated prior to the evaluation phase and blinded to the clinical observation results and each other’s assessments. Inter-rater agreement was computed using Cohen’s kappa (κ) coefficient [21] for each condition and respective 95% confidence interval (CI), to measure the degree of agreement between observers beyond what would be expected by chance.

2.6. Oral Diseases

Caries prevalence was assessed using the International Caries Detection and Assessment System II (ICDAS II) [22], considering its higher sensitivity in estimating caries prevalence compared to other criteria [23].
According to the ICDAS criteria, sites were evaluated using a scoring system ranging from 0 to 6: 0 indicates sound enamel; 1 represents the initial visual change in enamel; 2 signifies a clear visual change in enamel; 3 denotes localized enamel breakdown without visible signs of dentin involvement; 4 indicates an underlying dark shadow from dentin; 5 represents a distinct cavity with visible dentin; and 6 signifies an extensive distinct cavity with visible dentin. If a patient had at least one lesion scored 4 or higher, they were classified as having a ‘presence’ (code 0). Patients with carious lesions received the necessary treatment.
Tooth wear was documented using the Basic Erosive Wear Examination (BEWE) [24] with the exception of third molars. In each sextant, the surface with the most significant wear was recorded, and these scores were totaled. We then categorized the results as follows: no risk (BEWE ≤ 2) and at risk (BEWE ≥ 3) [24].
The assessment of periodontal health was conducted through a self-reporting method that had been previously validated with thirteen questions. Among these, two questions demonstrated a predictive ability with an area under the curve of 0.8 for the number of teeth lost as observed clinically [25]. The remaining questions covered the health of gums and teeth, the condition of loose teeth, bone loss, the appearance of teeth, and the use of dental floss and mouthwash.

2.7. Bias Controls

Independent validation was used as a reference standard. Operators were blinded to AI predictions during manual assessments [26].

2.8. Statistical Analysis

Binary classification was used (condition present vs. absent). Statistical analyses were performed in R. We computed diagnostic accuracy metrics (sensitivity, specificity, Positive Predictive Value [PPV] and Negative Predictive Value [NPV]) and constructed ROC curves for each condition. The clinical diagnosis was compared against the outcome (predicted condition) to generate the ROC curve and compute the Area Under the Curve (AUC). An AUC ≥ 0.80 indicates a good to excellent discriminative ability of the model, while an AUC ≥ 0.70 indicates an acceptable discriminative ability.
All analyses were conducted in R version 3.1 (www.r-project.org) and SPSS software 29.0 (IBM, New York, NY, USA). Statistical significance was set at p < 0.05.

3. Results

3.1. Participants and Characteristics

The cohort was predominantly male (68.4%, n = 78), with a mean age of 25.4 (±5.2) years. Most practiced football (80.7%, n = 92), basketball (7.0, n = 8) or martial arts (4.4%, n = 5). Age differences between males and females were not significant (p = 0.113) (Table 1). The prevalence of dental caries, periodontitis and tooth wear was 47.4%, 38.5% and 7.7%, respectively.

3.2. Agreement of Observers

The κ values were 0.82 (95% CI: 0.72–0.91) for caries, 0.91 (95% CI: 0.83–0.99) for periodontitis, 0.96 (95% CI: 0.93–0.99) for periapical, and 0.76 (95% CI: 0.64–0.88) for tooth wear, indicating high reproducibility and consistency of the clinical reference standard.
The overall agreement was considered substantial. Complete agreement across all three conditions (caries, periodontitis, and tooth wear) was achieved in 98 out of the 114 cases (86.0%), whereas partial agreement was obtained in 7 cases (6.1%). The remaining cases had no agreement (n = 9, 7.9%), were resolved with consensus and no need to involve a third examiner, but were all related to only one condition.
The software registers so-called ‘AI specifics’ that are predefined suggestions from the software without changing the software preset.

3.3. Performance of AI Software vs. Clinical Diagnosis

The κ values were 0.82 (95% CI: 0.72–0.91) for caries, 0.91 (95% CI: 0.83–0.99) for periodontitis, 0.96 (95% CI: 0.93–0.99) for periapical, and 0.76 (95% CI: 0.64–0.88) for tooth wear (Figure 1), indicating high reproducibility and consistency of the clinical reference standard. The sensitivity, specificity, PPV and NPVs are also presented in Table 2.
The overall agreement was considered substantial. Complete agreement across all three conditions (caries, periodontitis, and tooth wear) was achieved in 98 out of the 114 cases (86.0%), whereas partial agreement was obtained in 7 cases (6.1%). The remaining cases had no agreement (n = 9, 7.9), were resolved with consensus and no need to involve a third examiner, but were all related to only one condition.
The software registers so-called ‘AI specifics’ that are predefined suggestions from the software. The median number of specifics was 3, with a minimum of 0 and a maximum of 16.
In twenty-five cases (21.9%), the periodontal level display was considered inadequate. Of those, only two cases with a suggested AI-based diagnosis were confirmed with periodontitis, while 11 cases had suggestive periodontitis but not clinically (out of the 34 overall false positives, 32.4%).

4. Discussion

This study assessed the diagnostic performance of AI-assisted radiographic software in detecting oral health conditions among elite athletes, under the supervision of observers, compared to clinical gold-standard or self-reported validated methods. The results demonstrated that high reproducibility in clinical assessments and substantial agreement between AI-generated diagnoses and reference standards requires clinical supervision during its implementation. These findings support the potential integration of AI tools into Sports Dentistry workflows to facilitate more efficient and scalable diagnostic approaches.
The AI system showed excellent sensitivity for periodontitis (1.00; AUC = 0.84), aligning with previous findings that AI models can effectively identify radiographic signs of periodontal bone loss. However, a notable proportion of false positives (32.4%) was recorded, often due to overestimation of periodontal involvement by the AI. This observation underscores the importance of supervised interpretation, especially in populations with low disease prevalence or in cases where radiographic artifacts and anatomical variability may mimic pathology. For caries detection, the AI system yielded acceptable sensitivity (0.74; AUC = 0.69), comparable to previous studies using panoramic radiographs and AI, although slightly below thresholds typically seen with bitewing-based models. The system’s performance for tooth wear was more modest (sensitivity = 0.53; AUC = 0.68), likely reflecting the limitations of panoramic imaging in capturing occlusal and incisal wear facets, and the multifactorial nature of tooth wear patterns in athletes.
For periapical lesions, the obtained performance aligns with those presented by the company, which refers to an internal study with nearly 99% performance success [18].
The substantial overall agreement (86%) between the AI and clinical reference underscores the reproducibility and utility of this tool. However, 21.9% of the radiographic displays for periodontal levels were identified as inadequate, indicating a technical limitation that should be addressed in future software iterations. In clinical practice, particularly within the constrained timelines of sports seasons, tools that reduce time and subjectivity in diagnosis can significantly enhance care delivery. Nevertheless, this must be balanced with the necessity for clinical oversight to prevent overdiagnosis or misclassification.
These findings align with the current priorities in Sports Dentistry. As previously highlighted, the sports dentist encounters logistical challenges due to athletes’ rigorous training schedules, seasonality, and travel commitments. While traditional full-mouth examinations are comprehensive, they are often impractical in this context. AI-supported diagnostics and validated self-report tools offer a pragmatic alternative, potentially facilitating routine, preseason screening that integrates seamlessly with the broader sports medicine framework. Furthermore, the emphasis on athlete-centered outcomes and simplified protocols reflects a paradigm shift toward more personalized and accessible care models.
In elite athletes, oral health is influenced by unique physiological and behavioral patterns, including higher carbohydrate intake, dehydration, salivary changes, and increased incidence of orofacial trauma. These factors may affect the presentation and detectability of dental diseases in radiographic images, potentially altering AI diagnostic behavior when applied to this subgroup. Therefore, validating AI performance specifically in elite athletes is essential to ensure that diagnostic algorithms are reliable and clinically useful in the high-performance sports setting.

Strengths and Limitations

One notable strength of this study is the utilization of a real-world cohort of elite athletes alongside a CE-marked, commercially available AI tool. The methodological rigor is further enhanced by the blinding of observers, meticulous calibration, and the use of validated diagnostic references. Nonetheless, several limitations warrant consideration. Firstly, periodontitis was assessed through self-reporting, which, despite validation, may lead to underestimation or overestimation of disease prevalence. Athletes, in particular, may underreport symptoms due to low awareness of periodontal disease or a tendency to minimize health concerns that could affect their training or selection. This limitation may therefore have led to both underestimation and overestimation of disease prevalence, and the findings should be interpreted with caution. Secondly, the reliance on panoramic radiographs constrains the resolution and anatomical detail necessary for optimal assessment of certain conditions, particularly early carious lesions and incisal wear. Thirdly, the sample predominantly comprises football players, which may restrict the generalizability of findings to athletes in non-contact or endurance sports with distinct oral health risk profiles.
Clinical implications of false positives and low sensitivity are critical in this context. False positives for periodontitis may lead to unnecessary referrals, imaging, and overtreatment, disrupting athletes’ training and increasing costs. The moderate sensitivity for tooth wear raises underdiagnosis risks, delaying prevention measures like dietary counseling and protective appliances. Since athletes face erosive challenges from sports drinks, altered salivary flow, and stress-related bruxism, undetected tooth wear may progress until function and esthetics are affected. These findings highlight the need for AI-assisted diagnostics under clinical supervision to prevent disease overestimation while ensuring significant conditions are not missed.
Future research should continue to investigate the role of AI in Sports Dentistry, particularly through prospective study designs and real-time clinical integration within athletic settings. Longitudinal and interventional trials will be crucial to establish causality, validate predictive models, and assess the clinical utility of AI-driven decision support systems in preventing and managing orofacial injuries, dental trauma, and performance-related oral health issues. Moreover, multimodal approaches that integrate AI with athlete self-reported outcomes, salivary diagnostics, and wearable technologies—such as mouthguards equipped with biosensors or smart monitoring devices—hold promise for developing individualized risk profiles and dynamic monitoring strategies. These synergies could enable earlier detection of pathological changes, timely interventions, and more tailored preventive programs, ultimately enhancing both oral and systemic health in sports professionals and recreational athletes. Importantly, future work should also address ethical considerations, data privacy, and the need for standardized protocols to ensure safe, equitable, and widespread implementation of AI technologies in sports dentistry.

5. Conclusions

The AI-assisted software exhibited a substantial concordance with clinical diagnoses, particularly in cases of periodontitis and dental caries. This approach still requires cautious clinical integration, ongoing supervision, and further refinement of the model, despite adequate performance in this setting.

Supplementary Materials

The following supporting information can be downloaded at https://www.mdpi.com/article/10.3390/ai6100255/s1.

Author Contributions

Conceptualization, A.J. and J.B.; methodology, C.R. and J.B.; software, J.B.; validation, V.M., C.R. and J.B.; formal analysis, J.B.; writing—original draft preparation: all authors; writing—review and editing, all authors. All authors have read and agreed to the published version of the manuscript.

Funding

This research received support from FCT/MCTES to CiiEM (10.54499/UIDB/04585/2020) through national funds.

Institutional Review Board Statement

The study was conducted in accordance with the Declaration of Helsinki and approved by the Institutional Review Board (or Ethics Committee) of Egas Moniz (protocol code 1101).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

All data generated or analyzed during this study are included in this published article as a Supplementary Information File.

Acknowledgments

We acknowledge DNI Medicina e Saúde, Lda for the logistic and technical support provided in analyzing athletes.

Conflicts of Interest

The authors declare no conflicts of interest. Author Dr. Gabriel Nogueira is employed by Casa Pia Atlético Clube, Lisbon, Portugal.

References

  1. D’Adderio, L.; Bates, D.W. Transforming Diagnosis through Artificial Intelligence. npj Digit. Med. 2025, 8, 54. [Google Scholar] [CrossRef]
  2. Qiu, W.; Quan, C.; Yu, Y.; Kara, E.; Qian, K.; Hu, B.; Schuller, B.W.; Yamamoto, Y. Federated Abnormal Heart Sound Detection with Weak to No Labels. Cyborg Bionic Syst. 2024, 5, 0152. [Google Scholar] [CrossRef]
  3. Wei, J.; Chen, H.; Yao, L.; Hou, X.; Zhang, R.; Shi, L.; Sun, J.; Hu, C.; Wei, X.; Jia, W. BioCompNet: A Deep Learning Workflow Enabling Automated Body Composition Analysis toward Precision Management of Cardiometabolic Disorders. Cyborg Bionic Syst. 2025, 6, 0381. [Google Scholar] [CrossRef] [PubMed]
  4. Lal, A.; Nooruddin, A.; Umer, F. Concerns Regarding Deployment of AI-Based Applications in Dentistry—A Review. BDJ Open 2025, 11, 27. [Google Scholar] [CrossRef] [PubMed]
  5. Schwendicke, F.; Samek, W.; Krois, J. Artificial Intelligence in Dentistry: Chances and Challenges. J. Dent. Res. 2020, 99, 769–774. [Google Scholar] [CrossRef]
  6. Shan, T.; Tay, F.R.; Gu, L. Application of Artificial Intelligence in Dentistry. J. Dent. Res. 2021, 100, 232–244. [Google Scholar] [CrossRef]
  7. Mallineni, S.K.; Sethi, M.; Punugoti, D.; Kotha, S.B.; Alkhayal, Z.; Mubaraki, S.; Almotawah, F.N.; Kotha, S.L.; Sajja, R.; Nettam, V.; et al. Artificial Intelligence in Dentistry: A Descriptive Review. Bioengineering 2024, 11, 1267. [Google Scholar] [CrossRef]
  8. Needleman, I.; Ashley, P.; Fine, P.; Haddad, F.; Loosemore, M.; de Medici, A.; Donos, N.; Newton, T.; van Someren, K.; Moazzez, R.; et al. Consensus Statement: Oral Health and Elite Sport Performance. Br. Dent. J. 2014, 217, 587–590. [Google Scholar] [CrossRef]
  9. Buti, J.; Ronca, F.; Burgess, P.W.; Gallagher, J.; Ashley, P.; Needleman, I. Association between Periodontitis and Physical Fitness in Law Enforcement Workers. Clin. Oral Investig. 2025, 29, 99. [Google Scholar] [CrossRef]
  10. Solleveld, H.; Slaets, B.; Goedhart, A.; VandenBossche, L. Associations of Masticatory Muscles Asymmetry and Oral Health with Postural Control and Leg Injuries of Elite Junior Soccer Players. J. Hum. Kinet. 2022, 84, 21–31. [Google Scholar] [CrossRef] [PubMed]
  11. Júdice, A.; Botelho, J.; Machado, V.; Proença, L.; Ferreira, L.M.; Fine, P.; Mendes, J.J. Sports Dentistry Intricacies with Season-Related Challenges and the Role of Athlete-Centered Outcomes. Front. Oral Health 2025, in press. [Google Scholar] [CrossRef]
  12. Júdice, A.; Brandão, D.; Botelho, J.; Machado, V.; Proença, L.; Ferreira, L.M.A.; Stamos, A.; Fine, P.; Mendes, J.J. Elite Athletes’ Overall Oral Health, Values and Related Quality of Life: A Cross-Sectional Study. Sci. Rep. 2025, 15, 25564. [Google Scholar] [CrossRef]
  13. Stamos, A.; Engels-Deutsch, M.; Cantamessa, S.; Dartevelle, J.; Crouzette, T.; Haughey, J.; Grosso, F.D.; Avgerinos, S.; Fritsch, T.; Nanussi, A.; et al. A Suggested Universal Protocol for Dental Examination in Sports. Dent. Traumatol. 2023, 39, 521–530. [Google Scholar] [CrossRef]
  14. Perazzo, M.F.; Serra-Negra, J.M.; Firmino, R.T.; Pordeus, I.A.; Martins-Júnior, P.A.; Paiva, S.M. Patient-Centered Assessments: How Can They Be Used in Dental Clinical Trials? Braz. Oral Res. 2020, 34, e075. [Google Scholar] [CrossRef]
  15. Pandey, A. Artificial Intelligence, Sports, and Dentistry: Synergistic Advances and Future Applications. Biomed. Res. Clin. Rev. 2024, 8, 1–3. [Google Scholar]
  16. Sabarudin, A.; Tiau, Y.J. Image Quality Assessment in Panoramic Dental Radiography: A Comparative Study between Conventional and Digital Systems. Quant. Imaging Med. Surg. 2013, 3, 43–48. [Google Scholar] [CrossRef]
  17. Ma, L.; Li, N.; Yu, G.; Geng, X.; Cheng, S.; Wang, X.; Huang, M.; Jin, Y. Pareto-Wise Ranking Classifier for Multiobjective Evolutionary Neural Architecture search. IEEE Trans. Evol. Comput. 2023, 28, 570–581. [Google Scholar] [CrossRef]
  18. AI-Powered Dental X-Ray Report|Enhance Your Analysis with WeDiagnostiX—WeDiagnostix 2024. Available online: https://wediagnostix.ai/en/automated-dental-x-ray-analysis-with-ai (accessed on 1 July 2025).
  19. Rydén, L.; Buhlin, K.; Ekstrand, E.; de Faire, U.; Gustafsson, A.; Holmer, J.; Kjellström, B.; Lindahl, B.; Norhammar, A.; Nygren, Å.; et al. Periodontitis Increases the Risk of a First Myocardial Infarction: A Report From the PAROKRANK Study. Circulation 2016, 133, 576–583. [Google Scholar] [CrossRef]
  20. Nordendahl, E.; Gustafsson, A.; Norhammar, A.; Näsman, P.; Rydén, L.; Kjellström, B.; PAROKRANK Steering Committee. Severe Periodontitis Is Associated with Myocardial Infarction in Females. J. Dent. Res. 2018, 97, 1114–1121. [Google Scholar] [CrossRef]
  21. Cohen, J. Statistical Power Analysis for the Behavioral Sciences; Routledge: Oxfordshire, UK, 2013; ISBN 978-1-134-74270-7. [Google Scholar]
  22. Ismail, A.I.; Sohn, W.; Tellez, M.; Amaya, A.; Sen, A.; Hasson, H.; Pitts, N.B. The International Caries Detection and Assessment System (ICDAS): An Integrated System for Measuring Dental Caries. Community Dent. Oral Epidemiol. 2007, 35, 170–178. [Google Scholar] [CrossRef]
  23. Alves, L.S.; Susin, C.; Damé-Teixeira, N.; Maltz, M. Impact of different detection criteria on caries estimates and risk assessment. Int. Dent. J. 2018, 68, 144–151. [Google Scholar] [CrossRef] [PubMed]
  24. Bartlett, D.; Ganss, C.; Lussi, A. Basic Erosive Wear Examination (BEWE): A New Scoring System for Scientific and Clinical Needs. Clin. Oral Investig. 2008, 12, 65–68. [Google Scholar] [CrossRef]
  25. Machado, V.; Lyra, P.; Santos, C.; Proença, L.; Mendes, J.J.; Botelho, J. Self-Reported Measures of Periodontitis in a Portuguese Population: A Validation Study. J. Pers. Med. 2022, 12, 1315. [Google Scholar] [CrossRef] [PubMed]
  26. Moon, K.; Rao, S. Assessment of the Risk of Bias. In Principles and Practice of Systematic Reviews and Meta-Analysis; Patole, S., Ed.; Springer International Publishing: Cham, Switzerland, 2021; pp. 43–55. ISBN 978-3-030-71920-3. [Google Scholar]
Figure 1. Discriminative performance of WeDiagnostix® for caries (orange), self-reported periodontitis (purple), tooth wear (green), and periapical lesions (red) using categorical variables. Receiver operating characteristic (ROC) curves with area under the curve (AUC) values and 95% confidence intervals (CI). Original cut-off values for the diagnosis decision were confirmed.
Figure 1. Discriminative performance of WeDiagnostix® for caries (orange), self-reported periodontitis (purple), tooth wear (green), and periapical lesions (red) using categorical variables. Receiver operating characteristic (ROC) curves with area under the curve (AUC) values and 95% confidence intervals (CI). Original cut-off values for the diagnosis decision were confirmed.
Ai 06 00255 g001
Table 1. Participants characteristics.
Table 1. Participants characteristics.
Total (N = 114)Male (n = 78)Female (n = 36)p-Value
Age, mean (SD)25.4 (5.2)25.9 (5.3)24.4 (4.7)0.113
Erosion risk (BEWE ≥ 3)35.1 (40)38.5 (30)27.8 (10)0.368
Dental Caries (ICDAS score of 4 ≥ 1)47.4 (54)44.9 (35)52.8 (19)0.726
Periodontitis5.3 (6)7.7 (6)0 (0)0.167
Table 2. Sensitivity, Specificity, PPV, and NPVs for each condition.
Table 2. Sensitivity, Specificity, PPV, and NPVs for each condition.
SensitivitySpecificityPPVNPV
Caries0.740.650.660.74
Periodontitis1.000.680.171.00
Tooth Wear0.530.840.640.77
Periapical Lesions0.950.980.900.99
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Júdice, A.; Brandão, D.; Rodrigues, C.; Simões, C.; Nogueira, G.; Machado, V.; Ferreira, L.M.A.; Ferreira, D.; Proença, L.; Botelho, J.; et al. Diagnostic Performance of AI-Assisted Software in Sports Dentistry: A Validation Study. AI 2025, 6, 255. https://doi.org/10.3390/ai6100255

AMA Style

Júdice A, Brandão D, Rodrigues C, Simões C, Nogueira G, Machado V, Ferreira LMA, Ferreira D, Proença L, Botelho J, et al. Diagnostic Performance of AI-Assisted Software in Sports Dentistry: A Validation Study. AI. 2025; 6(10):255. https://doi.org/10.3390/ai6100255

Chicago/Turabian Style

Júdice, André, Diogo Brandão, Carlota Rodrigues, Cátia Simões, Gabriel Nogueira, Vanessa Machado, Luciano Maia Alves Ferreira, Daniel Ferreira, Luís Proença, João Botelho, and et al. 2025. "Diagnostic Performance of AI-Assisted Software in Sports Dentistry: A Validation Study" AI 6, no. 10: 255. https://doi.org/10.3390/ai6100255

APA Style

Júdice, A., Brandão, D., Rodrigues, C., Simões, C., Nogueira, G., Machado, V., Ferreira, L. M. A., Ferreira, D., Proença, L., Botelho, J., Fine, P., & Mendes, J. J. (2025). Diagnostic Performance of AI-Assisted Software in Sports Dentistry: A Validation Study. AI, 6(10), 255. https://doi.org/10.3390/ai6100255

Article Metrics

Back to TopTop