Next Article in Journal
Factors Affecting Patients with Concurrent Deep Neck Infection and Lemierre’s Syndrome
Next Article in Special Issue
Development of a Deep Learning Model for Malignant Small Bowel Tumors Survival: A SEER-Based Study
Previous Article in Journal
Patient Perception When Transitioning from Classic to Remote Assisted Cardiac Rehabilitation
Previous Article in Special Issue
Chronic Appendicitis—From Ambiguous Clinical Image to Inconclusive Imaging Studies
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Artificial Intelligence in Digestive Endoscopy—Where Are We and Where Are We Going?

1
Institute of Gastroenterology and Hepatology, Saint Spiridon Hospital, “Grigore T. Popa” University of Medicine and Pharmacy, 700111 Iași, Romania
2
Institute of Computer Science, Romanian Academy—Iași Branch, 700481 Iași, Romania
*
Author to whom correspondence should be addressed.
Diagnostics 2022, 12(4), 927; https://doi.org/10.3390/diagnostics12040927
Submission received: 28 February 2022 / Revised: 30 March 2022 / Accepted: 6 April 2022 / Published: 8 April 2022
(This article belongs to the Special Issue Modern Imaging and Computer-Aided Diagnosis in Gastroenterology)

Abstract

:
Artificial intelligence, a computer-based concept that tries to mimic human thinking, is slowly becoming part of the endoscopy lab. It has developed considerably since the first attempt at developing an automated medical diagnostic tool, today being adopted in almost all medical fields, digestive endoscopy included. The detection rate of preneoplastic lesions (i.e., polyps) during colonoscopy may be increased with artificial intelligence assistance. It has also proven useful in detecting signs of ulcerative colitis activity. In upper digestive endoscopy, deep learning models may prove to be useful in the diagnosis and management of upper digestive tract diseases, such as gastroesophageal reflux disease, Barrett’s esophagus, and gastric cancer. As is the case with all new medical devices, there are challenges in the implementation in daily medical practice. The regulatory, economic, organizational culture, and language barriers between humans and machines are a few of them. Even so, many devices have been approved for use by their respective regulators. Future studies are currently striving to develop deep learning models that can replicate a growing amount of human brain activity. In conclusion, artificial intelligence may become an indispensable tool in digestive endoscopy.

1. Introduction

Artificial intelligence (AI) is a computer model created to mimic human behavior [1]. In various medical fields, this technology has made its presence felt, in many cases improving diagnosis, treatment, and disease follow-up procedures.
The first attempt to automate medical diagnosis and treatment recommendations was the well-known MYCIN study [2,3]. It was a backward chaining expert system that used AI to identify bacteria and recommend antibiotics (hence the name, MYCIN).
In the early years of AI, many classic methods were employed, from rule-based systems to neural networks, statistical methods, signal and image processing, some of them using fuzzy, probability, possibility, or chaos theories. These were mainly off-line due to time-consuming computing. The big step was made in the last decade when the diversification of machine learning and deep learning structures was sustained by the development of new devices using parallel computing and multi-core graphics processing units (GPUs) [4].
Today we use real-time tools to assist various medical procedures that researchers have struggled to develop for over half a century.
Devices such as the NVIDIA Jetson microsystems series, NVIDIA GPU boards, etc., are making a difference in the field of real-time medical applications.

2. Artificial Intelligence in Colonoscopy

GLOBOCAN, the online database that provides statistical information on 36 types of cancer in 185 countries, was updated in 2020 by the International Agency for Research on Cancer (IARC) [5]. The research estimated over 19 million new cases, all cancers, both sexes, and all ages. Breast cancer was the most commonly diagnosed cancer (11.7% of all cases), followed by lung (11.4%), colorectal (CRC, 10%), prostate (7.3%), stomach (5.6%), liver (4.7%), cervix uteri, (3.1%), esophagus (3.1%) cancers and other cancers for the remaining 42.9% [6]. The mortality data showed that lung cancer was still the leading cause of cancer death (18%), followed by CRC (9.4%), liver (8.3%), stomach (7.7%), breast (6.9%), esophagus (5.5%), pancreas (4.7%) and prostate (3.8%) cancers. Almost 10 million cancer deaths occurred in 2020 [6].
CRC is a major public health problem. According to GLOBOCAN 2020 estimates, CRC is still the third most commonly diagnosed cancer and the second leading cause of cancer death, being responsible for over 900,000 deaths worldwide in 2020 [6,7]. More concerning is the fact that recent studies found that the pattern of CRC incidence is changing, noting a rise in early-onset cases, especially in high-income countries. Although a direct cause has not been found, the most likely culprit would be early-age exposure to large bowel carcinogens [8].
The diagnosis and management of many diseases have improved because of the technological advancements in the healthcare industry and easier access to large medical databases for research. Integrating new technologies into clinical practice may be a key factor. CRC diagnosis still relies on colonoscopy despite significant advances in the field. This investigation is considered essential in reducing CRC incidence and mortality [9]. Early detection is becoming increasingly important for both the medical community and the public. Many CRC screening programs have been launched since 2007. Fifteen of 28 European countries underwent population-based CRC screenings in 2019 [10].
A screening program is considered successful when an early CRC diagnosis is made, and precancerous lesions are diagnosed and treated. At the same time, such a program may be subject to multiple limitations. The increased number of procedures per endoscopist might be correlated with the increased adenoma miss rate, especially for diminutive polyps (≤5 mm). Moreover, during a busy work schedule, operator fatigue is associated with poorer colonoscopy performance. This may negatively affect the polyp detection rate, resulting in a low adenoma detection rate (ADR) [11]. In order to objectively assess the colonoscopy performance, criteria such as preprocedural colon preparation rate, cecal intubation rate, correct identification and management of lesions, and correct postprocedural follow-ups have to be considered [12]. It appears that technological advancement, including AI, might be a solution to these limitations.
The use of AI for colonoscopy has been studied over the last decade. Numerous papers highlight the benefits of implementing such an application in current medical practice [13]. According to current European Society of Gastrointestinal Endoscopy (ESGE) guidelines, all polyps larger than 5 mm need to be removed and sent for histopathology analysis to maximize screening effectiveness. The same thing should be applied to all diminutive polyps (<5 mm) with adenomatous structures. Diminutive polyps located in the rectum and the sigmoid characterized as hyperplastic by a high confidence optical diagnosis tool can be “left in situ” or “resect and discard” [14,15]. Most digestive endoscopists resect polyps and send them to histopathology because optical diagnosis tools are only available in expert medical institutions. Costs may increase, and physicians may become fatigued as a result. Technology development, including AI integration, may reduce this setback in the long run.
In endoscopy, AI has introduced two concepts: computer-assisted detection (CADe) and computer-assisted diagnosis (CADx). With CADe, the AI model acts as a “second pair of eyes” for the colonoscopist [16]. The capacity of identifying polyps may help to increase the investigator’s ADR, especially for diminutive polyps. This technology is especially useful for beginner colonoscopists, which may achieve results similar to experts in colonoscopy [17,18]. This eliminates the need for immediate endoscopic reassessment and therefore reduces healthcare costs [19]. A meta-analysis published in 2021 highlighted the advantages of using CADe compared to other colonoscopy methods such as high-definition white-light endoscopy, chromoendoscopy (dye-based or Narrow Band Imaging [NBI]), or add-on devices (i.e., systems that increase mucosal visualization, such as full-spectrum endoscopy (FUSE or G-EYE balloon endoscopy). This study showed that CADe is superior to high-definition white-light colonoscopy (an increase in ADR by 7.5%). Furthermore, both chromoendoscopy and increased mucosal visualization systems achieved better ADR (4.4% and 4.1%, respectively) compared to high-definition white-light colonoscopy. Cross-comparison of CADe with chromoendoscopy and increased mucosal visualization systems showed higher ADR with CADe (OR 1.45 [95% CI 1.14–1.85]; moderate certainty of the evidence, and OR 1.54 [95% CI 1.22–1.94]; low certainty of the evidence, respectively) [20].
A CADx system is used to classify a lesion based on several morphological parameters (surface, vascular patterns, shape, size, location), thus generating probability scores for malignancy or non-malignancy [21]. A paper published in 2021 [22] showed that CADx using white light colonoscopy has a sensitivity of 95.5% and a specificity of 84.4%, resulting in an accuracy of 93.2%. CADx using blue light colonoscopy showed slightly superior results: sensitivity 96.3%, specificity 88.7%, and accuracy 94.7%. These data highlight the ability of CADx systems to diagnose the polyp type and thus help with the correct management. Moreover, it may enable one to decide whether to resect a lesion and send it to pathology, resect and discard, or even leave it in place and monitor it over time [14]. More studies are necessary before such an approach may be recommended.
NBI (Narrow Band Imaging)-CADx attracted the most attention from the research community. Numerous studies highlighted the potential benefits of these two technologies working together. In 2018 Chen et al. developed an NBI—AI system capable of differentiating polyps with a sensitivity of 96.3%, specificity of 78.1%, and accuracy of 91% [23]. Zachariah et al. published a paper in 2020 where they wrote about the development of their NBI-CADx system based on convolutional neural networks. The AI model managed to exceed the ASGE PIVI (American Society of Gastrointestinal Endoscopy Preservation and Incorporation of Valuable endoscopic Innovations) standard for both “resect and discard” and “diagnose and leave” strategies. This AI system achieved an accuracy of 94% with NBI, being able to correctly classify diminutive polyps, irrelevant of the endoscopists’ experience [24]. In 2020 Song et al. developed a CADx model for predicting colorectal polyp histology on NBI pictures. In the study, they included both trainees and NBI expert endoscopists. The AI system proved to have higher diagnostic accuracy than the trainees (81–82% vs. 63.8–71.8%) and similar results compared to expert endoscopists. This study concluded that the CADx-NBI system could be an important tool for improving the trainees’ diagnostic accuracy [25]. Similarly, Jin et al. showed that their AI system for NBI achieves high accuracy in distinguishing adenomas from hyperplastic polyps. Moreover, the trainees achieved a near-expert level of accuracy without needing to undergo extensive training [26].
Chromoendoscopy has not been so appealing for AI development. In this case, the pit pattern classification is dependent on the depth of color. This means the more dye is sprayed, the better the quality. Because it is difficult to obtain a uniform image quality, the CADx system may have a hard time learning these patterns [27]. However, an old study led by Takemura et al. in 2010 achieved an accuracy of 98.5% with their CADx model associated with chromoendoscopy [28].
Endocytoscopy is a novel in vivo imaging technique capable of offering a microscopic-like view of the mucosa in real time. Due to the large quantity of information extracted by this technique (i.e., cellular and microvascular patterns), the integration of an AI system in endocytoscopy seems to be an ideal connection. Furthermore, because endocytoscopy provides mostly focused, fixed-size images, the CADx system has an easier job analyzing the images [27]. Misawa et al. developed an NBI-CADx for endocytoscopy with a sensitivity, specificity, and accuracy of 84.5%, 97.6%, and 90.0%, respectively. Moreover, they achieved a probability of diagnosis greater than 90%, which is the ASGE PIVI threshold for a “high-confidence” diagnosis [29]. In 2020 Mori et al. published an article on the cost savings in colonoscopy with CADx systems. They used their AI system with endocytoscopy on 207 patients with 240 diminutive rectosigmoid polyps. The AI correctly differentiated neoplastic polyps with a sensitivity of 93.3%, specificity of 95.2%, and a negative predictive value (NPV) of 95.2%. Having an NPV > 90%, which is the ASGE threshold, the research team was confident in applying the strategy of diagnose-and-leave to all non-neoplastic lesions. As a result, 105 polyps were removed, and 145 polyps were left in place. The study estimated a reduction of the average colonoscopy cost and the annual reimbursement for colonoscopy by 18.9% (US$ 149.2 million) in Japan and 10.9% (US$ 85.2 million) in the United States, compared with the resect-all-polyps strategy [30]. Although the potential cost reduction is impressive, endocytoscopy is not widely used in clinical practice.
A colorectal cancer screening program is effective if the colonoscopy quality indicators are met. Currently, these indicators are subject to intentional or unintentional manipulation. They include pre-procedural indicators (i.e., the rate of adequate bowel preparation, the time interval for colonoscopy, the indication for colonoscopy), procedure indicators (i.e., cecal intubation rate, ADR, withdrawal time, polyp detection rate, management of pathology and complications, patient experience) and post-procedural indicators (appropriate post-polypectomy surveillance) [12]. Using AI together with other computerized management systems, these parameters may be objectively assessed and may provide a procedure quality score. This might be the next step towards achieving higher healthcare performance.
Inflammatory bowel disease (IBD) has also been a focus of AI development in colonoscopy [31]. Several studies were aimed to assess the value of AI during colonoscopy in ulcerative colitis (UC) due to its convenient location in the colon. Studies by Ozaw et al. [32] and Stidham et al. [33] highlighted the benefits of AI in distinguishing active disease from remission during standard colonoscopy. Another study conducted by Bossuyt et al. [34] used a red density score which correlated mucosal redness with the severity of the inflammation. The results were biopsy-confirmed and were consistent with Robart’s histological index [35], the Mayo endoscopic score, and the Ulcerative Colitis Endoscopic Index of Severity (UCEIS) [36].
In Crohn’s Disease (CD), given the various locations of the lesions, video capsule endoscopy (VCE) has been the target for research when trying to implement an AI system [31]. Klang et al. [37] and Barash et al. [38] developed a CAD system capable of detecting ulcerations during VCE and concluded that AI might prove beneficial in CD diagnosis and monitoring. Another interesting study conducted by Ding et al. [39] concluded that the use of AI reduces the reading time of VCE from an average of 96.6 min to 5.9 min.

3. Artificial Intelligence in Upper Digestive Endoscopy

The majority of AI research appears to be focused on improving lower digestive endoscopy examinations, but there is a growing interest in upper digestive endoscopy (UDE) as well. Applying AI to diseases such as gastroesophageal reflux disease (GERD), Barrett’s esophagus (BE), and gastric cancer may have great healthcare benefits if newer technologies are implemented.
GERD is a fairly common disease with symptoms ranging from occasional chest discomfort to severe heartburn and regurgitation. In 2021 Wang, C.-C. et al. [40] proposed a deep learning model, named GERD-VGGNet, that managed to identify and classify GERD according to Los Angeles classification both in conventional and in NBI endoscopy. According to the study, their AI model resulted in an accuracy of 87.9%, which was significantly higher than the results of the trainees (75.0% and 65.6%). These results prove that AI may be beneficial for the recognition of GERD disease. However, more research in this field is needed [40].
Barrett’s esophagus is usually the outcome of long-lasting untreated GERD. Because it is a pre-malignant lesion, research has been conducted, and guidelines have been developed for its detection, diagnosis, and follow-up [41]. Numerous recent studies demonstrated that AI might be the missing link for the optimal management of this disease. In 2020, Hashimoto R et al. [42] developed a convolutional neural network (CNN) algorithm designed to differentiate between dysplastic and non-dysplastic lesions in BE. It detected early neoplasia with 95.4% accuracy [42]. Further, in 2019, de Groof et al. [43] developed a CADe system capable of identifying neoplasia in patients with BE. Data have demonstrated that their system yielded better results than those of non-expert endoscopists [43]. Thus, AI may prove to be extremely beneficial to beginner endoscopists. Detecting early neoplasia in BE is a top priority, and AI research may add significant benefits in the future.
Similar to colorectal cancer, gastric cancer is still a major healthcare problem with a poor prognosis if left undiagnosed and untreated [44]. Upper digestive endoscopy (UDE) is the most important procedure to evaluate and diagnose gastric cancer and take biopsy samples. During UDE, novice endoscopists may not be able to correctly assess the entire mucosa. The unseen areas are called blind spots and may hide potential neoplastic lesions. A capable AI system may overcome this impediment and possibly enhance the quality of UDE. In a 2021 study, Wu L et al. [45] developed an AI model named ENDOANGEL. They showed that the number of blind spots per endoscopist was significantly reduced when using the AI model. This technology could detect gastric neoplasia with an accuracy of 84.7% [45].

4. Challenges in Implementing AI Systems in Healthcare

Regulations regarding safety, efficacy, and transparency must be approved before AI technology can be used in clinical settings [46]. It is equally important to consider the potential negative patient outcomes. These steps, although they may take time, are absolutely necessary for implementation [46].
AI systems require ongoing maintenance, large data incorporation, software updates, and hardware repairs. All of these activities require human resources and funding support. The economic burden may be significant. To decide if purchasing an AI system is advantageous for patient care, it is necessary to carefully balance investment and benefits. Data regarding AI technology implementation costs are scarce at present [47].
Integrating AI technology into existing systems, such as electronic health record databases, is an important issue. Adapting an AI model to a variety of clinical situations and gaining benefits at an organizational level can be challenging. Creating an AI model that is compatible with a large number of healthcare systems while still being relevant on an individual basis is not an easy task [48].
A further challenge is creating a good communication channel between the AI model and physicians. Essentially, this means translating digital information into the usual medical language in order to aid in diagnosis and treatment. The 2021 study by Fonollà R et al. [49] highlighted the possibility of creating an AI system that can overcome this difficulty. They used a large database that included two units: one unit housed polyp images and characteristics according to multiple classifications such as PARIS, NICE, KUDO, and BASIC. The second unit contained medical statements of experts in endoscopy that described the polyps. Common ground was established between the experts regarding the range of accepted terminology when describing polyps. Fonollà R et al. managed to create an AI system that automatically generates a textual polyp description based on the BASIC classification [49]. A step forward has been made, but more research is needed for improving collaboration between AI systems and physicians.
It is important to stress that in order to create an AI medical system, the most important role is played by the medical experts providing the necessary knowledge in a specific field. For instance, in the case of polyp detection during colonoscopy, it is necessary to gather a large variety of saved video colonoscopies performed by top endoscopists. To incorporate knowledge, medical experts must select colonoscopy frames that best represent different types of polyps with different sizes and in different circumstances. Then they have to assist in the important task of annotating colonoscopy frames so that the AI system is able to correctly learn how to detect polyps. The quality of the databases behind an AI system is of paramount importance for the effectiveness of its operation and is given by the value of the medical team that assisted in its development.

5. AI Systems Currently Approved for Use in Gastroenterology

Despite numerous difficulties, many AI systems have been validated and approved for use in medical practice. Healthcare corporations, together with their academic partners, were able to develop, test, and evaluate the results of incorporating such devices in clinical practice. Some of the currently available are Endo AID, Olympus Corp., Tokyo, Japan; CAD EYE, Fujifilm Corp., Tokyo, Japan; Discovery, Pentax Corp., Tokyo, Japan; GI Genius, Medtronic Corp., Dublin, Ireland; EndoBRAIN, Cybernet Corp., Tokyo, Japan (Table 1) [50]. The European Medicines Agency (EMA) and the Japanese Pharmaceuticals and Medical Devices Agency (PMDA) approved the use of these AI systems during 2018–2020. A great deal of interest has been expressed in developing and implementing these devices, as shown by the rapid approvals from regulatory agencies [51].
Although the United States FDA was a little more rigid when evaluating AI technology in endoscopy compared with the EMA and the PMDA, on 12 April 2021, they approved the first computer-aided polyp detection system represented by GI Genius, Medtronic Corp [52].
In a multicenter randomized trial in 2020, Repici et al. used GI Genius from Medtronic Corp. The aim of their study was to compare CADe colonoscopy to conventional colonoscopy. Their primary outcome was ADR, and their secondary outcomes were adenoma detection per colonoscopy, non-neoplastic resection rate, and withdrawal time. Using GI Genius, they concluded that the ADR (54.8% vs. 40.4%) and adenomas per colonoscopy were significantly higher in the CADe group compared to the control group. No significant difference was reported regarding withdrawal time or proportion of subjects with resection of non-neoplastic lesions [16].
Olympus’s Endo-AID was presented to the public for the first time during the October 2020 United European Gastroenterology (UEG) Week [53]. The system promises excellent improvement in polyp detection and the potential to reduce the strain on the endoscopist by reducing the need for excessive eye movement. Chieh Sian Koo et al. published in Endoscopy 2021 an article that demonstrated the performance for both CADe and CADx of this AI system [54]. Even so, system improvements are still needed, as shown by a case report published in Endoscopy 2022 by Lafeuille Pierre et al. In this report, the AI system failed to correctly detect a 2.5 cm pseudo-depressed non-granular laterally spreading tumor of the transverse colon, which the pathological examination suggested being an adenocarcinoma [55].
Fujifilm’s CAD EYE was officially presented during the ESGE Connect 2020. In 2021, Helmut Neumann et al. [56] used CAD EYE in combination with a linked color imaging (LCI) technique on 240 polyps that covered the entire spectrum of adenomatous histology. In their study, the AI system achieved a sensitivity of 100% without missing a single lesion. They calculated a 0.001% false-positive frame rate. Above all, the AI system managed to identify all 34 sessile serrated lesions (100%). They concluded that the setup used in their study has the potential to significantly improve the quality of colonoscopy [56].
Cybernet Corp. developed multiple variations of their AI system called EndoBRAIN. EndoBRAIN-EYE is a CADe system capable of detecting polyps and reducing the change of missed adenomas. Masashi Misawa et al. [57] published their research in Gastrointestinal Endoscopy (2021), where they highlighted the development and validation process of this AI system. For the purpose of training the AI, they acquired a total of 1405 colonoscopy videos from five medical centers. The AI achieved a sensitivity of 90.5% and a specificity of 93.7% for frame-based analysis. They also showed the per-polyp sensitivity for all polyps (98.0%), for diminutive polyps (98.3%), for protruded (98.5%) and for flat polyps (97.0%) [57]. EndoBRAIN-PLUS is a CADx system designed to identify the type of lesion in colonoscopy. In the development and validation study led by Yuichi Mori et al. in 2021 [58], the AI system was tasked to identify three pathological class predictions (cancer, adenoma, or non-neoplastic) for endocytoscopic images obtained at 520-fold magnification. They used 30 cancers, 15 adenomas, and 5 non-neoplastic lesions for the validation test. As a result, the AI system identified the three pathological classes with an overall accuracy of 91.9%. The system managed to differentiate cancer with a sensitivity of 91.8% and a specificity of 97.3% [58]. In collaboration with Olympus Corp., Cybernet Corp. also developed a CADx system dedicated to ulcerative colitis (UC) called EndoBRAIN-UC. This system is based on their previous CADx tool used for differential diagnosis between neoplastic and non-neoplastic polyps [58]. In the study published in Gastrointestinal Endoscopy in 2019, Yasuharu Maeda et al. [59] described the development and validation process of this AI system. The study included a significant number of endocytoscopic images, and the performance of the CADx was determined based on the histological activity of the biopsy samples. The overall diagnostic sensitivity, specificity, and accuracy for predicting persistent histological inflammation were 74%, 97%, and 91%, respectively [59]. The authors considered the 74% sensitivity to be tolerable because, until then, histologic inflammation was difficult to identify with endoscopy [59].
With more and more AI systems approved for clinical use, the opportunity for research and improvement is impressive, with potential future innovations on the way.

6. Future Potential of AI in Digestive Endoscopy

Although AI is typically defined as a machine that replicates human behavior and interactions, we are still far from that. A lot of research, time, and funding is invested into developing an AI system capable of doing one basic human function (i.e., identifying a polyp). While the human brain can accumulate a lot of information (i.e., polyp, normal mucosa, normal vascular pattern, normal colon haustra, bowel movement, feces, water, endoscopic instruments, and so on), our brain can also sort that information, eliminating what is non-essential (or what we consider to be “normal”) and focus on the important aspects (i.e., the polyp). At this stage, the current AI systems mentioned in this review are capable of achieving better results than a novice endoscopist but not better than an expert. Hence, the development of a large database and interactive algorithms for AI training may be a potential target for future studies.
Our group, which included colleagues from the Iasi Branch of the Institute of Computer Science of the Romanian Academy, developed a new experimental deep learning system using NVIDIA devices, specifically the Jetson Xavier NX, for real-time object and polyp detection on video colonoscopy and also proposed a method for semantically identifying different areas in colonoscopic frames [60,61,62] (Figure 1 and Figure 2). The human brain can comprehend each individual unit in an image and scrutinize and select it according to a set of criteria while at the same time comprehending how it interacts with the rest of the environment. If this can be achieved by an AI system, the possibilities may be endless.
Although this study offers a significant amount of valuable current information on artificial intelligence in digestive endoscopy, the may limitation is that it is not a systematic review. This means that this study might be prone to subjective selection bias during the literature searches.

7. Conclusions

As we approach the end of the first quarter of the 21st century, technology has become an intrinsic part of our daily life. The same applies to the field of medicine, gastroenterology included. AI models have achieved remarkable success in colonoscopy, with the potential to improve digestive disease diagnosis, teaching for novice endoscopists, and even play an important role in large CRC screening programs. In upper digestive endoscopy, AI devices can assist endoscopists as a second pair of eyes, demonstrating an ability to estimate BE with potential neoplastic transformation, identify gastric cancer, and accurately assess gastric blind spots.
Our belief is that AI might soon become an essential tool in the endoscopic lab as the field of digestive endoscopy becomes more and more dependent on newer technology.

Author Contributions

Contributed equally and shared first co-authorship, R.-A.V. and M.L.; conceptualization, V.L.D.; methodology, V.L.D., R.-A.V., M.L. and O.-B.B.; formal analysis, O.-B.B., A.O. and A.C.; investigation, R.-A.V., M.L., A.C., O.-B.B. and A.O.; validation, V.L.D.; writing—original draft preparation, R.-A.V.; writing—review and editing, V.L.D., M.L., A.C. and O.-B.B.; supervision, V.L.D.; share senior co-authorship, V.L.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Topol, E.J. High-performance medicine: The convergence of human and artificial intelligence. Nat. Med. 2019, 25, 44–56. [Google Scholar] [CrossRef] [PubMed]
  2. Buchanan, B.G.; Shortliffe, E.H. Rule-Based Expert Systems: The MYCIN Experiments of the Stanford Heuristic Programming Project; Addison-Wesley: Reading, MA, USA, 1984; ISBN 0201101726. [Google Scholar]
  3. Shortliffe, E.H.; Buchanan, B.G. A model of inexact reasoning in medicine. Math. Biosci. 1975, 23, 351–379. [Google Scholar] [CrossRef]
  4. Madiajagan, M.; Raj, S.S. Parallel Machine Learning and Deep Learning Approaches for Bioinformatics. In Deep Learning and Parallel Computing Environment for Bioengineering Systems; Academic Press: Cambridge, MA, USA, 2019; pp. 245–255. [Google Scholar]
  5. GLOBOCAN 2020: New Global Cancer Data|UICC. Available online: https://www.uicc.org/news/globocan-2020-new-global-cancer-data (accessed on 7 February 2022).
  6. Sung, H.; Ferlay, J.; Siegel, R.L.; Laversanne, M.; Soerjomataram, I.; Jemal, A.; Bray, F. Global Cancer Statistics 2020: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries. CA Cancer J. Clin. 2021, 71, 209–249. [Google Scholar] [CrossRef] [PubMed]
  7. Ferlay, J.; Colombet, M.; Soerjomataram, I.; Parkin, D.M.; Piñeros, M.; Znaor, A.; Bray, F. Cancer statistics for the year 2020: An overview. Int. J. Cancer 2021, 149, 778–789. [Google Scholar] [CrossRef] [PubMed]
  8. Siegel, R.L.; Torre, L.A.; Soerjomataram, I.; Hayes, R.B.; Bray, F.; Weber, T.K.; Jemal, A. Global patterns and trends in colorectal cancer incidence in young adults. Gut 2019, 68, 2179–2185. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  9. Nishihara, R.; Wu, K.; Lochhead, P.; Morikawa, T.; Liao, X.; Qian, Z.R.; Inamura, K.; Kim, S.A.; Kuchiba, A.; Yamauchi, M.; et al. Long-term colorectal-cancer incidence and mortality after lower endoscopy. N. Engl. J. Med. 2013, 369, 1095–1105. [Google Scholar] [CrossRef] [Green Version]
  10. Siersema, P.D. Colorectal Cancer Awareness Issue 2019. Endoscopy 2019, 51, 207–208. [Google Scholar] [CrossRef]
  11. Dong, Z.; Wang, J.; Chen, Y.; Sun, H.; Li, B.; Zhang, Q.; Sun, K.; Wang, Z.; Qian, X.; Zhan, T.; et al. Negative Effects of Endoscopists’ Fatigue on Colonoscopy Quality on 34,022 Screening Colonoscopies. J. Gastrointest. Liver Dis. 2021, 30, 358–365. [Google Scholar] [CrossRef]
  12. Kaminski, M.F.; Thomas-Gibson, S.; Bugajski, M.; Bretthauer, M.; Rees, C.J.; Dekker, E.; Hoff, G.; Jover, R.; Suchanek, S.; Ferlitsch, M.; et al. Performance measures for lower gastrointestinal endoscopy: A European Society of Gastrointestinal Endoscopy (ESGE) Quality Improvement Initiative. Endoscopy 2017, 49, 378–397. [Google Scholar] [CrossRef] [Green Version]
  13. Kaul, V.; Enslin, S.; Gross, S.A. History of artificial intelligence in medicine. Gastrointest. Endosc. 2020, 92, 807–812. [Google Scholar] [CrossRef]
  14. Rex, D.K.; Kahi, C.; O’Brien, M.; Levin, T.R.; Pohl, H.; Rastogi, A.; Burgart, L.; Imperiale, T.; Ladabaum, U.; Cohen, J.; et al. The American Society for Gastrointestinal Endoscopy PIVI (Preservation and Incorporation of Valuable Endoscopic Innovations) on real-time endoscopic assessment of the histology of diminutive colorectal polyps. Gastrointest. Endosc. 2011, 73, 419–422. [Google Scholar] [CrossRef] [PubMed]
  15. Houwen, B.B.S.L.; Hassan, C.; Coupé, V.M.H.; Greuter, M.J.E.; Hazewinkel, Y.; Vleugels, J.L.A.; Antonelli, G.; Bustamante-Balén, M.; Coron, E.; Cortas, G.A.; et al. Definition of competence standards for optical diagnosis of diminutive colorectal polyps: European Society of Gastrointestinal Endoscopy (ESGE) Position Statement. Endoscopy 2022, 54, 88–99. [Google Scholar] [CrossRef] [PubMed]
  16. Repici, A.; Badalamenti, M.; Maselli, R.; Correale, L.; Radaelli, F.; Rondonotti, E.; Ferrara, E.; Spadaccini, M.; Alkandari, A.; Fugazza, A.; et al. Efficacy of Real-Time Computer-Aided Detection of Colorectal Neoplasia in a Randomized Trial. Gastroenterology 2020, 159, 512–520.e7. [Google Scholar] [CrossRef] [PubMed]
  17. Repici, A.; Spadaccini, M.; Antonelli, G.; Correale, L.; Maselli, R.; Galtieri, P.A.; Pellegatta, G.; Capogreco, A.; Milluzzo, S.M.; Lollo, G.; et al. Artificial intelligence and colonoscopy experience: Lessons from two randomised trials. Gut 2022, 71, 757–765. [Google Scholar] [CrossRef]
  18. Xu, Y.; Ding, W.; Wang, Y.; Tan, Y.; Xi, C.; Ye, N.; Wu, D.; Xu, X. Comparison of diagnostic performance between convolutional neural networks and human endoscopists for diagnosis of colorectal polyp: A systematic review and meta-analysis. PLoS ONE 2021, 16, e0246892. [Google Scholar] [CrossRef]
  19. Hassan, C.; Spadaccini, M.; Iannone, A.; Maselli, R.; Jovani, M.; Chandrasekar, V.T.; Antonelli, G.; Yu, H.; Areia, M.; Dinis-Ribeiro, M.; et al. Performance of artificial intelligence in colonoscopy for adenoma and polyp detection: A systematic review and meta-analysis. Gastrointest. Endosc. 2021, 93, 77–85.e6. [Google Scholar] [CrossRef]
  20. Spadaccini, M.; Iannone, A.; Maselli, R.; Badalamenti, M.; Desai, M.; Chandrasekar, V.T.; Patel, H.K.; Fugazza, A.; Pellegatta, G.; Galtieri, P.A.; et al. Computer-aided detection versus advanced imaging for detection of colorectal neoplasia: A systematic review and network meta-analysis. Lancet Gastroenterol. Hepatol. 2021, 6, 793–802. [Google Scholar] [CrossRef]
  21. Van Der Zander, Q.E.W.; Schreuder, R.M.; Fonollà, R.; Scheeve, T.; Van Der Sommen, F.; Winkens, B.; Aepli, P.; Hayee, B.; Pischel, A.B.; Stefanovic, M.; et al. Optical diagnosis of colorectal polyp images using a newly developed computer-aided diagnosis system (CADx) compared with intuitive optical diagnosis. Endoscopy 2021, 53, 1219–1226. [Google Scholar] [CrossRef]
  22. Sakamoto, T.; Nakashima, H.; Nakamura, K.; Nagahama, R.; Saito, Y. Performance of Computer-Aided Detection and Diagnosis of Colorectal Polyps Compares to That of Experienced Endoscopists. Dig. Dis. Sci. 2021, 1–8. [Google Scholar] [CrossRef]
  23. Chen, P.J.; Lin, M.C.; Lai, M.J.; Lin, J.C.; Lu, H.H.S.; Tseng, V.S. Accurate Classification of Diminutive Colorectal Polyps Using Computer-Aided Analysis. Gastroenterology 2018, 154, 568–575. [Google Scholar] [CrossRef]
  24. Zachariah, R.; Samarasena, J.; Luba, D.; Duh, E.; Dao, T.; Requa, J.; Ninh, A.; Karnes, W. Prediction of Polyp Pathology Using Convolutional Neural Networks Achieves ‘Resect and Discard’ Thresholds. Am. J. Gastroenterol. 2020, 115, 138. [Google Scholar] [CrossRef] [PubMed]
  25. Song, E.M.; Park, B.; Ha, C.A.; Hwang, S.W.; Park, S.H.; Yang, D.H.; Ye, B.D.; Myung, S.J.; Yang, S.K.; Kim, N.; et al. Endoscopic diagnosis and treatment planning for colorectal polyps using a deep-learning model. Sci. Rep. 2020, 10, 30. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  26. Jin, E.H.; Lee, D.; Bae, J.H.; Kang, H.Y.; Kwak, M.S.; Seo, J.Y.; Yang, J.I.; Yang, S.Y.; Lim, S.H.; Yim, J.Y.; et al. Improved Accuracy in Optical Diagnosis of Colorectal Polyps Using Convolutional Neural Networks with Visual Explanations; The American Gastroenterological Association: Bethesda, MD, USA, 2020; Volume 158, ISBN 8222072293. [Google Scholar]
  27. Parsa, N.; Byrne, M.F. Artificial intelligence for identification and characterization of colonic polyps. Ther. Adv. Gastrointest. Endosc. 2021, 14, 1–12. [Google Scholar] [CrossRef] [PubMed]
  28. Takemura, Y.; Yoshida, S.; Tanaka, S.; Onji, K.; Oka, S.; Tamaki, T.; Kaneda, K.; Yoshihara, M.; Chayama, K. Quantitative analysis and development of a computer-aided system for identification of regular pit patterns of colorectal lesions. Gastrointest. Endosc. 2010, 72, 1047–1051. [Google Scholar] [CrossRef] [Green Version]
  29. Misawa, M.; Kudo, S.E.; Mori, Y.; Nakamura, H.; Kataoka, S.; Maeda, Y.; Kudo, T.; Hayashi, T.; Wakamura, K.; Miyachi, H.; et al. Characterization of Colorectal Lesions Using a Computer-Aided Diagnostic System for Narrow-Band Imaging Endocytoscopy. Gastroenterology 2016, 150, 1531–1532.e3. [Google Scholar] [CrossRef] [Green Version]
  30. Mori, Y.; Kudo, S.-E.; East, J.E.; Rastogi, A.; Bretthauer, M.; Misawa, M.; Sekiguchi, M.; Matsuda, T.; Saito, Y.; Ikematsu, H.; et al. Cost savings in colonoscopy with artificial intelligence-aided polyp diagnosis: An add-on analysis of a clinical trial (with video). Gastrointest. Endosc. 2020, 92, 905–911.e1. [Google Scholar] [CrossRef]
  31. Takenaka, K.; Kawamoto, A.; Okamoto, R.; Watanabe, M.; Ohtsuka, K. Artificial intelligence for endoscopy in inflammatory bowel disease. Intest. Res. 2022. [Google Scholar] [CrossRef]
  32. Ozawa, T.; Ishihara, S.; Fujishiro, M.; Saito, H.; Kumagai, Y.; Shichijo, S.; Aoyama, K.; Tada, T. Novel computer-assisted diagnosis system for endoscopic disease activity in patients with ulcerative colitis. Gastrointest. Endosc. 2019, 89, 416–421.e1. [Google Scholar] [CrossRef]
  33. Stidham, R.W.; Liu, W.; Bishu, S.; Rice, M.D.; Higgins, P.D.R.; Zhu, J.; Nallamothu, B.K.; Waljee, A.K. Performance of a Deep Learning Model vs Human Reviewers in Grading Endoscopic Disease Severity of Patients With Ulcerative Colitis. JAMA Netw. Open 2019, 2, e193963. [Google Scholar] [CrossRef] [Green Version]
  34. Bossuyt, P.; Vermeire, S.; Bisschops, R. Scoring endoscopic disease activity in IBD: Artificial intelligence sees more and better than we do. Gut 2020, 69, 788–789. [Google Scholar] [CrossRef]
  35. Marchal-Bressenot, A.; Salleron, J.; Boulagnon-Rombi, C.; Bastien, C.; Cahn, V.; Cadiot, G.; Diebold, M.D.; Danese, S.; Reinisch, W.; Schreiber, S.; et al. Development and validation of the Nancy histological index for UC. Gut 2017, 66, 43–49. [Google Scholar] [CrossRef] [PubMed]
  36. Travis, S.P.L.; Schnell, D.; Krzeski, P.; Abreu, M.T.; Altman, D.G.; Colombel, J.F.; Feagan, B.G.; Hanauer, S.B.; Lichtenstein, G.R.; Marteau, P.R.; et al. Reliability and initial validation of the ulcerative colitis endoscopic index of severity. Gastroenterology 2013, 145, 987–995. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  37. Klang, E.; Barash, Y.; Margalit, R.Y.; Soffer, S.; Shimon, O.; Albshesh, A.; Ben-Horin, S.; Amitai, M.M.; Eliakim, R.; Kopylov, U. Deep learning algorithms for automated detection of Crohn’s disease ulcers by video capsule endoscopy. Gastrointest. Endosc. 2020, 91, 606–613.e2. [Google Scholar] [CrossRef] [PubMed]
  38. Barash, Y.; Azaria, L.; Soffer, S.; Margalit Yehuda, R.; Shlomi, O.; Ben-Horin, S.; Eliakim, R.; Klang, E.; Kopylov, U. Ulcer severity grading in video capsule images of patients with Crohn’s disease: An ordinal neural network solution. Gastrointest. Endosc. 2021, 93, 187–192. [Google Scholar] [CrossRef]
  39. Ding, Z.; Shi, H.; Zhang, H.; Meng, L.; Fan, M.; Han, C.; Zhang, K.; Ming, F.; Xie, X.; Liu, H.; et al. Gastroenterologist-Level Identification of Small-Bowel Diseases and Normal Variants by Capsule Endoscopy Using a Deep-Learning Model. Gastroenterology 2019, 157, 1044–1054.e5. [Google Scholar] [CrossRef]
  40. Wang, C.C.; Chiu, Y.C.; Chen, W.L.; Yang, T.W.; Tsai, M.C.; Tseng, M.H. A Deep Learning Model for Classification of Endoscopic Gastroesophageal Reflux Disease. Int. J. Environ. Res. Public Health 2021, 18, 2428. [Google Scholar] [CrossRef]
  41. Qumseya, B.; Sultan, S.; Bain, P.; Jamil, L.; Jacobson, B.; Anandasabapathy, S.; Agrawal, D.; Buxbaum, J.L.; Fishman, D.S.; Gurudu, S.R.; et al. ASGE guideline on screening and surveillance of Barrett’s esophagus. Gastrointest. Endosc. 2019, 90, 335–359.e2. [Google Scholar] [CrossRef] [Green Version]
  42. Hashimoto, R.; Requa, J.; Dao, T.; Ninh, A.; Tran, E.; Mai, D.; Lugo, M.; El-Hage Chehade, N.; Chang, K.J.; Karnes, W.E.; et al. Artificial intelligence using convolutional neural networks for real-time detection of early esophageal neoplasia in Barrett’s esophagus (with video). Gastrointest. Endosc. 2020, 91, 1264–1271.e1. [Google Scholar] [CrossRef]
  43. De Groof, A.J.; Struyvenberg, M.R.; van der Putten, J.; van der Sommen, F.; Fockens, K.N.; Curvers, W.L.; Zinger, S.; Pouw, R.E.; Coron, E.; Baldaque-Silva, F.; et al. Deep-Learning System Detects Neoplasia in Patients With Barrett’s Esophagus With Higher Accuracy Than Endoscopists in a Multistep Training and Validation Study With Benchmarking. Gastroenterology 2020, 158, 915–929.e4. [Google Scholar] [CrossRef]
  44. Hu, Z.; Zuo, Z.; Miao, H.; Ning, Z.; Deng, Y. Incidence, Risk Factors and Prognosis of T4a Gastric Cancer: A Population-Based Study. Front. Med. 2022, 8, 767904. [Google Scholar] [CrossRef]
  45. Wu, L.; He, X.; Liu, M.; Xie, H.; An, P.; Zhang, J.; Zhang, H.; Ai, Y.; Tong, Q.; Guo, M.; et al. Evaluation of the effects of an artificial intelligence system on endoscopy quality and preliminary testing of its performance in detecting early gastric cancer: A randomized controlled trial. Endoscopy 2021, 53, 1199–1207. [Google Scholar] [CrossRef] [PubMed]
  46. Charow, R.; Jeyakumar, T.; Younus, S.; Dolatabadi, E.; Salhia, M.; Al-Mouaswas, D.; Anderson, M.; Balakumar, S.; Clare, M.; Dhalla, A.; et al. Artificial Intelligence Education Programs for Health Care Professionals: Scoping Review. JMIR Med. Educ. 2021, 7, e31043. [Google Scholar] [CrossRef] [PubMed]
  47. He, J.; Baxter, S.L.; Xu, J.; Xu, J.; Zhou, X.; Zhang, K. The practical implementation of artificial intelligence technologies in medicine. Nat. Med. 2019, 25, 30–36. [Google Scholar] [CrossRef]
  48. Singh, R.P.; Hom, G.L.; Abramoff, M.D.; Campbell, J.P.; Chiang, M.F. Current Challenges and Barriers to Real-World Artificial Intelligence Adoption for the Healthcare System, Provider, and the Patient. Transl. Vis. Sci. Technol. 2020, 9, 45. [Google Scholar] [CrossRef] [PubMed]
  49. Fonollà, R.; van der Zander, Q.E.W.; Schreuder, R.M.; Subramaniam, S.; Bhandari, P.; Masclee, A.A.M.; Schoon, E.J.; van der Sommen, F.; de With, P.H.N. Automatic image and text-based description for colorectal polyps using BASIC classification. Artif. Intell. Med. 2021, 121, 102178. [Google Scholar] [CrossRef] [PubMed]
  50. Taghiakbari, M.; Mori, Y.; von Renteln, D. Artificial intelligence-assisted colonoscopy: A review of current state of practice and research. World J. Gastroenterol. 2021, 27, 8103. [Google Scholar] [CrossRef] [PubMed]
  51. Mori, Y.; Neumann, H.; Misawa, M.; Kudo, S.E.; Bretthauer, M. Artificial intelligence in colonoscopy—Now on the market. What’s next? J. Gastroenterol. Hepatol. 2021, 36, 7–11. [Google Scholar] [CrossRef]
  52. Spadaccini, M.; De Marco, A.; Franchellucci, G.; Sharma, P.; Hassan, C.; Repici, A. Discovering the first FDA-approved computer-aided polyp detection system. Future Oncol. 2022, 18, 1405–1412. [Google Scholar] [CrossRef]
  53. Olympus Olympus Launches ENDO-AID, an AI-Powered Platform for Its Endoscopy System—Olympus Europe, Middle East and Africa. Available online: https://www.olympus-europa.com/company/en/news/press-releases/2020-10-09t08-30-00/olympus-launches-endo-aid-an-ai-powered-platform-for-its-endoscopy-system.html (accessed on 4 March 2022).
  54. Koo, C.S.; Dolgunov, D.; Koh, C.J. Key tips for using computer-aided diagnosis in colonoscopy-observations from two different platforms. Endoscopy 2021. [Google Scholar] [CrossRef]
  55. Lafeuille, P.; Yzet, C.; Rivory, J.; Pontarollo, G.; Latif, E.H.; Bartoli, A.; Pioche, M. Flat colorectal adenocarcinoma: A worrisome false negative of artificial intelligence-assisted colonoscopy. Endoscopy 2022. [Google Scholar] [CrossRef]
  56. Neumann, H.; Kreft, A.; Sivanathan, V.; Rahman, F.; Galle, P.R. Evaluation of novel LCI CAD EYE system for real time detection of colon polyps. PLoS ONE 2021, 16, e0255955. [Google Scholar] [CrossRef] [PubMed]
  57. Misawa, M.; Kudo, S.-E.; Mori, Y.; Hotta, K.; Ohtsuka, K.; Matsuda, T.; Saito, S.; Kudo, T.; Baba, T.; Ishida, F.; et al. Development of a computer-aided detection system for colonoscopy and a publicly accessible large colonoscopy video database (with video). Gastrointest. Endosc. 2021, 93, 960–967.e3. [Google Scholar] [CrossRef] [PubMed]
  58. Mori, Y.; Kudo, S.; Misawa, M.; Hotta, K.; Kazuo, O.; Saito, S.; Ikematsu, H.; Saito, Y.; Matsuda, T.; Kenichi, T.; et al. Artificial intelligence-assisted colonic endocytoscopy for cancer recognition: A multicenter study. Endosc. Int. Open 2021, 9, E1004–E1011. [Google Scholar] [CrossRef] [PubMed]
  59. Maeda, Y.; Kudo, S.-E.; Mori, Y.; Misawa, M.; Ogata, N.; Sasanuma, S.; Wakamura, K.; Oda, M.; Mori, K.; Ohtsuka, K. Fully automated diagnostic system with artificial intelligence using endocytoscopy to identify the presence of histologic inflammation associated with ulcerative colitis (with video). Gastrointest. Endosc. 2019, 89, 408–415. [Google Scholar] [CrossRef] [Green Version]
  60. Ciobanu, A.; Luca, M.; Barbu, T.; Drug, V.; Olteanu, A.; Vulpoi, R. Experimental Deep Learning Object Detection in Real-time Colonoscopies. In Proceedings of the 2021 International Conference on e-Health and Bioengineering (EHB), Iasi, Romania, 18–19 November 2021; Institute of Electrical and Electronics Engineers (IEEE): Iasi, Romania, 2021. [Google Scholar]
  61. Luca, M.; Ciobanu, A.; Barbu, T.; Drug, V. Artificial Intelligence and Deep Learning, Important Tools in Assisting Gastroenterologists. In Handbook of Artificial Intelligence in Healthcare; Lim, C.-P., Vaidya, A., Jain, K., Mahorkar, V.U., Jain, L.C., Eds.; Springer: Cham, Switzerland, 2021; Volume 211, pp. 197–213. [Google Scholar]
  62. Ciobanu, A.; Luca, M.; Oltean, A.; Barboi, O.; Drug, V. Cielab Automatic Colonoscopy Post-Evaluation. Endoscopy 2021, 53, S194–S195. [Google Scholar] [CrossRef]
Figure 1. Assigning colors to different components in a colonoscopy frame by deep learning semantic segmentation: mucosa in red, residue in yellow and reflections in blue.
Figure 1. Assigning colors to different components in a colonoscopy frame by deep learning semantic segmentation: mucosa in red, residue in yellow and reflections in blue.
Diagnostics 12 00927 g001
Figure 2. Examples of deep learning colonoscopy detection in real-time on an NVIDIA Jetson Xavier microsystem.
Figure 2. Examples of deep learning colonoscopy detection in real-time on an NVIDIA Jetson Xavier microsystem.
Diagnostics 12 00927 g002
Table 1. Currently approved colonoscopy computer-assisted tools for commercial use (adapted from Taghiakbari et al., 2021).
Table 1. Currently approved colonoscopy computer-assisted tools for commercial use (adapted from Taghiakbari et al., 2021).
ProductManufacturerPlace of Approval and Year Computer System Used
EndoBRAINCybernet System Corp./Olympus Corp.Japan 2018CADx
EndoBRAIN-EYECybernet System Corp./Olympus Corp.Japan 2020CADe
EndoBrain-PLUSCybernet System Corp./Olympus Corp.Japan 2020CADx
EndoBrain-UCCybernet System Corp./Olympus Corp.Japan 2020CADx
GI GeniusMedtronic Corp.Europe 2019CADe
United States 2021
ENDO-AIDOlympus Corp.Europe 2020CADe
CAD EYEFujifilm Corp.Europe 2020CADe/
CADx
Japan 2020
DISCOVERYPentax Corp.Europe 2020CADe
WISE VISIONNEC Corp.Europe 2021CADe
Japan 2021
CADDIEOdin VisionEurope 2021CADe
ME-APDSMagentiq EyeEurope 2021CADe
EndoAngelWuhan EndoAngel Medical Technology CompanyChina 2020CADe
CADx—computer-assisted diagnosis. CADe—computer-assisted detection.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Vulpoi, R.-A.; Luca, M.; Ciobanu, A.; Olteanu, A.; Barboi, O.-B.; Drug, V.L. Artificial Intelligence in Digestive Endoscopy—Where Are We and Where Are We Going? Diagnostics 2022, 12, 927. https://doi.org/10.3390/diagnostics12040927

AMA Style

Vulpoi R-A, Luca M, Ciobanu A, Olteanu A, Barboi O-B, Drug VL. Artificial Intelligence in Digestive Endoscopy—Where Are We and Where Are We Going? Diagnostics. 2022; 12(4):927. https://doi.org/10.3390/diagnostics12040927

Chicago/Turabian Style

Vulpoi, Radu-Alexandru, Mihaela Luca, Adrian Ciobanu, Andrei Olteanu, Oana-Bogdana Barboi, and Vasile Liviu Drug. 2022. "Artificial Intelligence in Digestive Endoscopy—Where Are We and Where Are We Going?" Diagnostics 12, no. 4: 927. https://doi.org/10.3390/diagnostics12040927

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop