Next Article in Journal
Automatic Detection, Classification, and Grading of Lumbar Intervertebral Disc Degeneration Using an Artificial Neural Network Model
Next Article in Special Issue
The Role of MRI in the Diagnosis of Solid Pseudopapillary Neoplasm of the Pancreas and Its Mimickers: A Case-Based Review with Emphasis on Differential Diagnosis
Previous Article in Journal
Development and Integration of DOPS as Formative Tests in Head and Neck Ultrasound Education: Proof of Concept Study for Exploration of Perceptions
Previous Article in Special Issue
Diagnostic, Structured Classification and Therapeutic Approach in Cystic Pancreatic Lesions: Systematic Findings with Regard to the European Guidelines
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Artificial Intelligence—The Rising Star in the Field of Gastroenterology and Hepatology

by
Madalina Stan-Ilie
1,2,
Vasile Sandru
2,*,
Gabriel Constantinescu
1,2,
Oana-Mihaela Plotogea
1,2,
Ecaterina Mihaela Rinja
2,
Iulia Florentina Tincu
1,3,†,
Alexandra Jichitu
2,*,
Adriana Elena Carasel
2,
Andreea Cristina Butuc
2 and
Bogdan Popa
1,2
1
Clinical Hospital, University of Medicine and Pharmacy Carol Davila, 050474 Bucharest, Romania
2
Gastroenterology Department, Clinical Emergency Hospital of Bucharest, 014461 Bucharest, Romania
3
Gastroenterology Department, “Dr Victor Gomoiu” Clinical Children Hospital, 022102 Bucharest, Romania
*
Authors to whom correspondence should be addressed.
With equal contribution as the first author.
Diagnostics 2023, 13(4), 662; https://doi.org/10.3390/diagnostics13040662
Submission received: 5 December 2022 / Revised: 31 January 2023 / Accepted: 7 February 2023 / Published: 10 February 2023
(This article belongs to the Special Issue Advances in the Diagnostic Imaging of Gastrointestinal Diseases)

Abstract

:
Artificial intelligence (AI) is a term that covers a multitude of techniques that are used in a manner that tries to reproduce human intelligence. AI is helpful in various medical specialties that use imaging for diagnostic purposes, and gastroenterology is no exception. In this field, AI has several applications, such as detecting and classifying polyps, detecting the malignancy in polyps, diagnosing Helicobacter pylori infection, gastritis, inflammatory bowel disease, gastric cancer, esophageal neoplasia, and pancreatic and hepatic lesions. The aim of this mini-review is to analyze the currently available studies regarding AI in the field of gastroenterology and hepatology and to discuss its main applications as well as its main limitations.

1. Introduction

Artificial intelligence (AI) is a term that covers a multitude of techniques that are used in order to resolve different problems through a method that tries to copy human intelligence [1]. In the short and medium term, there is no doubt that among all the technologies that continue to grow, artificial intelligence is the one that brings the most important contributions to the field of medicine. More specifically, diagnostic imaging techniques will play the most important part in this evolution due to their purpose of analyzing images, which represents a very accessible type of intelligence [1]. In gastroenterology, the imaging modalities that are used for diagnostic and staging purposes are endoscopy, endoscopic ultrasound, radiology, and histopathologic examination.
Automatic analysis of gastrointestinal images is performed by a specific subtype of AI that is called machine learning (ML). ML refers to the ability of computers to learn and improve themselves from experiences [2]. The learning process can be either supervised or unsupervised. The supervised method involves teaching the model using a group of input data that are already associated with the output data. Unsupervised learning, on the other hand, means that the system is self-trained for a certain purpose. While supervised learning is usually used for image classification, unsupervised learning usually involves the detection of clusters or different pattern recognition. There are specific techniques used in supervised learning, specifically, support vector machine (SVM), naïve Bayes, and random forest (RF) [1].
Both supervised and unsupervised techniques can be used in order to teach artificial neural networks (ANNs). ANNs are a branch of ML algorithms that are made of node units, which are organized in successive layers [1]. The concept underlying ANNs is the neural network. In the same way that neurons are able to transmit signals through dendrites and axons, artificial neurons (nodes) are also able to transmit different signals. The difference between the two systems is the fact that, unlike the biological neural network, the artificial one is able to receive other forms of signals in addition to the activation ones.
If an ANN is composed of multiple layers, the system is called a deep neural network (DNN), and using this kind of system, is called deep learning (DL). The first layer of the DNN is represented by the input information, and the last one is represented by the output. The layers between input and output are called hidden layers. This kind of compound layering allows the system to make complex decisions based on simpler information.
DL is capable of changing the frameworks in each particular layer based on representation learning, therefore, providing the output more efficiently. This system has a major advantage, and that is transfer learning [3]. This means that a model that was previously trained and has learned the characteristics of an image in one task might be used for a new task [4].
For some medical specialties, especially the ones that need the interpretation of images, such as dermatology, gastroenterology, radiology, and pathology, AI is anticipated to be a technology of great importance.
The aim of this mini-review is to reveal and describe the implementation of AI in the field of gastroenterology.

2. Artificial Intelligence in Gastrointestinal Upper and Lower Endoscopy

AI technologies have a large variety of applications in gastrointestinal endoscopy, therefore, improving both diagnostic and treatment performance.
Currently, AI is used in endoscopy to detect, classify, and assess the histology of colorectal polyps, for wireless capsule endoscopy (WCE), for the evaluation of the esogastric pathology by upper endoscopy, and for image analysis of endoscopic ultrasound (EUS) [3] (Table 1).

3. Colorectal Polyps

3.1. Polyp Detection

For colonoscopy, there are two implementations of AI, computer-aided polyp detection (CADe) and computer-aided polyp diagnosis (CADx) [5,6]. Both of these techniques have been intensively studied.
Taking into consideration that colonoscopy has limitations, such as bowel preparation, adenoma detection rate (ADR), and even fatigue, the rate of missed polyps during colonoscopy might be as high as 25% [7]. Glòria Fernández et al. analyzed the capability of an automatic method to detect colonic polyps based on the creation of energy maps. Although only 24 videos containing polyps were analyzed, the specificity and sensibility of this technique were as high as 72.4% and 70.4%, respectively [8].
Moreover, recently a prospective randomized controlled study carried out by Pu Wang et al. investigated the effect of an automatic polyp detection system. In this study, 1058 patients were enrolled and randomized to either standard or computer-aided diagnosis colonoscopy. They concluded that the AI system notably increased the ADR and also the mean number of adenomas per patient due to a higher incidence of small adenomas that the AI system was able to find [6].

3.2. Polyp Classification

The classification of polyps is important, especially when it comes to small ones. This allows the endoscopist to make the best choice and either resect them or leave them in situ. To make these decisions, a close-up analysis using an enhanced imaging technique of the polyps is needed [3]. The final aim of the classification is to determine whether the polyp is malignant or non-malignant. In order to make this differentiation, AI takes into consideration specific characteristics of the polyps, such as the shape, texture, and color [9]. Chromoendoscopy, magnification narrow-band imaging (NBI), endocytoscopy, laser-induced autofluorescence, or confocal endomicroscopy are some techniques used for polyp classification [10,11,12,13] (Figure 1).
Yoshito Takemura et al. performed an analysis concerning narrow-band imaging magnifying colonoscopy to predict the histology of colorectal tumors [14]. Although there is a learning curve in the NBI classification system, and the study was conducted in a single center, the accuracy of this system was nearly 97.8%.
On the other hand, endocytoscopy is able to provide microscopic visualization using mini probes [15]. Yuichi Mori et al. conducted a study regarding the capacity of an automated system for endocytoscopic diagnosis concerning small and diminutive polyps. They found an 89% accuracy of the system for diminutive and small polyps [16].
Laser-induced autofluorescence is a technique that has the capacity to detect colonic dysplasia in vivo [17]. WavSTAT is a device that is incorporated into biopsy forceps and uses laser-induced autofluorescence spectroscopy, allowing laser light to be absorbed by the tissue. The tissue itself emits light; this is further analyzed and represents an optical fingerprint [18].
Chromoendoscopy is a technique in which topical dyes are applied in order to enhance tissue features [19]. This is usually used combined with another technique, such as NBI [20].
Probe-based confocal laser endomicroscopy (pCLE) gives the endoscopist a live microscopic visualization of the epithelial tissue during endoscopy. An automatic software was designed in order to support pCLE, and Barbara André et al. compared its performance with an off-line method used by expert endoscopists. They concluded that both techniques have similarly high sensitivity and specificity [21].

3.3. Detection of Malignancy in Polyps

Diagnosing malignancy in polyps is very important because making the right diagnosis guides the optimal treatment for the patients. If deep submucosal invasion is present, surgery is required because there is a high risk of possible metastasis to the lymph nodes [22]. Taking this into consideration, proper endoscopic diagnostic tools should be used in order to be able to use the right therapeutic options.
Endoscopic treatment consists of endoscopic mucosal resection, endoscopic submucosal dissection, or endoscopic full-thickness resection [22,23,24]. Currently, several endoscopic techniques are available to assess the depth of invasion. Those are NBI, high-definition white light endoscopy (HD-WLE), and EUS [25].
Recently, Kenichi et al. evaluated another CAD system for assessing the grade of invasion, which uses ultra-high magnification endocytoscopy [26]. They concluded that this system might be a helpful diagnosing tool in the future, having both high sensitivity and specificity of 98.1% and 100%.

3.4. Inflammatory Bowel Disease

When speaking about inflammatory bowel disease, more important than the endoscopic healing of the mucosa is the histologic one. The risk of disease exacerbation and dysplasia is higher when histologic inflammation is still present, and this is harder to assess with conventional colonoscopy, especially in patients who suffer from a type of disease that evolves over several years.
The accuracy of a CAD system was evaluated by Yasuharu et al. using images from colonoscopies of patients with ulcerative colitis. They found a 74% sensitivity and 97% specificity of the system, concluding that it allows the identification of persistent histologic inflammation [27].
Another DNN system was developed by Kento et al. using images of colonoscopies from patients with ulcerative colitis. The system’s accuracy was later tested prospectively on 875 patients with ulcerative colitis who underwent colonoscopy. The accuracy of endoscopic remission was nearly 90.1%, and histologic remission was 92.9% [28].
Regarding Chron’s disease, a retrospective study conducted by Eyal et al. analyzed a deep learning algorithm for the automatic detection of ulcers located in the small intestine using images provided by capsule endoscopy. A convolutional neural network was trained in order to classify the images from the mucosa into either normal or mucosal ulcers. They found an accuracy ranging from 95.4% to 96.7% [29].

3.5. Helicobacter Pylori Diagnosis

Helicobacter pylori is a leading cause of gastric cancer. In Asia, the diagnosis of H. pylori by assessing the mucosa is an important part of gastric cancer screening [3]. AI might be a useful tool for improving diagnostic performance taking into consideration that this process is time-consuming and is associated with an abrupt learning curve.
An algorithm that is able to detect H. pylori on specific stained gastric biopsies was designed by Sebastian et al. [30]. They analyzed 87 cases, from which Giemsa-stained biopsies revealed a 100% sensitivity for the algorithm.

3.6. Gastritis

Chronic gastritis is an entity that has a high prevalence. It is diagnosed by evaluating the degree of both active and chronic inflammation, assessing the presence of atrophy or intestinal metaplasia, and testing for the presence of Helicobacter pylori infection [1].
A CNN was used in a study conducted by Georg et al. [31]. The capacity of this network was evaluated in order to establish the proper classification of gastritis (autoimmune, bacterial, and chemical). The accuracy of the test was 84% [31].

3.7. Gastric Cancer

Gastric lesions are as important as colorectal lesions to be detected early and properly characterized in order to establish optimal treatment options [1]. The gastric lesions that might be premalignant conditions are chronic gastritis and polyps. Several studies analyzed different algorithms that were developed in order to improve the detection of premalignant gastric conditions.
Toshiaki et al. studied the capacity of a CNN system to correctly diagnose gastric cancer lesions with an overall sensitivity of 92.2% [32].
Another study by Yan et al. focused on an artificial intelligence CNN system designed in order to determine the grade of invasion of gastric cancer [33]. They concluded that the system was able to differentiate between early gastric cancer and deep submucosal invasion with a sensitivity and specificity of 76.47% and 95.56%, values that were higher than those achieved by expert endoscopists using standard systems.
Another system that operated on a CNN was developed at the Chinese PLA General Hospital in China and used in order to distinguish between malignant and benign gastric tumors. Their model achieved 100% sensitivity and 80.6% specificity [34].

3.8. Esophageal Neoplasia

Esophageal cancer is one of the most aggressive types of cancer. The main histological types are adenocarcinoma and squamous cell carcinoma [1]. Locally advanced esophageal cancer is mainly treated with chemotherapy, and studies demonstrate a positive correlation between histopathological response and the overall rate of survival [35].
One of the most common lesions of the esophagus that is a potential premalignant condition is Barrett’s esophagus. The risk of progression to Barrett’s esophagus increases with the progression of dysplasia. Taking this into consideration, in order to improve prognosis, early detection is of main interest. Nowadays, histopathology represents the gold standard for diagnosing Barrett’s esophagus, although it has limitations regarding interobserver agreement [1]. In order to overcome this limitation, CAD studies that are based on image analysis were recently developed.
Shahriar et al. designed a DL model to help improve the histological diagnosis of dysplasia. Slides from 542 patients were included in the study and were divided into 3 categories: nondysplastic, low-grade dysplasia, and high-grade dysplasia. The model was trained and validated in order to identify dysplasia based on images with an 81.3% sensitivity and 100% specificity for low-grade dysplasia and >90% for nondysplastic Barrett esophagus and high-grade dysplasia [36].
Another retrospective study conducted at the Cancer Institute in Japan used a CNN to detect esophageal cancer early. This system had a 98% sensitivity and was able to distinguish between superficial cancer and advanced esophageal cancer with a 98% accuracy [37].
Regarding Barrett’s esophagus diagnosis, a hybrid ResNet-UNet model was developed by Albert et al. in order to detect neoplasia. The CAD system was able to differentiate and classify the images as either nondysplastic Barrett’s esophagus or as containing neoplasia. Its overall specificity, sensitivity, and accuracy were 88%, 90%, and 89%, respectively [38].
Another aid for helping to determine the grade of dysplasia in Barrett’s esophagus is computerized morphometry. This was used in order to measure several indices of the epithelial nuclei, such as the shape, size, texture, architectural distribution, and symmetry, in a study conducted by Edmon et al. This study proposes, therefore, computerized morphometry as a sustainable tool for determining the grade of dysplasia and predicting the progression to adenocarcinoma [39].
Volumetric laser endomicroscopy is a modern image-based system that has the ability to provide a high-resolution scan of the esophagus’ layers. Although it has high potential to improve the diagnosis of dysplasia in Barrett’s esophagus, its limitations regarding the amount of data needed for real-time interpretations has made it difficult to be used. To overcome this limitation, Anne-Fré Swager et al. designed an algorithm using a clinical volumetric laser endomicroscopy prediction score as an input. The algorithm had a sensitivity and specificity of 90% and 93%, suggesting that an automatic algorithm for detecting early neoplasia has the potential to assist endoscopists [40].

4. Wireless Capsule Endoscopy

WCE allows the physician to visualize the small bowel. Although it has high utility by making it possible to diagnose multiple abnormalities, such as mucosal pathology, bleeding, or polyps, WCE has limitations. The most important ones are linked to the large amount of data that needs to be analyzed: nearly 60,000 images and up to 8 h of video in a classic evaluation [3,41].
Taking these limitations into consideration, a study carried out by YuanPu Zheng et al. proved that the detection rate of abnormalities in a classic WCE evaluation is not, in fact, affected by the endoscopist’s experience [42].
At the moment, the software that is used with WCE has the ability to remove image frames that bring no information to the reader and to improve the reader’s efficiency by, for example, using color in order to locate the frames that contain blood.
One of the limitations of the CAD systems that are usually used with WCE is that every time a new application for WCE appears, a new CAD system has to be designed. Santi Seguí et al. developed a system that uses a CNN that reaches almost 96% accuracy for six intestinal motility events [43]. The large number of images that WCE is able to provide can be used to create databases that serve future CAD system development. A study that included 12 endoscopy centers in France retrospectively selected videos from small bowel WCE analysis [44]. They included 4174 videos and extracted from them the ones that contained any pathological findings.
Regarding gastrointestinal bleeding detection by WCE, Xiao Jia et al. designed an automatic bleeding system based on a CNN. They evaluated their method on 10,000 WCE images and found a 99.9% precision value [45]. Yixuan Yuan et al. proposed a novel learning method for the detection of polyps. Their system was based on the idea that images that have similar features should share the same category. The method had a 98% overall accuracy for polyps, bubbles, turbid, and clear images [46].
For the detection of angiectasia, the most common lesion of the small bowel, a CAD system using a CNN was tested with a sensitivity of 100% and a specificity of 96% [44]. Two datasets of still frames were created for algorithm testing and machine learning. Comparable direct learning systems were also reported for the detection of erosions, ulcers, and hookworms [47,48,49].

5. Endoscopic Ultrasound

EUS is a useful examination, especially when it comes to diagnosing pancreatic lesions and differentiating chronic pancreatitis from them. Unfortunately, there are limited studies regarding deep learning systems available at the moment.
One study by Maoling Zhu et al. evaluated 262 patients with chronic pancreatitis and pancreatic cancer, respectively. Computer-based techniques were used in order to select texture characteristics from specific regions of interest. A total of 105 characteristics and 9 categories were extracted from the EUS images. The overall accuracy, sensitivity, and specificity of the system were 94.2%, 96.25%, and 93.38% [50].
Ananya Das et al. used digital image analysis on EUS images in order to create a model that has the ability to differentiate between chronic pancreatitis and pancreatic cancer. The analysis was conducted on three groups of patients. One group with normal pancreas, another with chronic pancreatitis, and one with pancreatic adenocarcinoma. Although the number of patients enrolled was small (110 for the normal pancreas group, 99 for the chronic pancreatitis group, and 110 for the adenocarcinoma group), they concluded that direct image analysis of EUS images has high accuracy in differentiating between the three entities [34].
Another aspect that needs to be taken into consideration is the differentiation between autoimmune and chronic pancreatitis. A small study carried out by Jianwei Zhu et al. analyzed 181 cases, 81 with autoimmune pancreatitis and 100 with chronic pancreatitis [51]. They showed that with local ternary pattern variance, textural feature CAD of EUS imaging might be a valuable tool in differentiating autoimmune from chronic pancreatitis.
Another tool that is usefully used combined with EUS in order to differentiate between chronic pancreatitis and pancreatic cancer or to characterize a pancreatic mass is elastography [52]. A cross-sectional study by Săftoiu et al. analyzed the accuracy of EUS combined elastography for pancreatic lesions. A total of 68 patients were included in the study, from which 22 had normal pancreas, 11 had chronic pancreatitis, 32 had pancreatic adenocarcinoma, and 3 had neuroendocrine tumors. Hue histograms of each individual image were calculated by examining the EUS elastography movies. Data were afterward the subject of an extended neural network analysis in order to differentiate benign from malignant characteristics. The sensibility and specificity for differentiating benign from malignant lesions were 91.4% and 97.9 %, respectively, in their study [53].

6. Radiology

AI in radiology has become a huge part of diagnostic and therapeutic procedures due to its high applicability among various pathologies.
ML prototypes have been used to process a wide variety of images from computed tomography (CT), magnetic resonance imagining (MRI), and ultrasound (US) [1].
More recently, radiomimics, a term that was first described in 2012, has received great interest due to its capacity to reveal the correlation with biological procedures [54,55].

7. AI in Radiology—Applications in Gastrointestinal Pathology

7.1. Liver Disease

Metabolic syndrome has a continuous growing curve, replacing the previous most frequent causes of liver cancer, hepatitis and alcoholic cirrhosis [56]. In an article from 2020 that provides an update regarding the incidence of global cancer, liver cancer occupies the third place among the causes of cancer deaths [57]. Taking this into consideration, proper diagnostic tools that provide incipient detection of cancer and proper staging tools need to be used in order to provide the right treatment.
The most frequent liver diseases are represented by hepatocellular carcinoma, non-alcoholic fatty liver disease, benign tumors, viral hepatitis, chronic liver disease, and primary sclerosing cholangitis. For the assessment of liver disease, AI and abdominal US have been used for both diffuse liver disease and focal liver lesions [58]. At the moment, liver biopsy represents the gold standard for the diagnosis of fibrosis and NAFLD. Although it has high sensitivity and specificity, the various complications of this procedure (hemorrhage, peritonitis, pneumothorax) cannot be ignored. Therefore, there is a great need for potential future techniques for diagnosing purposes.
Ilias Gastos et al. proposed to evaluate and classify chronic liver disease using ultrasound shear-wave elastography using a CAD system. A total number of 85 images were analyzed, containing 54 healthy subjects and 31 with chronic liver disease. The accuracy of this model was 87%, with high sensitivity and specificity that reached values of 83.3% and 89.1%, respectively [59].
Regarding the detection and characterization of mass lesions, Schmauch et al. created an algorithm that has the ability to simultaneously realize those two tasks using DL. A total of 367 ultrasound images were used from 367 individuals. The algorithm was attendant by annotations from a radiologist and then tested on 177 subjects. The data reached high receiver operating characteristic curves of 0.93 and 0.916 for lesion discernment and characterization [60].
Koichiro et al. used a DL method with a CNN in order to differentiate between liver masses at CT. A set of liver masses images was used over three phases, non-contrast-agent enhanced, arterial, and delayed. The masses were included under supervised training in five categories: hepatocellular carcinomas, other malignant liver tumors, indeterminate or mass-like lesions, and rare benign liver masses, hemangiomas and cysts. After training, a CNN was tested on 100 liver masses, and the median accuracy of differential diagnosis among liver masses was 0.84 [61].
MRI is also a useful tool for the classification of liver and pancreatic lesions, and several studies evaluated the success of AI when combined with this type of imaging. Fan et al. classified distinct types of liver tissue in patients that were diagnosed with hepatocellular carcinoma using 3-D MRI images and a CNN. Their method was further successfully tested on 20 patients with encouraging results [62].
At the moment, regarding the MRI sequences, only T2-weighted images are used for the automatic grading of liver lesions. Mariëlle conducted a supplementary analysis regarding MRI sequences for automatic classification. Their study included 95 patients, with a total of 125 benign lesions and 88 malignant lesions. DCE-MR and T2-weighted images were analyzed with an overall accuracy of 0.77 [63].
Concerning the potential risk of cirrhosis development by patients with hepatitis, James et al. reported the importance of identifying the viral genetic markers that are mostly associated with the progression of fibrosis. In their study, several sites were identified to correlate with the rate of fibrosis progression using ML techniques, linear projection, and Bayesian networks [64].
Primary sclerosing cholangitis is a liver disease that is characterized by inflammation and fibrosis of the intra and extrahepatic ducts. Moreover, it is a premalignant condition, and effective medical treatment options are lacking. Eaton et al. conducted a study on 509 subjects and estimated the risk and outcomes of the patients with primary sclerosing cholangitis. In order to estimate the risk of disease decompensation, nine variables were taken into consideration: patient age, bilirubinemia, serum alkaline phosphatase, albumin, AST, platelets count, hemoglobin, sodium, and the number of years from the diagnosis. Their tool was able to accurately predict hepatic decompensation by using an ML technique [65].
More recently, AI was used in order to predict graft failure and, therefore, overcome the problems of liver transplantation, such as the high mortality on waiting lists, insufficient donors, and graft failures [66,67,68]. Other factors, such as diabetes, that are associated with death after transplantation were also analyzed using AI [69].

7.2. Pancreatic Disease

Several pancreatic diseases were investigated using AI, such as acute pancreatitis and its complications, chronic pancreatitis, pancreatic cystic neoplasms, and pancreatic ductal adenocarcinoma [70,71]. In acute and chronic pancreatitis, AI was used in order to improve disease severity scores and prognostic models. AI was used for the detection, differentiation, and prediction of the malignant potential of pancreatic cystic neoplasms. Pancreatic ductal adenocarcinoma was evaluated using AI by differentiating it from other benign conditions [72]. Moreover, AI is useful in interpreting tissue samples.
Taking into consideration that pancreatic cancer represents the seventh most lethal cancer worldwide and that a five-year survival period depends on the lesion dimensions, the challenge in this pathology is to use proper diagnostic tools in order to provide early detection and to establish the high-risk group of patients [57,73].
Oleg et al. compared different algorithms for risk prediction in pancreatic cancer. Their study included 379 patients, and urine biomarkers (LYVE1, REG1B, TFF1) were analyzed. They set up a biomarker-based risk score that is able to stratify the patients at risk for developing pancreatic cancer [74].
CT is frequently used for screening for the diagnosis of pancreatic cancer, although its sensitivity is not that high, particularly for small lesions [75]. In order to overcome this issue, Liu et al. designed a model of CT scans that was used for the early detection of pancreatic cancer. The model was used on 300 normal scans and 136 pancreatic ductal adenocarcinoma cases and achieved a 90.2% specificity and 80.2% sensitivity [76].
A more sensitive tool for the diagnosis of pancreatic cancer is EUS. Ozkan et al. developed a CAD system for diagnosing pancreatic cancer that uses EUS images [77]. EUS images were extracted from 202 patients with pancreatic cancer and 130 non-cancer patients. Their system reached an 83.3% sensitivity and 93.3% specificity [77].
Another system that uses real-time CAD for pancreatic masses from endoscopic ultrasound imaging was developed by Anca et al. Their system was based on a hybrid convolutional and long short-term memory neural network model. Their study included 65 patients with focal pancreatic masses, and from those, they selected 20 images. The model had a 98.26% accuracy [78].
Acute pancreatitis is a disease in which the outcome depends on its severity. In order to establish the proper treatment and monitoring, acute pancreatitis needs rapid and suitable risk classification. Bodil et al. used ANNs in order to develop a system that is able to predict the severity of acute pancreatitis. A total number of 208 patients were included in their study, and severe pancreatitis was defined relying on the Atlanta criteria. ANNs selected as risk variables the durations of pain, hemoglobin levels, creatinine, heart rate, alanine aminotransferase, and white blood cell count. The system reached 50% sensitivity [79].
Another important pancreatic lesion is represented by pancreatic cystic neoplasms, which are precursor lesions of pancreatic cancer. Their slow progression to invasive carcinoma gives enough time to detect them and to use proper curative treatment. Currently, available technologies that are used in order to establish the risk of cancer are limited [80].
Jayasree et al. retrospectively analyzed pancreatic cysts and parenchyma regions on CT scans from 103 patients in order to predict the risk of intrapapillary mucinous neoplasms (IPMNs). IPMNs were categorized as either low or high risk after resection. Tenfold cross-validation was used combined with clinical variables, obtaining an area under the curve of 0.81 [81].

8. Discussion

AI is a promising tool for diagnosis, prognosis, and treatment in the field of gastroenterology and hepatology. Although promising studies have evaluated the specificity and sensitivity of AI systems with promising results, to date, there are only a few devices approved [82]. Some of them are used in the branch of endoscopy, such as EndoBRAIN-EYE, EndoBRAIN, WISE VISION, WavSTAT4, and GI Genius, which are designed in order to detect colon tumors [83,84,85,86]. Moreover, EndoBRAIN-Plus has the possibility to establish tumor depth [83]. CAD EYE and Discovery systems are able to assist the endoscopist in detecting colon polyps, therefore raising the rate of adenoma detection [87].
For CT, Liver AI was designed in order to detect liver lesions. Similar systems are Poseidon and Ultrasound, which are used for ultrasonography [82].
AI has promising applications regarding endoscopy techniques by slowly replacing biopsies in the future, which are currently the gold standard for a large variety of lesions. Moreover, the implementation of ML systems improves the quality of lesion detection in WCE, which is a very laborious but useful technique for the evaluation of the small intestine.
Pancreatic cancer has lately become a very studied pathology due to its high mortality caused by late diagnosis. The only efficient treatment in these cases is surgery, but only about 20% of the patients benefit from it [88]. The available diagnostic tools at the moment are represented by: US, EUS, CT, MRI, and positron emission tomography-CT [89]. Of these, EUS seems to have the highest sensitivity and specificity for detecting pancreatic lesions [90]. The main problem of EUS is that the diagnosis is very reliant on the specialist’s experience. AI helps to overcome this problem by assisting professionals in this field to detect abnormalities. Dumitrescu et al. analyzed the sensibility and specificity of AI in a meta-analysis that included 10 studies [89]. The variation in diagnostic accuracy using AI was not wide in their study compared with the literature, which shows no significant variation. Moreover, EUS seems to have the highest sensitivity in detecting lesions of 3 cm, which represents an important step in early diagnosis. Compared to MRI and CT, which have 67% and 53% sensitivity, respectively, EUS has 94.4% sensitivity [91].
Another aspect that AI addresses is non-variceal upper gastrointestinal bleeding which represents a cause of high mortality. Recently, Ungureanu et al. assessed the use of an ANN for the prediction of mortality in patients that presented with non-variceal upper gastrointestinal bleeding. Their study included 914 patients, and the analysis was performed using the Rockall, Glasgow-Blatchford score and AIM65. Their ANN was able to predict mortality with an accuracy of >95%, which was higher than that of the three scores individually analyzed [92].
The European Society of Gastrointestinal Endoscopy (ESGE) position statement regarding AI, published in October 2022, especially regarding the diagnosis and management of gastrointestinal neoplasia stated that, in order to be implemented in a clinical setting, AI should assure a high-quality standard for both the diagnostic and the treatment of gastrointestinal neoplasia [93]. For the diagnosis of potential lesions, AI should enhance the performance of less experienced endoscopists, not of more experienced ones, therefore, increasing the rate of detection. The ESGE advises against the high expectation of the possibility that AI will replace histopathologic examination of the polyps in the future. AI should not replace histopathologic examination but help endoscopists to make the right decisions concerning colorectal polyps [93]. Moreover, their recommendation for future research is to compare the performance of less experienced endoscopists assisted by AI with that of more experienced ones.
Discussing the limitations of artificial intelligence, the most important ones are that there is still a need for further studies to evaluate its efficiency on a higher number of patients and that implementing CADe systems is expensive, so trial programs are needed before purchasing them.
On 12 December 2022, a study carried out by Ahmad et al. concluded that the polyp detection rate on colonoscopy was significantly higher using CADe. The study was performed by 8 experienced endoscopists in a cancer screening program that included 614 patients who were randomized into either a CADe or control group. Although ADR was not significantly higher using CADe (2.4 versus 2.1 per colonoscopy), the polyp detection rate was higher in the CADe group (85.7% versus 79.7%) [94].
More recently, on the 16th of December 2022, Ladabaum et al. published a study that evaluated a trial result from a CADe program that was used for 3 months in one center, which was the largest one from the study compared to another 5 units that served as control. At the center that used CADe, ADR was 40.1% compared with that of the control sites, which was higher, 41.8% [95]. Taking into consideration that this result was different from that of multiple randomized controlled trials, it may suggest that other factors, such as motivation and training, are also important in the process.
To conclude, there is no doubt that AI might be called “the rising star” of the moment in the field of medicine, providing both physicians and patients with future perspectives regarding diagnosis, prognosis, and treatment decisions, but further studies are still needed.

Author Contributions

Conceptualization, A.J., I.F.T., B.P., and M.S.-I.; validation, V.S., O.-M.P., and E.M.R.; writing—original draft preparation, A.J.; writing—review and editing, V.S., O.-M.P.; visualization, G.C., B.P., A.E.C., A.C.B.; supervision, M.S.-I., G.C., B.P.; project administration, I.F.T., G.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Berbís, M.A.; Aneiros-Fernández, J.; Mendoza Olivares, F.J.; Nava, E.; Luna, A. Role of Artificial Intelligence in Multidisciplinary Imaging Diagnosis of Gastrointestinal Diseases. World J. Gastroenterol. 2021, 27, 4395–4412. [Google Scholar] [CrossRef]
  2. Artificial Intelligence Committee. House of Lords-AI in the UK: Ready, Willing and Able? Available online: https://Publications.Parliament.Uk/Pa/Ld201719/Ldselect/Ldai/100/10005.Html (accessed on 2 January 2021).
  3. Pannala, R.; Krishnan, K.; Melson, J.; Parsi, M.A.; Schulman, A.R.; Sullivan, S.; Trikudanathan, G.; Trindade, A.J.; Watson, R.R.; Maple, J.T.; et al. Artificial Intelligence in Gastrointestinal Endoscopy. VideoGIE 2020, 5, 598–613. [Google Scholar] [CrossRef]
  4. Chartrand, G.; Cheng, P.M.; Vorontsov, E. Deep Learning: A Primer for Radiologists. Radiographics 2017, 37, 2113–2131. [Google Scholar] [CrossRef]
  5. Byrne, M.F.; Chapados, N.; Soudan, F.; Oertel, C.; Linares Pérez, M.; Kelly, R.; Iqbal, N.; Chandelier, F.; Rex, D.K. Real-Time Differentiation of Adenomatous and Hyperplastic Diminutive Colorectal Polyps during Analysis of Unaltered Videos of Standard Colonoscopy Using a Deep Learning Model. Gut 2019, 68, 94–100. [Google Scholar] [CrossRef]
  6. Wang, P.; Berzin, T.M.; Glissen Brown, J.R.; Bharadwaj, S.; Becq, A.; Xiao, X.; Liu, P.; Li, L.; Song, Y.; Zhang, D.; et al. Real-Time Automatic Detection System Increases Colonnoscoopic Polyp and Adenoma Detec-Tion Rates: A Prospective Randomised Controlled Study. Gut 2019, 68, 1813–1819. [Google Scholar] [CrossRef]
  7. Corley, D.A.; Levin, T.R.; Doubeni, C.A. Adenoma Detection Rate and Risk of Colorectal Cancer and Death. N. Engl. J. Med. 2014, 370, 2541. [Google Scholar] [CrossRef] [PubMed]
  8. Fernández-Esparrach, G.; Bernal, J.; López-Cerón, M.; Córdova, H.; Sánchez-Montes, C.; de Miguel, C.R.; Sánchez, F.J. Exploring the Clinical Potential of an Automatic Colonic Polyp Detection Method Based on the Creation of Energy Maps. Endoscopy 2016, 48, 837–842. [Google Scholar] [CrossRef] [PubMed]
  9. Wang, A.; Mo, J.; Zhong, C.; Wu, S.; Wei, S.; Tu, B.; Liu, C.; Chen, D.; Xu, Q.; Cai, M.; et al. Artificial Intelligence-Assisted Detection and Classification of Colorectal Polyps under Colonoscopy: A Systematic Review and Meta-Analysis. Ann. Transl. Med. 2021, 9, 1662. [Google Scholar] [CrossRef]
  10. Takemura, Y.; Yoshida, S.; Tanaka, S. Quantitative Analysis and Development of a Computer-Aided System for Identification of Regular Pit Patterns of Colorectal Lesions. Gastrointest. Endosc. 2010, 72, 1047–1051. [Google Scholar] [CrossRef] [PubMed]
  11. Tischendorf, J.J.; Gross, S.; Winograd, R.; Hecker, H.; Auer, R.; Behrens, A.; Trautwein, C.; Aach, T.; Stehle, T. Computer-Aided Classification of Colorectal Polyps Based on Vascular Patterns: A Pilot Study. Endoscopy 2010, 42, 203–207. [Google Scholar] [CrossRef] [PubMed]
  12. Mori, Y.; Kudo, S.E.; Misawa, M. Real-Time Use of Artificial Intelligence in Identification of Diminutive Polyps during Colon-Oscopy: A Prospective Study. Ann. Intern. Med. 2018, 169, 357–366. [Google Scholar] [CrossRef]
  13. Mori, Y.; Kudo, S.E.; Wakamura, K.; Misawa, M.; Ogawa, Y.; Kutsukawa, M.; Kudo, T.; Hayashi, T.; Miyachi, H.; Ishida, F.; et al. Novel Computer-Aided Diagnostic System for Colorectal Lesions by Using Endocytoscopy (with Videos). Gastrointest. Endosc. 2015, 81, 621–629. [Google Scholar] [CrossRef]
  14. Takemura, Y.; Yoshida, S.; Tanaka, S.; Kawase, R.; Onji, K.; Oka, S.; Tamaki, T.; Raytchev, B.; Kaneda, K.; Yoshihara, M.; et al. Computer-Aided System for Predicting the Histology of Colorectal Tumors by Using Narrow-Band Imaging Magnifying Colonoscopy (with Video). Gastrointest. Endosc. 2012, 75, 179–185. [Google Scholar] [CrossRef]
  15. Abad, M.R.A.; Shimamura, Y.; Fujiyoshi, Y.; Seewald, S.; Inoue, H. Endocytoscopy: Technology and Clinical Application in Upper Gastrointestinal Tract. Transl. Gastroenterol. Hepatol. 2020, 5, 28. [Google Scholar] [CrossRef]
  16. Mori, Y.; Kudo, S.E.; Chiu, P.W.; Singh, R.; Misawa, M.; Wakamura, K.; Kudo, T.; Hayashi, T.; Katagiri, A.; Miyachi, H.; et al. Impact of an Automated System for Endocytoscopic Diagnosis of Small Colorectal Lesions: An International Web-Based Study. Endoscopy 2016, 48, 1110–1118. [Google Scholar] [CrossRef] [PubMed]
  17. Cothren, R.M.; Sivak, M.V.; Van Dam, J.; Petras, R.E.; Fitzmaurice, M.; Crawford, J.M.; Wu, J.; Brennan, J.F.; Rava, R.P.; Manoharan, R.; et al. Feld Detection of Dysplasia at Colonoscopy Using Laser-Induced Fluorescence: A Blinded Study. Gastrointest. Endosc. 1996, 44, 168–176. [Google Scholar] [CrossRef]
  18. Richards-Kortum, R.; Rava, R.P.; Petras, R.E.; Fitzmaurice, M.; Sivak, M.; Feld, M.S. Spectroscopic Diagnosis of Colonic Dysplasia. Photochem. Photobiol. 1991, 53, 777–786. [Google Scholar] [CrossRef] [PubMed]
  19. Wyllie, R.; Hyams, J.S.; Kay, M. Pediatric Gastrointestinal and Liver Disease, 6th ed.; Elsevier Health Sciences: Philadelphia, PA, USA, 2021. [Google Scholar]
  20. Efthymiou, M.; Allen, P.B.; Taylor, A.C.; Desmond, P.V.; Jayasakera, C.; De Cruz, P.; Kamm, M.A. Chromoendoscopy versus Narrow Band Imaging for Colonic Surveillance in Inflammatory Bowel Disease. Inflamm. Bowel. Dis. 2013, 19, 2132–2138. [Google Scholar] [CrossRef]
  21. André, B.; Vercauteren, T.; Buchner, A.M.; Krishna, M.; Ayache, N.; Wallace, M.B. Software for Automated Classification of Probe-Based Confocal Laser Endomicroscopy Videos of Colorectal Polyps. World J. Gastroenterol. 2012, 18, 5560–5569. [Google Scholar] [CrossRef]
  22. Ikematsu, H.; Yoda, Y.; Matsuda, T. Long-Term Outcomes after Resection for Submucosal Invasive Colorectal Cancers. Gastroenterology 2013, 144, 551–559. [Google Scholar] [CrossRef] [PubMed]
  23. Yoda, Y.; Ikematsu, H.; Matsuda, T. A Large-Scale Multicenter Study of Long-Term Outcomes after Endoscopic Resection for Submucosal Invasive Colorectal Cancer. Endoscopy 2013, 45, 718–724. [Google Scholar] [CrossRef] [PubMed]
  24. Ferlitsch, M.; Moss, A.; Hassan, C. Colorectal Polypectomy and Endoscopic Mucosal Resection (EMR): European Society of Gastrointestinal Endoscopy (ESGE) Clinical Guideline. Endoscopy 2017, 49, 270–297. [Google Scholar] [CrossRef] [PubMed]
  25. Backes, Y.; Moss, A.; Reitsma, J.B. Narrow Band Imaging, Magnifying Chromoendoscopy, and Gross Morphological Features for the Optical Diagnosis of T1 Colorectal Cancer and Deep Submucosal Invasion: A Systematic Review and Meta-Analysis. Am. J. Gastroenterol. 2017, 112, 54–64. [Google Scholar] [CrossRef]
  26. Takeda, K.; Kudo, S.E.; Mori, Y.; Misawa, M.; Kudo, T.; Wakamura, K.; Katagiri, A.; Baba, T.; Hidaka, E.; Ishida, F.; et al. Accuracy of Diagnosing Invasive Colorectal Cancer Using Computer-Aided Endocytoscopy. Endoscopy 2017, 49, 798–802. [Google Scholar] [CrossRef]
  27. Maeda, Y.; Kudo, S.E.; Mori, Y.; Misawa, M.; Ogata, N.; Sasanuma, S.; Wakamura, K.; Oda, M.; Mori, K.; Ohtsuka, K. Fully Automated Diagnostic System with Artificial Intelligence Using Endocytoscopy to Identify the Presence of Histologic Inflammation Associated with Ulcerative Colitis (with Video). Gastrointest. Endosc. 2019, 89, 408–415. [Google Scholar] [CrossRef] [PubMed]
  28. Takenaka, K.; Ohtsuka, K.; Fujii, T.; Negi, M.; Suzuki, K.; Shimizu, H.; Oshima, S.; Akiyama, S.; Motobayashi, M.; Nagahori, M.; et al. Development and Validation of a Deep Neural Network for Accurate Evaluation of Endoscopic Images From Patients with Ulcerative Colitis. Gastroenterology 2020, 158, 2150–2157. [Google Scholar] [CrossRef]
  29. Klang, E.; Barash, Y.; Margalit, R.Y.; Soffer, S.; Shimon, O.; Albshesh, A.; Ben-Horin, S.; Amitai, M.M.; Eliakim, R.; Kopylov, U. Deep Learning Algorithms for Automated Detection of Crohn’s Disease Ulcers by Video Capsule Endoscopy. Gastrointest. Endosc. 2020, 91, 606–613. [Google Scholar] [CrossRef] [PubMed]
  30. Klein, S.; Gildenblat, J.; Ihle, M.A.; Merkelbach-Bruse, S.; Noh, K.W.; Peifer, M.; Quaas, A.; Büttner, R. Deep Learning for Sensitive Detection of Helicobacter Pylori in Gastric Biopsies. BMC Gastroenterol. 2020, 20, 417. [Google Scholar] [CrossRef]
  31. Steinbuss, G.; Kriegsmann, K.; Kriegsmann, M. Identification of Gastritis Subtypes by Convolutional Neuronal Networks on Histological Images of Antrum and Corpus Biopsies. Int. J. Mol. Sci. 2020, 21, 6652. [Google Scholar] [CrossRef]
  32. Hirasawa, T.; Aoyama, K.; Tanimoto, T.; Ishihara, S.; Shichijo, S.; Ozawa, T.; Ohnishi, T.; Fujishiro, M.; Matsuo, K.; Fujisaki, J.; et al. Application of Artificial Intelligence Using a Convolutional Neural Network for Detecting Gastric Cancer in Endoscopic Images. Gastric. Cancer 2018, 21, 653–660. [Google Scholar] [CrossRef] [Green Version]
  33. Zhu, Y.; Wang, Q.C.; Xu, M.D.; Zhang, Z.; Cheng, J.; Zhong, Y.S.; Zhang, Y.Q.; Chen, W.F.; Yao, L.Q.; Zhou, P.H.; et al. Application of Convolutional Neural Network in the Diagnosis of the Invasion Depth of Gastric Cancer Based on Conventional Endoscopy. Gastrointest. Endosc. 2019, 89, 806–815. [Google Scholar] [CrossRef] [PubMed]
  34. Das, A.; Nguyen, C.C.; Li, F.; Li, B. Digital Image Analysis of EUS Images Accurately Differentiates Pancreatic Cancer from Chronic Pancreatitis and Normal Tissue. Gastrointest. Endosc. 2008, 67, 861–867. [Google Scholar] [CrossRef] [PubMed]
  35. Hammoud, Z.T.; Kesler, K.A.; Ferguson, M.K.; Battafarrano, R.J.; Bhogaraju, A.; Hanna, N.; Govindan, R.; Mauer, A.A.; Yu, M.; Einhorn, L.H. Survival Outcomes of Resected Patients Who Demonstrate a Pathologic Complete Response after Neoadjuvant Chemoradiation Therapy for Locally Advanced Esophageal Cancer. Dis. Esophagus 2006, 19, 69–72. [Google Scholar] [CrossRef] [PubMed]
  36. Faghani, S.; Codipilly, D.C.; Vogelsang, D.; Moassefi, M.; Rouzrokh, P.; Khosravi, B.; Agarwal, S.; Dhaliwal, L.; Katzka, D.A.; Hagen, C.; et al. Development of a Deep Learning Model for the Histologic Diagnosis of Dysplasia in Barrett’s Esophagus. Gastrointest. Endosc. 2022, 96, 918–925. [Google Scholar] [CrossRef]
  37. Horie, Y.; Yoshio, T.; Aoyama, K.; Yoshimizu, S.; Horiuchi, Y.; Ishiyama, A.; Hirasawa, T.; Tsuchida, T.; Ozawa, T.; Ishihara, S.; et al. Diagnostic Outcomes of Esophageal Cancer by Artificial Intelligence Using Convolutional Neural Networks. Gastrointest. Endosc. 2019, 89, 25–32. [Google Scholar] [CrossRef] [PubMed]
  38. de Groof, A.J.; Struyvenberg, M.R.; van der Putten, J.; van der Sommen, F.; Fockens, K.N.; Curvers, W.L.; Zinger, S.; Pouw, R.E.; Coron, E.; Baldaque-Silva, F.; et al. Deep-Learning System Detects Neoplasia in Patients with Barrett’s Esophagus with Higher Accuracy Than Endoscopists in a Multistep Training and Validation Study with Benchmarking. Gastroenterology 2020, 158, 915–929. [Google Scholar] [CrossRef]
  39. Sabo, E.; Beck, A.H.; Montgomery, E.A.; Bhattacharya, B.; Meitner, P.; Wang, J.Y.; Resnick, M.B. Computerized Morphometry as an Aid in Determining the Grade of Dysplasia and Progres-Sion to Adenocarcinoma in Barrett’s Esophagus. Lab. Investig. 2006, 86, 1261–1271. [Google Scholar] [CrossRef] [PubMed]
  40. Swager, A.F.; van der Sommen, F.; Klomp, S.R.; Zinger, S.; Meijer, S.L.; Schoon, E.J.; Bergman, J.J.G.H.M.; de With, P.H.; Curvers, W.L. Computer-Aided Detection of Early Barrett’s Neoplasia Using Volumetric Laser Endomicroscopy. Gastrointest. Endosc. 2017, 86, 839–846. [Google Scholar] [CrossRef]
  41. Wang, A.; Banerjee, S.; Barth, B.A. Wireless Capsule Endoscopy. Gastrointest. Endosc. 2013, 78, 805–815. [Google Scholar] [CrossRef]
  42. Zheng, Y.; Hawkins, L.; Wolff, J.; Goloubeva, O.; Goldberg, E. Detection of Lesions during Capsule Endoscopy: Physician Performance Is Disappointing. Am. J. Gastroenterol. 2012, 107, 554–560. [Google Scholar] [CrossRef]
  43. Segui, S.; Drozdzal, M.; Pascual, G. Generic Feature Learning for Wireless Capsule Endoscopy Analysis. Comput. Biol. Med. 2016, 79, 163–172. [Google Scholar] [CrossRef]
  44. Leenhardt, R.; Li, C.; Le Mouel, J.P.; Rahmi, G.; Saurin, J.C.; Cholet, F.; Boureille, A.; Amiot, X.; Delvaux, M.; Duburque, C.; et al. CAD-CAP: A 25,000-Image Database Serving the Development of Artificial Intelligence for Capsule Endoscopy. Endosc. Int. Open 2020, 8, E415–E420. [Google Scholar] [CrossRef]
  45. Jia, X.; Meng, M.Q. A Deep Convolutional Neural Network for Bleeding Detection in Wireless Capsule Endoscopy Images. In Proceedings of the 2016 38th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Orlando, FL, USA, 16–20 August 2016; pp. 639–642. [Google Scholar]
  46. Yuan, Y.; Meng, M.Q. Deep Learning for Polyp Recognition in Wireless Capsule Endoscopy Images. Med. Phys. 2017, 44, 1379–1389. [Google Scholar] [CrossRef]
  47. Aoki, T.; Yamada, A.; Aoyama, K. Automatic Detection of Erosions and Ulcerations in Wireless Capsule Endoscopy Images Based on a Deep Convolutional Neural Network. Gastrointest. Endosc. 2019, 89, 357–363. [Google Scholar] [CrossRef]
  48. Fan, S.; Xu, L.; Fan, Y. Computer-Aided Detection of Small Intestinal Ulcer and Erosion in Wireless Capsule Endoscopy Images. Phys. Med. Biol. 2018, 63, 165001. [Google Scholar] [CrossRef]
  49. He, J.Y.; Wu, X.; Jiang, Y.G. Hookworm Detection in Wireless Capsule Endoscopy Images with Deep Learning. IEEE Trans. Image Process. 2018, 27, 2379–2392. [Google Scholar] [CrossRef] [PubMed]
  50. Zhu, M.; Xu, C.; Yu, J.; Wu, Y.; Li, C.; Zhang, M.; Jin, Z.; Li, Z. Differentiation of Pancreatic Cancer and Chronic Pancreatitis Using Computer-Aided Diagnosis of Endoscopic Ultrasound (EUS) Images: A Diagnostic Test. PLoS ONE 2013, 8, e63820. [Google Scholar] [CrossRef]
  51. Zhu, J.; Wang, L.; Chu, Y.; Hou, X.; Xing, L.; Kong, F.; Zhou, Y.; Wang, Y.; Jin, Z.; Li, Z. A New Descriptor for Computer-Aided Diagnosis of EUS Imaging to Distinguish Autoimmune Pancreatitis from Chronic Pancreatitis. Gastrointest. Endosc. 2015, 82, 831–836. [Google Scholar] [CrossRef] [PubMed]
  52. Giovannini, M. Endoscopic Ultrasound Elastography. Pancreatology 2011, 11 (Suppl. S2), 34–39. [Google Scholar] [CrossRef] [PubMed]
  53. Săftoiu, A.; Vilmann, P.; Gorunescu, F.; Gheonea, D.I.; Gorunescu, M.; Ciurea, T.; Popescu, G.L.; Iordache, A.; Hassan, H.; Iordache, S. Neural Network Analysis of Dynamic Sequences of EUS Elastography Used for the Differential Diagnosis of Chronic Pancreatitis and Pancreatic Cancer. Gastrointest. Endosc. 2008, 68, 1086–1094. [Google Scholar] [CrossRef]
  54. Lambin, P.; Rios-Velazquez, E.; Leijenaar, R.; Carvalho, S.; van Stiphout, R.G.; Granton, P.; Zegers, C.M.; Gillies, R.; Boellard, R.; Dekker, A.; et al. Radiomics: Extracting More Information from Medical Images Using Advanced Feature Analysis. Eur. J. Cancer 2012, 48, 441–446. [Google Scholar] [CrossRef]
  55. Gillies, R.J.; Kinahan, P.E.; Hricak, H. Radiomics: Images Are More than Pictures, They Are Data. Radiology 2016, 278, 563–577. [Google Scholar] [CrossRef] [PubMed]
  56. Marengo, A.; Rosso, C.; Bugianesi, E. Liver Cancer: Connections with Obesity, Fatty Liver, and Cirrhosis. Annu. Rev. Med. 2016, 67, 103–117. [Google Scholar] [CrossRef]
  57. Sung, H.; Ferlay, J.; Siegel, R.L.; Laversanne, M.; Soerjomataram, I.; Jemal, A.; Bray, F. Global Cancer Statistics 2020: GLOBOCAN Estimates of Incidence and Mortality Worldwide for 36 Cancers in 185 Countries. CA Cancer J. Clin. 2021, 71, 209–249. [Google Scholar] [CrossRef] [PubMed]
  58. Liu, J.Q.; Ren, J.Y.; Xu, X.L.; Xiong, L.Y.; Peng, Y.X.; Pan, X.F.; Dietrich, C.F.; Cui, X.W. Ultrasound-Based Artificial Intelligence in Gastroenterology and Hepatology. World J. Gastroenterol. 2022, 28, 5530–5546. [Google Scholar] [CrossRef]
  59. Gatos, I.; Tsantis, S.; Spiliopoulos, S.; Karnabatidis, D.; Theotokas, I.; Zoumpoulis, P.; Loupas, T.; Hazle, J.D.; Kagadis, G.C. A New Computer Aided Diagnosis System for Evaluation of Chronic Liver Disease with Ultrasound Shear Wave Elastography Imaging. Med. Phys. 2016, 43, 1428–1436. [Google Scholar] [CrossRef] [PubMed]
  60. Schmauch, B.; Herent, P.; Jehanno, P.; Dehaene, O.; Saillard, C.; Aubé, C.; Luciani, A.; Lassau, N.; Jégou, S. Diagnosis of Focal Liver Lesions from Ultrasound Using Deep Learning. Diagn. Interv. Imaging 2019, 100, 227–233. [Google Scholar] [CrossRef] [PubMed]
  61. Yasaka, K.; Akai, H.; Abe, O.; Kiryu, S. Deep Learning with Convolutional Neural Network for Differentiation of Liver Masses at Dynamic Contrast-Enhanced CT: A Preliminary Study. Radiology 2018, 286, 887–896. [Google Scholar] [CrossRef]
  62. Zhang, F.; Yang, J.; Nezami, N.; Laage-Gaupp, F.; Chapiro, J.; De Lin, M.; Duncan, J. Liver Tissue Classification Using an Auto-Context-Based Deep Neural Network with a Multi-Phase Training Framework. Patch Based Tech. Med. Imaging 2018, 59, 59–66. [Google Scholar]
  63. Jansen, M.J.A.; Kuijf, H.J.; Veldhuis, W.B.; Wessels, F.J.; Viergever, M.A.; Pluim, J.P.W. Automatic Classification of Focal Liver Lesions Based on MRI and Risk Factors. PLoS ONE 2019, 14, e0217053. [Google Scholar] [CrossRef]
  64. Lara, J.; López-Labrador, F.; González-Candelas, F.; Berenguer, M.; Khudyakov, Y.E. Computational Models of Liver Fibrosis Progression for Hepatitis C Virus Chronic Infection. BMC Bioinform. 2014, 15, S5. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  65. Eaton, J.E.; Vesterhus, M.; McCauley, B.M.; Atkinson, E.J.; Schlicht, E.M.; Juran, B.D.; Gossard, A.A.; LaRusso, N.F.; Gores, G.J.; Karlsen, T.H.; et al. Primary Sclerosing Cholangitis Risk Estimate Tool (PREsTo) Predicts Outcomes of the Disease: A Derivation and Validation Study Using Machine Learning. Hepatology 2020, 71, 214–224. [Google Scholar] [CrossRef] [PubMed]
  66. Briceño, J.; Cruz-Ramírez, M.; Prieto, M.; Navasa, M.; De Urbina, J.O.; Orti, R.; Gómez-Bravo, M.Á.; Otero, A.; Varo, E.; Tomé, S.; et al. Use of Artificial Intelligence as an Innovative Donor-Recipient Matching Model for Liver Transplantation: Results from a Multicenter Spanish Study. J. Hepatol. 2014, 61, 1020–1028. [Google Scholar] [CrossRef] [PubMed]
  67. Ayllón, M.D.; Ciria, R.; Cruz-Ramírez, M.; Pérez-Ortiz, M.; Gómez, I.; Valente, R.; O’Grady, J.; de la Mata, M.; Hervás-Martínez, C.; Heaton, N.D.; et al. Validation of Artificial Neural Networks as a Methodology for Donor-Recipient Matching for Liver Transplantation. Liver Transpl. 2018, 24, 192–203. [Google Scholar] [CrossRef] [PubMed]
  68. Lau, L.; Kankanige, Y.; Rubinstein, B.; Jones, R.; Christophi, C.; Muralidharan, V.; Bailey, J. Machine-Learning Algorithms Predict Graft Failure after Liver Transplantation. Transplantation 2017, 101, e125–e132. [Google Scholar] [CrossRef] [PubMed]
  69. Bhat, V.; Tazari, M.; Watt, K.D.; Bhat, M. New-Onset Diabetes and Preexisting Diabetes Are Associated with Comparable Reduction in Long-Term Survival after Liver Transplant: A Machine Learning Approach. Mayo Clin. 2018, 93, 1794–1802. [Google Scholar] [CrossRef]
  70. Gorris, M.; Hoogenboom, S.A.; Wallace, M.B.; van Hooft, J.E. Artificial Intelligence for the Management of Pancreatic Diseases. Dig. Endosc. 2021, 32, 231–241. [Google Scholar] [CrossRef]
  71. van den Heever, M.; Mittal, A.; Haydock, M.; Windsor, J. The Use of Intelligent Database Systems in Acute Pancreatitis—A Systematic Review. Pancreatology 2014, 14, 9–16. [Google Scholar] [CrossRef]
  72. Kröner, P.T.; Engels, M.M.; Glicksberg, B.S.; Johnson, K.W.; Mzaik, O.; van Hooft, J.E.; Wallace, M.B.; El-Serag, H.B.; Krittanawong, C. Artificial Intelligence in Gastroenterology: A State-of-the-Art Review. World J. Gastroenterol. 2021, 27, 6794–6824. [Google Scholar] [CrossRef]
  73. Egawa, S.; Toma, H.; Ohigashi, H.; Okusaka, T.; Nakao, A.; Hatori, T.; Maguchi, H.; Yanagisawa, A.; Tanaka, M. Japan Pancreatic Cancer Registry: 30th Year Anniversary. Pancreas 2012, 41, 985–992. [Google Scholar] [CrossRef]
  74. Blyuss, O.; Zaikin, A.; Cherepanova, V.; Munblit, D.; Kiseleva, E.M.; Prytomanova, O.M.; Duffy, S.W.; Crnogorac-Jurcevic, T. Development of PancRISK, a Urine Biomarker-Based Risk Score for Stratified Screening of Pancreatic Cancer Patients. Br. J. Cancer 2020, 122, 692–696. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  75. Pelaez-Luna, M.; Takahashi, N.; Fletcher, J.G.; Chari, S.T. Resectability of Presymptomatic Pancreatic Cancer and Its Relationship to Onset of Diabetes: A Retrospective Review of CT Scans and Fasting Glucose Values Prior to Diagnosis. Am. J. Gastroenterol. 2007, 102, 2157–2163. [Google Scholar] [CrossRef]
  76. Liu, F.; Xie, L.; Xia, Y.; Fishman, E.K.; Yuille, A.L. Joint Shape Representation and Classification for Detecting PDAC; ProQuest LLC: Ann Arbor, MI, USA, 2018; Volume 11861, pp. 212–220. [Google Scholar]
  77. Ozkan, M.; Cakiroglu, M.; Kocaman, O.; Kurt, M.; Yilmaz, B.; Can, G.; Korkmaz, U.; Dandil, E.; Eksi, Z. Age-Based Computer-Aided Diagnosis Approach for Pancreatic Cancer on Endoscopic Ultrasound Images. Endosc. Ultrasound 2016, 5, 101–107. [Google Scholar] [PubMed]
  78. Udriștoiu, A.L.; Cazacu, I.M.; Gruionu, L.G.; Gruionu, G.; Iacob, A.V.; Burtea, D.E.; Ungureanu, B.S.; Costache, M.I.; Constantin, A.; Popescu, C.F.; et al. Real-Time Computer-Aided Diagnosis of Focal Pancreatic Masses from Endoscopic Ultrasound Imaging Based on a Hybrid Convolutional and Long Short-Term Memory Neural Network Model. PLoS ONE 2021, 16, e0251701. [Google Scholar] [CrossRef] [PubMed]
  79. Andersson, B.; Andersson, R.; Ohlsson, M.; Nilsson, J. Prediction of Severe Acute Pancreatitis at Admission to Hospital Using Artificial Neural Networks. Pancreatology 2011, 11, 328–335. [Google Scholar] [CrossRef] [PubMed]
  80. Sakorafasa, G.H.; Smyrniotisa, V.; Reid-Lombardob, K.M.; Sarrb, M.G. Primary Pancreatic Cystic Neoplasms Revisited. Part III. Intraductal Papillary Mucinous Neoplasms. Surg. Oncol. 2011, 20, e109–e118. [Google Scholar] [CrossRef]
  81. Chakraborty, J.; Midya, A.; Gazit, L.; Attiyeh, M.; Langdon-Embry, L.; Allen, P.J.; Do, R.K.G.; Simpson, A.L. CT Radiomics to Predict High-Risk Intraductal Papillary Mucinous Neoplasms of the Pancreas. Med. Phys. 2018, 45, 5019–5029. [Google Scholar] [CrossRef]
  82. Oka, A.; Ishimura, N.; Ishihara, S. A New Dawn for the Use of Artificial Intelligence in Gastroenterology, Hepatology and Pancreatology. Diagnostics 2021, 11, 1719. [Google Scholar] [CrossRef]
  83. Mori, Y.; Kudo, S.E.; Misawa, M.; Hotta, K.; Kazuo, O.; Saito, S.; Ikematsu, H.; Saito, Y.; Matsuda, T.; Kenichi, T.; et al. Artificial Intelligence-Assisted Colonic Endocytoscopy for Cancer Recognition: A Multicenter Study. Endosc. Int. Open 2021, 9, E1004–E1011. [Google Scholar] [CrossRef]
  84. Kudo, S.E.; Misawa, M.; Mori, Y.; Hotta, K.; Ohtsuka, K.; Ikematsu, H.; Saito, Y.; Takeda, K.; Nakamura, H.; Ichimasa, K.; et al. Artificial Intelligence-Assisted System Improves Endoscopic Identification of Colorectal Neoplasms. Clin. Gastroenterol. Hepatol. 2020, 18, 1874–1881. [Google Scholar] [CrossRef]
  85. Kuiper, T.; Alderlieste, Y.A.; Tytgat, K.M.; Vlug, M.S.; Nabuurs, J.A.; Bastiaansen, B.A.; Löwenberg, M.; Fockens, P.; Dekker, E. Automatic Optical Diagnosis of Small Colorectal Lesions by Laser-Induced Autofluorescence. Endoscopy 2015, 47, 56–62. [Google Scholar] [CrossRef] [PubMed]
  86. Hann, A.; Troya, J.; Fitting, D. Current Status and Limitations of Artificial Intelligence in Colonoscopy. United Eur. Gastroenterol. J. 2021, 9, 527–533. [Google Scholar] [CrossRef] [PubMed]
  87. Neumann, H.; Kreft, A.; Sivanathan, V.; Rahman, F.; Galle, P.R. Evaluation of Novel LCI CAD EYE System for Real Time Detection of Colon Polyps. PLoS ONE 2021, 16, e0255955. [Google Scholar] [CrossRef]
  88. National Cancer Institute. Surveillance, Epidemiology and End Results Program (SEER). Available online: https://Seer.Cancer.Gov/Statfacts/Html/Pancreas.Html (accessed on 1 December 2021).
  89. Dumitrescu, E.A.; Ungureanu, B.S.; Cazacu, I.M.; Florescu, L.M.; Streba, L.; Croitoru, V.M.; Sur, D.; Croitoru, A.; Turcu-Stiolica, A.; Lungulescu, C.V. Diagnostic Value of Artificial Intelligence-Assisted Endoscopic Ultrasound for Pancreatic Cancer: A Systematic Review and Meta-Analysis. Diagnostics 2022, 12, 309. [Google Scholar] [CrossRef]
  90. Maguchi, H. The Roles of Endoscopic Ultrasonography in the Diagnosis of Pancreatic Tumors. J. Hepato-Biliary-Pancreat Surg. 2004, 11, 1–3. [Google Scholar] [CrossRef] [PubMed]
  91. Müller, M.F.; Meyenberger, C.; Bertschinger, P.; Schaer, R.; Marincek, B. Pancreatic Tumors: Evaluation with Endoscopic US, CT, and MR Imaging. Radiology 1994, 190, 745–751. [Google Scholar] [CrossRef]
  92. Ungureanu, B.S.; Saftoiu, A.; Turcu-Stiolica, A.; Cazacu, S.M.; Gheonea, D.I. Artificial Neural Network for the Prediction of Mortality in Patients Presented with Non-Variceal Upper Gastrointestinal Bleeding. Endoscopy 2022, 54, S189. [Google Scholar]
  93. Messmann, H.; Bisschops, R.; Antonelli, G.; Libânio, D.; Sinonquel, P.; Abdelrahim, M.; Ahmad, O.F.; Areia, M.; Bergman, J.J.G.H.M.; Bhandari, P.; et al. Mario Dinis-Ribeiro Position Statement on Expected Value of Artificial Intelligence in GI; ESGE: Dublin, Ireland, 2022. [Google Scholar]
  94. Ahmad, A.; Wilson, A.; Haycock, A.; Humphries, A.; Monahan, K.; Suzuki, N.; Thomas-Gibson, S.; Vance, M.; Bassett, P.; Thiruvilangam, K.; et al. Evaluation of a real-time computer-aided polyp detection system during screening colonoscopy: AI-DETECT study. Endoscopy 2022, 2, 109–204. [Google Scholar] [CrossRef]
  95. Ladabaum, U.; Shepard, J.; Weng, Y.; Desai, M.; Singer, S.J.; Mannalithara, A. Computer-aided detection of polyps does not improve colonoscopist performance in a pragmatic implementation trial. Gastroenterology 2022, S0016–S5085. [Google Scholar] [CrossRef]
Figure 1. NBI visualization of colon polyps.
Figure 1. NBI visualization of colon polyps.
Diagnostics 13 00662 g001
Table 1. Applications of CAD and AI in endoscopic procedures.
Table 1. Applications of CAD and AI in endoscopic procedures.
LOWER DIGESTIVE TRACT ENDOSCOPY
Polyp detection
Polyps classification
Detection of malignancy in polyps
Inflammatory bowel disease (ulcerative colitis and Chron’s disease)
UPPER DIGESTIVE TRACT ENDOSCOPY
Diagnosis of Helicobacter pylori
Inflammatory gastric disease (autoimmune, bacterial and chemical chronic gastritis)
Gastric cancer
Esophageal cancer and premalignant conditions (Barret’s esophagus)
WIRELESS CAPSULE ENDOSCOPY
Angiectasia
Polyps
Erosions/ulcers
Hookworms
ENDOSCOPIC ULTRASOUND
Chronic pancreatitis
Pancreatic cancer
Autoimmune pancreatitis
EUS Electrography
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Stan-Ilie, M.; Sandru, V.; Constantinescu, G.; Plotogea, O.-M.; Rinja, E.M.; Tincu, I.F.; Jichitu, A.; Carasel, A.E.; Butuc, A.C.; Popa, B. Artificial Intelligence—The Rising Star in the Field of Gastroenterology and Hepatology. Diagnostics 2023, 13, 662. https://doi.org/10.3390/diagnostics13040662

AMA Style

Stan-Ilie M, Sandru V, Constantinescu G, Plotogea O-M, Rinja EM, Tincu IF, Jichitu A, Carasel AE, Butuc AC, Popa B. Artificial Intelligence—The Rising Star in the Field of Gastroenterology and Hepatology. Diagnostics. 2023; 13(4):662. https://doi.org/10.3390/diagnostics13040662

Chicago/Turabian Style

Stan-Ilie, Madalina, Vasile Sandru, Gabriel Constantinescu, Oana-Mihaela Plotogea, Ecaterina Mihaela Rinja, Iulia Florentina Tincu, Alexandra Jichitu, Adriana Elena Carasel, Andreea Cristina Butuc, and Bogdan Popa. 2023. "Artificial Intelligence—The Rising Star in the Field of Gastroenterology and Hepatology" Diagnostics 13, no. 4: 662. https://doi.org/10.3390/diagnostics13040662

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop