Next Article in Journal
Inter-Examiner Disagreement for Assessing Cervical Multifidus Ultrasound Metrics Is Associated with Body Composition Features
Next Article in Special Issue
Neural Networks Application for Accurate Retina Vessel Segmentation from OCT Fundus Reconstruction
Previous Article in Journal
Image Watermarking Using Least Significant Bit and Canny Edge Detection
Previous Article in Special Issue
Abdominal Aortic Thrombus Segmentation in Postoperative Computed Tomography Angiography Images Using Bi-Directional Convolutional Long Short-Term Memory Architecture
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Intraprocedure Artificial Intelligence Alert System for Colonoscopy Examination

1
Department of Gastroenterology and Hepatology, Taoyuan Chang Gung Memorial Hospital, Taoyuan 333, Taiwan
2
Department of Gastroenterology and Hepatology, Linkou Chang Gung Memorial Hospital, Taoyuan 333, Taiwan
3
College of Medicine, Chang Gung University, Taoyuan 333, Taiwan
4
Department of Computer Science and Information Engineering, Fu-Jen Catholic University, New Taipei 242, Taiwan
*
Author to whom correspondence should be addressed.
Sensors 2023, 23(3), 1211; https://doi.org/10.3390/s23031211
Submission received: 11 November 2022 / Revised: 13 January 2023 / Accepted: 18 January 2023 / Published: 20 January 2023
(This article belongs to the Special Issue Vision- and Image-Based Biomedical Diagnostics)

Abstract

:
Colonoscopy is a valuable tool for preventing and reducing the incidence and mortality of colorectal cancer. Although several computer-aided colorectal polyp detection and diagnosis systems have been proposed for clinical application, many remain susceptible to interference problems, including low image clarity, unevenness, and low accuracy for the analysis of dynamic images; these drawbacks affect the robustness and practicality of these systems. This study proposed an intraprocedure alert system for colonoscopy examination developed on the basis of deep learning. The proposed system features blurred image detection, foreign body detection, and polyp detection modules facilitated by convolutional neural networks. The training and validation datasets included high-quality images and low-quality images, including blurred images and those containing folds, fecal matter, and opaque water. For the detection of blurred images and images containing folds, fecal matter, and opaque water, the accuracy rate was 96.2%. Furthermore, the study results indicated a per-polyp detection accuracy of 100% when the system was applied to video images. The recall rates for high-quality image frames and polyp image frames were 95.7% and 92%, respectively. The overall alert accuracy rate and the false-positive rate of low quality for video images obtained through per-frame analysis were 95.3% and 0.18%, respectively. The proposed system can be used to alert colonoscopists to the need to slow their procedural speed or to perform flush or lumen inflation in cases where the colonoscope is being moved too rapidly, where fecal residue is present in the intestinal tract, or where the colon has been inadequately distended.

1. Introduction

Colonoscopy is a valuable tool for detecting colorectal diseases. Chromoendoscopy is often used in the diagnosis of polyps through the enhancement of the color, vascular structure, and surface morphology of polyp lesions [1]. Generally, chromoendoscopy is divided into dye- and equipment-based types [2]. Equipment-based chromoendoscopy, or virtual chromoendoscopy, is the type that is most extensively used in current clinical practice; it involves narrowband imaging, flexible spectral imaging color enhancement, i-scan, and blue laser imaging [3]. When used in conjunction with magnifying colonoscopy, such equipment-based techniques can help to accurately distinguish non-neoplastic and neoplastic lesions, help predict the depth of invasion, and assist endoscopists in using the correct treatment methods [4], thereby effectively reducing the incidence and mortality of colorectal cancer [5].
Patients undergoing colonoscopy are required to maintain a low-residue diet and take laxatives to empty their bowels of excrement to allow the colonoscopists to have a clear view of the intestinal mucosa. While performing a colonoscopy, a colonoscopist inserts a colonoscope into the patient’s anus and further guides it into their bowel lumen, passing through the rectum, sigmoid colon, descending colon, transverse colon, and ascending colon before finally reaching the cecum. The colonoscopist then slowly pulls back the colonoscope from the cecum and may push it forward or pull it farther back as necessary to examine the mucosa in detail. Polypectomy or biopsy may be performed if a polyp or lesion is detected. Among the quality indicators of colonoscopy [6], the adenoma detection rate (ADR), quality of bowel cleansing, withdrawal time, cecal intubation rate, and complete polypectomy rate are closely correlated with the occurrence of colorectal cancer after colonoscopy [7]. However, considerable variance in ADRs may occur among endoscopists, which could, in turn, diminish the clinical benefits of colonoscopy.
Several deep learning–based systems have been formulated for the diagnosis and detection of colorectal polyps [8,9,10,11,12,13,14]. For example, Chen et al. [15] used deep neural networks to distinguish narrowband images of neoplastic and hyperplastic polyps at the Tri-Service General Hospital in Taiwan. Park et al. [16] used a convolutional neural network (CNN) to classify polyps captured in colonoscopic images. Shin et al. [17] proposed a method based on a region-based CNN (R-CNN) that engages in false-positive and offline learning for polyp detection. Ren et al. [18] used an R-CNN to segment polyps in images. Wang et al. [19] used a segmentation network called SegNet to screen for polyps. Zheng et al. [20] employed a you only look once (YOLO) model to detect polyps in white-light and narrowband images. Hsu et al. [3] used grayscale images and a CNN-based deep network to enhance features in colonoscopic images, detect the location of colorectal polyps, and identify polyp types. Nogueira-Rodríguez et al. [21] used an updated version of the YOLO model, called YOLOv3, for real-time polyp detection. Li et al. [22] collected endoscopic images, publicly available on the Internet, including those in the MICCAI 2017, CVC colon DB, GLRC, and KUMC datasets; the researchers extracted the polyp images of interest and employed multiple deep learning models—including Faster R-CNN, YOLOv3, RetinaNet, DetNet, RefineDet, YOLOv4, and adaptive training sample selection—for polyp detection and identification. However, many of the systems proposed in these studies rely on perfect manually captured images or require magnified images for model training, verification, and testing. Thus, the result may be a model that is vulnerable, has low accuracy, or yields excessive false positives, which would be difficult to apply in clinical settings [23].
The clarity of images of the colorectal mucosa strongly affects the quality of colonoscopy. The factors behind low colonoscopic image quality include inadequate bowel preparation, insufficient air or carbon dioxide insufflation, lens fogging or a colonoscopic lens stained with fecal matter, and blurred images (Figure 1). In addition, a patient not complying with the instruction to consume a low-residue diet or not taking their prescribed dose of bowel-cleansing medication can result in the presence of residual fecal material or water in their colorectal lumen, and insufficient air or carbon dioxide insufflation may lead to poor inflation of the lumen, which may hinder the clear identification of abnormalities. Finally, lens fogging or a colonoscopic lens stained by fecal matter or tissue fluid can hinder image recognition.
Blurred images refer to images rendered out of focus because of motion resulting from the rapid withdrawal of the colonoscope or from withdrawal with an unsteady hand, either of which can prevent the accurate observation and identification of lesions. Colonoscopists may miss lesions because of low image quality, and machine vision systems, whose accuracy is particularly dependent on image quality, may perform much worse if the problem of poor image quality is not addressed. Hassan et al. [24] reported an average rate of 3.2 false positives when a computer-aided polyp detection system was used in colonoscopy. Most of the false positives were due to bowel content, artifacts from the mucosal wall, air bubbles, fecal water, or blood in the intestinal tract. Another factor behind these false-positive cases was computer-aided diagnoses made on the basis of images obtained under insufficient air insufflation. Rutter et al. [25] suggested that patients may still develop colorectal cancer, despite having undergone screening; they reported that 2.5–7.7% of patients had colorectal cancers within 3–5 years after receiving a colonoscopy, primarily due to inaccurate screening results.
Sharp and clear images are necessary for the computer-aided detection and identification of colorectal polyps and lesions. The technique of searching for and identifying the shape, texture, and morphology of protrusions in the intestinal tract is akin to the identification of foreign object damage (FOD) on an airport runway, which severely hinders flight and passenger safety [26,27]. Some FOD detection systems used in airports worldwide are based on optical detection with deep learning or hybrid techniques. Thus, it is crucial to establish a robust intraprocedure artificial intelligence (AI) alert system to reduce the burden of clinicians involved in colonoscopy.
This study proposed an alert system for colonoscopic examinations that uses deep learning to alert the colonoscopist with a message when the image is blurred because of rapid movement or when fecal matter or water in the lumen, insufficient air inflation, or polyps are detected. Colonoscopists should pay attention to any abnormalities during examination to avoid missing lesions and to improve the quality of colonoscopy. A CNN model was used to determine when an alert message ought to be sent. The experimental results revealed that the number of polyps of each case identified by the proposed system is the same as the number of polyps detected by endoscopists through per-polyp analysis. In addition, the sensitivity rates for the detection of high-quality images and the detection of polyps were 92% and 95.74%, respectively, and the false alarm rate of low-quality images was 0.18%. The proposed system can be employed by clinicians to improve the quality of colonoscopies.
The remainder of this paper is organized as follows: Section 2 describes the materials and methods; Section 3 details the evaluation experiments; and Section 4 discusses the results and concludes the paper.

2. Materials and Methods

The training and validation datasets used for the proposed system were obtained from the PolypsSet dataset [22] and Chang Gung Memorial Hospital (Table 1). In total, 3750 low-quality images and 2500 high-quality images were selected by experienced colonoscopists from the experimental datasets. The low-quality images included blurred images and images that contained folds, fecal matter, and opaque water. High-quality images were defined as images with clear and well-distended colon lumen and with no fecal residue or opaque fluid. Each image had a resolution of 640 × 480 pixels and was a TIF file. Among the low-quality images, the number of blurred images and the number of images containing folds, fecal matter, and opaque fluid were 2500 and 1250, respectively. In addition, the number of high-quality images containing polyps and the number of those containing no polyps were 1250 and 1250, respectively. The test dataset was derived from six videos (Table 2) that had been obtained for this study from Linkou Chang Gung Memorial Hospital. Each video was in MKV format, lasted approximately 15 min, and displayed 30 frames per second. The colonoscope model was CF-H290L/I, which featured a 170° angle of view, a forward-viewing direction of view, and a depth of field of 5–100 mm. After the images were de-identified and all of the non-intestinal information was cropped from the images, the images had a resolution of 720 × 960 pixels. After the deletion of the first 3–10 min portion of each video, which showed the insertion of the colonoscope into the cecum, the remaining footage was employed for polyp detection and identification at 3 frames per second. Among the dynamic images obtained from the videos, the number of blurred images and the number of images containing folds, fecal matter, and opaque water were 8716 and 1967, respectively, and the numbers of high-quality normal images and polyp images were 399 and 50, respectively. All of the videos featured one polyp, except for Case #1, which featured two polyps (Table 3). Polyp detection was performed using a CNN model for classification, and the training dataset comprised 612 images from the CVC-ClinicDB dataset and 500 images from the PolypsSet dataset (Table 4).
Figure 2 shows the architecture of the proposed intraprocedure alert system, which provides blurred image detection, foreign body detection, and polyp detection. Blurred image detection is used to identify blurred images that have occurred due to camera shaking, to the colonoscope being withdrawn too rapidly, or to the lens being stained with fecal matter or opaque fluid. The presence of a colon fold and fecal matter or methods for fluid detection are used to indicate abnormal protrusions that may be haustral folds and creases or fecal residue. Finally, polyp detection is used to identify polyp protrusions in the colon lumen [3]. All of these functions are provided by the proposed CNN deep learning model. Figure 3 and Table 5 present the proposed CNN deep learning model architecture for the detection of blurred images, fecal matter, opaque water, and colon folds. Figure 4 and Table 6 present the polyp detection architecture for feature extraction and the bounding box transformation layer for the result output. Notably, polyp detection was performed on images from the six videos to verify the effectiveness of the system in identifying false alerts after low-quality images were excluded.
The size of each input image was measured in terms of the width (W) × Height (H) × filter number (N). All of the images were adjusted to fit the specification of the CNN deep learning model. We employed convolution, batch normalization, a rectified linear unit, and maximum pooling operations to conduct feature extraction [3]. Table 5 and Table 6 show the filters, size/stride, and output image size of each operation. A classification conversion layer was used to distinguish blurred images from non-polyp foreign body images. In addition, fully connected, softmax, and classification output layers were used for classification.

3. Experimental Results

Table 7 and Table 8 present the training and validation datasets for the low-quality images, respectively, including blurred images and foreign body images, which numbered 5000 and 2500, respectively. Table 9 and Table 10 present the respective confusion matrices for the blurred image and foreign body detection. “TP,” “FN,” “TN,” and “FP” denote “true positive,” “false negative,” “true negative,” and “false positive,” respectively. A five-fold cross-validation method was used to verify the performance effectiveness. Table 11 presents the calculation method for the classification performance index, and Table 12 presents the classification performance for the blurred image and foreign body detection. For the detection of blurred images, the accuracy, precision, recall, F1-measure, and F2-measure were 96.2%, 98.8%, 93.6%, 96.1%, and 94.6%, respectively. For the detection of folds, fecal matter, and opaque water images, the accuracy, precision, recall, F1-measure, and F2-measure were 96.2%, 97.5%, 94.8%, 96.1%, and 95.3%, respectively. Among the factors behind the occurrence of false positives was an increased concentration of fecal matter or water near a polyp or the presence of a crease next to a tiny polyp, which tended to cause the system to issue an alert for the presence of fecal matter or colon folds (Figure 5 and Figure 6).
Table 13 and Table 14 present the actual numbers of polyps detected after blurred images and images containing fecal matter or water and colon folds were excluded, respectively, along with the identification accuracy levels for the high-quality normal colon images and polyp images. The numbers of polyps, high-quality normal image frames, and polyp image frames were 7, 399, and 50, respectively. The number of polyps in each case identified by the proposed system was the same as the number of polyps detected by endoscopists through per-polyp analysis. The recall rates for the high-quality image frames and polyp image frames were 95.7% and 92%, respectively. The overall alert accuracy rate and the false-positive rate for the low-quality dynamic images obtained through per-frame analysis were 95.3% and 0.18%, respectively.

4. Discussion

Nearly 70% of colorectal adenocarcinomas develop from conventional adenomas, with the remaining 30% develop from sessile serrated polyps [28]. The progression from polyp to adenocarcinoma generally occurs over 5–10 years [29], meaning that the incidence and mortality rates of colorectal cancer can be reduced through appropriate screening strategies. Several studies have indicated that appropriately performed screening colonoscopy and post-polypectomy colonoscopy surveillance can substantially decrease the incidence and mortality of colorectal cancer [5,30,31]. However, this protective effect is likely to be considerably compromised by low colonoscopy quality, which could increase the risk of post-colonoscopy colorectal cancer (PCCRC) [32].
Colorectal lesion detection is markedly affected by the quality of the colonoscopic image obtained, which is evaluated in terms of colon cleanliness, the clarity of mucosal images, and the degree of bowel distension [33]. Colon cleanliness refers to the degree of bowel cleanliness required for the careful examination of the mucosa after fecal water and residue have been suctioned. Although colon cleanliness is evaluated as excellent, good, fair, or poor, the differences between each level are not governed by any standardized criteria [34]. An alternative method is to assign quantitative scores based on the individual cleanliness of each region of the colon (e.g., Boston Bowel Preparation Scale, Ottawa Bowel Preparation Scale); however, these scoring systems are rather complicated [35,36]. Poor colon cleanliness prolongs the procedure duration and may lead to the missed detection of colorectal cancer or colorectal polyps. In addition, the early follow-up colonoscopy requirement, of within one year, because of poor colon cleanliness increases the economic burden on patients, medical institutions, and society as a whole [37]. The score calculated after colonoscopy cannot improve a patient’s bowel cleanliness for a current examination. Therefore, we developed an alert system to provide real-time feedback regarding the bowel cleanliness of a patient to the colonoscopist so that irrigation can be employed to clean the bowel lumen and improve the quality of the colonoscopy. Nevertheless, further clinical trials are required to verify whether this system can improve the detection rate of colorectal adenomas.
In addition to poor bowel preparation, another factor that may affect the clarity of mucosal images is the rapid withdrawal of the colonoscope, which can result in blurred images. According to multiple studies, physicians whose colonoscope withdrawal duration during a normal inspection is 6 min or longer have a considerably higher ADR than physicians whose withdrawal duration is less than 6 min [38]. In addition, retrospective studies have confirmed that the longer the colonoscope withdrawal duration, the lower is the risk of PCCRC [39]. Blurred images may cause physicians to miss colorectal cancers or polyps; therefore, alerting colonoscopists to unclear mucosal images may remind them to withdraw the colonoscope more slowly or to move the colonoscope back and forth in order to observe the unclear regions, and thus improve the examination quality. One clinical trial indicated that the use of the ENDOANGEL AI system, which features a built-in monitoring function for withdrawal speed, increases the ADR by 8% [40]. Another cause of blurred images is the unsteady movement of the colonoscope. Automatic quality-control AI systems can help monitor the stability of colonoscopy movement and alert physicians when an image is blurred [41]. One clinical trial revealed that such systems can increase the ADR by up to 12.4%.
Optimal colon distension is a prerequisite for colonoscopists to examine every part of the mucosa in detail. Several studies have confirmed that physicians who can adequately distend the lumen have a lower miss rate for colorectal adenomas [34]. A study comparing conventional colonoscopy with virtual colonoscopy indicated that approximately 11% of the polyps found, of which nearly 4% were adenomas larger than 6 mm [42], were missed with conventional colonoscopy. Most of these missed adenomas were located in the proximal part of a fold or near the anus orifice. Therefore, optimal distension can improve the ADR, and the real-time alert system proposed in the present study could alert endoscopists to the need to distend the lumen when a region is insufficiently inflated; such distension could in turn reduce the risk of clinicians failing to spot lesions.
The results of our study revealed that the proposed system could effectively alert endoscopists to low-quality images or poor colon preparation, prompting them to focus the image, clean the bowel, or inflate the lumen for detailed examination. This system reduces the likelihood of clinicians failing to spot colon polyps. The main limitation of this study was that only retrospectively recorded videos from a single medical center were used. Therefore, a large-scale prospective multicenter clinical trial is needed to validate the efficacy of the proposed system in increasing the colon polyp detection rate.

5. Conclusions

This study proposed an intraprocedure AI alert system for colonoscopy examination. Using feature extraction and classification alongside a CNN model, this system can identify blurred images, instances of inadequate bowel cleansing, and instances of insufficient air insufflation during colonoscopies. The system then alerts the clinician to the need to correct or pay greater attention to specific elements during examination in order to reduce the loss of crucial information and improve the reliability of the examination. The main novel feature of our study was the detection of low-quality images and foreign bodies in intestinal lumen to alert endoscopists and, thus, achieve higher-quality colonoscopy examination. Our experimental results indicated that blurred image and foreign body detection can prevent misjudgments and yield accurate polyp detection. In addition, our dynamic image sampling method indicated that the use of just three or four images from each second of footage for detection can yield accurate results; this means that our method is computationally lightweight.
Several polyp detection and classification systems can reduce the false-positive rate by enlarging training datasets. However, these systems still encounter challenges in the presence of colon folds, fecal matter, or water during examination. Our experimental results indicated that blurred image and foreign body detection can prevent misjudgments and be used to accurately detect polyps. Furthermore, thanks to our dynamic image sampling method, the use of just three or four images from each second of footage for detection can reduce the information processing load and thus lower the hardware requirements for image processing.

Author Contributions

Conceptualization, C.-M.H. and C.-C.H.; methodology, C.-C.H. and Z.-M.H.; collection, classification and annotation of colon images (C.-M.H., C.-C.H. and T.K.); writing—original draft preparation, C.-M.H. and C.-C.H.; writing—review and editing, C.-M.H., C.-C.H., T.-H.C. and T.K.; funding acquisition, C.-M.H. and C.-C.H. All authors have read and agreed to the published version of the manuscript.

Funding

The grant supporting this work was obtained from the National Science and Technology Council of the Republic of China, Taiwan (Contract No. MOST 111-2221-E-030-018). The funders had no role in the study design, data collection and analysis, decision to publish, or preparation of the manuscript.

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Institutional Review Board of Chang Gung Medical Foundation (protocol IRB No.: 202200778B0 and date of approval 30 May 2022).

Informed Consent Statement

Not applicable.

Acknowledgments

We are very grateful to the reviewers for their valuable comments that have helped to improve the paper.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Galloro, G.; Ruggiero, S.; Russo, T.; Saunders, B. Recent advances to improve the endoscopic detection and differentiation of early colorectal neoplasia. Color. Dis. 2015, 17, 25–30. [Google Scholar] [CrossRef] [Green Version]
  2. Kaltenbach, T.; Sano, Y.; Friedland, S.; Soetikno, R.; American Gastroenterological Association. American Gastroenterological Association (AGA) Institute technology assessment on image-enhanced endoscopy. Gastroenterology 2008, 134, 327–340. [Google Scholar] [CrossRef]
  3. Hsu, C.M.; Hsu, C.C.; Hsu, Z.M.; Shih, F.Y.; Chang, M.L.; Chen, T.H. Colorectal Polyp Image Detection and Classification through Grayscale Images and Deep Learning. Sensors 2021, 21, 5995. [Google Scholar] [CrossRef]
  4. Suzuki, H.; Yamamura, T.; Nakamura, M.; Hsu, C.M.; Su, M.Y.; Chen, T.H.; Chiu, C.T.; Hirooka, Y.; Goto, H. An International Study on the Diagnostic Accuracy of the Japan Narrow-Band Imaging Expert Team Classification for Colorectal Polyps Observed with Blue Laser Imaging. Digestion 2020, 101, 339–346. [Google Scholar] [CrossRef]
  5. Zauber, A.G.; Winawer, S.J.; O’Brien, M.J.; Lansdorp-Vogelaar, I.; Ballegooijen, M.V.; Hankey, B.F.; Shi, W.; Bond, J.H.; Schapiro, M.; Panish, J.F.; et al. Colonoscopic Polypectomy and Long-Term Prevention of Colorectal-Cancer Deaths. N. Engl. J. Med. 2012, 366, 687–696. [Google Scholar] [CrossRef]
  6. Rex, D.K.; Schoenfeld, P.S.; Cohen, J.; Pike, I.M.; Adler, D.G.; Fennerty, M.B.; Lieb, J.G., 2nd; Park, W.G.; Rizk, M.K.; Sawhney, M.S.; et al. Quality indicators for colonoscopy. Am. J. Gastroenterol. 2015, 110, 72–90. [Google Scholar] [CrossRef]
  7. May, F.P.; Shaukat, A. State of the Science on Quality Indicators for Colonoscopy and How to Achieve Them. Am. J. Gastroenterol. 2020, 115, 1183–1190. [Google Scholar] [CrossRef]
  8. Korbar, B.; Olofson, A.M.; Miraflor, A.P.; Nicka, C.M.; Suriawinata, M.A.; Torresani, L.; Suriawinata, A.A.; Hassanpour, S. Looking under the Hood: Deep Neural Network Visualization to Interpret Whole-Slide Image Analysis Outcomes for Colorectal Polyps. In Proceedings of the 2017 IEEE Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Honolulu, HI, USA, 21–26 July 2017; pp. 821–827. [Google Scholar] [CrossRef]
  9. Liu, Q. Deep Learning Applied to Automatic Polyp Detection in Colonoscopy Images. Master’s Thesis, University College of Southeast Norway, Notodden, Norway, 2017. Available online: http://hdl.handle.net/11250/2449603 (accessed on 1 January 2022).
  10. Chao, W.L.; Manickavasagan, H.; Krishna, S.G. Application of Artificial Intelligence in the Detection and Differentiation of Colon Polyps: A Technical Review for Physicians. Diagnostics 2019, 9, 99. [Google Scholar] [CrossRef] [Green Version]
  11. Barua, I.; Vinsard, D.G.; Jodal, H.C.; Loberg, M.; Kalager, M.; Holme, O.; Misawa, M.; Bretthauer, M.; Mori, Y. Artificial intelligence for polyp detection during colonoscopy: A systematic review and meta-analysis. Endoscopy 2021, 53, 277–284. [Google Scholar] [CrossRef]
  12. Jin, P.; Ji, X.; Kang, W.; Li, Y.; Liu, H.; Ma, F.; Ma, S.; Hu, H.; Li, W.; Tian, Y. Artificial intelligence in gastric cancer: A systematic review. J. Cancer Res. Clin. Oncol. 2020, 146, 2339–2350. [Google Scholar] [CrossRef]
  13. Sanchez-Peralta, L.F.; Bote-Curiel, L.; Picon, A.; Sanchez-Margallo, F.M.; Pagador, J.B. Deep learning to find colorectal polyps in colonoscopy: A systematic literature review. Artif. Intell. Med. 2020, 108, 101923. [Google Scholar] [CrossRef]
  14. Parmar, R.; Martel, M.; Rostom, A.; Barkun, A.N. Validated Scales for Colon Cleansing: A Systematic Review. Am. J. Gastroenterol. 2016, 111, 197–204. [Google Scholar] [CrossRef]
  15. Chen, C.-W. Real-time Colorectal Polyp Segmentation with Deep Learning in NBI and WL Colonoscopy. Master’s Thesis, National Taiwan University, Taipei, Taiwan, 2021. [Google Scholar] [CrossRef]
  16. Park, S.Y. Colonoscopic Polyp Detection Using Convolutional Neural Networks. In Proceedings of the Medical Imaging 2016: Computer-Aided Diagnosis, San Diego, CA, USA, 27 February–March 2016; p. 978528. [Google Scholar]
  17. Shin, Y.; Qadir, H.A.; Aabakken, L.; Bergsland, J.; Balasingham, I. Automatic Colon Polyp Detection Using Region Based Deep CNN and Post Learning Approaches. IEEE Access 2018, 6, 40950–40962. [Google Scholar] [CrossRef]
  18. Ren, S.; He, K.; Girshick, R.; Sun, R. Faster R-CNN: Towards Real-Time Object Detection with Region Proposal Networks. In Proceedings of the 28th International Conference on Neural Information Processing Systems (NIPS’15), Montreal, QC, Canada, 7–12 December 2015; pp. 91–99. [Google Scholar]
  19. Wang, P.; Xiao, X.; Glissen Brown, J.R.; Berzin, T.M.; Tu, M.; Xiong, F.; Hu, X.; Liu, P.; Song, Y.; Zhang, D.; et al. Development and validation of a deep-learning algorithm for the detection of polyps during colonoscopy. Nat. Biomed. Eng. 2018, 2, 741–748. [Google Scholar] [CrossRef]
  20. Zheng, Y.; Yu, R.; Jiang, Y.; Mak, T.W.C.; Wong, S.H.; Lau, J.Y.W.; Poon, C.C.Y. Localisation of Colorectal Polyps by Convolutional Neural Network Features Learnt from White Light and Narrow Band Endoscopic Images of Multiple Databases. In Proceedings of the 40th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Honolulu, HI, USA, 18–21 July 2018; pp. 4142–4145. [Google Scholar]
  21. Nogueira-Rodríguez, A.; Domínguez-Carbajales, R.; Campos-Tato, F.; Herrero, J.; Puga, M.; Remedios, D.; Rivas, L.; Sánchez, E.; Iglesias, Á.; Cubiella, J.; et al. Real-time polyp detection model using convolutional neural networks. Neural Comput. Appl. 2022, 34, 10375–10396. [Google Scholar] [CrossRef]
  22. Li, K.; Fathan, M.I.; Patel, K.; Zhang, T.; Zhong, C.; Bansal, A.; Rastogi, A.; Wang, J.S.; Wang, G. Colonoscopy polyp detection and classification: Dataset creation and comparative evaluations. PLoS ONE 2021, 16, e0255809. [Google Scholar] [CrossRef]
  23. Viscaino, M.; Torres Bustos, J.; Munoz, P.; Auat Cheein, C.; Cheein, F.A. Artificial intelligence for the early detection of colorectal cancer: A comprehensive review of its advantages and misconceptions. World J. Gastroenterol. 2021, 27, 6399–6414. [Google Scholar] [CrossRef]
  24. Hassan, C.; Badalamenti, M.; Maselli, R.; Correale, L.; Iannone, A.; Radaelli, F.; Rondonotti, E.; Ferrara, E.; Spadaccini, M.; Alkandari, A.; et al. Computer-aided detection-assisted colonoscopy: Classification and relevance of false positives. Gastrointest. Endosc. 2020, 92, 900–904.e4. [Google Scholar] [CrossRef] [PubMed]
  25. Rutter, M.D.; Beintaris, I.; Valori, R.; Chiu, H.M.; Corley, D.A.; Cuatrecasas, M.; Dekker, E.; Forsberg, A.; Gore-Booth, J.; Haug, U.; et al. World Endoscopy Organization Consensus Statements on Post-Colonoscopy and Post-Imaging Colorectal Cancer. Gastroenterology 2018, 155, 909–925.e3. [Google Scholar] [CrossRef] [Green Version]
  26. Liu, Y.; Li, Y.; Liu, J.; Peng, X.; Zhou, Y.; Murphey, Y.L. FOD Detection using DenseNet with Focal Loss of Object Samples for Airport Runway. In Proceedings of the IEEE Symposium Series on Computational Intelligence (SSCI), Bangalore, India, 18–21 November 2018; pp. 547–554. [Google Scholar]
  27. Xu, H.; Han, Z.; Feng, S.; Zhou, H.; Fang, Y. Foreign object debris material recognition based on convolutional neural networks. EURASIP J. Image Video Process. 2018, 2018, 21. [Google Scholar] [CrossRef] [Green Version]
  28. Snover, D.C. Update on the serrated pathway to colorectal carcinoma. Hum. Pathol. 2011, 42, 1–10. [Google Scholar] [CrossRef] [PubMed]
  29. Hisabe, T.; Hirai, F.; Matsui, T. Development and progression of colorectal cancer based on follow-up analysis. Dig. Endosc. 2014, 26, 73–77. [Google Scholar] [CrossRef] [PubMed]
  30. Winawer, S.J.; Zauber, A.G.; Ho, M.N.; O’Brien, M.J.; Gottlieb, L.S.; Sternberg, S.S.; Waye, J.D.; Schapiro, M.; Bond, J.H.; Panish, J.F.; et al. Prevention of colorectal cancer by colonoscopic polypectomy. The National Polyp Study Workgroup. N. Engl. J. Med. 1993, 329, 1977–1981. [Google Scholar] [CrossRef]
  31. Nishihara, R.; Wu, K.; Lochhead, P.; Morikawa, T.; Liao, X.; Qian, Z.R.; Inamura, K.; Kim, S.A.; Kuchiba, A.; Yamauchi, M.; et al. Long-term colorectal-cancer incidence and mortality after lower endoscopy. N. Engl. J. Med. 2013, 369, 1095–1105. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  32. Haug, U.; Regula, J. Interval cancer: Nightmare of colonoscopists. Gut 2014, 63, 865–866. [Google Scholar] [CrossRef] [PubMed]
  33. Rex, D.K. Colonoscopic withdrawal technique is associated with adenoma miss rates. Gastrointest. Endosc. 2000, 51, 33–36. [Google Scholar] [CrossRef]
  34. Aronchick, C.A.; Lipshutz, W.H.; Wright, S.H.; Dufrayne, F.; Bergman, G. A novel tableted purgative for colonoscopic preparation: Efficacy and safety comparisons with Colyte and Fleet Phospho-Soda. Gastrointest. Endosc. 2000, 52, 346–352. [Google Scholar] [CrossRef]
  35. Lai, E.J.; Calderwood, A.H.; Doros, G.; Fix, O.K.; Jacobson, B.C. The Boston bowel preparation scale: A valid and reliable instrument for colonoscopy-oriented research. Gastrointest. Endosc. 2009, 69, 620–625. [Google Scholar] [CrossRef]
  36. Rostom, A.; Jolicoeur, E. Validation of a new scale for the assessment of bowel preparation quality. Gastrointest. Endosc. 2004, 59, 482–486. [Google Scholar] [CrossRef]
  37. Johnson, D.A.; Barkun, A.N.; Cohen, L.B.; Dominitz, J.A.; Kaltenbach, T.; Martel, M.; Robertson, D.J.; Boland, C.R.; Giardello, F.M.; Lieberman, D.A.; et al. Optimizing adequacy of bowel cleansing for colonoscopy: Recommendations from the US multi-society task force on colorectal cancer. Gastroenterology 2014, 147, 903–924. [Google Scholar] [CrossRef]
  38. Barclay, R.L.; Vicari, J.J.; Doughty, A.S.; Johanson, J.F.; Greenlaw, R.L. Colonoscopic Withdrawal Times and Adenoma Detection during Screening Colonoscopy. N. Engl. J. Med. 2006, 355, 2533–2541. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  39. Shaukat, A.; Rector, T.S.; Church, T.R.; Lederle, F.A.; Kim, A.S.; Rank, J.M.; Allen, J.I. Longer Withdrawal Time Is Associated with a Reduced Incidence of Interval Cancer After Screening Colonoscopy. Gastroenterology 2015, 149, 952–957. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  40. Gong, D.; Wu, L.; Zhang, J.; Mu, G.; Shen, L.; Liu, J.; Wang, Z.; Zhou, W.; An, P.; Huang, X.; et al. Detection of colorectal adenomas with a real-time computer-aided system (ENDOANGEL): A randomised controlled study. Lancet Gastroenterol. Hepatol. 2020, 5, 352–361. [Google Scholar] [CrossRef] [PubMed]
  41. Su, J.R.; Li, Z.; Shao, X.J.; Ji, C.R.; Ji, R.; Zhou, R.C.; Li, G.C.; Liu, G.Q.; He, Y.S.; Zuo, X.L.; et al. Impact of a real-time automatic quality control system on colorectal polyp and adenoma detection: A prospective randomized controlled study (with videos). Gastrointest. Endosc. 2020, 91, 415–424.e4. [Google Scholar] [CrossRef] [PubMed]
  42. Pickhardt, P.J.; Nugent, P.A.; Mysliwiec, P.A.; Choi, J.R.; Schindler, W.R. Location of adenomas missed by optical colonoscopy. Ann. Intern. Med. 2004, 141, 352–359. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Poor colonoscopic image quality types. (a) Out-of-focus image, (b) Camera shake, (c) Insufficient air insufflation resulting false positive.
Figure 1. Poor colonoscopic image quality types. (a) Out-of-focus image, (b) Camera shake, (c) Insufficient air insufflation resulting false positive.
Sensors 23 01211 g001
Figure 2. System architecture.
Figure 2. System architecture.
Sensors 23 01211 g002
Figure 3. Alert system architecture.
Figure 3. Alert system architecture.
Sensors 23 01211 g003
Figure 4. Polyp detection architecture.
Figure 4. Polyp detection architecture.
Sensors 23 01211 g004
Figure 5. Polyp erroneously signaled as fecal matter or water.
Figure 5. Polyp erroneously signaled as fecal matter or water.
Sensors 23 01211 g005
Figure 6. Polyp erroneously signaled as a colon fold.
Figure 6. Polyp erroneously signaled as a colon fold.
Sensors 23 01211 g006
Table 1. Training and validation image dataset.
Table 1. Training and validation image dataset.
Image TypeImage#
Blurred image2500
Folds/fecal matter and water1250
Good quality image Polyp 1250
Normal1250
Total6250
Table 2. Dynamic image test dataset.
Table 2. Dynamic image test dataset.
Image TypeImage#
Blurred image8716
Folds/fecal matter and water1967
Good quality image Polyp 50
Normal399
Total11132
Table 3. Number of polyps.
Table 3. Number of polyps.
Case#Polyp#
12
21
31
41
51
61
Table 4. Polyp-detection training dataset.
Table 4. Polyp-detection training dataset.
DatasetImage#Size
CVC-ClinicDB-training612384 × 288
PolypsSet-training500640 × 480
Total1112
Table 5. CNN model of alert system.
Table 5. CNN model of alert system.
LayersFilters (N)Size/StrideOutput (W × H)
Image Input 480 × 640
Convolution16(3 × 3)480 × 640
Batch Normalization 480 × 640
ReLu 480 × 640
Max pooling 2 × 2/2240 × 320
Convolution16(3 × 3)240 × 320
Batch Normalization 240 × 320
ReLu 240 × 320
Max Pooling 2 × 2/2480 × 640
Convolution32(3 × 3)480 × 640
Batch Normalization 480 × 640
ReLu 480 × 640
Max Pooling 2 × 2/2120 × 160
Convolution32(3 × 3)120 × 160
Batch Normalization 120 × 160
ReLu 120 × 160
Max Pooling 2 × 2/260 × 80
Convolution32(3 × 3)60 × 80
Batch Normalization 60 × 80
ReLu 60 × 80
Max Pooling 2 × 2/230 × 40
Convolution32(3 × 3)30 × 40
Batch Normalization 30 × 40
ReLu 30 × 40
Max Pooling 2 × 2/215 × 20
Convolution32(3 × 3)15 × 20
Batch Normalization 15 × 20
ReLu 15 × 20
Max Pooling 2 × 2/27 × 10
Fully Connected 7 × 10
Softmax 7 × 10
Classification Output 7 × 10
Table 6. CNN model of polyp detection.
Table 6. CNN model of polyp detection.
LayersFilters (N)Size/StrideOutput (W × H)
Image Input 128 × 128
Convolution32(3 × 3 + 3 × 1 + 1 × 3)128 × 128
Batch Normalization 128 × 128
ReLu 128 × 128
Max Pooling 2 × 2/264 × 64
Convolution64(3 × 3 + 3 × 1 + 1 × 3)64 × 64
Batch Normalization 64 × 64
ReLu 64 × 64
Max Pooling 2 × 2/232 × 32
Convolution128(3 × 3 + 3 × 1 + 1 × 3)32 × 32
Batch Normalization 32 × 32
ReLu 32 × 32
Max Pooling 2 × 2/216 × 16
Convolution256(3 × 3 + 3 × 1 + 1 × 3)16 × 16
Batch Normalization 16 × 16
ReLu 16 × 16
Max Pooling 2 × 2/28 × 8
Convolution256(3 × 3 + 3 × 1 + 1 × 3)8 × 8
Batch Normalization 8 × 8
ReLu 8 × 8
Convolution256(3 × 3 + 3 × 1 + 1 × 3)8 × 8
Batch Normalization 8 × 8
ReLu 8 × 8
Convolution256(3 × 3 + 3 × 1 + 1 × 3)8 × 8
Batch Normalization 8 × 8
ReLu 8 × 8
Convolution241 × 1/18 × 8
Transform 8 × 8
Output
Table 7. Training and validation datasets for blurred image detection and classification.
Table 7. Training and validation datasets for blurred image detection and classification.
Image TypeTraining Set#Validation Set #Total#
Blurred image20005002500
Good quality imagePolyp10002501250
Normal10002501250
Subtotal400010005000
Table 8. Training and validation datasets for foreign body detection and classification.
Table 8. Training and validation datasets for foreign body detection and classification.
Image TypeTraining Set#Validation Set #Total#
Folds/fecal matter and water image10002501250
Good quality imagePolyp500125625
Normal500125625
Subtotal20005002500
Table 9. Validation results for blurred image detection.
Table 9. Validation results for blurred image detection.
Blurred Image
(Predicted)
Good Quality Image
(Predicted)
Blurred image (Actual)468 (TP)32 (FN)
Good quality image (Actual)6 (FP)494 (TN)
Table 10. Validation results for detection of colon folds and fecal matter or water.
Table 10. Validation results for detection of colon folds and fecal matter or water.
Folds/Fecal Matter and Water Image (Predicted)Good Quality Image
(Predicted)
Folds/fecal matter and water image (Actual)237 (TP)13 (FN)
Good quality image (Actual)6 (FP)244 (TN)
Table 11. Performance index.
Table 11. Performance index.
Accuracy
(Acc)
Acc = TP + TN TP + FP + TN + FN F1-measure
(F1)
F 1 = 2 × Pr ec × Re c Pr ec + Re c
Precision
(Prec)
Pr ec = TP TP + FP F2-measure
(F2)
F 2 = 5 × Pr ec × Re c 4 × Pr ec + Re c
Recall
(Rec)
Re c = TP TP + FN
Table 12. Performance index for blurred image and foreign body detection.
Table 12. Performance index for blurred image and foreign body detection.
Acc%Prec%Rec%F1%F2%
Blurred image detection96.298.893.696.194.6
Foreign body detection96.297.594.896.195.3
Table 13. Total number of polyps.
Table 13. Total number of polyps.
Case#Actual Polyp#Predicted Polyp#
122
211
311
411
511
611
Total polyp77
Table 14. Recall and false alarm rate for detection of image quality and polyp by per-frame analysis.
Table 14. Recall and false alarm rate for detection of image quality and polyp by per-frame analysis.
Good Quality ImagePolyp ImageTotal
Image#39950449
Predicted#38246428
Recall (%)95.79295.3
False alarm rate21/11132 = 0.0018 = 0.18%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Hsu, C.-M.; Hsu, C.-C.; Hsu, Z.-M.; Chen, T.-H.; Kuo, T. Intraprocedure Artificial Intelligence Alert System for Colonoscopy Examination. Sensors 2023, 23, 1211. https://doi.org/10.3390/s23031211

AMA Style

Hsu C-M, Hsu C-C, Hsu Z-M, Chen T-H, Kuo T. Intraprocedure Artificial Intelligence Alert System for Colonoscopy Examination. Sensors. 2023; 23(3):1211. https://doi.org/10.3390/s23031211

Chicago/Turabian Style

Hsu, Chen-Ming, Chien-Chang Hsu, Zhe-Ming Hsu, Tsung-Hsing Chen, and Tony Kuo. 2023. "Intraprocedure Artificial Intelligence Alert System for Colonoscopy Examination" Sensors 23, no. 3: 1211. https://doi.org/10.3390/s23031211

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop