Artificial Intelligence in Urologic Robotic Oncologic Surgery: A Narrative Review
Abstract
:Simple Summary
Abstract
1. Introduction
2. Methods
3. Main Text
3.1. Machine Learning and Deep Learning in Artificial Intelligence
3.2. Evaluating Surgical Expertise
3.3. Addressing the Issue of Haptic Feedback
3.4. Exploring the Intersection of Artificial Intelligence and Logistics
3.5. Preoperative Identification of Exact Anatomy
3.6. Intraoperative Application of Artificial Intelligence in Surgical Procedures
3.7. Postoperative Outcomes Prediction in Surgical Procedures
4. Current Limitations and Implications for Future Research
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Andras, I.; Mazzone, E.; van Leeuwen, F.W.B.; De Naeyer, G.; van Oosterom, M.N.; Beato, S.; Buckle, T.; O’sullivan, S.; van Leeuwen, P.J.; Beulens, A.; et al. Artificial intelligence and robotics: A combination that is changing the operating room. World J. Urol. 2019, 38, 2359–2366. [Google Scholar] [CrossRef] [PubMed]
- Beckmann, J.S.; Lew, D. Reconciling evidence-based medicine and precision medicine in the era of big data: Challenges and opportunities. Genome Med. 2016, 8, 134. [Google Scholar] [CrossRef] [PubMed]
- Yu, K.H.; Beam, A.L.; Kohane, I.S. Artificial intelligence in healthcare. Nat. Biomed. Eng. 2018, 2, 719–731. [Google Scholar] [CrossRef]
- Tran, B.X.; Vu, G.T.; Ha, G.H.; Vuong, Q.-H.; Ho, M.-T.; Vuong, T.-T.; La, V.-P.; Ho, M.-T.; Nghiem, K.-C.P.; Nguyen, H.L.T.; et al. Global Evolution of Research in Artificial Intelligence in Health and Medicine: A Bibliometric Study. J. Clin. Med. 2019, 8, 360. [Google Scholar] [CrossRef]
- Yildirim, M.; Bingol, H.; Cengil, E.; Aslan, S.; Baykara, M. Automatic Classification of Particles in the Urine Sediment Test with the Developed Artificial Intelligence-Based Hybrid Model. Diagnostics 2023, 13, 1299. [Google Scholar] [CrossRef] [PubMed]
- Moustris, G.P.; Hiridis, S.C.; Deliparaschos, K.M.; Konstantinidis, K.M. Evolution of autonomous and semi-autonomous robotic surgical systems: A review of the literature. Int. J. Med. Robot. Comput. Assist. Surg. 2011, 7, 375–392. [Google Scholar] [CrossRef] [PubMed]
- Roehrborn, C.G.; Teplitsky, S.; Das, A.K. Aquablation of the prostate: A review and update. Can. J. Urol. 2019, 26, 20–24. [Google Scholar]
- Rajkomar, A.; Dean, J.; Kohane, I. Machine Learning in Medicine. N. Engl. J. Med. 2019, 380, 1347–1358. [Google Scholar] [CrossRef]
- Chen, C.L.; Mahjoubfar, A.; Tai, L.-C.; Blaby, I.K.; Huang, A.; Niazi, K.R.; Jalali, B. Deep Learning in Label-free Cell Classification. Sci. Rep. 2016, 6, 21471. [Google Scholar] [CrossRef]
- Kriegeskorte, N.; Golan, T. Neural network models and deep learning. Curr. Biol. 2019, 29, R231–R236. [Google Scholar] [CrossRef]
- Fard, M.J.; Ameri, S.; Ellis, R.D.; Chinnam, R.B.; Pandya, A.K.; Klein, M.D. Automated robot-assisted surgical skill evaluation: Predictive analytics approach. Int. J. Med. Robot. Comput. Assist. Surg. 2017, 14, e1850. [Google Scholar] [CrossRef] [PubMed]
- Ghani, K.R.; Liu, Y.; Law, H.; Miller, D.C.; Montie, J.; Deng, J.; Michigan Urological Surgery Improvement Collaborative. PD46-04 Video Analysis of Skill and Technique (VAST): Machine learning to assess surgeons performing robotic prostatectomy. J. Urol. 2017, 197, e891. [Google Scholar] [CrossRef]
- Wang, Z.; Fey, A.M. Deep learning with convolutional neural network for objective skill evaluation in robot-assisted surgery. Int. J. Comput. Assist. Radiol. Surg. 2018, 13, 1959–1970. [Google Scholar] [CrossRef] [PubMed]
- Ershad, M.; Rege, R.; Fey, A.M. Automatic and near real-time stylistic behavior assessment in robotic surgery. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 635–643. [Google Scholar] [CrossRef] [PubMed]
- Youssef, S.C.; Hachach-Haram, N.; Aydin, A.; Shah, T.T.; Sapre, N.; Nair, R.; Rai, S.; Dasgupta, P. Video labelling robot-assisted radical prostatectomy and the role of artificial intelligence (AI): Training a novice. J. Robot. Surg. 2023, 17, 695–701. [Google Scholar] [CrossRef] [PubMed]
- Dai, Y.; Abiri, A.; Pensa, J.; Liu, S.; Paydar, O.; Sohn, H.; Sun, S.; Pellionisz, P.A.; Pensa, C.; Dutson, E.P.; et al. Biaxial sensing suture breakage warning system for robotic surgery. Biomed. Microdevices 2019, 21, 10. [Google Scholar] [CrossRef] [PubMed]
- Ngiam, K.Y.; Khor, I.W. Big data and machine learning algorithms for health-care delivery. Lancet Oncol. 2019, 20, e262–e273. [Google Scholar] [CrossRef] [PubMed]
- Piana, A.; Gallioli, A.; Amparore, D.; Diana, P.; Territo, A.; Campi, R.; Gaya, J.M.; Guirado, L.; Checcucci, E.; Bellin, A.; et al. Three-dimensional Augmented Reality–guided Robotic-assisted Kidney Transplantation: Breaking the Limit of Atheromatic Plaques. Eur. Urol. 2022, 82, 419–426. [Google Scholar] [CrossRef]
- Zhao, B.; Waterman, R.S.; Urman, R.D.; Gabriel, R.A. A Machine Learning Approach to Predicting Case Duration for Robot-Assisted Surgery. J. Med. Syst. 2019, 43, 32. [Google Scholar] [CrossRef]
- Liu, J.; Wang, S.; Turkbey, E.B.; Linguraru, M.G.; Yao, J.; Summers, R.M. Computer-aided detection of renal calculi from noncontrast CT images using TV-flow and MSER features. Med. Phys. 2015, 42, 144–153. [Google Scholar] [CrossRef]
- Liu, J.; Wang, S.; Linguraru, M.G.; Yao, J.; Summers, R.M. Computer-aided detection of exophytic renal lesions on non-contrast CT images. Med. Image Anal. 2014, 19, 15–29. [Google Scholar] [CrossRef] [PubMed]
- Iglesias, J.E.; Sabuncu, M.R. Multi-atlas segmentation of biomedical images: A survey. Med. Image Anal. 2015, 24, 205–219. [Google Scholar] [CrossRef] [PubMed]
- Doyle, P.W.; Kavoussi, N.L. Machine learning applications to enhance patient specific care for urologic surgery. World J. Urol. 2021, 40, 679–686. [Google Scholar] [CrossRef] [PubMed]
- Wang, H.; Suh, J.W.; Das, S.R.; Pluta, J.B.; Craige, C.; Yushkevich, P.A. Multi-Atlas Segmentation with Joint Label Fusion. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 35, 611–623. [Google Scholar] [CrossRef] [PubMed]
- Yildirim, M. Image Visualization and Classification Using Hydatid Cyst Images with an Explainable Hybrid Model. Appl. Sci. 2023, 13, 9926. [Google Scholar] [CrossRef]
- Nosrati, M.S.; Amir-Khalili, A.; Peyrat, J.-M.; Abinahed, J.; Al-Alao, O.; Al-Ansari, A.; Abugharbieh, R.; Hamarneh, G. Endoscopic scene labelling and augmentation using intraoperative pulsatile motion and colour appearance cues with preoperative anatomical priors. Int. J. Comput. Assist. Radiol. Surg. 2016, 11, 1409–1418. [Google Scholar] [CrossRef] [PubMed]
- Di Dio, M.; Barbuto, S.; Bisegna, C.; Bellin, A.; Boccia, M.; Amparore, D.; Verri, P.; Busacca, G.; Sica, M.; De Cillis, S.; et al. Artificial Intelligence-Based Hyper Accuracy Three-Dimensional (HA3D®) Models in Surgical Planning of Challenging Robotic Nephron-Sparing Surgery: A Case Report and Snapshot of the State-of-the-Art with Possible Future Implications. Diagnostics 2023, 13, 2320. [Google Scholar] [CrossRef] [PubMed]
- Klén, R.; Salminen, A.P.; Mahmoudian, M.; Syvänen, K.T.; Elo, L.L.; Boström, P.J. Prediction of complication related death after radical cystectomy for bladder cancer with machine learning methodology. Scand. J. Urol. 2019, 53, 325–331. [Google Scholar] [CrossRef]
- Checcucci, E.; Pecoraro, A.; Amparore, D.; De Cillis, S.; Granato, S.; Volpi, G.; Sica, M.; Verri, P.; Piana, A.; Piazzolla, P.; et al. The impact of 3D models on positive surgical margins after robot-assisted radical prostatectomy. World J. Urol. 2022, 40, 2221–2229. [Google Scholar] [CrossRef]
- Baghdadi, A.; Hussein, A.A.; Ahmed, Y.; Cavuoto, L.A.; Guru, K.A. A computer vision technique for automated assessment of surgical performance using surgeons’ console-feed videos. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 697–707. [Google Scholar] [CrossRef]
- Haifler, M.; Pence, I.; Sun, Y.; Kutikov, A.; Uzzo, R.G.; Mahadevan-Jansen, A.; Patil, C.A. Discrimination of malignant and normal kidney tissue with short wave infrared dispersive Raman spectroscopy. J. Biophotonics 2018, 11, e201700188. [Google Scholar] [CrossRef]
- Checcucci, E.; Piazzolla, P.; Marullo, G.; Innocente, C.; Salerno, F.; Ulrich, L.; Moos, S.; Quarà, A.; Volpi, G.; Amparore, D.; et al. Development of Bleeding Artificial Intelligence Detector (BLAIR) System for Robotic Radical Prostatectomy. J. Clin. Med. 2023, 12, 7355. [Google Scholar] [CrossRef]
- Porpiglia, F.; Checcucci, E.; Amparore, D.; Autorino, R.; Piana, A.; Bellin, A.; Piazzolla, P.; Massa, F.; Bollito, E.; Gned, D.; et al. Augmented-reality robot-assisted radical prostatectomy using hyper-accuracy three-dimensional reconstruction (HA3D™) technology: A radiological and pathological study. BJU Int. 2019, 123, 834–845. [Google Scholar] [CrossRef] [PubMed]
- Porpiglia, F.; Checcucci, E.; Amparore, D.; Manfredi, M.; Massa, F.; Piazzolla, P.; Manfrin, D.; Piana, A.; Tota, D.; Bollito, E.; et al. Three-dimensional Elastic Augmented-reality Robot-assisted Radical Prostatectomy Using Hyperaccuracy Three-dimensional Reconstruction Technology: A Step Further in the Identification of Capsular Involvement. Eur. Urol. 2019, 76, 505–514. [Google Scholar] [CrossRef] [PubMed]
- Checcucci, E.; Piana, A.; Volpi, G.; Piazzolla, P.; Amparore, D.; De Cillis, S.; Piramide, F.; Gatti, C.; Stura, I.; Bollito, E.; et al. Three-dimensional automatic artificial intelligence driven augmented-reality selective biopsy during nerve-sparing robot-assisted radical prostatectomy: A feasibility and accuracy study. Asian J. Urol. 2023, 10, 407–415. [Google Scholar] [CrossRef]
- De Backer, P.; Van Praet, C.; Simoens, J.; Peraire Lores, M.; Creemers, H.; Mestdagh, K.; Allaeys, C.; Vermijs, S.; Piazza, P.; Mottaran, A.; et al. Improving Augmented Reality Through Deep Learning: Real-time Instrument Delineation in Robotic Renal Surgery. Eur. Urol. 2023, 84, 86–91. [Google Scholar] [CrossRef]
- Amparore, D.; Checcucci, E.; Piazzolla, P.; Piramide, F.; De Cillis, S.; Piana, A.; Verri, P.; Manfredi, M.; Fiori, C.; Vezzetti, E.; et al. Indocyanine Green Drives Computer Vision Based 3D Augmented Reality Robot Assisted Partial Nephrectomy: The Beginning of “Automatic” Overlapping Era. Urology 2022, 164, e312–e316. [Google Scholar] [CrossRef] [PubMed]
- Amparore, D.; Sica, M.; Verri, P.; Piramide, F.; Checcucci, E.; De Cillis, S.; Piana, A.; Campobasso, D.; Burgio, M.; Cisero, E.; et al. Computer Vision and Machine-Learning Techniques for Automatic 3D Virtual Images Overlapping During Augmented Reality Guided Robotic Partial Nephrectomy. Technol. Cancer Res. Treat. 2024, 23, 15330338241229368. [Google Scholar] [CrossRef] [PubMed]
- Chen, J.; Remulla, D.; Nguyen, J.H.; Dua, A.; Liu, Y.; Dasgupta, P.; Hung, A.J. Current status of artificial intelligence applications in urology and their potential to influence clinical practice. BJU Int. 2019, 124, 567–577. [Google Scholar] [CrossRef]
- Hung, A.J.; Chen, J.; Che, Z.; Nilanon, T.; Jarc, A.; Titus, M.; Oh, P.J.; Gill, I.S.; Liu, Y. Utilizing Machine Learning and Automated Performance Metrics to Evaluate Robot-Assisted Radical Prostatectomy Performance and Predict Outcomes. J. Endourol. 2018, 32, 438–444. [Google Scholar] [CrossRef]
- Hung, A.J.; Chen, J.; Ghodoussipour, S.; Oh, P.J.; Liu, Z.; Nguyen, J.; Purushotham, S.; Gill, I.S.; Liu, Y. A deep-learning model using automated performance metrics and clinical features to predict urinary continence recovery after robot-assisted radical prostatectomy. BJU Int. 2019, 124, 487–495. [Google Scholar] [CrossRef] [PubMed]
- Kantarjian, H.; Yu, P.P. Artificial Intelligence, Big Data, and Cancer. JAMA Oncol. 2015, 1, 573–574. [Google Scholar] [CrossRef] [PubMed]
- Shademan, A.; Decker, R.S.; Opfermann, J.D.; Leonard, S.; Krieger, A.; Kim, P.C.W. Supervised autonomous robotic soft tissue surgery. Sci. Transl. Med. 2016, 8, 337ra64. [Google Scholar] [CrossRef] [PubMed]
- Saeidi, H.; Le, H.N.D.; Opfermann, J.D.; Leonard, S.; Kim, A.; Hsieh, M.H.; Kang, J.U.; Krieger, A. Autonomous Laparoscopic Robotic Suturing with a Novel Actuated Suturing Tool and 3D Endoscope. In Proceedings of the 2019 International Conference on Robotics and Automation (ICRA), Montreal, QC, Canada, 20–24 May 2019; pp. 1541–1547. [Google Scholar]
Authors (Ref) | Outcome | Data Source | Type of AI | Findings |
---|---|---|---|---|
Fard et al. [11] | Tool for evaluating and measuring skills | Characteristics of the robotic arms on a global scale: duration of task completion, length of the path, perception of depth, velocity, fluidity of movement, curvature angular displacement, tortuosity | Three classification methods: k-nearest neighbours, logistic regression, and support vector machines | Differentiating between skilled and inexperienced robotic surgeons using automated methods |
Ghani et al. [12] | Tool for evaluating and measuring skills | Videos of 11 surgeons performing the bladder neck anastomosis, while the algorithm measured the speed, trajectory, and smoothness of instrument movement, along with the instrument’s positioning in relation to the opposite side, in a comprehensive and step-by-step manner | Computer vision algorithm | Certain parameters were closely linked to expertise, like the relationship between needle driver forceps and joint position |
Wang et al. [13] | Tool for evaluating and measuring skills | Data regarding the movement of the master tool manipulator and the patient side manipulator | Artificial neural network | Automated evaluation and feedback of surgical proficiency utilising artificial neural network |
Ershad et al. [14] | Tool for evaluating and measuring skills | Installed position sensors on the surgeon’s limbs Metrics for measuring stylistic behaviour: liquid/viscous, polished/coarse, sharp/unsteady, rapid/lethargic, tranquil/distressed, calm/anxious indecisive, organised/disorganised | Sparse coding framework | Assessment of surgical expertise through the evaluation of the proficiency in stylistic conduct |
Youssef et al. [15] | Tool for evaluating and measuring skills | 25 videos of RARP used for videolabelling by a beginner surgeon | Proximie | 17 suitable for videolabelling, average tagging accuracy of the steps of 93.1% |
Authors (Ref) | Outcome | Data Source | Type of AI | Findings |
---|---|---|---|---|
Dai et al. [16] | Haptic feedback surrogate | Robotic tools equipped with sensors | Not mentioned | A haptic feedback system is designed to enhance the prevention of suture breakage, increase the quality of knots, and facilitate learning |
Piana et al. [18] | Haptic feedback surrogate | High-accuracy CT scanning creating 3D images | 3D augmented reality guidance during KT, with ML using the Da Vinci console software | Eliminates the need for tactile sensation (haptic feedback) during KT, and facilitates iliac vessels plaques |
Authors (Ref) | Outcome | Data Source | Type of AI | Findings |
---|---|---|---|---|
Zhao et al. [19] | Intraoperative outcomes | Variables employed in the construction of the model: prearranged surgical procedure, group of procedures, age, obesity, sex, combined case, robotic prototype, cancer, location of the tumour, hypertension, tobacco usage, atrial fibrillation, obstructive sleep apnoea, coronary artery disease, diabetes mellitus, cirrhosis, chronic obstructive pulmonary disease (COPD), renal insufficiency, ASA classification, calendar month, time of day, anaesthesiologist, operating room aide | ML | Improved precision in forecasting case duration |
Authors (Ref) | Outcome | Data Source | Type of AI | Findings |
---|---|---|---|---|
Nosrati et al. [26] | Intraoperative outcomes | Subsurface cues such as pulsation patterns, textures, and colours within the operative field and preoperative imaging | Machine learning | The framework has application in non- visible and partially occluded structures during PN |
DiDio et al. [27] | Intraoperative outcomes | Preoperative CT Pyelography | Computer vision to create HA3D model | The model used to selectively apply pressure to the artery during PN |
Klen et al. [28] | Postoperative outcomes | Data from 1099 operated RARC patients | Machine Learning | Identifying patients who are at high risk for complications after RC, and additional factors identified via ASA score, CPD, ACCI, and CHF |
Checcucci et al. [29] | Postoperative outcomes | 3D and non 3D prostate models from RARP and mpMRI | Multivariable linear regression models | No extracapsular extension in mpMRI and the use of 3D models during RARP lowered the incidence of positive margins |
Authors (Ref) | Outcome | Data Source | Type of AI | Findings |
---|---|---|---|---|
Bagdadi et al. [30] | Intraoperative assessment | 20 PLD videos | Machine learning | The machine learning model may accurately generate prostatectomy assessment and competence evaluation (PACE) scores |
Haifler et al. [31] | Intraoperative assessment | Shortwave Raman spectroscopy data | Machine learning | Differentiation between malignant and benign renal tissue |
Checcucci et al. [32] | Intraoperative assessment | Footage captured by the endoscope during RARP at 3-s intervals | Artificial neural network | Can predict bleeding |
Porpiglia et al. [33,34] | Intraoperative assessment | HA3D models of the prostate from mpMRI used during RARP | Virtual augmented reality | Increased recognition of ECE |
De Backer et al. [36] | Intraoperative assessment | Da Vinci recordings | Virtual augmented reality and deep learning | Detection of tools during renal surgery; motion and alignment artefacts during patient’s breathing |
Amparore et al. [37] | Intraoperative assessment | HA3D model, NIRF firefly to enhance kidney visibility which was used as a reference point | Computer vision technology, Ignite for image anchoring | Overlapping the 3D model without manual assistance during AR-RAPN |
Amparore et al. [38] | Intraoperative assessment | Prerecorded surgery videos | Convolutional neural network technology vs. indocyanine green injection | Overlapping the 3D model without manual assistance during RAPN even without indocyanine injection |
Authors (Ref) | Outcome | Data Source | Type of AI | Findings |
---|---|---|---|---|
Hung et al. [40] | Postoperative outcomes | Automated measures of performance: overall duration, average period of inactivity, total length of all instruments’ paths, length of the path taken by the dominant/non-dominant instrument, length of the path taken, the time taken to move, the average velocity, and the amount of time spent idle for both the dominant and non-dominant instruments. adjustment of camera position, frequency, length of path, duration of movement, average velocity, duration of inactivity, energy consumption, exchange of a third arm utilisation of the clutch, exit the console | Three machine learning algorithms, using data from robot system and hospital length of stay | A machine learning algorithm designed to forecast the duration of hospitalisation and the duration of Foley catheter utilisation. |
Hung et al. [41] | Postoperative outcomes | Automated measures of performance: Metrics pertaining to the measurement and analysis of time Measurements of the motion characteristics of an instrument, Metrics for measuring camera movement Metrics for measuring the articulation of the Endowrist Metrics for system events Characteristics of patients: age, body mass index (BMI), PSA, Gleason score prior to surgery, ASA classification, surgical duration, lymphadenectomy scope, urethropexy, nerve-sparing, median lobe, abnormal Gleason score, pathological staging size of the prostate gland, surgical margins, radiotherapy | DL model (DeepSurv) | A predictive model capable of determining continence outcomes following robotic radical prostatectomy; the emphasis lies on performance indicators rather than patient characteristics. |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bellos, T.; Manolitsis, I.; Katsimperis, S.; Juliebø-Jones, P.; Feretzakis, G.; Mitsogiannis, I.; Varkarakis, I.; Somani, B.K.; Tzelves, L. Artificial Intelligence in Urologic Robotic Oncologic Surgery: A Narrative Review. Cancers 2024, 16, 1775. https://doi.org/10.3390/cancers16091775
Bellos T, Manolitsis I, Katsimperis S, Juliebø-Jones P, Feretzakis G, Mitsogiannis I, Varkarakis I, Somani BK, Tzelves L. Artificial Intelligence in Urologic Robotic Oncologic Surgery: A Narrative Review. Cancers. 2024; 16(9):1775. https://doi.org/10.3390/cancers16091775
Chicago/Turabian StyleBellos, Themistoklis, Ioannis Manolitsis, Stamatios Katsimperis, Patrick Juliebø-Jones, Georgios Feretzakis, Iraklis Mitsogiannis, Ioannis Varkarakis, Bhaskar K. Somani, and Lazaros Tzelves. 2024. "Artificial Intelligence in Urologic Robotic Oncologic Surgery: A Narrative Review" Cancers 16, no. 9: 1775. https://doi.org/10.3390/cancers16091775
APA StyleBellos, T., Manolitsis, I., Katsimperis, S., Juliebø-Jones, P., Feretzakis, G., Mitsogiannis, I., Varkarakis, I., Somani, B. K., & Tzelves, L. (2024). Artificial Intelligence in Urologic Robotic Oncologic Surgery: A Narrative Review. Cancers, 16(9), 1775. https://doi.org/10.3390/cancers16091775