Integration of Real-Time Image Fusion in the Robotic-Assisted Treatment of Hepatocellular Carcinoma
Abstract
:Simple Summary
Abstract
1. Introduction
2. Background
2.1. Image Fusiuon Systems—General Considerations
2.2. Image Fusion and the Locoregional Therapies of HCC
2.3. Limitations of Image Fusion
2.4. Robotic-Assisted Targeted HCC Treatment
- Two identical robotic modules which operate “in mirror” (see Figure 1a where the robotic system is evaluated in laboratory conditions with a phantom). The two modules operate as follows: the first module performs the needle insertion, while the second module guides an intraoperatory US probe that provides visual feedback for tumor location and needle location within the tissue. Each robotic module was designed based on parallel mechanisms to ensure high accuracy (of the treatment delivery), patient safety and procedure ergonomics. Each robotic module has five degrees of freedom (DOF) for guiding the medical instruments (discussed further) in three Cartesian motions and two rotations (Figure 1b).
- Each of the robotic modules are equipped with one of the two novel medical instruments: one multi-needle automated instrument with three DOFs for accurate positioning, insertion, retraction and releasing of specialized needles (for brachytherapy or chemotherapy) (Figure 2a) [33,34], or an automated medical instrument with four DOFs (Figure 2b) which guide a Hitachi Arieta intraoperatory US probe (Figure 2c) for insertion/retraction along the longitudinal axis of the probe, rotation about the longitudinal axis of the probe and two rotations of the distal head about two distinct orthogonal axes [34,35]. Consequently, the needle insertion robotic module has eight DOFs (with three translational redundant ones), whereas the US probe manipulation robotic module has nine DOFs (with two redundant rotations and one redundant translation for fine control of the transducer).
- The Input console: the input console is the master part of the robot control. It is based on a portable computer and was designed to integrate the following components: (1) a graphical user interface with a real-time tumor detection and visualization module with IF (US with CT) received from the 3D reconstruction module, scalable motion for precise medical tool manipulation, and modular control to allow each robotic module to manipulate each automated medical instrument (for needle insertion and for US probe manipulation); (2) a motion input device for real-time continuous control (using a high-precision 3D motion input device) or for setting predefined position (using mouse and keyboard input devices).
3. Materials and Methods
3.1. Visualization and Detection System Integrated into a Robotic System for the Treatment of HCC
3.2. Computerized System for 3D Reconstruction, Image Fusion and HCC Detection
- The segmentation module, performing the segmentation (detection and spatial delimitation) of the liver, HCC tumor, and blood vessels using specific methods, such as clustering, region growing, and convolutional neural networks (CNN); this module receives CT images acquired before surgery.
- a.
- Traditional methods: In order to achieve HCC segmentation through conventional methods, the fast and robust fuzzy-C-means clustering (FRFCM) technique [36] was adopted. FRFCM assumed a preprocessing phase of morphological reconstruction, and then a fuzzy-C-means clustering algorithm was employed, modified in order to obtain speed improvement. After applying FRFCM, the following were performed during the post-processing phase: image thresholding, the labelling of the resulted objects. The object with the maximum area was selected thereafter, which corresponded to the advanced HCC tumor. The best performance resulted for 10 clusters, employing a squared structural element of size 2 for morphological reconstruction, and a disk structural element of size 3 for the closing operation.
- b.
- Deep learning techniques: Multiple CNN architectures were experimented with, such as ERFNet, EDANet, DeepLabV3 and U-Net [37,38,39]. The ERFNet and EDANet networks were pretrained using traffic data in order to emphasize some basic structures such as edges and curvatures. The DeepLabV3 CNN architecture with a ResNet-101 backbone [38], pre-trained on the Common Objects in Context (COCO) dataset [39], was experimented with in the following situations: (a) on the original images; (b) when the neural network input was composed of three channels, the first receiving the grayscale image as input, the second receiving the morphological reconstruction of the grayscale image and the third, the FRFCM result, after applying the FRFCM technique, with 50 and 100 clusters, respectively. The U-Net CNN architecture, trained from scratch with our CT data, was also employed for HCC segmentation.
- 2.
- The tumor detection module: In order to enhance the robotic-assisted treatment modality, an automatic Tumor detection system (using CT images) was also developed, which provides statistical maps (showing the HCC) for visual feedback.
- 3.
- The 3D reconstruction module performs 3D reconstruction from the segmented 2D CT images and also generates the 3D anatomic model of HCC within the liver. This module is designed to provide visual feedback to the input console for the medical personnel (e.g., surgeons) operating the robotic system.
- 4.
- The fusion module receives the ultrasound image, as well as the spatial coordinates and orientation corresponding to the current transducer position and emphasizes the corresponding 2D CT slice with the main anatomical elements within the 3D volume.
- 5.
- The communication module assures the real-time communication of the computerized system with the robotic system control. The computerized system for image fusion and 3D reconstruction, and the computer application associated with the robot, being situated on different computers, communicate via Ethernet through a socket-based mechanism. Firstly, the image fusion and 3D reconstruction system receives the spatial coordinates and the Euler angles associated with the current ultrasound transducer position from the robot during surgery and, correspondingly, identifies the 2D slice within the 3D anatomic model. Then, the image fusion and 3D reconstruction system provides the robot application with the image of the corresponding 2D section and also the 3D image associated with the 3D anatomic model. Thus, the computerized application for image fusion and 3D reconstruction, as well as the computer application associated with the robot, implement two communication threads, one for the input data and the other for the output data. The computer application associated with the robot finally displays the image corresponding to the 3D anatomic model, as well as that associated with the 2D slice and the ultrasound image that corresponds to the current position of the transducer. The surgeon analyzes these images and, if necessary, applies, with the aid of the robot, the minimum invasive treatment for HCC reduction.
4. Results and Discussion
4.1. The Dataset
4.2. Segmentation and Fusion Results
4.3. Discussion
5. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
- Bertuccio, P.; Turati, F.; Carioli, G.; Rodriguez, T.; La Vecchia, C.; Malvezzi, M.; Negri, E. Global trends and predictions in hepatocellular carcinoma mortality. J. Hepatol. 2017, 67, 302–309. [Google Scholar] [CrossRef]
- Lee, M.W.; Kim, Y.J.; Park, H.S.; Yu, N.C.; Jung, S.I.; Ko, S.Y.; Hae, J.J. Targeted sonography for small hepatocellular carcinoma discovered by CT or MRI: Factors affecting sonographic detection. Am. J. Roentgenol. 2010, 194, 396–400. [Google Scholar] [CrossRef]
- Dwyer, G.; Giataganas, P.; Pratt, P.; Hughes, M.; Yang, G.Z. Miniaturised Robotic Probe for Real-Time Intraoperative Fusion of Ultrasound and Endomicroscopy. In Proceedings of the 2015 IEEE International Conference on Robotics and Automation (ICRA), Seattle, WA, USA, 26–30 May 2015. [Google Scholar]
- Samei, G.; Tsang, K.; Kesch, C.; Lobo, J.; Hor, S.; Mohareri, O.; Chang, S.; Goldenberg, S.L.; Black, P.C.; Salcudean, S. A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Med. Image Anal. 2020, 60, 101588. [Google Scholar] [CrossRef] [PubMed]
- Kaye, D.R.; Stoianovici, D.; Han, M. Robotic Ultrasound and Needle Guidance for Prostate Cancer Management: Review of the Contemporary Literature. Curr. Opin. Urol. 2014, 24, 75–80. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Lim, S.; Jun, C.; Chang, D.; Petrisor, D.; Han, M.; Stoianovici, D. Robotic Transrectal Ultrasound Guided Prostate Biopsy. IEEE Trans. Biomed. Eng. 2019, 66, 2527–2537. [Google Scholar] [CrossRef] [PubMed]
- Vaida, C.; Plitea, N.; Al Hajjar, N.; Burz, A.; Graur, F.; Gherman, B.; Pisla, D. A new robotic system for minimally invasive treatment of liver tumours. Proc. Rom. Acad. Ser. A Math. Phys. Tech. Sci. Inf. Sci. 2020, 21, 273–280. [Google Scholar]
- Pisla, D.; Vaida, C.; Birlescu, I.; Gherman, B.; Plitea, N. Risk management for the reliability of robotic assisted treatment of non-resectable liver tumors. Appl. Sci. 2020, 10, 52. [Google Scholar] [CrossRef] [Green Version]
- Krücker, J.; Xu, S.; Venkatesan, A.; Locklin, J.K.; Amalou, H.; Glossop, N.; Wood, B.J. Clinical utility of real-time fusion guidance for biopsy and ablation. J. Vasc. Interv. Radiol. 2011, 22, 515–524. [Google Scholar] [CrossRef] [Green Version]
- Li, C.; Xihui, Y.; Dengke, Z.; Linqiang, L.; Fazong, W.; Jianfei, T.; Jiansong, J. Iodine-125 Brachytherapy Can Prolong Progression-Free Survival of Patients with Locoregional Recurrence and/or Residual Hepatocellular Carcinoma After Radiofrequency Ablation. Cancer Biother. Radiopharm. 2020. [Google Scholar] [CrossRef]
- Zhiyuan, W.; Ju, G.; Wei, H.; Qingbing, W.; Ziyin, W.; Qin, L.; Jingjing, L.; Junwei, G.; Xiaoyi, D.; Zhongmin, W. Evaluation of doxorubicin-eluting bead transcatheter arterial chemoembolization combined with endovascular brachytherapy for hepatocellular carcinoma with main portal vein tumor thrombus. BMC Cancer 2020. Preprint under review. [Google Scholar]
- Ewertsen, C.; Saftoiu, A.; Gruionu, L.G.; Karstrup, S.; Nielsen, M.B. Real-time image fusion involving diagnostic ultrasound. Am. J. Roentgenol. 2013, 200, 249–255. [Google Scholar] [CrossRef] [PubMed]
- Abi-Jaoudeh, N.; Kruecker, J.; Kadoury, S.; Kobeiter, H.; Venkatesan, A.M.; Levy, E.; Wood, B.J. Multimodality image fusion-guided procedures: Technique, accuracy, and applications. Cardiovasc. Intervent. Radiol. 2012, 35, 986–998. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wood, B.J.; Kruecker, J.; Abi-Jaoudeh, N.; Locklin, J.K.; Levy, E.; Xu, S.; Solbiati, L.; Kapoor, A.; Amalou, H.; Venkatesan, A. Navigation systems for ablation. J. Vasc. Interv. Radiol. 2010, 21, 257–263. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Zhi, D. Towards estimating fiducial localization error of point-based registration in image-guided neurosurgery. Biomed. Mater. Eng. 2015, 26, S943–S949. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Fitzpatrick, J.M.; West, J.B.; Maurer, C.R. Predicting error in rigid-body point-based registration. IEEE Trans. Med. Imaging. 1998, 17, 694–702. [Google Scholar] [CrossRef]
- Boctor, E.M.; Taylor, R.H.; Fichtinger, G.; Choti, M.A. Robotically assisted intraoperative ultrasound with application to ablative therapy of liver cancer. In Medical Imaging 2003: Visualization, Image-Guided Procedures, and Display; International Society for Optics and Photonics: San Diego, CA, USA, 2003. [Google Scholar]
- Inchingolo, R.; Posa, A.; Mariappan, M.; Spiliopoulos, S. Locoregional treatments for hepatocellular carcinoma: Current evidence and future directions. World J. Gastroenterol. 2019, 25, 4614–4628. [Google Scholar] [CrossRef]
- Galloway, R.L. The process and development of image-guided procedures. Annu. Rev. Biomed. Eng. 2001, 3, 83–108. [Google Scholar] [CrossRef]
- Herline, A.; Stefansic, J.D.; Debelak, J.; Galloway, R.L.; Chapman, W.C. Technical advances toward interactive image-guided laparoscopic surgery. Surg. Endosc. 2000, 14, 675–679. [Google Scholar] [CrossRef]
- Ahn, S.J.; Lee, J.M.; Lee, D.H.; Lee, S.M.; Yoon, J.H.; Kim, Y.J.; Lee, J.H.; Yu, S.U.; Han, J.K. Real-time US-CT/MR fusion imaging for percutaneous radiofrequency ablation of hepatocellular carcinoma. J. Hepatol. 2017, 66, 347–354. [Google Scholar] [CrossRef]
- Lee, J.Y.; Choi, B.I.; Chung, Y.E.; Kim, M.W.; Kim, S.H.; Han, J.K. Clinical value of CT/MR-US fusion imaging for radiofrequency ablation of hepatic nodules. Eur. J. Radiol. 2012, 81, 2281–2289. [Google Scholar] [CrossRef]
- Rafailidis, V.; Sidhu, P.S. Ultrasound of the Liver. In Imaging of the Liver and Intra-Hepatic Biliary Tract. Medical Radiology; Springer: Cham, Switzerland, 2020; pp. 51–76. [Google Scholar]
- Lee, M.W. Fusion imaging of real-time ultrasonography with CT or MRI for hepatic intervention. Ultrasonography 2014, 33, 227–239. [Google Scholar] [CrossRef] [PubMed]
- Hakime, A.; Deschamps, F.; De Carvalho, E.G.M.; Teriitehau, C.; Auperin, A.; De Baere, T. Clinical evaluation of spatial accuracy of a fusion imaging technique combining previously acquired computed tomography and real-time ultrasound for imaging of liver metastases. Cardiovasc. Intervent. Radiol. 2011, 34, 338–344. [Google Scholar] [CrossRef] [PubMed]
- Solorio, L.; Wu, H.; Hernandez, C.; Gangolli, M.; Exner, A.A. Ultrasound-guided intratumoral delivery of doxorubicin from in situ forming implants in a hepatocellular carcinoma model. Ther. Deliv. 2016, 7, 201–212. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Birlescu, I.; Husty, M.; Vaida, C.; Gherman, B.; Tucan, P.; Pisla, D. Joint-Space Characterization of a Medical Parallel Robot Based on a Dual Quaternion Representation of SE(3). Mathematics 2020, 8, 1086. [Google Scholar] [CrossRef]
- Vaida, C.; Tucan, P.; Plitea, N.; Lazar, V.; Al Hajjar, N.; Pisla, D. Kinematic analysis of a new parallel robotic system for minimally invasive therapy of non-resecable hepatic tumors. In IFToMM World Congress on Mechanism and Machine Science; Springer: Cham, Switzerland, 2019; pp. 719–728. [Google Scholar]
- Birlescu, I.; Husty, M.; Vaida, C.; Plitea, N.; Nayak, A.; Pisla, D. Complete Geometric Analysis Using the Study SE(3) Parameters for a Novel, Minimally Invasive Robot Used in Liver Cancer Treatment. Symmetry 2019, 11, 1491. [Google Scholar] [CrossRef] [Green Version]
- Antal, A.T.; Antal, A. Helical gear dimensions in the case of the minimal equalized specific sliding. In Proceedings of the SYROM 2009—10th IFToMM International Symposium on Science of Mechanisms and Machines; Springer: Dordrecht, The Netherlands, 2010; pp. 85–93. [Google Scholar] [CrossRef]
- Antal, A.T. Addendum modification of spur gears with equalized efficiency at the points where the meshing stars and ends. Mechanika 2015, 21, 480–485. [Google Scholar] [CrossRef] [Green Version]
- Plitea, N.; Pisla, D.; Vaida, C.; Gherman, B.; Tucan, P. PRoHep-LCT-Parallel robot for the minimally invasive treatment of hepatic carcinoma. Patent Pending A 2018, 1017. [Google Scholar]
- Gherman, B.; Birlescu, I.; Burz, A.; Pisla, D. Automated medical instrument for the insertion of brachytherapy needles on parallel trajectories. Patent Pending A 2019, 806. [Google Scholar]
- Gherman, B.; Birlescu, I.; Burz, A.; Pisla, D. Kinematic analysis of two innovative medical instruments for the robotic assisted treatment of non-resectable liver tumors. In EuCoMeS 2020: New Trends in Mechanism and Machine Science; Springer: Cham, Switzerland, 2020; pp. 189–197. [Google Scholar]
- Birlescu, I.; Vaida, C.; Gherman, B.; Burz, A.; Tucan, P.; Plitea, N.; Pisla, D. Automated medical instrument for ultrasound laparoscopic probe guiding. Patent Pending A 2019, 752. [Google Scholar]
- Mitrea, D.; Marita, T.; Vancea, F.; Nedevschi, S.; Mitrea, P.; Neamt, G.M.; Timoftei, S.; Florian, V.; Pisla, D.; Radu, C.; et al. Towards building a computerized system for modelling advanced HCC tumors, in order to assist their minimum invasive surgical treatment. In New Trends in Mechanisms and Machine Science, the 8th European Conference on Mechanism Science (EuCoMeS); Springer: Cham, Switzerland, 2020; pp. 219–227. [Google Scholar]
- Chen, L.; Papandreou, G.; Schroff, F.; Adam, H. Rethinking Atrous Convolution for Semantic Image Segmentation. arXiv 2018, arXiv:1706.05587 CoRR. [Google Scholar]
- Christ, P.F.; Ettlinger, F.; Grün, F.; Elshaera, M.E.; Lipkova, J.; Schlecht, S.; Ahmaddy, F.; Tatavarty, S.; Bickel, M.; Bilic, P.; et al. Automatic liver and tumor segmentation of CT and MRI volumes using cascaded fully convolutional neural networks. arXiv 2017, arXiv:1702.05970 CoRR. [Google Scholar]
- Smith, L.N.; Topin, N. Super-Convergence: Very Fast Training of Residual Networks Using Large Learning Rates. arXiv Prepr. 2017, arXiv:1708.07120 CoRR. [Google Scholar]
- Schroeder, W.; Martin, K.; Lorensen, B. The Visualization Toolkit: An Object-Oriented Approach to 3D Graphics, 4th ed. Available online: http://www.kitware.com (accessed on 10 November 2020).
- Gong, Y.; Tang, Y.; Geng, Y.; Zhou, Y.; Yu, M.; Huang, B.; Sun, Z.; Tang, H.; Jian, Z.; Hou, B. Comparative safety and effectiveness of ultrasound guided radiofrequency ablation combined with preoperative three-dimensional reconstruction versus surgical resection for solitary hepatocellular carcinoma of 3–5 cm. J. Cancer 2019, 10, 5568. [Google Scholar] [CrossRef] [PubMed]
- Li, K.; Su, Z.; Xu, E.; Huang, Q.; Zeng, Q.; Zheng, R. Evaluation of the ablation margin of hepatocellular carcinoma using CEUS-CT/MR image fusion in a phantom model and in patients. BMC Cancer 2017, 17, 1–10. [Google Scholar] [CrossRef] [PubMed] [Green Version]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Radu, C.; Fisher, P.; Mitrea, D.; Birlescu, I.; Marita, T.; Vancea, F.; Florian, V.; Tefas, C.; Badea, R.; Ștefănescu, H.; et al. Integration of Real-Time Image Fusion in the Robotic-Assisted Treatment of Hepatocellular Carcinoma. Biology 2020, 9, 397. https://doi.org/10.3390/biology9110397
Radu C, Fisher P, Mitrea D, Birlescu I, Marita T, Vancea F, Florian V, Tefas C, Badea R, Ștefănescu H, et al. Integration of Real-Time Image Fusion in the Robotic-Assisted Treatment of Hepatocellular Carcinoma. Biology. 2020; 9(11):397. https://doi.org/10.3390/biology9110397
Chicago/Turabian StyleRadu, Corina, Petra Fisher, Delia Mitrea, Iosif Birlescu, Tiberiu Marita, Flaviu Vancea, Vlad Florian, Cristian Tefas, Radu Badea, Horia Ștefănescu, and et al. 2020. "Integration of Real-Time Image Fusion in the Robotic-Assisted Treatment of Hepatocellular Carcinoma" Biology 9, no. 11: 397. https://doi.org/10.3390/biology9110397
APA StyleRadu, C., Fisher, P., Mitrea, D., Birlescu, I., Marita, T., Vancea, F., Florian, V., Tefas, C., Badea, R., Ștefănescu, H., Nedevschi, S., Pisla, D., & Hajjar, N. A. (2020). Integration of Real-Time Image Fusion in the Robotic-Assisted Treatment of Hepatocellular Carcinoma. Biology, 9(11), 397. https://doi.org/10.3390/biology9110397