Next Article in Journal
Evaluating the Quality of Light Emitted by Smartphone Displays
Previous Article in Journal
RML-YOLO: An Insulator Defect Detection Method for UAV Aerial Images
Previous Article in Special Issue
Reconstruction of a 3D Real-World Coordinate System and a Vascular Map from Two 2D X-Ray Pixel Images for Operation of Magnetic Medical Robots
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Innovations in Robot-Assisted Surgery for Genitourinary Cancers: Emerging Technologies and Clinical Applications

by
Stamatios Katsimperis
1,*,
Lazaros Tzelves
1,
Georgios Feretzakis
2,
Themistoklis Bellos
1,
Ioannis Tsikopoulos
3,
Nikolaos Kostakopoulos
4 and
Andreas Skolarikos
1
1
Second Department of Urology, National and Kapodistrian University of Athens, Sismanogleio Hospital, 15126 Athens, Greece
2
School of Science and Technology, Hellenic Open University, 26335 Patras, Greece
3
Royal National Orthopaedic Hospital, London W1W 5AQ, UK
4
First Department of Urology, Metropolitan General Hospital, 15562 Athens, Greece
*
Author to whom correspondence should be addressed.
Appl. Sci. 2025, 15(11), 6118; https://doi.org/10.3390/app15116118
Submission received: 14 April 2025 / Revised: 27 May 2025 / Accepted: 28 May 2025 / Published: 29 May 2025
(This article belongs to the Special Issue New Trends in Robot-Assisted Surgery)

Abstract

Robot-assisted surgery has transformed the landscape of genitourinary cancer treatment, offering enhanced precision, reduced morbidity, and improved recovery compared to open or conventional laparoscopic approaches. As the field matures, a new generation of technological innovations is redefining the boundaries of what robotic systems can achieve. This narrative review explores the integration of artificial intelligence, advanced imaging modalities, augmented reality, and connectivity in robotic urologic oncology. The applications of machine learning in surgical skill evaluation and postoperative outcome predictions are discussed, along with AI-enhanced haptic feedback systems that compensate for the lack of tactile sensation. The role of 3D virtual modeling, intraoperative augmented reality, and fluorescence-guided surgery in improving surgical planning and precision is examined for both kidney and prostate procedures. Emerging tools for real-time tissue recognition, including confocal microscopy and Raman spectroscopy, are evaluated for their potential to optimize margin assessment. This review also addresses the shift toward single-port systems and the rise of telesurgery enabled by 5G connectivity, highlighting global efforts to expand expert surgical care across geographic barriers. Collectively, these innovations represent a paradigm shift in robot-assisted urologic oncology, with the potential to enhance functional outcomes, surgical safety, and access to high-quality care.

1. Introduction

The introduction of robot-assisted surgery has marked a significant evolution in the surgical management of genitourinary malignancies, particularly prostate, kidney, and bladder cancers. Since the widespread adoption of robotic platforms, such as the da Vinci Surgical System, urologic surgeons have gained enhanced dexterity, superior visualization, and greater precision in performing complex oncologic procedures [1]. Minimally invasive approaches now surpass open surgery in terms of length of hospital stay and blood loss [2]. However, the landscape of robotic surgery is far from static. A new wave of technological innovation is reshaping the field, fueled by advances in artificial intelligence, imaging modalities, augmented reality, and connectivity [3]. These emerging tools not only refine the surgeon’s capabilities intraoperatively but impact preoperative planning and surgical education.
In this narrative review, we present an up-to-date overview of the most promising innovations in robot-assisted surgery for genitourinary cancers. We aim to provide insight into their clinical relevance, current applications, and future potential in enhancing the quality, safety, and personalization of urologic oncology care.

2. Integration of Artificial Intelligence (AI) in Robotic Surgery

The integration of artificial intelligence (AI) into robotic surgery represents a transformative leap in surgical precision and efficiency. AI, especially through machine learning (ML) and deep learning (DL), plays an instrumental role in advancing surgical capabilities by enabling real-time data analysis and providing decision support tools to surgeons. While early applications are promising, most AI tools in surgery remain in early validation phases and lack prospective multicenter trials. Their clinical deployment is still largely confined to high-volume tertiary centers with robust informatics infrastructure. Critically, these applications—ranging from intraoperative decision support to postoperative prediction—are not yet uniformly validated. Clinical integration depends on factors such as data quality, surgeon compliance, and regulatory approval. Future efforts should prioritize prospective validation and cost-effectiveness analysis to facilitate broader adoption.

2.1. Machine Learning in Robotic Surgery

Machine learning, a subset of AI, empowers robotic systems to enhance the precision of surgical procedures. By analyzing complex datasets, including preoperative imaging, intraoperative dynamics, and patient-specific variables, AI algorithms can offer predictive analytics for a wide range of surgical outcomes. For example, machine learning has been employed to track anatomical changes in real-time, which is crucial during procedures like prostatectomy and nephrectomy. Nosrati et al. presented an innovative approach for aligning preoperative imaging with real-time endoscopic views during partial nephrectomy procedures [4]. Their method leverages subsurface anatomical features, most notably vascular pulsation patterns, alongside color and texture information to automatically match the surgical field with preoperative scans. Machine learning algorithms are used to analyze the visual data related to color and texture. This research represents significant advancement by integrating vascular pulsation as a novel cue for enhancing intraoperative image registration. Moreover, the proposed system includes a tissue-specific deformation model, allowing for accurate adaptation to the dynamic, non-rigid changes that occur during surgery [4].

2.2. Enhancing Surgical Skill Development

AI in robotic surgery not only aids surgeons during operations but contributes to the education and training of new surgeons. Systems that analyze surgeon performance, such as the video analysis of skill and technique (VAST), evaluate surgical actions based on video feeds from robotic consoles [3]. In VAST, videos of 11 surgeons performing bladder neck anastomosis were analyzed. The goal was to train a computer vision system to evaluate aspects like instrument speed, trajectory, smoothness, and positioning. Surgeons were classified based on skill level, and the system’s performance was validated against expert peer assessments using the GEARS tool. The algorithm achieved 83% accuracy using individual instrument data, which increased to 92% with joint movement, and reached 100% when incorporating the contralateral instrument. The study also identified a strong correlation between surgical expertise and variables, such as joint position, acceleration, and velocity. Similarly, Youssef et al. developed an AI tool for skill evaluation and measurement, incorporating video annotation of 25 robot-assisted radical prostatectomy (RARP) procedures using the Proximie augmented reality platform [5]. A novice surgeon annotated the 12 procedural steps, with 17 videos deemed suitable for analysis and 8 excluded due to poor quality or excessive length. The temporal tagging accuracy ranged from 85.6% to 100%, averaging 93.1% [5].

2.3. AI for Postoperative Predictions

One of the most promising applications of AI in robotic surgery is in the prediction of postoperative outcomes. By analyzing data from various sources, including surgical performance metrics and patient histories, AI algorithms can forecast complications such as infection risk, recovery timelines, and potential for recurrence. This capability allows surgeons to customize post-operative care plans based on predictive models, leading to better patient outcomes. A Finnish multicenter study developed a machine learning model to estimate 90-day mortality after radical cystectomy using only preoperative data [6]. The model achieved good accuracy (AUROC 0.73) and identified key risk factors such as ASA class, congestive heart failure, and chronic pulmonary disease. A practical risk table was also created to support clinical decision-making. In the context of robot-assisted radical prostatectomy (RARP), Hung et al. applied machine learning to intraoperative automated performance metrics (APMs) to predict key outcomes, such as hospital length of stay, surgery time, and Foley catheter duration, achieving 87.2% accuracy for the length of stay (LOS) [7]. In a follow-up study, the same group used a deep learning model (DeepSurv) combining APMs and clinicopathological data to predict urinary continence recovery after RARP [8]. The model identified APMs as stronger predictors than clinical features, and surgeons with more efficient APMs had significantly better continence outcomes at 3 and 6 months [8]. Collectively, these studies highlight how AI-driven tools can enhance perioperative decision-making, functional outcome prediction, and individualized patient care in robot-assisted urologic oncology.

2.4. Addressing Limitations in Haptic Feedback

A persistent challenge in robot-assisted surgery is the absence of haptic (tactile) feedback, which traditionally enables surgeons to assess tissue stiffness, tension, and other physical properties during procedures. This limitation can hinder precision and elevate the risk of intraoperative errors, such as excessive force application or unintentional tissue damage. Recent innovations leveraging artificial intelligence (AI) and advanced imaging technologies aim to bridge this gap by interpreting visual and sensory data to simulate feedback. Machine learning algorithms have been trained to analyze organ characteristics and predict tissue responses, thereby offering surgeons real-time virtual feedback to support more informed intraoperative decisions. In a notable example, Dai et al. developed a biaxial shear sensing system integrated into a robotic grasper, designed to detect excessive tension and provide haptic warnings prior to suture breakage [9]. Their study showed that this system significantly reduced suture failures by 59%, and lowered the average applied force by 25%, without compromising the quality of knot tying. This indicates that sensor-assisted feedback mechanisms can enhance safety and consistency in delicate surgical tasks. Similarly, in the context of robotic-assisted kidney transplantation (RAKT), Piana et al. employed three-dimensional augmented reality (3D AR) to address the inability to palpate atherosclerotic plaques in iliac vessels—an important limitation in minimally invasive vascular procedures [10]. By superimposing high-resolution CT-based virtual models onto the surgical field, the AR system enabled accurate intraoperative localization of vascular plaques, improving surgical navigation and procedural safety. These advancements highlight the critical role of AI and immersive technologies in compensating for the lack of tactile feedback, paving the way for safer and more precise robotic interventions. Figure 1 illustrates the integration of AI into robotic surgery workflows, including its applications in surgical skill evaluation, intraoperative guidance, and postoperative prediction.

3. Advancements in Imaging and Augmented Reality (AR)

Modern robot-assisted surgery increasingly relies on enhanced visualization techniques to optimize precision, safety, and personalization. Among these, three-dimensional imaging and augmented reality (AR) have emerged as transformative technologies that bridge the gap between conventional imaging and the dynamic surgical field. These tools are particularly valuable in urologic oncology, where anatomical complexity and oncologic considerations often dictate narrow surgical margins and organ preservation.

3.1. Three-Dimensional (3D) Virtual Models for Surgical Planning

Traditional two-dimensional imaging modalities, such as CT and MRI, while fundamental for diagnosis and staging, have limitations in spatial perception, and do not adequately represent the dynamic complexity of intraoperative anatomy. To address this, patient-specific three-dimensional virtual models (3DVMs) have been developed and integrated into robotic workflows. These 3DVMs are reconstructed from high-resolution cross-sectional images and allow the surgeon to interact with an anatomically faithful representation of the organ, tumor, and surrounding structures (Figure 2). In the context of robot-assisted partial nephrectomy (RAPN), 3DVMs enable a detailed evaluation of tumor size, location, endophyticity, and its relationship to vessels and calyces. Studies have shown that the preoperative use of 3DVMs improves an understanding of vascular anatomy, which in turn facilitates selective or super-selective arterial clamping, a critical strategy for minimizing warm ischemia and preserving renal function [11]. A recent study by Amparore et al. introduced perfusion-region 3DVMs (PR-3DVMs), which use Voronoi-based mathematical algorithms to delineate renal perfusion territories and guide tailored clamping strategies [12]. Their prospective analysis of 103 RAPN cases demonstrated a high concordance between planned and executed vascular management, with significantly better functional outcomes in terms of renal scintigraphy parameters when selective or super-selective clamping was employed.

3.2. Augmented Reality in Intraoperative Navigation

Augmented reality represents the next step in intraoperative image guidance. By superimposing 3D models onto the live surgical view from the endoscopic camera, AR enhances the surgeon’s perception of structures hidden beneath the visible surface.
AR has been employed during RAPN to project tumors, vessels, and collecting system anatomy directly onto the kidney, visible on the robotic console (Figure 3). This integration facilitates real-time orientation and navigation, minimizing the risk of intraoperative surprises. Intraoperative ultrasound, traditionally used for localization, has been shown to be less precise than AR-guided approaches in complex cases [13]. In a comparative study of patients with complex renal tumors (PADUA score ≥ 10), Porpiglia et al. demonstrated that AR guidance using Hyper Accuracy® 3D (HA3D®) models significantly improved intraoperative navigation during RAPN [13]. Compared to standard 2D ultrasound, AR guidance resulted in lower rates of global ischemia (45.8% vs. 69.7%, p = 0.03), higher tumor enucleation rates (62.5% vs. 37.5%, p = 0.02), and fewer collecting system violations (10.4% vs. 45.5%, p = 0.003). Postoperatively, AR was associated with a smaller drop in renal plasma flow at 3 months (–12.38 vs. –18.14, p = 0.01) and a lower rate of complications. These findings highlight the potential of AR to enhance surgical precision and improve functional outcomes in challenging renal tumor resections.
Supporting these findings, Kobayashi et al. showed that surgical navigation systems integrating real-time endoscopic views with 3D virtual models improved robotic efficiency during RAPN. Their 2019 study demonstrated that surgical navigation significantly reduced inefficient connected motions, such as “‘insert’, ‘pull’, and “rotate”, leading to faster identification of the renal artery (9 vs. 16 min, p = 0.008) [14]. A second study by the same group, using propensity score-matched analysis, found that navigation-assisted RAPN achieved significantly greater parenchymal preservation (90.0% vs. 83.5%, p = 0.042) and better extraparenchymal resection control (21.4 mL vs. 17.2 mL, p = 0.041) [15].
A recent innovation in this field is the development of iKidney, an artificial intelligence-driven platform for automated AR overlay during RAPN [16]. Sica et al. reported the first clinical use of this system, which leverages a convolutional neural network (ResNet-50) to segment the kidney from endoscopic images and align a 3D model with the in vivo anatomy in real time [16]. This eliminates the need for manual model manipulation by trained assistants, a major barrier to widespread adoption. In their case, the system enabled accurate identification and enucleoresection of a totally endophytic renal mass with selective clamping and no postoperative complications. Overlay precision reached 97.8% during model training, supporting its feasibility and clinical utility. The integration of AI into AR guidance may democratize this technology by making it more autonomous, scalable, and suitable for use even in non-tertiary centers.
In RARP, AR has enabled the overlay of prostate tumors and extracapsular extension (ECE) zones onto the real-time surgical field, offering enhanced anatomical visualization during key steps, such as nerve-sparing and dissection. This has proven particularly useful for selective biopsies and intraoperative adjustment of the dissection plane. Some platforms have also introduced elastic 3D models, which deform in response to robotic manipulation, improving anatomical fidelity during surgery. Porpiglia et al. were among the first to explore AR integration into RARP using HA3D models based on mpMRI data [17]. In their early experience, AR guidance enabled the accurate marking of intraprostatic lesions and suspected ECE, with high concordance between AR-identified targets and final pathology (100% lesion match; 73% positive NVB biopsies at marked ECE sites) [17]. A follow-up study validated the dimensional accuracy of these models, reporting <3 mm mismatch in 85% of prostate surfaces when compared to scanned specimens [18]. Building on this, the group introduced a three-dimensional elastic AR system to dynamically simulate prostate deformation during the nerve-sparing phase [19]. This technology achieved a 100% accuracy rate in identifying capsular involvement compared to only 47% with standard 2D cognitive guidance (p < 0.05), offering the potential to reduce positive surgical margins while preserving functional outcomes.

3.3. Immersive Surgical Planning in the Metaverse

Beyond intraoperative overlays, a new horizon is emerging with immersive virtual planning environments, often referred to as the surgical “metaverse”. Using VR headsets and high-speed connectivity, multiple surgical stakeholders can now enter a shared virtual space to review and manipulate patient-specific 3DVMs. Checcucci et al. (2024) reported a novel application where avatars of surgeons and moderators discussed partial nephrectomy cases in a fully virtual meeting room, interacting with hyperaccurate kidney models and simulating resection and clamping strategies prior to the actual surgery [20]. This technology not only overcomes geographic barriers but enhances spatial orientation and team collaboration. Preliminary findings suggest that such preoperative discussions can enrich planning quality and contribute to more precise surgical execution.

3.4. Limitations and Future Directions

Despite its promise, AR adoption is not without challenges. Manual model superimposition is time-consuming and often limited to centers with dedicated technical support. Current models may also fail to account for intraoperative organ deformation, which affects the precision of virtual overlays.
Automated platforms, like iKidney, represent an important step toward scalable solutions, but further validation and regulatory approvals are required before widespread adoption. Future advancements may include real-time deformable models, fully integrated AI-guided overlays, and cloud-based model sharing for remote collaboration. Furthermore, cost and access disparities, especially in resource-limited settings, remain significant barriers to equity in the implementation of these technologies.

4. Real-Time Tissue Recognition and Margin Assessment

The ability to identify tumor boundaries with high precision during oncologic surgery is essential for achieving negative surgical margins (NSMs) while minimizing collateral tissue damage. This challenge is particularly prominent in procedures, such as RARP and RAPN, where the trade-off between oncologic control and the preservation of neurovascular or renal parenchymal structures is delicate. A range of real-time intraoperative technologies has emerged to address this need, including fluorescence imaging, optical spectroscopy, digital histopathology, and machine learning assisted systems.

4.1. Frozen Section and the NeuroSAFE Technique

The frozen section (FS) analysis remains the intraoperative standard for margin assessment in RARP, particularly when attempting to preserve the neurovascular bundles in cases where extracapsular extension is a concern. Among FS-based techniques, the NeuroSAFE protocol, pioneered at the Martini-Klinik, has gained significant international traction for enabling real-time decision-making on nerve preservation during surgery [21].
The technique involves a systematic FS analysis of the posterolateral margins adjacent to the neurovascular bundles, with the goal of confirming negative margins before committing to bilateral nerve-sparing. If a positive surgical margin (PSM) is detected, a secondary resection is performed on the involved side, often sacrificing the neurovascular bundle [21].
Until recently, the evidence supporting NeuroSAFE was largely retrospective or observational. However, the NeuroSAFE PROOF trial, the first multicenter, phase 3 randomized controlled trial evaluating this technique, has now provided high-level evidence for its effectiveness [22]. The trial demonstrated that this approach significantly improves erectile function at 12 months postoperatively, with no compromise in early oncologic outcomes [22].
Compared to standard RARP, patients in the NeuroSAFE arm had the following outcomes:
  • Better postoperative erectile function (mean IIEF-5 difference: +3.2);
  • Improved early continence at 3 months;
  • Increased nerve-sparing rates, particularly in cases initially deemed unsuitable for bilateral preservation.
Although a slightly higher rate of positive margins was observed, this did not affect short-term cancer control. These results support the use of NeuroSAFE to optimize functional outcomes without compromising oncologic safety.

4.2. Ex Vivo Imaging Technologies

To overcome the limitations of a conventional FS analysis during radical prostatectomy, several ex vivo imaging technologies have emerged, offering rapid tissue assessment combined with high-resolution digital imaging capabilities. Among the emerging intraoperative technologies, fluorescence confocal microscopy (FCM) has shown promise as a rapid, high-resolution imaging method for evaluating prostate margins during robot-assisted radical prostatectomy. Unlike the conventional FS analysis, FCM does not require tissue fixation or sectioning. Instead, freshly excised tissue is stained (typically with acridine orange), digitally scanned, and immediately analyzed. Early studies have demonstrated strong concordance with final histopathology, with reported diagnostic accuracy exceeding 90% and short turnaround times [23]. Recent developments have introduced standardized protocols for applying FCM to assess the neurovascular bundles and posterolateral margins intraoperatively. In comparative analyses, FCM has performed similarly to the NeuroSAFE technique in detecting clinically significant positive margins, while offering the advantage of significantly shorter processing times [24]. Moreover, the digital nature of FCM allows for a remote pathological review, which may be particularly valuable for centers without onsite pathology support. As the technology evolves, FCM may become an important adjunct to real-time surgical decision-making in robotic prostatectomy, enhancing oncologic safety while preserving functional outcomes. Another promising approach is specimen PET-CT imaging, exemplified by platforms, such as Aura-10™. This method involves the preoperative administration of PSMA-targeted radiotracers, followed by PET-CT imaging of the excised prostate specimen. The technique provides three-dimensional visualization of radiotracer uptake at the tumor margins with submillimetric resolution. Preliminary findings indicate a strong concordance with final histopathology, and the potential to detect focal positive surgical margins and lymph node metastases with high precision [25]. However, concerns regarding radiation exposure, cost, and clinical workflow integration remain challenges to its widespread adoption. Together, these ex vivo imaging modalities represent a shift toward faster, more precise intraoperative margin assessments, and may complement or eventually reduce the reliance on the conventional FS analysis in prostate cancer surgery.

4.3. In Vivo Optical and Spectroscopic Techniques

In contrast to ex vivo tools, several emerging technologies now offer the ability to assess tissue intraoperatively while it remains in situ. These optical and spectroscopic methods aim to provide real-time guidance during robot-assisted surgery, particularly in procedures requiring precise margin control and nerve preservation. One such technique is confocal laser endomicroscopy (CLE), a fiber-optic imaging technology that provides real-time, high-resolution visualization of tissues at the cellular level, using a 488-nm laser in conjunction with intravenously administered fluorescein. Originally applied in urology for upper tract and bladder tumors, CLE has also been investigated for intraoperative use in RARP. In a feasibility study by Lopez et al., CLE was successfully integrated into the da Vinci robotic platform using the TilePro™ interface, allowing direct visualization of the prostatic and periprostatic tissues during nerve-sparing dissection [26]. The system employs miniature probes, ranging from 0.85 to 2.6 mm in diameter, offering varying fields of view and resolution depending on the surgical access requirements. Key anatomical landmarks, including the bladder neck, urethral stump, and neurovascular bundles, were clearly visualized, suggesting that CLE could serve as a real-time adjunct to pathological assessment. To support clinical interpretation, a visual atlas of prostate-specific CLE images has been developed [27]. While promising, clinical data remain limited, and larger studies are needed to determine whether CLE can reliably guide margin assessment and improve functional outcomes in prostate cancer surgery.
Raman spectroscopy (RS) is another promising modality, utilizing inelastic light scattering to differentiate tissue types based on molecular composition. When applied intraoperatively through fiber-optic probes, RS has demonstrated high sensitivity (approximately 87%) and specificity (around 86%) in distinguishing benign from malignant prostate tissue [28]. A notable advancement in this field is the integration of RS directly into the da Vinci robotic surgical platform, as demonstrated by Pinto et al. They developed a dual-wavelength RS system (680 and 785 nm) adapted for robotic manipulation, and successfully collected in vivo spectra at the surgical margin during RARP. In a proof-of-concept study involving 20 prostatectomy specimens and additional intraoperative measurements in four patients, RS achieved impressive accuracy (91%), sensitivity (90.5%), and specificity (96%) in distinguishing prostatic from extraprostatic tissues [29]. The system’s compatibility with standard robotic instruments and fast acquisition time (~2 s per point) suggests it can be incorporated into surgical workflows with minimal disruption. These early results highlight the potential of RS to serve as a real-time margin assessment tool, potentially reducing positive surgical margins and supporting nerve-sparing decisions. However, broader clinical validation, including robust classification models and standardized workflows, is needed before routine adoption in robotic prostate surgery.
Key variables, such as sensitivity, specificity, cost-effectiveness, and ease of use, differ significantly, suggesting that a single “best” tool may not be appropriate across all clinical contexts. Despite strong early-phase data, these technologies must overcome barriers including capital cost, training requirements, and lack of reimbursement pathways. Regulatory pathways and collaborative trials are necessary to establish their value in oncologic outcomes and patient-centered metrics.

4.4. Indocyanine Green (ICG) and Fluorescence-Guided Partial Nephrectomy

In robot-assisted partial nephrectomy (RAPN), the use of near-infrared fluorescence (NIRF) imaging with indocyanine green (ICG) has emerged as a widely adopted strategy for real-time anatomical and functional assessment [30]. ICG is a fluorescent dye that, upon intravenous injection, binds to plasma proteins and emits fluorescence when exposed to near-infrared light, enabling intraoperative visualization of vascularized structures [31]. One of the primary applications of ICG in RAPN is the assessment of renal perfusion. Following ICG administration, the perfusion status of the renal parenchyma becomes immediately visible, allowing the surgeon to confirm the efficacy of arterial clamping—whether selective or super-selective [31]. This visualization is crucial for avoiding unnecessary devascularization of healthy tissue and ensuring that the intended ischemic zone has been accurately targeted.
ICG is also valuable in enhancing tumor-to-parenchyma contrast, particularly in cases of clear cell renal cell carcinoma (ccRCC). Due to defective proximal tubule function, these tumors often exhibit altered ICG uptake, appearing hypofluorescent in comparison to the surrounding normal parenchyma. This differential fluorescence assists in tumor localization, real-time guidance of excision, and delineation of surgical margins [32]. Additionally, post-resection imaging with ICG can help confirm complete tumor removal by revealing any residual fluorescence within the resection bed, which may indicate retained tumor tissue or regions of poor perfusion.
Some surgeons further employ ICG-enhanced enucleation planes, which integrate visual and perfusion data to refine dissection accuracy. ICG has also played a role in the development of “zero-ischemia” techniques, where fluorescence is used to verify that non-clamped regions of the kidney remain perfused during selective arterial control [33]. Technological advancements, such as the Firefly® system integrated into da Vinci platforms, allow for seamless toggling between white light and NIRF modes, facilitating an efficient workflow during robotic surgery.

4.5. Augmented Reality and Artificial Intelligence for Margin Guidance

Emerging technologies, such as augmented reality (AR) and artificial intelligence (AI), are reshaping intraoperative strategies for margin assessment. AR platforms that overlay MRI-derived 3D models onto live endoscopic views provide surgeons with a non-invasive method to visualize areas at increased risk of extracapsular extension [34]. Though not direct detection tools, these systems enhance spatial orientation and can guide selective biopsies or dynamic adjustments to nerve-sparing plans. Advanced AR systems also simulate soft-tissue deformation, improving the anatomical accuracy of overlaid tumor boundaries during real-time manipulation [19]. In parallel, machine learning algorithms are being developed to interpret intraoperative imaging modalities with increasing autonomy. Automated segmentation of PSMA PET-CT data has already demonstrated high accuracy (up to 97–99%) in correlating radiologic signals with histopathologic findings [35]. A summary of AR and real-time margin assessment technologies can be found in Table 1. A comparative analysis of biosensor technologies can be found in Table 2.

5. Single-Port vs. Multi-Port Approaches in Robot-Assisted Urologic Oncology

Robotic-assisted surgery in urology relies primarily on two core systems: multi-port (MP) platforms, such as the da Vinci Xi, and single-port (SP) platforms, such as the da Vinci SP. These systems differ significantly in terms of design, ergonomics, and clinical application (Table 3). The MP systems use multiple trocar insertions for individual instruments, which provide excellent triangulation and a broad working space. By contrast, the SP platform uses a single 25 mm multichannel port to deploy a flexible camera and multiple wristed instruments, all through a single incision. This allows for minimized invasiveness, reduced postoperative pain, and improved cosmesis, but presents challenges, such as reduced traction and a steeper learning curve.
The da Vinci Xi system offers high-definition 3D visualization, EndoWrist technology for enhanced dexterity, and easy docking for multi-quadrant access. It remains the most widely used platform for RARP, RARC, and RAPN. On the other hand, the SP system has enabled novel surgical approaches, such as transvesical RARP, retroperitoneal nephrectomy, and extraperitoneal procedures, which were previously difficult or impossible with MP systems. Studies have shown that SP surgery may offer comparable oncologic outcomes with advantages in pain scores and same-day discharge rates, although limitations remain, such as reduced lymph node yield and limited availability globally.
Importantly, the SP system incorporates a relocation pedal allowing the surgeon to reposition the entire camera-instrument array within the body cavity without undocking, which is essential for surgeries in confined anatomical spaces. However, the reduced range of motion and diminished haptic feedback compared to MP systems can impact delicate maneuvers, such as nerve-sparing. Consequently, surgeon experience and careful patient selection remain critical to the success of SP robotic procedures.

The Emergence of Single-Port Robotic Surgery

The emergence of the da Vinci SP platform marks a significant evolution in robotic surgery, introducing a compact system that operates through a single 2.5 cm multichannel port housing three articulating instruments and a flexible 3D endoscope. This design minimizes incisional morbidity, improves cosmesis, and facilitates access to deep pelvic and retroperitoneal spaces. Early multi-institutional experiences have confirmed the feasibility and safety of SP procedures, including radical prostatectomy (SP-RARP), partial nephrectomy (SP-RAPN), and radical cystectomy, although the current evidence largely stems from retrospective or single-arm series [36,37,38].
In radical prostatectomy, SP-RARP has been successfully performed using transperitoneal, extraperitoneal, and transvesical approaches. Outcomes appear comparable to multi-port RARP (MP-RARP), with reports of reduced postoperative opioid use, shorter hospital stays, excellent cosmetic satisfaction, and similar recovery of continence and erectile function at 3 months [39,40,41,42]. One large series also noted a higher rate of outpatient surgeries in the SP-RARP group, reflecting improved recovery profiles [43]. Studies such as those by Noh et al., Vigneswaran et al., and Ge et al. confirm equivalent oncologic and functional outcomes, including continence and complication rates, while SP techniques may result in decreased opioid consumption [44,45,46]. However, technical drawbacks, such as reduced triangulation and the absence of a fourth robotic arm, can hinder lymph node dissection or complex reconstruction efforts [47].
For partial nephrectomy, SP-RAPN is feasible in carefully selected patients, typically those with exophytic, low-complexity, or polar tumors. Warm ischemia times and renal function preservation are on par with MP-RAPN, though challenges remain with instrument triangulation, intracorporeal suturing, and assistant access [48,49]. Meta-analyses suggest reduced postoperative pain and better cosmetic outcomes with SP platforms, but operative times may be longer, and the learning curve steeper [48]. Importantly, complication and conversion rates appear similar between SP and MP approaches in various studies [50].
SP radical cystectomy with intracorporeal diversion has also been performed in limited series, demonstrating encouraging early outcomes [51]. However, the technical demands of complex reconstruction through a single port have limited widespread adoption. Similarly, procedures such as ureteral reimplantation and simple prostatectomy have been shown to be feasible using SP systems, with comparable outcomes to MP techniques when performed by experienced surgeons [52,53].
The strengths of SP platforms include reduced pain, superior cosmesis, a smaller surgical footprint ideal for narrow anatomical spaces, and the potential for future natural orifice transluminal endoscopic surgery (NOTES). Several series have reported shorter hospital stays and decreased opioid requirements [40]. However, key limitations persist—namely high cost (approximately $2 million per system), reduced instrument strength, lack of compatibility with some adjunct tools (e.g., Firefly fluorescence imaging), absence of a fourth robotic arm, and limited bedside assistant access [54]. Instrumentation often requires adaptation, such as the use of magnetic retractors or novel suction tools, to accommodate the constrained workspace. While future iterations of SP systems may address these issues, cost remains a major obstacle to broader implementation.
Transitioning to SP surgery also presents training and ergonomic challenges. Surgeons must relearn docking, orientation, and intracorporeal suturing within a more confined field, particularly in reconstructive or anatomically complex cases. Future developments may focus on expanding the range of instruments available (e.g., robotic staplers and suction devices), improving software integration with technologies like augmented reality and AI-guided movement, and exploring hybrid SP–MP workflows that balance access and invasiveness. Cost reduction through market competition and shared-use models will be crucial for the broader adoption of SP robotic surgery.

6. Telesurgery and the Future of Remote Robotic Urology

Telesurgery—the remote execution or guidance of surgical procedures via robotic systems—has become an increasingly viable model thanks to advances in robotic precision, low-latency networking, and global digital infrastructure. While early demonstrations, such as the Lindbergh transatlantic procedure in 2001, were hampered by high costs and transmission delays, recent breakthroughs using 5G technology have renewed interest in the field [55]. The ultra-low latency and high bandwidth of 5G networks, with transmission delays consistently below 200 milliseconds, now enable real-time control of surgical robots across vast distances.
Clinical studies in China have successfully demonstrated 5G-assisted telesurgeries involving radical cystectomy, radical nephrectomy, adrenalectomy, and varicocelectomy, all performed without intraoperative complications and with acceptable operative metrics [56,57,58,59,60]. These real-world applications suggest that latency under 300 milliseconds is generally safe for technically demanding tasks such as laparoscopic suturing. In these settings, novel robotic platforms have played a pivotal role. The MicroHand Edge MP1000 and KD-SR-01 (KangDuo) systems have demonstrated stable telesurgical performance, each offering dual-console capabilities to ensure safe handover in case of network interruptions [56,61]. The Edge MP1000, for example, has completed over 30 human telesurgeries across distances exceeding 6000 km, maintaining round-trip latencies under 200 ms and achieving full procedural success without perceptible delay. The KangDuo system has advanced the concept further by enabling triple-console telesurgery, which supports real-time collaboration and teaching. Its open-console design provides multi-angle visualization of the operating room, enhancing situational awareness. Similarly, the Toumai® system developed by MicroPort has been used in over 100 telesurgeries, including complex urologic procedures, such as radical prostatectomy, with latency values ranging from 24 to 159 ms depending on network distance. Notably, this platform has achieved safe operation over distances up to 5000 km using both 5G and dedicated fiber optic networks. In Japan, the Hinotori™ system has undergone successful preclinical telesurgery testing across distances up to 2000 km, with latency as low as 40 ms in cadaveric models [62]. While not yet used for human telesurgery, Hinotori’s compact design, haptic feedback, and intuitive interface have positioned it as a strong candidate for future deployment in clinical remote surgery. These developments illustrate the growing maturity of telesurgical platforms, underscoring the importance of robust network reliability, modular hardware design, and integrated safeguards as key enablers for global telesurgery adoption.
Beyond these technical advancements, telesurgery holds the potential to significantly expand access to specialized surgical care. By enabling expert surgeons to operate remotely, this model can help bridge the gap between high-volume centers and geographically isolated or underserved regions. Patients may benefit from reduced travel burdens, while healthcare systems can make more efficient use of specialized expertise. However, several challenges must be addressed before widespread adoption becomes feasible. Ensuring secure, encrypted data transmission is critical to maintaining patient safety and privacy. The initial costs associated with telesurgical infrastructure remain substantial, raising concerns about cost-effectiveness in the early stages of implementation. Furthermore, the shift to remote surgery demands a rethinking of training and credentialing processes. Surgeons must adapt to working without tactile feedback and develop proficiency through structured simulation and virtual environments to ensure safety and competence in remote settings.

7. Future Directions

As robotic surgery continues to evolve, future efforts will focus on integrating real-time AI-driven decision support, fully autonomous imaging overlays, and haptic feedback technologies into routine surgical workflows. Expanding access to advanced systems through cost-reduction strategies and refining training protocols for single-port and remote surgery will be key priorities. Ultimately, the convergence of intelligent automation, immersive visualization, and global connectivity may redefine the standard of care in urologic oncology. Barriers to widespread adoption include lack of standardization, training burdens, and limited cost-effectiveness data. AI-driven haptic systems, for example, remain underutilized due to hardware constraints and absence of reimbursement incentives. Broader adoption will depend on evidence generation, streamlined integration into existing platforms, and support from healthcare systems and industry. Table 4 summarizes the maturity level and clinical readiness of key technologies discussed in this review. It highlights their respective phases of validation, strengths, and considerations for broader implementation in robotic urologic oncology.

8. Conclusions

Robot-assisted surgery continues to evolve at a rapid pace, driven by innovations in artificial intelligence, imaging, augmented reality, and surgical platforms. These advancements are transforming every stage of urologic oncology care—from preoperative planning and intraoperative precision to postoperative recovery and remote surgical capabilities. Technologies such as machine learning-guided decision support, real-time tissue recognition, and immersive AR-based navigation are not only enhancing oncologic and functional outcomes but expanding access through single-port and telesurgical systems. While challenges remain, in terms of cost, training, and infrastructure, the integration of these emerging tools signals a shift toward more personalized, precise, and globally accessible urologic cancer care. Continued interdisciplinary collaboration, clinical validation, and equitable implementation will be essential to fully realize the potential of next-generation robotic surgery. Although many technologies show early promise, most require further validation in multicenter prospective trials. Key areas for future research include the impact of AR and biosensor tools on margin control and functional outcomes, cost-effectiveness evaluations of AI-driven tools, and safe scalability of telesurgery. Ultimately, success in this field will depend not only on technological innovation but on clinical validation, regulatory support, and equitable implementation across diverse healthcare settings.

Author Contributions

Conceptualization: S.K.; literature search: S.K., T.B., L.T., G.F., I.T. and N.K.; writing—original draft preparation: S.K. and T.B.; writing—review and editing: S.K., L.T., I.T. and N.K.; supervision: A.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data sharing is not applicable.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Mikhail, D.; Sarcona, J.; Mekhail, M.; Richstone, L. Urologic Robotic Surgery. Surg. Clin. N. Am. 2020, 100, 361–378. [Google Scholar] [CrossRef] [PubMed]
  2. Ilic, D.; Evans, S.M.; Allan, C.A.; Jung, J.H.; Murphy, D.; Frydenberg, M. Laparoscopic and robotic-assisted versus open radical prostatectomy for the treatment of localised prostate cancer. Cochrane Database Syst. Rev. 2017, 9, Cd009625. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  3. Bellos, T.; Manolitsis, I.; Katsimperis, S.; Juliebø-Jones, P.; Feretzakis, G.; Mitsogiannis, I.; Varkarakis, I.; Somani, B.K.; Tzelves, L. Artificial Intelligence in Urologic Robotic Oncologic Surgery: A Narrative Review. Cancers 2024, 16, 1775. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  4. Nosrati, M.S.; Amir-Khalili, A.; Peyrat, J.M.; Abinahed, J.; Al-Alao, O.; Al-Ansari, A.; Abugharbieh, R.; Hamarneh, G. Endoscopic scene labelling and augmentation using intraoperative pulsatile motion and colour appearance cues with preoperative anatomical priors. Int. J. Comput. Assist. Radiol. Surg. 2016, 11, 1409–1418. [Google Scholar] [CrossRef] [PubMed]
  5. Cheikh Youssef, S.; Hachach-Haram, N.; Aydin, A.; Shah, T.T.; Sapre, N.; Nair, R.; Rai, S.; Dasgupta, P. Video labelling robot-assisted radical prostatectomy and the role of artificial intelligence (AI): Training a novice. J. Robot. Surg. 2023, 17, 695–701. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  6. Klén, R.; Salminen, A.P.; Mahmoudian, M.; Syvänen, K.T.; Elo, L.L.; Boström, P.J. Prediction of complication related death after radical cystectomy for bladder cancer with machine learning methodology. Scand. J. Urol. 2019, 53, 325–331. [Google Scholar] [CrossRef] [PubMed]
  7. Hung, A.J.; Chen, J.; Che, Z.; Nilanon, T.; Jarc, A.; Titus, M.; Oh, P.J.; Gill, I.S.; Liu, Y. Utilizing Machine Learning and Automated Performance Metrics to Evaluate Robot-Assisted Radical Prostatectomy Performance and Predict Outcomes. J. Endourol. 2018, 32, 438–444. [Google Scholar] [CrossRef] [PubMed]
  8. Hung, A.J.; Chen, J.; Ghodoussipour, S.; Oh, P.J.; Liu, Z.; Nguyen, J.; Purushotham, S.; Gill, I.S.; Liu, Y. A deep-learning model using automated performance metrics and clinical features to predict urinary continence recovery after robot-assisted radical prostatectomy. BJU Int. 2019, 124, 487–495. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  9. Dai, Y.; Abiri, A.; Pensa, J.; Liu, S.; Paydar, O.; Sohn, H.; Sun, S.; Pellionisz, P.A.; Pensa, C.; Dutson, E.P.; et al. Biaxial sensing suture breakage warning system for robotic surgery. Biomed. Microdevices 2019, 21, 10. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  10. Piana, A.; Gallioli, A.; Amparore, D.; Diana, P.; Territo, A.; Campi, R.; Gaya, J.M.; Guirado, L.; Checcucci, E.; Bellin, A.; et al. Three-dimensional Augmented Reality-guided Robotic-assisted Kidney Transplantation: Breaking the Limit of Atheromatic Plaques. Eur. Urol. 2022, 82, 419–426. [Google Scholar] [CrossRef] [PubMed]
  11. Roberts, S.; Desai, A..; Checcucci, E.; Puliatti, S.; Taratkin, M.; Kowalewski, K.F.; Rivas, J.G.; Rivero, I.; Veneziano, D.; Autorino, R.; et al. “Augmented reality” applications in urology: A systematic review. Minerva Urol. Nephrol. 2022, 74, 528–537. [Google Scholar] [CrossRef] [PubMed]
  12. Amparore, D.; Pecoraro, A.; Checcucci, E.; Piramide, F.; Verri, P.; De Cillis, S.; Granato, S.; Angusti, T.; Solitro, F.; Veltri, A.; et al. Three-dimensional Virtual Models’ Assistance During Minimally Invasive Partial Nephrectomy Minimizes the Impairment of Kidney Function. Eur. Urol. Oncol. 2022, 5, 104–108. [Google Scholar] [CrossRef] [PubMed]
  13. Porpiglia, F.; Checcucci, E.; Amparore, D.; Piramide, F.; Volpi, G.; Granato, S.; Verri, P.; Manfredi, M.; Bellin, A.; Piazzolla, P.; et al. Three-dimensional Augmented Reality Robot-assisted Partial Nephrectomy in Case of Complex Tumours (PADUA ≥10): A New Intraoperative Tool Overcoming the Ultrasound Guidance. Eur. Urol. 2020, 78, 229–238. [Google Scholar] [CrossRef] [PubMed]
  14. Kobayashi, S.; Cho, B.; Huaulmé, A.; Tatsugami, K.; Honda, H.; Jannin, P.; Hashizumea, M.; Eto, M. Assessment of surgical skills by using surgical navigation in robot-assisted partial nephrectomy. Int. J. Comput. Assist. Radiol. Surg. 2019, 14, 1449–1459. [Google Scholar] [CrossRef] [PubMed]
  15. Kobayashi, S.; Cho, B.; Mutaguchi, J.; Inokuchi, J.; Tatsugami, K.; Hashizume, M.; Eto, M. Surgical Navigation Improves Renal Parenchyma Volume Preservation in Robot-Assisted Partial Nephrectomy: A Propensity Score Matched Comparative Analysis. J. Urol. 2020, 204, 149–156. [Google Scholar] [CrossRef] [PubMed]
  16. Sica, M.; Piazzolla, P.; Amparore, D.; Verri, P.; De Cillis, S.; Piramide, F.; Volpi, G.; Piana, A.; Di Dio, M.; Alba, S.; et al. 3D Model Artificial Intelligence-Guided Automatic Augmented Reality Images during Robotic Partial Nephrectomy. Diagnostics 2023, 13, 3454. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  17. Porpiglia, F.; Fiori, C.; Checcucci, E.; Amparore, D.; Bertolo, R. Augmented Reality Robot-assisted Radical Prostatectomy: Preliminary Experience. Urology 2018, 115, 184. [Google Scholar] [CrossRef] [PubMed]
  18. Porpiglia, F.; Checcucci, E.; Amparore, D.; Autorino, R.; Piana, A.; Bellin, A.; Piazzolla, P.; Massa, F.; Bollito, E.; Gned, D.; et al. Augmented-reality robot-assisted radical prostatectomy using hyper-accuracy three-dimensional reconstruction (HA3D™) technology: A radiological and pathological study. BJU Int. 2019, 123, 834–845. [Google Scholar] [CrossRef] [PubMed]
  19. Porpiglia, F.; Checcucci, E.; Amparore, D.; Manfredi, M.; Massa, F.; Piazzolla, P.; Manfrin, D.; Piana, A.; Tota, D.; Bollito, E.; et al. Three-dimensional Elastic Augmented-reality Robot-assisted Radical Prostatectomy Using Hyperaccuracy Three-dimensional Reconstruction Technology: A Step Further in the Identification of Capsular Involvement. Eur. Urol. 2019, 76, 505–514. [Google Scholar] [CrossRef] [PubMed]
  20. Checcucci, E.; Amparore, D.; Volpi, G.; De Cillis, S.; Piramide, F.; Verri, P.; Piana, A.; Sica, M.; Gatti, C.; Alessio, P.; et al. Metaverse Surgical Planning with Three-dimensional Virtual Models for Minimally Invasive Partial Nephrectomy. Eur. Urol. 2024, 85, 320–325. [Google Scholar] [CrossRef] [PubMed]
  21. Beyer, B.; Schlomm, T.; Tennstedt, P.; Boehm, K.; Adam, M.; Schiffmann, J.; Sauter, G.; Wittmer, C.; Steuber, T.; Graefen, M.; et al. A feasible and time-efficient adaptation of NeuroSAFE for da Vinci robot-assisted radical prostatectomy. Eur. Urol. 2014, 66, 138–144. [Google Scholar] [CrossRef] [PubMed]
  22. Dinneen, E.; Almeida-Magana, R.; Al-Hammouri, T.; Pan, S.; Leurent, B.; Haider, A.; Freeman, A.; Roberts, N.; Brew-Graves, C.; Grierson, J.; et al. Effect of NeuroSAFE-guided RARP versus standard RARP on erectile function and urinary continence in patients with localised prostate cancer (NeuroSAFE PROOF): A multicentre, patient-blinded, randomised, controlled phase 3 trial. Lancet Oncol. 2025, 26, 447–458. [Google Scholar] [CrossRef] [PubMed]
  23. Puliatti, S.; Bertoni, L.; Pirola, G.M.; Azzoni, P.; Bevilacqua, L.; Eissa, A.; Elsherbiny, A.; Sighinolfi, M.C.; Chester, J.; Kaleci, S.; et al. Ex vivo fluorescence confocal microscopy: The first application for real-time pathological examination of prostatic tissue. BJU Int. 2019, 124, 469–746. [Google Scholar] [CrossRef] [PubMed]
  24. Rocco, B.; Sarchi, L.; Assumma, S.; Cimadamore, A.; Montironi, R.; Reggiani Bonetti, L.; Turri, F.; De Carne, C.; Puliatti, S.; Maiorana, A.; et al. Digital Frozen Sections with Fluorescence Confocal Microscopy During Robot-assisted Radical Prostatectomy: Surgical Technique. Eur. Urol. 2021, 80, 724–729. [Google Scholar] [CrossRef] [PubMed]
  25. Darr, C.; Costa, P.F.; Kahl, T.; Moraitis, A.; Engel, J.; Al-Nader, M.; Reis, H.; Köllermann, J.; Kesch, C.; Krafft, U.; et al. Intraoperative Molecular Positron Emission Tomography Imaging for Intraoperative Assessment of Radical Prostatectomy Specimens. Eur. Urol. Open Sci. 2023, 54, 28–32. [Google Scholar] [CrossRef] [PubMed Central]
  26. Lopez, A.; Zlatev, D.V.; Mach, K.E.; Bui, D.; Liu, J.J.; Rouse, R.V.; Harris, T.; Leppert, J.T.; Liao, J.C. Intraoperative Optical Biopsy during Robotic Assisted Radical Prostatectomy Using Confocal Endomicroscopy. J. Urol. 2016, 195, 1110–1117. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  27. Panarello, D.; Compérat, E.; Seyde, O.; Colau, A.; Terrone, C.; Guillonneau, B. Atlas of Ex Vivo Prostate Tissue and Cancer Images Using Confocal Laser Endomicroscopy: A Project for Intraoperative Positive Surgical Margin Detection During Radical Prostatectomy. Eur. Urol. Focus 2020, 6, 941–958. [Google Scholar] [CrossRef] [PubMed]
  28. Crow, P.; Molckovsky, A.; Stone, N.; Uff, J.; Wilson, B.; WongKeeSong, L.M. Assessment of fiberoptic near-infrared raman spectroscopy for diagnosis of bladder and prostate cancer. Urology 2005, 65, 1126–1130. [Google Scholar] [CrossRef] [PubMed]
  29. Pinto, M.; Zorn, K.C.; Tremblay, J.P.; Desroches, J.; Dallaire, F.; Aubertin, K.; Marple, E.T.; Kent, C.; Leblond, F.; Trudel, D.; et al. Integration of a Raman spectroscopy system to a robotic-assisted surgical system for real-time tissue characterization during radical prostatectomy procedures. J. Biomed. Opt. 2019, 24, 1–10. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  30. Katsimperis, S.; Tzelves, L.; Bellos, T.; Manolitsis, I.; Mourmouris, P.; Kostakopoulos, N.; Pyrgidis, N.; Somani, B.; Papatsoris, A.; Skolarikos, A. The use of indocyanine green in partial nephrectomy: A systematic review. Cent. Eur. J. Urol. 2024, 77, 15–21. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  31. Bjurlin, M.A.; McClintock, T.R.; Stifelman, M.D. Near-infrared fluorescence imaging with intraoperative administration of indocyanine green for robotic partial nephrectomy. Curr. Urol. Rep. 2015, 16, 20. [Google Scholar] [CrossRef] [PubMed]
  32. Gadus, L.; Kocarek, J.; Chmelik, F.; Matejkova, M.; Heracek, J. Robotic Partial Nephrectomy with Indocyanine Green Fluorescence Navigation. Contrast Media Mol. Imaging 2020, 2020, 1287530. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  33. Borofsky, M.S.; Gill, I.S.; Hemal, A.K.; Marien, T.P.; Jayaratna, I.; Krane, L.S.; Stifelman, M.D. Near-infrared fluorescence imaging to facilitate super-selective arterial clamping during zero-ischaemia robotic partial nephrectomy. BJU Int. 2013, 111, 604–610. [Google Scholar] [CrossRef] [PubMed]
  34. Martini, A.; Falagario, U.G.; Cumarasamy, S.; Jambor, I.; Wagaskar, V.G.; Ratnani, P.; Iii, K.G.H.; Tewari, A.K. The Role of 3D Models Obtained from Multiparametric Prostate MRI in Performing Robotic Prostatectomy. J. Endourol. 2022, 36, 387–393. [Google Scholar] [CrossRef] [PubMed]
  35. Rovera, G.; Grimaldi, S.; Oderda, M.; Finessi, M.; Giannini, V.; Passera, R.; Gontero, P.; Deandreis, D. Machine Learning CT-Based Automatic Nodal Segmentation and PET Semi-Quantification of Intraoperative (68)Ga-PSMA-11 PET/CT Images in High-Risk Prostate Cancer: A Pilot Study. Diagnostics 2023, 13, 3013. [Google Scholar] [CrossRef] [PubMed Central]
  36. Kaouk, J.; Garisto, J.; Eltemamy, M.; Bertolo, R. Pure Single-Site Robot-Assisted Partial Nephrectomy Using the SP Surgical System: Initial Clinical Experience. Urology 2019, 124, 282–285. [Google Scholar] [CrossRef] [PubMed]
  37. Kaouk, J.; Bertolo, R.; Eltemamy, M.; Garisto, J. Single-Port Robot-Assisted Radical Prostatectomy: First Clinical Experience Using The SP Surgical System. Urology. 2019, 124, 309. [Google Scholar] [CrossRef] [PubMed]
  38. Zhang, M.; Thomas, D.; Salama, G.; Ahmed, M. Single port robotic radical cystectomy with intracorporeal urinary diversion: A case series and review. Transl. Androl. Urol. 2020, 9, 925–930. [Google Scholar] [CrossRef] [PubMed Central]
  39. Hinojosa-Gonzalez, D.E.; Roblesgil-Medrano, A.; Torres-Martinez, M.; Alanis-Garza, C.; Estrada-Mendizabal, R.J.; Gonzalez-Bonilla, E.A.; Flores-Villalba, E.; Olvera-Posada, D. Single-port versus multiport robotic-assisted radical prostatectomy: A systematic review and meta-analysis on the da Vinci SP platform. Prostate 2022, 82, 405–414. [Google Scholar] [CrossRef] [PubMed]
  40. Fahmy, O.; Fahmy, U.A.; Alhakamy, N.A.; Khairul-Asri, M.G. Single-Port versus Multiple-Port Robot-Assisted Radical Prostatectomy: A Systematic Review and Meta-Analysis. J. Clin. Med. 2021, 10, 5723. [Google Scholar] [CrossRef] [PubMed Central]
  41. Li, K.; Yu, X.; Yang, X.; Huang, J.; Deng, X.; Su, Z.; Wang, C.; Wu, T. Perioperative and Oncologic Outcomes of Single-Port vs Multiport Robot-Assisted Radical Prostatectomy: A Meta-Analysis. J. Endourol. 2022, 36, 83–98. [Google Scholar] [CrossRef] [PubMed]
  42. Noël, J.; Moschovas, M.C.; Sandri, M.; Bhat, S.; Rogers, T.; Reddy, S.; Corder, C.; Patel, V. Patient surgical satisfaction after da Vinci(®) single-port and multi-port robotic-assisted radical prostatectomy: Propensity score-matched analysis. J. Robot. Surg. 2022, 16, 473–481. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  43. Lenfant, L.; Sawczyn, G.; Aminsharifi, A.; Kim, S.; Wilson, C.A.; Beksac, A.T.; Schwen, Z.; Kaouk, J. Pure Single-site Robot-assisted Radical Prostatectomy Using Single-port Versus Multiport Robotic Radical Prostatectomy: A Single-institution Comparative Study. Eur. Urol. Focus 2021, 7, 964–972. [Google Scholar] [CrossRef] [PubMed]
  44. Noh, T.I.; Kang, Y.J.; Shim, J.S.; Kang, S.H.; Cheon, J.; Lee, J.G.; Kang, S.G. Single-Port vs Multiport Robot-Assisted Radical Prostatectomy: A Propensity Score Matching Comparative Study. J. Endourol. 2022, 36, 661–667. [Google Scholar] [CrossRef] [PubMed]
  45. Vigneswaran, H.T.; Schwarzman, L.S.; Francavilla, S.; Abern, M.R.; Crivellaro, S. A Comparison of Perioperative Outcomes Between Single-port and Multiport Robot-assisted Laparoscopic Prostatectomy. Eur. Urol. 2020, 77, 671–674. [Google Scholar] [CrossRef] [PubMed]
  46. Ge, S.; Zeng, Z.; Li, Y.; Gan, L.; Meng, C.; Li, K.; Wang, Z.; Zheng, L. Comparing the safety and efficacy of single-port versus multi-port robotic-assisted techniques in urological surgeries: A systematic review and meta-analysis. World J. Urol. 2024, 42, 18. [Google Scholar] [CrossRef] [PubMed]
  47. Xu, M.C.; Hemal, A.K. Single-Port vs Multiport Robotic Surgery in Urologic Oncology: A Narrative Review. J. Endourol. 2025, 39, 271–284. [Google Scholar] [CrossRef] [PubMed]
  48. Harrison, R.; Ahmed, M.; Billah, M.; Sheckley, F.; Lulla, T.; Caviasco, C.; Sanders, A.; Lovallo, G.; Stifelman, M. Single-port versus multiport partial nephrectomy: A propensity-score-matched comparison of perioperative and short-term outcomes. J. Robot. Surg. 2023, 17, 223–231. [Google Scholar] [CrossRef] [PubMed]
  49. Glaser, Z.A.; Burns, Z.R.; Fang, A.M.; Saidian, A.; Magi-Galluzzi, C.; Nix, J.W.; Rais-Bahrami, S. Single- versus multi-port robotic partial nephrectomy: A comparative analysis of perioperative outcomes and analgesic requirements. J. Robot. Surg. 2022, 16, 695–703. [Google Scholar] [CrossRef] [PubMed]
  50. Nguyen, T.T.; Ngo, X.T.; Duong, N.X.; Dobbs, R.W.; Vuong, H.G.; Nguyen, D.D.; Basilius, J.; Onder, N.K.; Mendiola, D.F.A.; Hoang, T.-D.; et al. Single-Port vs Multiport Robot-Assisted Partial Nephrectomy: A Meta-Analysis. J. Endourol. 2024, 38, 253–261. [Google Scholar] [CrossRef] [PubMed]
  51. Gross, J.T.; Vetter, J.M.; Sands, K.G.; Palka, J.K.; Bhayani, S.B.; Figenshau, R.S.; Kim, E.H. Initial Experience with Single-Port Robot-Assisted Radical Cystectomy: Comparison of Perioperative Outcomes Between Single-Port and Conventional Multiport Approaches. J. Endourol. 2021, 35, 1177–1183. [Google Scholar] [CrossRef] [PubMed]
  52. Ditonno, F.; Franco, A.; Manfredi, C.; Veccia, A.; De Nunzio, C.; De Sio, M.; Vourganti, S.; Chow, A.K.; Cherullo, E.E.; Antonelli, A.; et al. Single-port robot-assisted simple prostatectomy: Techniques and outcomes. World J. Urol. 2024, 42, 98. [Google Scholar] [CrossRef] [PubMed]
  53. Heo, J.E.; Kang, S.K.; Lee, J.; Koh, D.; Kim, M.S.; Lee, Y.S.; Ham, W.S.; Jang, W.S. Outcomes of single-port robotic ureteral reconstruction using the da Vinci SP(®) system. Investig. Clin. Urol. 2023, 64, 373–379. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
  54. Nguyen, T.T.; Basilius, J.; Ali, S.N.; Dobbs, R.W.; Lee, D.I. Single-Port Robotic Applications in Urology. J. Endourol. 2023, 37, 688–699. [Google Scholar] [CrossRef] [PubMed]
  55. Marescaux, J.; Leroy, J.; Gagner, M.; Rubino, F.; Mutter, D.; Vix, M.; Butner, S.E.; Smith, M.K. Transatlantic robot-assisted telesurgery. Nature 2001, 413, 379–380. [Google Scholar] [CrossRef] [PubMed]
  56. Zheng, J.; Wang, Y.; Zhang, J.; Guo, W.; Yang, X.; Luo, L.; Jiao, W.; Hu, X.; Yu, Z.; Wang, C.; et al. 5G ultra-remote robot-assisted laparoscopic surgery in China. Surg. Endosc. 2020, 34, 5172–5180. [Google Scholar] [CrossRef] [PubMed]
  57. Yang, X.; Wang, Y.; Jiao, W.; Li, J.; Wang, B.; He, L.; Chen, Y.; Gao, X.; Li, Z.; Zhang, Y.; et al. Application of 5G technology to conduct tele-surgical robot-assisted laparoscopic radical cystectomy. Int. J. Med. Robot. 2022, 18, e2412. [Google Scholar] [CrossRef] [PubMed]
  58. Li, J.; Yang, X.; Chu, G.; Feng, W.; Ding, X.; Yin, X.; Zhang, L.; Lv, W.; Ma, L.; Sun, L.; et al. Application of Improved Robot-assisted Laparoscopic Telesurgery with 5G Technology in Urology. Eur. Urol. 2023, 83, 41–44. [Google Scholar] [CrossRef]
  59. Li, J.; Jiao, W.; Yuan, H.; Feng, W.; Ding, X.; Yin, X.; Zhang, L.; Lv, W.; Ma, L.; Sun, L.; et al. Telerobot-assisted laparoscopic adrenalectomy: Feasibility study. Br. J. Surg. 2022, 110, 6–9. [Google Scholar] [CrossRef] [PubMed]
  60. Zhou, X.; Wang, J.Y.; Zhu, X.; Sun, H.J.; Aikebaer, A.; Tian, J.Y.; Shao, Y.; Maimaitijiang, D.; Muhetaer, W.; Li, J.; et al. Ultra-remote robot-assisted laparoscopic surgery for varicocele through 5G network: Report of two cases and review of the literature. Zhonghua Nan Ke Xue 2022, 28, 696–701. [Google Scholar] [PubMed]
  61. Ebihara, Y.; Hirano, S.; Kurashima, Y.; Takano, H.; Okamura, K.; Murakami, S.; Shichinohe, T.; Morohashi, H.; Oki, E.; Hakamada, K.; et al. Tele-robotic distal gastrectomy with lymph node dissection on a cadaver. Asian J. Endosc. Surg. 2024, 17, e13246. [Google Scholar] [CrossRef] [PubMed]
  62. Takahashi, Y.; Hakamada, K.; Morohashi, H.; Wakasa, Y.; Fujita, H.; Ebihara, Y.; Oki, E.; Hirano, S.; Mori, M. Effects of communication delay in the dual cockpit remote robotic surgery system. Surg. Today 2024, 54, 496–501. [Google Scholar] [CrossRef] [PubMed] [PubMed Central]
Figure 1. Workflow of AI integration into robot-assisted surgery.
Figure 1. Workflow of AI integration into robot-assisted surgery.
Applsci 15 06118 g001
Figure 2. A three-dimensional (3D) virtual model for surgical planning used to guide robotic partial nephrectomy.
Figure 2. A three-dimensional (3D) virtual model for surgical planning used to guide robotic partial nephrectomy.
Applsci 15 06118 g002
Figure 3. Augmented Reality in RAPN. Real-time projection of tumors and vessels onto the kidney during surgery.
Figure 3. Augmented Reality in RAPN. Real-time projection of tumors and vessels onto the kidney during surgery.
Applsci 15 06118 g003
Table 1. Summary of AR and Margin Assessment Technologies in Robotic Urologic Oncology.
Table 1. Summary of AR and Margin Assessment Technologies in Robotic Urologic Oncology.
TechnologyApplicationClinical ValidationReference
Hyperaccuracy 3D AR for RAPNNavigation and tumor enucleationImproved functional outcomes, reduced ischemiaPorpiglia et al. (2020) [13]
Elastic AR for RARPCapsular involvement identification100% capsular accuracy vs. 47% in controlsPorpiglia et al. (2019) [19]
iKidney AI-AR SystemAutomated AR alignmentFirst clinical use reported, 97.8% overlay precisionSica et al. (2023) [16]
Confocal Laser Endomicroscopy (CLE)In vivo nerve-sparing visualizationFeasibility study; high-quality imaging of landmarksLopez et al. (2016) [26]
Fluorescence Confocal Microscopy (FCM)Ex vivo prostate margin evaluation>90% accuracy; rapid margin assessmentPuliatti et al. (2019); Rocco et al. (2021) [23,24]
Raman Spectroscopy (RS)In vivo tissue differentiation91% accuracy in vivo; pilot use in RARPPinto et al. (2019) [29]
Indocyanine Green (ICG) with NIRF ImagingPerfusion and tumor contrast in RAPNWidely used; validated in numerous RAPN studiesGadus et al. (2020); Borofsky et al. (2013) [32,33]
Table 2. Comparative Analysis of Biosensor Technologies.
Table 2. Comparative Analysis of Biosensor Technologies.
TechnologySensitivitySpecificityCost-EffectivenessEase of UseReference
Confocal Laser Endomicroscopy (CLE)HighHighModerateModerateLopez et al. (2016) [26]
Fluorescence Confocal Microscopy (FCM)>90%>90%Moderate to HighHigh (ex vivo)Puliatti et al. (2019); Rocco et al. (2021) [23,24]
Raman Spectroscopy (RS)≈91%≈96%HighModerate (requires training)Pinto et al. (2019) [29]
Table 3. Comparative Features of Single-Port (SP) and Multi-Port (MP) Robotic Platforms.
Table 3. Comparative Features of Single-Port (SP) and Multi-Port (MP) Robotic Platforms.
FeatureSingle-Port (SP)Multi-Port (MP)
Number of Ports1 multichannel port3–5 separate ports
Incision Size≈25 mm8–12 mm each
Instrument TriangulationLimitedExcellent
Learning CurveSteeperShorter
Access to Confined SpacesSuperiorChallenging
Pain and RecoveryImprovedModerate
Lymph Node YieldOften LowerHigher
Instrument Strength/TractionReducedStronger
Same-Day Discharge RateHigherLower
AvailabilityLimited GloballyWidely Available
Table 4. Technology Maturity and Strengths.
Table 4. Technology Maturity and Strengths.
TechnologyMaturity LevelKey StrengthsKey Study/Reference
AI-based Performance PredictionValidated in multi-institutional studiesOutcome prediction, tailored recoveryHung et al. (2018, 2019) [7,8]
Hyperaccuracy 3D ARApplied in complex RAPN and RARPAnatomical fidelity, margin accuracyPorpiglia et al. (2020) [13]
Confocal Microscopy (FCM)Validated ex vivo; clinical feasibility shownDigital workflow, rapid turnaroundPuliatti et al. (2019) [23]; Rocco et al. (2021) [24]
Raman Spectroscopy (RS)Pilot intraoperative use; promising resultsHigh diagnostic accuracy, integration with da VinciPinto et al. (2019) [29]
Indocyanine Green (ICG) ImagingRoutine in RAPN; well establishedPerfusion mapping, tumor contrastGadus et al. (2020) [32]; Borofsky et al. (2013) [33]
iKidney AR SystemFirst-in-human case; early stageAutomation, eliminates manual overlaySica et al. (2023) [16]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Katsimperis, S.; Tzelves, L.; Feretzakis, G.; Bellos, T.; Tsikopoulos, I.; Kostakopoulos, N.; Skolarikos, A. Innovations in Robot-Assisted Surgery for Genitourinary Cancers: Emerging Technologies and Clinical Applications. Appl. Sci. 2025, 15, 6118. https://doi.org/10.3390/app15116118

AMA Style

Katsimperis S, Tzelves L, Feretzakis G, Bellos T, Tsikopoulos I, Kostakopoulos N, Skolarikos A. Innovations in Robot-Assisted Surgery for Genitourinary Cancers: Emerging Technologies and Clinical Applications. Applied Sciences. 2025; 15(11):6118. https://doi.org/10.3390/app15116118

Chicago/Turabian Style

Katsimperis, Stamatios, Lazaros Tzelves, Georgios Feretzakis, Themistoklis Bellos, Ioannis Tsikopoulos, Nikolaos Kostakopoulos, and Andreas Skolarikos. 2025. "Innovations in Robot-Assisted Surgery for Genitourinary Cancers: Emerging Technologies and Clinical Applications" Applied Sciences 15, no. 11: 6118. https://doi.org/10.3390/app15116118

APA Style

Katsimperis, S., Tzelves, L., Feretzakis, G., Bellos, T., Tsikopoulos, I., Kostakopoulos, N., & Skolarikos, A. (2025). Innovations in Robot-Assisted Surgery for Genitourinary Cancers: Emerging Technologies and Clinical Applications. Applied Sciences, 15(11), 6118. https://doi.org/10.3390/app15116118

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop