Next Article in Journal
Impact of Dermatologic Screening and Methods on Breslow Thickness in Melanoma: A Retrospective Cohort Study
Previous Article in Journal
LINE-1 Transcript Heterogeneity in Non-Small Cell Lung Cancers Is Driven by Host Genomic Context and Conserved Functional Hotspots
Previous Article in Special Issue
Interpretation of PSMA-PET Among Urologists: A Prospective Multicentric Evaluation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Contemporary Preoperative Detection of Extraprostatic Extension in Prostate Cancer

1
Doctoral School, Poznań University of Medical Sciences, 60-812 Poznań, Poland
2
Department of Urology, Ministry of Internal Affairs Hospital, 61-701 Poznań, Poland
3
Department of Urology and Oncological Urology, Poznań University of Medical Sciences, 61-701 Poznań, Poland
4
Students’ Society of Urology, Poznań University of Medical Sciences, 60-812 Poznań, Poland
*
Author to whom correspondence should be addressed.
Cancers 2026, 18(3), 456; https://doi.org/10.3390/cancers18030456
Submission received: 30 December 2025 / Revised: 19 January 2026 / Accepted: 27 January 2026 / Published: 30 January 2026
(This article belongs to the Special Issue Advances in the Use of PET/CT and MRI in Prostate Cancer)

Simple Summary

Extraprostatic extension occurs when prostate cancer grows beyond the prostate capsule and is an important factor influencing surgical strategy and complication rates. Standard tools, such as clinical parameters, risk calculators, and multiparametric MRI, help estimate this risk; however, their accuracy is limited and varies between observers. New artificial intelligence techniques are increasingly being explored to improve preoperative detection. Radiomics and deep-learning models can analyze subtle imaging patterns that are often invisible to the human eye and may support more personalized clinical decisions. This review provides a contemporary overview of current and emerging methods for detecting extraprostatic extension and discusses future directions of prostate cancer management.

Abstract

Extraprostatic extension (EPE) is an important prognostic factor in prostate cancer and influences nerve-sparing decisions during radical prostatectomy. Multiparametric MRI (mpMRI) is the standard for local staging, but its sensitivity for EPE remains limited, and its interpretation is subject to inter-reader variability. In this narrative review, we aim to create an overview of contemporary strategies for the preoperative detection of EPE. We searched PubMed, Embase, Web of Science, and Google Scholar, focusing on studies published between 2015 and 2025 including articles evaluating clinical parameters, mpMRI features, nomograms, radiomics, machine learning, and deep learning models for EPE prediction. The analyzed literature was compared with respect to diagnostic performance, validation strategy, and clinical applicability of individual methods. Clinical parameters and traditional nomograms provide moderate accuracy for EPE detection. mpMRI improves staging, with tumor–capsule contact length as the most important single imaging marker. Radiomics-based and machine-learning models matched and occasionally outperform conventional approaches, achieving AUC values ranging from 0.75 to 0.85. Deep-learning models demonstrated similar performance by directly analyzing imaging data, although most lacked external validation and were sensitive to dataset heterogeneity. Several radiomics and deep learning models demonstrated performance comparable to, and in selected studies exceeding, expert radiologist assessment. Binary EPE classification has limited clinical value, while side-specific and graded EPE assessment offers a more clinically relevant approach. Translation of these tools into routine practice will require multimodal, side-specific, and externally validated models supported by automated segmentation and explainable artificial intelligence frameworks.

1. Introduction

Extraprostatic extension (EPE) of prostate cancer is a critical factor influencing various outcomes following radical prostatectomy. EPE has been associated with increased rates of positive surgical margins [1,2], biochemical recurrence [1,3], poorer progression-free survival [2,3], and a more aggressive form of cancer [1]. In addition, a wider extent of EPE is associated with a worse prognosis for the patient [3].
Given its strong prognostic implications, the accurate preoperative detection of EPE is essential not only for oncologic risk stratification but also for optimizing the balance between cancer control and functional preservation during surgery. In cases where EPE is confidently excluded, nerve-sparing prostatectomy may be safely performed, preserving erectile and urinary function [4,5,6]. Excluding EPE on one side of the prostate is already highly beneficial, as it enables nerve-sparing surgery on the unaffected side, which significantly increases the likelihood of postoperative erectile function recovery compared with non-nerve-sparing surgery [7]. Conversely, when EPE is suspected or confirmed bilaterally, wide excision or bilateral neurovascular bundle resection is required to achieve negative surgical margins [8], which reduces the chances of erectile function recovery to a minimum [7]. Hence, overtreatment due to false-positive findings of EPE can lead to unnecessary loss of function, while undertreatment increases the risk of recurrence. Therefore, the preoperative identification of EPE has direct and important implications for surgical strategy.
Traditionally, clinicians have relied on a combination of clinical factors like PSA, biopsy Gleason score, number of positive cores, and nomograms like MSKCC [9] and Partin tables [10] to estimate the risk of EPE. Then, multiparametric MRI (mpMRI) has become a cornerstone in the preoperative assessment of prostate cancer extension beyond the capsule, as mpMRI became a gold standard in predicting prostate cancer [11,12] with dedicated scoring systems and radiological features, also indicating possible EPE [13]. However, clinical tools have their limitations, like poor sensitivity for the detection of EPE on MRI [14]. The emergence of machine learning (ML) and deep learning (DL) approaches offers the potential to enhance predictive accuracy by learning complex patterns from large, multimodal datasets [15,16,17]. These tools may bridge the gap left by conventional methods and provide more individualized, reproducible, and scalable risk stratification for EPE.
This narrative review aims to provide an overview of the current methods used to detect EPE prior to radical prostatectomy. We explore traditional clinical predictors and biopsy-based nomograms, assess the diagnostic performance and limitations of mpMRI and MRI-based grading systems, and examine the contribution of quantitative imaging features and radiomics. Particular focus is placed on emerging machine learning and deep learning models, which hold promise for improved detection. Additionally, we discuss future research directions. Through this review, we aim to highlight both the current capabilities and unmet needs in the preoperative evaluation of capsular extension in prostate cancer.

2. Materials and Methods

Relevant literature for this narrative review was searched using major biomedical databases, including PubMed, Web of Science, Google Scholar, and Embase. Articles published between 2015 and 2025 were considered to reflect both foundational and state-of-the-art developments in EPE prediction. The search strategy involved combinations of the following keywords and MeSH terms: “prostate cancer”, “extraprostatic extension”, “EPE”, “MRI”, “radiomics”, “machine learning”, “deep learning”, “black-box’, “explainable AI’, “CNN”, “Convolutional Neural Network”, “capsular invasion”, “preoperative staging”, “EPE”, “extracapsular extension”, “ECE” “capsular invasion”, “radical prostatectomy”, “prostate mpMRI”, “PIRADS”, “artificial intelligenceandMRI prediction models”, “nomogram”, “side-specific”, “nomogram validation”.
Only original studies, reviews, and technical validation papers written in English were included. The included studies spanned a range of methodologies, including clinical risk calculators, MRI-based grading systems, quantitative MRI features, radiomics, and deep learning models trained on multi-institutional or single-center datasets. Priority was given to papers addressing the diagnostic performance of mpMRI, radiomic signatures, and artificial intelligence (AI)-based models in detecting EPE. Additional articles were identified through manual reference tracking and expert recommendations.
Artificial intelligence approaches were categorized into two groups. The first group comprised radiomics-based machine learning models. These workflows were defined as pipelines in which handcrafted quantitative features such as shape descriptors, first-order intensity statistics, and texture features were extracted from segmented regions of interest including the tumor, the tumor–capsule interface, or the whole gland, and were subsequently used as inputs to classical machine learning classifiers such as support vector machines, random forests, and gradient-boosted decision trees including XGBoost. The second group comprised deep learning models. These were defined as end-to-end or semi end-to-end convolutional neural network approaches that learn imaging representations directly from magnetic resonance imaging volumes or image patches, optionally preceded by automated segmentation.

3. Review

3.1. Parameters Used for EPE Detection

3.1.1. Clinical Parameters: PSA Level, PSA Density, ISUP Grade Group, Positive Biopsy Cores

Prostate-specific antigen (PSA) remains one of the most widely used serum biomarkers in the diagnostic and risk stratification pathway of prostate cancer. Elevated PSA levels, particularly when combined with PSA density (PSAD), may indicate a higher risk of adverse pathological features, including EPE [18,19]. Research shows that PSAD is an independent predictor of adverse pathologic stage and PSA-free survival [20].
Biopsy-derived factors, such as the Gleason grade group and the number or percentage of positive biopsy cores, are also strong predictors of cancer aggressiveness and extent. This was confirmed by a study in which patients with EPE, compared to those without EPE, had significantly higher PSA levels and higher International Society of Urological Pathology (ISUP) grade group in biopsy, but there were no differences in the percentage of positive biopsy cores [21]. However, other studies support that the percentage of positive biopsy cores is higher in patients with EPE, while PSA levels and ISUP grade group do not differ significantly between those with and without EPE [22]. Furthermore, the available literature confirms that including PSAD, tumor percentage, and tumor length in the Partin tables would improve EPE prediction [23].
Considering the TNM (Tumor, Node, Metastasis) classification in prostate cancer staging, the occurrence of EPE classified as at least T3a corresponds to the high-risk group, regardless of the PSA level or ISUP grading on the three-tier prognostic scale (low, intermediate, high) included in the European Association of Urology (EAU) guidelines [24]. Some authors recommend using the five-tier Cambridge prognostic scale to improve the prognosis of cancer-specific mortality in primary non-metastatic prostate cancer [25,26]. According to the above scale, EPE occurs in risk group 4 together with Grade Group 4 and PSA > 20 ng/mL [25,26].
Moreover, in patients with PSA < 10 ng/mL, in addition to PSAD and Gleason score, lymphovascular invasion is a statistically significant prognostic factor for EPE [27]. Another preoperative predictor of EPE may be perineural invasion at needle biopsy, as evidenced by available publications [28,29].
However, some authors suggest that the detection of EPE during prostate biopsy is rare and should not exclude radical local treatment [30,31,32]. Taken together, PSA-based markers, particularly PSAD, along with biopsy-derived features such as ISUP grade group, tumor burden, and lymphovascular or perineural invasion, provide important but sometimes inconsistent information on the risk of EPE. Although EPE automatically upstages disease to at least T3a and places patients in a high-risk category, the variability of individual predictors highlights the need for their combined use within refined prognostic systems rather than reliance on any single preoperative parameter. The summary of clinical parameters used in EPE prediction is depicted in Table 1.

3.1.2. The Role of Magnetic Resonance Imaging

Before the era of multiparametric magnetic resonance imaging, men with elevated PSA levels were referred for transrectal ultrasound-guided prostate biopsy. The guidelines changed after the PROMIS study, in which clinically significant prostate cancer (csPC) detection increased by 18% as a result of the performance and interpretation of mpMRI before the first set of biopsies [11]. Therefore, since 2019, this has been included in the official EAU guidelines and has been regarded as a gold standard diagnostic tool. Using mpMRI, it is now possible to detect the presence of EPE on a large scale. The inevitable step was the creation of standardized scales to unify radiological descriptions. One such scale is the PIRADS v2.1 scale, which assesses the likelihood of csPC based on mpMRI. The features visible in mpMRI that indicate EPE with the highest sensitivity and specificity include, among others, breach of the capsule with direct tumor extension and a tumor–capsule interface greater than 10 mm [33]. Depending on the source, the sensitivity and specificity of EPE detection based on mpMRI are 40.4–47.4% and 85.7–96.6%, respectively [34,35]. In addition, the ESUR scoring system shows moderate diagnostic effectiveness in detecting EPE [36]. Schieda, N. et al. [37] reported that the PIRADS scale can reduce differences in EPE detection accuracy caused by the lesser experience of the describing radiologist, simultaneously increasing sensitivity without reducing specificity. However, there are reports of high specificity but poor and heterogeneous sensitivity for local PC staging in MRI [38]. Parameters improving sensitivity with a simultaneous decrease in specificity for the detection of EPE are: tumor size, capsular contact, and ADC entropy [39]. Another useful tool is the 5-point Likert scale, which effectively determines the grade of EPE and seminal vesicle invasion (SVI) in mpMRI [40,41]. There is evidence that the maximum tumor diameter of Likert 3–5 lesions on MRI is an independent prognostic factor for EPE [42]. Additionally, Mehralivand EPE grade has the same diagnostic performance as the Likert scale, predicting both EPE and biochemical recurrence-free survival with a comparable degree of observer dependence [21,43].
An important aspect in EPE detection is the phenomenon of intrareader variability—a situation in which two different radiologists evaluate the same MRI scan differently. The literature data indicate that the consistency in detecting index lesions based on the PI-RADS v2.1 scale by radiologists with different levels of experience is substantial, while the consistency in excluding index changes is excellent regardless of the level of experience—85.1% for dedicated radiologists and 82.0% for non-dedicated radiologists [44]. In addition, clinically significant parameters for radiologist consistency, while minimizing intrareader variability, are PSAD ≥ 0.15 ng/mL/cc, pre-MRI high risk for PC, positivity threshold of PI-RADS score 4 + 5, PZ lesions, and homogeneous signal intensity of the PZ [44]. Also, reevaluation of the same image by another specialist improves EPE detection [45]. Therefore, it is reasonable to use the EPE grading system and artificial intelligence algorithms that estimate EPE risk without intrareader variability [46]. Existing EPE markers are effective regardless of tumor location, even in anterior prostate cancers (APCs), and the arc-dimension ratio may be a new marker for APCs [47]. Overall, mpMRI has become a cornerstone of preoperative EPE assessment, providing high specificity through standardized scales, but its sensitivity remains heterogeneous and dependent on reader experience. These limitations underline the need to focus on concrete, reproducible morphological mpMRI parameters and systems immune to intrareader variability.

3.1.3. mpMRI Parameters Used in EPE Prediction

Several morphological features on mpMRI have been identified as potential predictors of EPE and mEPE—Table 2. Among these, the length of tumor–capsule contact is one of the most widely studied and reproducible metrics. Thresholds ranging from >10 to >20 mm have been associated with increased risk of EPE, though exact cutoffs vary between studies [33,48,49]. Some authors claim that the ISUP grade group influences the relationship between tumor capsule length and EPE, and that 15–16 mm can be considered the threshold value [50]. Other predictive signs include capsular bulging, irregularity, focal capsular disruption (“breach”), and extraprostatic protrusion into periprostatic fat. Some studies also consider asymmetry or obliteration of the neurovascular bundle (NVB) and the rectoprostatic angle [51]. These features, while helpful in identifying gross capsular extension, often fail to capture subtle or microscopic invasion, which does not result in observable changes in capsular contour. Bulging is a highly sensitive feature—81%, while macroscopic extension has a specificity of 100% [52]. In addition, the prediction of EPE can be improved by the capsular enhancement sign. When comparing MRI sequences, the literature refers to DCE-MRI as having superior performance in predicting EPE compared to other sequences [53]. Another useful parameter for predicting EPE is tumor contact area 1 (TCA1), which describes tumor dimensions across two planes, and tumor contact area 2 (TCA2), which describes the tumor’s contact area within the MRI volume. Although TCA1 and TCA2 do not show any advantage over tumor contact length (TCL), they are still useful parameters, especially in cT2N0M0 PC [54]. The EPE number in combination with radial distance is also an effective measurement in predicting biochemical recurrence and substaging of pT3a prostate cancer [55]. The above evidence indicates that there is no ideal parameter for the definitive detection of EPE. The solution seems to be the use of various parameters together with clinical assessment by specialist radiologists in order to maximize the chances of a correct diagnosis.

3.1.4. Normograms

Normograms are tools used to assess the risk of EPE based on individual clinical parameters of the patient, mainly before planned radical prostatectomy. Traditional nomograms such as the Partin tables, the Memorial Sloan Kettering Cancer Center (MSKCC) nomogram, and the Cancer of the Prostate Risk Assessment (CAPRA) score are based on clinical parameters: PSA value, biopsy Gleason score, cT stage, and do not take into account mpMRI image, globally predicting EPE risk [9,56,57]. The tools that use MRI images are the Martini, Nyarangi–Dix, Soeterik, and Wibmer nomograms, which grant them the advantage of being side-specific—Table 2 [58,59,60,61]. They are especially important when deciding whether to perform a nerve-sparing radical prostatectomy on a certain side. Both traditional and MRI-inclusive nomograms have a moderate (0.72–0.80) AUC value in predicting EPE [15]. Additionally, the MSKCC nomogram has higher specificity than the Partin table for predicting EPE. Furthermore, the Nyarangi–Dix normogram is superior to other normograms in terms of net benefit for risk thresholds between 20 and 30% [62]. However, its limitation in practical terms is the fact that the ESUR score must be assessed by a radiologist based on mpMRI, which is not always available. Therefore, in everyday practice, the Soeterik nomogram may prove more useful due to its more accessible combination of parameters (PSAD, clinical stage on MRI, ISUP biopsy grade) with acceptable AUCs ranging from 0.80 to 0.83 in the testing cohort and from 0.77 to 0.78 in the validation cohort [63]. Among all MRI features, the ESUR score and TCCL had the highest AUC and AIC values [62]. In addition, it was shown that the Wibmer nomogram overestimates the risk of EPE for thresholds > 25% [62]. Despite external validation and updates to the Martini nomogram, miscalibration is still present [59]. Nevertheless, it is characterized by sensitivity and specificity compared to mpMRI of 84.2% and 66.7%, respectively [34]. Furthermore, the nomogram reported by Gandaglia et al. is characterized by higher discrimination (71.8% vs. 69.8%, p = 0.3 and 71.8% vs. 61.3%, p < 0.001) and similar miscalibration and net benefit for probability thresholds above 30% regarding EPE prediction when compared to the MSKCC nomogram and Partin tables [64,65]. Another useful nomogram is the one developed by Sayyid et al., which uses preoperative parameters to assess the risk of side-specific EPE without the use of mpMRI and has a predictive accuracy of 0.74 [66]. An important report is a publication comparing sixteen predictive models, which shows that the models developed by Pak, Patel, Martini, and Soeterik achieved the highest accuracy (AUC ranging from 0.73 to 0.77), adequate calibration for a probability threshold < 40%, and the highest net benefit for a probability threshold > 8% on decision curve analysis [67]. Overall, both traditional and MRI-based nomograms demonstrated moderate accuracy in predicting EPE, but models incorporating mpMRI provide clinically valuable side-specific information for surgical planning. However, limitations related to calibration, overestimation of risk, and restricted availability of MRI-derived parameters still hinder their widespread implementation in routine practice. The most important nomograms are depicted in Table 3 and Table 4.

3.1.5. Alternative Imaging Modalities

Micro-Ultrasound
High-resolution micro-ultrasound has emerged as a promising modality for real-time preoperative assessment of extraprostatic extension. Early studies demonstrated high sensitivity and negative predictive value for detecting EPE, including focal microscopic extension, with risk increasing proportionally to the number of micro-ultrasound predictors such as capsular bulging, hypoechoic halo, and obliteration of the vesicle–prostatic angle [79]. A prospective study of 140 patients confirmed that these micro-ultrasound features were strongly associated with non-organ-confined disease, achieving an AUC of 0.88 when combined with clinical parameters [80]. More recently, a large prospective cohort of 391 patients and 612 prostate lobes was used to develop a side-specific micro-ultrasound-based nomogram integrating PSAD, ISUP grade group, maximal core involvement, and MUS-detected EPE, which achieved an internally validated AUC of 0.81, comparable to an mpMRI-based model, while identifying 36% of EPE cases missed by MRI, including lesions invisible on PI-RADS assessment [81]. Finally, a micro-ultrasound-based nomogram developed in a prospective cohort of 295 patients achieved an AUC of 0.77 for micro-ultrasound alone and 0.86 for the multivariable model after internal bootstrap validation [82]. However, it is important to note that after the mentioned studies did not include any external validation, this raises concerns regarding the generalizability of the reported high AUCs to routine clinical practice. Data suggest that micro-ultrasound may complement mpMRI by providing real-time, side-specific risk stratification for nerve-sparing surgical planning, but more studies are needed.
Positron Emission Tomography
The use of Prostate-Specific Membrane Antigen Positron Emission Tomography (PSMA PET), particularly in hybrid PSMA PET/MRI imaging, is an advancement in the preoperative staging of prostate cancer, offering a higher sensitivity for detecting EPE compared to mpMRI alone [14,83,84,85]. Meta-analyses indicate that PSMA PET achieved a pooled sensitivity of approximately 0.72 and a high specificity of 0.87, with an overall AUC of 0.87 [83,86]. The intensity of tracer uptake, represented by SUVmax, showed a strong correlation with tumor aggressiveness and Gleason scores, serving as an independent predictor of EPE [83,87]. Despite its high specificity, the method’s sensitivity remains limited by spatial resolution, which may hinder the detection of microscopic EPE [83,86].
Increasing attention has been directed toward multimodal nomograms that integrate metabolic PET parameters, such as SUVmax and PSMA-derived tumor volume (PSMA-TV), with clinical and MRI data [83,87]. Studies have shown that models combining overt EPE on PET, SUVmax ≥ 13.84, and PSMA-TV can achieve an exceptionally high AUC of 0.890 [83]. Integration approaches, which “upgrade” the suspicion of EPE if SUVmax > 12 have demonstrated a significantly high sensitivity—80.4% [85]. Furthermore, recent nomograms utilizing 18F-DCFPyL PSMA-PET/CT with a threshold of SUVmax ≥ 13 have significantly outperformed traditional MRI-only models (AUC 0.754 vs. 0.735) [87]. However, while established clinical and MRI-based nomograms have undergone external validation, the latest PET-integrated models are based on single-center cohorts and require further external validation to ensure their stability in routine clinical practice. The question remains as to whether these exceptionally good results would be maintained after external validation.

3.1.6. Machine Learning Algorithms

An extension of nomograms is algorithms that use machine learning to recognize patterns of EPE based on patient clinical data. The literature confirms their usefulness in predicting EPE; the model cited in the test cohort obtained an area under the receiver operating characteristic curve (AUROC) of 0.81 and an area under the precision–recall curve (AUPRC) of 0.78 [78]. Further evidence is provided by the work of Kwong, J. et al., in which the developed AI model, after external validation, achieved a pooled AUROC of 0.77 and a pooled AUPRC of 0.61 [76]. Side-specific Extraprostatic Extension Risk Assessment tool (SEPERA) predicted side-specific EPE in 68% cases. The biggest defects of the model were false negatives, but none of them were aggressive tumors (grade > 2 or high-risk disease). A recent report is a study combining machine-learning normograms with PSMA-PET, MRI data, and genomics in site-specific EPE prediction [88]. Interestingly, Decipher Genomic Classifier (DGC) scores+PET achieved a better AUC compared to PET+MRI+DGC of 0.85 vs. 0.83, respectively. PET-only predictions were superior to MRI-only predictions; however, multimodal combinations achieved a significant improvement in prediction accuracy. Overall, machine-learning-based nomograms demonstrate similar, and sometimes even better and more flexible performance in EPE prediction compared with traditional models, particularly when combined with multimodal data such as PSMA-PET and genomic classifiers.

3.2. Artificial Intelligence in Image Analysis for EPE Prediction

Artificial intelligence is increasingly used in medical imaging as a tool to support clinical decisions. In prostate cancer, AI-based image analysis can be divided into two categories. The first uses radiomics combined with classical machine learning, where predefined image features are analyzed. The second focuses on deep learning, which learns patterns directly from images. Simplified differences between those two methods are depicted in Figure 1. The following sections present both approaches.

3.2.1. Machine Learning and Radiomics in Image Analysis for EPE Prediction

Radiomics is a method for analyzing medical images that goes beyond what radiologists can see with their eyes. Instead of just looking at an MRI or CT scan and describing what is visible, radiomics turns these images into large sets of numerical features [89,90]. These features capture detailed information about the shape, texture, intensity, and spatial patterns within the lesion and the nearby tissue [91]. The process starts by segmenting the area of interest, such as the suspected lesion or the prostate capsule, and then extracting features using software like PyRadiomics [92].
After extraction, these features can help build predictive models. They can be used on their own or combined with clinical information, such as PSA levels or Gleason scores [16,93]. In prostate cancer, radiomics is especially valuable since it can detect very subtle changes at the boundary between the tumor and the capsule [16]. These changes may indicate extraprostatic extension or features indicating microscopic extension that are often not visible by eye in standard imaging. The main advantages of radiomics are that it makes image analysis objective, consistent, and measurable, and that it can fit into clinical decision support systems [89,90,91].
An important step in creating radiomic models is selecting features [89,90]. The initial count of features is usually very large, so only the most informative ones are to be kept. This helps reduce the risk of overfitting and makes the model easier to understand. Techniques like recursive feature elimination, mutual information ranking, or regularized regression are often used for this purpose [94,95].
Then, supervised machine learning models are used to predict extraprostatic extension before surgery based on those extracted features. Common algorithms include support vector machines, random forests, and XGBoost [96,97]. Support vector machines handle high-dimensional data well and can differentiate between cases with and without extension [98]. Random forests combine multiple decision trees, which boosts strength and aids in ranking features [97]. XGBoost often achieves high accuracy by modeling complex relationships among variables [96]. In several studies, these models reached areas under the curve between 0.75 and 0.85, especially when trained on carefully chosen radiomic and clinical features [16]. Their main drawback is that they still depend on feature engineering and may struggle with generalizing to other patient populations [99]. Another limitation is the risk of overfitting when applied to small datasets, potentially leading to artificially inflated diagnostic performance [100]. For radiomics, the limitation is that performance relies on image quality, segmentation accuracy, and uniformity of imaging protocols [101]. Despite these challenges, radiomics and ML strategies are important milestones in advanced EPE prediction.
Several studies have already demonstrated the usefulness of radiomics and ML in predicting EPE. Ma et al. developed a T2-weighted MRI-based radiomics signature trained on 210 patients, achieving an AUC of 0.88 in independent validation, clearly outperforming expert radiologists [102]. Damascelli et al. applied a multiparametric MRI radiomic pipeline and demonstrated that a combined T2-weighted and ADC signature was significantly associated with extracapsular extension, reaching an accuracy of 0.84 [103]. Cuocolo et al. created a model based on T2-weighted and diffusion parameters, achieving an AUC of 0.85 in external validation [104]. Fan et al. built mpMRI-based machine-learning models and, using a random-forest classifier, achieved an AUC of 0.85 for predicting EPE [105].
Bai et al. introduced a peritumoral radiomics strategy using a 3–12 mm ring around the tumor and demonstrated better generalizability than intratumoral features, with an external validation AUC of 0.68 [106]. Losnegård et al. analyzed radiomic features derived from the capsule–tumor interface and reported superior diagnostic performance compared with expert radiologists [107].
Xu et al. constructed an mpMRI-based radiomics nomogram integrating radiomic and clinical features, achieving a validation AUC of 0.87 and significantly outperforming the clinical model alone [108]. He et al. evaluated MRI radiomics in a large cohort of 459 patients and showed that the best integrated radiomics–clinical model reached an AUC of 0.73 [109]. Stanzione et al. combined radiomics with the PI-RADS scoring system and demonstrated an incremental benefit over radiological assessment alone [110].
A 2024 meta-analysis summarized all studies available at that time employing radiomics, achieving a pooled AUC of 0.82 [16]. By analyzing these studies, we can conclude that radiomics can identify subtle changes and prostatic extensions more effectively than the visual assessments of radiologists.
Another meta-analysis showed that MRI-inclusive nomograms and AI or radiomics-based models achieved comparable, moderate performance for EPE prediction, suggesting that advanced imaging features do not yet consistently outperform traditional clinical models.
Many studies also merge radiomic features with clinical data like PSA, Prostate Volume, DRE, or biopsy data to create hybrid models, which often outperform models based on imaging or clinical variables alone. However, to date, only four studies employing radiomics and machine-learning methods have adopted this approach [105,106,107,109], reporting AUC values ranging from 0.72 to 0.85.
Another important limitation is the lack of side-specific modeling. As discussed earlier, accurate identification of the side affected by EPE is important for nerve-sparing surgery [7]. Among studies incorporating machine learning and radiomics, the majority predict only the presence of EPE, without specifying the side of involvement [16]. Therefore, future research should build on the strong performance of current radiomics and hybrid radiomics clinical models by developing robust side-specific approaches that translate these methods into personalized tools for surgical decision making.

3.2.2. Deep Learning

Deep learning is a type of artificial intelligence that enables computer systems to learn directly from data like complex medical images. Unlike traditional machine learning approaches, which depend on manually defined and extracted features, deep learning is based on convolutional neural networks that automatically identify patterns within the input data [111,112].
These networks learn in a hierarchical manner, beginning with simple visual characteristics such as edges or signal intensity differences and gradually combining them into more complex anatomical and contextual representations [111,112]. This makes deep learning particularly well-suited for the analysis of multiparametric MRI, where subtle spatial relationships within and around a lesion may carry important diagnostic information that is difficult to capture using conventional image interpretation [113,114].
In recent years, several research groups have applied convolutional neural networks to improve the detection of EPE. Examples include models developed by Hou, Moroianu, Priester, Yao, and Khosravi [115,116,117,118,119]. Together, these studies present a shift toward automated and potentially more flexible prediction systems in prostate cancer imaging, providing an AUC between 0.72 and 0.88 on validation datasets.
It should be noted that side-specific outcome reporting, which is particularly relevant for nerve-sparing surgery, has only been incorporated in a limited number of previously published DL models [115,117,119]. This represents an important gap in the current literature and highlights the need for prediction frameworks that are not only accurate, but also anatomically and surgically relevant.
One of the main advantages of deep learning is that it does not depend on predefined mathematical descriptors or handcrafted image features [113]. Instead, it resembles the way a human observer recognizes visual patterns while offering the ability to analyze large volumes of data and detect image characteristics that may not be visible to the human eye [113]. As a result, deep learning has the potential to make image interpretation more objective and consistent, reduce inter reader variability, and support clinical decision making by contributing to the standardization of radiological assessment [120,121].
Despite these advantages, deep learning also has important limitations. Many models operate as so-called black boxes, meaning that their internal decision-making processes are difficult to interpret from a clinical perspective [122]. This lack of transparency can limit clinician trust and slow the adoption of such tools in everyday practice. In addition, deep learning models are sensitive to variations in imaging data, scanner types, and acquisition protocols. Consequently, models that perform well on internal datasets may show reduced accuracy when applied to external cohorts from different centers [123]. This emphasizes the importance of robust methodological design and external validation. Another major challenge is the requirement for large, high-quality training datasets, which are often difficult to obtain in medical imaging, particularly for highly specific tasks such as predicting extracapsular extension [124].
To address these challenges, increasing attention has been directed toward explainable artificial intelligence, the use of larger and more diverse datasets, and the implementation of consistent validation strategies [123,125,126,127]. Explainable artificial intelligence aims to make the decision-making process of complex models more transparent by identifying which image regions or variables contribute most strongly to a given prediction [128]. By linking model outputs to clinically meaningful features, these approaches help align algorithmic reasoning with clinical intuition, which may improve clinician trust and facilitate integration into routine practice [129]. The way explainable AI improves the black box nature of deep learning model prediction is depicted in Figure 2.
Rather than functioning only as probability-generating tools, explainable models provide insight into why a particular prediction is made, allowing clinicians to better understand, evaluate, and contextualize the results [130]. The clinical value of such approaches has already been demonstrated in other areas of medicine, for example, in improving the understanding of hypoxemia risk during anesthesia, where explainability has improved both model transparency and clinical acceptance [131]. With continued methodological advances and access to good datasets, explainable deep learning approaches have the potential to become an important component of precision imaging and personalized treatment planning [114].
In practice, the two approaches—ML and DL are complementary. Classical machine learning can be effective when data volume is limited, and interpretability is important [132,133]. Deep learning becomes advantageous when larger datasets are available and when the diagnostic task benefits from learning complex spatial relationships [132,133]. With the growing adoption of explainable AI and multicenter imaging datasets, both paradigms are increasingly being integrated into clinical decision-support systems, with the shared aim of improving accuracy, consistency, and personalized care. The main characteristics of the two approaches are summarized in Table 5.

3.2.3. AI vs. Specialists

MRI interpreted by an experienced radiologist remains the gold standard for prostate cancer imaging [12]. Artificial intelligence should act as a supportive tool designed to assist and augment radiological decision-making rather than replace the radiologist [134,135]. While AI-based systems may outperform less experienced readers, highly specialized genitourinary radiologists may still achieve better diagnostic accuracy, as depicted in studies [135,136,137]. However, a recent study suggests that newer AI models may outperform even expert radiologists [135]. At present, such evidence remains limited in the available literature. Nevertheless, considering the inter-reader variability of MRI interpretation, AI may serve as a standardizing tool, contributing to more consistent and reproducible assessments across different levels of radiological expertise [135,138].

4. Future Directions

An important future direction in the prediction of extraprostatic extension is developing multimodal models that combine imaging features from MRI, biopsy pathology, serum markers like PSA or PSAD, and even genomic classifiers [139,140]. By combining different types of data, multimodal models may improve both sensitivity and specificity, especially in patients with uncertain EPE risks. However, creating these models requires large, well-annotated datasets with synchronized clinical, imaging, and molecular data, which are hard to obtain in most clinical settings [141]. Future studies should focus on building integrated pipelines based on different types of data that can provide personalized risk estimates to guide surgical decisions and on creating well-built datasets.
A significant limitation to clinical adoption of AI-based models for predicting EPE is the lack of external validation and the limited availability of multi-institutional datasets [141,142]. Many models are trained and tested at a single center, often involving similar patient groups, imaging protocols, and annotation standards. This raises concerns about overfitting and poor generalization to real-world clinical scenarios [143,144]. Future efforts must prioritize external validation cohorts with varied demographic and technical characteristics. Collaborative groups, federated learning setups, and publicly available collections of annotated MRI-histopathology pairs could speed up this process. Without strong validation across institutions, clinical use of AI models will probably remain restricted.
Another underexplored area is the direct comparison between artificial intelligence models and expert radiologists, as well as the evaluation of combined strategies in which AI models are used to support human interpretation [145]. While several studies regarding EPE detection have compared model performance with that of radiologists, only a limited number have assessed whether a radiologist assisted by an AI model outperforms either approach alone [115]. Future research should systematically investigate three scenarios: radiologist alone, model alone, and radiologist assisted by the model, to determine whether AI can meaningfully enhance human performance rather than merely replicate it.
The inconsistency in defining ground truth and reporting performance is a major limitation in the current literature on AI for EPE detection. Some studies define EPE based on clinical reports; others use registered histology with precise capsular annotations. This variability in reporting standards complicates comparison and evaluation between models. Protocols like TRIPOD-ML [146,147], QUADAS-AI [148], and the PROBAST-AI [149,150] provide guidelines for good reporting. Adoption of these guidelines would improve reproducibility and help clinicians better evaluate the AI tools in EPE risk assessment.
Precise and reproducible segmentation of the prostate and lesions is essential for radiomics. Manual segmentation takes a lot of time and can vary between experts, making automated segmentation crucial for real-world AI use. Recent advances in 3D architectures like 3D UNet [151], V-Net [152], and nnU-Net [153] have shown strong results in prostate MRI segmentation, often achieving Dice scores above 0.90 [154]. Incorporating such tools into EPE prediction workflows can reduce dependence on operators and improve model scalability.
In addition to the radiologist’s expertise and proper training of AI models, image quality is a critical determinant of EPE detection performance [155,156]. Poor MRI quality may lead to an underestimation of subtle capsular irregularities and compromise both human and algorithmic interpretation. Therefore, future studies should incorporate standardized tools for assessing prostate MRI quality, such as PI-QUAL [157]. Consequently, reporting image quality scores should become a mandatory element of future studies evaluating AI-based EPE prediction models.
Most studies included in this review treated extraprostatic extension as a binary outcome, classifying disease simply as present or absent; however, this approach may be clinically insufficient, as focal capsular breach and established extraprostatic extension carry different prognostic and surgical implications, particularly for nerve-sparing decisions and margin risk stratification [158,159]. Only a limited number of investigations attempted to grade EPE severity [59,61,115,116,117] despite evidence that this distinction provides incremental clinical value [160]. Consequently, binary prediction frameworks may underestimate the biological complexity of periprostatic tumor spread and contribute to discrepancies between reported model performance and real-world surgical outcomes. Future artificial intelligence models should therefore incorporate graded EPE assessment using multi-class or regression-based strategies to better reflect invasion extent and enhance clinical interpretability.
The most important features of future EPE prediction models are depicted in Figure 3.

5. Conclusions

Preoperative detection of extraprostatic extension is limited by the sensitivity of traditional clinical parameters and mpMRI, as well as by substantial inter-reader variability. These limitations may lead to inappropriate surgical planning with negative consequences for both oncological and functional outcomes.
Radiomics and machine-learning approaches provide more objective and quantitative assessments of mpMRI and match and sometimes even outperform conventional nomograms. In addition, emerging non-MRI imaging modalities, such as high-resolution micro-ultrasound and PSMA PET, may play a complementary role in selected clinical scenarios, particularly when mpMRI findings are equivocal. However, the performance of these advanced approaches remains dependent on image quality, segmentation accuracy, and the heterogeneity of available datasets.
Deep-learning models offer further improvements in analyzing complex imaging patterns and may support more comprehensive, multimodal assessment strategies in the future. Nevertheless, their clinical adoption is still restricted by limited interpretability, insufficient external validation, and the lack of reliable side-specific predictions. Future efforts should therefore focus on developing explainable, multimodal, and externally validated frameworks that integrate complementary imaging tools and artificial intelligence to support personalized surgical decision-making in routine clinical practice.

Author Contributions

Conceptualization, J.S.; Methodology, J.S., J.K. and A.K.; Investigation, J.S., J.H., J.K., A.K., T.M. and W.A.C.; Data curation, J.S. and J.K.; Writing—original draft preparation, J.S.; Writing—review and editing, J.S., T.M. and W.A.C.; Visualization, J.H.; Supervision, T.M. and W.A.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

No new data were created or analyzed in this study.

Acknowledgments

The authors acknowledge the use of AI-assisted tools (OpenAI 5.2 and Grammarly 14.1267.0) for minor language editing and refining, and help with image generation. The authors have reviewed and edited the output and take full responsibility for the content of this publication.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
EPEExtraprostatic extension
mpMRIMultiparametric Magnetic Resonance Imaging
MLMachine learning
DLDeep learning
PSAProstate-specific antigen
PSADPSA density
ISUPInternational Society of Urological Pathology
TNMTumor, Node, Metastasis
EAUEuropean Association of Urology
csPCClinically significant prostate cancer
PCProstate cancer
SVISeminal vesicle invasion
NVBNeurovascular bundle
TCA1Tumor contact area 1
TCA2Tumor contact area 2
TCLTumor contact length
MSKCCMemorial Sloan Kettering Cancer Center
CAPRACancer of the Prostate Risk Assessment
AUROCArea under receiver operating characteristic curve
AUPRCArea under precision–recall curve
SEPERASide-specific Extraprostatic Extension Risk Assessment
DGCDecipher Genomic Classifier
PETPositron emission tomography
PSMAProstate specific membrane antigen
PSMA-PETProstate-specific membrane antigen positron emission tomography
PSMA-TVProstate-specific membrane antigen–derived tumor volume
SUVmaxMaximum standardized uptake value
TVTumor volume

References

  1. Wibmer, A.G.; Robertson, N.L.; Hricak, H.; Zheng, J.; Capanu, M.; Stone, S.; Ehdaie, B.; Brawer, M.K.; Vargas, H.A. Extracapsular Extension on MRI Indicates a More Aggressive Cell Cycle Progression Genotype of Prostate Cancer. Abdom. Radiol. 2019, 44, 2864–2873. [Google Scholar] [CrossRef] [PubMed]
  2. Cheng, L.; Darson, M.F.; Bergstralh, E.J.; Slezak, J.; Myers, R.P.; Bostwick, D.G. Correlation of Margin Status and Extraprostatic Extension with Progression of Prostate Carcinoma. Cancer 1999, 86, 1775–1782. [Google Scholar] [CrossRef]
  3. Ball, M.W.; Partin, A.W.; Epstein, J.I. Extent of Extraprostatic Extension Independently Influences Biochemical Recurrence-Free Survival: Evidence for Further PT3 Subclassification. Urology 2015, 85, 161–164. [Google Scholar] [CrossRef] [PubMed]
  4. Wang, X.; Wu, Y.; Guo, J.; Chen, H.; Weng, X.; Liu, X. Intrafascial Nerve-Sparing Radical Prostatectomy Improves Patients’ Postoperative Continence Recovery and Erectile Function: A Pooled Analysis Based on Available Literatures. Medicine 2018, 97, e11297. [Google Scholar] [CrossRef]
  5. Kyriazis, I.; Spinos, T.; Tsaturyan, A.; Kallidonis, P.; Stolzenburg, J.U.; Liatsikos, E. Different Nerve-Sparing Techniques during Radical Prostatectomy and Their Impact on Functional Outcomes. Cancers 2022, 14, 1601. [Google Scholar] [CrossRef]
  6. Liu, Y.; Deng, X.Z.; Qin, J.; Wen, Z.; Jiang, Y.; Huang, J.; Wang, C.J.; Chen, C.X.; Wang, L.; Li, K.P.; et al. Erectile Function, Urinary Continence and Oncologic Outcomes of Neurovascular Bundle Sparing Robot-Assisted Radical Prostatectomy for High-Risk Prostate Cancer: A Systematic Review and Meta-Analysis. Front. Oncol. 2023, 13, 1161544. [Google Scholar] [CrossRef]
  7. Xiang, P.; Du, Z.; Guan, D.; Yan, W.; Wang, M.; Guo, D.; Liu, D.; Liu, Y.; Ping, H. Is There Any Difference in Urinary Continence between Bilateral and Unilateral Nerve Sparing during Radical Prostatectomy? A Systematic Review and Meta-Analysis. World J. Surg. Oncol. 2024, 22, 66. [Google Scholar] [CrossRef]
  8. Lavallée, L.T.; Stokl, A.; Cnossen, S.; Mallick, R.; Morash, C.; Cagiannos, I.; Breau, R.H. The Effect of Wide Resection during Radical Prostatectomy on Surgical Margins. J. Can. Urol. Assoc. 2016, 10, 14–17. [Google Scholar] [CrossRef]
  9. Prostate Cancer Nomograms: Pre-Radical Prostatectomy|Memorial Sloan Kettering Cancer Center. Available online: https://www.mskcc.org/nomograms/prostate/pre_op (accessed on 28 December 2025).
  10. Partin, A.W.; Mangold, L.A.; Lamm, D.M.; Walsh, P.C.; Epstein, J.I.; Pearson, J.D. Contemporary Update of Prostate Cancer Staging Nomograms (Partin Tables) for the New Millennium. Urology 2001, 58, 843–848. [Google Scholar] [CrossRef]
  11. Ahmed, H.U.; El-Shater Bosaily, A.; Brown, L.C.; Gabe, R.; Kaplan, R.; Parmar, M.K.; Collaco-Moraes, Y.; Ward, K.; Hindley, R.G.; Freeman, A.; et al. Diagnostic Accuracy of Multi-Parametric MRI and TRUS Biopsy in Prostate Cancer (PROMIS): A Paired Validating Confirmatory Study. Lancet 2017, 389, 815–822. [Google Scholar] [CrossRef]
  12. Prostate Cancer—Uroweb. Available online: https://uroweb.org/guidelines/prostate-cancer/summary-of-changes/2019 (accessed on 28 December 2025).
  13. Turkbey, B.; Rosenkrantz, A.B.; Haider, M.A.; Padhani, A.R.; Villeirs, G.; Macura, K.J.; Tempany, C.M.; Choyke, P.L.; Cornud, F.; Margolis, D.J.; et al. Prostate Imaging Reporting and Data System Version 2.1: 2019 Update of Prostate Imaging Reporting and Data System Version 2. Eur. Urol. 2019, 76, 340–351. [Google Scholar] [CrossRef] [PubMed]
  14. Zhang, F.; Liu, C.L.; Chen, Q.; Shao, S.C.; Chen, S.Q. Accuracy of Multiparametric Magnetic Resonance Imaging for Detecting Extracapsular Extension in Prostate Cancer: A Systematic Review and Meta-Analysis. Br. J. Radiol. 2019, 92, 20190480. [Google Scholar] [CrossRef] [PubMed]
  15. Zhu, M.L.; Gao, J.H.; Han, F.; Yin, L.L.; Zhang, L.S.; Yang, Y.; Zhang, J.W. Diagnostic Performance of Prediction Models for Extraprostatic Extension in Prostate Cancer: A Systematic Review and Meta-Analysis. Insights Imaging 2023, 14, 140. [Google Scholar] [CrossRef] [PubMed]
  16. Ponsiglione, A.; Gambardella, M.; Stanzione, A.; Green, R.; Cantoni, V.; Nappi, C.; Crocetto, F.; Cuocolo, R.; Cuocolo, A.; Imbriaco, M. Radiomics for the Identification of Extraprostatic Extension with Prostate MRI: A Systematic Review and Meta-Analysis. Eur. Radiol. 2023, 34, 3981–3991. [Google Scholar] [CrossRef]
  17. Wen, J.; Liu, W.; Zhang, Y.; Shen, X. MRI-Based Radiomics for Prediction of Extraprostatic Extension of Prostate Cancer: A Systematic Review and Meta-Analysis. La Radiol. Medica 2024, 129, 702–711. [Google Scholar] [CrossRef]
  18. Gilliland, F.D.; Hoffman, R.M.; Hamilton, A.; Albertsen, P.; Eley, J.W.; Harlan, L.; Stanford, J.L.; Hunt, W.C.; Potosky, A. Predicting Extracapsular Extension of Prostate Cancer in Men Treated with Radical Prostatectomy: Results from the Population Based Prostate Cancer Outcomes Study. J. Urol. 1999, 162, 1341–1345. [Google Scholar] [CrossRef]
  19. Moon, H.W.; Kim, D.H.; Kim, J.; Kim, B.; Oh, S.N.; Choi, J.-I.; Rha, S.E.; Lee, J.Y. A Preoperative Scoring System for Predicting the Extraprostatic Extension of Prostate Cancer Following Radical Prostatectomy Using Magnetic Resonance Imaging and Clinical Factors. Abdom. Radiol. 2024, 49, 2683–2692. [Google Scholar] [CrossRef]
  20. Koie, T.; Mitsuzuka, K.; Yoneyama, T.; Narita, S.; Kawamura, S.; Kaiho, Y.; Tsuchiya, N.; Tochigi, T.; Habuchi, T.; Arai, Y.; et al. Prostate-Specific Antigen Density Predicts Extracapsular Extension and Increased Risk of Biochemical Recurrence in Patients with High-Risk Prostate Cancer Who Underwent Radical Prostatectomy. Int. J. Clin. Oncol. 2015, 20, 176–181. [Google Scholar] [CrossRef]
  21. Kim, S.H.; Cho, S.H.; Kim, W.H.; Kim, H.J.; Park, J.M.; Kim, G.C.; Ryeom, H.K.; Yoon, Y.S.; Cha, J.G. Predictors of Extraprostatic Extension in Patients with Prostate Cancer. J. Clin. Med. 2023, 12, 5321. [Google Scholar] [CrossRef]
  22. Lim, C.; Flood, T.A.; Hakim, S.W.; Shabana, W.M.; Quon, J.S.; El-Khodary, M.; Thornhill, R.E.; El Hallani, S.; Schieda, N. Evaluation of Apparent Diffusion Coefficient and MR Volumetry as Independent Associative Factors for Extra-Prostatic Extension (EPE) in Prostatic Carcinoma. J. Magn. Reson. Imaging 2016, 43, 726–736. [Google Scholar] [CrossRef]
  23. Sertkaya, Z.; Öztürk, M.İ.; Koca, O.; Güneş, M.; Karaman, M.İ. Predictive Values for Extracapsular Extension in Prostate Cancer Patients with PSA Values below 10 Ng/ML. Turk. J. Urol. 2014, 40, 130. [Google Scholar] [CrossRef]
  24. Merder, E.; Arıman, A.; Altunrende, F. A Modified Partın Table to Better Predict Extracapsular Extensıon in Clinically Localized Prostate Cancer. Urol. J. 2021, 18, 74–80. [Google Scholar] [CrossRef]
  25. Gnanapragasam, V.J.; Lophatananon, A.; Wright, K.A.; Muir, K.R.; Gavin, A.; Greenberg, D.C. Improving Clinical Risk Stratification at Diagnosis in Primary Prostate Cancer: A Prognostic Modelling Study. PLoS Med. 2016, 13, 1002063. [Google Scholar] [CrossRef] [PubMed]
  26. Gnanapragasam, V.J.; Bratt, O.; Muir, K.; Lee, L.S.; Huang, H.H.; Stattin, P.; Lophatananon, A. The Cambridge Prognostic Groups for Improved Prediction of Disease Mortality at Diagnosis in Primary Non-Metastatic Prostate Cancer: A Validation Study. BMC Med. 2018, 16, 31. [Google Scholar] [CrossRef] [PubMed]
  27. Parry, M.G.; Cowling, T.E.; Sujenthiran, A.; Nossiter, J.; Berry, B.; Cathcart, P.; Aggarwal, A.; Payne, H.; Van Der Meulen, J.; Clarke, N.W.; et al. Risk Stratification for Prostate Cancer Management: Value of the Cambridge Prognostic Group Classification for Assessing Treatment Allocation. BMC Med. 2020, 18, 114. [Google Scholar] [CrossRef] [PubMed]
  28. Cozzi, G.; Rocco, B.M.; Grasso, A.; Rosso, M.; Abed El Rahman, D.; Oliva, I.; Talso, M.; Costa, B.; Tafa, A.; Palumbo, C.; et al. Perineural Invasion as a Predictor of Extraprostatic Extension of Prostate Cancer: A Systematic Review and Meta-Analysis. Scand. J. Urol. 2013, 47, 443–448. [Google Scholar] [CrossRef]
  29. Chen, J.R.; Zhao, J.G.; Zhu, S.; Zhang, M.N.; Chen, N.; Liu, J.D.; Sun, G.X.; Shen, P.F.; Zeng, H. Clinical and Oncologic Findings of Extraprostatic Extension on Needle Biopsy in de Novo Metastatic Prostate Cancer. Asian J. Androl. 2020, 22, 427–431. [Google Scholar] [CrossRef]
  30. Goldberg, H.; Ramiz, A.H.; Glicksman, R.; Salgado, N.S.; Chandrasekar, T.; Klaassen, Z.; Wallis, C.J.D.; Hosni, A.; Moraes, F.Y.; Ghai, S.; et al. Extraprostatic Extension in Core Biopsies Epitomizes High-Risk but Locally Treatable Prostate Cancer. Eur. Urol. Oncol. 2019, 2, 88–96. [Google Scholar] [CrossRef]
  31. Fleshner, K.; Assel, M.; Benfante, N.; Lee, J.; Vickers, A.; Fine, S.; Carlsson, S.; Eastham, J. Clinical Findings and Treatment Outcomes in Patients with Extraprostatic Extension Identified on Prostate Biopsy. J. Urol. 2016, 196, 703–708. [Google Scholar] [CrossRef]
  32. Billis, A. Pathology. Int. Braz. J. Urol. 2004, 30, 346–348. [Google Scholar] [CrossRef][Green Version]
  33. Choi, M.H.; Kim, D.H.; Lee, Y.J.; Rha, S.E.; Lee, J.Y. Imaging Features of the PI-RADS for Predicting Extraprostatic Extension of Prostate Cancer: Systematic Review and Meta-Analysis. Insights Imaging 2023, 14, 77. [Google Scholar] [CrossRef] [PubMed]
  34. Majchrzak, N.; Cieśliński, P.; Milecki, T.; Głyda, M.; Karmelita-Katulska, K. MRI utility in predicting extraprostatic extension of prostate cancer and biochemical recurrence after radical prostatectomy. Biul. Pol. Tow. Onkol. Nowotw. 2021, 4, 261–265. [Google Scholar] [CrossRef]
  35. Ayaz, M.; Gülseren, Y.; İnan, İ.; Ok, F.; Kabaalioǧlu, A.; Yildirim, A. Extraprostatic Extension in Multiparametric MRI.; Is Presurgical Detection Possible? J. Cancer Res. Ther. 2023, 19, S639–S644. [Google Scholar] [CrossRef] [PubMed]
  36. Li, W.; Dong, A.; Hong, G.; Shang, W.; Shen, X. Diagnostic Performance of ESUR Scoring System for Extraprostatic Prostate Cancer Extension: A Meta-Analysis. Eur. J. Radiol. 2021, 143, 109896. [Google Scholar] [CrossRef]
  37. Schieda, N.; Quon, J.S.; Lim, C.; El-Khodary, M.; Shabana, W.; Singh, V.; Morash, C.; Breau, R.H.; McInnes, M.D.F.; Flood, T.A. Evaluation of the European Society of Urogenital Radiology (ESUR) PI-RADS Scoring System for Assessment of Extra-Prostatic Extension in Prostatic Carcinoma. Eur. J. Radiol. 2015, 84, 1843–1848. [Google Scholar] [CrossRef]
  38. de Rooij, M.; Hamoen, E.H.J.; Witjes, J.A.; Barentsz, J.O.; Rovers, M.M. Accuracy of Magnetic Resonance Imaging for Local Staging of Prostate Cancer: A Diagnostic Meta-Analysis. Eur. Urol. 2016, 70, 233–245. [Google Scholar] [CrossRef]
  39. Krishna, S.; Lim, C.S.; McInnes, M.D.F.; Flood, T.A.; Shabana, W.M.; Lim, R.S.; Schieda, N. Evaluation of MRI for Diagnosis of Extraprostatic Extension in Prostate Cancer. J. Magn. Reson. Imaging 2018, 47, 176–185. [Google Scholar] [CrossRef]
  40. Freifeld, Y.; DeLeon, A.D.; Xi, Y.; Pedrosa, I.; Roehrborn, C.G.; Lotan, Y.; Francis, F.; Costa, D.N. Diagnostic Performance of Prospectively Assigned Likert Scale Scores to Determine Extraprostatic Extension and Seminal Vesicle Invasion with Multiparametric MRI of the Prostate. AJR Am. J. Roentgenol. 2019, 212, 576–581. [Google Scholar] [CrossRef]
  41. Asfuroğlu, U.; Asfuroğlu, B.B.; Özer, H.; Gönül, İ.I.; Tokgöz, N.; İnan, M.A.; Uçar, M. Which One Is Better for Predicting Extraprostatic Extension on Multiparametric MRI: ESUR Score, Likert Scale, Tumor Contact Length, or EPE Grade? Eur. J. Radiol. 2022, 149, 110228. [Google Scholar] [CrossRef]
  42. Zapała, P.; Dybowski, B.; Bres-Niewada, E.; Lorenc, T.; Powała, A.; Lewandowski, Z.; Gołębiowski, M.; Radziszewski, P. Predicting Side-Specific Prostate Cancer Extracapsular Extension: A Simple Decision Rule of PSA, Biopsy, and MRI Parameters. Int. Urol. Nephrol. 2019, 51, 1545–1552. [Google Scholar] [CrossRef]
  43. Reisæter, L.A.R.; Halvorsen, O.J.; Beisland, C.; Honoré, A.; Gravdal, K.; Losnegård, A.; Monssen, J.; Akslen, L.A.; Biermann, M. Assessing Extraprostatic Extension with Multiparametric MRI of the Prostate: Mehralivand Extraprostatic Extension Grade or Extraprostatic Extension Likert Scale? Radiol. Imaging Cancer 2020, 2, e190071. [Google Scholar] [CrossRef]
  44. Engel, H.; Oerther, B.; Reisert, M.; Kellner, E.; Sigle, A.; Gratzke, C.; Bronsert, P.; Krauss, T.; Bamberg, F.; Benndorf, M. Quantitative Analysis of Diffusion Weighted Imaging May Improve Risk Stratification of Prostatic Transition Zone Lesions. In Vivo 2022, 36, 2323. [Google Scholar] [CrossRef] [PubMed]
  45. Wibmer, A.; Vargas, H.A.; Donahue, T.F.; Zheng, J.; Moskowitz, C.; Eastham, J.; Sala, E.; Hricak, H. Diagnosis of Extracapsular Extension of Prostate Cancer on Prostate MRI: Impact of Second-Opinion Readings by Subspecialized Genitourinary Oncologic Radiologists. AJR Am. J. Roentgenol. 2015, 205, W73–W78. [Google Scholar] [CrossRef] [PubMed]
  46. Li, W.; Shang, W.; Lu, F.; Sun, Y.; Tian, J.; Wu, Y.; Dong, A. Diagnostic Performance of Extraprostatic Extension Grading System for Detection of Extraprostatic Extension in Prostate Cancer: A Diagnostic Systematic Review and Meta-Analysis. Front. Oncol. 2022, 11, 792120. [Google Scholar] [CrossRef] [PubMed]
  47. Ahn, H.; Hwang, S.I.; Lee, H.J.; Suh, H.S.; Choe, G.; Byun, S.S.; Hong, S.K.; Lee, S.; Lee, J. Prediction of Extraprostatic Extension on Multi-Parametric Magnetic Resonance Imaging in Patients with Anterior Prostate Cancer. Eur. Radiol. 2020, 30, 26–37. [Google Scholar] [CrossRef]
  48. Kim, T.H.; Woo, S.; Han, S.; Suh, C.H.; Ghafoor, S.; Hricak, H.; Vargas, H.A. The Diagnostic Performance of the Length of Tumor Capsular Contact on MRI for Detecting Prostate Cancer Extraprostatic Extension: A Systematic Review and Meta-Analysis. Korean J. Radiol. 2020, 21, 684–694. [Google Scholar] [CrossRef]
  49. Futela, D.; Bhargava, M.; Rama, S.; Doddi, S.; Chen, Y.; Ramaiya, N.H.; Tirumani, S.H. 10 mm (PI-RADS v2.1) versus 15 Mm (PI-RADS v1.0) Tumor Capsule Contact Length in Predicting Extracapsular Extension in Prostate Cancer: Meta-Analysis and Systematic Review. Abdom. Radiol. 2025, 50, 6106–6118. [Google Scholar] [CrossRef]
  50. Bakir, B.; Onay, A.; Vural, M.; Armutlu, A.; Yildiz, S.Ö.; Esen, T. Can Extraprostatic Extension Be Predicted by Tumor-Capsule Contact Length in Prostate Cancer? Relationship With International Society of Urological Pathology Grade Groups. AJR Am. J. Roentgenol. 2020, 214, 588–596. [Google Scholar] [CrossRef]
  51. Gomez-Iturriaga, A.; Büchser, D.; Miguel, I.S.; Marban, M.; Urresola, A.; Ezquerro, A.; Gil, A.; Suarez, F.; Gonzalez, A.; Mairata, E.; et al. MRI Detected Extaprostatic Extension (EPE) in Prostate Cancer: Do All T3a Patients Have the Same Outcomes? Clin. Transl. Radiat. Oncol. 2020, 24, 135–139. [Google Scholar] [CrossRef]
  52. Martini, F.; Pigati, M.; Mattiauda, M.; Ponzano, M.; Piol, N.; Pigozzi, S.; Spina, B.; Cittadini, G.; Giasotto, V.; Zawaideh, J.P. Extra-Prostatic Extension Grading System: Correlation with MRI Features and Integration of Capsular Enhancement Sign for “Enhanced” Detection of T3a Lesions. Br. J. Radiol. 2024, 97, 971. [Google Scholar] [CrossRef]
  53. Asfuroğlu, U.; Asfuroğlu, B.B.; Özer, H.; İnan, M.A.; Uçar, M. A Comparative Analysis of Techniques for Measuring Tumor Contact Length in Predicting Extraprostatic Extension. Eur. J. Radiol. 2024, 181, 111753. [Google Scholar] [CrossRef] [PubMed]
  54. Tsujimoto, M.; Inoue, Y.; Taga, H.; Saito, Y.; Kaneko, M.; Miyashita, M.; Yamada, T.; Yamada, Y.; Ueda, T.; Fujihara, A.; et al. MRI-Determined Tumor Contact Area as a Predictor of Pathological Extraprostatic Extension in Clinical T2 Prostate Cancer. Prostate Cancer 2025, 2025, 9165949. [Google Scholar] [CrossRef]
  55. Park, C.K.; Chung, Y.S.; Choi, Y.D.; Ham, W.S.; Jang, W.S.; Cho, N.H. Revisiting Extraprostatic Extension Based on Invasion Depth and Number for New Algorithm for Substaging of PT3a Prostate Cancer. Sci. Rep. 2021, 11, 13952. [Google Scholar] [CrossRef] [PubMed]
  56. Eifler, J.B.; Feng, Z.; Lin, B.M.; Partin, M.T.; Humphreys, E.B.; Han, M.; Epstein, J.I.; Walsh, P.C.; Trock, B.J.; Partin, A.W. An Updated Prostate Cancer Staging Nomogram (Partin Tables) Based on Cases from 2006 to 2011. BJU Int. 2013, 111, 22–29. [Google Scholar] [CrossRef] [PubMed]
  57. Cooperberg, M.R.; Pasta, D.J.; Elkin, E.P.; Litwin, M.S.; Latini, D.M.; Duchane, J.; Carroll, P.R. The University of California, San Francisco Cancer of the Prostate Risk Assessment Score: A Straightforward and Reliable Preoperative Predictor of Disease Recurrence after Radical Prostatectomy. J. Urol. 2005, 173, 1938–1942. [Google Scholar] [CrossRef]
  58. Martini, A.; Gupta, A.; Lewis, S.C.; Cumarasamy, S.; Haines, K.G.; Briganti, A.; Montorsi, F.; Tewari, A.K. Development and Internal Validation of a Side-Specific, Multiparametric Magnetic Resonance Imaging-Based Nomogram for the Prediction of Extracapsular Extension of Prostate Cancer. BJU Int. 2018, 122, 1025–1033. [Google Scholar] [CrossRef]
  59. Soeterik, T.F.W.; van Melick, H.H.E.; Dijksman, L.M.; Küsters-Vandevelde, H.V.N.; Biesma, D.H.; Witjes, J.A.; van Basten, J.P.A. External Validation of the Martini Nomogram for Prediction of Side-Specific Extraprostatic Extension of Prostate Cancer in Patients Undergoing Robot-Assisted Radical Prostatectomy. Urol. Oncol. Semin. Orig. Investig. 2020, 38, 372–378. [Google Scholar] [CrossRef]
  60. Nyarangi-Dix, J.; Wiesenfarth, M.; Bonekamp, D.; Hitthaler, B.; Schütz, V.; Dieffenbacher, S.; Mueller-Wolf, M.; Roth, W.; Stenzinger, A.; Duensing, S.; et al. Combined Clinical Parameters and Multiparametric Magnetic Resonance Imaging for the Prediction of Extraprostatic Disease—A Risk Model for Patient-Tailored Risk Stratification When Planning Radical Prostatectomy. Eur. Urol. Focus 2020, 6, 1205–1212. [Google Scholar] [CrossRef]
  61. Wibmer, A.G.; Kattan, M.W.; Alessandrino, F.; Baur, A.D.J.; Boesen, L.; Franco, F.B.; Bonekamp, D.; Campa, R.; Cash, H.; Catalá, V.; et al. International Multi-Site Initiative to Develop an MRI-Inclusive Nomogram for Side-Specific Prediction of Extraprostatic Extension of Prostate Cancer. Cancers 2021, 13, 2627. [Google Scholar] [CrossRef]
  62. Heetman, J.G.; van der Hoeven, E.J.R.J.; Rajwa, P.; Zattoni, F.; Kesch, C.; Shariat, S.; Dal Moro, F.; Novara, G.; La Bombara, G.; Sattin, F.; et al. External Validation of Nomograms Including MRI Features for the Prediction of Side-Specific Extraprostatic Extension. Prostate Cancer Prostatic Dis. 2024, 27, 492–499. [Google Scholar] [CrossRef]
  63. Soeterik, T.F.W.; van Melick, H.H.E.; Dijksman, L.M.; Küsters-Vandevelde, H.; Stomps, S.; Schoots, I.G.; Biesma, D.H.; Witjes, J.A.; van Basten, J.P.A. Development and External Validation of a Novel Nomogram to Predict Side-Specific Extraprostatic Extension in Patients with Prostate Cancer Undergoing Radical Prostatectomy. Eur. Urol. Oncol. 2022, 5, 328–337. [Google Scholar] [CrossRef] [PubMed]
  64. Diamand, R.; Ploussard, G.; Roumiguié, M.; Oderda, M.; Benamran, D.; Fiard, G.; Quackels, T.; Assenmacher, G.; Simone, G.; Van Damme, J.; et al. External Validation of a Multiparametric Magnetic Resonance Imaging–Based Nomogram for the Prediction of Extracapsular Extension and Seminal Vesicle Invasion in Prostate Cancer Patients Undergoing Radical Prostatectomy. Eur. Urol. 2021, 79, 180–185. [Google Scholar] [CrossRef] [PubMed]
  65. Gandaglia, G.; Ploussard, G.; Valerio, M.; Mattei, A.; Fiori, C.; Roumiguié, M.; Fossati, N.; Stabile, A.; Beauval, J.B.; Malavaud, B.; et al. The Key Combined Value of Multiparametric Magnetic Resonance Imaging, and Magnetic Resonance Imaging–Targeted and Concomitant Systematic Biopsies for the Prediction of Adverse Pathological Features in Prostate Cancer Patients Undergoing Radical Prostatectomy. Eur. Urol. 2020, 77, 733–741. [Google Scholar] [CrossRef] [PubMed]
  66. Sayyid, R.; Perlis, N.; Ahmad, A.; Evans, A.; Toi, A.; Horrigan, M.; Finelli, A.; Zlotta, A.; Kulkarni, G.; Hamilton, R.; et al. Development and External Validation of a Biopsy-Derived Nomogram to Predict Risk of Ipsilateral Extraprostatic Extension. BJU Int. 2017, 120, 76–82. [Google Scholar] [CrossRef]
  67. Diamand, R.; Roche, J.B.; Lievore, E.; Lacetera, V.; Chiacchio, G.; Beatrici, V.; Mastroianni, R.; Simone, G.; Windisch, O.; Benamran, D.; et al. External Validation of Models for Prediction of Side-Specific Extracapsular Extension in Prostate Cancer Patients Undergoing Radical Prostatectomy. Eur. Urol. Focus 2023, 9, 309–316. [Google Scholar] [CrossRef]
  68. Tosoian, J.J.; Chappidi, M.; Feng, Z.; Humphreys, E.B.; Han, M.; Pavlovich, C.P.; Epstein, J.I.; Partin, A.W.; Trock, B.J. Prediction of Pathological Stage Based on Clinical Stage, Serum Prostate-Specific Antigen, and Biopsy Gleason Score: Partin Tables in the Contemporary Era. BJU Int. 2017, 119, 676–683. [Google Scholar] [CrossRef]
  69. Fanning, D.M.; Fan, Y.; Fitzpatrick, J.M.; Watson, R.W.G. External Validation of the 2007 and 2001 Partin Tables in Irish Prostate Cancer Patients. Urol. Int. 2010, 84, 174–179. [Google Scholar] [CrossRef]
  70. Karakiewicz, P.I.; Bhojani, N.; Capitanio, U.; Reuther, A.M.; Suardi, N.; Jeldres, C.; Pharand, D.; Péloquin, F.; Perrotte, P.; Shariat, S.F.; et al. External Validation of the Updated Partin Tables in a Cohort of North American Men. J. Urol. 2008, 180, 898–903. [Google Scholar] [CrossRef]
  71. Bhojani, N.; Salomon, L.; Capitanio, U.; Suardi, N.; Shariat, S.F.; Jeldres, C.; Zini, L.; Pharand, D.; Péloquin, F.; Arjane, P.; et al. External Validation of the Updated Partin Tables in a Cohort of French and Italian Men. Int. J. Radiat. Oncol. Biol. Phys. 2009, 73, 347–352. [Google Scholar] [CrossRef]
  72. Otles, E.; Denton, B.T.; Qu, B.; Murali, A.; Merdan, S.; Auffenberg, G.B.; Hiller, S.C.; Lane, B.R.; George, A.K.; Singh, K. Development and Validation of Models to Predict Pathological Outcomes of Radical Prostatectomy in Regional and National Cohorts. J. Urol. 2022, 207, 358–366. [Google Scholar] [CrossRef]
  73. Blas, L.; Shiota, M.; Nagakawa, S.; Tsukahara, S.; Matsumoto, T.; Lee, K.; Monji, K.; Kashiwagi, E.; Inokuchi, J.; Eto, M. Validation of User-Friendly Models Predicting Extracapsular Extension in Prostate Cancer Patients. Asian J. Urol. 2023, 10, 81–88. [Google Scholar] [CrossRef]
  74. Xu, L.; Zhang, G.; Zhang, X.; Bai, X.; Yan, W.; Xiao, Y.; Sun, H.; Jin, Z. External Validation of the Extraprostatic Extension Grade on MRI and Its Incremental Value to Clinical Models for Assessing Extraprostatic Cancer. Front. Oncol. 2021, 11, 655093. [Google Scholar] [CrossRef]
  75. Sighinolfi, M.C.; Sandri, M.; Torricelli, P.; Ligabue, G.; Fiocchi, F.; Scialpi, M.; Eissa, A.; Reggiani Bonetti, L.; Puliatti, S.; Bianchi, G.; et al. External Validation of a Novel Side-Specific, Multiparametric Magnetic Resonance Imaging-Based Nomogram for the Prediction of Extracapsular Extension of Prostate Cancer: Preliminary Outcomes on a Series Diagnosed with Multiparametric Magnetic Resonance imaging-targeted plus systematic saturation biopsy. BJU Int. 2019, 124, 192–194. [Google Scholar] [CrossRef]
  76. Kwong, J.C.C.; Khondker, A.; Meng, E.; Taylor, N.; Kuk, C.; Perlis, N.; Kulkarni, G.S.; Hamilton, R.J.; Fleshner, N.E.; Finelli, A.; et al. Development, Multi-Institutional External Validation, and Algorithmic Audit of an Artificial Intelligence-Based Side-Specific Extra-Prostatic Extension Risk Assessment Tool (SEPERA) for Patients Undergoing Radical Prostatectomy: A Retrospective Cohort Study. Lancet Digit. Health 2023, 5, e435–e445. [Google Scholar] [CrossRef]
  77. Veerman, H.; Heymans, M.W.; van der Poel, H.G. External Validation of a Prediction Model for Side-Specific Extraprostatic Extension of Prostate Cancer at Robot-Assisted Radical Prostatectomy. Eur. Urol. Open Sci. 2022, 37, 50–52. [Google Scholar] [CrossRef] [PubMed]
  78. Kwong, J.C.C.; Khondker, A.; Tran, C.; Evans, E.; Cozma, A.I.; Javidan, A.; Ali, A.; Jamal, M.; Short, T.; Papanikolaou, F.; et al. Explainable Artificial Intelligence to Predict the Risk of Side-Specific Extraprostatic Extension in Pre-Prostatectomy Patients. Can. Urol. Assoc. J. 2022, 16, 213–221. [Google Scholar] [CrossRef] [PubMed]
  79. Regis, F.; Casale, P.; Persico, F.; Colombo, P.; Cieri, M.; Guazzoni, G.; Buffi, N.M.; Lughezzani, G. Use of 29-MHz Micro-Ultrasound for Local Staging of Prostate Cancer in Patients Scheduled for Radical Prostatectomy: A Feasibility Study. Eur. Urol. Open Sci. 2020, 19, 20–23. [Google Scholar] [CrossRef] [PubMed]
  80. Fasulo, V.; Buffi, N.M.; Regis, F.; Paciotti, M.; Persico, F.; Maffei, D.; Uleri, A.; Saita, A.; Casale, P.; Hurle, R.; et al. Use of High-Resolution Micro-Ultrasound to Predict Extraprostatic Extension of Prostate Cancer Prior to Surgery: A Prospective Single-Institutional Study. World J. Urol. 2022, 40, 435–442. [Google Scholar] [CrossRef]
  81. Pedraza, A.M.; Parekh, S.; Joshi, H.; Grauer, R.; Wagaskar, V.; Zuluaga, L.; Gupta, R.; Barthe, F.; Nasri, J.; Pandav, K.; et al. Side-Specific, Microultrasound-Based Nomogram for the Prediction of Extracapsular Extension in Prostate Cancer. Eur. Urol. Open Sci. 2023, 48, 72–81. [Google Scholar] [CrossRef]
  82. Frego, N.; Contieri, R.; Fasulo, V.; Maffei, D.; Avolio, P.P.; Arena, P.; Beatrici, E.; Sordelli, F.; De Carne, F.; Lazzeri, M.; et al. Development of a Microultrasound-Based Nomogram to Predict Extra-Prostatic Extension in Patients with Prostate Cancer Undergoing Robot-Assisted Radical Prostatectomy. Urol. Oncol. Semin. Orig. Investig. 2024, 42, 159.e9–159.e16. [Google Scholar] [CrossRef]
  83. Uslu-Beşli, L.; Durmaz, S.; Onay, A.; Bakır, B.; Gürses, İ.; Özel-Yıldız, S.; Demirdağ, Ç.; Sayman, H.B.; Uslu-Beşli, L.; Durmaz, S.; et al. Impact of 68Ga-PSMA PET/MRI on the Accuracy of MRI-Derived Grading Systems for Predicting Extraprostatic Extension in Prostate Cancer. Diagnostics 2025, 15, 2405. [Google Scholar] [CrossRef]
  84. Evangelista, L.; Zattoni, F.; Cassarino, G.; Artioli, P.; Cecchin, D.; dal Moro, F.; Zucchetta, P. PET/MRI in Prostate Cancer: A Systematic Review and Meta-Analysis. Eur. J. Nucl. Med. Mol. Imaging 2020, 48, 859–873. [Google Scholar] [CrossRef]
  85. Woo, S.; Ghafoor, S.; Becker, A.S.; Han, S.; Wibmer, A.G.; Hricak, H.; Burger, I.A.; Schöder, H.; Vargas, H.A. Prostate-Specific Membrane Antigen Positron Emission Tomography (PSMA-PET) for Local Staging of Prostate Cancer: A Systematic Review and Meta-Analysis. Eur. J. Hybrid Imaging 2020, 4, 16. [Google Scholar] [CrossRef] [PubMed]
  86. Mari, A.; Cadenar, A.; Giudici, S.; Cianchi, G.; Albisinni, S.; Autorino, R.; Di Maida, F.; Gandaglia, G.; Mir, M.C.; Valerio, M.; et al. A Systematic Review and Meta-Analysis to Evaluate the Diagnostic Accuracy of PSMA PET/CT in the Initial Staging of Prostate Cancer. Prostate Cancer Prostatic Dis. 2024, 28, 56–69. [Google Scholar] [CrossRef] [PubMed]
  87. Tillu, N.; Maheshwari, A.; Kolanukuduru, K.; Choudhary, M.; Agarwal, Y.; Joshi, H.; Goel, S.; Sur, H.; Ben- David, R.; Kaufmann, B.; et al. Predicting Side-Specific Extraprostatic Extension in Prostate Cancer Using an 18F-DCFPyL PSMA-PET/CT–Based Nomogram. Prostate Cancer Prostatic Dis. 2025, 2025, 1–8. [Google Scholar] [CrossRef]
  88. Chimmula, R.R.; Green, M.; Tann, M.; Koch, M.; Boris, R.; Collins, K.; Bahler, C.; Oderinde, O. Comparative Analysis of Machine Learning-Derived Nomogram and Biomarkers in Predicting Side-Specific Extraprostatic Extension: Preliminary Findings. Clin. Imaging 2025, 125, 110556. [Google Scholar] [CrossRef] [PubMed]
  89. Kumar, V.; Gu, Y.; Basu, S.; Berglund, A.; Eschrich, S.A.; Schabath, M.B.; Forster, K.; Aerts, H.J.W.L.; Dekker, A.; Fenstermacher, D.; et al. Radiomics: The Process and the Challenges. Magn. Reson. Imaging 2012, 30, 1234–1248. [Google Scholar] [CrossRef]
  90. Mayerhoefer, M.E.; Materka, A.; Langs, G.; Häggström, I.; Szczypiński, P.; Gibbs, P.; Cook, G. Introduction to Radiomics. J. Nucl. Med. 2020, 61, 488–495. [Google Scholar] [CrossRef]
  91. Cameron, A.; Khalvati, F.; Haider, M.A.; Wong, A. MAPS: A Quantitative Radiomics Approach for Prostate Cancer Detection. IEEE Trans. Biomed. Eng. 2016, 63, 1145–1156. [Google Scholar] [CrossRef]
  92. Van Griethuysen, J.J.M.; Fedorov, A.; Parmar, C.; Hosny, A.; Aucoin, N.; Narayan, V.; Beets-Tan, R.G.H.; Fillion-Robin, J.C.; Pieper, S.; Aerts, H.J.W.L. Computational Radiomics System to Decode the Radiographic Phenotype. Cancer Res. 2017, 77, e104–e107. [Google Scholar] [CrossRef]
  93. Sun, Y.; Reynolds, H.M.; Parameswaran, B.; Wraith, D.; Finnegan, M.E.; Williams, S.; Haworth, A. Multiparametric MRI and Radiomics in Prostate Cancer: A Review. Australas. Phys. Eng. Sci. Med. 2019, 42, 3–25. [Google Scholar] [CrossRef]
  94. Koçak, B.; Durmaz, E.Ş.; Ateş, E.; Kılıçkesmez, Ö. Radiomics with Artificial Intelligence: A Practical Guide for Beginners. Diagn. Interv. Radiol. 2019, 25, 485. [Google Scholar] [CrossRef] [PubMed]
  95. Zhovannik, I.; Bussink, J.; Traverso, A.; Shi, Z.; Kalendralis, P.; Wee, L.; Dekker, A.; Fijten, R.; Monshouwer, R. Learning from Scanners: Bias Reduction and Feature Correction in Radiomics. Clin. Transl. Radiat. Oncol. 2019, 19, 33–38. [Google Scholar] [CrossRef] [PubMed]
  96. Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining 2016, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar] [CrossRef]
  97. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  98. Schölkopf, B. SVMs—A Practical Consequence of Learning Theory. IEEE Intell. Syst. Their Appl. 1998, 13, 18–21. [Google Scholar] [CrossRef]
  99. Zhang, Y.P.; Zhang, X.Y.; Cheng, Y.T.; Li, B.; Teng, X.Z.; Zhang, J.; Lam, S.; Zhou, T.; Ma, Z.R.; Sheng, J.B.; et al. Artificial Intelligence-Driven Radiomics Study in Cancer: The Role of Feature Engineering and Modeling. Mil. Med. Res. 2023, 10, 22. [Google Scholar] [CrossRef]
  100. Bejani, M.M.; Ghatee, M. Regularized Deep Networks in Intelligent Transportation Systems: A Taxonomy and a Case Study. Artif. Intell. Rev. 2019, 54, 6391–6438. [Google Scholar] [CrossRef]
  101. Yip, S.S.F.; Aerts, H.J.W.L. Applications and Limitations of Radiomics. Phys. Med. Biol. 2016, 61, R150–R166. [Google Scholar] [CrossRef]
  102. Ma, S.; Xie, H.; Wang, H.; Yang, J.; Han, C.; Wang, X.; Zhang, X. Preoperative Prediction of Extracapsular Extension: Radiomics Signature Based on Magnetic Resonance Imaging to Stage Prostate Cancer. Mol. Imaging Biol. 2020, 22, 711–721. [Google Scholar] [CrossRef]
  103. Damascelli, A.; Gallivanone, F.; Cristel, G.; Cava, C.; Interlenghi, M.; Esposito, A.; Brembilla, G.; Briganti, A.; Montorsi, F.; Castiglioni, I.; et al. Advanced Imaging Analysis in Prostate Mri: Building a Radiomic Signature to Predict Tumor Aggressiveness. Diagnostics 2021, 11, 594. [Google Scholar] [CrossRef]
  104. Cuocolo, R.; Stanzione, A.; Faletti, R.; Gatti, M.; Calleris, G.; Fornari, A.; Gentile, F.; Motta, A.; Dell’Aversana, S.; Creta, M.; et al. MRI Index Lesion Radiomics and Machine Learning for Detection of Extraprostatic Extension of Disease: A Multicenter Study. Eur. Radiol. 2021, 31, 7575–7583. [Google Scholar] [CrossRef] [PubMed]
  105. Fan, X.; Xie, N.; Chen, J.; Li, T.; Cao, R.; Yu, H.; He, M.; Wang, Z.; Wang, Y.; Liu, H.; et al. Multiparametric MRI and Machine Learning Based Radiomic Models for Preoperative Prediction of Multiple Biological Characteristics in Prostate Cancer. Front. Oncol. 2022, 12, 839621. [Google Scholar] [CrossRef] [PubMed]
  106. Bai, H.; Xia, W.; Ji, X.; He, D.; Zhao, X.; Bao, J.; Zhou, J.; Wei, X.; Huang, Y.; Li, Q.; et al. Multiparametric Magnetic Resonance Imaging-Based Peritumoral Radiomics for Preoperative Prediction of the Presence of Extracapsular Extension with Prostate Cancer. J. Magn. Reson. Imaging 2021, 54, 1222–1230. [Google Scholar] [CrossRef] [PubMed]
  107. Losnegård, A.; Reisæter, L.A.R.; Halvorsen, O.J.; Jurek, J.; Assmus, J.; Arnes, J.B.; Honoré, A.; Monssen, J.A.; Andersen, E.; Haldorsen, I.S.; et al. Magnetic Resonance Radiomics for Prediction of Extraprostatic Extension in Non-Favorable Intermediate- and High-Risk Prostate Cancer Patients. Acta Radiol. 2020, 61, 1570–1579. [Google Scholar] [CrossRef]
  108. Xu, L.; Zhang, G.; Zhao, L.; Mao, L.; Li, X.; Yan, W.; Xiao, Y.; Lei, J.; Sun, H.; Jin, Z. Radiomics Based on Multiparametric Magnetic Resonance Imaging to Predict Extraprostatic Extension of Prostate Cancer. Front. Oncol. 2020, 10, 533290. [Google Scholar] [CrossRef]
  109. He, D.; Wang, X.; Fu, C.; Wei, X.; Bao, J.; Ji, X.; Bai, H.; Xia, W.; Gao, X.; Huang, Y.; et al. MRI-Based Radiomics Models to Assess Prostate Cancer, Extracapsular Extension and Positive Surgical Margins. Cancer Imaging 2021, 21, 46. [Google Scholar] [CrossRef]
  110. Stanzione, A.; Cuocolo, R.; Cocozza, S.; Romeo, V.; Persico, F.; Fusco, F.; Longo, N.; Brunetti, A.; Imbriaco, M. Detection of Extraprostatic Extension of Cancer on Biparametric MRI Combining Texture Analysis and Machine Learning: Preliminary Results. Acad. Radiol. 2019, 26, 1338–1344. [Google Scholar] [CrossRef]
  111. Shen, D.; Wu, G.; Suk, H. Il Deep Learning in Medical Image Analysis. Annu. Rev. Biomed. Eng. 2017, 19, 221–248. [Google Scholar] [CrossRef]
  112. Li, Y. Research and Application of Deep Learning in Image Recognition. In Proceedings of the 2022 IEEE 2nd International Conference on Power, Electronics and Computer Applications (ICPECA), Shenyang, China, 21–23 January 2022; Volume 2022, pp. 994–999. [Google Scholar] [CrossRef]
  113. Lundervold, A.S.; Lundervold, A. An Overview of Deep Learning in Medical Imaging Focusing on MRI. Z. Med. Phys. 2019, 29, 102–127. [Google Scholar] [CrossRef]
  114. Mazurowski, M.A.; Buda, M.; Saha, A.; Bashir, M.R. Deep Learning in Radiology: An Overview of the Concepts and a Survey of the State of the Art with Focus on MRI. J. Magn. Reson. Imaging 2019, 49, 939–954. [Google Scholar] [CrossRef]
  115. Hou, Y.; Zhang, Y.H.; Bao, J.; Bao, M.L.; Yang, G.; Shi, H.B.; Song, Y.; Zhang, Y.D. Artificial Intelligence Is a Promising Prospect for the Detection of Prostate Cancer Extracapsular Extension with MpMRI: A Two-Center Comparative Study. Eur. J. Nucl. Med. Mol. Imaging 2021, 48, 3805–3816. [Google Scholar] [CrossRef]
  116. Yao, F.; Lin, H.; Xue, Y.N.; Zhuang, Y.D.; Bian, S.Y.; Zhang, Y.Y.; Yang, Y.J.; Pan, K.H. Multimodal Imaging Deep Learning Model for Predicting Extraprostatic Extension in Prostate Cancer Using MpMRI and 18 F-PSMA-PET/CT. Cancer Imaging 2025, 25, 103. [Google Scholar] [CrossRef]
  117. Priester, A.; Mota, S.M.; Grunden, K.P.; Shubert, J.; Richardson, S.; Sisk, A.; Felker, E.R.; Sayre, J.; Marks, L.S.; Natarajan, S.; et al. Extracapsular Extension Risk Assessment Using an Artificial Intelligence Prostate Cancer Mapping Algorithm. BJUI Compass 2024, 5, 1100–1111. [Google Scholar] [CrossRef] [PubMed]
  118. Khosravi, P.; Saikali, S.; Alipour, A.; Mohammadi, S.; Boger, M.; Diallo, D.M.; Smith, C.J.; Moschovas, M.C.; Hajirasouliha, I.; Hung, A.J.; et al. AutoRadAI: A Versatile Artificial Intelligence Framework Validated for Detecting Extracapsular Extension in Prostate Cancer. Biol. Methods Protoc. 2025, 10, bpaf032. [Google Scholar] [CrossRef] [PubMed]
  119. Moroianu, Ş.L.; Bhattacharya, I.; Seetharaman, A.; Shao, W.; Kunder, C.A.; Sharma, A.; Ghanouni, P.; Fan, R.E.; Sonn, G.A.; Rusu, M. Computational Detection of Extraprostatic Extension of Prostate Cancer on Multiparametric MRI Using Deep Learning. Cancers 2022, 14, 2821. [Google Scholar] [CrossRef] [PubMed]
  120. Renard, F.; Guedria, S.; Palma, N.D.; Vuillerme, N. Variability and Reproducibility in Deep Learning for Medical Image Segmentation. Sci. Rep. 2020, 10, 13724. [Google Scholar] [CrossRef]
  121. Sonn, G.A.; Fan, R.E.; Ghanouni, P.; Wang, N.N.; Brooks, J.D.; Loening, A.M.; Daniel, B.L.; To’o, K.J.; Thong, A.E.; Leppert, J.T. Prostate Magnetic Resonance Imaging Interpretation Varies Substantially Across Radiologists. Eur. Urol. Focus 2019, 5, 592–599. [Google Scholar] [CrossRef]
  122. Castelvecchi, D. Can We Open the Black Box of AI? Nat. News 2016, 538, 20. [Google Scholar] [CrossRef]
  123. Yu, A.C.; Mohajer, B.; Eng, J. External Validation of Deep Learning Algorithms for Radiologic Diagnosis: A Systematic Review. Radiol. Artif. Intell. 2022, 4, e210064. [Google Scholar] [CrossRef]
  124. Ching, T.; Himmelstein, D.S.; Beaulieu-Jones, B.K.; Kalinin, A.A.; Do, B.T.; Way, G.P.; Ferrero, E.; Agapow, P.M.; Zietz, M.; Hoffman, M.M.; et al. Opportunities and Obstacles for Deep Learning in Biology and Medicine. J. R. Soc. Interface 2018, 15, 64908. [Google Scholar] [CrossRef]
  125. Ji, S.; Zeng, C.; Zhang, Y.; Duan, Y. An Evaluation of Conventional and Deep Learning-Based Image-Matching Methods on Diverse Datasets. Photogramm. Rec. 2023, 38, 137–159. [Google Scholar] [CrossRef]
  126. Gong, Z.; Zhong, P.; Hu, W. Diversity in Machine Learning. IEEE Access 2019, 7, 64323–64350. [Google Scholar] [CrossRef]
  127. Rudzicz, F.; Joshi, S. Explainable AI for the Operating Theater. In Digital Surgery; Springer: Cham, Switzerland, 2021; pp. 339–350. [Google Scholar] [CrossRef]
  128. Qian, J.; Li, H.; Wang, J.; He, L. Recent Advances in Explainable Artificial Intelligence for Magnetic Resonance Imaging. Diagnostics 2023, 13, 1571. [Google Scholar] [CrossRef] [PubMed]
  129. Shin, D. The Effects of Explainability and Causability on Perception, Trust, and Acceptance: Implications for Explainable AI. Int. J. Hum. Comput. Stud. 2021, 146, 102551. [Google Scholar] [CrossRef]
  130. Cancer Systems Biology: Translational Mathematical Oncology—Ravi Salgia, Mohit Kumar Jolly, Prakash Kulkarni, Govindan Rangarajan—Google Books. Available online: https://books.google.pl/books?hl=en&lr=&id=Wa2GEQAAQBAJ&oi=fnd&pg=PA187&ots=yKdypZxS7C&sig=nir4QwG0KT0FgJql-OD5uqQgqo4&redir_esc=y#v=onepage&q&f=false (accessed on 29 December 2025).
  131. Lundberg, S.M.; Nair, B.; Vavilala, M.S.; Horibe, M.; Eisses, M.J.; Adams, T.; Liston, D.E.; Low, D.K.W.; Newman, S.F.; Kim, J.; et al. Explainable Machine-Learning Predictions for the Prevention of Hypoxaemia during Surgery. Nat. Biomed. Eng. 2018, 2, 749–760. [Google Scholar] [CrossRef]
  132. Rashid, M.; Singh, H.; Goyal, V. The Use of Machine Learning and Deep Learning Algorithms in Functional Magnetic Resonance Imaging—A Systematic Review. Expert Syst. 2020, 37, e12644. [Google Scholar] [CrossRef]
  133. Latif, J.; Xiao, C.; Imran, A.; Tu, S. Medical Imaging Using Machine Learning and Deep Learning Algorithms: A Review. In Proceedings of the 2019 2nd International Conference on Computing, Mathematics and Engineering Technologies, iCoMET, Sukkur, Pakistan, 30–31 January 2019. [Google Scholar] [CrossRef]
  134. Jha, S.; Topol, E.J. Adapting to Artificial Intelligence: Radiologists and Pathologists as Information Specialists. JAMA 2016, 316, 2353–2354. [Google Scholar] [CrossRef]
  135. Saha, A.; Bosma, J.S.; Twilt, J.J.; van Ginneken, B.; Bjartell, A.; Padhani, A.R.; Bonekamp, D.; Villeirs, G.; Salomon, G.; Giannarini, G.; et al. Artificial Intelligence and Radiologists in Prostate Cancer Detection on MRI (PI-CAI): An International, Paired, Non-Inferiority, Confirmatory Study. Lancet Oncol. 2024, 25, 879–887. [Google Scholar] [CrossRef]
  136. Syer, T.; Mehta, P.; Antonelli, M.; Mallett, S.; Atkinson, D.; Ourselin, S.; Punwani, S. Artificial Intelligence Compared to Radiologists for the Initial Diagnosis of Prostate Cancer on Magnetic Resonance Imaging: A Systematic Review and Recommendations for Future Studies. Cancers 2021, 13, 3318. [Google Scholar] [CrossRef]
  137. Giganti, F.; Panebianco, V.; Tempany, C.M.; Purysko, A.S. Is Artificial Intelligence Replacing Our Radiology Stars in Prostate Magnetic Resonance Imaging? The Stars Do Not Look Big, But They Can Look Brighter. Eur. Urol. Open Sci. 2022, 48, 12. [Google Scholar] [CrossRef]
  138. Brembilla, G.; Dell’Oglio, P.; Stabile, A.; Damascelli, A.; Brunetti, L.; Ravelli, S.; Cristel, G.; Schiani, E.; Venturini, E.; Grippaldi, D.; et al. Interreader Variability in Prostate MRI Reporting Using Prostate Imaging Reporting and Data System Version 2.1. Eur. Radiol. 2020, 30, 3383–3392. [Google Scholar] [CrossRef] [PubMed]
  139. AlDubayan, S.H.; Conway, J.R.; Camp, S.Y.; Witkowski, L.; Kofman, E.; Reardon, B.; Han, S.; Moore, N.; Elmarakeby, H.; Salari, K.; et al. Detection of Pathogenic Variants with Germline Genetic Testing Using Deep Learning vs Standard Methods in Patients With Prostate Cancer and Melanoma. JAMA 2020, 324, 1957–1969. [Google Scholar] [CrossRef] [PubMed]
  140. Wong, E.Y.; Chu, T.N.; Ladi-Seyedian, S.S. Genomics and Artificial Intelligence: Prostate Cancer. Urol. Clin. N. Am. 2024, 51, 27–33. [Google Scholar] [CrossRef] [PubMed]
  141. Johann, L.I.; Guangming, Z.H.U.; Hua, C.; Feng, M.; Bennamoun, B.; Ping, L.I.; Xiaoyuan, L.U.; Song, J.; Shen, P.; Xu, X.U.; et al. A Systematic Collection of Medical Image Datasets for Deep Learning. ACM Comput. Surv. 2024, 56, 3615862. [Google Scholar] [CrossRef]
  142. Cho, J.; Lee, K.; Shin, E.; Choy, G.; Do, S. How Much Data Is Needed to Train a Medical Image Deep Learning System to Achieve Necessary High Accuracy? arXiv 2015, arXiv:1511.06348. [Google Scholar]
  143. Rice, L.; Wong, E.; Kolter, J.Z. Overfitting in Adversarially Robust Deep Learning. In International Conference on Machine Learning; PMLR: Boston, MA, USA, 2020; pp. 8093–8104. [Google Scholar]
  144. Thanapol, P.; Lavangnananda, K.; Bouvry, P.; Pinel, F.; Leprevost, F. Reducing Overfitting and Improving Generalization in Training Convolutional Neural Network (CNN) under Limited Sample Sizes in Image Recognition. In Proceedings of the CIT 2020—5th International Conference on Information Technology, Chonburi, Thailand, 21–22 October 2020; pp. 300–305. [Google Scholar] [CrossRef]
  145. Liu, X.; Faes, L.; Kale, A.U.; Wagner, S.K.; Fu, D.J.; Bruynseels, A.; Mahendiran, T.; Moraes, G.; Shamdas, M.; Kern, C.; et al. A Comparison of Deep Learning Performance against Health-Care Professionals in Detecting Diseases from Medical Imaging: A Systematic Review and Meta-Analysis. Lancet Digit. Health 2019, 1, e271–e297. [Google Scholar] [CrossRef]
  146. Collins, G.S.; Reitsma, J.B.; Altman, D.G.; Moons, K.G.M. Transparent Reporting of a Multivariable Prediction Model for Individual Prognosis Or Diagnosis (TRIPOD): The TRIPOD Statement. BJS 2015, 102, 148–158. [Google Scholar] [CrossRef]
  147. Collins, G.S.; Moons, K.G.M. Reporting of Artificial Intelligence Prediction Models. Lancet 2019, 393, 1577–1579. [Google Scholar] [CrossRef]
  148. Guni, A.; Sounderajah, V.; Whiting, P.; Bossuyt, P.; Darzi, A.; Ashrafian, H. Revised Tool for the Quality Assessment of Diagnostic Accuracy Studies Using AI (QUADAS-AI): Protocol for a Qualitative Study. JMIR Res. Protoc. 2024, 13, 58202. [Google Scholar] [CrossRef]
  149. Moons, K.G.M.; Damen, J.A.A.; Kaul, T.; Hooft, L.; Andaur Navarro, C.; Dhiman, P.; Beam, A.L.; Van Calster, B.; Celi, L.A.; Denaxas, S.; et al. PROBAST+AI: An Updated Quality, Risk of Bias, and Applicability Assessment Tool for Prediction Models Using Regression or Artificial Intelligence Methods. BMJ 2025, 388, e082505. [Google Scholar] [CrossRef]
  150. Wolff, R.F.; Moons, K.G.M.; Riley, R.D.; Whiting, P.F.; Westwood, M.; Collins, G.S.; Reitsma, J.B.; Kleijnen, J.; Mallett, S. PROBAST: A Tool to Assess the Risk of Bias and Applicability of Prediction Model Studies. Ann. Intern. Med. 2019, 170, 51–58. [Google Scholar] [CrossRef]
  151. Fouladi, S.; Darvizeh, F.; Gianini, G.; Di Meo, R.; Di Palma, L.; Damiani, E.; Maiocchi, A.; Fazzini, D.; Alì, M. Exploring UNet-Based Models for Prostate Lesion Segmentation from Multi-Sequence MRI (T2W, ADC, DWI). World Wide Web 2025, 29, 4. [Google Scholar] [CrossRef]
  152. Milletari, F.; Navab, N.; Ahmadi, S.A. V-Net: Fully Convolutional Neural Networks for Volumetric Medical Image Segmentation. In Proceedings of the 2016 4th International Conference on 3D Vision, 3DV, Stanford, CA, USA, 25–28 October 2016; pp. 565–571. [Google Scholar] [CrossRef]
  153. Bhandary, S.; Kuhn, D.; Babaiee, Z.; Fechter, T.; Benndorf, M.; Zamboglou, C.; Grosu, A.L.; Grosu, R. Investigation and Benchmarking of U-Nets on Prostate Segmentation Tasks. Comput. Med. Imaging Graph. 2023, 107, 102241. [Google Scholar] [CrossRef] [PubMed]
  154. Comelli, A.; Dahiya, N.; Stefano, A.; Vernuccio, F.; Portoghese, M.; Cutaia, G.; Bruno, A.; Salvaggio, G.; Yezzi, A. Deep Learning-Based Methods for Prostate Segmentation in Magnetic Resonance Imaging. Appl. Sci. 2021, 11, 782. [Google Scholar] [CrossRef] [PubMed]
  155. Ullrich, T.; Quentin, M.; Oelers, C.; Dietzel, F.; Sawicki, L.M.; Arsov, C.; Rabenalt, R.; Albers, P.; Antoch, G.; Blondin, D.; et al. Magnetic Resonance Imaging of the Prostate at 1.5 versus 3.0 T: A Prospective Comparison Study of Image Quality. Eur. J. Radiol. 2017, 90, 192–197. [Google Scholar] [CrossRef] [PubMed]
  156. Woernle, A.; Englman, C.; Dickinson, L.; Kirkham, A.; Punwani, S.; Haider, A.; Freeman, A.; Kasivisivanathan, V.; Emberton, M.; Hines, J.; et al. Picture Perfect: The Status of Image Quality in Prostate MRI. J. Magn. Reson. Imaging 2024, 59, 1930–1952. [Google Scholar] [CrossRef]
  157. Giganti, F.; Allen, C.; Emberton, M.; Moore, C.M.; Kasivisvanathan, V. Prostate Imaging Quality (PI-QUAL): A New Quality Control Scoring System for Multiparametric Magnetic Resonance Imaging of the Prostate from the PRECISION Trial. Eur. Urol. Oncol. 2020, 3, 615–619. [Google Scholar] [CrossRef]
  158. Danneman, D.; Wiklund, F.; Wiklund, N.P.; Egevad, L. Prognostic Significance of Histopathological Features of Extraprostatic Extension of Prostate Cancer. Histopathology 2013, 63, 580–589. [Google Scholar] [CrossRef]
  159. Lazzereschi, L.; Birks, J.; Colling, R. Does the Extent of Extraprostatic Extension at Radical Prostatectomy Predict Outcome?—A Systematic Review and Meta-Analysis. Histopathology 2024, 85, 727–742. [Google Scholar] [CrossRef]
  160. Mehralivand, S.; Shih, J.H.; Harmon, S.; Smith, C.; Bloom, J.; Czarniecki, M.; Gold, S.; Hale, G.; Rayn, K.; Merino, M.J.; et al. A Grading System for the Assessment of Risk of Extraprostatic Extension of Prostate Cancer at Multiparametric MRI. Radiology 2019, 202, 440–441. [Google Scholar] [CrossRef]
Figure 1. Overview of radiomics-based machine learning versus deep learning approaches for prostate MRI analysis.
Figure 1. Overview of radiomics-based machine learning versus deep learning approaches for prostate MRI analysis.
Cancers 18 00456 g001
Figure 2. Panel (A) shows a conventional deep learning model that generates an EPE prediction from MRI input without insight into the decision process. Panel (B) illustrates the same model augmented with explainable AI, providing additional information on the factors contributing to the prediction.
Figure 2. Panel (A) shows a conventional deep learning model that generates an EPE prediction from MRI input without insight into the decision process. Panel (B) illustrates the same model augmented with explainable AI, providing additional information on the factors contributing to the prediction.
Cancers 18 00456 g002
Figure 3. Recommendation of future directions for new EPE predicting models.
Figure 3. Recommendation of future directions for new EPE predicting models.
Cancers 18 00456 g003
Table 1. Clinical parameters used in EPE prediction.
Table 1. Clinical parameters used in EPE prediction.
ParameterDescriptionAssociation with EPE
PSA levelSerum prostate-specific antigen (ng/mL)Higher PSA → increased risk
AgePatient age at diagnosisOlder age → higher risk
PSADPSA ÷ prostate volume (ng/mL/cc)Higher density → increased risk
cT stageClinical T stage (T1c–T3)≥T2b/T2c → higher risk
Biopsy Gleason score/grade groupHistologic grade from biopsyHigher grade → higher risk.
Number of positive coresAbsolute numberMore cores → higher risk
% Positive biopsy coresPositive cores ÷ total coresHigher % → increased risk
Perineural invasionPresence on biopsyAssociated with EPE
Lymphovascular invasionPresence on biopsyAssociated with EPE
Table 2. mpMRI parameters used in EPE prediction.
Table 2. mpMRI parameters used in EPE prediction.
ParameterDescriptionDiagnostic ValueComments
Breach of the capsuleDisruption of the capsule with direct tumor infiltrationVery high specificityDetects mainly macroscopic EPE
Outside the prostate
Tumor-capsule contact length (TCL)Length of contact between the tumor and the prostate capsuleIncreased EPE riskNo standardized threshold
Tumor sizeTumor diameterImprove sensitivity but reduce specificityNo impact on overall accuracy
ADC entropyTissue heterogeneity in MRIImprove sensitivity but reduce specificityNo impact on overall accuracy
Capsular enhancement sign
Early enhancement of the prostate increases diagnostic accuracy
Capsule adjacent to the tumor
Rectoprostatic angle obliterationThe acute angle between the posterior prostate capsule and the anterior rectal wallIncreased EPE risk
Asymmetry/obliteration of the neurovascular bundleAbnormal appearance of the NVBIncreased EPE risk
Periprostatic fat infiltrationVisible tumor tissue projecting into the surrounding periprostatic fatIncreased EPE risk
Capsular bulging, irregularity,A contour abnormality of the prostate capsuleIncreased EPE riskThe highest diagnostic performance for EPE prediction with clinical parameters
TCA1Tumor dimensions across two planesIncreased EPE riskTested on cT2N0M0 patients
TCA2Tumor’s contact area within the MRI volumeIncreased EPE riskTested on cT2N0M0 patients
Table 3. Non side-specific normograms.
Table 3. Non side-specific normograms.
NormogramParametersAUC Before ValidationAUC—External Validation
Partin tablesPSA, Gleason score, cT stage0.724 [68]0.61 [65];
0.22 [69];
0.71 [70];
0.61 [71];
0.67 [72];
0.61 [73]
MSKCCPSA, age, Gleason score0.71 [9]0.68 [65];
cT stage, biopsy cores 0.76 [72];
0.723 [74];
0.74 [73]
CAPRA scorePSA, Gleason score, cT stage,0.66 [57]0.704 [74]
% positive cores, age
Gandaglia without MRIPSA, ISUP, cT stage,0.67 [65]none
% positive cores
Table 4. Side-specific normograms.
Table 4. Side-specific normograms.
NormogramParametersAUC Before ValidationAUC—External Validation
MartiniPSA, Gleason grade, % core involvement, EPE at MRI0.821 [58]0.74 [62];
0.78 [34];
0.78 [59];
0.68 [75];
0.75 [73]
Gandaglia with MRIPSA, Gleason grade, % positive cores, EPE at MRI, max. diameter of the index lesion at MRI0.70 [65]none
PSA, Gleason grade, % positive cores, EPE at MRI, max. diameter of the index lesion at MRI
WibmerAge, PSA, PSAD, ISUP, % positive cores, max. tumor extent, PIRADS, max. Lesion Diameter, Length of Capsular Contact, Presence of EPE0.828 [61]0.72 [62]
Age, PSA, PSAD, ISUP, % positive cores, max. tumor extent, PIRADS, max. lesion diameter, length of capsular contact, presence of EPE
Nyarangi-DixPSA, cT stage, prostate volume, ISUP, ESUR criteria, capsule contact length0.87 [60]0.76 [62]
SoeterikPSAD, clinical stage on MRI, ISUP0.77–0.83 [63]0.75 [62];
0.80 [60];
0.69 [76];
0.80 [77];
0.81 [73]
SayyidAge, PSA, prostate volume, palpable nodule on DRE, hypoechoic nodule on TRUS, max. core involvement, % positive cores, ISUP0.74 [66]0.75 [78];
0.75 [76];
0.77 [73]
Age, PSA, prostate volume, palpable nodule on DRE, hypoechoic nodule on TRUS, max. core involvement, % positive cores, ISUP
Table 5. Comparison of basic features of Radiomics + Machine Learning models with Deep Learning Models.
Table 5. Comparison of basic features of Radiomics + Machine Learning models with Deep Learning Models.
CategoryRadiomics + Machine LearningDeep Learning
Feature extraction Based on handcrafted features extracted after segmentation. Features are learned automatically from raw images.
Workflow complexity Requires multiple steps: segmentation, feature extraction, feature selection, and modelling. Features are learned automatically from raw images.
Explainability Relatively transparent, features can be interpreted clinically.Often, a black box requires explainable AI methods for interpretation.
Data dependency It can be applied with limited datasets, but it is sensitive to feature engineering. Requires large, well-annotated datasets to perform reliably.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Stępka, J.; Milecki, T.; Ksepka, J.; Kujawska, A.; Hendrysiak, J.; Cieślikowski, W.A. Contemporary Preoperative Detection of Extraprostatic Extension in Prostate Cancer. Cancers 2026, 18, 456. https://doi.org/10.3390/cancers18030456

AMA Style

Stępka J, Milecki T, Ksepka J, Kujawska A, Hendrysiak J, Cieślikowski WA. Contemporary Preoperative Detection of Extraprostatic Extension in Prostate Cancer. Cancers. 2026; 18(3):456. https://doi.org/10.3390/cancers18030456

Chicago/Turabian Style

Stępka, Jan, Tomasz Milecki, Jędrzej Ksepka, Anna Kujawska, Jaśmina Hendrysiak, and Wojciech A. Cieślikowski. 2026. "Contemporary Preoperative Detection of Extraprostatic Extension in Prostate Cancer" Cancers 18, no. 3: 456. https://doi.org/10.3390/cancers18030456

APA Style

Stępka, J., Milecki, T., Ksepka, J., Kujawska, A., Hendrysiak, J., & Cieślikowski, W. A. (2026). Contemporary Preoperative Detection of Extraprostatic Extension in Prostate Cancer. Cancers, 18(3), 456. https://doi.org/10.3390/cancers18030456

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop