Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (400)

Search Parameters:
Keywords = probabilistic distribution functions

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
32 pages, 3633 KB  
Article
Reliability Analysis of Turbine Blade–Disk Dovetail Joints Considering Failure Correlation
by Shaohua Wang, Hua Yuan, Xi Liu, Rongqiao Wang, Gaoxiang Chen and Dianyin Hu
Crystals 2026, 16(4), 257; https://doi.org/10.3390/cryst16040257 (registering DOI) - 11 Apr 2026
Abstract
The service environment of the turbine blade–disk dovetail joint structure in aero-engines is complex. Uncertainties in material properties and geometry, as well as the failure correlations among multiple locations or components, make reliability assessment challenging. First, a probabilistic life modeling method based on [...] Read more.
The service environment of the turbine blade–disk dovetail joint structure in aero-engines is complex. Uncertainties in material properties and geometry, as well as the failure correlations among multiple locations or components, make reliability assessment challenging. First, a probabilistic life modeling method based on linear heteroscedastic regression is proposed, and the Manson–Coffin probabilistic life models of DD6 and FGH96 alloys at 650 °C are established. Then, the Copula function is introduced to characterize the failure dependence structure, and the effectiveness of the method is verified through numerical examples. Fatigue-critical locations of the dovetail are identified, and a Kriging surrogate model is established to obtain the probabilistic stress distribution at the critical locations. Subsequently, the Copula method is employed to conduct reliability analysis of dovetail structures. The results show that the reliability of multiple dovetails considering correlation lies between that of a single dovetail and that under the assumption of complete independence. Moreover, the life of the entire disk dovetail structure is significantly influenced by the number of dovetails and the required reliability level. Finally, the study is extended to the blade–disk dovetail multi-component system. The results indicate that when correlation is considered, the reliability of both components decreases, and the overall structural life is dominated by the dovetail component with the lower life. The analytical method proposed in this paper provides theoretical support and engineering reference for the reliability design and life assessment of aero-engine rotor structures. Full article
(This article belongs to the Special Issue Fatigue and Fracture of Crystalline Metal Structures)
24 pages, 2997 KB  
Article
A Controllability-Based Reliability Framework for Mechanical Systems with Scenario-Driven Performance Evaluation
by Daniel Osezua Aikhuele and Shahryar Sorooshian
Appl. Syst. Innov. 2026, 9(4), 72; https://doi.org/10.3390/asi9040072 - 27 Mar 2026
Viewed by 414
Abstract
In classical reliability engineering, failure is a probabilistic structural failure based on lifetime distributions of Weibull models. However, in the control-critical mechanical systems, it is possible that functional failure of the system happens before material failure occurs as a result of control power [...] Read more.
In classical reliability engineering, failure is a probabilistic structural failure based on lifetime distributions of Weibull models. However, in the control-critical mechanical systems, it is possible that functional failure of the system happens before material failure occurs as a result of control power loss. This paper proposes a Controllability–Reliability Coupling (CRC) model, which redefines the concept of reliability as the stabilizability in the face of progressive degradation. The actuators’ deterioration is modeled using the time-varying input effectiveness factor α(t), and the actuator is said to be in failure when the minimum singular value of the finite-horizon controllability Gramian becomes less than a stabilizability threshold ε. The performance of the simulation indicates that the functional failure is a precursor of structural failure in several degradation conditions. A baseline comparison shows that the CRC metric forecasts loss of controllability at TCRC=17.0 s, but the classical Weibull reliability never attains the structural failure threshold even in the time horizon of 20 s. The system retains margins of Lyapunov stability and H infinity robustness are not lost, and it is still stable and attenuates disturbances even when control authority is lost. In practical degradation scenarios, the forecasted CRC failure times are 21.5 s (linear wear), 13.1 s (accelerated fatigue), 23.7 s (intermittent faults), and 24.4 s (shock damage), whereas maintenance recovery abated functional failure completely. In a case study of an industrial robotic joint, at 27.0 s, functional collapse occurred, and at the same time, structural reliability was still above the failure threshold. The findings support the hypothesis that structural survival and functional controllability are distinct concepts. The proposed CRC framework is an approach to control-conscious reliability measure, which can detect early failures and offer proactive maintenance advice in the context of a cyber–physical system. Full article
Show Figures

Figure 1

25 pages, 3347 KB  
Article
Variational Bayesian-Based Reliability Evaluation of Nonlinear Structures by Active Learning Gaussian Process Modeling
by Wei-Chao Hou, Yu Xin, Ding-Tang Wang, Zuo-Cai Wang and Zong-Zu Liu
Infrastructures 2026, 11(4), 118; https://doi.org/10.3390/infrastructures11040118 - 27 Mar 2026
Viewed by 279
Abstract
In this study, variational Bayesian inference (VBI) with Gaussian mixture models is applied to update models of nonlinear structures, and then, the calibrated model is employed to estimate the failure probability of structures using a subset simulation (SS) algorithm. To improve the computation [...] Read more.
In this study, variational Bayesian inference (VBI) with Gaussian mixture models is applied to update models of nonlinear structures, and then, the calibrated model is employed to estimate the failure probability of structures using a subset simulation (SS) algorithm. To improve the computation efficiency of probabilistic nonlinear model updating, a Gaussian Process (GP) model is used to construct a surrogate likelihood function in Bayesian inference using an active learning algorithm, and then, Gaussian mixture models (GMMs) are employed to approximate the unknown posterior probabilistic density functions (PDFs) of model parameters. The optimized hyperparameters of GMMs can be obtained by maximizing the evidence lower bound (ELBO), and the stochastic gradient search method is used to solve this optimization problem. Based on the optimized hyperparameters, the posterior distributions of model parameters can be approximated using a combination of multiple Gaussian components. Subsequently, the SS algorithm is used to calculate the earthquake-induced failure probability of structures based on the calibrated nonlinear model. To verify the feasibility and effectiveness of the proposed method, a numerical simulation of a two-span bridge structure subjected to seismic excitations was developed. Moreover, the proposed strategy is further applied to estimate the failure probability of a scaled monolithic column structure subjected to bi-directional earthquake excitations. Both numerical and experimental results indicate that the proposed method is feasible and effective for probabilistic nonlinear model updates, and the updated model can significantly enhance the accuracy of structural failure probability predictions. Full article
(This article belongs to the Section Infrastructures and Structural Engineering)
Show Figures

Figure 1

22 pages, 3540 KB  
Article
A Method for Probability Forecasting of Daily Photovoltaic Power Output Based on Multivariate Dynamic Copula Functions and Reinforcement Learning
by Jun Zhao, Liang Wang, Chaoying Yang, Zhijun Zhao, Haonan Dai and Fei Wang
Electronics 2026, 15(7), 1387; https://doi.org/10.3390/electronics15071387 - 26 Mar 2026
Viewed by 241
Abstract
Accurate photovoltaic power probability forecasting assists dispatch departments in making rational decisions. Joint probability distributions constructed using Copula functions can flexibly characterize complex nonlinear correlations and tail dependencies among random variables. However, existing research has not thoroughly explored the multivariate dynamic coupling characteristics [...] Read more.
Accurate photovoltaic power probability forecasting assists dispatch departments in making rational decisions. Joint probability distributions constructed using Copula functions can flexibly characterize complex nonlinear correlations and tail dependencies among random variables. However, existing research has not thoroughly explored the multivariate dynamic coupling characteristics related to forecasting errors, nor has it sufficiently considered the complementary advantages among different Copula functions. To address this, we propose a method for forecasting photovoltaic power output probabilities days in advance, integrating multivariate dynamic Copula functions with reinforcement learning. First, to capture the time-varying structure of photovoltaic power-related variables, we introduce a sliding time window for segmented modeling of historical data, fitting marginal probability distributions for predicted irradiance, forecasting power, and forecasting error. Second, a joint probability distribution of dynamic Gaussian Copula and t-Copula is constructed based on historical samples within the time window, generating a probabilistic prediction interval for the target time. Finally, reinforcement learning is employed to adaptively combine the probability prediction intervals derived from both Copula types, yielding the final photovoltaic power probability forecast. Simulations using actual operational data from a photovoltaic power plant in Shanxi Province validate the effectiveness of the proposed method. Full article
(This article belongs to the Section Optoelectronics)
Show Figures

Figure 1

21 pages, 333 KB  
Article
Artificial Truth: Algorithmic Power, Epistemic Authority, and the Crisis of Democratic Knowledge
by Rosario Palese
Societies 2026, 16(3), 102; https://doi.org/10.3390/soc16030102 - 23 Mar 2026
Viewed by 971
Abstract
This article examines how artificial intelligence and algorithmic systems are reconfiguring truth regimes in digital societies, introducing the concept of “Artificial Truth” to describe an emerging form of epistemic governance where knowledge production and validation become infrastructural functions of sociotechnical systems. The study [...] Read more.
This article examines how artificial intelligence and algorithmic systems are reconfiguring truth regimes in digital societies, introducing the concept of “Artificial Truth” to describe an emerging form of epistemic governance where knowledge production and validation become infrastructural functions of sociotechnical systems. The study develops an integrated theoretical framework combining Foucault’s notion of truth regimes, Bourdieu’s theory of symbolic capital and fields, and Actor-Network Theory’s constructivist approach. Through conceptual analysis, the article investigates how algorithmic recommendation systems, generative AI, and automated fact-checking operate as epistemic devices that actively shape what is recognized as credible, authoritative, and true in public discourse. The analysis reveals three fundamental transformations: (1) the restructuring of trust economies, with epistemic authority shifting from institutional expertise to platform-native capital based on engagement metrics and affective proximity; (2) the emergence of generative AI as an epistemic actor producing “synthetic truth” through linguistic fluency rather than propositional understanding; (3) the institutionalization of computational veridiction in algorithmic fact-checking systems that translate situated epistemic judgments into probabilistic classifications presented as neutral. These dynamics configure a regime where truth is evaluated less by correspondence with reality and more by computational plausibility and platform integration. The article’s primary contribution lies in providing a unified theoretical framework for understanding contemporary transformations of epistemic authority, moving beyond disinformation studies to analyze AI as an epistemic actor. By integrating classical sociological perspectives with Science and Technology Studies, it conceptualizes algorithmic systems as epistemic infrastructures that embody specific power relations, restructure symbolic capital economies, and distribute epistemic authority asymmetrically, with profound implications for democratic knowledge, citizen epistemic agency, and public sphere pluralism. Full article
22 pages, 2677 KB  
Article
A Hybrid Interval Prediction Framework for Photovoltaic Power Prediction Using BiLSTM–Transformer and Adaptive Kernel Density Estimation
by Laiyuan Li and Zhibin Li
Appl. Sci. 2026, 16(6), 3023; https://doi.org/10.3390/app16063023 - 20 Mar 2026
Viewed by 245
Abstract
Photovoltaic (PV) power forecasting is strongly influenced by volatility, randomness, and changing meteorological conditions, while conventional point forecasting provides limited uncertainty information for engineering use. This study proposes a hybrid interval forecasting framework for PV prediction. Similar-day clustering first segments weather data into [...] Read more.
Photovoltaic (PV) power forecasting is strongly influenced by volatility, randomness, and changing meteorological conditions, while conventional point forecasting provides limited uncertainty information for engineering use. This study proposes a hybrid interval forecasting framework for PV prediction. Similar-day clustering first segments weather data into distinct scenarios (sunny, cloudy and overcast) to reduce noise and redundant information within sequences, enhancing stability and thereby providing a more refined feature space for deep learning. A BiLSTM–Transformer model is then used as the core forecaster, taking multiple meteorological variables as multi-feature time-series inputs. BiLSTM captures bidirectional temporal dependencies, and the Transformer enhances long-range feature extraction via attention. To improve robustness and stability, the Alpha Evolution (AE) algorithm is applied for hyperparameter optimization, balancing global exploration and local refinement. For probabilistic forecasting, Adaptive Bandwidth Kernel Density Estimation (ABKDE) is employed to construct prediction intervals, where the local bandwidth is determined by minimizing a local error function to adapt to data density and error distribution. Case studies utilizing a full-year, 5 min high-resolution dataset from the DKASC station demonstrate that the proposed AE-BiLSTM–Transformer achieves highly accurate point forecasts across diverse weather conditions, reducing the RMSE by 81.85%, 76.99%, and 72.26% under sunny, cloudy, and overcast scenarios, respectively, compared to the baseline LSTM. ABKDE further produces reliable and compact intervals; at the 90% confidence level on sunny days, it achieves PICP = 0.921 with PINAW = 0.0378, reducing PINAW by 75.16% relative to conventional KDE while maintaining comparable coverage. Full article
Show Figures

Figure 1

28 pages, 1600 KB  
Article
A Data-Driven Deep Reinforcement Learning Framework for Real-Time Economic Dispatch of Microgrids Under Renewable Uncertainty
by Biao Dong, Shijie Cui and Xiaohui Wang
Energies 2026, 19(6), 1481; https://doi.org/10.3390/en19061481 - 16 Mar 2026
Viewed by 286
Abstract
The real-time economic dispatch of microgrids (MGs) is challenged by the high penetration of renewable energy and the resulting source–load uncertainties. Conventional optimization-based scheduling methods rely heavily on accurate probabilistic models and often suffer from high computational burdens, which limits their real-time applicability. [...] Read more.
The real-time economic dispatch of microgrids (MGs) is challenged by the high penetration of renewable energy and the resulting source–load uncertainties. Conventional optimization-based scheduling methods rely heavily on accurate probabilistic models and often suffer from high computational burdens, which limits their real-time applicability. To address these challenges, a data-driven deep reinforcement learning (DRL) framework is proposed for real-time microgrid energy management. The MG dispatch problem is formulated as a Markov decision process (MDP), and a Deep Deterministic Policy Gradient (DDPG) algorithm is adopted to efficiently handle the high-dimensional continuous action space of distributed generators and energy storage systems (ESS). The system state incorporates renewable generation, load demand, electricity price, and ESS operational conditions, while the reward function is designed as the negative of the operational cost with penalty terms for constraint violations. A continuous-action policy network is developed to directly generate control commands without action discretization, enabling smooth and flexible scheduling. Simulation studies are conducted on an extended European low-voltage microgrid test system under both deterministic and stochastic operating scenarios. The proposed approach is compared with model-based methods (MPC and MINLP) and representative DRL algorithms (SAC and PPO). The results show that the proposed DDPG-based strategy achieves competitive economic performance, fast convergence, and good adaptability to different initial ESS conditions. In stochastic environments, the proposed method maintains operating costs close to the optimal MINLP reference while significantly reducing the online computational time. These findings demonstrate that the proposed framework provides an efficient and practical solution for the real-time economic dispatch of microgrids with high renewable penetration. Full article
Show Figures

Figure 1

24 pages, 4894 KB  
Article
Power Load Probabilistic Prediction Based on Multi-Value Quantile Regression and Timing Fusion Ensemble Learning Model
by Yuhang Liu, Fei Mei, Jun Zhang, Xiang Dai and Wen Li
Entropy 2026, 28(3), 329; https://doi.org/10.3390/e28030329 - 16 Mar 2026
Viewed by 309
Abstract
The core component to ensure the refined and safe operation of distribution network scheduling is 10 kV bus load probabilistic prediction. However, existing probabilistic prediction methods suffer from insufficient dynamic feature extraction and compromised prediction reliability caused by quantile crossing. To address these [...] Read more.
The core component to ensure the refined and safe operation of distribution network scheduling is 10 kV bus load probabilistic prediction. However, existing probabilistic prediction methods suffer from insufficient dynamic feature extraction and compromised prediction reliability caused by quantile crossing. To address these issues, this paper proposes a 10 kV bus load probabilistic prediction method integrating multi-value quantile regression (MQR) and a temporal fusion ensemble learning model (ELM). Firstly, a temporal fusion ensemble learning model is constructed, which integrates multiple temporal fusion network (TFN) sub-models through a stacking framework to parallel extract multi-dimensional temporal features of loads, effectively enhancing its feature capture capability for complex load data. Secondly, MQR is introduced as the core objective function to synchronously generate multi-quantile load forecasting results, comprehensively depicting the load probability distribution. Finally, a Listwise Maximum Likelihood Estimation (ListMLE) ranking constraint mechanism is embedded, which optimizes quantile ordering through monotonicity constraints, significantly reducing the degree of quantile crossing and improving the interpretability of forecasting results. The results show that the MQR-ELM algorithm achieves a Prediction Interval Coverage Probability of 94.624% (close to the nominal coverage rate of 95%), a Prediction Interval Averaged Width of 588.526, a Crossing Degree Index of only 0.0476, and a Continuous Ranked Probability Score as low as 84.931. All core indicators are significantly superior to those of the comparative algorithms. Full article
Show Figures

Figure 1

17 pages, 344 KB  
Article
A Generalized Framework for the (a, b)-Transformation of Probability Measures
by Raouf Fakhfakh, Ghadah Alomani and Abdulmajeed Albarrak
Mathematics 2026, 14(6), 977; https://doi.org/10.3390/math14060977 - 13 Mar 2026
Viewed by 255
Abstract
In this paper, we propose an analytic deformation acting on probability measures, designed to encompass and extend two fundamental operators in free probability: the (a,b)- and the Tc-deformations. This unified operator, indicated by [...] Read more.
In this paper, we propose an analytic deformation acting on probability measures, designed to encompass and extend two fundamental operators in free probability: the (a,b)- and the Tc-deformations. This unified operator, indicated by X(a,b,c), is introduced through a functional relation for the Cauchy–Stieltjes transform. We have X(a,b,0)=U˜(a,b) and X(1,1,c)=Tc. We examine the structural properties of this transformation within the setting of Cauchy–Stieltjes kernel (CSK) families, with special emphasis on the behavior of the associated variance functions (VFs). An explicit formula for the VF corresponding to measure deformed by X(a,b,c) is established. This result allows us to demonstrate a key invariance property: the free Meixner class of probability measures remains stable under the X(a,b,c)-transformation. Furthermore, a novel characterization of the semicircle law is obtained through the action of X(a,1,c), highlighting the role of symmetry in the deformation and preservation of free-probabilistic distributions. Full article
(This article belongs to the Section D1: Probability and Statistics)
22 pages, 2208 KB  
Article
Analysis and Cost Optimization of a Retrial Queue with Push-Out and Feedback Using Analytical and Metaheuristic Approaches
by Suganthi Poomalai, Saeid Jafari and Jayamani V. Nanjappan
Axioms 2026, 15(3), 204; https://doi.org/10.3390/axioms15030204 - 10 Mar 2026
Viewed by 281
Abstract
The paper explores an advanced single-server M/G/1 retrial queueing model that employs a push-out service with two unique classes of customers, i.e., transient (priority) customers and recurrent customers. The arrivals of customers are Poisson process. The service time of customers and retrial time [...] Read more.
The paper explores an advanced single-server M/G/1 retrial queueing model that employs a push-out service with two unique classes of customers, i.e., transient (priority) customers and recurrent customers. The arrivals of customers are Poisson process. The service time of customers and retrial time of transit customers are follow general probability distributions. The inter-retrial time of the recurrent customer is exponentially distributed. The system also includes feedback behavior of transit customers and probabilistic push-out of repeat customers. Closed-form formulae are obtained expressing steady-state distributions of important system states using supplementary variable technique (SVT) and probability generating functions (PGFs). The impact of parameters is shown with the help of numerical experiments, and the Beetle Antennae Search (BAS) algorithm is used to optimise the performance of the system. These results are useful in designing and optimization of priority-based service systems such as cloud computing systems, communication networks, and real-time task scheduling systems. Full article
(This article belongs to the Section Mathematical Analysis)
Show Figures

Figure 1

13 pages, 241 KB  
Hypothesis
From Molecular Cleavage to Clinical Effect: A Probabilistic Field Model of Botulinum Toxin Action
by Andrea Felice Armenti and Francesco Armenti
Biology 2026, 15(5), 446; https://doi.org/10.3390/biology15050446 - 9 Mar 2026
Viewed by 329
Abstract
Botulinum toxin (BoNT) is a highly specific molecular enzyme whose therapeutic action is based on the proteolytic cleavage of SNARE proteins, most notably SNAP-25. Despite the deterministic nature of this molecular mechanism, the clinical effects of BoNT exhibit substantial variability in efficacy, spatial [...] Read more.
Botulinum toxin (BoNT) is a highly specific molecular enzyme whose therapeutic action is based on the proteolytic cleavage of SNARE proteins, most notably SNAP-25. Despite the deterministic nature of this molecular mechanism, the clinical effects of BoNT exhibit substantial variability in efficacy, spatial extent, and duration that cannot be fully explained by dose–response relationships or diffusion-based models. In this work, we propose the Molecular Probability Field (MPF-BoNT) as a conceptual framework that bridges discrete molecular events and emergent functional outcomes. The MPF is defined as the spatial–temporal distribution of the probability that presynaptic terminals reach a functional silencing state (operationalized via SNAP-25 cleavage exceeding a threshold), shaped by exposure, uptake, target density, and temporal dynamics following toxin exposure. Within this framework, clinical effects arise from the integration of probabilistic molecular events across space and time, rather than from toxin presence or concentration alone. The MPF-BoNT framework accounts for key features of botulinum toxin action, including spread, nonlinearity of dose effects, variability in duration, and differences between technical and biological non-response. By explicitly incorporating molecular variables such as local concentration, exposure time, terminal density, internalization probability, and functional silencing thresholds, the framework provides an integrative interpretation of tissue-level behavior grounded in molecular biology. The MPF-BoNT offers a formal language to describe how established enzymatic events generate observable spatial, temporal, and functional patterns. As a generative framework grounded in explicit testable structure, it establishes a foundation for future experimental and clinical research. Full article
(This article belongs to the Section Biochemistry and Molecular Biology)
26 pages, 4251 KB  
Article
Reliability-Aware Robust Optimization for Multi-Type Sensor Placement Under Sensor Failures
by Shenghuan Zeng, Ding Luo, Pujingru Yan, Naiwei Lu, Ke Huang and Lei Wang
Buildings 2026, 16(5), 1024; https://doi.org/10.3390/buildings16051024 - 5 Mar 2026
Viewed by 275
Abstract
In the field of structural health monitoring systems, sensors serve as the fundamental components for assessing infrastructure integrity. The rationality of their spatial configuration significantly influences the precision of structural performance assessment, the efficacy of damage detection algorithms, and the operational reliability of [...] Read more.
In the field of structural health monitoring systems, sensors serve as the fundamental components for assessing infrastructure integrity. The rationality of their spatial configuration significantly influences the precision of structural performance assessment, the efficacy of damage detection algorithms, and the operational reliability of the system throughout its designated lifecycle. A robust optimization methodology for the placement of multi-type sensors is proposed in this study, explicitly formulated to mitigate the negative impact of sensor malfunctions during long-term operation. First, a rigorous evaluation framework for sensor placement schemes is established based on Bayesian inference and the minimization of information entropy, thereby quantifying the uncertainty inherent in parameter identification. Then, a probabilistic model of sensor failure is developed utilizing the Weibull distribution to capture time-dependent reliability characteristics, combined with a modified information entropy calculation method that mathematically assimilates these failure probabilities into the optimization objective. Finally, a heuristic search strategy is employed to achieve the robust optimal placement of multi-type sensors, efficiently navigating the complex combinatorial search space. In contrast to deterministic information entropy (DIE) methodologies, which assume ideal sensor functionality, the robust information entropy (RIE) approach comprehensively accounts for the stochastic nature of sensor failures and their impact on the information content of the monitoring network, thereby significantly augmenting the robustness and redundancy of the sensor configuration. Validations utilizing a numerical frame structure and a finite element bridge model demonstrate that the RIE method effectively integrates the sensor failure probability model to yield robust optimal placement schemes, minimizing the risk of information loss and ensuring reliable structural health monitoring throughout the engineering lifecycle. Full article
Show Figures

Figure 1

19 pages, 1503 KB  
Review
Imaging Ductal Carcinoma In Situ in the Era of De-Escalation: Role, Limits, and Clinical Implications for Risk-Adapted Management
by Marcella Buono, Luigi Schiavone, Sighelgaita Rizzo, Lanfranco Aquilino Musto, Gianluca Gatta, Lucia Pilati and Francesca Caumo
Diagnostics 2026, 16(5), 776; https://doi.org/10.3390/diagnostics16050776 - 5 Mar 2026
Viewed by 504
Abstract
The widespread implementation of population-based mammographic screening has markedly increased the detection of ductal carcinoma in situ (DCIS), without a proportional reduction in breast cancer-specific mortality. This divergence has intensified concerns regarding overdiagnosis and overtreatment and has prompted increasing interest in treatment de-escalation [...] Read more.
The widespread implementation of population-based mammographic screening has markedly increased the detection of ductal carcinoma in situ (DCIS), without a proportional reduction in breast cancer-specific mortality. This divergence has intensified concerns regarding overdiagnosis and overtreatment and has prompted increasing interest in treatment de-escalation and active surveillance strategies. Breast imaging remains indispensable for DCIS detection, extent assessment, and longitudinal monitoring. However, although imaging features correlate with histopathologic risk factors at the population level, their ability to predict individual biological progression is inherently probabilistic and limited. Overinterpretation of imaging phenotypes as surrogates of invasive destiny risks inappropriate reassurance or unjustified therapeutic escalation, particularly in the context of high-sensitivity modalities that may overestimate disease extent or trigger additional interventions without proven outcome benefits. This review examines the modality-specific roles of mammography, ultrasound, breast magnetic resonance imaging (MRI), contrast-enhanced mammography (CEM), and emerging artificial intelligence (AI) approaches within contemporary DCIS management, with particular attention to their implementation in active surveillance trials such as LORIS, COMET, LORD, and LORETTA. Across modalities, imaging primarily reflects lesion morphology, spatial distribution, and vascular behaviour, and functions most reliably as a risk-filtering and safety-gating instrument aimed at excluding radiologically unsafe scenarios, including occult invasion, underestimated disease extent, or imaging evolution incompatible with continued observation. By delineating both the capabilities and the epistemological limits of imaging, this review proposes a structured clinical decision framework in which imaging supports—but does not independently determine—risk-adapted management. Disciplined integration of imaging into multidisciplinary decision-making is essential to enable safe de-escalation, prevent false reassurance, and align DCIS care with patient-centred and value-based principles. Full article
(This article belongs to the Special Issue Diagnostic Radiology for Breast Cancer)
Show Figures

Figure 1

63 pages, 1636 KB  
Article
Asymptotic Theory for Multivariate Nonparametric Quantile Regression with Stationary Ergodic Functional Covariates and Missing-at-Random Responses
by Hadjer Belhas, Mustapha Mohammedi and Salim Bouzebda
Symmetry 2026, 18(3), 445; https://doi.org/10.3390/sym18030445 - 4 Mar 2026
Viewed by 263
Abstract
Quantiles are among the most fundamental constructs in probability theory and statistics, intrinsically linked to order structures, stochastic dominance, and the principles of robust statistical inference. Although the univariate theory of quantiles is by now classical and well developed, their generalization to multivariate [...] Read more.
Quantiles are among the most fundamental constructs in probability theory and statistics, intrinsically linked to order structures, stochastic dominance, and the principles of robust statistical inference. Although the univariate theory of quantiles is by now classical and well developed, their generalization to multivariate settings remains mathematically subtle and methodologically demanding. In particular, extending the notion of “location within a distribution” beyond one dimension raises delicate questions of geometry, ordering, and equivariance. Within this landscape, the spatial—or geometric—formulation of multivariate quantiles has emerged as a rigorous and conceptually unifying framework capable of reconciling these issues. In this work we advance this paradigm by introducing a kernel-based estimation procedure for nonparametric conditional geometric quantiles of a multivariate response YRq (q2) given a functional covariate X that takes values in an infinite-dimensional space. The data are assumed to form a strictly stationary and ergodic process, while the responses may be subject to a missing-at-random mechanism, a feature of substantial practical relevance. Our analysis establishes strong consistency of the proposed estimator, characterizes its optimal convergence rate, and derives its asymptotic distribution. These limit theorems, in turn, provide the theoretical foundation for constructing asymptotically valid confidence regions and for performing inference in multivariate quantile regression with functional covariates. The theoretical developments rest on natural complexity conditions for the involved functional classes together with mild smoothness and regularity assumptions. This balance between generality and mathematical precision ensures that the resulting methodology is not only robust in a rigorous probabilistic sense but also widely applicable to contemporary problems in high-dimensional and functional data analysis. The proposed methodology is numerically investigated through simulations and is implemented in a real data application. Full article
Show Figures

Figure 1

28 pages, 18337 KB  
Article
Forecast of Electric Power Consumed by Public Buildings: Univariate and Multivariate Approaches Based on Quantile Regression Models
by Sara Perna, Anna Rita Di Fazio, Andrea Iacovacci, Francesco Conte and Pasquale De Falco
Energies 2026, 19(5), 1200; https://doi.org/10.3390/en19051200 - 27 Feb 2026
Viewed by 253
Abstract
Load forecasting has become a key tool, especially for distribution system operators, to ensure optimal grid management and control. In recent years, attention has shifted toward probabilistic load forecasting (PLF), as it can model forecast uncertainty. Because electricity demand is strongly influenced by [...] Read more.
Load forecasting has become a key tool, especially for distribution system operators, to ensure optimal grid management and control. In recent years, attention has shifted toward probabilistic load forecasting (PLF), as it can model forecast uncertainty. Because electricity demand is strongly influenced by time-dependent factors such as seasonal patterns and daily habits, non-parametric PLF methods are particularly suitable because they make no assumptions about the distribution of variables. This study focuses on quantile regression (QR), a widely studied non-parametric PLF technique that models forecast uncertainty by only assuming a linear dependency among variables. It is applied every hour to forecast the daily consumption of three large public buildings—an elderly healthcare center, a biomedical research facility, and a polyclinic—with different demand variability profiles. Forecasts are carried out using real-world consumption data and evaluated considering both univariate and multivariate approaches. The performance of both QR approaches is rigorously evaluated against that of two persistence-based methods through standard evaluation metrics. For the univariate case, two aggregation levels are considered: single buildings and aggregation of buildings. The results confirm the effectiveness of both uQR and mQR, which consistently outperform persistence-based benchmarks. In terms of the pinball loss (PL) function, the QR approaches exhibit values ranging from 1% to 1.8% across all case studies. Both approaches demonstrate reliable and sharp prediction intervals (PIs); for example, for the PI(10–90) using the uQR, the PI coverage probability (PICP) ranges from 0.78 to 0.89 and the PI normalized average width (PINAW) from 0.09 to 0.26. Overall, uQR achieves lower PL, whereas mQR yields slightly better PICP and PINAW results for the building characterized by an irregular and unpredictable consumption profile. Full article
(This article belongs to the Special Issue Advanced Forecasting Methods for Sustainable Power Grid: 2nd Edition)
Show Figures

Figure 1

Back to TopTop