Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (146)

Search Parameters:
Keywords = probabilistic prediction interval

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
29 pages, 2805 KB  
Article
Probabilistic Links Between Quantum Classification of Patterns of Boolean Functions and Hamming Distance
by Theodore Andronikos, Constantinos Bitsakos, Konstantinos Nikas, Georgios I. Goumas and Nectarios Koziris
Stats 2026, 9(1), 5; https://doi.org/10.3390/stats9010005 (registering DOI) - 1 Jan 2026
Abstract
This article investigates the probabilistic relationship between quantum classification of Boolean functions and their Hamming distance. By integrating concepts from quantum computing, information theory, and combinatorics, we explore how Hamming distance serves as a metric for analyzing deviations in function classification. Our extensive [...] Read more.
This article investigates the probabilistic relationship between quantum classification of Boolean functions and their Hamming distance. By integrating concepts from quantum computing, information theory, and combinatorics, we explore how Hamming distance serves as a metric for analyzing deviations in function classification. Our extensive experimental results confirm that the Hamming distance is a pivotal metric for validating nearest neighbors in the process of classifying random functions. One of the significant conclusions we arrived is that the successful classification probability decreases monotonically with the Hamming distance. However, key exceptions were found in specific classes, revealing intra-class heterogeneity. We have established that these deviations are not random but are systemic and predictable. Furthermore, we were able to quantify these irregularities, turning potential errors into manageable phenomena. The most important novelty of this work is the demarcation, for the first time to the best of our knowledge, of precise Hamming distance intervals for the classification probability. These intervals bound the possible values the probability can assume, and provide a new foundational tool for probabilistic assessment in quantum classification. Practitioners can now endorse classification results with high certainty or dismiss them with confidence. This framework can significantly enhance any quantum classification algorithm’s reliability and decision-making capability. Full article
Show Figures

Figure 1

24 pages, 11970 KB  
Article
Data-Driven Probabilistic Wind Power Forecasting and Dispatch with Alternating Direction Method of Multipliers over Complex Networks
by Lina Sheng, Nan Fu, Juntao Mou, Linglong Zhu and Jinan Zhou
Mathematics 2026, 14(1), 112; https://doi.org/10.3390/math14010112 - 28 Dec 2025
Viewed by 116
Abstract
This paper proposes a privacy-preserving framework that couples probabilistic wind power forecasting with decentralized anomaly detection in complex power networks. We first design an adaptive federated learning (FL) scheme to produce probabilistic forecasts for multiple geographically distributed wind farms while keeping their raw [...] Read more.
This paper proposes a privacy-preserving framework that couples probabilistic wind power forecasting with decentralized anomaly detection in complex power networks. We first design an adaptive federated learning (FL) scheme to produce probabilistic forecasts for multiple geographically distributed wind farms while keeping their raw data local. In this scheme, an artificial neural network with quantile regression is trained collaboratively across sites to provide calibrated prediction intervals for wind power outputs. These forecasts are then embedded into an alternating direction method of multipliers (ADMM)-based load-side dispatch and anomaly detection model for decentralized power systems with plug-and-play industrial users. Each monitoring node uses local measurements and neighbor communication to solve a distributed economic dispatch problem, detect abnormal load behaviors, and maintain network consistency without a central coordinator. Experiments on the GEFCom 2014 wind power dataset show that the proposed FL-based probabilistic forecasting method outperforms persistence, local training, and standard FL in RMSE and MAE across multiple horizons. Simulations on IEEE 14-bus and 30-bus systems further verify fast convergence, accurate anomaly localization, and robust operation, indicating the effectiveness of the integrated forecasting–dispatch framework for smart industrial grids with high wind penetration. Full article
(This article belongs to the Special Issue Advanced Machine Learning Research in Complex System)
Show Figures

Figure 1

21 pages, 4860 KB  
Article
Data-Driven Probabilistic Analysis of Power System Faults Using Monte Carlo Simulation
by Franjo Pranjić and Peter Virtič
Technologies 2026, 14(1), 14; https://doi.org/10.3390/technologies14010014 - 24 Dec 2025
Viewed by 201
Abstract
This paper presents a data-driven probabilistic framework for analysing power system faults using Monte Carlo simulations. The study evaluates the operational reliability of multiple high-voltage switchgear topologies—including single-busbar systems, double-busbar systems, and ring-type configurations—by modelling the stochastic behaviour of disconnectors, circuit breakers, busbars, [...] Read more.
This paper presents a data-driven probabilistic framework for analysing power system faults using Monte Carlo simulations. The study evaluates the operational reliability of multiple high-voltage switchgear topologies—including single-busbar systems, double-busbar systems, and ring-type configurations—by modelling the stochastic behaviour of disconnectors, circuit breakers, busbars, and withdrawable switching elements with bypass arrangements. Realistic unavailability parameters derived from statistical reliability data are used to generate fault intervals for each device, enabling the simulation of millions of operational scenarios and capturing both full and partial outage events. The proposed methodology quantifies outage probabilities, identifies critical components, and reveals how devices count, switching logic, and system redundancy influence overall resilience. Results show significant reliability differences between topologies and highlight the importance of optimized substation design for fault tolerance. The developed probabilistic framework provides a transparent and computationally efficient tool to support planning, modernization, and predictive maintenance strategies in transmission and distribution networks. Findings contribute to improved fault diagnosis, enhanced grid stability, and increased reliability in both conventional and renewable-integrated power systems. Full article
Show Figures

Figure 1

21 pages, 3094 KB  
Article
Assessment of Load Reduction Potential Based on Probabilistic Prediction of Demand Response Baseline Load
by Xianjun Qi, Mengjie Gong, Feng Huang and Hao Liu
Processes 2026, 14(1), 52; https://doi.org/10.3390/pr14010052 - 23 Dec 2025
Viewed by 223
Abstract
The uncertainty of baseline load forecasting critically influences both the assessment of load reduction potential and demand response (DR) settlement. Therefore, this paper focuses on assessing load reduction potential based on probabilistic predictions of the baseline load. First, the uncertainty of the baseline [...] Read more.
The uncertainty of baseline load forecasting critically influences both the assessment of load reduction potential and demand response (DR) settlement. Therefore, this paper focuses on assessing load reduction potential based on probabilistic predictions of the baseline load. First, the uncertainty of the baseline load prediction is analyzed through calculating the conditional probability density function (PDF) and interval estimation of baseline load prediction errors from the convolutional neural network (CNN) model. Then, the probabilistic model of load reduction potential is proposed based on the results from the probabilistic prediction of baseline load and the terms about the interruptible load in DR contracts. Finally, the Monte Carlo simulation method is used to assess the load reduction potential, and probability distributions of the load reduction states, the lower and upper limits of the load reduction potential, are analyzed. Case studies demonstrate that the proposed method effectively characterizes the uncertainty of prediction results, with the prediction interval normalized average width (PINAW) decreased by 10.97%, thereby enabling the effective assessment of load reduction potential from the probabilistic perspective, helping decision makers take better choices. Full article
Show Figures

Figure 1

20 pages, 1609 KB  
Article
Low-Cost Gas Sensing and Machine Learning for Intelligent Refrigeration in the Built Environment
by Mooyoung Yoo
Buildings 2026, 16(1), 41; https://doi.org/10.3390/buildings16010041 - 22 Dec 2025
Viewed by 194
Abstract
Accurate, real-time monitoring of meat freshness is essential for reducing food waste and safeguarding consumer health, yet conventional methods rely on costly, laboratory-grade spectroscopy or destructive analyses. This work presents a low-cost electronic-nose platform that integrates a compact array of metal-oxide gas sensors [...] Read more.
Accurate, real-time monitoring of meat freshness is essential for reducing food waste and safeguarding consumer health, yet conventional methods rely on costly, laboratory-grade spectroscopy or destructive analyses. This work presents a low-cost electronic-nose platform that integrates a compact array of metal-oxide gas sensors (Figaro TGS2602, TGS2603, and Sensirion SGP30) with a Gaussian Process Regression (GPR) model to estimate a continuous freshness index under refrigerated storage. The pipeline includes headspace sensing, baseline normalization and smoothing, history-window feature construction, and probabilistic prediction with uncertainty. Using factorial analysis and response-surface optimization, we identify history length and sampling interval as key design variables; longer temporal windows and faster sampling consistently improve accuracy and stability. The optimized configuration (≈143-min history, ≈3-min sampling) reduces mean absolute error from ~0.51 to ~0.05 on the normalized freshness scale and shifts the error distribution within specification limits, with marked gains in process capability and yield. Although it does not match the analytical precision or long-term robustness of spectrometric approaches, the proposed system offers an interpretable and energy-efficient option for short-term, laboratory-scale monitoring under controlled refrigeration conditions. By enabling probabilistic freshness estimation from low-cost sensors, this GPR-driven e-nose demonstrates a proof-of-concept pathway that could, after further validation under realistic cyclic loads and operational disturbances, support more sustainable meat management in future smart refrigeration and cold-chain applications. This study should be regarded as a methodological, laboratory-scale proof-of-concept that does not demonstrate real-world performance or operational deployment. The technical implications described herein are hypothetical and require extensive validation under realistic refrigeration conditions. Full article
(This article belongs to the Special Issue Built Environment and Building Energy for Decarbonization)
Show Figures

Figure 1

25 pages, 6352 KB  
Article
Integrated Stochastic Framework for Drought Assessment and Forecasting Using Climate Indices, Remote Sensing, and ARIMA Modelling
by Majed Alsubih, Javed Mallick, Hoang Thi Hang, Mansour S. Almatawa and Vijay P. Singh
Water 2025, 17(24), 3582; https://doi.org/10.3390/w17243582 - 17 Dec 2025
Viewed by 297
Abstract
This study presents an integrated stochastic framework for assessing and forecasting drought dynamics in the western Bhagirathi–Hooghly River Basin, encompassing the districts of Bankura, Birbhum, Burdwan, Medinipur, and Purulia. Employing multiple probabilistic and statistical techniques, including the gamma-based standardized precipitation index (SPI), effective [...] Read more.
This study presents an integrated stochastic framework for assessing and forecasting drought dynamics in the western Bhagirathi–Hooghly River Basin, encompassing the districts of Bankura, Birbhum, Burdwan, Medinipur, and Purulia. Employing multiple probabilistic and statistical techniques, including the gamma-based standardized precipitation index (SPI), effective drought index (EDI), rainfall anomaly index (RAI), and the auto-regressive integrated moving average (ARIMA) model, the research quantifies spatio-temporal variability and projects drought risk under non-stationary climatic conditions. The analysis of century-long rainfall records (1905–2023), coupled with LANDSAT-derived vegetation and moisture indices, reveals escalating drought frequency and severity, particularly in Purulia, where recurrent droughts occur at roughly four-year intervals. Stochastic evaluation of rainfall anomalies and SPI distributions indicates significant inter-annual variability and complex temporal dependencies across all districts. ARIMA-based forecasts (2025–2045) suggest persistent negative SPI trends, with Bankura and Purulia exhibiting heightened drought probability and reduced predictability at longer timescales. The integration of remote sensing and time-series modelling enhances the robustness of drought prediction by combining climatic stochasticity with land-surface responses. The findings demonstrate that a hybrid stochastic modelling approach effectively captures uncertainty in drought evolution and supports climate-resilient water resource management. This research contributes a novel, region-specific stochastic framework that advances risk-based drought assessment, aligning with the broader goal of developing adaptive and probabilistic environmental management strategies under changing climatic regimes. Full article
(This article belongs to the Special Issue Drought Evaluation Under Climate Change Condition)
Show Figures

Figure 1

21 pages, 2101 KB  
Article
Probabilistic Prediction of Local Scour at Bridge Piers with Interpretable Machine Learning
by Jaemyeong Choi, Jongyeong Kim, Soonchul Kwon and Taeyoon Kim
Water 2025, 17(24), 3574; https://doi.org/10.3390/w17243574 - 16 Dec 2025
Viewed by 282
Abstract
Local pier scour remains one of the leading causes of bridge failure, calling for predictions that are both accurate and uncertainty-aware. This study develops an interpretable data-driven framework that couples CatBoost (Categorial Gradient Boosting) for deterministic point prediction with NGBoost (Natural Gradient Boosting) [...] Read more.
Local pier scour remains one of the leading causes of bridge failure, calling for predictions that are both accurate and uncertainty-aware. This study develops an interpretable data-driven framework that couples CatBoost (Categorial Gradient Boosting) for deterministic point prediction with NGBoost (Natural Gradient Boosting) for probabilistic prediction. Both models are trained on a laboratory dataset of 552 measurements of local scour at bridge piers using non-dimensional inputs (y/b, V/Vc, b/d50, Fr). Model performance was quantitatively evaluated using standard regression metrics, and interpretability was provided through SHAP (Shapley Additive Explanations) analysis. Monte Carlo–based reliability analysis linked the predicted scour depths to a reliability index β and exceedance probability through a simple multiplicative correction factor. On the held-out test set, CatBoost offers slightly higher point-prediction accuracy, while NGBoost yields well-calibrated prediction intervals with empirical coverages close to the nominal 68% and 95% levels. This framework delivers accurate, interpretable, and uncertainty-aware scour estimates for target-reliability, risk-informed bridge design. Full article
(This article belongs to the Section Hydraulics and Hydrodynamics)
Show Figures

Figure 1

29 pages, 12360 KB  
Article
Vision-Guided Dynamic Risk Assessment for Long-Span PC Continuous Rigid-Frame Bridge Construction Through DEMATEL–ISM–DBN Modelling
by Linlin Zhao, Qingfei Gao, Yidian Dong, Yajun Hou, Liangbo Sun and Wei Wang
Buildings 2025, 15(24), 4543; https://doi.org/10.3390/buildings15244543 - 16 Dec 2025
Viewed by 267
Abstract
In response to the challenges posed by the complex evolution of risks and the static nature of traditional assessment methods during the construction of long-span prestressed concrete (PC) continuous rigid-frame bridges, this study proposes a risk assessment framework that integrates visual perception with [...] Read more.
In response to the challenges posed by the complex evolution of risks and the static nature of traditional assessment methods during the construction of long-span prestressed concrete (PC) continuous rigid-frame bridges, this study proposes a risk assessment framework that integrates visual perception with dynamic probabilistic reasoning. By combining an improved YOLOv8 model with the Decision-making Trial and Evaluation Laboratory–InterpretiveStructure Modeling (DEMATEL–ISM) algorithm, the framework achieves intelligent identification of risk elements and causal structure modelling. On this basis, a dynamic Bayesian network (DBN) is constructed, incorporating a sliding window and forgetting factor mechanism to enable adaptive updating of conditional probability tables. Using the Tongshun River Bridge as a case study, at the identification layer, we refine onsite targets into 14 risk elements (F1–F14). For visualization, these are aggregated into four categories—“Bridge, Person, Machine, Environment”—to enhance readability. In the methodology layer, leveraging causal a priori information provided by DEMATEL–ISM, risk elements are mapped to scenario probabilities, enabling scenario-level risk assessment and grading. This establishes a traceable closed-loop process from “elements” to “scenarios.” The results demonstrate that the proposed approach effectively identifies key risk chains within the “human–machine–environment–bridge” system, revealing phase-specific peaks in human-related risks and cumulative increases in structural and environmental risks. The particle filter and Monte Carlo prediction outputs generate short-term risk evolution curves with confidence intervals, facilitating the quantitative classification of risk levels. Overall, this vision-guided dynamic risk assessment method significantly enhances the real-time responsiveness, interpretability, and foresight of bridge construction safety management and provides a promising pathway for proactive risk control in complex engineering environments. Full article
(This article belongs to the Special Issue Big Data and Machine/Deep Learning in Construction)
Show Figures

Figure 1

19 pages, 3961 KB  
Article
Risk-Aware Multi-Horizon Forecasting of Airport Departure Flow Using a Patch-Based Time-Series Transformer
by Xiangzhi Zhou, Shanmei Li and Siqing Li
Aerospace 2025, 12(12), 1107; https://doi.org/10.3390/aerospace12121107 - 15 Dec 2025
Viewed by 201
Abstract
Airport traffic flow prediction is a basic requirement for air traffic management. Building an effective airport traffic flow prediction model helps reveal how traffic demand evolves over time and supports short-term planning. At the same time, a large amount of air traffic data [...] Read more.
Airport traffic flow prediction is a basic requirement for air traffic management. Building an effective airport traffic flow prediction model helps reveal how traffic demand evolves over time and supports short-term planning. At the same time, a large amount of air traffic data supports using deep learning to learn traffic patterns with stable and accurate performance. In practice, airports need forecasts at short time intervals and need to know the departure flow and its uncertainty 1–2 h in advance. To meet this need, we treat airport departure flow prediction as a multi-step probabilistic forecasting problem on a multi-airport dataset that is organized by airport and time. Scheduled departure counts, recent taxi-out time statistics (P50/P90 over 30- and 60-minute windows), and calendar variables are put on the same time scale and standardized separately for each airport. Based on these data, we propose an end-to-end multi-step forecasting method built on PatchTST. The method uses patch partitioning and a Transformer encoder to extract temporal features from the past 48 h of multivariate history and directly outputs the 10th, 50th, and 90th percentile forecasts of departure flow for each 10 min step in the next 120 min. In this way, the model provides both point forecasts and prediction intervals. Experiments were conducted on 80 airports with the highest departure volumes, using April–July for training, August for validation, September for testing, and October for robustness evaluation. The results show that at a 10 min interval, the model achieves an MAE of 0.411 and an RMSE of 0.713 on the test set. The error increases smoothly with the forecast horizon and remains stable within the 60–120 min range. When the forecasts are aggregated to 1 h intervals in time or aggregated by airport clusters in space, the point forecast errors decrease further, and the average empirical coverage is 0.78 and the width of the percentile-based intervals is 1.29, which can meet the risk-awareness requirements of tactical operations management. The proposed method is relatively simple and also provides a unified modeling framework for later including external factors such as weather, runway configuration, and operational procedures, and for applications across different airports and years. Full article
(This article belongs to the Special Issue AI, Machine Learning and Automation for Air Traffic Control (ATC))
Show Figures

Figure 1

28 pages, 2600 KB  
Article
Reliable and Adaptive Probabilistic Forecasting for Event-Driven Water-Quality Time Series Using a Gated Hybrid–Mixture Density Network
by Nadir Ehmimed, Mohamed Yassin Chkouri and Abdellah Touhafi
Sensors 2025, 25(24), 7560; https://doi.org/10.3390/s25247560 - 12 Dec 2025
Viewed by 499
Abstract
Real-time, reliable forecasting of water quality (WQ) is a critical component of sustainable environmental management. A key challenge, however, is modeling time-varying uncertainty (heteroscedasticity), particularly during disruptive events like storms where predictability decreases dramatically. Standard probabilistic models often fail in these high-stakes scenarios, [...] Read more.
Real-time, reliable forecasting of water quality (WQ) is a critical component of sustainable environmental management. A key challenge, however, is modeling time-varying uncertainty (heteroscedasticity), particularly during disruptive events like storms where predictability decreases dramatically. Standard probabilistic models often fail in these high-stakes scenarios, producing forecasts that are either too conservative during calm periods or dangerously overconfident during volatile events. This paper introduces the Gated Hybrid–Mixture Density Network (GH-MDN), an architecture explicitly designed for adaptive uncertainty estimation. Its core innovation is a dedicated gating network that learns to adaptively modulate the prediction interval width in response to a domain-relevant, event-precursor signal. We evaluate the GH-MDN on both synthetic and real-world WQ datasets using a rigorous cross-validation protocol. The results demonstrate that our gated model provides robust calibration and trustworthy adaptive coverage; specifically, it successfully widens prediction intervals to capture extreme events where standard benchmarks fail. We further show that common aggregate metrics such as CRPS can mask over-confident behavior during rare events, underscoring the need for evaluation approaches that prioritize calibration. This science-informed approach to modeling heteroscedasticity prioritizes reliable risk coverage over aggregate error minimization, marking a critical step towards the development of more trustworthy environmental forecasting systems. Full article
(This article belongs to the Special Issue State-of-the-Art Sensors Technologies in Belgium 2024-2025)
Show Figures

Figure 1

19 pages, 3804 KB  
Article
An Optimized CNN-BiLSTM-RF Temporal Framework Based on Relief Feature Selection and Adaptive Weight Integration: Rotary Kiln Head Temperature Prediction
by Jianke Gu, Yao Liu, Xiang Luo and Yiming Bo
Processes 2025, 13(12), 3891; https://doi.org/10.3390/pr13123891 - 2 Dec 2025
Viewed by 272
Abstract
The kiln head temperature of a rotary kiln is a core process parameter in cement clinker production, and its accurate prediction coupled with uncertainty quantification is crucial for process optimization, energy consumption control, and safe operation. To tackle the prediction challenges arising from [...] Read more.
The kiln head temperature of a rotary kiln is a core process parameter in cement clinker production, and its accurate prediction coupled with uncertainty quantification is crucial for process optimization, energy consumption control, and safe operation. To tackle the prediction challenges arising from strong multi-variable coupling and nonlinear time series characteristics, this paper proposes a prediction approach integrating feature selection, heterogeneous model ensemble, and probabilistic interval estimation. Firstly, the Relief algorithm is adopted to select key features and construct a time series feature set with high discriminability. Then, a hierarchical architecture encompassing deep feature extraction, heterogeneous model fusion, and probabilistic interval quantification is devised. CNN is utilized to extract spatial correlation features among multiple variables, while BiLSTM is employed to bidirectionally capture the long-term and short-term temporal dependencies of the temperature sequence, thereby forming a deep temporal–spatial feature representation. Subsequently, RF is introduced to establish a heterogeneous model ensemble mechanism, and dynamic weight allocation is implemented based on the Mean Absolute Error of the validation set to enhance the modeling capability for nonlinear coupling relationships. Finally, Gaussian probabilistic regression is leveraged to generate multi-confidence prediction intervals for quantifying prediction uncertainty. Experiments on the real rotary kiln dataset demonstrate that the R2 of the proposed model is improved by up to 15.5% compared with single CNN, BiLSTM and RF models, and the Mean Absolute Error is reduced by up to 27.7%, which indicates that the model exhibits strong robustness to the dynamic operating conditions of the rotary kiln and provides both accuracy guarantee and risk quantification basis for process decision-making. This method offers a new paradigm integrating feature selection, adaptive heterogeneous model collaboration, and uncertainty quantification for industrial multi-variable nonlinear time series prediction, and its hierarchical modeling concept is valuable for the intelligent perception of complex process industrial parameters. Full article
(This article belongs to the Special Issue Transfer Learning Methods in Equipment Reliability Management)
Show Figures

Figure 1

23 pages, 3169 KB  
Article
A Risk-Driven Probabilistic Framework for Blast Vibrations in Twin Tunnels: Integrating Monte Carlo Simulation to Quantify Cavity Effects
by Abdulkadir Karadogan, Meric Can Ozyurt, Ulku Kalayci Sahinoglu, Umit Ozer and Abdurrahim Akgundogdu
Appl. Sci. 2025, 15(23), 12643; https://doi.org/10.3390/app152312643 - 28 Nov 2025
Viewed by 267
Abstract
Predicting blast-induced vibrations in twin tunnels is challenging due to complex wave-cavity interactions, which render conventional scaled-distance (PPV-SD) models inadequate. This study introduces a hybrid empirical-probabilistic framework to quantify the probability of exceeding regulatory vibration thresholds. Field data from the Northern [...] Read more.
Predicting blast-induced vibrations in twin tunnels is challenging due to complex wave-cavity interactions, which render conventional scaled-distance (PPV-SD) models inadequate. This study introduces a hybrid empirical-probabilistic framework to quantify the probability of exceeding regulatory vibration thresholds. Field data from the Northern Marmara Highway project first quantitatively confirm the severe degradation of the classical scaled-distance (PPV-SD) method in twin-tunnel geometry, reducing a strong correlation (R = 0.82) to insignificance. A Random Forest sensitivity analysis, applied to 123 blast records, ranked the governing parameters, guiding the development of a deterministic multi-parameter regression model (R = 0.72). The core innovation of this framework is the embedding of this deterministic model within a Monte Carlo Simulation (MCS) to propagate documented input uncertainties, thereby generating a full probability distribution for PPV. This represents a fundamental advance beyond deterministic point-estimates, as it enables the direct calculation of exceedance probabilities for risk-informed decision-making. For instance, for a regulatory threshold of 10 mm/s, the framework quantified the exceedance probability as P (PPV > 10 mm/s) = 5.2%. The framework’s robustness was demonstrated via validation against 100 independent blast records, which showed strong calibration with 94% of observed PPV values captured within the model’s 90% confidence interval. This computationally efficient framework (<10,000 iterations) provides engineers with a practical tool for moving from deterministic safety factors to quantifiable, risk-informed decision-making. Full article
Show Figures

Figure 1

24 pages, 12859 KB  
Article
A Hybrid EMD–LASSO–MCQRNN–KDE Framework for Probabilistic Electric Load Forecasting Under Renewable Integration
by Haoran Kong, Bingshuai Li and Yunhao Sun
Processes 2025, 13(12), 3781; https://doi.org/10.3390/pr13123781 - 23 Nov 2025
Viewed by 411
Abstract
Accurate probabilistic load forecasting is essential for secure power system operation and efficient energy management, particularly under increasing renewable integration and demand-side complexity. However, traditional forecasting methods often struggle with issues such as non-linearity, non-stationarity, feature redundancy, and quantile crossing, which hinder reliable [...] Read more.
Accurate probabilistic load forecasting is essential for secure power system operation and efficient energy management, particularly under increasing renewable integration and demand-side complexity. However, traditional forecasting methods often struggle with issues such as non-linearity, non-stationarity, feature redundancy, and quantile crossing, which hinder reliable uncertainty quantification. To overcome these challenges, this study proposes a hybrid probabilistic load forecasting framework that integrates empirical mode decomposition (EMD), LASSO-based feature selection, and a monotone composite quantile regression neural network (MCQRNN) enhanced with kernel density estimation (KDE). First, EMD decomposes the raw load series into intrinsic mode functions and a trend component to mitigate non-stationarity. Then, LASSO selects the most informative features from both the decomposed components and the original time series, effectively reducing dimensionality and multicollinearity. Subsequently, the proposed MCQRNN model generates multiple quantiles under monotonicity constraints, eliminating quantile crossing and improving multi-quantile coherence through a composite loss function. Finally, Gaussian kernel density estimation reconstructs a continuous probability density function from the predicted quantiles, enabling full distributional forecasting. The framework is evaluated on two public datasets—GEFCom2014 and ISO New England—using point, interval, and density evaluation metrics. Experimental results demonstrate that the proposed EMD–LASSO–MCQRNN–KDE model outperforms benchmark approaches in both point and probabilistic forecasting, providing a robust and interpretable solution for uncertainty-aware grid operation and energy planning. Full article
Show Figures

Figure 1

26 pages, 2405 KB  
Article
Uncertainty-Aware QoS Forecasting with BR-LSTM for Esports Networks
by Ching-Fang Yang
Information 2025, 16(12), 1016; https://doi.org/10.3390/info16121016 - 21 Nov 2025
Viewed by 563
Abstract
Reliable forecasting of network QoS indicators such as latency, jitter, and packet loss is essential for managing real-time and risk-sensitive applications. This study addresses the challenge of uncertainty quantification in QoS prediction by proposing a Bayesian Regression-enhanced Long Short-Term Memory (BR-LSTM) framework. The [...] Read more.
Reliable forecasting of network QoS indicators such as latency, jitter, and packet loss is essential for managing real-time and risk-sensitive applications. This study addresses the challenge of uncertainty quantification in QoS prediction by proposing a Bayesian Regression-enhanced Long Short-Term Memory (BR-LSTM) framework. The method integrates Bayesian mean variance estimates into sequential LSTM learning to enable accurate point forecasts and well-calibrated confidence intervals. Experiments are conducted using a Mininet-based emulation platform that simulates dynamic esports network environments. The proposed model is benchmarked against ten probabilistic and deterministic baselines, including ARIMA, Gaussian Process Regression, Bayesian Neural Networks, and Monte Carlo Dropout LSTM. Results demonstrate that BR-LSTM achieves competitive accuracy while providing uncertainty intervals that improve decision confidence for Service-Level Agreement (SLA) management. The calibrated upper bound (μ+kσ)  can be compared directly against SLA thresholds to issue early warnings and prioritize rerouting, pacing, or bitrate adjustments when the bound approaches or exceeds policy limits, while calibration controls false alarms and prevents unnecessary interventions. The findings highlight the potential of uncertainty-aware forecasting for intelligent information systems in latency-critical networks. Full article
(This article belongs to the Special Issue New Deep Learning Approach for Time Series Forecasting, 2nd Edition)
Show Figures

Graphical abstract

30 pages, 3310 KB  
Article
Probabilistic Analysis of Solar and Wind Energy Potentials at Geographically Diverse Locations for Sustainable Renewable Integration
by Satyam Patel, N. P. Patidar and Mohan Lal Kolhe
Energies 2025, 18(22), 6076; https://doi.org/10.3390/en18226076 - 20 Nov 2025
Viewed by 376
Abstract
The use of conventional fuel sources from the Earth to generate electrical power leads to several environmental issues such as carbon emissions and ozone depletion. Energy generation from renewable energy sources is one of the most affordable and cleanest techniques. However, the generation [...] Read more.
The use of conventional fuel sources from the Earth to generate electrical power leads to several environmental issues such as carbon emissions and ozone depletion. Energy generation from renewable energy sources is one of the most affordable and cleanest techniques. However, the generation of power from non-conventional sources like solar and wind requires the examination of established locations where these resources are plentiful and easily accessible. In this study, an investigation of solar and wind is performed at five different sites in various locations in India. For this examination, data on solar irradiance (W/m2) and wind speed (m/s) is taken from the “NASA POWER DAV v.2.5.22” Data Access Viewer created by NASA. The data for solar and wind was taken at hourly intervals. The period of the investigation was ten years, i.e., from January 2014 to December 2023. The solar and wind potential analysis was performed in a probabilistic way to determine the parameters that support the installation of solar–PV panels and wind energy generators at the examined sites for the generation of power from these spontaneously available sources, respectively. To examine the potential of solar and wind sites, the Beta and Weibull probability distribution function (PDF) was used. The parameter estimation of the Beta and Weibull PDF was performed via the Maximum Likelihood method. The chosen method is known for its accuracy and efficiency in handling large datasets. Some key performance prediction indicators were analyzed for the investigated solar and wind locations. The findings provide valuable insights that support renewable energy planning and the optimal design of hybrid power systems. Full article
(This article belongs to the Special Issue Energy Management of Renewable Energy Systems)
Show Figures

Figure 1

Back to TopTop