Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (327)

Search Parameters:
Keywords = probabilistic density

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
41 pages, 995 KiB  
Article
A Max-Flow Approach to Random Tensor Networks
by Khurshed Fitter, Faedi Loulidi and Ion Nechita
Entropy 2025, 27(7), 756; https://doi.org/10.3390/e27070756 - 15 Jul 2025
Abstract
The entanglement entropy of a random tensor network (RTN) is studied using tools from free probability theory. Random tensor networks are simple toy models that help in understanding the entanglement behavior of a boundary region in the anti-de Sitter/conformal field theory (AdS/CFT) context. [...] Read more.
The entanglement entropy of a random tensor network (RTN) is studied using tools from free probability theory. Random tensor networks are simple toy models that help in understanding the entanglement behavior of a boundary region in the anti-de Sitter/conformal field theory (AdS/CFT) context. These can be regarded as specific probabilistic models for tensors with particular geometry dictated by a graph (or network) structure. First, we introduce a model of RTN obtained by contracting maximally entangled states (corresponding to the edges of the graph) on the tensor product of Gaussian tensors (corresponding to the vertices of the graph). The entanglement spectrum of the resulting random state is analyzed along a given bipartition of the local Hilbert spaces. The limiting eigenvalue distribution of the reduced density operator of the RTN state is provided in the limit of large local dimension. This limiting value is described through a maximum flow optimization problem in a new graph corresponding to the geometry of the RTN and the given bipartition. In the case of series-parallel graphs, an explicit formula for the limiting eigenvalue distribution is provided using classical and free multiplicative convolutions. The physical implications of these results are discussed, allowing the analysis to move beyond the semiclassical regime without any cut assumption, specifically in terms of finite corrections to the average entanglement entropy of the RTN. Full article
(This article belongs to the Section Quantum Information)
29 pages, 1971 KiB  
Article
Mathematical Model of Data Processing in a Personalized Search Recommendation System for Digital Collections
by Serhii Semenov, Wojciech Baran, Magdalena Andrzejewska, Maxim Pochebut, Inna Petrovska, Oksana Sitnikova, Marharyta Melnyk and Anastasiya Mekhovykh
Appl. Sci. 2025, 15(13), 7583; https://doi.org/10.3390/app15137583 - 6 Jul 2025
Viewed by 262
Abstract
This paper presents a probabilistic-temporal modeling approach for analyzing data processing stages in a personalized recommendation system for digital heritage collections. The methodology is based on (Graphical Evaluation and Review Technique) GERT network formalism, which enables the representation of complex probabilistic workflows with [...] Read more.
This paper presents a probabilistic-temporal modeling approach for analyzing data processing stages in a personalized recommendation system for digital heritage collections. The methodology is based on (Graphical Evaluation and Review Technique) GERT network formalism, which enables the representation of complex probabilistic workflows with feedbacks and alternative branches. For each processing stage, corresponding GERT-schemes were developed, and equivalent transfer functions were derived. Using Laplace transform inversion techniques, probability density functions of processing time were recovered, followed by the calculation of key statistical metrics, including expectation, standard deviation, and quantiles. The results demonstrate that the proposed approach allows for detailed temporal performance evaluation, including the estimation of time exceedance probabilities at each stage. This provides a quantitative basis for optimizing recommendation system design and highlights the applicability of GERT-based modeling to intelligent data-driven services in the cultural domain. Full article
(This article belongs to the Special Issue Advanced Models and Algorithms for Recommender Systems)
Show Figures

Figure 1

28 pages, 47806 KiB  
Article
Experimental Validation of UAV Search and Detection System in Real Wilderness Environment
by Stella Dumenčić, Luka Lanča, Karlo Jakac and Stefan Ivić
Drones 2025, 9(7), 473; https://doi.org/10.3390/drones9070473 - 3 Jul 2025
Viewed by 243
Abstract
Search and rescue (SAR) missions require reliable search methods to locate survivors, especially in challenging environments. Introducing unmanned aerial vehicles (UAVs) can enhance the efficiency of SAR missions while simultaneously increasing the safety of everyone involved. Motivated by this, we experiment with autonomous [...] Read more.
Search and rescue (SAR) missions require reliable search methods to locate survivors, especially in challenging environments. Introducing unmanned aerial vehicles (UAVs) can enhance the efficiency of SAR missions while simultaneously increasing the safety of everyone involved. Motivated by this, we experiment with autonomous UAV search for humans in Mediterranean karst environment. The UAVs are directed using the Heat equation-driven area coverage (HEDAC) ergodic control method based on known probability density and detection function. The sensing framework consists of a probabilistic search model, motion control system, and object detection enabling to calculate the target’s detection probability. This paper focuses on the experimental validation of the proposed sensing framework. The uniform probability density, achieved by assigning suitable tasks to 78 volunteers, ensures the even probability of finding targets. The detection model is based on the You Only Look Once (YOLO) model trained on a previously collected orthophoto image database. The experimental search is carefully planned and conducted, while recording as many parameters as possible. The thorough analysis includes the motion control system, object detection, and search validation. The assessment of the detection and search performance strongly indicates that the detection model in the UAV control algorithm is aligned with real-world results. Full article
Show Figures

Figure 1

23 pages, 678 KiB  
Article
Unified Probabilistic and Similarity-Based Position Estimation from Radio Observations
by Max Werner, Markus Bullmann, Toni Fetzer and Frank Deinzer
Sensors 2025, 25(13), 4092; https://doi.org/10.3390/s25134092 - 30 Jun 2025
Viewed by 184
Abstract
We propose a modeling approach for position estimation based on the observed radio propagation in an environment. The approach is purely similarity-based and therefore free of explicit physical assumptions. What distinguishes it from classical related methods are probabilistic position estimates. Instead of just [...] Read more.
We propose a modeling approach for position estimation based on the observed radio propagation in an environment. The approach is purely similarity-based and therefore free of explicit physical assumptions. What distinguishes it from classical related methods are probabilistic position estimates. Instead of just providing a point estimate for a given signal sequence, our model returns the distribution of possible positions as continuous probability density function, which allows for appropriate integration into recursive state estimation systems. The estimation procedure starts by using a kernel to compare incoming data with reference recordings from known positions. Based on the obtained similarities, weights are assigned to the reference positions. An arbitrarily chosen density estimation method is then applied given this assignment. Thus, a continuous representation of the distribution of possible positions in the environment is provided. We apply the solution in a Particle Filter (PF) system for smartphone-based indoor localization. The approach is tested both with radio signal strength (RSS) measurements (Wi-Fi and Bluetooth Low Energy RSSI) and round-trip time (RTT) measurements, given by Wi-Fi Fine Timing Measurement. Compared to distance-based models, which are dedicated to the specific physical properties of each measurement type, our similarity-based model achieved overall higher accuracy at tracking pedestrians under realistic conditions. Since it does not explicitly consider the physics of radio propagation, the proposed model has also been shown to work flexibly with either RSS or RTT observations. Full article
Show Figures

Figure 1

16 pages, 5939 KiB  
Article
Modeling the Effects of Underground Brine Extraction on Shallow Groundwater Flow and Oilfield Fluid Leakage Pathways in the Yellow River Delta
by Jingang Zhao, Xin Yuan, Hu He, Gangzhu Li, Qiong Zhang, Qiyun Wang, Zhenqi Gu, Chenxu Guan and Guoliang Cao
Water 2025, 17(13), 1943; https://doi.org/10.3390/w17131943 - 28 Jun 2025
Viewed by 337
Abstract
The distribution of fresh and salty groundwater is a critical factor affecting the coastal wetlands. However, the dynamics of groundwater flow and salinity in river deltas remain unclear due to complex hydrological settings and impacts of human activities. The uniqueness of the Yellow [...] Read more.
The distribution of fresh and salty groundwater is a critical factor affecting the coastal wetlands. However, the dynamics of groundwater flow and salinity in river deltas remain unclear due to complex hydrological settings and impacts of human activities. The uniqueness of the Yellow River Delta (YRD) lies in its relatively short formation time, the frequent salinization and freshening alternation associated with changes in the course of the Yellow River, and the extensive impacts of oil production and underground brine extraction. This study employed a detailed hydrogeological modeling approach to investigate groundwater flow and the impacts of oil field brine leakage in the YRD. To characterize the heterogeneity of the aquifer, a sediment texture model was constructed based on a geotechnical borehole database for the top 30 m of the YRD. A detailed variable-density groundwater model was then constructed to simulate the salinity distribution in the predevelopment period and disturbance by brine extraction in the past decades. Probabilistic particle tracking simulation was implemented to assess the alterations in groundwater flow resulting from brine resource development and evaluate the potential risk of salinity contamination from oil well fields. Simulations show that the limited extraction of brine groundwater has significantly altered the hydraulic gradient and groundwater flow pattern accounting for the less permeable sediments in the delta. The vertical gradient increased by brine pumping has mitigated the salinization process of the shallow groundwater which supports the coastal wetlands. The low groundwater velocity and long travel time suggest that the peak salinity concentration would be greatly reduced, reaching the deep aquifers accounting for dispersion and dilution. Further detailed investigation of the complex groundwater salinization process in the YRD is necessary, as well as its association with alternations in the hydraulic gradient by brine extraction and water injection/production in the oilfield. Full article
(This article belongs to the Section Hydrogeology)
Show Figures

Figure 1

33 pages, 7235 KiB  
Review
Hysteresis Modeling of Soft Pneumatic Actuators: An Experimental Review
by Jesús de la Morena, Francisco Ramos and Andrés S. Vázquez
Actuators 2025, 14(7), 321; https://doi.org/10.3390/act14070321 - 27 Jun 2025
Viewed by 629
Abstract
Hysteresis is a nonlinear phenomenon found in many physical systems, including soft viscoelastic actuators, where it poses significant challenges to their application and performance. Consequently, developing accurate hysteresis models is essential for the effective design and optimization of soft actuators. Moreover, a reliable [...] Read more.
Hysteresis is a nonlinear phenomenon found in many physical systems, including soft viscoelastic actuators, where it poses significant challenges to their application and performance. Consequently, developing accurate hysteresis models is essential for the effective design and optimization of soft actuators. Moreover, a reliable model can be used to design compensators that mitigate the negative effects of hysteresis, improving closed-loop control accuracy and expanding the applicability of soft actuators in robotics. Physics-based approaches for modeling hysteresis in soft actuators offer valuable insights into the underlying material behavior. Nevertheless, they are often highly complex, making them impractical for real-world applications. Instead, phenomenological models provide a more feasible solution by representing hysteresis through input–output mappings based on experimental data. To effectively fit these phenomenological models, it is essential to rely on sensing data collected from real actuators. In this context, the primary objective of this work is a comprehensive comparative evaluation of the efficiency and performance of representative phenomenological hysteresis models (e.g., Bouc–Wen and Prandtl-Ishlinskii) using experimental data obtained from a pneumatic bending actuator made of a viscoelastic material. This evaluation suggests that the Generalized Prandtl–Ishlinskii model achieves the highest modeling accuracy, while the Preisach model with a probabilistic density function formulation excels in terms of parameter compactness. Full article
(This article belongs to the Special Issue Advanced Mechanism Design and Sensing for Soft Robotics)
Show Figures

Figure 1

28 pages, 7731 KiB  
Article
AC-Induced Corrosion of Cathodically Protected Pipelines: Experimental Study and Probabilistic Modeling
by Yuhan Su, Emadoddin Majdabadi Farahani, Qindan Huang and Qixin Zhou
Corros. Mater. Degrad. 2025, 6(2), 26; https://doi.org/10.3390/cmd6020026 - 19 Jun 2025
Viewed by 259
Abstract
This study investigated the effects of alternating current (AC) interference on pipeline steel under cathodic protection (CP). In a simulated solution, real-time electrochemical measurements and corrosion rate analysis were conducted on two steel types (C1018 and X60) under various levels of AC interference [...] Read more.
This study investigated the effects of alternating current (AC) interference on pipeline steel under cathodic protection (CP). In a simulated solution, real-time electrochemical measurements and corrosion rate analysis were conducted on two steel types (C1018 and X60) under various levels of AC interference with CP. Due to the complexity of AC-induced corrosion, relying on the shift in DC potential alone cannot accurately demonstrate the corrosion behavior in the presence of AC interference. In fact, such an approach may mislead the predictions of corrosion performance. It is observed that AC interference reduced the effectiveness of CP and increased the corrosion rate of the steel, both in weight loss and Tafel Extrapolation (Tafel) measurements. The study concluded that conventional CP standards used in the field were inadequate in the presence of high AC-level interference. Furthermore, this study found that a more negative CP current density (−0.75 A/m2) could reduce the effect of AC interference by 46–93%. This is particularly shown in the case of low-level AC interference, where the reduction can reach up to 93%. Utilizing the experimental data obtained by the two measurement methods, probabilistic models to predict the corrosion rate were developed with consideration of the uncertainty in the measurements. The sensitivity analysis showed how AC interference impacts the corrosion rate for a given CP level. Full article
Show Figures

Figure 1

15 pages, 6013 KiB  
Article
Urban Air Mobility Vertiport’s Capacity Simulation and Analysis
by Antoni Kopyt and Sebastian Dylicki
Aerospace 2025, 12(6), 560; https://doi.org/10.3390/aerospace12060560 - 19 Jun 2025
Viewed by 454
Abstract
This study shows a comprehensive simulation to assess and enhance the throughput capacity of unmanned air system vertiports, one of the most essential elements of urban air mobility ecosystems. The framework integrates dynamic grid-based spatial management, probabilistic mission duration algorithms, and EASA-compliant operational [...] Read more.
This study shows a comprehensive simulation to assess and enhance the throughput capacity of unmanned air system vertiports, one of the most essential elements of urban air mobility ecosystems. The framework integrates dynamic grid-based spatial management, probabilistic mission duration algorithms, and EASA-compliant operational protocols to address the infrastructural and logistical demands of high-density UAS operations. It was focused on two use cases—high-frequency food delivery utilizing small UASs and extended-range package logistics with larger UASs—and the model incorporates adaptive vertiport zoning strategies, segregating operations into dedicated sectors for battery charging, swapping, and cargo handling to enable parallel processing and mitigate congestion. The simulation evaluates critical variables such as vertiport dimensions, UAS fleet composition, and mission duration ranges while emphasizing scalability, safety, and compliance with evolving regulatory standards. By examining the interplay between infrastructure design, operational workflows, and resource allocation, the research provides a versatile tool for urban planners and policymakers to optimize vertiport layouts and traffic management protocols. Its modular architecture supports future extensions. This work underscores the necessity of adaptive, data-driven planning to harmonize vertiport functionality with the dynamic demands of urban air mobility, ensuring interoperability, safety, and long-term scalability. Full article
(This article belongs to the Special Issue Operational Requirements for Urban Air Traffic Management)
Show Figures

Figure 1

34 pages, 6941 KiB  
Article
Integrating Soil Parameter Uncertainty into Slope Stability Analysis: A Case Study of an Open Pit Mine in Hungary
by Petra Oláh and Péter Görög
Geosciences 2025, 15(6), 222; https://doi.org/10.3390/geosciences15060222 - 12 Jun 2025
Viewed by 290
Abstract
This study presents a probabilistic geotechnical analysis of the Visonta Keleti-III lignite mining area, focusing on the statistical evaluation of soil parameters and their integration into slope stability modeling. The objective was to provide a more accurate representation of the spatial variability of [...] Read more.
This study presents a probabilistic geotechnical analysis of the Visonta Keleti-III lignite mining area, focusing on the statistical evaluation of soil parameters and their integration into slope stability modeling. The objective was to provide a more accurate representation of the spatial variability of geological formations and mechanical soil properties in contrast to traditional deterministic approaches. The analysis was based on over 3300 laboratory samples from 28 boreholes, processed through multi-stage outlier filtering and regression techniques. Strong correlations were identified between physical soil parameters—such as wet and dry bulk density, void ratio, and plasticity index—particularly in cohesive soils. The probabilistic slope stability analysis applied the Bishop simplified method in combination with Latin Hypercube simulation. Results demonstrate that traditional methods tend to underestimate slope failure risk, whereas the probabilistic approach reveals failure probabilities ranging from 0% to 46.7% across different sections. The use of tailored statistical tools—such as Python-based filtering algorithms and distribution fitting via MATLAB—enabled more realistic modeling of geotechnical behavior. The findings emphasize the necessity of statistical methodologies in mine design, particularly in geologically heterogeneous, multilayered environments, where spatial uncertainty plays a critical role in slope stability assessments. Full article
(This article belongs to the Section Geomechanics)
Show Figures

Figure 1

33 pages, 5060 KiB  
Article
The Extreme Value Support Measure Machine for Group Anomaly Detection
by Lixuan An, Bernard De Baets and Stijn Luca
Mathematics 2025, 13(11), 1813; https://doi.org/10.3390/math13111813 - 29 May 2025
Viewed by 356
Abstract
Group anomaly detection is a subfield of pattern recognition that aims at detecting anomalous groups rather than individual anomalous points. However, existing approaches mainly target the unusual aggregate of points in high-density regions. In this way, unusual group behavior with a number of [...] Read more.
Group anomaly detection is a subfield of pattern recognition that aims at detecting anomalous groups rather than individual anomalous points. However, existing approaches mainly target the unusual aggregate of points in high-density regions. In this way, unusual group behavior with a number of points located in low-density regions is not fully detected. In this paper, we propose a systematic approach based on extreme value theory (EVT), a field of statistics adept at modeling the tails of a distribution where data are sparse, and one-class support measure machines (OCSMMs) to quantify anomalous group behavior comprehensively. First, by applying EVT to a point process model, we construct an analytical model describing the likelihood of an aggregate within a group with respect to low-density regions, aimed at capturing anomalous group behavior in such regions. This model is then combined with a calibrated OCSMM, which provides probabilistic outputs to characterize anomalous group behavior in high-density regions, enabling improved assessment of overall anomalous group behavior. Extensive experiments on simulated and real-world data demonstrate that our method outperforms existing group anomaly detectors across diverse scenarios, showing its effectiveness in quantifying and interpreting various types of anomalous group behavior. Full article
(This article belongs to the Section D1: Probability and Statistics)
Show Figures

Figure 1

30 pages, 6136 KiB  
Article
Seismic Reliability Analysis of Highway Pile–Plate Structures Considering Dual Stochasticity of Parameters and Excitation via Probability Density Evolution
by Liang Huang, Ge Li, Chaowei Du, Yujian Guan, Shizhan Xu and Shuaitao Li
Infrastructures 2025, 10(6), 131; https://doi.org/10.3390/infrastructures10060131 - 28 May 2025
Viewed by 289
Abstract
The paper innovatively studies the impact of dual randomness of structural parameters and seismic excitation on the seismic reliability of highway pile–slab structures using the probability density evolution method. A nonlinear stochastic dynamic model was established through the platform, integrating, for the first [...] Read more.
The paper innovatively studies the impact of dual randomness of structural parameters and seismic excitation on the seismic reliability of highway pile–slab structures using the probability density evolution method. A nonlinear stochastic dynamic model was established through the platform, integrating, for the first time, the randomness of concrete material properties and seismic motion variability. The main findings include the following: Under deterministic seismic input, the displacement angle fluctuation range caused by structural parameter randomness is ±3%, and reliability decreases from 100% to 65.26%. For seismic excitation randomness, compared to structural parameter randomness, reliability at the 3.3% threshold decreases by 7.99%, reaching 92.01%. Dual randomness amplifies the variability of structural response, reducing reliability to 86.38% and 62%, with a maximum difference of 20.5% compared to single-factor scenarios. Compared to the Monte Carlo method, probability density evolution shows significant advantages in computational accuracy and efficiency for large-scale systems, revealing enhanced discreteness and irregularity under combined randomness. This study emphasizes the necessity of addressing dual randomness in seismic design, advancing probabilistic seismic assessment methods for complex engineering systems, thereby aiding the design phase in enhancing facility safety and providing scientific basis for improved design specifications. Full article
(This article belongs to the Special Issue Seismic Engineering in Infrastructures: Challenges and Prospects)
Show Figures

Figure 1

26 pages, 529 KiB  
Article
A First-Order Autoregressive Process with Size-Biased Lindley Marginals: Applications and Forecasting
by Hassan S. Bakouch, M. M. Gabr, Sadiah M. A. Aljeddani and Hadeer M. El-Taweel
Mathematics 2025, 13(11), 1787; https://doi.org/10.3390/math13111787 - 27 May 2025
Viewed by 348
Abstract
In this paper, a size-biased Lindley (SBL) first-order autoregressive (AR(1)) process is proposed, the so-called SBL-AR(1). Some probabilistic and statistical properties of the proposed process are determined, including the distribution of its innovation process, the Laplace transformation function, multi-step-ahead conditional measures, autocorrelation, and [...] Read more.
In this paper, a size-biased Lindley (SBL) first-order autoregressive (AR(1)) process is proposed, the so-called SBL-AR(1). Some probabilistic and statistical properties of the proposed process are determined, including the distribution of its innovation process, the Laplace transformation function, multi-step-ahead conditional measures, autocorrelation, and spectral density function. In addition, the unknown parameters of the model are estimated via the conditional least squares and Gaussian estimation methods. The performance and behavior of the estimators are checked through some numerical results by a Monte Carlo simulation study. Additionally, two real-world datasets are utilized to examine the model’s applicability, and goodness-of-fit statistics are used to compare it to several pertinent non-Gaussian AR(1) models. The findings reveal that the proposed SBL-AR(1) model exhibits key theoretical properties, including a closed-form innovation distribution, multi-step conditional measures, and an exponentially decaying autocorrelation structure. Parameter estimation via conditional least squares and Gaussian methods demonstrates consistency and efficiency in simulations. Real-world applications to inflation expectations and water quality data reveal a superior fit over competing non-Gaussian AR(1) models, evidenced by lower values of the AIC and BIC statistics. Forecasting comparisons show that the classical conditional expectation method achieves accuracy comparable to some modern machine learning techniques, underscoring its practical utility for skewed and fat-tailed time series. Full article
(This article belongs to the Special Issue Statistical Simulation and Computation: 3rd Edition)
Show Figures

Figure 1

21 pages, 1276 KiB  
Article
Quantifying Truthfulness: A Probabilistic Framework for Atomic Claim-Based Misinformation Detection
by Fahim Sufi and Musleh Alsulami
Mathematics 2025, 13(11), 1778; https://doi.org/10.3390/math13111778 - 27 May 2025
Viewed by 738
Abstract
The increasing sophistication and volume of misinformation on digital platforms necessitate scalable, explainable, and semantically granular fact-checking systems. Existing approaches typically treat claims as indivisible units, overlooking internal contradictions and partial truths, thereby limiting their interpretability and trustworthiness. This paper addresses this gap [...] Read more.
The increasing sophistication and volume of misinformation on digital platforms necessitate scalable, explainable, and semantically granular fact-checking systems. Existing approaches typically treat claims as indivisible units, overlooking internal contradictions and partial truths, thereby limiting their interpretability and trustworthiness. This paper addresses this gap by proposing a novel probabilistic framework that decomposes complex assertions into semantically atomic claims and computes their veracity through a structured evaluation of source credibility and evidence frequency. Each atomic unit is matched against a curated corpus of 11,928 cyber-related news entries using a binary alignment function, and its truthfulness is quantified via a composite score integrating both source reliability and support density. The framework introduces multiple aggregation strategies—arithmetic and geometric means—to construct claim-level veracity indices, offering both sensitivity and robustness. Empirical evaluation across eight cyber misinformation scenarios—encompassing over 40 atomic claims—demonstrates the system’s effectiveness. The model achieves a Mean Squared Error (MSE) of 0.037, Brier Score of 0.042, and a Spearman rank correlation of 0.88 against expert annotations. When thresholded for binary classification, the system records a Precision of 0.82, Recall of 0.79, and an F1-score of 0.805. The Expected Calibration Error (ECE) of 0.068 further validates the trustworthiness of the score distributions. These results affirm the framework’s ability to deliver interpretable, statistically reliable, and operationally scalable misinformation detection, with implications for automated journalism, governmental monitoring, and AI-based verification platforms. Full article
Show Figures

Figure 1

14 pages, 1984 KiB  
Article
Fast and Interpretable Probabilistic Solar Power Forecasting via a Multi-Observation Non-Homogeneous Hidden Markov Model
by Jiaxin Zhang and Siyuan Shang
Energies 2025, 18(10), 2602; https://doi.org/10.3390/en18102602 - 17 May 2025
Viewed by 354
Abstract
The increasing complexity and uncertainty associated with high renewable energy penetration require forecasting methods that provide more comprehensive information for risk analysis and energy management. This paper proposes a novel probabilistic forecasting model for solar power generation based on a non-homogeneous multi-observation Hidden [...] Read more.
The increasing complexity and uncertainty associated with high renewable energy penetration require forecasting methods that provide more comprehensive information for risk analysis and energy management. This paper proposes a novel probabilistic forecasting model for solar power generation based on a non-homogeneous multi-observation Hidden Markov Model (HMM). The model is purely data-driven, free from restrictive assumptions, and features a lightweight structure that enables fast updates and transparent reasoning—offering a practical alternative to computationally intensive neural network approaches. The proposed framework is first formalized through an extension of the classical HMM and the derivation of its core inference procedures. A method for estimating the probability density distribution of solar power output is introduced, from which point forecasts are extracted. Thirteen model variants with different observation-dependency structures are constructed and evaluated using real PV operational data. Experimental results validate the model’s effectiveness in generating both prediction intervals and point forecasts, while also highlighting the influence of observation correlation on forecasting performance. The proposed approach demonstrates strong potential for real-time solar power forecasting in modern power systems, particularly where speed, adaptability, and interpretability are critical. Full article
(This article belongs to the Section A2: Solar Energy and Photovoltaic Systems)
Show Figures

Figure 1

20 pages, 416 KiB  
Article
An Adaptive Robust Event-Triggered Variational Bayesian Filtering Method with Heavy-Tailed Noise
by Di Deng, Peng Yi and Junlin Xiong
Sensors 2025, 25(10), 3130; https://doi.org/10.3390/s25103130 - 15 May 2025
Viewed by 372
Abstract
Event-triggered state estimation has attracted significant attention due to the advantage of efficiently utilizing communication resources in wireless sensor networks. In this paper, an adaptive robust event-triggered variational Bayesian filtering method is designed for heavy-tailed noise with inaccurate nominal covariance matrices. The one-step [...] Read more.
Event-triggered state estimation has attracted significant attention due to the advantage of efficiently utilizing communication resources in wireless sensor networks. In this paper, an adaptive robust event-triggered variational Bayesian filtering method is designed for heavy-tailed noise with inaccurate nominal covariance matrices. The one-step state prediction probability density function and the measurement likelihood function are modeled as Student’s t-distributions. By choosing inverse Wishart priors, the system state, the prediction error covariance, and the measurement noise covariance are jointly estimated based on the variational Bayesian inference and the fixed-point iteration. In the proposed filtering algorithm, the system states and the unknown covariances are adaptively updated by taking advantage of the event-triggered probabilistic information and the transmitted measurement data in the cases of non-transmission and transmission, respectively. The tracking simulations show that the proposed filtering method achieves good and robust estimation performance with low communication overhead. Full article
(This article belongs to the Special Issue Advances in Wireless Sensor Networks for Smart City)
Show Figures

Figure 1

Back to TopTop