Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (167)

Search Parameters:
Keywords = deterministic and probabilistic evaluation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
16 pages, 1851 KB  
Article
A Method for Determining Medium- and Long-Term Renewable Energy Accommodation Capacity Considering Multiple Uncertain Influencing Factors
by Tingxiang Liu, Libin Yang, Zhengxi Li, Kai Wang, Pinkun He and Feng Xiao
Energies 2025, 18(19), 5261; https://doi.org/10.3390/en18195261 - 3 Oct 2025
Abstract
Amid the global energy transition, rapidly expanding wind and solar installations challenge power grids with variability and uncertainty. We propose an adaptive framework for renewable energy accommodation assessment under high-dimensional uncertainties, integrating three innovations: (1) Response Surface Methodology (RSM) is adopted for the [...] Read more.
Amid the global energy transition, rapidly expanding wind and solar installations challenge power grids with variability and uncertainty. We propose an adaptive framework for renewable energy accommodation assessment under high-dimensional uncertainties, integrating three innovations: (1) Response Surface Methodology (RSM) is adopted for the first time to construct a closed-form polynomial of renewable energy accommodation in terms of resource hours, load, installed capacity, and transmission limits, enabling millisecond-level evaluation; (2) LASSO-regularized RSM suppresses high-dimensional overfitting by automatically selecting key interaction terms while preserving interpretability; (3) a Bayesian kernel density extension yields full posterior distributions and confidence intervals for renewable energy accommodation in small-sample scenarios, quantifying risk. A case study on a renewable-rich grid in Northwest China validates the framework: two-factor response surface models achieve R2 > 90% with < 0.5% mean absolute error across ten random historical cases; LASSO regression keeps errors below 1.5% in multidimensional space; Bayesian density intervals encompass all observed values. The framework flexibly switches between deterministic, sparse, or probabilistic modes according to data availability, offering efficient and reliable decision support for generation-transmission planning and market clearing under multidimensional uncertainty. Full article
Show Figures

Figure 1

21 pages, 2101 KB  
Article
The Cost-Effectiveness of Sugemalimab Plus CAPOX in Treating Advanced Gastric Cancer: Analysis from the GEMSTONE-303 Trial
by Chen-Han Chueh, Wei-Ming Huang, Ming-Yu Hong, Yi-Wen Tsai, Nai-Jung Chiang and Hsiao-Ling Chen
Cancers 2025, 17(19), 3171; https://doi.org/10.3390/cancers17193171 - 29 Sep 2025
Abstract
Background/Objectives: Sugemalimab demonstrated clinical efficacy in the GEMSTONE-303 trial, but its cost-effectiveness remains unclear. This study aims to evaluate the cost-effectiveness of sugemalimab in combination with chemotherapy (CAPOX) as a first-line treatment for patients with advanced or metastatic gastric or gastroesophageal junction (G/GEJ) [...] Read more.
Background/Objectives: Sugemalimab demonstrated clinical efficacy in the GEMSTONE-303 trial, but its cost-effectiveness remains unclear. This study aims to evaluate the cost-effectiveness of sugemalimab in combination with chemotherapy (CAPOX) as a first-line treatment for patients with advanced or metastatic gastric or gastroesophageal junction (G/GEJ) adenocarcinoma, compared to chemotherapy alone, from the perspective of Taiwan’s healthcare payer. Methods: A partitioned survival model was developed to simulate outcomes over a 40-year time horizon, and model parameters were derived from GEMSTONE-303 and the wider literature. Health benefits were measured in quality-adjusted life-years (QALYs), and only direct medical costs were included, with both discounted at an annual rate of 3%. The willingness-to-pay threshold was set at three times the 2024 GDP per capita. Deterministic and probabilistic sensitivity analyses were conducted alongside scenario analyses. Results: Compared to capecitabine and oxaliplatin (CAPOX) alone, adding sugemalimab yielded an incremental gain of 0.39 QALYs at an additional cost of USD 47,020, resulting in an incremental net monetary benefit of −USD 7478. Conclusions: Sugemalimab plus CAPOX is not cost-effective for advanced or metastatic G/GEJ adenocarcinoma from the Taiwan payer’s perspective. Achieving cost-effectiveness would require a 20–30% price reduction for sugemalimab (to USD 1204–USD 1376 per 600 mg), assuming first-line therapy is administered for the median treatment duration observed in the GEMSTONE-303 trial. If reimbursement continued until disease progression, a reduction of approximately 68% would be required (USD 550 per 600 mg). Full article
(This article belongs to the Special Issue Cost-Effectiveness Studies in Cancers)
Show Figures

Figure 1

55 pages, 6230 KB  
Review
Comprehensive Insights into Carbon Capture and Storage: Geomechanical and Geochemical Aspects, Modeling, Risk Assessment, Monitoring, and Cost Analysis in Geological Storage
by Abdul Rehman Baig, Jemal Fentaw, Elvin Hajiyev, Marshall Watson, Hossein Emadi, Bassel Eissa and Abdulrahman Shahin
Sustainability 2025, 17(19), 8619; https://doi.org/10.3390/su17198619 - 25 Sep 2025
Abstract
Carbon Capture and Storage (CCS) is a vital climate mitigation strategy aimed at reducing CO2 emissions from industrial and energy sectors. This review presents a comprehensive analysis of CCS technologies, focusing on capture methods, transport systems, geological storage, geomechanical and geochemical aspects, [...] Read more.
Carbon Capture and Storage (CCS) is a vital climate mitigation strategy aimed at reducing CO2 emissions from industrial and energy sectors. This review presents a comprehensive analysis of CCS technologies, focusing on capture methods, transport systems, geological storage, geomechanical and geochemical aspects, modeling, risk assessment, monitoring, and economic feasibility. Among capture technologies, pre-combustion capture is identified as the most efficient (90–95%) due to its high purity and integration potential. Notably, most operational CCS projects in 2025 utilize pre-combustion capture, particularly in hydrogen production and natural gas processing. For geological storage, saline aquifers and depleted oil and gas reservoirs are highlighted as the most promising due to their vast capacity and proven containment. In the transport phase, pipeline systems are considered the most effective and scalable method, offering high efficiency and cost-effectiveness for large-scale CO2 movement, especially in the supercritical phase. The study also emphasizes the importance of hybrid integrated risk assessment models, such as NRAP-Open-IAM, which combine deterministic simulations with probabilistic frameworks for robust site evaluation. In terms of monitoring, Seismic monitoring methods are regarded as the most reliable subsurface technique for tracking CO2 plume migration and ensuring storage integrity. Economically, depleted reservoirs offer the most feasible option when integrated with existing infrastructure and supported by incentives like 45Q tax credits. The review concludes that successful CCS deployment requires interdisciplinary innovation, standardized risk protocols, and strong policy support. This work serves as a strategic reference for researchers, policymakers, and industry professionals aiming to scale CCS technologies for global decarbonization. Full article
(This article belongs to the Section Pollution Prevention, Mitigation and Sustainability)
Show Figures

Figure 1

28 pages, 6622 KB  
Article
Bayesian Spatio-Temporal Trajectory Prediction and Conflict Alerting in Terminal Area
by Yangyang Li, Yong Tian, Xiaoxuan Xie, Bo Zhi and Lili Wan
Aerospace 2025, 12(9), 855; https://doi.org/10.3390/aerospace12090855 - 22 Sep 2025
Viewed by 230
Abstract
Precise trajectory prediction in the airspace of a high-density terminal area (TMA) is crucial for Trajectory Based Operations (TBO), but frequent aircraft interactions and maneuvering behaviors can introduce significant uncertainties. Most existing approaches use deterministic deep learning models that lack uncertainty quantification and [...] Read more.
Precise trajectory prediction in the airspace of a high-density terminal area (TMA) is crucial for Trajectory Based Operations (TBO), but frequent aircraft interactions and maneuvering behaviors can introduce significant uncertainties. Most existing approaches use deterministic deep learning models that lack uncertainty quantification and explicit spatial awareness. To address this gap, we propose the BST-Transformer, a Bayesian spatio-temporal deep learning framework that produces probabilistic multi-step trajectory forecasts and supports probabilistic conflict alerting. The framework first extracts temporal and spatial interaction features via spatio-temporal attention encoders and then uses a Bayesian decoder with variational inference to yield trajectory distributions. Potential conflicts are evaluated by Monte Carlo sampling of the predictive distributions to produce conflict probabilities and alarm decisions. Experiments based on real SSR data from the Guangzhou TMA show that this model performs exceptionally well in improving prediction accuracy by reducing MADE 60.3% relative to a deterministic ST-Transformer with analogous reductions in horizontal and vertical errors (MADHE and MADVE), quantifying uncertainty and significantly enhancing the system’s ability to identify safety risks, and providing strong support for intelligent air traffic management with uncertainty perception capabilities. Full article
(This article belongs to the Section Air Traffic and Transportation)
Show Figures

Figure 1

37 pages, 3679 KB  
Review
Application of Artificial Intelligence in Hydrological Modeling for Streamflow Prediction in Ungauged Watersheds: A Review
by Jerome G. Gacu, Cris Edward F. Monjardin, Ronald Gabriel T. Mangulabnan and Jerime Chris F. Mendez
Water 2025, 17(18), 2722; https://doi.org/10.3390/w17182722 - 14 Sep 2025
Viewed by 901
Abstract
Streamflow prediction in ungauged watersheds remains a critical challenge in hydrological science due to the absence of in situ measurements, particularly in remote, data-scarce, and developing regions. This review synthesizes recent advancements in artificial intelligence (AI) for streamflow modeling, focusing on machine learning [...] Read more.
Streamflow prediction in ungauged watersheds remains a critical challenge in hydrological science due to the absence of in situ measurements, particularly in remote, data-scarce, and developing regions. This review synthesizes recent advancements in artificial intelligence (AI) for streamflow modeling, focusing on machine learning (ML), deep learning (DL), and hybrid modeling frameworks. Three core methodological domains are examined: regionalization techniques that transfer models from gauged to ungauged basins using physiographic similarity and transfer learning; synthetic data generation through proxy variables such as NDVI, soil moisture, and digital elevation models; and model performance evaluation using both deterministic and probabilistic metrics. Findings from recent literature consistently demonstrate that AI-based models, especially Long Short-Term Memory (LSTM) networks and hybrid attention-based architectures, outperform traditional conceptual and physically based models in capturing nonlinear hydrological responses across diverse climatic and physiographic settings. The integration of AI with remote sensing enhances generalizability, particularly in ungauged and human-impacted basins. This review also addresses several persistent research gaps, including inconsistencies in model evaluation protocols, limited transferability across heterogeneous regions, a lack of reproducibility and open-source tools, and insufficient integration of physical hydrological knowledge into AI models. To bridge these gaps, future research should prioritize the development of physics-informed AI frameworks, standardized benchmarking datasets, uncertainty quantification methods, and interpretable modeling tools to support robust, scalable, and operational streamflow forecasting in ungauged watersheds. Full article
(This article belongs to the Special Issue Application of Machine Learning in Hydrologic Sciences)
Show Figures

Figure 1

20 pages, 16598 KB  
Article
A Comparative Analysis of Slope Stability Methods for an Open-Pit Mine in Mongolia
by Tuvshinbaatar Tsevegmid, Yunhee Kim, Soyi Lee and Bumjoo Kim
Appl. Sci. 2025, 15(18), 9984; https://doi.org/10.3390/app15189984 - 12 Sep 2025
Viewed by 632
Abstract
Slope stability is a critical factor in the mining industry, directly impacting operational safety and economic performance. In large open-pit mines, slope failures can cause work stoppages and significant financial losses. Regions like Mongolia, with their complex topography, irregular geometries, and heterogeneous rock [...] Read more.
Slope stability is a critical factor in the mining industry, directly impacting operational safety and economic performance. In large open-pit mines, slope failures can cause work stoppages and significant financial losses. Regions like Mongolia, with their complex topography, irregular geometries, and heterogeneous rock conditions, present a particular challenge for assessing slope stability. Conventional two-dimensional (2D) slope stability analysis and deterministic approaches have limitations in accounting for these complex topographies, irregular pit geometries, and lateral resistance forces. For a large open-pit mine in Mongolia, this study applied three-dimensional (3D) analyses with varying slope widths, using both limit equilibrium and finite element methods, to achieve a more reliable stability assessment under complex topographic conditions. To further enhance the reliability of evaluations under heterogeneous rock mass conditions, probabilistic approaches were employed alongside traditional deterministic methods. This enabled a more accurate estimation of safety factors and the identification of potential failure zones. The comparative study results demonstrate that 3D and probabilistic analyses consistently show 17–20% higher factors of safety and lower probabilities of failure than conventional 2D deterministic analyses. These findings highlight the effectiveness of these advanced methods for reliable slope stability assessment in complex geological conditions. Ultimately, the results underscore the importance of incorporating 3D and probabilistic analyses for more accurate and reliable assessments in complex open-pit mining, thereby contributing to improved safety and optimized operational efficiency. Full article
Show Figures

Figure 1

15 pages, 3574 KB  
Article
Prior Knowledge Shapes Success When Large Language Models Are Fine-Tuned for Biomedical Term Normalization
by Daniel B. Hier, Steven K. Platt and Anh Nguyen
Information 2025, 16(9), 776; https://doi.org/10.3390/info16090776 - 7 Sep 2025
Viewed by 385
Abstract
Large language models (LLMs) often fail to correctly associate biomedical terms with their standardized ontology identifiers, posing challenges for downstream applications that rely on accurate, machine-readable codes. These linking failures can compromise the integrity of data used in precision medicine, clinical decision support, [...] Read more.
Large language models (LLMs) often fail to correctly associate biomedical terms with their standardized ontology identifiers, posing challenges for downstream applications that rely on accurate, machine-readable codes. These linking failures can compromise the integrity of data used in precision medicine, clinical decision support, and population health. Fine-tuning can partially remedy these issues, but the degree of improvement varies across terms and terminologies. Focusing on the Human Phenotype Ontology (HPO), we show that a model’s prior knowledge of term–identifier pairs, acquired during pre-training, strongly predicts whether fine-tuning will enhance its linking accuracy. We evaluate prior knowledge in three complementary ways: (1) latent probabilistic knowledge, revealed through stochastic prompting, captures hidden associations not evident in deterministic output; (2) partial subtoken knowledge, reflected in incomplete but non-random generation of identifier components; and (3) term familiarity, inferred from annotation frequencies in the biomedical literature, which serve as a proxy for training exposure. We then assess how these forms of prior knowledge influence the accuracy of deterministic identifier linking. Fine-tuning performance varies most for terms in what we call the reactive middle zone of the ontology—terms with intermediate levels of prior knowledge that are neither absent nor fully consolidated. Fine-tuning was most successful when prior knowledge as measured by partial subtoken knowledge, was ‘weak’ or ‘medium’ or when prior knowledge as measured by latent probabilistic knowledge was ‘unknown’ or ‘weak’ (p<0.001). These terms from the ‘reactive middle’ exhibited the largest gains or losses in accuracy during fine-tuning, suggesting that the success of knowledge injection critically depends on the level of term–identifier pair knowledge in the LLM before fine-tuning. Full article
Show Figures

Figure 1

19 pages, 6973 KB  
Article
A Bayesian Framework for the Calibration of Cyclic Triaxial Tests
by Luis Castillo-Suárez, Jesús Redondo-Mosquera, Vicente Mercado, Jaime Fernández-Gómez and Joaquín Abellán-García
Geotechnics 2025, 5(3), 63; https://doi.org/10.3390/geotechnics5030063 - 5 Sep 2025
Viewed by 1176
Abstract
This research presents the calibration of a constitutive model to replicate the cyclic performance of soils using a Bayesian framework. This study uses data from laboratory-conducted consolidated undrained isotropic cyclic triaxial tests and numerical tools to estimate optimal parameters by the application of [...] Read more.
This research presents the calibration of a constitutive model to replicate the cyclic performance of soils using a Bayesian framework. This study uses data from laboratory-conducted consolidated undrained isotropic cyclic triaxial tests and numerical tools to estimate optimal parameters by the application of Slice Sampling in a Bayesian analysis and to determinate the uncertainty of the model. For each calibrated parameter in the model, a probability distribution was obtained from the Markov chain. The means and the standard deviations from the distributions are compared with the laboratory results by the simulation of a series of consolidated undrained isotropic cyclic triaxial tests and a numerical model for a deposit that replicates the Wildlife’s stratigraphic characteristics. The calibrated model response offers a good approximation of the recorded data and the uncertainty due to the model is evaluated. The results of this study demonstrate that Bayesian calibration can reliably quantify parameter uncertainty, reveal parameter correlations that deterministic methods overlook, and improve confidence in liquefaction assessments. This probabilistic framework provides a robust basis for extending calibration to other soil types and site conditions. Full article
Show Figures

Figure 1

18 pages, 2000 KB  
Article
Transient Stability Constraints for Optimal Power Flow Considering Wind Power Uncertainty
by Songkai Liu, Biqing Ye, Pan Hu, Ming Wan, Jun Cao and Yitong Liu
Energies 2025, 18(17), 4708; https://doi.org/10.3390/en18174708 - 4 Sep 2025
Viewed by 662
Abstract
To address the issue of uncertainty in renewable energy and its impact on the safe and stable operation of power systems, this paper proposes a transient stability constrained optimal power flow (TSCOPF) calculation method that takes into account the uncertainty of wind power [...] Read more.
To address the issue of uncertainty in renewable energy and its impact on the safe and stable operation of power systems, this paper proposes a transient stability constrained optimal power flow (TSCOPF) calculation method that takes into account the uncertainty of wind power and load. First, a non-parametric kernel density estimation method is used to construct the probability density function of wind power, while the load uncertainty model is based on a normal distribution. Second, a TSCOPF model incorporating the critical clearing time (CCT) evaluation metric is constructed, and corresponding probabilistic constraints are established using opportunity constraint theory, thereby establishing a TSCOPF model that accounts for wind power and load uncertainties; then, a semi-invariant probabilistic flow calculation method based on de-randomized Halton sequences is used to convert opportunity constraints into deterministic constraints, and the improved sooty tern optimization algorithm (ISTOA) is employed for solution. Finally, the superiority and effectiveness of the proposed method are validated through simulation analysis of case studies. Full article
(This article belongs to the Section F1: Electrical Power System)
Show Figures

Figure 1

46 pages, 47184 KB  
Article
Goodness of Fit in the Marginal Modeling of Round-Trip Times for Networked Robot Sensor Transmissions
by Juan-Antonio Fernández-Madrigal, Vicente Arévalo-Espejo, Ana Cruz-Martín, Cipriano Galindo-Andrades, Adrián Bañuls-Arias and Juan-Manuel Gandarias-Palacios
Sensors 2025, 25(17), 5413; https://doi.org/10.3390/s25175413 - 2 Sep 2025
Viewed by 973
Abstract
When complex computations cannot be performed on board a mobile robot, sensory data must be transmitted to a remote station to be processed, and the resulting actions must be sent back to the robot to execute, forming a repeating cycle. This involves stochastic [...] Read more.
When complex computations cannot be performed on board a mobile robot, sensory data must be transmitted to a remote station to be processed, and the resulting actions must be sent back to the robot to execute, forming a repeating cycle. This involves stochastic round-trip times in the case of non-deterministic network communications and/or non-hard real-time software. Since robots need to react within strict time constraints, modeling these round-trip times becomes essential for many tasks. Modern approaches for modeling sequences of data are mostly based on time-series forecasting techniques, which impose a computational cost that may be prohibitive for real-time operation, do not consider all the delay sources existing in the sw/hw system, or do not work fully online, i.e., within the time of the current round-trip. Marginal probabilistic models, on the other hand, often have a lower cost, since they discard temporal dependencies between successive measurements of round-trip times, a suitable approximation when regime changes are properly handled given the typically stationary nature of these round-trip times. In this paper we focus on the hypothesis tests needed for marginal modeling of the round-trip times in remotely operated robotic systems with the presence of abrupt changes in regimes. We analyze in depth three common models, namely Log-logistic, Log-normal, and Exponential, and propose some modifications of parameter estimators for them and new thresholds for well-known goodness-of-fit tests, which are aimed at the particularities of our setting. We then evaluate our proposal on a dataset gathered from a variety of networked robot scenarios, both real and simulated; through >2100 h of high-performance computer processing, we assess the statistical robustness and practical suitability of these methods for these kinds of robotic applications. Full article
Show Figures

Figure 1

31 pages, 1511 KB  
Article
Economic Evaluation During Physicochemical Characterization Process: A Cost–Benefit Analysis
by Despina A. Gkika, Nick Vordos, Athanasios C. Mitropoulos and George Z. Kyzas
ChemEngineering 2025, 9(5), 95; https://doi.org/10.3390/chemengineering9050095 - 2 Sep 2025
Viewed by 508
Abstract
As academic institutions expand, the proliferation of laboratories dealing with hazardous chemicals has risen. While the physicochemical characterization equipment employed in these academic chemical laboratories is widely recognized, its usage presents a notable risk to researchers at various levels. This paper presents a [...] Read more.
As academic institutions expand, the proliferation of laboratories dealing with hazardous chemicals has risen. While the physicochemical characterization equipment employed in these academic chemical laboratories is widely recognized, its usage presents a notable risk to researchers at various levels. This paper presents a simplified approach for evaluating the effects of the implementation of prevention investments in regard to working with nanomaterials on a lab scale. The evaluation is based on modeling the benefits (avoided accident costs) and costs (safety training), as opposed to an alternative (not investing in safety training). Each scenario analyzed in the economic evaluation reflects a different level of risk. The novelty of this study lies in its objective to provide an economic assessment of the benefits and returns from safety investments—specifically training—in a chemical laboratory, using a framework that integrates qualitative insights to explore and define the context alongside quantitative data derived from a cost–benefit analysis. The Net Present Value (NPV) was evaluated. The results of the cost–benefit analysis demonstrated that the benefits exceed the cost of the investment. The findings from the sensitivity analysis highlight the significant influence of insurance benefits on safety investments in the specific case study. In this case study, the deterministic analysis yielded a Net Present Value (NPV) of €280,414.67, which aligns closely with the probabilistic results. The probabilistic NPV indicates 90% confidence that the investment will yield a positive NPV ranging from €283,053 to €337,356. The cost–benefit analysis results demonstrate that the benefits outweigh the costs, showing that with an 87% training success rate, this investment would generate benefits of approximately €6328 by preventing accidents in this study. To the best of the researchers’ knowledge, this is the first study to evaluate the influence of safety investment through an economic evaluation of laboratory accidents with small-angle X-ray scattering during the physicochemical characterization process of engineered nanomaterials. The proposed approach and framework are relevant not only to academic settings but also to industry. Full article
(This article belongs to the Special Issue New Advances in Chemical Engineering)
Show Figures

Figure 1

17 pages, 1466 KB  
Article
Deterministic and Probabilistic Risk Assessment of Chlorpyrifos Residues via Consumption of Tomato and Cucumber in Armenia
by Meline Beglaryan, Taron Kareyan, Monika Khachatryan, Bagrat Harutyunyan and Davit Pipoyan
Foods 2025, 14(16), 2871; https://doi.org/10.3390/foods14162871 - 19 Aug 2025
Viewed by 930
Abstract
Chlorpyrifos (CPF) is a widely used organophosphate insecticide; however, global concerns exist regarding its potential health risks, particularly developmental neurotoxicity. This study aimed to determine CPF residues in locally sourced tomatoes and cucumbers and assess the potential chronic and acute dietary risks associated [...] Read more.
Chlorpyrifos (CPF) is a widely used organophosphate insecticide; however, global concerns exist regarding its potential health risks, particularly developmental neurotoxicity. This study aimed to determine CPF residues in locally sourced tomatoes and cucumbers and assess the potential chronic and acute dietary risks associated with their consumption by the adult population of Armenia. As part of the national residue monitoring program, samples of the two most commonly consumed vegetables (tomato and cucumber) were collected from various regions of Armenia and analyzed using gas chromatography–tandem mass spectrometry (GC-MS/MS). Two databases were used for dietary exposure assessment: one containing CPF residue levels and another containing individual food consumption data from a food frequency questionnaire completed by 1329 Armenian residents. Chronic risk was assessed using the Margin of Exposure (MOE), while acute risk was evaluated using the Hazard Quotient (HQ) and the Hazard Index (HI). CPF residues were detected in 15% of tomato and 28.6% of cucumber samples, with a mean content of 0.003 mg/kg. Deterministic and probabilistic assessments indicated no health concern (i.e., MOE > 300 and >1000, HQ and HI < 1) for the general adult population at current exposure levels. However, higher cumulative risk estimates obtained for high-consumption groups emphasize the significance of these studied vegetables as notable contributors to overall CPF intake. The findings indicate the importance of establishing vegetable-specific maximum residue levels, strengthening monitoring, and considering vulnerable population groups in future research. Broader assessments, including other plant-origin products, are recommended to ensure comprehensive risk assessment and support science-based policy decisions for improved food safety and public health protection in Armenia. Full article
(This article belongs to the Section Food Quality and Safety)
Show Figures

Figure 1

18 pages, 2364 KB  
Article
Deterioration Modeling of Pavement Performance in Cold Regions Using Probabilistic Machine Learning Method
by Zhen Liu, Xingyu Gu and Wenxiu Wu
Infrastructures 2025, 10(8), 212; https://doi.org/10.3390/infrastructures10080212 - 14 Aug 2025
Viewed by 670
Abstract
Accurate and reliable modeling of pavement deterioration is critical for effective infrastructure management. This study proposes a probabilistic machine learning framework using Bayesian-optimized Natural Gradient Boosting (BO-NGBoost) to predict the International Roughness Index (IRI) of asphalt pavements in cold climates. A dataset only [...] Read more.
Accurate and reliable modeling of pavement deterioration is critical for effective infrastructure management. This study proposes a probabilistic machine learning framework using Bayesian-optimized Natural Gradient Boosting (BO-NGBoost) to predict the International Roughness Index (IRI) of asphalt pavements in cold climates. A dataset only for cold regions was constructed from the Long-Term Pavement Performance (LTPP) database, integrating multiple variables related to climate, structure, materials, traffic, and constructions. The BO-NGBoost model was evaluated against conventional deterministic models, including artificial neural networks, random forest, and XGBoost. Results show that BO-NGBoost achieved the highest predictive accuracy (R2 = 0.897, RMSE = 0.184, MAE = 0.107) while also providing uncertainty quantification for risk-based maintenance planning. BO-NGBoost effectively captures long-term deterioration trends and reflects increasing uncertainty with pavement age. SHAP analysis reveals that initial IRI, pavement age, layer thicknesses, and precipitation are key factors, with freeze–thaw cycles and moisture infiltration driving faster degradation in cold climates. This research contributes a scalable and interpretable framework that advances pavement deterioration modeling from deterministic to probabilistic paradigms and provides practical value for more uncertainty-aware infrastructure decision-making. Full article
Show Figures

Figure 1

19 pages, 2473 KB  
Article
Learning Residual Distributions with Diffusion Models for Probabilistic Wind Power Forecasting
by Fuhao Chen and Linyue Gao
Energies 2025, 18(16), 4226; https://doi.org/10.3390/en18164226 - 8 Aug 2025
Viewed by 535
Abstract
Accurate and uncertainty-aware wind power forecasting is essential for reliable and cost-effective power system operations. This paper presents a novel probabilistic forecasting framework based on diffusion probabilistic models. We adopted a two-stage modeling strategy—a deterministic predictor first generates baseline forecasts, and a conditional [...] Read more.
Accurate and uncertainty-aware wind power forecasting is essential for reliable and cost-effective power system operations. This paper presents a novel probabilistic forecasting framework based on diffusion probabilistic models. We adopted a two-stage modeling strategy—a deterministic predictor first generates baseline forecasts, and a conditional diffusion model then learns the distribution of residual errors. Such a two-stage decoupling strategy improves learning efficiency and sharpens uncertainty estimation. We employed the elucidated diffusion model (EDM) to enable flexible noise control and enhance calibration, stability, and expressiveness. For the generative backbone, we introduced a time-series-specific diffusion Transformer (TimeDiT) that incorporates modular conditioning to separately fuse numerical weather prediction (NWP) inputs, noise, and temporal features. The proposed method was evaluated using the public database from ten wind farms in the Global Energy Forecasting Competition 2014 (GEFCom2014). We further compared our approach with two popular baseline models, i.e., a distribution parameter regression model and a generative adversarial network (GAN)-based model. Results showed that our method consistently achieves superior performance in both deterministic metrics and probabilistic accuracy, offering better forecast calibration and sharper distributions. Full article
(This article belongs to the Section A3: Wind, Wave and Tidal Energy)
Show Figures

Figure 1

22 pages, 4621 KB  
Article
Probabilistic Forecasting and Anomaly Detection in Sewer Systems Using Gaussian Processes
by Mohsen Rezaee, Peter Melville-Shreeve and Hussein Rappel
Water 2025, 17(16), 2357; https://doi.org/10.3390/w17162357 - 8 Aug 2025
Viewed by 544
Abstract
This study investigates the capability of Gaussian process regression (GPR) models in the probabilistic forecasting of water flow and depth in a combined sewer system. Traditionally, deterministic methods have been implemented in sewer flow forecasting and anomaly detection, two crucial techniques for a [...] Read more.
This study investigates the capability of Gaussian process regression (GPR) models in the probabilistic forecasting of water flow and depth in a combined sewer system. Traditionally, deterministic methods have been implemented in sewer flow forecasting and anomaly detection, two crucial techniques for a good wastewater network and treatment plant management. However, with the uncertain nature of the factors impacting on sewer flow and depth, a probabilistic approach which takes uncertainties into account is preferred. This research introduces a novel use of GPR in sewer systems for real-time control and forecasting. To this end, a composite kernel is designed to capture flow and depth patterns in dry- and wet-weather periods by considering the underlying physical characteristics of the system. The multi-input, single-output GPR model is evaluated using root mean square error (RMSE), coverage, and differential entropy. The model demonstrates high predictive accuracy for both treatment plant inflow and manhole water levels across various training durations, with coverage values ranging from 87.5% to 99.4%. Finally, the model is used for anomaly detection by identifying deviations from expected ranges, enabling the estimation of surcharge and overflow probabilities under various conditions. Full article
(This article belongs to the Special Issue Advances in Management and Optimization of Urban Water Networks)
Show Figures

Figure 1

Back to TopTop