Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (10,627)

Search Parameters:
Keywords = data uncertainty

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
15 pages, 997 KB  
Article
Understanding the Financial Implications of Antimicrobial Resistance Surveillance in Nepal: Context-Specific Evidence for Policy and Sustainable Financing Strategies
by Yunjin Yum, Monika Karki, Dan Whitaker, Kshitij Karki, Ratnaa Shakya, Hari Prasad Kattel, Amrit Saud, Vishan Gajmer, Pankaj Chaudhary, Shrija Thapa, Rakchya Amatya, Timothy Worth, Claudia Parry, Wongyeong Choi, Clemence Nohe, Adrienne Chattoe-Brown, Deepak C. Bajracharya, Krishna Prasad Rai, Sangita Sharma, Kiran Pandey, Bijaya Kumar Shrestha, Runa Jha and Jung-Seok Leeadd Show full author list remove Hide full author list
Antibiotics 2026, 15(1), 103; https://doi.org/10.3390/antibiotics15010103 - 20 Jan 2026
Abstract
Background/Objectives: Antimicrobial resistance (AMR) surveillance is a cornerstone of national AMR strategies but requires sustained, cross-sectoral financing. While the need for such financing is well recognized, its quantification remains scarce in low- and middle-income countries. This study aimed to estimate the full [...] Read more.
Background/Objectives: Antimicrobial resistance (AMR) surveillance is a cornerstone of national AMR strategies but requires sustained, cross-sectoral financing. While the need for such financing is well recognized, its quantification remains scarce in low- and middle-income countries. This study aimed to estimate the full costs of AMR surveillance across the human health, animal health, and food sectors (2021–2030) in selected facilities in Nepal and generate evidence to inform sustainable financing. Methods: A bottom-up micro-costing approach was used to analyze data from five sites. Costs were adjusted for inflation using projected gross domestic product deflators, and probabilistic sensitivity analyses were conducted to assess uncertainty in laboratory sample volumes under four scenarios. Results: The total cost of AMR surveillance in Nepal was $6.7 million: $3.4 million for human health (50.3% out of the aggregated costs), $2.7 million for animal health (39.8%), and $0.7 million for the food sector (9.9%). Laboratories accounted for >90% of total costs, with consumables and personnel as the main cost drivers. Average cost per sample was $150 (animal), $64 (food), and $6 (human). Conclusions: This study offers the first robust, multi-sectoral 10-year cost estimates of AMR surveillance in Nepal. The findings highlight that sustaining AMR surveillance requires predictable domestic financing, particularly to cover recurrent laboratory operations as donor support declines. These results provide cost evidence to support future budgeting and policy planning toward sustainable, nationally financed AMR surveillance in Nepal. Full article
35 pages, 2347 KB  
Article
Probabilistic Load Forecasting for Green Marine Shore Power Systems: Enabling Efficient Port Energy Utilization Through Monte Carlo Analysis
by Bingchu Zhao, Fenghui Han, Yu Luo, Shuhang Lu, Yulong Ji and Zhe Wang
J. Mar. Sci. Eng. 2026, 14(2), 213; https://doi.org/10.3390/jmse14020213 - 20 Jan 2026
Abstract
The global shipping industry is surging ahead, and with it, a quiet revolution is taking place on the water: marine lithium-ion batteries have emerged as a crucial clean energy carrier, powering everything from ferries to container ships. When these vessels dock, they increasingly [...] Read more.
The global shipping industry is surging ahead, and with it, a quiet revolution is taking place on the water: marine lithium-ion batteries have emerged as a crucial clean energy carrier, powering everything from ferries to container ships. When these vessels dock, they increasingly rely on shore power charging systems to refuel—essentially, plugging in instead of idling on diesel. But predicting how much power they will need is not straightforward. Think about it: different ships, varying battery sizes, mixed charging technologies, and unpredictable port stays all come into play, creating a load profile that is random, uneven, and often concentrated—a real headache for grid planners. So how do you forecast something so inherently variable? This study turned to the Monte Carlo method, a probabilistic technique that thrives on uncertainty. Instead of seeking a single fixed answer, the model embraces randomness, feeding in real-world data on supply modes, vessel types, battery capacity, and operational hours. Through repeated random sampling and load simulation, it builds up a realistic picture of potential charging demand. We ran the numbers for a simulated fleet of 400 vessels, and the results speak for themselves: load factors landed at 0.35 for conventional AC shore power, 0.39 for high-voltage DC, 0.33 for renewable-based systems, 0.64 for smart microgrids, and 0.76 when energy storage joined the mix. Notice how storage and microgrids really smooth things out? What does this mean in practice? Well, it turns out that Monte Carlo is not just academically elegant, it is practically useful. By quantifying uncertainty and delivering load factors within confidence intervals, the method offers port operators something precious: a data-backed foundation for decision-making. Whether it is sizing infrastructure, designing tariff incentives, or weighing the grid impact of different shore power setups, this approach adds clarity. In the bigger picture, that kind of insight matters. As ports worldwide strive to support cleaner shipping and align with climate goals—China’s “dual carbon” ambition being a case in point—achieving a reliable handle on charging demand is not just technical; it is strategic. Here, probabilistic modeling shifts from a simulation exercise to a tangible tool for greener, more resilient port energy management. Full article
18 pages, 966 KB  
Article
Anomaly Detection Based on Hybrid Kernelized Fuzzy Density
by Kaitian Luo, Shenhong Lei, Chaoqing Li and Yi Li
Symmetry 2026, 18(1), 192; https://doi.org/10.3390/sym18010192 - 20 Jan 2026
Abstract
Unsupervised anomaly detection has been extensively studied. However, most existing methods are designed for either numerical or nominal data, which struggle to detect anomalies effectively in real-world mixed-type datasets. Fuzzy information granulation is a key concept in granular computing, which offers a potent [...] Read more.
Unsupervised anomaly detection has been extensively studied. However, most existing methods are designed for either numerical or nominal data, which struggle to detect anomalies effectively in real-world mixed-type datasets. Fuzzy information granulation is a key concept in granular computing, which offers a potent framework for managing uncertainty in mixed-type data and provides a viable pathway for unsupervised anomaly detection. Nevertheless, conventional fuzzy information granulation-based detection methods often model only simple, linear fuzzy relations between samples. This limitation prevents them from capturing the complex, nonlinear structures inherent in the data, leading to a degradation in detection performance. To address these shortcomings, we propose a Hybrid Kernelized Fuzzy Density-based anomaly detector (HKFD). HKFD pioneers a hybrid kernelized fuzzy relation by integrating a hybrid distance metric with kernel methods. This new relation allows us to define a hybrid kernelized fuzzy density for each sample within every feature subspace, effectively capturing the local data dispersion. Crucially, we introduce an information-theoretic weighting mechanism. By calculating the fuzzy information entropy of each feature’s distribution, HKFD automatically assigns higher weights to more informative feature subspaces that contribute more to identifying anomalies. The final anomaly factor is then calculated by the weighted fusion of these densities. Comprehensive experiments on 20 datasets demonstrate that HKFD significantly outperforms state-of-the-art methods, achieving superior anomaly detection performance. Full article
(This article belongs to the Special Issue Symmetry/Asymmetry in Fuzzy Sets and Fuzzy Systems)
Show Figures

Figure 1

35 pages, 4364 KB  
Article
Pedestrian Traffic Stress Levels (PTSL) in School Zones: A Pedestrian Safety Assessment for Sustainable School Environments—Evidence from the Caferağa Case Study
by Yunus Emre Yılmaz and Mustafa Gürsoy
Sustainability 2026, 18(2), 1042; https://doi.org/10.3390/su18021042 - 20 Jan 2026
Abstract
Pedestrian safety in school zones is shaped by traffic conditions and street design characteristics, whose combined effects involve uncertainty and gradual transitions rather than sharp thresholds. This study presents an integrated assessment framework based on the analytic hierarchy process (AHP) and fuzzy logic [...] Read more.
Pedestrian safety in school zones is shaped by traffic conditions and street design characteristics, whose combined effects involve uncertainty and gradual transitions rather than sharp thresholds. This study presents an integrated assessment framework based on the analytic hierarchy process (AHP) and fuzzy logic to evaluate pedestrian traffic stress level (PTSL) at the street-segment scale in school environments. AHP is used to derive input-variable weights from expert judgments, while a Mamdani-type fuzzy inference system models the relationships between traffic and geometric variables and pedestrian stress. The model incorporates vehicle density, pedestrian density, lane width, sidewalk width, buffer zone, and estimated traffic flow speed as input variables, represented using triangular membership functions. Genetic Algorithm (GA) optimization is applied to calibrate membership-function parameters, improving numerical consistency without altering the linguistic structure of the model. A comprehensive rule base is implemented in MATLAB (R2024b) to generate a continuous traffic stress score ranging from 0 to 10. The framework is applied to street segments surrounding major schools in the study area, enabling comparison of spatial variations in pedestrian stress. The results demonstrate how combinations of traffic intensity and street geometry influence stress levels, supporting data-driven pedestrian safety interventions for sustainable school environments and low-stress urban mobility. Full article
Show Figures

Figure 1

33 pages, 4465 KB  
Article
Environmentally Sustainable HVAC Management in Smart Buildings Using a Reinforcement Learning Framework SACEM
by Abdullah Alshammari, Ammar Ahmed E. Elhadi and Ashraf Osman Ibrahim
Sustainability 2026, 18(2), 1036; https://doi.org/10.3390/su18021036 - 20 Jan 2026
Abstract
Heating, ventilation, and air-conditioning (HVAC) systems dominate energy consumption in hot-climate buildings, where maintaining occupant comfort under extreme outdoor conditions remains a critical challenge, particularly under emerging time-of-use (TOU) electricity pricing schemes. While deep reinforcement learning (DRL) has shown promise for adaptive HVAC [...] Read more.
Heating, ventilation, and air-conditioning (HVAC) systems dominate energy consumption in hot-climate buildings, where maintaining occupant comfort under extreme outdoor conditions remains a critical challenge, particularly under emerging time-of-use (TOU) electricity pricing schemes. While deep reinforcement learning (DRL) has shown promise for adaptive HVAC control, existing approaches often suffer from comfort violations, myopic decision making, and limited robustness to uncertainty. This paper proposes a comfort-first hybrid control framework that integrates Soft Actor–Critic (SAC) with a Cross-Entropy Method (CEM) refinement layer, referred to as SACEM. The framework combines data-efficient off-policy learning with short-horizon predictive optimization and safety-aware action projection to explicitly prioritize thermal comfort while minimizing energy use, operating cost, and peak demand. The control problem is formulated as a Markov Decision Process using a simplified thermal model representative of commercial buildings in hot desert climates. The proposed approach is evaluated through extensive simulation using Saudi Arabian summer weather conditions, realistic occupancy patterns, and a three-tier TOU electricity tariff. Performance is assessed against state-of-the-art baselines, including PPO, TD3, and standard SAC, using comfort, energy, cost, and peak demand metrics, complemented by ablation and disturbance-based stress tests. Results show that SACEM achieves a comfort score of 95.8%, while reducing energy consumption and operating cost by approximately 21% relative to the strongest baseline. The findings demonstrate that integrating comfort-dominant reward design with decision-time look-ahead yields robust, economically viable HVAC control suitable for deployment in hot-climate smart buildings. Full article
Show Figures

Figure 1

27 pages, 1619 KB  
Article
Uncertainty-Aware Multimodal Fusion and Bayesian Decision-Making for DSS
by Vesna Antoska Knights, Marija Prchkovska, Luka Krašnjak and Jasenka Gajdoš Kljusurić
AppliedMath 2026, 6(1), 16; https://doi.org/10.3390/appliedmath6010016 - 20 Jan 2026
Abstract
Uncertainty-aware decision-making increasingly relies on multimodal sensing pipelines that must fuse correlated measurements, propagate uncertainty, and trigger reliable control actions. This study develops a unified mathematical framework for multimodal data fusion and Bayesian decision-making under uncertainty. The approach integrates adaptive Covariance Intersection (aCI) [...] Read more.
Uncertainty-aware decision-making increasingly relies on multimodal sensing pipelines that must fuse correlated measurements, propagate uncertainty, and trigger reliable control actions. This study develops a unified mathematical framework for multimodal data fusion and Bayesian decision-making under uncertainty. The approach integrates adaptive Covariance Intersection (aCI) for correlation-robust sensor fusion, a Gaussian state–space backbone with Kalman filtering, heteroskedastic Bayesian regression with full posterior sampling via an affine-invariant MCMC sampler, and a Bayesian likelihood-ratio test (LRT) coupled to a risk-sensitive proportional–derivative (PD) control law. Theoretical guarantees are provided by bounding the state covariance under stability conditions, establishing convexity of the aCI weight optimization on the simplex, and deriving a Bayes-risk-optimal decision threshold for the LRT under symmetric Gaussian likelihoods. A proof-of-concept agro-environmental decision-support application is considered, where heterogeneous data streams (IoT soil sensors, meteorological stations, and drone-derived vegetation indices) are fused to generate early-warning alarms for crop stress and to adapt irrigation and fertilization inputs. The proposed pipeline reduces predictive variance and sharpens posterior credible intervals (up to 34% narrower 95% intervals and 44% lower NLL/Brier score under heteroskedastic modeling), while a Bayesian uncertainty-aware controller achieves 14.2% lower water usage and 35.5% fewer false stress alarms compared to a rule-based strategy. The framework is mathematically grounded yet domain-independent, providing a probabilistic pipeline that propagates uncertainty from raw multimodal data to operational control actions, and can be transferred beyond agriculture to robotics, signal processing, and environmental monitoring applications. Full article
(This article belongs to the Section Probabilistic & Statistical Mathematics)
Show Figures

Figure 1

23 pages, 1109 KB  
Review
A Review of End-to-End Decision Optimization Research: An Architectural Perspective
by Wenya Zhang and Gendao Li
Algorithms 2026, 19(1), 86; https://doi.org/10.3390/a19010086 - 20 Jan 2026
Abstract
Traditional decision optimization methods primarily focus on model construction and solution, leaving parameter estimation and inter-variable relationships to statistical research. The traditional approach divides problem-solving into two independent stages: predict first and then optimize. This decoupling leads to the propagation of prediction errors-even [...] Read more.
Traditional decision optimization methods primarily focus on model construction and solution, leaving parameter estimation and inter-variable relationships to statistical research. The traditional approach divides problem-solving into two independent stages: predict first and then optimize. This decoupling leads to the propagation of prediction errors-even minor inaccuracies in predictions can be amplified into significant decision biases during the optimization phase. To tackle this issue, scholars have proposed end-to-end decision optimization methods, which integrate the prediction and decision-making stages into a unified framework. By doing so, these approaches effectively mitigate error propagation and enhance overall decision performance. From an architectural design perspective, this review focuses on categorizing end-to-end decision optimization methods based on how the prediction and decision modules are integrated. It classifies mainstream approaches into three typical paradigms: constructing closed-loop loss functions, building differentiable optimization layers, and parameterizing the representation of optimization problems. It also examines their implementation pathways leveraging deep learning technologies. The strengths and limitations of these paradigms essentially stem from the inherent trade-offs in their architectural designs. Through a systematic analysis of existing research, this paper identifies key challenges in three core areas: data, variable relationships, and gradient propagation. Among these, handling non-convexity and complex constraints is critical for model generalization, while quantifying decision-dependent endogenous uncertainty remains an indispensable challenge for practical deployment. Full article
Show Figures

Figure 1

23 pages, 627 KB  
Article
Harnessing Blockchain for Transparent and Sustainable Accounting in Creative MSMEs amid Digital Disruption: Evidence from Indonesia
by I Made Dwi Hita Darmawan, Ni Putu Noviyanti Kusuma, Nir Kshetri, Ketut Tri Budi Artani and Wina Pertiwi Putri Wardani
J. Risk Financial Manag. 2026, 19(1), 80; https://doi.org/10.3390/jrfm19010080 - 20 Jan 2026
Abstract
Blockchain is widely promoted as a tool for enhancing transparency, trust, and sustainability in business, yet little is known about how creative micro, small, and medium enterprises (MSMEs) in emerging economies can meaningfully adopt it for finance and accounting purposes in times of [...] Read more.
Blockchain is widely promoted as a tool for enhancing transparency, trust, and sustainability in business, yet little is known about how creative micro, small, and medium enterprises (MSMEs) in emerging economies can meaningfully adopt it for finance and accounting purposes in times of global uncertainty. This study explores how blockchain can be harnessed for transparent and sustainable accounting in Indonesian creative MSMEs amid rapid digital disruption. Using an exploratory qualitative design, we conducted semi-structured, in-depth interviews with 18 owners and key decision-makers across diverse creative subsectors and analysed the data thematically through an integrated Technology Acceptance Model (TAM) and Diffusion of Innovation (DOI) lens. The findings show that participants recognise blockchain’s potential benefits for transaction transparency, verifiable records, intellectual property protection, and secure payments, but adoption is constrained by technical complexity, financial constraints, limited digital and accounting capabilities, and perceived regulatory and reputational risks. Government initiatives are seen as important for legitimacy yet insufficient without concrete guidance, capacity-building, and financial support. The study extends TAM–DOI applications to blockchain-enabled accounting in creative MSMEs and highlights the need for sequenced, ecosystem-based interventions to translate blockchain’s technical promise into accessible, ESG- and SDG-oriented accounting solutions in the creative economy. Full article
Show Figures

Figure 1

19 pages, 2065 KB  
Article
Multiscale Wind Forecasting Using Explainable-Adaptive Hybrid Deep Learning
by Fatih Serttas
Appl. Sci. 2026, 16(2), 1020; https://doi.org/10.3390/app16021020 - 19 Jan 2026
Abstract
This study presents a multiscale, uncertainty-aware hybrid deep learning approach addressing the short-term wind speed prediction problem, which is critical for the reliable planning and operation of wind energy systems. Wind signals are decomposed using adaptive variational mode decomposition (VMD), and the resulting [...] Read more.
This study presents a multiscale, uncertainty-aware hybrid deep learning approach addressing the short-term wind speed prediction problem, which is critical for the reliable planning and operation of wind energy systems. Wind signals are decomposed using adaptive variational mode decomposition (VMD), and the resulting wind components are processed together with meteorological data through a dual-stream CNN–BiLSTM architecture. Based on this multiscale representation, probabilistic forecasts are generated using quantile regression to capture best- and worst-case scenarios for decision-making purposes. Unlike fixed prediction intervals, the proposed approach produces adaptive prediction bands that expand during unstable wind conditions and contract during calm periods. The developed model is evaluated using four years of meteorological data from the Afyonkarahisar region of Türkiye. While the proposed model achieves competitive point forecasting performance (RMSE = 0.700 m/s and MAE = 0.54 m/s), its main contribution lies in providing reliable probabilistic forecasts through well-calibrated uncertainty quantification, offering decision-relevant information beyond single-point predictions. The proposed method is compared with a classical CNN–LSTM and several structural variants. Furthermore, SHAP-based explainability analysis indicates that seasonal and solar-related variables play a dominant role in the forecasting process. Full article
(This article belongs to the Topic Advances in Wind Energy Technology: 2nd Edition)
Show Figures

Figure 1

40 pages, 2030 KB  
Article
A Climate–Geomechanics Interface for Adaptive and Resilient Geostructures
by Tamara Bračko and Bojan Žlender
Climate 2026, 14(1), 23; https://doi.org/10.3390/cli14010023 - 19 Jan 2026
Abstract
Geostructures, such as foundations, embankments, retaining structures, bridge abutments, and both natural and engineered slopes, interact with the ground to ensure structural safety and functionality. One significant factor influencing these systems is climate, which continuously affects soil conditions through dynamic processes. Over the [...] Read more.
Geostructures, such as foundations, embankments, retaining structures, bridge abutments, and both natural and engineered slopes, interact with the ground to ensure structural safety and functionality. One significant factor influencing these systems is climate, which continuously affects soil conditions through dynamic processes. Over the past century, climate change has intensified, increasing uncertainties regarding the safety of both existing and planned geostructures. While the impacts of climate change on geostructures are evident, effective methods to address them remain uncertain. This paper presents an approach for mitigating and adapting to climate change impacts through a stepwise geomechanical analysis and geotechnical design framework that incorporates expected climatic conditions. A novel framework is introduced that systematically integrates projected climate scenarios into geomechanical modeling, enabling climate-resilient design of geostructures. The concept establishes an interface between climate effects and geomechanical data, capturing the causal chain of climate hazards, their effects, and potential consequences. The proposed interface provides a practical tool for integrating climate considerations into geotechnical design, supporting adaptive and resilient infrastructure planning. The approach is demonstrated across different geostructure types, with a detailed slope stability analysis illustrating its implementation. Results show that the interface, reflecting processes such as water infiltration, soil hydraulic conductivity, and groundwater flow, is often critical to slope stability outcomes. Furthermore, slope stability can often be maintained through simple, timely implemented nature-based solutions (NbS), whereas delayed actions typically require more complex and costly interventions. Full article
28 pages, 973 KB  
Article
Computable Reformulation of Data-Driven Distributionally Robust Chance Constraints: Validated by Solution of Capacitated Lot-Sizing Problems
by Hua Deng and Zhong Wan
Mathematics 2026, 14(2), 331; https://doi.org/10.3390/math14020331 - 19 Jan 2026
Abstract
Uncertainty in optimization models often causes awkward properties in their deterministic equivalent formulations (DEFs), even for simple linear models. Chance-constrained programming is a reasonable tool for handling optimization problems with random parameters in objective functions and constraints, but it assumes that the distribution [...] Read more.
Uncertainty in optimization models often causes awkward properties in their deterministic equivalent formulations (DEFs), even for simple linear models. Chance-constrained programming is a reasonable tool for handling optimization problems with random parameters in objective functions and constraints, but it assumes that the distribution of these random parameters is known, and its DEF is often associated with the complicated computation of multiple integrals, hence impeding its extensive applications. In this paper, for optimization models with chance constraints, the historical data of random model parameters are first exploited to construct an adaptive approximate density function by incorporating piecewise linear interpolation into the well-known histogram method, so as to remove the assumption of a known distribution. Then, in view of this estimation, a novel confidence set only involving finitely many variables is constructed to depict all the potential distributions for the random parameters, and a computable reformulation of data-driven distributionally robust chance constraints is proposed. By virtue of such a confidence set, it is proven that the deterministic equivalent constraints are reformulated as several ordinary constraints in line with the principles of the distributionally robust optimization approach, without the need to solve complicated semi-definite programming problems, compute multiple integrals, or solve additional auxiliary optimization problems, as done in existing works. The proposed method is further validated by the solution of the stochastic multiperiod capacitated lot-sizing problem, and the numerical results demonstrate that: (1) The proposed method can significantly reduce the computational time needed to find a robust optimal production strategy compared with similar ones in the literature; (2) The optimal production strategy provided by our method can maintain moderate conservatism, i.e., it has the ability to achieve a better trade-off between cost-effectiveness and robustness than existing methods. Full article
(This article belongs to the Section D: Statistics and Operational Research)
24 pages, 12276 KB  
Article
COVAS: Highlighting the Importance of Outliers in Classification Through Explainable AI
by Sebastian Roth, Adrien Cerrito, Samuel Orth, Ulrich Hartmann and Daniel Friemert
Mach. Learn. Knowl. Extr. 2026, 8(1), 24; https://doi.org/10.3390/make8010024 - 19 Jan 2026
Abstract
Understanding the decision-making behavior of machine learning models is essential in domains where individual predictions matter, such as medical diagnosis or sports analytics. While explainable artificial intelligence (XAI) methods such as SHAP provide instance-level feature attributions, they mainly summarize typical decision behavior and [...] Read more.
Understanding the decision-making behavior of machine learning models is essential in domains where individual predictions matter, such as medical diagnosis or sports analytics. While explainable artificial intelligence (XAI) methods such as SHAP provide instance-level feature attributions, they mainly summarize typical decision behavior and offer limited support for systematically exploring atypical yet correctly classified cases. In this work, we introduce the Classification Outlier Variability Score (COVAS), a framework designed to support hypothesis generation through the analysis of explanation variability. COVAS operates in the explanation space and builds directly on SHAP value representations. It quantifies how strongly an individual instance’s SHAP-based explanation deviates from class-specific attribution patterns by aggregating standardized SHAP deviations into a single score. Consequently, the applicability of COVAS inherits the model- and data-agnostic properties of SHAP, provided that explanations can be computed for the underlying model and data. We evaluate COVAS on publicly available datasets from the medical and sports domains. The results show that COVAS reveals explanation-space outliers not captured by feature-space outlier detection or prediction uncertainty measures. Robustness analyses demonstrate stability across parameter choices, class imbalance, model initialization, and model classes. Overall, COVAS complements existing XAI techniques by enabling targeted instance-level inspection and facilitating XAI-guided hypothesis formulation. Full article
Show Figures

Figure 1

33 pages, 326 KB  
Article
Intelligent Risk Identification in Construction Projects: A Case Study of an AI-Based Framework
by Kristijan Vilibić, Zvonko Sigmund and Ivica Završki
Buildings 2026, 16(2), 409; https://doi.org/10.3390/buildings16020409 - 19 Jan 2026
Abstract
Risk management in large-scale construction projects is a critical yet complex process influenced by financial, safety, environmental, scheduling, and regulatory uncertainties. Effective risk management contributes directly to project optimization by minimizing disruptions, controlling costs, and enhancing decision-making efficiency. Early identification and mitigation of [...] Read more.
Risk management in large-scale construction projects is a critical yet complex process influenced by financial, safety, environmental, scheduling, and regulatory uncertainties. Effective risk management contributes directly to project optimization by minimizing disruptions, controlling costs, and enhancing decision-making efficiency. Early identification and mitigation of risks allow resources to be allocated where they have the greatest effect, thereby optimizing overall project outcomes. However, conventional methods such as expert judgment and probabilistic modeling often struggle to process extensive datasets and complex interdependencies among risk factors. This study explores the potential of an AI-based framework for risk identification, utilizing artificial intelligence to analyze project documentation and generate a preliminary set of identified risks. The proposed methodology is implemented on the ‘Trg pravde’ judicial infrastructure project in Zagreb, Croatia, applying AI models (GPT-5, Gemini 2.5, Sonnet 4.5) to identify phase-specific risks throughout the project lifecycle. The approach aims to improve the efficiency of risk identification, reduce human bias, and align with established project management methodologies such as PM2. Initial findings suggest that the use of AI may broaden the range of identified risks and support more structured risk analysis, indicating its potential value as a complementary tool in risk management processes. However, human expertise remains crucial for prioritization, contextual interpretation, and mitigation. The study demonstrates that AI augments, rather than replaces, traditional risk management practices, enabling more proactive and data-driven decision-making in construction projects. Full article
(This article belongs to the Special Issue Applying Artificial Intelligence in Construction Management)
21 pages, 10379 KB  
Article
Spatial Optimization of Urban-Scale Sponge Structures and Functional Areas Using an Integrated Framework Based on a Hydrodynamic Model and GIS Technique
by Mengxiao Jin, Quanyi Zheng, Yu Shao, Yong Tian, Jiang Yu and Ying Zhang
Water 2026, 18(2), 262; https://doi.org/10.3390/w18020262 - 19 Jan 2026
Abstract
Rapid urbanization has exacerbated urban-stormwater challenges, highlighting the critical need for coordinated surface-water and groundwater management through rainfall recharge. However, current sponge city construction methods often overlook the crucial role of underground aquifers in regulating the water cycle and mostly rely on simplified [...] Read more.
Rapid urbanization has exacerbated urban-stormwater challenges, highlighting the critical need for coordinated surface-water and groundwater management through rainfall recharge. However, current sponge city construction methods often overlook the crucial role of underground aquifers in regulating the water cycle and mostly rely on simplified engineering approaches. To address these limitations, this study proposes a spatial optimization framework for urban-scale sponge systems that integrates a hydrodynamic model (FVCOM), geographic information systems (GIS), and Monte Carlo simulations. This framework establishes a comprehensive evaluation system that synergistically integrates surface water inundation depth, geological lithology, and groundwater depth to quantitatively assess sponge city suitability. The FVCOM was employed to simulate surface water inundation processes under extreme rainfall scenarios, while GIS facilitated spatial analysis and data integration. The Monte Carlo simulation was utilized to optimize the spatial layout by objectively determining factor weights and evaluate result uncertainty. Using Shenzhen City in China as a case study, this research combined the “matrix-corridor-patch” theory from landscape ecology to optimize the spatial structure of the sponge system. Furthermore, differentiated planning and management strategies were proposed based on regional characteristics and uncertainty analysis. The research findings provide a replicable and verifiable methodology for developing sponge city systems in high-density urban areas. The core value of this methodology lies in its creation of a scientific decision-making tool for direct application in urban planning. This tool can significantly enhance a city’s climate resilience and facilitate the coordinated, optimal management of water resources amid environmental changes. Full article
(This article belongs to the Special Issue "Watershed–Urban" Flooding and Waterlogging Disasters)
Show Figures

Figure 1

26 pages, 4506 KB  
Article
Global Tea Production Forecasting Using ARIMA Models: A Multi-Country Time-Series Analysis (1961–2028)
by Hediye Kumbasaroglu
Sustainability 2026, 18(2), 1005; https://doi.org/10.3390/su18021005 - 19 Jan 2026
Abstract
Understanding the long-term dynamics of global tea production is essential for assessing supply stability, climate sensitivity, and producer competitiveness. This study examines annual tea production data for major producing countries—China, India, Kenya, Sri Lanka, Türkiye, Vietnam, and other producer groups—over the period 1961–2023 [...] Read more.
Understanding the long-term dynamics of global tea production is essential for assessing supply stability, climate sensitivity, and producer competitiveness. This study examines annual tea production data for major producing countries—China, India, Kenya, Sri Lanka, Türkiye, Vietnam, and other producer groups—over the period 1961–2023 and provides production forecasts for 2024–2028 using country-specific ARIMA models. Unlike most existing studies focusing on single countries or short-term horizons, this research offers a unified multi-country and long-term comparative framework that integrates time-series forecasting with market concentration indicators. The results reveal pronounced cross-country heterogeneity in production behavior, with China exhibiting strong structural growth, while other producers display more moderate or climate-sensitive patterns. Forecasts suggest a continued increase in global tea production toward 2028, although projections are subject to uncertainty, as reflected by model-based confidence intervals. Overall, the study contributes robust, statistically validated insights to support evidence-based strategies for sustainable tea supply and international market planning. Forecasts suggest a continued increase in global tea production toward 2028, although projections are subject to uncertainty, as reflected by model-based confidence intervals. These forecasts highlight a robust upward trend in global tea supply due to both technological advancements and market expansion. Full article
(This article belongs to the Section Economic and Business Aspects of Sustainability)
Show Figures

Figure 1

Back to TopTop