Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (3,753)

Search Parameters:
Keywords = Time Series Forecasting

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
18 pages, 606 KB  
Article
Information-Preserving Spiking for Accurate Time-Series Forecasting in Spiking Neural Networks
by Jiwoo Lee and Eun-Kyu Lee
Electronics 2026, 15(8), 1597; https://doi.org/10.3390/electronics15081597 - 10 Apr 2026
Abstract
Deep learning models have achieved high accuracy in forecasting problems, but at the cost of large computational energy demand. Brain-inspired spiking neural networks (SNNs) offer a promising, low-power alternative, yet their adoption for time-series forecasting has been limited by information loss from binary [...] Read more.
Deep learning models have achieved high accuracy in forecasting problems, but at the cost of large computational energy demand. Brain-inspired spiking neural networks (SNNs) offer a promising, low-power alternative, yet their adoption for time-series forecasting has been limited by information loss from binary spikes and degraded performance in deeper networks. This paper proposes a fully spiking framework that bridges this gap by improving both the encoding and propagation of information in SNNs. The framework introduces a hybrid Delta-Rate encoding mechanism that captures both abrupt changes and gradual trends in time-series data, and a Mem-Spike mechanism that transmits analog membrane potential values to preserve fine-grained information between spiking layers. We further employ residual membrane connections to maintain signal flow in deep spiking networks. Using two public energy load datasets, our enhanced SNNs consistently outperform conventional spiking models, improving prediction accuracy by up to 61.6% and mitigating degradation in multi-layer networks. Notably, it narrows the gap to the selected deep learning baseline (LSTM), achieving comparable accuracy in some settings while requiring only about 10% of the estimated inference energy of that baseline under a common operation-level model. These results show that, within the empirical scope considered here, enhanced conventional SNNs can improve time-series forecasting accuracy while retaining favorable estimated efficiency. Full article
(This article belongs to the Special Issue Feature Papers in Artificial Intelligence)
Show Figures

Figure 1

50 pages, 1663 KB  
Review
Advances in Similar Day Methods for Short-Term Load Forecasting for Power Systems
by Monica Borunda, Luis Conde-López, Gerardo Ruiz-Chavarría, Guadalupe Lopez Lopez, Victor M. Alvarado and Edgardo de Jesús Carrera Avendaño
Forecasting 2026, 8(2), 32; https://doi.org/10.3390/forecast8020032 - 10 Apr 2026
Abstract
Short-term load forecasting is essential for the reliable, secure, efficient, and economic operation of modern power systems and electricity markets. Among many forecasting strategies, the similar day (SD) approach for short-term load forecasting was among the earliest used to assess power demand and [...] Read more.
Short-term load forecasting is essential for the reliable, secure, efficient, and economic operation of modern power systems and electricity markets. Among many forecasting strategies, the similar day (SD) approach for short-term load forecasting was among the earliest used to assess power demand and remains one of the most intuitive and widely adopted techniques worldwide. However, over time, increasing system complexity, richer datasets, and advances in computational intelligence have led to the evolution of SD methodologies beyond heuristic-based rule formulations. This work presents a study of the relevant literature on short-term load forecasting using SD methods reported between 2000 and 2025. This study analyzes how similarity is defined, how forecasts are generated, and how both stages interact within the complete forecasting process in the reviewed literature. Based on these criteria, a unified taxonomy is proposed to classify SD methods into conventional, intelligent, and hybrid formulations. This study provides insight into the methodologies, their performance, and the systems in which they have been tested. The results show that SD-based approaches remain competitive for short-term forecasting and that incorporating artificial intelligence techniques can further enhance their accuracy. Full article
(This article belongs to the Topic Short-Term Load Forecasting—2nd Edition)
Show Figures

Figure 1

35 pages, 856 KB  
Article
Stock Forecasting Based on Informational Complexity Representation: A Framework of Wavelet Entropy, Multiscale Entropy, and Dual-Branch Network
by Guisheng Tian, Chengjun Xu and Yiwen Yang
Entropy 2026, 28(4), 424; https://doi.org/10.3390/e28040424 - 10 Apr 2026
Abstract
Stock price sequences are characterized by pronounced nonlinearity, non-stationarity, and multi-scale volatility. They are further influenced by complex, multi-source factors, such as macroeconomic conditions and market behavior, making high-precision forecasting highly challenging. Existing approaches are limited by noise and multi-dimensional market features, as [...] Read more.
Stock price sequences are characterized by pronounced nonlinearity, non-stationarity, and multi-scale volatility. They are further influenced by complex, multi-source factors, such as macroeconomic conditions and market behavior, making high-precision forecasting highly challenging. Existing approaches are limited by noise and multi-dimensional market features, as well as difficulties in balancing prediction accuracy with model complexity. To address these challenges, we propose Wavelet Entropy and Cross-Attention Network (WECA-Net), which combines wavelet decomposition with a multimodal cross-attention mechanism. From an information-theoretic perspective, stock price dynamics reflect the time-varying uncertainty and informational complexity of the market. We employ wavelet entropy to quantify the dispersion and uncertainty of energy distribution across frequency bands, and multiscale entropy to measure the scale-dependent complexity and regularity of the time series. These entropy-derived descriptors provide an interpretable prior of “information content” for cross-modal attention fusion, thereby improving robustness and generalization under non-stationary market conditions. Experiments on Chinese stock indices, A-Share, and CSI 300 component stock datasets demonstrate that WECA-Net consistently outperforms mainstream models in Mean Absolute Error (MAE) and R2 across all datasets. Notably, on the CSI 300 dataset, WECA-Net achieves an R2 of 0.9895, underscoring its strong predictive accuracy and practical applicability. This framework is also well aligned with sensor data fusion and intelligent perception paradigms, offering a robust solution for financial signal processing and real-time market state awareness. Full article
(This article belongs to the Section Complexity)
19 pages, 73201 KB  
Article
Deterministic Drivers of Microbial Community Succession in Nongxiang Daqu Fermentation: Fungi Exhibit Stronger Environmental Selection Imprints than Bacteria
by Dongmei Wang, Fei Wang, Ping Tang, Lei Wang, Yusheng Xie, Maosen Xiong, Qian Luo, Yanping Luo, Dan Huang and Lei Yang
Fermentation 2026, 12(4), 193; https://doi.org/10.3390/fermentation12040193 - 10 Apr 2026
Abstract
Microbial communities are the fundamental determinants of Nongxiang Daqu quality. In this study, we systematically investigated the assembly and succession mechanisms of microbial communities during Nongxiang Daqu fermentation. Our findings reveal that this ecological succession is primarily driven by deterministic processes, encompassing dynamic [...] Read more.
Microbial communities are the fundamental determinants of Nongxiang Daqu quality. In this study, we systematically investigated the assembly and succession mechanisms of microbial communities during Nongxiang Daqu fermentation. Our findings reveal that this ecological succession is primarily driven by deterministic processes, encompassing dynamic environmental variables and interspecific microbial interactions. Significant stage-specific temporal variations in the community structure were observed, and biomarkers identified via a random forest model further corroborated these dynamic successional patterns. Both the neutral community model and Modified Stochasticity Ratio (MST) tests demonstrated that community assembly is dominated by deterministic processes, the influence of which intensifies as fermentation progresses. Notably, the fungal community exhibited a more pronounced response to these deterministic environmental selections than the bacterial community. Furthermore, co-occurrence network analysis, Mantel tests, and redundancy analysis (RDA) collectively illustrated that microbial interactions and environmental factors—specifically temperature, humidity, oxygen, carbon dioxide, and acidity—synergistically regulate this succession. Crucially, the rates of change in these environmental parameters directly dictated the pace of microbial turnover. Among these, oxygen and acidity had the greatest influence: oxygen accounted for 17.32% and 29.05% of the effects on fungi and bacteria, respectively, while acidity accounted for 16.77% and 25.23%, respectively. Time-series forecasting indicated that community structural assembly and stabilization predominantly conclude within the initial 30 days of fermentation. Ultimately, this study uncovers the ecological driving forces shaping the Nongxiang Daqu microbiome, providing a vital theoretical foundation for the targeted regulation of Daqu microecology and the enhancement of product quality. Full article
(This article belongs to the Section Fermentation for Food and Beverages)
Show Figures

Figure 1

20 pages, 1293 KB  
Article
Enhancing Long-Term Forecasting Stability in Smart Grids: A Hybrid Mamba-LSTM-Attention Framework
by Fusheng Chen, Chong Fo Lei, Te Guo and Chiawei Chu
Energies 2026, 19(8), 1855; https://doi.org/10.3390/en19081855 - 9 Apr 2026
Abstract
Accurate multivariate long-term time series forecasting (LTSF) is critical for smart grid operations. However, non-stationary distribution shifts frequently induce compounding error accumulation in conventional architectures. This study proposes the Mamba-LSTM-Attention (MLA) framework, a distribution-aware architecture engineered for forecasting stability. The pipeline integrates Reversible [...] Read more.
Accurate multivariate long-term time series forecasting (LTSF) is critical for smart grid operations. However, non-stationary distribution shifts frequently induce compounding error accumulation in conventional architectures. This study proposes the Mamba-LSTM-Attention (MLA) framework, a distribution-aware architecture engineered for forecasting stability. The pipeline integrates Reversible Instance Normalization (RevIN) to neutralize statistical drift. To address computational bottlenecks, the architecture utilizes a linear-time Selective State Space Model (Mamba) to capture global trend dynamics, cascaded with a single-layer gated Long Short-Term Memory (LSTM) unit to model localized non-linear residuals. A terminal information bottleneck structurally bounds cross-step error propagation. Empirical results across standard ETT and Electricity benchmarks reveal a precision–stability trade-off. By prioritizing structural resilience, the MLA framework limits error accumulation on highly volatile datasets, yielding MSEs of 0.210 and 0.128 on ETTh2 and ETTm2 at the T = 96 horizon. This structural bottleneck inherently smooths high-frequency periodic patterns, yielding lower absolute accuracy on stationary benchmarks such as ETTh1 and ETTm1. Ultimately, the architecture establishes a computationally efficient, structurally stable baseline tailored for non-stationary anomaly tracking in smart grids. Full article
(This article belongs to the Special Issue Forecasting Electricity Demand Using AI and Machine Learning)
Show Figures

Figure 1

42 pages, 3444 KB  
Article
Global Food Price Dynamics, Undernourishment, and Human Development: Wavelet Coherence Evidence and SDG 2.1 Resilience Scenarios up to 2030
by Olena Pavlova, Oksana Liashenko, Kostiantyn Pavlov, Agata Kutyba, Nataliia Fastovets, Artur Machno, Oleksandr Holubiev and Tetiana Vlasenko
Sustainability 2026, 18(8), 3724; https://doi.org/10.3390/su18083724 - 9 Apr 2026
Abstract
This study examines whether international food price dynamics provide a reliable signal of undernourishment and human development outcomes relevant to the attainment of SDG 2 (Zero Hunger) by 2030. We apply wavelet coherence analysis to the FAO Food Price Index and the prevalence [...] Read more.
This study examines whether international food price dynamics provide a reliable signal of undernourishment and human development outcomes relevant to the attainment of SDG 2 (Zero Hunger) by 2030. We apply wavelet coherence analysis to the FAO Food Price Index and the prevalence of undernourishment (SDG Indicator 2.1.1) over 2001–2023, testing statistical significance against an AR(1) red-noise null hypothesis. Hybrid ARIMA–Random Forest models generate probabilistic price forecasts through 2030. Despite strong raw coherence (R2 ≈ 0.77), only 7.8% of time–frequency cells achieve statistical significance, indicating that apparent co-movement largely reflects autocorrelation rather than substantive dependence. Where significant coherence emerges, it concentrates at medium-run horizons (3–6 years), consistent with undernourishment as a habitual dietary adequacy measure linked to sustained affordability pressures affecting health, productivity, and human capital formation. Rolling correlation analysis reveals suggestive evidence of a regime change around 2012—from negative to positive correlation—coinciding with a slowdown in progress toward reducing hunger, although the 5-year rolling windows yield only 19 observations, limiting the power of formal structural break tests. Price forecasts exhibit rapidly widening confidence intervals (by ±131 index points by 2030), underscoring fundamental limits to predictability. The annual PoU series comprises only 23 observations, which constrains the estimation of long-run (8–12-year) wavelet cycles; results at those horizons should therefore be interpreted with caution. These findings caution against mechanistic inferences from global price indices to hunger and human development outcomes, redirecting policy emphasis toward domestic transmission channels and nutrition-sensitive safety nets. Full article
(This article belongs to the Section Sustainable Food)
23 pages, 557 KB  
Article
A Multi-Stage Decomposition and Hybrid Statistical Framework for Time Series Forecasting
by Swera Zeb Abbasi, Mahmoud M. Abdelwahab, Imam Hussain, Moiz Qureshi, Moeeba Rind, Paulo Canas Rodrigues, Ijaz Hussain and Mohamed A. Abdelkawy
Axioms 2026, 15(4), 273; https://doi.org/10.3390/axioms15040273 - 9 Apr 2026
Abstract
Modeling and forecasting nonstationary and nonlinear economic time series remain fundamentally challenging due to structural breaks, volatility clustering, and noise contamination that distort the intrinsic stochastic structure. To address these limitations, this study proposes a novel three-stage hybrid statistical framework that systematically integrates [...] Read more.
Modeling and forecasting nonstationary and nonlinear economic time series remain fundamentally challenging due to structural breaks, volatility clustering, and noise contamination that distort the intrinsic stochastic structure. To address these limitations, this study proposes a novel three-stage hybrid statistical framework that systematically integrates multi-level signal decomposition with structured parametric modeling to enhance predictive accuracy. The proposed hybrid architectures—EMD–EEMD–ARIMA, EMD–EEMD–GMDH, and EMD–EEMD–ETS—employ a hierarchical decomposition–reconstruction strategy before forecasting. In the first stage, Empirical Mode Decomposition (EMD) decomposes the observed series into intrinsic mode functions (IMFs) and a residual component. In the second stage, Ensemble Empirical Mode Decomposition (EEMD) is applied to further refine the extracted components, mitigating mode mixing and improving signal separability. In the final stage, each reconstructed component is modeled using ARIMA, Exponential Smoothing State Space (ETS), and Group Method of Data Handling (GMDH) frameworks, and the individual forecasts are aggregated to obtain the final prediction. Empirical evaluation based on a recursive one-step-ahead forecasting scheme demonstrates consistent numerical improvements across all standard accuracy measures. In particular, the proposed EMD–EEMD–ARIMA model achieves the lowest forecasting error, reducing the root-mean-square error (RMSE) by approximately 6–7% relative to the best-performing single-stage model and by about 3–4% relative to the two-stage EMD-based hybrids. Similar improvements are observed in mean squared error (MSE), mean absolute error (MAE), and mean absolute percentage error (MAPE), indicating enhanced stability and robustness of the three-stage architecture. The results provide strong numerical evidence that multi-level decomposition combined with structured statistical modeling yields superior predictive performance for complex nonlinear and nonstationary time series. The proposed framework offers a mathematically coherent, computationally tractable, and systematically structured hybrid modeling strategy that effectively integrates noise-assisted decomposition with parametric and data-driven forecasting techniques. Full article
Show Figures

Figure 1

25 pages, 835 KB  
Article
Personalised Blood Glucose Time Series Forecasting in Type 1 Diabetes: Deep Collaborative Adversarial Learning
by Heydar Khadem, Hoda Nemat, Jackie Elliott and Mohammed Benaissa
J. Pers. Med. 2026, 16(4), 210; https://doi.org/10.3390/jpm16040210 - 8 Apr 2026
Abstract
Background/Objectives: Blood glucose prediction (BGP) for individuals with type 1 diabetes (T1D) is a clinically essential yet highly challenging task in time series forecasting (TSF) and an important problem in personalised medicine. Accurate bespoke BGP is crucial for individualised T1D management, reducing complications, [...] Read more.
Background/Objectives: Blood glucose prediction (BGP) for individuals with type 1 diabetes (T1D) is a clinically essential yet highly challenging task in time series forecasting (TSF) and an important problem in personalised medicine. Accurate bespoke BGP is crucial for individualised T1D management, reducing complications, and supporting patient-specific glycaemic risk mitigation. However, the pronounced volatility of glycaemic fluctuations in T1D, combined with the need for mathematical rigor and clinical relevance, hampers reliable prediction. This complexity underscores the demand to explore and enhance more advanced techniques. While adversarial learning is adept at modelling intricate data variability, its potential for BGP remains largely untapped. Methods: This work presents a novel approach for BGP by addressing a key limitation in conventional adversarial learning when applied to this task. Typically, these methods optimise prediction accuracy within a set horizon by minimising adversarial loss. This focus overlooks how predictions align with longer-term patterns, which are critical for clinical relevance in BGP, thereby yielding suboptimal results. To overcome this limitation, we introduce collaborative augmented adversarial learning, designed to improve the model’s temporal awareness. Incorporating collaborative interaction optimisation, this approach enables the model to reflect extended time dependencies beyond the immediate horizon, thereby improving both the clinical reliability of predictions and overall predictive performance. We develop and evaluate four learning systems for BGP: independent learning, adversarial learning, collaborative learning, and adversarial collaborative learning. The proposed systems were evaluated for two clinically relevant prediction horizons, namely 30 min and 60 min ahead. Results: The interdependent collaboratively augmented learning frameworks, validated using the well-established Ohio T1D datasets, demonstrate statistically significant superior performance in both clinical and mathematical evaluations. Conclusions: Beyond advancing BGP accuracy and clinical reliability, the proposed approach supports personalised medicine by improving subject-specific glucose forecasting from CGM data, with potential relevance for more individualised diabetes monitoring and decision support. The proposed approach also opens new avenues for advancements in other complex TSF domains, as outlined in our future work. Full article
Show Figures

Graphical abstract

68 pages, 7738 KB  
Review
An Overview of Complex Time Series Analysis
by Alejandro Ramírez-Rojas, Leonardo Di G. Sigalotti, Luciano Telesca and Fidel Cruz
Mathematics 2026, 14(7), 1231; https://doi.org/10.3390/math14071231 - 7 Apr 2026
Abstract
Different methodologies have been developed for the analysis and study of dynamical systems, including both theoretical models and natural systems. Examples span a wide range of applications, such as astronomy, financial and economic time series, biophysical systems, physiological phenomena, and Earth sciences, including [...] Read more.
Different methodologies have been developed for the analysis and study of dynamical systems, including both theoretical models and natural systems. Examples span a wide range of applications, such as astronomy, financial and economic time series, biophysical systems, physiological phenomena, and Earth sciences, including seismicity and climatic processes. The study of these complex systems is commonly based on the analysis of the signals they generate, using mathematical tools to extract relevant information. A broad spectrum of mathematical disciplines converges in this context, including stochastic, probability and statistical theory, entropic and informational measures, fractal and multifractal analysis, natural time analysis, modeling of non-linearity and recurrence methods, generalized entropies, non-extensive systems, machine learning, and high-dimensional and multivariate complexity. Research in this area is largely focused on the characterization of complex systems, providing indicators of determinism or stochasticity, distinguishing between regularity, chaos, and noise, and identifying topological as well as disorder-regularity features. In addition, short- and long-term forecasting, together with the identification of short- and long-range correlations, play a central role in such characterization. To address these objectives, numerous mathematical tools have been developed for the analysis of time series and point processes, each designed to capture specific signal properties. In this work, many of the most important tools used in time series analysis are compiled and reviewed, highlighting their main characteristics and the different types of complex systems to which they have been applied. Full article
(This article belongs to the Special Issue Recent Advances in Time Series Analysis, 2nd Edition)
Show Figures

Figure 1

22 pages, 22745 KB  
Article
Spectral Phenological Typologies for Improving Cross-Dataset in Mediterranean Winter Cereals
by Patricia Arizo-García, Sergio Castiñeira-Ibáñez, Beatriz Ricarte, Alberto San Bautista and Constanza Rubio
Appl. Sci. 2026, 16(7), 3598; https://doi.org/10.3390/app16073598 - 7 Apr 2026
Abstract
Accurate monitoring of crop phenology is essential for precision agriculture and yield forecasting. However, satellite-derived time series often suffer from inherent noise, such as residual atmospheric effects and mixed pixels, as well as a frequent lack of ground-truth data in agriculture. In response, [...] Read more.
Accurate monitoring of crop phenology is essential for precision agriculture and yield forecasting. However, satellite-derived time series often suffer from inherent noise, such as residual atmospheric effects and mixed pixels, as well as a frequent lack of ground-truth data in agriculture. In response, this study proposes an algorithm to define the type of spectral signatures for the principal phenological stages of crops, using them as the foundation for training supervised machine learning classification models. The algorithm was developed using Fuzzy C-Means (FCM) clustering to identify the spectral signature reference groups in winter wheat across the Burgos region (Spain) during the 2020 and 2021 growing seasons. To enhance cluster independence and biological coherence, a multi-step filtering process was implemented, including spectral purity (membership degree, SAM, and SAMder) and temporal coherence filters. The filtered and labeled dataset (80% original Burgos dataset) was used to train supervised classification models (KNN and XGBoost). The models’ reliability was verified through three wheat tests (remaining 20%), labeled using other clustering techniques, and an independent barley dataset from diverse geographic locations (Valladolid and Soria). The filtering process significantly improved cluster stability by removing outliers and transition spectral signatures. The supervised models demonstrated exceptional performance; the KNN model slightly outperformed XGB, achieving a mean Accuracy of 0.977, a Kappa of 0.967, and an F1-score of 0.977 in the wheat external test. Furthermore, the model showed, when applied to barley, that its phenological spectral signatures are equivalent in shape to those of wheat, with an Accuracy of 0.965 and an F1-score of 0.974. In addition, it was verified that the type spectral signatures remain the same regardless of the location. This study presents a robust classification tool capable of labeling four key phenological stages (tillering, stem elongation, ripening, and senescence) without ground truth. By effectively removing inherent satellite noise, the proposed methodology produces organized, cleaned datasets. This structured foundation is critical for future research integrating spectral signatures with harvester data to develop high-precision yield prediction models. Full article
(This article belongs to the Special Issue Digital Technologies in Smart Agriculture)
Show Figures

Figure 1

20 pages, 2061 KB  
Article
Long-Term Dew Analysis Through Multifractal Formalism and Hurst Exponent Under African Climate Conditions
by Gnonyi N’Kaina Mawinesso, Noukpo Médard Agbazo, Guy Hervé Houngue and Koto N’Gobi Gabin
Atmosphere 2026, 17(4), 375; https://doi.org/10.3390/atmos17040375 - 7 Apr 2026
Viewed by 203
Abstract
Dew constitutes a component of the near-surface water balance, but its large-scale fractal dynamical properties remain poorly documented across Africa. This study estimates dew amounts and investigates their fractal and multifractal behavior under African climatic conditions using gridded ERA5 datasets from 1993 to [...] Read more.
Dew constitutes a component of the near-surface water balance, but its large-scale fractal dynamical properties remain poorly documented across Africa. This study estimates dew amounts and investigates their fractal and multifractal behavior under African climatic conditions using gridded ERA5 datasets from 1993 to 2022. The Rescaled-Range (R/S) method, Multifractal Detrended Fluctuation Analysis (MFDFA), and the Improved Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (ICEEMDAN) algorithm are used. Hurst exponent (Hu) and the multifractal spectrum width (ω) are evaluated at daily and monthly scales over the full period and two sub-periods (1993–2007 and 2008–2022). The results reveal pronounced spatial heterogeneity in dew distribution. Daily mean amounts range between 0 and 0.18 mm, corresponding to annual accumulations reaching up to ~85 mm·yr−1 in humid coastal, equatorial, and sub-equatorial regions, while remaining below 0.5 mm·yr−1 in hyper-arid deserts. The continental mean annual amount is ~35.5 mm·yr−1. The Hurst exponent exhibits values between zero and one, indicating region-dependent persistent and anti-persistent behaviors. This suggests that prediction schemes based on preceding values may be suitable for dew time series prediction in African regions exhibiting persistent characteristics. The multifractal spectrum width (ω), reaching values of up to 10, highlights strong scaling heterogeneity, particularly at the monthly timescale. These findings indicate that African dew dynamics exhibit significant long-range dependence and multifractal variability, providing new insights into the intrinsic temporal structure of dew and into appropriate approaches for its forecasting. Full article
(This article belongs to the Special Issue Analysis of Dew under Different Climate Changes)
Show Figures

Figure 1

22 pages, 3050 KB  
Article
Event-Based Dual-Task Forecasting for SLA-Oriented Hospital Transport Operations Using Machine and Deep Learning Models
by Murat Akın
Appl. Sci. 2026, 16(7), 3570; https://doi.org/10.3390/app16073570 - 6 Apr 2026
Viewed by 181
Abstract
Service Level Agreement (SLA) compliance in hospital transport processes is essential in terms of patient safety, service continuity, and resource efficiency. However, transport requests occur as irregular events, limiting the applicability of equally spaced time-series assumptions. The presented study jointly addresses two complementary [...] Read more.
Service Level Agreement (SLA) compliance in hospital transport processes is essential in terms of patient safety, service continuity, and resource efficiency. However, transport requests occur as irregular events, limiting the applicability of equally spaced time-series assumptions. The presented study jointly addresses two complementary objectives in an event-based framework: predicting the interarrival time between consecutive transport requests (next-event forecasting) and forecasting the total request count within forward SLA horizons (forward-count forecasting). Machine learning methods such as Ridge Regression, Extra Trees, and Histogram-based Gradient Boosting, as well as deep learning architectures such as Long Short-Term Memory and Gated Recurrent Unit, were compared under different time horizons and adaptive history windows on time-stamped transport request records from the operational system supporting a private hospital in Turkey, including patient, specimen, and material transport requests. Results indicate that deep learning methods yield lower errors in demand count prediction at short time horizons; as the horizon lengthens, machine learning performs similarly and even outperforms in some cases; and as the history window increases, the prediction error for the next request occurrence systematically decreases. The lowest mean absolute error values in request counts were obtained for demand forecasting within a 30 min time window; 2.10 for material transport, 3.88 for patient transport, and 2.84 for specimen transport. Additionally, R2 value reached 0.98 for next-event forecasting with a rolling-memory window of 20 events. Overall, the findings suggest that hospital transport demand is substantially predictable and that event-based forecasting can support SLA-oriented staffing, task dispatching, and delay mitigation. Full article
Show Figures

Figure 1

22 pages, 9866 KB  
Article
Analysis of Driving Factors and Trend Prediction of Groundwater Levels in the West Liao River Basin Based on the STL-LSTM Model
by Sutong Fu, Liangping Yang, Junting Liu, Pengfei Hao, Fan Wang and Jianmin Bian
Water 2026, 18(7), 876; https://doi.org/10.3390/w18070876 - 6 Apr 2026
Viewed by 248
Abstract
In the ecologically fragile West Liao River Basin, characterizing groundwater dynamics is crucial for sustainable water management. Using 2000–2016 groundwater level data, this study applies Seasonal-Trend decomposition using Loess (STL) and change-point detection to analyse trends. Driving factors are quantified via random forest [...] Read more.
In the ecologically fragile West Liao River Basin, characterizing groundwater dynamics is crucial for sustainable water management. Using 2000–2016 groundwater level data, this study applies Seasonal-Trend decomposition using Loess (STL) and change-point detection to analyse trends. Driving factors are quantified via random forest combined with SHapley Additive exPlanations (SHAP) analysis, and a novel STL–Long Short-Term Memory (STL-LSTM) hybrid model is developed for forecasting. Key findings include: (1) Groundwater levels declined persistently, with a significant change point in 2009. The post-2009 decline rate accelerated to −0.749 m/yr, a 55.7% increase. (2) Statistical attribution reveals that soil moisture (43.5%) and climatic factors (29.0%) are the primary predictors of groundwater variability. The dominance of soil moisture highlights the key role of agricultural irrigation, which strongly modifies soil water dynamics during the growing season. (3) The STL-LSTM model achieves optimal predictive performance (R2 = 0.8805, RMSE = 0.7081 m), demonstrating enhanced accuracy for non-stationary sequences. This integrated framework combines trend diagnosis, driver interpretation, and hybrid modelling, offering scientific support for precise groundwater management in semi-arid agricultural basins. Full article
(This article belongs to the Section Hydrology)
Show Figures

Figure 1

21 pages, 2194 KB  
Article
Sensor-Based Ozone Monitoring and Forecasting in a Synchrotron Radiation Laboratory Using Autoregressive Integrated Moving Average Models
by Po-Jiun Wen, Kuo-Wei Wu, Liang-Chen Ho, Chieh-Han Yang, Tsung-Hung Tsai and Shih-Hau Fang
Sensors 2026, 26(7), 2251; https://doi.org/10.3390/s26072251 - 6 Apr 2026
Viewed by 268
Abstract
Ozone monitoring in laboratory environments is essential for ensuring personnel safety and maintaining stable experimental conditions, particularly in enclosed facilities where ozone may accumulate during high-energy radiation operations. This study investigates the short-term prediction of ozone concentration using data obtained from a sensor-based [...] Read more.
Ozone monitoring in laboratory environments is essential for ensuring personnel safety and maintaining stable experimental conditions, particularly in enclosed facilities where ozone may accumulate during high-energy radiation operations. This study investigates the short-term prediction of ozone concentration using data obtained from a sensor-based ozone monitoring system deployed at the National Synchrotron Radiation Research Center (NSRRC). Ozone concentration measurements were collected using a UV absorption-based ozone analyzer and analyzed as a time-series dataset under controlled experimental conditions. Three forecasting models—Autoregressive Integrated Moving Average (ARIMA), Long Short-Term Memory (LSTM), and linear regression—were evaluated for short-term ozone concentration prediction. Experimental results indicate that the ARIMA model provides superior predictive performance for the small-sample dataset used in this study. In the Right direction, ARIMA achieved R2 values of 89.5%, 86.3%, and 81.1% at distances of 5 cm, 10 cm, and 15 cm, respectively, while also demonstrating stable performance in the Up direction. The results highlight the effectiveness of classical time-series models for sensor data analysis in environments with limited sensing data. The proposed framework demonstrates the potential of integrating sensing devices with predictive data analytics to support real-time environmental monitoring and safety management in laboratory facilities. Full article
Show Figures

Figure 1

18 pages, 3189 KB  
Article
Continuous-Time Markov Chain Modelling for Service Life Prediction of Building Elements
by Artur Zbiciak, Dariusz Walasek, Vazgen Bagdasaryan and Eugeniusz Koda
Appl. Sci. 2026, 16(7), 3555; https://doi.org/10.3390/app16073555 - 5 Apr 2026
Viewed by 129
Abstract
A continuous-time Markov chain framework is developed for service life prediction of building assets, and three formulations are compared: a homogeneous generator, a time-varying generator, and a fractional model. The framework delivers survival, density of absorption time, hazard, and mean time to absorption. [...] Read more.
A continuous-time Markov chain framework is developed for service life prediction of building assets, and three formulations are compared: a homogeneous generator, a time-varying generator, and a fractional model. The framework delivers survival, density of absorption time, hazard, and mean time to absorption. For the homogeneous case, state trajectories are computed using matrix exponentials. The time-varying case is solved both by local exponential propagation on a time grid and by direct integration of the Kolmogorov equation. The fractional case is implemented in two independent ways, via a truncated series expansion and via an in-house routine for the Mittag-Leffler function, which also allows the direct evaluation of survival and hazard from the standard fractional relations while avoiding singular behaviour at the origin. This study shows that non-homogeneous rates accelerate deterioration relative to the homogeneous benchmark, whereas fractional dynamics reproduce early-time acceleration followed by a slow decline of the hazard, which is consistent with heavy-tailed survival and longer effective service life. The two fractional solvers provide mutually consistent outputs, which supports the numerical robustness of the approach. The framework is readily applicable to sparse inspection data and short observation windows and provides a transparent basis for comparing modelling assumptions that affect life cycle forecasts used in asset management and maintenance planning. Full article
Show Figures

Figure 1

Back to TopTop