Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (1,458)

Search Parameters:
Keywords = Gated Recurrent Unit (GRU)

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
28 pages, 31378 KB  
Article
Real-Time UAV Flight Path Prediction Using GRU Networks for Autonomous Site Assessment
by Yared Bitew Kebede, Ming-Der Yang, Henok Desalegn Shikur and Hsin-Hung Tseng
Drones 2026, 10(1), 56; https://doi.org/10.3390/drones10010056 - 13 Jan 2026
Abstract
Unmanned Aerial Vehicles (UAVs) have become essential tools across critical domains, including infrastructure inspection, public safety monitoring, traffic surveillance, environmental sensing, and target tracking, owing to their ability to collect high-resolution spatial data rapidly. However, maintaining stable and accurate flight trajectories remains a [...] Read more.
Unmanned Aerial Vehicles (UAVs) have become essential tools across critical domains, including infrastructure inspection, public safety monitoring, traffic surveillance, environmental sensing, and target tracking, owing to their ability to collect high-resolution spatial data rapidly. However, maintaining stable and accurate flight trajectories remains a significant challenge, particularly during autonomous missions in dynamic or uncertain environments. This study presents a novel flight path prediction framework based on Gated Recurrent Units (GRUs), designed for both single-step and multi-step-ahead forecasting of four-dimensional UAV coordinates, Easting (X), Northing (Y), Altitude (Z), and Time (T), using historical sensor flight data. Model performance was systematically validated against traditional Recurrent Neural Network architectures. On unseen test data, the GRU model demonstrated enhanced predictive accuracy in single-step prediction, achieving a MAE of 0.0036, Root Mean Square Error (RMSE) of 0.0054, and a (R2) of 0.9923. Crucially, in multi-step-ahead forecasting designed to simulate real-world challenges such as GPS outages, the GRU model maintained exceptional stability and low error, confirming its resilience to error accumulation. The findings establish that the GRU-based model is a highly accurate, computationally efficient, and reliable solution for UAV trajectory forecasting. This framework enhances autonomous navigation and directly supports the data integrity required for high-fidelity photogrammetric mapping, ensuring reliable site assessment in complex and dynamic environments. Full article
Show Figures

Figure 1

23 pages, 2604 KB  
Article
A Model for Identifying the Fermentation Degree of Tieguanyin Oolong Tea Based on RGB Image and Hyperspectral Data
by Yuyan Huang, Yongkuai Chen, Chuanhui Li, Tao Wang, Chengxu Zheng and Jian Zhao
Foods 2026, 15(2), 280; https://doi.org/10.3390/foods15020280 - 12 Jan 2026
Viewed by 27
Abstract
The fermentation process of oolong tea is a critical step in shaping its quality and flavor profile. In this study, the fermentation degree of Anxi Tieguanyin oolong tea was assessed using image and hyperspectral features. Machine learning algorithms, including Support Vector Machine (SVM), [...] Read more.
The fermentation process of oolong tea is a critical step in shaping its quality and flavor profile. In this study, the fermentation degree of Anxi Tieguanyin oolong tea was assessed using image and hyperspectral features. Machine learning algorithms, including Support Vector Machine (SVM), Long Short-Term Memory (LSTM), and Gated Recurrent Unit (GRU), were employed to develop models based on both single-source features and multi-source fused features. First, color and texture features were extracted from RGB images and then processed through Pearson correlation-based feature selection and Principal Component Analysis (PCA) for dimensionality reduction. For the hyperspectral data, preprocessing was conducted using Normalization (Nor) and Standard Normal Variate (SNV), followed by feature selection and dimensionality reduction with Competitive Adaptive Reweighted Sampling (CARS), Successive Projections Algorithm (SPA), and PCA. We then performed mid-level fusion on the two feature sets and selected the most relevant features using L1 regularization for the final modeling stage. Finally, SHapley Additive exPlanations (SHAP) analysis was conducted on the optimal models to reveal key features from both hyperspectral bands and image data. The results indicated that models based on single features achieved test set accuracies of 68.06% to 87.50%, while models based on data fusion achieved 77.78% to 94.44%. Specifically, the Pearson+Nor-SPA+L1+SVM fusion model achieved the highest accuracy of 94.44%. This demonstrates that data feature fusion enables a more comprehensive characterization of the fermentation process, significantly improving model accuracy. SHAP analysis revealed that the hyperspectral bands at 967, 942, 814, 784, 781, 503, 413, and 416 nm, along with the image features Hσ and H, played the most crucial roles in distinguishing tea fermentation stages. These findings provide a scientific basis for assessing the fermentation degree of Tieguanyin oolong tea and support the development of intelligent detection systems. Full article
(This article belongs to the Section Food Analytical Methods)
22 pages, 4971 KB  
Article
Optimized Hybrid Deep Learning Framework for Reliable Multi-Horizon Photovoltaic Power Forecasting in Smart Grids
by Bilali Boureima Cisse, Ghamgeen Izat Rashed, Ansumana Badjan, Hussain Haider, Hashim Ali I. Gony and Ali Md Ershad
Electricity 2026, 7(1), 4; https://doi.org/10.3390/electricity7010004 - 12 Jan 2026
Viewed by 25
Abstract
Accurate short-term forecasting of photovoltaic (PV) output is critical to managing the variability of PV generation and ensuring reliable grid operation with high renewable integration. We propose an enhanced hybrid deep learning framework that combines Temporal Convolutional Networks (TCNs), Gated Recurrent Units (GRUs), [...] Read more.
Accurate short-term forecasting of photovoltaic (PV) output is critical to managing the variability of PV generation and ensuring reliable grid operation with high renewable integration. We propose an enhanced hybrid deep learning framework that combines Temporal Convolutional Networks (TCNs), Gated Recurrent Units (GRUs), and Random Forests (RFs) in an optimized weighted ensemble strategy. This approach leverages the complementary strengths of each component: TCNs capture long-range temporal dependencies via dilated causal convolutions; GRUs model sequential weather-driven dynamics; and RFs enhance robustness to outliers and nonlinear relationships. The model was evaluated on high-resolution operational data from the Yulara solar plant in Australia, forecasting horizons from 5 min to 1 h. Results show that the TCN-GRU-RF model consistently outperforms conventional benchmarks, achieving R2 = 0.9807 (MAE = 0.0136; RMSE = 0.0300) at 5 min and R2 = 0.9047 (RMSE = 0.0652) at 1 h horizons. Notably, the degradation in R2 across forecasting horizons was limited to 7.7%, significantly lower than the typical 10–15% range observed in the literature, highlighting the model’s scalability and resilience. These validated results indicate that the proposed approach provides a robust, scalable forecasting solution that enhances grid reliability and supports the integration of distributed renewable energy sources. Full article
Show Figures

Figure 1

24 pages, 2272 KB  
Article
Short-Term Photovoltaic Power Prediction Using a DPCA–CPO–RF–KAN–GRU Hybrid Model
by Mingguang Liu, Ying Zhou, Yusi Wei, Weibo Zhao, Min Qu, Xue Bai and Zecheng Ding
Processes 2026, 14(2), 252; https://doi.org/10.3390/pr14020252 - 11 Jan 2026
Viewed by 80
Abstract
In photovoltaic (PV) power generation, the intermittency and uncertainty caused by meteorological factors pose challenges to grid operations. Accurate PV power prediction is crucial for optimizing power dispatching and balancing supply and demand. This paper proposes a PV power prediction model based on [...] Read more.
In photovoltaic (PV) power generation, the intermittency and uncertainty caused by meteorological factors pose challenges to grid operations. Accurate PV power prediction is crucial for optimizing power dispatching and balancing supply and demand. This paper proposes a PV power prediction model based on Density Peak Clustering Algorithm (DPCA)–Crested Porcupine Optimizer (CPO)–Random Forest (RF)–Gated Recurrent Unit (GRU)–Kolmogorov–Arnold Network (KAN). First, the DPCA is used to accurately classify weather conditions according to meteorological data such as solar radiation, temperature, and humidity. Then, the CPO algorithm is established to optimize the factor screening characteristic variables of the RF. Subsequently, a hybrid GRU model with a KAN layer is introduced for short-term PV power prediction. The Shapley Additive Explanation (SHAP) method values evaluating feature importance and the impact of causal features. Compared with other contrast models, the DPCA-CPO-RF-KAN-GRU model demonstrates better error reduction capabilities under three weather types, with an average fitting accuracy R2 reaching 97%. SHAP analysis indicates that the combined average SHAP value of total solar radiation and direct solar radiation contributes more than 70%. Finally, the Kernel Density Estimation (KDE) is utilized to verify that the KAN-GRU model has high robustness in interval prediction, providing strong technical support for ensuring the stability of the power grid and precise decision-making in the electricity market. Full article
(This article belongs to the Section Energy Systems)
Show Figures

Figure 1

25 pages, 39412 KB  
Article
Enhanced Spatiotemporal Relationship-Guided Deep Learning for Water Quality Prediction
by Ruikai Chen, Yonggui Wang, Hongjun Wang, Shaofei Wang and Jun Yang
Water 2026, 18(2), 185; https://doi.org/10.3390/w18020185 - 10 Jan 2026
Viewed by 102
Abstract
Water quality prediction serves as a crucial basis for water environment supervision and is of great significance for water resource protection. This study utilized meteorological and water quality data from 40 monitoring stations in the Tuojiang River Basin, Sichuan Province, China. A Gated [...] Read more.
Water quality prediction serves as a crucial basis for water environment supervision and is of great significance for water resource protection. This study utilized meteorological and water quality data from 40 monitoring stations in the Tuojiang River Basin, Sichuan Province, China. A Gated Recurrent Unit (GRU) model and a Graph Attention Network–Gated Recurrent Unit (GAT-GRU) model were constructed. Furthermore, based on the GAT-GRU framework, an Enhanced Spatio-Temporal Relation-Guided Gated Recurrent Unit (ESRG-GRU) model was developed by incorporating an explicit river network topology and a loss function that is sensitive to extreme values to strengthen spatio-temporal relationships. Water quality predictions were made for all 40 stations, and the performance of the three models was compared. The results show that, during the 7-day forecasting period, the training time of both the ESRG-GRU and the GAT-GRU models was only about 1/40 of that required for the GRU model. In terms of prediction accuracy, the average Nash–Sutcliffe efficiency (NSE) values over the 7-day forecast period were ESRG-GRU (0.7904) > GAT-GRU (0.7557) > GRU (0.6870), while the average root mean square error (RMSE) values were ESRG-GRU (0.0156) < GAT-GRU (0.0168) < GRU (0.0185). Regarding accuracy across different regions and seasons within the river basin, the ESRG-GRU model, guided by enhanced spatio-temporal deep learning, consistently outperformed both the GRU and the GAT-GRU models. This method can effectively enhance both the efficiency and accuracy of water quality prediction, thereby providing support for water environment supervision and regional water quality improvement. Full article
(This article belongs to the Special Issue Machine Learning Applications in the Water Domain)
30 pages, 4543 KB  
Article
Dynamic Risk Assessment of the Coal Slurry Preparation System Based on LSTM-RNN Model
by Ziheng Zhang, Rijia Ding, Wenxin Zhang, Liping Wu and Ming Liu
Sustainability 2026, 18(2), 684; https://doi.org/10.3390/su18020684 - 9 Jan 2026
Viewed by 104
Abstract
As the core technology of clean and efficient utilization of coal, coal gasification technology plays an important role in reducing environmental pollution, improving coal utilization, and achieving sustainable energy development. In order to ensure the safe, stable, and long-term operation of coal gasification [...] Read more.
As the core technology of clean and efficient utilization of coal, coal gasification technology plays an important role in reducing environmental pollution, improving coal utilization, and achieving sustainable energy development. In order to ensure the safe, stable, and long-term operation of coal gasification plant, aiming to address the strong subjectivity of dynamic Bayesian network (DBN) prior data in dynamic risk assessment, this study takes the coal slurry preparation system—the main piece of equipment in the initial stage of the coal gasification process—as the research object and uses a long short-term memory (LSTM) model combined with a back propagation (BP) neural network model to optimize DBN prior data. To further validate the superiority of the model, a gated recurrent unit (GRU) model was introduced for comparative verification. The mean absolute error (MAE), root mean square error (RMSE), and coefficient of determination are used to evaluate the generalization ability of the LSTM model. The results show that the LSTM model’s predictions are more accurate and stable. Bidirectional inference is performed on the DBN of the optimized coal slurry preparation system to achieve dynamic reliability analysis. Thanks to the forward reasoning of DBN in the coal slurry preparation system, quantitative analysis of the system’s reliability effects is conducted to clearly demonstrate the trend of system reliability over time, providing data support for stable operation and subsequent upgrades. By conducting reverse reasoning, key events and weak links before and after system optimization can be identified, and targeted improvement measures can be proposed accordingly. Full article
(This article belongs to the Special Issue Process Safety and Control Strategies for Urban Clean Energy Systems)
Show Figures

Figure 1

42 pages, 3251 KB  
Article
Efficient and Accurate Epilepsy Seizure Prediction and Detection Based on Multi-Teacher Knowledge Distillation RGF-Model
by Wei Cao, Qi Li, Anyuan Zhang and Tianze Wang
Brain Sci. 2026, 16(1), 83; https://doi.org/10.3390/brainsci16010083 - 9 Jan 2026
Viewed by 218
Abstract
Background: Epileptic seizures are unpredictable, and while existing deep learning models achieve high accuracy, their deployment on wearable devices is constrained by high computational costs and latency. To address this, this work proposes the RGF-Model, a lightweight network that unifies seizure prediction and [...] Read more.
Background: Epileptic seizures are unpredictable, and while existing deep learning models achieve high accuracy, their deployment on wearable devices is constrained by high computational costs and latency. To address this, this work proposes the RGF-Model, a lightweight network that unifies seizure prediction and detection within a single causal framework. Methods: By integrating Feature-wise Linear Modulation (FiLM) with a Ring-Buffer Gated Recurrent Unit (Ring-GRU), the model achieves adaptive task-specific feature conditioning while strictly enforcing causal consistency for real-time inference. A multi-teacher knowledge distillation strategy is employed to transfer complementary knowledge from complex teacher ensembles to the lightweight student, significantly reducing complexity without sacrificing accuracy. Results: Evaluations on the CHB-MIT and Siena datasets demonstrate that the RGF-Model outperforms state-of-the-art teacher models in terms of efficiency while maintaining comparable accuracy. Specifically, on CHB-MIT, it achieves 99.54% Area Under the Curve (AUC) and 0.01 False Prediction Rate per hour (FPR/h) for prediction, and 98.78% Accuracy (Acc) for detection, with only 0.082 million parameters. Statistical significance was assessed using a random predictor baseline (p < 0.05). Conclusions: The results indicate that the RGF-Model provides a highly efficient solution for real-time wearable epilepsy monitoring. Full article
(This article belongs to the Section Neurotechnology and Neuroimaging)
Show Figures

Figure 1

27 pages, 7153 KB  
Article
State-Dependent CNN–GRU Reinforcement Framework for Robust EEG-Based Sleep Stage Classification
by Sahar Zakeri, Somayeh Makouei and Sebelan Danishvar
Biomimetics 2026, 11(1), 54; https://doi.org/10.3390/biomimetics11010054 - 8 Jan 2026
Viewed by 211
Abstract
Recent advances in automated learning techniques have enhanced the analysis of biomedical signals for detecting sleep stages and related health abnormalities. However, many existing models face challenges with imbalanced datasets and the dynamic nature of evolving sleep states. In this study, we present [...] Read more.
Recent advances in automated learning techniques have enhanced the analysis of biomedical signals for detecting sleep stages and related health abnormalities. However, many existing models face challenges with imbalanced datasets and the dynamic nature of evolving sleep states. In this study, we present a robust algorithm for classifying sleep states using electroencephalogram (EEG) data collected from 33 healthy participants. We extracted dynamic, brain-inspired features, such as microstates and Lempel–Ziv complexity, which replicate intrinsic neural processing patterns and reflect temporal changes in brain activity during sleep. An optimal feature set was identified based on significant spectral ranges and classification performance. The classifier was developed using a convolutional neural network (CNN) combined with gated recurrent units (GRUs) within a reinforcement learning framework, which models adaptive decision-making processes similar to those in biological neural systems. Our proposed biomimetic framework illustrates that a multivariate feature set provides strong discriminative power for sleep state classification. Benchmark comparisons with established approaches revealed a classification accuracy of 98% using the optimized feature set, with the framework utilizing fewer EEG channels and reducing processing time, underscoring its potential for real-time deployment. These findings indicate that applying biomimetic principles in feature extraction and model design can improve automated sleep monitoring and facilitate the development of novel therapeutic and diagnostic tools for sleep-related disorders. Full article
(This article belongs to the Section Bioinspired Sensorics, Information Processing and Control)
Show Figures

Graphical abstract

34 pages, 6460 KB  
Article
Explainable Gait Multi-Anchor Space-Aware Temporal Convolutional Networks for Gait Recognition in Neurological, Orthopedic, and Healthy Cohorts
by Abdullah Alharthi
Mathematics 2026, 14(2), 230; https://doi.org/10.3390/math14020230 - 8 Jan 2026
Viewed by 135
Abstract
Gait recognition using wearable sensor data is crucial for healthcare, rehabilitation, and monitoring neurological and musculoskeletal disorders. This study proposes a deep learning framework for gait classification using inertial measurements from four body-mounted IMU sensors (head, lower back, and both feet). The data [...] Read more.
Gait recognition using wearable sensor data is crucial for healthcare, rehabilitation, and monitoring neurological and musculoskeletal disorders. This study proposes a deep learning framework for gait classification using inertial measurements from four body-mounted IMU sensors (head, lower back, and both feet). The data were collected from a publicly available, clinically annotated dataset comprising 1356 gait trials from 260 individuals with diverse pathologies. The framework, G-MASA-TCN (Gait Multi-Anchor, Space-Aware Temporal Convolutional Network), integrates multi-scale temporal fusion, graph-informed spatial modeling, and residual dilated convolutions to extract discriminative gait signatures. To ensure both high performance and interpretability, Integrated Gradients is incorporated as an explainable AI (XAI) method, providing sensor-level and temporal attributes that reveal the features driving model decisions. The framework is evaluated via repeated cross-validation experiments, reporting detailed metrics with cross-run statistical analysis (mean ± standard deviation) to assess robustness. Results show that G-MASA-TCN achieves 98% classification accuracy for neurological, orthopedic, and healthy cohorts, demonstrating superior stability and resilience compared to baseline architectures, including Gated Recurrent Unit (GRU), Transformer neural networks, and standard TCNs, and 98.4% accuracy in identifying individual subjects based on gait. Furthermore, the model offers clinically meaningful insights into which sensors and gait phases contribute most to its predictions. This work presents an accurate, interpretable, and reliable tool for gait pathology recognition, with potential for translation to real-world clinical settings. Full article
(This article belongs to the Special Issue Deep Neural Network: Theory, Algorithms and Applications)
Show Figures

Graphical abstract

39 pages, 3706 KB  
Article
Performance Assessment of DL for Network Intrusion Detection on a Constrained IoT Device
by Armin Mazinani, Daniele Antonucci, Luca Davoli and Gianluigi Ferrari
Future Internet 2026, 18(1), 34; https://doi.org/10.3390/fi18010034 - 7 Jan 2026
Viewed by 98
Abstract
This work investigates the deployment of Deep Learning (DL) models for network intrusion detection on resource-constrained IoT devices, using the public CICIoT2023 dataset. In particular, we consider the following DL models: Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), Recurrent Neural Network (RNN), [...] Read more.
This work investigates the deployment of Deep Learning (DL) models for network intrusion detection on resource-constrained IoT devices, using the public CICIoT2023 dataset. In particular, we consider the following DL models: Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), Recurrent Neural Network (RNN), Convolutional Neural Network (CNN), Temporal Convolutional Network (TCN), Multi-Layer Perceptron (MLP). Bayesian optimization is employed to fine-tune the models’ hyperparameters and ensure reliable performance evaluation across both binary (2-class) and multi-class (8-class, 34-class) intrusion detection. Then, the computational complexity of each DL model is analyzed—in terms of the number of Multiply–ACCumulate operations (MACCs), RAM usage, and inference time—through the STMicroelectronics Cube.AI Analyzer tool, with models being deployed on an STM32H7S78-DK board. To assess the practical deployability of the considered DL models, a trade-off score (balancing classification accuracy and computational efficiency) is introduced: according to this score, our experimental results indicate that MLP and TCN outperform the other models. Furthermore, Post-Training Quantization (PTQ) to 8-bit integer precision is applied, allowing the model size to be reduced by more than 90% with negligible performance degradation. This demonstrates the effectiveness of quantization in optimizing DL models for real-world deployment on resource-constrained IoT devices. Full article
Show Figures

Graphical abstract

17 pages, 1469 KB  
Article
A MASPSO-Optimized CNN–GRU–Attention Hybrid Model for Short-Term Wind Speed Forecasting
by Haoran Du and Yaling Sun
Sustainability 2026, 18(2), 583; https://doi.org/10.3390/su18020583 - 6 Jan 2026
Viewed by 195
Abstract
Short-term wind speed forecasting is challenged by the nonlinear, non-stationary, and highly volatile characteristics of wind speed series, which hinder the performance of traditional prediction models. To improve forecasting capability, this study proposes a hybrid modeling framework that integrates multi-strategy adaptive particle swarm [...] Read more.
Short-term wind speed forecasting is challenged by the nonlinear, non-stationary, and highly volatile characteristics of wind speed series, which hinder the performance of traditional prediction models. To improve forecasting capability, this study proposes a hybrid modeling framework that integrates multi-strategy adaptive particle swarm optimization (MASPSO), a convolutional neural network (CNN), a gated recurrent unit (GRU), and an attention mechanism. Within this modeling architecture, the CNN extracts multi-scale spatial patterns, the GRU captures dynamic temporal dependencies, and the attention mechanism highlights salient feature components. MASPSO is further incorporated to perform global hyperparameter optimization, thereby improving both prediction accuracy and generalization. Evaluation on real wind farm data confirms that the proposed modeling framework delivers consistently superior forecasting accuracy across different wind speed conditions, with significantly reduced prediction errors and improved robustness in multi-step forecasting tasks. Full article
(This article belongs to the Special Issue Advances in Sustainable Energy Technologies and Energy Systems)
Show Figures

Figure 1

26 pages, 6799 KB  
Article
Research on Anomaly Detection and Correction Methods for Nuclear Power Plant Operation Data
by Ren Yu, Yudong Zhao, Shaoxuan Yin, Wei Mao, Chunyuan Wang and Kai Xiao
Processes 2026, 14(2), 192; https://doi.org/10.3390/pr14020192 - 6 Jan 2026
Viewed by 133
Abstract
The data collection and analytical capabilities of the Instrumentation and Control (I&C) system in nuclear power plants (NPPs) continue to advance, thereby enhancing operational state awareness and enabling more precise control. However, the data acquisition, transmission, and storage devices in nuclear power plant [...] Read more.
The data collection and analytical capabilities of the Instrumentation and Control (I&C) system in nuclear power plants (NPPs) continue to advance, thereby enhancing operational state awareness and enabling more precise control. However, the data acquisition, transmission, and storage devices in nuclear power plant (NPP) I&C systems typically operate in harsh environments. This exposure can lead to device failures and susceptibility to external interference, potentially resulting in data anomalies such as missing samples, signal skipping, and measurement drift. This paper presents a Gated Recurrent Unit and Multilayer Perceptron (GRU-MLP)-based method for anomaly detection and correction in NPP I&C system data. The goal is to improve operational data quality, thereby supplying more reliable input for system analysis and automatic controllers. Firstly, the short-term prediction algorithm of operation data based on the GRU model is studied to provide a reference for operation data anomaly detection. Secondly, the MLP model is connected to the GRU model to recognize the difference between the collected value and the prediction value so as to distinguish and correct the anomalies. Finally, a series of experiments were conducted using operational data from a pressurized water reactor (PWR) to evaluate the proposed method. The experiments were designed as follows: (1) These experiments assessed the model’s prediction performance across varying time horizons. Prediction steps of 1, 3, 5, 10, and 20 were configured to verify the accuracy and robustness of the data prediction capability over short and long terms. (2) The model’s effectiveness in identifying anomalies was validated using three typical patterns: random jump, fixed-value drift, and growth drift. The growth drift category was further subdivided into linear, polynomial, and logarithmic growth to comprehensively test detection performance. (3) A comparative analysis was performed to demonstrate the superiority of the proposed GRU-MLP algorithm. It was compared against the interactive window center value method and the ARIMA algorithm. The results confirm the advantages of the proposed method for anomaly detection, and the underlying reasons are analyzed. (4) Additional experiments were carried out to discuss and verify the mobility (or transferability) of the prediction algorithm, ensuring its applicability under different operational conditions. Full article
(This article belongs to the Section Petroleum and Low-Carbon Energy Process Engineering)
Show Figures

Figure 1

29 pages, 3200 KB  
Article
Accurate Prediction of Type 1 Diabetes Using a Novel Hybrid GRU-Transformer Model and Enhanced CGM Features
by Loubna Mazgouti, Nacira Laamiri, Jaouher Ben Ali, Najiba El Amrani El Idrissi, Véronique Di Costanzo, Roomila Naeck and Jean-Mark Ginoux
Algorithms 2026, 19(1), 52; https://doi.org/10.3390/a19010052 - 6 Jan 2026
Viewed by 196
Abstract
Accurate prediction of Blood Glucose (BG) levels is essential for effective diabetes management and the prevention of adverse glycemic events. This study introduces a novel designed hybrid Gated Recurrent Unit-Transformer (GRU-Transformer) model tailored to forecast BG levels at 15, 30, 45, and 60 [...] Read more.
Accurate prediction of Blood Glucose (BG) levels is essential for effective diabetes management and the prevention of adverse glycemic events. This study introduces a novel designed hybrid Gated Recurrent Unit-Transformer (GRU-Transformer) model tailored to forecast BG levels at 15, 30, 45, and 60 min horizons using only Continuous Glucose Monitoring (CGM) data as input. The proposed approach integrates advanced CGM feature extraction step. The extracted features are statistically the mean, the median, the maximum, the entropy, the autocorrelation and the Detrended Fluctuation Analysis (DFA). In addition, in order to define more enhanced and specific features, the custom 3-points monotonicity score, the sinusoidal time encoding, and the workday/weekend binary features are proposed in this work. This approach enables the model to capture physiological dynamics and contextual temporal patterns of Type 1 Diabetes (T1D) with great accuracy. To thoroughly assess the performance of the proposed method, we relied on several well-established metrics, including Root Mean Squared Error (RMSE), Coefficient of Determination (R2), Mean Absolute Error (MAE), Mean Absolute Percentage Error (MAPE), and Root Mean Squared Percentage Error (RMSPE). Experimental results demonstrate that the proposed method achieves superior predictive accuracy for both short-term (15–30 min) and long-term (45–60 min) forecasting. Specifically, the model attained the lowest average RMSE values, with 4.00 mg/dL, 6.65 mg/dL, 7.96 mg/dL, and 8.91 mg/dL and yielding consistently high R2 scores for the respective prediction horizons. This new method distinguishes itself by continuously exceeding current prediction models, reinforcing its potential for real-time CGM and clinical decision support. Its high accuracy and adaptability make it a favorable tool for improving diabetes management and personalized glycemic control. Full article
Show Figures

Figure 1

20 pages, 789 KB  
Article
Deep Hybrid CNN-LSTM-GRU Model for a Financial Risk Early Warning System
by Muhammad Ali Chohan, Teng Li, Mohammad Abrar and Shamaila Butt
Risks 2026, 14(1), 14; https://doi.org/10.3390/risks14010014 - 5 Jan 2026
Viewed by 214
Abstract
Financial risk early warning systems are essential for proactive risk management in volatile markets, particularly for emerging economies such as China. This study develops a hybrid deep learning model integrating Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM), and Gated Recurrent Units (GRUs) [...] Read more.
Financial risk early warning systems are essential for proactive risk management in volatile markets, particularly for emerging economies such as China. This study develops a hybrid deep learning model integrating Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM), and Gated Recurrent Units (GRUs) to enhance the accuracy and robustness of financial risk prediction. Using firm-level quarterly financial data from Chinese listed companies, the proposed model is benchmarked against standalone CNN, LSTM, and GRU architectures. Experimental results show that the hybrid CNN–LSTM–GRU model achieves superior performance across all evaluation metrics, with prediction accuracy reaching 93.5%, precision reaching 92.2%, recall reaching 91.8%, and F1-score reaching 92.0%, significantly outperforming individual models. Moreover, the hybrid approach demonstrates faster convergence than LSTM and improved class balance compared to CNN and GRU, reducing false negatives for high-risk firms—a critical aspect for early intervention. These findings highlight the hybrid model’s robustness and real-world applicability, offering regulators, investors, and policymakers a reliable tool for timely financial risk detection and informed decision-making. By combining high predictive power with computational efficiency, the proposed system provides a practical framework for strengthening financial stability in emerging and dynamic markets. Full article
(This article belongs to the Special Issue Advances in Volatility Modeling and Risk in Markets)
Show Figures

Figure 1

18 pages, 6832 KB  
Article
Enhancing Efficiency in Coal-Fired Boilers Using a New Predictive Control Method for Key Parameters
by Qinwu Li, Libin Yu, Tingyu Liu, Lianming Li, Yangshu Lin, Tao Wang, Chao Yang, Lijie Wang, Weiguo Weng, Chenghang Zheng and Xiang Gao
Sensors 2026, 26(1), 330; https://doi.org/10.3390/s26010330 - 4 Jan 2026
Viewed by 310
Abstract
In the context of carbon neutrality, the large-scale integration of renewable energy sources has led to frequent load changes in coal-fired boilers. These fluctuations cause key operational parameters to deviate significantly from their design values, undermining combustion stability and reducing operational efficiency. To [...] Read more.
In the context of carbon neutrality, the large-scale integration of renewable energy sources has led to frequent load changes in coal-fired boilers. These fluctuations cause key operational parameters to deviate significantly from their design values, undermining combustion stability and reducing operational efficiency. To address this issue, we introduce a novel predictive control method to enhance the control precision of key parameters under complex variable-load conditions, which integrates a coupled predictive model and real-time optimization. The predictive model is based on a coupled Transformer-gated recurrent unit (GRU) architecture, which demonstrates strong adaptability to load fluctuations and achieves high prediction accuracy, with a mean absolute error of 0.095% and a coefficient of determination of 0.966 for oxygen content (OC); 0.0163 kPa and 0.987 for bed pressure (BP); and 0.300 °C and 0.927 for main steam temperature (MST). These results represent substantial improvements over lone implementations of GRU, LSTM, and Transformer models. Based on these multi-step predictions, a WOA-based real-time optimization strategy determines coordinated adjustments of secondary fan frequency, slag discharger frequency, and desuperheating water valves before deviations occur. Field validation on a 300 t/h boiler over a representative 24 h load cycle shows that the method reduces fluctuations in OC, BP, and MST by 62.07%, 50.95%, and 40.43%, respectively, relative to the original control method. By suppressing parameter variability and maintaining key parameters near operational targets, the method enhances boiler thermal efficiency and steam quality. Based on the performance gain measured during the typical operating day, the corresponding annual gain is estimated at ~1.77%, with an associated CO2 reduction exceeding 6846 t. Full article
(This article belongs to the Section Industrial Sensors)
Show Figures

Figure 1

Back to TopTop