Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (196)

Search Parameters:
Keywords = long short-term spatial-temporal dependencies

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
23 pages, 3168 KB  
Article
Spatio-Temporal Feature Fusion-Based Hybrid GAT-CNN-LSTM Model for Enhanced Short-Term Power Load Forecasting
by Jia Huang, Qing Wei, Tiankuo Wang, Jiajun Ding, Longfei Yu, Diyang Wang and Zhitong Yu
Energies 2025, 18(21), 5686; https://doi.org/10.3390/en18215686 - 29 Oct 2025
Viewed by 196
Abstract
Conventional power load forecasting frameworks face limitations in dynamic spatial topology capture and long-term dependency modeling. To address these issues, this study proposes a hybrid GAT-CNN-LSTM architecture for enhanced short-term power load forecasting. The model integrates three core components synergistically: Graph Attention Network [...] Read more.
Conventional power load forecasting frameworks face limitations in dynamic spatial topology capture and long-term dependency modeling. To address these issues, this study proposes a hybrid GAT-CNN-LSTM architecture for enhanced short-term power load forecasting. The model integrates three core components synergistically: Graph Attention Network (GAT) dynamically captures spatial correlations via adaptive node weighting, resolving static topology constraints; a CNN-LSTM module extracts multi-scale temporal features—convolutional kernels decompose load fluctuations, while bidirectional LSTM layers model long-term trends; and a gated fusion mechanism adaptively weights and fuses spatio-temporal features, suppressing noise and enhancing sensitivity to critical load periods. Experimental validations on multi-city datasets show significant improvements: the model outperforms baseline models by a notable margin in error reduction, exhibits stronger robustness under extreme weather, and maintains superior stability in multi-step forecasting. This study concludes that the hybrid model balances spatial topological analysis and temporal trend modeling, providing higher accuracy and adaptability for STLF in complex power grid environments. Full article
Show Figures

Figure 1

26 pages, 5287 KB  
Article
Multi-Point Seawall Settlement Modeling Using DTW-Based Hierarchical Clustering and AJSO-LSTM Method
by Chunmei Ding, Xian Liu, Zhenzhu Meng and Yadong Liu
J. Mar. Sci. Eng. 2025, 13(11), 2053; https://doi.org/10.3390/jmse13112053 - 27 Oct 2025
Viewed by 233
Abstract
A seawall settlement is a critical concern in marine engineering, as an excessive or uneven settlement can undermine structural stability and diminish the capacity to withstand marine hydrodynamic actions such as storm surges, waves, and tides. Accordingly, accurate settlement prediction is vital to [...] Read more.
A seawall settlement is a critical concern in marine engineering, as an excessive or uneven settlement can undermine structural stability and diminish the capacity to withstand marine hydrodynamic actions such as storm surges, waves, and tides. Accordingly, accurate settlement prediction is vital to ensuring seawall safety. To address the lack of clustering methods that capture the time-series characteristics of monitoring points and the limitations of hyperparameter sensitivity of conventional LSTM models, this study proposes a hybrid model integrating Dynamic Time Warping-based Hierarchical Clustering (DTW-HC) and an Adaptive Joint Search Optimization-enhanced Long Short-Term Memory Model (AJSO-LSTM). First, DTW-HC is employed to cluster monitoring points according to their time series characteristics, thereby constructing a spatial panel data structure that incorporates both temporal evolution and spatial heterogeneity. Then, an AJSO-LSTM model is developed within each cluster to capture temporal dependencies and improve prediction performance by overcoming the weaknesses of a conventional LSTM. Finally, using seawall settlement monitoring data from a real engineering case, the proposed method is validated by comparing it with a statistical model, a back-propagation Neural Network (BP-ANN), and a conventional LSTM. Results demonstrate that the proposed model consistently outperforms these three benchmark methods in terms of prediction accuracy and robustness. This confirms the potential of the proposed framework as an effective tool for seawall safety management and long-term service evaluation. Full article
Show Figures

Figure 1

26 pages, 1737 KB  
Article
ECG-CBA: An End-to-End Deep Learning Model for ECG Anomaly Detection Using CNN, Bi-LSTM, and Attention Mechanism
by Khalid Ammar, Salam Fraihat, Ghazi Al-Naymat and Yousef Sanjalawe
Algorithms 2025, 18(11), 674; https://doi.org/10.3390/a18110674 - 22 Oct 2025
Viewed by 406
Abstract
The electrocardiogram (ECG) is a vital diagnostic tool used to monitor heart activity and detect cardiac abnormalities, such as arrhythmias. Accurate classification of normal and abnormal heartbeats is essential for effective diagnosis and treatment. Traditional deep learning methods for automated ECG classification primarily [...] Read more.
The electrocardiogram (ECG) is a vital diagnostic tool used to monitor heart activity and detect cardiac abnormalities, such as arrhythmias. Accurate classification of normal and abnormal heartbeats is essential for effective diagnosis and treatment. Traditional deep learning methods for automated ECG classification primarily focus on reconstructing the original ECG signal and detecting anomalies based on reconstruction errors, which represent abnormal features. However, these approaches struggle with unseen or underrepresented abnormalities in the training data. In addition, other methods rely on manual feature extraction, which can introduce bias and limit their adaptability to new datasets. To overcome this problem, this study proposes an end-to-end model called ECG-CBA, which integrates the convolutional neural networks (CNNs), bidirectional long short-term memory networks (Bi-LSTM), and a multi-head Attention mechanism. ECG-CBA model learns discriminative features directly from the original dataset rather than relying on feature extraction or signal reconstruction. This enables higher accuracy and reliability in detecting and classifying anomalies. The CNN extracts local spatial features from raw ECG signals, while the Bi-LSTM captures the temporal dependencies in sequential data. An attention mechanism enables the model to primarily focus on critical segments of the ECG, thereby improving classification performance. The proposed model is trained on normal and abnormal ECG signals for binary classification. The ECG-CBA model demonstrates strong performance on the ECG5000 and MIT-BIH datasets, achieving accuracies of 99.60% and 98.80%, respectively. The model surpasses traditional methods across key metrics, including sensitivity, specificity, and overall classification accuracy. This offers a robust and interpretable solution for both ECG-based anomaly detection and cardiac abnormality classification. Full article
Show Figures

Figure 1

20 pages, 4600 KB  
Article
Study on the Coupling and Coordination Degree of Virtual and Real Space Heat in Coastal Internet Celebrity Streets
by Yilu Gong, Sijia Han and Jun Yang
ISPRS Int. J. Geo-Inf. 2025, 14(10), 407; https://doi.org/10.3390/ijgi14100407 - 21 Oct 2025
Viewed by 265
Abstract
This study investigates the coupling and coordination mechanisms between virtual and physical spatial heat in coastal internet-famous streets under the influence of social media. Taking Dalian’s coastal internet-famous street as a case study, user interaction data (likes, favorites, shares, and comments) from the [...] Read more.
This study investigates the coupling and coordination mechanisms between virtual and physical spatial heat in coastal internet-famous streets under the influence of social media. Taking Dalian’s coastal internet-famous street as a case study, user interaction data (likes, favorites, shares, and comments) from the Xiaohongshu platform were integrated with multi-source spatio-temporal big data, including Baidu Heat Maps, to construct an “online–offline” heat coupling and coordination evaluation framework. The entropy-weight method was employed to quantify online heat, while nonlinear regression analysis and a coupling coordination degree model were applied to examine interaction mechanisms and spatio-temporal differentiation patterns. The results show that online heat demonstrates significant polarization with strong agglomeration in the Donggang area, while offline heat fluctuates periodically, rising during the day, stabilizing at night, and peaking on holidays at up to 3.5 times weekday levels with marginal diminishing effects. Forwarding behavior is confirmed as the core driver of online popularity, highlighting the central role of cross-circle communication. The coupling coordination model identifies states ranging from high-quality coordination during holidays to discoordination in daily under-conversion or overload scenarios. These findings verify the leading role of algorithmic recommendation in redistributing spatial power and demonstrate that the sustainability of coastal check-in destinations depends on balancing short-term traffic surges with long-term spatial quality, providing practical insights for governance and sustainable urban planning. Full article
Show Figures

Figure 1

21 pages, 2245 KB  
Article
Frequency-Aware and Interactive Spatial-Temporal Graph Convolutional Network for Traffic Flow Prediction
by Guoqing Teng, Han Wu, Hao Wu, Jiahao Cao and Meng Zhao
Appl. Sci. 2025, 15(20), 11254; https://doi.org/10.3390/app152011254 - 21 Oct 2025
Viewed by 445
Abstract
Accurate traffic flow prediction is pivotal for intelligent transportation systems; yet, existing spatial-temporal graph neural networks (STGNNs) struggle to jointly capture the long-term structural stability, short-term dynamics, and multi-scale temporal patterns of road networks. To address these shortcomings, we propose FISTGCN, a Frequency-Aware [...] Read more.
Accurate traffic flow prediction is pivotal for intelligent transportation systems; yet, existing spatial-temporal graph neural networks (STGNNs) struggle to jointly capture the long-term structural stability, short-term dynamics, and multi-scale temporal patterns of road networks. To address these shortcomings, we propose FISTGCN, a Frequency-Aware Interactive Spatial-Temporal Graph Convolutional Network. FISTGCN enriches raw traffic flow features with learnable spatial and temporal embeddings, thereby providing comprehensive spatial-temporal representations for subsequent modeling. Specifically, it utilizes an interactive dynamic graph convolutional block that generates a time-evolving fused adjacency matrix by combining adaptive and dynamic adjacency matrices. It then applies dual sparse graph convolutions with cross-scale interactions to capture multi-scale spatial dependencies. The gated spectral block projects the input features into the frequency domain and adaptively separates low- and high-frequency components using a learnable threshold. It then employs learnable filters to extract features from different frequency bands and adopts a gating mechanism to adaptively fuse low- and high-frequency information, thereby dynamically highlighting short-term fluctuations or long-term trends. Extensive experiments on four benchmark datasets demonstrate that FISTGCN delivers state-of-the-art predictive accuracy while maintaining competitive computational efficiency. Full article
Show Figures

Figure 1

24 pages, 1741 KB  
Article
Remaining Useful Life Estimation of Lithium-Ion Batteries Using Alpha Evolutionary Algorithm-Optimized Deep Learning
by Fei Li, Danfeng Yang, Jinghan Li, Shuzhen Wang, Chao Wu, Mingwei Li, Chuanfeng Li, Pengcheng Han and Huafei Qian
Batteries 2025, 11(10), 385; https://doi.org/10.3390/batteries11100385 - 20 Oct 2025
Viewed by 1349
Abstract
The precise prediction of the remaining useful life (RUL) of lithium-ion batteries is of great significance for improving energy management efficiency and extending battery lifespan, and it is widely applied in the fields of new energy and electric vehicles. However, accurate RUL prediction [...] Read more.
The precise prediction of the remaining useful life (RUL) of lithium-ion batteries is of great significance for improving energy management efficiency and extending battery lifespan, and it is widely applied in the fields of new energy and electric vehicles. However, accurate RUL prediction still faces significant challenges. Although various methods based on deep learning have been proposed, the performance of their neural networks is strongly correlated with the hyperparameters. To overcome this limitation, this study proposes an innovative approach that combines the Alpha evolutionary (AE) algorithm with a deep learning model. Specifically, this hybrid deep learning architecture consists of convolutional neural network (CNN), time convolutional network (TCN), bidirectional long short-term memory (BiLSTM) and multi-scale attention mechanism, which extracts the spatial features, long-term temporal dependencies, and key degradation information of battery data, respectively. To optimize the model performance, the AE algorithm is introduced to automatically optimize the hyperparameters of the hybrid model, including the number and size of convolutional kernels in CNN, the dilation rate in TCN, the number of units in BiLSTM, and the parameters of the fusion layer in the attention mechanism. Experimental results demonstrate that our method significantly enhances prediction accuracy and model robustness compared to conventional deep learning techniques. This approach not only improves the accuracy and robustness of battery RUL prediction but also provides new ideas for solving the parameter tuning problem of neural networks. Full article
Show Figures

Figure 1

26 pages, 5440 KB  
Article
Improved Streamflow Forecasting Through SWE-Augmented Spatio-Temporal Graph Neural Networks
by Akhila Akkala, Soukaina Filali Boubrahimi, Shah Muhammad Hamdi, Pouya Hosseinzadeh and Ayman Nassar
Hydrology 2025, 12(10), 268; https://doi.org/10.3390/hydrology12100268 - 11 Oct 2025
Viewed by 713
Abstract
Streamflow forecasting in snowmelt-dominated basins is essential for water resource planning, flood mitigation, and ecological sustainability. This study presents a comparative evaluation of statistical, machine learning (Random Forest), and deep learning models (Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Spatio-Temporal Graph [...] Read more.
Streamflow forecasting in snowmelt-dominated basins is essential for water resource planning, flood mitigation, and ecological sustainability. This study presents a comparative evaluation of statistical, machine learning (Random Forest), and deep learning models (Long Short-Term Memory (LSTM), Gated Recurrent Unit (GRU), and Spatio-Temporal Graph Neural Network (STGNN)) using 30 years of data from 20 monitoring stations across the Upper Colorado River Basin (UCRB). We assess the impact of integrating meteorological variables—particularly, the Snow Water Equivalent (SWE)—and spatial dependencies on predictive performance. Among all models, the Spatio-Temporal Graph Neural Network (STGNN) achieved the highest accuracy, with a Nash–Sutcliffe Efficiency (NSE) of 0.84 and Kling–Gupta Efficiency (KGE) of 0.84 in the multivariate setting at the critical downstream node, Lees Ferry. Compared to the univariate setup, SWE-enhanced predictions reduced Root Mean Square Error (RMSE) by 12.8%. Seasonal and spatial analyses showed the greatest improvements at high-elevation and mid-network stations, where snowmelt dynamics dominate runoff. These findings demonstrate that spatio-temporal learning frameworks, especially STGNNs, provide a scalable and physically consistent approach to streamflow forecasting under variable climatic conditions. Full article
Show Figures

Figure 1

21 pages, 8249 KB  
Article
Short-Term Passenger Flow Forecasting for Rail Transit Inte-Grating Multi-Scale Decomposition and Deep Attention Mechanism
by Youpeng Lu and Jiming Wang
Sustainability 2025, 17(19), 8880; https://doi.org/10.3390/su17198880 - 6 Oct 2025
Viewed by 521
Abstract
Short-term passenger flow prediction provides critical data-driven support for optimizing resource allocation, guiding passenger mobility, and enhancing risk response capabilities in urban rail transit systems. To further improve prediction accuracy, this study proposes a hybrid SMA-VMD-Informer-BiLSTM prediction model. Addressing the challenge of error [...] Read more.
Short-term passenger flow prediction provides critical data-driven support for optimizing resource allocation, guiding passenger mobility, and enhancing risk response capabilities in urban rail transit systems. To further improve prediction accuracy, this study proposes a hybrid SMA-VMD-Informer-BiLSTM prediction model. Addressing the challenge of error propagation caused by non-stationary components (e.g., noise and abrupt fluctuations) in conventional passenger flow signals, the Variational Mode Decomposition (VMD) method is introduced to decompose raw flow data into multiple intrinsic mode functions (IMFs). A Slime Mould Algorithm (SMA)-based optimization mechanism is designed to adaptively tune VMD parameters, effectively mitigating mode redundancy and information loss. Furthermore, to circumvent error accumulation inherent in serial modeling frameworks, a parallel prediction architecture is developed: the Informer branch captures long-term dependencies through its ProbSparse self-attention mechanism, while the Bidirectional Long Short-Term Memory (BiLSTM) network extracts localized short-term temporal patterns. The outputs of both branches are fused via a fully connected layer, balancing global trend adherence and local fluctuation characterization. Experimental validation using historical entry flow data from Weihouzhuang Station on Xi’an Metro demonstrated the superior performance of the SMA-VMD-Informer-BiLSTM model. Compared to benchmark models (CNN-BiLSTM, CNN-BiGRU, Transformer-LSTM, ARIMA-LSTM), the proposed model achieved reductions of 7.14–53.33% in fmse, 3.81–31.14% in frmse, and 8.87–38.08% in fmae, alongside a 4.11–5.48% improvement in R2. Cross-station validation across multiple Xi’an Metro hubs further confirmed robust spatial generalizability, with prediction errors bounded within fmse: 0.0009–0.01, frmse: 0.0303–0.1, fmae: 0.0196–0.0697, and R2: 0.9011–0.9971. Furthermore, the model demonstrated favorable predictive performance when applied to forecasting passenger inflows at multiple stations in Nanjing and Zhengzhou, showcasing its excellent spatial transferability. By integrating multi-level, multi-scale data processing and adaptive feature extraction mechanisms, the proposed model significantly mitigates error accumulation observed in traditional approaches. These findings collectively indicate its potential as a scientific foundation for refined operational decision-making in urban rail transit management, thereby significantly promoting the sustainable development and long-term stable operation of urban rail transit systems. Full article
Show Figures

Figure 1

29 pages, 10675 KB  
Article
Stack Coupling Machine Learning Model Could Enhance the Accuracy in Short-Term Water Quality Prediction
by Kai Zhang, Rui Xia, Yao Wang, Yan Chen, Xiao Wang and Jinghui Dou
Water 2025, 17(19), 2868; https://doi.org/10.3390/w17192868 - 1 Oct 2025
Viewed by 496
Abstract
Traditional river quality models struggle to accurately predict river water quality in watersheds dominated by non-point source pollution due to computational complexity and uncertain inputs. This study addresses this by developing a novel coupling model integrating a gradient boosting algorithm (Light GBM) and [...] Read more.
Traditional river quality models struggle to accurately predict river water quality in watersheds dominated by non-point source pollution due to computational complexity and uncertain inputs. This study addresses this by developing a novel coupling model integrating a gradient boosting algorithm (Light GBM) and a long short-term memory network (LSTM). The method leverages Light GBM for spatial data characteristics and LSTM for temporal sequence dependencies. Model outputs are reciprocally recalculated as inputs and coupled via linear regression, specifically tackling the lag effects of rainfall runoff and upstream pollutant transport. Applied to predict the concentrations of chemical oxygen demand digested by potassium permanganate index (COD) in South China’s Jiuzhoujiang River basin (characterized by rainfall-driven non-point pollution from agriculture/livestock), the coupled model outperformed individual models, increasing prediction accuracy by 8–12% and stability by 15–40% than conventional models, which means it is a more accurate and broadly applicable method for water quality prediction. Analysis confirmed basin rainfall and upstream water quality as the primary drivers of 5-day water quality variation at the SHJ station, influenced by antecedent conditions within 10–15 days. This highly accurate and stable stack coupling method provides valuable scientific support for regional water management. Full article
Show Figures

Figure 1

32 pages, 13081 KB  
Article
FedIFD: Identifying False Data Injection Attacks in Internet of Vehicles Based on Federated Learning
by Huan Wang, Junying Yang, Jing Sun, Zhe Wang, Qingzheng Liu and Shaoxuan Luo
Big Data Cogn. Comput. 2025, 9(10), 246; https://doi.org/10.3390/bdcc9100246 - 26 Sep 2025
Viewed by 456
Abstract
With the rapid development of intelligent connected vehicle technology, false data injection (FDI) attacks have become a major challenge in the Internet of Vehicles (IoV). While deep learning methods can effectively identify such attacks, the dynamic, distributed architecture of the IoV and limited [...] Read more.
With the rapid development of intelligent connected vehicle technology, false data injection (FDI) attacks have become a major challenge in the Internet of Vehicles (IoV). While deep learning methods can effectively identify such attacks, the dynamic, distributed architecture of the IoV and limited computing resources hinder both privacy protection and lightweight computation. To address this, we propose FedIFD, a federated learning (FL)-based detection method for false data injection attacks. The lightweight threat detection model utilizes basic safety messages (BSM) for local incremental training, and the Q-FedCG algorithm compresses gradients for global aggregation. Original features are reshaped using a time window. To ensure temporal and spatial consistency, a sliding average strategy aligns samples before spatial feature extraction. A dual-branch architecture enables parallel extraction of spatiotemporal features: a three-layer stacked Bidirectional Long Short-Term Memory (BiLSTM) captures temporal dependencies, and a lightweight Transformer models spatial relationships. A dynamic feature fusion weight matrix calculates attention scores for adaptive feature weighting. Finally, a differentiated pooling strategy is applied to emphasize critical features. Experiments on the VeReMi dataset show that the accuracy reaches 97.8%. Full article
(This article belongs to the Special Issue Big Data Analytics with Machine Learning for Cyber Security)
Show Figures

Figure 1

22 pages, 6045 KB  
Article
Early Warning of Anthracnose on Illicium verum Through the Synergistic Integration of Environmental and Remote Sensing Time Series Data
by Junji Li, Yuxin Zhao, Tianteng Zhang, Jiahui Du, Yucai Li, Ling Wu and Xiangnan Liu
Remote Sens. 2025, 17(19), 3294; https://doi.org/10.3390/rs17193294 - 25 Sep 2025
Viewed by 266
Abstract
Anthracnose on Illicium verum Hook.f (I. verum) significantly affects the yield and quality of I. verum, and timely detection methods are urgently needed for early control. However, early warning is difficult due to two major challenges, including the sparse availability [...] Read more.
Anthracnose on Illicium verum Hook.f (I. verum) significantly affects the yield and quality of I. verum, and timely detection methods are urgently needed for early control. However, early warning is difficult due to two major challenges, including the sparse availability of optical remote sensing observations due to frequent cloud and rain interference, and the weak spectral responses caused by infestation during early stages. In this article, a framework for early warning of anthracnose on I. verum that combines high-frequency environmental (meteorological and topographical) data and Sentinel-2 remote sensing time-series data, along with a Time-Aware Long Short-Term Memory (T-LSTM) network incorporating an attentional mechanism (At-T-LSTM) was proposed. First, all available environmental and remote sensing data during the study period were analyzed to characterize the early anthracnose outbreaks, and sensitive features were selected as the algorithm input. On this basis, to address the issue of unequal temporal lengths between environmental and remote sensing time series, the At-T-LSTM model incorporates a time-aware mechanism to capture intra-feature temporal dependencies, while a Self-Attention layer is used to quantify inter-feature interaction weights, enabling effective multi-source features time-series fusion. The results show that the proposed framework achieves a spatial accuracy (F1-score) of 0.86 and a temporal accuracy of 83% in early-stage detection, demonstrating high reliability. By integrating remote sensing features with environmental drivers, this approach enables multi-feature collaborative modeling for the risk assessment and monitoring of I. verum anthracnose. It effectively mitigates the impact of sparse observations and significantly improves the accuracy of early warnings. Full article
(This article belongs to the Special Issue Application of Remote Sensing in Agroforestry (Third Edition))
Show Figures

Graphical abstract

25 pages, 12502 KB  
Article
BiLSTM-VAE Anomaly Weighted Model for Risk-Graded Mine Water Inrush Early Warning
by Manyu Liang, Hui Yao, Shangxian Yin, Enke Hou, Huiqing Lian, Xiangxue Xia, Jinsui Wu and Bin Xu
Appl. Sci. 2025, 15(19), 10394; https://doi.org/10.3390/app151910394 - 25 Sep 2025
Viewed by 329
Abstract
A new cascaded model is proposed to improve the accuracy and early warning capability of predicting mine water inrush accidents. The model sequentially applies a Bidirectional Long Short-Term Memory Network (BiLSTM) and a Variational Autoencoder (VAE) to capture the spatio-temporal dependencies between borehole [...] Read more.
A new cascaded model is proposed to improve the accuracy and early warning capability of predicting mine water inrush accidents. The model sequentially applies a Bidirectional Long Short-Term Memory Network (BiLSTM) and a Variational Autoencoder (VAE) to capture the spatio-temporal dependencies between borehole water level data and water inrush events. First, the BiLSTM predicts borehole water levels, and the prediction errors are analyzed to summarize temporal patterns in water level fluctuations. Then, the VAE identifies anomalies in the predicted results. The spatial correlation between borehole water levels, induced by the cone of depression during water inrush, is quantified to assign weights to each borehole. A weighted comprehensive anomaly score is calculated for final prediction. In actual water inrush cases from Xin’an Coal Mine, the BiLSTM-VAE model triggered high-risk alerts 9 h and 30 min in advance, outperforming the conventional threshold-based method by approximately 6 h. Compared with other models, the BiLSTM-VAE demonstrates better timeliness and higher accuracy with lower false alarm rates in mine water inrush prediction. This framework extends the lead time for implementing safety measures and provides a data-driven approach to early warning systems for mine water inrush. Full article
(This article belongs to the Special Issue Hydrogeology and Regional Groundwater Flow)
Show Figures

Figure 1

14 pages, 1809 KB  
Article
A Novel Convolutional Long Short-Term Memory Approach for Anomaly Detection in Power Monitoring System
by Hao Zhang, Jing Wang, Xuanyuan Wang, Xinyi Feng, Hongda Gao and Yingchun Niu
Energies 2025, 18(18), 4917; https://doi.org/10.3390/en18184917 - 16 Sep 2025
Viewed by 348
Abstract
With the rapid advancement of artificial intelligence, machine learning and big data analytics have become essential tools for enhancing the cybersecurity of power monitoring systems. This study proposes a network traffic anomaly detection model based on Convolutional Long Short-Term Memory (C-LSTM) networks, which [...] Read more.
With the rapid advancement of artificial intelligence, machine learning and big data analytics have become essential tools for enhancing the cybersecurity of power monitoring systems. This study proposes a network traffic anomaly detection model based on Convolutional Long Short-Term Memory (C-LSTM) networks, which integrates convolutional layers to capture spatial features and LSTM layers to model long-term temporal dependencies in network traffic. Incorporated into a cybersecurity situation awareness platform, the model enables comprehensive data collection, intelligent analysis, and rapid response to cybersecurity incidents, significantly enhancing the system’s ability to detect, warn, and mitigate potential threats. Experimental evaluations on the CICIDS2017 dataset demonstrate that the proposed model achieves high accuracy (95.3%) and recall (94.7%), highlighting its effectiveness and potential for practical application in safeguarding critical infrastructure against evolving cybersecurity challenges. Full article
Show Figures

Figure 1

28 pages, 1812 KB  
Article
An Integrated Hybrid Deep Learning Framework for Intrusion Detection in IoT and IIoT Networks Using CNN-LSTM-GRU Architecture
by Doaa Mohsin Abd Ali Afraji, Jaime Lloret and Lourdes Peñalver
Computation 2025, 13(9), 222; https://doi.org/10.3390/computation13090222 - 14 Sep 2025
Viewed by 1446
Abstract
Intrusion detection systems (IDSs) are critical for securing modern networks, particularly in IoT and IIoT environments where traditional defenses such as firewalls and encryption are insufficient against evolving cyber threats. This paper proposes an enhanced hybrid deep learning model that integrates convolutional neural [...] Read more.
Intrusion detection systems (IDSs) are critical for securing modern networks, particularly in IoT and IIoT environments where traditional defenses such as firewalls and encryption are insufficient against evolving cyber threats. This paper proposes an enhanced hybrid deep learning model that integrates convolutional neural networks (CNNs), Long Short-Term Memory (LSTM), and Gated Recurrent Units (GRU) in a multi-branch architecture designed to capture spatial and temporal dependencies while minimizing redundant computations. Unlike conventional hybrid approaches, the proposed parallel–sequential fusion framework leverages the strengths of each component independently before merging features, thereby improving detection granularity and learning efficiency. A rigorous preprocessing pipeline is employed to handle real-world data challenges: missing values are imputed using median filling, class imbalance is mitigated through SMOTE (Synthetic Minority Oversampling Technique), and feature scaling is performed with Min–Max normalization to ensure convergence consistency. The methodology is validated on the TON_IoT and CICIDS2017 dataset, chosen for its diversity and realism in IoT/IIoT attack scenarios. Three hybrid models—CNN-LSTM, CNN-GRU, and the proposed CNN-LSTM-GRU—are assessed for binary and multiclass intrusion detection. Experimental results demonstrate that the CNN-LSTM-GRU architecture achieves superior performance, attaining 100% accuracy in binary classification and 97% in multiclass detection, with balanced precision, recall, and F1-scores across all classes. Furthermore, evaluation on the CICIDS2017 dataset confirms the model’s generalization ability, achieving 99.49% accuracy with precision, recall, and F1-scores of 0.9954, 0.9943, and 0.9949, respectively, outperforming CNN-LSTM and CNN-GRU baselines. Compared to existing IDS models, our approach delivers higher robustness, scalability, and adaptability, making it a promising candidate for next-generation IoT/IIoT security. Full article
(This article belongs to the Section Computational Engineering)
Show Figures

Figure 1

13 pages, 1699 KB  
Article
Study on Centroid Height Prediction of Non-Rigid Vehicle Based on Deep Learning Combined Model
by Guoqiang Pang, Zhiquan Xiao, Zhanwen Cai and Pei Wang
Sensors 2025, 25(18), 5692; https://doi.org/10.3390/s25185692 - 12 Sep 2025
Viewed by 382
Abstract
The height of the center of gravity (ZCG) is a critical parameter for evaluating vehicle safety and performance. Systematic errors arise in ZCG measurement via the tilt-table test method due to unlocked suspension systems and variable sprung mass conditions, [...] Read more.
The height of the center of gravity (ZCG) is a critical parameter for evaluating vehicle safety and performance. Systematic errors arise in ZCG measurement via the tilt-table test method due to unlocked suspension systems and variable sprung mass conditions, which compromise accuracy. To address this limitation, a CNN–LSTM–Attention model integrating convolutional neural networks (CNNs), long short-term memory networks (LSTMs), and an attention mechanism is proposed. The CNN extracts spatial correlations among vehicle load transfer, suspension stiffness, and tilt angles. The LSTM captures temporal dependencies in tilt angle sequences, while the attention mechanism amplifies critical load-transfer features near the 0° region. Simulations of vehicles with unlocked suspension and variable sprung mass were conducted in Adams using tilt-table protocols. The CNN–LSTM–Attention model was trained on simulation data and validated with real-world tilt-test data under identical suspension conditions. Results demonstrate that the CNN–LSTM–Attention model achieves at least a 6.9% improvement in computational speed and at least a 0.1% reduction in prediction error compared to CNN, CNN-LSTM, and Transformer baselines. The CNN–LSTM–Attention model demonstrates valid predictive capability for ZCG at 0° tilt angle. This novel approach provides a robust solution for the tilt-table test method ZCG measurement, enhancing practical accuracy in vehicle dynamics parameter quantification. Full article
(This article belongs to the Topic Vehicle Dynamics and Control, 2nd Edition)
Show Figures

Figure 1

Back to TopTop