Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (111)

Search Parameters:
Keywords = bidirectional TCN

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
24 pages, 5799 KB  
Article
Robust Offshore Wind Power Forecasting Under Extreme Marine Conditions Using Multi-Source Feature Fusion and Kolmogorov–Arnold Networks
by Tongbo Zhu, Fan Cai and Dongdong Chen
J. Mar. Sci. Eng. 2026, 14(6), 573; https://doi.org/10.3390/jmse14060573 - 19 Mar 2026
Viewed by 139
Abstract
With the increasing penetration of offshore wind power, extreme marine conditions pose significant challenges to forecasting accuracy and grid stability. To address this issue, this study proposes a robust offshore wind power forecasting framework based on multi-source feature fusion and a hybrid TCN–BiLSTM–KAN [...] Read more.
With the increasing penetration of offshore wind power, extreme marine conditions pose significant challenges to forecasting accuracy and grid stability. To address this issue, this study proposes a robust offshore wind power forecasting framework based on multi-source feature fusion and a hybrid TCN–BiLSTM–KAN architecture. Specifically, a Temporal Convolutional Network (TCN) is employed to extract local multi-scale temporal features and suppress high-frequency disturbances, followed by a Bidirectional Long Short-Term Memory (BiLSTM) network to capture long-term temporal dependencies. A Kolmogorov–Arnold Network (KAN) is further integrated as a nonlinear mapping module to approximate complex dynamics under extreme marine conditions. The model is validated using a real-world offshore wind power dataset with a 15 min forecasting horizon, where balanced samples are constructed across different operating conditions. Experimental results demonstrate that, under extreme conditions, the proposed model achieves an RMSE of 3.58 MW and an R2 of 97.84%, with RMSE reductions of 56.8% and 42.3% compared to CNN-BiLSTM and Transformer-KAN, respectively. Furthermore, cross-site validation confirms that the model maintains stable predictive performance, indicating its preliminary spatial generalization capability. Overall, the proposed framework provides an effective solution for enhancing forecasting reliability and supporting secure grid integration of offshore wind power under extreme marine environments. Full article
(This article belongs to the Section Marine Energy)
Show Figures

Figure 1

22 pages, 793 KB  
Article
Comparative Analysis of Machine Learning and Deep Learning Models for Atrial Fibrillation Detection from Long-Term ECG
by Lerina Aversano, Ilaria Mancino, Agostino Marengo and Chiara Verdone
Appl. Sci. 2026, 16(5), 2390; https://doi.org/10.3390/app16052390 - 28 Feb 2026
Viewed by 222
Abstract
Atrial fibrillation is the most prevalent sustained cardiac arrhythmia and a major risk factor for stroke, heart failure, and premature mortality. Automatic detection remains challenging due to the variability of electrocardiogram (ECG) morphology, noise, and the paroxysmal nature of atrial fibrillation events. This [...] Read more.
Atrial fibrillation is the most prevalent sustained cardiac arrhythmia and a major risk factor for stroke, heart failure, and premature mortality. Automatic detection remains challenging due to the variability of electrocardiogram (ECG) morphology, noise, and the paroxysmal nature of atrial fibrillation events. This study proposes a comprehensive framework that integrates optimised segmentation, feature extraction, and advanced deep learning architectures to improve detection accuracy. A coalescence window is introduced to dynamically cluster arrhythmic episodes, aligning computational analysis with clinical event distributions. Multiple classifiers are investigated, ranging from traditional machine learning models to state-of-the-art deep neural networks, including Temporal Convolutional Networks (TCNs), Convolutional Neural Networks (CNNs), and Bidirectional Long Short-Term Memory (BiLSTM). Experimental evaluation on a balanced dataset of ECG signals demonstrates the superior performance of deep learning models, with the best architecture achieving high accuracy and F1-score, significantly outperforming traditional approaches. Furthermore, the proposed pipeline is designed to be modular and resource-aware, supporting potential deployment in real-time and edge computing environments. These results highlight the feasibility of scalable atrial fibrillation monitoring systems that bridge algorithmic innovation with clinical applicability, ultimately contributing to earlier diagnosis and improved patient management. Full article
Show Figures

Figure 1

37 pages, 4176 KB  
Article
Real-Time Thermal Symmetry Control of Data Centers Based on Distributed Optical Fiber Sensing and Model Predictive Control
by Lin-Xiang Tang and Mu-Jiang-Shan Wang
Symmetry 2026, 18(3), 398; https://doi.org/10.3390/sym18030398 - 24 Feb 2026
Viewed by 406
Abstract
The high energy consumption and spatiotemporal thermal asymmetry of data center cooling systems have become critical bottlenecks constraining their green and sustainable development. Traditional point-type temperature sensors suffer from insufficient spatial coverage, while conventional feedback control strategies exhibit delayed responses and limited adaptability [...] Read more.
The high energy consumption and spatiotemporal thermal asymmetry of data center cooling systems have become critical bottlenecks constraining their green and sustainable development. Traditional point-type temperature sensors suffer from insufficient spatial coverage, while conventional feedback control strategies exhibit delayed responses and limited adaptability under dynamic workloads. To address these challenges, this study proposes a real-time thermal symmetry management framework for data centers based on distributed fiber optic temperature sensing and model predictive control (MPC). The proposed system employs Brillouin scattering-based distributed sensing to continuously acquire high-density temperature measurements from thousands of points along a single optical fiber, enabling fine-grained perception of the three-dimensional thermal field. On this basis, a hybrid prediction model integrating thermodynamic physical equations with a Temporal Convolutional Network–Bidirectional Gated Recurrent Unit (TCN–BiGRU) deep neural network is developed to achieve accurate and stable spatiotemporal temperature forecasting. Furthermore, a symmetry-aware MPC controller is designed with the dual objectives of minimizing cooling energy consumption and suppressing thermal field deviations, thereby restoring temperature uniformity through rolling-horizon optimization. Experimental validation in a production data center demonstrates that the distributed sensing system achieves a measurement deviation of 0.12 °C, while the hybrid prediction model attains a root mean square error of 0.41 °C, representing a 26.8% improvement over baseline methods. The MPC-based control strategy reduces daily cooling energy consumption by 14.4%, improves the power usage effectiveness (PUE) from 1.58 to 1.47, and significantly enhances both thermal symmetry and operational safety. The Thermal Symmetry Index (TSI) decreased from 0.060 to 0.035, indicating a 41.7% improvement in spatial temperature distribution uniformity. The TSI is defined as the ratio of spatial temperature standard deviation to mean temperature, where lower values indicate better thermal uniformity; TSI < 0.03 represents excellent symmetry, 0.03–0.05 indicates good symmetry, and TSI > 0.08 suggests significant asymmetry requiring intervention. These results provide an effective and practical solution for intelligent operation, energy-efficient control, and low-carbon transformation of next-generation green data centers. Full article
(This article belongs to the Section Engineering and Materials)
Show Figures

Figure 1

19 pages, 2559 KB  
Article
A CPO-Optimized BiTCN–BiGRU–Attention Network for Short-Term Wind Power Forecasting
by Liusong Huang, Adam Amril bin Jaharadak, Nor Izzati Ahmad and Jie Wang
Energies 2026, 19(4), 1034; https://doi.org/10.3390/en19041034 - 15 Feb 2026
Viewed by 477
Abstract
Short-term wind power prediction is pivotal for maintaining the stability of power grids characterized by high renewable energy penetration. However, wind power time series exhibit complex characteristics, including local turbulence-induced fluctuations and long-term temporal dependencies, which challenge traditional forecasting models. Furthermore, the performance [...] Read more.
Short-term wind power prediction is pivotal for maintaining the stability of power grids characterized by high renewable energy penetration. However, wind power time series exhibit complex characteristics, including local turbulence-induced fluctuations and long-term temporal dependencies, which challenge traditional forecasting models. Furthermore, the performance of hybrid deep learning models is often compromised by the difficulty of tuning hyperparameters over non-convex optimization surfaces. To address these challenges, this study proposes a novel framework: CPO—BiTCN—BiGRU—Attention. Adopting a physically motivated “Filter–Memorize–Focus” strategy, the model first employs a Bidirectional Temporal Convolutional Network (BiTCN) with dilated causal convolutions to extract multi-scale local features and denoise raw data. Subsequently, a Bidirectional Gated Recurrent Unit (BiGRU) captures global temporal evolution, while an attention mechanism dynamically weights critical time steps corresponding to ramp events. To mitigate hyperparameter uncertainty, the Crowned Porcupine Optimization (CPO) algorithm is introduced to adaptively tune the network structure, balancing global exploration and local exploitation more effectively than traditional swarm algorithms. Experimental results obtained from real-world wind farm data in Xinjiang, China, demonstrate that the proposed model consistently outperforms State-of-the-Art benchmark models. Compared with the best competing methods, the proposed framework reduces MAE and MAPE by approximately 30–45%, while maintaining competitive RMSE performance, indicating improved average forecasting accuracy and robustness under varying operating conditions. The results confirm that the proposed architecture effectively decouples local noise from global trends, providing a robust and practical solution for short-term wind power forecasting in grid dispatching applications. Full article
(This article belongs to the Section A3: Wind, Wave and Tidal Energy)
Show Figures

Figure 1

25 pages, 8031 KB  
Article
A Dual-Optimized Hybrid Deep Learning Framework with RIME-VMD and TCN-BiGRU-SA for Short-Term Wind Power Prediction
by Zhong Wang, Kefei Zhang, Xun Ai, Sheng Liu and Tianbao Zhang
Appl. Sci. 2026, 16(3), 1531; https://doi.org/10.3390/app16031531 - 3 Feb 2026
Viewed by 284
Abstract
Precise short-term forecasting of wind power generation is indispensable for ensuring the security and economic efficiency of power grid operations. Nevertheless, the inherent non-stationarity and stochastic nature of wind power series present significant challenges for prediction accuracy. To address these issues, this paper [...] Read more.
Precise short-term forecasting of wind power generation is indispensable for ensuring the security and economic efficiency of power grid operations. Nevertheless, the inherent non-stationarity and stochastic nature of wind power series present significant challenges for prediction accuracy. To address these issues, this paper proposes a dual-optimized hybrid deep learning framework combining Spearman correlation analysis, RIME-VMD, and TCN-BiGRU-SA. First, Spearman correlation analysis is employed to screen meteorological factors, eliminating redundant features and reducing model complexity. Second, an adaptive Variational Mode Decomposition (VMD) strategy, optimized by the RIME algorithm based on Minimum Envelope Entropy, decomposes the non-stationary wind power series into stable intrinsic mode functions (IMFs). Third, a hybrid predictor integrating Temporal Convolutional Network (TCN), Bidirectional Gated Recurrent Unit (BiGRU), and Self-Attention (SA) mechanisms is constructed to capture both local trends and long-term temporal dependencies. Furthermore, the RIME algorithm is utilized again to optimize the hyperparameters of the deep learning predictor to avoid local optima. The proposed framework is validated using full-year datasets from two distinct wind farms in Xinjiang and Gansu, China. Experimental results demonstrate that the proposed model achieves a Root Mean Square Error (RMSE) of 7.5340 MW on the primary dataset, significantly outperforming mainstream baseline models. The multi-dataset verification confirms the model’s superior prediction accuracy, robustness against seasonal variations, and strong generalization capability. Full article
Show Figures

Figure 1

20 pages, 28542 KB  
Article
Accurate State of Charge Estimation for Lithium-Ion Batteries Using a Temporal Convolutional Network and Bidirectional Long Short-Term Memory Hybrid Model
by Jie Qiu, Zhendong Zhang, Zehua Zhu and Chenqiang Luo
Batteries 2026, 12(2), 50; https://doi.org/10.3390/batteries12020050 - 2 Feb 2026
Viewed by 535
Abstract
Lithium-ion batteries are extensively employed in new energy vehicles, where accurate State of Charge (SOC) estimation is fundamental for optimal battery management. However, existing methods often rely on single-model approaches and fail to leverage the complementary advantages of multiple models. This study proposes [...] Read more.
Lithium-ion batteries are extensively employed in new energy vehicles, where accurate State of Charge (SOC) estimation is fundamental for optimal battery management. However, existing methods often rely on single-model approaches and fail to leverage the complementary advantages of multiple models. This study proposes an innovative hybrid estimation model integrating a Temporal Convolutional Network (TCN) that efficiently captures long-range temporal dependencies via dilated convolution and residual blocks, with a Bidirectional Long Short-Term Memory Network (BiLSTM) that extracts bidirectional context information to enhance the accuracy of SOC estimation. First, the Panasonic datasets are utilized, with current, voltage, and cell temperature selected as input features. Subsequently, the proposed model is evaluated under various temperature conditions and driving cycles, demonstrating high accuracy and robustness. Finally, comparative experiments are conducted against traditional methods, such as standalone TCN and Long Short-Term Memory (LSTM) networks, under both 10 °C and −10 °C operating conditions. The results show that the hybrid model achieves superior performance in error metrics. Specifically, based on a second-order resistor-capacitor network, at −10 °C, the Root Mean Squared Error is reduced by 0.948%, and at 10 °C, it decreases by 0.398%. Additionally, the Maximum Absolute Error is lowered by 2.751% at −10 °C and by 2.192% at 10 °C. These improvements highlight the model’s significant potential as an effective solution for SOC estimation in lithium-ion batteries. Full article
(This article belongs to the Section Battery Modelling, Simulation, Management and Application)
Show Figures

Figure 1

21 pages, 3149 KB  
Article
A Next-Day Dew Intensity Prediction Model Based on the Improved Hippopotamus Optimization
by Yingying Xu, Ziye Lv, Yifei Cai and Kefei Wang
Sustainability 2026, 18(3), 1445; https://doi.org/10.3390/su18031445 - 1 Feb 2026
Viewed by 217
Abstract
Accurate dew intensity prediction is vital in multiple fields, such as agriculture, meteorology, industry, and transportation. This study addresses the cross-disciplinary demands for dew intensity prediction by proposing a hybrid deep learning model based on the improved hippopotamus optimization (IHO). Key influencing factors [...] Read more.
Accurate dew intensity prediction is vital in multiple fields, such as agriculture, meteorology, industry, and transportation. This study addresses the cross-disciplinary demands for dew intensity prediction by proposing a hybrid deep learning model based on the improved hippopotamus optimization (IHO). Key influencing factors were selected through multidimensional meteorological data correlation analysis, and a fusion architecture of a Bidirectional Temporal Convolutional Network (BiTCN) and a Support Vector Machine (SVM) was constructed. The IHO algorithm is adopted to optimize model parameters and enhance prediction accuracy adaptively. Experiments were conducted using ten years of meteorological data to verify the prediction of twelve-hour dew intensity in three typical ecosystems in Northeast China: farmland, marsh wetland, and urban areas. The results show that the optimized IHO-BiTCN-SVM model achieved significant improvements in key indicators, including MAE, MAPE, MSE, RMSE, and R2. For the farmland ecosystem, MAE was reduced by 72.2% (0.0016572 vs. 0.0059659), MSE decreased from 6.8552 × 10−5 to 6.7874 × 10−6, and R2 increased by 12.5% (0.98791 vs. 0.87793). The IHO algorithm reduced the MAE of the farmland system by 39.6%, the MAPE by 41.6%, and the MSE by 60.2%, yet the R2 increased by 1.8% compared with the benchmark model. This model effectively overcomes the subjectivity of traditional methods through an intelligent parameter optimization mechanism, providing reliable technical support for precise agricultural irrigation decisions, urban dew formation warnings, and wetland ecological protection. Full article
Show Figures

Figure 1

21 pages, 6112 KB  
Article
Machine Learning-Based Estimation of Knee Joint Mechanics from Kinematic and Neuromuscular Inputs: A Proof-of-Concept Using the CAMS-Knee Datasets
by Yara N. Derungs, Martin Bertsch, Kushal Malla, Allan Maas, Thomas M. Grupp, Adam Trepczynski, Philipp Damm and Seyyed Hamed Hosseini Nasab
Bioengineering 2026, 13(2), 173; https://doi.org/10.3390/bioengineering13020173 - 31 Jan 2026
Viewed by 749
Abstract
This study explores the feasibility of estimating tibiofemoral joint contact forces using deep learning models trained on in vivo biomechanical data. Leveraging the comprehensive CAMS-Knee datasets, we developed and evaluated two machine learning network architectures, a bidirectional Long Short-Term-Memory Network with a Multilayer [...] Read more.
This study explores the feasibility of estimating tibiofemoral joint contact forces using deep learning models trained on in vivo biomechanical data. Leveraging the comprehensive CAMS-Knee datasets, we developed and evaluated two machine learning network architectures, a bidirectional Long Short-Term-Memory Network with a Multilayer Perceptron (biLSTM-MLP) and a Temporal Convolutional Network (TCN) model, to predict medial and lateral knee contact forces (KCFs) across various activities of daily living. Using a leave-one-subject-out validation approach, the biLSTM-MLP model achieved root mean square errors (RMSEs) as low as 0.16 body weight (BW) and Pearson correlation coefficients up to 0.98 for the total KCF (Ftot) during walking. Although the prediction of individual force components showed slightly lower accuracy, the model consistently demonstrated high predictive accuracy and strong temporal coherence. In contrast to the biLSTM-MLP model, the TCN model showed more variable performance across force components and activities. Leave-one-feature-out analyses underscored the dominant role of lower-limb kinematics and ground reaction forces in driving model accuracy, while EMG features contributed only marginally to the overall predictive performance. Collectively, these findings highlight deep learning as a scalable and reliable alternative to traditional musculoskeletal simulations for personalized knee load estimation, establishing a foundation for future research on larger and more heterogeneous populations. Full article
(This article belongs to the Section Biosignal Processing)
Show Figures

Figure 1

38 pages, 12849 KB  
Article
Research on an Ultra-Short-Term Wind Power Forecasting Model Based on Multi-Scale Decomposition and Fusion Framework
by Daixuan Zhou, Yan Jia, Guangchen Liu, Junlin Li, Kaile Xi, Zhichao Wang and Xu Wang
Symmetry 2026, 18(2), 253; https://doi.org/10.3390/sym18020253 - 30 Jan 2026
Viewed by 297
Abstract
Accurate wind power prediction is of great significance for the dispatch, security, and stable operation of energy systems. It helps enhance the symmetry and coordination between the highly stochastic and volatile nature of the power generation supply side and the stringent requirements for [...] Read more.
Accurate wind power prediction is of great significance for the dispatch, security, and stable operation of energy systems. It helps enhance the symmetry and coordination between the highly stochastic and volatile nature of the power generation supply side and the stringent requirements for stability and power quality on the grid demand side. To further enhance the accuracy of ultra-short-term wind power forecasting, this paper proposes a novel prediction framework based on multi-layer data decomposition, reconstruction, and a combined prediction model. A multi-stage decomposition and reconstruction technique is first employed to significantly reduce noise interference: the Sparrow Search Algorithm (SSA) is utilized to optimize the parameters for an initial Variational Mode Decomposition (VMD), followed by a secondary decomposition of the high-frequency components using Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN). The resulting components are then reconstructed based on Sample Entropy (SE), effectively improving the quality of the input data. Subsequently, a hybrid prediction model named IMGWO-BiTCN-BiGRU is constructed to extract spatiotemporal bidirectional features from the input sequences. Finally, simulation experiments are conducted using actual measurement data from the Sotavento wind farm in Spain. The results demonstrate that the proposed hybrid model outperforms benchmark models across all evaluation metrics, validating its effectiveness in improving forecasting accuracy and stability. Full article
Show Figures

Figure 1

22 pages, 5927 KB  
Article
Research on a Temperature and Humidity Prediction Model for Greenhouse Tomato Based on iT-LSTM-CA
by Yanan Gao, Pingzeng Liu, Yuxuan Zhang, Fengyu Li, Ke Zhu, Yan Zhang and Shiwei Xu
Sustainability 2026, 18(2), 930; https://doi.org/10.3390/su18020930 - 16 Jan 2026
Viewed by 413
Abstract
Constructing a temperature and humidity prediction model for greenhouse-grown tomatoes is of great significance for achieving resource-efficient and sustainable greenhouse environmental control and promoting healthy tomato growth. However, traditional models often struggle to simultaneously capture long-term temporal trends, short-term local dynamic variations, and [...] Read more.
Constructing a temperature and humidity prediction model for greenhouse-grown tomatoes is of great significance for achieving resource-efficient and sustainable greenhouse environmental control and promoting healthy tomato growth. However, traditional models often struggle to simultaneously capture long-term temporal trends, short-term local dynamic variations, and the coupling relationships among multiple variables. To address these issues, this study develops an iT-LSTM-CA multi-step prediction model, in which the inverted Transformer (iTransformer, iT) is employed to capture global dependencies across variables and long temporal scales, the Long Short-Term Memory (LSTM) network is utilized to extract short-term local variation patterns, and a cross-attention (CA) mechanism is introduced to dynamically fuse the two types of features. Experimental results show that, compared with models such as Gated Recurrent Unit (GRU), Temporal Convolutional Network (TCN), Recurrent Neural Network (RNN), LSTM, and Bidirectional Long Short-Term Memory (Bi-LSTM), the iT-LSTM-CA achieves the best performance in multi-step forecasting tasks at 3 h, 6 h, 12 h, and 24 h horizons. For temperature prediction, the R2 ranges from 0.96 to 0.98, with MAE between 0.42 °C and 0.79 °C and RMSE between 0.58 °C and 1.06 °C; for humidity prediction, the R2 ranges from 0.95 to 0.97, with MAE between 1.21% and 2.49% and RMSE between 1.78% and 3.42%. These results indicate that the iT-LSTM-CA model can effectively capture greenhouse environmental variations and provide a scientific basis for environmental control and management in tomato greenhouses. Full article
Show Figures

Figure 1

17 pages, 3529 KB  
Article
Study on Multimodal Sensor Fusion for Heart Rate Estimation Using BCG and PPG Signals
by Jisheng Xing, Xin Fang, Jing Bai, Luyao Cui, Feng Zhang and Yu Xu
Sensors 2026, 26(2), 548; https://doi.org/10.3390/s26020548 - 14 Jan 2026
Viewed by 701
Abstract
Continuous heart rate monitoring is crucial for early cardiovascular disease detection. To overcome the discomfort and limitations of ECG in home settings, we propose a multimodal temporal fusion network (MM-TFNet) that integrates ballistocardiography (BCG) and photoplethysmography (PPG) signals. The network extracts temporal features [...] Read more.
Continuous heart rate monitoring is crucial for early cardiovascular disease detection. To overcome the discomfort and limitations of ECG in home settings, we propose a multimodal temporal fusion network (MM-TFNet) that integrates ballistocardiography (BCG) and photoplethysmography (PPG) signals. The network extracts temporal features from BCG and PPG signals through temporal convolutional networks (TCNs) and bidirectional long short-term memory networks (BiLSTMs), respectively, achieving cross-modal dynamic fusion at the feature level. First, bimodal features are projected into a unified dimensional space through fully connected layers. Subsequently, a cross-modal attention weight matrix is constructed for adaptive learning of the complementary correlation between BCG mechanical vibration and PPG volumetric flow features. Combined with dynamic focusing on key heartbeat waveforms through multi-head self-attention (MHSA), the model’s robustness under dynamic activity states is significantly enhanced. Experimental validation using a publicly available BCG-PPG-ECG simultaneous acquisition dataset comprising 40 subjects demonstrates that the model achieves excellent performance with a mean absolute error (MAE) of 0.88 BPM in heart rate prediction tasks, outperforming current mainstream deep learning methods. This study provides theoretical foundations and engineering guidance for developing contactless, low-power, edge-deployable home health monitoring systems, demonstrating the broad application potential of multimodal fusion methods in complex physiological signal analysis. Full article
(This article belongs to the Section Biomedical Sensors)
Show Figures

Figure 1

16 pages, 2092 KB  
Article
Bidirectional Temporal Attention Convolutional Networks for High-Performance Network Traffic Anomaly Detection
by Feng Wang, Yufeng Huang and Yifei Shi
Information 2026, 17(1), 61; https://doi.org/10.3390/info17010061 - 9 Jan 2026
Viewed by 360
Abstract
Deep learning-based network traffic anomaly detection, particularly using Recurrent Neural Networks (RNNs), often struggles with high computational overhead and difficulties in capturing long-range temporal dependencies. To address these limitations, this paper proposes a Bidirectional Temporal Attention Convolutional Network (Bi-TACN) for robust and efficient [...] Read more.
Deep learning-based network traffic anomaly detection, particularly using Recurrent Neural Networks (RNNs), often struggles with high computational overhead and difficulties in capturing long-range temporal dependencies. To address these limitations, this paper proposes a Bidirectional Temporal Attention Convolutional Network (Bi-TACN) for robust and efficient network traffic anomaly detection. Specifically, dilated causal convolutions with expanding receptive fields and residual modules are employed to capture multi-scale temporal patterns while effectively mitigating the vanishing gradient. Furthermore, a bidirectional structure integrated with Efficient Channel Attention (ECA) is designed to adaptively weight contextual features, preventing sparse attack indicators from being overwhelmed by dominant normal traffic. A Softmax-based classifier then leverages these refined representations to execute high-performance anomaly detection. Extensive experiments on the NSL-KDD and UNSW-NB15 datasets demonstrate that Bi-TACN achieves average accuracies of 88.51% and 82.5%, respectively, significantly outperforming baseline models such as Bi-TCN and Bi-GRU in terms of both precision and convergence speed. Full article
Show Figures

Figure 1

28 pages, 6394 KB  
Article
Prediction of Blade Root Loads for Wind Turbine Based on RBMO-VMD and TCN-BiLSTM-Attention
by Yifan Liu and Jing Cheng
Mathematics 2026, 14(2), 218; https://doi.org/10.3390/math14020218 - 6 Jan 2026
Viewed by 265
Abstract
Addressing the challenges associated with wind turbine blade root loads—including nonlinearity, strong coupling effects, high computational complexity, and the limitations of conventional mathematical-physical modeling approaches. This paper proposes a wind turbine blade root load prediction model that integrates Variational Mode Decomposition (VMD) optimized [...] Read more.
Addressing the challenges associated with wind turbine blade root loads—including nonlinearity, strong coupling effects, high computational complexity, and the limitations of conventional mathematical-physical modeling approaches. This paper proposes a wind turbine blade root load prediction model that integrates Variational Mode Decomposition (VMD) optimized by the Red-billed Blue Magpie Algorithm (RBMO) and a combined Temporal Convolutional Network (TCN)—Bidirectional Long Short-Term Memory (BiLSTM)—Attention mechanism. First, the RBMO algorithm optimizes VMD parameters. VMD decomposes data into multiple sub-sequences, which are combined with environmental and operational parameters to form input components for the TCN-BiLSTM-Attention ensemble prediction model. Finally, the RBMO algorithm determines the optimal hyperparameter configuration for the combined model. Prediction outputs from each component are then aggregated and reconstructed to yield the final blade root load prediction. Predictions are compared against actual data and results from other forecasting models. Results demonstrate superior predictive performance for the proposed model, effectively enhancing the accuracy of blade root load prediction for wind turbines. Full article
(This article belongs to the Collection Applied Mathematics for Emerging Trends in Mechatronic Systems)
Show Figures

Figure 1

25 pages, 3113 KB  
Article
Data-Driven Modeling for a Liquid Desiccant Dehumidification Air Conditioning System Based on BKA-BiTCN-BiLSTM-SA
by Xianhua Ou, Xinkai Wang, Zheyu Wang and Xiongxiong He
Appl. Sci. 2026, 16(1), 304; https://doi.org/10.3390/app16010304 - 28 Dec 2025
Viewed by 295
Abstract
The model of a liquid desiccant dehumidification air conditioning (LDAC) system is one of the key foundations for achieving efficient cooling, dehumidification and regeneration, and saving energy consumption. The data-driven modeling method does not need to understand the complex heat and mass transfer [...] Read more.
The model of a liquid desiccant dehumidification air conditioning (LDAC) system is one of the key foundations for achieving efficient cooling, dehumidification and regeneration, and saving energy consumption. The data-driven modeling method does not need to understand the complex heat and mass transfer mechanism and equipment physical information, thus the modeling complexity is greatly reduced. This paper proposes a temperature and humidity prediction model integrating the Black Kite Algorithm (BKA), Bidirectional Temporal Convolutional Network (BiTCN), Bidirectional Long Short-Term Memory (BiLSTM), and Self-Attention mechanism (SA). The model extracts local spatiotemporal features from sequence data through BiTCN, enhances the understanding of contextual dependencies in temporal data using BiLSTM, and employs the SA to assign dynamic weights to different time steps. Furthermore, BKA is adopted to optimize the hyperparameter combinations of the neural network, thereby improving prediction accuracy. To validate the model performance, an experimental platform for an LDAC system was established to collect operational data under multiple working conditions, constructing a comprehensive dataset for simulation analysis. Experimental results demonstrate that compared to conventional time-series prediction models, the proposed model achieves higher accuracy in predicting outlet temperature and humidity across various operating conditions, providing reliable technical support for system real-time control and performance optimization. Full article
Show Figures

Figure 1

27 pages, 5037 KB  
Article
A TCN-BiLSTM and ANR-IEKF Hybrid Framework for Sustained Vehicle Positioning During GNSS Outages
by Senhao Niu, Jie Li, Chenjun Hu, Junlong Li, Debiao Zhang and Kaiqiang Feng
Sensors 2026, 26(1), 152; https://doi.org/10.3390/s26010152 - 25 Dec 2025
Viewed by 550
Abstract
The performance of integrated Global Navigation Satellite System and Inertial Navigation System (GNSS/INS) navigation often declines in complex urban environments due to frequent GNSS signal blockages. This poses a significant challenge for autonomous driving applications that require continuous and reliable positioning. To address [...] Read more.
The performance of integrated Global Navigation Satellite System and Inertial Navigation System (GNSS/INS) navigation often declines in complex urban environments due to frequent GNSS signal blockages. This poses a significant challenge for autonomous driving applications that require continuous and reliable positioning. To address this limitation, this paper presents a novel hybrid framework that combines a deep learning architecture with an adaptive Kalman Filter. At the core of this framework is a Temporal Convolutional Network and Bidirectional Long Short-Term Memory (TCN-BiLSTM) model, which generates accurate pseudo-GNSS measurements from raw INS data during GNSS outages. These measurements are then fused with the INS data stream using an Adaptive Noise-Regulated Iterated Extended Kalman Filter (ANR-IEKF), which enhances robustness by dynamically estimating and adjusting the process and observation noise statistics in real time. The proposed ANR-IEKF + TCN-BiLSTM framework was validated using a real-world vehicle dataset that encompasses both straight-line and turning scenarios. The results demonstrate its superior performance in positioning accuracy and robustness compared to several baseline models, thereby confirming its effectiveness as a reliable solution for maintaining high-precision navigation in GNSS-denied environments. Validated in 70 s GNSS outage environments, our approach enhances positioning accuracy by over 50% against strong deep learning baselines with errors reduced to roughly 3.4 m. Full article
(This article belongs to the Section Navigation and Positioning)
Show Figures

Figure 1

Back to TopTop