Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Article Types

Countries / Regions

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Search Results (12,687)

Search Parameters:
Keywords = time-series modelling

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
25 pages, 13024 KB  
Article
Hybrid Frequency–Temporal Modeling with Transformer for Long-Term Satellite Telemetry Prediction
by Zhuqing Chen, Jiasen Yang, Zhongkang Yin, Yijia Wu, Lei Zhong, Qingyu Jia and Zhimin Chen
Appl. Sci. 2025, 15(21), 11585; https://doi.org/10.3390/app152111585 (registering DOI) - 30 Oct 2025
Abstract
Reliable forecasting of satellite telemetry is critical for spacecraft health management and mission planning. However, conventional data-driven methods often struggle to effectively capture both the long-term dependencies and local dynamics inherent in telemetry data. To tackle these challenges, we introduce FFT1D-Dual, a hybrid [...] Read more.
Reliable forecasting of satellite telemetry is critical for spacecraft health management and mission planning. However, conventional data-driven methods often struggle to effectively capture both the long-term dependencies and local dynamics inherent in telemetry data. To tackle these challenges, we introduce FFT1D-Dual, a hybrid Transformer framework that unifies frequency-domain and temporal-domain modeling, effectively capturing both long-term dependencies and local features in telemetry data to enable more accurate satellite forecasting. The encoder replaces computationally expensive self-attention with a novel Dual-Path Mixer encoder that combines one-dimensional Fast Fourier Transform (FFT) and temporal convolutions, adaptively fused via a learnable channel-wise gating mechanism. A standard attention-based decoder with dynamic positional encodings preserves temporal reasoning capability. Experiments on real-world satellite telemetry datasets demonstrate that FFT1D-Dual mostly outperforms baselines across both short- and long-term horizons across three representative telemetry variables while maintaining consistently lower error growth in long-horizon predictions. Ablation studies confirm that the combination of frequency-domain modeling and dual-path fusion jointly contributes to these gains. The proposed approach provides an efficient solution for accurate long-term forecasting in complex satellite telemetry scenarios. Full article
Show Figures

Figure 1

17 pages, 4959 KB  
Article
A Variational Mode Snake-Optimized Neural Network Prediction Model for Agricultural Land Subsidence Monitoring Based on Temporal InSAR Remote Sensing
by Zhenda Wang, Huimin Huang, Ruoxin Wang, Ming Guo, Longjun Li, Yue Teng and Yuefan Zhang
Processes 2025, 13(11), 3480; https://doi.org/10.3390/pr13113480 - 29 Oct 2025
Abstract
Interferometric Synthetic Aperture Radar (InSAR) technology is crucial for large-scale land subsidence analysis in cultivated areas within hilly and mountainous regions. Accurate prediction of this subsidence is of significant importance for agricultural resource management and planning. Addressing the limitations of existing subsidence prediction [...] Read more.
Interferometric Synthetic Aperture Radar (InSAR) technology is crucial for large-scale land subsidence analysis in cultivated areas within hilly and mountainous regions. Accurate prediction of this subsidence is of significant importance for agricultural resource management and planning. Addressing the limitations of existing subsidence prediction methods in terms of accuracy and model selection, this paper proposes a deep neural network prediction model based on Variational Mode Decomposition (VMD) and the Snake Optimizer (SO), termed VMD-SO-CNN-LSTM-MATT. VMD decomposes complex subsidence signals into stable intrinsic components, improving input data quality. The SO algorithm is introduced to globally optimize model parameters, preventing local optima and enhancing prediction accuracy. This model utilizes time–series subsidence data extracted via the SBAS-InSAR technique as input. Initially, the original sequence is decomposed into multiple intrinsic mode functions (IMFs) using VMD. Subsequently, a CNN-LSTM network incorporating a Multi-Head Attention mechanism (MATT) is employed to model and predict each component. Concurrently, the SO algorithm performs global optimization of the model hyperparameters. Experimental results demonstrate that the proposed model significantly outperforms comparative models (traditional Long Short-Term Memory (LSTM) neural network, VMD-CNN-LSTM-MATT, and Sparrow Search Algorithm (SSA)-optimized CNN-LSTM) across key metrics: Mean Absolute Error (MAE), Root Mean Square Error (RMSE), and Mean Absolute Percentage Error (MAPE). Specifically, the reductions achieved are minimum improvements of 29.85% for MAE, 8.42% for RMSE, and 33.69% for MAPE. This model effectively enhances the prediction accuracy of land subsidence in cultivated hilly and mountainous areas, validating its high reliability and practicality for subsidence monitoring and prediction tasks. Full article
(This article belongs to the Section AI-Enabled Process Engineering)
Show Figures

Figure 1

13 pages, 3563 KB  
Article
Iterative Forecasting of Short Time Series
by Evangelos Bakalis
Appl. Sci. 2025, 15(21), 11580; https://doi.org/10.3390/app152111580 - 29 Oct 2025
Abstract
We forecast short time series iteratively using a model based on stochastic differential equations. The recorded process is assumed to be consistent with an α-stable Lévy motion. The generalized moments method provides the values of the scaling exponent and the parameter α [...] Read more.
We forecast short time series iteratively using a model based on stochastic differential equations. The recorded process is assumed to be consistent with an α-stable Lévy motion. The generalized moments method provides the values of the scaling exponent and the parameter α, which determine the form of the stochastic term at each iteration. Seven weekly recorded economic time series—the DAX, CAC, FTSE100, MIB, AEX, IBEX, and STOXX600—were examined for the period from 2020 to 2025. The parameter α is always 2 for the four of them, FTSE100, AEX, IBEX, and STOXX600, indicating quasi-Gaussian processes. For FTSE100, IBEX, and STOXX600, the processes are anti-persistent (H < 0.5).The rest of the examined markets show characteristics of uncorrelated processes whose values are drawn from either a log-normal or a log-Lévy distribution. Further, all processes are multifractal, as the non-zero value of the mean intermittency indicates. The model’s forecasts, with the time horizon always one-step-ahead, are compared to the forecasts of a properly chosen ARIMA model combined with Monte Carlo simulations. The low values of the absolute percentage error indicate that both models function well. The model’s outcomes are further compared to ARIMA forecasts by using the Diebold–Mariano test, which yields a better forecast ability for the proposed model since it has less average loss. The ability and accuracy of the model to forecast even small time series is further supported by the low value of the absolute percentage error; the value of 4 serves as an upper limit for the majority of the forecasts. Full article
(This article belongs to the Special Issue Advanced Methods for Time Series Forecasting)
20 pages, 2167 KB  
Article
Research on Fault Diagnosis Method for Autonomous Underwater Vehicles Based on Improved LSTM Under Data Missing Conditions
by Lingyan Dong and Yan Huo
Appl. Sci. 2025, 15(21), 11570; https://doi.org/10.3390/app152111570 - 29 Oct 2025
Abstract
Fault diagnosis for Autonomous underwater Vehicle (AUVs) is a key technology for ensuring the safety of AUVs and an important skill for enabling them to autonomously perform tasks underwater for long periods. The effectiveness of current diagnostic methods is affected by the reliability [...] Read more.
Fault diagnosis for Autonomous underwater Vehicle (AUVs) is a key technology for ensuring the safety of AUVs and an important skill for enabling them to autonomously perform tasks underwater for long periods. The effectiveness of current diagnostic methods is affected by the reliability of expert knowledge and the accuracy of model establishment. In addition, some data-driven diagnostic methods lack robustness. Unlike traditional model-based fault diagnosis methods, this paper proposes a fault diagnosis method for AUVs based on the LSTM (Long Short-Term Memory) algorithm. LSTM is good at processing time series data and can learn complex temporal patterns. Therefore, the LSTM model is used to learn the mapping of state data to its corresponding fault types. The underwater environment in which AUVs work is complex and ever-changing, and packet loss may occur during data transmission, resulting in partial loss of online data. To address this issue, this paper fills in missing values during the feature processing stage and then uses a BiLSTM-Attention-MiniLoss algorithm to enhance the robustness of the diagnostic model. Finally, the fault diagnosis accuracy of the original LSTM and the BiLSTM-Attention-MiniLoss was compared based on an open-source dataset under different degrees of data loss. The experimental results showed that the fault diagnosis methods for AUV based on LSTM and the BiLSTM-Attention-MiniLoss could predict the type of fault based on the navigation status data of the AUV, with BiLSTM-Attention-MiniLoss performing better. Full article
48 pages, 1608 KB  
Systematic Review
A Systematic Review of Advances in Deep Learning Architectures for Efficient and Sustainable Photovoltaic Solar Tracking: Research Challenges and Future Directions
by Ali Alhazmi, Kholoud Maswadi and Christopher Ifeanyi Eke
Sustainability 2025, 17(21), 9625; https://doi.org/10.3390/su17219625 (registering DOI) - 29 Oct 2025
Abstract
The swift advancement of renewable energy technology has highlighted the need for effective photovoltaic (PV) solar energy tracking systems. Deep learning (DL) has surfaced as a promising method to improve the precision and efficacy of photovoltaic (PV) solar tracking by utilising complicated patterns [...] Read more.
The swift advancement of renewable energy technology has highlighted the need for effective photovoltaic (PV) solar energy tracking systems. Deep learning (DL) has surfaced as a promising method to improve the precision and efficacy of photovoltaic (PV) solar tracking by utilising complicated patterns in meteorological and PV system data. This systematic literature review (SLR) seeks to offer a thorough examination of the progress in deep learning architectures for photovoltaic solar energy tracking over the last decade (2016–2025). The review was structured around four research questions (RQs) aimed at identifying prevalent deep learning architectures, datasets, performance metrics, and issues within the context of deep learning-based PV solar tracking systems. The present research utilised SLR methodology to analyse 64 high-quality publications from reputed academic databases like IEEE Xplore, Science Direct, Springer, and MDPI. The results indicated that deep learning architectures, including Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) networks, and Transformer-based models, are extensively employed to improve the accuracy and efficiency of photovoltaic solar tracking systems. Widely utilised datasets comprised meteorological data, photovoltaic system data, time series data, temperature data, and image data. Performance metrics, including Mean Absolute Error (MAE), Mean Squared Error (MSE), and Mean Absolute Percentage Error (MAPE), were employed to assess model efficacy. Identified significant challenges encompass inadequate data quality, restricted availability, high computing complexity, and issues in model generalisation. Future research should concentrate on enhancing data quality and accessibility, creating generalised models, minimising computational complexity, and integrating deep learning with real-time photovoltaic systems. Resolving these challenges would facilitate advancements in efficient, reliable, and sustainable photovoltaic solar tracking systems, hence promoting the wider adoption of renewable energy technology. This review emphasises the capability of deep learning to transform photovoltaic solar tracking and stresses the necessity for interdisciplinary collaboration to address current limitations. Full article
25 pages, 1608 KB  
Article
Online Imputation of Corrupted Glucose Sensor Data Using Deep Neural Networks and Physiological Inputs
by Oscar D. Sanchez, Eduardo Mendez-Palos, Daniel Alexander Pascoe, Hannia M. Hernandez, Jesus G. Alvarez and Alma Y. Alanis
Algorithms 2025, 18(11), 688; https://doi.org/10.3390/a18110688 (registering DOI) - 29 Oct 2025
Abstract
One of the main challenges when working with time series captured online using sensors is the appearance of noise or null values, generally caused by sensor failures or temporary disconnections. These errors compromise data reliability and can lead to incorrect decisions. Particularly in [...] Read more.
One of the main challenges when working with time series captured online using sensors is the appearance of noise or null values, generally caused by sensor failures or temporary disconnections. These errors compromise data reliability and can lead to incorrect decisions. Particularly in the treatment of diabetes mellitus, where medical decisions depend on continuous glucose monitoring (CGM) systems provided by modern sensors, the presence of corrupted data can pose a significant risk to patient health. This work presents an approach that encompasses online detection and imputation of anomalous data using physiological inputs (insulin and carbohydrate intake), which enables decision-making in automatic glucose monitoring systems or for glucose control purposes. Four deep neural network architectures are proposed: CNN-LSTM, GRU, 1D-CNN, and Transformer-LSTM, under a controlled fault injection protocol and compared with the ARIMA model and the Temporal Convolutional Network (TCN). The obtained performance is compared using regression (MAE, RMSE, MARD) and classification (accuracy, precision, recall, F1-score, AUC) metrics. Results show that the CNN-LSTM network is the most effective for fault detection, achieving an F1-score of 0.876 and an accuracy of 0.979. Regarding data imputation, the 1D-CNN network obtained the best performance, with an MAE of 2.96 mg/dL and an RMSE of 3.75 mg/dL. Then, validation on the OhioT1DM dataset, containing real CGM data with natural sensor disconnections, showed that the CNN–LSTM model accurately detected anomalies and reliably imputed missing glucose segments under real-world conditions. Full article
Show Figures

Figure 1

22 pages, 2550 KB  
Article
Lightweight Signal Processing and Edge AI for Real-Time Anomaly Detection in IoT Sensor Networks
by Manuel J. C. S. Reis
Sensors 2025, 25(21), 6629; https://doi.org/10.3390/s25216629 - 28 Oct 2025
Abstract
The proliferation of IoT devices has created vast sensor networks that generate continuous time-series data. Efficient and real-time processing of these signals is crucial for applications such as predictive maintenance, healthcare monitoring, and environmental sensing. This paper proposes a lightweight framework that combines [...] Read more.
The proliferation of IoT devices has created vast sensor networks that generate continuous time-series data. Efficient and real-time processing of these signals is crucial for applications such as predictive maintenance, healthcare monitoring, and environmental sensing. This paper proposes a lightweight framework that combines classical signal processing techniques (Fourier and Wavelet-based feature extraction) with edge-deployed machine learning models for anomaly detection. By performing feature extraction and classification locally, the approach reduces communication overhead, minimizes latency, and improves energy efficiency in IoT nodes. Experiments with synthetic vibration, acoustic, and environmental datasets showed that the proposed Shallow Neural Network achieved the highest detection performance (F1-score 0.94), while a Quantized TinyML model offered a favorable trade-off (F1-score 0.92) with a 3× reduction in memory footprint and 60% lower energy consumption. Decision Trees remained competitive for ultra-constrained devices, providing sub-millisecond latency with limited recall. Additional analyses confirmed robustness against noise, missing data, and variations in anomaly characteristics, while ablation studies highlighted the contributions of each pipeline component. These results demonstrate the feasibility of accurate, resource-efficient anomaly detection at the edge, paving the way for practical deployment in large-scale IoT sensor networks. Full article
(This article belongs to the Special Issue Internet of Things Cybersecurity)
27 pages, 2162 KB  
Article
A Dual-Attention Temporal Convolutional Network-Based Track Initiation Method for Maneuvering Targets
by Hanbao Wu, Yiming Hao, Wei Chen and Mingli Liao
Electronics 2025, 14(21), 4215; https://doi.org/10.3390/electronics14214215 - 28 Oct 2025
Abstract
In strong clutter and maneuvering scenarios, radar track initiation faces the dual challenges of a low initiation rate and high false alarm rate. Although the existing deep learning methods show promise, the commonly adopted “feature flattening” input strategy destroys the intrinsic temporal structure [...] Read more.
In strong clutter and maneuvering scenarios, radar track initiation faces the dual challenges of a low initiation rate and high false alarm rate. Although the existing deep learning methods show promise, the commonly adopted “feature flattening” input strategy destroys the intrinsic temporal structure and feature relationships of track data, limiting its discriminative performance. To address this issue, this paper proposes a novel radar track initiation method based on Dual-Attention Temporal Convolutional Network (DA-TCN), reformulating track initiation as a binary classification task for very short multi-channel time series that preserve complete temporal structure. The DA-TCN model employs the TCN as its backbone network to extract local dynamic features and innovatively constructs a dual-attention architecture: a channel attention branch dynamically calibrates the importance of each kinematic feature, while a temporal attention branch integrates Bi-GRU and self-attention mechanisms to capture the dependencies at critical time steps. Ultimately, a learnable gated fusion mechanism adaptively weights the dual-branch information for optimal characterization of track characteristics. Experimental results on maneuvering target datasets demonstrate that the proposed method significantly outperforms multiple baseline models across varying clutter densities: Under the highest clutter density, DA-TCN achieves 95.12% true track initiation rate (+1.6% over best baseline) with 9.65% false alarm rate (3.63% reduction), validating its effectiveness for high-precision and highly robust track initiation in complex environments. Full article
Show Figures

Figure 1

20 pages, 3485 KB  
Article
Deformation Pattern Classification of Sea-Crossing Bridge InSAR Time Series Based on a Transfer Learning Framework
by Lichen Ren, Chengyin Liu and Jinping Ou
Remote Sens. 2025, 17(21), 3567; https://doi.org/10.3390/rs17213567 - 28 Oct 2025
Abstract
Interferometric Synthetic Aperture Radar (InSAR) provides unique advantages for sea-crossing bridge monitoring through continuous, large-scale deformation detection. Dividing monitoring data into specific deformation patterns helps establish the connection between bridge deformation and its underlying mechanisms. However, the classification of complex and nonlinear bridge [...] Read more.
Interferometric Synthetic Aperture Radar (InSAR) provides unique advantages for sea-crossing bridge monitoring through continuous, large-scale deformation detection. Dividing monitoring data into specific deformation patterns helps establish the connection between bridge deformation and its underlying mechanisms. However, the classification of complex and nonlinear bridge deformations often requires extensive manual labeling work. To achieve automatic classification of deformation patterns with minimal labeled data, this study introduces a transfer learning approach and proposes an InSAR-based method for deformation pattern recognition of cross-sea bridges. At first, deformation time series of the study area are acquired by PS-InSAR, with GNSS results confirming less than 10% error. Then, six types of deformation are identified, including stable, linear, step, piecewise linear, power law, and temperature-related types. Large amounts of simulated data with labels are generated based on these six types. Subsequently, four models—TCN, Transformer, TFT, and ROCKET—are trained using synthetic data and finely adjusted using few real data. Finally, the final classification results are weighted by the classification results of multiple models. Even though confidence and global consistency of each single model are also calculated, the final result is the combined result of a set of multi-type confidences. ROCKET achieved the highest accuracy on simulation data (96.27%) in these four representative models, while ensemble weighting improved robustness on real data. The methodology addresses supervised learning’s labeled data requirements through synthetic data generation and ensemble classification, producing probabilistic outputs that preserve uncertainty information rather than deterministic labels. The framework enables automatic classification of sea-crossing bridge deformation patterns with minimal labeled data, identifying patterns with distinct dominant factors and providing probabilistic information for engineering decision making. Full article
22 pages, 3835 KB  
Article
Phenology-Guided Wheat and Corn Identification in Xinjiang: An Improved U-Net Semantic Segmentation Model Using PCA and CBAM-ASPP
by Yang Wei, Xian Guo, Yiling Lu, Hongjiang Hu, Fei Wang, Rongrong Li and Xiaojing Li
Remote Sens. 2025, 17(21), 3563; https://doi.org/10.3390/rs17213563 - 28 Oct 2025
Abstract
Wheat and corn are two major food crops in Xinjiang. However, the spectral similarity between these crop types and the complexity of their spatial distribution has posed significant challenges to accurate crop identification. To this end, the study aimed to improve the accuracy [...] Read more.
Wheat and corn are two major food crops in Xinjiang. However, the spectral similarity between these crop types and the complexity of their spatial distribution has posed significant challenges to accurate crop identification. To this end, the study aimed to improve the accuracy of crop distribution identification in complex environments in three ways. First, by analysing the kNDVI and EVI time series, the optimal identification window was determined to be days 156–176—a period when wheat is in the grain-filling to milk-ripening phase and maize is in the jointing to tillering phase—during which, the strongest spectral differences between the two crops occurs. Second, principal component analysis (PCA) was applied to Sentinel-2 data. The top three principal components were extracted to construct the input dataset, effectively integrating visible and near-infrared band information. This approach suppressed redundancy and noise while replacing traditional RGB datasets. Finally, the Convolutional Block Attention Module (CBAM) was integrated into the U-Net model to enhance feature focusing on key crop areas. An improved Atrous Spatial Pyramid Pooling (ASPP) module based on deep separable convolutions was adopted to reduce the computational load while boosting multi-scale context awareness. The experimental results showed the following: (1) Wheat and corn exhibit obvious phenological differences between the 156th and 176th days of the year, which can be used as the optimal time window for identifying their spatial distributions. (2) The method proposed by this research had the best performance, with its mIoU, mPA, F1-score, and overall accuracy (OA) reaching 83.03%, 91.34%, 90.73%, and 90.91%, respectively. Compared to DeeplabV3+, PSPnet, HRnet, Segformer, and U-Net, the OA improved by 5.97%, 4.55%, 2.03%, 8.99%, and 1.5%, respectively. The recognition accuracy of the PCA dataset improved by approximately 2% compared to the RGB dataset. (3) This strategy still had high accuracy when predicting wheat and corn yields in Qitai County, Xinjiang, and had a certain degree of generalisability. In summary, the improved strategy proposed in this study holds considerable application potential for identifying the spatial distribution of wheat and corn in arid regions. Full article
Show Figures

Figure 1

30 pages, 7894 KB  
Article
Polyacrylamide and Polyacrylamide/Polysaccharide Hydrogels for Well Water Shutoff in High-Temperature Reservoirs
by Aleksey Telin, Natalia Sergeeva, Rustem Asadullin, Ekaterina Gusarova, Ravil Yakubov, Vladimir Dokichev, Anatoly Politov, Elina Sunagatova, Natalia Gibadullina, Galina Teptereva and Lyubov Lenchenkova
Gels 2025, 11(11), 862; https://doi.org/10.3390/gels11110862 (registering DOI) - 28 Oct 2025
Abstract
Polyacrylamide and polyacrylamide/polysaccharide hydrogels exhibiting high structural and mechanical properties, along with acceptable gelation times and gelant viscosity, are proposed for water shutoff applications in high-temperature reservoirs. The obtained polyacrylamide gels demonstrate an elastic modulus 1.6–2.7 times higher than that of the baseline [...] Read more.
Polyacrylamide and polyacrylamide/polysaccharide hydrogels exhibiting high structural and mechanical properties, along with acceptable gelation times and gelant viscosity, are proposed for water shutoff applications in high-temperature reservoirs. The obtained polyacrylamide gels demonstrate an elastic modulus 1.6–2.7 times higher than that of the baseline polyacrylamide–resorcinol–paraform–sulfamic acid gel (17.2 Pa), reaching up to 46.3 Pa, while the polyacrylamide/polysaccharide gels surpass it by a factor of 2.3–5.2, reaching up to 89.9 Pa. The gelation time of the polyacrylamide/polysaccharide gels ranges from 3 to 7 h, with the gelant viscosity varying from 685 to 2098 mPa·s at a shear rate of 100 s−1. Crosslinking of polyacrylamide with polysaccharides was achieved using paraform. Using the gel based on crosslinked polyacrylamide with xanthan as an example, spectral methods characterized the copolymer constituting the basis of the plugging material. Our analysis established that crosslinking occurs between the amide group of polyacrylamide and the hydroxyl group of the polysaccharide. Model reactions with low-molecular-weight analogs (glucose, acetamide, and formaldehyde), coupled with mass spectrometric confirmation of the structure of the resulting products, revealed possible reaction pathways. The crosslinking of polyacrylamide was investigated using a broad range of polysaccharides of plant and microbiological origin. The resulting series of hydrogels, possessing the suite of properties required for water shutoff in high-temperature formations, will enable oil companies (operators) and service firms to select a specific gel-forming system based on project objectives, logistics, and budget constraints. Full article
(This article belongs to the Section Gel Chemistry and Physics)
Show Figures

Figure 1

19 pages, 2431 KB  
Article
Predicting the Remaining Service Life of Power Transformers Using Machine Learning
by Zimo Gao, Binkai Yu, Jiahe Guang, Shanghua Jiang, Xinze Cong, Minglei Zhang and Lin Yu
Processes 2025, 13(11), 3459; https://doi.org/10.3390/pr13113459 - 28 Oct 2025
Abstract
In response to the insufficient adaptability of power transformer remaining useful life (RUL) prediction under complex working conditions and the difficulty of multi-scale feature fusion, this study proposes an industrial time series prediction model based on the parallel Transformer–BiGRU–GlobalAttention model. The parallel Transformer [...] Read more.
In response to the insufficient adaptability of power transformer remaining useful life (RUL) prediction under complex working conditions and the difficulty of multi-scale feature fusion, this study proposes an industrial time series prediction model based on the parallel Transformer–BiGRU–GlobalAttention model. The parallel Transformer encoder captures long-range temporal dependencies, the BiGRU network enhances local sequence associations through bidirectional modeling, the global attention mechanism dynamically weights key temporal features, and cross-attention achieves spatiotemporal feature interaction and fusion. Experiments were conducted based on the public ETT transformer temperature dataset, employing sliding window and piecewise linear label processing techniques, with MAE, MSE, and RMSE as evaluation metrics. The results show that the model achieved excellent predictive performance on the test set, with an MSE of 0.078, MAE of 0.233, and RMSE of 11.13. Compared with traditional LSTM, CNN-BiGRU-Attention, and other methods, the model achieved improvements of 17.2%, 6.0%, and 8.9%, respectively. Ablation experiments verified that the global attention mechanism rationalizes the feature contribution distribution, with the core temporal feature OT having a contribution rate of 0.41. Multiple experiments demonstrated that this method has higher precision compared with other methods. Full article
(This article belongs to the Section Energy Systems)
Show Figures

Figure 1

12 pages, 996 KB  
Article
Integrating 1D-CNN and Bi-GRU for ENF-Based Video Tampering Detection
by Xiaodan Lin and Xinhuan Zang
Sensors 2025, 25(21), 6612; https://doi.org/10.3390/s25216612 - 28 Oct 2025
Abstract
Electric network frequency (ENF) refers to the transmission frequency of a power grid, which fluctuates around 50 Hz or 60 Hz. Videos captured in a power grid environment may exhibit flickering artifact caused by the intensity variation in the light source, thus exhibiting [...] Read more.
Electric network frequency (ENF) refers to the transmission frequency of a power grid, which fluctuates around 50 Hz or 60 Hz. Videos captured in a power grid environment may exhibit flickering artifact caused by the intensity variation in the light source, thus exhibiting the flickering pattern according to the ENF fluctuation. This flicker, notable for its temporal dynamics and quasi-periodic property, acts as an effective means for video tampering forensics. However, ground-truth ENF databases are often unavailable in a real-world authentication setting, thus posing challenges in conducting ENF examination in video forensics. In addition, dynamic scenes in videos also increase the difficulty of anomaly detection in ENF signals. To address these challenges, we proposed an approach based on neural networks to detect inter-frame tampering in CMOS videos that incorporate ENF signals. To the best of our knowledge, this is the first work that deploys data-driven approach for ENF-based video forensics. Without the aid of the reference ENF dataset, we exploited the implicit ENF variation in luminance signals and transformed the video signal into a one-dimensional time series utilizing ENF priors. In addition, to alleviate the impact of moving objects that also cause the variation in luminance signal, a preprocessing stage is proposed. On this basis, we designed an anomaly detection model combining 1D-CNN and Bi-GRU to conduct experiments on static and dynamic video datasets. The experimental results demonstrate the effectiveness of our proposed method in inter-frame video tampering detection, implying its potential as a forensic tool for ENF-based video analysis. Full article
Show Figures

Figure 1

26 pages, 5287 KB  
Article
Multi-Point Seawall Settlement Modeling Using DTW-Based Hierarchical Clustering and AJSO-LSTM Method
by Chunmei Ding, Xian Liu, Zhenzhu Meng and Yadong Liu
J. Mar. Sci. Eng. 2025, 13(11), 2053; https://doi.org/10.3390/jmse13112053 - 27 Oct 2025
Abstract
A seawall settlement is a critical concern in marine engineering, as an excessive or uneven settlement can undermine structural stability and diminish the capacity to withstand marine hydrodynamic actions such as storm surges, waves, and tides. Accordingly, accurate settlement prediction is vital to [...] Read more.
A seawall settlement is a critical concern in marine engineering, as an excessive or uneven settlement can undermine structural stability and diminish the capacity to withstand marine hydrodynamic actions such as storm surges, waves, and tides. Accordingly, accurate settlement prediction is vital to ensuring seawall safety. To address the lack of clustering methods that capture the time-series characteristics of monitoring points and the limitations of hyperparameter sensitivity of conventional LSTM models, this study proposes a hybrid model integrating Dynamic Time Warping-based Hierarchical Clustering (DTW-HC) and an Adaptive Joint Search Optimization-enhanced Long Short-Term Memory Model (AJSO-LSTM). First, DTW-HC is employed to cluster monitoring points according to their time series characteristics, thereby constructing a spatial panel data structure that incorporates both temporal evolution and spatial heterogeneity. Then, an AJSO-LSTM model is developed within each cluster to capture temporal dependencies and improve prediction performance by overcoming the weaknesses of a conventional LSTM. Finally, using seawall settlement monitoring data from a real engineering case, the proposed method is validated by comparing it with a statistical model, a back-propagation Neural Network (BP-ANN), and a conventional LSTM. Results demonstrate that the proposed model consistently outperforms these three benchmark methods in terms of prediction accuracy and robustness. This confirms the potential of the proposed framework as an effective tool for seawall safety management and long-term service evaluation. Full article
Show Figures

Figure 1

17 pages, 2247 KB  
Article
Retrospective Analysis and Cross-Validated Forecasting of West Nile Virus Transmission in Italy: Insights from Climate and Surveillance Data
by Francesco Branda, Mohamed Mustaf Ahmed, Dong Keon Yon, Giancarlo Ceccarelli, Massimo Ciccozzi and Fabio Scarpa
Trop. Med. Infect. Dis. 2025, 10(11), 305; https://doi.org/10.3390/tropicalmed10110305 - 27 Oct 2025
Abstract
Background. West Nile Virus (WNV) represents a significant public health concern in Europe, with Italy—particularly its northern regions—experiencing recurrent outbreaks. Climate variables and vector dynamics are known to significantly influence transmission patterns, highlighting the need for reliable predictive models to enable timely outbreak [...] Read more.
Background. West Nile Virus (WNV) represents a significant public health concern in Europe, with Italy—particularly its northern regions—experiencing recurrent outbreaks. Climate variables and vector dynamics are known to significantly influence transmission patterns, highlighting the need for reliable predictive models to enable timely outbreak detection and response. Methods. We integrated epidemiological data on human WNV infections in Italy (2012–2024) with high-resolution climate variables (temperature, humidity, and precipitation). Using advanced feature engineering and a gradient boosting framework (XGBoost), we developed a predictive model optimized through time-series cross-validation. Results. The model achieved high predictive accuracy at the national level (R2 = 0.994, MAPE = 5.16%) and maintained robust performance across the five most affected provinces, with R2 values ranging from 0.896 to 0.996. SHAP analysis identified minimum temperature as the most influential climate predictor, while maximum temperature and rainfall demonstrated considerably weaker associations with case incidence. Conclusions. This machine learning approach provides a reliable framework for forecasting WNV outbreaks and supports evidence-based public health responses. The integration of climate and epidemiological data enhances surveillance capabilities and enables informed decision-making at regional and local levels. Full article
Show Figures

Figure 1

Back to TopTop