Next Article in Journal
Generative AI in AI-Based Digital Twins for Fault Diagnosis for Predictive Maintenance in Industry 4.0/5.0
Previous Article in Journal
Impact Load Compensation Inverter Based on Nonlinear Adaptive Multiple Input Multiple Output Control Strategy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Editorial

Advanced AI and Machine Learning Techniques for Time Series Analysis and Pattern Recognition

by
Antonio Pagliaro
1,2,3,*,
Antonio Alessio Compagnino
1,3 and
Pierluca Sangiorgi
1,*
1
INAF IASF Palermo, Via Ugo La Malfa 153, I-90146 Palermo, Italy
2
Istituto Nazionale di Fisica Nucleare Sezione di Catania, Via Santa Sofia 64, I-95123 Catania, Italy
3
ICSC—Centro Nazionale di Ricerca in HPC, Big Data e Quantum Computing, I-40121 Bologna, Italy
*
Authors to whom correspondence should be addressed.
Appl. Sci. 2025, 15(6), 3165; https://doi.org/10.3390/app15063165
Submission received: 10 March 2025 / Accepted: 11 March 2025 / Published: 14 March 2025

Abstract

:
Time series analysis and pattern recognition are cornerstones for innovation across diverse domains. In finance, these techniques enable market prediction and risk assessment. Astrophysicists use them to detect various phenomena and analyze data. Environmental scientists track ecosystem changes and pollution patterns, while healthcare professionals monitor patient vitals and disease progression. Transportation systems optimize traffic flow and predict maintenance needs. Energy providers balance grid loads and forecast consumption. Climate scientists model atmospheric changes and extreme weather events. Cybersecurity experts identify threats through anomaly detection in network traffic patterns. This editorial introduces this Special Issue, which explores state-of-the-art AI and machine learning (ML) techniques, including Long Short-Term Memory (LSTM) networks, Transformers, ensemble methods, and AutoML frameworks. We highlight innovative applications in data-driven finance, astrophysical event reconstruction, cloud masking, and healthcare monitoring. Recent advancements in feature engineering, unsupervised learning frameworks for cloud masking, and Transformer-based time series forecasting demonstrate the potential of these technologies. The papers collected in this Special Issue showcase how integrating domain-specific knowledge with computational innovations provides a pathway to achieving higher accuracy in time series analysis across various scientific disciplines.

1. Introduction

The rapid growth of time series data across domains implied the adoption of advanced AI and ML techniques in addressing challenges such as non-linearity, high dimensionality, and noise. Traditional statistical methods often fall short in capturing the complex temporal dependencies inherent in modern datasets. Advanced ML approaches such as LSTM networks, Transformers, ensemble methods, and AutoML frameworks have revolutionized time series analysis by enabling accurate forecasting, anomaly detection, and pattern recognition.
There are various applications, including the following: financial forecasting [1,2,3,4], astrophysical event reconstruction [5,6,7,8,9], cloud masking in satellite imagery [10,11,12], patient health monitoring [13,14], traffic flow optimization [15,16], energy demand prediction [17], climate change analysis [18,19], and cybersecurity threat detection [20,21]. Recent advancements include unsupervised frameworks like Auto-CM [22] for cloud masking, Fourier-based feature engineering for financial applications [23], and Transformer models for ECG analysis [24]. These innovations underscore the importance of integrating domain-specific knowledge with computational advances to tackle real-world challenges.
This Special Issue aims to bridge the gap between theoretical advancements and practical implementations by presenting cutting-edge research that demonstrates the versatility and efficacy of AI/ML approaches in time series analysis across multiple domains.

2. Machine Learning Techniques for Time Series Analysis

2.1. Recurrent Neural Networks (RNNs) and LSTMs

Long Short-Term Memory (LSTM) networks have transformed how we understand and predict patterns in sequential information, enabling breakthroughs that touch our daily lives in countless ways. These sophisticated neural networks possess what amounts to a digital memory system, allowing them to recognize meaningful connections in data that unfolds over time. As noted in recent research by Yakymiv [25], this capability has proven remarkably effective for predicting energy demands and many other applications. What makes LSTMs truly revolutionary is their elegant solution to a fundamental challenge in machine learning: maintaining important information over extended sequences. Through specialized internal mechanisms, these networks can selectively remember critical patterns while discarding irrelevant noise, much like how human memory works, but with mathematical precision. Despite requiring significant computational resources, LSTMs have earned their place as essential tools in our predictive arsenal. Their ability to model complex, non-linear relationships in time-dependent data continues to open new frontiers in fields ranging from climate science to healthcare.

2.2. Transformer Models

Though initially designed for natural language processing tasks, Transformers have demonstrated remarkable effectiveness in time series analysis. These models utilize self-attention mechanisms to capture relationships across temporal data points, providing significant advantages over traditional recurrent neural networks (RNNs) in scenarios requiring the modeling of long-range dependencies. Recent research by Logunova [26] documented the superior performance of Transformer architectures in traffic flow forecasting applications. In the financial sector, Qian (2025) [27] has adapted these models to capture complex temporal patterns in stock price movements, further validating their versatility across domains. The key innovation behind Transformers’ success lies in their self-attention mechanism, which allows the model to assign variable importance to different time steps when generating predictions. This capability proves particularly valuable for time series data where relevant information may be distributed across distant temporal points. Unlike RNNs, which process sequential information step by step, Transformers can directly model relationships between any points in a sequence, regardless of their distance from each other. This architectural advantage enables Transformers to identify subtle patterns and correlations within time series data that might otherwise remain undetected, making them increasingly valuable tools for researchers and practitioners working with temporal datasets across various scientific and industrial applications.

2.3. Ensemble Methods

Ensemble methods like Random Forests and Extra Trees Classifiers combine multiple models to enhance predictive accuracy. In finance, these methods have demonstrated superior performance in forecasting stock price changes by capturing complex feature interactions [4]. Ensemble learning has also been applied to healthcare time series data for early disease detection [28].
A common approach to ensemble modeling for time series is stacking, where predictions from multiple base models are combined using a meta-learner. This technique integrates outputs from diverse forecasting methods such as ARIMA (statistical modeling), Prophet (decomposition-based forecasting), and LSTMs (deep learning). In this framework, each base model independently processes the input data to generate its prediction. A meta-learner then takes these individual predictions as inputs and produces a final forecast that typically outperforms any single constituent model. This methodology creates a powerful synergy by combining traditional statistical approaches with advanced deep learning techniques. Statistical models often excel at capturing seasonal patterns and trends, while deep learning models can identify complex non-linear relationships in the data. The meta-learner weighs the strengths of each model appropriately, compensating for individual weaknesses and producing more robust and accurate forecasts. The stacking approach has proven particularly valuable in domains with complex time-dependent patterns where no single modeling technique consistently dominates, such as financial forecasting, energy demand prediction, and retail sale forecasting.

2.4. Unsupervised Learning Frameworks

The Auto-CM framework represents a significant advancement in unsupervised learning for satellite imagery cloud masking. By leveraging spatio-temporal dynamics, Auto-CM outperforms traditional physics-based methods and supervised ML models on diverse datasets [22]. The framework employs contrastive learning to identify cloud patterns without explicit labels, making it particularly valuable for regions with limited ground data.

3. Applications Across Domains

The versatility of advanced time series analysis techniques has led to their adoption across diverse scientific disciplines. The following examples highlight notable successful implementations that demonstrate the transformative impact of these methodologies. While these applications represent significant areas where AI and ML have made substantial contributions to time series analysis, it is important to note that this list is by no means exhaustive, as new applications continue to emerge across various fields of research and industry.

3.1. Data-Driven Finance

In finance, ML techniques are used for predicting significant market movements, risk assessment, portfolio optimization, and fraud detection. The Extra Trees Classifier has been particularly effective in forecasting stock price changes by capturing complex feature interactions [4]. Fourier transform-based feature engineering has further improved the accuracy of financial forecasting models by uncovering hidden periodicities in stock market data [23,29,30].

3.2. Astrophysics

Astrophysical research has benefited from ML methods for event reconstruction in Imaging Atmospheric Cherenkov Telescopes (IACTs). Ensemble methods have enhanced the accuracy of Cherenkov event classification in the ASTRI Mini-Array project [8], enabling more precise studies of high-energy cosmic phenomena.

3.3. Cloud Masking in Satellite Imagery

Cloud masking is critical for improving the quality of satellite-based Earth observation. Recent advancements include Auto-CM [22], which uses unsupervised deep learning to outperform existing cloud masking methods across diverse geographic regions. Additionally, deep learning models like U-Net and Mask R-CNN have been employed to detect and replace cloud-contaminated pixels, improving the accuracy of climate variable retrievals [31].

3.4. Healthcare Monitoring

Time series analysis has become essential in healthcare for monitoring patient vital signs and predicting disease progression. Transformer models have been applied to analyze electrocardiogram (ECG) data with high accuracy [24]. Ensemble methods have also been used to detect anomalies in patient health records, enabling early intervention.

4. Emerging Trends and Future Directions

4.1. Explainable AI for Time Series

As AI/ML models become increasingly complex, the need for explainability has grown in importance, especially in critical domains like healthcare and finance. Recent research has focused on developing techniques such as SHAP (SHapley Additive exPlanations) values and LIME (Local Interpretable Model-agnostic Explanations) specifically adapted for time series data, allowing practitioners to understand which temporal patterns most influence model predictions.

4.2. Transfer Learning for Limited Data Scenarios

Transfer learning approaches, where models pre-trained on large datasets are fine-tuned for specific tasks with limited data, are gaining traction in time series analysis. This approach has shown promise in domains where labeled data are poor or expensive to obtain, such as fault detection in industrial equipment or rare disease diagnosis from medical time series.

4.3. Federated Learning for Privacy-Preserving Analysis

Federated learning enables model training across multiple decentralized devices or servers while keeping data localized, addressing privacy concerns in sensitive domains. This approach is particularly relevant for healthcare applications where patient data privacy is paramount but collaborative model improvement is beneficial.

5. Conclusions

The convergence of artificial intelligence and machine learning with time series analysis has catalyzed transformative advances across scientific disciplines. Revolutionary approaches—including Transformer architectures, sophisticated ensemble methodologies, AutoML frameworks, and unsupervised systems like Auto-CM—have expanded our analytical capabilities and opened new frontiers for tackling previously intractable challenges. The contributions presented in this Special Issue not only document the current state of the art but also illuminate promising research trajectories. While significant progress has been achieved, several critical challenges persist. These include developing more efficient computational approaches for high-dimensional time series, mitigating data limitations through advanced transfer learning techniques, and enhancing model interpretability to facilitate adoption in sensitive domains where algorithmic transparency is paramount. As we look forward, the integration of domain-specific expertise with algorithmic innovation promises to accelerate progress in this field. The interdisciplinary collaboration between domain scientists and AI researchers continues to remove traditional barriers, suggesting that the most significant breakthroughs may emerge at these intersections. This evolving synthesis will likely yield increasingly sophisticated analytical tools capable of extracting deeper insights from temporal data, ultimately advancing our understanding of complex dynamic systems across scientific domains.

Acknowledgments

The authors acknowledge supercomputing resources and support from ICSC—Centro Nazionale di Ricerca in High Performance Computing, Big Data and Quantum Computing—and hosting entity, funded by European Union—NextGenerationEU.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Fu, T.C.; Chung, F.L.; Luk, R.; Ng, C.M. Preventing meaningless stock time series pattern discovery by changing perceptually important point detection. In Proceedings of the International Conference on Fuzzy Systems and Knowledge Discovery, Changsha, China, 27–29 August 2005; pp. 1171–1174. [Google Scholar]
  2. Markowska-Kaczmar, U.; Dziedzic, M. Discovery of technical analysis patterns. In Proceedings of the International Multiconference on Computer Science and Information Technology, Wisla, Poland, 20–22 October 2008; pp. 137–142. [Google Scholar]
  3. Chen, T.L.; Chen, F.Y. An intelligent pattern recognition model for supporting investment decisions in stock market. Inf. Sci. 2016, 346, 261–274. [Google Scholar] [CrossRef]
  4. Pagliaro, A. Forecasting Significant Stock Market Price Changes Using Machine Learning: Extra Trees Classifier Leads. Electronics 2023, 12, 4551. [Google Scholar] [CrossRef]
  5. Sahakyan, N. AI in the Cosmos. arXiv 2024, arXiv:2412.10093. [Google Scholar] [CrossRef]
  6. Olivares, E.; Curé, M.; Araya, I.; Fabregas, E.; Arcos, C.; Machuca, N.; Farias, G. Estimation of Physical Stellar Parameters from Spectral Models Using Deep Learning Techniques. Mathematics 2024, 12, 3169. [Google Scholar] [CrossRef]
  7. Special Issue: Newest Results in Gravitational Waves and Machine Learning. Universe 2024, 9, 10. Available online: https://www.mdpi.com/journal/universe/special_issues/48U1E55JLC (accessed on 12 March 2024).
  8. Pagliaro, A.; Cusumano, G.; La Barbera, A.; La Parola, V.; Lombardi, S. Application of Machine Learning Ensemble Methods to ASTRI Mini-Array Cherenkov Event Reconstruction. Appl. Sci. 2023, 13, 8172. [Google Scholar] [CrossRef]
  9. Bruno, A.; Pagliaro, A.; La Parola, V. Application of Machine and Deep Learning Methods to the Analysis of IACTs Data. In Intelligent Astrophysics; Zelinka, I., Brescia, M., Baron, D., Eds.; Emergence, Complexity and Computation; Springer: Berlin/Heidelberg, Germany, 2021; Volume 39, pp. 115–136. [Google Scholar]
  10. Anzalone, A.; Pagliaro, A.; Tutone, A. An Introduction to Machine and Deep Learning Methods for Cloud Masking Applications. Appl. Sci. 2024, 14, 2887. [Google Scholar]
  11. Liu, C.; Yang, S.; Di, D.; Yang, Y.; Zhou, C.; Hu, X.; Sohn, B.J. A Machine Learning-based Cloud Detection Algorithm for the Himawari-8 Spectral Image. Adv. Atmos. Sci. 2022, 39, 1994–2007. [Google Scholar] [CrossRef]
  12. Cilli, R.; Monaco, A.; Amoroso, N.; Tateo, A.; Tangaro, S.; Bellotti, R. Machine Learning for Cloud Detection of Globally Distributed Sentinel-2 Images. Remote Sens. 2020, 12, 2355. [Google Scholar] [CrossRef]
  13. Priyadharsan, D.M.J.; Sanjay, K.K.; Kathiresan, S.; Karthik, K.; Prasath, K.S. Patient Health Monitoring Using IoT with Machine Learning. Int. Conf. Intell. Syst. Inf. Manag. 2019, 6, 7514–7520. [Google Scholar]
  14. Awad, S.R.; Alghareb, F.S. Encoding-Based Machine Learning Approach for Health Status Classification and Remote Monitoring of Cardiac Patients. Algorithms 2025, 18, 94. [Google Scholar] [CrossRef]
  15. Agafonov, A. Traffic Flow Prediction Using Graph Convolutional Networks. In Proceedings of the 10th International Conference on Information Science and Technology (ICIST), Bath, London, and Plymouth, UK, 9–15 September 2020; pp. 91–95. [Google Scholar]
  16. Rasulmukhamedov, M.; Tashmetov, T.; Tashmetov, K. Forecasting Traffic Flow Using Machine Learning Algorithms. Eng. Proc. 2024, 70, 14. [Google Scholar] [CrossRef]
  17. Bedi, J.; Toshniwal, D. Energy load time-series forecast using decomposition and autoencoder integrated memory network. Appl. Soft Comput. 2020, 93, 106390. [Google Scholar] [CrossRef]
  18. Lenton, T.M.; Xu, C.; Abrams, J.F.; Ghadiali, A.; Loriani, S.; Sakschewski, B.; Zimm, C.; Ebi, K.L.; Dunn, R.R.; Svenning, J.-C.; et al. Quantifying the human cost of global warming. Nat. Sustain. 2023, 6, 1237. [Google Scholar] [CrossRef]
  19. Vázquez-Ramírez, S.; Torres-Ruiz, M.; Quintero, R.; Chui, K.T.; Guzmán Sánchez-Mejorada, C. An Analysis of Climate Change Based on Machine Learning and an Endoreversible Model. Mathematics 2023, 11, 3060. [Google Scholar] [CrossRef]
  20. Rajendran, T.; Imtiaz, N.M.; Jagadeesh, K.; Sampathkumar, B. Cybersecurity Threat Detection Using Deep Learning and Anomaly Detection Techniques. In Proceedings of the 2024 International Conference on Knowledge Engineering and Communication Systems (ICKECS), Chikkaballapur, India, 18–19 April 2024; pp. 1–6. [Google Scholar]
  21. Uzoka, A.; Cadet, E.; Ojukwu, P.U. Applying artificial intelligence in Cybersecurity to enhance threat detection, response, and risk management. Comput. Sci. Res. J. 2024, 5, 2511–2538. [Google Scholar]
  22. Xie, Y.; Li, Z.; Bao, H.; Jia, X.; Xu, D.; Zhou, X.; Skakun, S. Auto-CM: Unsupervised Deep Learning for Satellite Imagery Composition and Cloud Masking Using Spatio-Temporal Dynamics. Aaai Conf. Artif. Intell. 2023, 37, 14575–14583. [Google Scholar] [CrossRef]
  23. Krasner, S. Examining Applications of Fourier Transforms to Financial Data and Covariance Estimation, Carnegie Mellon University. 2020. Available online: https://kilthub.cmu.edu/articles/thesis/Examining_Applications_of_Fourier_Transforms_to_Financial_Data_and_Covariance_Estimation/12824255 (accessed on 12 March 2025).
  24. Krishna, G.V.; Avula, S.K.; Raju, V.V.K.; Lakshmi, T.V.H.; Tumuluru, P.; Balaji, T.; Jaya, N. Enhanced ECG Signal Classification Using Hybrid CNN-Transformer Models with Tuning Techniques and Genetic Algorithm Optimization. J. Theor. Appl. Inf. Technol. 2024, 102, 7510–7521. [Google Scholar]
  25. Yakymiv, V. Top Machine Learning Time Series Techniques: Pros and Cons. 2025. Available online: https://forbytes.com/blog/machine-learning-time-series-techniques/ (accessed on 12 March 2025).
  26. Logunova, I. Time Series Analysis in Machine Learning. 2025. Available online: https://serokell.io/blog/time-series-analysis-in-ml (accessed on 12 March 2025).
  27. Qian, Y. An enhanced Transformer framework with incremental learning for online stock price prediction. PLoS ONE 2025, 20, e0316955. [Google Scholar] [CrossRef]
  28. Alqahtani, A.; Alsubai, S.; Sha, M.; Vilcekova, L.; Javed, T. Cardiovascular Disease Detection Using Ensemble Learning. Comput. Intell. Neurosci. 2022, 2022, 5267498. [Google Scholar] [CrossRef]
  29. Mitra, A.; Wang, L. Improving Artificial Neural Network Based Stock Forecasting Using Fourier De-Noising. Future Transp. 2023, 70, 14. [Google Scholar]
  30. Ren, X.; Xu, W.; Duan, K. Fourier Transform-Based LSTM Stock Prediction Model Under Oil Shocks. Quant. Financ. Econ. 2022, 6, 342–358. [Google Scholar] [CrossRef]
  31. Ma, D.; Wu, R.; Xiao, D.; Sui, B. Cloud Removal from Satellite Images Using a Deep Learning Model with the Cloud-Matting Method. Remote Sens. 2023, 15, 904. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Pagliaro, A.; Compagnino, A.A.; Sangiorgi, P. Advanced AI and Machine Learning Techniques for Time Series Analysis and Pattern Recognition. Appl. Sci. 2025, 15, 3165. https://doi.org/10.3390/app15063165

AMA Style

Pagliaro A, Compagnino AA, Sangiorgi P. Advanced AI and Machine Learning Techniques for Time Series Analysis and Pattern Recognition. Applied Sciences. 2025; 15(6):3165. https://doi.org/10.3390/app15063165

Chicago/Turabian Style

Pagliaro, Antonio, Antonio Alessio Compagnino, and Pierluca Sangiorgi. 2025. "Advanced AI and Machine Learning Techniques for Time Series Analysis and Pattern Recognition" Applied Sciences 15, no. 6: 3165. https://doi.org/10.3390/app15063165

APA Style

Pagliaro, A., Compagnino, A. A., & Sangiorgi, P. (2025). Advanced AI and Machine Learning Techniques for Time Series Analysis and Pattern Recognition. Applied Sciences, 15(6), 3165. https://doi.org/10.3390/app15063165

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop