Time Series Forecasting Method Based on Multi-Scale Feature Fusion and Autoformer
Abstract
:1. Introduction
1.1. Literature Review
1.2. Research Hypotheses and Methodology
- A multi-scale feature fusion network suitable for time series is proposed. Based on convolutions at different scales, it can effectively extract multi-scale features of time series.
- Based on the above-mentioned feature fusion network, combined with date–time encoding [25], a time-series prediction method, MD–Autoformer, is proposed. Comparative experiments and ablation experiments are carried out with baseline models on real-world clothing sales datasets and time-series datasets from multiple fields, verifying the effectiveness of the method proposed in this paper.
- We optimized the MD–Autoformer’s hyperparameters using the TPE-based Bayesian algorithm, identifying the best parameters to enhance its prediction performance, making it more reliable and effective across diverse tasks.
2. Materials and Methods
2.1. Data Source
2.2. Autoformer Network Structure
2.3. Proposed Method
2.3.1. Multi-Scale Feature Fusion Network
- The 1 × 1 convolutional branch specializes in inter-channel interactions and cross-scale feature recalibration. Through adaptive average pooling along the temporal dimension, this branch extracts compressed global contextual information while preserving channel interdependencies.
- The 3 × 3 convolutional branch extends the receptive field along the temporal axis, enabling the model to capture both localized temporal patterns and extended dependencies. This configuration facilitates the identification of seasonal patterns and macro-trends while discriminating between high-frequency noise and low-frequency trends through differentiated frequency responses of convolutional kernels [29].
2.3.2. Date–Time Encoding
2.3.3. Prediction Model
3. Results
3.1. Experimental Parameters and Evaluation Metrics
3.1.1. Experimental Parameters
3.1.2. Evaluation Metrics
3.2. Clothing Sales Forecasting: Case Study
3.3. Method Validation on Diverse Data
3.3.1. Prediction Experiments in the Disease Field
3.3.2. Prediction Experiments in the Financial Field
3.3.3. Prediction Experiments in the Meteorological Field
3.4. Model Performance Evaluation
3.4.1. Convergence
3.4.2. Computational Time Analysis
3.4.3. Parameter Sensitivity
3.5. Statistical Significance Analysis
3.6. Ablation Study
3.7. Bayesian Hyperparametric Optimization Experiments
Algorithm 1: TPE-based BOA for MD–Autoformer turning |
Require: Objective function , hyperparameter space , maximum number of trials Ensure: Optimal , Minimal loss 1: Initialize observations 2: for to do 3: Partition into 4: Build density models: , 5: Select hyperparameters: 6: Evaluate hyperparameters: 7: Update observed data: 8: end for 9: return and ; |
4. Discussion
- RH1: Ablation and comparative experiments demonstrated that incorporating date–time encoding enhanced the model’s ability to recognize periodic temporal patterns, thereby supporting RH1.
- RH2: The multi-scale feature fusion network improved Autoformer’s prediction accuracy by extracting both local and global temporal features. Ablation studies confirmed that this approach effectively reduced error accumulation in scenarios with multi-scale seasonality, supporting RH2.
- RH3: Statistical significance analysis (p < 0.05) revealed that combining date–time encoding with the multi-scale fusion network led to a statistically significant improvement in prediction performance, validating RH3.
- RH4: The TPE-based Bayesian hyperparameter optimization improved the prediction accuracy of MD–Autoformer across datasets from multiple distinct domains, thereby validating RH4.
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
ILI | Influenza-Like Illness |
MSFNN | Multi-scale Feature Fusion Network |
DE | Date–Time Encoding |
References
- Waqas, M.; Humphries, U.W.; Hlaing, P.T.; Ahmad, S. Seasonal WaveNet-LSTM: A Deep Learning Framework for Precipitation Forecasting with Integrated Large Scale Climate Drivers. Water 2024, 16, 3194. [Google Scholar] [CrossRef]
- Torres, J.F.; Hadjout, D.; Sebaa, A.; Martínez-Álvarez, F.; Troncoso, A. Deep learning for time series forecasting: A survey. Big Data 2021, 9, 3–21. [Google Scholar] [PubMed]
- Tanaka, K.; Akimoto, H.; Inoue, M. Production risk management system with demand probability distribution. Adv. Eng. Inform. 2012, 26, 46–54. [Google Scholar]
- Staffini, A. Stock price forecasting by a deep convolutional generative adversarial network. Front. Artif. Intell. 2022, 5, 837596. [Google Scholar] [CrossRef]
- Ahmad, Z.; Bao, S.; Chen, M. DeepONet-Inspired Architecture for Efficient Financial Time Series Prediction. Mathematics 2024, 12, 3950. [Google Scholar] [CrossRef]
- Staffini, A. A CNN–BiLSTM Architecture for Macroeconomic Time Series Forecasting. Eng. Proc. 2023, 39, 33. [Google Scholar] [CrossRef]
- Hyndman, R. Forecasting: Principles and Practice, 3rd ed.; OTexts: Melbourne, Australia, 2021. [Google Scholar]
- Ristanoski, G.; Liu, W.; Bailey, J. A time-dependent enhanced support vector machine for time series regression. In Proceedings of the 19th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Chicago, IL, USA, 11–14 August 2013; pp. 946–954. [Google Scholar]
- Cortez, P. Sensitivity analysis for time lag selection to forecast seasonal time series using neural networks and support vector machines. In Proceedings of the 2010 International Joint Conference on Neural Networks (IJCNN), Barcelona, Spain, 18–23 July 2010; pp. 1–8. [Google Scholar]
- Cerqueira, V.; Torgo, L.; Soares, C. Machine learning vs statistical methods for time series forecasting: Size matters. arXiv 2019, arXiv:1909.13316. [Google Scholar]
- Box, G.E.; Pierce, D.A. Distribution of residual autocorrelations in autoregressive-integrated moving average time series models. J. Am. Stat. Assoc. 1970, 65, 1509–1526. [Google Scholar]
- Zhao, J.; Zhang, C. Research on Sales Forecast Based on Prophet-SARIMA Combination Model. J. Phys. Conf. Ser. 2020, 1616, 012069. [Google Scholar]
- Wei, J.; Zhu, J.; Huang, C.; Tang, Y.; Lin, X.; Mao, C. A novel prediction model for sales forecasting based on grey system. In Proceedings of the 2016 IEEE 9th International Conference on Service-Oriented Computing and Applications (SOCA), Macau, China, 4–6 November 2016; pp. 10–15. [Google Scholar]
- Zhang, B.; Tseng, M.-L.; Qi, L.; Guo, Y.; Wang, C.-H. A comparative online sales forecasting analysis: Data mining techniques. Comput. Ind. Eng. 2022, 176, 108935. [Google Scholar] [CrossRef]
- Zhou, X.; Meng, J.; Wang, G.; Xiaoxuan, Q. A demand forecasting model based on the improved Bass model for fast fashion clothing. Int. J. Cloth. Sci. Technol. 2020. ahead-of-print. [Google Scholar]
- Thomassey, S. Sales Forecasting in Apparel and Fashion Industry: A Review; Springer: Berlin/Heidelberg, Germany, 2014. [Google Scholar]
- Li, Y.; Yang, Y.; Zhu, K.; Zhang, J. Clothing sale forecasting by a composite GRU–Prophet model with an attention mechanism. IEEE Trans. Ind. Inform. 2021, 17, 8335–8344. [Google Scholar]
- He, K.; Ji, L.; Wu, C.W.D.; Tso, K.F.G. Using SARIMA–CNN–LSTM approach to forecast daily tourism demand. J. Hosp. Tour. Manag. 2021, 49, 25–33. [Google Scholar]
- Xu, D.; Zhang, Q.; Ding, Y.; Zhang, D. Application of a hybrid ARIMA-LSTM model based on the SPEI for drought forecasting. Environ. Sci. Pollut. Res. 2022, 29, 4128–4144. [Google Scholar]
- Dave, E.; Leonardo, A.; Jeanice, M.; Hanafiah, N. Forecasting Indonesia exports using a hybrid model ARIMA-LSTM. Procedia Comput. Sci. 2021, 179, 480–487. [Google Scholar]
- Fan, D.; Sun, H.; Yao, J.; Zhang, K.; Yan, X.; Sun, Z. Well production forecasting based on ARIMA-LSTM model considering manual operations. Energy 2021, 220, 119708. [Google Scholar]
- Wu, H.; Xu, J.; Wang, J.; Long, M. Autoformer: Decomposition transformers with auto-correlation for long-term series forecasting. Adv. Neural Inf. Process. Syst. 2021, 34, 22419–22430. [Google Scholar]
- Pan, K.; Lu, J.; Li, J.; Xu, Z. A Hybrid Autoformer Network for Air Pollution Forecasting Based on External Factor Optimization. Atmosphere 2023, 14, 869. [Google Scholar] [CrossRef]
- Zhong, J.; Li, H.; Chen, Y.; Huang, C.; Zhong, S.; Geng, H. Remaining Useful Life Prediction of Rolling Bearings Based on ECA-CAE and Autoformer. Biomimetics 2024, 9, 40. [Google Scholar] [CrossRef]
- Ma, S.; He, J.; He, J.; Feng, Q.; Bi, Y. Forecasting air quality Index in yan’an using temporal encoded Informer. Expert Syst. Appl. 2024, 255, 124868. [Google Scholar]
- Silva, E.S.; Hassani, H.; Madsen, D.Ø.; Gee, L. Googling fashion: Forecasting fashion consumer behaviour using google trends. Soc. Sci. 2019, 8, 111. [Google Scholar] [CrossRef]
- Skenderi, G.; Joppi, C.; Denitto, M.; Cristani, M. Well googled is half done: Multimodal forecasting of new fashion product sales with image-based google trends. J. Forecast. 2024, 43, 1982–1997. [Google Scholar] [CrossRef]
- Wiener, N. Generalized harmonic analysis. Acta Math. 1930, 55, 117–258. [Google Scholar]
- Ouyang, D.; He, S.; Zhang, G.; Luo, M.; Guo, H.; Zhan, J.; Huang, Z. Efficient multi-scale attention module with cross-spatial learning. In Proceedings of the ICASSP 2023—2023 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Rhodes Island, Greece, 4–10 June 2023; pp. 1–5. [Google Scholar]
- Zhou, H.; Zhang, S.; Peng, J.; Zhang, S.; Li, J.; Xiong, H.; Zhang, W. Informer: Beyond efficient transformer for long sequence time-series forecasting. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtually, 2–9 February 2021; pp. 11106–11115. [Google Scholar]
- Li, S.; Jin, X.; Xuan, Y.; Zhou, X.; Chen, W.; Wang, Y.-X.; Yan, X. Enhancing the locality and breaking the memory bottleneck of transformer on time series forecasting. In Proceedings of the 32nd Conference on Neural Information Processing Systems (NeurIPS 2019), Vancouver, BC, Canada, 8–14 December 2019; Advances in Neural Information Processing Systems 32. Volume 7 of 20. [Google Scholar]
- Zeng, A.; Chen, M.; Zhang, L.; Xu, Q. Are transformers effective for time series forecasting? In Proceedings of the AAAI Conference on Artificial Intelligence, Washington, DC, USA, 7–14 February 2023; pp. 11121–11128. [Google Scholar]
- Ślusarczyk, D.; Ślepaczuk, R. Optimal Markowitz Portfolio Using Returns Forecasted with Time Series and Machine Learning Models; Working Papers 2023-17; Faculty of Economic Sciences, University of Warsaw: Warsaw, Poland, 2023. [Google Scholar]
- Kashif, K.; Ślepaczuk, R. Lstm-arima as a hybrid approach in algorithmic investment strategies. arXiv 2024, arXiv:2406.18206. [Google Scholar]
- Michańków, J.; Sakowski, P.; Ślepaczuk, R. Hedging Properties of Algorithmic Investment Strategies using Long Short-Term Memory and Time Series models for Equity Indices. arXiv 2023, arXiv:2309.15640. [Google Scholar]
- Stefaniuk, F.; Ślepaczuk, R. Informer in Algorithmic Investment Strategies on High Frequency Bitcoin Data. arXiv 2025, arXiv:2503.18096. [Google Scholar]
- Kryńska, K.; Ślepaczuk, R. Daily and Intraday Application of Various Architectures of the LSTM model in Algorithmic Investment Strategies on Bitcoin and the S&P 500 Index. Working Papers of Faculty of Economic Sciences, Univeristy of Warsaw, WP 25/2022 (401). Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=4628806 (accessed on 16 February 2025).
- Roszyk, N.; Ślepaczuk, R. The Hybrid Forecast of S&P 500 Volatility ensembled from VIX, GARCH and LSTM models. arXiv 2024, arXiv:2407.16780. [Google Scholar]
- Lai, G.; Chang, W.-C.; Yang, Y.; Liu, H. Modeling long-and short-term temporal patterns with deep neural networks. In Proceedings of the 41st International ACM SIGIR Conference on Research & Development in Information Retrieval, Ann Arbor, MI, USA, 8–12 July 2018; pp. 95–104. [Google Scholar]
- Zulfiqar, M.; Gamage, K.A.; Kamran, M.; Rasheed, M.B. Hyperparameter optimization of bayesian neural network using bayesian optimization and intelligent feature engineering for load forecasting. Sensors 2022, 22, 4446. [Google Scholar] [CrossRef]
- Jiang, B.; Gong, H.; Qin, H.; Zhu, M. Attention-LSTM architecture combined with Bayesian hyperparameter optimization for indoor temperature prediction. Build. Environ. 2022, 224, 109536. [Google Scholar]
- Du, L.; Gao, R.; Suganthan, P.N.; Wang, D.Z. Bayesian optimization based dynamic ensemble for time series forecasting. Inf. Sci. 2022, 591, 155–175. [Google Scholar]
- Snoek, J.; Larochelle, H.; Adams, R.P. Practical bayesian optimization of machine learning algorithms. In Proceedings of the Advances in Neural Information Processing Systems, Lake Tahoe, NV, USA, 3–6 December 2012; pp. 2951–2959. [Google Scholar]
Feature (Abbreviation) | Description |
---|---|
Date | Sales date |
Sales | The store’s daily apparel sales volume |
On Promotion | The number of promotional apparel items in the store |
Oil Price | The oil price in the region |
Holiday | Whether it was a holiday in the region |
Google Trend | Google search trend index for the category name |
USDX | The US dollar index |
Methods | MD–Autoformer | Autoformer | Informer | Reformer | DLinear | ||||||
Metric | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | |
Apparel Sales | 24 | 2.165 | 0.761 | 2.851 | 1.256 | 2.865 | 1.286 | 3.595 | 2.077 | 2.669 | 1.004 |
36 | 2.257 | 0.847 | 2.879 | 1.158 | 2.898 | 1.334 | 3.824 | 2.076 | 2.746 | 1.034 | |
48 | 2.320 | 0.876 | 2.949 | 1.199 | 2.981 | 1.391 | 4.274 | 2.251 | 2.838 | 1.083 | |
60 | 2.270 | 0.893 | 2.912 | 1.191 | 2.985 | 1.436 | 4.504 | 2.399 | 2.852 | 1.146 |
Methods | MD–Autoformer | Autoformer | Informer | Reformer | DLinear | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Metric | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | |
Apparel Sales | 24 | ±0.014 | ±0.007 | ±0.013 | ±0.010 | ±0.037 | ±0.020 | ±0.039 | ±0.031 | ±0.020 | ±0.027 |
36 | ±0.019 | ±0.017 | ±0.025 | ±0.016 | ±0.027 | ±0.012 | ±0.059 | ±0.043 | ±0.011 | ±0.010 | |
48 | ±0.038 | ±0.030 | ±0.010 | ±0.023 | ±0.022 | ±0.013 | ±0.026 | ±0.023 | ±0.004 | ±0.011 | |
60 | ±0.018 | ±0.016 | ±0.025 | ±0.014 | ±0.023 | ±0.008 | ±0.019 | ±0.008 | ±0.028 | ±0.036 |
Methods | MD–Autoformer | Autoformer | Informer | Reformer | DLinear | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Metric | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | |
ILI | 24 | 1.385 ± 0.022 | 0.834 ± 0.014 | 1.866 | 1.287 | 2.401 | 1.677 | 2.098 | 1.382 | 1.489 | 1.084 |
36 | 1.296 ± 0.023 | 0.736 ± 0.013 | 1.762 | 1.148 | 2.181 | 1.467 | 2.187 | 1.448 | 1.462 | 1.018 | |
48 | 1.318 ± 0.062 | 0.777 ± 0.018 | 1.634 | 1.085 | 2.182 | 1.469 | 2.198 | 1.465 | 1.460 | 1.031 | |
60 | 1.357 ± 0.039 | 0.813 ± 0.015 | 1.664 | 1.125 | 2.294 | 1.564 | 2.210 | 1.483 | 1.540 | 1.098 |
Methods | MD–Autoformer | Autoformer | Informer | Reformer | DLinear | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Metric | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | |
Exchange | 96 | 0.352 ± 0.004 | 0.249 ± 0.008 | 0.444 | 0.323 | 0.920 | 0.752 | 1.032 | 0.829 | 0.292 | 0.211 |
192 | 0.466 ± 0.011 | 0.330 ± 0.010 | 0.548 | 0.369 | 1.097 | 0.895 | 1.090 | 0.906 | 0.405 | 0.299 | |
336 | 0.583 ± 0.012 | 0.411 ± 0.014 | 0.713 | 0.524 | 1.293 | 1.036 | 1.165 | 0.976 | 0.558 | 0.421 | |
720 | 0.944 ± 0.028 | 0.689 ± 0.013 | 1.203 | 0.941 | 1.574 | 1.310 | 1.229 | 1.016 | 0.807 | 0.609 |
Methods | MD–Autoformer | Autoformer | Informer | Reformer | DLinear | |||||
---|---|---|---|---|---|---|---|---|---|---|
Metric | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE |
Results | 0.370 | 0.283 | 0.413 | 0.302 | 1.934 | 1.473 | 2.932 | 2.432 | 0.479 | 0.327 |
Standard deviation | ±0.019 | ±0.014 | ±0.012 | ±0.015 | ±0.016 | ±0.018 | ±0.102 | ±0.060 | ±0.004 | ±0.008 |
Methods | MD–Autoformer | Autoformer | Informer | Reformer | DLinear | ||||||
---|---|---|---|---|---|---|---|---|---|---|---|
Metric | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | |
Weather | 96 | 0.445 ± 0.017 | 0.269 ± 0.008 | 0.516 | 0.336 | 0.548 | 0.384 | 0.830 | 0.596 | 0.421 | 0.239 |
192 | 0.533 ± 0.009 | 0.339 ± 0.011 | 0.554 | 0.367 | 0.773 | 0.544 | 0.867 | 0.638 | 0.470 | 0.284 | |
336 | 0.530 ± 0.004 | 0.325 ± 0.024 | 0.599 | 0.395 | 0.760 | 0.523 | 0.799 | 0.596 | 0.514 | 0.313 | |
720 | 0.599 ± 0.006 | 0.381 ± 0.013 | 0.647 | 0.428 | 1.029 | 0.741 | 1.063 | 0.792 | 0.570 | 0.363 |
Model | Time Cost (s) on Different Datasets | |||
---|---|---|---|---|
Apparel Sales | ILI | Exchange | Weather | |
MD–Autoformer | 1.134 | 0.512 | 22.671 | 192.809 |
Autoformer | 1.082 | 0.493 | 19.342 | 182.560 |
Informer | 0.568 | 0.291 | 8.197 | 67.081 |
Reformer | 0.639 | 0.348 | 23.648 | 200.964 |
DLinear | 0.046 | 0.026 | 0.247 | 2.882 |
Dataset | Apparel Sales | ILI | Exchange | Weather | ||||
---|---|---|---|---|---|---|---|---|
Metric | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE |
4 | 2.403 | 0.922 | 1.330 | 0.784 | 0.583 | 0.411 | 0.530 | 0.325 |
8 | 2.396 | 0.920 | 1.354 | 0.791 | 0.601 | 0.436 | 0.541 | 0.330 |
16 | 2.389 | 0.914 | 1.318 | 0.777 | 0.591 | 0.420 | 0.549 | 0.336 |
32 | 2.424 | 0.938 | 1.343 | 0.786 | 0.612 | 0.434 | 0.568 | 0.345 |
Datasets | t-Statistic | p-Value | Mean Diff |
---|---|---|---|
Apparel Sales | 57.63 | 0.000000 *** | 0.298 |
ILI | 47.91 | 0.000000 *** | 0.312 |
Exchange | 20.70 | 0.000000 *** | 0.252 |
Weather | 20.95 | 0.000000 *** | 0.047 |
Methods | MD–Autoformer | Autoformer + MSFFN | Autoformer + DE | Autoformer | |||||
---|---|---|---|---|---|---|---|---|---|
Metric | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | |
Apparel Sales | 24 | 2.165 | 0.761 | 2.773 | 1.114 | 2.189 | 0.781 | 2.851 | 1.256 |
36 | 2.257 | 0.847 | 2.870 | 1.168 | 2.277 | 0.842 | 2.879 | 1.158 | |
48 | 2.320 | 0.876 | 2.945 | 1.184 | 2.346 | 0.86 | 2.949 | 1.199 | |
60 | 2.270 | 0.893 | 2.882 | 1.202 | 2.281 | 0.912 | 2.912 | 1.191 | |
ILI | 24 | 1.385 | 0.834 | 1.810 | 1.237 | 1.401 | 0.849 | 1.866 | 1.287 |
36 | 1.296 | 0.736 | 1.657 | 1.059 | 1.322 | 0.763 | 1.762 | 1.148 | |
48 | 1.318 | 0.777 | 1.627 | 1.061 | 1.321 | 0.788 | 1.634 | 1.085 | |
60 | 1.357 | 0.813 | 1.681 | 1.099 | 1.367 | 0.817 | 1.664 | 1.125 | |
Exchange | 96 | 0.352 | 0.249 | 0.395 | 0.290 | 0.361 | 0.256 | 0.444 | 0.323 |
192 | 0.466 | 0.330 | 0.506 | 0.371 | 0.488 | 0.347 | 0.548 | 0.369 | |
336 | 0.583 | 0.411 | 0.636 | 0.471 | 0.611 | 0.443 | 0.713 | 0.524 | |
720 | 0.944 | 0.689 | 1.086 | 0.835 | 0.950 | 0.693 | 1.203 | 0.941 | |
Weather | 96 | 0.445 | 0.269 | 0.499 | 0.325 | 0.459 | 0.283 | 0.516 | 0.336 |
192 | 0.533 | 0.339 | 0.550 | 0.356 | 0.541 | 0.349 | 0.554 | 0.367 | |
336 | 0.530 | 0.325 | 0.571 | 0.363 | 0.544 | 0.339 | 0.599 | 0.395 | |
720 | 0.599 | 0.381 | 0.628 | 0.399 | 0.617 | 0.412 | 0.647 | 0.428 |
Feature (Abbreviation) | Value Range |
---|---|
Dropout | [0.1, 0.3] |
Learning rate | ] |
Train epoch MSFFN channels | [1, 10] |
Hyperparameter | Apparel Sales | ILI | Exchange | Weather |
---|---|---|---|---|
Dropout | 0.209 | 0.143 | 0.019 | 0.098 |
Learning rate | 0.0001 | 0.001 | 0.001 | 0.001 |
Train epochs | 8 | 6 | 6 | 2 |
MSFFN channels | 16 | 16 | 4 | 4 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Ma, X.; Zhang, H. Time Series Forecasting Method Based on Multi-Scale Feature Fusion and Autoformer. Appl. Sci. 2025, 15, 3768. https://doi.org/10.3390/app15073768
Ma X, Zhang H. Time Series Forecasting Method Based on Multi-Scale Feature Fusion and Autoformer. Applied Sciences. 2025; 15(7):3768. https://doi.org/10.3390/app15073768
Chicago/Turabian StyleMa, Xiangkai, and Huaxiong Zhang. 2025. "Time Series Forecasting Method Based on Multi-Scale Feature Fusion and Autoformer" Applied Sciences 15, no. 7: 3768. https://doi.org/10.3390/app15073768
APA StyleMa, X., & Zhang, H. (2025). Time Series Forecasting Method Based on Multi-Scale Feature Fusion and Autoformer. Applied Sciences, 15(7), 3768. https://doi.org/10.3390/app15073768