Time Series Determinism Recognition by LSTM Model
Abstract
1. Introduction
2. AI Time Series Models
3. Time Series
3.1. AR Model
3.2. Lorentz Attractor
3.3. Financial Time Series
4. LSTM Implementation (Results)
4.1. AR(2) Model
4.2. Lorentz Attractor
4.3. Financial Time Series
5. Conclusions and Future Work
5.1. Key Contributions
- This study introduces a novel application of deep learning models—specifically LSTM networks—as a metric for assessing determinism in time series data.
- Unlike traditional uses of AI for prediction or classification, this research inverts the paradigm by using the model’s performance as an indicator of the underlying structure in the data.
- Theoretical insights are provided regarding the deterministic nature of AI models, emphasizing their reliance on pattern recognition and the importance of avoiding overfitting.
- The approach was empirically validated on three types of time series:
- –
- Deterministic with noise: AR(2) model—performance degraded with noise.
- –
- Deterministic: Lorenz attractor—accurately reconstructed.
- –
- Stochastic: Financial time series—poor model performance, supporting the hypothesis.
- The quality of the LSTM model, measured via RMSE, MAE, and , correlates with the degree of determinism in the time series.
5.2. Shortcomings
- The process of constructing AI models is not yet standardised, which may affect reproducibility and scalability.
- Designing optimal architectures remains a complex task, even though an optimization algorithm was employed.
- This study primarily used RMSE as the determinism indicator; however, this may not capture all aspects of model quality or data structure.
5.3. Future Prospects
- Develop a standardised pipeline for generating and validating AI models for determinism analysis.
- Develop a standardised metric for determinism analysis.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Ribeiro, M.; Henriques, T.; Castro, L.; Souto, A.; Antunes, L.; Costa-Santos, C.; Teixeira, A. The Entropy Universe. Entropy 2021, 23, 222. [Google Scholar] [CrossRef] [PubMed]
- Delgado-Bonal, A.; Marshak, A. Approximate Entropy and Sample Entropy: A Comprehensive Tutorial. Entropy 2019, 21, 541. [Google Scholar] [CrossRef]
- Namdari, A.; Li, Z.S. A review of entropy measures for uncertainty quantification of stochastic processes. Adv. Mech. Eng. 2019, 11, 1687814019857350. [Google Scholar] [CrossRef]
- Ausloos, M.; Miśkiewicz, J. Introducing the q-Theil index. Braz. J. Phys. 2009, 39, 388–395. [Google Scholar]
- Rotundo, G.; Ausloos, M. Complex-valued information entropy measure for networks with directed links (digraphs). Application to citations by community agents with opposite opinions. Eur. Phys. J. B 2013, 86, 169. [Google Scholar] [CrossRef]
- Bar-Hillel, M.; Wagenaar, W.A. The perception of randomness. Adv. Appl. Math. 1991, 12, 428–454. [Google Scholar] [CrossRef]
- Arcuri, A.; Iqbal, M.Z.; Briand, L. Random testing: Theoretical results and practical implications. IEEE Trans. Softw. Eng. 2011, 38, 258–277. [Google Scholar] [CrossRef]
- Sano, M.; Sawada, Y. Measurement of the Lyapunov spectrum from a chaotic time series. Phys. Rev. Lett. 1985, 55, 1082. [Google Scholar] [CrossRef] [PubMed]
- Shao, Y.H.; Gu, G.F.; Jiang, Z.Q.; Zhou, W.X.; Sornette, D. Comparing the performance of FA, DFA and DMA using different synthetic long-range correlated time series. Sci. Rep. 2012, 2, 835. [Google Scholar] [CrossRef]
- Arianos, S.; Carbone, A. Cross-correlation of long-range correlated series. J. Stat. Mech. Theory Exp. 2009, 2009, P03037. [Google Scholar] [CrossRef]
- Doukhan, P.; Oppenheim, G.; Taqqu, M. Theory and Applications of Long-Range Dependence; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2002. [Google Scholar]
- Cryer, J.D. Time Series Analysis; Duxbury Press Boston: North Scituate, MA, USA, 1986; Volume 286. [Google Scholar]
- Franses, P.H. Time Series Models for Business and Economic Forecasting; Cambridge University Press: Oxford, UK, 1998. [Google Scholar]
- Wei, W.W. Time Series Analysis. In The Oxford Handbook of Quantitative Methods in Psychology: Vol. 2: Statistical Analysis; Oxford University Press: Oxford, UK, 2013. [Google Scholar] [CrossRef]
- Han, C.; Hilger, H.; Mix, E.; Böttcher, P.C.; Reyers, M.; Beck, C.; Witthaut, D.; Gorjão, L.R. Complexity and Persistence of Price Time Series of the European Electricity Spot Market. PRX Energy 2022, 1, 013002. [Google Scholar] [CrossRef]
- Chen, S.; Guo, W. Auto-encoders in deep learning—a review with new perspectives. Mathematics 2023, 11, 1777. [Google Scholar] [CrossRef]
- Zhang, Q.; Yang, L.T.; Chen, Z.; Li, P. A survey on deep learning for big data. Inf. Fusion 2018, 42, 146–157. [Google Scholar] [CrossRef]
- Yu, Y.; Si, X.; Hu, C.; Zhang, J. A review of recurrent neural networks: LSTM cells and network architectures. Neural Comput. 2019, 31, 1235–1270. [Google Scholar] [CrossRef] [PubMed]
- Shiri, F.M.; Perumal, T.; Mustapha, N.; Mohamed, R. A comprehensive overview and comparative analysis on deep learning models: CNN, RNN, LSTM, GRU. arXiv 2023, arXiv:2305.17473. [Google Scholar]
- Sherstinsky, A. Fundamentals of recurrent neural network (RNN) and long short-term memory (LSTM) network. Phys. D Nonlinear Phenom. 2020, 404, 132306. [Google Scholar] [CrossRef]
- Tsai, Y.T.; Zeng, Y.R.; Chang, Y.S. Air pollution forecasting using RNN with LSTM. In Proceedings of the 2018 IEEE 16th Intl Conf on Dependable, Autonomic and Secure Computing, 16th Intl Conf on Pervasive Intelligence and Computing, 4th Intl Conf on Big Data Intelligence and Computing and Cyber Science and Technology Congress (DASC/PiCom/DataCom/CyberSciTech), Athens, Greece, 12–15 August 2018; pp. 1074–1079. [Google Scholar]
- Abdel-Nasser, M.; Mahmoud, K. Accurate photovoltaic power forecasting models using deep LSTM-RNN. Neural Comput. Appl. 2019, 31, 2727–2740. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Wu, Q.; Guan, F.; Lv, C.; Huang, Y. Ultra-short-term multi-step wind power forecasting based on CNN-LSTM. IET Renew. Power Gener. 2021, 15, 1019–1029. [Google Scholar] [CrossRef]
- Koprinska, I.; Wu, D.; Wang, Z. Convolutional neural networks for energy time series forecasting. In Proceedings of the 2018 International Joint Conference on Neural networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–8. [Google Scholar]
- Smagulova, K.; James, A.P. A survey on LSTM memristive neural network architectures and applications. Eur. Phys. J. Spec. Top. 2019, 228, 2313–2324. [Google Scholar] [CrossRef]
- Wen, X.; Li, W. Time Series Prediction Based on LSTM-Attention-LSTM Model. IEEE Access 2023, 11, 48322–48331. [Google Scholar] [CrossRef]
- Greff, K.; Srivastava, R.K.; Koutník, J.; Steunebrink, B.R.; Schmidhuber, J. LSTM: A Search Space Odyssey. IEEE Trans. Neural Netw. Learn. Syst. 2017, 28, 2222–2232. [Google Scholar] [CrossRef] [PubMed]
- Farzad, A.; Mashayekhi, H.; Hassanpour, H. A comparative performance analysis of different activation functions in LSTM networks for classification. Neural Comput. Appl. 2019, 31, 2507–2521. [Google Scholar] [CrossRef]
- Vijayaprabakaran, K.; Sathiyamurthy, K. Towards activation function search for long short-term model network: A differential evolution based approach. J. King Saud Univ.-Comput. Inf. Sci. 2022, 34, 2637–2650. [Google Scholar] [CrossRef]
- Duchi, J.; Hazan, E.; Singer, Y. Adaptive Subgradient Methods for Online Learning and Stochastic Optimization. J. Mach. Learn. Res. 2011, 12, 2121–2159. [Google Scholar]
- Ruder, S. An overview of gradient descent optimization algorithms. arXiv 2017, arXiv:1609.04747. [Google Scholar]
- Alamri, N.M.H.; Packianather, M.; Bigot, S. Optimizing the Parameters of Long Short-Term Memory Networks Using the Bees Algorithm. Appl. Sci. 2023, 13, 2536. [Google Scholar] [CrossRef]
- Nwankpa, C.; Ijomah, W.; Gachagan, A.; Marshall, S. Activation functions: Comparison of trends in practice and research for deep learning. arXiv 2018, arXiv:1811.03378. [Google Scholar]
- Dubey, S.R.; Singh, S.K.; Chaudhuri, B.B. Activation functions in deep learning: A comprehensive survey and benchmark. Neurocomputing 2022, 503, 92–108. [Google Scholar] [CrossRef]
- Bai, Y. RELU-function and derived function review. In Proceedings of the SHS Web of Conferences, Kunming, China, 23–25 December 2022; EDP Sciences: Les Ulis, France, 2022; Volume 144, p. 02006. [Google Scholar]
- Available online: https://optuna.org/ (accessed on 10 November 2024).
- Lütkepohl, H.; Krätzig, M. Applied Time Series Econometrics; Cambridge University Press: Cambridge, UK, 2004. [Google Scholar]
- Tsay, R.S. Analysis of Financial Time Series; John Wiley & Sons: Hoboken, NJ, USA, 2005. [Google Scholar]
- Schuster, H.G.; Just, W. Deterministic Chaos: An Introduction; John Wiley & Sons: Hoboken, NJ, USA, 2005. [Google Scholar] [CrossRef]
- Lorenz, E.N. Deterministic Nonperiodic Flow. J. Atmos. Sci. 1963, 20, 130–141. [Google Scholar] [CrossRef]
- Viswanath, D. The fractal property of the Lorenz attractor. Phys. D Nonlinear Phenom. 2004, 190, 115–128. [Google Scholar] [CrossRef]
- Guckenheimer, J.; Williams, R.F. Structural stability of Lorenz attractors. Publ. Mathématiques L’IHÉS 1979, 50, 59–72. [Google Scholar] [CrossRef]
- Poon, S.H.; Granger, C.W.J. Forecasting volatility in financial markets: A review. J. Econ. Lit. 2003, 41, 478–539. [Google Scholar] [CrossRef]
- Cavalcante, R.C.; Brasileiro, R.C.; Souza, V.L.; Nobrega, J.P.; Oliveira, A.L. Computational intelligence and financial markets: A survey and future directions. Expert Syst. Appl. 2016, 55, 194–211. [Google Scholar] [CrossRef]
- Milana, C.; Ashta, A. Artificial intelligence techniques in finance and financial markets: A survey of the literature. Strateg. Chang. 2021, 30, 189–209. [Google Scholar] [CrossRef]
- Samanidou, E.; Zschischang, E.; Stauffer, D.; Lux, T. Agent-based models of financial markets. Rep. Prog. Phys. 2007, 70, 409. [Google Scholar] [CrossRef]
Mean Value | Max | Min | Var | Median | |
---|---|---|---|---|---|
USD/EUR | 1.0677 | 1.1457 | 0.9596 | 0.0016 | 1.0730 |
MSFT | 291.39 USD | 382.70 USD | 214.25 USD | 1676.01 USD2 | 287.18 USD |
Neuron Number | Dropout Rate | RMSE | MAE | |||
---|---|---|---|---|---|---|
0.2 | 0.15 | 324 | 0.003 | 0.206 | 0.168 | 0.059 |
0.25 | 63 | 0.054 | 0.206 | 0.168 | 0.179 | |
0.35 | 196 | 0.072 | 0.207 | 0.168 | 0.391 | |
0.45 | 103 | 0.134 | 0.206 | 0.168 | 0.766 | |
0.5 | 0.15 | 229 | 0.035 | 0.482 | 0.389 | −0.004 |
0.25 | 61 | 0.441 | 0.478 | 0.385 | 0.086 | |
0.35 | 22 | 0.242 | 0.478 | 0.385 | 0.249 | |
0.45 | 224 | 0.228 | 0.477 | 0.384 | 0.632 | |
0.8 | 0.15 | 228 | 0.450 | 0.791 | 0.622 | 0.037 |
0.25 | 153 | 0.238 | 0.791 | 0.623 | 0.148 | |
0.35 | 27 | 0.050 | 0.794 | 0.623 | 0.342 | |
0.45 | 161 | 0.104 | 0.790 | 0.620 | 0.691 |
Type of Network, Coordinate | Neuron Number | Dropout Rate | RMSE | MAE | |
---|---|---|---|---|---|
one layer, X | 23 | 0.003 | 0.021 | 0.015 | 1 |
two layers, X | 151; 93 | 0.002 | 0.044 | 0.029 | 1 |
one layer, Z | 156 | 0.018 | 0.324 | 0.309 | 0.999 |
Time Series | Neuron Number | Dropout Rate | RMSE | MAE | |
---|---|---|---|---|---|
EUR/USD | 186 | 0.322 | 0.008 | 0.005 | 0.860 |
MSFT | 138 | 0.166 | 4.720 | 3.754 | 0.958 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Miśkiewicz, J.; Witkowicz, P. Time Series Determinism Recognition by LSTM Model. Mathematics 2025, 13, 2000. https://doi.org/10.3390/math13122000
Miśkiewicz J, Witkowicz P. Time Series Determinism Recognition by LSTM Model. Mathematics. 2025; 13(12):2000. https://doi.org/10.3390/math13122000
Chicago/Turabian StyleMiśkiewicz, Janusz, and Paweł Witkowicz. 2025. "Time Series Determinism Recognition by LSTM Model" Mathematics 13, no. 12: 2000. https://doi.org/10.3390/math13122000
APA StyleMiśkiewicz, J., & Witkowicz, P. (2025). Time Series Determinism Recognition by LSTM Model. Mathematics, 13(12), 2000. https://doi.org/10.3390/math13122000