# Forecasting Multivariate Chaotic Processes with Precedent Analysis

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Materials and Methods

_{<m:1>}(k), k = 1, …, N be an m-dimensional series of observations captured by a system that monitors the parameters of a real process occurring in an unstable immersion environment. As the observation model, we use a conventional additive relation of the form

_{<m:1>}(k) is the so-called systemic component used to make management decisions and v

_{<m}

_{:1>}(k) is a random process that simulates observation noise. The systemic component X(k) is supposed to be the output of some non-linear system with determined coefficients and initial values. However, severe dependence of the output on initial conditions has caused the observer to be unable to predict the behavior of the system without any prior knowledge. Several approaches in this area including model-based and data-driven classifications, are regarded in [16,17,18,19,20]. Particularly, in [20], the authors regard methods of multidimensional sparse regression to identify relevant terms in the system component X(k) (SINDy).

_{i}(k), i = 1, …, m is an implementation of multidimensional dynamic chaos. Moreover, each of its parameters is an oscillatory non-periodic process that contains pronounced local trends. Usually, the system component X(k), k = 1, …, N is isolated from the mix Y(k), k = 1,…, N by sequential filtration. However, it is very difficult to strictly separate deterministic and random components under these conditions. This requires the introduction of additional subjective constraints.

_{<m}

_{:1>}(k) represents an m-dimensional noise process that is not taken into account in the decision-making process and is subject to filtering. Generally, it is close to the Wiener process, however, it significantly deviates from the stationarity condition due to the presence of heteroscedasticity and random variations of the autocorrelation function. The Gaussian condition is also satisfied very approximately due to a large number of abnormal observations and the failure to conform to the regularizing conditions of the central limit theorems [21,22].

_{i}(k), i = 1, …, m significantly correlate. In this case, the proposed relative stability (that has to be investigated individually) of the values of the covariance matrix P(Y) = cov(Y

_{i},Y

_{j}), i,j = 1, …, m makes it possible to use this property as a regularizing factor in precedent forecasting. This assumption must be checked by analyzing the dynamics of pairwise correlations on sliding windows. Stability, here, is connected with the correctness of the proposed approximate flatness and must be checked for the data of interest.

_{S}. ${N}_{S}$ is the size of the training sample. In the case of the continuous monitoring of the observed object, this array can be continuously replenished directly adjacent to the current moment in time.

**,**where ${P}_{1},{P}_{2}$ are the sample covariance matrices of the samples under consideration on the observation windows (3) and (4), are used as analogues of Fischer’s F-statistic.

- (1)
- Hotelling (or Lawley–Hotelling) trace:

- (2)
- Hotelling’s statistic:

- (3)
- Pillai’s trace:

- (4)
- Roy’s characteristic roots:

- (5)
- Wilkes’ statistic:

## 3. Results

#### 3.1. Multidimensional Data Polygon for Studying the Effectiveness of Forecasting Algorithms: Preliminary Studies

#### 3.2. Property Analysis of Similarity Measures in Chaotic Processes

#### 3.3. Analog Search in Multidimensional Chaotic Processes

#### 3.4. Precedent Forecasting in Multidimensional Chaotic Processes: Numerical Studies

^{+}is the number of forecasting steps at which the direction of the linear trend of the observation data coincides in sign with the direction of the forecast trend.

## 4. Discussion

## 5. Conclusions

## Author Contributions

## Funding

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## References

- Smith, L. Chaos: A Very Short Introduction; Oxford University Press: Oxford, UK, 2007. [Google Scholar]
- Horstkemke, W.; Lefever, R. Noise-Induced Transitions Theory and Applications in Physics, Chemistry and Biology; Springer Series in Synergetics; Springer: Berlin/Heidelberg, Germany, 1984; Volume 15, pp. 1–315. [Google Scholar]
- Gora, C.; Dovgal, V. Discrete Chaotic Processes and Information Processing; LAP Lambert Academic Publishing: Chisinau, Republic of Moldova, 2012. [Google Scholar]
- Musaev, A.; Borovinskaya, E. Prediction in Chaotic Environments Based on Weak Quadratic Classifiers. Symmetry
**2020**, 12, 1630. [Google Scholar] [CrossRef] - Pourafzal, A.; Fereidunian, A. A Complex Systems Approach to Feature Extraction for Chaotic Behavior Recognition. In Proceedings of the 2020 IEEE 6th Iranian Conference on Signal Processing and Intelligent Systems (ICSPIS), Mashhad, Iran, 23–24 December 2020; pp. 1–6. [Google Scholar]
- Manneville, P. Instabilities, Chaos and Turbulence: An Introduction to Nonlinear Dynamics and Complex Systems; Imperial College Press: London, UK, 2004. [Google Scholar]
- Klimontovich, Y.L. Turbulent Motion and the Structure of Chaos. A New Approach to the Statistical Theory of Open Systems, 2nd ed.; URSS: Moscow, Russia, 2010; ISBN 9785484011971. [Google Scholar]
- Musaev, A.; Fenin, M.F. Research of inertia of dynamic processes in gas-dynamic chaotic media. Izvestiya SpbSIT
**2018**, 45, 114–122. [Google Scholar] - Peters, E.E. Chaos and Order in the Capital Markets: A New View of Cycles, Prices, and Market Volatility, 2nd ed.; John Wiley & Sons: Hoboken, NJ, USA, 1996. [Google Scholar]
- Ananchenko, I.V.; Musaev, A.A. Mathematical and Information Technologies in the Forex Market; LAP Lambert Academic Publ.: Saarbrucken, Germany, 2013. [Google Scholar]
- Wu, D.; Wang, X.; Su, J.; Tang, B.; Wu, S. A labeling method for financial time series prediction based on trends. Entropy
**2020**, 22, 1162. [Google Scholar] [CrossRef] [PubMed] - Wernecke, H.; Sándor, B.; Gros, C. How to test for partially predictable chaos. Sci. Rep.
**2017**, 7, 1087. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Zhou, T.; Chu, C.; Xu CLiu, W.; Yu, H. Detecting Predictable Segments of Chaotic Financial Time Series via Neural Network. Electronics
**2020**, 5, 823. [Google Scholar] [CrossRef] - Flores, J.; González, J.; Farias, R.; Calderon, F. Evolving nearest neighbor time series forecasters. Soft Comput.
**2019**, 23, 1039–1048. [Google Scholar] [CrossRef] - Gromov, V.; Baranov, P.; Tsybakin, A. Prediction after a Horizon of Predictability: Non-Predictable Points and Partial Multi-Step Prediction for Chaotic Time Series. 2020. Available online: https://bit.ly/3lF6NjV (accessed on 12 October 2021).
- Musaev, A.A. Estimation of Inertia of Chaotic Processes Taking into Account Qualitative Characteristics of Local Trends. SPIIRAS Proc.
**2014**, 2, 83–87. [Google Scholar] [CrossRef] - Bao, Y.; Xiong, T.; Hu, Z. Multi-step-ahead time series prediction using multiple-output support vector regression. Neurocomputing
**2014**, 129, 482–493. [Google Scholar] [CrossRef] [Green Version] - Tang, L.; Pan, H.; Yao, Y. K-nearest neighbor regression with principal component analysis for financial time series prediction. In Proceedings of the 2018 International Conference on Computing and Artificial Intelligence, Chengdu, China, 12–14 March 2018; pp. 127–131. [Google Scholar]
- Tang, L.; Pan, H.; Yao, Y. Computational Intelligence Prediction Model Integrating Empirical Mode Decomposition, Principal Component Analysis, and Weighted k-Nearest Neighbor. J. Electron. Sci. Technol.
**2021**, 18, 341–349. [Google Scholar] - Bruntona, S.L.; Proctorb, J.L.; Kutzc, J.N. Discovering governing equations from data by sparse identification of nonlinear dynamical systems. Proc. Natl. Acad. Sci. USA
**2016**, 113, 3932–3937. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Ye, R.; Dai, Q. MultiTL-KELM: A multi-task learning algorithm for multi-step-ahead time series prediction. Appl. Soft Comput. J.
**2019**, 79, 227–253. [Google Scholar] [CrossRef] - Sinai, Y.G. Probability Theory: An Introductory Course; Springer Science & Business Media: Berlin/Heidelberg, Germany, 1992. [Google Scholar]
- Makshanov, A.V.; Musaev, A.A. Intellectual Data Analysis; Saint Petersburg Institute of Technology: Saint Petersburg, Russia, 2019. [Google Scholar]
- Lin, G.; Lin, A.; Cao, J. Multidimensional KNN algorithm based on EEMD and complexity measures in financial time series forecasting. Expert Syst. Appl.
**2021**, 168, 114443. [Google Scholar] [CrossRef] - Perner, P. (Ed.) Advances in Data Mining. Applications and Theoretical Aspects. In Lecture Notes in Computer Science; Springer International Publishing AG: Basel, Switzerland, 2018; Volume 10933. [Google Scholar]
- Fukunaga, K. Introduction to Statistical Pattern Recognition, 2nd ed.; Academic Press: Cambridge, MA, USA, 2013; ISBN 0122698517. [Google Scholar]
- Bishop, C. Pattern Recognition and Machine Learning; Springer: New York, NY, USA, 2006; ISBN 9780387310732. [Google Scholar]
- Musaev, A.A. Evolutionary-Statistical Approach to Self-Organization of Predictive Models of Technological Process Control. Autom. Ind.
**2006**, 7, 31–35. [Google Scholar] - Carreno, J.E. Multi-Objective Optimization by Using Evolutionary Algorithms: The p-Optimality Criteria. IEEE Trans. Evol. Comput.
**2014**, 18, 167–179. [Google Scholar] - Suganthan, P.N. Differential Evolution: A Survey of the State-of-the-Art. IEEE Trans. Evol. Comput.
**2011**, 15, 4–31. [Google Scholar] - Mkhopadhyay, A.; Maulik, U.; Bandyopadhyay, S.; Coello, C.A.C. A Survey of Multiobjective Evolutionary Algorithms for Data Mining: Part I. IEEE Trans. Evol. Comput.
**2014**, 18, 4–19. [Google Scholar] [CrossRef] - Kirichenko, L.; Radivilova, T.; Bulakh, V. Binary Classification of Fractal Time Series by Machine Learning Methods. In International Scientific Conference “Intellectual Systems of Decision Making and Problem of Computational Intelligence”; Springer: Cham, Switzerland, 2019; pp. 701–711. [Google Scholar]

**Figure 2.**Simultaneous changes in the forecasted parameter (No. 1) and its three most correlated parameters (No. 11, 14, and 16).

**Figure 4.**(

**a**) Similarity metric variation for the first difference of a multidimensional chaotic process and (

**b**) similarity metric variation for a multidimensional chaotic process itself.

**Figure 5.**Changes in the Euclidean and Mahalanobis distances for a multidimensional chaotic process.

**Figure 7.**Precedent windows for four parameters of a multidimensional chaotic process by the Hotelling statistic.

**Figure 8.**Analog windows for the first parameter of a multidimensional chaotic process by the Hotelling statistic.

**Figure 10.**Two examples of implementing a multidimensional precedent forecast using Hotelling and Wilkes metrics.

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Musaev, A.; Makshanov, A.; Grigoriev, D.
Forecasting Multivariate Chaotic Processes with Precedent Analysis. *Computation* **2021**, *9*, 110.
https://doi.org/10.3390/computation9100110

**AMA Style**

Musaev A, Makshanov A, Grigoriev D.
Forecasting Multivariate Chaotic Processes with Precedent Analysis. *Computation*. 2021; 9(10):110.
https://doi.org/10.3390/computation9100110

**Chicago/Turabian Style**

Musaev, Alexander, Andrey Makshanov, and Dmitry Grigoriev.
2021. "Forecasting Multivariate Chaotic Processes with Precedent Analysis" *Computation* 9, no. 10: 110.
https://doi.org/10.3390/computation9100110