# A Combined Model Based on Recurrent Neural Networks and Graph Convolutional Networks for Financial Time Series Forecasting

^{1}

^{2}

^{*}

## Abstract

**:**

^{2}). These results represent a smaller difference between the result returned by the model and the real value and, therefore, a greater precision in the predictions of this model.

## 1. Introduction

#### 1.1. Artificial Neural Networks for Time Series Forecasting

#### 1.1.1. Long Short-Term Memory

#### 1.1.2. Bidirectional LSTM

#### 1.1.3. Graph Convolutional Networks

#### 1.2. Commodity Analysis

- -
- Review of main statistical techniques for forecasting economic time series.
- -
- Creation of a new hybrid model that combines recurrent and graph convolutional networks.
- -
- Demonstration of an improvement in the results reflected in the existing literature provided by the new model.
- -
- Comparison of the main models used in the prediction of time series.

## 2. Methodology

#### 2.1. ANN Architecture

_{m}defines the number of neurons that are part of the hidden layer m and the output layer O.

#### 2.2. BiLSTM-GCN Network

_{t}layer, resulting in the output h

_{t}in the backward layer and the output of the forward layer h

_{t}′, which when combined and after applying the activation function provide the output o

_{t}.

## 3. Results

#### 3.1. Data Description

#### 3.2. Evaluation Metrics

#### 3.3. Model Parameters

- -
- Learning rate: A value that is too low may make it necessary to increase the number of epochs and make training slower.
- -
- Batch size: Determines how many samples will be analyzed before updating the model parameters.
- -
- Epoch: Determines how many times the entire dataset will be trained.
- -
- Hidden layers: The complexity of the model is determined by the number of neurons and hidden chaos. This defines the learning capacity of the model. For the selection of the number of hidden layers, different units were experimented with, selecting the optimal value by comparing the evaluation metrics.
- -
- Optimization algorithm: The choice of the optimization algorithm can have a notable impact on the learning of the models. It updates the parameter values based on the set learning rate. In this case, Adam was selected because it tries to combine the advantages of RMSProp (similar to gradient descent) together with the advantages of gradient descent with momentum [74].

#### 3.4. Experimental Results

^{2}value, which indicates a more accurate result with test data. Given the results of the ARIMA and PROPHET models, there is a distortion in the graphical representation of the results, by which it has been decided not to show them in Figure 7.

## 4. Discussion

## 5. Conclusions and Future Lines of Research

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## References

- Box, G.E.P.; Jenkins, G.M. Time Series Analysis: Forecasting and Control; Holden-Day: San Francisco, CA, USA, 1970. [Google Scholar]
- Kamijo, K.I.; Tanigawa, T. Stock price pattern recognition-a recurrent neural network approach. In Proceedings of the 1990 IJCNN International Joint Conference on Neural Networks, San Diego, CA, USA, 7–21 June 1990; pp. 215–221. [Google Scholar]
- Chakraborty, K.; Mehrotra, K.; Mohan, C.K.; Ranka, S. Forecasting the behavior of multivariate time series using neural networks. Neural Netw.
**1992**, 5, 961–970. [Google Scholar] [CrossRef] [Green Version] - Kohzadi, N.; Boyd, M.S.; Kermanshahi, B.; Kaastra, I. A comparison of artificial neural network and time series models for forecasting commodity prices. Neurocomputing
**1996**, 10, 169–181. [Google Scholar] [CrossRef] - Kolarik, T.; Rudorfer, G. Time series forecasting using neural networks. ACM SIGAPL APL Quote Quad
**1994**, 25, 86–94. [Google Scholar] [CrossRef] - Hornik, K.; Stinchcombe, M.; White, H. Multilayer feedforward networks are universal approximators. Neural Netw.
**1989**, 2, 359–366. [Google Scholar] [CrossRef] - Gers, F.A.; Schraudolph, N.N.; Schmidhuber, J. Learning precise timing with LSTM recurrent networks. J. Mach. Learn. Res.
**2002**, 3, 115–143. [Google Scholar] - Malhotra, P.; Vig, L.; Shroff, G.; Agarwal, P. Long short term memory networks for anomaly detection in time series. Proceedings
**2015**, 89, 89–94. [Google Scholar] - Cinar, Y.G.; Mirisaee, H.; Goswami, P.; Gaussier, E.; Ait-Bachir, A.; Strijov, V. Time series forecasting using rnns: An extended attention mechanism to model periods and handle missing values. arXiv
**2017**, arXiv:1703.10089. [Google Scholar] - Laptev, N.; Yosinski, J.; Li, L.E.; Smyl, S. Time-series extreme event forecasting with neural networks at uber. In Proceedings of the International Conference on Machine Learning, Sydney, Australia, 6–11 August 2017; Volume 34, pp. 1–5. [Google Scholar]
- Guo, T.; Lin, T.; Lu, Y. An interpretable LSTM neural network for autoregressive exogenous model. arXiv
**2018**, arXiv:1804.05251. [Google Scholar] - Lara-Benítez, P.; Carranza-García, M.; Riquelme, J.C. An Experimental Review on Deep Learning Architectures for Time Series Forecasting. Int. J. Neural Syst.
**2021**, 31, 2130001. [Google Scholar] [CrossRef] - Pirani, M.; Thakkar, P.; Jivrani, P.; Bohara, M.H.; Garg, D. A Comparative Analysis of ARIMA, GRU, LSTM and BiLSTM on Financial Time Series Forecasting. In Proceedings of the 2022 IEEE International Conference on Distributed Computing and Electrical Circuits and Electronics (ICDCECE), Ballari, India, 23–24 April 2022; pp. 1–6. [Google Scholar]
- Guo, Z.; Wang, H. A deep graph neural network-based mechanism for social recommendations. IEEE Trans. Ind. Inform.
**2020**, 17, 2776–2783. [Google Scholar] [CrossRef] - Chen, Y.; Ding, F.; Zhai, L. Multi-scale temporal features extraction based graph convolutional network with attention for multivariate time series prediction. Expert Syst. Appl.
**2022**, 200, 117011. [Google Scholar] [CrossRef] - Sezer, O.B.; Gudelek, M.U.; Ozbayoglu, A.M. Financial time series forecasting with deep learning: A systematic literature review: 2005–2019. Appl. Soft Comput.
**2020**, 90, 106181. [Google Scholar] [CrossRef] [Green Version] - Zhang, G.; Patuwo, B.E.; Hu, M.Y. Forecasting with artificial neural networks:: The state of the art. Int. J. Forecast.
**1998**, 14, 35–62. [Google Scholar] [CrossRef] - Tang, Y.; Song, Z.; Zhu, Y.; Yuan, H.; Hou, M.; Ji, J.; Tang, C.; Li, J. A survey on machine learning models for financial time series forecasting. Neurocomputing
**2022**, 512, 363–380. [Google Scholar] [CrossRef] - Zhang, G.P. Time series forecasting using a hybrid ARIMA and neural network model. Neurocomputing
**2003**, 50, 159–175. [Google Scholar] [CrossRef] - Hill, T.; O’Connor, M.; Remus, W. Neural network models for time series forecasts. Manag. Sci.
**1996**, 42, 1082–1092. [Google Scholar] [CrossRef] - Makridakis, S.; Anderson, A.; Carbone, R.; Fildes, R.; Hibon, M.; Lewandowski, R.; Newton, J.; Parzen, E.; Winkler, R. La exactitud de los métodos de extrapolación (series de tiempo): Resultados de una competencia de pronósticos. J. Forecast.
**1982**, 1, 111–153. [Google Scholar] [CrossRef] - Gheyas, I.A.; Smith, L.S. A neural network approach to time series forecasting. In Proceedings of the World Congress on Engineering, London, UK, 1–3 July 2009; Volume 2, pp. 1–3. [Google Scholar]
- Khashei, M.; Bijari, M. An artificial neural network (p, d, q) model for timeseries forecasting. Expert Syst. Appl.
**2010**, 37, 479–489. [Google Scholar] [CrossRef] - Yolcu, U.; Egrioglu, E.; Aladag, C.H. A new linear & nonlinear artificial neural network model for time series forecasting. Decis. Support Syst.
**2013**, 54, 1340–1347. [Google Scholar] - Zhang, G.P.; Kline, D.M. Quarterly time-series forecasting with neural networks. IEEE Trans. Neural Netw.
**2007**, 18, 1800–1814. [Google Scholar] [CrossRef] - Hochreiter, S.; Schmidhuber, J. Long short-term memory. Neural Comput.
**1997**, 9, 1735–1780. [Google Scholar] [CrossRef] [PubMed] - Gers, F.A.; Schmidhuber, J.; Cummins, F. Learning to forget: Continual prediction with LSTM. Neural Comput.
**2000**, 12, 2451–2471. [Google Scholar] [CrossRef] [PubMed] - Siami-Namini, S.; Tavakoli, N.; Namin, A.S. The performance of LSTM and BiLSTM in forecasting time series. In Proceedings of the 2019 IEEE International Conference on Big Data (Big Data), Los Angeles, CA, USA, 9–12 December 2019; pp. 3285–3292. [Google Scholar]
- Kim, J.; Moon, N. BiLSTM model based on multivariate time series data in multiple field for forecasting trading area. J. Ambient. Intell. Humaniz. Comput.
**2019**, 1–10. [Google Scholar] [CrossRef] - Yang, M.; Wang, J. Adaptability of Financial Time Series Prediction Based on BiLSTM. Procedia Comput. Sci.
**2022**, 199, 18–25. [Google Scholar] [CrossRef] - Zhou, J.; Cui, G.; Hu, S.; Zhang, Z.; Yang, C.; Liu, Z.; Sun, M. Graph neural networks: A review of methods and applications. AI Open
**2020**, 1, 57–81. [Google Scholar] [CrossRef] - Han, Y.; Karunasekera, S.; Leckie, C. Graph neural networks with continual learning for fake news detection from social media. arXiv
**2020**, arXiv:2007.03316. [Google Scholar] - Sanchez-Gonzalez, A.; Heess, N.; Springenberg, J.T.; Merel, J.; Riedmiller, M.; Hadsell, R.; Battaglia, P. Graph networks as learnable physics engines for inference and control. In Proceedings of the International Conference on Machine Learning, Stockholm, Sweden, 10–15 July 2018; PMLR: Melville, NY, USA, 2018; pp. 4470–4479. [Google Scholar]
- Sperduti, A.; Starita, A. Supervised neural networks for the classification of structures. IEEE Trans. Neural Netw.
**1997**, 8, 714–735. [Google Scholar] [CrossRef] [Green Version] - Gori, M.; Monfardini, G.; Scarselli, F. A new model for learning in graph domains. In Proceedings of the 2005 IEEE International Joint Conference on Neural Networks, Montreal, QC, Canada, 31 July–4 August 2005; Volume 2, pp. 729–734, No. 2005. [Google Scholar]
- Scarselli, F.; Gori, M.; Tsoi, A.C.; Hagenbuchner, M.; Monfardini, G. The graph neural network model. IEEE Trans. Neural Netw.
**2008**, 20, 61–80. [Google Scholar] [CrossRef] [Green Version] - Gallicchio, C.; Micheli, A. Graph echo state networks. In Proceedings of the 2010 international joint conference on neural networks (IJCNN), Barcelona, Spain, 18–23 July 2010; pp. 1–8. [Google Scholar]
- Wu, Z.; Pan, S.; Chen, F.; Long, G.; Zhang, C.; Yu, P.S. A Comprehensive Survey on Graph Neural Networks. IEEE Trans. Neural Netw. Learn. Syst.
**2021**, 32, 4–24. [Google Scholar] [CrossRef] [Green Version] - Wu, Z.; Pan, S.; Long, G.; Jiang, J.; Chang, X.; Zhang, C. Connecting the dots: Multivariate time series forecasting with graph neural networks. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, Virtual, 6–10 July 2020; pp. 753–763. [Google Scholar]
- Deng, A.; Hooi, B. Graph neural network-based anomaly detection in multivariate time series. Proc. AAAI Conf. Artif. Intell.
**2021**, 35, 4027–4035. [Google Scholar] [CrossRef] - Jiang, W.; Luo, J. Graph neural network for traffic forecasting: A survey. Expert Syst. Appl.
**2022**, 207, 117921. [Google Scholar] [CrossRef] - Wang, J.; Zhang, S.; Xiao, Y.; Song, R. A review on graph neural network methods in financial applications. arXiv
**2021**, arXiv:2111.15367. [Google Scholar] [CrossRef] - Ma, D.; Guo, Y.; Ma, S. Short-Term Subway Passenger Flow Prediction Based on GCN-BiLSTM. IOP Conf. Ser. Earth Environ. Sci.
**2021**, 693, 012005. [Google Scholar] [CrossRef] - Wu, Z.; Huang, M.; Zhao, A. Traffic prediction based on GCN-LSTM model. J. Phys. Conf. Ser.
**2021**, 1972, 012107. [Google Scholar] [CrossRef] - Li, Z.; Xiong, G.; Chen, Y.; Lv, Y.; Hu, B.; Zhu, F.; Wang, F.Y. A hybrid deep learning approach with GCN and LSTM for traffic flow prediction. In Proceedings of the 2019 IEEE Intelligent Transportation Systems Conference (ITSC), Auckland, NZ, USA, 27–30 October 2019; pp. 1929–1933. [Google Scholar]
- Kriechbaumer, T.; Angus, A.; Parsons, D.; Casado, M.R. An improved wavelet–ARIMA approach for forecasting metal prices. Resour. Policy
**2014**, 39, 32–41. [Google Scholar] [CrossRef] [Green Version] - Jiang, W. Applications of deep learning in stock market prediction: Recent progress. Expert Syst. Appl.
**2021**, 184, 115537. [Google Scholar] - Almasarweh, M.; Alwadi, S. ARIMA model in predicting banking stock market data. Mod. Appl. Sci.
**2018**, 12, 309. [Google Scholar] [CrossRef] - Chung, R.C.; Ip, W.H.; Chan, S.L. An ARIMA-intervention analysis model for the financial crisis in China’s manufacturing industry. Int. J. Eng. Bus. Manag.
**2009**, 1, 5. [Google Scholar] [CrossRef] [Green Version] - Bhardwaj, G.; Swanson, N.R. An empirical investigation of the usefulness of ARFIMA models for predicting macroeconomic and financial time series. J. Econom.
**2006**, 131, 539–578. [Google Scholar] [CrossRef] [Green Version] - Eğri, E.; Günay, S. Bayesian model selection in ARFIMA models. Expert Syst. Appl.
**2010**, 37, 8359–8364. [Google Scholar] - Gong, X.; Si, Y.W.; Fong, S.; Biuk-Aghai, R.P. Financial time series pattern matching with extended UCR suite and support vector machine. Expert Syst. Appl.
**2016**, 55, 284–296. [Google Scholar] [CrossRef] - Kristjanpoller, W.; Minutolo, M.C. Forecasting volatility of oil price using an artificial neural network-GARCH model. Expert Syst. Appl.
**2016**, 65, 233–241. [Google Scholar] [CrossRef] - Ghezelbash, A. Predicting changes in stock index and gold prices to neural network approach. J. Math. Comput. Sci.
**2012**, 4, 227–236. [Google Scholar] [CrossRef] - Dehghani, H.; Bogdanovic, D. Copper price estimation using bat algorithm. Resour. Policy
**2018**, 55, 55–61. [Google Scholar] [CrossRef] - Malliaris, A.G.; Malliaris, M. Are oil, gold and the euro inter-related? Time series and neural network analysis. Rev. Quant. Financ. Account.
**2013**, 40, 1–14. [Google Scholar] [CrossRef] [Green Version] - Monge, M.; Lazcano, A. Commodity Prices after COVID-19: Persistence and Time Trends. Risks
**2022**, 10, 128. [Google Scholar] [CrossRef] - Liu, Q.; Liu, M.; Zhou, H.; Yan, F. A multi-model fusion based non-ferrous metal price forecasting. Resour. Policy
**2022**, 77, 102714. [Google Scholar] [CrossRef] - Zhang, A.; Lipton, Z.C.; Li, M.; Smola, A.J. Dive into Deep Learning. arXiv
**2020**, arXiv:2106.11342v3. [Google Scholar] - Rumelhart, D.E.; Hinton, G.E.; Williams, R.J. Learning representations by back-propagating errors. Nature
**1986**, 323, 533–536. [Google Scholar] [CrossRef] - Güler, N.F.; Übeyli, E.D.; Güler, I. Recurrent neural networks employing Lyapunov exponents for EEG signals classification. Expert Syst. Appl.
**2005**, 29, 506–514. [Google Scholar] [CrossRef] - Fu, L. Rule generation from neural networks. IEEE Trans. Syst. Man Cybern.
**1994**, 24, 1114–1124. [Google Scholar] - Hinton, G.E.; Salakhutdinov, R.R. Reducing the dimensionality of data with neural networks. Science
**2006**, 313, 504–507. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Glorot, X.; Bengio, Y. Understanding the difficulty of training deep feedforward neural networks. In Proceedings of the Thirteenth International Conference on Artificial Intelligence and Statistics, Sardinia, Italy, 13–15 May 2010; pp. 249–256. [Google Scholar]
- Jarrett, K.; Kavukcuoglu, K.; Ranzato, M.A.; LeCun, Y. What is the best multi-stage architecture for object recognition? In Proceedings of the 2009 IEEE 12th International Conference on Computer Vision, Kyoto, Japan, 29 September–2 October 2009; pp. 2146–2153. [Google Scholar]
- Nair, V.; Hinton, G.E. Rectified linear units improve restricted boltzmann machines. In Proceedings of the ICML 2010, Haifa, Israel, 21–24 June 2010. [Google Scholar]
- Zhao, L.; Song, Y.; Zhang, C.; Liu, Y.; Wang, P.; Lin, T.; Li, H. T-gcn: A temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transp. Syst.
**2019**, 21, 3848–3858. [Google Scholar] [CrossRef] [Green Version] - Li, Y.; Tarlow, D.; Brockschmidt, M.; Zemel, R. Gated graph sequence neural networks. arXiv
**2015**, arXiv:1511.05493. [Google Scholar] - Rajalakshmi, V.; Ganesh Vaidyanathan, S. Hybrid CNN-LSTM for Traffic Flow Forecasting. In Proceedings of the 2nd International Conference on Artificial Intelligence: Advances and Applications, Meknes, Morocco, 14–15 October 2022; Springer: Singapore, 2022; pp. 407–414. [Google Scholar]
- Lacasa, L.; Luque, B.; Ballesteros, F.; Luque, J.; Nuno, J.C. From time series to complex networks: The visibility graph. Proc. Natl. Acad. Sci.
**2008**, 105, 4972–4975. [Google Scholar] [CrossRef] [Green Version] - Dickey, D.A.; Fuller, W.A. Distributions of the estimators for autoregressive time series with a unit root. J. Am. Stat. Assoc.
**1979**, 74, 427–481. [Google Scholar] - Phillips, P.C.; Perron, P. Testing for a unit root in time series regression. Biometrika
**1988**, 75, 335–346. [Google Scholar] [CrossRef] - Frechtling, D.C. Practical Tourism Forecasting; Butterworth-Heinemann: Oxford, UK, 1996. [Google Scholar]
- Sun, R. Optimization for deep learning: Theory and algorithms. arXiv
**2019**, arXiv:1912.08957. [Google Scholar] - Goodfellow, I. Nips 2016 tutorial: Generative adversarial networks. arXiv
**2016**, arXiv:1701.00160. [Google Scholar] - Akaike, H. Maximum likelihood identification of Gaussian autoregressive. moving average models. Biometrika
**1973**, 60, 255–265. [Google Scholar] [CrossRef] - Akaike, H. A Bayesian extension of the minimum AIC procedure of autoregressive model fitting. Biometrika
**1979**, 66, 237–242. [Google Scholar] [CrossRef] - González Casimiro, M.P. Análisis de series temporales: Modelos ARIMA. 2009. Available online: http://hdl.handle.net/10810/12492 (accessed on 23 July 2022).
- Friedman, M. A comparison of alternative tests of significance for the problem of m rankings. Ann. Math. Stat.
**1940**, 11, 86–92. [Google Scholar] [CrossRef] - Rosner, B.; Glynn, R.J.; Lee ML, T. The Wilcoxon signed rank test for paired comparisons of clustered data. Biometrics
**2006**, 62, 185–192. [Google Scholar] [CrossRef] [PubMed] - Abbasimehr, H.; Paki, R.; Bahrini, A. A novel approach based on combining deep learning models with statistical methods for COVID-19 time series forecasting. Neural Comput. Appl.
**2022**, 34, 3135–3149. [Google Scholar] [CrossRef] [PubMed] - Ensafi, Y.; Amin, S.H.; Zhang, G.; Shah, B. Time-series forecasting of seasonal items sales using machine learning–A comparative analysis. Int. J. Inf. Manag. Data Insights
**2022**, 2, 100058. [Google Scholar] [CrossRef]

**Figure 8.**Results from the models: (

**a**) ARIMA; (

**b**) PROPHET; (

**c**) BiLSTM; (

**d**) GCN-LSTM; (

**e**) BiLSTM-GCN.

ADF Values | PP Values | Metrics |
---|---|---|

−1.753 | −1.451 | Statistical test |

0.404 | 0.558 | p-value |

10248.000 | 10,248.000 | Number of observations |

−2.862 | −2.860 | Critical value (5%) |

**Table 2.**Selected parameters for the combined model proposed in this work and the DNN models implemented for comparative purposes.

Model | Learning Rate | Batch | Epoch | Hidden Layers | Neurons | Optimizer |
---|---|---|---|---|---|---|

BiLSTM | 0.001 | 25 | 100 | 3 | 50 | Adam |

GCN-LSTM | 0.001 | 32 | 50 | 2/1 | 16,10/100 | Adam |

BiLSTM-GCN | 0.001 | 32 | 50 | 2 | 10 | Adam |

Neural Network | Complexity Time |
---|---|

BiLSTM | O(2(4ih + 4h2 + 3 h + ho)) |

GCN–LSTM | O(LA0i + Lni2) + O(W) |

BiLSTM-GCN | O(ih + ho) |

**Table 4.**Model Performance in terms of MSE, RMSE, R

^{2}, MAPE and Time. The best values for each metric are in bold.

MSE | RMSE | R^{2} | MAPE | Time | |
---|---|---|---|---|---|

ARIMA | 81.219 | 9.012 | 0.716 | 22.500 | 5.23 s |

PROPHET | 134.050 | 11.580 | 0.834 | 14.380 | 6.43 s |

BiLSTM | 16.100 | 4.012 | 0.951 | 7.750 | 2.57 s |

GCN-LSTM | 17.829 | 4.220 | 0.947 | 7.630 | 2.66 s |

BiLSTM-GCN | 15.610 | 3.850 | 0.955 | 7.410 | 2.10 s |

Statistical Value | 280.303 |

p-Value | 1.821 × 10^{−60} |

Compared Data | Statistical Value | p-Value |
---|---|---|

WTI and BiLSMT | 19441.0 | 0.024 |

WTI and GCN-LSTM | 14162.0 | 9.591 × 10^{−9} |

WTI and BiLSTM-GCN | 21371.0 | 0.322 |

Values | Statistical Value |
---|---|

Kurtosis | 47.72 |

F-Statistic | 81.14 |

Prob (F-Statistic) | 1.03 × 10^{−18} |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Lazcano, A.; Herrera, P.J.; Monge, M.
A Combined Model Based on Recurrent Neural Networks and Graph Convolutional Networks for Financial Time Series Forecasting. *Mathematics* **2023**, *11*, 224.
https://doi.org/10.3390/math11010224

**AMA Style**

Lazcano A, Herrera PJ, Monge M.
A Combined Model Based on Recurrent Neural Networks and Graph Convolutional Networks for Financial Time Series Forecasting. *Mathematics*. 2023; 11(1):224.
https://doi.org/10.3390/math11010224

**Chicago/Turabian Style**

Lazcano, Ana, Pedro Javier Herrera, and Manuel Monge.
2023. "A Combined Model Based on Recurrent Neural Networks and Graph Convolutional Networks for Financial Time Series Forecasting" *Mathematics* 11, no. 1: 224.
https://doi.org/10.3390/math11010224