A Reinforced, Event-Driven, and Attention-Based Convolution Spiking Neural Network for Multivariate Time Series Prediction
Abstract
:1. Introduction
- By incorporating the advantages of Gramian Angular Field (GAF) and rate encoding techniques, the proposed spike-encoding strategy allows for an effective representation of the knowledge with respect to temporal patterns and variable dependencies in MTS.
- The proposed Leaky-Integrate-and-Fire (LIF)-based pooling strategy has been theoretically and experimentally proven to be effective in extracting more features from the regions of interest in a spike image than average-pooling strategies.
- The redesigned Spike-based Convolutional Block Attention Mechanism (SCBAM) strengthens its event-driven characteristics in weighting operations while still having the outstanding capability to capture the spatio-temporal correlations encoded in the spike images.
- By performing experiments on multiple MTS data sets, the results show that the performance of our model rivals, and even surpasses, some CNN-, RNN-, and SNN-based techniques, with up to better performance but with less energy consumption.
2. Related Work
2.1. ANN-Based Models
2.2. SNN-Based Models
3. Preliminary Work
3.1. GAF
3.2. Rate Coding
3.3. LIF
3.4. CBAM
4. Methodology
4.1. GAFR Coding
4.2. LIF-Pooling
- Case 1: ,
- Case 2: ,
- Case 3: .
4.3. SCBAM
5. Experimental Results and Analysis
5.1. Tasks and Data Sets
- Stock prediction: We predict the direction of stock prices by taking into account the correlation between different stocks. For example, the stocks in the same sector, such as banking or healthcare, are likely to move in the same direction and react to the market in a similar way. To this end, we collected data from five gaming companies (Activision, Blizzard, Electronic Arts, Nintendo, and Tencent) over a period of 3094 days, from 4 January 2010 to 4 April 2022. The data set includes the daily date, opening price, closing price, the highest price, the lowest price, and the transaction volume, totaling 15,470 records. To standardize the prediction across all five companies, we normalized each feature in the data set. Specifically, our prediction objective was to use the all of the features from the first nine time steps to forecast the closing price at the tenth time step.
- PM2.5 prediction: We predict the rise and fall of PM2.5 levels in Beijing using the PM2.5 data [52] provided by the US Embassy in Beijing from 1 January 2010 to 31 December 2014, which consists of 43,824 records. This data set includes 12 features such as the year, month, day, hour, dew point, temperature, PM2.5 concentration, wind direction, air pressure, wind speed, snowfall, and precipitation. For convenience, we integrate year, month, day, and hour into a time feature, and, for missing values, we delete the entire record. Similarly, we use data from every nine time steps to predict the PM2.5 concentration for the next time step.
- Air quality prediction: We predicted the rise and fall of concentration to monitor air quality using the data set collected by the air quality chemical multisensor device deployed on highly polluted roads in an Italian city from March 2004 to February 2005 [53]. This data set contains information on the average hourly concentrations of five types of polluting gases, including , hydrocarbons, benzene, and nitrogen oxides. The data set has 9358 observations, including 15 features such as time, temperature, relative humidity, absolute humidity, and polluting gas concentration. Moreover, we directly delete records with missing values.
- Air pollution prediction: We predict the rise and fall of concentration to track air pollution using the gas turbine and the emission data set. This data set contains 36,722 observations collected by 11 sensors located in northwest Turkey. The data span a period of 5 years, from 1 January 2011 to 31 December 2015, with a sampling frequency of one hour. Each record includes ambient temperature, air pressure, humidity, air filter difference pressure, gas turbine exhaust pressure, turbine inlet temperature, turbine after temperature, compressor discharge pressure, turbine energy yield, concentration, and concentration, totaling 11 features. It is worth noting that each observation does not provide the time feature, and it still contains the inherent time feature because it is strictly sorted by time. As above, we delete those records with missing values.
5.2. Experimental Setting
- RNN: A neural network with memory ability, allowing the network to capture temporal correlations in the sequence. We used a simple one-layer RNN model.
- LSTM: A variant of RNN that introduces a gating mechanism with the advantage of handling long-term dependencies. In this paper, we used a simple one-layer LSTM model.
- GRU: Another variant of RNN, which has a higher computational efficiency than LSTM. Similar to RNN and LSTM, we used a one-layer GRU model.
- GCN [54]: Operates directly on graph-structure-based data that are built on the basis of the relationships between stocks.
- LSNN [55]: Integrates a neuronal adaptation mechanism into the recurrent SNN (RSNN) model with the function of capturing dynamic processes on large time scales, including the excitability and inhibition of neurons, spike frequency, spike time interval, etc.
5.3. Results
5.4. Ablation
- REAT-CSNN without any attention modules;
- REAT-CSNN when only considering SSAM;
- REAT-CSNN when only considering SCAM;
- REAT-CSNN by switching the implementation orders of SSAM and SCAM in SCBAM.
Attention Module | Stock | PM2.5 | |||
---|---|---|---|---|---|
SCAM | SSAM | Max-Pooling | LIF-Pooling | Max-Pooling | LIF-Pooling |
✗ | ✗ | ||||
✗ | ✓ | ||||
✓ | ✗ | ||||
✓ | ✓ |
5.4.1. Pooling Strategy
5.4.2. Attention Modules
5.5. Energy Efficiency
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
Content | Significance |
CSNN | Convolution Spiking Neural Network |
GAFR | Gramian Angular Field Rate |
LIF | Leaky-Integrate-and-Fire |
LIF-pooling | LIF-based Pooling |
LIF-Conv | LIF-based Convolution |
MTS | Multivariate Time Series |
REAT-CSNN | Reinforced Event-driven Attention CSNN |
SCBAM | Spike-based Convolutional Block Attention Mechanism |
SSAM | Spike-based Spatial Attention Module |
SCAM | Spike-based Channel Attention Module |
References
- Wu, Y.; Zhao, R.; Zhu, J.; Chen, F.; Xu, M.; Li, G.; Song, S.; Deng, L.; Wang, G.; Zheng, H.; et al. Brain-inspired global-local learning incorporated with neuromorphic computing. Nat. Commun. 2022, 13, 65. [Google Scholar] [CrossRef] [PubMed]
- Xu, M.; Liu, F.; Hu, Y.; Li, H.; Wei, Y.; Zhong, S.; Pei, J.; Deng, L. Adaptive synaptic scaling in spiking networks for continual learning and enhanced robustness. IEEE Trans. Neural Netw. Learn. Syst. 2024, 36, 5151–5165. [Google Scholar] [CrossRef] [PubMed]
- Xu, M.; Zheng, H.; Pei, J.; Deng, L. A unified structured framework for AGI: Bridging cognition and neuromorphic computing. In International Conference on Artificial General Intelligence; Springer: Cham, Switzerland, 2023; pp. 345–356. [Google Scholar]
- Maass, W. Networks of spiking neurons: The third generation of neural network models. Neural Netw. 1997, 10, 1659–1671. [Google Scholar] [CrossRef]
- Ghosh-Dastidar, S.; Adeli, H. Third generation neural networks: Spiking neural networks. In Advances in Computational Intelligence; Springer: Berlin/Heidelberg, Germany, 2009; pp. 167–178. [Google Scholar]
- Yamazaki, K.; Vo-Ho, V.K.; Bulsara, D.; Le, N. Spiking neural networks and their applications: A Review. Brain Sci. 2022, 12, 863. [Google Scholar] [CrossRef]
- Zhang, T.; Jia, S.; Cheng, X.; Xu, B. Tuning Convolutional Spiking Neural Network With Biologically Plausible Reward Propagation. IEEE Trans. Neural Netw. Learn. Syst. 2022, 33, 7621–7631. [Google Scholar] [CrossRef]
- Zhang, A.; Li, X.; Gao, Y.; Niu, Y. Event-Driven Intrinsic Plasticity for Spiking Convolutional Neural Networks. IEEE Trans. Neural Netw. Learn. Syst. 2022, 33, 1986–1995. [Google Scholar] [CrossRef]
- Islam, R.; Majurski, P.; Kwon, J.; Tummala, S.R.S.K. Exploring High-Level Neural Networks Architectures for Efficient Spiking Neural Networks Implementation. In Proceedings of the 2023 3rd International Conference on Robotics, Electrical and Signal Processing Techniques (ICREST), Dhaka, Bangladesh, 7–8 January 2023; pp. 212–216. [Google Scholar] [CrossRef]
- Saunders, D.J.; Siegelmann, H.T.; Kozma, R.; Ruszinkó, M. Stdp learning of image patches with convolutional spiking neural networks. In Proceedings of the 2018 International Joint Conference on Neural Networks (IJCNN), Rio de Janeiro, Brazil, 8–13 July 2018; pp. 1–7. [Google Scholar]
- Fei, X.; Jianping, L.; Jie, T.; Guangshuo, W. Image Recognition Algorithm Based on Spiking Neural Network. In Proceedings of the 2022 19th International Computer Conference on Wavelet Active Media Technology and Information Processing (ICCWAMTIP), Chengdu, China, 16–18 December 2022; pp. 1–5. [Google Scholar] [CrossRef]
- Zhou, R. A Method of Converting ANN to SNN for Image Classification. In Proceedings of the 2023 IEEE 3rd International Conference on Electronic Technology, Communication and Information (ICETCI), Changchun, China, 26–28 May 2023; pp. 819–822. [Google Scholar] [CrossRef]
- Li, J.; Hu, W.; Yuan, Y.; Huo, H.; Fang, T. Bio-inspired deep spiking neural network for image classification. In Proceedings of the Neural Information Processing: 24th International Conference, ICONIP 2017, Guangzhou, China, 14–18 November 2017; Proceedings, Part II 24. Springer: Berlin/Heidelberg, Germany, 2017; pp. 294–304. [Google Scholar]
- Reid, D.; Hussain, A.J.; Tawfik, H. Financial time series prediction using spiking neural networks. PLoS ONE 2014, 9, e103656. [Google Scholar] [CrossRef]
- Kasabov, N.; Capecci, E. Spiking neural network methodology for modelling, classification and understanding of EEG spatio-temporal data measuring cognitive processes. Inf. Sci. 2015, 294, 565–575. [Google Scholar] [CrossRef]
- Xing, Y.; Zhang, L.; Hou, Z.; Li, X.; Shi, Y.; Yuan, Y.; Zhang, F.; Liang, S.; Li, Z.; Yan, L. Accurate ECG classification based on spiking neural network and attentional mechanism for real-time implementation on personal portable devices. Electronics 2022, 11, 1889. [Google Scholar] [CrossRef]
- Maji, P.; Patra, R.; Dhibar, K.; Mondal, H.K. SNN Based Neuromorphic Computing Towards Healthcare Applications. In Internet of Things. Advances in Information and Communication Technology, Proceedings of the 6th IFIP International Cross-Domain Conference, IFIPIoT 2023, Denton, TX, USA, 2–3 November 2023; Springer: Cham, Switzerland, 2023; pp. 261–271. [Google Scholar]
- Gaurav, R.; Stewart, T.C.; Yi, Y. Reservoir based spiking models for univariate Time Series Classification. Front. Comput. Neurosci. 2023, 17, 1148284. [Google Scholar] [CrossRef]
- Fang, H.; Shrestha, A.; Qiu, Q. Multivariate time series classification using spiking neural networks. In Proceedings of the 2020 International Joint Conference on Neural Networks (IJCNN), Glasgow, UK, 19–24 July 2020; pp. 1–7. [Google Scholar]
- Sharma, V.; Srinivasan, D. A spiking neural network based on temporal encoding for electricity price time series forecasting in deregulated markets. In Proceedings of the The 2010 international joint conference on neural networks (IJCNN), Barcelona, Spain, 18–23 July 2010; pp. 1–8. [Google Scholar]
- Liu, Q.; Long, L.; Peng, H.; Wang, J.; Yang, Q.; Song, X.; Riscos-Núñez, A.; Pérez-Jiménez, M.J. Gated spiking neural P systems for time series forecasting. IEEE Trans. Neural Netw. Learn. Syst. 2021, 34, 6227–6236. [Google Scholar] [CrossRef] [PubMed]
- Fukumori, K.; Yoshida, N.; Sugano, H.; Nakajima, M.; Tanaka, T. Epileptic Spike Detection by Recurrent Neural Networks with Self-Attention Mechanism. In Proceedings of the ICASSP 2022—2022 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Singapore, 23–27 May 2022; pp. 1406–1410. [Google Scholar] [CrossRef]
- Samadzadeh, A.; Far, F.S.T.; Javadi, A.; Nickabadi, A.; Chehreghani, M.H. Convolutional spiking neural networks for spatio-temporal feature extraction. Neural Process. Lett. 2023, 55, 6979–6995. [Google Scholar] [CrossRef]
- Ponghiran, W.; Roy, K. Spiking neural networks with improved inherent recurrence dynamics for sequential learning. In Proceedings of the AAAI Conference on Artificial Intelligence, Online, 22 February–1 March 2022; Volume 36, pp. 8001–8008. [Google Scholar]
- Elsayed, N.; Maida, A.S.; Bayoumi, M. Gated Recurrent Neural Networks Empirical Utilization for Time Series Classification. In Proceedings of the 2019 International Conference on Internet of Things (iThings) and IEEE Green Computing and Communications (GreenCom) and IEEE Cyber, Physical and Social Computing (CPSCom) and IEEE Smart Data (SmartData), Atlanta, GA, USA, 14–17 July 2019; pp. 1207–1210. [Google Scholar] [CrossRef]
- Xia, M.; Shao, H.; Ma, X.; de Silva, C.W. A stacked GRU-RNN-based approach for predicting renewable energy and electricity load for smart grid operation. IEEE Trans. Ind. Inform. 2021, 17, 7050–7059. [Google Scholar] [CrossRef]
- Gautam, A.; Singh, V. CLR-based deep convolutional spiking neural network with validation based stopping for time series classification. Appl. Intell. 2020, 50, 830–848. [Google Scholar] [CrossRef]
- Shih, S.Y.; Sun, F.K.; Lee, H.y. Temporal pattern attention for multivariate time series forecasting. Mach. Learn. 2019, 108, 1421–1441. [Google Scholar] [CrossRef]
- Choi, Y.; Lim, H.; Choi, H.; Kim, I.J. GAN-Based Anomaly Detection and Localization of Multivariate Time Series Data for Power Plant. In Proceedings of the 2020 IEEE International Conference on Big Data and Smart Computing (BigComp), Pusan, Republic of Korea, 19–22 February 2020; pp. 71–74. [Google Scholar] [CrossRef]
- Auge, D.; Hille, J.; Mueller, E.; Knoll, A. A survey of encoding techniques for signal processing in spiking neural networks. Neural Process. Lett. 2021, 53, 4693–4710. [Google Scholar] [CrossRef]
- George, A.M.; Dey, S.; Banerjee, D.; Mukherjee, A.; Suri, M. Online time-series forecasting using spiking reservoir. Neurocomputing 2023, 518, 82–94. [Google Scholar] [CrossRef]
- Long, L.; Liu, Q.; Peng, H.; Wang, J.; Yang, Q. Multivariate time series forecasting method based on nonlinear spiking neural P systems and non-subsampled shearlet transform. Neural Netw. 2022, 152, 300–310. [Google Scholar] [CrossRef]
- Guarda, L.; Tapia, J.E.; Droguett, E.L.; Ramos, M. A novel Capsule Neural Network based model for drowsiness detection using electroencephalography signals. Expert Syst. Appl. 2022, 201, 116977. [Google Scholar] [CrossRef]
- Jiang, P.; Zou, C. Convolution neural network with multiple pooling strategies for speech emotion recognition. In Proceedings of the 2022 6th International Symposium on Computer Science and Intelligent Control (ISCSIC), Beijing, China, 11–13 November 2022; pp. 89–92. [Google Scholar] [CrossRef]
- More, Y.; Dumbre, K.; Shiragapur, B. Horizontal Max Pooling a Novel Approach for Noise Reduction in Max Pooling for Better Feature Detect. In Proceedings of the 2023 International Conference on Emerging Smart Computing and Informatics (ESCI), Pune, India, 1–3 March 2023; pp. 1–5. [Google Scholar] [CrossRef]
- Gong, P.; Wang, P.; Zhou, Y.; Zhang, D. A Spiking Neural Network With Adaptive Graph Convolution and LSTM for EEG-Based Brain-Computer Interfaces. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 1440–1450. [Google Scholar] [CrossRef]
- Rao, A.; Plank, P.; Wild, A.; Maass, W. A long short-term memory for AI applications in spike-based neuromorphic hardware. Nat. Mach. Intell. 2022, 4, 467–479. [Google Scholar] [CrossRef]
- Chen, C.; Li, K.; Teo, S.G.; Chen, G.; Zou, X.; Yang, X.; Vijay, R.C.; Feng, J.; Zeng, Z. Exploiting spatio-temporal correlations with multiple 3d convolutional neural networks for citywide vehicle flow prediction. In Proceedings of the 2018 IEEE International Conference on Data Mining (ICDM), Singapore, 17–20 November 2018; pp. 893–898. [Google Scholar]
- Wang, K.; Li, K.; Zhou, L.; Hu, Y.; Cheng, Z.; Liu, J.; Chen, C. Multiple convolutional neural networks for multivariate time series prediction. Neurocomputing 2019, 360, 107–119. [Google Scholar] [CrossRef]
- Zhang, J.; Dai, Q. MrCAN: Multi-relations aware convolutional attention network for multivariate time series forecasting. Inf. Sci. 2023, 643, 119277. [Google Scholar] [CrossRef]
- Qin, Y.; Song, D.; Chen, H.; Cheng, W.; Jiang, G.; Cottrell, G. A dual-stage attention-based recurrent neural network for time series prediction. arXiv 2017, arXiv:1704.02971. [Google Scholar]
- Liu, Y.; Gong, C.; Yang, L.; Chen, Y. DSTP-RNN: A dual-stage two-phase attention-based recurrent neural network for long-term and multivariate time series prediction. Expert Syst. Appl. 2020, 143, 113082. [Google Scholar] [CrossRef]
- Xiao, Y.; Yin, H.; Zhang, Y.; Qi, H.; Zhang, Y.; Liu, Z. A dual-stage attention-based Conv-LSTM network for spatio-temporal correlation and multivariate time series prediction. Int. J. Intell. Syst. 2021, 36, 2036–2057. [Google Scholar] [CrossRef]
- Fu, E.; Zhang, Y.; Yang, F.; Wang, S. Temporal self-attention-based Conv-LSTM network for multivariate time series prediction. Neurocomputing 2022, 501, 162–173. [Google Scholar] [CrossRef]
- Tan, Q.; Ye, M.; Yang, B.; Liu, S.; Ma, A.J.; Yip, T.C.F.; Wong, G.L.H.; Yuen, P. Data-gru: Dual-attention time-aware gated recurrent unit for irregular multivariate time series. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 930–937. [Google Scholar]
- Chen, W.; Jiang, M.; Zhang, W.G.; Chen, Z. A novel graph convolutional feature based convolutional neural network for stock trend prediction. Inf. Sci. 2021, 556, 67–94. [Google Scholar] [CrossRef]
- Yan, F.; Liu, W.; Dong, F.; Hirota, K. A quantum-inspired online spiking neural network for time-series predictions. Nonlinear Dyn. 2023, 111, 15201–15213. [Google Scholar] [CrossRef]
- Maciąg, P.S.; Kasabov, N.; Kryszkiewicz, M.; Bembenik, R. Air pollution prediction with clustering-based ensemble of evolving spiking neural networks and a case study for London area. Environ. Model. Softw. 2019, 118, 262–280. [Google Scholar] [CrossRef]
- Capizzi, G.; Sciuto, G.L.; Napoli, C.; Woźniak, M.; Susi, G. A spiking neural network-based long-term prediction system for biogas production. Neural Netw. 2020, 129, 271–279. [Google Scholar] [CrossRef] [PubMed]
- Wang, Z.; Oates, T. Imaging time-series to improve classification and imputation. arXiv 2015, arXiv:1506.00327. [Google Scholar]
- Woo, S.; Park, J.; Lee, J.Y.; Kweon, I.S. Cbam: Convolutional block attention module. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 3–19. [Google Scholar]
- Chen, S. Beijing PM2.5 Data. UCI Machine Learning Repository. 2017. Available online: https://archive.ics.uci.edu/dataset/381/beijing+pm2+5+data (accessed on 12 March 2025).
- Vito, S. Air Quality. UCI Machine Learning Repository. 2016. Available online: https://archive.ics.uci.edu/dataset/360/air+quality (accessed on 12 March 2025).
- Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
- Bellec, G.; Salaj, D.; Subramoney, A.; Legenstein, R.; Maass, W. Long short-term memory and learning-to-learn in networks of spiking neurons. Adv. Neural Inf. Process. Syst. 2018, 31, 795–805. [Google Scholar]
- Kim, S.; Park, S.; Na, B.; Yoon, S. Spiking-yolo: Spiking neural network for energy-efficient object detection. In Proceedings of the AAAI Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 11270–11277. [Google Scholar]
Network | Param (K) | T | Data Sets | |||
---|---|---|---|---|---|---|
Stock | PM2.5 | Air Quality | Air Pollution | |||
RNN (one-layer) | 3.54 | 9 | ||||
LSTM (one-layer) | 5.95 | 9 | ||||
GRU (one-layer) | 4.48 | 9 | ||||
GCN [54] | 15.92 | 9 | ||||
LSNN [55] | 3.6 | 9 | ||||
REAT-CSNN (ours) | 3.57 | 9 |
Model | FLOP | Float32 | Int32 |
---|---|---|---|
89,984 | 206,963.2 | 143,974.4 | |
179,968 | 413,926.4 | 287,948.8 | |
224,960 | 517,408 | 359,936 | |
126,624 | 291,235.2 | 201,667.2 | |
5377 | 4839.3 | 537.7 | |
84,714.3 | 381,214.35 | 84,714.3 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Li, Y.; Guan, X.; Yue, W.; Huang, Y.; Zhang, B.; Duan, P. A Reinforced, Event-Driven, and Attention-Based Convolution Spiking Neural Network for Multivariate Time Series Prediction. Biomimetics 2025, 10, 240. https://doi.org/10.3390/biomimetics10040240
Li Y, Guan X, Yue W, Huang Y, Zhang B, Duan P. A Reinforced, Event-Driven, and Attention-Based Convolution Spiking Neural Network for Multivariate Time Series Prediction. Biomimetics. 2025; 10(4):240. https://doi.org/10.3390/biomimetics10040240
Chicago/Turabian StyleLi, Ying, Xikang Guan, Wenwei Yue, Yongsheng Huang, Bin Zhang, and Peibo Duan. 2025. "A Reinforced, Event-Driven, and Attention-Based Convolution Spiking Neural Network for Multivariate Time Series Prediction" Biomimetics 10, no. 4: 240. https://doi.org/10.3390/biomimetics10040240
APA StyleLi, Y., Guan, X., Yue, W., Huang, Y., Zhang, B., & Duan, P. (2025). A Reinforced, Event-Driven, and Attention-Based Convolution Spiking Neural Network for Multivariate Time Series Prediction. Biomimetics, 10(4), 240. https://doi.org/10.3390/biomimetics10040240