QSA-QConvLSTM: A Quantum Computing-Based Approach for Spatiotemporal Sequence Prediction
Abstract
:1. Introduction
2. Related Work
2.1. Spatiotemporal Sequence Forecasting
2.2. Quantum Computing in Machine Learning
3. Method
3.1. Quantum Self-Attention Mechanism Process
3.2. Quantumization of Linear Mapping
3.3. Quantumization of Convolutional Mapping
- Encoding Layer: Converts classical data into quantum state representations.
- Convolution Kernel Layer: Uses parameterized quantum gates to perform feature mapping.
- Dynamic Adjustment Layer: Adapts the number of quantum bits and encoding schemes based on task requirements.
3.4. Model Methodology
3.5. Quantum Circuit Design
4. Experiment
4.1. Moving MNIST Dataset
4.2. Experimental Results
4.3. ERA5 Dataset Experiment and Computational Complexity Analysis
5. Discussion
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
QSA | quantum self-attention |
QConvLSTM | quantum convolutional LSTM |
QCod | quantum coding |
References
- Shi, X.; Chen, Z.; Wang, H.; Yeung, D.Y.; Wong, W.K.; Woo, W.C. Convolutional LSTM network: A machine learning approach for precipitation nowcasting. Adv. Neural Inf. Process. Syst. 2015, 28, 802–810. [Google Scholar] [CrossRef]
- Ma, Y.; Tresp, V.; Zhao, L.; Wang, Y. Variational quantum circuit model for knowledge graph embedding. Adv. Quantum Technol. 2019, 2, 1800078. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017. [Google Scholar] [CrossRef]
- Ma, Y.; Tresp, V. Quantum machine learning algorithm for knowledge graphs. ACM Trans. Quantum Comput. 2021, 2, 1–28. [Google Scholar] [CrossRef]
- Lin, Z.; Feng, M.; Santos, C.N.D.; Yu, M.; Xiang, B.; Zhou, B.; Bengio, Y. A structured self-attentive sentence embedding. arXiv 2017, arXiv:1703.03130. [Google Scholar]
- Devlin, J.; Chang, M.W.; Lee, K.; Toutanova, K. BERT: Pre-training of deep bidirectional transformers for language understanding. arXiv 2018, arXiv:1810.04805. [Google Scholar]
- Farhi, E.; Goldstone, J.; Gutmann, S. A quantum approximate optimization algorithm. arXiv 2014, arXiv:1411.4028. [Google Scholar]
- Cong, I.; Choi, S.; Lukin, M.D. Quantum convolutional neural networks. Nat. Phys. 2019, 15, 1273–1278. [Google Scholar] [CrossRef]
- Giovagnoli, A.; Tresp, V.; Ma, Y.; Schubert, M. Qneat: Natural evolution of variational quantum circuit architecture. In Proceedings of the Companion Conference on Genetic and Evolutionary Computation, Lisbon, Portugal, 15–19 July 2023; pp. 647–650. [Google Scholar]
- Medsker, L.R.; Jain, L. Recurrent Neural Networks. Des. Appl. 2001, 5, 64–67. [Google Scholar]
- Graves, A.; Graves, A. Long short-term memory. In Supervised Sequence Labelling with Recurrent Neural Networks; Springer: Berlin/Heidelberg, Germany, 2012; pp. 37–45. [Google Scholar]
- Graves, A. Generating sequences with recurrent neural networks. arXiv 2013, arXiv:1308.0850. [Google Scholar]
- Cho, K.; Van Merriënboer, B.; Gulcehre, C.; Bahdanau, D.; Bougares, F.; Schwenk, H.; Bengio, Y. Learning phrase representations using RNN encoder-decoder for statistical machine translation. arXiv 2014, arXiv:1406.1078. [Google Scholar]
- Shi, X.; Gao, Z.; Lausen, L.; Wang, H.; Yeung, D.Y.; Wong, W.K.; Woo, W.C. Deep learning for precipitation nowcasting: A benchmark and a new model. Adv. Neural Inf. Process. Syst. 2017, 30. [Google Scholar] [CrossRef]
- Wang, Y.; Long, M.; Wang, J.; Gao, Z.; Yu, P.S. Predrnn: Recurrent neural networks for predictive learning using spatiotemporal LSTMs. In Proceedings of the 31st International Conference on Neural Information Processing Systems (NIPS’17), Long Beach, CA, USA, 4–9 December 2017; Curran Associates Inc.: Red Hook, NY, USA, 2017; pp. 879–888. [Google Scholar]
- Wang, Y.; Long, M.; Wang, J.; Gao, Z.; Yu, P.S. Predrnn: A recurrent neural network for spatiotemporal predictive learning. IEEE Trans. Pattern Anal. Mach. Intell. 2022, 45, 2208–2225. [Google Scholar] [CrossRef] [PubMed]
- Rebentrost, P.; Mohseni, M.; Lloyd, S. Quantum support vector machine for big data classification. Phys. Rev. Lett. 2014, 113, 130503. [Google Scholar] [CrossRef] [PubMed]
- Dong, D.; Chen, C.; Li, H.; Tarn, T.J. Quantum reinforcement learning. IEEE Trans. Syst. Man, Cybern. Part B (Cybernetics) 2008, 38, 1207–1220. [Google Scholar] [CrossRef]
- Wei, S.; Chen, Y.; Zhou, Z.; Long, G. A quantum convolutional neural network on NISQ devices. AAPPS Bull. 2022, 32, 1–11. [Google Scholar] [CrossRef]
- Li, G.; Zhao, X.; Wang, X. Quantum self-attention neural networks for text classification. Sci. China Inf. Sci. 2024, 67, 142501. [Google Scholar] [CrossRef]
- Shi, S.; Wang, Z.; Li, J.; Li, Y.; Shang, R.; Zheng, H.; Zhong, G.; Gu, Y. A natural NISQ model of quantum self-attention mechanism. arXiv 2023, arXiv:2305.15680. [Google Scholar]
- Hersbach, H.; Bell, B.; Berrisford, P.; Hirahara, S.; Horányi, A.; Muñoz-Sabater, J.; Nicolas, J.; Peubey, C.; Radu, R.; Schepers, D.; et al. The ERA5 global reanalysis. Q. J. R. Meteorol. Soc. 2020, 146, 1999–2049. [Google Scholar] [CrossRef]
- Sun, Y.; Wu, Z.; Ma, Y.; Tresp, V. Quantum architecture search with unsupervised representation learning. arXiv 2024, arXiv:2401.11576. [Google Scholar]
- Sun, Y.; Liu, J.; Wu, Z.; Ding, Z.; Ma, Y.; Seidl, T.; Tresp, V. SA-DQAS: Self-attention enhanced differentiable quantum architecture search. arXiv 2024, arXiv:2406.08882. [Google Scholar]
Circuit Structure | MSE | SSIM |
---|---|---|
1 Layer, 2 Qubits | 96.8 | 0.715 |
2 Layers, 4 Qubits | 95.6 | 0.718 |
3 Layers, 6 Qubits | 95.9 | 0.717 |
Model | MSE | SSIM |
---|---|---|
ConvLSTM | 101.1 | 0.705 |
TrajGRU | 96.7 | 0.715 |
PredRNN | 61.6 | 0.878 |
PredRNN v2 | 51.9 | 0.891 |
QSA-QConvLSTM | 44.3 | 0.906 |
Method | MSE | SSIM |
---|---|---|
ConvLSTM | 101.1 | 0.705 |
w QConv, w/o QSA | 95.6 | 0.718 |
w/o QConv, w QSA | 46.2 | 0.897 |
w QConv, w QSA | 44.3 | 0.906 |
Model | MSE | SSIM |
---|---|---|
ConvLSTM | 98.3 | 0.730 |
w QConv, w/o QSA | 87.9 | 0.753 |
w/o QConv, w QSA) | 83.2 | 0.775 |
w QConv, w QSA | 74.5 | 0.786 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yu, W.; Chen, Z.; Zhang, C.; Chen, Y. QSA-QConvLSTM: A Quantum Computing-Based Approach for Spatiotemporal Sequence Prediction. Information 2025, 16, 206. https://doi.org/10.3390/info16030206
Yu W, Chen Z, Zhang C, Chen Y. QSA-QConvLSTM: A Quantum Computing-Based Approach for Spatiotemporal Sequence Prediction. Information. 2025; 16(3):206. https://doi.org/10.3390/info16030206
Chicago/Turabian StyleYu, Wenbin, Zongyuan Chen, Chengjun Zhang, and Yadang Chen. 2025. "QSA-QConvLSTM: A Quantum Computing-Based Approach for Spatiotemporal Sequence Prediction" Information 16, no. 3: 206. https://doi.org/10.3390/info16030206
APA StyleYu, W., Chen, Z., Zhang, C., & Chen, Y. (2025). QSA-QConvLSTM: A Quantum Computing-Based Approach for Spatiotemporal Sequence Prediction. Information, 16(3), 206. https://doi.org/10.3390/info16030206