Next Article in Journal
On Spirallikeness of Entire Functions
Previous Article in Journal
Riccati-Type Pseudo-Potential Approach to Quasi-Integrability of Deformed Soliton Theories
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

A Pretrained Spatio-Temporal Hypergraph Transformer for Multi-Stock Trend Forecasting

1
School of Mathematics and Statistics, Wuhan University of Technology, Wuhan 430070, China
2
School of Computer Science, Guangdong University of Education, Guangzhou 510303, China
*
Author to whom correspondence should be addressed.
Mathematics 2025, 13(10), 1565; https://doi.org/10.3390/math13101565
Submission received: 6 April 2025 / Revised: 4 May 2025 / Accepted: 7 May 2025 / Published: 9 May 2025
(This article belongs to the Section E1: Mathematics and Computer Science)

Abstract

Predicting stock trends has garnered extensive attention from investors and researchers due to its potential to optimize stock investment returns. The fluctuation of stock prices is complex and influenced by multiple factors, presenting two major challenges: the first challenge lies in the the temporal dependence of individual stocks and the spatial correlation among multiple stocks. The second challenge emerges from having insufficient historical data availability for newly listed stocks. To address these challenges, this paper proposes a spatio-temporal hypergraph transformer (STHformer). The proposed model employs a temporal encoder with an aggregation module to capture temporal patterns, utilizes self-attention to dynamically generate hyperedges, and selects cross-attention to implement hypergraph-associated convolution. Furthermore, pretraining based on reconstruction of masked sequences is implemented. This framework enhances the model’s cold-start capability, making it more adaptable to newly listed stocks with insufficient training data. Experimental results show that the proposed model, after pretraining on data from over two thousand stocks, performed well on datasets from the stock markets of the United States and China.
Keywords: stock trend prediction; self-supervised pretraining; hypergraph neural network; time series analysis stock trend prediction; self-supervised pretraining; hypergraph neural network; time series analysis

Share and Cite

MDPI and ACS Style

Wu, Y.; Xie, L.; Wan, H.; Xu, H. A Pretrained Spatio-Temporal Hypergraph Transformer for Multi-Stock Trend Forecasting. Mathematics 2025, 13, 1565. https://doi.org/10.3390/math13101565

AMA Style

Wu Y, Xie L, Wan H, Xu H. A Pretrained Spatio-Temporal Hypergraph Transformer for Multi-Stock Trend Forecasting. Mathematics. 2025; 13(10):1565. https://doi.org/10.3390/math13101565

Chicago/Turabian Style

Wu, Yuchen, Liang Xie, Hongyang Wan, and Haijiao Xu. 2025. "A Pretrained Spatio-Temporal Hypergraph Transformer for Multi-Stock Trend Forecasting" Mathematics 13, no. 10: 1565. https://doi.org/10.3390/math13101565

APA Style

Wu, Y., Xie, L., Wan, H., & Xu, H. (2025). A Pretrained Spatio-Temporal Hypergraph Transformer for Multi-Stock Trend Forecasting. Mathematics, 13(10), 1565. https://doi.org/10.3390/math13101565

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop