You are currently on the new version of our website. Access the old version .
EnergiesEnergies
  • This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
  • Article
  • Open Access

23 January 2026

Enhanced Transformer for Multivariate Load Forecasting: Timestamp Embedding and Convolution-Augmented Attention

,
,
,
,
and
1
China Electric Power Research Institute, Beijing 100892, China
2
School of Computer Science (National Pilot Software Engineering School), Haidian Campus, Beijing University of Posts and Telecommunications, Beijing 100876, China
*
Author to whom correspondence should be addressed.
Energies2026, 19(3), 596;https://doi.org/10.3390/en19030596 
(registering DOI)
This article belongs to the Section F5: Artificial Intelligence and Smart Energy

Abstract

Aiming at the insufficient capture of temporal dependence and weak coupling of external factors in multivariate load forecasting, this paper proposes a Transformer model integrating timestamp-based positional embedding and convolution-augmented attention. The model enhances temporal modeling capability through timestamp-based positional embedding, optimizes local contextual representation via convolution-augmented attention, and achieves deep fusion of load data with external factors such as temperature, humidity, and electricity price. Experiments based on the 2018 full-year load dataset for a German region show that the proposed model outperforms single-factor and multi-factor LSTMs in both short-term (24 h) and long-term (cross-month) forecasting. The research results verify the model’s accuracy and stability in multivariate load forecasting, providing technical support for smart grid load dispatching.

Article Metrics

Citations

Article Access Statistics

Article metric data becomes available approximately 24 hours after publication online.