Multi-Task Graph Attention Net for Electricity Consumption Prediction and Anomaly Detection
Abstract
1. Introduction
- This paper proposes an MGAT that addresses the dual characteristics of electricity consumption. By decomposing consumption data into high-entropy (fluctuation-driven) and low-entropy (quasi-periodic) components, the MGAT separately models these patterns. The low-entropy components are forecasted via a Multi-Scale Attention Network (MSAT), while high-entropy components are analyzed using temporal-sensitive fluctuation graphs to quantify consumption peaks and troughs.
- By constructing temporal environmental information graphs and fusing them with temporal-sensitive fluctuation graphs, the MGAT explicitly captures the interplay between numerical consumption changes and external environmental information using a designed Hierarchical Graph Attention Autoencoder within the MGAT for forecasting the high-entropy components. Finally, the MGAT synthesizes all the forecasted components for electricity consumption forecasting and anomaly detection.
2. Related Works
3. Research Methodology
3.1. Adaptive Consumption Decomposition and Low-Entropy Component Prediction
3.1.1. Adaptive Electricity Consumption Data Decomposition
- 1.
- Calculate the Euclidean distance bdij between all pairs of subsequences Compk (i) and Compk (j) (i ≠ j). Then, count the number of pairs Bi that satisfy bdij ≤ r.
- 2.
- Construct subsequences of length 2k, denoted as Comp2k (i). Calculate the Euclidean distance adij between all pairs Comp2k (i) and Comp2k (j) (i ≠ j), and count the number of pairs Ai that satisfy adij ≤ r.
- 3.
- Then, the SeqEntropy of this IMF component can be calculated as follows:
3.1.2. Low-Entropy Component Prediction
3.2. High-Entropy Component Prediction
3.2.1. Construct Temporal-Sensitive Fluctuation Graph
3.2.2. Construct Temporal Environmental Information Graphs
3.2.3. Construct HGAE for High-Entropy Component Prediction
3.3. Electricity Consumption Forecasting and Anomaly Detection Based on All Components
4. Research Results
4.1. Dataset and Experiment Settings
- 1.
- The first electricity consumption dataset is obtained from Tetouan in Morocco (Electric Power Consumption https://www.kaggle.com/datasets/fedesoriano/electric-power-consumption, 1 August 2022) (Data_kaggle) [21]. The data consists of 52,416 observations of energy consumption in a 10 min window. Every observation is described by the following feature columns.
- ⮚
- Date Time: Time window of ten minutes.
- ⮚
- Temperature: Weather temperature.
- ⮚
- Humidity: Weather humidity.
- ⮚
- Wind Speed: Wind Speed.
- ⮚
- General Diffuse Flows: “Diffuse flow” is a catchall term to describe low-temperature (<0.2° to ~100 °C) fluids that slowly discharge through sulfide mounds, fractured lava flows, and assemblages of bacterial mats and macrofauna.
- ⮚
- Zone 1 Power Consumption.
- ⮚
- Zone 2 Power Consumption.
- ⮚
- Zone 3 Power Consumption.
- 2.
- The second electricity consumption dataset is obtained from the China Southern Power Grid’s operational jurisdiction (https://pan.baidu.com/s/1b3S-EBYeaiIcNwHGBUtSYw?pwd=ggtv, 5 June 2025) across seven administrative divisions (Data_Southern). The reason for selecting this time period is that it includes the COVID-19 pandemic era, during which the changes in electricity consumption can effectively reflect the model’s predictive and anomaly detection capabilities. The dataset contains the aggregate regional electricity consumption, a three-tier industrial breakdown, and nine granular subsectors spanning manufacturing, logistics, and service industries.
- 3.
- For the anomaly detection:
4.2. Baselines
4.3. Validating the Effectiveness of the MGAT
4.3.1. Validating on Data_kaggle
- A.
- Electricity Consumption Prediction
- B.
- Electricity Consumption Anomaly Detection
4.3.2. Validating on Data_Southern
- A.
- Electricity Consumption Prediction
- B.
- Electricity Consumption Anomaly Detection
4.4. Ablation Experiments for MGAT
4.5. Discussion
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Yildiz, B.; Bilbao, J.I.; Sproul, A.B. A review and analysis of regression and machine learning models on commercial building electricity load forecasting. Renew. Sustain. Energy Rev. 2017, 73, 1104–1122. [Google Scholar] [CrossRef]
- González, A.M.; Roque, A.S.; García-González, J. Modeling and forecasting electricity prices with input/output hidden Markov models. IEEE Trans. Power Syst. 2005, 20, 13–24. [Google Scholar] [CrossRef]
- Munkhammar, J.; van der Meer, D.; Widén, J. Very short term load forecasting of residential electricity consumption using the Markov-chain mixture distribution (MCM) model. Appl. Energy 2021, 282, 116180. [Google Scholar] [CrossRef]
- Velasquez, C.E.; Zocatelli, M.; Estanislau, F.B.; Castro, V.F. Analysis of time series models for Brazilian electricity consumption demand forecasting. Energy 2022, 247, 123483. [Google Scholar] [CrossRef]
- Gomez, W.; Wang, F.K.; Amogne, Z.E. Electricity Load and Price Forecasting Using a Hybrid Method Based Bidirectional Long Short-Term Memory with Attention Mechanism Model. Int. J. Energy Res. 2023, 2023, 3815063. [Google Scholar] [CrossRef]
- Zhang, X.M.; Grolinger, K.; Capretz, M.A.M.; Seewald, L. Forecasting Residential Energy Consumption: Single Household Perspective. In Proceedings of the 17th IEEE International Conference on Machine Learning and Applications, Orlando, FL, USA, 17–20 December 2018. [Google Scholar]
- Imani, M. Electrical load-temperature CNN for residential load forecasting. Energy 2021, 227, 120480. [Google Scholar] [CrossRef]
- Wang, S.; Wang, X.; Wang, S.; Wang, D. Bi-directional long short-term memory method based on attention mechanism and rolling update for short-term load forecasting. Int. J. Electr. Power Energy Syst. 2019, 109, 470–479. [Google Scholar] [CrossRef]
- Torres, J.F.; Martínez-Álvarez, F.; Troncoso, A. A deep LSTM network for the Spanish electricity consumption demand forecasting. Neural Comput. Appl. 2022, 34, 10533–10545. [Google Scholar] [CrossRef]
- Kim, N.; Choi, J. LSTM Based Short-term Electricity Consumption Forecast with Daily Load Profile Sequences. In Proceedings of the IEEE 7th Global Conference on Consumer Electronics, Nara, Japan, 9–12 October 2018. [Google Scholar]
- Zheng, J.; Xu, C.; Zhang, Z.; Li, X. Electric load forecasting in smart grids using Long-Short-Term-Memory based Recurrent Neural Network. In Proceedings of the 51st Annual Conference on Information Sciences and Systems(CISS), Baltimore, MD, USA, 22–24 March 2017. [Google Scholar]
- Yang, Y.; Gao, Y.; Wang, Z.; Li, X.A.; Zhou, H.; Wu, J. Multiscale-integrated deep learning approaches for short-term load forecasting. Int. J. Mach. Learn. Cybern. 2024, 15, 6061–6076. [Google Scholar] [CrossRef]
- Feng, S.; Miao, C.; Xu, K.; Wu, J.; Wu, P.; Zhang, Y. Multi-scale attention flow for probabilistic time series forecasting. IEEE Trans. Knowl. Data Eng. 2023, 36, 2056–2068. [Google Scholar] [CrossRef]
- Xu, A.; Guo, Y.; Wu, T.; Wang, X.; Jiang, Q.; Zhang, Y. Household Electricity Consumption Forecast Based on Bi—LSTM. Ind. Control. Comput. 2020, 33, 11–13. [Google Scholar]
- Dedinec, A.; Filiposka, S.; Dedinec, A.; Kocarev, L. Deep belief network based electricity load forecasting: An analysis of Macedonian case. Energy 2016, 115, 1688–1700. [Google Scholar] [CrossRef]
- Tsai, C.L.; Chen, W.T.; Chang, C.S. Polynomial-Fourier series model for analyzing and predicting electricity consumption in buildings. Energy Build. 2016, 127, 301–312. [Google Scholar] [CrossRef]
- Gashler, M.; Ashmore, S. Modeling time series data with deep Fourier neural networks. Neurocomputing 2016, 188, 3–11. [Google Scholar] [CrossRef]
- Liu, X.; Li, S.; Gao, M. A discrete time-varying grey Fourier model with fractional order terms for electricity consumption forecast. Energy 2024, 296, 131065. [Google Scholar] [CrossRef]
- Imani, M.; Ghassemian, H. Lagged Load Wavelet Decomposition and LSTM Networks for Short-Term Load Forecasting. In Proceedings of the 4th International Conference on Pattern Recognition and Image Analysis, Tehran, Iran, 6–7 March 2019. [Google Scholar]
- Chang, S.; Zhang, Y.; Han, W.; Yu, M.; Guo, X.; Tan, W.; Cui, X.; Witbrock, M.; Hasegawa-Johnson, M.; Huang, T.S. Dilated Recurrent Neural Networks. Annu. Conf. Neural Inf. Process. Syst. 2017, 1, 77–87. [Google Scholar]
- Available online: https://www.kaggle.com/datasets/fedesoriano/electric-power-consumption (accessed on 1 August 2022).
- Zhen, H.; Niu, D.; Wang, K.; Shi, Y. Photovoltaic Power Forecasting Based on GA improved Bi-LSTM in Microgrid without Meteorological Information. Energy 2021, 231, 120908. [Google Scholar] [CrossRef]
- Ma, Z.; Mei, G. A hybrid attention-based deep learning approach for wind power prediction. Appl. Energy 2022, 323, 119608. [Google Scholar] [CrossRef]
- Liu, Y.; Guan, L.; Hou, C.; Han, H.; Liu, Z.; Sun, Y.; Zheng, M. Wind Power Short-Term Prediction Based on LSTM and Discrete Wavelet Transform. Appl. Sci. 2019, 9, 1108. [Google Scholar] [CrossRef]
- Meng, Z.; Xie, Y.; Sun, J. Short-term load forecasting using neural attention model based on EMD. Electr. Eng. 2022, 104, 1857–1866. [Google Scholar] [CrossRef]
- Wang, Z.; Pei, C.; Ma, M.; Wang, X.; Li, Z.; Pei, D.; Rajmohan, S.; Zhang, D.; Lin, Q.; Zhang, H.; et al. Revisiting vae for unsupervised time series anomaly detection: A frequency perspective. Proc. ACM Web Conf. 2024, 2024, 3096–3105. [Google Scholar]
- Wu, J.; Zhang, H.; Qiang, M. An attention graph stacked autoencoder for anomaly detection of electro-mechanical actuator using spatio-temporal multivariate signal. Chin. J. Aeronaut. 2024, 37, 506–520. [Google Scholar]
Models | Zone 1 | Zone 2 | Zone 3 |
---|---|---|---|
Bi-LSTM | 0.0199 | 0.0260 | 0.0243 |
SVR | 0.0223 | 0.0272 | 0.0267 |
Attention | 0.0203 | 0.0249 | 0.0235 |
GA-BLSTM | 0.0194 | 0.0255 | 0.0217 |
DFT-Attention | 0.0192 | 0.0223 | 0.0204 |
MS-Attention | 0.0193 | 0.0225 | 0.0199 |
EMD-OSVR | 0.0165 | 0.0202 | 0.0187 |
EMD-Attention | 0.0164 | 0.0213 | 0.0177 |
MGAT | 0.0148 | 0.0191 | 0.0173 |
Models | Accuracy (20-10) | Accuracy (10-10) | Accuracy (20-5) |
---|---|---|---|
Bi-LSTM | 0.902 | 0.901 | 0.892 |
SVR | 0.855 | 0.852 | 0.838 |
Attention | 0.903 | 0.913 | 0.890 |
GA-BLSTM | 0.909 | 0.912 | 0.891 |
DFT-Attention | 0.913 | 0.919 | 0.899 |
MS-Attention | 0.921 | 0.924 | 0.901 |
EMD-OSVR | 0.934 | 0.939 | 0.904 |
EMD-Attention | 0.955 | 0.962 | 0.923 |
VAE-AD | 0.961 | 0.964 | 0.939 |
GAE-AD | 0.971 | 0.972 | 0.936 |
MGAT | 0.983 | 0.983 | 0.941 |
Models | Data_Kaggle | Data_Southern |
---|---|---|
Bi-LSTM | 0.0419 | 0.0429 |
SVR | 0.0474 | 0.0431 |
Attention | 0.0441 | 0.0425 |
GA-BLSTM | 0.0406 | 0.0424 |
DFT-Attention | 0.0401 | 0.0411 |
MS-Attention | 0.0399 | 0.0411 |
EMD-OSVR | 0.0395 | 0.0411 |
EMD-Attention | 0.0384 | 0.0403 |
MGAT | 0.0373 | 0.0400 |
Models | Area_1 | Area_2 | Area_3 | Area_4 | Area_5 | Area_6 | Area_7 |
---|---|---|---|---|---|---|---|
Bi-LSTM | 0.0413 | 0.0389 | 0.0592 | 0.0320 | 0.0409 | 0.0305 | 0.0307 |
SVR | 0.0423 | 0.0407 | 0.0693 | 0.0419 | 0.0511 | 0.0416 | 0.0391 |
Attention | 0.0415 | 0.0385 | 0.0591 | 0.0317 | 0.0401 | 0.0302 | 0.0305 |
GA-BLSTM | 0.0411 | 0.0386 | 0.0587 | 0.0314 | 0.0402 | 0.0301 | 0.0304 |
DFT-Attention | 0.0404 | 0.0392 | 0.0562 | 0.0314 | 0.0400 | 0.0304 | 0.0301 |
MS-Attention | 0.0401 | 0.0389 | 0.0557 | 0.0309 | 0.0392 | 0.0302 | 0.0301 |
EMD-OSVR | 0.0383 | 0.0377 | 0.0527 | 0.0301 | 0.0394 | 0.0302 | 0.0302 |
EMD-Attention | 0.0381 | 0.0384 | 0.0501 | 0.0301 | 0.0381 | 0.0301 | 0.0299 |
MGAT | 0.0372 | 0.0372 | 0.0497 | 0.0294 | 0.0376 | 0.0296 | 0.0297 |
Models | Industrial | Primary Sector | Secondary Sector | Tertiary Sector | Manufacturing | Household |
---|---|---|---|---|---|---|
Bi-LSTM | 0.0404 | 0.0366 | 0.0384 | 0.0401 | 0.0409 | 0.0461 |
SVR | 0.0472 | 0.0371 | 0.0426 | 0.0424 | 0.0421 | 0.0501 |
Attention | 0.0405 | 0.0367 | 0.0385 | 0.0406 | 0.0409 | 0.0477 |
GA-BLSTM | 0.0401 | 0.0365 | 0.0383 | 0.0398 | 0.0407 | 0.0456 |
DFT-Attention | 0.0401 | 0.0368 | 0.0381 | 0.0395 | 0.0405 | 0.0449 |
MS-Attention | 0.0397 | 0.0362 | 0.0379 | 0.0391 | 0.0402 | 0.0444 |
EMD-OSVR | 0.0394 | 0.0356 | 0.0375 | 0.0387 | 0.0386 | 0.0441 |
EMD-Attention | 0.0391 | 0.0342 | 0.0373 | 0.0385 | 0.0390 | 0.0434 |
MGAT | 0.0384 | 0.0344 | 0.0375 | 0.0383 | 0.0384 | 0.0429 |
Models | Accuracy (20-10) | Accuracy (10-10) | Accuracy (20-5) |
---|---|---|---|
Bi-LSTM | 0.917 | 0.914 | 0.891 |
SVR | 0.813 | 0.821 | 0.808 |
Attention | 0.881 | 0.885 | 0.873 |
GA-BLSTM | 0.921 | 0.918 | 0.903 |
DFT-Attention | 0.949 | 0.951 | 0.919 |
MS-Attention | 0.950 | 0.953 | 0.921 |
EMD-OSVR | 0.951 | 0.955 | 0.921 |
EMD-Attention | 0.972 | 0.969 | 0.933 |
VAE-AD | 0.964 | 0.971 | 0.935 |
GAE-AD | 0.965 | 0.977 | 0.935 |
MGAT | 0.980 | 0.983 | 0.943 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Bai, N.; Zhang, J.; Wu, Z. Multi-Task Graph Attention Net for Electricity Consumption Prediction and Anomaly Detection. Computers 2025, 14, 350. https://doi.org/10.3390/computers14090350
Bai N, Zhang J, Wu Z. Multi-Task Graph Attention Net for Electricity Consumption Prediction and Anomaly Detection. Computers. 2025; 14(9):350. https://doi.org/10.3390/computers14090350
Chicago/Turabian StyleBai, Na, Jian Zhang, and Zhaoli Wu. 2025. "Multi-Task Graph Attention Net for Electricity Consumption Prediction and Anomaly Detection" Computers 14, no. 9: 350. https://doi.org/10.3390/computers14090350
APA StyleBai, N., Zhang, J., & Wu, Z. (2025). Multi-Task Graph Attention Net for Electricity Consumption Prediction and Anomaly Detection. Computers, 14(9), 350. https://doi.org/10.3390/computers14090350