# Mid- to Long-Term Electric Load Forecasting Based on the EMD–Isomap–Adaboost Model

^{*}

## Abstract

**:**

^{2}). Compared with the single Adaboost algorithm, the EMDIA reduces MAE by 11.58, MAPE by 0.13%, and RMSE by 49.93 and increases R

^{2}by 0.04.

## 1. Introduction

- (1)
- By applying the idea of “decomposition and integration”, EMD is used to decompose the load and obtain the features of load at different frequency. Through correlation analysis, multiple IMFs are integrated into two categories: trend item and fluctuation item.
- (2)
- This is the first study to suggest using EMD–Isomap to decompose meteorological and economic features, then reducing the dimension to select high-quality features.

## 2. Related Work

## 3. Applied Methodologies

#### 3.1. The Principle of Adaboost

- (1)
- First, a weak learning algorithm and a training set are given: $\left({x}_{1},{y}_{1}\right),\left({x}_{2},{y}_{2}\right),\left({x}_{3},{y}_{3}\right),\dots \dots ,{x}_{i}\in X,{y}_{i}\in Y,$ where X and Y represent a domain or instance space.
- (2)
- Initialize the weights of N samples, assuming that the sample distribution ${D}_{t}\left(i\right)$ is uniform, ${D}_{t}\left(i\right)=1/N$. ${D}_{t}\left(i\right)$ represents the weight of samples in the T times iteration, and N is the number of samples in the training set.
- (3)
- Under the probability distribution of training samples, the weak learner h
_{3}(x) is trained. - (4)
- Calculate the error of the weak learner under each sample ${\epsilon}_{i}$ and average error ${\epsilon}_{t},{\epsilon}_{t}=\frac{1}{\mathrm{N}}{\displaystyle \sum _{i=1}^{\mathrm{N}}}{\epsilon}_{i}$.
- (5)
- Update sample weight:${D}_{t}\left(i\right)=\frac{{D}_{t-1}\left(i\right){\beta}_{t}{}^{-{\epsilon}_{i}}}{{Z}_{t}}$. Weak learner weight ${W}_{t}=\frac{1}{2}\mathrm{ln}(1/{\beta}_{t})$, ${\beta}_{t}=\frac{{\epsilon}_{t}}{1-{\epsilon}_{t}}$. Z
_{t}is the normalization factor of $\sum _{i=1}^{N}{D}_{t}({x}_{i})}=1$. - (6)
- Skip to step 3 and continue the next iteration until the number of iterations is T.
- (7)
- According to the trained weak learners, the strong learners are combined $H(x)={\displaystyle \sum _{t=1}^{T}}{w}_{t}{h}_{t}(x)$.

#### 3.2. Empirical Mode Decomposition

- (1)
- In the whole-time range of the function, the number of local extreme points and zero crossings must be equal or at most one difference.
- (2)
- At any time point, the mean of the upper envelope of the local maximum and the lower envelope of the local minimum must be zero.

_{11}, the original data signal be M, and the m

_{11}component be the difference between the original data and the average value:

_{i(K}

_{−1)}is the component of (K−1) filtering, h

_{ik}is the mean of the K filtering, and m

_{ik}is the ith component that meets the IMF conditions after K filtering. It is defined as the ith IMF component L

_{i}. The process of repeated s filtering is as follows:

#### 3.3. The Principle of Isometric Mapping (Isomap)

_{i}(i = 1, 2, …, N), calculate its k-nearest neighbor X

_{i}

_{1}, X

_{I}

_{2},…, X

_{ik}, which is recorded as N

_{j}. Take point X

_{i}as the fixed point and Euclidean distance D (X

_{i}, X

_{ij}) as the edge to establish the neighborhood relation graph G (V, e).

- (1)
- Using ε-nearest neighbor, if ||X
_{i}-X_{j}||2 ≤ ε, then the point pair X_{i}, X_{j}can be regarded as the nearest neighbor. - (2)
- Using k-nearest neighbor, the number of nearest neighbor’s K is given in advance, and then the nearest neighbor points are determined.

_{ij})

_{N}**to obtain the lowest dimension embedding Y = {y**

_{× N}_{1}, y

_{2}, …, y

_{N}}

#### 3.4. The Full Procedure of the EMDIA Model

- (1)
- The power load data are decomposed into IFM and residual by EMD.
- (2)
- The IMF and residual, meteorological factors, and economic factors are divided into two groups through correlation analysis, with the trend term related to economic factors and the fluctuation term related to meteorological factors.
- (3)
- IMF and residual are formed by decomposing economic factors and meteorological factors through EMD.
- (4)
- All decomposed data are normalized. The normalization process is shown in Equation (6):$${\overline{X}}_{a}(t)=\frac{{X}_{a}(t{)\text{}-\text{}\mathrm{min}}_{j}\text{{}{X}_{a}(j)\text{}}}{{\mathrm{max}}_{j}\text{{}{X}_{a}(j{)\text{}}-\text{}\mathrm{min}}_{\mathrm{j}}\text{{}{X}_{a}(j)\text{}}}$$
- (5)
- Isomap dimensionality reduction is used to select new high-quality features for Adaboost using the normalized data.
- (6)
- The Adaboost method, paired with the features after Isomap dimensionality reduction, predicts the trend and fluctuation terms.
- (7)
- Integrate the values predicted by the trend and fluctuation terms, as appropriate.

#### 3.5. Model Evaluation Indicators

^{2}), the mean absolute error (MAE), the mean absolute percentage error (MAPE), and the root mean square error (RMSE) where L

_{t}and are the actual and forecasted values of the load at time t and M is the total number of data points used. The formulas of the four evaluation indicators are as follows:

## 4. Data and Experiment Settings

#### 4.1. Data Sources

#### 4.2. Experiment Settings

## 5. Results and Analysis

#### 5.1. Results of EMD Decomposition Electric Load

#### 5.2. Analysis of the Factors Affecting Power Load

#### 5.3. EMD-Isomap for Feature Decomposition and Dimensionality Reduction

#### 5.4. Feature Importance Analysis

#### 5.5. Prediction Results and Analysis of Different Algorithms

- Random forest settings: number of trees = 100, maximum depth of tree = 30, maximum number of leaf nodes = 50.
- BP settings: learning rate = 0.1; number of iterations = 1000, number of hidden layer neurons = 100.
- GRU settings: hidden layer = 2, number of neurons = 64, stride = 7, epoch = 120, learning rate = 0.01.
- LSTM settings: LSTM network = 1, number of neurons = 20, learning rate = 0.001.
- Adaboost settings: base learner = decision tree, number of base learners = 100, learning rate = 1.

- (1)
- When the Adaboost method is compared with the RF, BP, LSTM, and GRU algorithms, the results show that the RMSE, MAE, and MAPE indicate that the error is smaller. The R
^{2}of Adaboost is higher than any other algorithms, and it is the closest to 1. The MAE of Adaboost is 16.58, the MAPE is 0.13%, and the RMSE is 57.83. The MAE of RF was 476.42, the MAPE was 3.85%, and the RMSE was 574.46. The MAE of BP was 1041.74, the MAPE was 7.62%, and the RMSE was 1048.14. The MAE of the LSTM is 624.93, the MAPE is 4.59%, and the RMSE is 912.62. The MAE of GRU was 438, the MAPE was 3.49%, and the RMSE was 563.80. The three evaluation indicators of Adaboost are lower than RF, BP, LSTM, and GRU. The above shows that the Adaboost algorithm is the most applicable in this study. - (2)
- When comparing the EMD–Adaboost algorithm to Adaboost, the results in Table 1 reveal that Adaboost after EMD has a better prediction impact than Adaboost alone. EMD–Adaboost reduces MAE by 5.9%, MAPE by 0.02%, and RMSE by 20.5% and improves R
^{2}by 0.01, demonstrating that EMD is crucial in power load forecasting. - (3)
- When the MAE index is nearly comparable, the S-EMDIA algorithm improves R
^{2}by 0.03, MAPE by 0.02%, and RMSE by 13.06% when compared with EMD–Adaboost. The findings of the comparison reveal that after EMD–Isomap, new features are chosen to participate in the prediction, which can improve the accuracy of the prediction results to some extent. - (4)
- The approaches in the literature [42] were also compared in this study. In contrast to EMDIA, EMD–PCA–Adaboost reduces feature dimensionality using PCA rather than Isomap. The experimental results are shown in Table 1. The R
^{2}of EMD–PCA–Adaboost is 0.95, MAE is 426.53, MAPE is 3.35%, and RMSE is 573.90. Each indicator of EMD–PCA–Adaboost has a disadvantage when compared with EMD–Adaboost, implying that PCA may not be suitable for nonlinear data dimensionality reduction. The prediction accuracy of Adaboost was not improved by the new features after EMD–PCA. In comparison with EMDIA, the indicators’ disadvantages are more clear, demonstrating the usefulness and necessity of Isomap in this method. - (5)
- Compared with S-EMDIA, the MAE of EMDIA is decreased by 6.2, the MAPE is decreased by 0.08%, and the RMSE is decreased by 16.37. The R
^{2}of EMDIA and S-EMDIA are both 0.99. Table 2 illustrates the forecast results for the IMF, the trend term, and the fluctuation term. Compared with IMF1-IMF4, the prediction result of the fluctuation term reconstructed by correlation analysis shows a great improvement in R^{2}, and the effects of the other three evaluation indicators (MAE, MAPE, and RMSE) all decrease to a certain extent. In this study, the trend term is the residual term, the value of R^{2}is 0.93, the value of MAE is 111.68, the value of MAPE is 41.6%, and the value of RMSE is 39.52. These show that after EMD, the trend and fluctuation terms are produced by correlation analysis, which considerably increases prediction accuracy.

^{2}results for RF, BP, LSTM, GRU, and Adaboost are 0.91, 0.84, 0.87, 0.95, and 0.95, respectively. The R

^{2}of EMDIA is 0.99. Figure 6 shows the prediction results for EMDIA, BP, RF, Adaboost, LSTM, and GRU. The forecast time is in months. The results show that the predicted value of EMDIA has the best fitting effect with the original value, and the error is also the smallest, while the fitting effect of BP is the worst. The Adaboost algorithm is optimized for smoother curve fitting and higher accuracy.

## 6. Conclusions

- (1)
- The EMD of the load sequence can effectively separate the complex information contained in the original sequence, and the decomposition quantity is divided into trend items and fluctuation items by correlation analysis. This effectively reduces the model calculation time and improves the prediction accuracy.
- (2)
- Using the EMD–Isomap algorithm, the nonlinear meteorological and economic data are decomposed to obtain multidimensional features. Then, the Isomap algorithm is used to reduce the dimensions and select the features to obtain new features. This effectively removes noise and redundancy in the features. The experimental results verify the effectiveness of EMD–Isomap.
- (3)
- The EMDIA approach has excellent accuracy in power load forecasting and can better predict the load trend, indicating that it has a promising future application in medium- and long-term power load forecasting.

^{2}increases to 0.99, which shows much better performance than traditional Adaboost. In the meantime, compared with the BP, RF, LSTM, and GRU algorithms, the four evaluation indexes of the proposed method are superior, which proves that the proposed method is potentially useful for power load forecasting.

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Nicolas, K.; Maxence, B.; Thibaut, B.; Robin, G.; Elena, M.; Georges, K.; Guillaume, P.; Pierre, C. Long-term forecast of local electrical demand and evaluation of future impacts on the electricity distribution network. CIRED—Open Access Proc. J.
**2017**, 2017, 2401–2405. [Google Scholar] - Lindberg, K.B.; Seljom, P.; Madsen, H.; Fischer, D.; Korpås, M. Long-term electricity load forecasting: Current and future trends. Util. Policy
**2019**, 58, 102–119. [Google Scholar] [CrossRef] - Jinghua, L.; Yongsheng, L.; Shuhui, Y. Mid-long term load forecasting model based on support vector machine optimized by improved sparrow search algorithm. Energy Rep.
**2022**, 8, 491–497. [Google Scholar] - Xiaole, L.; Yiqin, W.; Guibo, M.; Xin, C.; Qianxiang, S.; Bo, Y. Electric load forecasting based on Long-Short-Term-Memory network via simplex optimizer during COVID-19. Energy Rep.
**2022**, 8, 1–12. [Google Scholar] - Pan, G.; Ouyang, A. Medium and Long Term Power Load Forecasting using CPSO-GM Model. J. Netw.
**2014**, 9, 2121–2128. [Google Scholar] [CrossRef] - Yinsheng, S.; Chunxiao, L.; Bao, L.; Weisi, D.; Peishen, Z. Principal Component Analysis of Short-term Electric Load Forecast Data Based on Grey Forecast. J. Phys. Conf. Ser.
**2020**, 1486, 062031. [Google Scholar] [CrossRef] - Elkamel, M.; Schleider, L.; Pasiliao, E.L.; Diabat, A.; Zheng, Q.P. Long-Term Electricity Demand Prediction via Socioeconomic Factors—A Machine Learning Approach with Florida as a Case Study. Energies
**2020**, 13, 3996. [Google Scholar] [CrossRef] - WuNeng, L.; Pan, Z.; QiuWen, L.; Dong, M.; CuiYun, L.; YiMing, L.; ZhenCheng, L. Adaboost-Based Power System Load Forecasting. J. Phys. Conf. Ser.
**2021**, 2005, 012190. [Google Scholar] - Fan, G.F.; Peng, L.L.; Hong, W.C.; Sun, F. Electric load forecasting by the SVR model with differential empirical mode decomposition and auto regression. Neurocomputing
**2016**, 173, 958–970. [Google Scholar] [CrossRef] - Li, M.-a.; Zhu, W.; Liu, H.-n.; Yang, J.-f.; Acharya, U.R. Adaptive Feature Extraction of Motor Imagery EEG with Optimal Wavelet Packets and SE-Isomap. Appl. Sci.
**2017**, 7, 390. [Google Scholar] [CrossRef] [Green Version] - Semero, Y.K.; Zhang, J.; Zheng, D. EMD–PSO–ANFIS-based hybrid approach for short-term load forecasting in microgrids. IET Gener. Transm. Distrib.
**2020**, 14, 470–475. [Google Scholar] [CrossRef] - Meng, Z.; Xie, Y.; Sun, J. Short-term load forecasting using neural attention model based on EMD. Electr. Eng.
**2021**, 104, 1857–1866. [Google Scholar] [CrossRef] - Okolobah, V.A.; Ismail, Z. New Approach to Peak Load Forecasting based on EMD and ANFIS. Indian J. Sci. Technol.
**2013**, 6, 5600–5606. [Google Scholar] [CrossRef] - Yu, J.; Ding, F.; Guo, C.; Wang, Y. System load trend prediction method based on IF-EMD-LSTM. International Journal of Distributed Sensor Networks,(8). Int. J. Distrib. Sens. Netw.
**2019**, 15, 1–8. [Google Scholar] [CrossRef] - Luo, Y.; Jia, X.; Chen, S.W. Short-term Power Load Forecasting Based on EMD and ESN. In Proceedings of the 2012 International Conference on Engineering Materials (ICEM 2012), Singapore, 30–31 December 2012; 651, pp. 922–928. [Google Scholar]
- Sobhani, M.; Campbell, A.; Sangamwar, S.; Li, C.; Hong, T. Combining Weather Stations for Electric Load Forecasting. Energies
**2019**, 12, 1510. [Google Scholar] [CrossRef] [Green Version] - Hua, S.H.; Guo, W.W.; Gan, Z.Z.; Zhang, X.L. The Relationship Analysis of Temperature and Electric Load. Appl. Mech. Mater.
**2012**, 256–259, 2644–2647. [Google Scholar] - Yan, Q.Y.; Liu, Z.Y.; He, S.Q. Long Term Forecast of Electric Load in Beijing. Adv. Mater. Res.
**2012**, 516–517, 1490–1495. [Google Scholar] [CrossRef] - Kyle, P.; Clarke, L.; Rong, F.; Smith, S.J. Climate Policy and the Long-Term Evolution of the U.S. Buildings Sector. Energy J.
**2010**, 31, 145–172. [Google Scholar] [CrossRef] - Arash, M.; Sahar, Z.; Maryam, S.; Behnam, M.; Fazel, M. Short-Term Load Forecasting of Microgrid via Hybrid Support Vector Regression and Long Short-Term Memory Algorithms. Sustainability
**2020**, 12, 7076. [Google Scholar] - Hong, W.-C. Electric load forecasting by seasonal recurrent SVR (support vector regression) with chaotic artificial bee colony algorithm. Energy
**2011**, 36, 5568–5578. [Google Scholar] [CrossRef] - Jihoon, M.; Yongsung, K.; Minjae, S.; Eenjun, H. Hybrid Short-Term Load Forecasting Scheme Using Random Forest and Multilayer Perceptron. Energies
**2018**, 11, 3283. [Google Scholar] - Mingming, L.; Yulu, W. Power load forecasting and interpretable models based on GS_XGBoost and SHAP. J. Phys. Conf. Ser.
**2022**, 2195, 012028. [Google Scholar] - Xiaojin, L.; Yueyang, H.; Yuanbo, S. Ultra-Short Term Power Load Prediction Based on Gated Cycle Neural Network and XGBoost Models. J. Phys. Conf. Ser.
**2021**, 2026, 012022. [Google Scholar] - Jeenanunta, C. Electricity load forecasting using a deep neuralnetwork. Eng. Appl. Sci. Res.
**2019**, 46, 10–17. [Google Scholar] - Cai, Q.; Yan, B.; Su, B.; Liu, S.; Xiang, M.; Wen, Y.; Cheng, Y.; Feng, N. Short-term load forecasting method based on deep neural network with sample weights. Int. Trans. Electr. Energy Syst.
**2020**, 30, e12340. [Google Scholar] [CrossRef] - Jun, L.; Jin, M.; Jianguo, Z.; Yu, C. Short-term load forecasting based on LSTM networks considering attention mechanism. Int. J. Electr. Power Energy Syst.
**2022**, 137, 107818. [Google Scholar] - Kedong, Z.; Yaping, L.; Wenbo, M.; Feng, L.; Jiahao, Y. LSTM enhanced by dual-attention-based encoder-decoder for daily peak load forecasting. Electr. Power Syst. Res.
**2022**, 208, 107860. [Google Scholar] - Aurangzeb, K.; Alhussein, M.; Javaid, K.; Haider, S.I. A Pyramid-CNN Based Deep Learning Model for Power Load Forecasting of Similar-Profile Energy Customers Based on Clustering. IEEE Access
**2021**, 9, 14992–15003. [Google Scholar] [CrossRef] - Arurun, K.; Afzal, P.; Khwaja, A.S.; Bala, V.; Alagan, A. Performance comparison of single and ensemble CNN, LSTM and traditional ANN models for short-term electricity load forecasting. J. Eng.
**2022**, 2022, 550–565. [Google Scholar] - Shanmugasundar, G.; Vanitha, M.; Čep, R.; Kumar, V.; Kalita, K.; Ramachandran, M. A Comparative Study of Linear, Random Forest and AdaBoost Regressions for Modeling Non-Traditional Machining. Processes
**2021**, 9, 2015. [Google Scholar] [CrossRef] - Li, F.-F.; Wang, S.-Y.; Wei, J.-H. Long term rolling prediction model for solar radiation combining empirical mode decomposition (EMD) and artificial neural network (ANN) techniques. J. Renew. Sustain. Energy
**2018**, 10, 013704. [Google Scholar] [CrossRef] - Dang, X.J.; Chen, H.Y.; Jin, X.M. A Method for Forecasting Short-Term Wind Speed Based on EMD and SVM. Appl. Mech. Mater.
**2013**, 392, 622–627. [Google Scholar] [CrossRef] - Li, S.; Wang, P.; Goel, L. Short-term load forecasting by wavelet transform and evolutionary extreme learning machine. Electr. Power Syst.
**2015**, 122, 96–103. [Google Scholar] [CrossRef] - Peng, L.L.; Fan, G.F.; Yu, M.; Chang, Y.C.; Hong, W.C. Electric Load Forecasting based on Wavelet Transform and Random Forest. Adv. Theory Simul.
**2021**, 4, 2100334. [Google Scholar] [CrossRef] - Gao, X.; Li, X.; Zhao, B.; Ji, W.; Jing, X.; He, Y. Short-Term Electricity Load Forecasting Model Based on EMD-GRU with Feature Selection. Energies
**2019**, 12, 1140. [Google Scholar] [CrossRef] [Green Version] - Chen, G.C.; Zhang, X.; Guan, Z.W. The Wind Power Forecast Model Based on Improved EMD and SVM. Appl. Mech. Mater.
**2014**, 694, 150–154. [Google Scholar] [CrossRef] - Li, H.; Hu, L.; Jian, T.; Jun, H.; Yi, G.; Dehua, G. Sensitivity Analysis and Forecast of Power Load Characteristics Based on Meteorological Feature Information. IOP Conf. Ser. Earth Environ. Sci.
**2020**, 558, 052060. [Google Scholar] [CrossRef] - Qiu, Y.; Li, X.; Zheng, W.; Hu, Q.; Wei, Z.; Yue, Y. The prediction of the impact of climatic factors on short-term electric power load based on the big data of smart city. J. Phys. Conf. Ser.
**2017**, 887, 012023. [Google Scholar] [CrossRef] [Green Version] - Guan, J.; Zurada, J.; Shi, D.; Lopez Vargas, J.; Cabrera Loayza, M.C. A Fuzzy Neural Approach with Multiple Models to Time-Dependent Short Term Power Load Forecasting Based on Weather. Int. J. Multimed. Ubiquitous Eng.
**2017**, 12, 1–16. [Google Scholar] - Chen, H.J.; Xing, F.Z.; Shang, J.; Jun, Z.Y. The Sensitivity Analysis of Power Loads and Meteorological Factors in Typhoon Weather. Appl. Mech. Mater.
**2014**, 716–717, 110320131107. [Google Scholar] - Yunqin, Z.; Qize, C.; Wenjie, J.; Xiaofeng, L.; Liang, S.; Zehua, C. Photovoltaic power prediction model based on EMD-PCA-LSTM. J. Sun
**2021**. [Google Scholar] [CrossRef] - Ma, M.; Liu, Z.; Su, Y. Acoustic signal feature extraction method based on manifold learning and its application. Int. Core J. Eng.
**2021**, 7, 446–454. [Google Scholar] - Jinghua, L.; Shanyang, W.; Wei, D. Combination of Manifold Learning and Deep Learning Algorithms for Mid-Term Electrical Load Forecasting. IEEE Trans. Neural Netw. Learn. Syst.
**2021**, 1–10. [Google Scholar] [CrossRef] - Zhao, H.; Yu, H.; Li, D.; Mao, T.; Zhu, H. Vehicle Accident Risk Prediction Based on AdaBoost-SO in VANETs. IEEE Access
**2019**, 7, 14549–14557. [Google Scholar] [CrossRef] - Zeng, K.; Liu, J.; Wang, H.; Zhao, Z.; Wen, C. Research on Adaptive Selection Algorithm for Multi-model Load Forecasting Based on Adaboost. IOP Conf. Ser. Earth Environ. Sci.
**2020**, 610, 012005. [Google Scholar] [CrossRef] - Deng, Y.; Wang, W.; Qian, C.; Wang, Z.; Dai, D. Boundary-processing-technique in EMD method and Hilbert transform. Chin. Sci. Bull.
**2001**, 46, 954–961. [Google Scholar] [CrossRef] - Tawn, R.; Browell, J. A review of very short-term wind and solar power forecasting. Renew. Sustain. Energy Rev.
**2022**, 153, 111758. [Google Scholar] [CrossRef] - Xin, J.; Yiming, C.; Lei, W.; Huali, H.; Peng, C. Failure prediction, monitoring and diagnosis methods for slewing bearings of large-scale wind turbine: A review. Measurement
**2021**, 172, 108855. [Google Scholar] - Hadi, R.; Hamidreza, F.; Gholamreza, M. Stock price prediction using deep learning and frequency decomposition. Expert Syst. Appl.
**2020**, 169, 114332. [Google Scholar] - Liu, Q.; Cai, Y.; Jiang, H.; Lu, J.; Chen, L. Traffic state prediction using ISOMAP manifold learning. Phys. A Stat. Mech. Appl.
**2018**, 506, 532–541. [Google Scholar] [CrossRef] - Ding, S.; Keal, C.A.; Zhao, L.; Yu, D. Dimensionality reduction and classification for hyperspectral image based on robust supervised ISOMAP. J. Ind. Prod. Eng.
**2022**, 39, 19–29. [Google Scholar] [CrossRef] - Wang, Q.; Ying, Z. Facial Expression Recognition Algorithm Based on Gabor Texture Features and Adaboost Feature Selection via Sparse Representation. Appl. Mech. Mater.
**2014**, 511–512, 433–436. [Google Scholar] [CrossRef] - Medeiros, M.; Soares, L. Robust statistical methods for electricity load forecasting. In Proceedings of the RTE-VT Workshop on State Estimation and Forecasting Techniques, Paris, France, 29–30 May 2006; pp. 1–8. [Google Scholar]

**Figure 3.**Correlation coefficients between the data features and the power load data. (Different colors are used to highlight the levels of different data).

**Figure 4.**Results of the EMD decomposition of factors affecting power load. (

**A**) is the EMD decomposition result of A, X, P, K (A-average temperature, X-maximum temperature, P-periodic term, K-working population). (

**B**) is the EMD decomposition result of N,E,G,W(N-minimum temperature, E-precipitation, G-GDP, C-gross construction, W-per capita wage).

Algorithm | MAE | MAPE (%) | RMSE | R^{2} |
---|---|---|---|---|

RF | 476.42 | 3.85 | 574.46 | 0.91 |

BP | 1041.74 | 7.62 | 1048.14 | 0.84 |

LSTM | 624.93 | 4.59 | 912.62 | 0.87 |

GRU | 438 | 3.49 | 563.80 | 0.95 |

Adaboost | 16.58 | 0.13 | 57.83 | 0.95 |

EMD-Adaboost | 10.69 | 0.1 | 37.33 | 0.96 |

S-EMDIA | 11.2 | 0.08 | 24.27 | 0.99 |

EMD-PCA-Adaboost | 426.53 | 3.35 | 573.90 | 0.95 |

EMDIA | 5.0 | 0.0003 | 7.9 | 0.99 |

Forecast Target | MAE | MAPE (%) | RMSE | R^{2} |
---|---|---|---|---|

IMF1 | 372.24 | 471.64 | 473.01 | −0.28 |

IMF2 | 658.27 | 79.94 | 1030.86 | 0.84 |

IFM3 | 201.35 | 3761.89 | 912.62 | 0.83 |

IMF4 | 47.37 | 407.3 | 73.94 | 0.82 |

Residual (trend term) | 3.14 | 0.02 | 5.58 | 0.99 |

fluctuation term | 111.68 | 41.6 | 39.52 | 0.93 |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Han, X.; Su, J.; Hong, Y.; Gong, P.; Zhu, D.
Mid- to Long-Term Electric Load Forecasting Based on the EMD–Isomap–Adaboost Model. *Sustainability* **2022**, *14*, 7608.
https://doi.org/10.3390/su14137608

**AMA Style**

Han X, Su J, Hong Y, Gong P, Zhu D.
Mid- to Long-Term Electric Load Forecasting Based on the EMD–Isomap–Adaboost Model. *Sustainability*. 2022; 14(13):7608.
https://doi.org/10.3390/su14137608

**Chicago/Turabian Style**

Han, Xuguang, Jingming Su, Yan Hong, Pingshun Gong, and Danping Zhu.
2022. "Mid- to Long-Term Electric Load Forecasting Based on the EMD–Isomap–Adaboost Model" *Sustainability* 14, no. 13: 7608.
https://doi.org/10.3390/su14137608