# Short-Term Load Forecasting Using Smart Meter Data: A Generalization Analysis

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Related Work

## 3. Modeling Techniques

#### 3.1. Support Vector Regression (SVR)

#### 3.2. Gradient Boosted Regression Tree

#### 3.3. Feed Forward Neural Network (FFNN)

#### 3.4. Long-Short Term Memory Network (LSTM)

## 4. Data Normalization and Parameter Tuning

## 5. Error Metrics

## 6. Smart Metering Data and Statistical Analysis

## 7. Forecasting Experiments and Results

#### 7.1. Model Development and Tuning

#### 7.2. Model Evaluation

## 8. Conclusions

- Although the models were merely fed with consumption values as the predictors, in general, they could provide stable one-hour ahead prediction over a one-year period.
- The AI methodologies; LSTM and FFNN compared to the other two techniques, adapted better to changes while performing predictions, following the trend of real consumption and on average, achieving lower prediction errors.
- With regard to daily peak load estimations of various profiles, the GBRT in addition to LSTM outperformed other techniques.
- As for the computation cost, the GBRT is the fastest algorithm to be trained and fine-tuned among the others. On the contrary, the training times of the SVR and LSTM are significantly high, especially when the training size for the SVR grows or the number of variables for tuning in the LSTM network increases.
- Increasing the number of training houses could improve the accuracy of forecasts as long as the additional profile(s) raise the model knowledge about the test profile. This implies a larger training set does not necessarily boost the model performance.
- Increasing the number of inputs does not have a similar effect on the performance of different variants. Some models perform better with recent information about past consumption and some need more knowledge on past consumption values.
- A comparative study among the five customer groups shows that the customers with the lower average amount of yearly consumption and smaller hourly load volatility generate more predictable profiles.
- An analysis of seasonal predictions reveals that the seasons with lower temperatures usually come with more load violations, thus making forecasting more difficult for almost all models.

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Smart Meters, Quarterly Report to end December 2016; Department for Business Energy and Industrial Strategy: London, UK, 2017.
- How Many Smart Meters are Installed in the United States, and Who Has Them? Available online: https://www.eia.gov/tools/faqs/faq.php?id=108&t=3 (accessed on 28 January 2020).
- Wang, Y.; Chen, Q.; Hong, T.; Kang, C. Review of Smart Meter Data Analytics: Applications, Methodologies, and Challenges. IEEE Trans. Smart Grid
**2019**, 10, 3125–3148. [Google Scholar] [CrossRef][Green Version] - Farmanbar, M.; Parham, K.; Arild, Ø.; Rong, C. A Widespread Review of Smart Grids Towards Smart Cities. Energies
**2019**, 12, 4484. [Google Scholar] [CrossRef][Green Version] - Ryu, S.; Noh, J.; Kim, H. Deep Neural Network Based Demand Side Short Term Load Forecasting. Energies
**2016**, 10, 3. [Google Scholar] [CrossRef] - Mocanu, E.; Nguyen, P.H.; Gibescu, M.; Kling, W.L. Comparison of machine learning methods for estimating energy consumption in buildings. In Proceedings of the 2014 International Conference on Probabilistic Methods Applied to Power Systems (PMAPS), Durham, UK, 7–10 July 2014; pp. 1–6. [Google Scholar] [CrossRef][Green Version]
- Pirbazari, A.M.; Chakravorty, A.; Rong, C. Evaluating Feature Selection Methods for Short-Term Load Forecasting. In Proceedings of the 2019 IEEE International Conference on Big Data and Smart Computing (BigComp), Kyoto, Japan, 27 Feburuary–2 March 2019; pp. 1–8. [Google Scholar] [CrossRef]
- Zhao, H.; Magoulès, F. A review on the prediction of building energy consumption. Renew. Sustain. Energy Rev.
**2012**, 16, 3586–3592. [Google Scholar] [CrossRef] - Pérez-Lombard, L.; Ortiz, J.; Pout, C. A review on buildings energy consumption information. Energy Build.
**2008**, 40, 394–398. [Google Scholar] [CrossRef] - Hyndman, R.; Koehler, A.; Ord, K.; Snyder, R. Forecasting with Exponential Smoothing; Springer: Berlin/Heidelberg, Germany, 2008. [Google Scholar]
- Ahmad, A.S.; Hassan, M.Y.; Abdullah, M.P.; Rahman, H.A.; Hussin, F.; Abdullah, H.; Saidur, R. A review on applications of ANN and SVM for building electrical energy consumption forecasting. Renew. Sustain. Energy Rev.
**2014**, 33, 102–109. [Google Scholar] [CrossRef] - Khashei, M.; Bijari, M. An artificial neural network (p,d,q) model for timeseries forecasting. Expert Syst. Appl.
**2010**, 37, 479–489. [Google Scholar] [CrossRef] - Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput.
**1997**, 9, 1735–1780. [Google Scholar] [CrossRef] - Sutskever, I.; Vinyals, O.; Le, Q. Sequence to Sequence Learning with Neural Networks. Adv. Neural Inf. Process. Syst.
**2014**, 4, 3104–3112. [Google Scholar] - Quilumba, F.L.; Lee, W.-J.; Huang, H.; Wang, D.Y.; Szabados, R.L. Using Smart Meter Data to Improve the Accuracy of Intraday Load Forecasting Considering Customer Behavior Similarities. IEEE Trans. Smart Grid
**2015**, 6, 911–918. [Google Scholar] [CrossRef] - Dahl, M.; Brun, A.; Kirsebom, O.; Andresen, G. Improving Short-Term Heat Load Forecasts with Calendar and Holiday Data. Energies
**2018**, 11, 1678. [Google Scholar] [CrossRef][Green Version] - Kong, W.; Dong, Z.Y.; Hill, D.J.; Luo, F.; Xu, Y. Short-Term Residential Load Forecasting Based on Resident Behaviour Learning. IEEE Trans. Power Syst.
**2018**, 33, 1087–1088. [Google Scholar] [CrossRef] - Fumo, N.; Biswas, M.A.R. Regression analysis for prediction of residential energy consumption. Renew. Sustain. Energy Rev.
**2015**, 47, 332–343. [Google Scholar] [CrossRef] - Pappas, S.S.; Ekonomou, L.; Karamousantas, D.C.; Chatzarakis, G.E.; Katsikas, S.K.; Liatsis, P. Electricity demand loads modeling using AutoRegressive Moving Average (ARMA) models. Energy
**2008**, 33, 1353–1360. [Google Scholar] [CrossRef] - Lü, X.; Lu, T.; Kibert, C.J.; Viljanen, M. Modeling and forecasting energy consumption for heterogeneous buildings using a physical–statistical approach. Appl. Energy
**2015**, 144, 261–275. [Google Scholar] [CrossRef] - Janacek, G. Time Series Analysis Forecasting and Control. J. Time Ser. Anal.
**2009**. [Google Scholar] [CrossRef] - Thissen, U.; van Brakel, R.; de Weijer, A.P.; Melssen, W.J.; Buydens, L.M.C. Using support vector machines for time series prediction. Chemom. Intell. Lab. Syst.
**2003**, 69, 35–49. [Google Scholar] [CrossRef] - Lahouar, A.; Slama, J.B.H. Day-ahead load forecast using random forest and expert input selection. Energy Convers. Manag.
**2015**, 103, 1040–1051. [Google Scholar] [CrossRef] - Hambali, A.O.; Akinyemi, M.; JYusuf, N. Electric Power Load Forecast Using Decision Tree Algorithms. Comput. Inf. Syst. Dev. Inform. Allied Res. J.
**2017**, 7, 29–42. [Google Scholar] - Ahmed, N.K.; Atiya, A.F.; Gayar, N.E.; El-Shishiny, H. An Empirical Comparison of Machine Learning Models for Time Series Forecasting. Econom. Rev.
**2010**, 29, 594–621. [Google Scholar] [CrossRef] - Khosravani, H.; Castilla, M.; Berenguel, M.; Ruano, A.; Ferreira, P. A Comparison of Energy Consumption Prediction Models Based on Neural Networks of a Bioclimatic Building. Energies
**2016**, 9, 57. [Google Scholar] [CrossRef][Green Version] - Yang, K.; Zhao, L. Application of Mamdani Fuzzy System Amendment on Load Forecasting Model. In Proceedings of the 2009 Symposium on Photonics and Optoelectronics, Wuhan, China, 14–16 August 2009; pp. 1–4. [Google Scholar] [CrossRef]
- Chaturvedi, D.; Kumar, R.; Kalra, P.K. Artificial neural network learning using improved genetic algorithm. J. Inst. Eng. India
**2002**, 82, 1–8. [Google Scholar] - Baliyan, A.; Gaurav, K.; Mishra, S.K. A Review of Short Term Load Forecasting using Artificial Neural Network Models. Procedia Comput. Sci.
**2015**, 48, 121–125. [Google Scholar] [CrossRef][Green Version] - Marino, D.; Amarasinghe, K.; Manic, M. Building Energy Load Forecasting using Deep Neural Networks. In Proceedings of the IECON 2016-42nd Annual Conference of the IEEE Industrial Electronics Society, Florence, Italy, 23–26 October 2016. [Google Scholar]
- Kong, W.; Dong, Z.Y.; Jia, Y.; Hill, D.J.; Xu, Y.; Zhang, Y. Short-Term Residential Load Forecasting Based on LSTM Recurrent Neural Network. IEEE Trans. Smart Grid
**2019**, 10, 841–851. [Google Scholar] [CrossRef] - Agrawal, R.; Muchahary, F.; Tripathi, M. Long Term Load Forecasting with Hourly Predictions Based on Long-Short-Term-Memory Networks. In Proceedings of the 2018 IEEE Texas Power and Energy Conference (TPEC), College Station, TX, USA, 8–9 February 2018. [Google Scholar]
- Kim, T.-Y.; Cho, S.-B. Predicting residential energy consumption using CNN-LSTM neural networks. Energy
**2019**, 182, 72–81. [Google Scholar] [CrossRef] - Mamun, A.; Hoq, M.; Hossain, E.; Bayindir, R. A Hybrid Deep Learning Model with Evolutionary Algorithm for Short-Term Load Forecasting. In Proceedings of the 2019 8th International Conference on Renewable Energy Research and Applications (ICRERA), Brasov, Romania, 3–6 November 2019. [Google Scholar]
- Bouktif, S.; Fiaz, A.; Ouni, A.; Serhani, M. Optimal Deep Learning LSTM Model for Electric Load Forecasting using Feature Selection and Genetic Algorithm: Comparison with Machine Learning Approaches. Energies
**2018**, 11, 1636. [Google Scholar] [CrossRef][Green Version] - Cristianini, N.; Shawe-Taylor, J. An Introduction to Support Vector Machines and Other Kernel-based Learning Methods, 1st ed.; Cambridge University Press: New York, NY, USA, 2000. [Google Scholar]
- Chang, M.; Chen, B.; Lin, C.-J. EUNITE Network Competition: Electricity Load Forecasting, Dec. 2001. Available online: http://www.eunite.org/knowledge/Competitions/1st_competition/Scientific_comments_to_the_competition/Scientific_comments_to_the_competition.htm (accessed on 14 March 2020).
- Friedman, J. Stochastic Gradient Boosting. Comput. Stat. Data Anal.
**2002**, 38, 367–378. [Google Scholar] [CrossRef] - Ruiz-Abellón, M.; Gabaldón, A.; Guillamón, A. Load Forecasting for a Campus University Using Ensemble Methods Based on Regression Trees. Energies
**2018**, 11, 2038. [Google Scholar] [CrossRef][Green Version] - Feng, C.; Cui, M.; Hodge, B.-M.; Zhang, J. A data-driven multi-model methodology with deep feature selection for short-term wind forecasting. Appl. Energy
**2017**, 190, 1245–1257. [Google Scholar] [CrossRef][Green Version] - Géron, A. Hands on Machine Learning with Scikit-Learn and Tensorflow; O’Reilly Media: Sevastopol, CA, USA, 2019; p. 261. [Google Scholar]
- Gers, F.A. Learning to forget: Continual prediction with LSTM. In Proceedings of the 9th International Conference on Artificial Neural Networks: ICANN ’99, Edinburgh, UK, 6 August 1999; Volume 1999, pp. 850–855. [Google Scholar] [CrossRef]
- Lippmann, R. An introduction to computing with neural nets. IEEE Assp Mag.
**1987**, 4, 4–22. [Google Scholar] [CrossRef] - Nair, V.; Hinton, G. Rectified Linear Units Improve Restricted Boltzmann Machines Vinod Nair, vol. 27. 2010. Available online: https://www.cs.toronto.edu/~fritz/absps/reluICML.pdf (accessed on 14 March 2020).
- Types of Optimization Algorithms used in Neural Networks and Ways to Optimize Gradient Descent. Available online: https://towardsdatascience.com/types-of-optimization-algorithms-used-in-neural-networks-and-ways-to-optimize-gradient-95ae5d39529f (accessed on 21 February 2020).
- Hyndman, R.J.; Koehler, A.B. Another look at measures of forecast accuracy. Int. J. Forecast.
**2006**, 22, 679–688. [Google Scholar] [CrossRef][Green Version] - Asare-Bediako, B.; Kling, W.L.; Ribeiro, P.F. Day-ahead residential load forecasting with artificial neural networks using smart meter data. In Proceedings of the 2013 IEEE Grenoble Conference, Grenoble, France, 16–20 June 2013; pp. 1–6. [Google Scholar] [CrossRef]
- SmartMeter Energy Consumption Data in London Households—London Datastore. Available online: https://data.london.gov.uk/dataset/smartmeter-energy-use-data-in-london-households (accessed on 10 February 2020).
- Acron. Available online: https://acorn.caci.co.uk/downloads/Acorn-User-guide.pdf (accessed on 23 March 2013).
- Vapnik, V.N. Statistical Learning Theory; John Wiley & Sons, Inc.: New York, NY, USA, 1998. [Google Scholar]
- Pedregosa, F.; Varoquaux, G.; Gramfort, A.; Michel, V.; Thirion, B.; Grisel, O.; Blondel, M.; Prettenhofer, P.; Weiss, R.; Dubourg, V.; et al. Scikit-learn: Machine Learning in Python. J. Mach. Learn. Res.
**2011**, 12, 2825–2830. [Google Scholar] - Keras: Deep Learning library for Theano and TensorFlow—Data Science Central. Available online: https://www.datasciencecentral.com/profiles/blogs/keras-deep-learning-library-for-theano-and-tensorflow (accessed on 21 February 2020).

**Figure 7.**Error analysis with respect to the number of input variables and the size of the training set.

The Acorn User Group | Block Number | Number of Houses | (Average Value Over a Group in KW/h) | |||
---|---|---|---|---|---|---|

Mean and SD. of Hourly Consumption | Mean and SD. of Daily Consumption | Mean and SD. of Weekly Consumption | Total Consumption Over 1 Year | |||

Group A | B0 | 15 | 0.87 (± 0.72) | 21.3 (± 6.51) | 145 (± 34.3) | 7707 |

Group B | B3 | 15 | 0.53 (± 0.45) | 12.28 (± 3.99) | 88.4 (± 21.3) | 4689 |

Group C | B4 | 15 | 0.52 (± 0.42) | 12.7 (± 3.44) | 88 (± 18.6) | 4666 |

Group D | B10, B11 | 15 | 0.64 (± 0.52) | 15.4 (± 4.57) | 106 (± 24.2) | 5627 |

Group E | B24, B27 | 15 | 0.44 (± 0.40) | 10.7 (± 3.29) | 74.2 (± 16.2) | 3933 |

Best Variant | Training Size | Training Time (Minutes) | Parameters |
---|---|---|---|

SVR | 33 houses | 45 | Kernel: RBF, C:10, Gamma: 0.001, Epsilon: 0.2, input: 6 load lags |

GBRT | 45 houses | 15 | Max_Depth: 2, Learning_rate: 0.06, n_estimatores: 300, n_features: 6, 6 load lags |

FFNN | 33 houses | 30 | Hidden layer: 1, Hidden_neurons: 13, Optimizer: Adam Weight_init_mode: Golrot Normal, 6 load lags |

LSTM | 45 houses | 40 | LSTM layer: 1, LSTM cells: 10, Activation Function: ReLu Dropout rate: 0.1, 6 load lags |

Average Computed Over Predictions | |||||
---|---|---|---|---|---|

Model | RMSE +/− SD (KW/h) | MAE +/− SD (kW/h) | MASE +/− SD (KW/h) | DpMAPE +/− SD | CWE |

SVR | 0.36 ± 0.1 | 0.24 ± 0.05 | 1.12 ± 0.16 | 19.56 ± 3.68 | 0.48 |

GBRT | 0.36 ± 0.1 | 0.23± 0.06 | 1.06 ± 0.12 | 17.88± 3.18 | 0.45 |

FFNN | 0.35± 0.1 | 0.22 ± 0.06 | 1.01± 0.09 | 18.72±4.08 | 0.44 |

LSTM | 0.35 ± 0.1 | 0.21 ± 0.06 | 0.98 ± 0.07 | 17.76± 3.64 | 0.43 |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Mehdipour Pirbazari, A.; Farmanbar, M.; Chakravorty, A.; Rong, C.
Short-Term Load Forecasting Using Smart Meter Data: A Generalization Analysis. *Processes* **2020**, *8*, 484.
https://doi.org/10.3390/pr8040484

**AMA Style**

Mehdipour Pirbazari A, Farmanbar M, Chakravorty A, Rong C.
Short-Term Load Forecasting Using Smart Meter Data: A Generalization Analysis. *Processes*. 2020; 8(4):484.
https://doi.org/10.3390/pr8040484

**Chicago/Turabian Style**

Mehdipour Pirbazari, Aida, Mina Farmanbar, Antorweep Chakravorty, and Chunming Rong.
2020. "Short-Term Load Forecasting Using Smart Meter Data: A Generalization Analysis" *Processes* 8, no. 4: 484.
https://doi.org/10.3390/pr8040484