# A Comparative Study of Time Series Forecasting Methods for Short Term Electric Energy Consumption Prediction in Smart Buildings

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

- Analysis and comparison of the performance of statistical and ML based strategies;
- Establishing the size of the historical window to be used to optimize the predictions; and
- Analysis of the electricity consumption data collected from the smart buildings considered.

## 2. Related Works

## 3. Materials and Methods

#### 3.1. Data

#### 3.2. Methods

**Linear Regression (LM)**. Ref. [48] is statistical technique commonly used for time series forecasting. The basic idea behind linear regression is to try to find the relationship between two variables. In its simplest formulation, a linear equation, $Y=a+bX$, is used to represent the association between the independent variable (X) and the dependent one (Y), i.e., the variables to be predicted. In the case where multiple independent variables are used to determine the value of a dependent variable, as is the case of this work, the process is called multiple linear regression. In this case, the goal is to model the linear relation between multiple independent, or explanatory, variables and a dependent variable using an equation such as $Y=a+{b}_{1}{X}_{1}+\dots +{b}_{n}{X}_{n}+\u03f5$, where $\u03f5$ is the residual (difference between the predicted and the observed value) and ${X}_{i},1\le i\le n$ are the n explanatory variables. For this work, we used the implementation provided by the caret [49] R package.

**Auto-Regressive Integrated Moving Average (ARIMA)**. is a classical method based on Box and Jenkins [50] widely used for time series forecasting, as shown in Section 2. ARIMA is an extension of the simpler AutoRegressive Moving Average (ARMA) that includes the concept of integration. We used this method since it can be used to forecast non-stationary time series. In [51], ARIMA is described as a stochastic model building. In particular, this method can be viewed as an iterative strategy that consists of three steps:

- Identification. In this step, the data and all related information are used for selecting a sub-class of model that might best represent the data.
- Estimation. In a second step, the parameters of the model are trained using the data.
- Diagnostic Checking. Finally, the so-obtained model is validated using the data at disposal and areas where the model can be improved are identified.

**Evolutionary Algorithms (EAs) for Regression Trees (EVTree)**. EAs [54] are optimization techniques based on the basic concepts of Darwinian evolution. EAs evolve a population of candidate solutions over a number of generations by means of the application of genetic operators, such as selection, crossover and mutation. Each candidate solution, i.e., an individual of the population, is assigned a quality score, or fitness. Usually, EAs start from a randomly initialized population, which is then evaluated to assign a fitness to each individual. A number of individuals are then selected, usually based on the fitness. Crossover and mutation operators are used to generate new individuals starting from the selected ones. The so-generated offspring are then inserted into the population. The whole process is repeated until a stopping criterion is met, e.g., when a maximum number of generations have been performed. The particular EA provided by the EVTree [55] R package was used. This algorithm evolves regression trees [56], i.e., trees used to predict a continuous value based on a set of predictor variables.

**Generalized Boosted Regression Models (GBM)**. [57,58]. This is an ensemble algorithm, where a set of regression trees is trained following a sequential procedure. During this sequential procedure, GBM applies a gradient descend procedure, which is repeated until a given number of trees has been built or when no improvement is detected. The last condition is checked by using the set of current trees to produce predictions. Such predictions are also used to correct the model, so that mistakes obtained in previous iterations can be corrected. A known issue in gradient boosting is overfitting. GBM tackles this problem by using regularization methods, which basically penalize various parts of the algorithm. To produce the final prediction a voting mechanism, among all the tree obtained, is used. We used the GBM implementation provided by the caret R package.

**Artificial Neural Networks (ANNs)**. In a rough analogy of biological learning system, ANNs consist of densely interconnected units, called neurons. Each neuron receives a number of real valued input, e.g., from other neurons of the network, and produces a single real valued output. The output depends on an activation function used in each unit, which introduce non-linearity to the output. However, the activation function is used only if the input received by a unit is higher than a given activation threshold. If this is not the case, then no output is produced. Normally, an ANN consists of different layers of neurons. Among such layers we can distinguish the input and output layers, and in between there may be one or more hidden layers. As said before, there is an activation threshold used in the network, which is determined by weights that are adjusted during a training phase. Several types of ANNs have been proposed. Among these, the simplest, and one of the most widely used one, is the so-called feedforward ANN. In these type of networks, neurons of adjacent layers are connected, and each of these connections is assigned a weight. The information advances from the input layer toward the output layer, which consists of only one unit. The output unit produces the final prediction of the network. The caret package implementation was used.

**Random Forests (RF)**. were first proposed by Breinman and Cutle in [59]. Similar to GBM, RF is also an ensemble approach, where a set of trees are used to produce the final output, using a voting scheme. Each tree is induced from a randomly selected training subset and using also a randomly selected subset of features. This implies that the trees depend on the values of an independently sampled input dataset, using the same distribution for all trees. In the case of regression, the final prediction is represented by the average of the predictions of each induced tree. In addition, for this method, the implementation provided by caret was used.

**Ensemble**. This method was proposed by Divina et al. [10]. It is a stacking ensemble scheme. In particular, two layers are used to make a final prediction. In the first layer, EVTree, RF and NN are used, and the predictions made by these three methods are then combined on the top layer by GBM to produce the final output. This method shows excellent performances on a problem regarding the prediction of the global energy demands, this it is interesting to check its validity also in the case of a local energy demand setting, such as the one used in this paper.

**Recursive Partitioning and Regression Trees (RPart)**. [60]. This method is based on the CART algorithm, and builds regression trees, using a two states strategy. The resulting model can be represented as a binary tree, which can be easily interpreted. In the first step, the algorithm determines the variable which best splits the data into two groups. For regression, as in this case, the attribute with the largest standard deviation reduction is considered the best and is used for building the decision node. The data are then separated according the the attribute selected, and the process is applied separately to each sub-group determined in the first phase, until the subgroups either get to a minimum size or no improvement can be achieved. In a second step, the obtained full tree is pruned using a cross-validation strategy. Again, the caret implementation was used.

**Extreme Gradient Boosting (XGBoost)**. [61]. This method is similar to GBM, as it shares with it the principle of gradient boosting for building an ensemble of trees. There are, however, important differences. For instance, XGBoost controls over-fitting (a known issue in GBM) by adopting a more regularized model formalization. This allows XGBoost to generally obtain better performance than GBM. XGBoost has recently received much attention in the data science community, as it has been successfully applied in different domains. This popularity is mostly due to the scalability of the method. In fact, XGBoost can run up to ten times faster than other popular approaches on a single machine. Moreover, it is also capable of scaling to billions of instances in distributed or memory-limited settings. The latter feature is achieved via different systems and algorithmic optimizations. Some of this improvements include a novel tree induction algorithm for managing sparse data and a strategy that makes it possible to handle the weights of the instances in approximate tree learning. Parallel and distributed computing makes learning faster, allowing a faster model exploration. Additionally, XGBoost employs out-of-core computation, which enables to process massive date even of a simple desktop. We used the implementation provided by the caret R package.

## 4. Results

## 5. Conclusions and Future Work

## Author Contributions

## Funding

## Conflicts of Interest

## Appendix A. Graphical Representation of Results

## References

- ExxonMobil. 2018 Outlook for Energy: A View to 2040. Available online: https://corporate.exxonmobil.com/en/~/media/Global/Files/outlook-for-energy/2018-Outlook-for-Energy.pdf. (accessed on 25 March 2019).
- International Energy Agency. World Energy Outlook 2018; International Energy Agency Publications: Paris, France, 2018. [Google Scholar]
- Ledu, M.; Matthews, H.D.; de Elía, R. Regional estimates of the transient climate response to cumulative CO
_{2}emissions. Nat. Clim. Chang.**2016**, 6, 474–478. [Google Scholar] [CrossRef] - Energy 2020, A Strategy for Competitive, Sustainable and Secure Energy. Available online: http://eur-lex.europa.eu/legal-content/EN/TXT/PDF/?uri=CELEX:52010DC0639&from=EN (accessed on 12 March 2019).
- Agency, T.I.E. Energy Efficiency: Buildings. Available online: https://www.iea.org/topics/energyefficiency/buildings (accessed on 20 March 2019).
- Raza, M.Q.; Khosravi, A. A review on artificial intelligence based load demand forecasting techniques for smart grid and buildings. Renew. Sustain. Energy Rev.
**2015**, 50, 1352–1372. [Google Scholar] [CrossRef] - Sen, P.; Roy, M.; Pala, P. Application of ARIMA for forecasting energy consumption and GHG emission: A case study of an Indian pig iron manufacturing organization. Energy
**2016**, 116, 1031–1038. [Google Scholar] [CrossRef] - Chujai, P.; Kerdprasop, N.; Kerdprasop, K. Time Series Analysis of Household Electric Consumption with ARIMA and ARMA Models. In Proceedings of the International MultiConference of Engineers and Computer Scientists, Hong Kong, China, 13–15 March 2013; Volume 1. [Google Scholar]
- Bonetto, R.; Rossi, M. Machine Learning Approaches to Energy Consumption Forecasting in Households. arXiv
**2017**, arXiv:1706.09648. [Google Scholar] - Divina, F.; Gilson, A.; Goméz-Vela, F.; García Torres, M.; Torres, J.F. Stacking Ensemble Learning for Short-Term Electricity Consumption Forecasting. Energies
**2018**, 11, 949. [Google Scholar] [CrossRef] - Oliveira, E.M.; Oliveira, F.L.C. Forecasting mid-long term electric energy consumption through bagging ARIMA and exponential smoothing methods. Energy
**2018**, 144, 776–788. [Google Scholar] [CrossRef] - Medina, A.; Cámara, A.; Monrobel, J.R. Measuring the Socioeconomic and Environmental Effects of Energy Efficiency Investments for a More Sustainable Spanish Economy. Sustainability
**2016**, 8, 1039. [Google Scholar] [CrossRef] - Campillo, J.; Wallin, F.; Torstensson, D.; Vassileva, I. Energy Demand Model Design For Forecasting Electricity Consumption And Simulating Demand Response Scenarios In Sweden. In Proceedings of the 4th International Conference in Applied Energy 2012, Suzhou, China, 5–8 July 2012; pp. 1–7. [Google Scholar]
- Pinson, P.; Madsen, H.; O’Malley, M.; O’Connell, N. Benefits and challenges of electrical demand response: A critical review. Renew. Sustain. Energy Rev.
**2014**, 39, 686–699. [Google Scholar] - Deb, C.; Zhang, F.; Yang, J.; Lee, S.E.; Shah, K.W. A review on time series forecasting techniques for building energy consumption. Renew. Sustain. Energy Rev.
**2017**, 74, 902–924. [Google Scholar] [CrossRef] - Changhong, C.; Bingyan, W.; Qingyan, F.; Green, C.; Streets, D.G. Reductions in emissions of local air pollutants and co-benefits of Chinese energy policy: A Shanghai case study. Energy Policy
**2006**, 34, 754–762. [Google Scholar] [CrossRef] - Gooijer, J.G.D.; Hyndman, R.J. 25 years of time series forecasting. Int. J. Forecast.
**2006**, 22, 443–473. [Google Scholar] [CrossRef] - Abdel-Aal, R.E.; Al-Garni, A.Z. Forecasting monthly electric energy consumption in eastern Saudi Arabia using univariate time-series analysis. Energy
**1997**, 22, 1059–1069. [Google Scholar] [CrossRef] - Shilpa, G.N.; Sheshadri, G.S. Short-Term Load Forecasting Using ARIMA Model For Karnataka State Electrical Load. Int. J. Eng. Res. Dev.
**2017**, 13, 75–79. [Google Scholar] - Newsham, G.R.; Birt, B.J. Building-level Occupancy Data to Improve ARIMA-based Electricity Use Forecasts. In Proceedings of the 2nd ACM Workshop on Embedded Sensing Systems for Energy-Efficiency in Building (BuildSys ’10), Zurich, Switzerland, 2 November 2010; ACM: New York, NY, USA, 2010; pp. 13–18. [Google Scholar]
- Rallapallia, S.R.; Ghosh, S. Forecasting monthly peak demand of electricity in India—A critique. Energy Policy
**2012**, 45, 516–520. [Google Scholar] [CrossRef] - Schrock, D.W.; Claridge, D.E. Predicting Energy Usage in a Supermarket. In Proceedings of the 6th Symposium on Improving Building Systems in Hot and Humid Climates, Dallas, TX, USA, 3–4 October 1989; Available online: https://core.ac.uk/download/pdf/79624985.pdf (accessed on 20 May 2019).
- Nowotarski, J.; Liu, B.; Weron, R.; Hong, T. Improving short term load forecast accuracy via combining sister forecasts. Energies
**2016**, 98, 40–49. [Google Scholar] [CrossRef] - Samarasinghe, M.; Al-Hawani, W. Short-Term Forecasting of Electricity Consumption using Gaussian Processes. Master’s Thesis, University of Agder, Kristiansand, Norway, 2012. [Google Scholar]
- Rahman, H.; Selvarasan, I.; Begum, J. Short-Term Forecasting of Total Energy Consumption for India-A Black Box based Approach. Energies
**2018**, 11, 3442. [Google Scholar] [CrossRef] - Park, D.; El-Sharkawi, M.; Marks, R.J.; Atlas, L.; Damborg, M. Electric Load Forecasting Using An Artificial Neural Network. IEEE Transadions Power Syst.
**1991**, 6, 442–449. [Google Scholar] [CrossRef] - Nizami, J.; Ai-Garni, A.Z. Forecasting electric energy consumption using neural networks. Energy Policy
**1995**, 23, 1097–1104. [Google Scholar] [CrossRef] - Hong, T. Short Term Electric Load Forecasting. Ph.D. Thesis, Graduate Faculty of North Carolina State University, Raleigh, NC, USA, 2010. [Google Scholar]
- Kelo, S.; Dudul, S. A wavelet Elman neural network for short-term electrical load prediction under the influence of temperature. Int. J. Electr. Power Energy Syst.
**2012**, 43, 1063–1071. [Google Scholar] [CrossRef] - Chitsaz, H.; Shaker, H.; Zareipour, H.; Wood, D.; Amjady, N. Short-term electricity load forecasting of buildings in microgrids. Energy Build.
**2015**, 99, 50–60. [Google Scholar] [CrossRef] - Zheng, H.; Yuan, J.; Chen, L. Short-Term Load Forecasting Using EMD-LSTM Neural Networks with a Xgboost Algorithm for Feature Importance Evaluation. Energies
**2017**, 10, 1168. [Google Scholar] [CrossRef] - Jaina, R.K.; Smith, K.M.; Culligan, P.J.; Taylor, J.E. Forecasting energy consumption of multi-family residential buildings using support vector regression: Investigating the impact of temporal and spatial monitoring granularity on performance accuracy. Appl. Energy
**2014**, 123, 168–178. [Google Scholar] [CrossRef] - Liu, D.; Chen, Q.; Mori, K. Time series forecasting method of building energy consumption using support vector regression. In Proceedings of the 2015 IEEE International Conference on Information and Automation, Lijiang, China, 8–10 August 2015; pp. 1628–1632. [Google Scholar]
- Lora, A.T.; Santos, J.M.R.; Riquelme, J.C.; Expósito, A.G.; Ramos, J.L.M. Time-Series Prediction: Application to the Short-Term Electric Energy Demand. In Lecture Notes in Computer Science, Proceedings of the Current Topics in Artificial Intelligence TTIA 2003, San Sebastian, Spain, 12–14 November 2004; Springer: Berlin, Germany, 2004; Volume 3040, pp. 577–586. [Google Scholar]
- Zheng, J.; Xu, C.; Zhang, Z.; Li, X. Electric load forecasting in smart grids using Long-Short-Term-Memory based Recurrent Neural Network. In Proceedings of the 51st Annual Conference on Information Sciences and Systems (CISS), Baltimore, MD, USA, 22–24 March 2017; pp. 1–6. [Google Scholar]
- Torres, J.F.; Fernández, A.M.; Troncoso, A.; Martínez-Álvarez, F. Deep Learning-Based Approach for Time Series Forecasting with Application to Electricity Load. In Biomedical Applications Based on Natural and Artificial Computing; Springer International Publishing: Cham, Switzerland, 2017; pp. 203–212. [Google Scholar]
- Galicia, A.; Torres, J.F.; Martínez-Álvarez, F.; Troncoso, A. Scalable Forecasting Techniques Applied to Big Electricity Time Series. In Advances in Computational Intelligence, Proceedings of the 14th International Work-Conference on Artificial Neural Networks, IWANN 2017, Cadiz, Spain, 14–16 June 2017; Springer International Publishing: Cham, Switzerland, 2017; Part II; pp. 165–175. [Google Scholar]
- Castelli, M.; Vanneschi, L.; De Felice, M. Forecasting short-term electricity consumption using a semantics-based genetic programming framework: The South Italy case. Energy Econ.
**2015**, 47, 37–41. [Google Scholar] [CrossRef] - Talavera-Llames, R.L.; Pérez-Chacón, R.; Martínez-Ballesteros, M.; Troncoso, A.; Martínez-Álvarez, F. A Nearest Neighbours-Based Algorithm for Big Time Series Data Forecasting. In Hybrid Artificial Intelligent Systems, Proceedings of the 11th International Conference, HAIS 2016, Seville, Spain, 18–20 April 2016; Springer International Publishing: Cham, Switzerland, 2016; pp. 174–185. [Google Scholar]
- Geng, J.; Huang, M.L.; Li, M.W.; Hong, W.C. Hybridization of seasonal chaotic cloud simulated annealing algorithm in a SVR-based load forecasting model. Neurocomputing
**2015**, 151, 1362–1373. [Google Scholar] [CrossRef] - Fan, G.F.; Peng, L.L.; Hong, W.C.; Sun, F. Electric load forecasting by the SVR model with differential empirical mode decomposition and auto regression. Neurocomputing
**2016**, 173, 958–970. [Google Scholar] [CrossRef] - Jetcheva, J.G.; Majidpour, M.; Chen, W.P. Neural network model ensembles for building-level electricity load forecasts. Energy Build.
**2014**, 84, 214–223. [Google Scholar] [CrossRef] - Khairalla, M.A.; Ning, X.; AL-Jallad, N.T.; El-Faroug, M.O. Short-Term Forecasting for Energy Consumption through Stacking Heterogeneous Ensemble Learning Model. Energies
**2018**, 11, 1605. [Google Scholar] [CrossRef] - Martínez-Álvarez, F.; Troncoso, A.; Asencio-Cortés, G.; Riquelme, J.C. A Survey on Data Mining Techniques Applied to Electricity-Related Time Series Forecasting. Energies
**2015**, 8, 13162–13193. [Google Scholar] [CrossRef] - Daut, M.A.M.; Hassan, M.Y.; Abdullah, H.; Rahman, H.A.; Abdullah, M.P.; Hussin, F. Building electrical energy consumption forecasting analysis using conventional and artificial intelligence methods: A review. Renew. Sustain. Energy Rev.
**2017**, 70, 1108–1118. [Google Scholar] [CrossRef] - Spanish-Government. Agencia Estatal de Meteorologia—AEMET. Gobierno de España. 2019. Available online: https://opendata.aemet.es/ (accessed on 25 April 2019).
- Kleiber, C.; Zeileis, A. Applied Econometrics with R; Springer: New York, NY, USA, 2008; ISBN 978-0-387-77316-2. [Google Scholar]
- Neter, J.; Kutner, M.H.; Nachtsheim, C.J.; Wasserman, W. Applied Linear Statistical Models; Irwin: Chicago, IL, USA, 1996. [Google Scholar]
- Kuhn, M. Building Predictive Models in R Using the caret Package. J. Stat. Softw. Artic.
**2008**, 28, 1–26. [Google Scholar] - Box, G.; Jenkins, G. Time Series Analysis: Forecasting and Control; John Wiley and Sons: New York, NY, USA, 2008. [Google Scholar]
- Box, G.E.P.; Jenkins, G. Time Series Analysis, Forecasting and Control; Holden-Day, Inc.: San Francisco, CA, USA, 1990. [Google Scholar]
- Salles, R.; Assis, L.; Guedes, G.; Bezerra, E.; Porto, F.; Ogasawara, E. A Framework for Benchmarking Machine Learning Methods Using Linear Models for Univariate Time Series Prediction. In Proceedings of the 2017 International Joint Conference on Neural Networks (IJCNN), Anchorage, AK, USA, 14–19 May 2017. [Google Scholar]
- Time Series Models for “Caret” Package. Available online: https://github.com/sfeuerriegel/caret.ts (accessed on 3 April 2019).
- Eiben, A.E.; Smith, J.E. Introduction to Evolutionary Computing; Springer: New York, NY, USA, 2003. [Google Scholar]
- Grubinger, T.; Zeileis, A.; Pfeiffer, K.P. evtree: Evolutionary Learning of Globally Optimal Classification and Regression Trees in R. J. Stat. Softw.
**2014**, 61, 1–29. [Google Scholar] [CrossRef] - Loh, W.Y. Classification and regression trees. In Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery; Wiley: New York, NY, USA, 2011; Volume 1, pp. 14–23. [Google Scholar]
- Friedman, J.H. Greedy Function Approximation: A Gradient Boosting Machine. Ann. Stat.
**2000**, 29, 1189–1232. [Google Scholar] [CrossRef] - Friedman, J.H. Stochastic gradient boosting. Comput. Stat. Data Anal.
**2002**, 38, 367–378. [Google Scholar] [CrossRef] - Breiman, L. Random Forests. Mach. Learn.
**2001**, 45, 5–32. [Google Scholar] [CrossRef] - Breiman, L.; Friedman, J.; Olshen, R.; Stone, C. Classification and Regression Trees; Wadsworth and Brooks: Monterey, CA, USA, 1984. [Google Scholar]
- Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining (KDD ’16), San Francisco, CA, USA, 13–17August 2016; ACM: New York, NY, USA, 2016; pp. 785–794. [Google Scholar]
- Talavera-Llames, R.; Pérez-Chacón, R.; Troncoso, A.; Martínez-Álvarez, F. MV-kWNN: A novel multivariate and multi-output weighted nearest neighbours algorithm for big data time series forecasting. Neurocomputing
**2019**, 353, 56–73. [Google Scholar] [CrossRef] - Galicia, A.; Talavera-Llames, R.; Troncoso, A.; Koprinska, I.; Martínez-Álvarez, F. Multi-step forecasting for big data time series based on ensemble learning. Knowl.-Based Syst.
**2019**, 163, 830–841. [Google Scholar] [CrossRef] - Floreano, D.; Dürr, P.; Mattiussi, C. Neuroevolution: From architectures to learning. Evol. Intell.
**2008**, 1, 47–62. [Google Scholar] [CrossRef] - Stanley, K.O.; Clune, J.; Lehman, J.; Miikkulainen, R. Designing neural networks through neuroevolution. Nat. Mach. Intell.
**2019**, 1, 24–35. [Google Scholar] [CrossRef] - Real, E.; Aggarwal, A.; Huang, Y.; Le, Q.V. Regularized Evolution for Image Classifier Architecture Search. arXiv
**2018**, arXiv:1802.01548. [Google Scholar]

**Figure 1.**Graphical representation of the time series of all the buildings. The values are shown in logarithmic scale.

**Figure 4.**Dataset pre-processing. w determines the amount of the historical data used, while h determines the prediction horizon.

**Figure 7.**Four examples of real data against the predictions from ARIMA and Random Forest for w = 10.

**Table 1.**Description of the selected buildings. The first column presents the year of construction and the year of the refurbishment if was made. The second column present the size in m${}^{2}$, and finally a short description.

Building | Year (Refurbishment) | Size (m${}^{2}$) | Description |
---|---|---|---|

1 | 1956 (2005) | 75,323 | university canteen |

2 | 1956 (2001) | 527,277 | a four story building with several classrooms and offices |

3 | 1956 (1999) | 5231.01 | a four story building with several classrooms and offices |

6 | 1956 (1999) | 5231.01 | a four story building with four large classrooms |

7 | 1958 (2003) | 5231.01 | a four story building with several classrooms and offices |

8 | 1956 (1999) | 1948.16 | a two story building with four large classrooms |

10 | 1956 (1999) | 5231.01 | a four story building with several classrooms and offices |

12 | 1956 (1999) | 5231.01 | a four story building with several classrooms and offices |

14 | 1956 (1990) | 5231.01 | a four story building with several classrooms and offices |

24 | 1956 (2007) | 9134.93 | a three story building with several classrooms and offices |

25 | 2012 (-) | 16,281.71 | a newly two story building hosting the university library and some classrooms |

32 | 1956 (2005) | 2530.78 | Administration offices |

44 | 2008 (-) | 1752.83 | a newly three story built building with research labs, classrooms and offices. |

Year | Avg. Temp. | Avg. Max. Temp. | Avg. Min. Temp. | Rainfall (mm) | Avg. Wind Sp. (Km/h) |
---|---|---|---|---|---|

2012 | 18.7 | 25.8 | 12.4 | 324.13 | 9.0 |

2013 | 18.5 | 25.3 | 12.6 | 425.67 | 9.9 |

2014 | 18.9 | 25.6 | 13.3 | 636.50 | 9.4 |

2015 | 19.1 | 26.8 | 13.1 | 318.46 | 8.6 |

2016 | 19.1 | 26.0 | 13.5 | 626.31 | 9.4 |

2017 | 19.6 | 27.0 | 13.2 | 321.0 | 8.8 |

**Table 3.**Results obtained by all the methods on all the buildings for each value of the historical window considered.

ARIMA | LM | EvTree | RF | NN | GMB | rpart | ENSEMBLE | XGBoost | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|

w | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | MAE | RMSE | |

Edificio 1 | 7 | 0.57 | 0.71 | 0.50 | 0.58 | 0.48 | 0.58 | 0.48 | 0.56 | 0.49 | 0.57 | 0.49 | 0.57 | 0.51 | 0.60 | 0.55 | 0.67 | 0.49 | 0.57 |

10 | 0.52 | 0.63 | 0.46 | 0.54 | 0.50 | 0.61 | 0.47 | 0.55 | 0.46 | 0.54 | 0.46 | 0.55 | 0.50 | 0.58 | 0.51 | 0.64 | 0.46 | 0.55 | |

15 | 0.52 | 0.63 | 0.46 | 0.54 | 0.48 | 0.58 | 0.48 | 0.55 | 0.46 | 0.54 | 0.47 | 0.55 | 0.52 | 0.59 | 0.54 | 0.66 | 0.47 | 0.55 | |

20 | 0.55 | 0.68 | 0.47 | 0.55 | 0.49 | 0.59 | 0.48 | 0.56 | 0.47 | 0.54 | 0.47 | 0.55 | 0.52 | 0.60 | 0.53 | 0.65 | 0.48 | 0.56 | |

Edificio 2 | 7 | 2.10 | 2.57 | 1.37 | 1.55 | 1.41 | 1.61 | 1.35 | 1.53 | 1.40 | 1.61 | 1.35 | 1.53 | 1.39 | 1.57 | 1.41 | 1.69 | 1.35 | 1.54 |

10 | 1.73 | 2.16 | 1.31 | 1.50 | 1.34 | 1.57 | 1.32 | 1.50 | 1.32 | 1.50 | 1.30 | 1.50 | 1.38 | 1.57 | 1.41 | 1.70 | 1.31 | 1.51 | |

15 | 1.52 | 1.89 | 1.32 | 1.53 | 1.38 | 1.63 | 1.33 | 1.51 | 1.33 | 1.53 | 1.30 | 1.53 | 1.39 | 1.57 | 1.39 | 1.72 | 1.33 | 1.54 | |

20 | 1.66 | 2.05 | 1.32 | 1.53 | 1.36 | 1.61 | 1.33 | 1.51 | 1.33 | 1.53 | 1.32 | 1.53 | 1.39 | 1.57 | 1.35 | 1.64 | 1.32 | 1.54 | |

Edificio 3 | 7 | 1.07 | 4.72 | 1.07 | 4.72 | 1.06 | 4.67 | 1.01 | 4.64 | 1.09 | 4.67 | 1.03 | 4.65 | 1.05 | 4.66 | 1.04 | 4.56 | 1.02 | 4.65 |

10 | 1.06 | 4.92 | 1.06 | 4.92 | 0.97 | 4.63 | 0.93 | 4.64 | 0.91 | 4.65 | 0.91 | 4.64 | 1.00 | 4.66 | 1.02 | 4.66 | 0.92 | 4.64 | |

15 | 1.16 | 5.14 | 1.16 | 5.14 | 0.96 | 4.68 | 0.94 | 4.66 | 0.93 | 4.68 | 0.91 | 4.66 | 1.00 | 4.67 | 0.98 | 4.67 | 0.93 | 4.66 | |

20 | 1.16 | 5.14 | 1.16 | 5.14 | 0.99 | 4.68 | 0.94 | 4.67 | 0.93 | 4.69 | 0.93 | 4.67 | 0.99 | 4.68 | 1.02 | 4.68 | 0.94 | 4.67 | |

Edificio 6 | 7 | 1.83 | 2.34 | 0.96 | 1.25 | 0.98 | 1.29 | 0.94 | 1.24 | 0.97 | 1.26 | 0.97 | 1.28 | 0.95 | 1.23 | 1.00 | 1.35 | 0.98 | 1.28 |

10 | 1.32 | 1.74 | 0.88 | 1.15 | 0.88 | 1.20 | 0.87 | 1.15 | 0.87 | 1.14 | 0.87 | 1.16 | 0.89 | 1.17 | 0.91 | 1.26 | 0.86 | 1.16 | |

15 | 1.13 | 1.48 | 0.88 | 1.16 | 0.91 | 1.25 | 0.87 | 1.17 | 0.89 | 1.17 | 0.86 | 1.18 | 0.91 | 1.20 | 0.93 | 1.28 | 0.89 | 1.19 | |

20 | 1.17 | 1.51 | 0.88 | 1.17 | 0.95 | 1.26 | 0.87 | 1.17 | 0.89 | 1.17 | 0.88 | 1.19 | 0.91 | 1.20 | 0.93 | 1.26 | 0.87 | 1.18 | |

Edificio 7 | 7 | 2.34 | 2.89 | 1.13 | 1.32 | 1.16 | 1.41 | 1.09 | 1.28 | 1.12 | 1.31 | 1.09 | 1.30 | 1.12 | 1.31 | 1.13 | 1.36 | 1.09 | 1.30 |

10 | 2.25 | 2.76 | 1.06 | 1.25 | 1.08 | 1.32 | 1.04 | 1.22 | 1.06 | 1.27 | 1.03 | 1.21 | 1.08 | 1.29 | 1.13 | 1.39 | 1.04 | 1.24 | |

15 | 1.06 | 1.26 | 1.06 | 1.26 | 1.08 | 1.28 | 1.05 | 1.22 | 1.08 | 1.26 | 1.03 | 1.21 | 1.09 | 1.26 | 1.11 | 1.34 | 1.04 | 1.22 | |

20 | 1.06 | 1.30 | 1.06 | 1.25 | 1.09 | 1.32 | 1.06 | 1.23 | 1.07 | 1.24 | 1.06 | 1.24 | 1.10 | 1.27 | 1.12 | 1.37 | 1.05 | 1.22 | |

Edificio 8 | 7 | 0.39 | 0.49 | 0.36 | 0.44 | 0.37 | 0.46 | 0.35 | 0.44 | 0.36 | 0.44 | 0.36 | 0.45 | 0.36 | 0.44 | 0.39 | 0.52 | 0.35 | 0.45 |

10 | 0.47 | 0.60 | 0.34 | 0.43 | 0.36 | 0.46 | 0.34 | 0.43 | 0.34 | 0.43 | 0.33 | 0.43 | 0.36 | 0.44 | 0.38 | 0.53 | 0.34 | 0.43 | |

15 | 0.36 | 0.45 | 0.34 | 0.43 | 0.36 | 0.47 | 0.34 | 0.43 | 0.34 | 0.43 | 0.34 | 0.43 | 0.36 | 0.44 | 0.37 | 0.50 | 0.34 | 0.43 | |

20 | 0.34 | 0.43 | 0.34 | 0.43 | 0.35 | 0.46 | 0.34 | 0.43 | 0.34 | 0.43 | 0.34 | 0.43 | 0.36 | 0.44 | 0.38 | 0.51 | 0.34 | 0.43 | |

Edificio 10 | 7 | 1.38 | 1.71 | 1.02 | 1.25 | 1.02 | 1.28 | 1.02 | 1.26 | 1.02 | 1.26 | 1.01 | 1.24 | 1.07 | 1.37 | 1.07 | 1.32 | 1.02 | 1.26 |

10 | 1.23 | 1.52 | 0.95 | 1.16 | 0.96 | 1.21 | 0.96 | 1.19 | 0.94 | 1.16 | 0.94 | 1.17 | 1.01 | 1.29 | 1.02 | 1.27 | 0.94 | 1.18 | |

15 | 1.01 | 1.27 | 0.97 | 1.17 | 0.97 | 1.21 | 0.97 | 1.19 | 0.97 | 1.18 | 0.94 | 1.15 | 1.01 | 1.27 | 1.03 | 1.30 | 0.95 | 1.17 | |

20 | 1.18 | 1.48 | 0.98 | 1.18 | 0.99 | 1.23 | 0.97 | 1.19 | 0.97 | 1.18 | 0.95 | 1.16 | 1.01 | 1.27 | 1.04 | 1.27 | 0.96 | 1.18 | |

Edificio 12 | 7 | 0.52 | 0.65 | 0.39 | 0.45 | 0.39 | 0.46 | 0.37 | 0.43 | 0.38 | 0.45 | 0.37 | 0.43 | 0.39 | 0.45 | 0.40 | 0.50 | 0.36 | 0.43 |

10 | 0.42 | 0.52 | 0.37 | 0.43 | 0.39 | 0.47 | 0.36 | 0.43 | 0.37 | 0.43 | 0.35 | 0.43 | 0.39 | 0.46 | 0.39 | 0.49 | 0.35 | 0.43 | |

15 | 0.43 | 0.53 | 0.37 | 0.43 | 0.38 | 0.46 | 0.36 | 0.43 | 0.37 | 0.43 | 0.36 | 0.43 | 0.39 | 0.46 | 0.38 | 0.48 | 0.36 | 0.44 | |

20 | 0.51 | 0.64 | 0.37 | 0.43 | 0.39 | 0.47 | 0.36 | 0.43 | 0.37 | 0.43 | 0.36 | 0.42 | 0.39 | 0.46 | 0.39 | 0.49 | 0.36 | 0.44 | |

Edificio 14 | 7 | 1.87 | 2.21 | 1.19 | 1.45 | 1.23 | 1.51 | 1.09 | 1.33 | 1.17 | 1.43 | 1.12 | 1.36 | 1.27 | 1.52 | 1.24 | 1.52 | 1.15 | 1.39 |

10 | 1.11 | 1.37 | 0.98 | 1.19 | 1.10 | 1.39 | 0.97 | 1.18 | 1.01 | 1.23 | 0.95 | 1.14 | 1.21 | 1.43 | 1.03 | 1.29 | 0.95 | 1.15 | |

15 | 0.90 | 1.09 | 0.89 | 1.07 | 1.00 | 1.23 | 0.94 | 1.14 | 0.90 | 1.08 | 0.89 | 1.09 | 1.04 | 1.26 | 0.94 | 1.20 | 0.89 | 1.09 | |

20 | 1.06 | 1.31 | 0.89 | 1.07 | 1.00 | 1.23 | 0.94 | 1.14 | 0.90 | 1.08 | 0.90 | 1.10 | 1.04 | 1.26 | 1.03 | 1.31 | 0.91 | 1.10 | |

Edificio 24 | 7 | 1.20 | 1.56 | 0.94 | 1.22 | 0.98 | 1.27 | 0.93 | 1.20 | 0.95 | 1.23 | 0.92 | 1.20 | 0.94 | 1.22 | 1.02 | 1.32 | 0.94 | 1.21 |

10 | 1.12 | 1.47 | 0.88 | 1.14 | 0.98 | 1.30 | 0.88 | 1.15 | 0.85 | 1.11 | 0.88 | 1.14 | 0.92 | 1.20 | 1.00 | 1.31 | 0.88 | 1.15 | |

15 | 1.25 | 1.63 | 0.87 | 1.14 | 0.95 | 1.25 | 0.88 | 1.14 | 0.88 | 1.16 | 0.90 | 1.17 | 0.92 | 1.18 | 0.94 | 1.22 | 0.88 | 1.16 | |

20 | 0.99 | 1.28 | 0.88 | 1.14 | 1.00 | 1.32 | 0.89 | 1.15 | 0.88 | 1.16 | 0.92 | 1.19 | 0.90 | 1.17 | 0.96 | 1.27 | 0.91 | 1.18 | |

Edificio 25 | 7 | 4.34 | 5.54 | 3.48 | 4.29 | 3.86 | 4.94 | 3.43 | 4.24 | 3.73 | 4.56 | 3.44 | 4.28 | 3.51 | 4.29 | 4.01 | 5.07 | 3.41 | 4.26 |

10 | 6.41 | 7.73 | 3.29 | 4.06 | 3.62 | 4.56 | 3.24 | 4.05 | 3.47 | 4.23 | 3.26 | 4.09 | 3.38 | 4.18 | 3.82 | 4.78 | 3.24 | 4.07 | |

15 | 3.83 | 4.84 | 3.33 | 4.12 | 3.84 | 4.83 | 3.34 | 4.16 | 3.49 | 4.32 | 3.40 | 4.29 | 3.76 | 4.73 | 3.70 | 4.79 | 3.41 | 4.29 | |

20 | 4.27 | 5.23 | 3.33 | 4.12 | 3.75 | 4.73 | 3.33 | 4.14 | 3.47 | 4.32 | 3.38 | 4.21 | 3.76 | 4.74 | 3.72 | 4.80 | 3.40 | 4.25 | |

Edificio 32 | 7 | 0.66 | 0.80 | 0.52 | 0.63 | 0.56 | 0.69 | 0.52 | 0.63 | 0.53 | 0.64 | 0.52 | 0.64 | 0.53 | 0.63 | 0.59 | 0.75 | 0.52 | 0.63 |

10 | 0.86 | 1.06 | 0.51 | 0.62 | 0.56 | 0.69 | 0.51 | 0.62 | 0.51 | 0.62 | 0.51 | 0.63 | 0.52 | 0.63 | 0.58 | 0.75 | 0.51 | 0.62 | |

15 | 0.63 | 0.78 | 0.52 | 0.62 | 0.56 | 0.70 | 0.51 | 0.62 | 0.52 | 0.62 | 0.51 | 0.63 | 0.53 | 0.64 | 0.55 | 0.70 | 0.52 | 0.65 | |

20 | 0.59 | 0.72 | 0.52 | 0.63 | 0.54 | 0.66 | 0.52 | 0.63 | 0.52 | 0.63 | 0.51 | 0.63 | 0.53 | 0.64 | 0.57 | 0.72 | 0.53 | 0.64 | |

Edificio 44 | 7 | 0.55 | 0.68 | 0.46 | 0.56 | 0.47 | 0.57 | 0.45 | 0.54 | 0.47 | 0.57 | 0.45 | 0.55 | 0.46 | 0.55 | 0.49 | 0.61 | 0.46 | 0.55 |

10 | 0.60 | 0.74 | 0.45 | 0.54 | 0.47 | 0.58 | 0.45 | 0.53 | 0.45 | 0.54 | 0.44 | 0.54 | 0.46 | 0.55 | 0.49 | 0.61 | 0.45 | 0.54 | |

15 | 0.44 | 0.53 | 0.45 | 0.53 | 0.48 | 0.59 | 0.45 | 0.53 | 0.45 | 0.53 | 0.45 | 0.53 | 0.47 | 0.55 | 0.50 | 0.63 | 0.45 | 0.54 | |

20 | 0.67 | 0.82 | 0.45 | 0.53 | 0.49 | 0.59 | 0.45 | 0.53 | 0.45 | 0.53 | 0.45 | 0.53 | 0.46 | 0.55 | 0.48 | 0.61 | 0.45 | 0.54 | |

Average | 7 | 1.45 | 2.07 | 1.03 | 1.52 | 1.08 | 1.60 | 1.00 | 1.49 | 1.05 | 1.54 | 1.01 | 1.50 | 1.04 | 1.53 | 1.10 | 1.63 | 1.01 | 1.50 |

10 | 1.47 | 2.09 | 0.96 | 1.46 | 1.02 | 1.54 | 0.95 | 1.43 | 0.97 | 1.45 | 0.94 | 1.43 | 1.01 | 1.50 | 1.05 | 1.59 | 0.94 | 1.44 | |

15 | 1.10 | 1.66 | 0.97 | 1.47 | 1.03 | 1.55 | 0.96 | 1.44 | 0.97 | 1.46 | 0.95 | 1.45 | 1.03 | 1.53 | 1.03 | 1.58 | 0.96 | 1.46 | |

20 | 1.17 | 1.74 | 0.97 | 1.47 | 1.03 | 1.55 | 0.96 | 1.44 | 0.97 | 1.46 | 0.96 | 1.45 | 1.03 | 1.53 | 1.04 | 1.58 | 0.96 | 1.46 |

**Table 4.**Average MAE obtained by the methods for all the values of historical window and on all the buildings. The standard deviation is shown in brackets.

Method | Average | Average for w = 10 |
---|---|---|

RF | 0.97 (0.75) | 0.95 (0.75) |

GBM | 0.97 (0.76) | 0.94 (0.75) |

XGBoost | 0.97 (0.76) | 0.94 (0.75) |

LM | 0.98 (0.76) | 0.96 (0.75) |

NN | 0.99 (0.81) | 0.97 (0.81) |

rpart | 1.03 (0.75) | 1.01 (0.79) |

EVTree | 1.04 (0.86) | 1.02 (0.85) |

ENSEMBLE | 1.06 (0.87) | 1.05 (0.90) |

ARIMA | 1.30 (1.15) | 1.29 (1.14) |

previous day | 1.44 (1.18) | 1.43 (1.22) |

previous week | 1.33 (1.12) | 1.32 (1.14) |

**Table 5.**Average RMSE obtained by the methods for all the values of historical window and on all the buildings.The standard deviation is shown in brackets.

Method | Average | Average for w = 10 |
---|---|---|

RF | 1.45 (1.32) | 1.43 (1.35) |

GBM | 1.46 (1.33) | 1.43 (1.35) |

XGBoost | 1.46 (1.33) | 1.44 (1.35) |

NN | 1.47 (1.36) | 1.45 (1.38) |

LM | 1.48 (1.39) | 1.46 (1.40) |

rpart | 1.52 (1.37) | 1.50 (1.36) |

EVTree | 1.56 (1.41) | 1.54 (1.41) |

ENSEMBLE | 1.59 (1.42) | 1.58 (1.45) |

ARIMA | 1.89 (1.67) | 1.92 (1.85) |

previous day | 2.07 (1.90) | 2.14 (1.99) |

previous week | 2.05 (1.91) | 2.04 (1.96) |

**Table 6.**Significance of the results regarding MAE according to two-tailed t-test with confidence level of 1%.

Method | EVTree | RF | NN | GBM | ARIMA | LM | rpart | ENSEMBLE | XGBoost |
---|---|---|---|---|---|---|---|---|---|

EVTree | s | s | s | s | s | s | |||

RF | s | s | s | ||||||

NN | s | s | s | s | |||||

GBM | s | s | s | ||||||

ARIMA | s | s | s | s | |||||

LM | s | s | |||||||

rpart | s | ||||||||

ENSEMBLE | s |

**Table 7.**Significance of the results regarding RMSE according to two-tailed t-test with confidence level of 1%.

Method | EVTree | RF | NN | GBM | ARIMA | LM | rpart | ENSEMBLE | XGBoost |
---|---|---|---|---|---|---|---|---|---|

EVTree | s | s | s | s | s | s | s | ||

RF | s | s | s | s | s | ||||

NN | s | s | s | ||||||

GBM | s | s | s | s | |||||

ARIMA | s | s | s | s | |||||

LM | s | ||||||||

rpart | s | s | |||||||

ENSEMBLE | s |

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Divina, F.; García Torres, M.; Goméz Vela, F.A.; Vázquez Noguera, J.L.
A Comparative Study of Time Series Forecasting Methods for Short Term Electric Energy Consumption Prediction in Smart Buildings. *Energies* **2019**, *12*, 1934.
https://doi.org/10.3390/en12101934

**AMA Style**

Divina F, García Torres M, Goméz Vela FA, Vázquez Noguera JL.
A Comparative Study of Time Series Forecasting Methods for Short Term Electric Energy Consumption Prediction in Smart Buildings. *Energies*. 2019; 12(10):1934.
https://doi.org/10.3390/en12101934

**Chicago/Turabian Style**

Divina, Federico, Miguel García Torres, Francisco A. Goméz Vela, and José Luis Vázquez Noguera.
2019. "A Comparative Study of Time Series Forecasting Methods for Short Term Electric Energy Consumption Prediction in Smart Buildings" *Energies* 12, no. 10: 1934.
https://doi.org/10.3390/en12101934