# Load Forecasting for a Campus University Using Ensemble Methods Based on Regression Trees

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Ensemble Methods Based on Regression Trees

#### 2.1. Bagging

#### 2.2. Random Forest

#### 2.3. Conditional Forest

#### 2.4. Boosting

- Tree constraints: there are several ways to introduce constraints when constructing regression trees. For example, the following tree constraints can be considered as regularization parameters:
- ○
- The number of gradient boosting iterations N: increasing N reduces the error on the training dataset, but may lead to overfitting. An optimal value of N is often selected by monitoring prediction error on a separate validation data set.
- ○
- Tree depth: the size of the trees or number of terminal nodes in trees, which controls the maximum allowed level of interaction between variables in the model. The weak learners need to have skills but they should remain weak, thus shorter trees are preferred. In general, values of tree depth between 4 and 8 work well and values greater than 10 are unlikely to be required, see [35].
- ○
- The minimum number of observation per split: the minimum number of observations needed before a split can be considered. It helps to reduce prediction variance at leaves.

- Shrinkage or learning rate: in regularization by shrinkage, each update is scaled by the value of the learning rate parameter “eta” in (0,1]. Shrinkage reduces the influence of each individual tree and leaves space for future trees to improve the model. As it is stated in [28], small learning rates provide improvements in model’s generalization ability over gradient boosting without shrinking (eta = 1), but the computational time increases. Besides, the number of iterations and learning rate are tightly related: for a smaller learning rate “eta”, a greater N is required.
- Random sampling: to reduce the correlation between the trees in the sequence, at each step, a subsample of the training data is selected without replacement to fit the base learner. This modification prevent overfitting and it was first introduced in [36], which is also called stochastic gradient boosting. Friedman observed an improvement in gradient boosting’s accuracy with samplings of around one half of the training datasets. An alternative to row sampling is column sampling, which indeed prevents over-fitting more efficiently, see [37].
- Penalize tree complexity: complexity of a tree can be defined as a combination of the number of leaves and the L2 norm of the leaf scores. This regularization not only avoids overfitting, it also tends to select simple and predictive models. Following this approach, ref. [37] describes a scalable tree boosting system called XGBoost. In that paper, the objective to be minimized is a combination of the loss function and the complexity of the tree. In contrast to the previous ensemble methods, XGBoost requires a minimal amount of computational resources to solve real-world problems.

_{L}and I

_{R}the instance sets of left and right nodes after the split, $I={I}_{L}\cup {I}_{R}$, the reduction in the objective after the split is given by:

- An exact greedy algorithm is available.
- Approximate global and approximate local algorithms are available for big datasets.
- It performs parallel learning. Besides, an effective cache-aware block structure is available for out-of-core tree learning.
- It is efficient in case of sparse input data (including the presence of missing values).

## 3. Prediction Results for the University Buildings

#### 3.1. Customer Description: A Campus University

^{2}to 6500 m

^{2}and a meeting zone (10,000 m

^{2}). Buildings are of two kinds: naturally ventilated cellular (individual windows, local light switches, and local heating control) and naturally ventilated open-plan (office equipment, light switched in longer groups, and zonal heating control). This campus has an overall surface larger than 35,500 m

^{2}to fulfill the needs of different Faculties for classrooms, departmental offices, administrative offices, and laboratories for 1800 students and 200 professors. Unfortunately, the age of buildings (50 years old in four cases) and architectural conditioning works are far from actual energy efficiency standards, specifically in the two main electrical end-uses of the building: air conditioning/space heating (low performance, insufficient heat insulation, and an important cluster of individual appliances for offices and small laboratories) and lighting (where conventional magnetic ballasts and fluorescent are still used at a great extend).

#### 3.2. Data Description

#### 3.3. Forecasting Results

## 4. Direct Market Consumers

#### 4.1. Law Framework for DMC and Market Performance

#### 4.2. Price of the Energy Participating as a DMC

- Regulated prices: these are prices set by the State and also depend on the supply rate. This component includes access fees, capacity payments and loss coefficients. This component does not depend on the type of supply, thus the corresponding cost would be the same for consumers through retailers and DMC.
- Taxes: they are also regulated prices, although of a different nature from the previous ones. This component is given by the special tax on electricity (currently 5113%) and VAT (currently 21%). This component is also common for all consumers.
- Unregulated prices: this component of the billing contemplates the price for the energy consumed in wholesale market and therefore it is not regulated by the State. It includes the price of energy in the Day-ahead and Intraday Market, costs for bilateral contracts, costs for measured deviations (difference between energy consumed and programmed energy), and costs for ancillary services.

- E(h) = Energy cost in the hour “h”, in €.
- ECBC(h) = Energy cost in the hour “h” from bilateral contracts, in €.
- DMP(h) = Daily Market price in the hour “h”, in €/kWh.
- EDM(h) = Energy bought in the Daily Market in the hour “h”, in kWh.
- IMP(h) = Intraday Market price in the hour “h”, in €/kWh.
- EIM(h) = Energy bought in the Intraday Market in the hour “h”, in kWh.
- SAC(h) = System adjustment cost passed on to the DMC in the hour “h”, in €/kWh.
- EMCB(h) = Energy measured in Central Bars in the hour “h”, in kWh.
- MDP(h) = Measured Deviations price in the hour “h”, in €/kWh.
- EMD(h) = Measured Deviation of Energy in the hour “h” = Difference between consumed energy and programmed energy in the hour “h”, in kWh.
- CPP(h) = Capacity payment price in the hour “h”, in €/kWh.

#### 4.3. Case Study: A Campus University as a DMC

## 5. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Hahn, H.; Meyer-Nieberg, S.; Pickl, S. Electric load forecasting methods: Tools for decision making. Eur. J. Oper. Res.
**2009**, 199, 902–907. [Google Scholar] [CrossRef] - Alfares, H.K.; Nazeeruddin, M. Electric load forecasting: Literature survey and classification of methods. Int. J. Syst. Sci.
**2002**, 33, 23–34. [Google Scholar] [CrossRef] - Yang, H.T.; Huang, C.M.; Huang, C.L. Identification of ARMAX model for short term load forecasting: An evolutionary programming approach. IEEE Trans. Power Syst.
**1996**, 11, 403–408. [Google Scholar] [CrossRef] - Taylor, W.; Menezes, L.M.; McSharry, P.E. A comparison of univariate methods for forecasting electricity demand up to a day ahead. Int. J. Forecast.
**2006**, 22, 1–16. [Google Scholar] [CrossRef] [Green Version] - Newsham, G.R.; Birt, B.J. Building-level occupancy data to improve arima-based electricity use forecasts. In Proceedings of the 2nd ACM Workshop on Embedded Sensing Systems for Energy-Efficiency in Building, Zurich, Switzerland, 3–5 November 2010; pp. 13–18. [Google Scholar]
- Massana, J.; Pous, C.; Burgas, L.; Melendez, J.; Colomer, J. Shortterm load forecasting in a non-residential building contrasting models and attributes. Energy Build.
**2015**, 92, 322–330. [Google Scholar] [CrossRef] - Bruhns, A.; Deurveilher, G.; Roy, J.S. A nonlinear regression model for midterm load forecasting and improvements in seasonality. In Proceedings of the 15th Power Systems Computation Conference, Liege, Belgium, 22–26 August 2005. [Google Scholar]
- Charytoniuk, W.; Chen, M.S.; Van Olinda, P. Nonparametric regression based short-term load forecasting. IEEE Trans. Power Syst.
**1998**, 13, 725–730. [Google Scholar] [CrossRef] - Amber, K.P.; Aslam, M.W.; Mahmood, A.; Kousar, A.; Younis, M.Y.; Akbar, B.; Chaudhary, G.Q.; Hussain, S.K. Energy Consumption Forecasting for University Sector Buildings. Energies
**2017**, 10, 1579. [Google Scholar] [CrossRef] - Tso, G.K.F.; Yau, K.K.W. Predicting electricity energy consumption: A comparison of regression analysis, decision tree and neural networks. Energy
**2007**, 32, 1761–1768. [Google Scholar] [CrossRef] - Li, K.; Su, H.; Chu, J. Forecasting building energy consumption using neural networks and hybrid neuro-fuzzy system: A comparative study. Energy Build.
**2011**, 43, 2893–2899. [Google Scholar] [CrossRef] - Liao, G.C.; Tsao, T.P. Application of a fuzzy neural network combined with a chaos genetic algorithm and simulated annealing to short-term load forecasting. IEEE Trans. Evol. Comput.
**2016**, 10, 330–340. [Google Scholar] [CrossRef] - Hippert, H.S.; Pedreira, C.E.; Souza, R.C. Neural networks for short-term load forecasting: A review and evaluation. IEEE Trans. Power Syst.
**2001**, 16, 44–55. [Google Scholar] [CrossRef] - Karatasou, S.; Santamouris, M.; Geros, V. Modeling and predicting building’s energy use with artificial neural networks: Methods and results. Energy Build.
**2006**, 38, 949–958. [Google Scholar] [CrossRef] - Metaxiotis, K.; Kagiannas, A.; Askounis, D.; Psarras, J. Artificial intelligence in short term electric load forecasting: A state-of-the-art survey for the researcher. Energy Convers. Manag.
**2003**, 44, 1525–1534. [Google Scholar] [CrossRef] - Buitrago, J.; Asfour, S. Short-term forecasting of electric loads using nonlinear autoregressive artificial neural networks with exogenous vector inputs. Energies
**2017**, 10, 40. [Google Scholar] [CrossRef] - Hashmi, M.U.; Arora, V.; Priolkar, J.G. Hourly electric load forecasting using Nonlinear Auto Regressive with eXogenous (NARX) based neural network for the state of Goa, India. In Proceedings of the International Conference on Industrial Instrumentation and Control. (ICIC), Pune, India, 28–30 May 2015; pp. 1418–1423. [Google Scholar] [CrossRef]
- Hanshen, L.; Yuan, Z.; Jinglu, H.; Zhe, L. A localized NARX Neural Network model for Short-term load forecasting based upon Self-Organizing Mapping. In Proceedings of the IEEE 3rd International Future Energy Electronics Conference and ECCE Asia (IFEEC 2017—ECCE Asia), Kaohsiung, Taiwan, 3–7 June 2017; pp. 749–754. [Google Scholar] [CrossRef]
- Fan, G.-F.; Qing, S.; Wang, H.; Hong, W.-C.; Li, H.-J. Support Vector Regression Model Based on Empirical Mode Decomposition and Auto Regression for Electric Load Forecasting. Energies
**2013**, 6, 1887–1901. [Google Scholar] [CrossRef] [Green Version] - Dong, Y.; Ma, X.; Ma, C.; Wang, J. Research and Application of a Hybrid Forecasting Model Based on Data Decomposition for Electrical Load Forecasting. Energies
**2016**, 9, 1050. [Google Scholar] [CrossRef] - Dudek, G. Short-Term Load Forecasting Using Random Forests. In Intelligent Systems’2014. Advances in Intelligent Systems and Computing; Springer: Cham, Switzerland, 2015; pp. 821–828. ISBN 978-3-319-11310-4. [Google Scholar]
- Hedén, W. Predicting Hourly Residential Energy Consumption Using Random Forest and Support Vector Regression: An Analysis of the Impact of Household Clustering on the Performance Accuracy, Degree-Project in Mathematics (Second Cicle). Royal Institute of Technology SCI School of Engineering Sciences. 2016. Available online: https://kth.diva-portal.org/smash/get/diva2:932582/FULLTEXT01.pdf (accessed on 4 July 2018).
- Ahmed Mohammed, A.; Aung, Z. Ensemble Learning Approach for Probabilistic Forecasting of Solar Power Generation. Energies
**2016**, 9, 1017. [Google Scholar] [CrossRef] - Lin, Y.; Luo, H.; Wang, D.; Guo, H.; Zhu, K. An Ensemble Model Based on Machine Learning Methods and Data Preprocessing for Short-Term Electric Load Forecasting. Energies
**2017**, 10, 1186. [Google Scholar] [CrossRef] - Sistema de Información del Operador del Sistema (Esios); Red Eléctrica de España. Alta Como Consumidor Directo en Mercado Peninsular. Available online: https://www.esios.ree.es/es/documentacion/guia-alta-os-consumidor-directo-mercado-peninsula (accessed on 4 July 2018).
- MINETAD. Ministerio de Energía, Turismo y Agenda Digital. Gobierno de España. Available online: http://www.minetad.gob.es/ENERGIA/ELECTRICIDAD/DISTRIBUIDORES/Paginas/ConsumidoresDirectosMercado.aspx (accessed on 4 July 2018).
- Touzani, S.; Granderson, J.; Fernandes, S. Gradient boosting machine for modeling the energy consumption of commercial buildings. Energy Build.
**2018**, 158, 1533–1543. [Google Scholar] [CrossRef] [Green Version] - James, G.; Witten, D.; Hastie, T.; Tibshirani, R. An Introduction to Statistical Learning; Springer: New York, NY, USA, 2013; ISBN 978-1-4614-7138-7. [Google Scholar]
- R Package: RandomForest. Repository CRAN. Available online: https://cran.r-project.org/web/packages/ randomForest/randomForest.pdf (accessed on 4 July 2018).
- Hothorn, T.; Hornik, K.; Zeileis, A. Unbiased Recursive Partitioning: A Conditional Inference Framework. J. Comput. Graph. Stat.
**2006**, 15, 651–674. [Google Scholar] [CrossRef] [Green Version] - Strasser, H.; Weber, C. On the asymptotic theory of permutation statistics. Math. Methods Stat.
**1999**, 8, 220–250. [Google Scholar] - R Package: Party. Repository CRAN. Available online: https://cran.r-project.org/web/ packages/party /party.pdf (accessed on 4 July 2018).
- Freund, Y.; Schapire, R.E. A decision-theoretic generalization of on-line learning and an application to boosting. J. Comput. Syst. Sci.
**1997**, 55, 119–139. [Google Scholar] [CrossRef] - Friedman, J.H. Greedy function approximation: A gradient boosting machine. Ann. Stat.
**2001**, 19, 1189–1232. [Google Scholar] [CrossRef] - Hastie, T.; Tibshirani, R.; Friedman, J.H. 10. Boosting and Additive Trees. In The Elements of Statistical Learning, 2nd ed.; Springer: New York, NY, USA, 2009; pp. 337–384. ISBN 0-387-84857-6. [Google Scholar]
- Friedman, J.H. Stochastic Gradient Boosting. Comput. Stat. Data Anal.
**2002**, 38, 367–378. [Google Scholar] [CrossRef] - Chen, T.; Guestrin, C. XGBoost: A scalable tree boosting system. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13–17 August 2016; pp. 785–794. [Google Scholar]
- R Package: Xgboost. Repository CRAN. Available online: https://cran.r-project.org/web/packages/xgboost/xgboost.pdf (accessed on 4 July 2018).
- Pérez-Lombard, L.; Ortiz, J.; Pout, C. A review on buildings consumption information. Energy Build.
**2008**, 40, 394–398. [Google Scholar] [CrossRef] - Strobl, C.; Boulesteix, A.L.; Kneib, T.; Augustin, T.; Zeileis, A. Conditional Variable Importance for Random Forests. BMC Bioinf.
**2008**, 9, 307. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Boletín Oficial del Estado. Ley 24/2013, de 26 de Diciembre, del Sector Eléctrico. Available online: https://www.boe.es/buscar/doc.php?id=BOE-A-2013-13645 (accessed on 4 July 2018).
- Boletín Oficial del Estado. Real Decreto 1955/2000, de 1 de Diciembre, Por el Que se Regulan las Actividades de Transporte, Distribución, Comercialización, Suministro y Procedimientos de Autorización de Instalaciones de Energía Eléctrica. Available online: http://www.boe.es/buscar/act.php? id=BOE-A-2000-24019&tn=1& p=20131230&vd=#a70 (accessed on 4 July 2018).
- Comisión Nacional de Los Mercados y la Competencia (CNMC). List of Direct Market Consumers. Available online: https://www.cnmc.es/ambitos-de-actuacion/energia/mercado-electrico#listados (accessed on 4 July 2018).
- Red Eléctrica Española (REE). Electricity Market Data. Available online: https://www.esios.ree.es/es/ analisis (accessed on 4 July 2018).

**Figure 1.**Goodness-of-fit measures for each month in 2016 and each ensemble method: (

**a**) using root mean square error (RMSE) (kWh); and, (

**b**) using mean absolute percentage error (MAPE) (%).

**Figure 2.**Goodness-of-fit measures for each ensemble method by hour of the day in 2016: (

**a**) using RMSE (kWh); (

**b**) using MAPE (%).

End-Use | USA (%) | UK (%) | Spain (%) | University Buildings (%) (UPCT) |
---|---|---|---|---|

HVAC | 48 | 55 | 52 | 40–50 |

Lighting | 22 | 17 | 33 | 25–30 |

Equipment (appliances) | 13 | 5 | 10 | 7–12 |

Other (WH, refrigeration) | 17 | 23 | 5 | 8–10 |

Predictors | Description |
---|---|

H2, H3, … H24 | Hourly dummy variables corresponding to the hour of the day |

WH2, WH3, … WH7 | Hourly dummy variables corresponding to the day of the week |

MH2, MH3, …, MH12 | Hourly dummy variables corresponding to the month of the year |

FH1 | Hourly dummy variables corresponding to the month of the year |

FH2 | Hourly dummy variable corresponding to Christmas and Eastern days |

FH3 | Hourly dummy variable corresponding to academic holidays (patron saint festivities) |

FH4 | Hourly dummy variable corresponding to national, regional or local holidays |

FH5 | Hourly dummy variable corresponding to academic periods with no-classes and no-exams (tutorial periods) |

Temperature_lag_i | Hourly external temperature lagged “i” hours. Depending on the prediction horizon, different lags will be considered. |

LOAD_lag_i | Hourly load lagged “i” hours. Depending on the prediction horizon, different lags will be considered. |

Term | Description |
---|---|

ntree (N) | Number of trees or iterations in bagging, random forest and conditional forest |

mtry | Number of predictors considered at each split in bagging, random forest and conditional forest |

node impurity | Importance measure in random forest |

max_depth | Maximum depth of a tree |

subsample | Subsample ratio of the training instance |

eta | Shrinkage or learning rate |

nrounds | Number of boosting iterations |

gain | Fractional contribution of each feature to the model |

XGBoost Pred. Horizon = 48 h | eta = 0.01, nrounds = 5700 | eta = 0.02, nrounds = 3400 | eta = 0.05, nrounds = 1700 | eta = 0.10, nrounds = 566 |
---|---|---|---|---|

RMSE_train (kWh) | 11.91 | 11.02 | 10.02 | 12.50 |

RMSE_test (kWh) | 23.74 | 23.65 | 23.92 | 24.26 |

R-squared_train | 0.988 | 0.989 | 0.991 | 0.986 |

R-squared_test | 0.946 | 0.946 | 0.945 | 0.943 |

MAPE_train (%) | 5.03 | 4.76 | 4.45 | 5.28 |

MAPE_test (%) | 8.98 | 9.00 | 9.12 | 9.23 |

E_mean_train (kWh) | 0.00 | 0.00 | 0.00 | 0.00 |

E_mean_test (kWh) | −0.16 | −0.35 | −0.47 | −0.09 |

E_skewness_train | 0.31 | 0.29 | 0.24 | 0.28 |

E_skewness_test | 0.14 | 0.02 | 0.09 | 0.04 |

E_kurtosis_train | 6.63 | 6.28 | 5.64 | 6.51 |

E_kurtosis_test | 7.58 | 7.68 | 7.41 | 7.53 |

Computational time | 13 min | 8 min | 4 min | 1.5 min |

Pred. Horizon = 48 h | Bagging | RForest | CForest | XGBoost | MLR | Naïve |
---|---|---|---|---|---|---|

Optimal parameters | ntree = 200, mtry = 53 | ntree = 200, mtry = 20 | ntree = 3, mtry = 53 | max_depth = 6, subsample = 0.5, eta = 0.02, nrounds = 3400 | number of predictors = 53 | lag = 168 h |

Error_mean_train (kWh) | 0.056 | 0.04 | 0.50 | 0.00 | 0.00 | 0.36 |

Error_mean_test (kWh) | 0.25 | −0.13 | 1.41 | −0.35 | −3.41 | 1.31 |

Error_skewness_train | 1.48 | 1.46 | 1.54 | 0.29 | 0.64 | 0.49 |

Error_skewness_test | 1.19 | 0.61 | 1.81 | 0.02 | 0.61 | 0.35 |

Error_kurtosis_train | 31.12 | 27.12 | 23.44 | 6.28 | 8.14 | 13.39 |

Error_kurtosis_test | 13.18 | 10.19 | 15.68 | 7.68 | 7.13 | 12.28 |

Pred. Horizon = 48 h | Bagging | RForest | CForest | XGBoost | MLR | Naïve |
---|---|---|---|---|---|---|

Optimal parameters | ntree = 200, mtry = 53 | ntree = 200, mtry = 20 | ntree = 3, mtry = 53 | max_depth = 6, subsample = 0.5, eta = 0.02, nrounds = 3400 | number of predictors = 53 | lag = 168 h |

RMSE_train (kWh) | 8.83 | 8.79 | 25.86 | 11.02 | 42.34 | 52.6 |

RMSE_test (kWh) | 27.65 | 25.45 | 33.09 | 23.65 | 40.67 | 50.87 |

R-squared_train | 0.99 | 0.99 | 0.94 | 0.99 | 0.844 | 0.76 |

R-squared_test | 0.93 | 0.94 | 0.89 | 0.95 | 0.841 | 0.75 |

MAPE_train (%) | 2.6 | 2.69 | 8 | 4.76 | 18.05 | 14.6 |

MAPE_test (%) | 9.6 | 9.33 | 10.82 | 8.99 | 19.33 | 15.63 |

Total number of predictors | 53 | 53 | 53 | 53 | 53 | 0 |

Number important predictors (cumulative importance >99%) | 24 | 28 | 25 | 25 | 30 | Not applicable |

Top 5 important predictors | LOAD_lag_168 (77.99%) | LOAD_lag_168 (42.55%) | LOAD_lag_168 (58.15%) | LOAD_lag_168 (75.85%) | LOAD_lag_168 (91.81%) | Not applicable |

LOAD_lag_48 (3.69%) | LOAD_lag_144 (16.37%) | WH7 (10.57%) | LOAD_lag_48 (3.76%) | FH3 (1.52%) | ||

FH1 (2.88%) | LOAD_lag_48 (9.26%) | WH6 (6.37%) | LOAD_lag_144 (3.52%) | LOAD_lag_48 (0.93%) | ||

LOAD_lag_144 (2.63%) | LOAD_lag_120 (6.67%) | LOAD_lag_48 (5.23%) | FH1 (2.86%) | FH1 (0.69%) | ||

FH3 (1.97%) | WH7 (5.13%) | LOAD_lag_144 (3.89%) | FH3 (1.93%) | WH7 (0.46%) | ||

Computational time | 46 min | 18 min | 1.5 min | 8 min | 0 min | 0 min |

Pred. Horizon = 48h | Bagging | RForest | CForest | XGBoost |
---|---|---|---|---|

MAPE regular days (149) | 9.07 | 8.60 | 10.44 | 8.15 |

MAPE special days (217) | 9.97 | 9.83 | 11.08 | 9.57 |

MAPE total days (366) | 9.60 | 9.33 | 10.82 | 8.99 |

XGBoost | Pred. Horizon = 1 h | Pred. Horizon = 2 h | Pred. Horizon = 12 h | Pred. Horizon = 24 h | Pred. Horizon = 48 h |
---|---|---|---|---|---|

RMSE_train (kWh) | 4.37 | 5.45 | 7.51 | 8.59 | 10.02 |

RMSE_test (kWh) | 10.77 | 13.91 | 22.15 | 22.44 | 23.92 |

R-squared_train | 0.998 | 0.997 | 0.995 | 0.993 | 0.991 |

R-squared_test | 0.989 | 0.981 | 0.953 | 0.951 | 0.945 |

MAPE_train (%) | 1.85 | 2.37 | 3.38 | 7.09 | 4.45 |

MAPE_test (%) | 3.50 | 4.67 | 7.51 | 8.14 | 9.12 |

Total number of predictors | 99 | 97 | 77 | 55 | 53 |

Number important predictors (cumulative importance >99%) | 34 | 47 | 38 | 20 | 26 |

Top 5 important predictors | LOAD_lag_1 (85.24%) | LOAD_lag_168 (60.92%) | LOAD_lag_168 (72.87%) | LOAD_lag_168 (73.40%) | LOAD_lag_168 (75.63%) |

LOAD_lag_168 (8.22%) | LOAD_lag_2 (23.81%) | LOAD_lag_24 (8.72%) | LOAD_lag_24 (9.54%) | LOAD_lag_48 (3.79%) | |

LOAD_lag_24 (7.94%) | LOAD_lag_24 (2.59%) | WH6 (2.58%) | WH6 (2.57%) | LOAD_lag_144 (3.33%) | |

H7 (3.92%) | LOAD_lag_12 (1.47%) | FH1 (1.92%) | FH1 (2.07%) | FH1 (2.86%) | |

LOAD_lag_5 (3.60%) | FH1 (0.85%) | FH3 (1.83%) | FH3 (1.85%) | FH3 (1.95%) | |

Computational time | 7 min | 7 min | 5 min | 4.5 min | 4 min |

Month | DM Cost (in €) | AS Cost (in €) | MD Cost (in €) | CP Cost (in €) | DMC Cost (in €) |
---|---|---|---|---|---|

January | 5478 | 685 | 91 | 313 | 6567 |

February | 4492 | 815 | 56 | 409 | 5772 |

March | 3644 | 763 | 45 | 99 | 4551 |

April | 2980 | 649 | 70 | 105 | 3804 |

May | 3976 | 801 | 43 | 127 | 4948 |

June | 6692 | 682 | 42 | 336 | 7752 |

July | 7151 | 610 | 28 | 524 | 8313 |

August | 4450 | 430 | 56 | 0 | 4936 |

September | 8013 | 724 | 57 | 195 | 8989 |

October | 8289 | 708 | 48 | 130 | 9176 |

November | 7960 | 474 | 91 | 140 | 8665 |

December | 8727 | 492 | 46 | 285 | 9575 |

Total 2016 | 71,853 (86.52%) | 7834 (9.44%) | 697 (0.84%) | 2664 (3.2%) | 83,048 |

**Table 10.**Comparison of costs in four cases: average final price (AFP), pessimist, DMC, and retailer.

Month | Consumption kWh | AFP (in €) | Pessimist (in €) | DMC (in €) | Retailer (in €) | Saving % |
---|---|---|---|---|---|---|

January | 125,702 | 6643 | 6677 | 6567 | 7434 | 12 |

February | 136,620 | 5834 | 5821 | 5772 | 6760 | 15 |

March | 119,103 | 4778 | 4628 | 4551 | 5338 | 15 |

April | 108,475 | 3965 | 3874 | 3804 | 4346 | 12 |

May | 130,149 | 5164 | 5001 | 4948 | 5571 | 11 |

June | 157,785 | 7953 | 7802 | 7752 | 8815 | 12 |

July | 160,212 | 8423 | 8361 | 8313 | 9315 | 11 |

August | 100,343 | 5133 | 4957 | 4936 | 5477 | 10 |

September | 167,116 | 9272 | 9040 | 8989 | 10,036 | 10 |

October | 141,077 | 9410 | 9213 | 9176 | 9953 | 8 |

November | 127,613 | 8818 | 8691 | 8665 | 9534 | 9 |

December | 130,583 | 9717 | 9634 | 9575 | 10,524 | 9 |

Total 2016 | 1,604,778 | 85,111 | 83,698 | 83,048 | 93,103 | 11 |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Ruiz-Abellón, M.D.C.; Gabaldón, A.; Guillamón, A.
Load Forecasting for a Campus University Using Ensemble Methods Based on Regression Trees. *Energies* **2018**, *11*, 2038.
https://doi.org/10.3390/en11082038

**AMA Style**

Ruiz-Abellón MDC, Gabaldón A, Guillamón A.
Load Forecasting for a Campus University Using Ensemble Methods Based on Regression Trees. *Energies*. 2018; 11(8):2038.
https://doi.org/10.3390/en11082038

**Chicago/Turabian Style**

Ruiz-Abellón, María Del Carmen, Antonio Gabaldón, and Antonio Guillamón.
2018. "Load Forecasting for a Campus University Using Ensemble Methods Based on Regression Trees" *Energies* 11, no. 8: 2038.
https://doi.org/10.3390/en11082038