# Designing Robust Forecasting Ensembles of Data-Driven Models with a Multi-Objective Formulation: An Application to Home Energy Management Systems

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

- -
- Is the performance of ensemble forecasting models better than single ones?
- -
- In previous works [28], 25 models were used for the ensemble. Would a larger number result in better performance?
- -
- For NARX models (using exogeneous variables), which would be the best technique for describing the evolution of exogenous variables?
- -
- Is the Prediction Interval Coverage Probability threshold meet for all steps within the Prediction Horizon?

## 2. Materials and Methods

#### 2.1. Single Model Design

- A pre-processing stage where, from the existing data, suitable sets (training, testing, validation, etc.) are obtained; this is known as a data selection problem;
- Determination of the “best” set of inputs (feature/delays selection) and network topology, given the above data sets;
- Determination of the “best” network parameters, given the data sets, inputs/delays and network topology.

#### 2.1.1. Data Selection

**x**to H is obtained by:

**x**and the convex hull of

**X**, denoted by conv(

**X**), can be computed by solving the following quadratic optimization problem:

**a***, the distance of point

**x**to conv(

**X**) is given by:

#### 2.1.2. Parameter Separability

**u**is the (linear) output weights vector, and

**v**represents the nonlinear parameters. For simplicity, we shall assume here only one hidden layer, and

**v**is composed of

**n**vectors of parameters, each one for each neuron $\left(v={\left[\begin{array}{ccc}{v}_{1}& \dots & {v}_{n}\end{array}\right]}^{T}\right)$. This type of model comprises Multilayer Perceptrons, Radial Basis Function (RBF) networks, B-Spline and Asmod models, Wavelet networks, and Mamdani, Takagi, and Takagi-Sugeno fuzzy models (satisfying certain assumptions) [32].

**X**, training the model means finding the values of

**w**, such that the following criterion is minimized:

**u**, its optimal solution is given as:

- It lowers the problem dimensionality, as the number of model parameters to determine is reduced;
- The initial value of $\mathsf{\Psi}$ is much smaller than $\mathsf{\Omega}$
- Typically, the rate of convergence of gradient algorithms using (9) is faster than using Equation (7).

#### 2.1.3. Training Algorithms

#### 2.1.4. Radial Basis Function Networks

_{max}represents the maximum distance between centers.

#### 2.1.5. MOGA

^{th}delay for variable i. This represents the one-step-ahead prediction within a prediction horizon. As we iterate (17) over PH, some or all of the indices in the right-hand-side will be larger than k, which means that the corresponding forecast must be employed. What has been said for NARX models is also valid for NAR models (with no exogeneous inputs).

**F**, each model must select the most representative d features within a user-specified interval, $d\in \left[{d}_{m},{d}_{M}\right],\hspace{0.33em}{d}_{M}\le q$. For this reason, each ANN structure is codified as shown in Figure 1:

_{m}represent the minimum number of features, while the last white ones are a variable number of inputs, up to the predefined maximum number. The ${\lambda}_{j}$ values correspond to the indices of the features

**f**

_{j}in the columns of

**F**.

**F**, together with the target data, must then be partitioned into three different sets: training set, to estimate the model parameters; test set, to perform early stopping; and validation set, to analyze the MOGA performance.

**v**)—number of nonlinear parameters—or the norm of the linear parameters (${\Vert u\Vert}_{2}$). For forecasting applications, as it is the case here, one criterion is also used to assess its performance. Assume a time-series sim, a subset of the design data, with p data points. For each point, the model (14) is used to make predictions up to PH steps ahead. Then, an error matrix is built:

^{th}column of matrix

**E**, by ${\rho}_{sim}\left(.,i\right)$, the forecasting performance criterion is the sum of the RMS of the columns of

**E**:

_{pop}), number of iterations population (n

_{iter}), and genetic algorithm parameters (proportion of random immigrants, selective pressure, crossover rate and survival rate), the hybrid evolutive-gradient method is executed.

_{pop}* n

_{iter}different models. As the problem is multi-objective, a subset of these models corresponds to non-dominated models (nd), or Pareto solutions. If one or more objectives is (are) set as restriction(s), a subset of nd, denoted as preferential solutions, pref, corresponds to the non-dominated solutions, which meet the goals. An example is shown in Figure 4.

#### 2.2. Model Ensemble

#### 2.3. Robust Models

^{th}and (1 − α/2)

^{th}quantiles of the cumulative distribution function of ${\epsilon}_{k}$. For instance, for the commonly used 90% PIs, which is the level used in this paper, the 5% and 95% quantiles of the error term are required. We will denote as ${\overline{\widehat{y}}}_{k}$ and ${\underset{\_}{\widehat{y}}}_{k}$ the upper and lower bounds, respectively.

- Sequential methods, which first receive the point forecast, and then use it to construct the PI;
- Direct methods, where the construction of the PI is carried out simultaneously with the identification. These are outside the scope of this paper. The reader is referred to [16] for a description of this class of methods.

- 1.
- Delta Method [15]

**J**in (12), and data noise variance, ${\sigma}_{\epsilon}^{2}$, can be estimated as:

- 2.
- Bayesian Method

- 3.
- Mean-Variance Estimation Method (MVEM) [43]

- 4.
- Bootstrap method

- 5.
- Covariance method

**X,**and the nonlinear parameters

**v**, the total prediction variance for each prediction ${\widehat{y}}_{k}$ can be obtained as

#### 2.4. Performance Criteria

^{2}) (40). Please notice that these criteria are computed with data normalized in the interval [−1, 1].

_{e}, and the later by m

_{s}. Different sets will be used to determine the ensemble solutions, namely, using models from the set of non-dominated solutions obtained in the 1st MOGA execution, denoted as ${m}_{e}^{nd},$ from the set of preferred set achieved in the 2nd MOGA execution (${m}_{e}^{pref}$), as well as three sets of models selected from the preferred set; those will be denoted as ${m}_{e}^{nw}$, ${m}_{e}^{fore}$, and ${m}_{e}^{par}$.

## 3. Results and Discussion

#### 3.1. The Data

_{D}) was acquired with 1 sec intervals. PV Power generated (P

_{G}) was acquired with a sampling time of 1 min. All these data were averaged across 15 min intervals. Additional data that were used include the daily household occupation (Occ) and a codification for the type of day (D

_{E}). This characterizes each day of the week and the occurrence and severity of holidays based on the day they occur, as may be consulted in [45,46]. In Table 1, the regular day column shows the coding for the days of the week when these are not a holiday. The following column presents the values encoded when there is a holiday, and finally, the special column shows the values that substitute the regular day value in two special cases: for Mondays when Tuesday is a holiday and for Fridays when Thursday is a holiday.

**v**) and ${\rho}_{sim}\left(PH\right)$, while in the second execution, some objectives were set as goals.

- Prediction horizon: 28 steps (7 h);
- Number of neuros: ${n}_{n}\in \left[\begin{array}{ccc}2& \cdots & 10\end{array}\right]$;
- Initial parameter values: OAKM [38];
- Number of training trials: five, best compromise solution;
- Termination criterion: early stopping, with a maximum number of iterations of 50;
- Number of generations: 100;
- Population size: 100;
- Proportion of random emigrants: 0.10;
- Crossover rate: 0.70.

_{3}and M

_{4}, the number of admissible inputs ranged from 1 to 30, while for M

_{1}and M

_{2}the range employed was from 1 to 20.

_{D}, [20 9 0] for T, [1 0 0] for O

_{cc}, and [1 0 0] for D

_{E}. This means that for P

_{D}we considered the first 20 lags before the current samples: nine centered 24 h ago and nine centered one week before. For T, the same number of lags before the current sample and centered one day ago will be used (but not from the third period), and for the other two variables only the first lag will be allowed. This means that the total number of lags (d

_{max}) that MOGA considers is (20 + 9 + 9) + (20 + 9) + (1) + (1), i.e., 69 lags.

_{ind}= 676 samples. As a PH of 28 steps ahead was used, for this model, we obtain the following: n

_{pred_MOGA}= 1881 − 676 − 28 = 1161 samples and n

_{pred_VAL}= 1962 − 676 − 28 = 1242 samples.

_{max}= 38, and l

_{ind}, n

_{pred_MOGA}, and n

_{pred_VAL}are the same as M

_{3}.

_{max}= 29, l

_{ind}= 100, n

_{pred_MOGA}= 1745 (around 18 days), and n

_{pred_VAL}= 1826 (around 19 days).

_{G}, [20 9 9] lags for R, and [20 9 0] lags for T, which means that d

_{max}= 105, and l

_{ind}, n

_{pred_MOGA}, and n

_{pred_VAL}are the same as M

_{1}and M

_{3}.

#### 3.2. Solar Radiation Models

^{17}.

^{2}(s), over the prediction horizon, considering the set ${m}_{e}^{par}$. On the left side of each figure, the selected ensemble is shown, and on the right, the evolution for the multiple models is presented. Figure 16 shows, for three days within the time-series considered, the measured solar radiation, its one-step-ahead forecast, and the corresponding upper and lower bounds.

#### 3.3. Atmospheric Air Temperature Models

#### 3.4. Load Demand Models

#### 3.5. Power Generation Models

_{val}is very high.

#### 3.6. Analysis of the Results

- -
- In three out of four models, the ensemble versions present better results, for all performance criteria, than the single model chosen. In the power generation model, in three of the performance indices, the single model obtained the best results. Overall, for the total number of 24 criteria, the ensemble approaches obtained the best results in 21 cases;
- -
- Focusing now on the number of models to use in the ensemble, only in two performance criteria (${S}_{PINAW}$ in the air temperature model and ${S}_{RMSE}$ in the power generation model with measured data for the exogeneous variables) out of the 24 criteria did the use of more than 25 models obtain better results;
- -
- Regarding the method to select the 25 models (${m}_{e}^{nw}$, ${m}_{e}^{fore}$ or ${m}_{e}^{par}$), there is not a clear choice. The winners obtained for the selection methods were 8, 6, and 9, respectively;
- -
- For the NARX models, for all cases, the use of ${{m}_{e}^{fore}\u230b}_{e}$ was found to be the best choice;
- -
- Finally, violations of PIPC were assessed using (35). As it can be seen in Table 4, Table 7, Table 11 and Table 14, and in Figure 9, Figure 17, Figure 20 and Figure 23, in most cases the computed Prediction Interval Coverage Probability, throughout PH for the chosen model ensemble, was slightly larger than the level specified, 90%. In the cases where violations have been found, PICP values are near 90%.

- 1.
- Use ApproxHull to select the datasets from the available design data;
- 2.
- Formulate MOGA, in a first step, minimizing the different objectives;
- 3.
- Analyze MOGA results, restrict some of the objectives and redefine input space and/or the model topology;
- 4.
- Use the preferential set of models obtained from the second MOGA execution;
- 5.
- Determine ${m}_{e}^{pref}$ using (44);
- 6.
- From ${m}_{e}^{pref}$ obtain ${m}_{e}^{par}$ using (47);
- 7.
- If the model that should be designed is a NARX model, use ${m}_{e}^{par}$ for the exogenous variables;
- 8.
- Obtain the prediction intervals using (24) and (25).

#### 3.7. Discussion

^{2}. Our approach obtained, for 15 min step-ahead, values of 0.91, 14.5%, and 35 W/m

^{2}, respectively.

## 4. Conclusions

## 5. Patents

## Author Contributions

## Funding

## Data Availability Statement

## Conflicts of Interest

## References

- Ahmad, T.; Chen, H. A review on machine learning forecasting growth trends and their real-time applications in different energy systems. Sustain. Cities Soc.
**2020**, 54, 102010. [Google Scholar] [CrossRef] - Wang, H.; Lei, Z.; Zhang, X.; Zhou, B.; Peng, J. A review of deep learning for renewable energy forecasting. Energy Convers. Manag.
**2019**, 198, 111799. [Google Scholar] [CrossRef] - Liu, H.; Li, Y.; Duan, Z.; Chen, C. A review on multi-objective optimization framework in wind energy forecasting techniques and applications. Energy Convers. Manag.
**2020**, 224, 113324. [Google Scholar] [CrossRef] - Sharma, V.; Cortes, A.; Cali, U. Use of Forecasting in Energy Storage Applications: A Review. IEEE Access
**2021**, 9, 114690–114704. [Google Scholar] [CrossRef] - Ma, J.; Ma, X. A review of forecasting algorithms and energy management strategies for microgrids. Syst. Sci. Control. Eng.
**2018**, 6, 237–248. [Google Scholar] [CrossRef] - Walther, J.; Weigold, M. A Systematic Review on Predicting and Forecasting the Electrical Energy Consumption in the Manufacturing Industry. Energies
**2021**, 14, 968. [Google Scholar] [CrossRef] - Gomes, I.; Bot, K.; Ruano, M.d.G.; Ruano, A. Recent Techniques Used in Home Energy Management Systems: A Review. Energies
**2022**, 15, 2866. [Google Scholar] [CrossRef] - Gomes, I.; Ruano, M.G.; Ruano, A.E. Minimizing the operation costs of a smart home using a HEMS with a MILP-based model predictive control approach. In Proceedings of the IFAC World Congress 2023, Yokohama, Japan, 9–14 July 2023. [Google Scholar]
- Breiman, L. Bagging predictors. Mach. Learn.
**1996**, 24, 123–140. [Google Scholar] [CrossRef] - Freund, Y.; Schapire, R.E. A Decision-Theoretic Generalization of On-Line Learning and an Application to Boosting. Journal of Comput. Syst. Sci.
**1997**, 55, 119–139. [Google Scholar] [CrossRef] - Voyant, C.; Notton, G.; Kalogirou, S.; Nivet, M.-L.; Paoli, C.; Motte, F.; Fouilloy, A. Machine learning methods for solar radiation forecasting: A review. Renew. Energy
**2017**, 105, 569–582. [Google Scholar] [CrossRef] - Gaboitaolelwe, J.; Zungeru, A.M.; Yahya, A.; Lebekwe, C.K.; Vinod, D.N.; Salau, A.O. Machine Learning Based Solar Photovoltaic Power Forecasting: A Review and Comparison. IEEE Access
**2023**, 11, 40820–40845. [Google Scholar] [CrossRef] - Kychkin, A.V.; Chasparis, G.C. Feature and model selection for day-ahead electricity-load forecasting in residential buildings. Energy Build.
**2021**, 249, 111200. [Google Scholar] [CrossRef] - Ungar, L.; De, R.; Rosengarten, V. Estimating Prediction Intervals for Artificial Neural Networks. In Proceedings of the 9th Yale Workshop on Adaptive and Learning Systems, Pittsburgh, PA, USA, 10–12 June 1996; pp. 1–6. [Google Scholar]
- Hwang, J.T.G.; Ding, A.A. Prediction Intervals for Artificial Neural Networks. J. Am. Stat. Assoc.
**1997**, 92, 748–757. [Google Scholar] [CrossRef] - Cartagena, O.; Parra, S.; Muñoz-Carpintero, D.; Marín, L.G.; Sáez, D. Review on Fuzzy and Neural Prediction Interval Modelling for Nonlinear Dynamical Systems. IEEE Access
**2021**, 9, 23357–23384. [Google Scholar] [CrossRef] - Khosravi, A.; Nahavandi, S.; Creighton, D. Construction of Optimal Prediction Intervals for Load Forecasting Problems. IEEE Trans. Power Syst.
**2010**, 25, 1496–1503. [Google Scholar] [CrossRef] - Quan, H.; Srinivasan, D.; Khosravi, A. Uncertainty handling using neural network-based prediction intervals for electrical load forecasting. Energy
**2014**, 73, 916–925. [Google Scholar] [CrossRef] - Gob, R.; Lurz, K.; Pievatolo, A. More Accurate Prediction Intervals for Exponential Smoothing with Covariates with Applications in Electrical Load Forecasting and Sales Forecasting. Qual. Reliab. Eng. Int.
**2015**, 31, 669–682. [Google Scholar] [CrossRef] - Antoniadis, A.; Brossat, X.; Cugliari, J.; Poggi, J.M. A prediction interval for a function-valued forecast model: Application to load forecasting. Int. J. Forecast.
**2016**, 32, 939–947. [Google Scholar] [CrossRef] - Zuniga-Garcia, M.A.; Santamaria-Bonfil, G.; Arroyo-Figueroa, G.; Batres, R. Prediction Interval Adjustment for Load-Forecasting using Machine Learning. Appl. Sci.
**2019**, 9, 20. [Google Scholar] [CrossRef] - Chu, Y.H.; Li, M.Y.; Pedro, H.T.C.; Coimbra, C.F.M. Real-time prediction intervals for intra-hour DNI forecasts. Renew. Energy
**2015**, 83, 234–244. [Google Scholar] [CrossRef] - Chu, Y.; Coimbra, C.F.M. Short-term probabilistic forecasts for Direct Normal Irradiance. Renew. Energy
**2017**, 101, 526–536. [Google Scholar] [CrossRef] - van der Meer, D.W.; Widen, J.; Munkhammar, J. Review on probabilistic forecasting of photovoltaic power production and electricity consumption. Renew. Sust. Energ. Rev.
**2018**, 81, 1484–1512. [Google Scholar] [CrossRef] - Nowotarski, J.; Weron, R. Recent advances in electricity price forecasting: A review of probabilistic forecasting. Renew. Sust. Energ. Rev.
**2018**, 81, 1548–1568. [Google Scholar] [CrossRef] - Wang, H.-z.; Li, G.-q.; Wang, G.-b.; Peng, J.-c.; Jiang, H.; Liu, Y.-t. Deep learning based ensemble approach for probabilistic wind power forecasting. Appl. Energy
**2017**, 188, 56–70. [Google Scholar] [CrossRef] - Lu, C.; Liang, J.; Jiang, W.; Teng, J.; Wu, C. High-resolution probabilistic load forecasting: A learning ensemble approach. J. Frankl. Inst.
**2023**, 360, 4272–4296. [Google Scholar] [CrossRef] - Bot, K.; Santos, S.; Laouali, I.; Ruano, A.; Ruano, M.G. Design of Ensemble Forecasting Models for Home Energy Management Systems. Energies
**2021**, 14, 7664. [Google Scholar] [CrossRef] - Khosravani, H.R.; Ruano, A.E.; Ferreira, P.M. A convex hull-based data selection method for data driven models. Appl. Soft Comput.
**2016**, 47, 515–533. [Google Scholar] [CrossRef] - Golub, G.; Pereyra, V. Separable nonlinear least squares: The variable projection method and its applications. Inverse Probl.
**2003**, 19, R1–R26. [Google Scholar] [CrossRef] - Ruano, A.E.B.; Jones, D.I.; Fleming, P.J. A New Formulation of the Learning Problem for a Neural Network Controller. In Proceedings of the 30th IEEE Conference on Decision and Control, Brighton, UK, 11–13 December 1991; pp. 865–866. [Google Scholar]
- Ruano, A.E.; Ferreira, P.M.; Cabrita, C.; Matos, S. Training Neural Networks and Neuro-Fuzzy Systems: A Unified View. IFAC Proc. Vol.
**2002**, 35, 415–420. [Google Scholar] [CrossRef] - Levenberg, M. A method for the solution of certain non-linear problems in least squares. Q. Appl. Math.
**1964**, 2, 164–168. [Google Scholar] [CrossRef] - Marquardt, D.W. An algorithm for least-squares estimation of nonlinear parameters. J. Soc. Ind. Appl. Math.
**1963**, 11, 431–441. [Google Scholar] [CrossRef] - Ruano, A.E. Applications of Neural Networks to Control Systems. Ph.D. Thesis, University College of North Wales, Bangor, UK, 1992. [Google Scholar]
- Haykin, S. Neural Networks: A Comprehensive Foundation, 2nd ed.; Prentice Hall: Hoboken, NJ, USA, 1999. [Google Scholar]
- Ferreira, P.; Ruano, A. Evolutionary Multiobjective Neural Network Models. In New Advances in Intelligent Signal Processing; Ruano, A., Várkonyi-Kóczy, A., Eds.; Studies in Computational Intelligence; Springer: Berlin/Heidelberg, Germany, 2011; Volume 372, pp. 21–53. [Google Scholar]
- Chinrunngrueng, C.; Séquin, C.H. Optimal adaptive k-means algorithm with dynamic adjustment of learning rate. IEEE Trans. Neural Netw.
**1995**, 6, 157–169. [Google Scholar] [CrossRef] - Lineros, M.L.; Luna, A.M.; Ferreira, P.M.; Ruano, A.E. Optimized Design of Neural Networks for a River Water Level Prediction System. Sensors
**2021**, 21, 6504. [Google Scholar] [CrossRef] - Bichpuriya, Y.; Rao, M.S.S.; Soman, S.A. Combination approaches for short term load forecasting. In Proceedings of the 2010 Conference Proceedings IPEC, Singapore, 27–29 October 2010. [Google Scholar]
- Kiartzis, S.; Kehagias, A.; Bakirtzis, A.; Petridis, V. Short term load forecasting using a Bayesian combination method. Int. J. Electr. Power Energy Syst.
**1997**, 19, 171–177. [Google Scholar] [CrossRef] - Fan, S.; Chen, L.; Lee, W.J. Short-term load forecasting using comprehensive combination based on multi-meteorological information. In Proceedings of the 2008 IEEE/IAS Industrial and Commercial Power Systems Technical Conference, Las Vegas, NV, USA, 4–8 May 2008; pp. 1–7. [Google Scholar]
- Nix, D.A.; Weigend, A.S. Estimating the mean and variance of the target probability distribution. In Proceedings of the 1994 IEEE International Conference on Neural Networks (ICNN’94), Orlando, FL, USA, 28 June–2 July 1994; Volume 51, pp. 55–60. [Google Scholar]
- Ruano, A.; Bot, K.; Ruano, M.G. Home Energy Management System in an Algarve residence. First results. In CONTROLO 2020: Proceedings of the 14th APCA International Conference on Automatic Control and Soft Computing, Gonçalves; Gonçalves, J.A., Braz-César, M., Coelho, J.P., Eds.; Lecture Notes in Electrical Engineering; Springer Science and Business Media Deutschland GmbH: Bragança, Portugal, 2021; Volume 695, pp. 332–341. [Google Scholar]
- Ferreira, P.M.; Ruano, A.E.; Pestana, R.; Koczy, L.T. Evolving RBF predictive models to forecast the Portuguese electricity consumption. IFAC Proc. Vol.
**2009**, 42, 414–419. [Google Scholar] [CrossRef] - Ferreira, P.M.; Cuambe, I.D.; Ruano, A.E.; Pestana, R. Forecasting the Portuguese Electricity Consumption using Least-Squares Support Vector Machines. IFAC Proc. Vol.
**2013**, 46, 411–416. [Google Scholar] [CrossRef] - Bot, K.; Ruano, A.; Ruano, M.G. Forecasting Electricity Demand in Households using MOGA-designed Artificial Neural Networks. IFAC-PapersOnLine
**2020**, 53, 8225–8230. [Google Scholar] [CrossRef] - Ruano, A.; Bot, K.; Ruano, M.d.G. The Impact of Occupants in Thermal Comfort and Energy Efficiency in Buildings. In Occupant Behaviour in Buildings: Advances and Challenges; Bentham Science: Sharjah, United Arab Emirates, 2021; Volume 6, pp. 101–137. [Google Scholar]
- Gomes, I.L.R.; Ruano, M.G.; Ruano, A.E. MILP-based model predictive control for home energy management systems: A real case study in Algarve, Portugal. Energy Build.
**2023**, 281, 112774. [Google Scholar] [CrossRef] - Galvan, I.M.; Huertas-Tato, J.; Rodriguez-Benitez, F.J.; Arbizu-Barrena, C.; Pozo-Vazquez, D.; Aler, R. Evolutionary-based prediction interval estimation by blending solar radiation forecasting models using meteorological weather types. Appl. Soft. Comput.
**2021**, 109, 13. [Google Scholar] [CrossRef] - Ni, Q.; Zhuang, S.X.; Sheng, H.M.; Wang, S.; Xiao, J. An Optimized Prediction Intervals Approach for Short Term PV Power Forecasting. Energies
**2017**, 10, 16. [Google Scholar] [CrossRef] - Ni, Q.; Zhuang, S.X.; Sheng, H.M.; Kang, G.Q.; Xiao, J. An ensemble prediction intervals approach for short-term PV power forecasting. Sol. Energy
**2017**, 155, 1072–1083. [Google Scholar] [CrossRef] - Harkat, H.; Ruano, A.E.; Ruano, M.G.; Bennani, S.D. GPR target detection using a neural network classifier designed by a multi-objective genetic algorithm. Appl. Soft Comput.
**2019**, 79, 310–325. [Google Scholar] [CrossRef] - Hajimani, E.; Ruano, M.G.; Ruano, A.E. An intelligent support system for automatic detection of cerebral vascular accidents from brain CT images. Comput. Methods Programs Biomed.
**2017**, 146, 109–123. [Google Scholar] [CrossRef] [PubMed] - Teixeira, C.A.; Pereira, W.C.A.; Ruano, A.E.; Ruano, M.G. On the possibility of non-invasive multilayer temperature estimation using soft-computing methods. Ultrasonics
**2010**, 50, 32–43. [Google Scholar] [CrossRef]

**Figure 1.**Chromosome and input space lookup table [37].

**Figure 2.**Model design cycle [29].

**Figure 4.**Detail of objective space for a problem discussed in [39].

**Figure 8.**Selected solar radiation models for ${m}_{e}^{nw}$ (

**top left**), ${m}_{e}^{fore}$ (

**top right**), and ${m}_{e}^{par}$ (

**bottom**). In the top graphs, the selected models are denoted with a red cross, while for the bottom they are represented in green.

**Figure 16.**Solar radiation one-step-ahead detail. Target (blue solid); Prediction (red solid); Prediction upper bound (yellow dash); Prediction lower bound (magenta dash).

**Figure 19.**

**Left**: Air temperature R

^{2}evolution.

**Right**: Air temperature one-step-ahead detail. Target (blue solid); Prediction (red solid); Prediction upper bound (yellow dash); Prediction lower bound (magenta dash).

**Figure 22.**

**Left**: Load demand R

^{2}evolution;

**Right**: Load demand one-step-ahead detail. Target (blue solid); Prediction (red solid); Prediction upper-bound (yellow dash); Prediction lower bound (magenta dash).

**Figure 25.**

**Left**: Power generation R

^{2}evolution;

**Right**: Power generation one-step-ahead detail. Target (blue solid); Prediction (red solid); Prediction upper-bound (yellow dash); Prediction lower bound (magenta dash).

Day of the Week | Regular Day | Holiday | Special |
---|---|---|---|

Monday | 0.05 | 0.40 | 0.70 |

Tuesday | 0.10 | 0.80 | |

Wednesday | 0.15 | 0.50 | |

Thursday | 0.20 | 1.00 | |

Friday | 0.25 | 0.60 | 0.90 |

Saturday | 0.30 | 0.30 | |

Sunday | 0.35 | 0.35 |

RMSE_{tr} | RMSE_{te} | RMSE_{val} | Comp | $\left|\right|\mathit{w}\left|\right|$ | $\sum \mathit{f}\mathit{o}\mathit{r}\mathit{e}\left(.\right)$ | |
---|---|---|---|---|---|---|

min | 0.108 | 0.084 | 0.084 | 8 | 1.819 | 4.398 |

median | 0.114 | 0.086 | 0.086 | 45 | 21.85 | 4.643 |

max | 0.184 | 0.157 | 0.153 | 210 | - | - |

RMSE_{tr} | RMSE_{te} | RMSE_{val} | Comp | $\left|\right|\mathit{w}\left|\right|$ | $\sum \mathit{f}\mathit{o}\mathit{r}\mathit{e}\left(.\right)$ | |
---|---|---|---|---|---|---|

min | 0.111 | 0.084 | 0.084 | 14 | 4.183 | 4.416 |

median | 0.114 | 0.086 | 0.086 | 42 | 16.19 | 4.649 |

max | 0.119 | 0.091 | 0.092 | 99 | - | - |

Sets | # | $\mathit{V}\mathit{i}\mathit{o}{\mathit{l}}_{\mathit{P}\mathit{I}\mathit{C}\mathit{P}}$ | ${\mathit{S}}_{\mathit{P}\mathit{I}\mathit{N}\mathit{A}\mathit{W}}$ | ${\mathit{S}}_{\mathit{R}\mathit{M}\mathit{S}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{R}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{P}\mathit{E}}$ | ${\mathit{S}}_{{\mathit{R}}^{2}}$ |
---|---|---|---|---|---|---|---|---|

m_{s} | 1 | 0 | 8.767 | 3.349 | 1.767 | 104.0 | 2443 | 35.12 |

${m}_{e}^{nd}$ | 59 | 0 | 8.748√ | 3.341√ | 1.737√ | 102.2√ | 2027√ | 35.13√ |

${m}_{mm}^{nd}$ | 59 | 0 | 9.015 | 3.448 | 1.801 | 105.9 | 2085 | 35.07 |

${m}_{e}^{nw}$ | 25 | 0 | 8.837 | 3.342√ | 1.716√ | 100.9√ | 2020 | 35.13√ |

${m}_{mm}^{nw}$ | 25 | 0 | 8.993 | 3.448 | 1.779 | 104.6 | 2061 | 35.07 |

${m}_{e}^{fore}$ | 25 | 0 | 8.603√ | 3.298√ | 1.714√ | 100.8√ | 2155 | 35.15√ |

${m}_{mm}^{fore}$ | 25 | 0 | 8.783 | 3.360 | 1.759 | 103.5 | 2295 | 35.12 |

${m}_{e}^{par}$ | 25 | 0 | 8.613√ | 3.299√ | 1.703√ | 100.2√ | 2093 | 35.15√ |

${m}_{mm}^{par}$. | 25 | 0 | 8.804 | 3.361 | 1.759 | 103.5 | 2201 | 35.12 |

RMSE_{tr} | RMSE_{te} | RMSE_{val} | Comp | $\left|\right|\mathit{w}\left|\right|$ | $\sum \mathit{f}\mathit{o}\mathit{r}\mathit{e}\left(.\right)$ | |
---|---|---|---|---|---|---|

min | 0.018 | 0.017 | 0.017 | 8 | 4.342 | 3.802 |

median | 0.018 | 0.017 | 0.017 | 56 | 30.82 | 3.995 |

max | 0.020 | 0.018 | 0.018 | 210 | - | 4.767 |

RMSE_{tr} | RMSE_{te} | RMSE_{val} | Comp | $\left|\right|\mathit{w}\left|\right|$ | $\sum \mathit{f}\mathit{o}\mathit{r}\mathit{e}\left(.\right)$ | |
---|---|---|---|---|---|---|

min | 0.018 | 0.017 | 0.017 | 8 | 4.342 | 3.802 |

median | 0.019 | 0.017 | 0.017 | 53 | 31.97 | 3.999 |

max | 0.020 | 0.018 | 0.018 | 100 | - | 4.767 |

Sets | # | $\mathit{V}\mathit{i}\mathit{o}{\mathit{l}}_{\mathit{P}\mathit{I}\mathit{C}\mathit{P}}$ | ${\mathit{S}}_{\mathit{P}\mathit{I}\mathit{N}\mathit{A}\mathit{W}}$ | ${\mathit{S}}_{\mathit{R}\mathit{M}\mathit{S}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{R}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{P}\mathit{E}}$ | ${\mathit{S}}_{{\mathit{R}}^{2}}$ |
---|---|---|---|---|---|---|---|---|

m_{s} | 1 | 3.425 | 2.699 | 219.9 | 3007 | 27.88 | ||

${m}_{e}^{nd}$ | 87 | 0.28 | 11.14√ | 3.412√ | 2.692√ | 219.3√ | 2949 | 27.93√ |

${m}_{mm}^{nd}$ | 87 | 0.41 | 11.68 | 3.487 | 2.742 | 233.4 | 2987 | 27.57 |

${m}_{e}^{nw}$ | 25 | 0.72 | 12.67 | 3.407√ | 2.689√ | 219.1√ | 2921 | 27.96√ |

${m}_{mm}^{nw}$ | 25 | 0.41 | 11.76 | 3.499 | 2.754 | 224.4 | 2979 | 27.52 |

${m}_{e}^{fore}$ | 25 | 0.33 | 11.54√ | 3.416√ | 2.696√ | 219.6√ | 2972 | 27.91√ |

${m}_{mm}^{fore}$ | 25 | 0.42 | 11.50 | 3.486 | 2.746 | 233.7 | 3007 | 27.57 |

${m}_{e}^{par}$ | 25 | 0.28 | 11.18√ | 3.412√ | 2.695√ | 219.6√ | 2951 | 27.93√ |

${m}_{mm}^{par}$ | 25 | 043 | 11.67 | 3.499 | 2.756 | 224.6 | 3006 | 27.52 |

RMSE_{tr} | RMSE_{te} | RMSE_{val} | Comp | $\left|\right|\mathit{w}\left|\right|$ | $\sum \mathit{f}\mathit{o}\mathit{r}\mathit{e}\left(.\right)$ | |
---|---|---|---|---|---|---|

min | 0.146 | 0.117 | 0.119 | 8 | 1.358 | 6.208 |

median | 0.149 | 0.119 | 0.121 | 133 | - | 6.208 |

max | 0.216 | 0.169 | 21.78 | 261 | - | - |

RMSE_{tr} | RMSE_{te} | RMSE_{val} | Comp | $\left|\right|\mathit{w}\left|\right|$ | $\sum \mathit{f}\mathit{o}\mathit{r}\mathit{e}\left(.\right)$ | |
---|---|---|---|---|---|---|

min | 0.111 | 0.084 | 0.084 | 14 | 4.183 | 4.416 |

median | 0.114 | 0.086 | 0.086 | 42 | 16.19 | 4.649 |

max | 0.119 | 0.091 | 0.092 | 99 | - | - |

Sets | # | $\mathit{V}\mathit{i}\mathit{o}{\mathit{l}}_{\mathit{P}\mathit{I}\mathit{C}\mathit{P}}$ | ${\mathit{S}}_{\mathit{P}\mathit{I}\mathit{N}\mathit{A}\mathit{W}}$ | ${\mathit{S}}_{\mathit{R}\mathit{M}\mathit{S}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{R}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{P}\mathit{E}}$ | ${\mathit{S}}_{{\mathit{R}}^{2}}$ |
---|---|---|---|---|---|---|---|---|

${{m}_{s}\u230b}_{m}$ | 1 | 0 | - | 4.753 | 3.317 | 211.4 | 3185 | 18.16 |

${m}_{s}{}_{s}$ | 0 | - | 4.735 | 3.332 | 211.9 | 3141 | 18.35 | |

${{m}_{s}\u230b}_{e}$ | 0 | - | 4.731 | 3.319 | 211.6 | 3143 | 18.16 | |

${{m}_{e}^{nd}\u230b}_{m}$ | 89 | 0 | 24.83 | 4.578√ | 3.199√ | 203.8√ | 3154 | 20.28√ |

${{m}_{e}^{nd}\u230b}_{s}$ | 89 | 0 | 23.33 | 4.591√ | 3.208√ | 204.4√ | 3133 | 20.12√ |

${{m}_{e}^{nd}\u230b}_{e}$ | 89 | 0 | 24.83 | 4.587√ | 3.205√ | 204.2√ | 3132 | 20.29√ |

${{m}_{e}^{pref}\u230b}_{m}$ | 47 | 0 | 24.73 | 4.574√ | 3.186√ | 203.1√ | 3151 | 20.32√ |

${{m}_{e}^{pref}\u230b}_{s}$ | 47 | 0 | 24.73 | 4.589√ | 3.197√ | 203.8√ | 3131 | 20.15√ |

${{m}_{e}^{pref}\u230b}_{e}$ | 47 | 0 | 24.72 | 4.585√ | 3.193√ | 203.5√ | 3130 | 20.20√ |

Sets | # | $\mathit{V}\mathit{i}\mathit{o}{\mathit{l}}_{\mathit{P}\mathit{I}\mathit{C}\mathit{P}}$ | ${\mathit{S}}_{\mathit{P}\mathit{I}\mathit{N}\mathit{A}\mathit{W}}$ | ${\mathit{S}}_{\mathit{R}\mathit{M}\mathit{S}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{R}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{P}\mathit{E}}$ | ${\mathit{S}}_{{\mathit{R}}^{2}}$ |
---|---|---|---|---|---|---|---|---|

${{m}_{e}^{nw}\u230b}_{m}$ | 25 | 0 | 19.45√ | 4.584√ | 3.192√ | 203.4√ | 3136 | 20.21√ |

${{m}_{e}^{nw}\u230b}_{s}$ | 25 | 0 | 19.44√ | 4.597√ | 3.199√ | 203.9√ | 3129 | 20.04√ |

${{m}_{e}^{nw}\u230b}_{e}$ | 25 | 0 | 19.44√ | 4.593√ | 3.195√ | 203.6√ | 3127 | 20.11√ |

${{m}_{e}^{fore}\u230b}_{m}$ | 25 | 0 | 24.83 | 4.578√ | 3.199√ | 203.8√ | 3154 | 20.28√ |

${{m}_{e}^{fore}\u230b}_{s}$ | 25 | 0 | 23.33 | 4.590√ | 3.208√ | 204.4√ | 3133 | 20.12√ |

${{m}_{e}^{fore}\u230b}_{e}$ | 25 | 0 | 24.83 | 4.587√ | 3.205√ | 204.2√ | 3132 | 20.29√ |

${{m}_{e}^{par}\u230b}_{m}$ | 25 | 0 | 24.71 | 4.575√ | 3.173√ | 202.2√ | 3162 | 20.31√ |

${{m}_{e}^{par}\u230b}_{s}$ | 25 | 0 | 24.69 | 4.590√ | 3.179√ | 202.6√ | 3145 | 20.13√ |

${{m}_{e}^{par}\u230b}_{e}$ | 25 | 0 | 24.69 | 4.585√ | 3.175√ | 202.4√ | 3145 | 20.19√ |

RMSE_{tr} | RMSE_{te} | RMSE_{val} | Comp | $\left|\right|\mathit{w}\left|\right|$ | $\sum \mathit{f}\mathit{o}\mathit{r}\mathit{e}\left(.\right)$ | |
---|---|---|---|---|---|---|

min | 0.053 | 0.042 | 0.049 | 8 | 3.071 | 1.686 |

median | 0.063 | 0.049 | 0.052 | 72 | 3352 | 2.609 |

max | 0.145 | 0.106 | 0.129 | 300 | - | - |

RMSE_{tr} | RMSE_{te} | RMSE_{val} | Comp | $\left|\right|\mathit{w}\left|\right|$ | $\sum \mathit{f}\mathit{o}\mathit{r}\mathit{e}\left(.\right)$ | |
---|---|---|---|---|---|---|

min | 0.058 | 0.042 | 0.049 | 14 | 6.727 | 2.199 |

median | 0.063 | 0.043 | 0.051 | 52 | 4358 | 2.671 |

max | 0.067 | 0.046 | 0.129 | 100 | - | - |

Sets | # | $\mathit{V}\mathit{i}\mathit{o}{\mathit{l}}_{\mathit{P}\mathit{I}\mathit{C}\mathit{P}}$ | ${\mathit{S}}_{\mathit{P}\mathit{I}\mathit{N}\mathit{A}\mathit{W}}$ | ${\mathit{S}}_{\mathit{R}\mathit{M}\mathit{S}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{R}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{P}\mathit{E}}$ | ${\mathit{S}}_{{\mathit{R}}^{2}}$ |
---|---|---|---|---|---|---|---|---|

${{m}_{s}\u230b}_{m}$ | 1 | 0 | 6.924 | 2.774 | 1.294 | 76.02 | 2826 | 35.38 |

${{m}_{s}\u230b}_{s}$ | 1 | 0 | 8.011 | 3.159 | 1.430 | 83.99 | 3328 | 35.19 |

${{m}_{s}\u230b}_{e}$ | 1 | 0 | 7.943 | 3.132 | 1.405 | 82.50 | 3046 | 35.21 |

${{m}_{e}^{pref}\u230b}_{m}$ | 44 | 1 | 7.270√ | 2.877√ | 1.271√ | 74.62√ | 2244√ | 35.33√ |

${{m}_{e}^{pref}\u230b}_{s}$ | 44 | 1 | 7.261√ | 3.071√ | 1.369√ | 80.42√ | 2623√ | 35.24√ |

${{m}_{e}^{pref}\u230b}_{e}$ | 44 | 1 | 7.263√ | 3.065√ | 1.365√ | 80.19√ | 2533√ | 35.24√ |

Sets | # | $\mathit{V}\mathit{i}\mathit{o}{\mathit{l}}_{\mathit{P}\mathit{I}\mathit{C}\mathit{P}}$ | ${\mathit{S}}_{\mathit{P}\mathit{I}\mathit{N}\mathit{A}\mathit{W}}$ | ${\mathit{S}}_{\mathit{R}\mathit{M}\mathit{S}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{R}\mathit{E}}$ | ${\mathit{S}}_{\mathit{M}\mathit{A}\mathit{P}\mathit{E}}$ | ${\mathit{S}}_{{\mathit{R}}^{2}}$ |
---|---|---|---|---|---|---|---|---|

${{m}_{e}^{nw}\u230b}_{m}$ | 25 | 1 | 7.426√ | 2.842√ | 1.257√ | 73.87√ | 2192√ | 35.35√ |

${{m}_{e}^{nw}\u230b}_{s}$ | 25 | 1 | 7.438√ | 3.089√ | 1.383√ | 81.24√ | 2658√ | 35.23√ |

${{m}_{e}^{nw}\u230b}_{e}$ | 25 | 1 | 7.437√ | 3.084√ | 1.381√ | 81.10√ | 2520√ | 35.23√ |

${{m}_{e}^{fore}\u230b}_{m}$ | 25 | 1 | 7.094√ | 2.882√ | 1.258√ | 73.88√ | 2135√ | 35.33√ |

${{m}_{e}^{fore}\u230b}_{s}$ | 25 | 1 | 7.099√ | 3.080√ | 1.368√ | 80.32√ | 2576√ | 35.23√ |

${{m}_{e}^{fore}\u230b}_{e}$ | 25 | 1 | 7.090√ | 3.073√ | 1.365√ | 80.16√ | 2461√ | 35.24√ |

${{m}_{e}^{par}\u230b}_{m}$ | 25 | 1 | 7.328√ | 2.894√ | 1.267√ | 74.44√ | 2127√ | 35.32√ |

${{m}_{e}^{par}\u230b}_{s}$ | 25 | 1 | 7.333√ | 3.091√ | 1.377√ | 80.84√ | 2553√ | 35.23√ |

${{m}_{e}^{par}\u230b}_{e}$ | 25 | 1 | 7.323√ | 3.083√ | 1.327√ | 80.61√ | 2428√ | 35.23√ |

Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |

© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Ruano, A.; Ruano, M.d.G.
Designing Robust Forecasting Ensembles of Data-Driven Models with a Multi-Objective Formulation: An Application to Home Energy Management Systems. *Inventions* **2023**, *8*, 96.
https://doi.org/10.3390/inventions8040096

**AMA Style**

Ruano A, Ruano MdG.
Designing Robust Forecasting Ensembles of Data-Driven Models with a Multi-Objective Formulation: An Application to Home Energy Management Systems. *Inventions*. 2023; 8(4):96.
https://doi.org/10.3390/inventions8040096

**Chicago/Turabian Style**

Ruano, Antonio, and Maria da Graça Ruano.
2023. "Designing Robust Forecasting Ensembles of Data-Driven Models with a Multi-Objective Formulation: An Application to Home Energy Management Systems" *Inventions* 8, no. 4: 96.
https://doi.org/10.3390/inventions8040096