# Energy Consumption Patterns and Load Forecasting with Profiled CNN-LSTM Networks

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Related Work

#### 2.1. Clustering and Energy Profile Creation

#### 2.2. Forecasting Models

## 3. Dataset Description

#### 3.1. REFIT

- Correcting the time to account for daylight savings in the United Kingdom;
- Merging timestamp duplicates;
- Moving sections of IAM columns to correctly match the appliance they were recording when that appliance was reset or otherwise moved;
- Forward filling NaN values or zeroing them depending on the duration of the time gap;
- Removing spikes of greater than 4000 watts from the IAM values and replacing them with zeros;
- Appending an additional issues columns that was set to 1 if the sum of the sub-metering IAMs was greater than that of the household aggregate—in this case, data should either be discarded or, at the very least, the discrepancy must be noted.

#### 3.2. UCID

#### 3.3. Meteorological Data

## 4. Methodology

#### 4.1. Stage 1—Data Collection and Cleaning

#### 4.2. Stage 2—Dimensionality Reduction and Clustering

- Morning: 06:00–11:00;
- Late morning/afternoon: 11:00–15:00;
- Late afternoon/early evening: 15:00–20:30;
- Evening: 20:30–23:30;
- Late evening/early morning: 23:30–06:00.

#### 4.3. Stage 3—Data Preprocessing

#### 4.4. Stage 4—Training and Testing

#### 4.4.1. Stage 4.1—Classification Tree

#### 4.4.2. Stage 4.2—CNN-LSTM Network

## 5. Results and Discussion

#### 5.1. Clustering

#### 5.2. Cluster Label Classification

- The UCID dataset contained a much larger number of samples (days).
- The distribution of the samples over the different days of the week and months was much more uniform in the UCID dataset.

#### 5.3. Forecasting Accuracy

## 6. Conclusions and Future Work

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Conflicts of Interest

## References

- Wei, Y.; Zhang, X.; Shi, Y.; Xia, L.; Pan, S.; Wu, J.; Han, M.; Zhao, X. A Review of Data-Driven Approaches for Prediction and Classification of Building Energy Consumption. Renew. Sustain. Energy Rev.
**2018**, 82, 1027–1047. [Google Scholar] [CrossRef] - Chen, C.; Das, B.; Cook, D. Energy Prediction in Smart Environments. In Proceedings of the Workshops Proceedings of the 6th International Conference on Intelligent Environments, Kuala Lumpur, Malaysia, 19–21 July 2010; pp. 148–157. [Google Scholar] [CrossRef]
- Yildiz, B.; Bilbao, J.I.; Dore, J.; Sproul, A. Household Electricity Load Forecasting Using Historical Smart Meter Data with Clustering and Classification Techniques. In Proceedings of the 2018 IEEE Innovative Smart Grid Technologies-Asia (ISGT Asia), Singapore, 22–25 May 2018. [Google Scholar] [CrossRef]
- Hsiao, Y.H. Household Electricity Demand Forecast Based on Context Information and User Daily Schedule Analysis From Meter Data. IEEE Trans. Ind. Inform.
**2015**, 11, 33–43. [Google Scholar] [CrossRef] - Raza, M.Q.; Khosravi, A. A Review on Artificial Intelligence Based Load Demand Forecasting Techniques for Smart Grid and Buildings. Renew. Sustain. Energy Rev.
**2015**, 50, 1352–1372. [Google Scholar] [CrossRef] - Al-Saudi, K. The Effectiveness of Different Forecasting Models on Multiple Disparate Datasets; Research Internship; University of Groningen: Groningen, The Netherlands, 2020. [Google Scholar]
- Foucquier, A.; Robert, S.; Suard, F.; Stéphan, L.; Jay, A. State of the Art in Building Modelling and Energy Performances Prediction: A Review. Renew. Sustain. Energy Rev.
**2013**, 23, 272–288. [Google Scholar] [CrossRef] [Green Version] - Kong, W.; Dong, Z.Y.; Jia, Y.; Hill, D.J.; Xu, Y.; Zhang, Y. Short-Term Residential Load Forecasting Based on LSTM Recurrent Neural Network. IEEE Trans. Smart Grid
**2019**, 10, 841–851. [Google Scholar] [CrossRef] - Kim, T.Y.; Cho, S.B. Predicting Residential Energy Consumption Using CNN-LSTM Neural Networks. Energy
**2019**, 182, 72–81. [Google Scholar] [CrossRef] - UCI Machine Learning Repository: Individual Household Electric Power Consumption Data Set. Available online: https://archive.ics.uci.edu/ml/datasets/individual+household+electric+power+consumption (accessed on 1 February 2021).
- Fallah, S.N.; Deo, R.C.; Shojafar, M.; Conti, M.; Shamshirband, S. Computational Intelligence Approaches for Energy Load Forecasting in Smart Energy Management Grids: State of the Art, Future Challenges, and Research Directions. Energies
**2018**, 11, 596. [Google Scholar] [CrossRef] [Green Version] - Backer, E.; Jain, A.K. A Clustering Performance Measure Based on Fuzzy Set Decomposition. IEEE Trans. Pattern Anal. Mach. Intell.
**1981**, PAMI-3, 66–75. [Google Scholar] [CrossRef] - Stephen, B.; Tang, X.; Harvey, P.R.; Galloway, S.; Jennett, K.I. Incorporating Practice Theory in Sub-Profile Models for Short Term Aggregated Residential Load Forecasting. IEEE Trans. Smart Grid
**2017**, 8, 1591–1598. [Google Scholar] [CrossRef] [Green Version] - Ester, M.; Kriegel, H.P.; Sander, J.; Xu, X. A Density-Based Algorithm for Discovering Clusters in Large Spatial Databases with Noise; KDD’96; KDD; AAAI Press: Menlo Park, CA, USA, 1996. [Google Scholar]
- Yildiz, B.; Bilbao, J.; Dore, J.; Sproul, A. Recent Advances in the Analysis of Residential Electricity Consumption and Applications of Smart Meter Data. Appl. Energy
**2017**, 208, 402–427. [Google Scholar] [CrossRef] - Kohonen, T. The Self-Organizing Map. Proc. IEEE
**1990**, 78, 1464–1480. [Google Scholar] [CrossRef] - James, G.; Witten, D.; Hastie, T.; Tibshirani, R. An Introduction to Statistical Learning with Applications in R; Springer: Berlin/Heidelberg, Germany, 2017. [Google Scholar]
- Fumo, N.; Rafe Biswas, M. Regression Analysis for Prediction of Residential Energy Consumption. Renew. Sustain. Energy Rev.
**2015**, 47, 332–343. [Google Scholar] [CrossRef] - Shi, H.; Xu, M.; Li, R. Deep Learning for Household Load Forecasting—A Novel Pooling Deep RNN. IEEE Trans. Smart Grids
**2018**, 9, 5271–5280. [Google Scholar] [CrossRef] - O’Malley, T.; Bursztein, E.; Long, J.; Chollet, F.; Jin, H.; Invernizzi, L. Keras Tuner. 2019. Available online: https://github.com/keras-team/keras-tuner (accessed on 1 February 2021).
- Murray, D.; Stankovic, L.; Stankovic, V. An Electrical Load Measurements Dataset Of United Kingdom Households From A Two-year Longitudinal Study. Sci. Data
**2017**, 4, 1–12. [Google Scholar] [CrossRef] [Green Version] - Global Solar Irradiance Data and PV System Power Output Data. 2019. Available online: https://solcast.com/ (accessed on 1 February 2021).
- Van der Maaten, L.; Hinton, G. Visualizing Data using t-SNE. J. Mach. Learn. Res.
**2008**, 9, 2579–2605. [Google Scholar] - McInnes, L.; Healy, J.; Melville, J. UMAP: Uniform Manifold Approximation and Projection for Dimension Reduction. arXiv
**2020**, arXiv:1802.03426. [Google Scholar] - Campello, R.J.G.B.; Moulavi, D.; Sander, J. Density-Based Clustering Based on Hierarchical Density Estimates. In Advances in Knowledge Discovery and Data Mining; Pei, J., Tseng, V.S., Cao, L., Motoda, H., Xu, G., Eds.; Springer: Berlin/Heidelberg, Germany, 2013; pp. 160–172. [Google Scholar]
- Savitzky, A.; Golay, M.J.E. Smoothing and Differentiation of Data by Simplified Least Squares Procedures. Anal. Chem.
**1964**, 36, 1627–1639. [Google Scholar] [CrossRef] - Chawla, N.V.; Bowyer, K.W.; Hall, L.O.; Kegelmeyer, W.P. SMOTE: Synthetic Minority Over-sampling Technique. J. Artif. Intell. Res.
**2002**, 16, 321–357. [Google Scholar] [CrossRef] - Kingma, D.; Ba, J. Adam: A Method For Stochastic Optimization. arXiv
**2015**, arXiv:1412.6980. [Google Scholar]

**Figure 3.**By utilizing a combination of the sine function and the cosine function, we eliminated the possibility that two different times would receive the same value had we used either function independently. The combination of both functions can be thought of as an artificial 2-axis coordinate system that represents the time of day. (

**a**) The time of the day represented as a combination of both sine and cosine waves. (

**b**) Visualizing our cyclical encoding of the time of day.

**Figure 4.**Trimmed Granger Causation matrix that displays the Granger Causality of our independent features against our target variable for the UCID and REFIT datasets. (

**a**) UCID data set. (

**b**) REFIT data set.

**Figure 5.**Mutual information gain with regard to our independent features and target variable for the UCID and REFIT datasets. (

**a**) UCID dataset. (

**b**) REFIT dataset.

**Figure 6.**Illustration of the distribution of values with respect to the global active power of the UCID dataset before and after removing outlier values, as defined by Equation (1). (

**a**) Before removing outliers. (

**b**) After removing outliers.

**Figure 7.**Illustration of the application of the moving average method and the Savitzky–Golay filter method to smooth a subset of our raw data. (

**a**) Application of the moving average method with a window size of 3. (

**b**) Application of the Savitzky–Golay filter method with a polynomial order of 3 and a window size of 5.

**Figure 9.**An illustration of the previously obtained trend component both with and without the application of LOESS.

**Figure 10.**An illustration of the Synthetic Minority Oversampling Technique (SMOTE) algorithm in the case of 2 classes depicted by blue squares (minority class) and red circles (majority class). The blue square on the far left is isolated from other members of its class and is surrounded by members of the other class; this is considered to be a noise point. The cluster in the center contains several blue squares surrounded by members from the other class and thus is indicative of potentially unsafe points that are unlikely to be random noise. Finally, the cluster in the far right contains predominantly isolated blue squares. The algorithm will then generate new synthetic samples, prioritizing the safer regions.

**Figure 12.**Assessing the number of important features through the use of the Recursive Feature Elimination and Cross-Validation (RFECV) algorithm. In this particular scenario, the optimal number of features was pruned down from a total of 77 to a mere 24.

**Figure 13.**The permutation importance of each of the features chosen as part of our fitted Random Forest classifier.

**Figure 15.**The output of performing the UMAP algorithm on the 20-dimensional UCID dataset. Each point in this figure represents a single sample (or day) within our dataset mapped onto a 2-dimensional surface.

**Figure 16.**The output obtained from performing the HDBSCAN algorithm on the 2-dimensional UCID dataset previously seen in Figure 15.

**Figure 17.**The output obtained by performing the k-means algorithm on the 2-dimensional UCID dataset previously seen in Figure 15.

**Figure 18.**Visualization of the generated clusters. (

**a**) Average power consumption per hour of the day for each of the resulting clusters obtained after utilizing the HDBSCAN algorithm on our 2-dimensional representation of the UCID dataset. (

**b**) Distribution of the clusters over the different months of the year. (

**c**) Distribution of the clusters over the different days of the week. (

**d**) Spread of the number of samples per cluster label.

**Figure 19.**Confusion matrices for the REFIT dataset and the UCID dataset. (

**a**) Confusion matrix—UCID. (

**b**) Confusion matrix—REFIT.

**Figure 20.**Showcasing the ability of our method to make one-step-ahead predictions on the UCID dataset. (

**a**) UCID—Raw data. (

**b**) UCID—Trend.

**Figure 21.**Showcasing the ability of our method to make one-step-ahead predictions on the REFIT dataset. (

**a**) REFIT—Raw data. (

**b**) REFIT—Trend.

Parameter | Description |
---|---|

Air Temperature | The air temperature (2 m above ground level). Units are degrees Celsius. |

Albedo | Average daytime surface reflectivity of visible light, expressed as a value between 0 and 1. 0 represents complete absorption. 1 represents complete reflection. |

Azimuth | The angle between a line pointing due north to the sun’s current position in the sky. Negative to the East. Positive to the West. 0 at due North. Units are degrees. |

Cloud Opacity | The measurement of how opaque the clouds are to solar radiation in the given location. Units are percentages. |

Dewpoint | The air dewpoint temperature (2 m above ground level). Units are degrees Celsius. |

Direct Normal Irradiance | Solar irradiance arriving in a direct line from the sun as measured on a surface held perpendicular to the sun. Units in W/m^{2}. |

Direct (Beam) Horizontal Irradiance | The horizontal component of Direct Normal Irradiance. Units are W/m^{2}. |

Global Horizontal Irradiance | The total irradiance received on a horizontal surface. It is the sum of the horizontal components of direct (beam) and diffuse irradiance. Units are W/m^{2}. |

Global Tilted Irradiance–Fixed | The total irradiance received on a surface with a fixed tilt. The tilt is set to latitude of the location. Units are W/m^{2}. |

Global Tilted Irradiance–Horizontal Single-Axis Tracker | The total irradiance received on a sun-tracking surface. Units are W/m^{2}. |

Precipitable Water | The total column preciptable water content. Units are kg/m^{2}. |

Relative Humidity | The air relative humidity (2 m above ground level). Units are percentages. |

SFC pressure | The air pressure at ground level. Units are hPa. |

Snow Depth | The snow depth liquid-water-equivalent. Units are cm. |

Wind Direction | The wind direction (10 m above ground level). This is the meteorological convention. 0 represents northerly wind (from the north); 90 represents easterly wind (from the east); 180 represents southerly wind (from the south); 270 represents westerly wind (from the west). Units are degrees. |

Wind Speed | The wind speed (10 m above ground level). Units are m/s. |

Zenith | The angle between a line perpendicular to the earth’s surface and the sun (90 deg = sunrise and sunset; 0 deg = sun directly overhead). Units are degrees. |

**Table 2.**List of temporal variables that were taken into consideration during the feature engineering process, as outlined in Section 4.3.

Variable | Description |
---|---|

Day | An integer value between 1 and 31. |

Weekday | An integer value between 0 and 6 denoting the different days of the week. |

Month | An integer value between 1 and 12. |

Year | An integer value between 2007 and 2010. |

Hour | An integer value between 0 and 23. |

Minute | An integer value between 0 and 45 in increments of 15. |

Season | An integer value between 0 and 3 where 0 denotes Spring, 1 denotes Summer, 2 denotes Fall, and 3 denotes Winter. |

Holiday | A categorical variable that takes on an integer value of 1 when the day concerned is a public holiday and 0 otherwise. |

**Table 3.**Results of training, optimizing, and evaluating a random forest classifier on the cluster labels obtained for the UCID and REFIT datasets.

Data Set | No. of Clusters | Accuracy |
---|---|---|

UCID | 3 | 76% |

REFIT—House 12 | 3 | 66% |

**Table 4.**Performance comparison of different methods used for the UCID and REFIT datasets. Note that these results were obtained for one-step-ahead predictions at a resolution of 15 min from the raw datasets.

Data Set | Method | MAE (kW) | RMSE (kW) | MAPE |
---|---|---|---|---|

UCID | LSTM [9] | 0.62 | 0.86 | 51.45% |

CNN-LSTM [9] | 0.34 | 0.61 | 34.84% | |

Proposed | 0.14 | 0.19 | 21.62% | |

REFIT | LSTM | N/A | N/A | N/A |

CNN-LSTM | N/A | N/A | N/A | |

Proposed | 0.11 | 0.17 | 25.77% |

**Table 5.**Performance metrics obtained when applying our method on the trend component of the UCID and the REFIT datasets to obtain one-step-ahead predictions.

Data Set | MAE (kW) | RMSE (kW) | MAPE |
---|---|---|---|

UCID | 0.02 | 0.02 | 2.58% |

REFIT | 0.02 | 0.02 | 4.32% |

**Table 6.**Performance metrics obtained when applying our method on both the raw data as well as the trend component of the UCID and REFIT datasets to obtain twelve-step-ahead predictions.

Data Set | Method | MAE (kW) | RMSE (kW) | MAPE |
---|---|---|---|---|

UCID | Raw | 0.37 | 0.59 | 38.23% |

Trend | 0.02 | 0.02 | 3.15% | |

REFIT | Raw | 0.17 | 0.31 | 39.75% |

Trend | 0.02 | 0.02 | 4.75% |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Al-Saudi, K.; Degeler, V.; Medema, M.
Energy Consumption Patterns and Load Forecasting with Profiled CNN-LSTM Networks. *Processes* **2021**, *9*, 1870.
https://doi.org/10.3390/pr9111870

**AMA Style**

Al-Saudi K, Degeler V, Medema M.
Energy Consumption Patterns and Load Forecasting with Profiled CNN-LSTM Networks. *Processes*. 2021; 9(11):1870.
https://doi.org/10.3390/pr9111870

**Chicago/Turabian Style**

Al-Saudi, Kareem, Viktoriya Degeler, and Michel Medema.
2021. "Energy Consumption Patterns and Load Forecasting with Profiled CNN-LSTM Networks" *Processes* 9, no. 11: 1870.
https://doi.org/10.3390/pr9111870