Comparison Analysis for Electricity Consumption Prediction of Multiple Campus Buildings Using Deep Recurrent Neural Networks
Abstract
:1. Introduction
2. Deep RNN Algorithms for Peak Electricity Consumption Prediction
2.1. Features Considered in This Study
2.2. Peak Electricity Consumption Prediction Models Based on Deep RNN
3. Experiment Settings
3.1. Dataset
3.2. Measure Metrics
3.3. Training Details
4. Experiment Results
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Chung, M.H.; Rhee, E.K. Potential Opportunities for Energy Conservation in Existing Buildings on University Campus: A Field Survey in Korea. Energy Build. 2014, 78, 176–182. [Google Scholar] [CrossRef]
- Alshuwaikhat, H.M.; Abubakar, I. An Integrated Approach to Achieving Campus Sustainability: Assessment of the Current Campus Environmental Management Practices. J. Clean. Prod. 2008, 16, 1777–1785. [Google Scholar] [CrossRef]
- Koester, R.J.; Eflin, J.; Vann, J. Greening of the Campus: A Whole-Systems Approach. J. Clean. Prod. 2006, 14, 769–779. [Google Scholar] [CrossRef]
- U.S. Department of Energy Buildings. Energy Databook. Energy Efficiency & Renewable Energy. 2012. Available online: https://ieer.org/wp/wp-content/uploads/2012/03/DOE-2011-Buildings-Energy-DataBook-BEDB.pdf (accessed on 5 December 2023).
- Liu, C.-L.; Tseng, C.-J.; Huang, T.-H.; Yang, J.-S.; Huang, K.-B. A Multi-Task Learning Model for Building Electrical Load Prediction. Energy Build. 2023, 278, 112601. [Google Scholar] [CrossRef]
- Walter, T.; Price, P.N.; Sohn, M.D. Uncertainty Estimation Improves Energy Measurement and Verification Procedures. Appl. Energy. 2014, 130, 230–236. [Google Scholar] [CrossRef]
- Yang, J.; Santamouris, M.; Lee, S.E.; Deb, C. Energy Performance Model Development and Occupancy Number Identification of Institutional Buildings. Energy Build. 2016, 123, 192–204. [Google Scholar] [CrossRef]
- Yildiz, B.; Bilbao, J.I.; Sproul, A.B. A Review and Analysis of Regression and Machine Learning Models on Commercial Building Electricity Load Forecasting. Renew. Sustain. Energy Rev. 2017, 73, 1104–1122. [Google Scholar] [CrossRef]
- Mocanu, E.; Nguyen, P.H.; Gibescu, M.; Kling, W.L. Deep Learning for Estimating Building Energy Consumption. Sustain. Energy Grids Netw. 2016, 6, 91–99. [Google Scholar] [CrossRef]
- Hippert, H.S.; Pedreira, C.E.; Souza, R.C. Neural Networks for Short-Term Load Forecasting: A Review and Evaluation. IEEE Trans. Power Syst. 2001, 16, 44–55. [Google Scholar] [CrossRef]
- Huang, S.J.; Shih, K.R. Short-Term Load Forecasting via ARMA Model Identification Including Non-Gaussian Process Considerations. IEEE Trans. Power Syst. 2003, 18, 673–679. [Google Scholar] [CrossRef]
- Sudheer, G.; Suseelatha, A. Short Term Load Forecasting Using Wavelet Transform Combined with Holt-Winters and Weighted Nearest Neighbor Models. Int. J. Electr. Power Energy Syst. 2015, 64, 340–346. [Google Scholar] [CrossRef]
- Dong, B.; Cao, C.; Lee, S.E. Applying Support Vector Machines to Predict Building Energy Consumption in Tropical Region. Energy Build. 2005, 37, 545–553. [Google Scholar] [CrossRef]
- Fan, C.; Xiao, F.; Wang, S. Development of Prediction Models for Next-Day Building Energy Consumption and Peak Power Demand Using Data Mining Techniques. Appl. Energy 2014, 127, 1–10. [Google Scholar] [CrossRef]
- Zhang, R.; Dong, Z.Y.; Xu, Y.; Meng, K.; Wong, K.P. Short-Term Load Forecasting of Australian National Electricity Market by an Ensemble Model of Extreme Learning Machine. IET Gener. Transm. Distrib. 2013, 7, 391–397. [Google Scholar] [CrossRef]
- Chae, Y.T.; Horesh, R.; Hwang, Y.; Lee, Y.M. Artificial Neural Network Model for Forecasting Sub-Hourly Electricity Usage in Commercial Buildings. Energy Build. 2016, 111, 184–194. [Google Scholar] [CrossRef]
- Yazici, I.; Beyca, O.F.; Delen, D. Deep-Learning-Based Short-Term Electricity Load Forecasting: A Real Case Application. Eng. Appl. Artif. Intell. 2022, 109, 104645. [Google Scholar] [CrossRef]
- Huang, Y.; Yuan, Y.; Chen, H.; Wang, J.; Guo, Y.; Ahmad, T. A Novel Energy Demand Prediction Strategy for Residential Buildings Based on Ensemble Learning. Proc. Energy Procedia 2019, 158, 3411–3416. [Google Scholar] [CrossRef]
- Wang, Z.; Wang, Y.; Srinivasan, R.S. A Novel Ensemble Learning Approach to Support Building Energy Use Prediction. Energy Build. 2018, 159, 109–122. [Google Scholar] [CrossRef]
- Liu, X.; Ding, Y.; Tang, H.; Fan, L.; Lv, J. Investigating the Effects of Key Drivers on Energy Consumption of Nonresidential Buildings: A Data-Driven Approach Integrating Regularization and Quantile Regression. Energy 2022, 244, 122720. [Google Scholar] [CrossRef]
- Liu, L.; Zhao, Y.; Chang, D.; Xie, J.; Ma, Z.; Sun, Q.; Yin, H.; Wennersten, R. Prediction of Short-Term PV Power Output and Uncertainty Analysis. Appl. Energy 2018, 228, 700–711. [Google Scholar] [CrossRef]
- Rahman, A.; Srikumar, V.; Smith, A.D. Predicting Electricity Consumption for Commercial and Residential Buildings Using Deep Recurrent Neural Networks. Appl. Energy 2018, 212, 372–385. [Google Scholar] [CrossRef]
- Kong, W.; Dong, Z.Y.; Jia, Y.; Hill, D.J.; Xu, Y.; Zhang, Y. Short-Term Residential Load Forecasting Based on LSTM Recurrent Neural Network. IEEE Trans. Smart Grid 2017, 10, 841–851. [Google Scholar] [CrossRef]
- Wang, Y.; Zhang, N.; Chen, X. A Short-Term Residential Load Forecasting Model Based on Lstm Recurrent Neural Network Considering Weather Features. Energies 2021, 14, 2737. [Google Scholar] [CrossRef]
- Gul, M.J.; Urfa, G.M.; Paul, A.; Moon, J.; Rho, S.; Hwang, E. Mid-Term Electricity Load Prediction Using CNN and Bi-LSTM. J. Supercomput. 2021, 77, 10942–10958. [Google Scholar] [CrossRef]
- Le, T.; Vo, M.T.; Vo, B.; Hwang, E.; Rho, S.; Baik, S.W. Improving Electric Energy Consumption Prediction Using CNN and Bi-LSTM. Appl. Sci. 2019, 9, 4237. [Google Scholar] [CrossRef]
- Tang, X.; Chen, H.; Xiang, W.; Yang, J.; Zou, M. Short-Term Load Forecasting Using Channel and Temporal Attention Based Temporal Convolutional Network. Electr. Power Syst. Res. 2022, 205, 107761. [Google Scholar] [CrossRef]
- Lin, J.; Ma, J.; Zhu, J.; Cui, Y. Short-Term Load Forecasting Based on LSTM Networks Considering Attention Mechanism. Int. J. Electr. Power Energy Syst. 2022, 137, 107818. [Google Scholar] [CrossRef]
- Wu, K.; Wu, J.; Feng, L.; Yang, B.; Liang, R.; Yang, S.; Zhao, R. An Attention-Based CNN-LSTM-BiLSTM Model for Short-Term Electric Load Forecasting in Integrated Energy System. Int. Trans. Electr. Energy Syst. 2021, 31, e12637. [Google Scholar] [CrossRef]
- Chitalia, G.; Pipattanasomporn, M.; Garg, V.; Rahman, S. Robust Short-Term Electrical Load Forecasting Framework for Commercial Buildings Using Deep Recurrent Neural Networks. Appl. Energy 2020, 278, 115410. [Google Scholar] [CrossRef]
- Li, A.; Xiao, F.; Zhang, C.; Fan, C. Attention-Based Interpretable Neural Network for Building Cooling Load Prediction. Appl. Energy 2021, 299, 117238. [Google Scholar] [CrossRef]
- Gao, Y.; Ruan, Y. Interpretable Deep Learning Model for Building Energy Consumption Prediction Based on Attention Mechanism. Energy Build. 2021, 252, 111379. [Google Scholar] [CrossRef]
- Ding, Z.; Chen, W.; Hu, T.; Xu, X. Evolutionary Double Attention-Based Long Short-Term Memory Model for Building Energy Prediction: Case Study of a Green Building. Appl. Energy 2021, 288, 116660. [Google Scholar] [CrossRef]
- Jin, X.B.; Zheng, W.Z.; Kong, J.L.; Wang, X.Y.; Bai, Y.T.; Su, T.L.; Lin, S. Deep-Learning Forecasting Method for Electric Power Load via Attention-Based Encoder-Decoder with Bayesian Optimization. Energies 2021, 14, 1596. [Google Scholar] [CrossRef]
- Menezes, A.C.; Cripps, A.; Buswell, R.A.; Wright, J.; Bouchlaghem, D. Estimating the Energy Consumption and Power Demand of Small Power Equipment in Office Buildings. Energy Build. 2014, 75, 199–209. [Google Scholar] [CrossRef]
- Brännlund, R.; Vesterberg, M. Peak and Off-Peak Demand for Electricity: Is There a Potential for Load Shifting? Energy Econ. 2021, 102, 105466. [Google Scholar] [CrossRef]
- Greff, K.; Srivastava, R.K.; Koutnik, J.; Steunebrink, B.R.; Schmidhuber, J. LSTM: A Search Space Odyssey. IEEE Trans. Neural Networks Learn. Syst. 2016, 28, 2222–2232. [Google Scholar] [CrossRef] [PubMed]
- Lee, D.; Kim, K. PV Power Prediction in a Peak Zone Using Recurrent Neural Networks in the Absence of Future Meteorological Information. Renew. Energy 2021, 173, 1098–1110. [Google Scholar] [CrossRef]
- Donahue, J.; Hendricks, L.A.; Rohrbach, M.; Venugopalan, S.; Guadarrama, S.; Saenko, K.; Darrell, T. Long-Term Recurrent Convolutional Networks for Visual Recognition and Description. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Boston, MA, USA, 8–10 June 2016; pp. 2625–2634. [Google Scholar] [CrossRef]
- Hochreiter, S.; Schmidhuber, J. Long Short-Term Memory. Neural Comput. 1997, 9, 1735–1780. [Google Scholar] [CrossRef]
- Beil, J.; Perner, G.; Asfour, T. Speech Recognition With Deep Recurrent Neural Networks. In Proceedings of the 2013 IEEE International Conference on Acoustics, Speech and Signal Processing, Vancouver, BC, Canada, 26–31 May 2013; pp. 6645–6649. [Google Scholar] [CrossRef]
- Kim, T.Y.; Cho, S.B. Predicting Residential Energy Consumption Using CNN-LSTM Neural Networks. Energy 2019, 182, 72–81. [Google Scholar] [CrossRef]
- Ullah, F.U.M.; Ullah, A.; Haq, I.U.; Rho, S.; Baik, S.W. Short-Term Prediction of Residential Power Energy Consumption via CNN and Multi-Layer Bi-Directional LSTM Networks. IEEE Access 2019, 8, 123369–123380. [Google Scholar] [CrossRef]
- Luong, M.T.; Pham, H.; Manning, C.D. Effective Approaches to Attention-Based Neural Machine Translation. In Proceedings of the Conference Proceedings-EMNLP 2015: Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, 17–21 September 2015. [Google Scholar] [CrossRef]
- Bahdanau, D.; Cho, K.H.; Bengio, Y. Neural Machine Translation by Jointly Learning to Align and Translate. In Proceedings of the 3rd International Conference on Learning Representations, ICLR 2015-Conference Track Proceedings, San Diego, CA, USA, 7–9 May 2015. [Google Scholar] [CrossRef]
- Xu, K.; Ba, J.L.; Kiros, R.; Cho, K.; Courville, A.; Salakhutdinov, R.; Zemel, R.S.; Bengio, Y. Show, Attend and Tell: Neural Image Caption Generation with Visual Attention. In Proceedings of the 32nd International Conference on Machine Learning, ICML 2015, Lille, France, 6–11 July 2015; pp. 2048–2057. [Google Scholar]
- Bergstra, J.; Bengio, Y. Random Search for Hyper-Parameter Optimization. J. Mach. Learn. Res. 2012, 13, 281–305. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G. Imagenet Classification with Deep Convolutional Neural Networks. Adv. Neural Inf. 2012, 25, 1–9. [Google Scholar] [CrossRef]
- Li, M.; Zhang, T.; Chen, Y.; Smola, A.J. Efficient Mini-Batch Training for Stochastic Optimization. In Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, New York, NY, USA, 24–27 August 2014; pp. 661–670. [Google Scholar] [CrossRef]
- Carrera, B.; Kim, K. Comparison Analysis of Machine Learning Techniques for Photovoltaic Prediction Using Weather Sensor Data. Sensors 2020, 20, 3129. [Google Scholar] [CrossRef] [PubMed]
Symbols | Categories | Variable | Units |
---|---|---|---|
Input | Weather | Temperature | |
Humidity | % | ||
Radiation | |||
Cloudiness | Number | ||
Wind speed | |||
Calendar | Seasonality | Number | |
Rest day | Number | ||
Vacation | Number | ||
Trend | Electricity consumption | kW | |
Output | Prediction value | Peak electricity consumption | kW |
Building Types | Electricity Consumption | |||
---|---|---|---|---|
Peak | Off-Peak | Average | Standard Deviation | |
Office | 451.0 | 95.0 | 202.1 | 49.9 |
Nature science | 298.0 | 13.0 | 65.8 | 55.9 |
General education | 565.0 | 66.0 | 178.4 | 77.3 |
Models | Standard Uncertainty Level | ||||
---|---|---|---|---|---|
Temperature | Humidity | Radiation | Cloudiness | Wind Speed | |
Spring | 0.0066 | 0.0031 | 0.0891 | 0.0022 | 0.0041 |
Summer | 0.0015 | 0.0022 | 0.0609 | 0.0009 | 0.0011 |
Autumn | 0.0058 | 0.0024 | 0.0852 | 0.0008 | 0.0038 |
Winter | 0.0054 | 0.0021 | 0.0611 | 0.0015 | 0.0017 |
Models | Office | Nature Science | General Education | ||||||
---|---|---|---|---|---|---|---|---|---|
MAE | RMSE | CV | MAE | RMSE | CV | MAE | RMSE | CV | |
XGBoost | 16.16 | 22.26 | 9.81 | 11.21 | 16.74 | 10.11 | 17.62 | 24.11 | 20.58 |
LSTM | 14.63 | 20.68 | 8.48 | 10.42 | 15.24 | 9.94 | 15.50 | 22.73 | 19.82 |
BiLSTM | 14.16 | 20.53 | 8.44 | 9.87 | 14.91 | 9.97 | 14.19 | 21.73 | 12.61 |
CNN-LSTM | 12.85 | 18.47 | 8.48 | 6.64 | 9.80 | 10.75 | 17.06 | 24.17 | 13.81 |
CNN-BiLSTM | 12.50 | 18.36 | 8.56 | 6.06 | 9.23 | 9.85 | 19.15 | 25.52 | 12.51 |
CNN- LSTM-A | 12.36 | 17.90 | 8.11 | 6.17 | 9.43 | 10.41 | 15.03 | 24.21 | 14.77 |
CNN- BiLSTM-A | 12.25 | 17.86 | 7.99 | 6.97 | 9.98 | 11.07 | 14.06 | 19.89 | 12.15 |
Loss Functions | Models | Number of Training Instances | |||||
---|---|---|---|---|---|---|---|
4 | 8 | 12 | |||||
MAE | RMSE | MAE | RMSE | MAE | RMSE | ||
MAE | LSTM | 18.96 | 25.67 | 16.34 | 22.6 | 16.42 | 21.29 |
BiLSTM | 18.91 | 25.72 | 15.29 | 21.82 | 15.49 | 21.73 | |
CNN-LSTM | 18.42 | 24.94 | 17.7 | 24.76 | 16.97 | 24.17 | |
CNN-BiLSTM | 20.23 | 27.60 | 17.83 | 23.95 | 18.58 | 25.52 | |
CNN-LSTM-A | 16.91 | 23.02 | 16.21 | 22.64 | 17.54 | 24.21 | |
CNN-BiLSTM-A | 18.09 | 24.23 | 15.66 | 22.06 | 16.86 | 23.89 | |
MSE | LSTM | 20.95 | 27.81 | 17.15 | 23.35 | 15.50 | 22.05 |
BiLSTM | 20.24 | 26.84 | 15.57 | 22.13 | 15.69 | 22.35 | |
CNN-LSTM | 18.69 | 24.29 | 18.63 | 24.83 | 17.06 | 24.09 | |
CNN-BiLSTM | 20.39 | 26.54 | 18.1 | 24.38 | 19.15 | 25.34 | |
CNN-LSTM-A | 18.58 | 24.60 | 14.68 | 20.57 | 15.03 | 21.58 | |
CNN-BiLSTM-A | 19.08 | 24.87 | 15.49 | 22.12 | 14.26 | 20.49 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Lee, D.; Kim, J.; Kim, S.; Kim, K. Comparison Analysis for Electricity Consumption Prediction of Multiple Campus Buildings Using Deep Recurrent Neural Networks. Energies 2023, 16, 8038. https://doi.org/10.3390/en16248038
Lee D, Kim J, Kim S, Kim K. Comparison Analysis for Electricity Consumption Prediction of Multiple Campus Buildings Using Deep Recurrent Neural Networks. Energies. 2023; 16(24):8038. https://doi.org/10.3390/en16248038
Chicago/Turabian StyleLee, Donghun, Jongeun Kim, Suhee Kim, and Kwanho Kim. 2023. "Comparison Analysis for Electricity Consumption Prediction of Multiple Campus Buildings Using Deep Recurrent Neural Networks" Energies 16, no. 24: 8038. https://doi.org/10.3390/en16248038
APA StyleLee, D., Kim, J., Kim, S., & Kim, K. (2023). Comparison Analysis for Electricity Consumption Prediction of Multiple Campus Buildings Using Deep Recurrent Neural Networks. Energies, 16(24), 8038. https://doi.org/10.3390/en16248038