The importance of neural network (NN) modelling is evident from its performance benefits in a myriad of applications, where, unlike conventional techniques, NN modeling provides superior performance without relying on complex filtering and/or time-consuming parameter tuning specific to applications and their wider ranges of conditions. In this paper, we employ NN modelling with training data generation based on sensitivity analysis for the prediction of building energy consumption to improve performance and reliability. Unlike our previous work, where insignificant input variables are successively screened out based on their mean impact values (MIVs) during the training process, we use the receiver operating characteristic (ROC) plot to generate reliable data with a conservative or progressive point of view, which overcomes the issue of data insufficiency of the MIV method: By properly setting boundaries for input variables based on the ROC plot and their statistics, instead of completely screening them out as in the MIV-based method, we can generate new training data that maximize true positive and false negative numbers from the partial data set. Then a NN model is constructed and trained with the generated training data using Levenberg–Marquardt back propagation (LM-BP) to perform electricity prediction for commercial buildings. The performance of the proposed data generation methods is compared with that of the MIV method through experiments, whose results show that data generation using successive and cross pattern provides satisfactory performance, following energy consumption trends with good phase. Among the two options in data generation, i.e., successive and two data combination, the successive option shows lower root mean square error (RMSE) than the combination one by around 400~900 kWh (i.e., 30%~75%).
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited