Next Article in Journal
Reschedule of Distributed Energy Resources by an Aggregator for Market Participation
Previous Article in Journal
State-of-Charge Estimation of Battery Pack under Varying Ambient Temperature Using an Adaptive Sequential Extreme Learning Machine
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Novel Nonlinear Combined Forecasting System for Short-Term Load Forecasting

School of Statistics, Dongbei University of Finance and Economics, Dalian 116025, China
*
Author to whom correspondence should be addressed.
Energies 2018, 11(4), 712; https://doi.org/10.3390/en11040712
Submission received: 27 February 2018 / Revised: 16 March 2018 / Accepted: 19 March 2018 / Published: 22 March 2018
(This article belongs to the Section F: Electrical Engineering)

Abstract

:
Short-term load forecasting plays an indispensable role in electric power systems, which is not only an extremely challenging task but also a concerning issue for all society due to complex nonlinearity characteristics. However, most previous combined forecasting models were based on optimizing weight coefficients to develop a linear combined forecasting model, while ignoring that the linear combined model only considers the contribution of the linear terms to improving the model’s performance, which will lead to poor forecasting results because of the significance of the neglected and potential nonlinear terms. In this paper, a novel nonlinear combined forecasting system, which consists of three modules (improved data pre-processing module, forecasting module and the evaluation module) is developed for short-term load forecasting. Different from the simple data pre-processing of most previous studies, the improved data pre-processing module based on longitudinal data selection is successfully developed in this system, which further improves the effectiveness of data pre-processing and then enhances the final forecasting performance. Furthermore, the modified support vector machine is developed to integrate all the individual predictors and obtain the final prediction, which successfully overcomes the upper drawbacks of the linear combined model. Moreover, the evaluation module is incorporated to perform a scientific evaluation for the developed system. The half-hourly electrical load data from New South Wales are employed to verify the effectiveness of the developed forecasting system, and the results reveal that the developed nonlinear forecasting system can be employed in the dispatching and planning for smart grids.

1. Introduction

Electrical load forecasting plays a pivotal role in electrical systems [1]. High-precision forecasting models can significantly improve power system management and provide effective information for economic operators [2]. If the forecasting error were to decrease by 1%, the operating costs would decrease by 10 million pounds [3]. However, inaccurate forecasting results can result in huge losses for electric power companies. Overestimated forecasts lead to extra cost production, while underestimated forecasts lead to issues in supplying sufficient electricity, which could in turn result in large power system losses [4]. Many severe blackout events have occurred that have deeply affected social production and people’s lives. For example, on 14 August 2003, the U.S.–Canada power grid suffered a serious blackout event. This accident affected approximately 50 million people and generated huge losses amounting to billions of dollars [5,6]. Furthermore, Hunan Province, China in 2008, Europe in 2006 and India in 2012 were affected by blackout events [7]. Clearly, if an effective forecasting model were in place to provide an early warning prior to such events, timely measures could be taken to prevent their occurrence. However, electrical systems are affected by various factors, such as country policies, population growth and the social environment [8]. Therefore, the development of an accurate, simple, and robust forecasting model is meaningful for load forecasting.
In recent years, several methods have been developed to decrease electrical load prediction error. These methods can be broadly categorized into two groups, namely traditional and intelligent forecasting methods [9]. Traditional forecasting methods are widely used in load forecasting because they are simple to apply. These methods include regression models [10,11], the grey forecasting model (GM) [12], autoregressive moving average (ARMA) model [13], the autoregressive integrated moving average (ARIMA) model [14], the Kalman filtering (KF) method [15], etc. However, traditional forecasting methods cannot achieve sufficient accuracy for nonlinear load series [9].
Intelligent forecasting methods have been applied to improve model performance in nonlinear time series [16,17,18]. Many intelligent methods have been used to load time series because they can effectively solve complicated processes [19]. Moreover, intelligent methods are regarded as powerful tools for load forecasting problems owing to their accurate and robust forecasting levels [20]. With the improvement in intellectual algorithms, several intelligent prediction methods have been employed in power load prediction, such as fuzzy logic [21], artificial neural network (ANN) [22,23] and support vector machines (SVM) [24,25].
All these single forecasting models cannot achieve high precision on all occasions, because each model exhibits its own advantages and disadvantages [26]. To eliminate the weaknesses that are inhered in single models, many combined forecasting models have been proposed that are able to achieve desirable forecasting performance, which are regarded as the research direction for obtaining effective performance [27,28]. More specifically, the combined forecasting methods, as first noted by Bates and Granger [29], are developed to improve the forecasting performance by combining the advantages of each models. In recent years, different types of individual models have been integrated into load forecasting to decrease forecasting error. For example, Wang et al. [30] applied adaptive particle swarm optimization (PSO) to obtain the weight coefficients of a combined model based on the seasonal ARIMA, seasonal exponential smoothing and the weighted SVM in power load prediction. Similarly, Xiao et al. [31] developed a combined model that integrated several neural networks, incorporating the back propagation neural network (BPNN), radial basis function (RBF), generalized regression neural network (GRNN) and genetic-algorithm-optimized back propagation neural network (GABPNN) into load forecasting. Zhao et al. [32] proposed a novel combined model based on a high-order Markov chain to predict power consumption. Xiao et al. [33] developed a combined forecasting model for load forecasting, which employed an optimization method to obtain the weights of each individual model. Moreover, from the above literatures, it can be concluded that combined models exhibit preferable predictive performance compared to single models. Generally, the above-mentioned combined forecasting models have integrated individual models by means of linear combinations, denoted as linear combined model, which also cannot always achieve the promising forecasting results.
To the best of our knowledge, most previous studies proposed linear combined model to forecast electrical power load, which can enhance the forecasting effectiveness to some extent. However, there are still many defects in linear combined model which can be summarized: (1) The linear combined model only takes the linear terms into account with a fixed weight, ignoring the significance of the potential nonlinear term, which may cause decline of forecasting accuracy; (2) The linear combined model can lead to poor forecasting results when there is a strong nonlinear relationship between the individual predictor and final results.
With the above-mentioned analysis considered, the nonlinear combined method can be adapted to obtain the better performance than linear combined models. Since the middle of the 1990s, the SVM model has been widely used in many fields, such as vessel traffic flow forecasting [34], air quality early-warning [35], electrical load forecasting [6], etc. Especially, in the fields of electrical load forecasting, Hong et al. [36,37,38] developed a series of SVM-based model that integrates some advanced optimization algorithm, which obtains better forecasting performance than other compared models. Inspired by the outstanding studies of Hong et al., the authors find that the SVM has superiority in nonlinear time series forecasting and is powerful and simply implemented in application. Therefore, the modified SVM model is developed, which employs the advanced optimization algorithm to determine the parameters for further improving the forecasting performance. More specifically, the modified SVM model is employed as a nonlinear combined method to combine forecasters.
Therefore, with the limitations and strengths discussed above, a novel nonlinear combined forecasting system, based on improved data pre-processing module, the forecasting module, and the evaluation module, is developed in this study. More specifically, the data preprocessing module improved by longitudinal data selection is incorporated in the developed combined forecasting system to extract and identify the main feature of electrical power load data, which further enhance the effectiveness of data pre-processing and then enhance the final forecasting performance; the forecasting module, including individual forecasting models (BPNN, firefly-algorithm-optimized back propagation neural network (FABPNN), Elman neural network (ENN) and wavelet neural network (WNN)) and the combined model construction, performs multi-step forecasting for electrical power load with an effective forecasting performance, which can provide the basic information for scientific operations of electrical power system. More specifically, the modified support vector machine is developed to integrate all the individual predictors and obtain the final prediction, which successfully overcomes the upper drawbacks of linear combined model. Furthermore, the comprehensive evaluation module is an integral part of a complete forecasting system, which can verify the forecasting effectiveness from typical evaluation metric and statistical perspective. In summary, the developed nonlinear combined model takes full advantage of each components and ultimately achieves final success in electrical load forecasting.
The major contributions of this paper are as follows:
(1)
In this study, we develop a new nonlinear combined forecasting system that can integrate the merits of individual forecasting models to achieve higher forecasting accuracy and stability. More specifically, the improved data pre-processing module based on longitudinal data selection is successfully proposed, which further enhance the effectiveness of data pre-processing and then improve the final forecasting performance. Moreover, the modified support vector machine is developed to integrate all the individual predictors and obtain the final prediction, which successfully overcomes the upper drawbacks of linear combined model.
(2)
The proposed combined forecasting system aims to achieve effective performance in multi-step electrical load forecasting. Multi-step forecasting can effectively capture the dynamic behavior of electrical loads in the future, which is more beneficial to power systems than one-step forecasting. Thus, this study builds a combined forecasting system to achieve accurate results for multi-step electrical load forecasting, which will provide better basic for power system administration, load dispatch and energy transfer scheduling.
(3)
The superiority of the proposed nonlinear combined forecasting system is validated well in a real electrical power market. The novel nonlinear combined forecasting displays its superiority compared with the individual forecasting model, and the prediction validity of the developed combined forecasting system demonstrates its superiority in electrical load forecasting compared with linear combined models and the benchmark model (ARIMA) as well. Therefore, the new developed forecasting system can be widely used in engineering application.
(4)
A more comprehensive evaluation is performed for further verifying the forecasting system’s effectiveness and significance. The results of Diebold–Mariano (DM) test and forecasting effectiveness reveal that the developed nonlinear combined forecasting system performs a higher degree of prediction accuracy than other comparison models and that it is significantly different from traditional forecasting models in terms of the level of prediction accuracy.
(5)
An insightful discussion is provided in this paper to further verify the forecasting effectiveness of the proposed system. Four discussions are performed, which include the significance of the proposed forecasting system, the comparison with linear combined models, the superiority of the optimization algorithm and the developed forecasting system’s stability, which bridge the knowledge gap for the relevant studies, and provide more valuable analysis and information for electrical load forecasting.
The remainder of this paper is structured as follows. Section 2 illustrates the framework of the developed combined system. In Section 3, complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN), individual forecasting models (the BPNN, FABPNN, ENN and WNN), the method of constructing the combined model and certain forecasting evaluation criteria are provided. Section 4 describes the results of three experiments. Further discussion is described in Section 5, and finally, a conclusion is provided in Section 6.

2. Framework of Proposed Nonlinear Combined Forecasting System

A new nonlinear combined forecasting system is developed that exhibits greater effectiveness in electrical load forecasting. It addresses the drawbacks of the individual models, which cannot always be optimal in any given case; in addition, it considers the contribution of nonlinear terms of individual forecasting models to improving the final forecasting performance compared with the linear combined model. The basic framework of the developed combined forecasting system is outlined as follows.
Considering that uncertainty and randomness exist in raw electrical load series, the data pre-processing module improved by longitudinal data selection is employed during the first stage of electrical load forecasting to extract the primary features of the raw electrical load series.
Four ANNs, namely BPNN, FABPNN, ENN and WNN, which are regarded as the individual forecasting models, are constructed to predict the filtered load time series. Then the combination forecasting is constructed based on modified SVM, which is used to combine the forecasting results obtained by the BPNN, FABPNN, ENN and WNN.
The prediction performance of the developed forecasting system is evaluated by employing typical accurate metrics, the Diebold–Mariano (DM) test and forecasting effectiveness.
Multi-step forecasting is applied in order to further test the forecasting abilities of the proposed combined forecasting system. Multi-step forecasting is an extrapolation process for realizing forecasting values by means of historical data and previous forecasting values. The multi-step forecasting process is as follows:
(a)
1-step forecasting: on the basis of the historical data { v ( 1 ) , v ( 2 ) , v ( 3 ) , , v ( M ) } , the predicted data v ^ ( M + 1 ) is acquired, where M is the sampled time of the data sequence.
(b)
2-step forecasting: on the basis of the historical data { v ( 2 ) , v ( 3 ) , v ( 4 ) , , v ( M ) } and previously predicted value v ^ ( M + 1 ) , the predicted data v ^ ( M + 2 ) is acquired.
(c)
3-step forecasting: on the basis of the historical data { v ( 3 ) , v ( 4 ) , , v ( M ) } and previously predicted value { v ^ ( M + 1 ) , v ^ ( M + 2 ) } , the predicted data v ^ ( M + 3 ) is acquired.

3. Proposed Combined Forecasting System

To obtain high-precision forecasting results, the developed nonlinear combined forecasting system includes three modules: data pre-processing, forecasting and evaluation module. Figure 1 depicts the flowchart of the proposed forecasting system. The main procedure how the developed forecasting system is working in electrical load forecasting mainly include: first, the improved data pre-processing module based on longitudinal data selection is successfully developed to eliminate the negative effects of noise, which seems to be a promising technique to extract and identify the main feature of electrical power load data, as shown in Figure 1 part I; second, the structure of input-output for modeling is shown in Figure 1 part II; third, four individual forecasters are provided to forecast future changes of electrical load data, as shown in Figure 1 part III; and then, as shown in Figure 1 part IV, the modified SVM model is developed as a nonlinear combined method to combine all forecasters and obtain the final forecasting result; Finally, as shown in Figure 1 part V, the comprehensive evaluation module is used to verify the forecasting effectiveness from typical evaluation metric and statistical perspective. The details of each module are presented as follows.

3.1. Module 1: Improved Data Pre-Processing Module

Huang et al. [39] developed the empirical mode decomposition (EMD) technique, which can be employed to resolve the original sequences into intrinsic mode functions (IMFs). EMD can analyze complex data, such as non-stationary data, and many studies [40,41,42] successfully used EMD method in electrical load forecasting. However, it exhibits the defect of mode mixing, then the ensemble empirical mode decomposition (EEMD) method [43] has been developed to solve this defect. However, EEMD introduces two additional difficulties: residual noise exists in the reconstructed signal and the quantities of IMFs are likely to differ with the same decomposition. To address the mode mixing problem while maintaining the capacity to solve these additional difficulties, CEEMDAN was developed [44], and its main steps are illustrated in Figure 2 part A. Compared to EEMD, the main distinction of CEEMDAN is the introduction of adaptive noise. Therefore, this work employs the CEEMDN algorithm for data pre-processing. Several studies [45,46] have confirmed the successful application of CEEMDAN in the component filtering field. However, most previous studies only focused on conducting a simple data pre-processing, while ignoring the significance of data selection, which leaves much to be desired. Therefore, the improved data pre-processing module based on longitudinal data selection is successfully proposed, which further enhance the effectiveness of data pre-processing and then improve the final forecasting performance.

3.2. Module 2: Forecasting Module

3.2.1. Individual Forecasting Models

A variety of individual models can be used to obtain effective load forecasting performance. In this study, four widely used ANNs, namely the BPNN, FABPNN, ENN and WNN, are selected for the electrical load forecasting. The topological structure of the individual neural networks is depicted in Figure 2 part B.
Definition 1.
BPNN. There are two significant parameters in BPNN: weight and threshold. Suppose wab implies the weight connecting hidden node a and output node b, ubt represents the weight connecting input node t and hidden node b, and θ ^ b and θ a display the threshold value of hidden node b and output node a, respectively. Then the output of hidden b: Hb can be calculated as:
H b = F ( t = 1 T u b t I t + θ ^ b )
where It is the input data of input node t, T denotes the input nodes’ number, and F implies the S activation function which can be displayed as:
F ( t ) = 1 1 + e t
Then the output layer calculates the sum through:
O a = b = 1 B w a b H b + θ a
where B represents the hidden nodes’ number.
Definition 2.
FABPNN. The FABPNN is a neural network model that mainly consists of two parts: firefly algorithm (FA) optimization and BPNN. More specifically, the thresholds and the weight values of BPNN are optimized by FA method. In addition, the detail description of FA algorithm can be found in [36].
Definition 3.
ENN. The ENN model possess four layers: the input layer, the hidden layer, the context layer, and the output layer. Suppose the input data of neurons at time t is Iit (i = 1, 2, …, r), the context layer neurons cvt (v = 1, 2, …, n) and netvt (v = 1, 2, …, n) at time t.
The hidden layer neurons Hvt (v = 1, 2, …, n) at time t can be displayed as:
H v t ( l ) = F ( n e t v t ( l ) ) = F ( i = 1 r w i v I i t ( l ) + v = 1 n s v c v t ( l ) )
where F is the hidden layer activation function which can be obtained by Equation (2), wiv represents the weights connecting input layer node i and the hidden layer node v, and sv is the weight between hidden layer node v and the context layer.
The output layer Ot+1 is represented as follows:
O t + 1 ( l ) = f ( v = 1 n k v H v t ( l ) )
where kv is the weight between hidden layer node v and the output layer and f is an identity map as the activation function.
Definition 4.
WNN. Suppose the input data and output data in WNN are xd (d = 1, 2, …, g) and yj (j = 1, 2, …, l), separately. In addition, the output of hidden layer H is represented by:
H ( d ) = H d ( i = 1 g w i d x d q d a d )
where Hd is the wavelet basis function, wid is the weights connecting the input layer and hidden layer, qd is the translation factor of the wavelet basis function, ad represents the scaling factor of the wavelet basis function. Then the output is displayed as:
o ( j ) = i = 1 p w i j H ( i ) , j = 1 , 2 , , l
where p represents the number of hidden node, and wij implies the weights connecting the output layer and hidden layer.

3.2.2. Combined Forecasting Model

The combination model is regarded as a promising method for acquiring prediction validity in load forecasting and can incorporate the advantages of the individual models.

The Theory of Combined Forecasting

Combined forecasting methods can be categorized as linear and nonlinear combined forecasting. The traditional linear combined method is represented as follows:
f = t = 1 d w t f t
where w t denotes the weight coefficient of the t-th prediction method, f t is the forecasting result from the t-th prediction method, d is the quantity of single models, and f is the prediction result of the combined method. However, the linear prediction model offers limited applications, because it merely determines each model’s influence, resulting in poor forecasting accuracy. The nonlinear combined method can successfully solve this problem of the linear combined model, as well as reducing uncertainty and taking full advantage of the information of each forecasting method. Meanwhile, it avoids computing the weight of the linear combined method. The nonlinear combined model is represented as follows:
f = φ ( f 1 , f 2 , , f d )
where f t implies the prediction results of the t-th forecasting method, φ (   ) is the nonlinear combined forecasting function, and f is the forecasted value of the nonlinear combined model.
Considering the strong nonlinear function mapping abilities of neural networks, we can implement an electrical load combination prediction method according to a neural network. However, owing to the generalization ability limitation, a neural network easily becomes trapped in the local optimal solution and cannot make full use of information by selecting a small sample [47]. Compared to traditional neural networks, the SVM seeks structural risk minimization and its two convex properties guarantee that a global optimal solution can be obtained. Therefore, the modified SVM is developed as a nonlinear combined method to combine all forecasters.

Support Vector Machine (SVM)

The SVM, developed by Vapnik [48,49], exhibits extensive application in nonlinear regression estimation, which can be widely employed in the forecasting field [35]. Moreover, the SVM has better performance in terms of electricity load forecasting [50,51,52]. According to Vapnik’s theory, SVM equations are mainly expressed as follows.
Suppose p i means the input vector, p i R n , and z i implies the output vector, z i R . The estimation expression is represented as:
f ( p ) = < w , τ ( p ) > + q
where τ ( p ) is nonlinear mapping that causes the input space p to be high-dimensional space,
w denotes the estimated weight vectors, and q represents a scalar. The values of w and q can be obtained by means of a quadratic programming problem:
min ω , q , η 1 2 w 2 + C i = 1 N ( η i + η i ) s . t . { | z i < w , τ ( p i ) > q | ε + η i η i , η i 0 , i = 1 , 2 , , N
where C is the error penalty coefficient, N is the quantity of factors in the training sequence of training, η i and η i are the relaxation factor, and ε represents the admissible error. For nonlinear regression cases, we can convert them into linear regression cases using a kernel function k ( p i , p j ) . The nonlinear mapping can be obtained by:
f ( p ) = i = 1 N ( σ i σ i ) k ( p , p i ) + q
where σ i and σ i are the Lagrange multipliers. The RBF kernel function is used in this paper, and can be written as follows:
k ( p i , p j ) = exp ( γ p i p j 2 )
where γ denotes the kernel parameter, and p i and p j are two vectors in the input space. In this paper, two important parameters ( γ , C ) influence the prediction validity. The MFO algorithm is employed to determine the parameters.

Moth-Flame Optimization (MFO)

Mirjalili [53] provided a new metaheuristic algorithm known as MFO, a nature-inspired optimization method, by modeling the natural behavior of moths in a mathematical form. A specific presentation of this optimization method can be found in [53,54], and the main procedures are presented as follows.
Step 1 Parameter determination.
The parameters of the MFO method mainly include the quantity of moths, flames, and variables, the maximum number of iterations, and the lower and upper bounds of variables.
Step 2 Position initialization.
Equations (14) and (15) are introduced to express the position of moths and flames separately:
U = [ u 1 , 1 u 1 , 2 u 1 , d u 2 , 1 u 2 , 2 u 2 , d u k , 1 u k , 2 u k , d ]
V = [ V 1 , 1 V 1 , 2 V 1 , d V 2 , 1 V 2 , 2 V 2 , d V k , 1 V k , 2 V k , d ]
where k represents the quantity of moths, and d is the quantity of variables.
Equation (16) calculates the initialization of U and V:
u · h   o r   V · h = ( U b h L b h ) × r a n d (   ) + L b h
where u . h and V . h denote the values of U and V separately, the upper and lower bounds of variables are represented by Ub and Lb, respectively, and rand is a random number in [0, 1].
Step 3 Selection of fitness values.
For the flames, there is a matrix OV, as shown in Equation (17), which can be used to obtain the corresponding fitness values:
O V = [ O V 1 O V 2 O V k ]
where k implies the moths’ number.
Step 4 Iteration optimization.
A logarithmic spiral is selected as the update formula for the MFO method, as follows:
U c = S ( U c , V h ) = G c e b r c o s ( 2 π r ) + V h
G c can be determined by:
G c = | V h U c |
where G c represents the distance between the h-th flame and c-th moth, S implies the spiral function, b is equivalent to a constant for determining the logarithmic spiral form, and the value of r is randomly within the range of [−1, 1]. Moreover, the parameter r indicates how close the next position is to the flame; when the position is farthest, r = 1 , while r = 1 implies the closest.
However, the method of updating the position defined by Equation (18) causes the MFO algorithm to converge to the local optimal solution quickly. To avoid converging to local optima, every moth can only refresh its location based on one of the flames, following Equation (18). The flames are sorted according to fitness values for each iteration and at the back of renovation. Next, the moths renew their corresponding flame locations; however, the location renovating of moths of approximately k various positions impairs the exploitation of optimal solutions. An adaptive mechanism is therefore proposed to address this problem:
f l a m e   n o = r o u n d ( W s × W 1 i t e r max )
where W implies the maximum quantity of flames, s indicates the number of current iterations, and i t e r max is the maximum number of iterations.
Step 5 Optimal flames selection.
If the flame is not determined to be superior to the optimal flame of the former iteration, the flame’s position will be updated, and the most appropriate flame is re-decided. When the iteration standard is reached, the optima are regarded as the most suitable approximation of the optimum. The pseudo-code of MFO is described in Algorithm 1:
Algorithm 1 Moth-Flame Optimization Algorithm.
Energies 11 00712 i001

Construction of the Final Forecasting Result

The construction of final forecasting result is the important step in the combined forecasting model. To overcome the above-mentioned drawbacks of linear combined model, the modified support vector machine based on MFO algorithm is developed in this paper, which is employed as a nonlinear combined method to search for the best function to combine each individual predictor. More specifically, the obtained function is performed to aggregate all forecasting results of each predictor in the previous steps as a final forecasting result for the original electric power load data. In other words, based on the previous work, the forecasting results obtained from each individual predictor are input into the modified SVM model to predict future electrical load data, which can achieve desirable forecasting performance in engineering applications.

3.3. Module 3: Evaluation

It is vital to evaluate the forecasting system’s performance by employing appropriate metrics. To evaluate the forecasting accuracy, several evaluation criteria are applied in this study, including average error (AE), mean absolute error (MAE), mean square error (MSE), mean absolute percentage error (MAPE) and ζ I N D E X . Furthermore, for further verification of the model’s effectiveness and significance, the Diebold-Mariano (DM) test and forecasting effectiveness are performed in this work.

3.3.1. Forecasting System Evaluation Criteria

Several evaluation criteria are applied in this study, as shown in Table 1, where O i is the observed value, F ^ i means the predictive value and T represents the number of prediction values. In addition, the metric ζ I N D E X is employed to compare the forecasting effectiveness of the proposed combined model with other models.

3.3.2. DM Test

The DM test [55] is applied in order to contradistinguish the predictive validity of the proposed forecasting system from others. The details are as follows:
Assume that the real values are { y t ; t = 1 , , n + m } , and the predictions of the compared models are, respectively: { y ^ t ( a ) ; t = 1 , , n + m }   { y ^ t ( b ) ; t = 1 , , n + m } . Then, the prediction errors of these two models are:
e n + l ( a ) = y n + l y ^ n + l ( a ) , l = 1 , 2 , , m .
e n + l ( b ) = y n + l y ^ n + l ( b ) , l = 1 , 2 , , m .
The loss function F ( e n + g ( j ) )   j = a , b is used for evaluating the forecasting accuracy of the compared models. Two popular loss functions reveal the following:
Square error loss:
F ( e n + l ( j ) ) = ( e n + l ( j ) ) 2
Absolute deviation loss:
F ( e n + l ( j ) ) = | e n + l ( j ) |
The DM test statistic values are defined as:
D M = l = 1 m ( F ( e n + l ( a ) ) F ( e n + l ( b ) ) ) / m S 2 / m s 2
where S 2 is a variance estimator of d l = F ( e n + l ( a ) ) F ( e n + l ( b ) ) . The hypothesis testing is:
H 0 : E ( d l ) = 0   t
H 1 : E ( d l ) 0
The test statistics of DM gradually follow a standardized normal distribution. When the value of DM satisfies a criterion, which is represented by Equation (28), the null hypothesis will be refused:
| D M | > z α / 2
where z α / 2 expresses the z-value from the standardized normal table and the level of significance is α .

3.3.3. Forecasting Effectiveness

This work also introduces the forecasting effectiveness to measure the prediction veracity of the proposed forecasting system. Further details regarding forecasting effectiveness can be found in [56].
Definition 5.
Assume that the actual values are { B m ; m = 1 , , M } , forecast values are { B ^ m ; m = 1 , , M } , and forecast errors are e m = B m B ^ m . Then, the forecasting accuracy is calculated by:
A m = { 1 | e m B m | , 0 | e m B m | 1 0 ,   | e m B m | > 1
Definition 6.
The kth-order forecasting effectiveness unit can be calculated as:
n k = m = 1 M Q m A m k
where k represents a positive integer, Q m expresses the discontinuous probability distribution at time m, and m = 1 M Q m = 1 , Q m > 0 . Moreover, when the priori information of the discrete probability distribution cannot be known, we define Q m as equal to 1 / M .
Definition 7.
The kth-order forecasting effectiveness can be represented as:
E ( n 1 , n 2 , , n k )
where E indicates a continuous function of a certain k unit. In particular, when E ( x ) = x , the formula for one-order forecasting effectiveness can be expressed as E ( n 1 ) = n 1 ; when E ( x , y ) = x ( 1 y x 2 ) , the formula for two-order forecasting effectiveness can be expressed as E ( n 1 , n 2 ) = n 1 ( 1 n 2 ( n 1 ) 2 ) .

4. Experimental Study

All experiments are conducted in MATLAB R2015a on Windows 10 with a 2.60 GHz Intel Core i7-6700HQ CPU, 64-bit and 8 GB RAM. The experimental parameters are displayed in Table 2.

4.1. Data Selection

In this study, half an hour of power load data from New South Wales in February and June from 2009 to 2011 are selected to assess the validity of the proposed combined forecasting system, as indicated in Table 3. To enhance the forecasting performance, a longitudinal data selection is accepted to preprocessing the raw data. Each series of raw load datasets is divided into seven subclasses (from Monday to Sunday), based on a designated date of one week, to ensure that the inherent characteristics of each subset are the same. Figure 3 demonstrates the longitudinal data selection process. For instance, there are 12 Mondays (48 data points per day) in February from 2009 to 2011, which are selected as one subset. For each subset, we select the last one day as the testing set and the other days as the training set. The training and testing data structures of the proposed combined forecasting system are illustrated in Figure 1 part II.

4.2. Experiment Setup

To test the performance of the developed nonlinear combined forecasting system, three experiments are conducted in this study. The electrical load data collected from New South Wales in February from 2009 to 2011 are used as Experiment I in this study. Meanwhile, due to the different electrical load datasets with different characteristics, the datasets of June are also used as another case study, called Experiment II, which is employed to further test the forecasting superiority of the proposed combined forecasting system. In Experiment I-II, the developed combined forecasting system is compared with other individual forecasting models, namely, BPNN, FABPNN, ENN and WNN. If the proposed forecasting system performs better than other individual models in different months, we can safely conclude that the proposed system has better forecasting performance and universal applicability for different datasets with different characteristics; in other words, the proposed combined forecasting system based on improved data preprocessing and modified SVM can achieve better forecasting performance than other comparison models in different environments. Moreover, the benchmark model ARIMA was employed to evaluate and compare the developed combined forecasting system in Experiment III based on the electrical load data of February and June.

4.2.1. Experiment I: The Case of February

In this experiment, to test the forecasting performance of the novel nonlinear forecasting system by improved data preprocessing and modified SVM, the 30-min electrical data from Monday to Sunday in February are employed. More specifically, four performance metrics, significant improvements for combined model compared with the forecasting results of other models and statistical MAPE values are used to evaluate the foresting accuracy and stability of the proposed forecasting system. The multi-step prediction results of the developed combined forecasting system and single models are displayed in Table 4, Table 5 and Table 6, and Figure 4 and Figure 5.
(a)
Table 4 shows the prediction capability of the combined forecasting system and four single models in the 1-step to 3-step forecasting. Taking Fridays’ forecasting results as an example: in 1-step forecasting, it is determined that the proposed nonlinear combined forecasting system achieves superior results compared to other models for different forecasting horizons. The combined forecasting system exhibits minimum forecasting errors, with AE, MAE, MSE, and MAPE values of −3.7850, 41.5131, 2979.40, and 0.4739%, respectively. The forecasting ability of BPNN is ranked second, while WNN exhibits the worst forecasting performance. In 2-step forecasting, the combined forecasting system achieves the most exact prediction performance, with a MAPE value of 0.8163%, while BPNN is the second most accurate model, and WNN is the worst. For 3-step forecasting, the combined forecasting system still achieves the most superior performance. The 1-step forecasting exhibits superior forecasting accuracy for the same model compared with multi-step prediction.
(b)
Table 5 presents the detailed results of multi-step improvements between the developed combined forecasting system and other prediction models. Taking Fridays’ results: in the 1-step predictions, the combined forecasting system decreases the MAE values by 57.4973%, 58.9966%, 61.2416% and 71.2267%, the MSE values by 82.4779%, 83.4804%, 84.4607% and 91.0522%, and the MAPE values by 57.6094%, 59.3157%, 61.7676% and 72.2005%, based on BPNN, FABPNN, ENN and WNN, respectively. In the 2-step and 3-step predictions, the combined forecasting system still decreases the MAE, MSE and MAPE values in comparison with other models.
(c)
Table 6 shows the statistical values of MAPE (%). The developed combined forecasting system obtains lower minimum and maximum MAPE values among the four individual models for 1-step to 3-step prediction. Furthermore, the developed forecasting system achieves minimum standard deviation (Std.) values for MAPE compared to the individual models.
(d)
Figure 4 displays the average values of AE, MAE, MSE and MAPE for 1-step, 2-step, and 3-step forecasting. To analyze the detailed forecasting results, the 1-step forecasting results for Monday are depicted in Figure 5. It can be observed that the prediction validity of the proposed nonlinear combined forecasting system is more precise than that of the single models. Moreover, the forecasting values of the developed forecasting system are more approximate to the real data.
Remark. 
By comparing the forecasting error metrics for multi-step prediction, it is found that the proposed forecasting system is superior in almost every aspect. Meanwhile, the developed combined forecasting system achieves the lowest minimum and maximum MAPE values, implying that it is more exact than the single models. The proposed combined forecasting system is the steadiest, because it achieves the minimum Std. values for MAPE. Therefore, the proposed forecasting system has better forecasting accuracy and high forecasting stability than the other models. Most importantly, the improved data preprocessing algorithm and modified SVM can act as an effective technique to improve the forecasting performance of the proposed system, which can effectively predict the electrical load data.

4.2.2. Experiment II: The Case of June

To evaluate the forecasting performance of the developed combined forecasting system for different load datasets, the 30-min electrical data from Monday to Sunday in June are employed in Experiment II. The dataset structure is the same as that of February, and the results are displayed in Table 7, Table 8 and Table 9, and Figure 6 and Figure 7.
(a)
Table 7 shows the final evaluations of the results for the 1-step to 3-step forecasting. For 1-step forecasting, the proposed combined forecasting system outperforms the BPNN, FABPNN, ENN and WNN models, according to the comparison of AE, MAE, MSE and MAPE from Monday to Sunday. For example, the MAPE values of the combined forecasting system are 0.5270%, 0.5390%, 0.4489%, 0.5306%, 0.4429%, 0.5052% and 0.5332% from Monday to Sunday, respectively. For 2-step forecasting, the developed combined forecasting system achieves the most accurate prediction effect, with MAPE values of 0.8562%, 0.7543%, 0.7336%, 0.8335%, 0.6223%, 0.6898% and 0.7702% from Monday to Sunday, respectively. For 3-step forecasting, the developed nonlinear combined forecasting system is still the most accurate.
(b)
Table 8 illustrates the detailed multi-step improvements between the developed combined forecasting system and other prediction models. Taking the results of Sunday as an example, in the 1-step predictions, the combined forecasting system decreases the MAPE values by 49.1983%, 47.7906%, 59.3493% and 65.0099%, based on BPNN, FABPNN, ENN and WNN, respectively. In the 2-step and 3-step predictions, the proposed combined forecasting system also decreases the MAPE values.
(c)
Table 9 displays the results of the MAPE value statistics. For the minimum, maximum, mean and Std. of the MAPE values, the developed combined forecasting system obtains a lower value in all aspects for BPNN, FABPNN, ENN and WNN.
(d)
Figure 6 summarizes the results of the average values of four forecasting error indexes for 1-step, 2-step, and 3-step forecasting in June. Furthermore, Figure 7 illustrates the detailed forecasting results of 1-step for Sunday. It is found that the combined forecasting system achieves a more precise prediction performance than the other four models.
Remark. 
Based on the above experiment, the proposed forecasting system exhibits superior performance in all forecasting error indexes. Furthermore, the developed system achieves the smallest maximum and minimum MAPE values, which means that it displays superior capability among the investigated models. The developed forecasting system also achieves forecasting stability, because its MAPE Std. values are smallest. Therefore, the proposed combined forecasting system achieves forecasting accuracy and stability simultaneously. Moreover, we find that the combined forecasting system performs effectively in different month which can be safely conclude that the improved data preprocessing and modified SVM have great contribution to enhance the forecasting effectiveness of the proposed system. In summary, the proposed forecasting system has better forecasting performance and universal applicability which can be widely applied for load forecasting as well as other fields.

4.2.3. Experiment III: Comparison with Benchmark Model

In this section, the ARIMA model is selected as a benchmark model for comparison with the developed combined forecasting system. The half-hour power load data from February and June are applied to contradistinguish the prediction performances of the developed combined forecasting system and ARIMA model. The average values of AE, MAE, MSE and MAPE for February and June are displayed in Table 10, where it is revealed that the combined forecasting system achieves lower MAPE values than the ARIMA model. To express prediction capability clearly, the comparison of prediction results for Wednesday in June, for the combined forecasting system and ARIMA model, are shown in Figure 8. From the results of Table 10 and Figure 8, we can conclude that the proposed nonlinear combined forecasting system can achieve better forecasting performance than ARIMA model.

4.3. Summary

Based on experiments I–III, we conclude that:
(a)
For 1-step to 3-step forecasting, the developed combined forecasting system achieves smaller values for all forecasting error metrics than the single models. In addition, the developed combined forecasting system also obtains the lowest MAPE Std. results. Overall, through improved data preprocessing method and modified SVM, the developed system is superior to the four single models in terms of both validity and stability.
(b)
The developed combined forecasting system achieves lower MAPE results than the benchmark ARIMA model in 1-step to 3-step forecasting. Therefore, we can conclude that the developed combined forecasting system outperforms the ARIMA model in electrical load forecasting.
(c)
Compared with the individual prediction models, the predictive ability of the developed combined forecasting system exhibits significant improvements. According to relevant literature [3], if the electrical load forecasting error were to decrease by 1%, the operating costs would decrease by 10 million pounds. Consequently, considerable economic benefit could be generated.

5. Discussion

An insightful discussion based on above case studies is conducted in this section, which can provide detailed and comprehensive analysis for the experimental results.

5.1. Discussion of the Significance of the Developed Forecasting System with Testing Method

DM test and forecasting effectiveness are used as two testing method to demonstrate the capability of the developed forecasting system.
(a)
Table 11 presents the DM statistics, where the square error loss function values are applied, and demonstrates that the combined forecasting system differs from BPNN, FABPNN, ENN, WNN and ARIMA at the 1% significance level in multi-step forecasting.
(b)
Table 11 implies the one-order and two-order forecasting effectiveness of the developed forecasting system and other compared models. From Table 11, it can be determined that the proposed forecasting system obtains the largest value of forecasting effectiveness compared with other compared models in multi-step forecasting.
Remark. 
From the results of the DM test and forecasting effectiveness, we can conclude that the developed nonlinear combined forecasting system exhibits superior forecasting performance to the other models and the prediction validity of the developed forecasting system and others differs significantly.

5.2. Discussion of Comparison with Linear Combined Models

To further validate the effectiveness of the developed nonlinear combined forecasting system, two linear combined methods, which include the average value method and entropy weight method, are applied to compare with the proposed nonlinear combined forecasting system. More specifically, the average value method means that the weight of each individual model is equal to 1/M (M is the number of the individual models), and the entropy weight method calculated the weights of each single model by evaluating the amount of information of each individual model objectively.
To provide more detailed comparison information, the forecasting results of two randomly selected days (Saturday in February and Friday in June) are presented in Table 12. It is clearly revealed that the developed nonlinear combined forecasting system achieves lower MAPE values compared with the linear combined models in multi-step forecasting, indicating that the nonlinear combined forecasting system can provide more accurate forecasting result in engineering application.
Remark. 
As demonstrated by the performances of the developed combined forecasting system and linear combined models, the developed combined forecasting system exhibits superior performance, indicating that improved data preprocessing and modified SVM can greatly enhance the performance of the nonlinear combined forecasting system in electrical load forecasting.

5.3. Discussion of the Superiority of the Optimization Algorithm

To test the superiority of the optimization algorithm used in the developed forecasting system, the discussion of comparison with other typical optimization algorithm, i.e., FA, Ant lion optimizer (ALO) and Dragonfly algorithm (DA), is performed in this section. As shown in Figure 9, four typical test functions, including two unimodal test functions (i.e., Function 1 and Function 2) and two multimodal test functions (i.e., Function 3 and Function 4), are tested in this section. Figure 9 presents the curve of fitness value of the MFO and other compared algorithms, which is employed to validate the effectiveness of the MFO algorithm in terms of convergence performance. In brief, it can be observed that the MFO is better than that of other compared algorithms.
Remark. 
According to the comparison of the MFO algorithm and other compared algorithms, the MFO algorithm shows superior performance, indicating that the MFO algorithm can provide very promising and competitive optimization results, which further illustrates that the MFO algorithm can contribute greatly to the excellent performance of the developed forecasting system.

5.4. Further Validation for the Stability of the Developed Combined Forecasting System

Although the statistical MAPE values presented in Table 6 and Table 9 well evaluate the stability of the developed forecasting system, to prove the effectiveness and applicability, further validation needs to be conducted by another method. Therefore, based on the performance variance used in [57], the standard deviation of forecasting errors is employed to verify the stability of the developed combined forecasting system. As demonstrated in Table 13, the standard deviation values of the proposed forecasting system are smaller than other compared models, indicating that the developed forecasting system is more stable than other comparison models.
Remark. 
The results for the standard deviation of forecasting errors also account for the conclusion obtained from Table 6 and Table 9, i.e., the developed combined forecasting system is superior to all considered compared models in terms of stability. In summary, we can conclude that both the accuracy and stability of the developed forecasting system performs better than all considered compared models.

6. Conclusions

Electrical load forecasting plays a considerably significant role in power systems. More accurate forecasting results are vital for economic operation and provide more valid information for decision makers. Therefore, conducting accurate forecasting of electrical loads appears to be particularly important to reduce costs and risks. However, it is difficult to achieve desirable performance using single methods. The combined method can sufficiently incorporate the advantages of individual models; however, the application of linear combinations is limited because the possibility of nonlinear terms is ignored. Therefore, in this study, a novel nonlinear combined forecasting system, which consists of three modules (improved data pre-processing module, the forecasting module, and the evaluation module) is developed for electric load forecasting which successfully overcomes the drawbacks that existed in linear combined forecasting models. Different from the simple data pre-processing of most previous studies, the improved data pre-processing module based on longitudinal data selection is successfully developed in this system, which further improves the effectiveness of data pre-processing and then enhances the final forecasting performance. Moreover, the modified support vector machine is developed to integrate all the individual predictors and obtain the final prediction, which successfully overcomes the upper drawbacks of the linear combined model. Furthermore, the evaluation module is incorporated to perform a scientific evaluation for the developed system.
According to the experimental results and analyses, the developed forecasting system exhibits a more precise prediction capability than the four single models. For example, in 1-step prediction, the average MAPE values of the developed forecasting system, BPNN, FABPNN, ENN and WNN are 0.5281%, 0.9411%, 0.9581%, 1.1011% and 1.5047%, respectively; in 2-step prediction, the average MAPE values are 0.7650%, 1.2889%, 1.3036%, 1.4834% and 2.0018%, respectively; and in 3-step forecasting, the average MAPE values of the five models are 1.1631%, 1.6619%, 1.6800%, 1.8781% and 2.6247%, respectively. Furthermore, the proposed forecasting system obtains the lowest MAPE Std. results, indicating that it can maintain stability in electrical load forecasting. The results of the DM test and forecasting effectiveness confirm the evidence that the developed nonlinear combined forecasting system outperforms the single models and ARIMA benchmark model. Furthermore, the proposed combined forecasting system exhibits effective prediction ability compared to the linear combined models. In summary, the developed combined forecasting system, which has excellent properties, is a promising model for power load forecasting as well as other fields.

Acknowledgments

This research was supported by the Major Program of National Social Science Foundation of China (Grant No. 14ZDB130).

Author Contributions

Chengshi Tian proposed the concept of this research and provided overall guidance, Yan Hao wrote the whole manuscript, and carried on data analysis.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Quan, H.; Srinivasan, D.; Khosravi, A. Uncertainty handling using neural network-based prediction intervals for electrical load forecasting. Energy 2014, 73, 916–925. [Google Scholar] [CrossRef]
  2. Shu, F.; Luonan, C. Short-term load forecasting based on an adaptive hybrid method. Power Syst. IEEE Trans. 2006, 21, 392–401. [Google Scholar] [CrossRef]
  3. Lee, C.-W.; Lin, B.-Y. Application of Hybrid Quantum Tabu Search with Support Vector Regression (SVR) for Load Forecasting. Energies 2016, 9, 873. [Google Scholar] [CrossRef]
  4. Zjavka, L.; Snášel, V. Short-term power load forecasting with ordinary differential equation substitutions of polynomial networks. Electr. Power Syst. Res. 2016, 137, 113–123. [Google Scholar] [CrossRef]
  5. Du, P.; Wang, J.; Yang, W.; Niu, T. Multi-step ahead forecasting in electrical power system using a hybrid forecasting system. Renew. Energy 2018, 122, 533–550. [Google Scholar] [CrossRef]
  6. Yang, W.; Wang, J.; Wang, R. Research and application of a novel hybrid model based on data selection and artificial intelligence algorithm for short term load forecasting. Entropy 2017, 19, 52. [Google Scholar] [CrossRef]
  7. The 12 Biggest Blackouts in History. Available online: http://www.msn.com/en-za/news/offbeat/the-12-biggest-blackouts-in-history/ar-CCeNdC#page=1 (accessed on 13 March 2018).
  8. Wang, Y.; Wang, J.; Zhao, G.; Dong, Y. Application of residual modification approach in seasonal ARIMA for electricity demand forecasting: A case study of China. Energy Policy 2012, 48, 284–294. [Google Scholar] [CrossRef]
  9. Huang, M.-L. Hybridization of Chaotic Quantum Particle Swarm Optimization with SVR in Electric Demand Forecasting. Energies 2016, 9, 426. [Google Scholar] [CrossRef]
  10. Dudek, G. Pattern-based local linear regression models for short-term load forecasting. Electr. Power Syst. Res. 2016, 130, 139–147. [Google Scholar] [CrossRef]
  11. Guo, Y.; Nazarian, E.; Ko, J.; Rajurkar, K. Hourly cooling load forecasting using time-indexed ARX models with two-stage weighted least squares regression. Energy Convers. Manag. 2014, 80, 46–53. [Google Scholar] [CrossRef]
  12. Wang, X. Grey prediction with rolling mechanism for electricity demand forecasting of Shanghai. In Proceedings of the 2007 IEEE International Conference on Grey Systems and Intelligent Services, GSIS 2007, Nanjing, China, 18–20 November 2007; pp. 689–692. [Google Scholar]
  13. Dong, Y.; Wang, J.; Wang, C.; Guo, Z. Research and Application of Hybrid Forecasting Model Based on an Optimal Feature Selection System—A Case Study on Electrical Load Forecasting. Energies 2017, 10, 490. [Google Scholar] [CrossRef]
  14. Lee, C.M.; Ko, C.N. Short-term load forecasting using lifting scheme and ARIMA models. Expert Syst. Appl. 2011, 38, 5902–5911. [Google Scholar] [CrossRef]
  15. Zhang, M.; Bao, H.; Yan, L.; Cao, J.; Du, J. Research on processing of short-term historical data of daily load based on Kalman filter. Power Syst Technol. 2003, 9, 39–42. [Google Scholar]
  16. Lin, W.-M.; Gow, H.-J.; Tsai, M.-T. An enhanced radial basis function network for short-term electricity price forecasting. Appl. Energy 2010, 87, 3226–3234. [Google Scholar] [CrossRef]
  17. Jain, R.K.; Smith, K.M.; Culligan, P.J.; Taylor, J.E. Forecasting energy consumption of multi-family residential buildings using support vector regression: Investigating the impact of temporal and spatial monitoring granularity on performance accuracy. Appl. Energy 2014, 123, 168–178. [Google Scholar] [CrossRef]
  18. García-Martos, C.; Rodríguez, J.; Sánchez, M.J. Modelling and forecasting fossil fuels, CO2 and electricity prices and their volatilities. Appl. Energy 2013, 101, 363–375. [Google Scholar] [CrossRef]
  19. Metaxiotis, K.; Kagiannas, A.; Askounis, D.; Psarras, J. Artificial intelligence in short term electric load forecasting: A state-of-the-art survey for the researcher. Energy Convers. Manag. 2003, 44, 1525–1534. [Google Scholar] [CrossRef]
  20. Li, P.; Li, Y.; Xiong, Q.; Chai, Y.; Zhang, Y. Application of a hybrid quantized Elman neural network in short-term load forecasting. Int. J. Electr. Power Energy Syst. 2014, 55, 749–759. [Google Scholar] [CrossRef]
  21. Liao, G.-C.; Tsao, T.-P. Application of fuzzy neural networks and artificial intelligence for load forecasting. Electr. Power Syst. Res. 2004, 70, 237–244. [Google Scholar] [CrossRef]
  22. Dong, Y.; Ma, X.; Ma, C.; Wang, J. Research and application of a hybrid forecasting model based on data decomposition for electrical load forecasting. Energies 2016, 9, 50. [Google Scholar] [CrossRef]
  23. Wang, J.; Yang, W.; Du, P.; Li, Y. Research and application of a hybrid forecasting framework based on multi-objective optimization for electrical power system. Energy 2018, 148, 59–78. [Google Scholar] [CrossRef]
  24. Li, H.; Guo, S.; Zhao, H.; Su, C.; Wang, B. Annual electric load forecasting by a least squares support vector machine with a fruit fly optimization algorithm. Energies 2012, 5, 4430–4445. [Google Scholar] [CrossRef]
  25. Peng, L.L.; Fan, G.F.; Huang, M.L.; Hong, W.C. Hybridizing DEMD and quantum PSO with SVR in electric load forecasting. Energies 2016, 9, 221. [Google Scholar] [CrossRef]
  26. Wang, J.; Yang, W.; Du, P.; Niu, T. A novel hybrid forecasting system of wind speed based on a newly developed multi-objective sine cosine algorithm. Energy Convers. Manag. 2018, 163, 134–150. [Google Scholar] [CrossRef]
  27. Wang, J.; Zhu, W.; Zhang, W.; Sun, D. A trend fixed on firstly and seasonal adjustment model combined with the ε-SVR for short-term forecasting of electricity demand. Energy Policy 2009, 37, 4901–4909. [Google Scholar] [CrossRef]
  28. Osório, G.J.; Matias, J.C.O.; Catalão, J.P.S. Short-term wind power forecasting using adaptive neuro-fuzzy inference system combined with evolutionary particle swarm optimization, wavelet transform and mutual information. Renew. Energy 2015, 75, 301–307. [Google Scholar] [CrossRef]
  29. Bates, J.M.; Granger, C.W.J. The Combination of Forecasts. Oper. Res. Soc. 1969, 20, 451–468. [Google Scholar] [CrossRef]
  30. Wang, J.; Zhu, S.; Zhang, W.; Lu, H. Combined modeling for electric load forecasting with adaptive particle swarm optimization. Energy 2010, 35, 1671–1678. [Google Scholar] [CrossRef]
  31. Xiao, L.; Wang, J.; Hou, R.; Wu, J. A combined model based on data pre-analysis and weight coefficients optimization for electrical load forecasting. Energy 2015, 82, 524–549. [Google Scholar] [CrossRef]
  32. Zhao, W.; Wang, J.; Lu, H. Combining forecasts of electricity consumption in China with time-varying weights updated by a high-order Markov chain model. Omega 2014, 45, 80–91. [Google Scholar] [CrossRef]
  33. Xiao, L.; Shao, W.; Liang, T.; Wang, C. A combined model based on multiple seasonal patterns and modified firefly algorithm for electrical load forecasting. Appl. Energy 2016, 167, 135–153. [Google Scholar] [CrossRef]
  34. Li, M.W.; Han, D.F.; Wang, W. long Vessel traffic flow forecasting by RSVR with chaotic cloud simulated annealing genetic algorithm and KPCA. Neurocomputing 2015, 157, 243–255. [Google Scholar] [CrossRef]
  35. Xu, Y.; Yang, W.; Wang, J. Air quality early-warning system for cities in China. Atmos. Environ. 2017, 148, 239–257. [Google Scholar] [CrossRef]
  36. Hong, W.C.; Dong, Y.; Zhang, W.Y.; Chen, L.Y.; Panigrahi, B.K. Cyclic electric load forecasting by seasonal SVR with chaotic genetic algorithm. Int. J. Electr. Power Energy Syst. 2013, 44, 604–614. [Google Scholar] [CrossRef]
  37. Chen, Y.; Hong, W.-C.; Shen, W.; Huang, N. Electric Load Forecasting Based on a Least Squares Support Vector Machine with Fuzzy Time Series and Global Harmony Search Algorithm. Energies 2016, 9, 70. [Google Scholar] [CrossRef]
  38. Hong, W.C.; Dong, Y.; Lai, C.Y.; Chen, L.Y.; Wei, S.Y. SVR with hybrid chaotic immune algorithm for seasonal load demand forecasting. Energies 2011, 4, 960–977. [Google Scholar] [CrossRef]
  39. Huang, N.; Shen, Z.; Long, S.; Wu, M.; Shih, H.; Zheng, Q.; Yen, N.; Tung, C.; Liu, H. The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. Proc. R. Soc. A Math. Phys. Eng. Sci. 1998, 454, 903–995. [Google Scholar] [CrossRef]
  40. Fan, G.F.; Peng, L.L.; Zhao, X.; Hong, W.C. Applications of hybrid EMD with PSO and GA for an SVR-based load forecasting model. Energies 2017, 10, 1713. [Google Scholar] [CrossRef]
  41. Fan, G.F.; Peng, L.L.; Hong, W.C.; Sun, F. Electric load forecasting by the SVR model with differential empirical mode decomposition and auto regression. Neurocomputing 2016, 173, 958–970. [Google Scholar] [CrossRef]
  42. Fan, G.-F.; Qing, S.; Wang, H.; Hong, W.-C.; Li, H.-J. Support vector regression model based on empirical mode decomposition and auto regression for electric load forecasting. Energies 2013, 6, 1887–1901. [Google Scholar] [CrossRef]
  43. Wu, Z.; Huang, N.E. Ensemble Empirical Mode Decomposition. Adv. Adapt. Data Anal. 2009, 1, 1–41. [Google Scholar] [CrossRef]
  44. Torres, M.E.; Colominas, M.A.; Schlotthauer, G.; Flandrin, P. A complete ensemble empirical mode decomposition with adaptive noise. In Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Prague, Czech Republic, 22–26 May 2011; pp. 4144–4147. [Google Scholar]
  45. Zhang, W.; Qu, Z.; Zhang, K.; Mao, W.; Ma, Y.; Fan, X. A combined model based on CEEMDAN and modified flower pollination algorithm for wind speed forecasting. Energy Convers. Manag. 2017, 136, 439–451. [Google Scholar] [CrossRef]
  46. Afanasyev, D.O.; Fedorova, E.A. The long-term trends on the electricity markets: Comparison of empirical mode and wavelet decompositions. Energy Econ. 2016, 56, 432–442. [Google Scholar] [CrossRef]
  47. Chen, Y.; Yang, Y.; Liu, C.; Li, C.; Li, L. A hybrid application algorithm based on the support vector machine and artificial intelligence: An example of electric load forecasting. Appl. Math. Model. 2015, 39, 2617–2632. [Google Scholar] [CrossRef]
  48. Vapnik, V.N. The Nature of Statistical Learning Theory; Springer: New York, NY, USA, 1995. [Google Scholar]
  49. Vapnik, V.N. Statistical Learning Theory; Wiley: New York, NY, USA, 1998. [Google Scholar]
  50. Ju, F.Y.; Hong, W.C. Application of seasonal SVR with chaotic gravitational search algorithm in electricity forecasting. Appl. Math. Model. 2013, 37, 9643–9651. [Google Scholar] [CrossRef]
  51. Li, M.W.; Geng, J.; Wang, S.; Hong, W.C. Hybrid chaotic quantum bat algorithm with SVR in electric load forecasting. Energies 2017, 10, 2180. [Google Scholar] [CrossRef]
  52. Liang, Y.; Niu, D.; Ye, M.; Hong, W.-C. Short-term load forecasting based on wavelet transform and least squares support vector machine optimized by improved cuckoo search. Energies 2016, 9, 827. [Google Scholar] [CrossRef]
  53. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  54. Zhao, H.; Zhao, H.; Guo, S. Using GM (1,1) Optimized by MFO with Rolling Mechanism to Forecast the Electricity Consumption of Inner Mongolia. Appl. Sci. 2016, 6, 20. [Google Scholar] [CrossRef]
  55. Diebold, F.X.; Mariano, R.S. Comparing predictive accuracy. J. Bus. Econ. Stat. 1995, 13, 253–263. [Google Scholar] [CrossRef]
  56. Chen, H.; Hou, D. Research on superior combination forecasting model based on forecasting effective measure. J. Univ. Sci. Technol. China 2002, 2, 172–180. [Google Scholar]
  57. Wang, J.; Du, P.; Niu, T.; Yang, W. A novel hybrid system based on a new proposed algorithm—Multi-objective Whale Optimization Algorithm for wind speed forecasting. Appl. Energy 2017, 208, 344–360. [Google Scholar] [CrossRef]
Figure 1. Flowchart of the proposed combined forecasting system.
Figure 1. Flowchart of the proposed combined forecasting system.
Energies 11 00712 g001
Figure 2. Structure of the individual models and complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN). (A) Structure of the complete ensemble empirical mode decomposition with adaptive noise. (B) Structure of the individual models.
Figure 2. Structure of the individual models and complete ensemble empirical mode decomposition with adaptive noise (CEEMDAN). (A) Structure of the complete ensemble empirical mode decomposition with adaptive noise. (B) Structure of the individual models.
Energies 11 00712 g002
Figure 3. Process of longitudinal data selection.
Figure 3. Process of longitudinal data selection.
Energies 11 00712 g003
Figure 4. The results of AE, MAE, MSE and MAPE for February.
Figure 4. The results of AE, MAE, MSE and MAPE for February.
Energies 11 00712 g004
Figure 5. 1-step forecasting results in February. (A) The MAPE values from Monday to Sunday. (B) The forecasting results of Monday, February.
Figure 5. 1-step forecasting results in February. (A) The MAPE values from Monday to Sunday. (B) The forecasting results of Monday, February.
Energies 11 00712 g005
Figure 6. The results of AE, MAE, MSE and MAPE for June.
Figure 6. The results of AE, MAE, MSE and MAPE for June.
Energies 11 00712 g006
Figure 7. 1-step forecasting results in June. (A) The MAPE values from Monday to Sunday. (B) The forecasting results of Sunday, June
Figure 7. 1-step forecasting results in June. (A) The MAPE values from Monday to Sunday. (B) The forecasting results of Sunday, June
Energies 11 00712 g007
Figure 8. Comparison of 1-step to 3-step forecasting performance of the proposed combined forecasting system and ARIMA model.
Figure 8. Comparison of 1-step to 3-step forecasting performance of the proposed combined forecasting system and ARIMA model.
Energies 11 00712 g008
Figure 9. Comparison of the curve of fitness value of the MFO and other compared algorithms.
Figure 9. Comparison of the curve of fitness value of the MFO and other compared algorithms.
Energies 11 00712 g009
Table 1. Evaluation rules.
Table 1. Evaluation rules.
MetricDefinitionEquation
AEThe average error of T forecasting results 1 T i = 1 T ( O i F ^ i )
MAEThe mean error absolute of T forecasting results 1 T i = 1 T | O i F ^ i |
MSEThe mean square error of T forecasting results 1 T i = 1 T ( O i F ^ i ) 2
MAPE (%)The mean absolute percentage error 1 T i = 1 T | ( O i F ^ i ) O i | × 100 %
ζ I N D E X 1 (%)The decreased relative error of the index among different models I N D E X mod e l i I N D E X mod e l j I N D E X mod e l i × 100 %
1 In this paper, the INDEX includes MAE, MSE, MAPE.
Table 2. Experimental parameter values.
Table 2. Experimental parameter values.
ModelExperimental ParameterDefault Value
CEEMDANNoise standard deviation0.2
The number of realizations500
Maximum number of sifting iterations5000
The removed intrinsic mode functionsIMF1
BPNNLearning velocity0.1
Maximum number of training iterations1000
Training precision requirement0.00004
Neuron number in the input layer4
Neuron number in the hidden layer9
Neuron number in the output layer1
FABPNNFA number of fireflies30
Maximum number of FA iterations500
FA randomness 0–10.5
FA minimum value of beta0.2
FA absorption coefficient1
BPNN maximum number of iteration times200
BPNN convergence value0.00001
BPNN learning rate0.1
Neuron number in the input layer4
Neuron number in the hidden layer9
Neuron number in the output layer1
ENNNumber of iterations1000
Neuron number in the input layer4
Neuron number in the hidden layer9
Neuron number in the output layer1
WNNNumber of iterations100
Learning rate0.01
Neuron number in the input layer4
Neuron number in the hidden layer9
Neuron number in the output layer1
MFOThe number of search agents30
Maximum number of iterations300
The lower bounds of variables0.01
The upper bounds of variables100
The number of variables2
SVMThe number of the input layer4
The number of the output layer1
The kernel function’s nameRBF
Table 3. Statistical values of data used in this study.
Table 3. Statistical values of data used in this study.
WeekMonthMeanMedianStd.MinimumMaximum
(MW)(MW)(MW)(MW)(MW)
MON.February9334.3059871.0951520.6286649.84011,078.940
June9720.6249952.1901409.1157157.53011,937.500
TUE.February8589.6159019.1851073.9136488.9809683.760
June9920.19810,129.7751238.0847453.35011,996.360
WED.February8649.8269031.2051111.0266452.9509786.950
June9682.9239848.7851189.0457228.83011,713.400
THU.February8864.9929374.7001255.0056487.11010,273.970
June9603.7859743.9651206.7057140.74011,602.130
FRI.February8870.0449170.6951237.7106516.41010,200.400
June9735.6299894.5101182.4827327.95011,451.680
SAT.February8540.9428979.6401133.1626554.5209928.520
June9124.3869226.755982.7057268.43010,864.100
SUN.February8093.0388554.0201001.7476391.5309390.940
June8772.3318663.3051116.6736977.15010,926.030
Table 4. Forecasting results obtained using February data from New South Wales.
Table 4. Forecasting results obtained using February data from New South Wales.
WeekModelAEMAEMSEMAPE
1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step
MON.BPNN1.3639−10.7514−14.785876.968087.3096126.86979493.4612,421.4727,525.350.86320.95581.4429
FABPNN3.7445−6.6519−5.247476.758988.2479128.22599547.0312,511.8828,258.780.86120.96701.4564
ENN−1.0044−13.5185−15.261984.9175105.2535152.550112,425.1218,729.9641,127.410.95351.15991.7416
WNN−0.1408−25.7658−26.4372134.9493174.9007236.524432,121.2466,146.46183,341.011.50291.91782.6471
Combined forecasting system−1.89544.8475−25.305342.754755.748392.04542955.295066.1315,392.980.48960.61251.0435
TUE.BPNN−14.0677−23.9245−32.185082.8839104.7622135.133311,424.7219,811.9232,025.020.98561.25801.6543
FABPNN−16.8212−28.3703−39.842687.0675111.3953140.676212,832.8022,262.7034,626.021.03411.33851.7184
ENN−20.6985−32.9486−42.9214104.1221134.5153158.641017,145.1930,481.0145,606.381.26591.65041.9700
WNN−18.6602−31.9576−50.3965132.1093179.5787231.999332,314.9595,482.40228,634.121.61682.20882.8686
Combined forecasting system−3.5775−3.1009−1.327148.753472.6386109.75213697.709420.4221,857.920.59210.86791.3137
WED.BPNN−17.0143−31.5453−35.043579.8144113.2511144.338410,635.3723,724.3236,327.290.94911.34971.7504
FABPNN−15.8284−30.8274−33.777883.1057113.8130139.980211,747.1024,310.6734,389.390.99301.36081.6969
ENN−13.7888−25.2383−26.105887.7993117.0281142.433713,134.1925,093.0435,466.311.04981.39611.7187
WNN−11.8223−12.5006−3.6612118.3172160.6506198.120241,894.2158,325.4479,324.941.42701.94082.3941
Combined forecasting system−8.1485−9.1175−28.695949.433968.0790104.99884082.459029.2921,231.040.59980.81241.2458
THU.BPNN−6.3561−19.6933−22.5452104.6001132.4169174.419417,715.7832,893.4847,678.251.20091.53452.0338
FABPNN−8.0497−22.9996−23.4411102.1040129.9880166.014317,458.2132,981.1543,922.021.17141.50801.9394
ENN−14.2674−30.6256−29.6502114.5853143.5429171.519321,097.1136,114.8845,852.661.33631.69302.0229
WNN−33.6208−53.1476−63.9543160.5429200.8806247.578245,415.2182,931.44124,086.681.89292.38032.9301
Combined forecasting system−6.5061−10.1216−4.252755.522680.3931123.87444824.1012,375.8424,987.700.65530.94071.4463
FRI.BPNN−10.2567−23.5966−21.270797.6717140.9842187.974817,003.6137,922.8361,624.941.11791.60602.1836
FABPNN−14.0009−28.2144−28.2492101.2430147.2809198.521118,035.5641,018.7569,155.701.16471.68982.3062
ENN−12.1384−27.6435−22.5750107.1073146.7995190.703319,173.2638,936.8159,817.361.23941.70102.2384
WNN−28.1042−47.9970−52.1578144.2766196.5340258.134533,297.4066,162.33108,392.631.70462.31693.0409
Combined forecasting system−3.7850−0.8954−3.469941.513171.7784105.16592979.4010,037.0422,513.690.47390.81631.1747
SAT.BPNN−18.3668−58.8065−50.190583.6567128.2100149.553913,347.2630,695.8042,244.560.99491.53851.8127
FABPNN−18.2990−58.7480−52.026081.0710122.2612142.667012,678.2527,997.9837,079.790.95961.45851.7135
ENN−8.9057−33.5612−33.200788.3301129.8825159.456013,860.6028,814.8143,314.431.04731.54081.8951
WNN−76.2241−103.5840−110.5893161.0754203.1411247.7138107,836.75137,768.64166,982.051.98932.50323.0492
Combined forecasting system−9.9972−15.8985−16.938042.221360.004897.43073253.007601.3916,427.040.51150.72601.1988
SUN.BPNN−19.0006−41.8885−63.057787.1693138.3510146.672911,695.3830,279.0941,649.761.09271.73841.8840
FABPNN−14.7801−33.2463−50.962487.4692134.0810144.363611,198.2226,233.7634,582.411.08401.67331.8233
ENN−14.2234−29.7956−42.536981.0386123.0542124.798710,251.8122,741.0123,536.691.00411.53741.5672
WNN−30.0213−49.0184−68.3288113.8393164.9003194.340419,452.6442,417.1064,463.361.44732.10722.5010
Combined forecasting system−3.11203.1278−17.604340.627555.679674.50552549.955992.0010,360.290.49880.67460.9197
Table 5. Improvement percentages generated by the combined forecasting system from February data.
Table 5. Improvement percentages generated by the combined forecasting system from February data.
Week Combined Forecasting SystemCombined Forecasting SystemCombined Forecasting SystemCombined Forecasting System
vs. BPNNvs. FABPNNvs. ENNvs. WNN
1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step
MON. ζ M A E 44.451336.148827.448944.299936.827628.216249.651447.034339.662268.317968.125861.0842
ζ M S E 68.870359.214744.077169.044959.509445.528576.215272.951762.572590.799692.341091.6042
ζ M A P E 43.277835.910127.681943.141936.653928.353648.647847.190740.086767.418868.059760.5802
TUE. ζ M A E 41.178730.663418.782344.005134.792121.982553.176745.999830.817363.096259.550652.6929
ζ M S E 67.634352.450731.747471.185657.685236.874378.433069.094152.072788.557390.133990.4398
ζ M A P E 39.922331.009620.589742.737035.156723.551353.222747.411933.313463.376160.705854.2041
ζ M A E 38.063939.886727.255140.516840.183424.990243.696641.826826.282358.219257.622947.0025
WED. ζ M S E 61.614461.940841.556265.247262.858738.262868.917464.016740.137590.255384.519173.2354
ζ M A P E 36.797439.806828.827439.593440.297526.583242.860241.808127.515757.965558.140547.9638
ζ M A E 46.919239.287928.979045.621638.153425.383351.544843.993727.778265.415759.979749.9655
THU. ζ M S E 72.769562.376047.591072.367862.476043.108977.133965.732045.504489.377885.077079.8627
ζ M A P E 45.434538.699128.886644.060737.621825.422550.961044.436628.502365.381860.479650.6387
FRI. ζ M A E 57.497349.087744.053258.996651.264347.025361.241651.104544.853771.226763.477959.2593
ζ M S E 82.477973.533063.466683.480475.530667.444984.460774.222262.362691.052284.829779.2295
ζ M A P E 57.609449.171246.205659.315751.690649.063761.767652.007847.522872.200564.766661.3709
SAT. ζ M A E 49.530353.198034.852547.920650.920831.707652.200553.800738.898173.787970.461560.6680
ζ M S E 75.627975.236461.114474.341972.850255.698176.530673.619962.074996.983494.482590.1624
ζ M A P E 48.588752.814433.867446.697150.226830.037651.161252.883336.742474.287970.999160.6855
SUN. ζ M A E 53.392559.754849.202953.552358.473248.390349.866554.752040.299464.311666.234461.6624
ζ M S E 78.197080.210875.125277.229077.159270.041775.126873.651155.982486.891585.873683.9284
ζ M A P E 54.351861.194151.182053.981759.684849.557150.321456.120341.313265.533767.986163.2263
Table 6. Statistical MAPE values of February data from New South Wales.
Table 6. Statistical MAPE values of February data from New South Wales.
Week BPNNFABPNNENNWNNCombined Forecasting System
1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step
MON.Minimum0.82630.88681.25210.81200.85291.13310.81200.92741.23150.97051.13731.37870.45290.54720.8730
Maximum0.94341.11931.74370.98231.26372.18281.08791.43982.17513.00414.07547.25840.51510.65051.2038
Mean0.86320.95581.44290.86120.96701.45640.95351.15991.74161.50291.91782.64710.48960.61251.0435
Std.0.03450.06570.16160.04870.10370.27400.07410.13850.24700.60150.93871.55680.01530.02890.0867
TUE.Minimum0.85131.04941.39430.89421.12931.41140.95881.21021.43131.08941.38841.79980.56080.79451.0911
Maximum1.19121.51851.97391.48061.86112.00361.52631.97352.38953.38436.957810.69870.61850.91871.5951
Mean0.98561.25801.65431.03411.33851.71841.26591.65041.97001.61682.20882.86860.59210.86791.3137
Std.0.08770.12990.18020.14930.20290.16220.15360.19370.22410.56931.36442.20100.01790.03660.1462
WED.Minimum0.89881.13421.36680.89591.06321.25430.95891.18111.38291.00151.22421.45470.53030.76430.7724
Maximum1.03761.48762.00361.32941.97172.30571.16921.57232.04033.29694.70784.46281.03700.85291.4643
Mean0.94911.34971.75040.99301.36081.69691.04981.39611.71871.42701.94082.39410.59980.81241.2458
Std.0.03760.09440.17500.11750.21260.25170.06920.11860.18460.55230.82680.69520.12230.03090.1919
THU.Minimum1.13281.35551.73451.05921.24461.62401.16951.48151.71971.44391.69011.94510.59960.68611.2255
Maximum1.68972.22293.18371.45712.06092.58221.45671.90752.42233.35815.06445.96950.69860.99701.7947
Mean1.20091.53452.03381.17141.50801.93941.33631.69302.02291.89292.38032.93010.65530.94071.4463
Std.0.13700.20100.33810.10430.18750.26720.07950.13000.20290.48190.84721.03810.02730.07610.1614
FRI.Minimum0.97891.35211.80900.98791.37891.98021.15331.57782.04251.19841.68952.19850.42170.72670.9891
Maximum1.32731.85292.63471.62432.19602.88171.52011.89152.49862.34403.08834.24230.51720.92591.4602
Mean1.11791.60602.18361.16471.68982.30621.23941.70102.23841.70462.31693.04090.47390.81631.1747
Std.0.08690.14810.21710.16250.23300.26010.09670.10500.13910.35050.43060.64770.02720.05080.1357
SAT.Minimum0.89251.40901.57620.87921.30741.51220.94971.42491.61320.91581.12411.35790.48070.63951.0126
Maximum1.09301.68872.11201.08351.64221.95601.22321.65952.183210.343711.134911.04340.54090.81481.4727
Mean0.99491.53851.81270.95961.45851.71351.04731.54081.89511.98932.50323.04920.51150.72601.1988
Std.0.05360.09120.15500.05730.09580.12370.09650.07950.17222.34442.43002.31400.01450.04720.1428
SUN.Minimum0.94231.45011.44590.91741.39931.43870.93671.44701.40251.07031.67191.66500.47500.49910.7836
Maximum1.70582.75973.62311.43691.91292.26791.12171.67491.89431.82942.64363.13350.51440.74941.0195
Mean1.09271.73841.88401.08401.67331.82331.00411.53741.56721.44732.10722.50100.49880.67460.9197
Std.0.24290.41490.67000.12150.17440.26220.05970.05520.14040.20720.27010.45090.01220.07840.0729
Table 7. Forecasting results obtained using June data from New South Wales.
Table 7. Forecasting results obtained using June data from New South Wales.
WeekModelAEMAEMSEMAPE
1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step
MON.BPNN28.149040.280960.042087.2188116.6938157.321812,598.6224,248.9146,240.920.91741.25001.6564
FABPNN31.942643.433770.005191.8725122.9630174.090814,715.9927,952.6157,357.070.95921.30001.8176
ENN23.549725.629641.8058107.7753140.2308191.144322,021.5043,213.0684,770.041.11901.49002.0180
WNN21.363524.272040.4258131.7077172.9343259.793734,847.8261,929.17162,633.651.37221.82002.6891
Combined forecasting system13.164331.778333.247853.440681.4294118.39254923.0110,866.1129,343.950.57200.85621.2510
TUE.BPNN11.863211.094630.615268.820293.5565118.43687209.0014,736.0127,030.990.70280.94001.1964
FABPNN8.27114.784423.488572.621596.1279127.48868652.0216,947.8231,041.000.73720.96001.2870
ENN9.93182.246725.548297.0374117.4459147.841916,480.6730,382.5852,207.810.97741.18001.4659
WNN8.37534.232324.5869113.9295142.2588203.135122,966.5140,516.8691,325.841.15041.43002.0418
Combined forecasting system5.53245.2540−0.869552.560674.6922104.13884625.968844.6520,139.940.53900.75431.0397
WED.BPNN−12.3035−26.7509−41.059267.780794.4602135.40848288.4317,216.8134,143.770.68940.95681.3600
FABPNN−8.0366−20.7559−26.535367.442794.1133136.70378505.3617,816.2936,375.570.69170.96121.3900
ENN−12.0552−29.6080−34.321886.6654124.5985175.295414,665.4231,819.0764,573.050.88241.27151.7700
WNN−42.7244−70.8510−97.8009126.2426167.1746243.522938,007.5760,350.02123,256.041.30271.72002.4900
Combined forecasting system−4.8054−10.0858−26.129943.017270.973093.24342939.599012.5116,351.510.44890.73360.9664
THU.BPNN−5.8352−18.9423−40.713582.8959112.6687151.227911,293.1623,270.9543,996.530.85431.16601.5612
FABPNN−9.7177−24.0847−54.533284.0990111.3262155.727911,357.1021,913.8644,006.920.86461.14571.6058
ENN−14.1086−34.7318−62.7759104.5977137.7137201.825519,136.6338,040.7776,066.671.07511.43072.0714
WNN−27.2461−54.1089−91.1277119.6625160.5898231.175624,307.8549,421.97103,282.741.24401.68002.4100
Combined forecasting system−0.1579−1.2456−1.111450.431879.7675118.23273847.2010,580.4223,714.550.53060.83351.2603
FRI.BPNN12.285811.558619.335682.8441104.8186142.557113,523.9926,727.8653,413.520.83651.05931.4524
FABPNN14.724015.603922.331287.5349111.5343156.435415,983.8928,458.8558,309.170.88451.13511.6003
ENN12.12839.714017.3316101.8761131.7470170.128820,140.8838,866.1072,284.311.03341.35531.7397
WNN−4.3370−22.6447−4.8305155.6390198.0136266.981269,985.41102,668.91207,077.151.60502.06282.7662
Combined forecasting system5.18348.841228.714143.165060.7125100.76073316.257569.4821,664.290.44290.62231.0342
SAT.BPNN15.554333.109748.487985.7014113.2363130.224617,019.5126,498.7733,826.420.92081.22381.4201
FABPNN12.757230.728254.917691.6801118.8509135.095719,685.1129,051.8335,875.160.98651.29211.4751
ENN8.936027.697069.9136104.2473144.1342157.329826,924.2142,076.4251,964.641.11981.56481.7123
WNN5.304817.982355.9119119.7350165.8653194.238833,275.5655,086.7177,887.751.28681.79432.1188
Combined forecasting system−1.90622.6399−7.838146.377263.6460114.95713754.728630.0233,023.620.50520.68981.2506
SUN.BPNN12.108413.406816.683093.6335131.9624164.330415,544.8935,966.0955,245.021.04961.46831.8586
FABPNN9.44279.257614.321190.8458130.7386150.056314,898.6333,837.1946,475.601.02131.46081.6896
ENN7.704514.585332.2788117.4224162.3855212.778626,470.6954,663.7290,407.861.31171.79752.3623
WNN−7.0600−15.6229−14.4743135.7484191.2429248.316933,576.9872,513.26123,455.021.52392.14302.7985
Combined forecasting system3.4471−4.246910.802846.779668.7158101.12633391.828848.7323,580.160.53320.77021.1393
Table 8. Improvement percentages generated by the combined forecasting system from June data.
Table 8. Improvement percentages generated by the combined forecasting system from June data.
Week Combined Forecasting SystemCombined Forecasting SystemCombined Forecasting SystemCombined Forecasting System
vs. BPNNvs. FABPNNvs. ENNvs. WNN
1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step
MON. ζ M A E 38.728230.219624.745041.831833.777331.993950.414841.931938.061259.424952.913154.4283
ζ M S E 60.924255.189336.541266.546561.126748.839977.644574.854665.384185.872882.454081.9570
ζ M A P E 37.651931.505924.472640.368934.140331.171548.884642.538538.008558.316452.957453.4797
TUE. ζ M A E 23.626220.163512.072327.623922.299118.315245.834736.402829.560753.865747.495548.7342
ζ M S E 35.830839.979325.493146.533247.812535.118271.931070.889161.423579.857878.170477.9472
ζ M A P E 23.300919.755413.092826.879921.427219.210744.849536.076329.071853.143147.251849.0783
WED. ζ M A E 36.534724.864631.139136.216624.587731.791650.364043.038646.807865.924957.545661.7106
ζ M S E 64.533847.652952.109865.438349.414355.048179.955671.675874.677592.265885.066386.7337
ζ M A P E 34.882323.326528.940135.098823.676630.473849.124942.305045.400365.539257.348561.1882
THU. ζ M A E 39.162529.201821.818240.032928.348024.077351.785042.077341.418357.855050.328448.8559
ζ M S E 65.933454.533846.099066.125251.718246.111879.896172.186668.824084.173078.591777.0392
ζ M A P E 37.889928.518119.275638.629827.254421.518150.645841.744639.156757.346750.388347.7055
FRI. ζ M A E 47.896042.078529.319150.688245.566135.589657.629953.917440.773972.265969.339262.2593
ζ M S E 75.478771.679559.440479.252573.402062.845883.534780.524270.029095.261592.627389.5381
ζ M A P E 47.058641.254728.793749.930145.173135.374457.143454.081240.551972.406069.830862.6123
SAT. ζ M A E 45.885243.793711.724049.414146.448914.906955.512355.842526.932461.266861.627940.8166
ζ M S E 77.938767.43242.373380.926170.29447.948586.054579.489636.449888.716384.333757.6010
ζ M A P E 45.130143.633111.933948.784446.615015.215354.881155.915726.962360.736661.556040.9745
SUN. ζ M A E 50.039747.927838.461648.506647.440432.607860.161357.683652.473565.539564.068959.2753
ζ M S E 78.180575.397057.317177.234173.849149.263387.186583.812473.918089.898487.797180.8998
ζ M A P E 49.198347.546238.703047.790647.273432.571659.349357.151651.773165.009964.059759.2897
Table 9. Statistical MAPE values of June data from New South Wales.
Table 9. Statistical MAPE values of June data from New South Wales.
Week BPNNFABPNNENNWNNCombined Forecasting System
1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step
MON.Minimum0.80431.08251.23660.89421.11301.49950.98501.32921.57921.06271.38281.81890.54570.75611.1092
Maximum0.98561.36881.94191.02611.50912.30991.26921.72542.26852.68802.97174.75480.59570.94561.6676
Mean0.91741.24511.65640.95921.30141.81761.11901.49402.01801.37221.81972.68910.57200.85621.2510
Std.0.05690.10210.23310.04410.12340.23000.08080.09980.22600.40920.41890.72600.01520.05480.1365
TUE.Minimum0.59350.72410.91080.62550.82791.10660.90181.04631.16640.83221.03901.39400.49930.69880.9078
Maximum0.85421.14701.56770.91091.22441.49251.12131.42291.89401.57531.81142.85230.58200.85321.2966
Mean0.70280.94381.19640.73720.96221.28700.97741.17721.46591.15041.43422.04180.53900.75431.0397
Std.0.07430.11080.18900.09360.11340.11160.05990.09770.18340.21870.22200.38260.02650.04130.1148
WED.Minimum0.52410.72110.94370.53850.73471.15550.81011.18461.50870.82261.10761.64400.42810.66650.8328
Maximum0.82111.20691.59340.85911.26941.71921.01591.42522.01764.13774.39095.73350.47040.79551.2923
Mean0.68940.95681.36290.69170.96121.39100.88241.27151.76511.30271.72172.49340.44890.73360.9664
Std.0.08920.14370.18630.09720.15760.20010.05410.06440.16880.81690.76330.98420.01290.03400.1311
THU.Minimum0.69160.94811.16720.67370.91231.30600.98991.30601.71241.05091.31081.80420.49910.75031.0305
Maximum1.01461.35232.24350.98861.28571.84341.16881.59362.41021.80082.44803.31780.58630.90701.5658
Mean0.85431.16601.56120.86461.14571.60581.07511.43072.07141.24401.67932.41110.53060.83351.2603
Std.0.10200.13130.30720.08550.11590.15330.05100.08390.21230.19210.27650.35580.02230.04570.1396
FRI.Minimum0.58440.68140.98060.63860.80481.16890.92081.17421.42780.87041.18431.70690.42760.54800.8679
Maximum0.98921.27721.76131.27841.52892.29551.10011.50492.05566.76727.31967.31200.46390.68331.1764
Mean0.83651.05931.45240.88451.13511.60031.03341.35531.73971.60502.06282.76620.44290.62231.0342
Std.0.12960.19850.23320.15420.19660.33060.05250.08810.13351.55141.65691.52440.00980.03760.0884
SAT.Minimum0.78301.06701.14810.81941.02281.11721.02021.43311.47781.12111.53911.63870.47920.63321.0244
Maximum1.14441.57211.84961.14001.52492.14471.20051.72591.94871.83262.20392.93830.53810.76041.4706
Mean0.92081.22381.42010.98651.29211.47511.11981.56481.71231.28681.79432.11880.50520.68981.2506
Std.0.12460.15060.20280.08480.15670.24470.05490.07330.15070.19600.21330.39530.01610.03920.1528
SUN.Minimum0.82271.09351.27440.82001.16291.33231.18231.63082.07061.31851.89572.20490.50420.65730.9825
Maximum1.25921.78742.36661.28041.78112.19081.52002.15952.66892.11732.66353.44490.56830.85271.5572
Mean1.04961.46831.85861.02131.46081.68961.31171.79752.36231.52392.14302.79850.53320.77021.1393
Std.0.15280.22660.36220.14000.20650.29220.08430.14250.18990.21580.23160.32700.02020.05000.1751
Table 10. Forecasting results of the proposed combined forecasting system and ARIMA model.
Table 10. Forecasting results of the proposed combined forecasting system and ARIMA model.
AEMAEMSEMAPE
1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step
February
ARIMA0.47335.12822.3802116.3646168.6304239.280523,001.9752,536.4192,779.141.36201.96572.8060
Combined forecasting system−5.2888−4.4512−13.941945.832366.3317101.11043477.418503.1618,967.240.54590.77861.1918
June
ARIMA−0.25418.430714.9222139.1223236.8844397.583436617.72101576.84279856.301.45112.47574.1315
Combined forecasting system2.92254.70505.259447.967471.4195107.26453828.369193.1323974.010.51030.75141.1345
Table 11. Results for the DM test and the forecasting effectiveness.
Table 11. Results for the DM test and the forecasting effectiveness.
Test MethodAverage Value1-step2-step3-step
DM-testBPNN2.9934 ***2.8608 ***2.6476 ***
FABPNN2.9884 ***2.8418 ***2.7172 ***
ENN3.3992 ***3.1337 ***2.9694 ***
WNN3.7132 ***3.3533 ***3.3633 ***
ARIMA3.8003 ***3.4451 ***4.0389 ***
Average Value1-step2-step3-step
Forecasting effectiveness 1BPNN0.99130.98790.9851
FABPNN0.99130.98780.9853
ENN0.98950.98570.9824
WNN0.98850.98430.9796
ARIMA0.98590.97780.9653
Combined forecasting system0.99480.99250.9899
Average Value1-step2-step3-step
Forecasting effectiveness 2BPNN0.98380.97700.9713
FABPNN0.98360.97700.9717
ENN0.98010.97240.9659
WNN0.97870.97030.9616
ARIMA0.97360.95840.9385
Combined forecasting system0.99060.98580.9804
*** Indicates the 1% significance level; 1 Indicates the one-order forecasting effectiveness; 2 Indicates the two-order forecasting effectiveness.
Table 12. Comparison between the proposed combined forecasting system and linear combined models.
Table 12. Comparison between the proposed combined forecasting system and linear combined models.
AEMAEMSEMAPE
1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step1-Step2-Step3-Step
SAT.
Average value method30.4489−63.6749−61.501684.582128.3602149.458713,397.8428,370.3738,526.361.01561.54251.8083
Entropy weight method29.3053−62.6083−59.909983.78127.4186147.989313,213.8228,112.8138,044.911.00441.52931.7879
Combined forecasting system−9.9972−15.8985−16.93842.221360.004897.430732537601.3916,427.040.51150.7261.1988
FRI.
Average value method8.70033.557913.54283.3551104.9405142.547813,811.1626,506.8755,862.190.84161.06661.4595
Entropy weight method8.90074.085113.886983.196104.477141.489413,779.7726,349.1155,349.160.83981.06181.4479
Combined forecasting system5.18348.841228.714143.16560.7125100.76073316.257569.4821,664.290.44290.62231.0342
Table 13. Results for the standard deviation of forecasting errors.
Table 13. Results for the standard deviation of forecasting errors.
Average Value1-Step2-Step3-Step
BPNN103.0949141.1311177.1506
FABPNN104.3878140.8473175.7578
ENN127.8786174.4013217.9661
WNN132.0079178.8134236.4214
ARIMA172.1966274.8526416.6033
Combined forecasting system58.854989.8436124.6596

Share and Cite

MDPI and ACS Style

Tian, C.; Hao, Y. A Novel Nonlinear Combined Forecasting System for Short-Term Load Forecasting. Energies 2018, 11, 712. https://doi.org/10.3390/en11040712

AMA Style

Tian C, Hao Y. A Novel Nonlinear Combined Forecasting System for Short-Term Load Forecasting. Energies. 2018; 11(4):712. https://doi.org/10.3390/en11040712

Chicago/Turabian Style

Tian, Chengshi, and Yan Hao. 2018. "A Novel Nonlinear Combined Forecasting System for Short-Term Load Forecasting" Energies 11, no. 4: 712. https://doi.org/10.3390/en11040712

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop