Application of Hybrid Quantum Tabu Search with Support Vector Regression (svr) for Load Forecasting

Hybridizing chaotic evolutionary algorithms with support vector regression (SVR) to improve forecasting accuracy is a hot topic in electricity load forecasting. Trapping at local optima and premature convergence are critical shortcomings of the tabu search (TS) algorithm. This paper investigates potential improvements of the TS algorithm by applying quantum computing mechanics to enhance the search information sharing mechanism (tabu memory) to improve the forecasting accuracy. This article presents an SVR-based load forecasting model that integrates quantum behaviors and the TS algorithm with the support vector regression model (namely SVRQTS) to obtain a more satisfactory forecasting accuracy. Numerical examples demonstrate that the proposed model outperforms the alternatives.


Introduction
A booming economy is dramatically increasing electric loads in every industry and those associated with people's daily lives.Meeting the demand of all has become an important goal of electricity providers.However, as mentioned by Bunn and Farmer [1], a 1% increase in the error in an electricity demand forecast corresponds to a £10 million increase in operating costs.Therefore, decision-makers seek accurate load forecasting to set effective energy policies, such as those concerning new power plants and investment in facilities [2].Importing or exporting electricity in energy-limited developing economies, such as that of Taiwan, is almost impossible [3,4].Unfortunately, electric load data have various characteristics, including nonlinearity and chaos.Moreover, many exogenous factors interact with each other, affecting forecasting, such as economic activities, weather conditions, population, industrial production, and others.These effects increase the difficulty of load forecasting [5].
In the last few decades, models for improving the accuracy of load forecasting have included the well-known Box-Jenkins' ARIMA model [6], exponential smoothing model [7], Kalman filtering/ linear quadratic estimation model [8][9][10], the Bayesian estimation model [11][12][13], and regression models [14][15][16].However, most of these models are theoretically based on assumed linear relationships between historical data and exogenous variables and so cannot effectively capture the complex nonlinear characteristics of load series, or easily provide highly accurate load forecasting.
In the middle of the 1990s, support vector regression (SVR) [42] began to be used to solve forecasting problems [43], and in the 2000s, Hong et al. [44][45][46][47][48][49][50][51][52][53][54][55][56] developed various SVR-based load forecasting models by hybridizing evolutionary algorithms, chaotic mapping functions and cloud theory with an SVR model, to effectively determine its three parameters to improve the forecasting accuracy.Based on Hong's research results, the accurate determination of three parameters of the SVR model is critical to improving its forecasting performance.The drawbacks of evolutionary algorithms cause the combination of parameters during the optimal modeling process, such as premature convergence or trapping in a local optimum.Therefore, Hong and his colleagues investigated the possibility of using chaotic mapping functions to increase the ergodicity over the search space, then transfer the three parameters into chaotic space to make the search more compact, and employ the cloud theory to establish a cooling mechanism during annealing process to enrich the influent effects of temperature decreasing mechanism, and eventually, improve the searching quality of SA algorithm for better forecasting accuracy.
Inspired by the excellent work of Hong et al., the authors find that the tabu search (TS) [57,58] algorithm is simply implemented to iteratively find a near-optimal solution, so it is powerful and has been successfully used to solve various optimization problems [59][60][61].The TS algorithm, even with a flexible memory system to record recently visited solutions, and the ability to climb out of local minima, suffers from the tuning of the tabu tenure, meaning that it still becomes stuck at local minima and has a low convergence speed [62,63].Also, the best solution is fixed for long iterations, i.e., it takes a great deal of time to escape to near-global optima from current position [64].Therefore, both intensification and diversification strategies should be considered to improve the robustness, effectiveness and efficiency of simple TS; a more powerful neighborhood structure can be feasibly constructed by applying quantum computing concepts [65].The same old problem, premature convergence or trapping at local optima, causes the forecasting accuracy to be unsatisfactory.This paper seeks to extend Hong's exploration to overcome the shortcomings of the TS algorithm, and to use the improved TS algorithm to forecast electric loads.
In this work, quantum computing concepts are utilized to improve the intensification and diversification of the simple TS algorithm; to improve its searching performance, and thus to improve its forecasting accuracy.The forecasting performance of the proposed hybrid quantum TS algorithm with an SVR model-the support vector regression quantum tabu search (SVRQTS) model-is compared with that of four other forecasting methods that were proposed by Hong [56] and Huang [66].This paper is organized as follows.Section 2 presents the detail processes of the proposed SVRQTS model.The basic formulation of SVR and the quantum tabu search (QTS) algorithm are introduced.Section 3 presents two numerical examples and compares published methods with respect to forecasting accuracy.Finally, Section 4 draws conclusions.

Support Vector Regression (SVR) Model
A brief introduction of an SVR model is provided as follows.For a given training data set, G = {(x i , y i )} n i=1 , where x i is a vector of fed-in data and y i is the corresponding actual values.G is then mapped into a high dimensional feature space by a nonlinear mapping function, ϕ(•).Theoretically, in the feature space, there should be an optimized linear function, f, to approximate the relationship between x i and y i .This kind of optimized linear function is the so-called SVR function and is shown as Equation (1), where f (x) represents the forecasting values; the coefficients w and b are coefficients which are estimated by minimizing the empirical risk function as shown in Equation (2), where L ε (y, f (x)) is the ε-insensitive loss function.The ε-insensitive loss function is employed to find out an optimum hyper plane on the high dimensional feature space to maximize the distance separating the training data into two subsets.Thus, the SVR focuses on finding the optimum hyper plane and minimizing the training error between the training data and the ε-insensitive loss function.
The SVR model then minimizes the overall errors as shown in Equation (3), The first term of Equation (3), by employing the concept of maximizing the distance of two separated training data, is used to regularize weight sizes, to penalize large weights, and to maintain regression function flatness.The second term, to penalize the training errors of f (x) and y, decides the balance between confidence risk and experience risk by using the ε-insensitive loss function.C is a parameter to specify the trade-off between the empirical risk and the model flatness.Training errors above ε are denoted as ξ * i , whereas training errors below −ε are denoted as ξ i , which are two positive slack variables, representing the distance from actual values to the corresponding boundary values of ε-tube.
After the quadratic optimization problem with inequality constraints is processed, the parameter vector w in Equation ( 1) is obtained in Equation (4), where β * i , β i , satisfying the equality β i * β * i = 0, are the Lagrangian multipliers.Finally, the SVR regression function is obtained as Equation ( 5) in the dual space, where K(x i , x j ) is so-called the kernel function, and the value of the kernel equals the inner product of two vectors, x i and x j , in the feature space ϕ(x i ) and ϕ(x j ), respectively; that is, K(x i , x j ) = ϕ(x i ) • ϕ(x j ).However, the computation of the inner product in the high feature space becomes a computationally complicated problem along with the increase in the input dimensions.Such a problem of contradiction between high dimensions and computational complexity can be overcome by using the kernel trick or defining appropriate kernel functions in place of the dot product of the input vectors in high-dimensional feature space.The kernel function is used to directly compute the inner product from the input space, rather than in the high dimensional feature space.Kernel functions provide a way to avoid the curse of dimensionality.There are several types of kernel function, and it is hard to determine the type of kernel functions for specific data patterns [67].The most commonly used kernel functions include linear functions, polynomial functions, Gaussian functions, sigmoid functions, splines, etc.The Gaussian function, K(x i , x j ) = exp −0.5 x i − x j 2 /σ 2 , is used widely among all these various kernel functions as it can map the input space sample set into a high dimensional feature space effectively and is good for representing the complex nonlinear relationship between the input and output samples.Furthermore, only one variable (the width parameter, σ) is there to be defined.Considering the above advantages, the Gaussian radial basis function (RBF) is employed as the kernel function in this study.
The most important consideration in maximizing the forecasting accuracy of an SVR model is the well determination of its three parameters, which are the hyper-parameters, C, ε, and the kernel parameter, σ.Therefore, finding efficient algorithms for evaluating these three parameters is critical.As indicated above, inspired by Hong's hybridization of chaotic mapping functions with evolutionary algorithms to find favorable combinations of parameters and to overcome the premature convergence of the evolutionary algorithms, this work uses another (quantum-based) method to find an effective hybrid algorithm without the drawbacks of the TS algorithm by, for example, improving its intensification and diversification.Accordingly, the QTS algorithm is developed and improved using the hybrid chaotic mapping function.The chaotic QTS (CQTS) algorithm is hybridized with an SVR model, to develop the support vector regression chaotic quantum tabu search (SVRCQTS) model, to optimize parameter selection to maximize forecasting accuracy.

Tabu Search (TS) Algorithm and Quantum Tabu Search (QTS) Algorithm
In 1986, Glover and Laguna first developed a renowned meta-heuristic algorithm called tabu Search (TS) [57,58].TS is an iterative procedure designed for exploring in the solution space to find the near optimal solution.TS starts with a random solution or a solution obtained by a constructive and deterministic method and evaluates the fitness function.Then all possible neighbors of the given solution are generated and evaluated.A neighbor is a solution which can be reached from the current solution by a simple move.New solution is generated from the neighbors of the current one.To avoid retracing the used steps, the method records recent moves in a tabu list.The tabu list keeps track of previously explored solutions and forbids the search from returning to a previously visited solution.
If the best of these neighbors is not in the tabu list, pick it to be the new current solution.One of the most important features of TS is that a new solution may be accepted even if the best neighbor solution is worse than the current one.In this way it is possible to overcome trapping in local minima.TS algorithm has been successfully used to lots of optimization problems [59][60][61].
However, in the TS algorithm, if a neighboring solution is not in the tabu list, TS sets it as the new current solution, but this solution is commonly worse than the current best solution.TS typically finds local minima and so do not change the best solution for many iterations; therefore, reaching a near-global minimum takes a long time and its convergence speed is low [62].To overcome this shortcoming of the TS algorithm; to reduce its convergence time, to solve the similar old problem, premature convergence or trapping at local optima, the qubit concept and the quantum rotation gate mechanism can be used to construct a more powerful neighborhood structure by quantum computing concepts [65].
In the traditional TS algorithm, an initial solution is randomly generated, and its fitness function is evaluated to determine whether it should be set as the current best solution.However, in quantum computing, the initial solution is generated by using the concept of qubit to assign a real value in the interval (0,1), consistent with Equation (6).A qubit is the smallest unit of information for a quantum representation, and is mathematically represented as a column vector (unit vector), which can be identified in 2D Hilbert space.Equation ( 6) describes a quantum superposition between these two states.In quantum measurement, the super-position between states collapses into either the "ground state" or the "excited state".
where |0 represents the "ground state", |1 denotes the "excited state"; (c 1 , c 2 ) ∈ ℵ; c 1 and c 2 are the probability amplitudes of these two states; ℵ is the set of complex numbers.The most popular quantum gate, the quantum rotation gate (given by Equation ( 7)), is used to update the initial solution.
where (α i , β i ) is the updated qubit; θ i is the rotation angle.
The quantum orthogonality process (Equation ( 8)) is implemented to ensure that the corresponding value exceeds rand(0,1).The tabu memory is introduced and set to null before the process is executed.The QTS begins with a single vector, v best , and terminates when it reaches the predefined number of iterations.In each iteration, a new set of vectors, V(BS) is generated in the neighborhood of v best .For each vector in V(BS), if it is not in the tabu memory and has a higher fitness value than v best , then v best is updated as the new vector.When the tabu memory is full, the first-in-first-out (FIFO) rule is applied to eliminate a vector from the list.
where |c 1 | 2 and |c 2 | 2 are the two probabilities that are required to transform the superposition between the states (as in Equation ( 6)) into |0 and |1 , respectively.

Chaotic Mapping Function for Quantum Tabu Search (QTS) Algorithm
As mentioned, the chaotic variable can be adopted by applying the chaotic phenomenon to maintain diversity in the population to prevent premature convergence.The CQTS algorithm is based on the QTS algorithm, but uses the chaotic strategy when premature convergence occurs during the iterative searching process; at other times, the QTS algorithm is implemented, as described in Section 2.2.1.
To strengthening the effect of chaotic characteristics, many studies have used the logistic mapping function as a chaotic sequence generator.The greatest disadvantage of the logistic mapping function is that its distribution is concentration at both ends, with little in the middle.The Cat mapping function has a better chaotic distribution characteristic, so in this paper, the Cat mapping function is used as the chaotic sequence generator.
The classical Cat mapping function is the two-dimensional Cat mapping function [68], shown as Equation ( 9), where x mod 1 = x − [x], mod, the so-called modulo operation, is used for the fractional parts of a real number x by subtracting an appropriate integer.

Implementation Steps of Chaotic Quantum Tabu Search (CQTS) Algorithm
The procedure of the hybrid CQTS algorithm with an SVR model is as follows; Figure 1 presents the corresponding flowchart.
Step 1 Initialization.Randomly generate the initial solution, P, that includes the values of three parameters in an SVR model.
Step 2 Objective value.Compute the objective values (forecasting errors) by using the initial solution, P. The mean absolute percentage error (MAPE), given by Equation (10), is used to measure the forecasting errors.
where N is the number of forecasting periods; y i is the actual value in period i; f i denotes the forecast value in period i.
Step 3 Generate neighbors.Using the qubit concept, Equation ( 6) sets the initial solution, P, to a real value between interval (0,1) and then obtains P .Then, use the quantum rotation gate, given by Equation (7), to generate the neighbor, P .
Step 4 Pick.Pick a new individual from the examined neighbors based on the quantum tabu condition, which is determined by whether the corresponding value of P exceeds rand(0,1).
Step 5 Update the best solution (objective value) and the tabu memory list.If P > rand(0,1), then update the solution to P* in the quantum tabu memory, v best .Eventually, the objective value is updated as the current best solution.If the tabu memory is full, then the FIFO rule is applied to eliminate a P* from the list.
Step 6 Premature convergence test.Calculate the mean square error (MSE), given by Equation (11), to evaluate the premature convergence status [69], and set the criteria, δ.
where f i is the current objective value; f avg is the mean of all previous objective values, and f is given by Equation ( 12), An MSE of less than δ indicates premature convergence.Therefore, the Cat mapping function, Equation (9), is used to find new optima, and the new optimal value is set as the best solution.
Step 7 Stopping criteria.If the stopping threshold (MAPE, which quantifies the forecasting accuracy) or the maximum number of iterations is reached, then training is stopped and the results output; otherwise, the process returns to step 3.

Data Set of Numerical Examples
3.1.1.The First Example: Taiwan Regional Load Data In the first example, Taiwan's regional electricity load data from a published paper [56,66] are used to establish the proposed SVRCQTS forecasting model.The forecasting performances of this proposed model is compared with that of alternatives.The data set comprises 20 years (from 1981 to 2000) of load values for four regions of Taiwan.This data set is divided into several subsets-a training set (comprising 12 years of load data from 1981 to 1992), a validation set (comprising four years of data from 1993 to 1996), and a testing set (comprising four years of data from 1997 to 2000).The forecasting performances are measured using MAPE (Equation ( 10)).
In the training stage, the rolling forecasting procedure, proposed by Hong [56], is utilized to help CQTS algorithm determining appropriate parameter values of an SVR model in the training stage, and eventually, receive more satisfied results.For details, the training set is further divided into two subsets, namely the fed-in (for example, n load data) and the fed-out (12 − n load data), respectively.Firstly, the preceding n load data are used to minimize the training error by the structural risk principle; then, receive one-step-ahead (in-sample) forecasting load, i.e., the (n + 1)th forecasting load.Secondly, the next n load data, i.e., from 2nd to (n + 1)th data, are set as the new fed-in and similarly used to minimize the training error again to receive the second one-step-ahead (in-sample) forecasting load, named as the (n + 2)th forecasting load.Repeat this procedure until the 12nd (in-sample) forecasting load is obtained with the training error.The training error can be obtained during each iteration, these parameters would be decided by QTS algorithm, and the validation error would be also calculated in the meanwhile.Only with the smallest validation and testing errors will the adjusted parameter combination be selected as the most appropriate parameter combination.The testing data set is only used for examining the forecasting accuracy level.Eventually, the four-year forecasting electricity load demands in each region are forecasted by the SVRCQTS model.The complete process is illustrated in Figure 2. The relevant modeling procedures are as in the first example.

The Third Example: 2014 Global Energy Forecasting Competition (GEFCOM 2014) Load Data
The third example involves the 744 h of load data from the 2014 Global Energy Forecasting Competition [70] (from 00:00 1 December 2011 to 00:00 1 January 2012).The data set is divided into three subsets-a training set (552 h of load data from 01:00 1 December 2011 to 00:00 24 December 2011), a validation set (96 h of load data from 01:00 24 December 2011 to 00:00 28 December 2011), and testing set (96 h of load data from 01:00 28 December 2011 to 00:00 1 January 2012).The relevant modeling procedures are as in the preceding two examples.

Forecasting Results and Analysis for Example 1
In Example 1, the combination of parameters of the most appropriate model are evaluated using the QTS algorithm and the CQTS algorithm for each region, and almost has the smallest testing MAPE value.Table 1 presents these well-determined parameters for each region.

Initialization
Randomly generate the initial solution, P, for these three parameters in an SVR model.

Pick
Pick new individual from the examined neighbor according to the quantum tabu condition, i.e., whether P''>rand(0,1).

Objective value
Compute the objective values (forecasting errors) by using the initial solution, P.

Update objective value and Tabu memory list
If P''>rand(0,1), update the solution as P* into the quantum tabu memory.

Yes
Employ the Cat mapping function (Eq.( 9)) toto look for new optima to revise the best solution.Table 2 presents the forecasting accuracy index (MAPE) and electricity load values of each region that are forecast, under the same conditions, using alternative models, which were include SVRCQTS, SVRQTS, SVR with chaotic particle swarm optimization (SVRCQPSO), SVR with quantum PSO (SVRQPSO), and SVR with PSO (SVRPSO) models.Clearly, according to Table 2, the SVRCQTS model is superior to the other SVR-based models.Applying quantum computing mechanics to the TS algorithm is a feasible means of improving the satisfied, and thus improving the forecasting accuracy of the SVR model.The Cat mapping function has a critical role in finding an improved solution when the QTS algorithm becomes trapped in local optima or requires a long time to solve the problem of interest).For example, for the central region, the QTS algorithm is utilized to find the best solution, (σ, C, ε) = (12.0000,1.0000 × 1010, 0.2800), with a forecasting error, MAPE, of 1.6870%.The solution can be further improved by using the CQTS algorithm with (σ, C, ε) = (6.0000,1.6000 × 1010, 0.5500), which has a smaller forecasting accuracy of 1.2650%.For other regions, the QTS algorithm yields an increased forecasting performance to 1.3260% (northern region), 1.3670% (southern region) and 1.9720% (eastern region).All of these models can also be further improved to increase the accuracy of the forecasting results by using the Cat mapping function (the CQTS algorithm), yielding a forecasting accuracy of 1.087% for the northern region, 1.1720% for the southern region, and 1.5430% for the eastern region.

Yes
To verify that the proposed SVRCQTS and SVRQTS models offers an improved forecasting accuracy, the Wilcoxon signed-rank test, recommended by Diebold and Mariano [71], is used.In this work, the Wilcoxon signed-rank test is performed with two significance levels, α = 0.025 and α = 0.005, by one-tail-tests.Table 3 presents the test results, which reveal that the SVRCQTS model significantly outperforms other models for the northern and eastern regions in terms of MAPE.Table 2 presents the forecasting accuracy index (MAPE) and electricity load values of each region that are forecast, under the same conditions, using alternative models, which were include SVRCQTS, SVRQTS, SVR with chaotic particle swarm optimization (SVRCQPSO), SVR with quantum PSO (SVRQPSO), and SVR with PSO (SVRPSO) models.Clearly, according to Table 2, the SVRCQTS model is superior to the other SVR-based models.Applying quantum computing mechanics to the TS algorithm is a feasible means of improving the satisfied, and thus improving the forecasting accuracy of the SVR model.The Cat mapping function has a critical role in finding an improved solution when the QTS algorithm becomes trapped in local optima or requires a long time to solve the problem of interest).For example, for the central region, the QTS algorithm is utilized to find the best solution, (σ, C, ε) = (12.0000,1.0000 × 1010, 0.2800), with a forecasting error, MAPE, of 1.6870%.The solution can be further improved by using the CQTS algorithm with (σ, C, ε) = (6.0000,1.6000 × 1010, 0.5500), which has a smaller forecasting accuracy of 1.2650%.For other regions, the QTS algorithm yields an increased forecasting performance to 1.3260% (northern region), 1.3670% (southern region) and 1.9720% (eastern region).All of these models can also be further improved to increase the accuracy of the forecasting results by using the Cat mapping function (the CQTS algorithm), yielding a forecasting accuracy of 1.087% for the northern region, 1.1720% for the southern region, and 1.5430% for the eastern region.
To verify that the proposed SVRCQTS and SVRQTS models offers an improved forecasting accuracy, the Wilcoxon signed-rank test, recommended by Diebold and Mariano [71], is used.In this work, the Wilcoxon signed-rank test is performed with two significance levels, α = 0.025 and α = 0.005, by one-tail-tests.Table 3 presents the test results, which reveal that the SVRCQTS model significantly outperforms other models for the northern and eastern regions in terms of MAPE.6) 353 ( 5) 350 ( 8) 370 ( 12) 367 (9) 1998 390 (7) 388 ( 9) 404 ( 7) 390 ( 7) 376 ( 21) 374 ( 23) 1999 395 ( 6) 394 ( 7) 394 ( 7) 410 ( 9) 411 ( 10) 409 (8) 2000 427 ( 7) 429 ( 9) 414 ( 6) 413 ( 7) 418 ( 2) 415 ( 5) Note: *: The values in the parentheses are the absolute error, which is defined as: |y i − f i |, where y i is the actual value in period i; f i denotes is the forecast value in period i.SVRCQPSO: support vector regression chaotic quantum particle swarm optimization; SVRQPSO: support vector regression quantum particle swarm optimization ; SVRCPSO: support vector regression chaotic particle swarm optimization; SVRPSO: support vector regression particle swarm optimization.In Example 2, the processing steps are those in the preceding example.The parameters in an SVR model are computed using the QTS algorithm and the CQTS algorithm.The finalized models exhibit the best forecasting performance with the smallest MAPE values.Table 4 presents the well determined parameters for annual electricity load data.To compare with other benchmarking algorithms, Table 4 presents all results from relevant papers on SVR-based modeling, such as those of Hong [56], who proposed the SVRCPSO and SVR with PSO (SVRPSO) models and Huang [66], who proposed the SVRCQPSO and SVRQPSO models.
Table 5 presents the MAPE values and forecasting results obtained using the alternative forecasting models.The SVRCQTS model outperforms the other models, indicating quantum computing is an ideal approach to improve the performance of any SVR-based model, and that the Cat mapping function is very effective for solving the problem of premature convergence and the fact that it is time-saving.Clearly, the QTS algorithm yields (σ, C, ε) = (5.0000,1.3000 × 10 11 , 0.630) with a MAPE of 1.3210%, whereas the CQTS algorithm provides a better solution, (σ, C, ε) = (6.0000,1.8000 × 10 11 , 0.340) with a MAPE of 1.1540%.Figure 3 presents the real values and the forecast values obtained using the various models.In Example 2, the processing steps are those in the preceding example.The parameters in an SVR model are computed using the QTS algorithm and the CQTS algorithm.The finalized models exhibit the best forecasting performance with the smallest MAPE values.Table 4 presents the well determined parameters for annual electricity load data.To compare with other benchmarking algorithms, Table 4 presents all results from relevant papers on SVR-based modeling, such as those of Hong [56], who proposed the SVRCPSO and SVR with PSO (SVRPSO) models and Huang [66], who proposed the SVRCQPSO and SVRQPSO models.
Table 5 presents the MAPE values and forecasting results obtained using the alternative forecasting models.The SVRCQTS model outperforms the other models, indicating quantum computing is an ideal approach to improve the performance of any SVR-based model, and that the Cat mapping function is very effective for solving the problem of premature convergence and the fact that it is time-saving.Clearly, the QTS algorithm yields (σ, C, ε) = (5.0000,1.3000 × 10 11 , 0.630) with a MAPE of 1.3210%, whereas the CQTS algorithm provides a better solution, (σ, C, ε) = (6.0000,1.8000 × 10 11 , 0.340) with a MAPE of 1.1540%.Figure 3 presents the real values and the forecast values obtained using the various models.Finally, Table 8 presents the results of the Wilcoxon signed-rank test.It indicates that the proposed SVRCQTS model almost receives statistical significance in forecasting performances under the significant level, α = 0.05.Therefore, the proposed SVRCQTS model significantly outperforms other alternatives in terms of α = 0.05.

Conclusions
This work proposes a hybrid model that incorporates an SVR-based model, the chaotic cat mapping function, and the QTS algorithm for forecasting electricity load demand.Experimental results reveal that the proposed model exhibits significantly better forecasting performance than other SVR-based forecasting models.In this paper, quantum mechanics is utilized to improve the intensification and diversification of the simple TS algorithm, and thereby to improve its forecasting accuracy.Chaotic cat mapping is also used to help prevent the QTS algorithm from becoming trapped in local optima in the modeling processes.This work marks a favorable beginning of the hybridization of quantum computing mechanics and the chaotic mechanism to expand the search space, which is typically limited by Newtonian dynamics.Finally, Table 8 presents the results of the Wilcoxon signed-rank test.It indicates that the proposed SVRCQTS model almost receives statistical significance in forecasting performances under the significant level, α = 0.05.Therefore, the proposed SVRCQTS model significantly outperforms other alternatives in terms of α = 0.05.

Conclusions
This work proposes a hybrid model that incorporates an SVR-based model, the chaotic cat mapping function, and the QTS algorithm for forecasting electricity load demand.Experimental results reveal that the proposed model exhibits significantly better forecasting performance than other SVR-based forecasting models.In this paper, quantum mechanics is utilized to improve the intensification and diversification of the simple TS algorithm, and thereby to improve its forecasting accuracy.Chaotic cat mapping is also used to help prevent the QTS algorithm from becoming trapped in local optima in the modeling processes.This work marks a favorable beginning of the hybridization of quantum computing mechanics and the chaotic mechanism to expand the search space, which is typically limited by Newtonian dynamics.

3. 1 . 2 .
The Second Example: Taiwan Annual Load Data In the second example, Taiwan's annual electricity load data from a published paper are used[56,66].The data set is composed of 59 years of load data (from 1945 to 2003), which are divided into three subsets-a training set (40 years of load data from 1945 to 1984), a validation set (10 ten of load data from 1985 to 1994), and a testing set (nine years of load data from 1995 to 2003).

Figure 3 .
Figure 3. Actual values and forecasting values of SVRCQTS, SVRQTS, and other models (Example 2).SVRCQTS: support vector regression chaotic quantum Tabu search; SVRQTS: support vector regression quantum Tabu search.

Table 1 .
Parameters determination of SVRCQTS and SVRQTS models (example 1).SVRCQTS: support vector regression chaotic quantum tabu search; SVRQTS: support vector regression quantum Tabu search.
a denotes that the SVRCQTS model significantly outperforms other alternative models.

Table 8 .
Wilcoxon signed-rank test (Example 3).: a denotes that the SVRCQTS model significantly outperforms other alternative models. Note