You are currently viewing a new version of our website. To view the old version click .
Sustainability
  • Editor’s Choice
  • Article
  • Open Access

21 January 2020

A Compact Pigeon-Inspired Optimization for Maximum Short-Term Generation Mode in Cascade Hydroelectric Power Station

,
,
,
and
1
College of Computer Science and Engineering, Shandong University of Science and Technology, Qingdao 266590, China
2
College of Science and Engineering, Flinders University, 1284 South Road, Clovelly Park, SA 5042, Australia
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue Renewable Energies for Sustainable Development

Abstract

Pigeon-inspired optimization (PIO) is a new type of intelligent algorithm. It is proposed that the algorithm simulates the movement of pigeons going home. In this paper, a new pigeon herding algorithm called compact pigeon-inspired optimization (CPIO) is proposed. The challenging task for multiple algorithms is not only combining operations, but also constraining existing devices. The proposed algorithm aims to solve complex scientific and industrial problems with many data packets, including the use of classical optimization problems and the ability to find optimal solutions in many solution spaces with limited hardware resources. A real-valued prototype vector performs probability and statistical calculations, and then generates optimal candidate solutions for CPIO optimization algorithms. The CPIO algorithm was used to evaluate a variety of continuous multi-model functions and the largest model of hydropower short-term generation. The experimental results show that the proposed algorithm is a more effective way to produce competitive results in the case of limited memory devices.

1. Introduction

The metaheuristic algorithm [1] has emerged as a very promising tool to solve complex optimization problems. Original pigeon-inspired optimization (OPIO) is a new type of metaheuristic search algorithm [2]. The algorithm simulates the behavior of pigeons going home. Preliminary studies indicate that it is a very promising optimization algorithm and can outperform excellent existing algorithms [3]. OPIO exploits a population of pigeons as candidate solutions by setting boundaries and optimizing the problem by moving the candidate solutions to approach the best solutions based on a given measure of quality. The general steps of the algorithm are described below.
OPIO can solve continuous solution space problems. In addition, many versions of OPIO in the literature are proposed to solve the problem of continuous and discrete solution spaces in recent years. An improved Gaussian pigeon inspired optimization algorithm preserves the diversity of early evolution to avoid premature convergence. The entire algorithm shows excellent performance in global optimization and is effective for solving multimodal and non-convex problems with higher dimensions. Multi-objective pigeon-inspired optimization (MPIO) is used for multi-objective optimization in designing the parameters of brushless direct current motors  [4]. The multimodal multi-objective pigeon-inspired optimization algorithm (MMPIO) was proposed to figure out the multimodal multi-objective optimization problems [5,6].
With the continuous development of metaheuristic algorithms, intelligent group optimization has become an emerging technology to solve many engineering problems. Metaheuristic algorithms perform well on wireless sensor networks [7,8]. Since 2000, many scholars have designed many ant colony optimization algorithms, particle swarm optimization algorithms (PSO) [9], gray wolf optimization algorithms (GWO) [10,11], bat inspired algorithms (BA) [12,13], flower pollination algorithms (FPA) [14,15], cat swarm optimization (CSO) [16,17], differential evolution algorithm (DE) [18,19], quasi-affine transformation evolution algorithms (QUATRE) [20,21], genetic algorithms (GA) [22,23], etc. Based on the simulation of the above-mentioned functional mechanisms through an in-depth study, it is easy to observe that the adaptive phenomenon can widely exist in nature. Among them, OPIO was proposed by Duan and other scholars in 2014 [24]. It is a new intelligent optimization algorithm based on the homing behavior of pigeons. While it has not been long since its introduction, this algorithm has been used in model improvement and application, obtaining many research results. Because the algorithm has good adaptability and high calculation accuracy, various optimizations have been carried out in the fields of unmanned aerial vehicle (UAV) formation [25], control parameter optimization  [26], and image processing [27].
A country’s development and social progress are inseparable from its demand for energy [28,29]. Electric energy is a very flexible form of energy that is increasingly obtained from the sun. Electrical energy can be converted into heat, chemical energy, and mechanical energy. Its power is also convenient to use, easy to control, safe, and clean. More importantly, most of the development of today’s society relies on the development of science and technology. The main ways to generate electricity are thermal, wind, hydropower, and nuclear power generation. Hydropower is a renewable energy source that can continuously generate and deliver electricity. The main advantage of hydropower generation is that it can eliminate fuel cost. The cost of operating a hydropower station is not affected by rising fossil fuel prices such as oil, natural gas, and coal. Hydroelectric power stations do not require fuel. The economic life of a hydroelectric power station is longer than those of fuel-fired power plants. In addition, hydropower stations are mostly operated automatically and normally. In addition, such power plants have low operating costs [30,31].
The short-term optimal dispatching of cascade hydropower stations refers to the maximum value of the objective function in the case of meeting the various constraints of the cascade hydropower stations on one or several days [32]. In general, this mainly refers to the following three mathematical models: The model with the shortest power generation, the short-term water consumption minimum model [33], and the short-term peak power maximum model [34]. These three mathematical models have the same properties, i.e., under certain constraints, the nonlinear multi-stage optimization problem is obtained. This paper only analyzes the model with the maximization of the short-term power generation.
In this paper, we combine the compact technique with the pigeon-inspired optimization to propose the compact pigeon-inspired optimization algorithm. The proposed CPIO not only improves the time efficiency but also reduces the hardware memory. The algorithm proposed in this paper has very good spatial complexity. The algorithm only has one particle to update, and the original algorithm uses the population to update. After expanding our work, our goal is to solve the problem of reducing memory usage and parameter selection in optimizing the short-term power generation of cascade hydropower stations. The reasons for expanding our work include adding sample probability functions that must control the perturbation vector and comparing them with other compact algorithms in this article. The probability function operates to solve the optimal value of the compact pigeon-inspired optimization (CPIO) algorithm, and uses a real-valued prototype vector to generate each candidate solution. The algorithm has been tested on multiple continuous multi-modal functions as well as the short-term power generation of cascade hydropower stations [35,36,37].

3. Pigeon-Inspired Optimization

Without prejudice, the minimization problem of the objective function f ( x ) is discussed in this paper, where x is the vector that defines the n design variables in the domain D in the decision space.
The pigeon-inspired optimization is a meta-heuristic algorithm that is inspired by the behavior of the pigeons returning home and is widely used in most continuous or discrete optimization problems. This article mainly introduces continuity problems. Referring to extensive literature reviews, a group of pigeons move in decision space D according to the update rules to find the optimal value when looking for the solution of the problem. More formally, in order to gain the satisfactory value of the objective function f ( x ) , the population of the pigeons is randomly sprinkled in the previously set search space. The objective function judges the equivalent quality of solution based on the position information of each pigeon. At any stage t, the i-th pigeon has its own position vector x k t and velocity vector v k t . For each pigeon, the best solution is the value of the objective function. The best position of the position where the pigeon has passed will be stored. The global optimal solution is continuously updated. To transition from the t step to the t + 1 step, a more competitive solution will be taken, and each particle is perturbed according to the following formula:
v k t + 1 = e R t V k t + ϕ 1 ( x g b e s t x k )
and:
x k t + 1 = ϕ 2 x k t + ϕ 3 v k t + 1
As the formula above suggests, x k t refers to the current position of the k-th pigeon, and x g b e s t is the best position ever found in the entire herd, and the vector v k t is a perturbation vector, namely velocity. Finally, ϕ 1 is a variable constant, is a variable amount limited to 0–1, and ϕ 2 , ϕ 3 are two weight factors can be constants or variables. This stage belongs to the map and the compass operator. When the pigeon approaches the destination, the dependence on the sun and the magnetic object is reduced, and then the landmark operator is entered.
x c e n t e r t = k = 1 N t x k t F ( x k t ) N t k = 1 N t F ( x k t )
From here on, the landmark operator is entered. In this operator, the pigeons continue to iterate according to the pigeons or landmarks of the roads understood by the population. In the above formula, the purpose of this operation is to find out the pigeons with a high fitness value in the flock. This pigeon is then considered to be the pigeon that knows the road, and the pigeons are iterated according to the pigeon. N t is the population number at the t-th iteration, and F ( X k t ) is the fitness function value of the k-th pigeon position.
N t = N t 1 2
The significance of this operation is to halve the pigeons and discard the pigeons that do not have the way to know, to prevent such pigeons from misleading the population into local optimum.
x i t + 1 = x i t + ϕ 4 ( x c e n t e r t x i t )
In the formula, ϕ 4 is a variable constant that value is a randomly generated value from ( 0 , 1 ) . In this operation, all pigeons that do not know the road will be iterated according to the pigeons that know the road.
F ( x i t ) = 1 F i t n e s s ( x k t ) + χ For   minimization   problem
F ( x i t ) = F i t n e s s ( x k t ) For   maximization   problem
For maximizing the problem, OPIO uses Equation (14) to calculate the value of F ( x k t ) to find the pigeon with the ability to identify the function. For the minimization problem, OPIO uses Equation (15) to calculate the value of F ( x k t ) to find the pigeon with the function of identifying. In Equation (14), χ is a non-zero constant whose purpose is to prevent the denominator from being zero.

4. Compact Pigeon-Inspired Optimization

The compact approach replicates the operation of the population-based algorithm by building the probability of a total solution. The optimal process encodes the probability representation of the actual population as a virtual counterpart. Compact pigeon-inspired optimization is a model built on a pigeon-inspired optimization-based framework. In the OPIO algorithm, the concept and design of the CPIO algorithm will be explored in more detail.
The purpose of CPIO is to simulate the operation of OPIO underlying overall algorithm in a smaller version of memory variable memory. By constructing a distributed data structure, the actual solution of the OPIO is transformed into a compact algorithm, the perturbation vector. The PV vector is a probabilistic model for the solution of the population.
P V t = [ μ t , δ t ]
In the formula, μ , δ are two parameters of the standard deviation and the average of the vector P V , and t is the current number of iterations. The value of μ , δ is limited to probability density functions ( P D F )  [38] and is changed within [–1, 1]. The magnitude of the PDF is normalized by keeping the area to 1, because by obtaining approximately sufficient in well it is the uniform distribution with a full shape.
The initialization of the virtual population is performed as follows. For each design variable μ = 0 and δ = λ , where λ is a large constant ( λ = 10). This value is initialized to initially obtain a normal distribution of truncated wide shapes.
The sampling mechanism of the design variable x k t associated with the generic candidate solution x in P V is not a simple process and requires extensive interpretation. For each design variable indexed by k, a truncated Gaussian P D F with the mean μ and standard deviation δ is associated, The P D F is described by the following formula:
P D F = e ( x μ [ k ] ) 2 2 × σ [ k ] 2 × 2 π σ [ k ] × ( e r f ( μ [ k ] + 1 2 × σ [ k ] e r f ( μ [ k ] 1 2 × σ [ k ] ) ) )
P D F is the probability distribution function of P V , and a truncated Gaussian PDF-related μ , and  δ are formulated. A new candidate solution is generated by iteratively biasing towards a promising region of the optimal solution. Every component of the probability vector may be acquired by learning the previous generations. e r f is the error function established by  [39]. P D F corresponds to the cumulative distribution function ( C D F ) by constructing a Chebyshev polynomial [39], and the upper domain of the C D F is randomly changed between 0 and 1. C D F can be described as a real-valued random variable x with a probability distribution, and the value that can be obtained can be less than or equal to x k t .
The relationship between C D F and P D F can be defined as C D F = 0 1 PDF ( x ) d x , and P V operations can sample the design variable x k t by randomly generating values within the range of ( 0 , 1 ) .
In the iterative process of the compact algorithm, in order to find a better individual, a function that can be compared by two parameters is proposed in this paper. The two variable pigeon parameters are two sample individuals of the P V operation. The vector represented by the winner is the value of the fitness function. This value is higher than other virtual members, and the vector represented by the loser is that the individual fitness value is lower than the fitness evaluation standard. Two variables with return values, the winner and the loser are obtained from the calculation of the objective function, and a new candidate solution is generated to compare with the original global optimal solution to generate new winners and losers. For updating P V operations, μ and σ can be considered for updates according to the rules below. If the mean value of μ is 1, Then the update rule becomes μ t and σ t for each of its elements μ t + 1 and σ t + 1  [40] as described in the following:
μ i t + 1 = μ i t + w i n n e r i l o s e r i N ,
where N is the virtual population size and the value of δ is described below. The update rule for each element is given in the formula below.
σ i t + 1 = ( σ i t ) 2 + ( μ t + 1 ) 2 + w i n n e r i 2 l o s e r i 2 N
Mathematical details about construction Equations (18) and (19) have been given. The persistent and non-persistent structures of rcGA have been tested and can be seen in [41]. Seeing the virtual population size N as a parameter of a compression algorithm is not a true population-based algorithm. The virtual population size, in the real-valued compression algorithm, is an algorithm that depends on the convergence speed.
In general, a probabilistic model for compact OPIO is hired to represent all of the set of solutions for the pigeon group, neither storing location information nor storing speed information; however, storing newly generated candidate solutions. Therefore, the limited storage space is required to achieve the algorithm requirements which saves a lot of time and hardware resources for the cascade hydropower station to optimize the short-term power generation model.
CPIO uses a perturbation vector P V that has the same structure as the one shown in Equation (16), at the beginning of the optimization algorithm, just like the process described in Equation (17). The  P V initialization is designed as ( k , μ [ k ] = 0 , σ [ k ] = λ , λ = 10 ) and the variables of each design are limited to one continuous space [ 1 , 1 ] , and in addition, the position x and the velocity v are randomly initialized within a certain range.
Update velocity vector and position vector by slightly revised pigeon-inspired algorithm:
v k t + 1 = ω 1 e R t V k t + ω 2 ( x g b e s t x k )
and:
x k t + 1 = ξ 1 x k t + ξ 2 v k t + 1
ω 1 is an inertia weight, ω 2 is a random variable between 0 and 1, and ξ 1 and ξ 2 are weighting factors that control the position update of the pigeon.
It can be seen that the equation updating of speed (Equation (20)) and position (Equation (21)) is similar to the OPIO algorithm. In the original version, pigeon k was closely related to pigeon group N. In the compact version, there was no real population, but the relevance of a virtual population pigeon to the virtual population was not that great. It is easy to see that compact OPIO is just a pigeon that uses the update formula to update it, so updating it once produces a solution that saves a lot of memory.
In the landmark operator entering the second stage of CPIO, the original algorithm uses the Equation (11) to determine the pigeon with the function of identifying the function based on the fitness solution of each pigeon position, so that it becomes the center point and continues to update. Since the CPIO has only one particle to update, it is not suitable when selecting a pigeon with a path function. In this paper, a center point suitable for CPIO is proposed. By setting a virtual center position point, the guiding pigeon is updated. When the virtual center position is established, it is based on the historical fitness value of the pigeon. The number selected is also based on the size of the virtual population.
x c e n t e r t + 1 = i = l N + 1 N F ( x k t ) N
l is the number of iterations until now, N is the number of virtual populations. According to the Equation (22), the historical virtual center points of the pigeons can be selected, and it is known that they continue to iterate.
x k t + 1 = x k t + ω 3 ( x c e n t e r t x k t )
It is easy to see that CPIO saves a lot of memory space, so this approach can be applied to other variations of OPIO.

5. Numerical Results

The test results of the CPIO that have been tested by 29 test functions, and these test functions come from [42]. Each test function has a very detailed introduction in Table 1 and Table 2. Among these groups of questions, they have different search range and different expressions.
Table 1. Details of 29 test functions.
Table 2. Details of 29 test functions.
In Equations (20)–(23) and Algorithm 1, the parameters of the CPIO proposed herein are: N = 120 , R = 0.2 . The values of these parameters are referred to [43] and have a slight change. More specifically, in order to make CPIO work better, we modeled the virtual population size proposed by OPIO. In this article, CPIO is compared to the OPIO. In all test functions, CPIO is run 30 times and averaged. Take the minimum value of CPIO in all test functions.
Algorithm 1 Compact pigeon-inspired optimization (CPIO) pseudo-code.
1:
Map and compass factors R, The maximum number of iterations in the first stage M a x D T 1 and The maximum number of iterations in the second stage M a x D T 2 , dimension is d i m ;
2:
for k = 1 to d i m do
3:
     / / P V operation initialization; N is the total number of pigeons
4:
    initialize μ [ k ] = 0
5:
    initialize σ [ k ] = 10
6:
end for
7:
/ / Global Best initialization
8:
Generate the global best solution x b e s t by means of perturbation vector P V
9:
/ / Local Best Solution initialization
10:
Generate the local best solution x k by P V
11:
for t = 1 to M a x D t 1 do
12:
     x k t = Generate from P V operation
13:
     / / Update position and velocity
14:
     v k t + 1 = ω 1 e R t V k t + ω 2 ( x g b e s t x k )
15:
     x k t + 1 = ξ 1 x k t + ξ 2 v k t + 1
16:
     / / Best Selection
17:
     [ w i n n e r , l o s e r ] =compete( x k t + 1 , x g b e s t )
18:
     / / Update P V operation
19:
    for k = 1 : d i m do
20:
         μ t + 1 [ k ] = μ t [ k ] + w i n n e r [ k ] l o s e r [ k ] N
21:
         σ t + 1 [ k ] = ( σ t [ k ] ) 2 + ( μ t [ k ] ) 2 ( μ t + 1 [ k ] ) 2 + w i n n e r k 2 l o s e r k 2 N
22:
    end for
23:
end for
24:
for t = M a x D t 1 + 1 to M a x D t 1 + M a x D t 2 do
25:
     / / Enter the second stage
26:
     / / Select the virtual center pigeon from the historical best points
27:
     x k t + 1 = x k t + ω 3 ( x c e n t e r t x k t )
28:
     / / Best Selection
29:
     [ w i n n e r , l o s e r ] =compete( x k t + 1 , x g b e s t )
30:
     / / Update P V operation
31:
    for i = 1 : d i m do
32:
         μ t + 1 [ k ] = μ t [ k ] + w i n n e r [ k ] l o s e r [ k ] N
33:
         σ t + 1 [ k ] = ( σ t [ k ] ) 2 + ( μ t [ k ] ) 2 ( μ t + 1 [ k ] ) 2 + w i n n e r k 2 l o s e r k 2 N
34:
    end for
35:
end for
When initializing the two algorithms CPIO and original pigeon-inspired optimization (OPIO) , the map and compass factor R are set to 0.2, and the result is to compare CPIO and OPIO. The quality of solution and the number of runs of CPIO and OPIO optimal solutions are compared as below described. The CPIO and OPIO data results are the average of 30 runs. All algorithms operate 500 times, including 300 in the first phase and 200 in the second phase.
In Table 3, CPIO performs better than OPIO in many test functions, and most of the values perform well. In terms of the time cost comparison, it is easy to see that CPIO time spent is much better than PIO, especially in several of them, and the time spent is more than a hundred times more.
Table 3. Comparison, evaluation and speed of quality performance between CPIO and OPIO.
According to the comparison of the two algorithms, it can be concluded that the running time of CPIO is much lower than that of the original algorithm. This is because the number of population used in the process of iteration is different. In the new algorithm, it uses an example to keep iterating, constantly adjusting the probability distribution according to the path that has been iterated, and the greater the possibility of generating particles where the function values are superior. However, this method also has a big problem, since in the search process of a single particle, randomness is often large, and it is thus easy to fall into the local optimal. It is also relatively simple to achieve the optimal, in the case of small dimension settings, the advantages of the algorithm are not obvious. Because of this characteristic of the new algorithm, it is easy to save time and reduce the time complexity of the algorithm.
Figure 2 shows the convergence trend of CPIO and OPIO. Best score obtained so far refers to the optimal value obtained by the algorithm during the iteration process. While the convergence speed of OPIO and the algebra needed to achieve optimal are small, the optimal value of CPIO is better or nearly equal to the value of OPIO. Here, CPIO uses one particle for updating and iteration, while OPIO uses the entire population for optimization. CPIO is far less than OPIO search capability, but CPIO can save a lot of memory and time to find excellence.
Figure 2. Compact pigeon-inspired optimization (CPIO) and original pigeon-inspired optimization (OPIO) performance in test functions. (a) Ackley; (b) Crossit; (c) Drop; (d) Griewank.
Among the four selected functions, Figure 3 shows the time trend of the four functions running 30 times. In general, the time spent by the CPIO and PIO algorithms does not change much, but the two algorithms compare. It is easy to see that CPIO runs much faster than the PIO.
Figure 3. CPIO and OPIO performance in test functions. (a) Ackley; (b) Crossit; (c) Drop; (d) Griewank.
Table 4 shows the comparison of CPIO and PIO mentioned above in the memory variables, which makes it very convenient to implement the calculation algorithm. The number of variables of the two algorithms of CPIO and PIO proposed in this paper is calculated by the equation used in the computational optimization. In Table 4, it is easy to see that in the same computing situation, CPIO uses less memory than PIO. For example, during an iteration, CPIO uses an iteration Equations (16)–(23); the formula for PIO update iteration is Equations (9)–(13).
Table 4. The space complexity of the two algorithms.
As can be seen from Table 4, the actual population size of the PIO is N, but the actual population size in the CPIO is 1, and the virtual population number is N. In the case where the number of iterations l and the running time t are the same, the memory usage of the variables of OPIO and CPIO is iterated by 4 × t × N and 8 × t , respectively. Here, it is seen that the memory occupancy of the PIO is larger than the memory usage of the CPIO.
In Figure 4, the consequence of the presented algorithm and the else three meta-heuristics are shown. According to Table 5, the trend and optimal value of CPIO are fundamentally better than the other three algorithms, and have a superior performance. Table 5 shows the comparison of CPIO, OPIO and other algorithms, such as CPSO and PSO algorithms. Among the four meta-heuristic algorithms, the performance is as follows in 29 test functions. In the process of algorithm simulation, as part of the images are not so obvious, four relatively obvious images are extracted for display.
Figure 4. CPIO and three other meta-heuristic algorithms for testing function performance. (a) Ackley; (b) Crossit; (c) Drop; (d) Griewank.
Table 5. The optimal value and time cost of the four algorithms.

6. Experiments of Short-Term Power Generation Model for Cascade Hydropower Stations

Wanjiazhai Water Conservancy Project: The Wanjiazhai Water Conservancy Project is located in the canyon of the Tuoketuo to Longkou section of the Yellow River in the north of the Yellow River. It is the first of the eight cascades planned for the development of the middle reaches of the Yellow River. and also the Shanxi Yellow River Diversion Project. The starting point of the project the left bank is affiliated to the Pianguan County of Shanxi Province, and the right bank is subordinate to the Zhungeer Banner of Inner Mongolia Autonomous Region. The dam site controls a drainage area of 395,000 square kilometers, with a total storage capacity of 896 million cubic meters and a storage capacity of 445 million cubic meters. It has comprehensive benefits such as water supply, power generation, flood control and anti-icing.
Longkou Hydropower Station is located at the junction of two provinces, Hequ County, Shanxi Province and Zhungeer Banner, Inner Mongolia. It is 25.6 km from the upstream Wanjiazhai Water Control Project and 70 km from the downstream Tianqiao Hydropower Station. It is the regional center of energy and chemical bases in Shanxi Province and Inner Mongolia Autonomous Region, and controls the drainage area of 397,406 square kilometers.
Table 6 shows the monthly inflow values of the two cascade hydropower stations in the wet years, the flat water years and the dry years. ASP is Annual scheduling period.
Table 6. Cascade hydropower station monthly water supply.
The short-term power generation model of cascade hydropower stations has been introduced above. Figure 5 showcases the main flow of the algorithm. In this paper, the three periods of the two cascade hydropower stations are scheduled and modeled by Equations (1)–(8) and the sum of the power generation of the two cascade hydropower stations is the largest. As shown in Figure 6, at any stage, CPIO has the largest scheduling capacity for the two cascade hydropower stations, and the total power generation is also relatively huge. CPIO dispatched the two cascade hydropower stations. The final result has the power generation at 3.968 × 10 17 KWH in the high flow year, and the total power generation at 3.108 × 10 17 KWH in the year of the median water. The power generation at 2.396 × 10 17 KWH in the low year.
Figure 5. The main process of optimizing hydropower station.
Figure 6. Comparison of four meta-heuristic algorithms in cascade hydropower stations: (a) Wet water years schedule; (b) flat water years schedule; and (c) dry water years schedule.

7. Conclusions

A novel optimization approach called compact pigeon-inspired optimization (CPIO) is proposed. The proposed CPIO was tested on 29 classical test functions to demonstrate the usefulness of the proposed optimization method. A compact method is successfully used in the pigeon-inspired optimization algorithm to reduce the usage of the memory size. The proposed CPIO was also applied to cascade hydroelectric power generation. Simulation results show the CPIO may reach better results compared with some existing algorithms for the cascade hydroelectric power station.

Author Contributions

Conceptualization, J.-S.P. and W.-M.Z.; Data curation, A.-Q.T. and H.C.; Formal analysis, A.-Q.T., S.-C.C., J.-S.P., H.C., and W.-M.Z.; Investigation, A.-Q.T.; Methodology, A.-Q.T., S.-C.C., J.-S.P., H.C., and W.-M.Z.; Software, A.-Q.T.; Validation, J.-S.P.; Visualization, A.-Q.T. and S.-C.C.; Writing—original draft, A.-Q.T.; and Writing—review and editing, S.-C.C. and J.-S.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

We wish to confirm that there are no known conflict of interest and there has been no significant financial support for this work that could have influenced its outcome. We confirm that the manuscript has been read and approved by all named authors and that there are no other persons who satisfied the criteria for authorship but are not listed.

References

  1. Yang, X.S. Nature-Inspired Metaheuristic Algorithms; Luniver Press: Frome, UK, 2010; pp. 15–35. ISBN 978-1-905986-28-6. [Google Scholar]
  2. Jia, Z.; Sahmoudi, M. A type of collective detection scheme with improved pigeon-inspired optimization. Int. J. Intell. Comput. Cybern. 2016, 9, 105–123. [Google Scholar] [CrossRef]
  3. Chen, S.; Duan, H. Fast image matching via multi-scale Gaussian mutation pigeon-inspired optimization for low cost quadrotor. Aircr. Eng. Aerosp. Technol. 2017, 89, 777–790. [Google Scholar] [CrossRef]
  4. Qiu, H.; Duan, H. Multi-objective pigeon-inspired optimization for brushless direct current motor parameter design. Sci. China Technol. Sci. 2015, 58, 1915–1923. [Google Scholar] [CrossRef]
  5. Deng, X.W.; Shi, Y.Q.; Li, S.L.; Li, W.; Deng, S.W. Multi-objective pigeon-inspired optimization localization algorithm for large-scale agricultural sensor network. J. Huaihua Univ. 2017, 36, 37–40. [Google Scholar]
  6. Fu, X.; Chan, F.T.; Niu, B.; Chung, N.S.; Qu, T. A multi-objective pigeon inspired optimization algorithm for fuzzy production scheduling problem considering mould maintenance. Sci. China Inf. Sci. 2019, 62, 70202. [Google Scholar] [CrossRef]
  7. Pan, J.S.; Kong, L.; Sung, T.W.; Tsai, P.W.; Snášel, V. α-Fraction first strategy for hierarchical model in wireless sensor networks. J. Internet Technol. 2018, 19, 1717–1726. [Google Scholar]
  8. Wang, J.; Gao, Y.; Liu, W.; Sangaiah, A.K.; Kim, H.J. An intelligent data gathering schema with data fusion supported for mobile sink in wireless sensor networks. Int. J. Distrib. Sens. Netw. 2019, 15, 1550147719839581. [Google Scholar] [CrossRef]
  9. Wang, L.; Singh, C. Environmental/economic power dispatch using a fuzzified multi-objective particle swarm optimization algorithm. Electr. Power Syst. Res. 2007, 77, 1654–1664. [Google Scholar] [CrossRef]
  10. Hu, P.; Pan, J.S.; Chu, S.C.; Chai, Q.W.; Liu, T.; Li, Z.C. New Hybrid Algorithms for Prediction of Daily Load of Power Network. Appl. Sci. 2019, 9, 4514. [Google Scholar] [CrossRef]
  11. Emary, E.; Zawbaa, H.M.; Grosan, C.; Hassenian, A.E. Feature subset selection approach by gray-wolf optimization. In Afro-European Conference for Industrial Advancement; Springer: Berlin/Heidelberg, Germany, 2015; pp. 1–13. [Google Scholar]
  12. Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  13. Dao, T.K.; Pan, T.S.; Pan, J.S. Parallel bat algorithm for optimizing makespan in job shop scheduling problems. J. Intell. Manuf. 2018, 29, 451–462. [Google Scholar] [CrossRef]
  14. Yang, X.S. Flower pollination algorithm for global optimization. In Proceedings of the International Conference on Unconventional Computation and Natural Computation, Orléans, France, 3–7 September 2012; Springer: Berlin/Heidelberg, Germany, 2012; pp. 240–249. [Google Scholar]
  15. Nguyen, T.T.; Pan, J.S.; Dao, T.K. An Improved Flower Pollination Algorithm for Optimizing Layouts of Nodes in Wireless Sensor Network. IEEE Access 2019, 7, 75985–75998. [Google Scholar] [CrossRef]
  16. Chu, S.C.; Tsai, P.W.; Pan, J.S. Cat swarm optimization. In Proceedings of the Pacific Rim International Conference on Artificial Intelligence, Guilin, China, 7–11 August 2006; Springer: Berlin/Heidelberg, Germany, 2006; pp. 854–858. [Google Scholar]
  17. Kong, L.; Pan, J.S.; Tsai, P.W.; Vaclav, S.; Ho, J.H. A balanced power consumption algorithm based on enhanced parallel cat swarm optimization for wireless sensor network. Int. J. Distrib. Sens. Networks 2015, 11, 729680. [Google Scholar] [CrossRef]
  18. Qin, A.K.; Huang, V.L.; Suganthan, P.N. Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans. Evol. Comput. 2008, 13, 398–417. [Google Scholar] [CrossRef]
  19. Meng, Z.; Pan, J.S.; Tseng, K.K. PaDE: An enhanced Differential Evolution algorithm with novel control parameter adaptation schemes for numerical optimization. Knowl.-Based Syst. 2019, 168, 80–99. [Google Scholar] [CrossRef]
  20. Meng, Z.; Pan, J.S. Quasi-affine transformation evolutionary (QUATRE) algorithm: A parameter-reduced differential evolution algorithm for optimization problems. In Proceedings of the 2016 IEEE Congress on Evolutionary Computation (CEC), Vancouver, BC, Canada, 24–29 July 2016; pp. 4082–4089. [Google Scholar]
  21. Liu, N.; Pan, J.S. A bi-population QUasi-Affine TRansformation Evolution algorithm for global optimization and its application to dynamic deployment in wireless sensor networks. EURASIP J. Wirel. Commun. Netw. 2019, 2019, 175. [Google Scholar] [CrossRef]
  22. Koza, J.R. Genetic Programming: Automatic Programming of Computers. EvoNews 1997, 1, 4–7. [Google Scholar]
  23. Hsu, H.P.; Chiang, T.L.; Wang, C.N.; Fu, H.P.; Chou, C.C. A Hybrid GA with Variable Quay Crane Assignment for Solving Berth Allocation Problem and Quay Crane Assignment Problem Simultaneously. Sustainability 2019, 11, 2018. [Google Scholar] [CrossRef]
  24. Duan, H.; Qiao, P. Pigeon-inspired optimization: A new swarm intelligence optimizer for air robot path planning. Int. J. Intell. Comput. Cybern. 2014, 7, 24–37. [Google Scholar] [CrossRef]
  25. Li, C.; Duan, H. Target detection approach for UAVs via improved pigeon-inspired optimization and edge potential function. Aerosp. Sci. Technol. 2014, 39, 352–360. [Google Scholar] [CrossRef]
  26. Deng, Y.; Duan, H. Control parameter design for automatic carrier landing system via pigeon-inspired optimization. Nonlinear Dyn. 2016, 85, 97–106. [Google Scholar] [CrossRef]
  27. Duan, H.; Wang, X. Echo state networks with orthogonal pigeon-inspired optimization for image restoration. IEEE Trans. Neural Networks Learn. Syst. 2015, 27, 2413–2425. [Google Scholar] [CrossRef] [PubMed]
  28. Wang, C.N.; Le, A. Measuring the Macroeconomic Performance among Developed Countries and Asian Developing Countries: Past, Present, and Future. Sustainability 2018, 10, 3664. [Google Scholar] [CrossRef]
  29. Wang, C.N.; Nguyen, H.K. Enhancing urban development quality based on the results of appraising efficient performance of investors—A case study in vietnam. Sustainability 2017, 9, 1397. [Google Scholar] [CrossRef]
  30. Scieri, F.; Miller, R.L. Hydro Electric Generating System. U.S. Patent 4,443,707, 17 April 1984. [Google Scholar]
  31. Davison, F.E. Electric Generating Water Power Device. U.S. Patent 4,163,905, 7 August 1979. [Google Scholar]
  32. Ma, C.; Lian, J.; Wang, J. Short-term optimal operation of Three-gorge and Gezhouba cascade hydropower stations in non-flood season with operation rules from data mining. Energy Convers. Manag. 2013, 65, 616–627. [Google Scholar] [CrossRef]
  33. Jain, A.; Ormsbee, L.E. Short-term water demand forecast modeling techniques—CONVENTIONAL METHODS VERSUS AI. J. Am. Water Work. Assoc. 2002, 94, 64–72. [Google Scholar] [CrossRef]
  34. Fan, C.; Xiao, F.; Wang, S. Development of prediction models for next-day building energy consumption and peak power demand using data mining techniques. Appl. Energy 2014, 127, 1–10. [Google Scholar] [CrossRef]
  35. Fosso, O.B.; Belsnes, M.M. Short-term hydro scheduling in a liberalized power system. In Proceedings of the 2004 International Conference on Power System Technology, PowerCon 2004, Singapore, 21–24 November 2004; Volume 2, pp. 1321–1326. [Google Scholar]
  36. Nguyen, T.T.; Vo, D.N. An efficient cuckoo bird inspired meta-heuristic algorithm for short-term combined economic emission hydrothermal scheduling. Ain Shams Eng. J. 2016, 9, 483–497. [Google Scholar] [CrossRef]
  37. Nazari-Heris, M.; Mohammadi-Ivatloo, B.; Gharehpetian, G. Short-term scheduling of hydro-based power plants considering application of heuristic algorithms: A comprehensive review. Renew. Sustain. Energy Rev. 2017, 74, 116–129. [Google Scholar] [CrossRef]
  38. Billingsley, P. Probability and Measure; John Wiley & Sons: Hoboken, NJ, USA, 2008. [Google Scholar]
  39. Bronshtein, I.N.; Semendyayev, K.A. Handbook of Mathematics; Springer Science & Business: Berlin/Heidelberg, Germany, 2013; ISBN 978-3-662-21982-9. [Google Scholar]
  40. Neri, F.; Mininno, E.; Iacca, G. Compact particle swarm optimization. Inf. Sci. 2013, 239, 96–121. [Google Scholar] [CrossRef]
  41. Mininno, E.; Cupertino, F.; Naso, D. Real-valued compact genetic algorithms for embedded microcontroller optimization. IEEE Trans. Evol. Comput. 2008, 12, 203–219. [Google Scholar] [CrossRef]
  42. Surjanovic, S.; Bingham, D. Virtual Library of Simulation Experiments: Test Functions and Datasets. Available online: http://www.sfu.ca/~ssurjano (accessed on 26 December 2019).
  43. Hao, R.; Luo, D.; Duan, H. Multiple UAVs mission assignment based on modified pigeon-inspired optimization algorithm. In Proceedings of the 2014 IEEE Chinese Guidance, Navigation and Control Conference, Yantai, China, 8–10 August 2014; pp. 2692–2697. [Google Scholar]

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.