You are currently viewing a new version of our website. To view the old version click .
Mathematics
  • Communication
  • Open Access

4 March 2021

Empirical Convergence Theory of Harmony Search Algorithm for Box-Constrained Discrete Optimization of Convex Function

and
1
Department of Mathematics and Statistics, Sejong University, Seoul 05006, Korea
2
College of IT Convergence, Gachon University, Seongnam 13120, Korea
*
Author to whom correspondence should be addressed.

Abstract

The harmony search (HS) algorithm is an evolutionary computation technique, which was inspired by music improvisation. So far, it has been applied to various scientific and engineering optimization problems including project scheduling, structural design, energy system operation, car lane detection, ecological conservation, model parameter calibration, portfolio management, banking fraud detection, law enforcement, disease spread modeling, cancer detection, astronomical observation, music composition, fine art appreciation, and sudoku puzzle solving. While there are many application-oriented papers, only few papers exist on how HS performs for finding optimal solutions. Thus, this preliminary study proposes a new approach to show how HS converges on an optimal solution under specific conditions. Here, we introduce a distance concept and prove the convergence based on the empirical probability. Moreover, a numerical example is provided to easily explain the theorem.

1. Introduction

In recent years, many researchers have utilized various nature-inspired metaheuristic algorithms for solving scientific and engineering optimization problems. One of the popular algorithms is harmony search (HS), which was inspired by jazz improvisation [1]. As a musician plays a musical note from its memory or randomly, HS generates a value from its memory or randomly. As a new harmony, which is composed of musical notes, is evaluated in each practice and memorized if it is good, a new solution vector, which is composed of values, is evaluated in each iteration and memorized if it performs well. This HS optimization process, which utilizes three basic operations (memory consideration, pitch adjustment, and random selection), continues until it finds an optimal solution vector [2].
When compared with other algorithms, there is a similarity between HS and them [1]. HS is similar to Tabu Search with respect to keeping the past vectors in a memory called harmony memory (HM). In addition, HS can use the adaptative parameters of HMCR (Harmony Memory Consideration Rate) and PAR (Pitch Adjustment Rate), which are similar to Simulated Annealing in varying temperature. Furthermore, both HS and genetic algorithm (GA) manage multiple vectors simultaneously. However, there is also a difference between HS and GA. While HS generates a new vector by considering all the existing vectors, GA generates the new vector by considering only two of the existing vectors (the parents). In addition, HS considers each variable in a vector independently while GA cannot consider in that way because its major operation is crossover, which keeps the gene sequence (multiple variables together).
The HS algorithm has been applied to various optimization problems including project scheduling [3], structural design [4], energy system operation [5], car lane detection [6], ecological conservation [7], model parameter calibration [8], portfolio management [9], banking fraud detection [10], law enforcement [11], disease spread modeling [12], cancer detection [13], astronomical observation [14], music composition [15,16], fine art appreciation [17], and sudoku puzzle solving [18]. Furthermore, there are some application-oriented reviews of HS [19,20,21,22,23,24,25].
While there are many applications proposed so far, only a few studies have been dedicated to the theoretical background of the HS algorithm. Beyer [26] dealt with the expected population variance of several evolutionary algorithms, and Das et al. [27] proposed an approximated variance of the expectation of solution candidates and discussed the exploratory power of the HS algorithm. However, there has been no study discussing the convergence of the HS algorithm while the convergence of other optimization algorithms has been discussed in some studies [28,29,30,31].
As an evolutionary computation algorithm, HS considers an optimization problem:
m i n x X f ( x ) ,
where
X = { x = ( x i ) R n | l i < x i < u i ,   i = 1 , , n } , and   f : R n R .
If all the bounds are infinite, the above problem becomes an unconstrained optimization. However, most practical optimization problems adopt variables having certain prefixed range of values, and, therefore, become box-constrained problems [32].
In this paper, we propose the convergence theory based on the empirical probability for the HS, which is one of box-constrained optimization algorithms. In particular, we consider the convex optimization functions with single and multiple variable cases. For this, we define a discrete sequence to prove convergence using distance. This new approach can be applied to the algorithms, such as HS, which have candidate sets that store the improved solutions in the storage and iteratively update them.

2. Harmony Search Algorithm

2.1. Basic Structure of Harmony Search

The HS algorithm is an optimization method inspired by musical phenomena. This algorithm mimics the musical performing process that occurs when a musician searches for a better harmonic sound, such as jazz improvisation. Jazz improvisation finds a musically pleasing harmony (perfect state) determined by aesthetic standards, just as the optimization process finds a global solution (perfect state) determined by an objective function. While the pitch of each instrument determines its aesthetic quality, the value of each decision/design variable determines its solution quality of the objective function.
In order to optimize a problem using HS, we first define all possible candidate value set called the candidate set (universal set) Λ . Let us assume that Λ includes K   candidate   values . That is, Λ = [ Λ 1 , Λ n ] and Λ i = { x i ( 1 ) , , x i ( K i ) } .
Here, the memory storage HM of the HS algorithm can be expressed as in Equation (2). To initialize HM, we randomly choose values from the universal set, and generate vectors as many as HMS (harmony memory size, that is, the number of vectors stored in HM). The value of the objective function is also kept next to each solution vector.
HM = [ x 1 1 x 2 1 x n 1 x 1 2 x 2 2 x n 2 x 1 H M S x 2 H M S x n H M S | f ( x 1 ) f ( x 2 ) f ( x H M S ) ]
Once HM is prepared, the HM is refreshed with better solution vectors iteration by iteration. If a newly generated vector x N e w is better than the worst vector x Worst stored in the HM in terms of an objective function value, the new vector is swapped with the worst one.
HM New = HM \ { x W o r s t }   { x N e w }
The value of variable x i   ( i = 1 , 2 , , n ) can be randomly selected from the set of all candidate discrete values Λ i = { x i ( 1 ) , x i ( 2 ) , , x i ( K i ) } with a probability of P R (random selection rate). Otherwise, the value of x i can be selected from the set of stored values, named H M i = { x i 1 ,   x i 2 , ,   x i H M S } , which is the ith column of HM, with a probability of P M (the probability with which the value is selected solely from HM), or, once x i ( k p i ) is selected from H M i , it can be slightly adjusted by moving it to neighboring values x i ( k p i + m ) with a probability of P P (the probability with which the value is selected solely using pitch adjustment) as follows:
x i N e w { x i ( k R i ) Λ i ,   w i t h   p r o b a b i l i t y   P R ,   x i h H M i ,   w i t h   p r o b a b i l i t y   P M , x i ( k P i + m )   Λ i ,   w i t h   p r o b a b i l i t y   P P ,
where = 1 , , n ;   k R i ,   k P i { 1 , , K i } ;   h { 1 , , H M S } , P R + P M + P P = 1   ( 100 % ) , 0 P R , P M , P P 1 ,   and   m is some non-zero integer such that x i ( k P i + m )   Λ i . Here, m is a predetermined parameter for adjustment. That is, m = ± 1 or ± 2 or ± 3 , etc. For example, if m   =   ± 1 , then we take
m = { 1 ,   w i t h   p r o b a b i l i t y   1 2   1 ,   w i t h   p r o b a b i l i t y   1 2
For P M and P P , we first define HMCR by HMCR = 1 P R . Then, after we define PAR, P M and P P are defined as P M = HMCR ( 1 PAR ) and P P = HMCR · PAR .
Here, for the pitch adjustment, we first select x i h randomly from H M i using a uniform distribution between 0 and 1, i.e., h = I n t e g e r ( R a n d ( 0 , 1 ) × H M S + 0.5 ) . Then, we identify k P i , which satisfies x i h = x i ( k P i ) ,   where   k P i { 1 , , K i } . Next, we further tweak x i ( k P i ) into x i ( k P i + m ) , where x i ( k P i + m ) Λ i .
This is the basic structure of the HS algorithm [33]. Although there are many structural variants of the HS algorithm [34], most variants basically have the above-mentioned three operations: memory consideration, pitch adjustment, and random selection.

2.2. Solution Formula for Harmony Search without Pitch Adjustment

Let us formulate the solution for the HS algorithm without a pitch adjustment case.
At the first generation, x i 1 ( i = 1 , , n ) will be chosen with given probabilities as follows:
x i 1 = { x i ( k )   f r o m   Λ i   w i t h   p r o b .   P R , x i h   f r o m   H M i 0   w i t h   p r o b .   P M ,
where i = 1 , , n ;   k = k P { 1 , , K i } ;   h { 1 , , H M S } , P R + P M = 1 and H M i 0 is the initial harmony memory for x i . And let x i g be the solution with given probabilities at the g t h generation stage, then
x i g = { x i ( k )   f r o m   Λ i   w i t h   p r o b .   P R , x i h   f r o m   H M i g 1   w i t h   p r o b .   P M ,
where H M i g 1 is the newly updated i t h column of HM after the ( g 1 ) t h generation. At the first generation, x 1 = ( x 1 1 ,   x 2 1 , , x n 1 ) is obtained by two operations (random selection or memory consideration). If the newly generated vector x 1 is better than x worst 0 which is the worst vector in H M 0 in terms of an objective function value, then x worst 0 will be replaced with x 1 . Otherwise, the worst vector x worst 0 will stay in the memory.
The element that we get from this comparison after the first generation will be represented by x new 1 . That is,
x new 1 { x 1   i f   x 1   i s   b e t t e r   t h a n   x worst 0 ,   x worst 0                             o t h e r w i s e .
Following a similar procedure, at the g t h generation, if x g is better than x worst g 1 , which is the worst solution vector in H M g 1 , then x worst g 1 will be replaced with x g . Otherwise, the worst element x worst g 1 will stay in the memory.
The element that we get from this comparison after the g t h generation will be represented by x new g . That is,
x new g { x g   i f   x g   i s   b e t t e r   t h a n   x worst g 1 ,   x worst g 1                               o t h e r w i s e ,
Using an indicator function, the solution formula for x new g in Equation (7) can be equivalently represented by:
x new g = x g · I g + x worst g 1 · ( 1 I g ) ,
where
I g = {   1   i f   x g   i s   b e t t e r   t h a n   x worst g 1 , 0                                   o t h e r w i s e .

2.3. Solution Formula for Harmony Search with Pitch Adjustment

Next, let us formulate the solution for the HS algorithm with a pitch adjustment case.
At the first generation, x i 1 ( i = 1 , , n ) will be chosen with given probabilities as follows:
x i 1 = { x i ( k R i )   f r o m   Λ i   w i t h   P R , x i h   f r o m   H M i 0   w i t h   P M , x i ( k P i + m )   f r o m   Λ i   w i t h   P P ,
where i = 1 , , n ;   k R i , k P i { 1 , , K } ;   h { 1 , , H M S } , P R + P M + P P = 1 and H M i 0 is the initial harmony memory. Then, let x i g   be the solution with given probabilities at the g t h generation stage, in which:
x i g = { x i ( k R i )   f r o m   Λ i   w i t h   P R , x i h   f r o m   H M i g 1   w i t h   P M , x i ( k R i + m )   f r o m   Λ i   w i t h   P P
where H M i g 1 is the newly updated i t h column of HM after the ( g 1 ) t h generation. At the first generation, x 1 = ( x 1 1 ,   x 2 1 , , x n 1 ) is obtained by three operations (random selection, memory consideration or pitch adjustment). If x 1 is better than x worst 0 which is the worst one in H M 0 in terms of an objective function value, then x new 1 = x 1 . Otherwise, x new 1 = x worst 0 . That is,
x new 1 { x 1   i f   x 1   i s   b e t t e r   t h a n   x worst 0 ,   x worst 0                               o t h e r w i s e .
Therefore, after the first generation, H M 1 will be updated by substituting x worst 0 with x new 1 . Then, after the g t h generation,
x new g { x g   i f   x g   p e r f o r m s   b e t t e r   t h a n   x worst g 1 ,   x worst g 1                                             o t h e r w i s e ,
where x worst g 1 is the solution vector that performs the worst in the H M g 1 . Therefore, after the g t h generation, H M g will be updated by substituting x worst 0 with x new 1 .
Let x new g = x g ^ . Then, using an indicator function, the solution formula for x g ^ in Equation (13) can be represented by:
x g ^ = x g · I g + x worst g 1 · ( 1 I g ) ,
where
I g = {   1   i f   x g   p e r f o r m s   b e t t e r   t h a n   x worst g 1 , 0                                                 o t h e r w i s e .

4. Conclusions

In this communication, we employed the distance concept and proved the convergence of the HS algorithm based on the empirical probability. The solution behavior of HS for one or more discrete variables was discussed and the given theorem was demonstrated with a numerical example.
For the future study, we will expand the theorem to include non-discrete variables, multi-modal functions, and adaptive parameters [35].

Author Contributions

J.H.Y. developed the conceptualization, proving the theorems, and drafted the manuscript. Moreover, supervising, reviewing, and editing were done by Z.W.G. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Research Foundation of Korea (NRF) grant funded by the Korea government (MSIT) (No. 2020R1A2C1A01011131).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Geem, Z.W.; Kim, J.H.; Loganathan, G.V. A new heuristic optimization algorithm: Harmony search. Simulation 2001, 76, 60–68. [Google Scholar] [CrossRef]
  2. Lee, K.S.; Geem, Z.W. A new metaheuristic algorithm for continuous engineering optimization: Harmony search theory and practice. Comput. Methods Appl. Mech. Eng. 2004, 194, 3902–3933. [Google Scholar] [CrossRef]
  3. Geem, Z.W. Multiobjective Optimization of Time-Cost Trade-Off Using Harmony Search. J. Constr. Eng. Manag. ASCE 2010, 136, 711–716. [Google Scholar] [CrossRef]
  4. Geem, Z.W. Harmony Search Algorithms for Structural Design Optimization; Springer: Berlin, Germany, 2009. [Google Scholar]
  5. Nazari-Heris, M.; Mohammadi-Ivatloo, B.; Asadi, S.; Kim, J.-H.; Geem, Z.W. Harmony search algorithm for energy system applications: An updated review and analysis. J. Exp. Theor. Artif. Intell. 2019, 31, 723–749. [Google Scholar] [CrossRef]
  6. Moon, Y.Y.; Geem, Z.W.; Han, G.-T. Vanishing point detection for self-driving car using harmony search algorithm. Swarm Evol. Comput. 2018, 41, 111–119. [Google Scholar] [CrossRef]
  7. Geem, Z.W. Can Music Supplant Math in Environmental Planning? Leonardo 2015, 48, 147–150. [Google Scholar] [CrossRef]
  8. Lee, W.-Y.; Ko, K.-E.; Geem, Z.W.; Sim, K.-B. Method that determining the Hyperparameter of CNN using HS Algorithm. J. Korean Inst. Intell. Syst. 2017, 27, 22–28. [Google Scholar] [CrossRef]
  9. Tuo, S.H. A Modified Harmony Search Algorithm for Portfolio Optimization Problems. Econ. Comput. Econ. Cybern. Stud. Res. 2016, 50, 311–326. [Google Scholar]
  10. Daliri, S. Using Harmony Search Algorithm in Neural Networks to Improve Fraud Detection in Banking System. Comput. Intell. Neurosci. 2020, 2020, 6503459. [Google Scholar] [CrossRef] [PubMed]
  11. Shih, P.-C.; Chiu, C.-Y.; Chou, C.-H. Using Dynamic Adjusting NGHS-ANN for Predicting the Recidivism Rate of Commuted Prisoners. Mathematics 2019, 7, 1187. [Google Scholar] [CrossRef]
  12. Fairchild, G.; Hickmann, K.S.; Mniszewski, S.M.; Del Valle, S.Y.; Hyman, J.M. Optimizing human activity patterns using global sensitivity analysis. Comput. Math. Organ Theory 2014, 20, 394–416. [Google Scholar] [CrossRef] [PubMed][Green Version]
  13. Elyasigomari, V.; Lee, D.A.; Screen, H.R.C.; Shaheed, M.H. Development of a two-stage gene selection method that incorporates a novel hybrid approach using the cuckoo optimization algorithm and harmony search for cancer classification. J. Biomed. Inform. 2017, 67, 11–20. [Google Scholar] [CrossRef] [PubMed]
  14. Deeg, H.J.; Moutou, C.; Erikson, A.; Csizmadia, S.; Tingley, B.; Barge, P.; Bruntt, H.; Havel, M.; Aigrain, S.; Almenara, J.M.; et al. A transiting giant planet with a temperature between 250 K and 430 K. Nature 2010, 464, 384–387. [Google Scholar] [CrossRef] [PubMed]
  15. Geem, Z.W.; Choi, J.Y. Music Composition Using Harmony Search Algorithm. Lect. Notes Comput. Sci. 2007, 4448, 593–600. [Google Scholar]
  16. Navarro, M.; Corchado, J.M.; Demazeau, Y. MUSIC-MAS: Modeling a harmonic composition system with virtual organizations to assist novice composers. Expert Syst. Appl. 2016, 57, 345–355. [Google Scholar] [CrossRef]
  17. Koenderink, J.; van Doorn, A.; Wagemans, J. Picasso in the mind’s eye of the beholder: Three-dimensional filling-in of ambiguous line drawings. Cognition 2012, 125, 394–412. [Google Scholar] [CrossRef] [PubMed]
  18. Geem, Z.W. Harmony Search Algorithm for Solving Sudoku. Lect. Notes Comput. Sci. 2007, 4692, 371–378. [Google Scholar]
  19. Geem, Z.W. Music-Inspired Harmony Search Algorithm: Theory and Applications; Springer: New York, NY, USA, 2009. [Google Scholar]
  20. Manjarres, D.; Landa-Torres, I.; Gil-Lopez, S.; Del Ser, J.; Bilbao, M.N.; Salcedo-Sanz, S.; Geem, Z.W. A Survey on Applications of the Harmony Search Algorithm. Eng. Appl. Artif. Intell. 2013, 26, 1818–1831. [Google Scholar] [CrossRef]
  21. Askarzadeh, A. Solving electrical power system problems by harmony search: A review. Artif. Intell. Rev. 2017, 47, 217–251. [Google Scholar] [CrossRef]
  22. Yi, J.; Lu, C.; Li, G. A literature review on latest developments of Harmony Search and its applications to intelligent manufacturing. Math. Biosci. Eng. 2019, 16, 2086–2117. [Google Scholar] [CrossRef]
  23. Ala’a, A.; Alsewari, A.A.; Alamri, H.S.; Zamli, K.Z. Comprehensive Review of the Development of the Harmony Search Algorithm and Its Applications. IEEE Access 2019, 7, 14233–14245. [Google Scholar]
  24. Alia, M.; Mandava, R. The variants of the harmony search algorithm: An overview. Artif. Intell. Rev. 2011, 36, 49–68. [Google Scholar] [CrossRef]
  25. Gao, X.Z.; Govindasamy, V.; Xu, H.; Wang, X.; Zenger, K. Harmony Search Method: Theory and Applications. Comput. Intell. Neurosci. 2015, 2015, 258491. [Google Scholar] [CrossRef] [PubMed]
  26. Beyer, H.-G. On the dynamics of EAs without selection. In Foundations of Genetic Algorithms; Banzhaf, W., Reeves, C., Eds.; Morgan Kaufmann: San Francisco, CA, USA, 1999; Volume 5, pp. 5–26. [Google Scholar]
  27. Das, S.; Mukhopadhyay, A.; Roy, A.; Abraham, A. Exploratory Power of the Harmony Search Algorithm: Analysis and Im-provements for Global Numerical Optimization. IEEE Trans. Sys. Man Cybern. Part B Cybern. 2001, 41, 89–106. [Google Scholar] [CrossRef]
  28. Wu, C.F.J. On the convergence properties of the EM algorithm. Ann. Stat. 1983, 11, 95–103. [Google Scholar] [CrossRef]
  29. Bull, A.D. Convergence Rates of Efficient Global Optimization Algorithms. J. Mach. Learn. Res. 2011, 12, 2879–2904. [Google Scholar]
  30. Trelea, I.C. The particle swarm optimization algorithm: Convergence analysis and parameter selection. Inf. Process. Lett. 2003, 85, 317–325. [Google Scholar] [CrossRef]
  31. Zhang, X.; Zheng, X.; Cheng, R.; Qiu, J.; Jin, Y. A competitive mechanism based multi-objective particle swarm optimizer with fast convergence. Inf. Sci. 2018, 427, 63–76. [Google Scholar] [CrossRef]
  32. Facchinei, F.; Júdice, J.; Soares, J. Generating Box-Constrained Optimization Problems. ACM Trans. Math. Softw. 1997, 23, 443–447. [Google Scholar] [CrossRef]
  33. Geem, Z.W. Novel derivative of harmony search algorithm for discrete design variables. Appl. Math. Comp. 2008, 199, 223–230. [Google Scholar] [CrossRef]
  34. Zhang, T.; Geem, Z.W. Review of Harmony Search with Respect to Algorithm Structure. Swarm Evol. Comput. 2019, 48, 31–43. [Google Scholar] [CrossRef]
  35. Almeida, F.; Giménez, D.; López-Espín, J.J.; Pérez-Pérez, M. Parameterized Schemes of Metaheuristics: Basic Ideas and Applications with Genetic Algorithms, Scatter Search, and GRASP. IEEE Trans. Syst. Man Cybern. Syst. 2013, 43, 570–586. [Google Scholar] [CrossRef]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.