Efficient Chaotic Imperialist Competitive Algorithm with Dropout Strategy for Global Optimization
Abstract
:1. Introduction
2. Literature Review
2.1. Imperialist Competitive Algorithm (ICA)
2.2. Chaotic Imperialist Competitive Algorithm (CICA)
2.3. Dropout
3. Chaotic Imperialist Competitive Algorithm with Dropout (CICAD)
Algorithm 1 CICAD 

4. Numerical Examples
4.1. Success Rate
4.2. Statistical Results
4.3. Computational Complexity
5. Application
5.1. Objective Function
5.2. Experimental Results
 The optimal solution passes through the obstacles.
 The optimal solution is worse than the median of all trails.
6. Conclusions
Author Contributions
Funding
Conflicts of Interest
References
 AtashpazGargari, E.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007; IEEE: Piscataway, NJ, USA, 2007; pp. 4661–4667. [Google Scholar]
 Cattani, M.; Caldas, I.L.; Souza, S.L.d.; Iarosz, K.C. Deterministic chaos theory: Basic concepts. Rev. Bras. de Ensino de Física 2017, 39. [Google Scholar] [CrossRef] [Green Version]
 Rosso, O.; Larrondo, H.; Martin, M.; Plastino, A.; Fuentes, M. Distinguishing noise from chaos. Phys. Rev. Lett. 2007, 99, 154102. [Google Scholar] [CrossRef] [PubMed] [Green Version]
 Chen, Y.; Li, L.; Xiao, J.; Yang, Y.; Liang, J.; Li, T. Particle swarm optimizer with crossover operation. Eng. Appl. Artif. Intell. 2018, 70, 159–169. [Google Scholar] [CrossRef]
 Aliniya, Z.; Keyvanpour, M.R. CBICA: A crossoverbased imperialist competitive algorithm for largescale problems and engineering design optimization. Neural Comput. Appl. 2019, 31, 7549–7570. [Google Scholar] [CrossRef]
 Xu, S.; Wang, Y.; Lu, P. Improved imperialist competitive algorithm with mutation operator for continuous optimization problems. Neural Comput. Appl. 2017, 28, 1667–1682. [Google Scholar] [CrossRef]
 Ma, Z.; Yuan, X.; Han, S.; Sun, D.; Ma, Y. Improved Chaotic Particle Swarm Optimization Algorithm with More Symmetric Distribution for Numerical Function Optimization. Symmetry 2019, 11, 876. [Google Scholar] [CrossRef] [Green Version]
 Alatas, B.; Akin, E.; Ozer, A.B. Chaos embedded particle swarm optimization algorithms. Chaos Solitons Fractals 2009, 40, 1715–1734. [Google Scholar] [CrossRef]
 Gandomi, A.H.; Yang, X.S.; Talatahari, S.; Alavi, A.H. Firefly algorithm with chaos. Commun. Innonlinear Sci. Numer. Simul. 2013, 18, 89–98. [Google Scholar] [CrossRef]
 Wang, G.G.; Deb, S.; Gandomi, A.H.; Zhang, Z.; Alavi, A.H. Chaotic cuckoo search. Soft Comput. 2016, 20, 3349–3362. [Google Scholar] [CrossRef]
 Zhao, H.; Gao, W.; Deng, W.; Sun, M. Study on an Adaptive CoEvolutionary ACO Algorithm for Complex Optimization Problems. Symmetry 2018, 10, 104. [Google Scholar] [CrossRef] [Green Version]
 Talatahari, S.; Azar, B.F.; Sheikholeslami, R.; Gandomi, A. Imperialist competitive algorithm combined with chaos for global optimization. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 1312–1319. [Google Scholar] [CrossRef]
 Fiori, S.; Di Filippo, R. An improved chaotic optimization algorithm applied to a DC electrical motor modeling. Entropy 2017, 19, 665. [Google Scholar] [CrossRef] [Green Version]
 Srivastava, N.; Hinton, G.; Krizhevsky, A.; Sutskever, I.; Salakhutdinov, R. Dropout: A simple way to prevent neural networks from overfitting. J. Mach. Learn. Res. 2014, 15, 1929–1958. [Google Scholar]
 Wu, H.; Gu, X. Towards dropout training for convolutional neural networks. Neural Netw. 2015, 71, 1–10. [Google Scholar] [CrossRef] [Green Version]
 Park, S.; Kwak, N. Analysis on the dropout effect in convolutional neural networks. In Asian Conference on Computer Vision; Springer: Berlin/Heidelberg, Germany, 2016; pp. 189–204. [Google Scholar]
 Moon, T.; Choi, H.; Lee, H.; Song, I. Rnndrop: A novel dropout for rnns in asr. In Proceedings of the 2015 IEEE Workshop on Automatic Speech Recognition and Understanding (ASRU), Scottsdale, AZ, USA, 13–17 December 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 65–70. [Google Scholar]
 Kaveh, A.; Talatahari, S. Optimum design of skeletal structures using imperialist competitive algorithm. Comput. Struct. 2010, 88, 1220–1229. [Google Scholar] [CrossRef]
 May, R.M. Simple mathematical models with very complicated dynamics. In The Theory of Chaotic Attractors; Springer: Berlin/Heidelberg, Germany, 2004; pp. 85–93. [Google Scholar]
 He, D.; He, C.; Jiang, L.G.; Zhu, H.W.; Hu, G.R. Chaotic characteristics of a onedimensional iterative map with infinite collapses. IEEE Trans. Circuits Syst. I Fundam. Theory Appl. 2001, 48, 900–906. [Google Scholar]
 Hilborn, R.C. Chaos and Nonlinear Dynamics: An Introduction for Scientists and Engineers; Oxford University Press on Demand: Oxford, UK, 2004. [Google Scholar]
 Ott, E. Chaos in Dynamical Systems; Cambridge University Press: Cambridge, UK, 2002. [Google Scholar]
 Zheng, W.M. Kneading plane of the circle map. Chaos Solitons Fractals 1994, 4, 1221–1233. [Google Scholar] [CrossRef]
 Little, M.; Heesch, D. Chaotic rootfinding for a small class of polynomials. J. Differ. Equ. Appl. 2004, 10, 949–953. [Google Scholar] [CrossRef] [Green Version]
 Semeniuta, S.; Severyn, A.; Barth, E. Recurrent Dropout without Memory Loss. In Proceedings of the COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, Osaka, Japan, 11–16 December 2016; pp. 1757–1766. [Google Scholar]
 Wang, S.; Manning, C. Fast dropout training. In Proceedings of the International Conference on Machine Learning, Atlanta, GA, USA, 16–21 June 2013; pp. 118–126. [Google Scholar]
 Choset, H.M.; Hutchinson, S.; Lynch, K.M.; Kantor, G.; Burgard, W.; Kavraki, L.E.; Thrun, S. Principles of Robot Motion: Theory, Algorithms, and Implementation; MIT Press: Cambridge, MA, USA, 2005. [Google Scholar]
 Lamini, C.; Fathi, Y.; Benhlima, S. Collaborative Qlearning path planning for autonomous robots based on holonic multiagent system. In Proceedings of the 2015 10th International Conference on Intelligent Systems: Theories and Applications (SITA), Rabat, Morocco, 20–21 October 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 1–6. [Google Scholar]
 Woon, S.F.; Rehbock, V. A critical review of discrete filled function methods in solving nonlinear discrete optimization problems. Appl. Math. Comput. 2010, 217, 25–41. [Google Scholar] [CrossRef] [Green Version]
 Puchinger, J.; Raidl, G.R. Combining metaheuristics and exact algorithms in combinatorial optimization: A survey and classification. In Proceedings of the International WorkConference on the Interplay Between Natural and Artificial Computation, Las Palmas, Canary Islands, Spain, 15–18 June 2005; Springer: Berlin/Heidelberg, Germany, 2005; pp. 41–53. [Google Scholar]
 Šeda, M. Roadmap methods vs. cell decomposition in robot motion planning. In Proceedings of the 6th WSEAS International Conference on Signal Processing, Robotics and Automation, Corfu Island, Greece, 16–19 February 2007; World Scientific and Engineering Academy and Society (WSEAS): Athens, Greece, 2007; pp. 127–132. [Google Scholar]
 Cai, C.; Ferrari, S. Informationdriven sensor path planning by approximate cell decomposition. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 2009, 39, 672–689. [Google Scholar]
 Rimon, E.; Koditschek, D.E. Exact robot navigation using artificial potential functions. Dep. Pap. (ESE) 1992, 323. [Google Scholar] [CrossRef] [Green Version]
 Hocaoglu, C.; Sanderson, A.C. Planning multiple paths with evolutionary speciation. IEEE Trans. Evol. Comput. 2001, 5, 169–191. [Google Scholar] [CrossRef]
 Jung, I.K.; Hong, K.B.; Hong, S.K.; Hong, S.C. Path planning of mobile robot using neural network. In Proceedings of the ISIE’99. IEEE International Symposium on Industrial Electronics (Cat. No. 99TH8465), Bled, Slovenia, Slovenia, 12–16 July 1999; IEEE: Piscataway, NJ, USA, 1999; Volume 3, pp. 979–983. [Google Scholar]
 Kennedy, J. Particle swarm optimization. In Encyclopedia of Machine Learning; Springer: Berlin/Heidelberg, Germany, 2010; pp. 760–766. [Google Scholar]
 Huang, H.C.; Tsai, C.C. Global path planning for autonomous robot navigation using hybrid metaheuristic GAPSO algorithm. In Proceedings of the SICE Annual Conference, Tokyo, Japan, 13–18 September 2011; IEEE: Piscataway, NJ, USA, 2011; pp. 1338–1343. [Google Scholar]
Map  Definition  Parameters 

Logistic map [19]  ${x}_{k+1}=a{x}_{k}(1{x}_{k})$  $a=4$ 
ICMIC map [20]  ${x}_{k+1}=sin\left(\frac{a}{{x}_{n}}\right)$  $a=0.9$ 
Sinusoidal map [19]  ${x}_{k+1}=a{x}_{k}^{2}sin\left(\pi {x}_{k}\right)$  $a=2.3,\phantom{\rule{4pt}{0ex}}{x}_{0}=0.7$ 
Gauss map [21]  ${x}_{k+1}=exp(\alpha {x}_{k}^{2})+\beta $  $\alpha =4.9,\phantom{\rule{4pt}{0ex}}\beta =0.58$ 
Tent map [22]  ${x}_{k+1}=\mu min\{{x}_{k},1{x}_{k}\}$  $\mu =2$ 
Circle map [23]  ${x}_{k+1}={x}_{k}+b(a/2\pi )sin\left(2\pi {x}_{k}\right)mod\left(1\right)$  $a=0.5,\phantom{\rule{4pt}{0ex}}b=0.2$ 
Complex squaring map [24]  ${x}_{k+1}={x}_{k}^{2}$  ${x}_{k}={x}_{0}^{2n}$ 
Function  Definition  Interval  Optimum 

Griewank  ${f}_{1}\left(X\right)=1+\frac{1}{4000}{\sum}_{i=1}^{n}{x}_{i}^{2}{\prod}_{i=1}^{n}cos(\frac{{x}_{i}}{\sqrt{i}})$  [−150, 150]  0.0 
Ackley  ${f}_{2}\left(X\right)=20exp(0.2\sqrt{\frac{1}{n}{\sum}_{i=1}^{n}{x}_{i}^{2}})exp\left(\frac{1}{n}{\sum}_{i=1}^{n}cos\left(2\pi {x}_{i}\right)\right)+20+e$  [−32, 32]  0.0 
Brown  ${f}_{3}\left(X\right)={\sum}_{i=1}^{n1}{\left({x}_{i}^{2}\right)}^{({x}_{i+1}^{2}+1)}+{\left({x}_{i+1}^{2}\right)}^{({x}_{i}^{2}+1)}$  [−1, 4]  0.0 
Rastrigin  ${f}_{4}\left(X\right)={\sum}_{i=1}^{n}({x}_{i}^{2}10cos\left(2\pi {x}_{i}\right)+10)$  [−10, 10]  0.0 
Schwefel’s 2.22  ${f}_{5}\left(X\right)={\sum}_{i=1}^{n}{x}_{i}+{\prod}_{i=1}^{n}\left{x}_{i}\right$  [−100, 100]  0.0 
Schwefel’s 2.23  ${f}_{6}\left(X\right)={\sum}_{i=1}^{n}{x}_{i}^{10}$  [−10, 10]  0.0 
Qing  ${f}_{7}\left(X\right)={\sum}_{i=1}^{n}{({x}^{2}i)}^{2}$  [−500, 500]  0.0 
Rosenbrock  ${f}_{8}\left(X\right)={\sum}_{i=1}^{n1}(100{({x}_{i}^{2}{x}_{i+1})}^{2}+{(1{x}_{i})}^{2})$  [−2.048, 2.048]  0.0 
Schwefel  ${f}_{9}\left(X\right)=418.9829\xb7n+{\sum}_{i=1}^{n}({x}_{i}sin\left(\sqrt{{x}_{i}}\right))$  [−10, 10]  0.0 
Weierstrass  ${f}_{10}\left(X\right)={\sum}_{i=1}^{n}({\sum}_{k=0}^{20}({0.5}^{k}cos(2\pi \xb7{3}^{k}({x}_{i}+0.5))))n{\sum}_{k=0}^{20}({0.5}^{k}cos\left({3}^{k}\pi \right))$  [−0.5, 0.5]  0.0 
Whitley  ${f}_{11}\left(X\right)={\sum}_{i=1}^{n}{\sum}_{j=1}^{n}(\frac{{(100{({x}_{i}^{2}{x}_{j})}^{2}+{(1{x}_{j})}^{2})}^{2}}{4000}cos(100{({x}_{i}^{2}{x}_{j})}^{2}+{(1{x}_{j})}^{2})+1)$  [−10.24, 10.24]  0.0 
Zakharov  ${f}_{12}\left(X\right)={\sum}_{i=1}^{n}{x}_{i}^{2}+{\left({\sum}_{i=1}^{n}0.5i{x}_{i}\right)}^{2}+{\left({\sum}_{i=1}^{n}0.5i{x}_{i}\right)}^{4}$  [−5, 10]  0.0 
CICA  CICAD(0.1)  CICAD(0.2)  CICAD(0.3)  CICAD(0.4)  CICAD(0.5)  

Logistic map  76  63  56  38  49  52 
ICMIC map  55  47  42  39  31  36 
Sinusoidal map  89  83  63  72  54  67 
Gauss map  23  21  19  23  14  16 
Tent map  32  29  22  24  19  26 
Circle map  25  22  18  8  13  10 
Complex squaring map  39  32  27  25  29  27 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  2.6990e11  1.0341e10  2.6780e08  8.1404e10 
CICA  1.1707e16  3.4777e14  2.5794e12  5.0708e15 
CICAD  0  1.0767e08  2.9765e07  5.4310e08 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  8.3538e07  7.1169e05  9.7544e05  8.2014e06 
CICA  5.7959e08  1.0239e07  5.1388e06  1.2366e07 
CICAD  2.4248e11  1.8701e06  9.4602e06  3.0191e06 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  5.3185e03  8.2913e02  3.4768e01  5.9350e02 
CICA  0  3.6893e04  7.6523e04  2.3507e04 
CICAD  0  1.6878e03  3.6494e03  1.1136e03 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  0  1.6667e06  0.00005  9.1287e06 
CICA  0  9.3427e09  1.0685e07  3.4296e08 
CICAD  0  1.0604e07  1.9899e06  3.9926e07 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  2.1853e06  3.6150e05  4.3900e05  1.0507e05 
CICA  8.4180e09  1.3307e08  1.5124e08  1.8661e09 
CICAD  7.6125e10  7.7876e07  3.5125e05  3.1734e06 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  6.2357e11  2.4227e09  8.1524e09  1.9060e09 
CICA  8.4576e14  6.3189e12  4.8157e11  4.2206e12 
CICAD  3.6451e15  1.2597e09  7.6530e08  6.9023e09 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  3.7096e04  1.2428e03  5.8762e02  5.2740e03 
CICA  5.2507e08  7.8159e08  8.3169e07  6.9295e08 
CICAD  1.8346e11  6.8298e07  7.4924e07  1.9475e07 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  0.001296  0.201608  1.217682  0.362075 
CICA  0.000182  0.024174  0.07179  0.021891 
CICAD  0.000061  0.081672  0.36303  0.101833 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  3.2679e06  2.9866e05  7.2682e05  8.4658e06 
CICA  6.2612e09  8.1053e09  1.5335e08  1.2321e09 
CICAD  5.3896e10  6.4659e08  4.2363e06  3.8251e07 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  2.7647e03  3.2945e02  6.3290e02  1.8813e02 
CICA  0  1.5156e05  3.1263e05  9.8972e06 
CICAD  0  3.9078e04  7.9613e04  2.5196e04 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  5.3185e09  3.0960e08  3.4768e08  7.4846e09 
CICA  6.8309e14  9.5997e13  7.6523e12  6.5952e13 
CICAD  4.2985e15  7.3188e11  3.6494e10  3.3985e11 
Min(best)  Mean  Max(worst)  St.Dev.  

ICA  8.3546e11  4.3202e09  3.7554e08  4.2478e09 
CICA  4.4263e13  6.4897e10  5.4924e09  6.2908e10 
CICAD  3.8908e15  8.2997e09  8.8761e08  7.7855e09 
ICA  CICA  CICAD(0.1)  CICAD(0.3)  CICAD(0.5)  

Fitness  Min.  64.21  61.83  60.93  61.98  63.33 
Mean  66.50  63.49  64.60  66.13  67.76  
Max.  69.76  67.91  70.13  70.97  72.41  
St. Dev.  1.6335  1.1657  2.7274  2.5325  2.7108  
Execution  Mean  28.67  34.91  12.87  19.81  36.03 
Time (sec)  St. Dev.  14.28  18.15  7.24  8.14  14.56 
Success Rate  86.67%  93.33%  87.50%  82.50%  75.83% 
ICA  CICA  CICAD(0.1)  CICAD(0.3)  CICAD(0.5)  

Fitness  Min.  67.42  64.11  63.89  64.02  64.57 
Mean  69.68  66.18  67.57  70.28  72.62  
Max.  75.65  73.71  79.71  79.52  83.55  
St. Dev.  1.5567  1.3839  3.0785  4.0630  5.7615  
Execution  Mean  52.18  57.89  17.98  46.57  69.05 
Time (sec)  St. Dev.  12.41  14.59  4.79  12.48  19.24 
Success Rate  80.83%  88.33%  82.50%  76.67%  70.83% 
© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Z.S.; Lee, J.; Song, C.G.; Kim, S.J. Efficient Chaotic Imperialist Competitive Algorithm with Dropout Strategy for Global Optimization. Symmetry 2020, 12, 635. https://doi.org/10.3390/sym12040635
Wang ZS, Lee J, Song CG, Kim SJ. Efficient Chaotic Imperialist Competitive Algorithm with Dropout Strategy for Global Optimization. Symmetry. 2020; 12(4):635. https://doi.org/10.3390/sym12040635
Chicago/Turabian StyleWang, ZongSheng, Jung Lee, Chang Geun Song, and SunJeong Kim. 2020. "Efficient Chaotic Imperialist Competitive Algorithm with Dropout Strategy for Global Optimization" Symmetry 12, no. 4: 635. https://doi.org/10.3390/sym12040635