Adaptive Self-Scaling Brain-Storm Optimization via a Chaotic Search Mechanism
Abstract
:1. Introduction
- An improved adaptive CLS strategy is proposed to enhance the global search performance of BSO.
- Adaptive scaling CLS can effectively help BSO jump out of local optima and avoid premature convergence of the BSO algorithm.
- Extensive experimental results show that ABSO is superior to its competitors.
- The results of two nonparametric statistical tests indicate that the proposed adaptive self-scaling mechanism effectively improves the search ability and efficiency of ABSO.
2. Brain-Storm Optimization
3. Chaotic Maps
- A logistic map is a classic chaotic map in the nonlinear dynamics of a biological population that can be expressed as follows:
- A piecewise linear chaotic map (PWLCM) has a constant density function in the defined interval. The simplest PWLCM can be obtained by the following equation:In our experiment, the initial chaotic sequence was , and we set .
- A Singer map is similar to a one-dimensional chaotic system:
- A sine map is similar to a unimodal logistic map and can be expressed by the following formula:
- A circle map is a simplified one-dimensional model for driven mechanical rotors and phase locked loops in electronics that maps a circle onto itself and can be expressed by the following equation:
- A Bernoulli shift map is a type of piecewise linear map that is similar to a tent map and can be expressed as follows:In this study, we set and .
- An iterative chaotic map with infinite collapses (ICMIC) obtains infinite fixed points based on the following equation:
- A sinusoidal map can be generated by the following formula:
- Chebyshev maps are widely used in digital communication, neural computing and safety maintenance. The process to generate chaotic sequences is defined as follows:
- A topological conjugacy relationship exists between the famous logistic map and the tent map, so the two are interconvertible. Tent map sequences can be calculated by the following formula:
- A Gaussian map can be generated by the following formula:
- Cubic maps are usually used in cryptography, and the sequence calculation formula is as follows:We set and in our experiment.
4. CLS Based on Adaptive Self-Scaling
Algorithm 1: ABSO |
01: for all Randomly generate a population with N individuals and calculate the fitness of each individual. do |
02: Divide N individuals into Mclusters using the k-means method; the optimal individual in each cluster is |
chosen as the center. |
03: Select the cluster center: |
04: if , replace the cluster center with a randomly generated one; |
05: if , choose a center based on the current probability, or choose a randomly selected |
individual as the center. |
06: if and Steps 04 and 05 are not implemented, choose 2 cluster centers, and combine |
them into a new cluster center, or randomly choose 2 individuals to generate a new cluster center. |
end-for |
07: ABSO-R: Randomly select a mapping function from the 12 chaotic maps, wherein the probability of each |
chaotic map being selected is equal. |
ABSO-P: The above 12 chaotic maps are selected in parallel. |
08: Calculate the chaotic variables according to the selected chaotic mapping Equations (3)–(14). |
09: Implement the CLS using the chaotic variables and adaptive self-scaling strategy. |
10: Compare the fitness of the current individual with the fitness before the chaotic search based on Equation (18), |
update the individual fitness, and output the optimal fitness value. |
11: end-all |
5. Experimental Studies
5.1. Benchmark Functions
5.2. Experimental Results and Analysis
6. Computational Complexity
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Yin, P.Y.; Chuang, Y.L. Adaptive memory artificial bee colony algorithm for green vehicle routing with cross-docking. Appl. Math. Model. 2016, 40, 9302–9315. [Google Scholar] [CrossRef]
- Ziarati, K.; Akbari, R.; Zeighami, V. On the performance of bee algorithms for resource-constrained project scheduling problem. Appl. Soft Comput. 2011, 11, 3720–3733. [Google Scholar] [CrossRef]
- Chen, J.; Cheng, S.; Chen, Y.; Xie, Y.; Shi, Y. Enhanced brain storm optimization algorithm for wireless sensor networks deployment. In International Conference in Swarm Intelligence; Springer: Cham, Switzerland, 2015; pp. 373–381. [Google Scholar]
- Li, L.; Tang, K. History-based topological speciation for multimodal optimization. IEEE Trans. Evol. Comput. 2014, 19, 136–150. [Google Scholar] [CrossRef]
- Yang, G.; Wu, S.; Jin, Q.; Xu, J. A hybrid approach based on stochastic competitive Hopfield neural network and efficient genetic algorithm for frequency assignment problem. Appl. Soft Comput. 2016, 39, 104–116. [Google Scholar] [CrossRef]
- Bianchi, L.; Dorigo, M.; Gambardella, L.M.; Gutjahr, W.J. A survey on metaheuristics for stochastic combinatorial optimization. Nat. Comput. 2009, 8, 239–287. [Google Scholar] [CrossRef] [Green Version]
- Qin, A.K.; Huang, V.L.; Suganthan, P.N. Differential evolution algorithm with strategy adaptation for global numerical optimization. IEEE Trans. Evol. Comput. 2008, 13, 398–417. [Google Scholar] [CrossRef]
- Price, K.V. Differential evolution. In Handbook of Optimization; Springer: Berlin/Heidelberg, Germany, 2013; pp. 187–214. [Google Scholar]
- Das, S.; Suganthan, P.N. Differential evolution: A survey of the state-of-the-art. IEEE Trans. Evol. Comput. 2010, 15, 4–31. [Google Scholar] [CrossRef]
- Poli, R.; Kennedy, J.; Blackwell, T. Particle swarm optimization. Swarm Intell. 2007, 1, 33–57. [Google Scholar] [CrossRef]
- Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
- Shi, Y. Particle swarm optimization: Developments, applications and resources. In Proceedings of the 2001 Congress on Evolutionary Computation, Seoul, Korea, 27–30 May 2001; Volume 1, pp. 81–86. [Google Scholar]
- Wang, Y.; Gao, S.; Yu, Y.; Cai, Z.; Wang, Z. A gravitational search algorithm with hierarchy and distributed framework. Knowl. Based Syst. 2021, 218, 106877. [Google Scholar] [CrossRef]
- Ji, J.; Gao, S.; Wang, S.; Tang, Y.; Yu, H.; Todo, Y. Self-adaptive gravitational search algorithm with a modified chaotic local search. IEEE Access 2017, 5, 17881–17895. [Google Scholar] [CrossRef]
- Wang, Y.; Yu, Y.; Gao, S.; Pan, H.; Yang, G. A hierarchical gravitational search algorithm with an effective gravitational constant. Swarm Evol. Comput. 2019, 46, 118–139. [Google Scholar] [CrossRef]
- Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
- Mafarja, M.M.; Mirjalili, S. Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing 2017, 260, 302–312. [Google Scholar] [CrossRef]
- Aljarah, I.; Faris, H.; Mirjalili, S. Optimizing connection weights in neural networks using the whale optimization algorithm. Soft Comput. 2018, 22, 1–15. [Google Scholar] [CrossRef]
- Cheng, C.Y.; Pourhejazy, P.; Ying, K.C.; Lin, C.F. Unsupervised Learning-based Artificial Bee Colony for minimizing non-value-adding operations. Appl. Soft Comput. 2021, 105, 107280. [Google Scholar] [CrossRef]
- Ji, J.; Song, S.; Tang, C.; Gao, S.; Tang, Z.; Todo, Y. An artificial bee colony algorithm search guided by scale-free networks. Inf. Sci. 2019, 473, 142–165. [Google Scholar] [CrossRef]
- Yao, X. A review of evolutionary artificial neural networks. Int. J. Intell. Syst. 1993, 8, 539–567. [Google Scholar] [CrossRef]
- Tang, C.; Ji, J.; Tang, Y.; Gao, S.; Tang, Z.; Todo, Y. A novel machine learning technique for computer-aided diagnosis. Eng. Appl. Artif. Intell. 2020, 92, 103627. [Google Scholar] [CrossRef]
- Ji, J.; Gao, S.; Cheng, J.; Tang, Z.; Todo, Y. An approximate logic neuron model with a dendritic structure. Neurocomputing 2016, 173, 1775–1783. [Google Scholar] [CrossRef]
- Ji, J.; Song, Z.; Tang, Y.; Jiang, T.; Gao, S. Training a dendritic neural model with genetic algorithm for classification problems. In Proceedings of the 2016 International Conference on Progress in Informatics and Computing (PIC), Shanghai, China, 23–25 December 2016; pp. 47–50. [Google Scholar]
- Song, S.; Gao, S.; Chen, X.; Jia, D.; Qian, X.; Todo, Y. AIMOES: Archive information assisted multi-objective evolutionary strategy for ab initio protein structure prediction. Knowl. Based Syst. 2018, 146, 58–72. [Google Scholar] [CrossRef]
- Song, Z.; Tang, Y.; Chen, X.; Song, S.; Song, S.; Gao, S. A preference-based multi-objective evolutionary strategy for ab initio prediction of proteins. In Proceedings of the 2017 International Conference on Progress in Informatics and Computing (PIC), Nanjing, China, 15–17 December 2017; pp. 7–12. [Google Scholar]
- Song, S.; Ji, J.; Chen, X.; Gao, S.; Tang, Z.; Todo, Y. Adoption of an improved PSO to explore a compound multi-objective energy function in protein structure prediction. Appl. Soft Comput. 2018, 72, 539–551. [Google Scholar] [CrossRef]
- Song, Z.; Tang, Y.; Ji, J.; Todo, Y. Evaluating a dendritic neuron model for wind speed forecasting. Knowl. Based Syst. 2020, 201, 106052. [Google Scholar] [CrossRef]
- Song, Z.; Tang, C.; Ji, J.; Todo, Y.; Tang, Z. A Simple Dendritic Neural Network Model-Based Approach for Daily PM2.5 Concentration Prediction. Electronics 2021, 10, 373. [Google Scholar] [CrossRef]
- Song, Z.; Zhou, T.; Yan, X.; Tang, C.; Ji, J. Wind Speed Time Series Prediction Using a Single Dendritic Neuron Model. In Proceedings of the 2020 2nd International Conference on Machine Learning, Big Data and Business Intelligence (MLBDBI), Taiyuan, China, 23–25 October 2020; pp. 140–144. [Google Scholar]
- Ji, J.; Song, S.; Tang, Y.; Gao, S.; Tang, Z.; Todo, Y. Approximate logic neuron model trained by states of matter search algorithm. Knowl. Based Syst. 2019, 163, 120–130. [Google Scholar] [CrossRef]
- Todo, Y.; Tang, Z.; Todo, H.; Ji, J.; Yamashita, K. Neurons with multiplicative interactions of nonlinear synapses. Int. J. Neural Syst. 2019, 29, 1950012. [Google Scholar] [CrossRef]
- Shi, Y. Brain storm optimization algorithm. In International Conference in Swarm Intelligence; Springer: Berlin/Heidelberg, Germany, 2011; pp. 303–309. [Google Scholar]
- Sun, C.; Duan, H.; Shi, Y. Optimal satellite formation reconfiguration based on closed-loop brain storm optimization. IEEE Comput. Intell. Mag. 2013, 8, 39–51. [Google Scholar] [CrossRef]
- Duan, H.; Li, C. Quantum-behaved brain storm optimization approach to solving Loney’s solenoid problem. IEEE Trans. Magn. 2014, 51, 1–7. [Google Scholar] [CrossRef]
- Qiu, H.; Duan, H. Receding horizon control for multiple UAV formation flight based on modified brain storm optimization. Nonlinear Dyn. 2014, 78, 1973–1988. [Google Scholar] [CrossRef]
- Guo, X.; Wu, Y.; Xie, L.; Cheng, S.; Xin, J. An adaptive brain storm optimization algorithm for multiobjective optimization problems. In International Conference in Swarm Intelligence; Springer: Cham, Switzerland, 2015; pp. 365–372. [Google Scholar]
- Guo, X.; Wu, Y.; Xie, L. Modified brain storm optimization algorithm for multimodal optimization. In International Conference in Swarm Intelligence; Springer: Cham, Switzerland, 2014; pp. 340–351. [Google Scholar]
- Sun, Y. A hybrid approach by integrating brain storm optimization algorithm with grey neural network for stock index forecasting. Abstr. Appl. Anal. 2014, 2014, 759862. [Google Scholar] [CrossRef]
- Yu, Y.; Gao, S.; Cheng, S.; Wang, Y.; Song, S.; Yuan, F. CBSO: A memetic brain storm optimization with chaotic local search. Memetic Comput. 2018, 10, 353–367. [Google Scholar] [CrossRef]
- Li, C.; Duan, H. Information granulation-based fuzzy RBFNN for image fusion based on chaotic brain storm optimization. Optik 2015, 126, 1400–1406. [Google Scholar] [CrossRef]
- Song, Z.; Gao, S.; Yu, Y.; Sun, J.; Todo, Y. Multiple chaos embedded gravitational search algorithm. IEICE Trans. Inf. Syst. 2017, 100, 888–900. [Google Scholar] [CrossRef] [Green Version]
- GGarcía, S.; Fernández, A.; Luengo, J.; Herrera, F. Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf. Sci. 2010, 180, 2044–2064. [Google Scholar] [CrossRef]
- Alcalá-Fdez, J.; Sanchez, L.; Garcia, S.; del Jesus, M.J.; Ventura, S.; Garrell, J.M.; Herrera, F. KEEL: A software tool to assess evolutionary algorithms for data mining problems. Soft Comput. 2009, 13, 307–318. [Google Scholar] [CrossRef]
- Wolf, A.; Swift, J.B.; Swinney, H.L.; Vastano, J.A. Determining Lyapunov exponents from a time series. Phys. D Nonlinear Phenom. 1985, 16, 285–317. [Google Scholar] [CrossRef] [Green Version]
F1 | F3 | F4 | F5 | F6 | |
---|---|---|---|---|---|
Algorithm | Mean ± Std | Mean ± Std | Mean ± Std | Mean ± Std | Mean ± Std |
BSO | 2.39± 2.42 | 6.45± 3.11 | 5.55± 4.65 | 7.87± 3.87 | 6.59± 5.02 |
CBSO | 2.57± 2.44 | 3.59± 2.11 | 5.09± 3.55 | 7.04± 4.00 | 7.01± 6.53 |
CGSA-P | 1.99± 8.91 | 7.64± 5.88 | 4.91± 2.01 | 7.25± 2.11 | 6.49± 3.95 |
DE | 2.41± 5.16 | 7.99± 2.06 | 5.97± 3.19 | 7.05± 2.14 | 6.13± 3.99 |
WOA | 1.51± 2.16 | 1.47± 7.44 | 5.66± 2.91 | 7.26± 6.55 | 6.71± 1.32 |
ABSO-R | 1.97± 1.71 | 5.77± 9.34 | 6.02± 1.11 | 7.62± 5.91 | 6.55± 2.54 |
ABSO-P | 2.18± 5.14 | 3.47± 2.92 | 4.57± 1.86 | 6.44± 2.62 | 6.33± 3.59 |
F7 | F8 | F9 | F10 | F11 | |
Algorithm | Mean±Std | Mean±Std | Mean±Std | Mean±Std | Mean±Std |
BSO | 1.59± 8.91 | 1.09± 3.99 | 1.09± 2.29 | 7.95± 6.77 | 1.45± 4.31 |
CBSO | 1.45± 9.44 | 8.99± 2.89 | 5.01± 6.99 | 5.71± 6.21 | 1.41± 4.01 |
CGSA-P | 8.88± 1.92 | 9.41± 2.12 | 3.03± 5.87 | 5.97± 3.59 | 1.51± 6.99 |
DE | 1.24± 8.97 | 1.00± 1.32 | 4.59± 8.53 | 8.21± 3.44 | 1.31± 2.84 |
WOA | 1.37± 9.41 | 1.09± 2.29 | 7.02± 2.51 | 6.11± 1.01 | 1.52± 1.23 |
ABSO-R | 1.19± 9.89 | 8.89± 4.61 | 4.10± 7.64 | 6.03± 2.94 | 1.41± 5.79 |
ABSO-P | 9.97± 1.06 | 8.86± 5.21 | 2.49± 1.25 | 5.89± 4.41 | 1.41± 5.57 |
F12 | F13 | F14 | F15 | F16 | |
Algorithm | Mean±Std | Mean±Std | Mean±Std | Mean±Std | Mean±Std |
BSO | 1.17± 7.11 | 6.01± 2.41 | 1.14± 3.54 | 3.17± 1.96 | 3.89± 5.01 |
CBSO | 1.98± 7.22 | 5.89± 4.09 | 8.11± 5.33 | 3.01± 2.14 | 3.43± 2.99 |
CGSA-P | 1.50± 3.12 | 3.26± 4.39 | 4.81± 2.13 | 1.29± 2.97 | 3.19± 2.59 |
DE | 5.39± 2.01 | 4.27± 4.97 | 5.58± 8.16 | 1.86± 4.01 | 3.22± 2.99 |
WOA | 4.50± 3.64 | 1.28± 1.59 | 8.14± 7.01 | 7.12± 5.21 | 3.51± 2.64 |
ABSO-R | 1.57± 7.11 | 5.97± 9.79 | 2.01± 3.13 | 3.09± 2.39 | 3.57± 3.01 |
ABSO-P | 1.77± 4.51 | 5.06± 3.00 | 6.95± 2.67 | 3.01± 1.97 | 3.00± 2.93 |
F17 | F18 | F19 | F20 | F21 | |
Algorithm | Mean±Std | Mean±Std | Mean±Std | Mean±Std | Mean±Std |
BSO | 2.74± 3.14 | 2.59± 1.26 | 1.52± 2.11 | 2.99± 3.01 | 2.81± 8.91 |
CBSO | 2.38± 1.89 | 1.42± 7.98 | 1.51± 5.99 | 2.68± 3.10 | 2.62± 1.01 |
CGSA-P | 3.01± 3.39 | 3.33± 2.21 | 1.61± 4.91 | 2.97± 9.98 | 2.61± 3.31 |
DE | 2.39± 3.17 | 7.01± 2.01 | 2.00± 4.69 | 2.29± 2.30 | 2.53± 2.13 |
WOA | 2.61± 2.12 | 2.94± 3.31 | 3.16± 1.99 | 2.69± 2.01 | 2.57± 5.99 |
ABSO-R | 2.68± 1.96 | 1.77± 2.69 | 1.44± 7.16 | 2.70± 4.01 | 2.59± 5.17 |
ABSO-P | 2.33± 2.57 | 1.14± 2.93 | 1.19± 5.96 | 2.58± 1.97 | 2.42± 4.11 |
F22 | F23 | F24 | F25 | F26 | |
Algorithm | Mean±Std | Mean±Std | Mean±Std | Mean±Std | Mean±Std |
BSO | 9.51± 5.63 | 4.01± 1.61 | 3.74± 2.13 | 2.97± 3.41 | 1.18± 3.74 |
CBSO | 6.87± 2.19 | 3.29± 2.31 | 3.89± 1.41 | 2.97± 2.16 | 8.33± 2.11 |
CGSA-P | 5.97± 1.96 | 3.72± 2.97 | 3.25± 6.01 | 2.99± 1.97 | 7.01± 5.97 |
DE | 2.94± 5.01 | 2.97± 2.62 | 3.04± 2.32 | 2.96± 3.02 | 3.51± 2.20 |
WOA | 6.74± 2.10 | 3.59± 7.91 | 3.46± 9.23 | 2.98± 3.14 | 7.31± 2.11 |
ABSO-R | 6.00± 1.73 | 3.66± 2.00 | 3.49± 1.09 | 2.96± 4.49 | 7.16± 4.71 |
ABSO-P | 4.09± 1.91 | 2.88± 1.43 | 3.21± 8.87 | 2.87± 1.34 | 3.09± 1.08 |
F27 | F28 | F29 | F30 | ||
Algorithm | Mean±Std | Mean±Std | Mean±Std | Mean±Std | |
BSO | 4.68± 3.09 | 3.24± 4.58 | 5.19± 3.97 | 8.08± 6.99 | |
CBSO | 3.89± 2.71 | 3.24± 3.98 | 4.51± 2.19 | 7.12± 2.22 | |
CGSA-P | 4.68± 2.99 | 3.33± 4.61 | 4.52± 2.91 | 2.22± 8.03 | |
DE | 3.12± 4.52 | 6.20± 1.85 | 3.98± 1.35 | 1.99± 5.07 | |
WOA | 3.49± 7.77 | 3.49± 2.79 | 5.67± 6.11 | 1.74± 5.21 | |
ABSO-R | 4.01± 2.99 | 3.24± 3.67 | 4.41± 4.01 | 6.90± 4.09 | |
ABSO-P | 3.12± 2.71 | 3.16± 2.95 | 3.92± 3.92 | 5.07± 1.91 |
Algorithm | ABSO-R | ABSO-P | BSO | CBSO | CGSA-P | DE | WOA |
---|---|---|---|---|---|---|---|
Ranking | 3.8295 | 1.7931 | 5.7586 | 4.069 | 3.8103 | 2.9959 | 5.6034 |
ABSO-R vs. | p-Value | ||
---|---|---|---|
BSO | 0.000074 | 376.0 | 30.0 |
CBSO | 0.04788 | 233.0 | 173.0 |
CGSA-P | 1 | 193.0 | 242.0 |
DE | 1 | 134.0 | 272.0 |
WOA | 0.000958 | 368.5 | 66.5 |
ABSO-P vs. | p-Value | ||
---|---|---|---|
BSO | 0.000002 | 435.0 | 0.0 |
CBSO | 0.000009 | 421.5 | 13.5 |
CGSA-P | 0.009664 | 293.0 | 142.0 |
DE | 0.017052 | 259.0 | 147.0 |
WOA | 0.000002 | 435.0 | 0.0 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Song, Z.; Yan, X.; Zhao, L.; Fan, L.; Tang, C.; Ji, J. Adaptive Self-Scaling Brain-Storm Optimization via a Chaotic Search Mechanism. Algorithms 2021, 14, 239. https://doi.org/10.3390/a14080239
Song Z, Yan X, Zhao L, Fan L, Tang C, Ji J. Adaptive Self-Scaling Brain-Storm Optimization via a Chaotic Search Mechanism. Algorithms. 2021; 14(8):239. https://doi.org/10.3390/a14080239
Chicago/Turabian StyleSong, Zhenyu, Xuemei Yan, Lvxing Zhao, Luyi Fan, Cheng Tang, and Junkai Ji. 2021. "Adaptive Self-Scaling Brain-Storm Optimization via a Chaotic Search Mechanism" Algorithms 14, no. 8: 239. https://doi.org/10.3390/a14080239
APA StyleSong, Z., Yan, X., Zhao, L., Fan, L., Tang, C., & Ji, J. (2021). Adaptive Self-Scaling Brain-Storm Optimization via a Chaotic Search Mechanism. Algorithms, 14(8), 239. https://doi.org/10.3390/a14080239