Fitness Landscape Analysis for the Differential Evolution Algorithm
Abstract
1. Introduction
- To investigate the association between various FLCs and DE performance across various optimization problems and dimensions, utilizing critical performance metrics such as solution quality, success rate, and success speed, while considering five key FLCs, namely ruggedness, gradients, funnels, deception, and searchability.
- To investigate the relationship between various FLCs and the search behavior of the DE algorithm across different problems and dimensions, utilizing a behavioral measure to quantify the search behavior of the DE algorithm called the diversity rate-of-change (DRoC).
2. Background
2.1. Differential Evolution
- MutationThe mutation operator is applied first to produce a mutant vector for each individual in the current population. That is, for each parent vector xi(t), a mutant vector ui(t) is generated by selecting three distinct individuals from the population. The first selected vector, , serves as the base vector (also called the target vector). The other two vectors, and , are then used to calculate a scaled difference. The vectors are selected such that , with i, i1, i2, i3 ∼ U(1, NP). Here, i denotes the population index and NP represents the population size.
- CrossoverThe crossover operator in the DE algorithm creates a trial vector, , by recombining components of the parent vector, xi(t), with the mutant vector, ui(t). Each component of the trial vector is assigned as follows:
- SelectionThe selection operation in the DE algorithm determines whether the parent vector or its corresponding trial vector proceeds to the next generation based on their fitness in a greedy selection scheme. Specifically, the trial vector, , is compared to the parent vector, xi(t), and the vector which exhibits a better fitness value survives into the next generation.
- Strategy of the Differential Evolution AlgorithmPrice and Storn proposed a naming convention for the DE algorithm based on the applied mutation and crossover operators [2]. This conventional notation, denoted as DE/x/y/z, is widely used to describe different strategies of the DE algorithm. In this notation, x specifies the method used to select the target vector, y indicates the number of difference vectors involved in the mutation operation, and z specifies the crossover technique employed. The basic DE/rand/1/bin strategy is applied in this paper, in which the target vector is randomly selected, a single difference vector is used in the mutation operation, and binomial crossover is performed. A summary of the DE/rand/1/bin strategy is provided in Algorithm 1.
Algorithm 1 DE/rand/1/bin: Differential evolution algorithm with random target vector and binomial crossover. - 1:
- Initialize generation counter: t ← 0
- 2:
- Set control parameters: population size NP, scale factor F, and crossover rate CR
- 3:
- Randomly generate initial population P(0) with NP individuals
- 4:
- while termination criteria not satisfied do
- 5:
- for each candidate solution xi(t) ∈ P(t) do
- 6:
- Evaluate fitness:
- 7:
- Randomly choose distinct indices , , such thatand
- 8:
- Select random dimension
- 9:
- for each dimension j = 1 to D do
- 10:
- Generate trial vector component:
- 11:
- end for
- 12:
- if then
- 13:
- 14:
- else
- 15:
- xi(t + 1) ← xi(t)
- 16:
- end if
- 17:
- end for
- 18:
- 19:
- t ← t + 1
- 20:
- end while
- 21:
- return the individual with the best fitness value in the final population
2.2. Fitness Landscape Analysis
- RuggednessRuggedness refers to the level of variation in fitness values of a particular fitness landscape. A specific landscape is highly rugged if the fitness values of neighboring solutions have extremely different values. For searching algorithms, it is a daunting task to find the global optima for highly rugged problems because of the possibility of getting trapped in numerous local optima. As a measure, ruggedness is directly linked to problem difficulty. The more the ruggedness associated with a specific problem landscape, the more challenging it is to find a global optimum.To measure the ruggedness of an optimization problem, the first entropic measure (FEM) is used. Originally introduced by Vassilev et al. [37] for discrete landscapes and later adapted for continuous ones by Malan and Engelbrecht [16], FEM for macro-scale-ruggedness (denoted by FEM0.1) and for micro-scale-ruggedness (denoted by FEM0.01) are adopted in this research. The FEM metric produces a value in the range [0, 1], where 0 indicates a flat landscape and 1 indicates maximum ruggedness.
- GradientsThe steepness of gradients refers to the magnitude of fitness changes between neighboring solutions or the absolute difference in fitness values between neighboring solutions. A landscape with steep gradients may have a higher probability of being deceptive to search algorithms. Deception occurs when the steepness of gradients guides the algorithm away from a global optimum, causing the algorithm to converge prematurely on a suboptimal solution. In this study, two metrics are used to quantify gradients, as introduced by Malan and Engelbrecht [17]. These are the average gradient, Gavg, and the standard deviation of gradients, Gdev. The Gavg is a positive real number, with larger values indicating greater steepness of gradients. The Gdev is also a positive real number, with larger values indicating greater deviation from the average gradient, which in turn reflects an uneven distribution of gradients across the landscape.
- FunnelsA funnel refers to a global basin shape in the fitness landscape that consists of clustered local optima. Single-funnel landscapes typically guide the search algorithm smoothly toward the global optimum. In contrast, multi-funnel landscapes represent a greater challenge, as they may direct the algorithm toward different competing local optima, which impose increased difficulty and potentially leading to premature convergence.An approach for estimating the presence of multiple funnels in a landscape is the dispersion metric (DM) of Lunacek and Whitley [38]. A problem with small dispersion probably has a single-funneled landscape. In contrast, a high-dispersion problem probably has a multi-funneled landscape. Malan and Engelbrecht proposed an adaptive DM metric and used normalized solution vectors to compare dispersion metric values across problems with different domains [17]. The DM metric produces a value in , where D is the dimensionality of the search space, and dispD is the dispersion of a large uniform random sample of a D-dimensional space normalized to [0, 1] in all dimensions. A positive value for DM indicates the presence of multiple funnels, whereas negative DM values indicate a landscape with a single funnel.
- DeceptionA deceptive landscape provides the search algorithm with false information, guiding the search in the wrong direction. For minimization problems, a landscape is considered easily searchable when fitness values decrease as the distance to the optimum decreases. Deception is primarily attributed to the landscape structure and the distribution of optima. Deceptive landscapes may contain gradients that lead away from the global optima. However, the level of deception is related to the position of suboptimal solutions with reference to the global optima and the presence of isolation. In this paper, the fitness distance correlation (FDC) is used to quantify the deception in a landscape. Jones and Forrest [39] proposed the FDC measure, which was later adapted by Malan and Engelbrecht [18], alleviating the need for prior knowledge of the global optimum. The FDC metric produces a value in the range [−1, 1]. For minimization problems, smaller FDC values indicate a higher degree of deception, making larger values more desirable.
- SearchabilitySearchability (also referred to as the evolvability) of a fitness landscape is defined as the ability of an optimization algorithm to navigate the landscape towards better positions (of fitter solutions) efficiently and effectively. Given the structural characteristics of a fitness landscape, highly searchable landscapes are those with less deceptive terrain allowing the search algorithm to generate fitter offspring in a single move using a specific algorithmic operator. Malan and Engelbrecht [18] introduced the fitness cloud index (FCI) as a metric to measure the searchability in the context of the PSO algorithm, adapted from fitness cloud scatter plots by Verel et al. [40]. The FCI ranges from 0 to 1, where 0 indicates the worst possible searchability, and 1 represents perfect searchability for a given problem concerning a specific search operator of the optimization algorithm. In this study, the FCI is adopted for the DE algorithm, referred to as FCIDE. Specifically, the FCI measure was calculated by comparing each target vector with its corresponding trial vector after one generational step of DE and recording the proportion of instances where the fitness of the trial vector improved upon that of the original target vector. For simplicity, the measure will be termed as FCI, with FCIdev indicating the average standard deviation in searchability.
2.3. Diversity Rate-of-Change
3. Related Work
4. Empirical Analysis
4.1. Experimental Setup
4.2. Benchmarks
4.3. Performance Measures
- Quality Metric (QM): The quality metric assesses the quality of the solutions obtained. To calculate the QM, the absolute measure of fitness error is calculated as the difference in fitness between the best solution found in a single run, fmin, and the optimum solution, . The smaller the absolute error, the better the solution quality. Malan and Engelbrecht proposed a method to convert the fitness error into a positive, normalized quality measure, defined as follows [19]:
- Success Rate (SRate): The success rate is defined as the number of successful runs that reach a solution within a fixed accuracy level from the global optimum, divided by the total number of runs. SRate is calculated as follows [19]:
- Success Speed (SSpeed): The speed at which the algorithm finds an acceptable solution is measured by the number of function evaluations consumed to reach that solution. The number of function evaluations required to reach the global optimum or a solution within the fixed accuracy level represents the success speed of a successful run. For unsuccessful runs, SSpeed = 0. The success speed, SSpeed, for a successful run r is calculated as follows [19]:
4.4. Experimental Procedure
5. Experiment 1: The Association Between Fitness Landscape Characteristics and the Performance of the Differential Evolution Algorithm
- Solved and fast (S+): Problems where QM = 1, SRate = 1, and SSpeed > 0.5 indicate that the optimal solution was found for all 30 runs, using less than 50% of the allowed time (i.e., maximum number of function evaluations).
- Solved (S): Problems with QM = 1, SRate = 1, and SSpeed ≤ 0.5 indicate that the optimal solution was found for all 30 runs, but required 50% or more of the allowed time (i.e., maximum number of function evaluations).
- Moderate (M): Problems where 0 < QM < 1 indicate that a near-optimal solution was found.
- Failed (F): Problems where QM = 0 indicate that no solution was found.
6. Experiment 2: Exploring the Relationship Between Fitness Landscape Characteristics and the Behavior of the Differential Evolution Algorithm
7. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
References
- Storn, R. On the Usage of Differential Evolution for Function Optimization. In Proceedings of the IEEE International Conference on Fuzzy Systems, New Orleans, LA, USA, 8–11 September 1996; pp. 519–523. [Google Scholar]
- Storn, R.; Price, K. Differential Evolution—A Simple and Efficient Heuristic for Global Optimization Over Continuous Spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
- Price, K.; Storn, R.M.; Lampinen, J.A. Differential Evolution: A Practical Approach to Global Optimization, 1st ed.; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
- Plagianakos, V.P.; Tasoulis, D.K.; Vrahatis, M.N. A Review of Major Application Areas of Differential Evolution. In Advances in Differential Evolution; Chakraborty, U.K., Ed.; Springer: Berlin/Heidelberg, Germany, 2008; pp. 197–238. [Google Scholar]
- Das, S.; Mullick, S.S.; Suganthan, P.N. Recent Advances in Differential Evolution—An Updated Survey. Swarm Evol. Comput. 2016, 27, 1–30. [Google Scholar] [CrossRef]
- Ahmad, M.F.; Isa, N.A.M.; Lim, W.H.; Ang, K.M. Differential Evolution: A Recent Review Based on State-of-the-Art Works. Alex. Eng. J. 2022, 61, 3831–3872. [Google Scholar] [CrossRef]
- Chakraborty, S.P.; Saha, A.K.; Ezugwu, A.E.; Agushaka, J.O.; Zitar, R.A.; Abualigah, L. Differential Evolution and Its Applications in Image Processing Problems: A Comprehensive Review. Arch. Comput. Methods Eng. 2022, 30, 985–1040. [Google Scholar] [CrossRef]
- Langdon, W.B.; Poli, R. Evolving Problems to Learn About Particle Swarm Optimizers and Other Search Algorithms. IEEE Trans. Evol. Comput. 2007, 11, 561–578. [Google Scholar] [CrossRef]
- Talbi, E. Metaheuristics: From Design to Implementation; John Wiley & Sons: Hoboken, NJ, USA, 2009. [Google Scholar]
- Yang, X. Nature-Inspired Metaheuristic Algorithms; Luniver Press: Bristol, UK, 2010. [Google Scholar]
- Gendreau, M.; Potvin, J. (Eds.) Handbook of Metaheuristics, 2nd ed.; Springer: New York, NY, USA, 2010. [Google Scholar]
- Michalewicz, Z.; Fogel, D.B. Tuning the Algorithm to the Problem. In How to Solve It: Modern Heuristics, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2004; pp. 277–298. [Google Scholar]
- Wright, S. The Roles of Mutation, Inbreeding, Crossbreeding, and Selection in Evolution. In Proceedings of the Sixth International Congress on Genetics, Ithaca, NY, USA, 24–31 August 1932; pp. 356–366. [Google Scholar]
- Wright, S. Surfaces of Selective Value Revisited. Am. Nat. 1988, 131, 115–123. [Google Scholar] [CrossRef]
- Zou, F.; Chen, D.; Liu, H.; Cao, S.; Ji, X.; Zhang, Y. A Survey of Fitness Landscape Analysis for Optimization. Neurocomputing 2022, 503, 129–139. [Google Scholar] [CrossRef]
- Malan, K.M.; Engelbrecht, A.P. Quantifying Ruggedness of Continuous Landscapes Using Entropy. In Proceedings of the IEEE Congress on Evolutionary Computation, Trondheim, Norway, 18–21 May 2009; pp. 1440–1447. [Google Scholar]
- Malan, K.M.; Engelbrecht, A.P. Ruggedness, Funnels and Gradients in Fitness Landscapes and the Effect on PSO Performance. In Proceedings of the IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013; pp. 963–970. [Google Scholar]
- Malan, K.M.; Engelbrecht, A.P. Characterising the Searchability of Continuous Optimisation Problems for PSO. Swarm Intell. 2014, 8, 275–302. [Google Scholar] [CrossRef]
- Malan, K.M.; Engelbrecht, A.P. Fitness Landscape Analysis for Metaheuristic Performance Prediction. In Recent Advances in the Theory and Application of Fitness Landscapes; Emergence, Complexity and Computation; Richter, H., Engelbrecht, A., Eds.; Springer: Berlin/Heidelberg, Germany, 2014; Volume 6, pp. 103–132. [Google Scholar]
- Malan, K.M.; Engelbrecht, A.P. Particle Swarm Optimisation Failure Prediction Based on Fitness Landscape Characteristics. In Proceedings of the Symposium on Swarm Intelligence, Orlando, FL, USA, 9–12 December 2014; pp. 1–9. [Google Scholar]
- Dennis, C.; Ombuki-Berman, B.M.; Engelbrecht, A.P. Predicting Particle Swarm Optimization Control Parameters from Fitness Landscape Characteristics. In Proceedings of the IEEE Congress on Evolutionary Computation, Krakow, Poland, 28 June–1 July 2021; pp. 2289–2298. [Google Scholar]
- Uludağ, G.; Uyar, A.Ş. Fitness Landscape Analysis of Differential Evolution Algorithms. In Proceedings of the International Conference on Soft Computing, Computing with Words and Perceptions in System Analysis, Decision and Control, Antalya, Turkey, 9–11 September 2009; pp. 1–4. [Google Scholar]
- Yang, S.; Li, K.; Li, W.; Chen, W.; Chen, Y. Dynamic Fitness Landscape Analysis on Differential Evolution Algorithm. In Proceedings of the Bio-Inspired Computing–Theories and Applications, Xi’an, China, 28–30 October 2016; pp. 179–184. [Google Scholar]
- Zhang, Z.; Duan, N.; Zou, K.; Sun, Z. Predictive Models of Problem Difficulties for Differential Evolutionary Algorithm Based on Fitness Landscape Analysis. In Proceedings of the 37th Chinese Control Conference, Wuhan, China, 25–27 July 2018; pp. 3221–3226. [Google Scholar]
- Huang, Y.; Li, W.; Ouyang, C.; Chen, Y. A Self-Feedback Strategy Differential Evolution with Fitness Landscape Analysis. Soft Comput. 2018, 22, 7773–7785. [Google Scholar] [CrossRef]
- Li, W.; Li, S.; Chen, Z.; Zhong, L.; Ouyang, C. Self-Feedback Differential Evolution Adapting to Fitness Landscape Characteristics. Soft Comput. 2019, 23, 1151–1163. [Google Scholar] [CrossRef]
- Li, K.; Liang, Z.; Yang, S.; Chen, Z.; Wang, H.; Lin, Z. Performance Analyses of Differential Evolution Algorithm Based on Dynamic Fitness Landscape. Int. J. Cogn. Inform. Nat. Intell. 2019, 13, 36–61. [Google Scholar] [CrossRef]
- Liang, J.; Li, Y.; Qu, B.; Yu, K.; Hu, Y. Mutation Strategy Selection Based on Fitness Landscape Analysis: A Preliminary Study. In Bio-Inspired Computing: Theories and Applications; Communications in Computer and Information Science; Pan, L., Liang, J., Qu, B., Eds.; Springer: Singapore, 2020; Volume 1159, pp. 284–298. [Google Scholar]
- Huang, Y.; Li, W.; Tian, F.; Meng, X. A Fitness Landscape Ruggedness Multiobjective Differential Evolution Algorithm with a Reinforcement Learning Strategy. Appl. Soft Comput. 2020, 96, 106693. [Google Scholar] [CrossRef]
- Tan, Z.; Li, K.; Tian, Y.; Al-Nabhan, N. A Novel Mutation Strategy Selection Mechanism for Differential Evolution Based on Local Fitness Landscape. J. Supercomput. 2021, 77, 5726–5756. [Google Scholar] [CrossRef]
- Tan, Z.; Li, K.; Wang, Y. Differential Evolution with Adaptive Mutation Strategy Based on Fitness Landscape Analysis. Inf. Sci. 2021, 549, 142–163. [Google Scholar] [CrossRef]
- Li, Y.; Liang, J.; Yu, K.; Yue, C.; Zhang, Y. Keenness for Characterizing Continuous Optimization Problems and Predicting Differential Evolution Algorithm Performance. Complex Intell. Syst. 2023, 9, 5251–5266. [Google Scholar] [CrossRef]
- Li, S.; Li, W.; Tang, J.; Wang, F. A New Evolving Operator Selector by Using Fitness Landscape in Differential Evolution Algorithm. Inf. Sci. 2023, 624, 709–731. [Google Scholar] [CrossRef]
- Liang, J.; Li, K.; Yu, K.; Yue, C.; Li, Y.; Song, H. A Novel Differential Evolution Algorithm Based on Local Fitness Landscape Information for Optimization Problems. Trans. Inf. Syst. 2023, 106, 601–616. [Google Scholar] [CrossRef]
- Zheng, L.; Luo, S. Adaptive Differential Evolution Algorithm Based on Fitness Landscape Characteristic. Mathematics 2022, 10, 1511. [Google Scholar] [CrossRef]
- Engelbrecht, A.P.; Bosman, P.A.N.; Malan, K.M. The Influence of Fitness Landscape Characteristics on Particle Swarm Optimisers. Nat. Comput. 2022, 21, 335–345. [Google Scholar] [CrossRef]
- Vassilev, V.K.; Fogarty, T.C.; Miller, J.F. Smoothness, Ruggedness and Neutrality of Fitness Landscapes: From Theory to Application. In Advances in Evolutionary Computing: Theory and Applications; Ghosh, A., Tsutsui, S., Eds.; Springer: Berlin/Heidelberg, Germany, 2003; pp. 3–44. [Google Scholar]
- Lunacek, M.; Whitley, D. The Dispersion Metric and the CMA Evolution Strategy. In Proceedings of the 8th Annual Conference on Genetic and Evolutionary Computation, Seattle, WA, USA, 8–12 July 2006; pp. 477–484. [Google Scholar]
- Jones, T.; Forrest, S. Fitness Distance Correlation as a Measure of Problem Difficulty for Genetic Algorithms. In Proceedings of the Sixth International Conference on Genetic Algorithms, Pittsburgh, PA, USA, 15–19 July 1995; pp. 184–192. [Google Scholar]
- Verel, S.; Collard, P.; Clergue, M. Where Are Bottlenecks in NK Fitness Landscapes? In Proceedings of the IEEE Congress on Evolutionary Computation, Canberra, Australia, 8–12 December 2003; pp. 273–280. [Google Scholar]
- Bosman, P.; Engelbrecht, A.P. Diversity Rate of Change Measurement for Particle Swarm Optimisers. In Proceedings of the 9th International Conference on Swarm Intelligence, Brussels, Belgium, 10–12 September 2014; pp. 86–97. [Google Scholar]
- Hayward, L.; Engelbrecht, A. Determining Metaheuristic Similarity Using Behavioral Analysis. IEEE Trans. Evol. Comput. 2025, 29, 262–274. [Google Scholar] [CrossRef]
- Pitzer, E.; Affenzeller, M. A Comprehensive Survey on Fitness Landscape Analysis. In Recent Advances in Intelligent Engineering Systems; Studies in Computational Intelligence; Fodor, J., Klempous, R., Araujo, C.P.S., Eds.; Springer: Berlin/Heidelberg, Germany, 2012; Volume 378, pp. 161–191. [Google Scholar]
- Ochoa, G.; Malan, K. Recent Advances in Fitness Landscape Analysis. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, Prague, Czech Republic, 13–17 July 2019; pp. 1077–1094. [Google Scholar]
- Malan, K.M. A Survey of Advances in Landscape Analysis for Optimisation. Algorithms 2021, 14, 40. [Google Scholar] [CrossRef]
- Malan, K.; Ochoa, G. Landscape Analysis of Optimization Problems and Algorithms. In Proceedings of the Companion Conference on Genetic and Evolutionary Computation, Lisbon, Portugal, 15–19 July 2023; pp. 1416–1432. [Google Scholar]
- Jones, T. Evolutionary Algorithms, Fitness Landscapes and Search. Ph.D. Thesis, University of New Mexico, Albuquerque, NM, USA, 1995. [Google Scholar]
- Barnett, L. Evolutionary Search on Fitness Landscapes with Neutral Networks. Ph.D. Thesis, University of Sussex, East Sussex, UK, 2003. [Google Scholar]
- Derbel, B.; Verel, S. Fitness Landscape Analysis to Understand and Predict Algorithm Performance for Single- and Multi-Objective Optimization. In Proceedings of the Genetic and Evolutionary Computation Conference Companion, Cancun, Mexico, 8–12 July 2020; pp. 993–1042. [Google Scholar]
- Bolshakov, V.; Pitzer, E.; Affenzeller, M. Fitness Landscape Analysis of a Simulation Optimisation Problem with HeuristicLab. In Proceedings of the UKSim 5th European Symposium on Computer Modeling and Simulation, Cambridge, UK, 30 March–1 April 2011; pp. 107–112. [Google Scholar]
- Marmion, M.; Jourdan, L.; Dhaenens, C. Fitness Landscape Analysis and Metaheuristics Efficiency. J. Math. Model. Algorithms Oper. Res. 2013, 12, 3–26. [Google Scholar] [CrossRef]
- Aboutaib, B.; Verel, S.; Fonlupt, C.; Derbel, B.; Liefooghe, A.; Ahiod, B. On Stochastic Fitness Landscapes: Local Optimality and Fitness Landscape Analysis for Stochastic Search Operators. In Proceedings of the 16th International Conference on Parallel Problem Solving from Nature, Leiden, The Netherlands, 5–9 September 2020; Lecture Notes in Computer Science. Bäck, T., Preuss, M., Deutz, A., Wang, H., Kallel, S.A., Juez, J.L., Sim, K., Eds.; Volume 12275, pp. 97–110. [Google Scholar]
- Russell, S.; Norvig, P. Artificial Intelligence: A Modern Approach, 4th ed.; Pearson: Hoboken, NJ, USA, 2020. [Google Scholar]
- Lu, H.; Shi, J.; Fei, Z.; Zhou, Q.; Mao, K. Measures in the Time and Frequency Domains for Fitness Landscape Analysis of Dynamic Optimization Problems. Appl. Soft Comput. 2017, 51, 192–208. [Google Scholar] [CrossRef]
- Lu, H.; Shi, J.; Fei, Z.; Zhou, Q.; Mao, K. Analysis of the Similarities and Differences of Job-Based Scheduling Problems. Eur. J. Oper. Res. 2018, 270, 809–825. [Google Scholar] [CrossRef]
- Tanabe, R.; Fukunaga, A. Success-history Based Parameter Adaptation for Differential Evolution. In Proceedings of the IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 71–78. [Google Scholar]
- Tanabe, R.; Fukunaga, A. Improving the Search Performance of SHADE Using Linear Population Size Reduction. In Proceedings of the IEEE Congress on Evolutionary Computation), Beijing, China, 6–11 July 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 1658–1665. [Google Scholar]
- Smith, T.; Husbands, P.; O’Shea, M. Fitness Landscapes and Evolvability. Evol. Comput. 2002, 10, 1–34. [Google Scholar] [CrossRef] [PubMed]
- Xiaowang, H.; Bin, N.; Jicheng, W.; Qiong, G.; Bojun, C. A Fitness Distance Correlation-Based Adaptive Differential Evolution for Nonlinear Equations Systems. Int. J. Swarm Intell. Res. 2024, 15, 1–22. [Google Scholar] [CrossRef]
- Xinyu, Z.; Ningzhi, L.; Long, F.; Hongwei, L.; Bailiang, C.; Mingwen, W. Adaptive niching differential evolution algorithm with landscape analysis for multimodal optimization. Inf. Sci. 2025, 700, 121842. [Google Scholar] [CrossRef]
- Sun, Y.; Halgamuge, S.K.; Kirley, M.; Munoz, M.A. On the Selection of Fitness Landscape Analysis Metrics for Continuous Optimization Problems. In Proceedings of the International Conference on Information and Automation for Sustainability, Colombo, Sri Lanka, 22–24 December 2014; pp. 1–6. [Google Scholar]
- Saad, A.D.; Engelbrecht, A.P.; Khan, S.A. An Analysis of Differential Evolution Population Size. Appl. Sci. 2024, 14, 9976. [Google Scholar] [CrossRef]
- Gämperle, R.; Müller, S.D.; Koumoutsakos, P. A Parameter Study for Differential Evolution. Adv. Intell. Syst. Fuzzy Syst. Evol. Comput. 2002, 10, 293–298. [Google Scholar]
- Ronkkonen, J.; Kukkonen, S.; Price, K. Real-Parameter Optimization with Differential Evolution. In Proceedings of the IEEE Congress on Evolutionary Computation, Edinburgh, Scotland, UK, 2–5 September 2005; Volume 1, pp. 506–513. [Google Scholar]
- Montgomery, J.; Chen, S. An Analysis of the Operation of Differential Evolution at High and Low Crossover Rates. In Proceedings of the IEEE Congress on Evolutionary Computation, Barcelona, Spain, 18–23 July 2010; pp. 1–8. [Google Scholar]
- Ali, M.M.; Törn, A. Population Set-Based Global Optimization Algorithms: Some Modifications and Numerical Studies. Comput. Oper. Res. 2004, 31, 1703–1725. [Google Scholar] [CrossRef]
- Ali, M.; Pant, M. Improving the Performance of Differential Evolution Algorithm Using Cauchy Mutation. Soft Comput. 2011, 15, 991–1007. [Google Scholar] [CrossRef]
- Poikolainen, I.; Neri, F.; Caraffini, F. Cluster-Based Population Initialization for Differential Evolution Frameworks. Inf. Sci. 2015, 297, 216–235. [Google Scholar] [CrossRef]
- Opara, K.; Arabas, J. Comparison of Mutation Strategies in Differential Evolution—A Probabilistic Perspective. Swarm Evol. Comput. 2018, 39, 53–69. [Google Scholar] [CrossRef]
- Guo, S.; Yang, C. Enhancing Differential Evolution Utilizing Eigenvector-Based Crossover Operator. IEEE Trans. Evol. Comput. 2014, 19, 31–49. [Google Scholar] [CrossRef]
- Song, E.; Li, H. A Self-Adaptive Differential Evolution Algorithm Using Oppositional Solutions and Elitist Sharing. IEEE Access 2021, 9, 20035–20050. [Google Scholar] [CrossRef]
- Qiu, X.; Tan, K.C.; Xu, J. Multiple Exponential Recombination for Differential Evolution. IEEE Trans. Cybern. 2016, 47, 995–1006. [Google Scholar] [CrossRef]
- Zhou, Y.; Yi, W.; Gao, L.; Li, X. Adaptive Differential Evolution with Sorting Crossover Rate for Continuous Optimization Problems. IEEE Trans. Cybern. 2017, 47, 2742–2753. [Google Scholar] [CrossRef]
- Sallam, K.M.; Elsayed, S.M.; Sarker, R.A.; Essam, D.L. Landscape-Based Adaptive Operator Selection Mechanism for Differential Evolution. Inf. Sci. 2017, 418, 383–404. [Google Scholar] [CrossRef]
- Tian, M.; Gao, X.; Dai, C. Differential Evolution with Improved Individual-Based Parameter Setting and Selection Strategy. Appl. Soft Comput. 2017, 56, 286–297. [Google Scholar] [CrossRef]
- Zhao, Z.; Yang, J.; Hu, Z.; Che, H. A Differential Evolution Algorithm with Self-Adaptive Strategy and Control Parameters Based on Symmetric Latin Hypercube Design for Unconstrained Optimization Problems. Eur. J. Oper. Res. 2016, 250, 30–45. [Google Scholar] [CrossRef]
- Takahama, T.; Sakai, S. An Adaptive Differential Evolution Considering Correlation of Two Algorithm Parameters. In Proceedings of the 7th International Conference on Soft Computing and Intelligent Systems and 15th International Symposium on Advanced Intelligent Systems, Kitakyushu, Japan, 5–8 December 2014; pp. 618–623. [Google Scholar]
- Yao, X.; Liu, Y.; Lin, G. Evolutionary Programming Made Faster. IEEE Trans. Evol. Comput. 1999, 3, 82–102. [Google Scholar] [CrossRef]
- Rahnamayan, S.; Tizhoosh, H.R.; Salama, M.M.A. A Novel Population Initialization Method for Accelerating Evolutionary Algorithms. Comput. Math. Appl. 2007, 53, 1605–1614. [Google Scholar] [CrossRef]
- Mishra, S.K. Performance of Repulsive Particle Swarm Method in Global Optimization of Some Important Test Functions: A Fortran Program. SSRN Electron. J. 2006. [Google Scholar] [CrossRef]
- Hansen, N.; Kern, S. Evaluating the CMA Evolution Strategy on Multimodal Test Functions. In Proceedings of the International Conference on Parallel Problem Solving from Nature, Birmingham, UK, 18–22 September 2004; pp. 282–291. [Google Scholar]
- Mishra, S.K. Some New Test Functions for Global Optimization and Performance of Repulsive Particle Swarm Method; MPRA Paper 2718; North-Eastern Hill University: Shillong, India, 2006. [Google Scholar]
- Price, K.V.; Storn, R.M.; Lampinen, J.A. Appendix A.1: Unconstrained Uni-modal Test Functions. In Differential Evolution: A Practical Approach to Global Optimization; Natural Computing Series; Springer: Berlin/Heidelberg, Germany, 2005; pp. 514–533. [Google Scholar]
- Jong, K.A.D. An Analysis of the Behavior of a Class of Genetic Adaptive Systems. Ph.D. Thesis, University of Michigan, Ann Arbor MI, USA, 1975. [Google Scholar]
- CIlib Fitness Landscape Analysis. Available online: https://github.com/ciren/fla (accessed on 26 March 2023).
- Spearman, C. The Proof and Measurement of Association Between Two Things. Am. J. Psychol. 1904, 15, 72–101. [Google Scholar] [CrossRef]
- Malan, K.M. Characterising Continuous Optimisation Problems for Particle Swarm Optimisation Performance Prediction. Ph.D. Thesis, University of Pretoria, Pretoria, South Africa, 2014. [Google Scholar]
- Locatelli, M. A Note on the Griewank Test Function. J. Glob. Optim. 2003, 25, 169–174. [Google Scholar] [CrossRef]
- Zar, J.H. Significance Testing of the Spearman Rank Correlation Coefficient. J. Am. Stat. Assoc. 1972, 67, 578–580. [Google Scholar] [CrossRef]
- Lampinen, J.; Zelinka, I. On Stagnation of the Differential Evolution Algorithm. In Proceedings of the 6th International Mendel Conference on Soft Computing, Brno, Czech Republic, 7–9 June 2000; Volume 6, pp. 76–83. [Google Scholar]
- Yu, C.; Jun, H. Average convergence rate of evolutionary algorithms in continuous optimization. Inf. Sci. 2021, 562, 200–219. [Google Scholar] [CrossRef]
- Morales-Castañeda, B.; Maciel-Castillo, O.; Navarro, M.A.; Aranguren, I.; Valdivia, A.; Ramos-Michel, A.; Oliva, D.; Hinojosa, S. Handling stagnation through diversity analysis: A new set of operators for evolutionary algorithms. In Proceedings of the IEEE Congress on Evolutionary Computation, Padua, Italy, 18–23 July 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–8. [Google Scholar]
- Wang, S.; Li, Y.; Yang, H. Self-Adaptive Mutation Differential Evolution Algorithm Based on Particle Swarm Optimization. Appl. Soft Comput. 2019, 81, 105496. [Google Scholar] [CrossRef]
- Xiao, P.; Zou, D.; Xia, Z.; Shen, X. Multi-strategy different dimensional mutation differential evolution algorithm. In Proceedings of the 3rd International Conference on Advances in Materials, Machinery, and Electronics, Wuhan, China, 19–20 January 2019; Volume 2073, p. 020102. [Google Scholar]
- Mersmann, O.; Bischl, B.; Trautmann, H.; Preuss, M.; Weihs, C.; Rudolph, G. Exploratory Landscape Analysis. In Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, Dublin, Ireland, 12–16 July 2011; pp. 829–836. [Google Scholar]
- Kerschke, P.; Preuss, M.; Hernández, C.; Schütze, O.; Sun, J.; Grimme, C.; Rudolph, G.; Bischl, B.; Trautmann, H. Cell Mapping Techniques for Exploratory Landscape Analysis. In Advances in Intelligent Systems and Computing; Springer International Publishing: Berlin/Heidelberg, Germany, 2014; pp. 115–131. [Google Scholar]
- Lang, R.D.; Engelbrecht, A.P. On the Robustness of Random Walks for Fitness Landscape Analysis. In Proceedings of the IEEE Symposium Series on Computational Intelligence, Xiamen, China, 6–9 December 2019; pp. 1898–1906. [Google Scholar]
- Lang, R.D.; Engelbrecht, A.P. Decision Space Coverage of Random Walks. In Proceedings of the IEEE Congress on Evolutionary Computation, Glasgow, UK, 19–24 July 2020; pp. 1–8. [Google Scholar]
Reference (Year, Authors) | FLC Metrics Used | Remarks |
---|---|---|
(1) Uludağ et al., 2009 [22] | Fitness Distance Correlation (FDC), Correlation Length (CL) | FDC and CL were effective but insufficient for landscapes with high ruggedness, deception, or large single basins. Suggested future use of evolvability metrics. |
(2) Yang et al., 2016 [23] | Dynamic Severity, Ruggedness | DE struggled with highly rugged landscapes and frequent dynamic changes. Study limited to 2D problems; future work suggested including more FLCs and analysis of DE control parameters. |
(3) Zhang et al., 2018 [24] | Fitness Distance Correlation (FDC), Information Landscape Measure (ILs) | Used regression and decision trees to investigate relationships between DE control parameter settings and problem features using FLA. Study included limited problems and ignored population size. |
(4) Huang et al., 2018 [25] | Number of Local Optima | Used local FLC (number of optima) to guide mutation strategy in a self-feedback DE variant. No discussion on how FLCs affect performance; limited to soil water texture problems. |
(5) Li et al., 2019 [26] | Number of Optima (Modality) | Adapted DE using landscape-based modality estimation to guide control parameters and operators. Showed strong results, but relied on a single FLC and introduced added complexity. Population size was not considered. |
(6) Li et al., 2019 [27] | Dynamic Severity, Gradients, Ruggedness, FDC | Analyzed DE performance across 12 problems using 4 FLCs. Stressed importance of multiple metrics but relied on visual inspection without statistical validation. Fixed iteration count may affect result credibility. |
(7) Liang et al., 2019 [28] | Number of Optima, Basin Size Ratio, Keenness, FDC | Used an AI model to predict suitable mutation strategies based on four FLCs. While results were promising, the correlation between FLCs and DE performance was not analyzed. |
(8) Huang et al., 2020 [29] | Ruggedness (Unimodal vs. Multimodal) | Proposed LRMODE algorithm using reinforcement learning to guide mutation selection based on ruggedness. Limited by use of a single FLC metric (number of optima). |
(9) Tan et al., 2021 [30] | Roughness (Avg. Distance to Local Optima) | Developed LFLDE, which selects mutation strategies based on roughness and adapts control parameters and population size. However, only one FLC was used and no justification was given for strategy selection. |
(10) Tan et al., 2021 [31] | FDC, Ruggedness | FLDE used a random forest model to select mutation strategies based on two FLCs. It included parameter adaptation and population reduction. Study lacked discussion on DE performance vs. FLCs and introduced high algorithmic complexity. |
(11) Zheng and Lou, 2022 [35] | Proportional Optima (Ruggedness Estimation) | FL-ADE dynamically adjusted population size based on ruggedness and used an archive-based adaptive mutation. Effective but relied on a single FLC (number of optima) without multi-feature analysis. |
(12) Li et al., 2023 [32] | Keenness (KEE), FDC, Neutrality, Dispersion Metric | Used four FLCs with a predictive model to link problem features to DE performance. Provided strong correlation insights but was limited to seven benchmark problems. |
(13) Li et al., 2023 [33] | FDC, Ruggedness | Proposed mutation and parameter selectors trained via ensemble learning and neural networks. Performance was limited due to use of only two FLCs; future work suggested using additional metrics like evolvability. |
(14) Liang et al., 2023 [34] | Population Density (avg. Euclidean distance) | FLIDE used a custom population density metric to guide mutation strategy and population reduction. Achieved good performance, but relied on a single FLC; authors noted the need for richer landscape feature extraction. |
(15) Hu et al., 2024 [59] | FDC | FDCADE used FDC to guide adaptive mutation and parameter control for solving nonlinear equation systems. Showed strong performance, including applications in robotics, but relied on a single FLC. |
(16) Zhou et al., 2025 [60] | FDC | ANFDE algorithm used FDC to balance niching strategies (speciation and crowding). Effective on CEC2013 benchmarks, but adaptation was limited to strategy allocation and did not include broader behavior or multi-metric analysis. |
#F | Function Name | Domain | Dimensions |
---|---|---|---|
F1 | Ackley [78] | x ∈ [−32, 32] | 1, 2, 5, 15, 30 |
F2 | Alpine [79] | x ∈ [−10, 10] | 1, 2, 5, 15, 30 |
F3 | Beale [80] | x ∈ [−4.5, 4.5] | 2 |
F4 | Bohachevsky [81] | x ∈ [−15, 15] | 2, 5, 15, 30 |
F5 | Egg Holder [82] | x ∈ [−512, 512] | 2 |
F6 | Goldstein-Price [78] | x ∈ [−2, 2] | 2 |
F7 | Griewank [78] | x ∈ [−600, 600] | 1, 2, 5, 15, 30 |
F8 | Levy [82] | x ∈ [−10, 10] | 2, 5, 15, 57 |
F9 | Michalewicz [80] | x ∈ [0, π] | 2, 5, 30 |
F10 | Pathological [79] | x ∈ [−100, 100] | 2, 5, 15, 30 |
F11 | Quadric (Schwefel 1.2) [78] | x ∈ [−100, 100] | 1, 2, 5, 15, 30 |
F12 | Quartic [78] | x ∈ [−1.28, 1.28] | 1, 2, 5, 15, 30 |
F13 | Rana [83] | x ∈ [−512, 512] | 2, 5, 15, 30 |
F14 | Rastrigin [78] | x ∈ [−512, 512] | 1, 2, 5, 15, 30 |
F15 | Rosenbrock [78] | x ∈ [−2.048, 2.048] | 1, 2, 5, 15, 30 |
F16 | Salomon [83] | x ∈ [−100, 100] | 1, 2, 5, 15, 30 |
F17 | Schwefel 2.22 [78] | x ∈ [−10, 10] | 1, 2, 5, 15, 30 |
F18 | Schwefel 2.26 [78] | x ∈ [−500, 500] | 1, 2, 5, 15, 30 |
F19 | Six-hump Camel Back [78] | x ∈ [−5, 5] | 2 |
F20 | Skew Rastrigin [81] | x ∈ [−5, 5] | 1, 2, 5, 15, 30 |
F21 | Spherical [84] | x ∈ [−100, 100] | 1, 2, 5, 15, 30 |
F22 | Step [78] | x ∈ [−20, 20] | 1, 2, 5, 15, 30 |
F23 | Weierstrass [80] | x ∈ [−0.5, 0.5] | 1, 2, 5, 15, 30 |
F24 | Zakharov [80] | x ∈ [−5, 10] | 2, 5, 15, 30 |
#F: Function number. |
#f | Function | D | QM | SRate | SSpeed | Overall |
---|---|---|---|---|---|---|
1 | f_ack | 1 | 1.000 | 1.000 | 0.642 | S+ |
1 | f_ack | 2 | 1.000 | 1.000 | 0.614 | S+ |
1 | f_ack | 5 | 1.000 | 1.000 | 0.611 | S+ |
1 | f_ack | 15 | 1.000 | 1.000 | 0.649 | S+ |
1 | f_ack | 30 | 1.000 | 1.000 | 0.459 | S |
2 | f_alp | 1 | 1.000 | 1.000 | 0.835 | S+ |
2 | f_alp | 2 | 1.000 | 1.000 | 0.762 | S+ |
2 | f_alp | 5 | 1.000 | 1.000 | 0.637 | S+ |
2 | f_alp | 15 | 0.999 | 0.800 | 0.200 | M |
2 | f_alp | 30 | 0.569 | 0.000 | 0.000 | M |
3 | f_bea | 2 | 1.000 | 1.000 | 0.438 | S |
4 | f_boh | 2 | 1.000 | 1.000 | 0.788 | S+ |
4 | f_boh | 5 | 1.000 | 1.000 | 0.786 | S+ |
4 | f_boh | 15 | 1.000 | 1.000 | 0.788 | S+ |
4 | f_boh | 30 | 1.000 | 1.000 | 0.768 | S+ |
5 | f_egg | 2 | 0.000 | 0.433 | 0.369 | F |
6 | f_gp | 2 | 1.000 | 1.000 | 0.662 | S+ |
7 | f_grw | 1 | 1.000 | 1.000 | 0.750 | S+ |
7 | f_grw | 2 | 1.000 | 1.000 | 0.635 | S+ |
7 | f_grw | 5 | 1.000 | 1.000 | 0.402 | S |
7 | f_grw | 15 | 1.000 | 1.000 | 0.717 | S+ |
7 | f_grw | 30 | 1.000 | 1.000 | 0.757 | S+ |
8 | f_lvy | 2 | 1.000 | 1.000 | 0.610 | S+ |
8 | f_lvy | 5 | 1.000 | 1.000 | 0.610 | S+ |
8 | f_lvy | 15 | 1.000 | 1.000 | 0.604 | S+ |
8 | f_lvy | 30 | 1.000 | 1.000 | 0.548 | S+ |
9 | f_mic | 2 | 1.000 | 1.000 | 0.750 | S+ |
9 | f_mic | 5 | 0.885 | 0.000 | 0.000 | M |
9 | f_mic | 30 | 0.000 | 0.000 | 0.000 | F |
10 | f_pth | 2 | 1.000 | 1.000 | 0.474 | S |
10 | f_pth | 5 | 0.000 | 0.000 | 0.000 | F |
10 | f_pth | 15 | 0.000 | 0.000 | 0.000 | F |
10 | f_pth | 30 | 0.000 | 0.000 | 0.000 | F |
11 | f_qdr | 1 | 1.000 | 1.000 | 0.953 | S+ |
11 | f_qdr | 2 | 1.000 | 1.000 | 0.911 | S+ |
11 | f_qdr | 5 | 1.000 | 1.000 | 0.840 | S+ |
11 | f_qdr | 15 | 1.000 | 1.000 | 0.400 | S |
11 | f_qdr | 30 | 0.765 | 0.000 | 0.000 | M |
12 | f_qrt | 1 | 1.000 | 1.000 | 0.997 | S+ |
12 | f_qrt | 2 | 1.000 | 1.000 | 0.982 | S+ |
12 | f_qrt | 5 | 1.000 | 1.000 | 0.968 | S+ |
12 | f_qrt | 15 | 1.000 | 1.000 | 0.962 | S+ |
12 | f_qrt | 30 | 1.000 | 1.000 | 0.956 | S+ |
13 | f_ran | 2 | 0.000 | 0.000 | 0.000 | F |
13 | f_ran | 5 | 0.000 | 0.000 | 0.000 | F |
13 | f_ran | 15 | 0.000 | 0.000 | 0.000 | F |
13 | f_ran | 30 | 0.000 | 0.000 | 0.000 | F |
14 | f_ras | 1 | 1.000 | 1.000 | 0.799 | S+ |
14 | f_ras | 2 | 1.000 | 1.000 | 0.763 | S+ |
14 | f_ras | 5 | 1.000 | 1.000 | 0.699 | S+ |
14 | f_ras | 15 | 0.000 | 0.033 | 0.001 | F |
14 | f_ras | 30 | 0.000 | 0.000 | 0.000 | F |
15 | f_ros | 2 | 1.000 | 1.000 | 0.345 | S |
15 | f_ros | 5 | 0.873 | 0.467 | 0.011 | M |
15 | f_ros | 15 | 0.205 | 0.000 | 0.000 | M |
15 | f_ros | 30 | 0.135 | 0.000 | 0.000 | M |
16 | f_sal | 1 | 1.000 | 1.000 | 0.609 | S+ |
16 | f_sal | 2 | 1.000 | 1.000 | 0.515 | S+ |
16 | f_sal | 5 | 1.000 | 1.000 | 0.089 | S |
16 | f_sal | 15 | 0.000 | 0.000 | 0.000 | F |
16 | f_sal | 30 | 0.000 | 0.000 | 0.000 | F |
17 | f_sch2.22 | 1 | 1.000 | 1.000 | 0.877 | S+ |
17 | f_sch2.22 | 2 | 1.000 | 1.000 | 0.863 | S+ |
17 | f_sch2.22 | 5 | 1.000 | 1.000 | 0.918 | S+ |
17 | f_sch2.22 | 15 | 1.000 | 1.000 | 0.910 | S+ |
17 | f_sch2.22 | 30 | 1.000 | 1.000 | 0.895 | S+ |
18 | f_sch2.26 | 1 | 1.000 | 1.000 | 0.776 | S+ |
18 | f_sch2.26 | 2 | 1.000 | 1.000 | 0.773 | S+ |
18 | f_sch2.26 | 5 | 1.000 | 1.000 | 0.776 | S+ |
18 | f_sch2.26 | 15 | 1.000 | 1.000 | 0.762 | S+ |
18 | f_sch2.26 | 30 | 1.000 | 1.000 | 0.689 | S+ |
19 | f_skr | 1 | 1.000 | 1.000 | 0.767 | S+ |
19 | f_skr | 2 | 1.000 | 1.000 | 0.716 | S+ |
19 | f_skr | 5 | 0.865 | 0.867 | 0.636 | M |
19 | f_skr | 15 | 0.271 | 0.067 | 0.001 | M |
19 | f_skr | 30 | 0.000 | 0.000 | 0.000 | F |
20 | f_sph | 1 | 1.000 | 1.000 | 0.956 | S+ |
20 | f_sph | 2 | 1.000 | 1.000 | 0.930 | S+ |
20 | f_sph | 5 | 1.000 | 1.000 | 0.918 | S+ |
20 | f_sph | 15 | 1.000 | 1.000 | 0.911 | S+ |
20 | f_sph | 30 | 1.000 | 1.000 | 0.900 | S+ |
21 | f_stp | 1 | 1.000 | 1.000 | 0.997 | S+ |
21 | f_stp | 2 | 1.000 | 1.000 | 0.987 | S+ |
21 | f_stp | 5 | 1.000 | 1.000 | 0.974 | S+ |
21 | f_stp | 15 | 1.000 | 1.000 | 0.969 | S+ |
21 | f_stp | 30 | 1.000 | 1.000 | 0.962 | S+ |
22 | f_wei | 1 | 1.000 | 1.000 | 0.460 | S |
22 | f_wei | 2 | 1.000 | 1.000 | 0.356 | S |
22 | f_wei | 5 | 1.000 | 1.000 | 0.199 | S |
22 | f_wei | 15 | 0.000 | 0.000 | 0.000 | F |
22 | f_wei | 30 | 0.000 | 0.000 | 0.000 | F |
23 | f_zak | 2 | 1.000 | 1.000 | 0.960 | S+ |
23 | f_zak | 5 | 1.000 | 1.000 | 0.968 | S+ |
23 | f_zak | 15 | 1.000 | 1.000 | 0.993 | S+ |
23 | f_zak | 30 | 1.000 | 1.000 | 0.900 | S+ |
24 | f_6h | 2 | 1.000 | 1.000 | 0.663 | S+ |
Dimension | FLCs | QM | SRate | SSpeed |
---|---|---|---|---|
D = 1 | FEM0.01 | NA | NA | −0.674 |
FEM0.1 | NA | NA | −0.421 | |
DM | NA | NA | −0.313 | |
Gavg | NA | NA | −0.684 | |
Gdev | NA | NA | −0.432 | |
FDC | NA | NA | 0.146 | |
FCI | NA | NA | 0.442 | |
FCIdev | NA | NA | 0.492 | |
D = 2 | FEM0.01 | 0.020 | −0.015 | −0.063 |
FEM0.1 | −0.372 | −0.358 | −0.088 | |
DM | −0.427 | −0.478 | −0.475 | |
Gavg | −0.203 | −0.285 | −0.396 | |
Gdev | −0.292 | −0.306 | −0.476 | |
FDC | 0.433 | 0.478 | 0.401 | |
FCI | 0.400 | 0.415 | 0.374 | |
FCIdev | 0.028 | 0.083 | 0.095 | |
D = 5 | FEM0.01 | −0.188 | 0.036 | −0.393 |
FEM0.1 | −0.161 | −0.046 | −0.388 | |
DM | −0.420 | −0.472 | −0.293 | |
Gavg | −0.318 | −0.149 | −0.587 | |
Gdev | −0.401 | −0.266 | −0.535 | |
FDC | 0.416 | 0.488 | 0.321 | |
FCI | 0.087 | 0.152 | 0.051 | |
FCIdev | −0.151 | −0.027 | 0.051 | |
D = 15 | FEM0.01 | −0.531 | −0.482 | −0.561 |
FEM0.1 | −0.480 | −0.430 | −0.490 | |
DM | −0.409 | −0.310 | −0.312 | |
Gavg | −0.541 | −0.539 | −0.565 | |
Gdev | −0.536 | −0.535 | −0.496 | |
FDC | 0.348 | 0.209 | 0.253 | |
FCI | 0.468 | 0.319 | 0.380 | |
FCIdev | 0.348 | 0.303 | 0.236 | |
D = 30 | FEM0.01 | −0.471 | −0.382 | −0.489 |
FEM0.1 | −0.523 | −0.425 | −0.515 | |
DM | −0.482 | −0.416 | −0.436 | |
Gavg | −0.514 | −0.416 | −0.518 | |
Gdev | −0.507 | −0.364 | −0.424 | |
FDC | 0.378 | 0.295 | 0.308 | |
FCI | 0.560 | 0.495 | 0.513 | |
FCIdev | −0.210 | −0.182 | −0.272 |
FLCs | |||||
---|---|---|---|---|---|
FEM0.01 | −0.124 | −0.383 | 0.019 | 0.030 | 0.066 |
FEM0.1 | −0.036 | 0.412 | 0.111 | 0.100 | 0.314 |
DM | 0.041 | 0.257 | 0.441 | 0.477 | 0.408 |
Gavg | −0.119 | −0.280 | 0.054 | −0.067 | 0.081 |
Gdev | 0.091 | −0.110 | 0.192 | 0.089 | 0.155 |
FDC | −0.300 | −0.207 | −0.395 | −0.284 | −0.373 |
FCI | −0.475 | −0.019 | −0.419 | −0.543 | −0.577 |
FCIdev | −0.530 | −0.192 | 0.178 | −0.376 | 0.087 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Saad, A.; Engelbrecht, A.P.; Khan, S.A. Fitness Landscape Analysis for the Differential Evolution Algorithm. Algorithms 2025, 18, 520. https://doi.org/10.3390/a18080520
Saad A, Engelbrecht AP, Khan SA. Fitness Landscape Analysis for the Differential Evolution Algorithm. Algorithms. 2025; 18(8):520. https://doi.org/10.3390/a18080520
Chicago/Turabian StyleSaad, Amani, Andries P. Engelbrecht, and Salman A. Khan. 2025. "Fitness Landscape Analysis for the Differential Evolution Algorithm" Algorithms 18, no. 8: 520. https://doi.org/10.3390/a18080520
APA StyleSaad, A., Engelbrecht, A. P., & Khan, S. A. (2025). Fitness Landscape Analysis for the Differential Evolution Algorithm. Algorithms, 18(8), 520. https://doi.org/10.3390/a18080520