Next Article in Journal
Numerical Investigation of an NACA 13112 Morphing Airfoil
Next Article in Special Issue
An Improved Dung Beetle Optimizer for the Twin Stacker Cranes’ Scheduling Problem
Previous Article in Journal
Stair-Climbing Wheeled Robot Based on Rotating Locomotion of Curved-Spoke Legs
Previous Article in Special Issue
Heuristic Optimization Algorithm of Black-Winged Kite Fused with Osprey and Its Engineering Application
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A New Single-Parameter Bees Algorithm

by
Hamid Furkan Suluova
* and
Duc Truong Pham
Department of Mechanical Engineering, The University of Birmingham, Birmingham B15 2TT, UK
*
Author to whom correspondence should be addressed.
Biomimetics 2024, 9(10), 634; https://doi.org/10.3390/biomimetics9100634
Submission received: 2 September 2024 / Revised: 7 October 2024 / Accepted: 9 October 2024 / Published: 18 October 2024
(This article belongs to the Special Issue Nature-Inspired Metaheuristic Optimization Algorithms 2024)

Abstract

:
Based on bee foraging behaviour, the Bees Algorithm (BA) is an optimisation metaheuristic algorithm which has found many applications in both the continuous and combinatorial domains. The original version of the Bees Algorithm has six user-selected parameters: the number of scout bees, the number of high-performing bees, the number of top-performing or “elite” bees, the number of forager bees following the elite bees, the number of forager bees recruited by the other high-performing bees, and the neighbourhood size. These parameters must be chosen with due care, as their values can impact the algorithm’s performance, particularly when the problem is complex. However, determining the optimum values for those parameters can be time-consuming for users who are not familiar with the algorithm. This paper presents BA1, a Bees Algorithm with just one parameter. BA1 eliminates the need to specify the numbers of high-performing and elite bees and other associated parameters. Instead, it uses incremental k-means clustering to divide the scout bees into groups. By reducing the required number of parameters, BA1 simplifies the tuning process and increases efficiency. BA1 has been evaluated on 23 benchmark functions in the continuous domain, followed by 12 problems from the TSPLIB in the combinatorial domain. The results show good performance against popular nature-inspired optimisation algorithms on the problems tested.

Graphical Abstract

1. Introduction

Metaheuristics have attracted interest because of their ability to find near-optimal solutions through rapid iterative computations [1,2]. The effectiveness of exact methods decreases as the size and complexity of problems increase. Therefore, metaheuristics are more practical [3]. They are competent at addressing multi-objective optimisation challenges and perform well with limited data or computing resources. Metaheuristics are versatile and adjustable, making them effective for finding answers to a broad spectrum of optimisation problems across different sectors [4,5,6].
Metaheuristics provide advantages in solving intricate problems [7]. However, they have several limitations [8,9]. These include the risk of being trapped at a local optimum and leading to premature convergence, difficulties in selecting appropriate parameter configurations, causing poor performance, and the inability to guarantee that the solution found is optimal [9,10].
The number of initial parameters and their numerical values are important in finding near-optimal solutions because they affect various factors, such as the convergence rate, number of evaluations and exploration-exploitation balance. These factors are typically employed to assess the efficiency of a metaheuristic [11]. Additionally, novice researchers may be unsure about the parameter configuration for solving a particular problem. Changing the parameter settings might lead to different exploration and exploitation approaches, ultimately affecting the quality of the solutions achieved. Therefore, users must carefully select appropriate parameter values based on the problem to obtain the best results.
The Bees Algorithm (BA) is an intelligent computing method based on the foraging behaviour of honeybees [12]. It was developed in 2005 to solve continuous optimisation problems. Since its first introduction, the BA has attracted attention because of its versatility and ability to discover near-optimal solutions efficiently. The BA has also been used to solve several well-known combinatorial problems, including the travelling salesman problem (TSP), vehicle routing problem (VRP) and production planning and scheduling problems.
This paper introduces BA1, a single-parameter Bees Algorithm and evaluates its performance against those of different well-known algorithms. Parameter reduction is achieved via k-means clustering. Clustering is an unsupervised learning method which categorises unlabelled data by identifying their shared properties [13]. The k-means algorithm is a well-studied method for categorising the provided data items into k distinct clusters through an iterative process which converges to a local minimum [14]. There are various examples where metaheuristics are used to perform clustering as an optimisation problem. However, to the authors’ knowledge, employing clustering to enhance optimisation algorithms has not been studied.
Like metaheuristics, the standard k-means clustering algorithm requires selecting an appropriate parameter (the K value) to process the clustering. There are several techniques for managing the parameter configurations for k-means, one of which is incremental k-means, which was developed by Pham et al. [15]. This explores the total distortion amount for each K value by executing a cluster centre jump. Pham et al. [16] proposed a function which uses global distortion to evaluate clustering results. This function facilitates the automatic selection of the best number of clusters for the k-means method.
This paper is structured as follows. Section 2 introduces the Bees Algorithm and some of its recent variants. Section 3 explains the proposed single-parameter Bees Algorithm. Section 4 presents the results of evaluating this new algorithm against the Bat [17], Grey Wolf [18] and Whale [19] optimisation algorithms on 23 continuous benchmark functions and against the same three algorithms plus Moth Flame Optimisation [20] and Particle Swarm Optimisation [21] algorithms on 12 combinatorial problems from the TSPLIB. Section 5 concludes the paper.

2. The Bees Algorithm

Honeybees are social insects which have inspired researchers to use their behaviours, including food foraging, as the basis of numerous optimisation algorithms [22]. Bees in nature forage the fields surrounding their hive for food, looking for flower patches abundant in nectar. Upon reaching the hive, they deposit the collected nectar. Bees pass information about the food to the hive by dancing (i.e., waggle dancing) on a specific area of the hive called the “dance floor”. The waggle dance indicates the direction, distance and quality of the nectar supply. The number of recruited bees depends on the assessment of the patch’s quality. Food sources which are more abundant and have higher-quality nectar tend to attract more foragers, thus improving the food collection process [23].
The Bees Algorithm, which mimics this food-foraging behaviour, was first proposed by Pham et al. [24]. This version is known as the basic or original Bees Algorithm (BAO). In BAO, there are six user-selected parameters: the number of scout bees (n), the number of elite selected patches (e), the number of good selected patches (m–e), the number of recruited bees for the elite patches (nep), the number of recruited bees for the good patches (nsp) and the neighbourhood size (ngh). Although BAO was introduced to address continuous optimisation, it has subsequently been applied to combinatorial problems such as scheduling [25] and PCB assembly [26]. Algorithm 1 is the pseudocode of the original BA.
Algorithm 1: Original Bees Algorithm
1Start
2Input the required parameters n, e, m, nep, nsp, ngh, MaxIt
3Generate n initial solutions
4Evaluate the fitness of the n initial solutions
5Select the best m solution for neighbourhood search
6 while iteration < MaxIt do
7 for each site i, (i = 1, …, e) do
8 Exploit site within ngh of the site with nep forager bees (Equation (3)) and Evaluate fitness
9 if better solution found replace site
10 end for
11 for each site j, (j = e + 1, …, m) do
12 Exploit site within ngh of the site with nsp forager bees (Equation (3)) and Evaluate fitness
13 if better solution found replace site
14 end for
15 for each site k, (k = m + 1, …, n) do
16 Explore site n-m scout bees (Equation (2)) and Evaluate fitness
17 end for
18 end while
19Return the best-so-far solution
In the initialisation step of BAO, the user inputs the aforementioned parameters and the stopping criteria. The algorithm then starts the optimisation process by sending n scout bees (Xs) to the search space with a uniform random distribution within its lower (xmin) and upper (xmax) boundaries (Equation (1)). Then, the algorithm evaluates the fitness of each scout bee. According to the quality of food sources, scout bees recruit forager bees (i.e., nep for the e elite patches and nsp for the (m–e) good selected patches) to exploit the source. These forager bees Xf look for locations with better fitness scores for their patches, the initial size of which is defined by ngh (Equations (2) and (3)). The remaining (n–m) scout bees continue to explore the search space randomly. The algorithm updates the obtained solution at the end of each cycle and returns the best-so-far solution when a stopping criterion is met:
X s i = U x m i n ,   x m a x ,                         i = 1 ,   , n ,
r = n g h × x m i n ,   x m a x ,
X f i , j = X s i + U r ,   r ,               i = 1 , ,   n ;   j = 1 , , n e p                                 i f   i e 1 , , n s p               i f   e < i m .
The shrinking strategy, which reduces the patch size when the algorithm is not able to improve the best solution on the patch, was employed to obtain a more exploitative local search and increase the density around a local optimum [27]. Pham and Darwish [28] also utilised this method in addition to fuzzy selection of the flower patches. Parameter reduction was performed via the fuzzy Bees Algorithm (BAFuzzy), which uses fuzzy logic to choose elite patches among the selected patches and the foragers to perform a local search.
An improved version of the BA, referred to as the standard Bees Algorithm (BAS), which includes shrinking and site abandonment, was introduced in [23]. With site abandonment, the bees leave a patch when the situation is stagnant (i.e., there is no improvement following a set number of trials during a local search). It is a method complementary to the shrinking strategy since they both act on a lack of improvement in the patch’s best solution. While shrinking improves the exploitation capability of the algorithm, site abandonment helps to avoid local optima.
BAS becomes an eight-parameter metaheuristic with the addition of the shrinking rate (shrink) and stagnation limit (stlim) to the existing parameters of BAO (i.e., n, e, m, nep, nsp and ngh). Here, shrink is a positive real multiplicative factor less than or equal to one to be applied to ngh, while stlim is a positive integer and is the number of trials made before a patch is abandoned.
Ismail et al. [29] achieved parameter reduction with BA2, a two-parameter version of the BA, for both continuous and combinatorial problems. The reduction is performed by combining the exploration and exploitation phases while maintaining the core principles of the BA. BA2 only requires the user to define the number of patches (n) and the maximum number of foragers sent to the top-ranked (elite) patch (nep). The number of foragers for other patches is determined via Equation (4), where w m a x represents nep, w m i n equals one, and k is the rank of a patch (i.e., k = 1 for the best patch, e.g., Figure 1). BA2 employs a triangular distribution to place foragers on the patches. The centre of a patch corresponds to the summit of its triangular distribution. There is no crisply defined neighbourhood which now covers the whole search space [29]:
w k = w m a x + k 1 w m i n w m a x n 1 .
The initial version of BA2 did not have a shrinking rate or stagnation limit (i.e., BA2 is a two-parameter version of BAO). If the shrinking rate and stagnation limit are included, then the total number of parameters is four, and the algorithm becomes a reduced-parameter version of BAS [30].
Hartono and Pham [31] introduced the Fibonacci Bees Algorithm (BAF), which reduces parameters via the Fibonacci sequence. BAF discards two parameters from BAS, namely the number of elite sites (e) and the number of recruited bees for elite sites (nep). The m selected sites are ranked, and the number of recruited bees for each selected site is determined according to its rank using the Fibonacci sequence.
Suluova, Hartono and Pham [22] proposed a continuous version of BAF with a total of five parameters: the number of scout bees (n), the number of selected sites (m), the maximum number of foragers for the top-ranked site (nr), the shrinking rate (shrink) and the stagnation limit (stlim). Their results showed that BAF is competitive and has an improved success rate for some benchmark functions compared with BAS.

3. Details of BA1

The k-means clustering method is employed to determine the number of patches by grouping the search agents (i.e., the bees). Grouping individuals in the search space eliminates the need for parameters related to the distribution of bees to sites, such as the number of elite patches (e), the number of recruited bees for elite patches (nep), the number of selected patches (m), and the number of recruited bees for other selected patches (nsp). As mentioned above, to group the bees in the search space without increasing the number of parameters, the cluster selection function f(K) by Pham et al. [16] is used automatically to select the optimal number of clusters. As explained in [16], in the first step, to determine an appropriate K value, the distortions of all the clusters are calculated separately for each K value (Equation (5)). Then, the total distortion of the clusters is obtained via Equation (6):
I K j = t = 1 N j d ( x j t ,   w j ] 2 ,
S K = j = 1 K I K j .
In Equation (5), I K j represents the distortion of cluster j when the bees form K clusters, N j is the number of bees belonging to cluster j, x j t is the tth element, and d x j t ,   w j is the distance between x j t and the centre w j of cluster j. S K is the actual distortion for a specified K value, whereas α K   S K 1 estimates the distortion for K based on the actual distortion for (K − 1). BA1 keeps a record of the total distortions and determines the optimum K value using f K (Equation (7)) for the initialised bee distribution. If the distribution is uniform (i.e., the whole population forms a single cluster), then f K becomes one. However, if the bees in the search space are dense at some locations, then f K decreases, and this confirms that there are some well-defined clusters. In the equations below, α K is a weight factor, while N d is the number of dimensions:
f K = 1 S K α K   S K 1 1                 i f   K = 1 , i f   S K 1 0 ,   K > 1 , i f   S K 1 = 0 ,   K > 1 ,
α K = 1 3 4 N d α K 1 + 1 α K 1 6       i f   K = 2   a n d   N d > 1 ,   i f   K > 2   a n d   N d > 1 .
Using the incremental k-means clustering method [15] and the K selection function [16], the total population of bees in the search space is categorised into one or more groups. As a first step in clustering, the entire population of n scout bees (Xs) explores the search space (Equation (9)). Then, the algorithm measures the Euclidean distances d between the fittest bee (Xs(1)) and other individuals (Equation (10)). With the incremental k-means approach, using these distances, the algorithm chooses the most appropriate k value for the population and the problem. The selected k value (i.e., the number of clusters) becomes the number of patches, and the individuals belonging to each cluster turn into forager bees to exploit the sites for which the centre is the best bee in the cluster. As a result of this process, the proposed algorithm eliminates the need to set the numbers of elite and good patches and the number of bees recruited for these patches:
X s i = U x m i n ,   x m a x ,                         i = 1 ,   , n ,
d i = X s 1 2 X s i 2 ,                         i = 1 ,   , n .
The triangular probability distribution (T [0, pk, 1]) is employed to integrate exploration and exploitation. Here, pk represents the peak point of a patch and is determined using Equation (11), where k is the rank number of a patch (i.e., k = 1 is the best patch) and K is the total number of patches. A random number generator based on a triangular distribution (TRNG) generates numbers between 0 and 1 to determine a swarm radius r for each forager bee j on patch k (Equation (12)), where zero corresponds to the patch centre and one represents the furthest point from the centre. Then, a new solution Xf is generated via Equation (13) within patch k, the centre of which is located at Xs. The forager bees tend to exploit the patch when the rank is high. As can be seen in Figure 2, the majority of the foragers swarm near the centres of highly ranked patches. As a result of using a TRNG, the ngh parameter of the original BA is eliminated when the exploration (0.5 < pk < 1) and exploitation (0 < pk < 0.5) phases are merged:
p k = k 1 K 1 ,       k = 1 , , K .
r k ,   j = n g h k × x m i n ,   x m a x = T 0 ,   p k ,   1 × x m i n ,   x m a x .
X f k , j k = X s k + U r k , j k , r k , j k ,   k = 1 , , K ; j k = 1 , , w 1       i f   k = 1 1 ,   ,   w K     i f   k = K
It is known that the efficiency of BA increases with the shrinking and site abandonment strategies. However, determining the correct parameter values for those strategies can be challenging. Selecting an improper shrinking rate could cause the algorithm to miss an optimum and converge prematurely. Similarly, it is critical to select a suitable stagnation limit (i.e., the number of consecutive visits to a patch without improvements), since an incorrect limit could cause either early abandonment or not leaving the patch even if it has run out of nectar. Thus, although these two strategies can increase the algorithm’s performance, having to set two additional parameters makes it more complex for users. Thus, in BA1, the shrinking rate (shrink) is automatically determined by the algorithm itself based on the iteration number (Equation (14)). In this equation, the shrinking rate is adjusted between 0.75 and 1, decreasing from 1 in the first iteration to 0.75 in the last iteration. This causes a reduction in the patch size, which helps BA1 increasingly focus on exploitation towards the end. Furthermore, the stagnation limit (stlim) becomes dependent on the average bee number per site. After the algorithm determines the number of patches, stlim is calculated using Equation (15):
s h r i n k = 1 3 4 i t M a x I t ,         i t = 1 , ,   M a x I t ,
s t l i m = C o l o n y S i z e K .
Figure 3 shows the general flowchart for BA1 which has been designed for both continuous and combinatorial optimisation problems. There are two minor differences between the continuous and combinatorial versions of BA1, namely the distance metric and the local operators used. In this work, the Hamming distance was adopted as the distance metric for combinatorial problems including vehicle routing, sequence planning and production scheduling. For those problems, the Euclidean distance ubiquitous in continuous optimisation cannot be employed.
The main local search operator for the continuous optimisation version is mutation, although other operators such as creep and randomisation can also be used. As with other combinatorial variants of the BA, the swap, insertion and reversion local operators are employed to generate new solutions. However, while previous variants only used one local operator to produce one solution from an existing solution in each iteration, BA1 applies all local operators and selects the fittest one among the solutions created. This greedy selection increases the number of fitness evaluations per cycle but helps the process converge to near-optimal solutions faster by leveraging the strengths of the different operators.

4. Experiments and Results

Experiments to assess the performance of BA1 were performed on both continuous and combinatorial problems. Twenty-three benchmark functions were used to evaluate BA1 in the continuous domain. The performance of BA1 was compared with that of the bat algorithm (BAT) [17], Grey Wolf Optimiser (GWO) [18] and Whale Optimisation Algorithm (WOA) [19]. For the combinatorial domain, BA1 was applied to 12 datasets from the Travelling Salesman Problem Library (TSPLIB) [32] and compared with BAT [17], the Discrete Whale Optimisation Algorithm (DWOA) [19], Grey Wolf Optimiser (GWO) [18], Moth Flame Optimisation (MFO) [20] and Particle Swarm Optimisation (PSO) [21]. The comparator algorithms were chosen for their proven competitive performance [33].
Table 1 lists the details of the functions used for continuous optimisation. The first six functions (F1–F6) were used to evaluate the exploitation ability of the algorithms. These functions had 30 dimensions and large domains. F6–F13 were 30 dimensional problems for testing the algorithms’ exploration performance. Functions F14–F23 were used to demonstrate the algorithms’ ability to handle low-dimensional problems where the global optima were not zero.
The values of the parameters for the different algorithms are given in Table 2. The one parameter for BA1, which is the total number of bees, was set at 100, equal to the number of search agents for all of the comparator algorithms. The other parameters of the comparator algorithms were taken from the literature [34]. Fifty independent runs of each algorithm were conducted per function. The stopping criterion was 500,000 fitness evaluations. If the obtained result was less than 10−10, then it was taken to be “0”. Table 3 presents the results of the continuous optimisation experiments.
From Table 3, it can be noted that BA1 was the lone top performer for 12 benchmark problems (F5, F6, F11, F12, F13, F14, F16, F18, F20, F21, F22 and F23) and shared the first place with others for four problems (F1, F2, F10 and F17). BA1 yielded strong results for both low- and high-dimensional functions, showing no search bias towards the origin. BA1 obtained the exact solutions for eight functions (F1, F2, F6, F10, F12, F13, F16 and F22) with a null standard deviation, which reflects its precision in reaching the global optimum. In addition, the proposed algorithm exhibited a high degree of robustness, generally finding stable solutions with low variability. Although BA1 was not the best performer for some problems, it still provided competitive solutions with low standard deviations, indicating its ability to explore and exploit. Thus, it can be stated that BA1 demonstrated an excellent performance through numerous trials and consistently provided precise and reliable solutions in the continuous domain.
Table 4 shows the results for the 12 combinatorial problems selected from the TSPLIB. The results for BAT, DWOA, GWO, MFO and PSO were taken from the study by Zhang et al. [35]. BA1 was again run 50 times, with each run limited to 1000 iterations. The population size was set equal to the number of cities in each problem. The comparison measure was the error rate (ER), which is the difference between the best-obtained solution (BOS) and the best-known solution (BKS) (Equation (16)):
E R = B O S B K S B K S × 100 .
From Table 4, it can be noted that BA1 had the lowest ER in eight of the problems, namely Ch150, D198, Eil51, Fl417, KroA100, Pr76, St70 and Tsp225. Only for one dataset (Oliver30) did BA1 have the poorest ER, sharing the bottom rank with PSO. BA1 showed higher performance when the complexity of the problem increased. Excluding Pr107, BA1 was consistently the top performer for problems involving more than 100 cities. Although it was not the best algorithm for some problems (Oliver30, Berlin52 and Eil76), it was competitive and able to provide near-optimal solutions. The performance of BA1 was above those of the comparators when considering the average ER of 12 datasets. When taking the average ER over all 12 problems, BA1 was in first place with an average ER of 5.27%, followed by DWOA (6.58%), BAT (8.24%), MFO (8.52%), GWO (9.08%) and PSO (14.40%).

5. Conclusions

This paper presented BA1, a single-parameter BA for both continuous and combinatorial optimisation. BA1 only requires the user to set the size of the bee population. BA1 starts by randomly generating initial solutions in the search space. Next, BA1 employs the incremental k-means method to cluster bees based on their distances to the fittest bee. The algorithm iteratively processes clustering and keeps a memory of the total distortion. Then, it selects the best K value for which the distortion function f(K) is minimal. The bees are clustered into patches and forage the patch to which they belong. Clustering automatically gives the number of bees for each patch. As in BA2, exploration and exploitation are merged via a probability distribution of foragers which concentrates bees around the top solutions and disperses them more randomly across the search space in the case of lower-ranked solutions. Additionally, shrinking and site abandonment follow preset strategies (i.e., shrinking is iteration-dependent, and site abandonment is a function of the number of patches). Therefore, the number of parameters decreases from six or eight (original BA: n, e, m, nep, nsp, ngh; standard BA: original BA parameters plus shrink and stlim) to one (n). BA1 simplifies the parameter configuration by minimising the number of parameters, making it even easier to use than BA2. The performance of BA1 was evaluated on 23 benchmark functions for continuous optimisation and 12 datasets from the TSPLIB for combinatorial optimisation and compared with those of popular metaheuristics. The results show that BA1 performed well against the comparator algorithms. BA1 demonstrated a high degree of reliability across numerous problems of varying complexity and size. Future work will extend the comparison to a larger set of benchmarks including standard engineering test problems and, in recognition of the no free lunch theory [36], seek to define the classes of problems to which BA1 is best suited.

Author Contributions

Conceptualisation, H.F.S. and D.T.P.; methodology, H.F.S.; software, H.F.S.; writing—original draft preparation, H.F.S. and D.T.P.; writing—review and editing, H.F.S. and D.T.P.; visualisation, H.F.S.; supervision, D.T.P. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The source code for the proposed algorithm is accessible at https://github.com/hfsuluova/Single-Parameter_BeesAlgorithm (accessed on 8 October 2024).

Acknowledgments

The authors would like to thank the University of Birmingham’s BEAR Cloud Service for providing flexible resources for intensive computational work. Hamid Furkan Suluova’s doctoral studies are funded by the Republic of Türkiye’s Ministry of National Education.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Ezugwu, A.E.; Adeleke, O.J.; Akinyelu, A.A.; Viriri, S. A conceptual comparison of several metaheuristic algorithms on continuous optimisation problems. Neural Comput. Appl. 2020, 32, 6207–6251. [Google Scholar] [CrossRef]
  2. Malarczyk, M.; Katsura, S.; Kaminski, M.; Szabat, K. A Novel Meta-Heuristic Algorithm Based on Birch Succession in the Optimization of an Electric Drive with a Flexible Shaft. Energies 2024, 17, 4104. [Google Scholar] [CrossRef]
  3. Çaşka, S. The Performance of Symbolic Limited Optimal Discrete Controller Synthesis in the Control and Path Planning of the Quadcopter. Appl. Sci. 2024, 14, 7168. [Google Scholar] [CrossRef]
  4. Liu, H.; Zhou, R.; Zhong, X.; Yao, Y.; Shan, W.; Yuan, J.; Xiao, J.; Ma, Y.; Zhang, K.; Wang, Z. Multi-Strategy Enhanced Crested Porcupine Optimizer: CAPCPO. Mathematics 2024, 12, 3080. [Google Scholar] [CrossRef]
  5. Ismail, W.N.; Alsalamah, H.A. Efficient Harris Hawk Optimization (HHO)-Based Framework for Accurate Skin Cancer Prediction. Mathematics 2023, 11, 3601. [Google Scholar] [CrossRef]
  6. Ang, M.C.; Ng, K.W. Minimising printed circuit board assembly time using the bees algorithm with TRIZ-inspired operators. In Intelligent Production and Manufacturing Optimisation—The Bees Algorithm Approach; Springer International Publishing: Cham, Switzerland, 2022; pp. 25–41. [Google Scholar]
  7. Liu, C.; Zhang, D.; Li, W. Crown Growth Optimizer: An Efficient Bionic Meta-Heuristic Optimizer and Engineering Applications. Mathematics 2024, 12, 2343. [Google Scholar] [CrossRef]
  8. Mayouf, C.; Salhi, A.; Haidara, F.; Aroua, F.Z.; El-Sehiemy, R.A.; Naimi, D.; Aya, C.; Kane, C.S.E. Solving Optimal Power Flow Using New Efficient Hybrid Jellyfish Search and Moth Flame Optimization Algorithms. Algorithms 2024, 17, 438. [Google Scholar] [CrossRef]
  9. Zhang, Z.; Wang, X.; Yue, Y. Heuristic Optimization Algorithm of Black-Winged Kite Fused with Osprey and Its Engineering Application. Biomimetics 2024, 9, 595. [Google Scholar] [CrossRef]
  10. Riff, M.-C.; Montero, E. A new algorithm for reducing metaheuristic design effort. In Proceedings of the 2013 IEEE Congress on Evolutionary Computation, Cancun, Mexico, 20–23 June 2013; IEEE: Piscataway, NJ, USA, 2013; pp. 3283–3290. [Google Scholar]
  11. Barragan-Vite, I.; Medina-Marin, J.; Hernandez-Romero, N.; Anaya-Fuentes, G.E. A Petri Net-Based Algorithm for Solving the One-Dimensional Cutting Stock Problem. Appl. Sci. 2024, 14, 8172. [Google Scholar] [CrossRef]
  12. Castellani, M.; Pham, D.T. The bees algorithm—A gentle introduction. In Intelligent Production and Manufacturing Optimisation—The Bees Algorithm Approach; Springer International Publishing: Cham, Switzerland, 2022; pp. 3–21. [Google Scholar]
  13. Aljarah, I.; Faris, H.; Mirjalili, S. (Eds.) Evolutionary Data Clustering: Algorithms and Applications; Springer: Berlin/Heidelberg, Germany, 2021. [Google Scholar]
  14. Shi, N.; Liu, X.; Guan, Y. Research on k-means clustering algorithm: An improved k-means clustering algorithm. In Proceedings of the Third International Symposium on Intelligent Information Technology and Security Informatics, Jian, China, 2–4 April 2010; pp. 63–67. [Google Scholar]
  15. Pham, D.T.; Dimov, S.S.; Nguyen, C.D. An incremental K-means algorithm. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2004, 218, 783–795. [Google Scholar] [CrossRef]
  16. Pham, D.T.; Dimov, S.S.; Nguyen, C.D. Selection of K in K-means clustering. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2005, 219, 103–119. [Google Scholar] [CrossRef]
  17. Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  18. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  19. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  20. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  21. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; IEEE: Piscataway, NJ, USA, 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  22. Suluova, H.F.; Hartono, N.; Pham, D.T. The Fibonacci Bees Algorithm for Continuous Optimisation Problems—Some Engineering Applications. In Proceedings of the International Workshop of the Bees Algorithm and Its Applications (BAA) 2023, Online, 15 November 2023. Paper 13. [Google Scholar]
  23. Pham, D.T.; Castellani, M. The bees algorithm: Modelling foraging behaviour to solve continuous optimization problems. Proc. Inst. Mech. Eng. Part C J. Mech. Eng. Sci. 2009, 223, 2919–2938. [Google Scholar] [CrossRef]
  24. Pham, D.T.; Ghanbarzadeh, A.; Koc, E.; Otri, S.; Rahim, S.; Zaidi, M. The Bees Algorithm. Technical Note; Manufacturing Engineering Centre, Cardiff University: Cardiff, UK, 2005. [Google Scholar]
  25. Pham, D.T.; Koc, E.; Lee, J.Y.; Phrueksanant, J. Using the bees algorithm to schedule jobs for a machine. In Proceedings of the Eighth International Conference on Laser Metrology, CMM and Machine Tool Performance, LAMDAMAP, Euspen, Cardiff, UK, 25–28 June 2007; pp. 430–439. [Google Scholar]
  26. Pham, D.T.; Otri, S.; Darwish, A.H. Application of the Bees Algorithm to PCB assembly optimisation. In Proceedings of the 3rd Virtual International Conference on Intelligent Production Machines and Systems (IPROMS 2007), Online, 2–13 July 2007; pp. 511–516. [Google Scholar]
  27. Pham, D.T.; Ghanbarzadeh, A. Multi-objective optimisation using the bees algorithm. In Proceedings of the 3rd International Virtual Conference on Intelligent Production Machines and Systems, Online, 2–13 July 2007; Volume 6. [Google Scholar]
  28. Pham, D.T.; Darwish, A.H. Fuzzy selection of local search sites in the Bees Algorithm. In Proceedings of the 4th International Virtual Conference on Intelligent Production Machines and Systems (IPROMS 2008), Cardiff, UK, 1–14 July 2008; pp. 1–14. [Google Scholar]
  29. Ismail, A.H.; Ruslan, W.; Pham, D.T. A user-friendly Bees Algorithm for continuous and combinatorial optimisation. Cogent Eng. 2023, 10, 2278257. [Google Scholar] [CrossRef]
  30. Ismail, A.H. Enhancing the Bees Algorithm Using the Traplining Metaphor. Ph.D. Thesis, University of Birmingham, Birmingham, UK, 2021. [Google Scholar]
  31. Hartono, N.; Pham, D.T. A novel Fibonacci-inspired enhancement of the Bees Algorithm: Application to robotic disassembly sequence planning. Cogent Eng. 2024, 11, 2298764. [Google Scholar] [CrossRef]
  32. Lin, S. Computer solutions of the traveling salesman problem. Bell Syst. Technol. J. 1965, 44, 2245–2269. [Google Scholar] [CrossRef]
  33. Ma, Z.; Wu, G.; Suganthan, P.N.; Song, A.; Luo, Q. Performance assessment and exhaustive listing of 500+ nature-inspired metaheuristic algorithms. Swarm Evol. Comput. 2023, 77, 101248. [Google Scholar] [CrossRef]
  34. Yang, X.S.; Slowik, A. Bat algorithm. In Swarm Intelligence Algorithms; CRC Press: Boca Raton, FL, USA, 2020; pp. 43–53. [Google Scholar]
  35. Zhang, J.; Hong, L.; Liu, Q. An improved whale optimization algorithm for the traveling salesman problem. Symmetry 2020, 13, 48. [Google Scholar] [CrossRef]
  36. Wolpert, D.H.; Macready, W.G. No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
Figure 1. Triangular distribution for n = 5 and nep = 9. From left to right: the highest-ranked to the lowest-ranked patches. More foragers (9 bees) were recruited to the highest-ranked patch than the other patches. The lowest-ranked patch received the smallest number of foragers (1 bee). (The orange lines indicate the peaks of the triangles).
Figure 1. Triangular distribution for n = 5 and nep = 9. From left to right: the highest-ranked to the lowest-ranked patches. More foragers (9 bees) were recruited to the highest-ranked patch than the other patches. The lowest-ranked patch received the smallest number of foragers (1 bee). (The orange lines indicate the peaks of the triangles).
Biomimetics 09 00634 g001
Figure 2. Example of swarming points for patches when K = 4 (a,b) bees swarm near the centre of highly ranked patches. (c,d) Bees are dispersed in lower-ranked patches.
Figure 2. Example of swarming points for patches when K = 4 (a,b) bees swarm near the centre of highly ranked patches. (c,d) Bees are dispersed in lower-ranked patches.
Biomimetics 09 00634 g002
Figure 3. Flowchart for BA1.
Figure 3. Flowchart for BA1.
Biomimetics 09 00634 g003
Table 1. Details of the benchmark functions.
Table 1. Details of the benchmark functions.
FunctionsDimBoundsGlobal Optimum
F 1 x = i = 1 n x i 2   30[−100, 100] F 1 x = 0
F 2 x = i = 1 n x i + i = 1 n x i 30[−100, 100] F 2 x = 0
F 3 x = i = 1 n j = 1 n x j 2 30[−100, 100] F 3 x = 0
F 4 x = max x i ,   1 i n 30[−100, 100] F 4 x = 0
F 5 x = i = 1 n 1 100 x i 2 + x i + 1 2 + x i 1 2 30[−30, 30] F 5 x = 0
F 6 x = i = 1 n 1 x i + 0.5 2 30[−100, 100] F 6 x = 0
F 7 x = i = 1 n i x i 4 + r a n d 0 ,   1 30[−1.28, 1.28] F 7 x = 0
F 8 x = i = 1 n 1 x i sin x i 30[−500, 500] F 8 x = 418.98 × n
F 9 x = i = 1 n x i 2 10 cos 2 π x i + 10 30[−5.12, 5.12] F 9 x = 0
F 10 x = 20 exp 0.2 1 n i = 1 n x i 2 exp 1 n i = 1 n cos 2 π x i + 20 + e 30[−32, 32] F 10 x = 0
F 11 x = x = 1 4000 i = 1 n x i 2 i = 1 n cos x i i + 1 + 1 30[−600, 600] F 11 x = 0
F 12 x = π n 10 sin π y 1 + i = 1 n 1 y i 1 2 1 + 10 sin 2 π y i + 1 + y n 1 2 + i = 1 n u x i ,   10 ,   100 ,   4 30[−50, 50] F 12 x = 0
F 13 x = 0.1 sin 2 3 π x 1 + i = 1 n 1 x i 1 2 1 + sin 2 3 π x i + 1 + x n 1 2 1 + sin 2 2 π x n + i = 1 n u x i ,   5 ,   100 ,   4 30[−50, 50] F 13 x = 0
F 14 x = 1 500 + j = 1 25 1 j + i = 1 2 x i a i j 6 1 2[−65, 65] F 14 x = 1
F 15 x = i = 1 11 a i x 1 b i 2 + b i x 2 b i 2 + b i x 3 + x 4 2 4[−5, 5] F 15 x = 0.00030
F 16 x = 4 x 1 2 2.1 x 1 4 + 1 3 x 1 6 + x 1 x 2 4 x 2 2 + 4 x 2 4 2[−5, 5] F 16 x = 1.0316
F 17 x = 5.1 4 π 2 x 1 2 + 5 π x 1 6 2 + 10 1 1 8 π c o s x 1 + 10 2[−5, 5] F 17 x = 0.398
F 18 x = 1 + x 1 + x 2 + 1 2 19 14 x 1 + 3 x 1 2 14 x 2 + 6 x 1 x 2 + 3 x 2 2 × 30 + 2 x 1 3 x 2 2 18 32 x 1 + 12 x 1 2 + 48 x 2 36 x 1 x 2 + 27 x 2 2 2[−2, 2] F 18 x = 3
F 19 x = i = 1 4 c i exp j = 1 3 a i j x j p i j 2 3[1, 3] F 19 x = 3.86
F 20 x = i = 1 4 c i exp j = 1 6 a i j x j p i j 2 6[0, 1] F 20 x = 3.32
F 21 x = i = 1 5 c i + x a i T x a i 1 4[0, 10] F 21 x = 10.1532
F 22 x = i = 1 7 c i + x a i T x a i 1 4[0, 10] F 22 x = 10.4028
F 23 x = i = 1 10 c i + x a i T x a i 1 4[0, 10] F 23 x = 10.5363
u = x i , a , k , m = k x i a m   ,     a n d   x i > a 0   ,     a n d a < x i < a k x i a m   ,     a n d   x i < a   ---
Table 2. Parameter settings used in the experiments.
Table 2. Parameter settings used in the experiments.
BA1BATGWOWOA
Total Population100100100100
LoudnessNA1NANA
Pulse RateNA1NANA
AlphaNA0.97NANA
GammaNA0.1NANA
Minimum FrequencyNA0NANA
Maximum FrequencyNA2NANA
NA: Not Applicable
Table 3. Results of continuous optimisation experiments.
Table 3. Results of continuous optimisation experiments.
FunctionsBA1BATGWOWOA
MeanStd DevMeanStd DevMeanStd DevMeanStd Dev
F10.000.001.43 × 1043.07 × 1030.000.000.000.00
F20.000.001.15 × 1031.02 × 1020.000.000.000.00
F31.57 × 10−21.72 × 10−21.94 × 1045.39 × 1030.000.006.759.04
F43.913.546.06 × 1014.650.000.001.625.43
F52.714.381.61 × 1022.63 × 1022.82 × 1011.032.37 × 1012.06 × 10−1
F60.000.001.47 × 1042.71 × 1033.506.21 × 10−17.90 × 10−73.12 × 10−7
F74.87 × 10−21.99 × 10−22.97 × 10−21.24 × 10−24.16 × 10−41.01 × 10−49.59 × 10−51.07 × 10−4
F8−1.14 × 1041.79 × 102−6.03 × 1035.84 × 102−5.93 × 1037.84 × 102−1.24 × 1044.37 × 102
F91.99 × 10−21.39 × 10−11.78 × 1022.69 × 1012.44 × 1014.030.000.00
F100.000.001.90 × 1012.23 × 10−12.051.370.000.00
F111.48 × 10−41.04 × 10−34.68 × 1024.27 × 1016.14 × 10−34.12 × 10−34.03 × 10−42.03 × 10−3
F120.000.003.44 × 1019.201.066.42 × 10−11.44 × 10−76.22 × 10−8
F130.000.001.03 × 1029.522.094.98 × 10−12.68 × 10−62.58 × 10−6
F149.98 × 10−13.33 × 10−169.537.537.164.899.98 × 10−14.66 × 10−15
F156.68 × 10−42.10 × 10−31.34 × 10−31.66 × 10−34.04 × 10−37.71 × 10−34.19 × 10−42.97 × 10−4
F16−1.030.00−9.99 × 10−11.60 × 10−1−1.031.42 × 10−11−1.031.96 × 10−15
F173.98 × 10−11.67 × 10−163.98 × 10−11.67 × 10−163.98 × 10−12.10 × 10−93.98 × 10−11.83 × 10−11
F183.002.66 × 10−157.329.903.007.34 × 10−83.009.98 × 10−10
F19−3.00 × 10−12.78 × 10−16−3.860.00−3.00 × 10−12.78 × 10−16−3.00 × 10−12.78 × 10−16
F20−3.323.11 × 10−15−3.265.94 × 10−2−3.275.87 × 10−2−3.256.49 × 10−2
F21−1.02 × 1017.11 × 10−15−5.483.08−8.452.94−1.02 × 1011.72 × 10−7
F22−1.04 × 1010.00−5.413.37−9.682.01−1.04 × 1019.04 × 10−8
F23−1.05 × 1011.74 × 10−14−5.783.53−9.792.27−1.05 × 1018.16 × 10−8
The best solutions are in bold. If two algorithms found the same solution, then the one with the smaller standard deviation was accepted as the best one. If the standard deviations were also the same, then both of them were accepted as the best one.
Table 4. Results of combinatorial experiments.
Table 4. Results of combinatorial experiments.
Problem (BKS)BA1BATDWOAGWOMFOPSO
MeanER (%)MeanER (%)MeanER (%)MeanER (%)MeanER (%)MeanER (%)
Berlin52 (7542)79305.1476942.0277272.4578984.7281848.5178624.24
Ch150 (6528)69286.13744013.97732912.27738413.11732912.27783319.99
D198 (15,780)16,2402.9216,8496.7716,6035.2217,1098.4216,9117.1718,13014.89
Eil51 (426)4372.584393.054454.464413.524495.44454.46
Eil76 (538)5624.465614.285797.625655.025777.2559510.59
Fl417 (11,861)13,09910.4415,53230.9513,88617.0715,49230.6114,08718.7718,68857.56
KroA100 (21,282)22,0183.4623,42410.0622,4715.5922,9637.923,45610.2223,48010.33
Oliver30 (420)4240.95420042004220.484230.714240.95
Pr76 (108,159)111,4103.01111,9893.54111,5113.1114,2615.64114,3775.75115,2656.57
Pr107 (44,303)50,51414.0246,4194.7845,7803.3346,0834.0247,4377.0746,9195.9
St70 (675)6963.117186.377125.487267.567105.197328.44
Tsp225 (3916)41927.05442713.05439912.33462017.98446914.12504928.93
The best solutions are in bold.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Suluova, H.F.; Pham, D.T. A New Single-Parameter Bees Algorithm. Biomimetics 2024, 9, 634. https://doi.org/10.3390/biomimetics9100634

AMA Style

Suluova HF, Pham DT. A New Single-Parameter Bees Algorithm. Biomimetics. 2024; 9(10):634. https://doi.org/10.3390/biomimetics9100634

Chicago/Turabian Style

Suluova, Hamid Furkan, and Duc Truong Pham. 2024. "A New Single-Parameter Bees Algorithm" Biomimetics 9, no. 10: 634. https://doi.org/10.3390/biomimetics9100634

APA Style

Suluova, H. F., & Pham, D. T. (2024). A New Single-Parameter Bees Algorithm. Biomimetics, 9(10), 634. https://doi.org/10.3390/biomimetics9100634

Article Metrics

Back to TopTop