This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution license (http://creativecommons.org/licenses/by/3.0/).
Optimization algorithms are search methods where the goal is to find an optimal solution to a problem, in order to satisfy one or more objective functions, possibly subject to a set of constraints. Studies of social animals and social insects have resulted in a number of computational models of swarm intelligence. Within these swarms their collective behavior is usually very complex. The collective behavior of a swarm of social organisms emerges from the behaviors of the individuals of that swarm. Researchers have developed computational optimization methods based on biology such as Genetic Algorithms, Particle Swarm Optimization, and Ant Colony. The aim of this paper is to describe an optimization algorithm called the Bees Algorithm, inspired from the natural foraging behavior of honey bees, to find the optimal solution. The algorithm performs both an exploitative neighborhood search combined with random explorative search. In this paper, after an explanation of the natural foraging behavior of honey bees, the basic Bees Algorithm and its improved versions are described and are implemented in order to optimize several benchmark functions, and the results are compared with those obtained with different optimization algorithms. The results show that the Bees Algorithm offering some advantage over other optimization methods according to the nature of the problem.
Swarm Intelligence (SI) is defined as the collective problemsolving capabilities of social animals [
Selforganization is created by four elements as were suggested by Bonabeau
There are many different species of animal that benefit from similar procedures that enable them to survive and to create new and better generations. Honey bees, ants, flocks of birds and shoals of fish are some of the examples of this efficient system in which individuals find safety and food. Moreover, even some other complex life forms follow similar simple rules to benefit from each other’s strength [
Swarmbased optimization algorithms (SOAs) mimic nature’s methods to drive a search towards the optimal solution. A key difference between SOAs and direct search algorithms such as hill climbing and random walk is that SOAs use a population of solutions for every iteration instead of a single solution. As a population of solutions is processed as iteration, the outcome of each iteration is also a population of solutions [
The aim of this paper is to describe an optimization algorithm called the Bees Algorithm, introduced by Pham [
In this paper, after an explanation of the natural foraging behavior of honey bees, the basic Bees Algorithm and its improved versions are described and are implemented in order to optimize several benchmark functions, and the results are compared with those obtained with different optimization algorithms. The paper is organized as follows: Swarmoptimization algorithms is given in
Swarm Optimization Algorithms (SOAs) mimic the collective exploration strategy of the swarms in the nature on optimization problems [
Evolutionary Algorithms (EAs) are wellknown SOAs, and inspired from the natural selection, mutation and recombination of the biological mechanism. Several forms of these algorithms have been introduced such as Evolutionary Strategy, Evolutionary Programming, Genetic Algorithm and so on. In EAs, the main strategy is to find the optimal points by utilizing the stochastic search operators such as natural selection, mutation and recombination to the population. The algorithms work with a random population of solutions. The algorithm efficiently exploits historical information to speculate on new search areas with improved performance [
Particle Swarm Optimization (PSO) utilizes the social behavior of the groups of population in nature such as animal herds or bird flocking, or schooling of fish. PSO consists of a population called swarm and each member of the swarm is called a particle [
Ant Colony Optimization (ACO) was inspired by the pheromonebased strategy of ants foraging in nature. The foraging behavior of ants is based on finding the shortest path between source and their nests [
In nature, honey bees have several complicated behaviors such as mating, breeding and foraging. These behaviors have been mimicked for several honey bee based optimization algorithms.
One of the famous mating and breeding behavior of honey bees inspired algorithm is Marriage in Honey Bees Optimization (MBO). The algorithm starts from a single queen without family and passes on to the development of a colony with family having one or more queens. In the literature, several versions of MBO have been proposed such as HoneyBees Mating Optimization (HBMO) [
The other type of beeinspired algorithms mimics the foraging behavior of the honey bees. These algorithms use standard evolutionary or random explorative search to locate promising locations. Then the algorithms utilize the exploitative search on the most promising locations to find the global optimum. The following algorithms were inspired from foraging behavior of honey bees; Bee System (BS), Bee Colony Optimization (BCO), Artificial Bee Colony (ABC) and The Bees Algorithm (BA).
Bee System is an improved version of the Genetic Algorithm (GA) [
Bee Colony Optimization (BCO) was proposed to solve combinatorial optimization problems by [
Artificial Bee Colony optimization (ABC) was proposed by Karaboga
The Bees Algorithm was proposed by Pham
A colony of honey bees can exploit a large number of food sources in big fields and they can fly up to 11 km to exploit food sources [
The scout bees inform their peers waiting in the hive as to the quality of the food source, based amongst other things, on sugar levels. The scout bees deposit their nectar and go to the dance floor in front of the hive to communicate to the other bees by performing their dance, known as the “waggle dance” [
The waggle dance is named based on the wagging run (in which the dancers produce a loud buzzing sound by moving their bodies from side to side), which is used by the scout bees to communicate information about the food source to the rest of the colony. The scout bees provide the following information by means of the waggle dance: the quality of the food source, the distance of the source from the hive and the direction of the source [
The waggle dance path has a figure of eight shape. Initially the scout bee vibrates its wing muscles which produces a loud buzz and runs in a straight line the direction which is related to the vertical on the hive and indicates the direction of the food source relative to the sun’s azimuth in the field, see
(
The BA has both local and global search capability utilizing exploitation and exploration strategies, respectively. The BA uses the set of parameters given in
Pseudocode of the basic Bees Algorithm.
Basic parameters of the Bees Algorithm.
Parameter  Symbols 

Number of scout bees in the selected patches 

Number of best patches in the selected patches 

Number of elite patches in the selected best patches 

Number of recruited bees in the elite patches 

Number of recruited bees in the nonelite best patches 

The size of neighborhood for each patch 

Number of iterations 

Difference between value of the first and last iterations 

Flowchart of the basic Bees Algorithm.
The Algorithm starts with sending
(
This section describes the proposed improvements to the BA by applying adaptive change to the neighborhood size and site abandonment approach simultaneously. Combined neighborhood size change and site abandonment (NSSA) strategy has been attempted on the BA by Koc [
In this subsection, the ANSSAbased BA was tested on benchmark functions and the results were compared with those obtained using the basic BA and other wellknown optimization techniques such as Evolutionary Algorithm (EA), Particle Swarm Optimization (PSO) (standard PSO), Artificial Bee Colony (ABC). There are several differences between these algorithms as given in
EA has been implemented on several optimization problems, however this algorithm has advantages and disadvantages as given below [
Advantages:
Feasibility of finding global optimum for several problems,
Availability to combine the hybrid algorithms with EA and others,
Implementation with several optimization problems,
Availability for real and binary problems.
Disadvantages:
Slow convergence rate,
Stability and convergence of algorithm is based on recombination and mutation rates,
The algorithm may converge to a suboptimal solution (risk of premature convergence),
Algorithm has a weakness on local search,
It has a difficult encoding scheme.
PSO has the following advantages and disadvantages [
Advantages:
The algorithm can easily be implemented;
The global search of the algorithm is efficient,
The dependency on the initial solution is smaller,
It is a fast algorithm,
The algorithm has less parameter for tuning.
Disadvantages;
The algorithm has a weakness regarding local search,
It has a slow convergence rate,
It may get trapped in local minima for hard optimization problems.
ABC is also the same as the other algorithms in that it has advantages and disadvantages:
Advantages;
The algorithm has strength in both local and global searches [
Implemented with several optimization problems [
Disadvantages;
Random initialization,
The algorithm has several parameters,
Parameters need to be tuned,
Probabilistic approach in the local search.
The BA also has advantages and disadvantages compared to the other algorithms [
Advantages:
The algorithm has local search and global search ability,
Implemented with several optimization problems,
Easy to use,
Available for hybridization combination with other algorithms.
Disadvantages:
Random initialization,
The algorithm has several parameters,
Parameters need to be tuned.
To analyze the behavior of each algorithm, the benchmark functions have been tested. The ten benchmark functions used for the test are given in
The proposed algorithm was run a hundred times for each function. The performance of the algorithm was assessed according to the accuracy and the average evaluation numbers (
Further, the ttest was utilized to measure the statistical significance of the proposed algorithm and the basic Bees Algorithm. The results are given in
The selected benchmark functions [
No  Function Name  Interval  Function  Global Optimum 

1  Goldstein &Price (2D)  [−2, 2]  X = [0,−1] F (X) = 3  
2  Schwefel (2D)  [−500, 500]  X = [0,0] F(X) = −837.658  
3  Schaffer (2D)  [−100, 100]  X = (0, 0) F(X) = 0  
4  Rosenbrock (10D)  [−1.2, 1.2]  X = [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] F(X) = 0  
5  Sphere (10D)  [−5.12, 5.12]  X = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0] F(X) = 0  
6  Ackley (10D)  [−32, 32]  X = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0] F(X) = 0  
7  Rastrigin (10D)  [−5.12, 5.12]  X = [0, 0, 0, 0, 0, 0, 0, 0, 0, 0] F(X) = 0  
8  Martin & Gaddy (2D)  [0, 10]  X = [5, 5] F(X) = 0  
9  Easom (2D)  [−100, 100]  X = [π, π] F(X) = 1  
10  Griewank (10D)  [−600, 600]  X = [100, 100, 100, 100, 100, 100, 100, 100, 100, 100] F(X) = 0 
The best test parameters for the BA after parameter tuning.
Parameters  Value 

Number of Scout Bees in the Selected Patches ( 
50 
Number of Best Patches in the Selected Patches ( 
15 
Number of Elite Patches in the Selected Best Patches ( 
3 
Number of Recruited Bees in the Elite Patches ( 
12 
Number of Recruited Bees in the NonElite Best Patches ( 
8 
The Size of neighborhood for Each Patches ( 
1 
Number of Iterations ( 
5000 
Difference between the First Iteration Value and the Last Iteration ( 
0.001 
Shrinking Constant ( 
2 
Number of Repetitions for Shrinking Process ( 
10 
Number of Repetitions for Enhancement Process ( 
25 
Number of Repetitions for Site Abandonment ( 
100 
The test parameters for the Evolutionary Algorithms (EA) [
Parameters  Crossover  No crossover 

Population size  100  
Evaluation cycles (max number)  5000  
Children per generation  99  
Crossover rate  1  0 
Mutation rate (variables)  0.05  0.8 
Mutation rate (mutation width)  0.05  0.8 
Initial mutation interval width α (variables)  0.1  
Initial mutation interval width ρ (mutation width)  0.1 
The test parameters for the Particle Swarm Optimization (PSO) [
Parameters  Value 

Population size  100 
PSO cycles (max number) 
5000 
Connectivity  See 
Maximum velocity  See 
C_{1}  2 
C_{2}  2 

0.9 

0.4 
The test parameters for the PSO [
Velocity of the each connectivity (Connectivity, 
Max particle velocity 


Connectivity (number of neigbourhood)  (2, 0.005)  (2, 0.001)  (2, 0.05)  (2, 0.1) 
(10, 0.005)  (10, 0.001)  (10, 0.05)  (10, 0.1)  
(20, 0.005)  (20, 0.001)  (20, 0.05)  (20, 0.1)  
(100, 0.005)  (100, 0.001)  (100, 0.05)  (100, 0.1) 
The test parameters for the Artificial Bee Colony (ABC) [
Parameters  Value 

Population size  100 
ABC cycles (max number)  5000 
Employed bees n_{e}  50 
Onlooker bees n_{e}  49 
Random scouts  1 
Stagnation limit for site abandonment 
50xDimenstion 
Accuracy of the proposed algorithm compared with other wellknown optimization techniques.
No.  PSO  EA  ABC  BA  ANSSABA  

Avg. Abs. Dif.  Std. Dev.  Avg. Abs. Dif.  Std. Dev.  Avg. Abs. Dif.  Std. Dev.  Avg. Abs. Dif.  Std. Dev.  Avg. Abs. Dif.  Std. Dev.  
1  0.0000  0.0000  0.0000  0.0000  0.0000  0.0001  0.0000  0.0003  0.0000  0.0001 
2  4.7376  23.4448  4.7379  23.4448  0.0000  0.0000  0.0000  0.0005  0.0003  0.0007 
3  0.0000  0.0000  0.0009  0.0025  0.0000  0.0000  0.0000  0.0003  0.0001  0.0005 
4  0.5998  1.0436  61.5213  132.6307  0.0965  0.0880  44.3210  112.2900  0.0000  0.0003 
5  0.0000  0.0000  0.0000  0.0000  0.0000  0.0000  0.0000  0.0003  0.0000  0.0000 
6  0.0000  0.0000  0.0000  0.0000  0.0000  0.0000  1.2345  0.3135  0.0063  0.0249 
7  0.1990  0.4924  2.9616  1.4881  0.0000  0.0000  24.8499  8.3306  0.0002  0.0064 
8  0.0000  0.0000  0.0000  0.0000  0.0000  0.0000  0.0000  0.0003  0.0000  0.0000 
9  0.0000  0.0000  0.0000  0.0000  0.0000  2.0096  0.0000  0.0003  0.0000  0.0002 
10  0.0008  0.0026  0.0210  0.0130  0.0052  0.0078  0.3158  0.1786  0.0728  0.0202 
ANNSA: adaptive neighborhood sizes and site abandonment
Average evaluation of proposed algorithm compared with other wellknown optimization techniques.
No.  PSO  EA  ABC  BA  ANSSABA  

Avg. evaluations  Std. Dev.  Avg. evaluations  Std. Dev.  Avg. evaluations  Std. Dev.  Avg. evaluations  Std. Dev.  Avg. evaluations  Std. Dev.  
1  3,262  822  2,002  390  2,082  435  504  211  250,049  0 
2  84,572  90,373  298,058  149,638  4,750  1,197  1,140  680  250,049  0 
3  28,072  21,717  219,376  183,373  21,156  13,714  121,088  174,779  250,049  0 
4  492,912  29,381  500,000  0  497,728  16,065  935,000  0  30,893.2  48,267.4 
5  171,754  7,732  36,376  2,736  13,114  480  285,039  277,778  25,098.3  36,483.4 
6  236,562  9,119  50,344  3,949  18,664  627  910,000  0  234,190.7  54.086.8 
7  412,440  67,814  500,000  0  207,486  57,568  885,000  0  93,580  97,429.1 
8  1,778  612  1,512  385  1,498  329  600  259  53,005.7  66,284.5 
9  16,124  15,942  36,440  28,121  1,542  201  5,280  6,303  250,049  0 
10  290,466  74,501  490,792  65,110  357,438  149,129  4,300,000  0  122,713.17  99,163.3 
In this paper neighborhood search in the BA was investigated. The focus was on improving the BA by utilizing the adaptive neighborhood sizes and site abandonment (ANSSA) strategy. The accuracy of the algorithm was computed with average absolute differences of the best results. According to this, the more accurate results were closer to zero. The proposed algorithm performed significantly better on high dimensional functions. For example, accuracy for 10DRastrigin was 0.0002 whereas the accuracy of the basic BA was 24.8499, and for 10DAckley it was 0.0063, whereas the accuracy of the basic BA was 1.2345. On another hand the proposed algorithms have less performance on lower dimensional problems, for example the accuracy of the algorithm for 2DSchwefel function was 0.0003, which was lower than the basic BAs result given in
The statistically significant difference between the adaptive neighborhood sizes and site abandonment in (ANSSA)based BA and the basic BA.
No.  Function  Significance between the basic BA and the improved BA  

Significant ( α < 0.05)  α  
1  Goldstein & Price (2D)  No  0.200 
2  Schwefel (2D)  No  0.468 
3  Schaffer (2D)  No  0.801 
4  Rosenbrock (10D)  No  0.358 
5  Sphere (10D)  No  0.433 
6  Ackley (10D)  Yes  0.020 
7  Rastrigin (10D)  Yes  0.007 
8  Martin & Gaddy (2D)  No  0.358 
9  Easom (2D)  No  0.563 
10  Griewank (10D)  Yes  0.020 
As it is shown in
A ttest was carried out on the results obtained for the basic BA and the improved BA in order to see if there was any significant difference between the performances of the two methods. This was done by looking for evidence for the rejection of the null hypothesis
In this paper, an optimization algorithm inspired by the natural foraging behavior of honey bees, called the Bees Algorithm, has been discussed, and an enhanced version called ANSSAbased Bees Algorithm has been proposed.
The proposed ANSSAbased has been successfully applied on continuous type benchmark functions and compared with other wellknown optimization techniques. To test the performance of the proposed algorithm, the following comparison approaches have been utilized: accuracy analysis, average evaluation and ttest. According to the results of the accuracy analysis and the average evaluation, the proposed algorithm performed better on higher dimensional than lower dimensional functions.
Finally, the statistical significance of proposed algorithm has been computed with a ttest and the results were compared with the basic Bees Algorithm. Based on the ttest results it can be concluded that the results of the proposed algorithm are statistically significant than the results of basic Bees Algorithm. Thus the proposed algorithm performed better than the basic Bees Algorithm on higher dimensional functions such as, Ackley (10D) and Griwank (10D).
We are grateful to our family and friends for their valuable supports and motivations.
The authors declare no conflict of interest.