You are currently viewing a new version of our website. To view the old version click .
Information
  • Article
  • Open Access

6 March 2025

Farmer Ants Optimization Algorithm: A Novel Metaheuristic for Solving Discrete Optimization Problems

,
,
,
and
1
Department of Computer Engineering, Shafagh Institute of Higher Education, Tonekabon 4683165363, Iran
2
Department of Computer Engineering, Rasht Branch, Islamic Azad University, Rasht 413353516, Iran
3
Ministry of Education, Mashhad 9133714165, Iran
*
Author to whom correspondence should be addressed.
This article belongs to the Special Issue Intelligent Information Technology

Abstract

Currently, certain complex issues are classified as NP-hard problems, for which there is no exact solution, or they cannot be solved in a reasonable amount of time. As a result, metaheuristic algorithms have been developed as an alternative. These algorithms aim to approximate the optimal solution rather than providing a definitive one. Over recent years, these algorithms have gained considerable attention from the research community. Nature and its inherent principles serve as the primary inspiration for the development of metaheuristic algorithms. A notable subgroup of these algorithms is evolutionary algorithms, which are modeled based on the behavior of social and intelligent animals and organisms. However, each metaheuristic algorithm typically excels only with specific types of problems. As a result, researchers continuously endeavor to develop new algorithms. This study introduces a novel metaheuristic algorithm known as the Farmer Ants Optimization Algorithm (FAOA). The algorithm is inspired by the life of farmer ants, which cultivate mushrooms for food, protect them from pests, and nourish them as they grow. These behaviors, based on their social dynamics, serve as the foundation for the proposed algorithm. Experiments conducted on various engineering and classical problems have demonstrated that the FAOA provides acceptable solutions for discrete optimization problems.

1. Introduction

In the new era, there are problems with different complexities in calculations. Some of these problems are known as non-deterministic polynomial hardness (NP-hard) problems []. NP-hard problems include thousands of problems, each of which has many applications in engineering sciences, and a definitive solution has not yet been found to solve them []. In such problems, it is very unlikely to find a solution with definite polynomial time []. With the increase in the amount of data, finding solutions to these issues is more challenging []. Therefore, optimization methods are considered an effective solution in this field. Optimization is finding the optimal solution for the parameters of a given system from all values to maximize or minimize the output. Optimization problems can be found in most engineering fields []. Because of the drawbacks of some conventional techniques, the possibility of falling into local optima, and the need to expand the search space [], optimization techniques [] have been developed over the last two decades [,]. With the increasing complexity of the problems, the need for optimization methods is felt more than before []. Optimization plays a very important role in industry, the development of science, management, and solving problems that can be modeled in this field. In multidimensional, discontinuous models and data containing noise, which cannot be solved by traditional methods, optimization algorithms can be used as an alternative [].
Real-world problems in machine learning and artificial intelligence are generally continuous, discrete, bounded, or unbounded [,]. Because of these features, it is difficult to find an exact solution for some classes of problems using conventional mathematical methods [,]. Several studies have confirmed that these methods are not efficient enough to solve some problems []. Metaheuristic algorithms are an alternative solution to solve such problems. These algorithms are usually inspired by intelligent concepts such as physical rules, social phenomena, animal behavior, and evolution [].
With the rapid growth of science and industry and the emergence of recent issues, metaheuristic algorithms are deployed more and more [,]. However, a specific metaheuristic algorithm cannot solve all problems. On the other hand, the major challenge of metaheuristic algorithms is obtaining the solution in the shortest possible time with the highest accuracy. Some algorithms are highly accurate in solving some problems, but their response time is longer than similar algorithms. For this reason, algorithms should approach an acceptable point in terms of accuracy and speed []. Today, metaheuristic algorithms have attracted the attention of many scientists. For example, the Genetic Algorithm (GA) [] uses the theory of evolution to solve discrete problems. Another algorithm in this field is called the Artificial Immune System []. This algorithm is inspired by the principles and processes of the vertebrate immune system and is modeled based on the learning and memory characteristics of the immune system. Ant Colony Optimization (ACO) [] is one of the most prominent metaheuristic algorithms. This algorithm is inspired by the search behavior of ants to find food. Particle Swarm Optimization (PSO) [] is another well-known algorithm inspired by the social movement of birds. The Flower Pollination Algorithm (FPA) [] is inspired by the growth of plants and the pollination process. Another algorithm in this field is the Firefly Algorithm []. This algorithm is based on the blinking of fireflies. The Trees Social Relations Optimization Algorithm (TSR) [] is one of the new algorithms in this field, which is inspired by the hierarchical and social life of trees. This algorithm can solve discrete and continuous problems. The Water Optimization Algorithm (WOA) [] is another new algorithm that is inspired by the chemical features of water molecules. The Lion Optimization Algorithm (LOA) [] is based on the life of lions and their individual and social behavior.
Some metaheuristic algorithms are inspired by modeling world natural features and adaptability to the environment []. Humankind has been using nature’s guidance to solve many problems for many years. In recent decades, many efforts have been conducted to develop algorithms derived from nature []. Evolutionary algorithms cannot optimally solve all problems, and each of them is suitable for solving a certain group of problems. For this reason, researchers try to find new algorithms. These algorithms are used in solving engineering problems and other fields, especially challenging problems [,]. Nature-inspired computing has attracted computer scientists for a long time, and popular fields such as neural networks [], cellular automata [], molecular computing [], and evolutionary algorithms have been created [,].
Despite the existence of a significant number of metaheuristic algorithms, there are still critical challenges facing this class of methods. Some of them are as follows:
Parameters tuning: Many metaheuristic algorithms require careful tuning of numerous parameters to achieve optimal performance. This process is often time consuming and may not consistently produce the best results.
Falling in local optima: Traditional algorithms frequently get trapped in local optima. These are solutions that are better than their immediate neighbors but not the best overall. This limits their ability to find globally optimal solutions.
Scalability: As optimization problems grow in size and complexity, the performance of many metaheuristic algorithms deteriorates, rendering them inefficient for large-scale applications.
Early convergence: Some algorithms converge too quickly on a narrow set of solutions, reducing diversity and potentially missing better alternatives elsewhere in the solution space.
Complexity: Many metaheuristic algorithms are computationally expensive, requiring significant time and processing power, especially for complex or NP-hard problems.
To fill these gaps, a new metaheuristic algorithm called the Farmer Ants Optimization Algorithm (FAOA) is introduced. This algorithm is inspired by the social life of farmer ants to solve discrete NP-hard problems. Farmer ants have been farming and growing mushrooms for millions of years []. They use their products to feed the colony []. These ants cultivate a special type of mushrooms for their food. They plant the seeds of these mushrooms in special chambers, feed them, and prevent them from rotting by producing a chemical substance or taking care of them against alien attacks. Knowledge sharing and concurrent behavior are the principal features of the FAOA.
To fill gaps, the FAOA, inspired by farmer ants’ natural behaviors, likely features fewer parameters to adjust, leveraging adaptive mechanisms that minimize manual tuning efforts. It also employs the social dynamics of farmer ants, like knowledge sharing, to enhance exploration, enabling it to escape local optima effectively. The FAOA’s parallelizable design divides problems into smaller sub-tasks managed by independent ant colonies, improving scalability for larger optimization challenges. The FAOA uses adjustable parameters like the social profile (SP) to maintain exploration diversity, preventing early convergence and ensuring broader solution searches. Inspired by efficient ant farming, the FAOA likely incorporates a simpler mathematical model, reducing computational complexity and enhancing efficiency for NP-hard problems.
The contribution of the proposed method includes the following:
  • Presenting the new Optimization Algorithm based on the life of farmer ants;
  • The concurrency feature of the algorithm that can solve complex problems with high accuracy and fast speed;
  • The ability to share local knowledge to reach the best global solution;
  • Ability to solve discrete NP-hard problems;
  • Evaluation of the proposed method and comparing it with some state-of-the-art algorithms in solving engineering problems.
Other parts of this paper are organized as follows. In Section 2, a review of related works has been conducted. Section 3 is related to investigating the life of farmer ants and introducing a new algorithm based on their inspiration. Section 4 is assigned to the evaluation of the proposed method. And finally, the paper is concluded in Section 5.

3. Materials and Methods

In this section, the life of the farmer ants is explained first, and then the basics of the proposed algorithm are explained.

3.1. Inspiration

Over the past thousands of years, humans started farming and have domesticated more than 260 plant species, 470 animal species, and 100 mushroom species []. However, farmer ants with more than 60 million years of agriculture are known as the first farmers of the world []. Attine ants were the first farmers in the world []. These ants grow mushrooms []. Scientists believe that ants started farming in the humid forests and tropical regions of South America 55 to 60 million years ago [,]. A colony of farmer ants is usually formed by a mated female. These colonies are made up of one or more queens. Some of the unmated queens after a while lose their wings and start farming. These ants grow their mushrooms in underground chambers and fertilize gardens with vegetable scraps and dead insects. Farmer ants are compulsively dependent on their mushrooms. Their babies are raised on an exclusively mushroom diet. Farmer ants live in a complex and highly specialized multi-trophic symbiosis. Ants obtain food for the colony by cultivating a special type of mushroom in their nest, and in return, these ants provide the food needed for the mushrooms and a suitable environment for their growth and cultivation free of parasites. Some leaf-cutter ants use fresh leaves to prepare a suitable environment for the growth of mushrooms [,,,,]. Some colonies are smaller and consist of fewer rooms containing hundreds to thousands of ants. Other colonies are made up of populations of up to 8 million ants, with many more rooms and multiple entrances. Agricultural processes are performed by all ants of the colony. The workers search around the nest to find food for mushrooms. These food items can be divided into different parts. They use fresh leaves of trees, flowers, and fruits in humid seasons, and seeds and carcasses of arthropods and insects in cold and dry seasons. Smaller ants clean and chew these materials and turn them into compost for the mushrooms. Ants provide other pieces of fertilizer needed by mushrooms through their excrement. There is also a smaller group of ants that destroy foreign mushrooms that have randomly grown among them [,]. Each colony of farmer ants has a specific smell. Ants recognize these chemical signals, called social profiles, and distinguish between members of their colony and those of other colonies [,]. These colonies live together with numerous ants to plant and develop their fields. Also, the carbon social profile of each colony separates them from other colonies so that the ants of other colonies can be easily identified. This complex and intelligent system is the source of inspiration for the authors of the article to introduce the proposed method []. Figure 2 shows the mushroom-growing factors and their features [,].
Figure 2. Details of the farmer ant’s life; (a) leaf-cutter ants; (b) cultivation of mushrooms by ants; (c) the structure of the social carbon profile; (d) types of foods to feed mushrooms.

3.2. Farmer Ants Optimization Algorithm (FAOA)

In this section, the details of the FAOA algorithm which is inspired by the life of farmer ants are explained. At the beginning of the algorithm, the number of ants (n) that will be responsible for finding food and taking care of mushrooms (p) is determined. Each mushroom will be handled by an ant. The number of ants is greater than or equal to the number of mushrooms. In other words, n ≥ p. Each colony contains K nests in which the mushrooms are placed for breeding. It is not always the case that the number of mushrooms in the nests is equal. Near the nest, the ants have their food, and each mushroom is assigned to an ant to deliver the desired food. The set of foods (F) consists of M different types. The growth rate of each mushroom depends on the type of food it feeds. This dependency is unclear at the beginning of the algorithm. In the first step, each ant randomly chooses a food and measures the amount of its effect on the growth of the mushroom. At the end of the algorithm, the appropriate food for each mushroom will be determined. The effect of the food assigned to a mushroom on its growth is shown by the parameter C 1 . According to the social profile of the colony, the ants of each nest have the responsibility of taking care of the mushrooms. The social profile is used to separate the ants from each nest to avoid losing their way and going to the wrong nest. When the value of the social profile is SP = 1, it means that ants will never lose their nest, and when SP = 0, it means that the behavior of ants in reaching their nest and mushroom is completely random. Changing this factor can determine the exploration behavior of the algorithm, which can be adjusted according to each problem, in other words, 0 S P 1 .
As mentioned earlier, the growth rate of a mushroom depends on the proper food it feeds on. But on the other hand, for its more effective growth, it also depends on the bacteria that the ants provide to take care of them. These bacteria protect the mushroom from the pest. Each mushroom growth rate is proportional to the type of bacteria produced by a given ant. For this reason, this factor can also be effective in mushroom growth. Bacteria can include several types, which are called bacteria types or BT and must be determined at the beginning of the algorithm. We determine the effectiveness of the bacteria in protecting the mushrooms with a coefficient C 2 . This coefficient can be adjusted according to the problem. The type of bacteria belonging to each ant is not changed. In order not to limit this issue, some other ants randomly deliver their bacteria to the mushrooms to vary the distribution of bacteria on the mushrooms. Another important factor that is effective in the growth of mushrooms is pests. Pests can interfere with mushroom growth and cause its destruction or weight loss. Pests are beyond the control of ants and are assumed to act randomly on mushrooms. PF or pest factor is considered a parameter with a negative impact on the growth of mushrooms, but its effect can be reduced by choosing the right bacteria for the mushroom. The effectiveness of PF is also determined by the coefficient C 3 . Equation (1) shows the effect of all the mentioned factors on the mushroom growth in each nest k.
W k = f o r   a l l   a n t s   a n d   m u s h r o o m s W 0 + ( C 1 × f m + C 2 × B i ) × W 0 C 3 × P F × W 0
where W 0 is the initial weight of the mushroom. C 1 , C 2 , C 3 are learning parameters and f m is the value of food quality for food m. B i is the bacteria of ant i, and PF is also determined by Equation (2).
P F = I s α
where I is the negative impact value of the pest, s is the volume of the pest, and α is the spread parameter of the pest on mushrooms, which can be adjusted according to the problem.
P F plays a critical role in modeling the negative influences that challenge the optimization process. It introduces a realistic hurdle, preventing overfitting to ideal conditions and making FAOA effective for complex, NP-hard problems with inherent trade-offs.
The effect of bacteria on mushroom growth is also calculated by Equation (3), where e is the positive effectiveness parameter of the bacteria. This parameter determines how good a specific bacteria type is at protecting or nourishing mushrooms—like rating a pesticide from 1 to 10, v is the volume of bacteria applied. More bacteria, bigger effect—similar to how much fertilizer you spread on a field, t is the lifetime of the bacteria. This reflects how long the bacteria stay active—some might wear off quickly, others last longer, like a slow-release fertilizer. And β is a regulating parameter. This fine-tunes the bacteria’s impact, acting like a dial to adjust its strength based on the problem’s needs.
B i = e × v × t β
To increase the exploration capability of the algorithm, some random solutions can be considered. By doing this, more variety of solutions can lead to finding better solutions and avoiding falling into local optima. In this case, some ants lose their way and move to other nests. This avoids falling into local optima and expands the solution space of the problem. Equation (4) shows this behavior, where p j i k is the probability of moving the lost ant (i) of nest (k) to nest (j) and S P k ,   S P j are the social profiles of the nest k and nest j, respectively. Equation (4) shows this issue.
p j i k = S P k × S P j × W j 1 S P k × M = 1 K W M       w h e r e       0 < S P j   a n d   S P k 1
Equation (5) shows the global phases in finding the solution. At this stage, 1 − FP percentage of ants behave randomly and go to other nests to take care of the mushroom. In this case, the effect of lost ants on growing the mushrooms is considered. The number of these ants or n L must be defined in advance and may vary according to the problem.
W L = 1 F P p j i k I = 1 L [ W 0 + ( r 1 × f m + r 2 × B i ) × W 0 r 3 × P F × W 0 ] + F P L + 1 k W k
W L   consists of two parts, the first part of which is related to the performance of L ants that have lost their nest and are cultivating another mushroom, and the second part is related to the remaining ants that have correctly found their mushroom.
In this regard, the coefficients r 1 , r 2 , r 3 , which are random numbers between zero and one, replace the coefficients C 1 , C 2 , C 3 . In this case, the behavior of the algorithm is more random, and a global search is performed. Figure 3 shows the problem model. The weight of mushrooms produced in each nest k is calculated from the Equations (1) and (4).
Figure 3. Ants colony mod.
In this figure, five nests are considered, one of which is the nest of the queen. M, F, B, and W represent mushrooms, food, bacteria, and weight, respectively. As this figure shows, not all nests contain all mushrooms. For example, there are no M3 mushrooms in Nest 1, or there are only M2 and M3 mushrooms in Nest 3. Calculations and operations on the nests are sent to the queen’s nest, so the queen’s nest has a complete structure and will contain all types of mushrooms. The weight of the queen’s nest mushrooms is equal to the total weight of the nests. We define Equation (6) as follows.
W = max { ( i = 1 K w k )   ,   ( i = 1 L w L + L + 1 K w k ) }
where W is the total weight of the mushrooms in the entire colony. Mushroom breeding nests send their local experiences to the queen nest to achieve global solutions. In this regard, some local and global operations should be carried out. Figure 4 shows the exchange of the partial solutions to construct a complete solution in the queen nest.
Figure 4. Exchange of information.
Food change operation:
At this stage, while keeping the ant constant (no bacteria change), the food of the mushrooms is changed to determine its impact on mushroom growth. To achieve this, two solution vectors are combined to create two new solution vectors (offspring). Subsequently, the fitness or quality of these new solution vectors is evaluated and added to the existing population. Figure 5 shows the food change operation.
Figure 5. Food change operation.
Bacteria change operation:
In the food change operation, the ant assigned to each mushroom is constant, and only the type of food is changed. Each ant has its bacteria type that cannot be changed. On the other hand, some bacteria types may not be suitable for the growth of some mushrooms. For this reason, it is necessary to change the bacteria to check its effectiveness for each mushroom. To do this, some ants outside the nest that have lost their way randomly move to the other nests, and bacteria change operations occur. This expands the problem search space and increases the probability of obtaining better solutions. Figure 6 shows the bacteria change operation. In this figure, bacteria B 9   and B 10 replace B 4 and B 7 , respectively, and new offspring is generated.
Figure 6. Bacteria change operation.
The proposed method is carried out in the following steps.
  • Step 1: Algorithm initialization including the number of ants, number of nests, types of mushrooms, foods, and bacteria;
  • Step 2: Random distribution of mushrooms in nests and assigning ants to mushrooms;
  • Step 3: Calculate the total weight of mushrooms in each nest using Equations (1)–(4);
  • Step 4: Do food change operations;
  • Step 5: Do bacterial change operation on some mushrooms according to SP;
  • Step 6: Participate 1 − SP percentage of ants in the global behavior of the algorithm;
  • Step 7: Send the best relative pattern to the queen nest;
  • Step 8: Calculate WK or the total weight of mushrooms in each nest using relations 1 to 4;
  • Step 9: Calculate the total weight of mushrooms in the entire colony or W using equation 5;
  • Step 10: Remove weak solutions;
  • Step 11: Repeat the algorithm until the stop condition is reached.
Figure 7 shows the flowcharts of the FAOA. Also, Algorithm 1 shows the Farmer’s Side Process.
Algorithm 1: Farmer’s Side Process
Initialize parameters
  1. For each nest K;
  2. Generate the initial population;
  3. Randomly assign ants to mushrooms;
  4.   F o r   e a c h   S k ;
  5.   Compute   W K   b y   E q u a t i o n s   ( 1 ) ( 4 ) ;
  6. Do food operation;
  7.   For   1 -   S P percent of mushrooms;
  8. Do bacteria operation;
  9.   Compute   W K for new solutions;
10.   Generate   a   random   number   r   0,1 ;  
11.   If   r < p ;
12.   Transfer   the   best   partial   solution   into   the   queen   nest   by   E q u a t i o n   1 ;
13.   Else   transfer   the   random   partial   solution   into   the   queen   nest   by   E q u a t i o n s   5   and   6 ;
14. Until the last iteration.
Figure 7. Flowchart of FAOA.
Once all nests have sent their partial solutions to the queen’s nest for aggregation, it is time to evaluate the complete solution. Algorithm 2 shows this step.
Algorithm 2: Queen Side Process
  1. Repeat;
  2.            For each iteration;
  3.                      Receive partial solutions for all nests;
  4.                      Compute   W by Equations (1), (5), and (6);
  5.                      Do global food operation;
  6.                      Do global bacteria operation;
  7.                    Add solutions to the population;
  8.                      Keep   the   P s best solutions;
  9.            End For;
10. Until the last iteration;
11. Return the final solution.

4. Evaluation and Results

In this section, the effectiveness of the FAOA in solving some engineering problems and benchmark functions is evaluated and compared with some state-of-the-art algorithms.

4.1. Problems and Compared Algorithms

In this section, some engineering problems have been deployed to evaluate the performance of the FAOA. These problems include Antennal Location Arable Field Planning [], SLP [], Embedded System [], Fog Computing Systems Cloud [], Knapsack [], Truss Structures with Static Constraints [], TSP [], and UCAV Three-Dimension Path Planning [].
Additionally, the algorithms compared with the FAOA include ACO [], BWO [], CRO [], GA [], GP [], GWO [], ICA [], SA [], TSR [], and Tabu []. Table 1 shows the parameters utilized in these algorithms.
Table 1. Parameter values of algorithms.
For optimal values of the parameters, we used the Taguchi method, developed by Dr. Genichi Taguchi, which is a statistical approach for designing experiments, widely recognized for tuning optimization algorithms. The central concept is to identify the most effective parameter settings efficiently, using only a small number of experiments, while ensuring the outcomes are both dependable and robust. In optimization algorithms, parameters like step size, population size, or convergence thresholds significantly influence performance. Adjusting these parameters manually can be highly impractical due to the sheer volume of possible combinations. The Taguchi method provides a structured and time-saving solution to this challenge [].

4.2. Discrete Problems

Discrete optimization is a field of optimization in applied math and computer science. Unlike continuous optimization, some or all variables in a discrete optimization problem are restricted to discrete values, like integers. There are various discrete problems in scientific and engineering fields and several approaches are used to solve them. The proposed algorithm will also be used to solve these problems.

4.2.1. The TSP Issue

The traveling salesman problem or TSP is one of the classic optimization problems in computer science. In this problem, a seller has a list of cities and must travel to each of them and choose a route in such a way that the total travel distance is minimized. He must pass through each city only once and return to his starting city. This problem is known as an NP-hard problem []. In this study, we utilized two scenarios to assess the proposed approach for addressing this issue. The initial scenario involved two hundred cities, with results displayed in Figure 8a, while the second scenario comprised a thousand cities, with results shown in Figure 8b. Initially, the algorithm’s number of iterations is set to 100 and 1000, respectively. The FAOA algorithm in the first scenario is relatively satisfactory for this problem []. In the second scenario, when the number of cities is increased to 1000, the proposed method demonstrates significantly improved performance. This superiority is attributed to the parallel feature of the method and its powerful operators, which effectively break down complex problems into smaller sub-problems for more appropriate solutions.
Figure 8. Comparison of FAOA with other discrete algorithms for TSP: (a) 200 cities; (b) 1000 cities.

4.2.2. Knapsack Problem (KP)

The knapsack problem is a classic discrete optimization problem. Given items with weights and profits, the goal is to maximize the total value by selecting items to put in the knapsack. For instance, if the knapsack can hold 20 kg, we need to choose items with a total weight of less than or equal to 20 kg while maximizing profit. The knapsack problem is an important optimization and NP-hard problem. In our experiments, we consider the number of iterations as 100. Additionally, the maximum allowed number of objects is five. The objective of this problem is to maximize the weight of the knapsack. Based on Figure 9, it is evident that the FAOA algorithm has yielded better results. Despite initially performing poorly in the early iterations, the algorithm’s performance steadily improved after the 80th iteration.
Figure 9. Results of knapsack problem.

4.2.3. Server Placement Problem (SP)

Proper placement of edge servers is crucial in mobile computing networks. This affects network response time, optimizes server load balance, and reduces server energy consumption. To evaluate the FAOA in the server placement problem, we consider 300 antennas in the network area and aim to place 100 servers in optimal locations [,,]. Figure 10 compares the performance of the FAOA algorithm with other algorithms. As this figure shows, the FAOA performs similarly to other algorithms between the 400th and 850th iteration but outperforms them thereafter.
Figure 10. Results of server placement problem.

4.2.4. Construction Site Layout Planning (CSLP)

Construction site layout planning has always been a concern for clients, contractors, and consultants. The major goal is to arrange facilities like offices, warehouses, and residences in a way that optimizes the transfer of materials, information, and staff. A well-planned layout can enhance safety and efficiency, lower transport costs, and avoid bottlenecks and obstructions during material and equipment transfer, especially in large-scale projects. The site layout can be designed based on the preferences of decision makers and various criteria []. The objective is to minimize the total cost of material, information, and staff transfer, modeled as f C S L P = f o r   a l l   i , j c i , j × d i , j , where c i , j   is the cost of transferring resources between facilities i and j, and d i , j is the distance between them. We consider the number of rounds in this problem to be 1000. Figure 11 shows the performance of FAOA and its comparison with other methods. As this figure shows, the proposed method performed better than the other methods.
Figure 11. Results of the CSLP problem.

4.2.5. AFP (Arable Field Problem)

AFP concerns the act of preparing the land to a specific size and making it suitable for cultivation with the aid of machinery and tools. This issue aims to determine the most efficient and cost-effective method for each device’s performance. The goal is to minimize the operational cost of land preparation, defined as f A F P = t = 1 T ( c t   u t + e t   ) , where c t     is the cost per truck t, u t is the usage time, and e t   is the equipment overhead. For this purpose, eight trucks have been used to solve the problem using the FAOA algorithm []. The number of iterations is considered to be 1000. As Figure 12 shows, FAOA results in better solutions compared to other algorithms.
Figure 12. Results of arable field planning problem.

4.2.6. ES (Emblem System for Intelligent Vehicles)

The ES is one of the current issues in vehicle engineering. The objective is to minimize the redundancy error in the four-wheeled robotic system, expressed as f E S = i = 1 I v i v o p t , where v i is the velocity of wheel i and v o p t is the optimal velocity for mobility. The goal of this issue is to enhance the efficiency of intelligent systems in automobiles. Vehicles with all-wheel drive offer greater mobility and flexibility compared with other vehicles. These types of omnidirectional mobile robots are very useful in some applications, like smart wheelchairs, industrial robots, and nursing robots []. However, there is a redundancy issue in this four-wheeled robotic system, as having more than three controllable degrees of freedom for mobile vehicles creates a redundant system [].
Optimization algorithms are considered a good solution in this field []. Figure 13 shows the performance of the FAOA in solving this problem and comparing it with other algorithms. As this figure shows, the FAOA performs a better response than other algorithms.
Figure 13. Results of the ES problem.

4.2.7. FCS (Fog Computing System-Cloud Problem)

Fog computing involves distributed computing on fog nodes, incorporating various devices (like sensors and home appliances), and has the potential to enhance the performance of Internet of Things (IoT) environments []. Fog nodes are usually deployed between low-level devices and high-level cloud computing platforms. Since the fog node deployment strategy affects the cost and performance of the fog computing system, determining an appropriate and efficient deployment strategy has become an optimization problem. The main objective of this issue is the fair allocation of computing resources []. The function to be minimized is the resource allocation cost, f F C S = n = 1 N ( c n   r n + l n   ) , where c n   is the cost of fog node n, r n is the resource demand, and l n   is the latency penalty. Figure 14 shows the performance of the FAOA in this problem compared to other algorithms. The results of experiments indicate the superiority of the FAOA over the compared methods.
Figure 14. Results of fog computing system problem.

4.2.8. Truss Structures Problem (TS)

Aluminum and steel are commonly used in engineering []. Trusses are widely used in structural engineering. However, as the number of truss structures in a project increases, the design and analysis become more complex. This leads to an increase in the number of solutions for these structures. One of the primary objectives is to create the most cost-effective structure that meets specific loading conditions. This aims to prevent the unnecessary use of materials in response to the growing demand for raw materials by ensuring the optimal design of truss structures. The objective of this issue is to minimize the weight of the trusses while meeting movement and stress []: f T r u s s = e = 1 E ( w e   ) , where w e   is the weight of truss element e, subject to stress and displacement constraints. Optimization methods can be considered as a solution in this field. The performance of the FAOA to solve this problem is shown in Figure 15. The results of experiments and comparisons with other algorithms show the superiority of the proposed method in solving this problem.
Figure 15. Results of truss structures problem.

4.2.9. UCAV Problem (UCAV Three-Dimension Path Planning)

The unmanned combat aerial vehicle (UCAV) is a complex optimization problem that focuses on flight path optimization, considering various types of constraints in complex battlefield environments. Unmanned aerial vehicles are either remotely piloted or self-piloted aircraft capable of carrying various accessories, like cameras, sensors, and communication equipment. They have diverse applications in both civilian and military domains. Their popularity stems from their low cost, compact size, and extensive maneuverability []. The objective is to minimize the flight path cost, f U C A V = w 1 L + w 2 T + w 3 H , where L is path length, T is threat exposure, H is altitude variation, and w 1 , w 2 , w 3 are weights. Path generation and planning are key technologies in countering UCAV []. Optimization methods can be considered as a solution in this field. Figure 16 depicts the performance of the FAOA in this scenario. It illustrates that the FAOA initially has average performance, but starting from round 93, the algorithm consistently achieves the best results.
Figure 16. Results of UCAV 3D problem.

4.3. Benchmark Functions

In this section, 22 standard benchmark functions with various characteristics are utilized to evaluate the performance of the FAOA and other algorithms. These functions consist of seven unimodal functions, nine multimodal functions, and six combined functions. Table 2, Table 3 and Table 4 display the unimodal, multimodal, and combined functions, respectively. Additionally, Table 5 presents the performance results of these algorithms []. The results of these tables show that, in most cases, the efficiency of the proposed method is superior to other algorithms.
Table 2. Unimodal functions.
Table 3. Multimodal functions.
Table 4. Composite functions.
Table 5. Comparison results for benchmark functions.

4.4. Runtime

A useful metric for evaluating the performance of a metaheuristic algorithm is its execution time on benchmark functions and problems. This reduction in runtimes can stem from several factors. In iterative algorithms, reducing the number of iterations directly decreases the execution time required to solve the problem. Secondly, a simpler mathematical model of the algorithm can lead to fewer computations, thereby reducing the overall execution time of the metaheuristic. Thirdly, the degree to which an algorithm can be parallelized also plays a significant role. In this case, the problem is divided into smaller parts that are executed in parallel, and the global solution is derived from these local solutions. The proposed algorithm, by its inherent capabilities, exhibits advantages in all three of these areas. The parallelizable nature of the proposed method not only makes it more scalable than certain existing methods but also reduces the number of iterations needed for convergence by dividing the problem into parallel parts. Moreover, compared to some similar methods, the simpler mathematical model employed in the proposed algorithm leads to a significant reduction in the computations required to solve the problem. Table 6 presents the execution times of the FAOA and other algorithms; as expected, the proposed method exhibits a shorter execution time in most cases. This improvement is particularly significant when the problem space is large.
Table 6. Comparison results for benchmark function’s average runtimes (ms).

4.5. Performance Metrics

In this section, the performance of the proposed method and other algorithms is evaluated based on critical criteria, including convergence rate, accuracy, scalability, and computational resource utilization, on real-world problems discussed in Section 4.2.1, Section 4.2.2, Section 4.2.3, Section 4.2.4, Section 4.2.5, Section 4.2.6, Section 4.2.7 and Section 4.2.8.

4.5.1. Convergence Rate

The convergence rate in algorithms refers to the speed and quality with which an algorithm approaches an optimal or acceptable solution. This metric indicates how quickly the algorithm converges to the solution and whether this convergence leads to an accurate and reliable result. The convergence rate could be the number of iterations or steps required by an algorithm to reach an acceptable solution. An algorithm with a high convergence rate reaches the solution more quickly. Table 7 presents the convergence rate of all algorithms in real problems. As the results in this table indicate, the proposed method outperformed others in all but one instance, reaching the optimal solution with fewer iterations. The primary reason for this superiority is the inherent parallel nature of the proposed method, which allows the problem to be divided into smaller sub-problems, enabling convergence in fewer iterations.
Table 7. Number of iterations or steps.

4.5.2. Computational Resource Utilization

Assessing computational resource utilization as a performance metric for metaheuristic algorithms offers an understanding of how effectively these algorithms handle resources like CPU time, memory usage, and energy consumption. This evaluation is vital as it informs about the practical applicability and scalability of these algorithms in real-world scenarios. In our experiments, we used memory usage as a metric for resource utilization. Lower memory usage indicates better efficiency, allowing the algorithm to handle larger datasets or more complex models without overloading the system. Table 8 shows the memory utilization of the FAOA and compared algorithm. As shown in this table, the proposed method demonstrates lower memory utilization in most cases. This is attributed to the reduced number of iterations required by the proposed approach due to its inherent parallel nature, as well as the simplicity of its mathematical model.
Table 8. Memory utilization (%).

4.5.3. Accuracy

Accuracy indicates how near an algorithm’s solution is to the ideal or best-known solution for a specific problem. In the case of metaheuristic algorithms, which are frequently used for NP-hard problems where finding exact solutions is computationally impractical, accuracy reflects the algorithm’s ability to effectively estimate the global optimum. It is usually represented as a percentage or a relative error measure, evaluating the algorithm’s result against the most recognized optimal solution. A widely used formula to quantify accuracy in optimization problems is shown in Equation (7).
A c c u r a c y % = 1 ( O b t a i n e d   v a l u e O p t i m a l   v a l u e O p t i m a l   v a l u e ) × 100
Table 9 shows the accuracy of the FAOA and other compared algorithms. The results show that the FAOA has better accuracy in all cases.
Table 9. Accuracy of the algorithms (%).

4.5.4. Scalability

Scalability in optimization algorithms describes how well they can manage growing problem sizes without excessive spikes in resource demands, such as time, memory, or iterations. Table 10 shows the scalability of the FAOA under some optimization problems in terms of runtime and memory utilization. As illustrated in this table, scalability has been evaluated based on execution time and memory consumption. If an algorithm is scalable, an increase in computational volume or dataset size will not result in a significant decline in its performance. Table 10 presents the scalability of the proposed algorithm across various optimization problems. As demonstrated by the table, the proposed method does not exhibit a substantial reduction in performance—encompassing execution time and memory usage—despite an increase in dataset size and the expansion of the problem space. This is attributed to the parallel nature of the proposed approach and the simplicity of its mathematical model, which enables the algorithm to converge in fewer iterations.
Table 10. The scalability of the FAOA.

4.6. Discussion

In this article, the Farmer Ants Optimization Algorithm (FAOA), as a new metaheuristic algorithm, was explained. This algorithm, which is suitable for solving discrete problems, has unique features that distinguish it from other similar algorithms. In most similar algorithms, negative impact parameters and problem constraints are not considered. This will reduce their efficiency and accuracy. The proposed method considers the factors that enhance the solution to the problem, while also taking into account the limiting factors that can significantly affect the solution. For example, in the performance of a processor, factors such as processor speed and RAM capacity are considered. On the other hand, for example, the heat of the processor at higher speeds, which can negatively affect the calculation speed, is less considered. The impact of the processor speed and RAM capacity varies for different applications and differs for each specific problem. Some applications are processor-intensive and others require more memory. The proposed algorithm with its operators and unique nature determines the effects of positive and negative factors in the learning process, which can help make the problem-solving process more realistic. On the other hand, another important feature of the proposed algorithm is its parallelization capability. In most cases, population-based algorithms require many iterations to obtain the optimal solution. Breaking the problem into smaller components that lead to local processing can reduce the complexity of the problem and convergence can be achieved in fewer iterations. The local and global features of the proposed algorithm expand the search space and increase the chances of reaching a global solution.

4.7. Time Complexity

In this section, we analyze the complexity of the FAOA. In the proposed algorithm, we start with a population of p, which is divided into k sub-rooms. Each sub-room contains m mushrooms, and each mushroom is assigned to an ant. By dividing solutions into k nests, the required calculations in each sub-section will be reduced by n/k and will be of logarithmic type. If we consider the number of algorithm iterations (maxiter) as n and also consider the three operations of addition and subtraction performed, the total number of necessary calculations equals the following:
O ( m a x i t e r p k ( W 0 + ( C 1 × f m + C 2 × B i ) × W 0 C 3 × P F × W 0 ) ) = O n p k + 3 = O n ( log 2 n + 3 ) = O n log 2 n

4.8. Limitation of the FAOA

Although the FAOA algorithm performs well in solving optimization problems and has important advantages, its performance may be limited in some cases. The first limitation is the algorithm’s ability to solve discrete optimization problems. The FAOA was designed because there are not many methods in the field of discrete optimization algorithms. However, we plan to establish the FAOA’s ability to solve continuous optimization problems in our future work. Also, due to the parallelization capability of the algorithm, in some cases where a problem cannot be decomposed into smaller parts, the efficiency of our method may be limited. We plan to overcome this issue in the next research.

5. Conclusions

This paper introduced a new metaheuristic algorithm named the Farmer Ants Optimization Algorithm (FAOA). The FAOA is inspired by the life of a group of farmer ants who grow mushrooms. In this algorithm, some aspects of the life of these ants, including mushroom cultivation, feeding, and caring for them, have been used. This algorithm, which is suitable for solving discrete optimization problems, can solve NP-hard problems and provide optimal solutions, especially for large-scale problems. The important feature of this algorithm includes considering the positive and negative impact of problem parameters in reaching the solution and its parallelization capability, which is suitable for problems with a discrete nature, especially with high dimensions. Experiments performed on some classic and new engineering problems, as well as on some benchmark functions, show that the proposed method is efficient in solving optimization problems. As a future work, we plan to develop this algorithm to solve problems with a continuous nature, so that it can solve a wider range of problems.

Author Contributions

A.A. and M.A. carried out the experiments. M.Z. wrote the manuscript with support from H.A. S.G. translated the manuscript into English and supervised the project. We confirm that the order of authors listed in the manuscript has been approved by all named authors. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Datasets are available upon reasonable request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. ChandraBora, K.; Kalita, B. Exact Polynomial-time Algorithm for the Clique Problem and P = NP for Clique Problem. Int. J. Comput. Appl. 2013, 73, 19–23. [Google Scholar] [CrossRef][Green Version]
  2. Woeginger, G.J. Exact algorithms for NP-hard problems: A survey. In Combinatorial Optimization—Eureka, You Shrink; Springer: Berlin/Heidelberg, Germany, 2003; pp. 185–207. [Google Scholar]
  3. Lin, F.-T.; Kao, C.-Y.; Hsu, C.-C. Applying the Genetic Approach to Simulated Annealing in Solving Some NP-Hard Problems. IEEE Trans. Syst. Man Cybern. 1993, 26, 1752–1767. [Google Scholar]
  4. Mirjalili, S. SCA: A Sine Cosine Algorithm for Solving Optimization Problems. Knowl. Based Syst. 2016, 96, 120–133. [Google Scholar] [CrossRef]
  5. Simpson, A.R.; Dandy, G.C.; Murphy, L.J. Genetic algorithms compared to other techniques for pipe optimization. J. Water Resour. Plan. Manag. 1994, 120, 423–443. [Google Scholar] [CrossRef]
  6. James, C. Introduction to Stochastics Search and Optimization; Wiley-Interscience: Hoboken, NJ, USA, 2003. [Google Scholar]
  7. Boussaïd, I.; Lepagnot, J.; Siarry, P. A survey on optimization metaheuristics. Inf. Sci. 2013, 237, 82–117. [Google Scholar] [CrossRef]
  8. Parejo, J.A.; Ruiz-Cortés, A.; Lozano, S.; Fernandez, P. Metaheuristic optimization frameworks: A survey and benchmarking. Soft Comput. 2012, 16, 527–561. [Google Scholar] [CrossRef]
  9. Mirjalili, S. Moth-Flame Optimization Algorithm: A Novel Nature-inspired Heuristic Paradigm. Knowl. Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  10. Wu, G.; Mallipeddi, R.; Suganthan, P.N. Ensemble strategies for population-based optimization algorithms—A survey. Swarm Evol. Comput. 2019, 44, 695–711. [Google Scholar] [CrossRef]
  11. Mcculloch, W.S.; Pitts, W. A Logical Calculus Of The Ideas Immanent In Nervous Activity. Bull. Math. Biophys. 1943, 5, 115–133. [Google Scholar] [CrossRef]
  12. Von Neumann, J.; Burks, A.W. Theory of self-reproducing automata. IEEE Trans. Neural Netw. 1966, 5, 3–14. [Google Scholar]
  13. Nocedal, J.; Wright, S.J. Numerical Optimization, 2nd ed.; Springer: New York, NY, USA, 2006. [Google Scholar]
  14. Wu, G. Across neighborhood search for numerical optimization. Inf. Sci. 2016, 329, 597–618. [Google Scholar] [CrossRef]
  15. Wu, G.; Pedrycz, W.; Suganthan, P.N.; Mallipeddi, R. A variable reduction strategy for evolutionary algorithms handling equality constraints. Appl. Soft Comput. 2015, 37, 774–786. [Google Scholar] [CrossRef]
  16. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  17. Reeves, C.R. Modern Heuristic Techniques for Combinatorial Problems; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 1993. [Google Scholar]
  18. Blum, C.; Puchinger, J.; Raidl, G.R.; Roli, A. Hybrid metaheuristics in combinatorial optimization: A survey. Appl. Soft Comput. 2011, 11, 4135–4151. [Google Scholar] [CrossRef]
  19. Daliri, A.; Asghari, A.; Azgomi, H. The water optimization algorithm: A novel metaheuristic for solving optimization problems. Appl. Intell. 2022, 52, 17990–18029. [Google Scholar] [CrossRef]
  20. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  21. De Castro, L.N.; Timmis, J. Artificial Immune Systems: A New Computational Intelligence Approach; Springer Science & Business Media: Berlin/Heidelberg, Germany, 2002. [Google Scholar]
  22. Dorigo, M.; Birattari, M.; Stutzle, T. Ant Colony Optimization. IEEE Comput. Intell. Mag. 2006, 1, 28–39. [Google Scholar] [CrossRef]
  23. Kennedy, J.; Eberhart, R. Particle swarm optimization (PSO). In Proceedings of the IEEE International Conference on Neural Networks; IEEE: Perth, Australia, 1995; pp. 1942–1948. [Google Scholar]
  24. Yang, X.-S. Flower Pollination Algorithm for Global Optimization. In Unconventional Computation and Natural Computation; Springer: Berlin/Heidelberg, Germany, 2012; pp. 240–249. [Google Scholar]
  25. Yang, X.-S. Firefly Algorithms for Multimodal Optimization. In SAGA 2009: Stochastic Algorithms: Foundations and Applications; Springer: Berlin/Heidelberg, Germany, 2009; pp. 169–178. [Google Scholar]
  26. Alimoradi, M.; Azgomi, H.; Asghari, A. Trees Social Relations Optimization Algorithm: A new Swarm-Based metaheuristic technique to solve continuous and discrete optimization problems. Math. Comput. Simul. 2022, 194, 629–664. [Google Scholar] [CrossRef]
  27. Yazdani, M.; Jolai, F. Lion Optimization Algorithm (LOA): A NatureInspired Metaheuristic Algorithm. J. Comput. Des. Eng. 2016, 3, 24–36. [Google Scholar] [CrossRef]
  28. Blum, C.; Roli, A. Metaheuristics in combinatorial optimization: Overview and conceptual comparison. ACM Comput. Surv. 2003, 35, 268–308. [Google Scholar] [CrossRef]
  29. Oftadeh, R.; Mahjoob, M.J.; Shariatpanahi, M. A novel meta-heuristic optimization algorithm inspired by group hunting of animals: Hunting search. Comput. Math. Appl. 2010, 60, 2087–2098. [Google Scholar] [CrossRef]
  30. Yang, X.-S.; Dey, N.; Fong, S. Nature-Inspired Metaheuristic Algorithms for Engineering Optimization Applications; Springer: Singapore, 2010. [Google Scholar]
  31. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  32. Ardleman, L. Molecular computation of solutions to combinatorial problems. Science 1994, 266, 1021–1024. [Google Scholar] [CrossRef] [PubMed]
  33. De Jong, K.A. Evolutionary computation a unified approach. In GECCO Companion’15: Proceedings of the Companion Publication of the 2015 Annual Conference on Genetic and Evolutionary Computation; Association for Computing Machinery: New York, NY, USA, 2015; pp. 21–35. [Google Scholar] [CrossRef]
  34. Zheng, Y.-J. Water wave optimization: A new natureinspired metaheuristic. Comput. Oper. Res. 2015, 55, 1–11. [Google Scholar] [CrossRef]
  35. Shik, J.Z.; Kooij, P.W.; Donoso, D.A.; Santos, J.C.; Gomez, E.B.; Franco, M.; Crumiere, A.J.J.; Arnan, X.; Howe, J.; Wcislo, W.T.; et al. Nutritional niches reveal fundamental domestication trade-offs in fungus-farming ants. Nat. Ecol. Evol. 2021, 5, 122–134. [Google Scholar] [CrossRef]
  36. De Fine Licht, H.H.; Schiøtt, M.; Mueller, U.G.; Boomsma, J.J. Evolutionary Transitions in Enzyme Activity of ant Fungus Gardens. Evolution 2010, 64, 2055–2069. [Google Scholar] [CrossRef]
  37. Dhiman, G.; Kumar, V. Spotted hyena optimizer: A novel bio-inspired based metaheuristic technique for engineering applications. Adv. Eng. Softw. 2017, 114, 48–70. [Google Scholar] [CrossRef]
  38. Parpinelli, R.S.; Lopes, H.S. New inspirations in swarm intelligence: A survey. Int. J. Bio-Inspired Comput. 2011, 3, 1–16. [Google Scholar] [CrossRef]
  39. Shah-Hosseini, H. Optimization with the Nature-Inspired Intelligent Water Drops Algorithm. Evol. Comput. 2009, 57, 297–338. [Google Scholar]
  40. Liu, B.; Zhou, Y. Artificial fish swarm optimization algorithm based on genetic algorithm. Comput. Eng. Des. 2008, 62, 617–629. [Google Scholar]
  41. Hassanien, A.-E.; Taha, M.H.N.; Khalifa, N.E.M. Enabling AI Applications in Data Science; Studies in Computational Intelligence; Springer: Cham, Switzerland, 2021; Volume 911. [Google Scholar]
  42. Kaveh, A.; Khayatazad, M. A new meta-heuristic method: Ray Optimization. Comput. Struct. 2012, 112–113, 283–294. [Google Scholar] [CrossRef]
  43. Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci. 2009, 179, 2232–2248. [Google Scholar] [CrossRef]
  44. Hashim, F.A.; Hussain, K.; Houssein, E.H.; Mabrouk, M.S.; Al-Atabany, W. Archimedes optimization algorithm: A new metaheuristic algorithm for solving optimization problems. Appl. Intell. 2021, 51, 1531–1551. [Google Scholar] [CrossRef]
  45. Kirkpatrick, S.; Gelatt, C.D.; Vecchi, M.P. Optimization by simulated annealing. Science 1983, 220, 671–680. [Google Scholar] [CrossRef]
  46. Kaveh, A.; Talatahari, S. A novel heuristic optimization method: Charged system search. Acta Mech. 2010, 213, 267–289. [Google Scholar] [CrossRef]
  47. Knowles, D.; Corne, D.W. M-PAES: A Memetic Algorithm for Multiobjective Optimization. In Proceedings of the 2000 Congress on Evolutionary Computation. CEC00 (Cat. No.00TH8512), La Jolla, CA, USA, 16–19 July 2000; IEEE: Piscataway, NJ, USA, 2000. [Google Scholar]
  48. Birbil, S.I.; Fang, S.C. An Electromagnetism-like Mechanism for Global Optimization. J. Glob. Optim. 2003, 25, 263–282. [Google Scholar] [CrossRef]
  49. Cai, W.; Yang, W.; Chen, X. A Global Optimization Algorithm Based on Plant Growth Theory: Plant Growth Optimization. In Proceedings of the 2008 International Conference on Intelligent Computation Technology and Automation (ICICTA), Changsha, China, 20–22 October 2008. [Google Scholar]
  50. Karci, A. Theory of Saplings Growing Up Algorithm. In Adaptive and Natural Computing Algorithms; Springer: Berlin/Heidelberg, Germany, 2007; pp. 450–460. [Google Scholar]
  51. Shukla, A.K.; Tripathi, D.; Reddy, B.R.; Chandramohan, D. A study on metaheuristics approaches for gene selection in microarray data: Algorithms. applications and open challenges. Evol. Intell. 2020, 13, 309–329. [Google Scholar] [CrossRef]
  52. Qin, A.K.; Huang, V.L.; Suganthan, P.N. Differential Evolution Algorithm With Strategy Adaptation for Global Numerical Optimization. IEEE Trans. Evol. Comput. 2009, 13, 398–417. [Google Scholar] [CrossRef]
  53. Trivedi, A.; Sanyal, K.; Verma, P.; Srinivasan, D. A Unified Differential Evolution Algorithm for Constrained Optimization Problems. In Proceedings of the IEEE Congress on Evolutionary Computation (CEC), Donostia/San Sebastian, Spain, 5–8 June 2017. [Google Scholar]
  54. Faradonbeh, R.S.; Monjezi, M.; Armaghani, D.J. Genetic programing and non-linear multiple regression techniques to predict backbreak in blasting operation. Eng. Comput. 2016, 32, 123–133. [Google Scholar] [CrossRef]
  55. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  56. Fathollahi-Fard, A.M.; Hajiaghaei-Keshteli, M.; Tavakkoli-Moghaddam, R. Red deer algorithm (RDA): A new nature-inspired meta-heuristic. Soft Comput. 2020, 24, 14637–14665. [Google Scholar] [CrossRef]
  57. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  58. Yang, X.-S. A New Metaheuristic Bat-Inspired Algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  59. Diamond, J. Evolution, consequences and future of plant and animal domestication. Nature 2002, 418, 700–707. [Google Scholar] [CrossRef]
  60. Mehdiabadi, N.J.; Schultz, T.R. Natural history and phylogeny of the fungus-farming ants (Hymenoptera: Formicidae: Myrmicinae: Attini). Myrmecol. News 2009, 13, 37–55. [Google Scholar]
  61. Nygaard, S.; Hu, H.; Li, C.; Schiøtt, M.; Chen, Z.; Yang, Z.; Xie, Q.; Ma, C.; Deng, Y.; Dikow, R.B.; et al. Reciprocal genomic evolution in the ant–fungus agricultural symbiosis. Nat. Commun. 2016, 7, 12233. [Google Scholar] [CrossRef] [PubMed]
  62. Vellinga, E.C. Ecology and Distribution of Lepiotaceous Fungi (Agaricaceae). Nova Hedwig. 2004, 78, 273–299. [Google Scholar] [CrossRef]
  63. Mueller, U.G.; Rehner, S.A.; Schultz, T.R. The Evolution of Agriculture in Ants. Science 1998, 281, 2034–2038. [Google Scholar] [CrossRef]
  64. Mueller, U.G.; Gerardo, N.M.; Aanen, D.K.; Six, D.L.; Ted, R. Schultz The Evolution of Agriculture in Insects. Annu. Rev. Ecol. Evol. Syst. 2005, 36, 563–595. [Google Scholar] [CrossRef]
  65. Hölldobler, B.; Wilson, E.O. The Leafcutter Ants: Civilization by Instinct; WW Norton & Company: New York, NY, USA, 2011. [Google Scholar]
  66. Currie, C.R.; Bot, A.N.M.; Boomsma, J.J. Experimental evidence of a tripartite mutualism: Bacteria protect ant fungus gardens from specialized parasites. Oikos 2003, 101, 91–102. [Google Scholar] [CrossRef]
  67. Van Borm, S.; Billen, J.; Boomsma, J.J. The diversity of microorganisms associated with Acromyrmex leafcutter ants. BMC Evol. Biol. 2002, 2, 9. [Google Scholar] [CrossRef]
  68. Ariniello, L. Protecting Paradise: Fungus-farming ants ensure crop survival with surprising strategies and partnerships. BioScience 1999, 49, 760–763. [Google Scholar] [CrossRef][Green Version]
  69. Ronque, M.U.; Feitosa, R.M.; Oliveira, P.S. Natural history and ecology of fungus-farming ants: A field study in Atlantic rainforest. Insectes Sociaux 2019, 66, 375–387. [Google Scholar] [CrossRef]
  70. Richard, F.-J.; Poulsen, M.; Drijfhout, F.; Jones, G.; Boomsma, J.J. Specificity in Chemical Profiles of Workers, Brood and Mutualistic Fungi in Atta, Acromyrmex, and Sericomyrmex Fungus-growing Ants. J. Chem. Ecol. 2007, 33, 2281–2292. [Google Scholar] [CrossRef] [PubMed]
  71. Poulsen, M.; Boomsma, J.J. Mutualistic Fungi Control Crop Diversity in Fungus-Growing Ants. Science 2005, 307, 741–744. [Google Scholar] [CrossRef]
  72. Wilson, E.O. The Insect Societies. In Insect Societies; Harvard University Press: Cambridge, MA, USA, 1971. [Google Scholar]
  73. Hughes, W.O.H.; Eilenberg, J.; Boomsma, J.J. Trade-offs in group living: Transmission and disease resistance in leaf-cutting ants. Proc. R. Soc. B Biol. Sci. 2002, 269, 1811–1819. [Google Scholar] [CrossRef]
  74. Aguilar-Colorado, Á.S.; Rivera-Cháve, J. Ants/Nest-Associated Fungi and Their Specialized Metabolites: Taxonomy, Chemistry, and Bioactivity. Rev. Bras. Farmacogn. 2023, 33, 901–923. [Google Scholar] [CrossRef]
  75. Moghadam, E.K.; Vahdanjoo, M.; Jensen, A.L.; Sharifi, M.; Sørensen, C.A.G. An Arable Field for Benchmarking of Metaheuristic Algorithms for Capacitated Coverage Path Planning Problems. Agronomy 2020, 10, 1454. [Google Scholar] [CrossRef]
  76. Kaveh, A.; Vazirinia, Y. Construction Site Layout Planning Problem Using Metaheuristic Algorithms: A Comparative Study. Iran. J. Sci. Technol. Trans. Civ. Eng. 2019, 43, 105–115. [Google Scholar] [CrossRef]
  77. Huang, H.-C. A Hybrid Metaheuristic Embedded System for Intelligent Vehicles Using Hypermutated Firefly Algorithm Optimized Radial Basis Function Neural Network. IEEE Trans. Ind. Inform. 2019, 15, 1062–1069. [Google Scholar] [CrossRef]
  78. Chen, H.; Chang, W.-Y.; Chiu, T.-L.; Chiang, M.-C.; Tsai, C.-W. SEFSD: An effective deployment algorithm for fog computing systems. J. Cloud Comput. 2023, 12, 105. [Google Scholar] [CrossRef]
  79. Leguizamon; Guillermo; Michalewicz, Z. A new version of ant system for subset problems. In Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406), Washington, DC, USA, 6–9 July 1999; IEEE: Piscataway, NJ, USA, 1999; Volume 2, pp. 1459–1464.
  80. Khodadadi, N.; Çiftçioğlu, A.Ö.; Mirjalili, S.; Nanni, A. A comparison performance analysis of eight meta-heuristic algorithms for optimal design of truss structures with static constraints. Decis. Anal. J. 2023, 8, 100266. [Google Scholar] [CrossRef]
  81. Reinelt, G. The Traveling Salesman: Computational Solutions for TSP Applications; Springer: Berlin/Heidelberg, Germany, 1994. [Google Scholar]
  82. Wang, G.; Guo, L.; Duan, H.; Wang, H.; Liu, L.; Shao, M. A HybridMetaheuristic DE/CS Algorithm for UCAV Three-Dimension Path Planning. Sci. J. 2012, 2012, 583973. [Google Scholar] [CrossRef]
  83. Hussien, A.G.; Hassanien, A.E.; Houssein, E.H.; Amin, M. New binary whale optimization algorithm for discrete optimization problems. Eng. Optim. 2020, 52, 945–959. [Google Scholar] [CrossRef]
  84. Atashpaz-Gargari, S.; Lucas, C. Imperialist competitive algorithm: An algorithm for optimization inspired by imperialistic competition. In Proceedings of the 2007 IEEE Congress on Evolutionary Computation, Singapore, 25–28 September 2007. [Google Scholar] [CrossRef]
  85. Salcedo-Sanz, S.; Del Ser, J.; Landa-Torres, I.; Gil-López, S.; Portilla-Figueras, J.A. The Coral Reefs Optimization Algorithm: A Novel Metaheuristic for Efficiently Solving Optimization Problems. Res. Artic. 2014, 2014, 739768. [Google Scholar] [CrossRef] [PubMed]
  86. Gallego, R.A.; Romero, R.; Monticelli, A.J. Tabu search algorithm for network synthesis. IEEE Trans. Power Syst. 2000, 15, 490–495. [Google Scholar] [CrossRef]
  87. Karna, S.K.; Sahai, R. An overview on Taguchi method. Int. J. Eng. Math. Sci. 2012, 1, 1–7. [Google Scholar]
  88. Available online: http://www.math.uwaterloo.ca/tsp/world/index.html (accessed on 16 January 2025).
  89. Asghari, A.; Sohrabi, M.K. Server placement in mobile cloud computing: A comprehensive survey for edge computing, fog computing and cloudlet. Comput. Sci. Rev. 2024, 51, 100616. [Google Scholar] [CrossRef]
  90. Asghari, A.; Azgomi, H.; Zoraghchian, A.A.; Barzegarinezhad, A. Energy-aware server placement in mobile edge computing using trees social relations optimization algorithm. J. Supercomput. 2024, 80, 6382–6410. [Google Scholar] [CrossRef]
  91. Asghari, A.; Sohrabi, M.K. Multiobjective edge server placement in mobile-edge computing using a combination of multiagent deep q-network and coral reefs optimization. IEEE Internet Things J. 2022, 9, 17503–17512. [Google Scholar] [CrossRef]
  92. Cheung, S.-O.; Tong, T.K.-L.; Tam, C.-M. Site pre-cast yard layout arrangement through genetic algorithms. Autom. Constr. 2002, 11, 35–46. [Google Scholar] [CrossRef]
  93. Siegwart, R.; Nourbakhsh, I.R.; Scaramuzza, D. Introduction to Autonomous Mobile Robots, 2nd ed.; MIT Press: London, UK, 2011. [Google Scholar]
  94. Bonomi, F.; Milito, R.; Zhu, J. Fog computing and its role in the internet of things. In Proceedings of the Mobile Cloud Computing Workshop, Addepalli S (2012), Helsinki, Finland, 17 August 2012; ACM: New York, NY, USA, 2012; pp. 13–16. [Google Scholar]
  95. Kaveh, A.; Talatahari, S.; Khodadadi, N. The hybrid invasive weed optimizationshuffled frog-leaping algorithm applied to optimal design of frame structures. Period. Polytech. Civ. Eng. 2019, 63, 882–897. [Google Scholar]
  96. Kurnaz, S.; Cetin, O.; Kaynak, O. Adaptive neurofuzzy inference system based autonomous flight control of unmanned air vehicles. Expert Syst. Appl. 2010, 37, 1229–1234. [Google Scholar] [CrossRef]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Article Metrics

Citations

Article Access Statistics

Multiple requests from the same IP address are counted as one view.