Previous Article in Journal
Quantifying the Relationship Between Speech Quality Metrics and Biometric Speaker Recognition Performance Under Acoustic Degradation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Firebug Swarm Optimization Algorithm: An Overview and Applications

by
Faroq Awin
1,2,*,
Yasser Alginahi
3,4 and
Esam Abdel-Raheem
4
1
Department of Electrical Engineering, International University of Science and Technology Kuwait, Ardiya 92400, Kuwait
2
Department of Electrical and Electronic Engineering, University of Tripoli, Tripoli 13217, Libya
3
Department of Computer Science, Adrian College, 110 S Madison Street, Adrian, MI 49221, USA
4
Department of Electrical and Computer Engineering, University of Windsor, Windsor, ON N9B 3P4, Canada
*
Author to whom correspondence should be addressed.
Submission received: 1 October 2025 / Revised: 31 December 2025 / Accepted: 9 January 2026 / Published: 13 January 2026

Abstract

This survey delves into the Firebug Swarm Optimization (FSO) algorithm, an advanced global optimization algorithm that plays a pivotal role in modern swarm intelligence optimization techniques. It explores the core principles of the FSO algorithm and examines the various hybrid variants developed to address complex optimization challenges. This survey also traces the evolution of swarm optimization methods, shedding light onto the natural phenomena and biological processes that have inspired these algorithms. Furthermore, it highlights the diverse real-world applications of the FSO algorithm, showcasing its effectiveness in fields such as engineering, data science, and artificial intelligence. To provide a comprehensive comparison, the survey includes a case study that evaluates the FSO algorithm’s performance against other existing algorithms. Lastly, the survey identifies key open research questions and suggests potential future directions for advancing the FSO algorithm and other nature-inspired optimization techniques, aiming to overcome current limitations and unlock new possibilities.

1. Introduction

Optimization problems permeate virtually all aspects of human endeavors and real-world designs. Typically, an optimization problem comprises three fundamental components: the objective function, decision variables, and constraints. The objective function encapsulates the optimization objective in mathematical form, with the aim of maximizing or minimizing a specific quantity. Furthermore, the type of optimization problem, whether linear or nonlinear, is dictated by the nature of the objective function. Decision variables denote the parameters manipulated by the optimization problem to achieve its optimal solution. The characteristics of the optimization problem dictate the type of decision variables, which can be discrete or continuous. Constraints delineate the conditions or limitations that the solution must satisfy to be deemed valid and feasible. These constraints may manifest as linear or nonlinear equations or inequalities, involving both the objective function and the decision variables [1]. In essence, the objective of any optimization problem is to determine the values of decision variables that maximize or minimize the objective function while adhering to all constraints. This collection of decision variable values constitutes the optimal solution, representing the most favorable outcome for the optimization problem.
To attain the optimal solution for an optimization problem, an algorithm is indispensable. Various types of algorithms exist, each possessing distinct strengths and weaknesses. The choice of algorithm is contingent upon the nature of the optimization problem. In essence, optimization falls under the purview of “operations research”, which encompasses various familiar branches like robust optimization, multi-objective optimization, and heuristic algorithms. Heuristic algorithms, contingent upon problem-specific technology, provide feasible solutions within reasonable time and space constraints but do not ensure optimal solutions. Due to their inherent greediness, they often converge to local optima and struggle to reach global optima. Nonetheless, heuristic strategies offer an efficient means of obtaining feasible solutions when finding the optimal solution proves arduous or impractical [2].
Meta-heuristic algorithms have been introduced as a black box approach to address a wide array of problems. The term “meta” conveys a philosophical notion, denoting an organizational unit within the world. Consequently, meta-heuristic algorithms serve as fundamental methods, irrespective of specific problem instances. They can be fine-tuned by adjusting internal parameters to suit particular problems because of their simplicity and versatility, numerous algorithms, including Particle Swarm Optimization (PSO) [3], the Ant Colony Optimization algorithm (ACO) [4], and the Bee Artificial Colony Optimization Algorithm (BAC) [5], have been successively proposed. As a result, various engineering applications have emerged, employing these algorithms to optimize structures such as top beam structures and reinforced concrete retaining walls. Additionally, these optimization algorithms have found applications in finance, healthcare systems, manufacturing, quality control, communication systems design, renewable energy, and various other engineering fields. This indicates the burgeoning period of meta-heuristic algorithms, with numerous types now available. As depicted in Figure 1, most meta-heuristics draw inspiration from nature and can be categorized into four main classes: evolution-based [6], swarm-based [7], physics/chemistry-based [8], and human-based algorithms [9]. Evolution-based algorithms model natural evolutionary processes like natural selection and species migration. Genetic Algorithm (GA), a prevalent evolution-based algorithm, mimics Darwinian natural selection [10], leading to the development of various other evolution-based algorithms such as Differential Evolution (DE), evolution strategy, and queen-bee evolution.
The past two decades have witnessed remarkable advancements in swarm-based optimization algorithms, drawing inspiration from the behaviors of various creatures on Earth. These algorithms are categorized as nature-inspired optimization algorithms and have been employed to address a multitude of optimization challenges in practical engineering applications. This rapid progress in swarm-based algorithms has captured the attention of several researchers, leading to the creation of numerous surveys concentrating on such optimization techniques. This survey discusses the Firebug Swarm Optimization (FSO) algorithm, delving into its variations, classification, and applications. This survey underscores the popularity of the FSO algorithm, which can be attributed to several key factors: First, FSO simulates the movement of firebugs in search of optimal mates and their swarm cohesion, which can address a range of complex design challenges. This capability allows FSO to effectively tackle highly multimodal landscapes, dynamic or noisy environments, and complex engineering challenges characterized by multiple constraints. Second, FSO is characterized by its simplicity and fast convergence rate, requiring minimal parameter tuning. Third, FSO effectively optimizes multi-objective cost functions through its swarm coordination patterns. Consequently, it is widely utilized in renewable energy systems, multi-input DC–DC converters, scheduling, and network routing. Fourth, FSO demonstrates flexibility in hybridization, allowing for seamless integration with fuzzy Logic; other heuristic optimization algorithms; and a range of artificial intelligence techniques, including machine learning (ML), neural network (NN), and convolutional neural network (CNN), which will be elaborated upon in the subsequent sections. To the best of the authors’ knowledge, there is no existing study that has conducted a comprehensive survey on FSO algorithms. Therefore, the primary contributions of this survey paper can be outlined as follows:
  • The history, sources of inspiration, phases, and applications of most swarm intelligent optimization algorithms are presented.
  • Focusing on the FSO algorithm as one of recent swarm optimization algorithms, this survey delves into its operational principles, hybrid variants, strengths and limitations, and its applications in real-world engineering scenarios.
The remainder of this paper is organized as follows: Section 2 describes the history of swarm optimization algorithms; Section 3 focuses on FSO algorithms; Section 4 discusses the real-world applications of FSO; Section 5 provides a case study on optimizing wireless sensor networks (WSNs); Section 6 presents open issues and future research directions; and finally, Section 7 provides the conclusions.

2. Historical Swarm Algorithms

The swarm intelligence technique known as the PSO algorithm drew inspiration from the collective behavior observed in swarms of birds and fish. Initially proposed by Kennedy [3], PSO was the first of its kind. Subsequently, it was adapted for evolutionary fuzzy systems, enabling the optimization of neural network (NN) structures. PSO was indirectly applied to evolve network structures, thereby circumventing the need for pre-processing of input data [11]. Meanwhile, GA as a heuristic optimization technique inspired by the process of natural selection and mutation was proposed by Holland [10]. GA and its hybrid versions have found widespread applications across various engineering domains. However, a primary issue with GA is its sluggishness. Consequently, the ACO algorithm was introduced to address global optimization challenges [4]. While ACO has been extensively utilized in solving numerous engineering design problems, it imposes a significant computational overhead [12]. This has spurred ongoing research endeavors aimed at devising superior meta-heuristic approaches with reduced complexity.
A multitude of swarm-based optimization techniques have been introduced from the mid-20th century to the present day. Figure 2 illustrates the chronological progression of almost all the algorithms of this category. The reader can readily observe that in the past five years, there has been a notable upsurge in the advancement of swarm-based optimization algorithms. This surge was linked to the swift progression of technologies that rely on computational power and artificial intelligence (AI), particularly in addressing solutions for intricate and extensive systems like Internet of Things (IoT), smart cities, telecommunication systems, robotics, engineering problems, and healthcare systems. Moreover, the collaboration between researchers from different disciplines has led to cross-pollination of ideas and methodologies. This interdisciplinary approach has spurred the development of novel swarm-based algorithms and optimization strategies.
It is noted that while GA is not typically classified as a swarm-based algorithm, it has been incorporated into the timeline presented in Figure 2 due to its significance and widespread adoption. This inclusion also serves as evidence that PSO was introduced prior to it.
Swarm-based algorithms drew their inspiration from different species’ behaviors such as foraging, social, and innate natural behavior. Figure 3 displays the sources of swarm-based algorithms inspiration. More specifically, swarm-based algorithms can be classified accordingly.
  • Algorithms inspired by foraging behavior: The foraging behaviors observed in various species can typically be categorized into three primary types: hunting mechanisms, colony optimization strategies, and solitary foraging techniques. Each of these behaviors reflects different approaches to acquiring resources. Below, examples of swarm-based algorithms corresponding to each of these categories are provided.
    Hunting mechanism: Ant Lion Optimizer (ALO) [13] serves as an exemplary model, mimicking the predatory tactics of ant lions in their larval phase. ALO adopts a structured approach comprising five distinct phases, mirroring the hunting strategy: random walk of ants, trap construction, ant entrapment, prey capture, and trap reconstruction. The Chameleon Swarm Algorithm (ChSA) [14], Lion Optimization (LO) [15], and Whale Optimization (WO) [16] are other illustrative instances.
    Colony optimization: The ACO algorithm [4] represents a notable example of colony optimization, which performs local and global pheromone updates to optimize the colony. The BAC [5] and Blind Naked Mole-Rate Optimization (BNMR) [17] are also illustrative instances.
    Solitary foraging: Puma Optimizer (PO) [18] describes how pumas are inclined to hunt individually rather than in groups or packs. This solitary hunting approach helps them reduce the likelihood of competition with other predators and permits them to concentrate on their individual hunting techniques. Such behavior imbues the PO algorithm with distinctive and potent mechanisms for both exploration and exploitation phases, thereby enhancing its effectiveness across a spectrum of optimization challenges. Some other examples of foraging are Grasshopper Optimization (GO) [19], Beetle Swarm Optimization (BSO) [20], and Red Kite Optimization (RKO) [21].
  • Algorithms inspired by social behavior: This class can fall into four categories.
    Breeding behavior: The Cuckoo Search Algorithm (CuSA) [22] draws inspiration from the aggressive reproductive behavior observed in Cuckoos. Certain species, such as the Ani and Guira Cuckoos, exhibit a communal nesting strategy, where they lay their eggs alongside those of others. However, they may selectively remove eggs from the nest to fully improve the chances of their eggs hatching success. Additionally, many species practice obligate brood parasitism, whereby they lay their eggs in the nests of other host birds, often across different species. This parasitic behavior encompasses three primary types: intraspecific brood parasitism, cooperative breeding, and nest takeover. Two other illustrative examples are Dragonfly Optimization (DO) [23] and the Crow Search Algorithm (CSA) [24].
    Mating behavior: Bird Mating Optimization (BMO) [25] draws inspiration from the mating behavior of birds; during the mating season, birds exhibit a range of intelligent behaviors, including singing, tail drumming, or dancing, to allure potential mates. Certain courtship ceremonies are notably intricate and contribute to establishing a bond between the prospective partners. Chimp Optimization (CO) [26] is another instance.
    Reproductive behavior: The FSO algorithm [27] is modeled based on reproductive patterns observed in Firebug swarms. The Firebug reproductive behavior involves five stages, including colony formation, mate selection, female Firebugs’ movement, attraction between fittest mates, and swarm cohesion.
    Leadership hierarchy: Grey Wolf Optimization (GWO) [28] mimics the leadership structure and hunting strategies observed in Grey Wolves. It utilizes four categories of Grey Wolves α , β , δ , and ω to simulate the hierarchical leadership. Moreover, it incorporates the three primary hunting stages: searching for prey, encircling prey, and attacking prey.
  • Algorithms inspired by innate natural behavior: In general, the innate natural behavior observed by various species are classified into mechanisms such as defense and survival. These mechanisms help species protect themselves and thrive in their environments. Below are brief explanations for several swarm-based algorithms that simulate these mechanisms.
    Defense mechanism: The Pufferfish Optimization Algorithm (POA) [29] draws inspiration from the Pufferfish’s natural defense mechanism against predators. In this mechanism, the Pufferfish fills its elastic stomach with water, transforming into a spherical ball adorned with pointed spines. As a result, hungry predators are deterred from attacking. The POA theory is then mathematically modeled in two phases: (a) exploration—simulating a predator’s attack on a Pufferfish—and (b) exploitation—simulating a predator’s escape from the spiny spherical Pufferfish.
    Survival mechanism: The Artificial Rabbit Optimization algorithm (ARO) [30] draws inspiration from the survival tactics of rabbits in their natural habitat. Specifically, it incorporates two key strategies: detour foraging and random hiding. In the detour foraging strategy, a rabbit detours to nearby grass patches near other rabbits’ nests. The rabbit avoids drawing attention from predators. Eating near other nests conceals its own nest location, reducing the risk of it being discovered. When facing a potential threats, a rabbit has multiple burrows (hiding spots) to choose from. It randomly selects one of its own burrows for concealment. This unpredictability makes it harder for enemies to capture the rabbit, this strategy is called random hiding strategy. In addition, when a rabbit’s energy decreases, it shifts from the detour foraging strategy to the random hiding strategy. This adaptive transition optimizes survival chances based on available resources and energy levels. The Gazelle Optimization Algorithm (GOA) [31] is also considered as an alternative instance.
    Echolocation mechanism: The Bat Optimization (BO) algorithm [32] draws inspiration from microbats’ natural life, which employs a form of sonar known as echolocation to navigate their surroundings. In the darkness, microbats use this remarkable ability to accomplish several tasks, including detecting prey, avoiding obstacles, and locating their roosting spots. When hunting, microbats emit intense sound pulses and then listen for the echoes that bounce back from nearby objects. The specifics of their echolocation signals vary based on the species. While most bats utilize short, frequency-modulated signals that cover approximately one octave, others prefer constant-frequency signals. Additionally, the signal bandwidth varies depending on the species and is often enhanced by incorporating multiple harmonics.
Table 1 provides an overview of the inspirations behind some swarm-based algorithms and their real-world applications.

3. Firebug Swarm Optimization

The FSO algorithm, as introduced by [27], draws inspiration from the reproductive swarm behavior observed in Firebugs (Pyrrhocoris Apterus), which unfolds in four distinct phases: roaming and exploring, aggregation, movement, and reproduction. These phases each emulate a biological activity of the firebugs. Specifically, the roaming phase involves the random generation of initial positions for firebugs within the colony (search space). The exploring phase is dedicated to identifying the optimal mate within the colony. The aggregation phase entails the gathering of male firebugs around the most attractive female firebug (optimal solution). The movement phase describes how male firebugs navigate and are drawn toward the best female firebug. Finally, the reproduction phase represents the process through which male firebugs reach their optimal female firebug.
The FSO algorithm is a global optimization technique characterized by a process in which individual firebugs seek the most suitable reproductive partners, analogous to the search for optimal solutions within a given search space. Firebug swarm has five distinct behaviors that will be employed to construct the optimization algorithm. These behaviors include formation of female colonies, mate selection, chemotactic movement of female bugs, attraction of male bugs to the fittest female bugs, and swarm cohesion. In essence, each bug aims to find the best, i.e., healthiest, mates for reproduction.
The advantages of the FSO algorithm can be summarized as follows. First, it is a fast optimization algorithm due to two factors: the simultaneous and parallel updating of all bug positions using element-wise Hadamard multiplication and the vectorization of the cost function, which reduces inefficient FOR loops. Second, it avoids premature convergence and ensures exploration of a broad region in the search space. Third, it enhances solution diversity by reducing the convergence of all female bugs towards the dominant male bug. However, FSO is specifically designed for multimodal optimization problems, which means it is not suitable for unimodal optimization tasks. Additionally, FSO has limitations in exploration and experiences an imbalance between exploration and exploitation. This issue occurs because exploration is primarily emphasized in the initial iterations, while exploitation predominates in the later iterations. When the FSO nears the maximum execution time, it indicates that the time allocated for exploitation will surpass that designated for exploration, resulting in an imbalance in the exploration–exploitation ratio. In other word, excessive exploration can lead to a wastage of resources and time and may also result in the algorithm becoming trapped in local optima. Conversely, an overemphasis on exploitation may detrimentally affect overall performance. Therefore, striking a balance between exploration and exploitation is a critical aspect of local search algorithms. Effective exploration mitigates the risk of the algorithm becoming ensnared in local optima, while efficient exploitation ensures meaningful progress toward optimal solutions [49].
Two key differences set the FSO algorithm apart from other existing algorithms. First, the FSO algorithm utilizes element-wise Hadamard matrix multiplication for position updates. This approach leverages the fast Single Instruction Multiple Data (SIMD) capabilities of modern CPUs and GPUs, allowing for parallel execution of multiple arithmetic operations. Unlike standard matrix multiplication, the Hadamard product combines matrices by multiplying their corresponding elements. This operation maintains the original dimensions while enabling localized transformations. Furthermore, standard matrix multiplication has a time complexity of O ( n 3 ) , whereas the Hadamard product exhibits a time complexity of O ( n m ) , indicating a significant reduction in computational complexity, where n m represents the size of the matrices involved. Additionally when comparing the time complexity of FSO with PSO and ACO algorithms. The FSO demonstrates a significant reduction. PSO and ACO incur time complexities of O ( T N D ) and O ( T N M 2 ) , respectively. In this context, N represents the number of particles and ants for PSO and ACO, respectively; T denotes the number of iterations; D indicates the dimensionality of the search space; and M refers to the number of nodes or the problem size. Second, the FSO algorithm ensures swarm cohesion, which means that the entire swarm moves as a unit and individual bugs do not disperse. Specifically, male bugs are not attracted to each other but instead follow the direction of motion towards promising solutions. The swarm cohesion leads to the prevention of premature convergence to local minima. Furthermore, the evaluations of the cost function are highly vectorized, as the position vectors of all individuals are passed as a single matrix to benchmark suites such as CEC 2013. This suite accepts the matrix of position vectors and returns all corresponding costs as a single array. The use of matrix operations, as opposed to scalar operations, is widely acknowledged for significantly improving computational efficiency in prominent high-level programming languages, thereby avoiding inefficient FOR loops. In summary, FSO leverages efficient matrix operations to circumvent the drawbacks of inefficient scalar operations [27]. The phases of the FSO algorithm and the behaviors of Firebug swarms are depicted in Figure 4.

3.1. Mathematical Model of FSO Algorithm

Consider a colony with N f female bugs and N m male bugs distributed across the search space with dimension of D. The mathematical model can be described through the behaviors of Firebugs as follows: Assume a matrix P F with a dimension of D by N F whose columns represent the positions of the female bug. Hadamard multiplication operations are used to simultaneously update all female bugs in a particular colony, as below:
P D M r e p m a t ( P ( m ) . x , 1 , N F )
P R M r e p m a t ( P ( a ) . x , 1 , N F )
where a is a random integer between 1 and N F . The function r e p m a t ( A , m , n ) generates a matrix with m copies of A along the row dimension and n copies along the column dimension.

3.1.1. Behavior I: Formation of Female Colonies (Initialization)

This behavior represents the initial phase of initialization, where the positions of female bugs are set within the search space and modeled as a column vector with real scalar costs. Each female’s initial position is considered a uniformly distributed random variable within the search space.

3.1.2. Behavior II: Mate Selection (Initialization)

This is a supplementary phase of initialization, which involves the following steps:
  • The location of each male bug is initialized to the position of the best female bug within its colony.
  • The best female bug is assigned to the dominant male bug controlling that specific colony.
  • The positions of the females are updated simultaneously using element-wise Hadamard matrix multiplication.

3.1.3. Behavior III: Chemotactic Movement of Female Bugs (Exploration)

This behavior represents the initial part of the exploration phase, during which each female moves strongly towards the dominant male in its colony while only weakly moving towards random male bugs. Mathematically, this behavior is simulated as follows:
  • The update equations for female bugs are arranged into a single matrix rather than a vector to enable synchronous updates and reduce inefficient search operations, as shown in the below female position update Equation (3).
    P F P F + C 1 ( P D M P F ) + C 2 ( P R M P F ) ,
    where P F is a matrix that contains the females’ positions, P D M denotes the location of a dominant male, P R M denotes the location of a random male, C 1 is the attraction coefficient towards the dominant male and C 2 is the attraction coefficient towards other male bugs. Note that the term C 1 ( P D M P F ) represents the movement of females towards the dominant male, while the term C 2 ( P R M P F ) represents the movement of females towards random males.
  • The cost function is vectorized to return a scalar when applied to a column vector. This approach greatly reduces inefficiencies associated with using FOR loops, resulting in lower complexity.
  • To ensure each female moves more strongly towards the dominant male and less towards random male bugs, set C 1 > C 2 .
Note that the attraction of females to random male bugs aids in exploration and enhances solution diversity, as it reduces the tendency of all female bugs to converge on the location of the dominant male.

3.1.4. Behavior IV: Attraction of Male Bugs to the Fittest Female Bugs (Exploitation)

This behavior is viewed as the exploitation phase and can be described as follows:
  • Each male bug is also attracted to a fit female outside its own colony. When males are drawn towards the same fittest female, it keeps the entire population clustered together.
  • There is no competition among males for a particular female in the colony to prevent all males from converging on the same female bug. Mathematically, this approach helps avoid premature convergence and ensures exploration of a broad area in the search space, rather than focusing solely on the location of the fittest female bugs.
Equation (4) describes the updated movement of all male bugs, including dominating male bug, towards the fittest female bug in the colony, and it can be presented as follows:
P M P M + C 3 ( P F F P M ) ,
where P M represents the vector of male bugs’ positions, P F F denotes the vector of the fittest female bug position, while C 3 is the attraction coefficient towards the fittest female bug. Note that the term C 3 ( P F F P M ) indicates the movement of males towards the fittest female bug.

3.1.5. Behavior V: Swarm Cohesion (Exploration)

This behavior is viewed as a complementary part of exploration, where the entire swarm moves together in a stochastic manner. Individual bugs do not disperse because they must follow the direction of randomly chosen bugs towards the fittest female, rather than moving towards the locations of other male bugs. This cohesion helps prevent premature convergence to local minima. The swarm cohesion is modeled as follows:
P M P M + C 4 ( P F F P I ) ,
where P I denotes the matrix of individual male bugs’ positions, and C 4 represents the attraction coefficient for these individual male bugs towards the fittest female bug. The term C 4 ( P F F P I ) describes the movement of individual males towards the fittest female bug, illustrating that individual male bugs align their movement with the direction of the dominant male bug. In other words, having the individual male bugs follow the direction of the dominant male bug ensures cohesion within the swarm.
The performance of the FSO algorithm was assessed using twenty-eight functions from the CEC2013 benchmark. The results indicated that FSO surpasses ACO, PSO, DE, Dynamic Learning PSO (DLPSO), and other heuristic global optimization algorithms. This outperformance is attributed to FSO’s ability to effectively balance exploration and exploitation in a high-dimensional multimodal landscape, preventing premature convergence. Specifically, female bugs are not drawn to just one optimal male bug, and male bugs only mimic the movement of random male bugs rather than being attracted to one another. Algorithm 1 presents the pseudo-code for the FSO algorithm, while Figure 5 illustrates the flowchart of the FSO algorithm.
Algorithm 1 Pseudo code of the FSO algorithm.
1:INPUT: cost function, S 1 m a x , S 2 m a x , L 1 , L 2 , N f , N m , and D.
2: Initialization: Female colony formation with N f female bug.
3: Mate selection: using Equations (1) and (2).
while (s S 1 m a x )
{
     Compute C 1 , and C 2
4:     Update female bugs’ positions using Equation (3) (Exploration).
     Update the fittest Female bug position P F F
     I = I + 1
     IF (I < L 1 ) {
     Go to Step 4.      ELSE {
     I = 1;
5:     Update male bugs’ positions P M using Equation (4). (Exploitation).
      IF (I > L 2 ) {
      s = s + 1
      I = 1
          }
      ELSE
        {
      s = 1
      IF (s < S 2 m a x ) {
6:         Update random male bugs’ positions using Equation (5) (Swarm cohesion)
         s = s + 1
            }
END
}
For a comprehensive understanding, a pseudo code of the FSO algorithm is presented below. The parameter settings for the FSO were empirically determined using the benchmark CEC 2013, which demonstrates strong performance. The population size is calculated as the product of N f and N m , selecting N m > N f to enhance exploration. The balance between exploration and exploitation is regulated by L 1 and L 2 , where L 1 promotes exploration and L 2 enhances exploitation. Furthermore, S 1 m a x and S 2 m a x influence exploration and exploitation, respectively, and are governed by competition criteria to ensure that the total evaluations do not exceed 1000 D .
To achieve an effective trade-off between exploration and exploitation, the parameters S 1 m a x and S 2 m a x are selected carefully. Additionally, to enhance exploration in high-dimensional multimodal landscapes, S 1 m a x is increased relative to S 2 m a x as the dimensions and complexity of the problem rise. For higher accuracy, it is essential to increase both S 1 m a x and S 2 m a x but at the expense of increased computational time [27].
To overcome these limitations, newly proposed FSO algorithms have been introduced, as detailed in the next section, Section 3.2. Furthermore, for a comprehensive overview, the strengths and weaknesses of the FSO and some other nature-inspired optimization algorithms are contrasted in Table 2.

3.2. FSO Variant Algorithms

In addition to the FSO algorithm, four innovative variants have been proposed to enhance its performance and address the primary limitations of the original FSO algorithm. These variants include the Enhanced FSO (EnFSO) [65], the Improved FSO (IFSO) [50], Fuzzy FSO (F2SO) [51], and the modified FSO (mFSO) [49].
The core concept behind the EnFSO algorithm is to refine the local search process by incorporating two additional operators: mutation and crossover, inspired by the Pelican Optimization Algorithm (PeOA) [66]. The initialization phase begins by setting the system’s input parameters, which are then randomized. Next, the cost function is defined. Colony formation, bug movement, and swarm cohesion occur sequentially. Jellyfish mutation and crossover guide the local search, and once the best current solution is identified, each jellyfish receives its own solution set. The crossover and mutation operators enhance population diversity, improve search effectiveness, and accelerate convergence. Algorithmically, EnFSO introduces two additional stages: randomizing the positions of female bugs immediately after initialization, and performing a local search phase placed after swarm cohesion and before termination in order to refine and identify the optimal global solution. The study in [65] introduces EnFSO to enhance the efficiency of renewable energy sources (RESs) integrated into a microgrid using a step-up multi-input DC–DC converter topology, where the FSO algorithm tunes the PI controller parameters to minimize the error between actual and reference active power, i.e., cost function, while the PeOA algorithm manages the search activity to determine the best PI parameters for minimizing the cost function. Simulation results show that EnFSO delivers higher maximum-power efficiency than resource flow analysis (RFA), artificial neural network (ANN), and ASO. Specifically, EnFSO reaches an efficiency of 93%, whereas RFA, ANN, and ASO achieve 89%, 87%, and 91%, respectively. Additionally, EnFSO lowers total harmonic distortion (THD) to 2%, while the other algorithms reduce it only to 5.5%, 7.5%, and 4%, respectively.
The IFSO algorithm in [50] integrates the FSO algorithm with decision tree techniques. This integration reduces the maximum execution time and minimizes scalar operations, leading to faster convergence and more effective exploration and exploitation. This integration streamlines the FSO’s search space, thereby enhancing the efficiency of identifying the optimal feature within that space. In other words, the DT classifier serves as the fitness calculator for each female bug. The classification error rate produced by the DT classifier is used as the cost function. Consequently, by learning the simple decision rules derived from the input features, the DT classifier effectively predicts a fitness value. Furthermore, the proposed IFSO employs vectorized versions of the cost function to guide each female bug primarily towards the dominant male bug controlling its colony, while also directing them weakly towards random male bugs. The biological behavior of firebugs is utilized to address the optimization problem of selecting a significant set of features from input images for classification. The firebugs represent the extracted features. The firebugs are initialized within the local search space. The feature combination sets of the forty-seven extracted features correspond to the positions of the female bugs’ positions. The DT classifier is employed to compute the fitness function, with the classification error rate of the firebugs serving as the cost function values. Each female bug’s initial position is represented as a uniform vector random variable. Consequently, the search process continues by updating the positions of the firebugs, the fitness function, and the candidate solutions. To validate the IFSO algorithm, the FCV2002 fingerprint dataset, which includes four subsets (DB1, DB2, DB3, and DB4), is utilized. This dataset is selected for its diversity, standardization, scalability, and acceptance. The simulation results demonstrate that the IFSO algorithm outperforms existing methods in terms of accuracy, sensitivity, specificity, and error rate. Quantitatively, IFSO achieves an accuracy of 95%, whereas ANN, recurrent neural network (RNN), and DT attain accuracies of only 85%, 60%, and 55%, respectively.
The F2SO algorithm [51] addresses imbalancing the exploration–exploitation ratio by integrating a fuzzy decision module with the FSO algorithm to facilitate the automatic adaptation of its search behavior by refining FSO parameters related to bug movement and the selection of the best female bug. The routing model is developed based on twelve fuzzy rules, which consider the distance between sensor nodes (SNs) and cluster heads (CHs), node degree, and residual energy. The FSO algorithm is employed to identify the optimal path between SNs and CHs. However, the FSO algorithm may encounter suboptimal search behavior during the optimization process, such as node transmission failures. To mitigate this issue, the fuzzy decision module automatically adjusts the search behavior of the FSO algorithm to prevent such scenarios. The experimental results showed that the F2SO algorithm outperforms existing routing methods in terms of network lifetime, end-to-end delay, energy consumption, and throughput. Further details can be found in Section 4.5.
The study in [49] aims to evaluate the economic factors associated with an off-grid hybrid photovoltaic (PV)/biomass/battery system designed to meet the energy needs of a rural area. This design problem is categorized as a high-dimensional multimodal optimization problem, where the exploration–exploitation trade-off is critical. Consequently, the study proposes the integration of a hybrid optimization framework of FSO with machine intelligence, including Logistic Chaotic Local Search (LCLS), the Opposite-Based Learning (OBL) technique, and Phasor and Transition Operators (PO, TO). The LCLS is utilized to tune swarm cohesion coefficients to improve exploration in areas with a higher potential for identifying optimal solutions. The OBL regulates the movement of the bugs and identifies the most fit female bug, thereby enhancing exploitation capabilities and minimizing the risk of convergence to local optima. Meanwhile, TO and PO are employed to generate a random distribution of bugs, which further promotes exploration and helps prevent entrapment in local optima during the search process.
The performance of the mFSO algorithm was assessed using ten functions from the CEC2020 benchmark. Additionally, simulation results for designing a standalone hybrid system that includes PV, biomass, and battery components demonstrated that the mFSO algorithm outperformed those integrated with FSO, Seagull Optimization algorithm (SOA), and Slime Mold Algorithm (SMA). Table 3 presents a comparison between the mFSO and other techniques. This comparison demonstrates that mFSO effectively minimizes the objective function and energy cost (COE).
In summary, the FSO algorithm is an efficient global search optimization technique specifically designed for multimodal optimization problems, characterized by its rapid convergence and resilience to noise. It is applicable to a wide range of real-world scenarios, including engineering design and WSN routing. However, the imbalance between exploration and exploitation significantly impacts the performance of the FSO algorithm, particularly in high-dimensional multimodal landscapes with constrained time and computational resources, such as hybrid renewable energy systems and low-resolution medical imaging and forensic applications. To enhance the efficacy of the FSO algorithm, integrating it with artificial intelligence techniques, such as NN, convolutional neural network (CNN), machine learning (ML) techniques, other heuristic optimization algorithms, and fuzzy logic has led to the development of FSO variants that are capable of finding high-quality solutions in real-world applications with limited computational resources. Furthermore, from an algorithmic structure perspective, this integration aims to optimize the exploration and exploitation parameters within the FSO algorithm. The tuning process aims to confine the search space of the FSO algorithm, enabling it to more efficiently identify local optima within this confined search space, thereby capitalizing on the inherent limitations of the FSO algorithm while leveraging its advantage of fast convergence.

4. FSO Real-World Practical Applications

FSO has various important engineering uses in various real-world applications including cloud computing, WSN, social media, healthcare systems, electric vehicles, etc. Figure 6 illustrates some important FSO real-world applications.

4.1. Task Scheduling

In cloud computing, task scheduling is vital to achieving multiple goals and satisfying different user needs. The increasing demand and users necessitates minimizing the task completion time and enhancing the load balancing capacity. Therefore, the work in [67] proposed a Hybrid Firebug Tunicate Optimizer (HFTO) as a task scheduling workframe for fault tolerance and dynamic task scheduling in cloud computing. Virtualization is fundamental to cloud computing, allowing virtual machines (VMs) to execute client tasks. Due to the dynamic nature of the cloud, physical machine loads vary. Load balancing helps distribute workloads across resources such as CPUs, disk drives, network links, and computers.
The proposed HFTO algorithm classifies tasks; generates various VM variants; and optimizes several Quality of Service (QoS) parameters, including fault tolerance, response time, efficiency, and makespan. The proposed HFTO algorithm demonstrates several advantages, such as improved search capability and faster convergence. The algorithm enhances fault tolerance by assigning tasks to suitable resources based on peak resource loads. Integrating the Tunicate Swarm Optimizer (TSO) with the FSO algorithm addresses a key limitation of FSO, i.e., its constant exploration–exploitation ratio, while leveraging FSO’s strengths, such as fast convergence and effective feature extraction. TSO enhances search capabilities by balancing exploration and exploitation, allowing for improved swarm behavior updates. The original FSO’s exploitation swarm cohesion required adjustments, as it hindered convergence speed, search ability, and computational efficiency. By incorporating TSO’s swarm behavior into FSO’s exploitation cohesion, the overall performance of the FSO algorithm is enhanced.
In HTFO, tasks running in each VM are considered as male firebugs, incoming tasks in the queue are considered as dominant male firebugs, and tasks offloaded from an overloaded VM are considered as the female firebug population. The role of FSO is to determine the optimal under-loaded VM, i.e., female firebug, for the incoming task, i.e., dominant male firebug. A quantitative comparison with existing frameworks shows that the proposed framework achieves higher load balancing efficiency and enhances overall cloud task scheduling performance.

4.2. Routing

The work in [68] proposed an energy efficient routing protocol that maximizes the network security with minimum energy consumption. The proposed algorithm is made of two phases; secure node selection and orthogonal routing. In the first phase, a secure node is chosen from the initial nodes of the WSN based on criteria such as trust, delay, and QoS, while in the subsequent phase, the Firebug Optimized Modified Bee Colony (FOMBC) technique is employed to determine an optimal route considering factors such as trust, distance, latency, and QoS.
The goal of combining FSO and MBCO is to balance premature convergence, reduce computational costs, and accelerate convergence. This integration enhances search capability and speeds up convergence, with MBCO managing the balance between exploration and exploitation, while FSO improves convergence speed with minimal computational load. Initially, each SN is represented as a male firebug, while neighboring nodes are treated as female firebugs within a colony, i.e., a node cluster.
The location of all sensor nodes in a colony is influenced by the location of the optimal route node within that colony. When using FSO outside of the colony, sensor nodes are attracted to neighboring nodes, moving toward the most energy-efficient neighboring node. This keeps the nodes cohesive and prevents them from dispersing. For the optimal route node in a sensor node cluster, there is no competition, i.e., swarm cohesion, preventing all sensor nodes from being drawn to the same neighboring nodes and reducing the risk of early convergence.
The effectiveness of the FOMBC method is compared with existing techniques including Trust-Aware Routing Framework (TARF), Simple Opportunistic Adaptive Routing (SOAR) Protocol, PSO, and WOA for secure routing. Experimental results indicate that the proposed FOMBC approach achieves superior performance in terms of detection rate, throughput maximization, minimal latency, and reduced range. Moreover, the proposed protocol using FOMBC remarkably prolongs the network. Quantitatively, FOMBC achieved a 39% higher throughput compared to PSO and WOA. Additionally, for routing 100 nodes in a WSN setup over an average of 100 rounds, FOMBC requires only 13 s, while PSO and WOA take 16 and 21 s, respectively. FOMBC outperforms the other two techniques in terms of delay, as it leverages the fast convergence of FSO and its minimal computational complexity.

4.3. Sarcastic Sentiments

Social media users frequently encounter sarcasm and ridicule, conveyed through both images and text. The work in [69] proposed a novel architecture based on FSO and Long Short-Term Memory (FSO-LSTM) to detect and categorize sarcastic sentiments expressed on Twitter, currently X. The work proposes a multi-objective optimization approach using FSO to balance optimal solutions for sarcasm prediction in text and images, leveraging RNN and LSTM. The fitness functions aim to minimize the root-mean-square error (RMSE) for the first objective and the mean absolute error (MAE) for the second, combining both objectives to determine the overall fitness value. Individuals in the population are ranked according to their objective function values, and mate selection in the FSO algorithm is guided by these fitness rankings. The proposed framework was trained using the CK+ dataset and utilizes past Twitter account history to analyze user behavioral changes. The proposed classifier surpasses existing approaches with a high average classification accuracy of 97.25%.
The work in [70] proposed an ensemble framework with optimal features for sarcasm detection in social media data. The framework comprises four essential components: pre-processing, feature extraction, optimal feature selection, and sarcasm detection. The sarcasm detection is designed employing a new ensemble technique that has been constructed with an optimized RNN deep learning classifier. The FSO algorithm is employed to enhance the sarcasm detection accuracy by optimizing the activation function of the RNN. To evaluate the effectiveness of an optimized RNN using FSO for sarcasm detection, rigorous testing on diverse datasets and comparisons with current models are essential. The success of this framework depends on factors such as data accuracy, feature quality, and careful tuning of hyperparameters, all of which influence the model’s performance. The classifier was trained on optimal features extracted using an optimized CNN, and the final results were produced by the optimized RNN. Comparative performance analysis of the proposed framework against existing algorithms including RNN, CNN, and ANN demonstrates superior performance of the proposed framework.

4.4. Privacy and Security

Healthcare data are highly sensitive and private, necessitating stringent protection from intrusions and attacks. Consequently, research has focused on developing various privacy-preserving frameworks using different approaches. In [71], a novel framework employing Gaussian mutation-based FSO in cloud computing was proposed. This algorithm aims to generate an optimal key to enhance security in cloud-based healthcare applications.
The fitness function for generating the optimal key is designed to minimize the key size. The work utilizes FSO for two primary reasons. First, FSO leverages element-wise operations, specifically Hadamard matrix multiplication for position updates, enabling a fast and efficient parallel update process. Second, swarm cohesion in FSO prevents competition among male firebugs, allowing male and female firebug colonies to explore a broader search space without converging prematurely to the same optimal location. Additionally, Gaussian mutation is used to enhance mate selection by improving the crossover process. In this context, sensitive data is represented as male firebugs, while key sizes are represented by female firebugs.

4.5. Power Management

4.5.1. Power Generation Systems

Ensuring effective operation of the entire electric power system is a primary objective. One approach to enhance the efficiency of electricity networks is through the adoption of hybrid renewable energy sources (HRESs), such as PV–wind-based systems. However, a significant challenge in implementing such technology is the occurrence of unbalanced voltage caused by diverse loads and harmonic distortion. Therefore, mitigating unbalanced voltage across the electricity network is crucial.
The work proposed in [72] is a hybrid optimization approach integrating Moth fly optimization (MFO) and the FSO algorithm. This hybrid algorithm acts as a controller to optimize energy allocation from renewable sources, improve power factors, reduce voltage unbalances, and meet total harmonic distortion (THD) criteria.
MFO commonly faces challenges such as low solution accuracy, slow convergence, limited diversity, and a tendency to get trapped in local optima due to difficulties in balancing exploration and exploitation, especially when boundaries are predefined. By integrating MFO with the FSO algorithm, these challenges are effectively addressed, leading to improved solution accuracy and faster convergence. This integrated approach enhances the algorithm’s ability to balance exploration and exploitation with reduced computational cost. The proposed integration not only enhances outcomes by increasing Maximum Power Point Tracking (MPPT) current and voltage values and managing wind turbine and battery switching to reduce THD but also continuously tracks the Maximum Power Point (MPP) of PV systems under varying illumination conditions, optimizing the conversion of solar and wind energy into electricity. Simulation results demonstrated the superiority of the proposed algorithm over existing approaches when applied to an islanded microgrid, particularly in terms of overall power quality (PQ), THD, voltage unbalance factor (VUF), and computational efficiency.

4.5.2. Photovoltaic System

For solar cell systems, the MPPT algorithm is employed to adjust the electrical operating point of the PV panels to ensure that they operate at their MPP under varying environmental conditions such as sunlight intensity and temperature. By dynamically adjusting the voltage and current, the MPPT algorithm ensures that the system extracts the maximum available power from the solar panels, thus maximizing the energy yield. Nevertheless, a significant challenge in PV systems arises from their non-uniform power-voltage characteristics, particularly noticeable under partial shading conditions where multiple peak points can occur. Traditional MPPT algorithms may face difficulties in such scenarios, potentially identifying local maximum peak points instead of the Global Maximum Peak Point (GMPP). Therefore, modifying MPPT algorithms becomes crucial to identifying the global maximum power point in PV systems under non-uniform radiance conditions such as partial shading conditions. Nature-inspired optimization algorithms such as PSO, ACO, and BAC have been employed for searching GMPP under such circumstances. However, integrating those algorithms to the MPPT algorithm resulted in incurring computational complexity and an increase in tracking time.
Therefore, the work in [73] proposed an FSO-based MPPT Algorithm for 9 × 9 PV arrays to maximize its power output under partial shading conditions. The work employed the FSO as a monitoring system to capture the GMPP of the PV array over the duty cycle of the booster converter. Several scenarios of partial shading have been applied to examine the efficacy of incorporating FSO with the MPPT algorithm. Specifically, the best female firebug in the colony represents the highest estimated MPP of the PV array, while the male firebugs represent monitored voltages and currents under various shading conditions. Integrating the FSO algorithm with MPPT enables the system to effectively capture the GMPP of the PV panel with reduced computational complexity. The simulation results showed that the algorithm in [73] outperformed MPPT combined with the PSO, GOA, and GWO algorithms in the performance metrics, such as generated output power, fill factor, efficiency, and mismatch loss. The MPPT combined with FSO outperforms MPPT implementations using other swarm intelligence algorithms, such as PSO, GOA, and GWO, due to its efficient local search capabilities, fast convergence, and independence from initial conditions.
Recently, image processing has become widely adopted for MPPT. This approach, unlike others, remains unaffected by temperature variations but is sensitive to natural partial shading. An optical camera that distinguishes different lighting conditions is employed to capture images of the PV array. After identifying the shading pattern, the PV arrays are reconfigured using a meta heuristic algorithm as an optimization tool. The work in [74] introduced an image processing-based FSO to detect MPPT under shadowing conditions on PV arrays. By capturing an image and applying an edge detection algorithm, the algorithm in [74] detects the shading portion of a PV array. The image processing analysis is utilized to adjust the current rating accordingly. The FSO algorithm’s role involves identifying and controlling switch configurations to maximize the global peak output power admidst partial shading patterns. Moreover, the proposed algorithm reconfigures shaded PV arrays to significantly mitigate power and efficiency losses. Initially, the FSO determines the size of the firebug population based on the size of the PV arrays or the number of solar panels. The algorithm then iterates multiple times to track the maximum possible GMPP generated by different switching configurations under various partial shading patterns. Each completed iteration represents the firebug population, with the GMPP being identified in every population, highlighting the most significant point of the population. As a result, each shading profile will attain the most optimal configuration, leading to a significant increase in the efficiency and energy yield of the photovoltaic array. Simulation results using a 280 Watt PV panel validate the effectiveness of the FSO-based MPPT technique with image processing. Comparative analysis against state-of-the-art MPPT methods demonstrates that their approach successfully detects the peak power point under partial shading conditions, thereby enhancing both power output and efficiency.

4.5.3. Fault Analysis

Induction generators used for grid-connected wind energy generation necessitate conversion switches. The switches add changes, which include voltage ripple, magnitude fluctuations, and injected harmonics. These changes have a significant impact on voltage stability amid grid disturbance and severe weather conditions, as well as on the PQ of the generation process, particularly concerning fault ride-through capabilities during voltage reactions. Hence, conducting fault analysis becomes imperative for implementing grid-connected wind energy systems.
One proposed solution involves developing a flexible alternating current transmission system (FACTS), facilitated by a decoupled controller to manage aggregated power and ensure stability monitoring. The work in [75] proposed to integrate MFO and FSO (MFO-FSO) to build up the decouple controller for this purpose. The purpose of using MFO is to successfully handle a particular class of global streamlining problem, which leads to early convergence; however, if MFO has predetermined boundaries, it will be unable to merge the boundary sets. Therefore, FSO is employed alongside MFO to enhance performance and avoid premature convergence with reduced computational complexity. The interim outcomes of the system using the MFO-FSO approach are implemented, and the transient model is reiterated. FACTS are validated, and the additional positive impacts of MFO-FSO on system stability are also evaluated. The simulation results demonstrated that the proposed MFO-FSO-based FACTS model can provide the necessary reactive power during grid disturbances. The results also showed that integrating the proposed controller significantly improves voltage stability compared to existing methods.
On the other hand, the integration of ML techniques and FSO technology results in improvements to detection faults in system performance. In [76], an optimal detection and classification technique is proposed, by integrating Multiple Support Vector Machine (MSVM) with FSO algorithms, called the MSVM-FSO algorithm. The proposed technique aims to diagnose faults in hybrid systems with greater accuracy. The proposed approach effectively classifies and identifies faults occurring in hybrid grid-connected systems with low complexity using the FSO algorithm and substantially enhances the grid-connected system’s PQ. The main objective of this combination is to ensure a system with reduced complexity for Fault Identification and Diagnosis (FID) to enhance the PQ of the hybrid system. In this approach, the MSVM method is applied to detect fault conditions in the grid-tied system, while the FSO method classifies the types of faults occurring in the grid-connected system. The fitness function of FSO is designed to classify whether a fault has occurred in the system. Faults are identified based on the calculated phase currents at the end of each line or feeder in the grid to evaluate the performance of the hybrid generation system under both systematic and non-systematic faults. The simulation results demonstrate that the proposed method achieves an accuracy of 99.7% and an efficiency of 98%, outperforming various existing techniques. The simulation results also demonstrated that the proposed technique effectively handles noise and uncertainties. By optimizing MSVM parameters with FSO, the technique can manage variations and disturbances in grid-connected systems, enhancing robustness and reliability. Finally, the technique’s adaptability to various grid-connected systems makes it a versatile solution for multiple applications, improving detection and classification methods for grid-tied systems with greater accuracy, robustness, and efficiency [76].

4.5.4. Tuning Controller

The fluctuating electrical loads exert a substantial and direct influence on both the electric generators and the supply voltages they produce. An Automated Voltage Regulator (AVR) is a feedback control system designed to monitor and regulate the output voltage of a generator. It functions as a controller by adjusting the exciter voltage to maintain the generator’s output voltage at an optimal level, even under varying load conditions. Typically, AVRs are implemented using PID controllers. However, the work in [77] introduces a novel design approach utilizing a Fuzzy Fractional-Order PID (FOPID) controller [78], which offers greater flexibility and robustness due to its increased degrees of freedom. This added complexity, however, makes parameter tuning more challenging compared to traditional PID controllers.
To address this, the Fuzzy-FOPID controller parameters are finely tuned using the FSO algorithm, a derivative-free global optimization method. The FSO algorithm ensures that the best, i.e., fittest, female firebug, representing optimal FOPID control parameters, consistently achieves low-cost values. The optimization process starts with a population of male and female firebugs randomly distributed across the search space. Each firebug is assigned a cost function and a position vector. The process involves several steps: mate selection, where the best female firebug is identified; updating the positions of female firebugs; attracting male firebugs toward the best female; and achieving swarm cohesion, where all random male firebugs are drawn to the best female while mimicking the movement of other male firebugs. This approach aims to find the optimal solution while avoiding premature convergence to local minima. To evaluate the robustness of the FSO-tuned FOPID controller, performance indices such as the Integral Absolute Error (IAE) and Integral of Time-multiplied Absolute Error (ITAE) were used as objective functions. Simulation results demonstrated the superiority of the proposed FSO-tuned FOPID controller, showcasing enhanced responsiveness and performance in AVR systems compared to conventional PID, standard FOPID, and Kidney-Inspired Algorithm (KIA)-based Fuzzy FOPID controllers [79]. The comparative analysis covered transient performance metrics, including rise time, peak time, settling time, and overshoot, highlighting the effectiveness of the proposed approach.
Furthermore, maintaining a balance between power generation and demand is a critical requirement for the stability of a power system. Variations in load and the integration of new power sources can significantly impact the system’s frequency. The Automatic Generation Controller (AGC) plays a vital role in maintaining frequency stability within the power system. Any mismatch between supply and demand frequencies can result in system errors.
To address this challenge, the study in [80] proposed an innovative approach using a cascaded fuzzy controller combined with a Three-Degree-of-Freedom FOPID (3DOF-FOPID) controller. This design was specifically applied to a hybrid power network incorporating an ocean thermal power source, aiming to enhance frequency regulation and overall system performance.
Optimizing the tuning of 3DOF-FOPID controller parameters is a critical task. However, there is no straightforward method for optimizing the membership functions of a fuzzy logic controller. To address this, the FSO algorithm was employed to fine-tune the proposed controller. The effectiveness of the FSO-tuned controller was evaluated using four performance indices: Integral Absolute Error (IAE), Integral of Time-multiplied Absolute Error (ITAE), Integral of Square Error (ISE), and Integral of Time-multiplied Square Error (ITSE). A constrained optimization problem was formulated to minimize these performance indices while adhering to constraints on the three PID parameters, set-point constraints for proportional and derivative control, and the low-pass filter coefficient. The simulation results demonstrate improved performance metrics such as settling time, undershoot, and overshoot. Furthermore, the analysis indicates that the proposed controller exhibits robustness against random load perturbations and variations in system parameters.
Likewise, the work in [81] presented an innovative control strategy for a first-order plus time delay-based three-area interconnected hybrid power system. The proposed strategy employs the FSO algorithm for optimal tuning of controller parameters. The constrained optimization function aims to minimize the ITSE of the same systems under the constraints given in the work [77]. The simulation results demonstrated the robustness of the proposed strategy against load variations. Furthermore, the effectiveness of the FSO algorithm is validated through comparisons with other meta-heuristic algorithms, namely the PSO and Big Bang Big Crunch (BB-BC) algorithms, highlighting FSO’s superior performance.

4.5.5. Battery Thermal Management

In addition, the work in [82] proposed a hybrid approach that integrates PeOA and FSO to enhance the optimization capabilities of PeOA. The proposed approach was applied to a battery thermal management system for electric vehicles (EVs)/hybrid EVs (HEVs). The proposed approach aims to maintain the maximum temperature variation in cooling air and between battery cells within a specified temperature range by optimizing coolant passage spacing and reducing temperature differences in the battery cells. The experimental results using a battery of 36 Lithium-ion (Li-ion) cells illustrate that implementing the proposed approach results in maximal temperature changes of 2.0 K in cooling air among the passages and 7.5 K between the battery cells and achieves a pressure drop of 228.03 Pa.

5. Case Study on Optimizing Routing in WSNs

Multi-hop communication in WSNs becomes essential when direct communication between a sensor node and the sink, i.e., base station, is either impractical or inefficient. This situation arises due to factors such as the distance between nodes and the sink, the limited energy resources of sensor nodes, or environmental challenges that obstruct communication. When the deployment area is extensive, or sensor nodes are dispersed across a wide geographic region, multi-hop communication enables seamless connectivity. It allows nodes to transmit data through intermediate nodes, ensuring that all nodes can communicate with the sink even if they are beyond its direct transmission range. This approach extends the effective communication radius of the network without requiring high-power transmissions from individual nodes. Additionally, multi-hop communication is particularly advantageous in scenarios with high node density, such as industrial process monitoring. In such cases, direct communication from multiple nodes to the sink can create congestion and overload the network. By distributing the communication load across multiple intermediate nodes, multi-hop communication alleviates bottlenecks and enhances network efficiency.
In WSN, data transmission from the source to the target experiences delays due to collisions and fluctuations in node mobility. Energy consumption increases as a result of inadequate maintenance of the routing path. The overall efficiency of clustering and routing is diminished by the presence of sensor nodes within a cluster. Nodes may become unreliable and malfunction in uncontrolled and hostile environments, leading to reduced energy availability in the WSN, which disrupts communication.
To establish an efficient WSN, this case study employs both distance and energy fitness functions. Furthermore, the WSN incorporates energy considerations to minimize packet loss. In this context, multi-hop routing is implemented to mitigate routing challenges.

Network Model

The assumptions employed in the development of the network model are as follows:
  • The sensors in the WSN are identical in terms of processing time and energy consumption.
  • A Euclidean distance formula is utilized for measuring the distance between sensor nodes.
  • The sensors are randomly located within the network area, and their positions remain constant once established within the system.
  • Nodes with residual energy and distance metrics are provided to the BS, which selects CHs using a suitable CH selection technique. Additionally, the transmission route from the CH to the BS is determined by the routing algorithm.
The transmitter and receiver energy is computed based on the distance D and packet length L.
E T X ( D , L ) = L × E d i s + L × ϵ F S × D 2
E R X ( D , L ) = L × E d i s
E T X and E R X represent energy consumed by the transmitter and receiver, respectively, while ϵ F S denotes the amplification coefficient for free space and E d i s denotes dissipated energy. The energy cost of the node is a critical parameter for the design and evaluation of WSN routing algorithms. The objective of optimization algorithm is to minimize the cost function T C to be optimized and is influenced by the energy cost E C , throughput ( T h r ), delay ( T D ), and packet delivery ( P a c d e l ).
T C = α 1 E C + α 2 T h r α 3 T D α 4 P a c d e l
where α i , i = 1 , 2 , 3 , 4 are optimization parameters, such that Σ α i = 0 . The optimal routes between the sensor node and the best neighbor node, as well as between CHs and the BS, are selected to minimize T C . The simulation settings are provided in Table 4.
This case study utilizes four nature-inspired optimization algorithms, namely, GA, PSO, ACO, and FSO, to improve multi-hop communication in WSNs by determining the optimal path for transmitting data packets from a sensor node to the sink in a densely deployed environment.
The evaluation focuses on four key performance metrics: throughput, delay, overhead, and packet delivery. The throughput measures the total data successfully delivered through the network within a specific time frame measured in kbps, and the delay represents the time required for data packets to travel from the source node to the destination measured in seconds, while overhead refers to the number of packets transmitted during route discovery and maintenance processes, measured in (packets/round). Finally, packet delivery assesses the accuracy of data transmission, including detection by potential attackers.
To ensure the reliability of the simulation results, sensor deployment was randomized in terms of sensor positions and density, along with initial sensor energy, where each sensor was assigned a random initial energy level and cluster heads are selected randomly. Furthermore, To ensure statistical reliability, each round is executed multiple times. As a common rule of thumb, 20–30 runs are sufficient to achieve average convergence and variance reduction, while a higher number of runs (up to 100) may further improve accuracy at the cost of increased computational overhead. Accordingly, 25 runs per round are conducted in this study.
These performance metrics provide a comprehensive assessment of the network’s performance and the effectiveness of the employed optimization algorithms, where the most efficient algorithm is to maximize energy efficiency and packet delivery while minimizing delay and incurred overhead.
This case study aims to assess the performance of the FSO algorithm, with its functionality in this context detailed below. SNs are represented as male firebugs, while neighboring nodes are depicted as female firebugs. The positions of male firebugs within the network, referred to as the colony, are determined by the location of the optimal route node. Male firebugs, i.e., sensor nodes, outside the network are attracted to neighboring nodes. These male firebugs move toward the most energy-efficient neighboring node, identified as the optimal female firebug, maintaining swarm cohesion and preventing dispersion. Furthermore, there is no competition among male firebugs for the optimal female firebug.
The simulation results for the four algorithms are presented in Figure 7, Figure 8, Figure 9 and Figure 10. Specifically, Figure 7 and Figure 8 illustrate a comparison of the overhead and delay associated with each algorithm. It is noted that FSO requires 13 sec while GA, ACO, and PSO take 23, 21, and 16 sec, respectively. These figures demonstrate that the FSO algorithm consistently achieves lower overhead and delay compared to the other three algorithms. This superior performance can be attributed to the FSO algorithm’s use of element-wise Hadamard matrix multiplication for position updates. This approach facilitates the parallel execution of arithmetic operations, resulting in faster computation and reduced computational complexity. In contrast, the GA exhibits significantly higher overhead and delay. This is primarily due to its inherently high computational complexity, as it requires iterative processes such as selection, crossover, and mutation, which are computationally expensive and time-consuming. This comparison highlights the efficiency of the FSO algorithm in minimizing resource usage and optimizing execution time.
Figure 9 provides a comparison of the throughput achieved by the four algorithms. The results clearly indicate that the FSO algorithm outperforms the others, highlighting its superior search efficiency. Quantitatively, FSO, GA, PSO, and ACO achieve throughputs of 65%, 52%, 62%, and 58%, respectively, after completing 100 rounds. The figure shows that the ACO algorithm achieves comparatively lower throughput. This outcome can be explained by the inherently stochastic nature of ACO. The reliance on randomness in its solution construction process often leads to inconsistent results, as the algorithm may produce varying solutions for the same problem instance across different runs. This inconsistency undermines the reliability of ACO, especially in scenarios where throughput optimization is critical. Overall, the figure underscores the robustness and reliability of the FSO algorithm in delivering higher throughput, particularly in challenging optimization environments.
Figure 10 illustrates the delivery ratios of the four algorithms. The results reveal that the GA outperforms the other three algorithms. This superior performance can be attributed to GA’s ability to adapt effectively to changing network conditions, including varying traffic patterns, node failures, and security threats. Such adaptability ensures that GA maintains robust and reliable security mechanisms in dynamic and unpredictable WSN environments. In contrast, the FSO algorithm faces challenges in responding to highly dynamic scenarios, such as rapidly evolving attack vectors or shifting network topologies. Its limited adaptability in these situations may necessitate the integration of additional mechanisms to improve its responsiveness and flexibility.
In conclusion, this case study demonstrates the superior performance of the FSO algorithm compared to GA, PSO, and ACO in terms of overhead, delay, and throughput. However, it also highlights a limitation of the FSO algorithm in ensuring secure packet delivery compared to the other algorithms. This limitation can be mitigated by integrating the FSO algorithm with other nature-inspired approaches, as shown in [68], where FSO was combined with MBCO to enhance WSN security while minimizing energy consumption, as previously discussed in Section 4.2.

6. Open Issues and Future Research Directions

A new emerging research trend involves the integration of ML, CNN, and deep learning (DL) with heuristic algorithms for real-world applications, such as structural health monitoring from various perspectives in civil and mechanical engineering [83,84], as well as in photodynamic therapy [85]. Consequently, the forthcoming research initiative seeks to integrate FSO algorithms with a variety of AI techniques, including ML, DL, and CNN along with their variants. This integration will leverage the ability of AI techniques to manage complex data patterns and extract significant features, combined with the rapid local search capabilities of FSO. The objective is to develop an advanced optimization technique with enhanced capabilities for addressing complex challenges, such as operating in noisy environmental conditions. A relevant real-world application of this research direction includes the detection of brain tumors, breast cancer, and kidney stones in low-resolution medical imaging, as well as the forensic identification of faint fingerprints.
Another promising research direction involves integrating various nature-inspired algorithms with the FSO algorithm to develop models with enhanced capabilities, for instance, integrating DHA [37] with the FSO algorithm holds potential for significantly enhancing spectrum allocation, Channel Assembly, and spectrum sharing in Cognitive Radio Networks (CRNs). Moreover, instead of extracting a statistical model of primary user existence over a specific region as in [86], the FSO algorithm or IFSO can be utilized to extract features to characterize the primary users’ behaviors over that region. The work in [87] employed various ML techniques to analyze and detect the existence of Primary Users (PUs) and Secondary Users (SUs), and which frequency band is occupied by PU or SU. The integration of ML techniques with the FSO algorithm promises to improve the detection and classification of various users, marking a promising research direction. Moreover, such integration will significantly improve the performance ML-based trust models [88]. Moreover, integrating FSO with DL techniques will lead to substantial improvements in data security for spectrum sensing database and authentications in CRN systems. On the other hand, combining FSO with DL techniques will lead to substantial improvements in detection and data security for spectrum sensing databases and authentications in CRN systems.
An additional open issue and potential avenue for future research is the joint optimization of spectrum and power in CRNs to improve their QoS. The work in [89] proposed an integration of the PSO algorithm with the DE algorithm to address this challenge. While this integrated approach effectively maximizes CRN throughput and improves spectrum efficiency, it comes with a high computational complexity. To mitigate this, the FSO algorithm could be a promising alternative, offering comparable performance with a lower computational burden.
Another unresolved challenge is the limited processing power of IoT devices, which restricts their functionality and often leads to offloading operations to cloud services. While this approach helps prevent overloading the devices with excessive data, it still encounters issues such as high latency, increased network traffic, and higher energy consumption. To address these challenges, fog computing has been introduced in IoT environments to accelerate time-sensitive data processing and management. In [90], an enhanced MFO algorithm combined with OBL was employed to address task and resource allocation challenges in the IoT environment. However, the MFO algorithm is prone to premature convergence and can lead to increased computational complexity. The use of the FSO algorithm for this task could significantly reduce the computational burden and help avoid premature convergence.

7. Conclusions

This survey has provided an overview of the FSO algorithm as a natural-inspired heuristic optimization technique. We have thoroughly examined its operational principles, explored various hybrid variants, and reviewed its diverse real-world applications. Additionally, this survey has offered comprehensive insights into the historical development of nature-inspired optimization algorithms, contextualizing the FSO algorithm within the broader spectrum of such techniques. Moreover, the survey shows that FSO is an efficient global search optimizer with a rapid convergence rate for multimodal optimization compared to existing methods. However, it experiences an imbalance between exploration and exploitation, which leads to being trapped in local optima and premature convergence in high-dimensional multimodal optimization problems with limited time resources. This issue can be addressed by integrating FSO with either nature-inspired algorithms, fuzzy logic, ML techniques, or NN and its variants. Furthermore, the survey has included a case study to evaluate the performance of the FSO algorithm against other optimization algorithms. Finally, the survey has identified several promising avenues for future research, with the goal of further enhancing the FSO algorithm and broadening its applicability to complex optimization challenges, such as the detection of tumors in low-resolution medical imaging.

Author Contributions

Methodology, F.A. and Y.A.; software, F.A.; validation, Y.A. and E.A.-R.; resources, E.A.-R. and F.A. writing—original draft, F.A.; writing—review and editing, Y.A. and F.A. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

The authors declare that data sharing is not applicable to this article as no new data was created or analyzed in this article.

Conflicts of Interest

The authors declare no conflicts of interest.

Abbreviations

The following abbreviations are used in this manuscript:
3 DOFThree Degrees of Freedom
ACOAnt Colony Optimization
AROArtificial Rabbit Optimization
AVRAutomated Voltage Regulator
BACBee Artificial Colony
BCOBee Colony Optimization
BOBat Optimization
BSOBeetle Swarm Optimization
CHCluster Head
CNNConvolutional Neural Network
CSACrow Search Algorithm
CuSACuckoo Search Algorithm
DEDifferential Evolution
DODragonfly Optimization
DHODeer Hunting Optimization
EnFSOEnhanced Firebug Swarm Optimization
FAFirefly Algorithm
FOPIDFractional Order PID
FOMBCFirebug Optimized Modified Bee Colony
FSOFirebug Swarm Optimization
F2SOFuzzy FSO
FSO-LSTMFirebug Swarm Optimization-Long Short Term Memory
GAGenetic Algorithm
GMPPGlobal Maximum Peak Point
GOGrasshopper Optimization
GWOGrey Wolf Optimization
HaHOHarris Hawks Optimization
HBAHoney Badger Algorithm
HEVHybrid Electric Vehicle
HFTOHybrid Firebug Tunicate Optimizer
HRESHybrid Renewable Energy Source
IFSOImproved Firebug Swarm Optimization
IoTInternet of Things
KHOKrill Herd Optimization
MBOMigrating Bird Optimization
MFOMoth Flame Optimization
mFSOModified Firebug Swarm Optimization
MPPMaximum Power Point
MPPTMaximum Power Point Tracking
MSVMMultiple Support Vector Machine
NNNeural Network
PeOAPelican Optimization Algorithm
POPuma Optimizer
POAPufferfish Optimization Algorithm
PIDProportional–Integral–Derivative
PSOParticle Swarm Optimization
PUPrimary User
PVPhotovoltaic
QoSQuality of Service
RFORed Fox Optimization
RKORed Kite Optimization
RNNRecurrent Neural Network
SHOSpotted Hyena Optimizer
SMASlime Mold Algorithm
SNSensor Node
SOSeagull Optimization
SpSASparrow Search Algorithm
SSASalp Swarm Algorithm
SUSecondary User
THDTotal Harmonic Distortion
VUFVoltage Unbalance Factor
WaOWalrus Optimizer
WOWhale Optimization

References

  1. Bellman, R. Mathematical Optimization Techniques; University of California Press: Berkeley, CA, USA, 1963. [Google Scholar]
  2. Yang, X.S. Metaheuristic optimization: Algorithm analysis and open problems. In Proceedings of the International Symposium on Experimental Algorithms; Springer: Berlin/Heidelberg, Germany, 2011; pp. 21–32. [Google Scholar]
  3. Eberhart, R.; Kennedy, J. Particle Swarm Optimization. In Proceedings of the IEEE International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; pp. 1942–1948. [Google Scholar]
  4. Colorni, A.; Dorigo, M.; Maniezzo, V. Distributed optimization by ant colonies. In Proceedings of the First European Conference on Artificial Life, Paris, France, 11–13 December 1991; Volume 142, pp. 134–142. [Google Scholar]
  5. Karaboga, D. An Idea Based on Honey Bee Swarm for Numerical Optimization; Technical Report TR06; Erciyes University, Engineering Faculty, Computer Engineering Department: Kayseri, Turkey, 2005. [Google Scholar]
  6. Cheng, R.; He, C.; Jin, Y.; Yao, X. Model-based evolutionary algorithms: A short survey. Complex Intell. Syst. 2018, 4, 283–292. [Google Scholar] [CrossRef]
  7. Ab Wahab, M.N.; Nefti-Meziani, S.; Atyabi, A. A comprehensive review of swarm optimization algorithms. PLoS ONE 2015, 10, e0122827. [Google Scholar]
  8. Vahidi, B.; Foroughi Nematolahi, A. Physical and physic-chemical based optimization methods: A review. J. Soft Comput. Civ. Eng. 2019, 3, 12–27. [Google Scholar]
  9. Rai, R.; Das, A.; Ray, S.; Dhal, K.G. Human-inspired optimization algorithms: Theoretical foundations, algorithms, open-research issues and application for multi-level thresholding. Arch. Comput. Methods Eng. 2022, 29, 5313–5352. [Google Scholar]
  10. Holland, J.J.; Domingo, E.; de la Torre, J.C.; Steinhauer, D.A. Mutation frequencies at defined single codon sites in vesicular stomatitis virus and poliovirus can be increased only slightly by chemical mutagenesis. J. Virol. 1990, 64, 3960–3962. [Google Scholar] [CrossRef]
  11. Shi, Y.; Eberhart, R.; Chen, Y. Implementation of evolutionary fuzzy systems. IEEE Trans. Fuzzy Syst. 1999, 7, 109–119. [Google Scholar] [CrossRef]
  12. Abasi, A.K.; Aloqaily, M.; Guizani, M.; Ouni, B. Metaheuristic algorithms for 6G wireless communications: Recent advances and applications. Ad Hoc Networks 2024, 149, 103474. [Google Scholar] [CrossRef]
  13. Mirjalili, S. The ant lion optimizer. Adv. Eng. Softw. 2015, 83, 80–98. [Google Scholar] [CrossRef]
  14. Braik, M.S. Chameleon Swarm Algorithm: A bio-inspired optimizer for solving engineering design problems. Expert Syst. Appl. 2021, 174, 114685. [Google Scholar] [CrossRef]
  15. Rajakumar, B. The Lion’s Algorithm: A new nature-inspired search algorithm. Procedia Technol. 2012, 6, 126–135. [Google Scholar] [CrossRef]
  16. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  17. Taherdangkoo, M.; Shirzadi, M.H.; Bagheri, M.H. A novel meta-heuristic algorithm for numerical function optimization: Blind, naked mole-rats (BNMR) algorithm. Sci. Res. Essays 2012, 7, 3566–3583. [Google Scholar]
  18. Abdollahzadeh, B.; Khodadadi, N.; Barshandeh, S.; Trojovskỳ, P.; Gharehchopogh, F.S.; El-kenawy, E.S.M.; Abualigah, L.; Mirjalili, S. Puma optimizer (PO): A novel metaheuristic optimization algorithm and its application in machine learning. Clust. Comput. 2024, 27, 1–49. [Google Scholar] [CrossRef]
  19. Saremi, S.; Mirjalili, S.; Lewis, A. Grasshopper optimisation algorithm: Theory and application. Adv. Eng. Softw. 2017, 105, 30–47. [Google Scholar] [CrossRef]
  20. Wang, T.; Yang, L. Beetle swarm optimization algorithm: Theory and application. arXiv 2018, arXiv:1808.00206. [Google Scholar] [CrossRef]
  21. Raeisi Gahrouei, J.; Beheshti, Z. The Electricity Consumption Prediction using Hybrid Red Kite Optimization Algorithm with Multi-Layer Perceptron Neural Network. J. Intell. Proced. Electr. Technol. 2022, 15, 19–40. [Google Scholar]
  22. Yang, X.S.; Deb, S. Cuckoo search via Lévy flights. In Proceedings of the 2009 World Congress on Nature & Biologically Inspired Computing (NaBIC), Coimbatore, India, 9–11 December 2009; pp. 210–214. [Google Scholar]
  23. Mirjalili, S. Dragonfly algorithm: A new meta-heuristic optimization technique for solving single-objective, discrete, and multi-objective problems. Neural Comput. Appl. 2016, 27, 1053–1073. [Google Scholar] [CrossRef]
  24. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  25. Askarzadeh, A. Bird mating optimizer: An optimization algorithm inspired by bird mating strategies. Commun. Nonlinear Sci. Numer. Simul. 2014, 19, 1213–1228. [Google Scholar] [CrossRef]
  26. Khishe, M.; Mosavi, M.R. Chimp optimization algorithm. Expert Syst. Appl. 2020, 149, 113338. [Google Scholar] [CrossRef]
  27. Noel, M.M.; Muthiah-Nakarajan, V.; Amali, G.B.; Trivedi, A.S. A new biologically inspired global optimization algorithm based on firebug reproductive swarming behaviour. Expert Syst. Appl. 2021, 183, 115408. [Google Scholar] [CrossRef]
  28. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  29. Al-Baik, O.; Alomari, S.; Alssayed, O.; Gochhait, S.; Leonova, I.; Dutta, U.; Malik, O.P.; Montazeri, Z.; Dehghani, M. Pufferfish Optimization Algorithm: A New Bio-Inspired Metaheuristic Algorithm for Solving Optimization Problems. Biomimetics 2024, 9, 65. [Google Scholar] [CrossRef]
  30. Wang, L.; Cao, Q.; Zhang, Z.; Mirjalili, S.; Zhao, W. Artificial rabbits optimization: A new bio-inspired meta-heuristic algorithm for solving engineering optimization problems. Eng. Appl. Artif. Intell. 2022, 114, 105082. [Google Scholar] [CrossRef]
  31. Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Gazelle Optimization Algorithm: A novel nature-inspired metaheuristic optimizer. Neural Comput. Appl. 2023, 35, 4099–4131. [Google Scholar] [CrossRef]
  32. Yang, X.S. A new metaheuristic bat-inspired algorithm. In Nature Inspired Cooperative Strategies for Optimization (NICSO 2010); Springer: Berlin/Heidelberg, Germany, 2010; pp. 65–74. [Google Scholar]
  33. Yang, X.S. Nature-Inspired Metaheuristic Algorithms; Luniver Press: Bristol, UK, 2010. [Google Scholar]
  34. Duman, E.; Uysal, M.; Alkaya, A.F. Migrating birds optimization: A new meta-heuristic approach and its application to the quadratic assignment problem. In Proceedings of the Applications of Evolutionary Computation: EvoApplications 2011: EvoCOMPLEX, EvoGAMES, EvoIASP, EvoINTELLIGENCE, EvoNUM, and EvoSTOC, Torino, Italy, 27–29 April 2011; Springer: Berlin/Heidelberg, Germany, 2011; pp. 254–263. [Google Scholar]
  35. Gandomi, A.H.; Alavi, A.H. Krill herd: A new bio-inspired optimization algorithm. Commun. Nonlinear Sci. Numer. Simul. 2012, 17, 4831–4845. [Google Scholar] [CrossRef]
  36. Mirjalili, S. Moth-flame optimization algorithm: A novel nature-inspired heuristic paradigm. Knowl.-Based Syst. 2015, 89, 228–249. [Google Scholar] [CrossRef]
  37. Brammya, G.; Praveena, S.; Ninu Preetha, N.; Ramya, R.; Rajakumar, B.; Binu, D. Deer hunting optimization algorithm: A new nature-inspired meta-heuristic paradigm. Comput. J. 2019, bxy133. [Google Scholar] [CrossRef]
  38. Heidari, A.A.; Mirjalili, S.; Faris, H.; Aljarah, I.; Mafarja, M.; Chen, H. Harris hawks optimization: Algorithm and applications. Future Gener. Comput. Syst. 2019, 97, 849–872. [Google Scholar] [CrossRef]
  39. Dhiman, G.; Kumar, V. Seagull optimization algorithm: Theory and its applications for large-scale industrial engineering problems. Knowl. Based Syst. 2019, 165, 169–196. [Google Scholar] [CrossRef]
  40. Hayyolalam, V.; Kazem, A.A.P. Black widow optimization algorithm: A novel meta-heuristic approach for solving engineering optimization problems. Eng. Appl. Artif. Intell. 2020, 87, 103249. [Google Scholar] [CrossRef]
  41. Mohammadi-Balani, A.; Nayeri, M.D.; Azar, A.; Taghizadeh-Yazdi, M. Golden eagle optimizer: A nature-inspired metaheuristic algorithm. Comput. Ind. Eng. 2021, 152, 107050. [Google Scholar] [CrossRef]
  42. Abdollahzadeh, B.; Soleimanian Gharehchopogh, F.; Mirjalili, S. Artificial gorilla troops optimizer: A new nature-inspired metaheuristic algorithm for global optimization problems. Int. J. Intell. Syst. 2021, 36, 5887–5958. [Google Scholar] [CrossRef]
  43. Połap, D.; Woźniak, M. Red fox optimization algorithm. Expert Syst. Appl. 2021, 166, 114107. [Google Scholar] [CrossRef]
  44. Abualigah, L.; Yousri, D.; Abd Elaziz, M.; Ewees, A.A.; Al-Qaness, M.A.; Gandomi, A.H. Aquila optimizer: A novel meta-heuristic optimization algorithm. Comput. Ind. Eng. 2021, 157, 107250. [Google Scholar] [CrossRef]
  45. Braik, M.; Hammouri, A.; Atwan, J.; Al-Betar, M.A.; Awadallah, M.A. White Shark Optimizer: A novel bio-inspired meta-heuristic algorithm for global optimization problems. Knowl. Based Syst. 2022, 243, 108457. [Google Scholar] [CrossRef]
  46. Agushaka, J.O.; Ezugwu, A.E.; Abualigah, L. Dwarf mongoose optimization algorithm. Comput. Methods Appl. Mech. Eng. 2022, 391, 114570. [Google Scholar] [CrossRef]
  47. Abdel-Basset, M.; Mohamed, R.; Jameel, M.; Abouhawwash, M. Nutcracker optimizer: A novel nature-inspired metaheuristic algorithm for global optimization and engineering design problems. Knowl. Based Syst. 2023, 262, 110248. [Google Scholar] [CrossRef]
  48. Trojovskỳ, P.; Dehghani, M. A new bio-inspired metaheuristic algorithm for solving optimization problems based on walruses behavior. Sci. Rep. 2023, 13, 8775. [Google Scholar] [CrossRef]
  49. Abd El-Sattar, H.; Kamel, S.; Hashim, F.A.; Sabbeh, S.F. Optihybrid: A modified firebug swarm optimization algorithm for optimal sizing of hybrid renewable power system. Neural Comput. Appl. 2024, 36, 1–27. [Google Scholar] [CrossRef]
  50. Parvathy, J.; Patil, P.G. Fingerprint Recognition Model Using Improved Firebug Swarm Optimization and tanh-Based Fuzzy Activated Neural Network. SN Comput. Sci. 2024, 5, 575. [Google Scholar] [CrossRef]
  51. Suresh, K.; Sreeja Mole, S.; Joseph Selva Kumar, A. F2SO: An energy efficient cluster based routing protocol using fuzzy firebug swarm optimization algorithm in WSN. Comput. J. 2023, 66, 1126–1138. [Google Scholar] [CrossRef]
  52. Zhang, Y.; Wang, S.; Ji, G. A comprehensive survey on particle swarm optimization algorithm and its applications. Math. Probl. Eng. 2015, 2015, 931256. [Google Scholar] [CrossRef]
  53. Bai, Q. Analysis of particle swarm optimization algorithm. Comput. Inf. Sci. 2010, 3, 180. [Google Scholar] [CrossRef]
  54. Shami, T.M.; El-Saleh, A.A.; Alswaitti, M.; Al-Tashi, Q.; Summakieh, M.A.; Mirjalili, S. Particle swarm optimization: A comprehensive survey. IEEE Access 2022, 10, 10031–10061. [Google Scholar] [CrossRef]
  55. Gad, A.G. Particle swarm optimization algorithm and its applications: A systematic review. Arch. Comput. Methods Eng. 2022, 29, 2531–2561. [Google Scholar] [CrossRef]
  56. Rezvanian, A.; Vahidipour, S.M.; Sadollah, A. An overview of ant colony optimization algorithms for dynamic optimization problems. In Optimization Algorithms—Classics and Recent Advances; IntechOpen: London, UK, 2023; pp. 1–19. [Google Scholar]
  57. Wong, W.; Ming, C.I. A review on metaheuristic algorithms: Recent trends, benchmarking and applications. In Proceedings of the 2019 7th International Conference on Smart Computing & Communications (ICSCC), Miri, Sarawak, Malaysia, 28–30 June 2019; pp. 1–5. [Google Scholar]
  58. Rajwar, K.; Deep, K.; Das, S. An exhaustive review of the metaheuristic algorithms for search and optimization: Taxonomy, applications, and open challenges. Artif. Intell. Rev. 2023, 56, 13187–13257. [Google Scholar] [CrossRef]
  59. Al-Tashi, Q.; Md Rais, H.; Abdulkadir, S.J.; Mirjalili, S.; Alhussian, H. A review of grey wolf optimizer-based feature selection methods for classification. In Evolutionary Machine Learning Techniques: Algorithms and Applications; Springer: Berlin/Heidelberg, Germany, 2000; pp. 273–286. [Google Scholar]
  60. Faris, H.; Aljarah, I.; Al-Betar, M.A.; Mirjalili, S. Grey wolf optimizer: A review of recent variants and applications. Neural Comput. Appl. 2018, 30, 413–435. [Google Scholar] [CrossRef]
  61. Sharma, I.; Kumar, V.; Sharma, S. A comprehensive survey on grey wolf optimization. Recent Adv. Comput. Sci. Commun. Formerly Recent Patents Comput. Sci. 2022, 15, 323–333. [Google Scholar]
  62. Wang, Y.; Huang, L.; Zhong, J.; Hu, G. LARO: Opposition-based learning boosted artificial rabbits-inspired optimization algorithm with Lévy flight. Symmetry 2022, 14, 2282. [Google Scholar] [CrossRef]
  63. Turgut, O.E.; Turgut, M.S.; Kırtepe, E. A systematic review of the emerging metaheuristic algorithms on solving complex optimization problems. Neural Comput. Appl. 2023, 35, 14275–14378. [Google Scholar] [CrossRef]
  64. Alharbi, L.A. Artificial rabbits Optimizer with machine learning based emergency department monitoring and medical data classification at KSA hospitals. IEEE Access 2023, 11, 59133–59141. [Google Scholar] [CrossRef]
  65. Senthil Kumar, R.; Subash kumar, C.; Lakshmanan, M.; Rajamani, M. Resources using multi-input DC-DC converter topology for optimal utilization of a novel step-up interconnected enhanced technique renewable energy. Analog. Integr. Circuits Signal Process. 2025, 122, 42. [Google Scholar] [CrossRef]
  66. Trojovskỳ, P.; Dehghani, M. Pelican optimization algorithm: A novel nature-inspired algorithm for engineering applications. Sensors 2022, 22, 855. [Google Scholar] [CrossRef] [PubMed]
  67. Nanjappan, M.; Natesan, G.; Krishnadoss, P. HFTO: Hybrid Firebug Tunicate Optimizer for Fault Tolerance and Dynamic Task Scheduling in Cloud Computing. Wirel. Pers. Commun. 2023, 129, 323–344. [Google Scholar] [CrossRef]
  68. Alamelumangai, M.; Suresh, S. Firebug Optimized Modified Bee Colony Algorithm for Trusted WSN Routing. IETE J. Res. 2023, 70, 4903–4916. [Google Scholar] [CrossRef]
  69. Karthik, E.; Sethukarasi, T. Sarcastic user behavior classification and prediction from social media data using firebug swarm optimization-based long short-term memory. J. Supercomput. 2022, 78, 5333–5357. [Google Scholar] [CrossRef]
  70. Haripriya, V.; Patil, P.G. An Ensemble Framework with Optimal Features for Sarcasm Detection in Social Media Data. Int. J. Intell. Syst. Appl. Eng. 2024, 12, 748–760. [Google Scholar]
  71. Anand, K.; Vijayaraj, A.; Vijay Anand, M. Privacy preserving framework using Gaussian mutation based firebug optimization in cloud computing. J. Supercomput. 2022, 78, 9414–9437. [Google Scholar] [CrossRef]
  72. Gandikoti, C.; Jha, S.K.; Jha, B.M.; Mishra, P. Distributed Voltage Unbalance Mitigation in Islanded Microgrid using Moth Flame Optimization and Firebug Swarm Optimization. Int. J. Power Electron. Drive Syst. 2024, 15, 824–834. [Google Scholar] [CrossRef]
  73. GayathriMonicka, S.; Manimegalai, D.; Karthikeyan, M. FSO based MPPT Algorithm for Maximizing Power Output in PV System under Partial Shading Conditions. In Proceedings of the 2023 5th International Conference on Smart Systems and Inventive Technology (ICSSIT), Tirunelveli, Tamil Nadu, India, 23–25 January 2023; pp. 73–79. [Google Scholar]
  74. Subarnan, G.M.; Damodaran, M.; Madhu, K.; Rethinam, G. Optimization of Image Processing Based MPPT Algorithm Using FSO Algorithm. Electr. Power Components Syst. 2024, 52, 364–380. [Google Scholar] [CrossRef]
  75. Mallappa, P.K.B.; Martínez-García, H.; Velasco-Quesada, G. Implementation of Grid-Connected Wind Energy during Fault Analysis Using Moth Flame Optimization with Firebug Swarm Optimization. Renew. Energy Power Qual. J. 2023, 21, 4. [Google Scholar] [CrossRef]
  76. Stallon, S.R.D.; Anand, R.; Kannan, R.; Rajasekaran, S. Optimal detection and classification of grid connected system using MSVM-FSO technique. Environ. Sci. Pollut. Res. 2024, 31, 31064–31080. [Google Scholar] [CrossRef]
  77. Chakraborty, S.; Mondal, A.; Das, C. Fuzzy Fractional Order PID Controller Design for AVR System. In Proceedings of the 2024 IEEE 3rd International Conference on Control, Instrumentation, Energy & Communication (CIEC), Kolkata, India, 25–27 January 2024; pp. 31–36. [Google Scholar]
  78. Zamani, M.; Karimi-Ghartemani, M.; Sadati, N.; Parniani, M. Design of a fractional order PID controller for an AVR using particle swarm optimization. Control Eng. Pract. 2009, 17, 1380–1387. [Google Scholar] [CrossRef]
  79. Ekinci, S.; Hekimoğlu, B. Improved kidney-inspired algorithm approach for tuning of PID controller in AVR system. IEEE Access 2019, 7, 39935–39947. [Google Scholar] [CrossRef]
  80. Chakraborty, S.; Mondal, A.; Biswas, S.; Roy, P.K. Design of FUZZY-3DOF-PID controller for an Ocean Thermal hybrid Automatic Generation Control system. Sci. Iran. 2023, 30, e023423. [Google Scholar] [CrossRef]
  81. Chakraborty, S.; Mondal, A.; Biswas, S. Application of FUZZY-3DOF-PID controller for controlling FOPTD type communication delay based renewable three-area deregulated hybrid power system. Evol. Intell. 2024, 17, 2821–2841. [Google Scholar] [CrossRef]
  82. Justin Raj, P.; Vasan Prabhu, V.; Krishna Kumar, V. Battery Thermal Management System for Electric Vehicle (EV)/Hybrid EV (HEV) with the Incorporation of POA-FSO Strategy. J. Circuits, Syst. Comput. 2024, 33, 2450199. [Google Scholar] [CrossRef]
  83. Khatir, A.; Capozucca, R.; Khatir, S.; Magagnini, E.; Le Thanh, C.; Riahi, M.K. Advancements and emerging trends in integrating machine learning and deep learning for SHM in mechanical and civil engineering: A comprehensive review. J. Braz. Soc. Mech. Sci. Eng. 2025, 47, 1–34. [Google Scholar] [CrossRef]
  84. Mansouri, A.; Tiachacht, S.; Ait-Aider, H.; Khatir, S.; Khatir, A.; Cuong-Le, T. A novel Optimization-Based Damage Detection in Beam Systems Using Advanced Algorithms for Joint-Induced Structural Vibrations. J. Vib. Eng. Technol. 2025, 13, 1–30. [Google Scholar] [CrossRef]
  85. El-Sadek, M.Z.; El-Aziz, M.K.A.; Shaaban, A.H.; Mostafa, S.A.; Wadan, A.H.S. Advancements and emerging trends in photodynamic therapy: Innovations in cancer treatment and beyond. Photochem. Photobiol. Sci. 2025, 24, 1489–1511. [Google Scholar] [CrossRef] [PubMed]
  86. Corral-De-Witt, D.; Ahmed, S.; Awin, F.; Rojo-Álvarez, J.L.; Tepe, K. An Accurate Probabilistic Model for TVWS Identification. Appl. Sci. 2019, 9, 4232. [Google Scholar] [CrossRef]
  87. Mohammad, A.; Awin, F.; Abdel-Raheem, E. Case study of TV spectrum sensing model based on machine learning techniques. Ain Shams Eng. J. 2022, 13, 101540. [Google Scholar] [CrossRef]
  88. Eziama, E.; Ahmed, S.; Ahmed, S.; Awin, F.; Tepe, K. Detection of adversary nodes in machine-to-machine communication using machine learning based trust model. In Proceedings of the 2019 IEEE international symposium on signal processing and information technology (ISSPIT), Ajman, United Arab Emirates, 10–12 December 2019; pp. 1–6. [Google Scholar]
  89. Wang, H.; Yu, X.; Chen, Q.; Wang, F. Research on Optimal Allocation Method of Energy Storage Devices for Coordinated Wind and Solar Power Generation. In Proceedings of the 2023 6th International Conference on Energy, Electrical and Power Engineering (CEEPE), Shenzhen, China, 15–17 September 2023; pp. 1073–1078. [Google Scholar]
  90. Nematollahi, M.; Ghaffari, A.; Mirzaei, A. Task and resource allocation in the internet of things based on an improved version of the moth-flame optimization algorithm. Clust. Comput. 2024, 27, 1775–1797. [Google Scholar] [CrossRef]
Figure 1. Classifications of optimization algorithms.
Figure 1. Classifications of optimization algorithms.
Signals 07 00008 g001
Figure 2. Timeline of advancement of almost all swarm intelligence algorithms.
Figure 2. Timeline of advancement of almost all swarm intelligence algorithms.
Signals 07 00008 g002
Figure 3. Inspiration sources of almost all swarm-based algorithms.
Figure 3. Inspiration sources of almost all swarm-based algorithms.
Signals 07 00008 g003
Figure 4. FSO pptimization phases versus Firebug swarm behaviors.
Figure 4. FSO pptimization phases versus Firebug swarm behaviors.
Signals 07 00008 g004
Figure 5. Illustrative flowchart of the FSO algorithm.
Figure 5. Illustrative flowchart of the FSO algorithm.
Signals 07 00008 g005
Figure 6. FSO practical real-world applications.
Figure 6. FSO practical real-world applications.
Signals 07 00008 g006
Figure 7. Comparison of the algorithms’ overhead.
Figure 7. Comparison of the algorithms’ overhead.
Signals 07 00008 g007
Figure 8. Comparison of the algorithms’ delay.
Figure 8. Comparison of the algorithms’ delay.
Signals 07 00008 g008
Figure 9. Comparison of the algorithms’ throughput.
Figure 9. Comparison of the algorithms’ throughput.
Signals 07 00008 g009
Figure 10. Comparison of the algorithms’ packet delivery.
Figure 10. Comparison of the algorithms’ packet delivery.
Signals 07 00008 g010
Table 1. Nature-inspired algorithms, their inspiration source, algorithm phases, and applications.
Table 1. Nature-inspired algorithms, their inspiration source, algorithm phases, and applications.
No.AlgorithmSource of InspirationAuthors, YearAlgorithm PhasesApplications
1PSOCollective behaviorKennedy et al.,InitializationWSNs
of swarms of birds1942 [3]EvaluationBioinformatics
or fish Update personal bestManufacturing
Update global bestMedical diagnosis
Update velocity and positionPower systems
TerminationEnergy storage
2FAFlashing behaviorYang,InitializationImage compression
of fireflies2008 [33]EvaluationFeature selection
AttractionAntenna design
IntensityLoad dispatch
RandomizationClassifications
SolutionClustering
3MBOV flight formationDuman et al.,V FormationQuadratic assignment
of the migrating2011 [34]Evaluation
birds to save energy Selecting the best leader
4KHOHerding behaviorGandomi andInitializationData clustering
of krill individualsAlavi, 2012 [35]Krill movementImage segmentation
Krill selectionWSNs
Krill eliminationPower optimization
5MFOTransverse orientationMirjalili,Moth movementFeature selection
2015 [36]UpdateImage processing
Renewable energy
6DHOHunting behavior ofBrammya et al.,Population initializationSpectrum sensing
human towards a deer2019 [37]Parametric initializationFeature selection
Position propagationLogistics
TerminationImage processing
7HaHOCooperative behaviorsHeidari et al.,InitializationPower optimization
of the Harris hawks’ in2019 [38]ExplorationEngineering design
hunting escaping preys ExploitationMedical diagnosis
Local searchFinance
8SOMigration and huntingDhiman et al.,InitializationImage processing
behavior of seagulls2019 [39]Search behaviorNetwork optimization
Update behaviorNeural network
TerminationFinancial forecasting
9BWOThe bizarre matingHayyolalam,InitializationEngineering design
behavior of black widow2020 [40]EvaluationFeature selection
spiders ProcreatingCyber security
CannibalismMedical diagnosis
MutationImage processing
Updating
10GEOIntelligence of goldenBalani et al.,InitializationEngineering design
eagles in tuning speed2021 [41]Soaring behaviorEconomic dispatch
at different stages of Hunting behaviorMedical diagnosis
their spiral trajectory Update behaviorFeature selection
for hunting TerminationImage processing
11AGTOGorilla troops’ socialAbdollahzadehetInitializationEngineering design
intelligenceet al., 2021 [42]MovementFunction optimization
CommunicationFeature selection
Memory updateImage processing
SelectionRenewable energy
12FSOReproductive swarmingNoel et al.,Roaming and exploringPrivacy and security
behavior of firebugs2021 [27]Forming aggregationsWSN routing
(Pyrrhocoris Apterus) MovementFault analysis
ReproductionEngineering design
Sentiment analysis
13RFORed fox live and huntingPolap et al.,InitializationComplex optimization
behavior2021 [43]Global searchCOVID-19 diagnosis
Local searchPath planning
Reproduction andImage segmentation
leaving the herdCancer detection
14AOAquila’s natural behaviorsAbualigah et al.,Selecting search spaceFeature selection
while capturing prey2021 [44]Narrowed explorationEngineering design
Expanded explorationGlobal optimization
Narrowed exploitationIdentification of control
scheduling
15WSOScholastic behaviorsBraik et al.,InitializationPower optimization
of white sharks2022 [45]Tracking the preyEngineering design
Search for preyFeature selection
MovementImage processing
Robotics
16DMOCompensatory behavioralAagushakaInitializationEngineering design
adaptation of the Dwarfet al., 2022 [46]GroupingGlobal optimization
Mongoose EvaluationData clustering
UpdatingIdentification
17NOABehavior of Clark’sAbdel-BassetSearch seedsEngineering design
Nutcrackerset al., 2023 [47]Storage in cacheGlobal optimization
Recovery behaviorsResource allocation
Block chain
18WaOBehaviors of walruses thatTrojovskyInitializationEngineering design
choose to migrate, breed,et al., 2023 [48]SignalingPrecise modeling
roost, feed, gather, and MigrationLoad dispatching problem
escape by receiving key signals ReproductionRenewable energy
19POADefense mechanism ofAl-BakiInitializationEngineering design
Pufferfish against predatorset al., 2024 [29]EvaluationOptimization problem
Predator attack
Defense
Update
Table 2. Comparison between some nature-inspired algorithms in terms of advantages, disadvantages, and limitations.
Table 2. Comparison between some nature-inspired algorithms in terms of advantages, disadvantages, and limitations.
No.AlgorithmAdvantagesDisadvantagesLimitations
1.FSOI. Effectively explores theI. FSO is not designedI. Limitations in exploration [50].
solution search space andfor unimodal optimizationII. In some instances, it might
can avoid local optima.problems.cause imbalanced exploration–exploitation
II. Simple and easy to imp-II. Imbalanced explorationtrade-off, leading to being trapped
lement.–exploitation trade-off.in local optima [49].
III. Adapts different typesIII. Cannot always sustainIII. FSO has a constant exploration–
of optimization problems.population diversity.exploitation ratio, which leads
IV. Fast and robust in to improper search behavior [51].
complex and multimodal
optimization problems.
V. Versatile for real-world
applications.
VI. Resilient in noisy
environments.
2.PSOI. Simple: relatively easyI. Premature convergence.I. PSO can suffer from a lack of
to understand and simple-II. Sensitivity to parameters.diversity in the swarm [52].
ment compared with otherIII. Struggles in high-II. Subject to stagnation in exploring
algorithms.dimensional spaces.the search space [53].
II. Has fewer parametersIV. Performs poorly inIII. Dependent on initialization [54].
to tune compared to GA.environments with noisyIV. Balancing exploration and
III. Converges quickly toor highly variable objectiveexploitation is challenging [55].
a solution, especially infunctions.
low-dimensional spaces.
IV. Efficient for global
search.
3.ACOI. Effectively exploresI. The performance is heavilyI. Premature convergence [56].
large search spacesdependent on the choice ofII. ACO might struggle with very
(Global Search Ability).parameters (Parameterhigh-dimensional problems.
II. Suitable for dynamicsensitivity).III. Initialization significantly impacts
optimization problemsII. Slow convergence, especiallythe algorithm’s performance [57].
(Adaptability).in determining precise solution.IV. Mis-updating might result in
III. Versatile, can beIII. ACO can suffer from sta-saturation, where ants follow the
applied to a wide rangegnation, limiting exploration.same paths [58].
of problems.IV. Computationally intensive,
IV. Parallel processing.especially in large-class
problems.
4.GWOI. Easy to understandI. Slow convergence, espe-I. Prediction performance
and implement.cially, fine-tuning phase ofin various scenarios is
II. Balances explorationthe optimization process.challenging [59].
and exploitation.II. Performance is sensitiveII. In complex problem, no
III. Requires relatively fewto selected parameter values.guarantee to find
tuning parameters.III. Performs poorly in noisyglobal optima [60].
IV. Designed to avoidenvironments.III. The initial positions of
local optima. wolves significantly impact
the performance [61].
5.AROI. Easy to understandI. Performance is sensitiveI. Lacks a robust theoretical
and implement.to the selection of parameterframework [62].
II. Effectively balancesvalues.II. Initial position dependence [63].
exploration and exploit-II. Slow convergence speed.III. Struggles with high-
ation.III. Intensive computationaldimensional optimization
III. Adapts changing env-cost in large optimizationproblems [64].
ironments and dynamicproblems.
problems.
IV. Versatile, can be applied
to various optimization
problems.
Table 3. Comparison between mFSO and other existing algorithms [49].
Table 3. Comparison between mFSO and other existing algorithms [49].
mFSOSOASMAFSO
Best fitness function0.097010.097040.09710.09782
Iteration43182923
COE (USD/kWh)0.191860.194700.193090.36080
Table 4. Parameter settings.
Table 4. Parameter settings.
ParameterValue
L4000 bits
E d i s 50 nJ/bit
ϵ F S 10 pJ/bit/m2
Area100 × 100 m
Rounds100
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Awin, F.; Alginahi, Y.; Abdel-Raheem, E. Firebug Swarm Optimization Algorithm: An Overview and Applications. Signals 2026, 7, 8. https://doi.org/10.3390/signals7010008

AMA Style

Awin F, Alginahi Y, Abdel-Raheem E. Firebug Swarm Optimization Algorithm: An Overview and Applications. Signals. 2026; 7(1):8. https://doi.org/10.3390/signals7010008

Chicago/Turabian Style

Awin, Faroq, Yasser Alginahi, and Esam Abdel-Raheem. 2026. "Firebug Swarm Optimization Algorithm: An Overview and Applications" Signals 7, no. 1: 8. https://doi.org/10.3390/signals7010008

APA Style

Awin, F., Alginahi, Y., & Abdel-Raheem, E. (2026). Firebug Swarm Optimization Algorithm: An Overview and Applications. Signals, 7(1), 8. https://doi.org/10.3390/signals7010008

Article Metrics

Back to TopTop