Next Article in Journal
Edge Server Deployment Strategy Based on Queueing Search Meta-Heuristic Algorithm
Previous Article in Journal
Ultra-Low-Cost Real-Time Precise Point Positioning Using Different Streams for Precise Positioning and Precipitable Water Vapor Retrieval Estimates
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Gaslike Social Motility: Optimization Algorithm with Application in Image Thresholding Segmentation

by
Oscar D. Sanchez
1,*,†,
Luz M. Reyes
2,3,†,
Arturo Valdivia-González
2,†,
Alma Y. Alanis
2,† and
Eduardo Rangel-Heras
2,†
1
Departamento Académico de Computación e Industrial, Universidad Autónoma de Guadalajara. Av. Patria 1201, Zapopan 45129, Mexico
2
University Center of Exact Sciences and Engineering, University of Guadalajara, Guadalajara 44100, Mexico
3
Max Planck Institute for the Physics of Complex Systems, 01187 Dresden, Germany
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Algorithms 2025, 18(4), 199; https://doi.org/10.3390/a18040199
Submission received: 4 February 2025 / Revised: 9 March 2025 / Accepted: 25 March 2025 / Published: 2 April 2025
(This article belongs to the Section Evolutionary Algorithms and Machine Learning)

Abstract

:
This work introduces a novel and practical metaheuristic algorithm, the Gaslike Social Motility (GSM) algorithm, designed for optimization and image thresholding segmentation. Inspired by a deterministic model that replicates social behaviors using gaslike particles, GSM is characterized by its simplicity, minimal parameter requirements, and emergent social dynamics. These dynamics include: (1) attraction between similar particles, (2) formation of stable particle clusters, (3) division of groups upon reaching a critical size, (4) inter-group interactions that influence particle distribution during the search process, and (5) internal state changes in particles driven by local interactions. The model’s versatility, including cross-group monitoring and adaptability to environmental interactions, makes it a powerful tool for exploring diverse scenarios. GSM is rigorously evaluated against established and recent metaheuristic algorithms, including Particle Swarm Optimization (PSO), Differential Evolution (DE), Bat Algorithm (BA), Artificial Bee Colony (ABC), Artificial Hummingbird Algorithm (AHA), AHA with Aquila Optimization (AHA-AO), Colliding Bodies Optimization (CBO), Enhanced CBO (ECBO), and Social Network Search (SNS). Performance is assessed using 22 benchmark functions, demonstrating GSM’s competitiveness. Additionally, GSM’s efficiency in image thresholding segmentation is highlighted, as it achieves high-quality results with fewer iterations and particles compared to other methods.

1. Introduction

The “No Free Lunch” theorem asserts that no single algorithm can universally outperform all others across every possible problem [1]. Consequently, a diverse range of metaheuristic algorithms (MAs) has been developed, including Evolutionary Algorithms (EAs) and Swarm Intelligence Algorithms (SAs) [2]. Among the most prominent MAs are Differential Evolution (DE) [3], Artificial Bee Colony (ABC) [4], Particle Swarm Optimization (PSO) [5], Genetic Algorithm (GA) [6], and Ant Colony Optimization (ACO) [7].
Recent advancements have introduced innovative particle-based algorithms with notable performance. These include the Artificial Hummingbird Algorithm (AHA), which emulates the foraging and flight behaviors of hummingbirds [8]; Colliding Bodies Optimization (CBO), which models solutions as physical bodies undergoing one-dimensional collisions [9]; and Enhanced Colliding Bodies Optimization (ECBO), an improved variant of CBO that incorporates mass and velocity to refine collision dynamics [10]. Additionally, hybrid approaches like the Artificial Hummingbird Algorithm with Aquila Optimization (AHA-AO) [11] and Social Network Search (SNS) have gained traction. SNS simulates social media interactions, where users influence one another through mood-driven behaviors such as Imitation, Conversation, Disputation, and Innovation, mirroring real-world opinion dynamics [12].
Metaheuristic algorithms (MAs) generate solutions by efficiently exploring the search space while progressively narrowing the search area. Their performance hinges on the ability to balance exploration, i.e., searching globally for diverse solutions, and exploitation, i.e., refining solutions within promising regions. Initially, when no prior information about the search space is available, MAs prioritize exploration to identify a wide range of potential solutions. As the algorithm progresses and converges toward potential optima, the focus shifts to exploitation, enhancing solution precision and accuracy [13].
Modern real-life optimization problems are often highly complex, making them difficult to solve using traditional exact methods. This complexity stems from factors such as high dimensionality, intricate parameter interactions, and multimodality [13]. Metaheuristic algorithms (MAs) have emerged as a robust alternative, offering effective solutions to these challenging problems [14].
Parametric identification is a crucial application in automatic control, but it presents significant challenges due to the complexity of the models involved. These models often include numerous parameters and multiple ordinary differential equations, making traditional identification methods inadequate. While parametric estimation techniques for linear systems are well-established [15], they often fail to address the complexities of non-linear systems. Conventional approaches are limited by their reliance on assumptions, such as unimodality, continuity, and differentiability of the objective function, which rarely apply to non-linear systems [16]. Another important application is in image analysis, where image segmentation remains a critical and extensively researched task [17].
This process is a critical preliminary step in image analysis, aiming to divide an image into meaningful and homogeneous regions based on characteristics such as gray values, edge information, and texture. Framing this as an optimization problem involves minimizing the cross-entropy between the object and the background. However, this approach often encounters multiple local minima, complicating the computational process. Furthermore, the computational time grows exponentially with the number of thresholds, making it a challenging and resource-intensive task.
To address key limitations, particularly the high computational cost of threshold searches, numerous algorithms have been developed. In scenarios where traditional methods fall short, metaheuristic algorithms (MAs) offer a promising alternative. Unlike conventional approaches, MAs do not depend on simplifying assumptions such as unimodality, continuity, or differentiability, making them highly effective for complex nonlinear systems with multiple parameters and time-varying dynamics. Their success has been demonstrated in various real-world applications [16], including the identification of unknown parameters in biological models [18].
In this study, we propose a novel population-based algorithm, the Gaslike Social Motility Algorithm (GSM), for solving optimization problems. Inspired by the movement patterns of gas motility [19], GSM dynamically adapts its search process to efficiently explore the solution space. This model is characterized by its deterministic nature, minimal parameter requirements, and ability to replicate social behaviors, such as: (1) attraction between similar particles, (2) formation of stable groups, (3) division of groups into smaller units upon reaching a critical size, (4) modification of particle distributions through inter-group interactions during the search process, and (5) changes in particles’ internal states through local interactions. The proposed algorithm was evaluated using 22 benchmark optimization functions and compared against the performance of the PSO, DE, BA, ABC, AHA, AHA-AO, CBO, ECBO, and SNS algorithms. The GSM algorithm demonstrated superior performance, achieving optimal results for all benchmark functions with significantly fewer computational iterations than its counterparts.
The key contributions of this paper are: (1) the introduction of a novel swarm-based optimization algorithm, (2) a comprehensive comparison of various metaheuristic algorithms, and (3) the application of the proposed algorithm to image thresholding segmentation. These contributions underscore the significance of this work in advancing the field of metaheuristic algorithms (MAs).
The paper is structured as follows: first, the GSM algorithm is described in detail; next, a comparative analysis is conducted between GSM and state-of-the-art algorithms, including PSO, DE, BA, ABC, AHA, AHA-AO, CBO, ECBO, and SNS; then, the application of GSM to image thresholding segmentation is presented; and finally, the conclusions are discussed.

2. The Proposed Gaslike Social Motility (GSM) Algorithm

We propose a novel swarm-based algorithm for global optimization, inspired by the principles of gaslike social motility as described in [19].
This algorithm is based on the concept of collective movement, commonly observed in systems such as bird flocks, bee swarms, and animal herds. Each particle is characterized by an internal state that evolves through interactions with neighboring particles, where the interaction strength depends on the internal states of the neighbors. Specifically, a particle i responds to particle j by either approaching or retreating, requiring an evaluation of the local environment to determine the appropriate action.
The behavior of gas particles is described below, along with its adaptation to develop a global optimization strategy inspired by their dynamics.

2.1. Gaslike Social Motility Behavior: Inspiration and Application

In [19], each particle i interacts within a bounded, time-varying neighborhood defined by the proximity and affinity between the gas particle i and its surrounding particles. These interactions determine whether a particle i forms groups with its neighbors or moves away from them. The behavior of each particle is governed by the following dynamic rules:
x t + 1 i = ( 1 ϵ ) f ( x t i ) + ϵ | η t i | j η t i f ( x t j ) ,
r t + 1 i = r t i + γ j η t i r t j r t i | r t j r t i | x t + 1 i j η t i x t + 1 j .
Here, x t i denotes the feature (state or mood) of the i-th particle at the time step t, where i ranges from 1 to N, the total number of particles. The position vector r t i lies in a d-dimensional space, and the neighborhood of each particle is defined as η t i = { j : | r t j r t i | R } , with | η t i | representing its cardinality. The parameter γ controls the coupling strength, determining how quickly a particle r t i adjusts its position to approach or retreat from others. The constant R defines the interaction radius for each particle, while ϵ represents the bounded coupling strength within the range 0 ϵ 1 .
The state of the particle x t i is influenced not only by its own mood but also by the moods of its neighboring particles. The parameter ϵ determines the extent of this influence: a higher value of ϵ increases the contribution from neighbors, while a lower value diminishes it. This dynamic models a highly relevant social interaction mechanism, making it particularly suitable for application in metaheuristic algorithms.
In Equation (1), the function f ( x ) governs the internal behavior of each particle and is applied across the entire network. The dynamics are coupled through a weighted sum of the influences exerted by the neighboring particles of the particle i. In this work, a network is defined as the set of relationships or connections that particles establish with their neighbors within their environment.
From Equation (2), it is clear that the particle i evaluates its environment globally before deciding whether to stay close to or move away from its neighbors. This decision is primarily determined by the two factors on the right-hand side of (2). The movement, which can be either inward or outward, is driven by the term x t + 1 i j η t i x t + 1 j . Here, particles exchange information about their “affinity” with their neighboring group, reflecting the similarity of their characteristics at time t + 1 . The direction of movement is influenced by the first factor in the equation, which assesses the angular distribution of neighbors relative to the particle i. Greater asymmetry in the angular distribution of neighbors results in a larger displacement magnitude for i.

2.2. Gaslike Social Motility Algorithm

The emergent behavior described by Equations (1) and (2) is well-suited for use as an optimization algorithm. To adapt this behavior for optimization, we introduced modifications while retaining the core principles of the original model. As a result, (1) and (2) are reformulated as follows:
x t + 1 i = ( 1 ϵ ) f ( r t i ) + ϵ | η t i | j η t i f ( r t j ) ,
r t + 1 i = r N + γ j η t j r B j r t i | r B j r t i | x t + 1 i j η t j x t + 1 j .
Here, we consider a population of N motile particles, each with a continuous position r t i R D . Each particle i navigates the D-dimensional search space, where x t i R represents the state (or “mood”) of the i-th gas particle, reflecting its perception of the environment. The evolution of the particle’s state depends on its interactions with its neighborhood. In particular, when r B j = r t i , the best position achieved by the neighboring particle r B j coincides with the position of particle r t i . In this case, the neighbor does not provide any meaningful contribution and is therefore ignored. Figure 1 illustrates an example of neighbor selection for the particle r t i within the search space.
The primary objective is to optimize the function f ( · ) by adjusting the position r of the gas particles. The optimization process depends entirely on the position r at the time step t, which is influenced by the particle’s mood x t + 1 i . The factor x t + 1 i j η t i x t + 1 j is computed to determine the direction of motion for the particle.
This equation quantifies the affinity of the particle i with its neighborhood. If the affinity is “good”, the particle i moves toward the dense region of the normal distribution corresponding to the best global position of the particles. If the affinity is “bad”, it moves in the opposite direction. Figure 2 illustrates whether the displacement is positive or negative based on the contributions of neighboring particles.
Moreover, in Equation (4), the direction of movement for particle i depends on the distribution of the best-visited positions r B j of all particles in the set η t i , relative to the position of i, as illustrated in Figure 3.
Finally, the term r N is a vector generated as r N N r B e s t , σ b s , where r N = ( r 1 , r 2 , , r D ) R D . This introduces a randomness factor that models the uncertainty and variability inherent in social systems, where particles may act unpredictably in response to external influences. In the context of the model, this term can produce effects contrary to expectations, generating deviations in particle displacement and reflecting behavioral diversity within the network. This reinforces the analogy to real social systems, where interactions between agents are not always deterministic. Here, N ( · ) is a normal distribution centered on the best global position r B e s t , and σ b s is the standard deviation of the best positions of b s particles, where b s < N . This term represents the leadership of the best-performing particle, guiding collective movement toward feasible regions with optimal solutions. However, particles may choose to move closer or farther depending on interactions with nearby particles, as illustrated in Figure 4.
In summary, the displacement of the particles depends on how the position r i relates to the best positions found by the set η t i and how similar the next state x t + 1 is to the corresponding i-th component. Specifically, if a particle is far from the best positions, it will move more significantly than if it is near them. The particle is attracted to remain in the vicinity of the best positions only if its state is comparable to those of its neighbors. If a particle has no neighbors ( η t j { } ), its position is determined solely by the random variable from N ( · ) , and its internal state is defined by the objective function at r t i . To avoid this scenario, an appropriate choice of R is crucial to ensure particles maintain interactions within the network, tailored to the properties of the objective function.
The selection of the parameters ϵ , γ , and R should be guided by the following considerations: ϵ regulates the coupling between a particle’s internal state and that of its neighbors. Larger values ( 0 < ϵ < 1 ) increase the tendency to resemble neighboring particles, while ϵ = 0 causes the particle to evolve independently of its neighborhood. γ should be small enough to allow cluster formation, as γ = 0 results in no motion. R depends on the size of the search space. If the space is large and R is small, particles may become too dispersed, preventing cluster formation. Conversely, if the space is small and R is large, particles will have nearly all others as neighbors. The interaction radius R defines the neighborhood size, and dynamically adjusting R allows the algorithm to balance local exploitation (small R) and global exploration (large R). When a particle has no useful neighbors (i.e., r B j = r t i ), it acts independently, enabling it to escape local optima.
In this way, we preserve the original property, where individuals evaluate their internal state to determine how much they want to resemble their neighborhood and, in terms of position, how they are located relative to the best position of their neighbors and how similar they are to their surroundings. Table 1 shows the list of GSM algorithm terms. The complete GSM algorithm is found in Algoritm 1.
Algorithm 1 GSM algorithm to solve minimization problems. f is the Objective Function, N the total number of particles and D the dimension of the problem.
1:
R, ϵ , γ , b s ← define parameters
2:
x i ← initialize i ∈ N state of the particles randomly
3:
r i ← initialize i { 1 , N } random position such that r i R D
4:
r B i r t i , initialization of best particle positions
5:
η i initialize i { N , N } neighbors at zero such that η i R N
6:
do
7:
    for  i = 1 N  do
8:
        if  f ( r t i ) < f ( r B i )  then
9:
            r B i r t i
10:
    Choose particle with best swarm position r B e s t
11:
    Selection of the best particles bs from rt
12:
    for  i = 1 N  do
13:
         n 0
14:
         for  j = 1 N  do
15:
             if  | r t j r t i | R & & i j  then
16:
                 η t i , n j
17:
                 n n + 1
18:
    for  i = 1 N  do
19:
         x t + 1 i ( 1 ϵ ) f ( r t i ) + ϵ | η t i | j η t i f ( r t j )
20:
                r t + 1 i r N + γ j η t j r B j r t i | r B j r t i | x t + 1 i j η t j x t + 1 j
  
       t = t + 1
21:
while Total number of iterations G is fulfilled

3. GSM Algorithm Performance Results

To evaluate the efficiency of the GSM algorithm, it was implemented and tested on a set of benchmark functions. The GSM algorithm was compared on equal terms with several well-known optimization algorithms, including PSO, DE, BA, ABC, AHA, AHA-AO, CBO, ECBO, and SNS. The benchmark functions were selected to represent a diverse range of problem types, such as unimodal, multimodal, regular, irregular, separable, non-separable, and multidimensional functions. The parameters used for each algorithm in this study are provided in Table 2.
A unimodal function has a single global optimum with no or only one local optimum, while a multimodal function features multiple local optima. Optimization algorithms are often tested on multimodal functions to assess their ability to avoid local optima. Algorithms with poor exploration capabilities may struggle to search the space thoroughly and may converge to suboptimal solutions. Additionally, flat search spaces pose challenges as they lack gradient information to guide the algorithm toward the global optimum [20]. Both separable and non-separable functions were included in the tests to further evaluate algorithm performance [20].
High-dimensional search spaces present another significant challenge for optimization algorithms. As the dimensionality of a problem increases, the search space volume grows exponentially, making it harder to locate the global optimum. Algorithms that perform well in low-dimensional spaces may struggle in high-dimensional environments. Therefore, global optimization algorithms are rigorously tested in high-dimensional search spaces to ensure robustness [21].
Scaling problems also pose difficulties for optimization algorithms. These problems involve significant variations in magnitude between the domain and the frequency of the hypersurface, complicating the search process [22]. For instance, Langerman functions are nonsymmetrical with randomly distributed local optima, making them particularly challenging to optimize. The quartic function introduces random noise (Gaussian or uniform), requiring algorithms to handle noisy data effectively. Algorithms that fail to perform well on noisy functions are likely to struggle in real-world applications where noise is prevalent. The benchmark functions used in this study are summarized in Table 3 and Table 4, categorized by unimodal and multimodal functions, respectively. These functions are further classified based on characteristics such as continuity, separability, scalability, and differentiability.
The GSM algorithm was compared against PSO, DE, BA, ABC, AHA, AHA-AO, CBO, ECBO, and SNS using the 22 benchmark functions listed in Table 3 and Table 4. Each algorithm was executed 100 times, and the mean, standard deviation, and optimal values were recorded. The number of iterations and population size for each algorithm are detailed in Table 3 and Table 4. All algorithms were implemented in MATLAB 24.2.0.2871072 (R2024b) Update 5 (The MathWorks, Inc., Natick, MA, USA), and the results are presented in Table 5 and Table 6. The findings indicate that the GSM algorithm converges to optimal values with high precision, requiring fewer iterations and smaller population sizes compared to other algorithms.
In Table 7, a ‘‘+’’ denotes cases where the GSM algorithm outperformed the other algorithms in terms of mean values, while a ‘‘−’’ indicates inferior performance. Overall, the GSM algorithm demonstrated superior performance on most benchmark functions compared to PSO, DE, BA, ABC, AHA, AHA-AO, CBO, ECBO, and SNS.
Table 2. Parameters of the algorithms.
Table 2. Parameters of the algorithms.
AlgorithmParametersReference
GSM R = 0.1 , ε = 0.0001
γ = 0.001 b s = 5
PSO c 1 = 2.0 , c 2 = 2.0 , and
ω = 0.9 0.2
[23]
DE F = 0.9 and c r = 0.2 [24]
BA f m i n = 0 , f m a x = 1 ,
A 0 = 1 , r 0 = 0.1 , α = 0.9 , γ = 0.9
[25]
ABCFood sources = 10[20]
AHA M i g t r i g g e r = 20 , Flight Step
size F s z = [ 1 , 1 ] and
r [ 0 , 1 ]
[8]
CBOno parameters[9]
ECBO P r o = 0.5 [10]
AHA-AO α = 0.1 , δ = 0.1 , ω = 0.005 and
r [ 0 , 1 ]
[11]
SNSno parameters[12]
Table 3. Unimodal benchmark functions used in experiments F, the number of iterations I, population P, and dimension D.
Table 3. Unimodal benchmark functions used in experiments F, the number of iterations I, population P, and dimension D.
NoFunction (F)TypeDRangeFormulationPI
F 1 BOOTHU, C, Di, NS, NSc2[−10, 10] f ( x ) = ( x 1 + 2 x 2 7 ) 2 + ( 2 x 1 + x 2 5 ) 2 50100
F 2 SPHEREU, S2[−5.12, 5.12] f ( x ) = i = 1 d x i 2 5050
F 3 SPHEREU, S10[−5.12, 5.12] f ( x ) = i = 1 d x i 2 150100
F 4 SPHEREU, S20[−5.12, 5.12] f ( x ) = i = 1 d x i 2 150150
F 5 BEALEU, C, Di, NS, NSc, S2[−4.5, 4.5] f ( x ) = ( 1.5 x 1 + x 1 x 2 ) 2 + ( 2.25 x 1 + x 1 x 2 2 ) 2 + ( 2.625 x 1 + x 2 3 ) 2 50100
F 6 ROTATED HYPER-ELLIPSOIDU, C, Di, NS, NSc2[−65.536, 65.536] f ( x ) = i = 1 d j = 1 i x j 2 50250
F 7 SUM SQUARESU, C, Di, S, Sc20[−5.12, 5.12] f ( x ) = i = 1 d i x i 2 200500
F 8 QUARTICU, C, Di, S, Sc2[−1.28, 1.28] f ( x ) = f ( x 1 , . , x n ) = i = 1 n i x i 4 + r a n d o m [ 0 , 1 ) 50100
The GSM algorithm demonstrates superior performance on unimodal, continuous, differentiable, nonseparable, and nonscalable functions F 1 , F 5 , and F 6 , outperforming the compared algorithms, including the most recent ones in the literature. Notably, for F 1 , the GSM algorithm consistently finds the global optimum with few iterations. For unimodal and separable functions such as F 2 , F 3 , and F 4 , algorithms like AHA, AHA-AO, and SNS exhibit better performance in high-dimensional spaces, indicating their ability to converge quickly on smooth functions. However, in low-dimensional settings, the GSM algorithm shows a slight improvement over these algorithms. Functions F 7 and F 8 are unimodal, continuous, differentiable, separable, and scalable. For F 7 , the AHA, AHA-AO, and SNS algorithms achieve the best results. In the case of F 8 , the GSM, AHA, AHA-AO, and SNS algorithms perform similarly, making it difficult to determine which algorithm is superior for this function.
Table 4. Multimodal benchmark functions used in experiments F, the number of iterations I, population P, and dimension D.
Table 4. Multimodal benchmark functions used in experiments F, the number of iterations I, population P, and dimension D.
NoFunction (F)TypeDRangeFormulationPI
F 9 SIX-HUMP CAMELM, C, Di, NS, NSc2[−3, 3], [−2, 2] f ( x ) = 4 2.1 x 1 2 + x 1 4 3 x 1 2 + x 1 x 2 + ( 4 + 4 x 2 2 ) x 2 2 1520
F 10 ACKLEYM, C, Di, NS, Sc2[−32.768, 32.768] f ( x ) = a e x p b 1 d i = 1 d x i 2 e x p 1 d i = 1 d c o s ( c x i ) + a + e x p ( 1 ) 50100
F 11 MICHALEWICZM, S2[0, π ] f ( x ) = i = 1 d s i n ( x i ) s i n 2 m i x i 2 π 5015
F 12 MICHALEWICZM, S5[0, π ] f ( x ) = i = 1 d s i n ( x i ) s i n 2 m i x i 2 π 200100
F 13 MICHALEWICZM, S10[0, π ] f ( x ) = i = 1 d s i n ( x i ) s i n 2 m i x i 2 π 15040
F 14 GRIEWANKM, C, Di, NS, Sc2[−600, 600] f ( x ) = i = 1 d x i 2 4000 i = 1 d c o s x i i + 1 150100
F 15 CROSS-IN-TRAYM, C, NS, NSc2[-10, 10] f ( x ) = 0.0001 | s i n ( x 1 ) s i n ( x 2 ) e x p | 100 x 1 2 + x 2 2 π | | + 1 0.1 5015
F 16 LEVYM, NS5[−10, 10] f ( x ) = s i n 2 ( π w 1 ) + i = 1 d 1 ( w i 1 ) 2 [ 1 + 10 s i n 2 ( π w i + 1 ) ] + ( w d 1 ) 2 [ 1 + s i n 2 ( 2 π w d ) ] , 50100
where  w i = 1 + x i 1 4 , for all i = 1 , . . . , d
F 17 EASOMM, C, Di, S, NSc2[- 100 , 100 ] f ( x ) = c o s ( x 1 ) c o s ( x 2 ) e x p ( x 1 π ) 2 ( x 2 π ) 2 5025
F 18 BRANINM, C, Di, NS, NSc2[−5, 10], [0, 15] f ( x ) = a ( x 2 b x 1 2 + c x 1 r ) 2 + s ( 1 t ) c o s ( x 1 ) + s 5015
F 19 BOHACHEVSKYM, C, Di, S, NSc2[−100, 100] f 1 ( x ) = x 1 2 + 2 x 2 2 0.3 c o s ( 3 π x 1 ) 0.4 c o s ( 4 π x 2 ) + 0.7 5050
F 20 SCHWEFELM, C, Di, S, Sc2[−500, 500] f ( X ) = 418.9829 d i = 1 d x i s i n ( | x i | ) 50100
F 21 SHUBERTM, C, Di, S, NSc2[−10, 10] f ( x ) = i = 1 5 i c o s ( ( i + 1 ) x 1 + i ) i = 1 5 i c o s ( ( i + 1 ) x 2 + i ) 5050
F 22 LANGERMANNM, NS2[0, 10] f ( x ) = i = 1 m c i e x p 1 π j = 1 d ( x j A i j ) 2 c o s π j = 1 d ( x j A i j ) 2 50100
On the other hand, functions F 9 and F 18 are multimodal, continuous, differentiable, non-separable, and non-scalable, featuring multiple local minima. For these functions, the GSM algorithm achieves the best results with a low standard deviation, indicating high precision. Functions F 10 and F 14 are multimodal, continuous, differentiable, non-separable, and scalable. For F 10 , the GSM algorithm performs slightly better than the other algorithms. However, for F 14 , the AHA and AHA-AO algorithms demonstrate superior performance. Functions F 11 , F 12 , and F 13 are multimodal and separable, posing challenges for all algorithms in locating the global optimum. For F 11 and F 13 , the GSM algorithm produces results closest to the optimum, while for F 12 , the DE algorithm outperforms the others. For the non-scalable function F 15 , the GSM algorithm surpasses all competing algorithms. Similarly, for F 16 , a multimodal and non-separable function, the GSM algorithm proves to be the best. In the case of F 17 , a continuous, differentiable, non-scalable, and multimodal function, both the GSM and ABC algorithms deliver strong results, with GSM performing slightly better. Functions F 19 and F 21 are multimodal, continuous, separable, non-scalable, and differentiable. While most algorithms perform well on these functions, the GSM algorithm stands out as the best. In contrast, F 20 is a scalable function where all algorithms struggle to find the optimum, with the best results achieved by SNS, AHA, AHA-AO, and GSM. Finally, for F 22 , a multimodal and non-separable function, most algorithms fail to optimize effectively. However, the closest results to the optimum are obtained by CBO, ECBO, and GSM.
Table 5. Results of all proposed algorithms for unimodal functions.
Table 5. Results of all proposed algorithms for unimodal functions.
No Min GSMPSODEBAABCAHACBOECBOAHA-AOSNS
F 1 0Best04.7161  × 10 4 1.5620  × 10 15 1.9285  × 10 23 5.4342  × 10 17 2.2249  × 10 9 2.1592  × 10 20 1.4831  × 10 5 1.0684  × 10 8 7.2601  × 10 21
Mean03.3472  × 10 8 6.2692  × 10 12 8.4600  × 10 20 5.8342  × 10 13 1.5434  × 10 9 2.1668  × 10 18 8.3649  × 10 5 3.9487 × 10 9 8.1263  × 10 19
StdDev01.0584  × 10 7 1.7007  × 10 11 9.9731  × 10 20 1.7111  × 10 12 2.5825  × 10 9 4.6605  × 10 18 1.0683  × 10 4 3.4400  × 10 9 1.2731  × 10 18
F 2 0Best7.0223  × 10 47 5.211  × 10 4 6.6253  × 10 18 2.3480  × 10 15 8.1189  × 10 11 4.0049  × 10 19 1.8585  × 10 13 1.0575  × 10 6 2.9348  × 10 13 1.8608  × 10 18
Mean4.8067  × 10 37 1.0242  × 10 8 1.6800  × 10 16 3.6863  × 10 14 3.1458  × 10 8 3.3871  × 10 18 1.4750  × 10 13 1.2829  × 10 5 5.8306  × 10 11 4.2709  × 10 17
StdDev1.5200  × 10 36 3.2126  × 10 8 1.4829  × 10 16 3.9603  × 10 14 5.4034  × 10 8 4.6770  × 10 18 2.3268  × 10 13 1.3716  × 10 5 1.5306  × 10 10 8.0683  × 10 17
F 3 0Best1.8983  × 10 23 1.2195  × 10 10 3.2206  × 10 6 2.6370  × 10 12 0.01432.2838  × 10 28 2.6720  × 10 5 0.01441.5281  × 10 25 4.3313  × 10 19
Mean5.5377  × 10 23 3.0311  × 10 7 5.6897  × 10 6 3.7566  × 10 11 0.03032.9439  × 10 28 2.6527  × 10 5 0.01704.3047  × 10 20 3.5935  × 10 19
StdDev6.4917  × 10 23 6.1127  × 10 7 1.8035  × 10 6 3.7826  × 10 11 0.01896.7651  × 10 28 1.1917  × 10 5 0.00608.1642  × 10 20 2.3942  × 10 19
F 4 0Best5.4565  × 10 19 9.0912  × 10 10 1.0301  × 10 4 7.9196  × 10 9 0.15691.0626  × 10 50 0.00390.14331.0605  × 10 36 2.3915  × 10 25
Mean4.4192  × 10 8 2.4432  × 10 5 1.8558  × 10 4 2.6131  × 10 8 0.43468.8246  × 10 44 0.00900.13203.9312  × 10 30 1.1474  × 10 25
StdDev1.3878  × 10 7 3.9384  × 10 5 6.3749  × 10 5 1.0577  × 10 8 0.23151.3711  × 10 43 0.00270.04571.0593  × 10 29 6.7661  × 10 26
F 5 0Best02.6321  × 10 16 8.9583  × 10 13 5.7992  × 10 21 2.9569  × 10 15 2.1248  × 10 10 9.5145  × 10 12 2.9519  × 10 6 3.6675  × 10 11 1.0593  × 10 16
Mean2.7733  × 10 33 9.6230  × 10 4 5.2257  × 10 10 0.09096.9116  × 10 13 7.7244  × 10 10 1.3407  × 10 9 2.2842  × 10 5 2.8535  × 10 10 1.7591  × 10 14
StdDev8.7700  × 10 33 0.00301.2922  × 10 9 0.19171.4852  × 10 12 1.0677  × 10 9 3.3844  × 10 9 5.1538  × 10 5 4.5641  × 10 10 5.1523  × 10 14
F 6 0Best6.0353  × 10 228 1.2180  × 10 62 1.6365  × 10 80 3.8400  × 10 32 3.6787  × 10 41 4.3497  × 10 91 5.1664  × 10 64 2.8114  × 10 6 2.7457  × 10 64 8.2010  × 10 76
Mean3.0077  × 10 226 7.3249  × 10 53 4.9137  × 10 76 9.9511  × 10 29 2.2754  × 10 38 3.8621  × 10 83 4.8796  × 10 62 2.3092  × 10 5 2.4815  × 10 57 2.8823  × 10 74
StdDev02.1958  × 10 52 1.0362  × 10 75 2.8837  × 10 28 5.0686  × 10 38 6.6999  × 10 83 7.6559  × 10 62 2.6735  × 10 5 7.2763  × 10 57 7.6906  × 10 74
F 7 0Best4.6778  × 10 60 9.7328  × 10 24 4.4776  × 10 17 3.0762  × 10 13 0.65622.4692  × 10 152 1.9993  × 10 12 0.06781.7838  × 10 116 3.5424  × 10 85
Mean5.9890  × 10 59 5.7186  × 10 22 6.9886  × 10 17 8.4214  × 10 10 2.16272.8594  × 10 152 6.7964  × 10 12 0.10775.1417  × 10 100 1.0878  × 10 85
StdDev1.0477  × 10 58 1.5996  × 10 21 2.2506  × 10 17 2.1557  × 10 9 1.15117.9217  × 10 152 3.4927  × 10 12 0.06101.6259  × 10 99 1.0818  × 10 85
F 8 0Best8.9795  × 10 5 0.01884.7591  × 10 4 9.5197  × 10 5 0.52180.0006042.8205  × 10 4 0.00210.00134.1045  × 10 4
Mean6.1556  × 10 4 0.01920.00110.01190.32785.5981  × 10 4 5.8584  × 10 4 0.00186.1085  × 10 4 5.7738  × 10 4
StdDev6.9240  × 10 4 0.01257.2484  × 10 4 0.02280.06423.2712  × 10 4 2.9234  × 10 4 0.00167.7823  × 10 4 6.8678  × 10 4
Table 6. Results of all proposed algorithms for multimodal functions.
Table 6. Results of all proposed algorithms for multimodal functions.
No M i n GSMPSODEBAABCAHACBOECBOAHA-AOSNS
F 9 −1.0316Best−1.0316−1.0315−1.0316−1.0316−0.8995−1.0289−1.0270−1.0316−1.0229−1.0296
Mean−1.0316−0.9268−1.0315−1.02514.2389−1.0269−1.0234−1.0300−1.0245−1.0301
StdDev4.7291  × 10 7 0.23311.4627  × 10 4 0.02065.74660.00640.01510.00260.00630.0023
F 10 0Best8.8817  × 10 16 0.06898.8817  × 10 16 3.0121  × 10 11 4.6990  × 10 8 8.8818  × 10 16 7.9670  × 10 13 0.04107.9936  × 10 15 7.9936  × 10 15
Mean1.2434  × 10 16 2.9340  × 10 10 5.5067  × 10 15 1.1846  × 10 9 1.3126  × 10 6 8.8818  × 10 16 5.7216  × 10 13 0.04483.6840  × 10 12 5.8620  × 10 15
StdDev05.6339  × 10 10 4.4468  × 10 15 1.5798  × 10 9 1.3306  × 10 6 08.0797  × 10 13 0.04071.0636  × 10 11 2.4841  × 10 15
F 11 −1.8013Best−1.8013−1.8007−1.8012−1.8013−1.7612−1.7835−1.8009−1.8010−1.7994−1.8005
Mean−1.8013−1.7328−1.8012−1.6382−1.1260−1.7933−1.7693−1.7929−1.7972−1.8000
StdDev4.2321  × 10 7 0.18109.0472  × 10 5 0.33650.52130.01070.07800.01140.00700.0018
F 12 −4.6876Best−4.6876−4.4186−4.6876−4.6984−4.6455−4.6677−3.0771−2.6610−4.6848−4.6795
Mean−4.5648−4.5497−4.6876−4.0940−3.9823−4.6835−3.1844−3.1365−4.6822−4.6792
StdDev0.12050.25925.4176  × 10 10 0.53300.64040.00600.15680.33260.00860.3155
F 13 −9.6601Best−9.3481−7.3345−8.6679−7.4825−4.0441−6.5437−4.2352−5.3466−7.1608−4.0027
Mean−8.4376−5.6146−7.5020−6.1008−2.7108−6.9682−4.0461−4.6553−6.9637−3.9526
StdDev0.6752641.54580.47301.12231.09230.36160.23490.48070.29250.4216
F 14 0Best00.05031.7840  × 10 7 0.00730.024908.0040  × 10 4 0.034001.6239  × 10 6
Mean2.5455  × 10 6 0.01883.2405  × 10 5 0.01820.726400.00180.020401.1947  × 10 5
StdDev8.0449  × 10 6 0.02761.7840  × 10 7 0.01890.571100.00180.009402.7079  × 10 5
F 15 −2.06261Best−2.06261−2.06261−2.06260−2.06261−2.05771−2.06226.5049  × 10 5 −2.0625−2.0622−2.06197
Mean−2.06261−2.05583−2.06256−2.06260−1.91421−2.06231.1164e-04−2.0622−2.0622−2.06232
StdDev6.51894e-120.012163.81885  × 10 5 7.3581  × 10 9 0.21702.8316  × 10 4 7.3258  × 10 5 2.9582  × 10 4 3.3633  × 10 4 2.8401  × 10 4
F 16 0Best1.49  × 10 32 6.3494  × 10 13 2.7875  × 10 12 0.08951.4886  × 10 4 8.0073  × 10 6 2.1805  × 10 10 5.8909  × 10 4 5.7749  × 10 6 1.2099  × 10 8
Mean5.0346  × 10 28 1.4721  × 10 9 9.7101  × 10 12 1.74330.01598.5798  × 10 6 9.3221  × 10 10 7.2814  × 10 4 5.5570  × 10 6 6.8459  × 10 9
StdDev1.5907  × 10 27 4.6030  × 10 9 1.0476  × 10 11 2.12650.03596.6270  × 10 6 1.2289  × 10 9 5.8909  × 10 4 4.3407  × 10 6 1.3644  × 10 8
F 17 −1Best−1−0.9957−0.9263−1.0000−1.0144−0.99207−0.974192−0.85191−0.9237−0.8505
Mean−0.9999−0.5063−0.2893−0.8000−1.0014−0.9435−7.4192  × 10 3 −8.5191  × 10 2 −0.7494−0.9566
StdDev8.7416  × 10 13 0.47020.37950.42160.00460.08631.6793  × 10 58 00.25780.0465
F 18 0.3978Best0.39780.39790.39800.39790.35000.422340.40140.40030.42400.3984
Mean0.39781.046690.41110.39790.48440.41160.41100.40530.42050.4020
StdDev8.8638  × 10 7 0.801930.01522.8015  × 10 7 0.006230.01250.02090.00870.01680.0055
F 19 0Best06.4181  × 10 12 2.8421  × 10 14 3.0285  × 10 12 2.4448  × 10 5 03.3798  × 10 8 0.31291.3847  × 10 12 7.2187  × 10 13
Mean01.6284  × 10 5 1.9197  × 10 12 0.04130.22984.1744  × 10 15 2.4281  × 10 8 0.15562.2589  × 10 10 1.2476  × 10 12
StdDev05.1098  × 10 5 3.0728  × 10 12 0.13060.50221.3123  × 10 14 3.2786  × 10 8 0.16044.7812  × 10 10 1.3407  × 10 12
F 20 0Best2.5455  × 10 5 2.5451  × 10 5 -5.5072  × 10 3 -2.1668  × 10 3 -4.25753.5033  × 10 5 -8.6854  × 10 3 3.876  × 10 3 5.6815  × 10 5 2.5455  × 10 5
Mean9.6119  × 10 5 1.1175  × 10 4 −1.3876  × 10 3 −2.1682  × 10 3 −6.35279.1733  × 10 5 −4.1821  × 10 3 −1.9261  × 10 3 2.1051  × 10 4 2.5455  × 10 5
StdDev3.0395  × 10 4 2.7290  × 10 4 2.0601  × 10 3 6.8514  × 10 3 1.30922.6225  × 10 4 1.3221  × 10 3 3.0815  × 10 2 2.9967  × 10 4 6.3251  × 10 4
F 21 −186.7309Best−186.7309−186.7309−186.7309−186.7309−186.6556−186.4788−176.6064−184.3464−186.4394−186.7299
Mean−186.7309−181.1768−186.7197−156.2154−154.1955−186.0850−180.0691−178.6231−186.2692−186.7201
StdDev4.4613  × 10 6 11.34400.018851.996739.25890.67315.851411.10270.46190.0140
F 22 −1.4Best−1.7541−1.7546−1.7546−1.7547−1.7547−1.0809−1.2551−1.6272−0.6639−1.7547
Mean−1.4259−1.6439−1.7546−1.3612−0.9023−1.0809−1.4072−1.4337−0.6639−1.7547
StdDev0.52580.34643.5298  × 10 8 0.51470.93751.2794  × 10 7 0.18970.22222.2680  × 10 7 4.3274  × 10 7
Table 7. Results of all proposed algorithms for multimodal functions.
Table 7. Results of all proposed algorithms for multimodal functions.
NoGSM vs. PSOGSM vs. DEGSM vs. BAGSM vs. ABCGSM vs. AHAGSM vs. CBOGSM vs. ECBOGSM vs. AHA-AOGSM vs. SNS
F 1 +++++++++
F 2 +++++++++
F 3 ++++++++
F 4 +++++
F 5 +++++++++
F 6 +++++++++
F 7 ++++++
F 8 +++++
F 9 +++++++++
F 10 +++++++++
F 11 +++++++++
F 12 ++++++
F 13 +++++++++
F 14 +++++++
F 15 +++++++++
F 16 +++++++++
F 17 +++++++++
F 18 +++++++++
F 19 +++++++++
F 20 +++++++++
F 21 +++++++++
F 22 ++++++++

Wilcoxon Sing Rank Test

The statistical results presented in Table 5 and Table 6 do not provide sufficient evidence to determine whether there are significant differences between the algorithms. To address this, a pairwise statistical test is often employed for more robust comparisons. In this study, the Wilcoxon Signed-Rank Test is applied to the results obtained from 100 runs of each algorithm to assess their relative efficiency across the set of benchmark functions. This statistical technique is widely used for comparing paired samples in optimization studies [1]. The test assumes n experimental data sets, each with two observations, a i and b i , corresponding to the performance of two algorithms on function i. This results in two paired samples: a 1 , , a n and b 1 , , b n . The T-statistic is calculated as the sum of the negative ranks derived from the differences z i = b i a i for all i = 1 , , n . The null hypothesis for this test is defined as follows:
  • H 0 : There is no significant difference between the mean solutions produced by algorithm A and algorithm B for the set of benchmark functions. The ranks provided by the Wilcoxon Signed-Rank Test, denoted as T + and T , are then examined to evaluate this hypothesis, as described in [1]. The critical values for the T-statistic are evaluated using a normal distribution Z. A 95% significance level ( α = 0.05 ) is applied, with n = 22 (the number of benchmark functions). Table 8 presents the pairwise statistical results comparing the GSM algorithm to the other algorithms.
The GSM algorithm demonstrates superior performance compared to the PSO, DE, BA, ABC, AHA, AHA-AO, CBO, ECBO, and SNS algorithms. In all pairwise comparisons, the null hypothesis is rejected, indicating significant differences in performance. Specifically, the PSO algorithm underperforms relative to the GSM algorithm. While the DE algorithm is highly competitive, the statistical test results confirm that GSM outperforms it. The BA algorithm, a powerful swarm-based approach, has been successfully applied to various problems. However, its performance falls short when compared to the GSM algorithm. As shown in Table 8, the key difference between the GSM and ABC algorithms lies in their exploration capabilities. The ABC algorithm struggles with certain functions, resulting in poorer performance. The AHA algorithm, another swarm-based approach, delivers excellent results and excels in optimizing smooth functions, often finding the optimum quickly. However, it faces challenges with multimodal functions, where the GSM algorithm achieves better results.
Table 8. Wilcoxon signed-rank test results.
Table 8. Wilcoxon signed-rank test results.
GSM vsPSODEBAABCAHACBOECBOAHA-AOSNS
α 0.050.050.050.050.050.050.050.050.05
T + 253235246253199229253205229
T 01870542404824
H 0 rejectedrejectedrejectedrejectedrejectedrejectedrejectedrejectedrejected
Swarm-based algorithms CBO and ECBO exhibit lower performance in most problems, even underperforming compared to older algorithms. The AHA-AO algorithm, an improved version of AHA, delivers results very similar to its predecessor. It excels in optimizing smooth functions and performs slightly better in multimodal functions. The SNS algorithm, a recent swarm-based approach, demonstrates strong performance across various test functions. However, its performance declines in non-separable functions, such as the Six-Hump Camel and Easom functions, where the proposed GSM algorithm proves superior. Overall, the GSM algorithm performs well across a wide range of functions, including continuous, non-continuous, separable, non-separable, scalable, non-scalable, differentiable, and non-differentiable problems. Specifically, it outperforms other algorithms in unimodal separable functions and also delivers strong results for unimodal non-separable functions, surpassing PSO, DE, BA, ABC, AHA, AHA-AO, CBO, ECBO, and SNS. While its performance is slightly lower for multimodal separable functions compared to unimodal separable ones, it still outperforms most population-based algorithms. For non-separable multimodal functions, the GSM algorithm maintains acceptable performance without significant issues. These results suggest that the GSM algorithm is particularly effective for separable functions. The GSM algorithm successfully optimized all implemented functions, which include challenging problems such as the QUARTIC function with random noise—useful for testing real-world applications—and functions with large search spaces like SCHWEFEL and GRIEWANK. This highlights the algorithm’s robustness and strong performance across diverse problem types.

4. GSM Algorithm Application to Threshold Segmentation

This section introduces the concepts of image segmentation and minimum cross entropy (MCE). In computer vision and pattern recognition, image segmentation is a critical preprocessing step that plays a vital role in obtaining a clear representation of an image and extracting essential information for subsequent analysis [17]. Image segmentation involves dividing a digital image into meaningful regions based on specific characteristics, such as gray values, texture, or edge details. Numerous algorithms and methods have been developed for image segmentation, which can be broadly classified into four categories: clustering-based methods, histogram thresholding-based methods, and region splitting and merging-based methods. Among these, histogram thresholding is widely used due to its efficiency and adaptability across various scenarios. Histogram thresholding utilizes the gray-level histogram of an image to determine threshold values that separate the image into distinct classes. For example, to divide an image into two classes, a single threshold value is required, a process known as bilevel thresholding (BTH). When more than two classes are needed, multilevel thresholding (MTH) is employed. In BTH, the threshold value is selected based on the following rule:
C 1 i f 0 p < t h C 2 i f t h p < L 1
where, p represents a pixel in grayscale from the matrix I g , which has dimensions m × n . The grayscale levels are defined as L = { 0 , 1 , 2 , , L 1 } . The pixel p can be classified into one of two classes, C 1 or C 2 , based on the threshold value t h . Equation (5) can be extended to multilevel thresholding, as shown in Equation (6).
C 1 i f 0 p < t h C 2 i f t h 1 p < t h 2 C i i f t h 2 p < t h i + 1 C n i f t h n p < L 1
Consider Equation (6), where { t h 1 , t h 2 , , t h i + 1 , , t h n } represents the set of thresholds. Threshold-based segmentation methods can be categorized into two types: parametric and nonparametric. Nonparametric methods determine thresholds using discriminators such as entropy, interclass variance, or error rate. In contrast, parametric methods estimate thresholds by modeling the gray-level distribution of each category using a probability density function (PDF). This paper adopts entropy as the discriminator criterion for identifying optimal thresholds.

4.1. Minimum Cross Entropy

The concept of directed divergence, also known as cross-entropy (CE), was introduced by Kullback in 1968 [26]. CE is an information-theoretic (IT) measure used to quantify the information difference between two probability distributions (PDs). The following expression defines CE:
D B , C = k = 1 N b k l o g b k c k
From Equation (7), let B = { b 1 , b 2 , , b N } and C = { c 1 , c 2 , , c N } represent two distinct probability distributions (PDs) of the same set. The information-theoretic (IT) distance between B and C is denoted by D. The Minimum Cross-Entropy (MCE) quantifies the statistical difference in uncertainty related to the experimental outcomes transmitted from C to B. In image segmentation, the prior distribution B represents the available knowledge about the “correct” solution. The Minimum Cross-Entropy Thresholding (MCET) method [26] applies MCE to image thresholding by minimizing the cross-entropy between the original image and the thresholded image. A lower cross-entropy value indicates reduced uncertainty and greater homogeneity [27]. In image thresholding and segmentation, a set of thresholds B = { t h 1 , t h 2 , , t h n t } is selected from the image histogram h. The threshold values t h are chosen to minimize the MCET result between the thresholded image I t h and the original image I o r . For example, in the case of single-threshold segmentation ( t h = t h 1 ), the thresholded image I t h is processed as follows:
I t h ( i , j ) = μ ( 1 , t h 1 ) i f I o r ( i , j ) < t h 1 μ ( t h 1 , L + 1 ) I o r ( i , j ) t h 1
The term L = 255 represents the maximum intensity value in the histogram for an 8-bit grayscale image. Equation (8) can be extended to accommodate multiple thresholds ( t h ). Given the histogram h of the original image, after normalization, the value u for a specific range bounded by a and b is calculated as follows:
μ ( a , b ) = i = a b 1 i h ( i ) i = a b 1 h ( i ) , i = 1 , 2 , , L
To ensure accurate image segmentation, the Minimum Cross-Entropy Thresholding (MCET) method must be applied to digital images. The MCET is computed for a single threshold using the approach developed by [22], as shown in the following equation:
D ( t h ) = i = 1 t h 1 1 i h ( i ) l o g i μ ( 1 , t h 2 ) + i = i t h 1 L i h ( i ) l o g i μ ( t h 1 , L + 1 )
This aims to seek the best set of thresholds that minimizes the cross-entropy.
t h o p t = a r g m i n ( t h ) ( D ( t h ) )

4.2. Gaslike Social Motility for Image Thresholding

In this paper, the GSM algorithm is applied as an efficient method to determine optimal threshold values for multilevel thresholding (MTH), which is essential for image segmentation. The process begins by loading a grayscale image ( I G r ) and generating its corresponding histogram ( h G r ). The parameters, including the population size (r), maximum number of iterations (G), and number of thresholds ( n t h ), are then specified. The initial population ( r i ) is randomly generated, with each particle representing a potential threshold value within the range of 0 to 255. The fitness value for each particle is calculated by evaluating the thresholds using the MCET function (Equation (12)). Through iterative refinement, the positions with the minimum fitness values are identified, and the solutions are improved until the stopping criteria are met. The best position among the population of particles is selected as the threshold vector ( t h ). Finally, the segmented image ( I s ) is produced using the threshold vector t h . One of the most commonly used segmentation rules for two thresholds is defined as follows:
I S d a m ( x , y ) = I G r ( x , y ) i f I G r ( x , y ) t h 1 t h 1 i f t h 1 I G r ( x , y ) t h 2 I G r ( x , y ) i f I G r ( x , y ) > t h 1
Here, I s represents the gray value of the segmented image, while I G r ( x , y ) denotes the gray value of the original image at pixel position ( x , y ) . The thresholds t h 1 and t h 2 are obtained using the Gaslike Social Motility optimization algorithm. Each threshold t h corresponds to a decision variable within the population, which is defined as follows:
r i = [ t h 1 , t h 2 , . . . , t h N ] , t h i = [ t h 1 , t h 2 , . . . , t h n t ]
Here, N represents the size of the particle population, and n t denotes the number of thresholds applied to each particle in the image, where i = [ 1 , 2 , , n t ] .

4.3. Experimental Results of Segmentation

This section presents the experimental results of the Gaslike Social Motility optimization algorithm. A set of nine images from the USC-SIPI Image Database (Miscellaneous section) [28] was used for the experiments. The results are summarized in Table 8, Table 9, Table 10, Table 11, Table 12 and Table 13. The performance of the proposed method was evaluated using seven metrics: the best fitness value, Root Mean Square Error (RMSE) [29], Peak Signal-to-Noise Ratio (PSNR) [30], Structural Similarity Index (SSIM) [31], Feature Similarity Index (FSIM) [32], Haar Wavelet-based Perceptual Similarity Index (HPSI) [33], Quality Index based on Local Variance (QILV) [34], and Universal Image Quality Index (UIQI) [35]. These metrics are described in detail in Table 7. The fitness function, also known as the objective function, assesses the quality of candidate solutions and varies depending on the problem; it plays a critical role in verifying whether the algorithm converges to an optimal solution. A key aspect of image segmentation is the algorithm’s robustness in handling images with diverse grayscale intensity values. Figure 5 displays the nine test images converted to grayscale, along with their respective histograms. The variety of histograms reflects the heterogeneity of the images, enabling comprehensive testing of the proposed method. Additionally, the database provides a wide range of scenarios and challenges, ensuring that the algorithm’s robustness is thoroughly evaluated across different datasets.

4.4. Results and Discussion Related to the Segmentation Application

Table 7 presents the optimal threshold values obtained for the tested levels: 2, 4, 8, and 16. As the number of levels increases, the threshold values often change significantly. Table 8 displays the fitness results for all images, providing insights into the algorithm’s performance using measures of central tendency: Best Fitness, Mean, and Standard Deviation of the fitness values across experiments. Additionally, Table 7 includes the corresponding values for PSNR, MSE, SSIM, FSIM, UIQI, HPSI, QILV, and HAAR. The comparison was conducted using 35 independent runs for each image and four threshold levels (2, 4, 8, and 16). This resulted in seven metrics (mean values of 35 runs) per image across the nine test images. Higher values of PSNR, SSIM, FSIM, HPSI, QILV, and UIQI indicate better segmentation quality, while lower RMSE values signify improved performance. Table 10, Table 11, Table 12, Table 13, Table 14 and Table 15 present the segmentation results and their corresponding histograms. Overall, the GSM algorithm demonstrates an exceptional balance between solution quality and computational efficiency, requiring only 50 particles and a maximum of 100 iterations. This performance is competitive compared to similar works [36,37]. The synergy between GSM and MCET brings the method closer to real-time application, highlighting its practical potential.

5. Conclusions and Discussion

Metaheuristic algorithms play a crucial role in solving optimization problems where traditional methods fall short. In recent studies, metaheuristic algorithms have gained prominence in image segmentation due to their ability to frame segmentation as an optimization problem, offering solutions with low computational cost and avoiding the pitfalls of local minima. This work introduces the GSM algorithm, a population-based metaheuristic designed for optimization problems. The GSM algorithm employs a movement adjustment scheme inspired by Gaslike Social Motility, leveraging its social behavior to balance exploration and exploitation for optimal value search. The GSM algorithm was tested on 22 benchmark functions, including unimodal, multimodal, separable, and non-separable functions of varying dimensions. Its performance was compared against the PSO, DE, BA, ABC, AHA, AHA-AO, CBO, ECBO, and SNS algorithms. The GSM algorithm outperformed most competing algorithms across the majority of test functions and demonstrated superior computational efficiency compared to other population-based metaheuristics. Overall, the GSM algorithm proves to be a strong candidate for solving online optimization problems.
Additionally, the algorithm successfully performed threshold segmentation. The algorithm demonstrates strong performance, requiring neither a large population nor excessive iterations to find effective solutions. Its balanced approach to exploration and exploitation enables it to identify optimal solutions efficiently, resulting in minimal error between the model’s output and the actual experimental data. The algorithm’s effectiveness as both an optimizer and a tool for threshold segmentation has been successfully validated. In future work, we plan to extend its application to online parametric identification and adaptive controllers. Furthermore, we aim to explore its potential in other optimization tasks.

Author Contributions

Investigation, O.D.S.; Methodology, A.Y.A.; Software, O.D.S. and A.V.-G.; Supervision, A.V.-G. and A.Y.A.; Visualization, O.D.S.; Writing—original draft, O.D.S., L.M.R. and E.R.-H.; Writing—review and editing.; Development of the GSM algorithm, O.D.S. and L.M.R. All authors have read and agreed to the published version of the manuscript.

Funding

This research received funding from Universidad de Guadalajara.

Data Availability Statement

Data are available upon request to the corresponding author.

Acknowledgments

The authors thank the Universidad Autónoma de Guadalajara for giving us the support to develop this research. We also thank SECIHTI for the financing provided in the project.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. García, S.; Molina, D.; Lozano, M.; Herrera, F. A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: A case study on the CEC’2005 special session on real parameter optimization. J. Heuristics 2009, 15, 617. [Google Scholar] [CrossRef]
  2. Gandomi, A.H. Interior search algorithm (ISA): A novel approach for global optimization. ISA Trans. 2014, 53, 1168–1183. [Google Scholar] [PubMed]
  3. Yu, Y.; Gao, S.; Wang, Y.; Todo, Y. Global optimum-based search differential evolution. IEEE/CAA J. Autom. Sin. 2019, 6, 379–394. [Google Scholar] [CrossRef]
  4. Gu, W.; Yu, Y.; Hu, W. Artificial bee colony algorithmbased parameter estimation of fractional-order chaotic system with time delay. IEEE/CAA J. Autom. Sin. 2017, 4, 107–113. [Google Scholar] [CrossRef]
  5. Roy, P.; Mahapatra, G.S.; Dey, K.N. Forecasting of software reliability using neighborhood fuzzy particle swarm optimization based novel neural network. IEEE/CAA J. Autom. Sin. 2019, 6, 1365–1383. [Google Scholar] [CrossRef]
  6. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar]
  7. Ji, J.; Song, S.; Tang, C.; Gao, S.; Tang, Z.; Todo, Y. An artificial bee colony algorithm search guided by scale-free networks. Inf. Sci. 2019, 473, 142–165. [Google Scholar] [CrossRef]
  8. Zhao, W.; Wang, L.; Mirjalili, S. Artificial hummingbird algorithm: A new bio-inspired optimizer with its engineering applications. Comput. Methods Appl. Mech. Eng. 2022, 388, 114194. [Google Scholar]
  9. Kaveh, A.; Mahdavi, V.R. Colliding bodies optimization: A novel meta-heuristic method. Comput. Struct. 2014, 139, 18–27. [Google Scholar]
  10. Kaveh, A.; Ghazaan, M.I. Enhanced colliding bodies optimization for design problems with continuous and discrete variables. Adv. Eng. Softw. 2014, 77, 66–75. [Google Scholar]
  11. Elaziz, M.A.; Dahou, A.; El-Sappagh, S.; Mabrouk, A.; Gaber, M.M. AHA-AO: Artificial Hummingbird Algorithm with Aquila Optimization for Efficient Feature Selection in Medical Image Classification. Appl. Sci. 2022, 12, 9710. [Google Scholar] [CrossRef]
  12. Talatahari, S.; Bayzidi, H.; Saraee, M. Social network search for global optimization. IEEE Access 2021, 9, 92815–92863. [Google Scholar]
  13. Doğan, B.; Ölmez, T. A new metaheuristic for numerical function optimization: Vortex Search algorithm. Inf. Sci. 2015, 293, 125–145. [Google Scholar] [CrossRef]
  14. Askarzadeh, A. Bird mating optimizer: An optimization algorithm inspired by bird mating strategies. Commun. Nonlinear Sci. Numer. Simul. 2014, 19, 1213–1228. [Google Scholar] [CrossRef]
  15. Ljung, L. System Identification—Theory for the User, 2nd ed.; PTR Prentice Hall: Upper Saddle River, NJ, USA, 1999; Volume 101. [Google Scholar]
  16. Ursem, R.K.; Vadstrup, P. Parameter identification of induction motors using differential evolution. In Proceedings of the 2003 Congress on Evolutionary Computation, 2003. CEC’03, Canberra, Australia, 8–12 December 2003; Volume 2, pp. 790–796. [Google Scholar] [CrossRef]
  17. Tang, K.; Yuan, X.; Sun, T.; Yang, J.; Gao, S. An improved scheme for minimum cross entropy threshold selection based on genetic algorithm. Knowl.-Based Syst. 2011, 24, 1131–1138. [Google Scholar] [CrossRef]
  18. Zhan, C.; Situ, W.; Yeung, L.F.; Tsang, P.W.M.; Yang, G. A parameter estimation method for biological systems modelled by ode/dde models using spline approximation and differential evolution algorithm. IEEE/ACM Trans. Comput. Biol. Bioinform. 2014, 11, 1066–1076. [Google Scholar] [CrossRef]
  19. Parravano, A.; Reyes, L. Gaslike model of social motility. Phys. Rev. E 2008, 78, 026120. [Google Scholar] [CrossRef]
  20. Karaboga, D.; Akay, B. A comparative study of artificial bee colony algorithm. Appl. Math. Comput. 2009, 214, 108–132. [Google Scholar] [CrossRef]
  21. Yang, X.S.; Cui, Z.; Xiao, R.; Gandomi, A.H.; Karamanoglu, M. Swarm Intelligence and Bio-Inspired Computation: Theory and Applications; Newnes: New South Wales, Australia, 2013. [Google Scholar]
  22. Dall’Igna Júnior, A.; Silva, R.S.; Mundim, K.C.; Dardenne, L.E. Performance and parameterization of the algorithm simplified generalized simulated annealing. Genet. Mol. Biol. 2004, 27, 616–622. [Google Scholar] [CrossRef]
  23. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95—International Conference on Neural Networks, Perth, Australia, November 27–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  24. Storn, R.; Price, K. Differential evolution—A simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar]
  25. Yang, X.S. Chapter 10-Bat Algorithms. In Nature-Inspired Optimization Algorithms; Yang, X.S., Ed.; Elsevier: Oxford, UK, 2014; pp. 141–154. [Google Scholar] [CrossRef]
  26. Kullback, S. Information Theory and Statistics; Dover Publications: Garden City, NY, USA, 1968; p. 399. [Google Scholar]
  27. Rodríguez-Esparza, E.; Zanella-Calzada, L.A.; Oliva, D.; Heidari, A.A.; Zaldivar, D.; Pérez-Cisneros, M.; Foong, L.K. An efficient Harris hawks-inspired image segmentation method. Expert Syst. Appl. 2020, 155, 113428. [Google Scholar] [CrossRef]
  28. USC-SIPI. The USC-SIPI Image Database: Miscellaneous Volume, Image 36. 2025. Available online: https://sipi.usc.edu/database/database.php?volume=misc (accessed on 28 February 2025).
  29. Oh, I.S.; Lee, J.S.; Moon, B.R. Hybrid genetic algorithms for feature selection. IEEE Trans. Pattern Anal. Mach. Intell. 2004, 26, 1424–1437. [Google Scholar] [CrossRef]
  30. Avcibas, I.; Sankur, B.; Sayood, K. Statistical evaluation of image quality measures. J. Electron. Imaging 2002, 11, 206–223. [Google Scholar] [CrossRef]
  31. Wang, Z.; Bovik, A.; Sheikh, H.; Simoncelli, E. Image Quality Assessment: From Error Visibility to Structural Similarity. IEEE Trans. Image Process. 2004, 13, 600–612. [Google Scholar] [CrossRef] [PubMed]
  32. Zhang, L.; Zhang, L.; Mou, X.; Zhang, D. FSIM: A Feature Similarity Index for Image Quality Assessment. IEEE Trans. Image Process. 2011, 20, 2378–2386. [Google Scholar] [CrossRef]
  33. Reisenhofer, R.; Bosse, S.; Kutyniok, G.; Wiegand, T. A Haar wavelet-based perceptual similarity index for image quality assessment. Signal Process. Image Commun. 2018, 61, 33–43. [Google Scholar] [CrossRef]
  34. Aja-Fernández, S.; Estépar, R.S.J.; Alberola-López, C.; Westin, C.F. Image quality assessment based on local variance. In Proceedings of the Annual International Conference of the IEEE Engineering in Medicine and Biology-Proceedings, New York, NY, USA, 30 August–3 September 2006; pp. 4815–4818. [Google Scholar] [CrossRef]
  35. Wang, Z.; Bovik, A.C. A universal image quality index. IEEE Signal Process. Lett. 2002, 9, 81–84. [Google Scholar] [CrossRef]
  36. Aranguren, I.; Valdivia, A.; Morales-Castañeda, B.; Oliva, D.; Elaziz, M.A.; Perez-Cisneros, M. Improving the segmentation of magnetic resonance brain images using the LSHADE optimization algorithm. Biomed. Signal Process. Control 2021, 64, 102259. [Google Scholar] [CrossRef]
  37. Oliva, D.; Hinojosa, S.; Osuna-Enciso, V.; Cuevas, E.; Pérez-Cisneros, M.; Sanchez-Ante, G. Image segmentation by minimum cross entropy using evolutionary methods. Soft Comput. 2019, 23, 431–450. [Google Scholar]
Figure 1. Selection of neighbors of the particle r t i in the search space.
Figure 1. Selection of neighbors of the particle r t i in the search space.
Algorithms 18 00199 g001
Figure 2. The direction of displacement of a particle r t i according to affinity with its neighbors.
Figure 2. The direction of displacement of a particle r t i according to affinity with its neighbors.
Algorithms 18 00199 g002
Figure 3. Angular distribution of the displacement of the particle r t i according to the best positions visited by the neighbors.
Figure 3. Angular distribution of the displacement of the particle r t i according to the best positions visited by the neighbors.
Algorithms 18 00199 g003
Figure 4. Displacement of the particle r t i with contribution of the best global position and the contribution provided by the neighbors, scaled by the parameter γ .
Figure 4. Displacement of the particle r t i with contribution of the best global position and the contribution provided by the neighbors, scaled by the parameter γ .
Algorithms 18 00199 g004
Figure 5. Set of images from the dataset [28].
Figure 5. Set of images from the dataset [28].
Algorithms 18 00199 g005
Table 1. Summary of the parameters and variables in the Gaslike Social Motility (GSM) algorithm.
Table 1. Summary of the parameters and variables in the Gaslike Social Motility (GSM) algorithm.
Symbol/ParameterMeaning
NNumber of particles in the population
DDimensionality of the search space
GMaximum number of generations
tIteration number
iIndex of a particle in the population ( i = 1 , 2 , , N )
jIndex of a neighboring particle
x t i State (mood) of particle i at time t
r t i Position of particle i in a d-dimensional space
η t i Neighborhood of particle i at time t
| η t i | Cardinality (size) of the neighborhood η t i
RInteraction radius (defines the neighborhood size)
ϵ Coupling strength
γ Coupling factor
r B j Best-known position of particle j
r N Random displacement vector generated with a normal distribution
σ b s Standard deviation of the best positions of b s particles
f ( · ) Objective function being optimized
N ( · ) Normal distribution function used for randomness
x t + 1 i j η t i x t + 1 j Affinity factor determining the direction of movement
Table 9. Quality metrics employed to assess the performance of GSM in a segmentation application.
Table 9. Quality metrics employed to assess the performance of GSM in a segmentation application.
MetricFormulation
1. Peak Signal to Noise Ratio (PSNR) P S N R = 20 l o g 10 255 R M S E
Measures the similarity of structural information between the original image and the processed image.
2. Structural Similarity Index (SSIM) S S I M = ( 2 μ I r μ I s + C 1 ) ( 2 σ I r I s + C 2 ) ( μ I r 2 + μ I s 2 + C 1 ) ( σ I r 2 + σ I s 2 + C 1 )
Measures the similarity of structural information between the original image and the processed image.
3. Feature Similarity Index (FSIM) F S I M = X Ω S L ( X ) P C m ( X ) X Ω P C m ( X )
Calculates the phase congruency and the gradient magnitude to characterize the local quality of the image.
4. Haar wavelet-based Perceptual Similarity Index (HPSI) H P S I = l α 1 x k = 1 2 H S f 1 f 2 k [ x ] · W f 1 f 2 k [ x ] x k = 1 2 W f 1 f 2 k [ x ] 2
Appraises the perceptual similarity between a reference image and a distorted image.
5. Quality Index based on Local Variance (QILV) Q I L V = 2 μ V I μ V J μ V I 2 + μ V J 2 · 2 σ V I σ V J σ V I 2 + σ V J 2 · σ V I σ V J σ V I σ V J
Focuses on the image structure to appraise the changes on the non-stationarity behavior of images.
6. Universal Image Quality Index (UIQI) U I Q I = 4 σ x y x y ¯ ( σ x 2 + σ y 2 ) ( ( x ¯ ) 2 + ( y ¯ ) 2 )
Evaluates image distortion as a combination of correlation loss, luminance distortion, and contrast distortion.
Table 10. Best threshold values.
Table 10. Best threshold values.
ImageThResults
Boat269 132
446 87 131 167
824 41 61 86 113 136 157 186
1616 25 35 47 63 79 95 112 127 139 150 162 176 195 207 214
House281 151
454 87 129 180
841 59 84 106 127 154 189 222
1626 41 58 72 86 99 106 112 120 133 150 167 182 196 212 225
Airplane294 160
462 106 149 192
840 62 84 103 122 150 180 204
1617 22 31 43 58 74 87 100 112 124 138 156 175 191 203 213
Lake273 141
456 90 142 194
840 54 70 93 122 154 177 202
167 28 38 46 53 62 74 86 98 112 129 147 165 180 194 213
Tank261 102
443 76 101 123
832 50 64 77 90 105 120 133
1630 46 56 61 69 78 85 93 102 109 118 123 130 136 142 153
Couple273 134
437 82 123 160
819 41 65 89 112 132 154 183
167 16 29 42 54 64 76 89 100 110 121 132 145 159 177 203
Peppers251 123
435 74 115 161
814 30 56 80 102 125 150 177
1614 24 30 42 53 63 71 80 90 104 118 131 147 162 174 192
Truck292 134
468 94 121 150
849 67 82 98 115 134 150 163
1636 41 49 57 65 74 81 88 95 102 109 119 128 139 149 163
Hunter273 128
462 98 130 162
853 73 95 118 141 165 181 202
1641 47 57 68 74 81 89 100 111 122 133 145 158 170 191 223
Table 11. Comparative study of the fitness values, considering MCET as objective function.
Table 11. Comparative study of the fitness values, considering MCET as objective function.
ImageThBest FitnessMean FitnessStd
Boat2−645.5921485−645.59200.0004
4−646.4887625−646.48470.0137
8−646.9336700−646.91920.0149
16−647.0805553−647.06920.0087
House2−689.0970058−689.09700.0000
4−689.5663172−689.53090.0792
8−689.7855957−689.77740.0145
16−689.8643191−689.86330.0008
Airplane2−934.9338252−934.93380.0000
4−935.4267241−935.42590.0019
8−935.6618385−935.65410.0061
16−935.7293579−935.71300.0211
Lake2−622.7141857−622.71410.0001
4−623.5186087−623.51820.0006
8−623.9466955−623.93030.0192
16−624.0848274−624.08100.0035
Tank2−509.474399−509.47410.0008
4−509.929904−509.92790.0038
8−510.113068−510.10140.0098
16−510.174674−510.84490.0091
Couple2−593.056607−593.05660.0001
4−594.1729886−594.16930.0081
8−594.676015−594.65590.0311
16−594.8555549−594.84120.0213
Peppers2−574.6142212−574.61420.0001
4−575.8100308−575.81000.0000
8−576.2891242−576.28210.0114
16−576.4626408−576.45350.0139
Truck2−617.7267227−617.72671.20  × 10 6
4−618.1589863−618.15750.0061
8−618.3618319−618.35680.0059
16−618.430783−618.68410.0089
Hunter2−541.6263−541.62630.0000
4−542.4250−542.41860.0078
8−542.7437−542.72120.0167
16−542.8376−542.91510.0218
Table 12. Comparative study of metrics values from all benchmark images.
Table 12. Comparative study of metrics values from all benchmark images.
ImageThPSNRSSIMFSIMUQIRMSEQILVMSSIMHPSI
Boat216.93710.55080.77020.156336.28210.72360.15630.3955
420.36540.66870.88020.290024.48990.89110.29010.5982
825.44290.82640.95170.535113.66150.96660.53520.8110
1630.63350.92580.98660.75837.54240.99130.75840.9362
House214.96710.67530.77880.05002071.89310.68330.28430.3644
419.56050.78990.86240.1159740.74600.91000.34310.5435
823.47710.85870.94380.2904297.24430.95090.47310.7836
1629.00150.95320.98470.539883.35800.98380.65900.9283
Airplane214.92860.74380.80220.10322090.37800.76810.10350.4623
421.07940.80840.88520.1946508.05320.95000.19490.6426
826.19440.85790.95240.3390157.30580.98650.33930.8209
1630.17630.91260.97700.467566.32330.99460.46770.9091
Lake214.33120.53140.83170.15632398.63940.64210.15640.4803
417.96170.63260.87930.27241039.77560.89510.27250.6111
824.33990.84490.95040.4995239.47920.96550.49960.8043
1629.89270.92180.98120.699267.11230.99170.69930.9279
Tank219.88340.68080.85930.194325.84580.77680.19440.5178
424.50290.79270.93660.427015.18990.94860.42710.7358
829.51750.90060.97560.66388.54390.98240.66380.8851
1633.88050.95190.98800.80355.73740.99280.80350.9471
Couple216.29780.54520.75790.18681525.10900.71250.18680.3767
420.39730.71450.88000.3299593.42220.87330.33000.5784
824.96090.84200.94910.5209210.69130.94880.52100.7804
1630.02990.92830.98370.724067.25880.98520.72410.9213
Peppers215.52880.55530.75180.08301821.45220.59970.09020.3487
420.42590.65700.85070.1973589.53940.88320.20400.5321
825.05830.79400.93760.4326203.00880.96540.43740.7698
1630.17640.92000.97540.699964.88760.98650.70250.9044
Truck215.91960.51960.75980.204340.79370.86850.20440.4040
421.61970.73760.89320.470821.18780.95840.47090.6488
827.77030.89620.96790.740610.43570.98740.74060.8707
1631.06600.91790.97050.81109.77150.98570.81100.8932
Hunter216.17550.49270.79050.15461568.67210.71870.15450.4201
419.34440.62660.88490.3049756.76690.91330.30480.6233
821.75070.73240.93180.4645434.60910.97560.46430.7592
1624.18400.81430.91240.5861355.77020.94410.58590.7080
Table 13. Segmented images (Boat, Tank, and House) with proposed GSM algorithm.
Table 13. Segmented images (Boat, Tank, and House) with proposed GSM algorithm.
T h = 2 T h = 4 T h = 8 T h = 16
Algorithms 18 00199 i001Algorithms 18 00199 i002Algorithms 18 00199 i003Algorithms 18 00199 i004
Algorithms 18 00199 i005Algorithms 18 00199 i006Algorithms 18 00199 i007Algorithms 18 00199 i008
Algorithms 18 00199 i009Algorithms 18 00199 i010Algorithms 18 00199 i011Algorithms 18 00199 i012
Algorithms 18 00199 i013Algorithms 18 00199 i014Algorithms 18 00199 i015Algorithms 18 00199 i016
Algorithms 18 00199 i017Algorithms 18 00199 i018Algorithms 18 00199 i019Algorithms 18 00199 i020
Algorithms 18 00199 i021Algorithms 18 00199 i022Algorithms 18 00199 i023Algorithms 18 00199 i024
Table 14. Segmented images (Hunter, Jetplane, and Peppers) with the proposed GSM algorithm.
Table 14. Segmented images (Hunter, Jetplane, and Peppers) with the proposed GSM algorithm.
T h = 2 T h = 4 T h = 8 T h = 16
Algorithms 18 00199 i025Algorithms 18 00199 i026Algorithms 18 00199 i027Algorithms 18 00199 i028
Algorithms 18 00199 i029Algorithms 18 00199 i030Algorithms 18 00199 i031Algorithms 18 00199 i032
Algorithms 18 00199 i033Algorithms 18 00199 i034Algorithms 18 00199 i035Algorithms 18 00199 i036
Algorithms 18 00199 i037Algorithms 18 00199 i038Algorithms 18 00199 i039Algorithms 18 00199 i040
Algorithms 18 00199 i041Algorithms 18 00199 i042Algorithms 18 00199 i043Algorithms 18 00199 i044
Algorithms 18 00199 i045Algorithms 18 00199 i046Algorithms 18 00199 i047Algorithms 18 00199 i048
Table 15. Segmented images (Couple, Truck, and Lake) with proposed GSM algorithm.
Table 15. Segmented images (Couple, Truck, and Lake) with proposed GSM algorithm.
T h = 2 T h = 4 T h = 8 T h = 16
Algorithms 18 00199 i049Algorithms 18 00199 i050Algorithms 18 00199 i051Algorithms 18 00199 i052
Algorithms 18 00199 i053Algorithms 18 00199 i054Algorithms 18 00199 i055Algorithms 18 00199 i056
Algorithms 18 00199 i057Algorithms 18 00199 i058Algorithms 18 00199 i059Algorithms 18 00199 i060
Algorithms 18 00199 i061Algorithms 18 00199 i062Algorithms 18 00199 i063Algorithms 18 00199 i064
Algorithms 18 00199 i065Algorithms 18 00199 i066Algorithms 18 00199 i067Algorithms 18 00199 i068
Algorithms 18 00199 i069Algorithms 18 00199 i070Algorithms 18 00199 i071Algorithms 18 00199 i072
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Sanchez, O.D.; Reyes, L.M.; Valdivia-González, A.; Alanis, A.Y.; Rangel-Heras, E. Gaslike Social Motility: Optimization Algorithm with Application in Image Thresholding Segmentation. Algorithms 2025, 18, 199. https://doi.org/10.3390/a18040199

AMA Style

Sanchez OD, Reyes LM, Valdivia-González A, Alanis AY, Rangel-Heras E. Gaslike Social Motility: Optimization Algorithm with Application in Image Thresholding Segmentation. Algorithms. 2025; 18(4):199. https://doi.org/10.3390/a18040199

Chicago/Turabian Style

Sanchez, Oscar D., Luz M. Reyes, Arturo Valdivia-González, Alma Y. Alanis, and Eduardo Rangel-Heras. 2025. "Gaslike Social Motility: Optimization Algorithm with Application in Image Thresholding Segmentation" Algorithms 18, no. 4: 199. https://doi.org/10.3390/a18040199

APA Style

Sanchez, O. D., Reyes, L. M., Valdivia-González, A., Alanis, A. Y., & Rangel-Heras, E. (2025). Gaslike Social Motility: Optimization Algorithm with Application in Image Thresholding Segmentation. Algorithms, 18(4), 199. https://doi.org/10.3390/a18040199

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop