Next Article in Journal
A Novel Adaptive Transient Model of Gas Invasion Risk Management While Drilling
Previous Article in Journal
Temporal Stability and Practical Relevance of Velocity and Velocity-Loss Perception in Back Squat
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Hybrid Multi-Strategy Optimization Metaheuristic Algorithm for Multi-Level Thresholding Color Image Segmentation

by
Amir Seyyedabbasi
Computer Engineering Department, Faculty of Engineering and Natural Science, Istinye University, 34396 Istanbul, Turkey
Appl. Sci. 2025, 15(13), 7255; https://doi.org/10.3390/app15137255
Submission received: 20 May 2025 / Revised: 16 June 2025 / Accepted: 21 June 2025 / Published: 27 June 2025
(This article belongs to the Topic Color Image Processing: Models and Methods (CIP: MM))

Abstract

Hybrid metaheuristic algorithms have been widely used to solve global optimization problems, making the concept of hybridization increasingly important. This study proposes a new hybrid multi-strategy metaheuristic algorithm named COSGO, which combines the strengths of grey wolf optimization (GWO) and Sand Cat Swarm Optimization (SCSO) to effectively address global optimization tasks. Additionally, a chaotic opposition-based learning strategy is incorporated to enhance the efficiency and global search capability of the algorithm. One of the main challenges in metaheuristic algorithms is premature convergence or getting trapped in local optima. To overcome this, the proposed strategy is designed to improve exploration and help the algorithm escape local minima. As a real-world application, multi-level thresholding for color image segmentation—a well-known problem in image processing—is studied. The COSGO algorithm is applied using two objective functions, Otsu’s method and Kapur’s entropy, to determine optimal multi-level thresholds. Experiments are conducted on 10 images from the widely used BSD500 dataset. The results show that the COSGO algorithm achieves competitive performance compared to other State-of-the-Art algorithms. To further evaluate its effectiveness, the CEC2017 benchmark functions are employed, and a Friedman ranking test is used to statistically analyze the results.

1. Introduction

Image segmentation aims to divide an image into several regions with similar features or semantic meanings for further processing and analysis [1]. This technique has attracted significant attention in various fields, including early disease detection [2], autonomous driving [3], and image processing applications [4]. To achieve accurate and robust segmentation, it is essential that the pixel characteristics within each region are consistent [1]. Threshold-based segmentation methods in image segmentation are typically divided into two categories [5]. Bi-level thresholding is a fundamental technique that segments an image into two separate classes by determining a single optimal threshold value. On the other hand, multi-level thresholding partitions the image into multiple segments by locating several threshold values.
Multi-threshold segmentation refers to the process of dividing an image into several object and background regions by determining multiple threshold values [6]. Existing image thresholding segmentation techniques primarily include two classical approaches: the Otsu method and the Kapur entropy method. The Kapur entropy approach selects thresholds based on the principles of information theory [1,6]. It operates by evaluating the entropy values of pixel distributions at various grey levels within the image. The threshold corresponding to the maximum entropy is chosen as the optimal point for image segmentation. The Otsu method is a popular technique for image segmentation that relies on analyzing greyscale distributions [1,6].
Its main strategy is to identify a threshold that separates the image into two groups, aiming to reduce the variance within each group as much as possible. By iteratively assessing intra-class variances, the method selects the threshold that yields the lowest total variance, leading to optimal binarization. The increase in the number of thresholds in multi-threshold image segmentation leads to an exponential rise in computational complexity, which in turn causes traditional multi-level thresholding techniques to suffer from reduced accuracy and slow convergence.
To address these limitations, metaheuristic algorithms have been employed in image segmentation tasks, demonstrating impressive performance. There have been many metaheuristic algorithms presented in the last few decades. Metaheuristic algorithms are used to solve real-world problems in different dimensions. It is possible to find the best or near-optimal solution using metaheuristic algorithms. Based on the No-Free-Lunch (NFL) theory [7], it has been concluded that there is no algorithm that can find all the possible solutions to optimization problems. There is a large difference between a heuristic algorithm and a metaheuristic algorithm in avoiding local optima trapping. Unlike heuristic algorithms, metaheuristic algorithms try to find a global optimum or optimal solution. Metaheuristic algorithms avoid being trapped in local optima through the use of exploration and exploitation phases. In each metaheuristic algorithm, these phases are the main operations. In addition, the trade-off between these two phases is important. The process of exploration involves looking for the most effective solution(s) for a given area(s), while the process of exploitation involves focusing on the most effective solution(s) for a given area(s) [8]. In most cases, the exploration ability of search agents must be increased in the initial iterations in order to locate potential areas and then exploit the optimum solution once discovered.
Metaheuristic algorithms can be divided into three categories: evolution-based, swarm-based, and physics-based. Evolutionary algorithms are largely based on natural laws and behaviors. Based on Darwin’s theory of evolution, the genetic algorithm is a metaheuristic algorithm [9]. Another evolution-based metaheuristic algorithm is differential evolution (DE) [10], another algorithm based on Darwinian theory, although it differs from GA in its selection phase. Metaheuristic algorithms based on swarms and societies in nature are called swarm-based algorithms. In order to find a solution to these algorithms, members of groups interact with one another within a search space. One of the most well-known algorithms in this category is particle swarm optimization (PSO) [11]. PSO simulates flocking birds’ social behavior. There are several possible solutions represented by each particle in the group. Among the other algorithms in this category are ant colony optimization (ACO) [12], grey wolf optimization [13], and Sand Cat Swarm Optimization [14]. Metaheuristic algorithms with physics-based inputs are derived from physical laws and theories, like gravity and particle movement. Among the well-known physics-based metaheuristic algorithms are the Gravitational Search Algorithm (GSA) [15], the Big-Bang Big-Crunch (BBBC) [16], and the Charged System Search (CSS) [17].
In [5], the proposed modified grey wolf optimizer (MGWO) is applied to multi-level thresholding color image segmentation to achieve the optimum threshold. The authors used some strategies in leader selection and position update; in addition, they used a mutation operator to lead search agents to the optimum solution. In [1], the authors proposed a hybrid algorithm based on crayfish optimization and a differential evolution algorithm named ACOADE to use in color multi-threshold image segmentation. This hybrid algorithm maximizes the foraging parameter of crayfish optimization and uses the base process of the differential evolution (DE) algorithm to achieve the optimum threshold value of the image segmentation technique using the Otsu and Kapur entropy methods as objective functions. The authors of [18] proposed an improved whale optimization algorithm to overcome the limitations of the classical WOA in color multi-threshold image segmentation. In this way, enhancement significantly improved the segmentation accuracy and robustness of the results.
In [19], the black widow optimization (BWO) algorithm is applied to a multi-level thresholding image segmentation problem. Similar to other studies in this field, the Otsu and Kapur entropy methods are used to evaluate as their objective function. Learning Enthusiasm-Based Teaching–Learning-Based Optimization (LebTLBO) is a recent metaheuristic known for its simplicity and low computational cost [20]. Its effectiveness in image segmentation is evaluated using the Berkeley Segmentation Dataset 500 (BSDS500). This dataset provides diverse and high-quality images suitable for benchmarking. Objective functions such as Otsu’s method and Kapur’s entropy are employed to assess segmentation performance. Improved elephant herding optimization (IEHO) integrates opposition-based learning and chaotic sequences to enhance the exploration–exploitation balance [21]. Its efficiency is tested on multi-level thresholding (MTH) for image segmentation. Kapur’s entropy, Otsu’s method, and Masi entropy are used as objective functions to determine optimal thresholds.
Recently, a novel 3D memristive cubic map employing dual discrete memristors (3D-MCM) has been proposed, exhibiting rich dynamical behaviors, including coexisting attractors, making it suitable for secure image encryption tasks [22]. To address the growing demand for secure and efficient image transmission, a novel color image encryption algorithm based on the 2D Logistic–Rulkov Neuron Map (2D-LRNM) was proposed, offering enhanced complexity and unpredictability through a hybrid chaotic model [23]. PSEQADE is a quantum differential evolutionary algorithm that uses a quantum-adaptive mutation strategy and a population state evaluation framework to address excessive mutation in QDE. The experimental results show excellent convergence performance, high accuracy, and stability in solving high-dimensional complex problems [24].
MGAD is a novel adaptive evolutionary multitask optimization algorithm designed to address challenges in evolutionary multitask optimization, including dynamic control, inaccurate task selection, negative knowledge transfer, and decreased optimization performance due to increased uncertainty [25]. The TSLS-SCSO algorithm, an enhanced version of Sand Cat Swarm Optimization (SCSO), is proposed for autonomous path planning in dynamic environments. It incorporates a population initialization strategy, spiral search, Livy flight, tent chaotic mapping, and sparrow alert mechanism, achieving higher accuracy and competitiveness [26].
The method includes a PHFM-based domain transform strategy and the BHRN network structure. The input image is transformed into the PHFM domain for parsing and edge predictions. An encoder–decoder structure is used for mutual guidance learning across multiple stages. The shared encoder enhances feature propagation, while separate decoders predict parsing and edge results [27]. This study explores the use of chaotic systems in metaheuristic optimization techniques to improve search efficiency. The initial populations of the PSO and WSO algorithms are distributed according to Logistic, Chebyshev, Circle, sine, and Piecewise chaotic maps [28].
The paper proposes a Polynomial Chebyshev Symmetric Chaotic-based GJO (PCSCGJO) algorithm to improve image segmentation in image processing. The PCSCGJO algorithm combines a chaotic generating function with the Golden Jackal Optimizer (GJO) algorithm, aiming to avoid local optima and improve segmentation results. The simulation results show the PCSCGJO method’s effectiveness in dealing with medical color images outperforms other metaheuristic algorithms in terms of quality and accuracy [29]. This paper proposes an improved Transit Search algorithm based on a chaotic map and local escape operator (CLTS). The algorithm introduces chaos initialization, simplifies transit criteria, improves convergence speed in the neighborhood phase, and balances exploration and exploitation. The experimental results show the CLTS algorithm achieves faster convergence speed and higher accuracy compared to the TS algorithm and other advanced algorithms. It also outperforms advanced image segmentation algorithms in terms of convergence and segmentation effects [30].
To improve metaheuristic algorithm performance, hybrid metaheuristic algorithms are proposed. It is common for metaheuristic algorithms to integrate two or more metaheuristic algorithms to solve optimization problems. Hybrid metaheuristic algorithms have the primary objective of utilizing the advantages of both algorithms while minimizing their drawbacks. Hybrid metaheuristic algorithms generally fine-tune exploration and exploitation [31]. A hybrid metaheuristic algorithm is proposed in order to achieve a balance between exploration and exploitation of metaheuristic algorithms. The performance of the convergence curve and the optimal solution during algorithm execution are essential for achieving the best result. The metaheuristic algorithm finds the optimal solution by analyzing the initial search space, whereas the algorithm structure is used to determine the optimal solution. The diversity of solutions is also important in this way. On the other hand, there is a relationship between the diversity of solutions and the speed of convergence. Fast convergence is often caused by low diversity. Most hybrid metaheuristic algorithms perform solution diversity, and the algorithm structure further promotes solution diversity [31,32]. The most important issue to avoid is local trapping.
This study makes the following major contributions:
  • A novel hybrid metaheuristic algorithm (COSGO) is proposed, which integrates grey wolf optimization (GWO) and Sand Cat Swarm Optimization (SCSO) to enhance global optimization performance.
  • The COSGO algorithm is specifically designed to overcome the local optima problem by improving exploration capabilities. The adaptive characteristics of SCSO are used to increase the diversity of GWO, thereby achieving a better balance between exploration and exploitation.
  • A chaotic opposition-based learning (COBL) strategy is incorporated into the hybrid metaheuristic algorithm to further improve convergence speed and solution quality by enhancing population diversity.
  • The proposed algorithm is applied to the multi-level thresholding color image segmentation problem, aiming to reduce computational effort in finding optimal threshold values. Both Otsu’s method and Kapur’s entropy are employed as objective functions.
  • Experimental validation is conducted using the BSD500 dataset (10 color images) to demonstrate the segmentation quality and effectiveness of the COSGO algorithm.
  • Comprehensive performance evaluation is carried out using the CEC2017 benchmark suite, and statistical significance is assessed using the Friedman ranking test, confirming the competitive performance of the proposed approach.

2. Fundamentals

An overview of the GWO optimization algorithm is provided in the first part of this section. The second part of the presentation focuses on the SCSO optimization algorithm, which is explained in detail. At the end of this section, the multi-level thresholding image segmentation problem is defined.

2.1. Grey Wolf Optimization (GWO)

The grey wolf optimization (GWO) algorithm was described by Mirjalili et al. [13]. A simulation of GWO was conducted based on the leadership hierarchy and social behavior of grey wolves. Unlike other types of animals, grey wolves live and hunt in a unique habitat. There are usually 5–12 grey wolves in a group of grey wolves. In each group, there are four kinds of grey wolves, namely, alpha (α), beta (β), delta (δ), and omega (ω). In groups, each of these types of grey wolves has a different responsibility. Alpha (α) wolves have a powerful effect on their groups. The alpha (α) wolf is responsible for making decisions regarding hunting, sleep locations, and wake-up times. The group is responsible for making this decision. Beta (β) wolves are on the second level of the pack hierarchy. Co-leaders are wolves that are considered to be part of the group’s leadership. Delta (δ) is categorized at the third level of the hierarchy. Delta wolves must follow the instructions of the upper-level wolves: alpha and beta. The delta is the last wolf capable of eating and hunting. It is important to say that if the delta wolf does not exist in a group, the group encounters internal chaos and problems. In the hierarchy of grey wolves, the omega grey wolf occupies the lowest position. A wolf is considered an omega if it does not include any of the above types.
The GWO algorithm consists of three phases: encircling, attacking, and hunting. Each phase has a different mathematical model. A number of parameters are included in the encircling model to allow the wolves to encircle their prey. As a result, the new location has been updated. The following equations represent encircling the prey mathematically:
D = C · X p t X t
X t + 1 = X p t A · D
A = 2 a · r 1 a
C = 2 · r 2
a = 2 1 t T
In this regard, A ,   C are coefficient vectors, and X , X p are the position vectors of the grey wolf and prey, respectively. Here, another important parameter is a , which decreased from 2 to 0. This parameter is used to guide the search agents to the prey based on the iteration number. As in most metaheuristic algorithms, there are some random parameters between 0 and 1; in the GWO algorithm, r 1   a n d   r 2 are random numbers:
D α = C 1 · X α X , D β = C 2 · X β X , D δ = C 3 · X δ X
X 1 = X a A 1 · D α , X 2 = X β A 2 · D β , X 3 = X δ A 3 · D δ
X t + 1 = X 1 t + X 2 t + X 3 t 3

2.2. Sand Cat Swarm Optimization (SCSO)

Sand Cat Swarm Optimization (SCSO) is a metaheuristic algorithm inspired by the unique survival and hunting behaviors of sand cats, a species adapted to harsh desert environments [14]. Unlike domestic cats, sand cats can detect low-frequency sounds below 2 kHz, which plays a crucial role in locating prey in challenging conditions. SCSO simulates this remarkable sensory and hunting capability to navigate the search space and locate near-optimal solutions. Similar to other swarm-based optimization techniques, the algorithm begins with the random initialization of a population of search agents within defined lower and upper bounds. Each agent represents a potential solution in a matrix where rows correspond to individual agents and columns to the problem’s dimensions. A fitness function is employed to evaluate how well each solution satisfies the optimization objective—whether minimization or maximization—and guides the search process across iterations. The algorithm iteratively updates the agents’ positions based on fitness values until convergence, with the final solution representing the best outcome found. While the principal goal across metaheuristics is similar, the strategies employed, such as the foraging-inspired mechanism in SCSO, distinguish each algorithm in achieving optimal or near-optimal results.
Each search agent begins with a sensitivity range starting at 2 kHz, controlled by the R G parameter, which linearly decreases over iterations to balance exploration and exploitation. Position updates are based on both the best candidate and the current position, modulated by the sensitivity range r   , ensuring adaptive movement across the search space. A circular movement strategy is employed using a random angle θ , enabling agents to effectively avoid local optima and converge toward the best solution:
r G = s M S M × i t e r c i t e r M a x
R = 2 × r G × r a n d 0 , 1 r G
r   = r G × r a n d 0 ,   1
P o s t + 1 = r . P o s b c t r a n d 0 , 1 · P o s c t
P o s r n d = r a n d 0 , 1 · P o s b t P o s c t                                 P o s t + 1 = P o s b t r . P o s r n d · cos θ
X t + 1 = P o s b t P o s r n d · cos θ . r   r . P o s b c t r a n d 0 , 1 · P o s c t   R 1 ; e x p l o i t a t i o n R > 1 ; e x p l o r a t i o n

2.3. Multi-Level Thresholding Image Segmentation

One of the most important and sensitive topics in image processing research is the multi-level thresholding problem, which can accurately capture complex details and structures in images. Additionally, it greatly enhances the impact of image segmentation. The segmentation approach is applied to images in the majority of medical sciences, such as pattern identification, medical image processing, and medical image analysis. The Otsu and Kapur methods are used in this study as objective functions in this problem.

2.3.1. Otsu Method

Otsu’s multi-threshold image segmentation is a statistical, unsupervised, and non-parametric method used to determine the optimal threshold values in image processing [4]. The goal of this method is to maximize the between-class variance, ensuring that pixels within the same segment have similar intensity values while also achieving strong separation between different segments. Assume an image is to be divided into N classes. To achieve this segmentation, N − 1 thresholds T 1 , T 2 , , T N 1 must be determined. Multi-thresholding separates pixels based on their greyscale intensity values using these thresholds. The normalized histogram of the image is defined below in Equation (15)
p i = h i N P ,   i = 0 L 1   p ( i ) = 1
Here, h i is the histogram count for intensity level i, NP is the total number of pixels in the image, and L is the number of greyscale levels. For a single threshold T, the probability of occurrence (weights) for the two classes separated by T are given by:
w 0 T = i = 0 T 1 p i               w 1 T = i = T L 1 p i
The class means are:
μ 0 T = 1 w 0 T i = 0 T 1 i . p i           μ 1 T = 1 w 1 T i = T L 1 i . p i
The total mean intensity of the image is:
μ T = i = 0 L 1 i . p i
Otsu’s method selects the threshold T* that maximizes the between-class variance:
σ b 2 T = w 0 T . w 1 T .   [ μ 0 T μ 1 T ] 2 T * = a r g m a x σ b 2 T
For multi-threshold segmentation (i.e., more than two classes), the principle is extended to maximize the sum of between-class variances for all N classes. This typically involves evaluating combinations of thresholds to find the set that yields the highest overall between-class variance.

2.3.2. Kapur Method

An additional popular technique for multi-level thresholding is Kapur’s method, which is founded on the idea of Shannon entropy. Unlike Otsu’s method, which relies on variance-based class separation, Kapur’s method seeks to maximize the sum of entropies of the segmented regions, thus ensuring that each region contains the most information possible. The method utilizes the normalized histogram of the greyscale image to calculate the probability distribution of pixel intensities. Given a combination of thresholds T 1 ,   T 2 ,   ,   T K 1 , the total entropy H for K segmented regions is calculated as:
H = H 1 + H 2 + + H K
The entropy of the final region is:
H k = i = T k 1 + 1 L 1   p ( i ) W K log ( p ( i ) W K )
Based on the above equation, pi is the probability of the ith grey level, and L is the total number of grey levels. The probability sums (weights) for each segmented region are calculated as follows:
W K = i = T k 1 + 1 L 1 p ( i )
The optimal set of thresholds is the one that maximizes the total entropy H, thereby achieving the most informative segmentation of the image. Kapur’s method is particularly effective for images with complex intensity distributions and is often used in applications requiring the precise segmentation of textures or objects with varying brightness.

3. Hybrid Metaheuristic Algorithm

Here, the classic SCSO and GWO algorithms are used to propose a new hybrid algorithm that benefits from the advantages of these algorithms to create an efficient algorithm for solving global optimization. In addition, a chaotic opposition-based learning strategy is used in the proposed hybrid optimization algorithm.

3.1. Hybridization SCSO and GWO

The SCSO algorithm uses the R parameter to control the movement of search agents within the search space. This parameter also ensures a balance between exploration and exploitation. Inspired by the unique characteristics of sand cats in nature, the R parameter emphasizes the sensitivity range and decreases from 2 kHz to 0 over time. In contrast, the GWO algorithm uses the A parameter to control the movement of search agents. When the A parameter value is greater or less than 1, it influences the balance between exploration and exploitation. However, this mechanism can lead to drawbacks, such as trapping in local optima, especially in large search spaces. To address this issue, the hybrid algorithm incorporates the R parameter to enhance control over agent movement and improve the balance between exploration and exploitation.
In this context, the COSGO algorithm benefits from the R-value of the SCSO algorithm to switch dynamically between two well-known metaheuristic algorithm strategies. When the R-value is between −1 and 1, the search agents follow the behavior rules of the SCSO algorithm; otherwise, they update their positions based on the GWO algorithm. COSGO also includes a mechanism to validate the fitness of newly updated positions. Updated positions are first checked against the search space boundaries. If they fall outside the allowed limits, correction parameters are applied to move them back within boundaries. Subsequently, a sorting mechanism is employed to merge the new positions with the existing ones, resulting in a combined population of old and new individuals. Likewise, the fitness values of both the existing and new individuals are merged. All individuals are ranked in ascending order based on their fitness values, with the best individual placed at the top. The population is then updated by selecting the top individuals according to the original number of search agents, ensuring that the best solutions are retained.
Grey wolf optimization (GWO) was employed due to its effective global exploration capabilities, simple structure, and minimal parameter requirements, which make it highly adaptable across a variety of optimization problems. On the other hand, the Sand Cat Swarm Optimization (SCSO) algorithm has demonstrated efficient local search behavior and a balanced exploration–exploitation trade-off, making it a suitable candidate for hybridization. The integration of chaotic opposition-based learning further enhances diversity and helps avoid premature convergence. Comparative studies such as [13,14] have also highlighted the complementary strengths of these algorithms, supporting the rationale for their combination in the proposed framework.

3.2. Chaotic Opposition-Based Learning Strategy

One of the recently introduced techniques in optimization is opposition-based learning (OBL) [33]. OBL was first proposed by Tizhoosh to emphasize the evaluation of the opposite position of a search agent within the search space. In this approach, an optimization algorithm evaluates the fitness of both the current position and the opposite position of a search agent. This allows the algorithm to explore the search space more broadly, covering diverse regions and directions. Moreover, in complex optimization problems where multiple local optima exist, search agents can easily become trapped. OBL helps mitigate this issue by increasing the chances of escaping local optima, thus enhancing the global search capability of the algorithm. In Equation (23), which represents the initial population of the optimization algorithm, lb is lower bound, ub is upper bound, and ri is a random number from a uniform distribution in the range [0, 1]. In equation 24, the opposite position of the current position is calculated:
p o s t + 1 = L b + r i ( U b L b )
O p o s t + 1 = L b + U b p o s t + 1
As mentioned earlier, OBL benefits from considering the opposite population based on fixed positions; however, this approach may face limitations. To address this issue, in [34], the authors proposed a modified version of OBL by introducing a random number, which adds a certain level of randomness to the solution and improves diversity. In the chaotic opposition-based learning strategy (COBLS), chaotic behavior is introduced to enhance population diversity and avoid premature convergence. A chaotic array is first generated by iteratively applying the sine map s times. This chaotic array is then used to replace the conventional random array, and opposite solutions are calculated accordingly. As a result, the search process is enriched by exploring different regions of the search space more effectively.
In this study, the sine map was selected as the chaotic map to enhance the diversity of the population during the optimization process. Compared to other chaotic maps such as the Logistic map, the sine map offers smoother and more uniformly distributed sequences, which can help to explore the search space more effectively. Previous studies have shown that the choice of chaotic map can significantly influence the convergence behavior and performance of metaheuristic algorithms [35,36]:
O p o s t + 1 = L b + U b F s   · p o s t + 1
F i + 1 = s i n   ( π F i )
In Equation (25), the Fs is a chaotic array that has been mapped using a sine function s times. S is set at 20 in order to prevent the loss of diversity brought on by too few iterations or the increase in running time produced by too many iterations.

3.3. System Model

As aforementioned, this study proposes a new multi-strategy hybrid metaheuristic algorithm to address one of the np-hard problems, multi-level thresholding for color image segmentation. The aim of this problem is to partition a color image into meaningful regions based on pixel intensity values. Initially, it takes an RGB image as input and converts it to greyscale histograms. These histograms are then used to determine the optimal threshold values. This combinatorial optimization problem is modeled, and the goal is to select a set of threshold values that maximizes an image quality criterion. The proposed COSGO algorithm was employed to find these optimal thresholds. To evaluate the COSGO algorithm, a total of 10 real-world images were taken from the BSD500 dataset, which is one of the well-known datasets. The flowchart of the proposed COSGO algorithm is given in Figure 1.
In addition, in the simulation, the main parameters used in the system and COSGO algorithm are explained below:
  • T: number of thresholds (3–5 in this study).
  • MaxIter: maximum number of iterations (e.g., 100).
  • PopulationSize: number of candidate solutions (e.g., 30).
  • Objective Functions: Otsu method and Kapur entropy.
  • Dataset: BSD500 (10 selected color images).
  • Channels: R, G, and B (treated independently for thresholding).
  • Evaluation Metric: PSNR, SSIM, and computation time.
It is possible to explain the problem formulation. Let h(i) be an image histogram for intensity level i ∈ [0, 255]. The objective is to determine a threshold vector for a multi-level thresholding problem with T thresholds:
T = t 1 , t 2 , , t T     w i t h                     0 < t 1 < t 2 < < t T < 255
The goal is to maximize f(T), a fitness function that can be either Otsu’s method or Kapur’s entropy. Thus, the optimization problem is formulated as:
max f T   s u b j e c t   t o       t 1 < t 2 < < t T
In order to determine the optimal threshold set T that produces the highest segmentation quality based on the chosen objective function, the COSGO algorithm searches the solution space.

4. Results and Analysis

This section evaluates the proposed algorithm using benchmark functions, which are commonly employed to assess metaheuristic algorithms due to their stochastic nature. To demonstrate the effectiveness of the proposed method, it is compared with several metaheuristic algorithms, including GWO [13], SCSO [14], and the WOA [37] and two hybrid metaheuristic algorithms, including AGWO_CS [38] and the PSOGSA [39]. The evaluation includes benchmark tests and statistical analyses. Each algorithm was run independently 30 times using identical control settings, and performance was measured based on the mean and standard deviation. All simulations were conducted on the same system using MATLAB R2020b. The simulation parameters for each metaheuristic algorithm are explained in Table 1.

4.1. Experimental Analysis Based on CEC2017

For numerical validation, a set of 29 benchmark functions from the CEC2017 test suite was employed, which are widely utilized for assessing the performance of metaheuristic algorithms. These functions are categorized into four groups: the unimodal functions range from F1 to F3, multimodal from F4 to F10, hybrid from F11 to F20, and composite from F21 to F30. The proposed COSGO algorithm was benchmarked using these functions and compared against several existing methods, including GWO, SCSO, the WOA, AGWO-CS, and the PSOGSA. Table 2 summarizes the experimental outcomes in terms of mean, worst, best, and standard deviation values. In this way, the Friedman ranking presented in Table 3 was used to compare the performance of COSGO to metaheuristic algorithms. For each benchmark function, the algorithms are ranked based on their performance, with the best-performing algorithm. These rankings were then averaged over all problems to compute the overall Friedman rank for each algorithm. A lower average rank indicates better overall performance. The Friedman test is a non-parametric statistical method that helps determine whether the observed differences in ranks among the algorithms are statistically significant. Based on Table 3, it is observed that the average ranking of the COSGO algorithm is lower than the other algorithms. It means the COSGO algorithm performance is better than the others.

4.2. Experimental Analysis Based on Multi-Level Thresholding

To assess the performance of the proposed COSGO algorithm against other methods, multiple quantitative metrics were used, such as the Peak Signal-to-Noise Ratio (PSNR), the Feature Similarity Index (FSIM), the Structural Similarity Index Measure (SSIM), dice, and the Jaccard index and accuracy, along with the optimal objective function values and their associated variances. In this regard, ten images were selected from the BSD500 dataset [40]. For ease of interpretation and comparative analysis, the best results averaged over 10 independent runs are highlighted in bold in the subsequent tables. Table 4 presents the average and standard deviation fitness function values for all compared methods across thresholding levels 2, 3, 4, and 5 evaluated on all images using two objective functions: Otsu’s method and Kapur’s entropy. Moreover, the proposed COSGO algorithm demonstrates lower computational time at threshold level 2 and level 3. Based on the obtained results of the running time of each algorithm, which are presented in Table 5, the COSGO algorithm achieves superior performance in terms of execution time.
The segmentation results based on Otsu and Kapur entropy are shown in Table 6, along with the PSNR, SSIM, and FSIM values. A traditional statistic for assessing image quality is the PSNR. It determines how segmented and original photos differ from one another. The segmented image is more like the original when the PSNR value is higher. In this way, the higher value of the PSNR proves the algorithm efficiently. In addition, the mean square error (MSE) was calculated for each algorithm with two objective functions. The obtained results were also in the same running condition, with 100 iteration numbers and 30 search agent numbers. It was observed that the proposed COSGO algorithm showed better performance in terms of higher PSNR and lower MSE values in comparison to the other algorithms.
The Structural Similarity Index Measure (SSIM) evaluates the perceived quality of an image by incorporating luminance, contrast, and structural information. A higher SSIM value closer to 1 indicates a greater structural similarity between the segmented image and the original image. The Feature Similarity Index Measure (FSIM), which is widely used for image quality assessment, particularly in terms of texture and detail preservation, quantifies the similarity based on low-level image features, such as phase congruency and gradient magnitude. Table 7 presents the obtained value for each algorithm with two different objective functions as Otsu and Kapur entropy functions.
In Table 8, the dice coefficient and Jaccard index are shown. The dice coefficient and Jaccard index (intersection over union) are widely used metrics for evaluating image segmentation performance by measuring the similarity between the predicted segmentation and the ground truth. In contrast, the Jaccard index measures the ratio of the intersection to the union of the same sets. Both metrics range from 0 to 1, with higher values indicating greater similarity, and they are especially important in assessing the spatial accuracy of segmentation methods.
Table 9 presents the quantitative evaluation of the segmentation accuracy achieved by each algorithm across the test image dataset. The evaluation is based on two widely recognized overlap-based similarity metrics: the dice coefficient and the Jaccard index. These metrics assess how closely the segmented regions produced by the algorithms align with the ground truth masks. Higher values in both metrics indicate superior segmentation performance, reflecting better boundary adherence and region consistency. The table highlights the comparative strengths of each method and provides a comprehensive understanding of their effectiveness in accurately delineating the target regions.
As shown in Figure 2, Figure 3, Figure 4 and Figure 5, the comparisons of SSIM and FSIM values across 10 test images demonstrate significant variations among different algorithms at threshold levels 2–5.

4.3. Performance Analysis

Based on the results, it should be noted that the COSGO algorithm performed significantly in finding the threshold value in images. In this problem, the image levels were studied on levels 2, 3, 4, and 5, respectively. The findings obtained for the algorithm in concern were near to one, which shows good improvement of the algorithm and a considerable performance of the algorithm, according to Table 6 and Table 7, which have been analyzed on the PSNR, SSIM, and FSIM criteria. An increase in the number of threshold levels generally results in more vivid and detailed segmented images. While images segmented using the COSGO algorithm are often visually distinguishable, the method exhibits certain weaknesses in specific regions. In addition, Figure 6 shows the segmented images with different metaheuristic algorithms.

4.4. Objective Function Analysis

In this section, the results obtained for the Otsu and Kapur entropy objective functions are analyzed. Based on Figure 7, Figure 8 and Figure 9, it can be noted that the COSGO algorithm performed better. In Figure 7, which is based on the PSNR results, the Otsu function obtained superior results over the COSGO algorithm and WOA. In the case of the Kapur function, the GWO and SCSO algorithms obtained better results, respectively. Based on the level of the images, the COSGO algorithm with the Otsu function at level 2, the SCSO algorithm with the Otsu function at level 3, the COSGO algorithm with the Otsu function at level 4, and the COSGO algorithm with the Otsu function at level 5 obtained promising results in the PSNR.
In Figure 8, the Structural Similarity Index Measure (SSIM) values were analyzed for different metaheuristic algorithms in combination with two distinct objective functions: Otsu and Kapur entropy. The results reveal that the proposed COSGO algorithm, when used with the Otsu objective function, generally exhibits strong performance in terms of structural similarity. Notably, at the multi-level thresholding level of 5, COSGO demonstrates superior performance with both Otsu and Kapur entropy compared to the other algorithms. This indicates that the COSGO algorithm is more capable of preserving structural information, such as luminance, contrast, and texture consistency, particularly at higher threshold levels. The effectiveness of COSGO under these conditions highlights its potential for more accurate and visually coherent image segmentation.
In Figure 9, the Feature Similarity Index Measure (FSIM) values obtained from various algorithms are presented and analyzed. The results indicate that the proposed COSGO algorithm demonstrates superior performance compared to other competing methods. Particularly, when integrated with the Otsu objective function, COSGO achieves the highest FSIM values across multiple test images. This suggests that COSGO is more effective in preserving low-level visual features, such as texture and edge information, during the segmentation process. The enhanced FSIM scores further confirm that the proposed approach maintains a high degree of similarity between the segmented and original images, thus contributing to improved perceptual quality and segmentation accuracy.

4.5. Theoretical Insights into the COSGO Algorithm

The COBL method improves the suggested hybrid GWO-SCSO algorithm, which provides unique theoretical contributions beyond the simple combination of two metaheuristics. The hybrid structure achieves a more balanced search process by combining the local exploitation strength of SCSO with the global exploration capabilities of GWO. The algorithm can converge more quickly while being resilient against local optima due to this combination. Additionally, the population diversity is enhanced by the chaotic opposition-based learning (COBL) method, which results in improved initialization and more consistent convergence behavior across runs. Our approach combines a complementary dynamics hierarchical leadership-based search in GWO with adaptive motion in SCSO, which improves the optimization performance in hybrid metaheuristic algorithms that frequently mix similar search behaviors. The COSGO hybridization process collectively offers advantages in both convergence speed and stability, which is established by our experimental findings.

5. Conclusions

In this study, the COSGO metaheuristic algorithm is proposed as a hybrid multi-strategy metaheuristic algorithm. This algorithm benefits from the advantages of the grey wolf optimization and sand cat swarm optimization algorithms to create an efficient metaheuristic algorithm to solve global optimization problems. Also, chaotic opposition-based learning was used to enhance the search agents’ search ability and obtain balanced exploration and exploitation phases. The COSGO algorithm displays better performance in comparison with the GWO and SCSO algorithm, the WOA, and two hybrid metaheuristic algorithms, AGWO-CS and the PSOGSA, in the CEC 2017 benchmark functions. Based on the obtained results in the Freidman ranking test, the COSGO algorithm’s 1.862068966 average ranking shows its competitive efficiency in the benchmark functions. In addition, one of the well-known np-hard problems—multi-level thresholding color image segmentation—was studied with the proposed COSGO algorithm. Two objective functions named Otsu and Kapur were used to find the optimum values. This technique was evaluated with four performance criteria—the PSNR, MSE, SSIM, FSIM, dice coefficient, Jaccard index, and segmentation accuracy—to investigate the COSGO algorithm’s performance in comparison with other metaheuristic algorithms. It was observed that the COSGO algorithm shows good performance with the Otsu method in multi-level thresholding color image segmentation.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Dong, Y.; Li, M.; Zhou, M. Multi-Threshold Image Segmentation Based on the Improved Dragonfly Algorithm. Mathematics 2024, 12, 854. [Google Scholar] [CrossRef]
  2. Akay, R.; Saleh, R.A.; Farea, S.M.; Kanaan, M. Multilevel thresholding segmentation of color plant disease images using metaheuristic optimization algorithms. Neural Comput. Appl. 2022, 34, 1161–1179. [Google Scholar] [CrossRef]
  3. Oliva, D.; Abd Elaziz, M.; Hinojosa, S. Metaheuristic Algorithms for Image Segmentation: Theory and Applications; Springer International Publishing: Berlin/Heidelberg, Germany, 2019; pp. 59–69. [Google Scholar]
  4. Abualigah, L.; Almotairi, K.H.; Elaziz, M.A. Multilevel thresholding image segmentation using meta-heuristic optimization algorithms: Comparative analysis, open challenges and new trends. Appl. Intell. 2023, 53, 11654–11704. [Google Scholar] [CrossRef]
  5. Hu, P.; Han, Y.; Zhang, Z. Multi-Level Thresholding Color Image Segmentation Using Modified Gray Wolf Optimizer. Biomimetics 2024, 9, 700. [Google Scholar] [CrossRef] [PubMed]
  6. Zhang, C.; Pei, Y.H.; Wang, X.X.; Hou, H.Y.; Fu, L.H. Symmetric cross-entropy multi-threshold color image segmentation based on improved pelican optimization algorithm. PLoS ONE 2023, 18, e0287573. [Google Scholar] [CrossRef]
  7. Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1997, 1, 67–82. [Google Scholar] [CrossRef]
  8. Yang, X.S. Nature-Inspired Metaheuristic Algorithms; Luniver Press: Bristol, UK, 2010. [Google Scholar]
  9. Holland, J.H. Genetic algorithms. Sci. Am. 1992, 267, 66–73. [Google Scholar] [CrossRef]
  10. Storn, R.; Price, K. Differential evolution–a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 1997, 11, 341–359. [Google Scholar] [CrossRef]
  11. Kennedy, J.; Eberhart, R. Particle swarm optimization. In Proceedings of the ICNN’95-International Conference on Neural Networks, Perth, WA, Australia, 27 November–1 December 1995; Volume 4, pp. 1942–1948. [Google Scholar]
  12. Dorigo, M.; Birattari, M.; Stutzle, T. Ant colony optimization. IEEE Comput. Intell. Mag. 2007, 1, 28–39. [Google Scholar] [CrossRef]
  13. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey wolf optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef]
  14. Seyyedabbasi, A.; Kiani, F. Sand Cat swarm optimization: A nature-inspired algorithm to solve global optimization problems. Eng. Comput. 2023, 39, 2627–2651. [Google Scholar] [CrossRef]
  15. Rashedi E, Nezamabadi-Pour H, Saryazdi S GSA: A gravitational search algorithm. Inf. Sci. 2009, 179, 2232–2248. [CrossRef]
  16. Erol, O.K.; Eksin, I. A new optimization method: Big bang–big crunch. Adv. Eng. Softw. 2006, 37, 106–111. [Google Scholar] [CrossRef]
  17. Kaveh A, Talatahari S A novel heuristic optimization method: Charged system search. Acta Mech. 2010, 213, 267–289. [CrossRef]
  18. Ma, G.; Yue, X. An improved whale optimization algorithm based on multilevel threshold image segmentation using the Otsu method. Eng. Appl. Artif. Intell. 2022, 113, 104960. [Google Scholar] [CrossRef]
  19. Houssein, E.H.; Helmy, B.E.D.; Oliva, D.; Elngar, A.A.; Shaban, H. A novel black widow optimization algorithm for multilevel thresholding image segmentation. Expert Syst. Appl. 2021, 167, 114159. [Google Scholar] [CrossRef]
  20. Singh, S.; Mittal, N.; Singh, H. A multilevel thresholding algorithm using LebTLBO for image segmentation. Neural Comput. Appl. 2020, 32, 16681–16706. [Google Scholar] [CrossRef]
  21. Chakraborty, F.; Roy, P.K. An efficient multilevel thresholding image segmentation through improved elephant herding optimization. Evol. Intell. 2025, 18, 17. [Google Scholar] [CrossRef]
  22. Gao, S.; Iu, H.H.C.; Erkan, U.; Simsek, C.; Toktas, A.; Cao, Y.; Wu, R.; Mou, J.; Li, Q.; Wang, C. A 3D memristive cubic map with dual discrete memristors: Design, implementation, and application in image encryption. IEEE Trans. Circuits Syst. Video Technol. 2025. Early Access. [Google Scholar] [CrossRef]
  23. Gao, S.; Zhang, Z.; Iu, H.H.C.; Ding, S.; Mou, J.; Erkan, U.; Toktas, A.; Li, Q.; Wang, C.; Cao, Y. A Parallel Color Image Encryption Algorithm Based on a 2D Logistic-Rulkov Neuron Map. IEEE Internet Things J. 2025, 12, 18115–18124. [Google Scholar] [CrossRef]
  24. Deng, W.; Wang, J.; Guo, A.; Zhao, H. Quantum differential evolutionary algorithm with quantum-adaptive mutation strategy and population state evaluation framework for high-dimensional problems. Inf. Sci. 2024, 676, 120787. [Google Scholar] [CrossRef]
  25. Song, Y.; Song, C. Adaptive evolutionary multitask optimization based on anomaly detection transfer of multiple similar sources. Expert Syst. Appl. 2025, 283, 127599. [Google Scholar] [CrossRef]
  26. Deng, W.; Feng, J.; Zhao, H. Autonomous path planning via sand cat swarm optimization with multi-strategy mechanism for unmanned aerial vehicles in dynamic environment. IEEE Internet Things J. 2025. Early Access. [Google Scholar] [CrossRef]
  27. Liu, Y.; Wang, C.; Lu, M.; Yang, J.; Gui, J.; Zhang, S. From simple to complex scenes: Learning robust feature representations for accurate human parsing. IEEE Trans. Pattern Anal. Mach. Intell. 2024, 46, 5449–5462. [Google Scholar] [CrossRef]
  28. Serbet, F.; Kaya, T. New comparative approach to multi-level thresholding: Chaotically initialized adaptive meta-heuristic optimization methods. Neural Comput. Appl. 2025, 37, 8371–8396. [Google Scholar] [CrossRef]
  29. Hamza, A.; Grimes, M.; Boukabou, A.; Lekouaghet, B.; Oliva, D.; Dib, S.; Himeur, Y. A chaotic variant of the Golden Jackal Optimizer and its application for medical image segmentation. Appl. Intell. 2025, 55, 295. [Google Scholar] [CrossRef]
  30. Li, C.; Liu, H. A transit search algorithm with chaotic map and local escape operator for multi-level threshold segmentation of spleen CT images. Appl. Math. Model. 2025, 141, 115930. [Google Scholar] [CrossRef]
  31. Ting, T.O.; Yang, X.S.; Cheng, S.; Huang, K. Hybrid Metaheuristic Algorithms: Past, Present, and Future. In Recent Advances in Swarm Intelligence and Evolutionary Computation; Springer: Berlin/Heidelberg, Germany, 2015; pp. 71–83. [Google Scholar]
  32. Amirsadri, S.; Mousavirad, S.J.; Ebrahimpour-Komleh, H. A Levy flight-based grey wolf optimizer combined with back-propagation algorithm for neural network training. Neural Comput. Appl. 2018, 30, 3707–3720. [Google Scholar] [CrossRef]
  33. Tizhoosh, H.R. Opposition-based learning: A new scheme for machine intelligence. In Proceedings of the International Conference on Computational Intelligence for Modelling, Control and Automation and International Conference on Intelligent Agents, Web Technologies and Internet Commerce (CIMCA-IAWTIC’06), Vienna, Austria, 28–30 November 2005; Volume 1, pp. 695–701. [Google Scholar]
  34. Long, W.; Jiao, J.; Liang, X.; Cai, S.; Xu, M. A random opposition-based learning grey wolf optimizer. IEEE Access 2019, 7, 113810–113825. [Google Scholar] [CrossRef]
  35. Rani, G.S.; Jayan, S.; Alatas, B. Analysis of chaotic maps for global optimization and a hybrid chaotic pattern search algorithm for optimizing the reliability of a bank. IEEE Access 2023, 11, 24497–24510. [Google Scholar] [CrossRef]
  36. Lu, H.; Wang, X.; Fei, Z.; Qiu, M. The effects of using chaotic map on improving the performance of multiobjective evolutionary algorithms. Math. Probl. Eng. 2014, 2014, 924652. [Google Scholar] [CrossRef]
  37. Mirjalili, S.; Lewis, A. The whale optimization algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  38. Sharma, S.; Kapoor, R.; Dhiman, S. A novel hybrid metaheuristic based on augmented grey wolf optimizer and cuckoo search for global optimization. In Proceedings of the 2021 2nd International Conference on Secure Cyber Computing and Communications (ICSCCC), Jalandhar, India, 21–23 May 2021; pp. 376–381. [Google Scholar]
  39. Mirjalili, S.; Hashim, S.Z.M. A new hybrid PSOGSA algorithm for function optimization. In Proceedings of the 2010 International Conference on Computer and Information Application, Tianjin, China, 3–5 December 2010; pp. 374–377. [Google Scholar]
  40. Martin, D.; Fowlkes, C.; Tal, D.; Malik, J. A database of human segmented natural images and its application to evaluating segmentation algorithms and measuring ecological statistics. In Proceedings of the Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001, Vancouver, BC, Canada, 7–14 July 2001; Volume 2, pp. 416–423. [Google Scholar]
Figure 1. The flowchart of the COSGO algorithm.
Figure 1. The flowchart of the COSGO algorithm.
Applsci 15 07255 g001
Figure 2. Comparison of SSIM and FSIM values obtained from 10 test images using different algorithms at threshold level 2.
Figure 2. Comparison of SSIM and FSIM values obtained from 10 test images using different algorithms at threshold level 2.
Applsci 15 07255 g002
Figure 3. Comparison of SSIM and FSIM values obtained from 10 test images using different algorithms at threshold level 3.
Figure 3. Comparison of SSIM and FSIM values obtained from 10 test images using different algorithms at threshold level 3.
Applsci 15 07255 g003
Figure 4. Comparison of SSIM and FSIM values obtained from 10 test images using different algorithms at threshold level 4.
Figure 4. Comparison of SSIM and FSIM values obtained from 10 test images using different algorithms at threshold level 4.
Applsci 15 07255 g004
Figure 5. Comparison of SSIM and FSIM values obtained from 10 test images using different algorithms at threshold level 5.
Figure 5. Comparison of SSIM and FSIM values obtained from 10 test images using different algorithms at threshold level 5.
Applsci 15 07255 g005
Figure 6. The segmented images with different algorithms.
Figure 6. The segmented images with different algorithms.
Applsci 15 07255 g006aApplsci 15 07255 g006b
Figure 7. Comparative PSNR analysis across different threshold levels (levels 2–5) using Otsu and Kapur methods with four algorithms. COSGO consistently outperforms other methods, especially under the Otsu-based segmentation, indicating its robustness in preserving image quality.
Figure 7. Comparative PSNR analysis across different threshold levels (levels 2–5) using Otsu and Kapur methods with four algorithms. COSGO consistently outperforms other methods, especially under the Otsu-based segmentation, indicating its robustness in preserving image quality.
Applsci 15 07255 g007
Figure 8. SSIM analysis for multiple threshold levels (levels 2–5) using Otsu and Kapur methods with various metaheuristic algorithms. COSGO again shows superior structural similarity performance, particularly under the Otsu thresholding strategy, indicating its effectiveness in preserving structural details.
Figure 8. SSIM analysis for multiple threshold levels (levels 2–5) using Otsu and Kapur methods with various metaheuristic algorithms. COSGO again shows superior structural similarity performance, particularly under the Otsu thresholding strategy, indicating its effectiveness in preserving structural details.
Applsci 15 07255 g008
Figure 9. FSIM analysis across different threshold levels (levels 2–5) using Otsu and Kapur methods with various algorithms. COSGO consistently outperforms others in feature similarity, especially when combined with Otsu, highlighting its robustness in preserving perceptual image quality.
Figure 9. FSIM analysis across different threshold levels (levels 2–5) using Otsu and Kapur methods with various algorithms. COSGO consistently outperforms others in feature similarity, especially when combined with Otsu, highlighting its robustness in preserving perceptual image quality.
Applsci 15 07255 g009
Table 1. The simulation parameters.
Table 1. The simulation parameters.
AlgorithmParameterValueAlgorithmParameterValue
GWOa[2, 0]SCSOSensitivity range (rG)[2, 0]
A[2, 0]Phase control range (R)[−2rG, 2rG]
AGWO_CSa 2 c o s ( r a n d ) · l M a x i t e r WOAa[2, 0]
PSOGSA G 0 1A[2, 0]
C10.5C2.rand (0, 1)
C21.5l[−1, 1]
b1
Table 2. Results for F1-F30 algorithms (CEC17) D10.
Table 2. Results for F1-F30 algorithms (CEC17) D10.
Function COSGOGWOSCSOWOAAGWO-CSPSOGSA
F1Mean284035,7001.9 × 10937,800,00014,600.00108,000.00
Worst842054,8001.92 × 1091.14 × 109210,000.00240,000.00
Best12617,3001.89 × 10913416,600.0019,500.00
STD238094309,280,000207,00038,500.0045,000.00
F3Mean300141024608772520.00877.00
Worst301430027,50038308500.003830.00
Best300546300302378.00302.00
STD0.22196060009372170.00937.00
F4Mean424418421402424.00409.00
Worst527464529403527.00440.00
Best403407404400403.00406.00
STD27.819.828.10.81827.806.01
F5Mean521520535515521.00516.00
Worst537542565529537.00535.00
Best510506505508510.00504.00
STD7.0610.413.44.67.069.11
F6Mean600601613613604.00601.00
Worst600605643640614.00605.00
Best600600602600600.00600.00
STD0.02341.1111.411.63.131.08
F7Mean719731763724738.00734.00
Worst729746803749765.00756.00
Best713717730712715.00716.00
STD3.521020.59.8812.0011.10
F8Mean812816827834819.00817.00
Worst824837838857829.00835.00
Best805805814818811.00805.00
STD5.418.117.338.044.947.66
F9Mean900907997913919.00910.00
Worst9009691280970973.00984.00
Best900900903905900.00900.00
STD0.012414.388.811.424.1021.70
F10Mean15001720189015201900.001480.00
Worst22702840249020002520.002500.00
Best10001180124012501420.001130.00
STD450412296156308.00317.00
F11Mean11501120110011101150.001130.00
Worst12401150111011301240.001310.00
Best11201100110011001120.001110.00
STD31.311.32.015.8231.3044.20
F12Mean641,000948,000943,000844,000641,000.00844,000.00
Worst2,820,0007,210,0003,890,0002,740,0002,820,000.002,740,000.00
Best13,20027,70046,100866013,200.008660.00
STD740,0001,410,000999,000901,000740,000.00901,000.00
F13Mean830013,00011,60084508450.0012,600.00
Worst29,00026,50035,10015,40015,400.0029,700.00
Best13602280250031903190.003140.00
STD76507460874029902990.006760.00
F14Mean14502900210014703060.001540.00
Worst14807240523015605560.001610.00
Best14401450144014201450.001500.00
STD11.31860140035.21740.0024.00
F15Mean15404380271016905230.004120.00
Worst158010,1007690282010,600.006020.00
Best15101650156015201580.001640.00
STD16.1227014202552440.001150.00
F16Mean17701820180017801770.001700.00
Worst19902240200019601990.002050.00
Best16201600162016001620.001610.00
STD118178108111118.00115.00
F17Mean17601760176017901760.001790.00
Worst18701890182018801870.001880.00
Best17201720173017401720.001740.00
STD34.237.119.134.434.2034.40
F18Mean20,80025,30018,90016,10020,800.0030,800.00
Worst41,20055,30050,50040,90041,200.0055,600.00
Best36404060201024803640.002930.00
STD13,60013,90012,50012,10013,600.0016,200.00
F19Mean193082606690277015,400.008380.00
Worst197018,60014,70010,100258,000.0018,900.00
Best19101930191019001910.001920.00
STD13.261905640186046,200.006530.00
F20Mean20202200210020402090.002060.00
Worst20302280223021502220.002150.00
Best20002180204020002030.002020.00
STD7.5119.8554658.4043.10
F21Mean22702270226023002310.002310.00
Worst23302330234023402340.002340.00
Best22002100210022002200.002200.00
STD53.763.269.648.530.4021.70
F22Mean23002300230023102310.002310.00
Worst23102320233023402340.002330.00
Best22002220225023002300.002300.00
STD18.71614.18.768.766.33
F23Mean26202620264026202620.002620.00
Worst26502640269026302650.002650.00
Best26102600262026102610.002610.00
STD8.739.814.47.758.7311.20
F24Mean27502730275027402750.002750.00
Worst28002780282027802800.002770.00
Best27202510250025002720.002730.00
STD14.361.86753.514.3013.00
F25Mean29302940294029302930.002940.00
Worst29502950302029502950.003020.00
Best29102900290029002910.002900.00
STD1314.222.722.813.0022.80
F26Mean30402930301029403040.002960.00
Worst40603190347038304060.003840.00
Best27502900281026002750.002900.00
STD28959.6128186289.00171.00
F27Mean31003100310031003100.003090.00
Worst31703120318031603170.003100.00
Best30903090309030903090.003090.00
STD17.55.818.616.417.503.02
F28Mean33703380328032203370.003370.00
Worst34803450341034103480.003410.00
Best31703180316031003170.003170.00
STD99.762.610914499.7083.90
F29Mean32003200325032003200.003200.00
Worst32903320345033103290.003300.00
Best31603150316031503160.003140.00
STD38.55070.244.138.5042.50
F30Mean33,200550,000464,000253,000648,000.00893,000.00
Worst831,0001,920,0002,270,000821,0006,730,000.002,450,000.00
Best38205430712045208600.007470.00
STD151,000715,000658,000378,0001,460,000.00621,000.00
Table 3. Friedman rankings of COSGO and the compared algorithms.
Table 3. Friedman rankings of COSGO and the compared algorithms.
FunctionCOSGOGWOSCSOWOAAGWO_CSPSOGSA
1136524
31452.562.5
4534152
5436142
612.55.55.542.5
7136254
8125643
9126453
10245361
11531254
121.5653.51.53.5
131642.52.55
14154263
15153264
16265421
172.52.52.55.52.55.5
18352136
19143265
20165243
212.52.5145.55.5
22222555
23336333
24313233
252.5552.52.55
265.51425.53
27444441
28462144
29336333
30143256
Total66.5106.511985120102.5
Average Ranking2.29313.67244.10342.93104.13793.5345
Table 4. The fitness values of the algorithms.
Table 4. The fitness values of the algorithms.
LevelImageCOSGOGWOSCSOWOAAGWO_CSPSOGSA
OtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapur
2118031Value−18,504.54−5.15−18,504.54−5.15−18,504.54−5.15−18,504.51−5.15−18,504.32−5.15−18,504.54−5.15
STD0.000.000.000.000.000.000.050.000.150.000.000.00
118072Value−9517.97−5.04−9517.97−5.04−9517.97−5.04−9517.96−5.04−9517.91−5.04−9517.97−5.04
STD0.000.000.000.000.000.000.020.000.060.000.000.00
326025Value−6334.31−4.88−6334.31−4.88−6334.31−4.88−6334.31−4.88−6334.25−4.88−6334.31−4.88
STD0.000.000.000.000.000.000.000.000.080.000.000.00
120003Value−26,805.11−5.22−26,805.11−5.22−26,805.11−5.22−26,805.08−5.22−26,804.92−5.22−26,805.11−5.22
STD0.000.000.000.000.000.000.060.000.050.000.000.00
253092Value−15,027.04−4.99−15,027.01−4.99−15,027.04−4.99−15,027.04−4.99−15,026.96−4.99−15,027.04−4.99
STD0.000.000.080.000.000.000.010.000.090.000.000.00
8068Value−12,529.92−4.65−12,529.92−4.65−12,529.92−4.65−12,529.92−4.65−12,529.85−4.65−12,529.92−4.65
STD0.000.000.000.000.000.000.010.000.100.000.000.00
81095Value−15,094.56−5.16−15,094.56−5.16−15,094.56−5.16−15,094.56−5.16−15,094.53−5.16−15,094.56−5.16
STD0.000.000.000.000.000.000.000.000.060.000.000.00
92014Value−14,961.03−4.55−14,961.03−4.55−14,961.03−4.55−14,961.03−4.55−14,960.91−4.55−14,961.03−4.55
STD0.000.000.000.000.000.000.010.000.130.000.000.00
365072Value−17,439.29−5.48−17,439.29−5.48−17,439.29−5.48−17,439.29−5.48−17,439.23−5.48−17,439.29−5.48
STD0.000.000.000.000.000.000.000.000.090.000.000.00
384022Value−15,896.95−5.16−15,896.95−5.16−15,896.95−5.16−15,896.93−5.16−15,896.90−5.16−15,896.95−5.16
STD0.000.000.000.000.000.000.050.000.030.000.000.00
3118031Value−18,730.87−5.15−18,730.74−5.15−18,730.87−5.15−18,730.84−5.15−18,730.08−5.15−18,730.87−5.15
STD0.000.000.290.000.000.000.050.000.330.000.000.00
118072Value−9640.56−5.04−9640.56−5.04−9640.54−5.04−9640.52−5.04−9639.99−5.04−9640.56−5.04
STD0.000.000.000.000.040.000.050.000.310.000.000.00
326025Value−6504.93−4.88−6504.93−4.88−6504.92−4.88−6504.85−4.88−6504.16−4.88−6504.93−4.88
STD0.000.000.000.000.010.000.090.000.870.000.000.00
120003Value−27,071.72−5.22−27,071.70−5.22−27,071.71−5.22−27,071.54−5.22−27,070.76−5.22−27,071.72−5.22
STD0.000.000.040.000.030.000.400.000.940.000.000.00
253092Value−15,184.31−4.99−15,183.55−4.99−15,184.31−4.99−15,184.30−4.99−15,183.08−4.99−15,184.31−4.99
STD0.000.001.670.000.000.000.010.000.850.000.000.00
8068Value−12,611.96−4.65−12,611.96−4.65−12,611.96−4.65−12,611.91−4.65−12,611.23−4.65−12,611.96−4.65
STD0.000.000.000.000.000.000.090.000.550.000.000.00
81095Value−15,261.54−5.16−15,261.38−5.16−15,261.53−5.16−15,261.44−5.16−15,260.17−5.16−15,261.54−5.16
STD0.000.000.350.000.020.000.110.000.660.000.000.00
92014Value−15,071.07−4.55−15,070.85−4.55−15,071.09−4.55−15,071.08−4.55−15,070.23−4.55−15,071.09−4.55
STD0.040.000.550.000.000.000.020.000.760.000.010.00
365072Value−17,685.43−5.48−17,685.43−5.48−17,685.42−5.48−17,685.34−5.48−17,684.14−5.48−17,685.43−5.48
STD0.000.000.000.000.030.000.130.001.180.000.000.00
384022Value−16,124.65−5.16−16,124.65−5.16−16,124.65−5.16−16,124.62−5.16−16,123.64−5.16−16,124.65−5.16
STD0.000.000.000.000.010.000.040.000.770.000.000.00
4118031Value−18,805.074−5.152−18,804.707−5.152−18,805.081−5.152−18,804.720−5.152−18,801.351−5.152−18,805.082−5.152
STD0.0090.0000.8450.0000.0200.0000.3940.0002.1470.0000.0110.000
118072Value−9695.921−5.036−9695.365−5.036−9695.908−5.036−9695.822−5.036−9693.808−5.036−9695.941−5.036
STD0.0190.0001.2710.0000.0400.0000.1640.0001.8210.0000.0000.000
326025Value−6593.882−4.885−6593.882−4.885−6593.886−4.885−6593.854−4.885−6592.268−4.885−6593.886−4.885
STD0.0140.0000.0140.0000.0050.0000.0440.0001.1370.0000.0050.000
120003Value−27,163.825−5.220−27,163.808−5.220−27,163.801−5.220−27,163.555−5.220−27,161.155−5.220−27,163.828−5.220
STD0.0030.0000.0380.0000.0540.0000.2440.0001.2770.0000.0000.000
253092Value−15,254.533−4.990−15,254.046−4.990−15,254.540−4.990−15,254.434−4.990−15,252.271−4.990−15,254.543−4.990
STD0.0150.0001.0780.0000.0080.0000.1050.0001.0130.0000.0000.000
8068Value−12,666.569−4.645−12,666.569−4.645−12,666.571−4.645−12,666.394−4.645−12,664.976−4.645−12,662.381−4.645
STD0.0040.0000.0040.0000.0000.0000.2150.0001.1480.0009.3690.000
81095Value−15,321.807−5.156−15,321.806−5.156−15,321.817−5.156−15,321.695−5.156−15,320.714−5.156−15,321.807−5.156
STD0.0280.0000.0310.0000.0230.0000.1610.0000.5900.0000.0280.000
92014Value−15,132.280−4.552−15,132.276−4.552−15,132.281−4.552−15,132.270−4.552−15,129.595−4.552−15,132.276−4.552
STD0.0050.0000.0060.0000.0050.0000.0310.0001.5150.0000.0070.000
365072Value−17,791.010−5.483−17,790.996−5.483−17,791.033−5.483−17,790.846−5.483−17,787.168−5.483−17,791.008−5.483
STD0.0320.0000.0510.0000.0000.0000.2750.0001.6190.0000.0410.000
384022Value−16,180.251−5.160−16,180.238−5.160−16,169.119−5.160−16,179.835−5.160−16,177.785−5.160−16,179.035−5.160
STD0.0090.0000.0090.00024.8590.0000.6610.0001.1260.0002.7270.000
5118031Value−18,843.8−5.15164−1.88 × 104−5.15 × 100−18,836−5.15164−18,843.5−5.15164−18,843.8−5.15164−1.88 × 104−5.15 × 100
STD0.02457502.1858056.28 × 10−1617.2639600.43242800.02457502.1858056.28 × 10−16
118072Value−9.73 × 103−5.04 × 100−9.72 × 103−5.04 × 100−9.72 × 103−5.04 × 100−9.73 × 103−5.04 × 100−9.73 × 103−5.04 × 100−9.72 × 103−5.04 × 100
STD0.0152098.88 × 10−161.7137924.44 × 10−1613.315474.44 × 10−160.4378018.88 × 10−160.0152098.88 × 10−161.7137924.44 × 10−16
326025Value−6640.82−4.88486−6.64 × 103−4.88 × 100−6.64 × 103−4.88 × 100−6.64 × 103−4.88 × 100−6640.82−4.88486−6.64 × 103−4.88 × 100
STD0.0232500.0003556.28 × 10−160.0070647.69 × 10−160.3217447.69 × 10−160.0232500.0003556.28 × 10−16
120003Value−2.72 × 104−5.22 × 100−2.72 × 104−5.22 × 100−2.72 × 104−5.22 × 100−27,219.5−5.21957−2.72 × 104−5.22 × 100−2.72 × 104−5.22 × 100
STD0.0621417.69 × 10−160.1389898.88 × 10−160.2395238.88 × 10−161.98043300.0621417.69 × 10−160.1389898.88 × 10−16
253092Value−1.53 × 104−4.99 × 100−15,292.5−4.99043−15,293.9−4.99043−15,285.9−4.99043−1.53 × 104−4.99 × 100−15,292.5−4.99043
STD0.0887494.44 × 10−163.14083900.001474017.5307400.0887494.44 × 10−163.1408390
8068Value−1.27 × 104−4.65 × 100−1.27 × 104−4.65 × 100−1.27 × 104−4.65 × 100−1.27 × 104−4.65 × 100−1.27 × 104−4.65 × 100−1.27 × 104−4.65 × 100
STD0.0054994.44 × 10−162.9469294.44 × 10−160.0222741.09 × 10−150.9014768.88 × 10−160.0054994.44 × 10−162.9469294.44 × 10−16
81095Value−1.54 × 104−5.16 × 100−1.54 × 104−5.16 × 100−1.54 × 104−5.16 × 100−1.54 × 104−5.16 × 100−1.54 × 104−5.16 × 100−1.54 × 104−5.16 × 100
STD0.0066427.69 × 10−160.4216324.44 × 10−1616.243586.28 × 10−160.666824.44 × 10−160.0066427.69 × 10−160.4216324.44 × 10−16
92014Value−1.52 × 104−4.55 × 100−15,159.7−4.55222−1.52 × 104−4.55 × 100−15,160.5−4.55222−1.52 × 104−4.55 × 100−15,159.7−4.55222
STD0.0577626.28 × 10−161.83048900.0176744.44 × 10−160.05278100.0577626.28 × 10−161.8304890
365072Value−1.78 × 104−5.48 × 100−17,848.9−5.48253−1.78 × 104−5.48 × 100−1.78 × 104−5.48 × 100−1.78 × 104−5.48 × 100−17,848.9−5.48253
STD0.0362634.44 × 10−160.03086700.1165131.09 × 10−150.1509338.88 × 10−160.0362634.44 × 10−160.0308670
384022Value−1.62 × 104−5.16 × 100−16,225.7−5.16025−1.62 × 104−5.16 × 100−16,226−5.16025−1.62 × 104−5.16 × 100−16,225.7−5.16025
STD0.0207376.28 × 10−162.20318400.2115424.44 × 10−160.70488200.0207376.28 × 10−162.2031840
Table 5. The running time of each algorithm.
Table 5. The running time of each algorithm.
LevelImageCOSGOGWOSCSOWOAAGWO_CSPSOGSA
OtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapur
21180310.87690.95590.28550.31860.29690.32180.28840.31460.30030.32980.29000.3198
1180720.83310.95630.28610.31810.28980.32230.28020.31360.29660.32900.28420.3195
3260250.83680.95900.28620.31970.29380.32250.28250.31430.29430.33160.29400.3191
1200030.84350.96370.28420.31890.28790.32680.28210.31520.29730.32950.28240.3220
2530920.84320.95070.28590.31730.29130.32150.27910.31010.29380.32750.28400.3171
80680.84370.95730.28310.31900.29130.32360.28090.31330.29300.32920.28780.3189
810950.83510.95470.27940.31870.29020.32530.28150.31170.29320.32950.29490.3202
920140.85070.95200.28600.31810.29190.32540.28460.31350.29510.33070.28590.3193
3650720.83250.94840.27860.31620.28300.32030.28280.31040.29190.32730.28390.3181
3840220.83340.94700.28130.32720.28720.32370.28150.31440.29120.33060.28990.3201
31180310.88830.97670.29310.32540.29450.33290.29340.32280.31390.34570.29490.3223
1180720.85390.96950.28350.32580.29350.32880.28430.32560.30300.34320.28740.3237
3260250.87390.98390.28760.32900.29120.33240.29290.32600.30680.34300.29760.3245
1200030.86530.97690.28790.32650.29340.33730.28870.32770.30560.34510.29120.3251
2530920.85730.96010.28920.31980.29330.32950.28860.32060.30250.33520.28950.3199
80680.85590.97880.28860.32570.29500.33300.28530.32470.30660.33760.29580.3290
810950.86180.96090.28580.32590.29420.33470.29070.32150.30450.33960.29780.3251
920140.86980.97250.28770.32700.29520.33230.28500.32380.30800.34360.29500.3256
3650720.85600.96180.28470.32520.29150.33220.28350.32120.30220.34180.28690.3213
3840220.86080.97530.28560.32460.28910.33330.28420.32650.30280.33910.28660.3258
41180310.90171.00800.29210.33690.29690.33550.28970.32710.31370.35030.30190.3348
1180720.89710.99840.29080.34280.29750.34100.28580.33390.31010.35000.29420.3341
3260250.87711.00220.28920.33350.29330.33610.28960.33650.31050.35190.29800.3373
1200030.87220.98900.29910.33490.30040.33940.28890.33380.31600.35330.29550.3375
2530920.87930.96810.29980.33090.29470.34120.28900.32990.30790.35030.29480.3290
80680.89010.97260.28940.33270.29740.33630.28520.32650.30940.34880.30100.3295
810950.87970.97990.29070.33200.29800.33430.28640.32780.30730.34910.30210.3356
920140.88961.00010.29320.33070.29610.35050.29120.32690.31130.35790.30060.3328
3650720.86100.99310.28920.33480.29380.33340.28920.33340.30630.35010.29310.3297
3840220.87900.99920.30080.33400.29680.33650.29190.32980.30550.34890.29500.3353
51180310.89700.99170.31130.34510.30470.33990.29650.32750.32050.35590.30450.3325
1180720.87630.98980.31340.34150.29680.33650.28990.32880.30920.36310.29580.3310
3260250.89020.99900.31290.34650.30180.33990.28820.32540.31460.35970.30130.3326
1200030.88570.99360.32030.34860.30070.33370.29180.33150.31310.35460.30060.3329
2530920.90020.98180.30620.33900.29650.33470.28920.32820.31550.37310.29790.3289
80680.88740.98330.30240.34010.29850.33230.28900.32610.31340.36050.30140.3330
810950.87220.99100.30970.33480.29640.33930.28850.32910.31740.36790.30390.3286
920140.87860.99420.30700.34210.29710.33990.29210.33940.31510.36990.29960.3276
3650720.86060.99630.29840.33950.29740.33640.28750.32850.30720.35220.29870.3273
3840220.88661.01770.29690.33460.29360.33760.28600.32830.30980.35480.29720.3271
Table 6. The PSNR and MSE values of the algorithms.
Table 6. The PSNR and MSE values of the algorithms.
LevelImageCOSGO GWOSCSOWOAAGWO_CSPSOGSA
OtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapur
2118031PSNR19.5116.1419.5116.7619.5116.2319.5016.4719.4215.4619.5116.18
MSE727.941603.79727.941416.16727.941561.26729.861484.20743.181919.64727.941588.24
118072PSNR19.5416.8019.5417.6619.5417.9119.5416.8719.5415.0619.5417.08
MSE723.021498.45723.021225.43723.021223.50723.091420.47723.362130.41723.021317.66
326025PSNR21.5419.7021.5416.2621.5415.3321.5516.0421.5416.8721.5415.22
MSE455.85761.11455.852075.13455.852277.03455.551862.70456.181659.86455.853026.49
120003PSNR20.2114.8520.2114.5520.2113.8320.2015.5520.1612.4420.2116.58
MSE619.692412.10619.692619.17619.692693.16620.892017.50627.364020.18619.691658.73
253092PSNR15.1614.4215.1614.3515.1614.3515.1714.1715.1713.8115.1614.81
MSE1981.662378.721979.692411.361981.662393.701979.082519.821975.382721.101981.662152.47
8068PSNR16.1410.5516.1410.5516.1411.4616.1410.9516.1410.5516.1412.04
MSE1582.455747.191582.455737.831582.454724.231582.645410.561583.095919.621582.454177.89
81095PSNR20.0015.6420.0016.0720.0016.5220.0016.8419.9914.9720.0015.98
MSE649.971775.69649.971667.19649.971506.76649.971375.75651.712083.89649.971668.81
92014PSNR9.749.599.749.459.749.529.759.439.759.519.749.43
MSE6898.327156.046898.327382.346898.327269.336895.707433.116894.527287.666898.327432.31
365072PSNR20.6712.8620.6712.5320.6713.0420.6714.4220.6613.5020.6716.95
MSE557.463465.46557.463683.50557.463291.42557.462572.63558.163347.27557.461614.96
384022PSNR19.9316.8919.9316.4019.9317.2919.9217.9719.9415.8519.9316.72
MSE661.261426.22661.261589.15661.261283.28661.921091.16659.151795.36661.261527.68
3118031PSNR22.935116.126822.939516.813222.935117.843222.977116.421823.001717.076622.935116.7904
MSE330.80661587.5032330.46631388.9745330.80661081.3988327.63471500.8736325.89381308.5358330.80661450.2195
118072PSNR21.867119.032121.867120.408221.868820.882321.872118.746721.878318.397221.867118.8232
MSE423.0325857.9918423.0325609.7335422.8679573.7079422.5413921.7215422.31211254.7416423.0325954.7287
326025PSNR24.074519.526624.074519.220424.068419.649524.065018.174024.019118.672024.074518.4796
MSE254.4667768.5329254.4667947.5201254.8246754.1504255.02231163.7450257.7481978.2080254.4667970.3092
120003PSNR23.568717.264323.566417.998923.567818.360323.565016.604223.517713.992523.568719.9053
MSE285.89431519.4354286.04631195.9356285.95801040.8176286.13981836.8011289.28052599.2134285.8943697.5528
253092PSNR15.619114.277915.630315.102215.619115.178615.619314.734015.595414.633915.619114.9065
MSE1783.06512450.05771778.49722014.73541783.06511985.25761783.01042188.16181792.86732267.97891783.06512107.3803
8068PSNR16.454611.088716.454612.515916.454611.850816.452611.210416.459710.132116.454612.1029
MSE1471.02925093.57851471.02923961.21851471.02924329.97971471.69884978.91801469.31466633.43751471.02924327.0509
81095PSNR23.503415.898823.493015.923023.523516.994323.524618.281823.549616.268823.503418.4466
MSE290.22671679.0968290.93041693.7111288.90141403.3690288.83231116.9040287.18321618.6846290.22671002.2381
92014PSNR9.98549.82449.98359.87569.98559.60249.98379.68139.97989.88779.98549.6678
MSE6524.40496773.20896527.21326691.63306524.19137135.66746527.02657000.34876532.76376674.33006524.35267024.2659
365072PSNR23.264413.146123.264415.381223.263514.468023.263812.360823.248215.926823.264419.0736
MSE306.64493344.1274306.64492150.0121306.70882687.2560306.68933777.8059307.79832194.2999306.64491173.5723
384022PSNR24.646416.294024.646417.205424.644318.082724.648116.470024.623217.731324.646418.0721
MSE223.06801558.4437223.06801277.8032223.17731046.8039222.98441499.4587224.27291210.3236223.06801060.0297
4118031PSNR24.5317.5024.5618.9324.5217.8224.5417.6624.4815.8824.5318.40
MSE229.081177.38227.77922.91229.691135.00228.781131.68232.161716.31228.921005.15
118072PSNR24.0319.0624.0221.9723.9920.3424.0821.3723.9319.9724.0219.57
MSE256.87935.98257.66438.79259.48658.31254.19638.95263.56807.32257.63876.66
326025PSNR25.9822.1725.9819.9325.9719.8325.9821.0525.9417.4625.9719.21
MSE164.02422.00164.02880.27164.37742.25164.04542.57165.602176.26164.37926.52
120003PSNR25.2616.7125.2620.6125.2717.5225.2519.9725.2013.1425.2718.46
MSE193.711893.83193.61621.12193.381472.89193.96670.40196.223350.84193.181321.95
253092PSNR15.9815.1015.9615.3815.9815.4615.9714.5815.8914.4615.9815.04
MSE1642.722017.101649.541898.591642.611854.571643.002306.101673.702356.261642.552066.07
8068PSNR16.4610.9816.4612.0716.4613.6016.4711.6416.459.9616.4912.68
MSE1469.125234.301469.124248.121468.473167.391466.014648.601472.556886.641459.923715.07
81095PSNR24.8816.5724.8619.5824.8817.4724.8719.1524.8616.8624.8817.32
MSE211.341514.17212.51780.56211.521278.55211.78868.95212.491527.53211.341352.85
92014PSNR10.109.8310.109.9710.1010.0510.109.6510.109.8810.109.85
MSE6356.416772.506356.946554.166356.256424.656356.117058.706355.096686.686357.706727.22
365072PSNR25.1812.7225.1817.3925.1914.9025.1816.8225.0913.9525.1817.74
MSE197.063535.42197.191317.40196.952608.09197.431726.96201.562873.21197.151578.84
384022PSNR25.6116.9825.6419.4325.3720.5025.5217.8825.5217.2925.6620.30
MSE178.721321.67177.47884.30189.47672.84182.511193.42182.591274.93176.83699.73
5118031PSNR26.0518.6026.0520.3426.1418.5126.1621.6625.9618.6326.0219.27
MSE161.681008.74161.58648.90158.231011.57157.78469.07165.381037.91162.54865.14
118072PSNR25.6920.5625.6221.4225.6320.8125.6121.6824.9721.3425.5723.31
MSE175.60681.95178.19495.47178.10694.31178.67449.19208.81504.80180.20315.30
326025PSNR27.6319.8027.6322.5327.6318.8727.6122.4927.3920.1127.6422.14
MSE112.251023.23112.18395.52112.231243.14112.68403.09118.59651.54111.92460.49
120003PSNR26.9418.9926.9419.3026.9314.7226.9217.3926.6714.1026.9121.18
MSE131.66846.31131.44998.71131.752343.70132.241702.88139.942545.59132.34541.65
253092PSNR16.2415.1116.2315.7016.2315.0716.2115.4616.2614.9516.2015.51
MSE1544.162021.331550.361770.171547.552028.601554.611856.851537.962106.701561.641850.45
8068PSNR16.6211.6916.6214.1216.6213.9816.6112.7116.6310.8516.6313.34
MSE1414.484476.091414.452726.311414.552954.991418.463795.221413.465740.311413.743287.62
81095PSNR26.4816.3126.4922.0726.4119.9826.4321.2325.9815.4026.0918.99
MSE146.241534.75145.93428.28148.49779.80148.11630.37164.101958.40161.68952.29
92014PSNR10.159.7210.149.8710.159.8110.149.9410.159.9410.159.82
MSE6285.966942.666289.186701.736287.076793.336298.676598.726288.246592.386288.956772.64
365072PSNR26.7414.6326.7218.4026.7416.9926.7116.6026.5116.1426.7419.83
MSE137.742659.72138.391148.47137.891879.41138.571962.17145.112454.43137.72747.78
384022PSNR27.1618.2627.1620.4027.1221.6726.8216.8926.8519.1327.1221.90
MSE125.141058.45125.01655.66126.19502.82136.741398.76134.341062.81126.24605.26
Table 7. The SSIM and FSIM values of the algorithms.
Table 7. The SSIM and FSIM values of the algorithms.
LevelImageCOSGO GWOSCSOWOAAGWO_CSPSOGSA
OtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapur
2118031SSIM0.69290.54210.69290.55330.69290.51960.69280.51390.69140.50790.69290.5061
FSIM0.29400.18500.29400.18800.29400.16570.29390.15460.29340.15950.29400.1507
118072SSIM0.61880.44240.61880.48800.61880.52800.61880.45280.61880.40950.61880.4891
FSIM0.28840.17950.28840.20990.28840.23100.28860.18660.28890.15870.28840.1906
326025SSIM0.61850.56930.61850.43850.61850.45040.61850.46380.61790.49570.61850.4215
FSIM0.27500.25960.27500.20680.27500.22670.27490.22120.27440.24180.27500.2058
120003SSIM0.72240.59910.72240.63450.72240.70030.72220.69000.72230.63020.72240.6400
FSIM0.24860.20170.24860.20660.24860.21260.24870.22560.24870.19880.24860.2258
253092SSIM0.62070.52510.62040.51640.62070.53600.62060.50770.62000.46930.62070.5718
FSIM0.23820.21040.23810.20390.23820.19260.23820.20760.23810.19850.23820.2117
8068SSIM0.65970.40880.65970.39510.65970.43570.65970.41900.65970.40910.65970.4586
FSIM0.15190.11220.15190.12220.15190.13400.15200.12780.15200.13310.15190.1357
81095SSIM0.65950.58130.65950.58840.65950.59710.65950.60880.65940.58110.65950.5911
FSIM0.24540.14470.24540.15950.24540.17390.24540.17460.24550.15910.24540.1629
92014SSIM0.45990.45180.45990.42520.45990.43110.46020.41200.46050.43760.45990.4295
FSIM0.29150.28560.29150.27940.29150.27870.29150.24790.29120.27720.29150.2794
365072SSIM0.57630.24470.57630.22890.57630.25290.57630.31750.57620.28450.57630.4335
FSIM0.25070.06860.25070.05550.25070.07420.25070.10980.25040.08930.25070.1712
384022SSIM0.68900.60610.68900.59920.68900.61850.68880.63750.68870.56520.68900.6073
FSIM0.21670.19010.21670.18010.21670.20120.21670.20860.21590.17790.21670.1790
3118031SSIM0.77990.49210.78000.55590.77990.66440.78070.55360.77990.58980.77990.5601
FSIM0.36570.14770.36540.18960.36570.28540.36610.19320.36610.22980.36570.2006
118072SSIM0.71140.52450.71140.55360.71150.59000.71160.54100.71140.52270.71140.5292
FSIM0.37640.21330.37640.23870.37640.26640.37650.24220.37620.22820.37640.2315
326025SSIM0.69490.57790.69490.54880.69480.58000.69430.54800.69340.59410.69490.5332
FSIM0.34200.27480.34200.25640.34190.27760.34170.26820.34190.29470.34200.2520
120003SSIM0.78040.66090.78070.67610.78050.69890.78050.70750.77940.70710.78040.6996
FSIM0.28160.22670.28170.23180.28170.24380.28170.23760.28160.22570.28160.2514
253092SSIM0.64240.52500.64220.56040.64240.55970.64250.56210.64190.55740.64240.5635
FSIM0.29140.19650.29200.23790.29140.23720.29150.20680.29060.23690.29140.2186
8068SSIM0.68260.42580.68260.47130.68260.45490.68290.44150.68300.40600.68260.4437
FSIM0.17690.13640.17690.16200.17690.13640.17710.12240.17770.11570.17690.1582
81095SSIM0.73430.58500.73460.58230.73440.60160.73400.63340.73470.59320.73430.6395
FSIM0.29890.14770.29910.15500.29890.17420.29880.20920.29770.17320.29890.2195
92014SSIM0.54340.49700.54340.50620.54330.46450.54310.46660.54220.53510.54330.4583
FSIM0.37910.32790.37850.31390.37850.30750.37820.30880.38010.36020.37860.2978
365072SSIM0.67670.26480.67670.37200.67670.33140.67650.22670.67710.39650.67670.5367
FSIM0.34370.08210.34370.14030.34360.12630.34370.05750.34300.16200.34370.2575
384022SSIM0.73310.59300.73310.61930.73310.64810.73310.61220.73370.62020.73310.6317
FSIM0.26550.16360.26550.21530.26550.22190.26560.18270.26610.22700.26550.2186
4118031SSIM0.80910.57630.80910.63880.80870.60330.80880.58410.80910.52560.80900.6160
FSIM0.42710.19330.42740.25840.42750.22550.42700.19850.42650.18450.42740.2529
118072SSIM0.77260.54780.77120.63950.77190.56930.77240.60180.77010.62330.77200.5608
FSIM0.44870.25720.44800.31160.44830.25690.44860.28290.44650.31670.44850.2620
326025SSIM0.76300.63020.76300.55990.76310.60220.76250.61050.76220.52890.76310.5469
FSIM0.40120.30530.40120.26950.40140.30370.40100.30440.40080.27520.40140.2654
120003SSIM0.78570.63430.78480.72080.78460.71980.78440.73110.78600.66790.78490.6765
FSIM0.30420.23030.30470.26950.30460.24800.30490.26210.31110.21750.30460.2526
253092SSIM0.67740.57090.67750.61380.67740.62080.67760.54750.67720.54580.67720.5887
FSIM0.34020.23030.33990.26300.34010.25540.34030.21740.33940.21410.34000.2291
8068SSIM0.65870.43050.65870.45610.65860.52850.66190.44540.65670.38120.66480.4835
FSIM0.31180.12770.31180.16090.31180.16350.31280.12820.31060.14060.28890.1552
81095SSIM0.76260.60110.76230.66730.76250.61710.76210.65560.76120.60300.76260.6287
FSIM0.34060.16910.34040.26250.34070.19210.34020.23570.33850.18760.34060.2028
92014SSIM0.61570.49000.61550.53240.61590.55210.61580.47870.61360.54040.61490.5047
FSIM0.44720.32420.44690.34640.44750.35520.44730.31640.44430.37840.44660.3290
365072SSIM0.75100.24130.75080.46160.75090.34740.75100.45340.75060.31740.75090.4956
FSIM0.42790.06880.42780.19460.42770.13830.42780.20640.42770.12110.42780.2445
384022SSIM0.79240.60920.79200.64820.77970.66960.78940.63380.79030.61680.78560.6756
FSIM0.39030.20260.39000.23710.36490.27540.38870.24340.38750.24910.37130.2970
5118031SSIM0.83950.61540.83870.71600.83340.70940.83980.64020.83950.61540.83870.7160
FSIM0.47630.24640.47530.33650.46640.32780.47640.25940.47630.24640.47530.3365
118072SSIM0.81220.61650.81200.62960.80420.63220.81270.60160.81220.61650.81200.6296
FSIM0.50330.30230.50290.31160.49250.33160.50250.30460.50330.30230.50290.3116
326025SSIM0.81340.53480.81280.63010.81320.61670.81240.57740.81340.53480.81280.6301
FSIM0.45310.26060.45230.30830.45280.30860.45210.28670.45310.26060.45230.3083
120003SSIM0.81200.67190.81330.69950.81300.72280.81320.72070.81200.67190.81330.6995
FSIM0.33630.24900.33720.26320.33690.25880.33610.25140.33630.24900.33720.2632
253092SSIM0.70840.62790.70700.61010.70870.63170.70790.64520.70840.62790.70700.6101
FSIM0.37930.28010.37860.26040.37900.27080.37270.25900.37930.28010.37860.2604
8068SSIM0.66870.47000.67380.62280.66850.52190.66600.46850.66870.47000.67380.6228
FSIM0.33030.13740.32880.18450.32980.22120.32960.14570.33030.13740.32880.1845
81095SSIM0.79260.59210.79230.63310.78730.67780.79270.68380.79260.59210.79230.6331
FSIM0.37960.16020.38020.21730.37210.26400.37860.26520.37960.16020.38020.2173
92014SSIM0.65170.50270.64600.49800.65500.52190.65110.54970.65170.50270.64600.4980
FSIM0.50400.33920.49930.32360.50690.33980.50330.37550.50400.33920.49930.3236
365072SSIM0.80410.30940.80400.57400.80440.38500.80370.39050.80410.30940.80400.5740
FSIM0.49000.11340.49030.28700.49070.16300.48940.16390.49000.11340.49030.2870
384022SSIM0.81670.64480.81780.68150.81670.66750.81480.63240.81670.64480.81780.6815
FSIM0.42120.23370.42470.26410.42110.27400.42290.24110.42120.23370.42470.2641
Table 8. The dice and Jaccard index values of the algorithms.
Table 8. The dice and Jaccard index values of the algorithms.
LevelImageCOSGO GWOSCSOWOAAGWO_CSPSOGSA
OtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapur
2118031Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
118072Dice0.43980.06850.43980.11090.43980.39960.43980.11810.44130.67760.43980.4003
Jaccard0.28190.03940.28190.06100.28190.28670.28190.07080.28310.58710.28190.3060
326025Dice0.17420.15370.17420.11750.17420.42320.17370.29870.17070.43850.17420.0676
Jaccard0.09540.08570.09540.06260.09540.30020.09510.19180.09330.30960.09540.0357
120003Dice0.50130.11650.50130.51080.50130.89660.50130.75040.50360.93730.50130.4133
Jaccard0.33440.06790.33440.41550.33440.81260.33440.63310.33650.88630.33440.2755
253092Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
8068Dice0.28350.16930.28350.18780.28350.20800.28350.31330.28350.47830.28350.2166
Jaccard0.16520.09280.16520.10370.16520.11650.16520.23260.16520.39800.16520.1219
81095Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
92014Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
365072Dice0.43390.07850.43390.07830.43390.09920.43390.27340.43380.67310.43390.2790
Jaccard0.27710.04290.27710.04260.27710.05420.27710.17550.27700.63980.27710.1670
384022Dice0.56360.11310.56360.25980.56360.29140.56410.16170.56360.72770.56360.2853
Jaccard0.39230.06170.39230.20790.39230.21700.39290.08920.39230.63450.39230.2280
3118031Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
118072Dice0.31930.00490.31930.03720.31930.08030.31930.05680.31590.41780.31930.1969
Jaccard0.19000.00250.19000.01930.19000.04480.19000.03290.18770.33430.19000.1485
326025Dice0.12150.20550.12150.11000.12120.11490.12120.19640.11920.35470.12150.0971
Jaccard0.06470.12240.06470.05860.06450.06310.06450.12810.06340.22620.06470.0516
120003Dice0.45360.05780.45360.14170.45360.20360.45360.70260.45160.89440.45360.3224
Jaccard0.29330.02980.29330.08590.29330.11820.29330.59370.29170.80910.29330.2086
253092Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
8068Dice0.26640.13340.26640.21500.26640.22230.26640.18840.26660.49350.26640.1945
Jaccard0.15370.07150.15370.12190.15370.12600.15370.10420.15380.44250.15370.1085
81095Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
92014Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
365072Dice0.32830.03720.32830.13400.32830.14910.32910.03130.32340.17420.32830.3767
Jaccard0.19640.01960.19640.08470.19640.10140.19700.01610.19290.10430.19640.2438
384022Dice0.20290.02990.20290.09890.20290.12510.20200.04580.20780.62330.20290.0624
Jaccard0.11290.01530.11290.05290.11290.06750.11230.02360.11600.50760.11290.0325
4118031Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
118072Dice0.22030.05650.21880.08180.22180.01790.21720.01170.22320.35370.22030.1650
Jaccard0.12380.03270.12280.04580.12470.00910.12190.00590.12560.24660.12380.1196
326025Dice0.10580.04910.10580.08410.10550.14400.10550.08160.10610.45910.10550.0835
Jaccard0.05580.02530.05580.04440.05570.08340.05570.04290.05610.35640.05570.0444
120003Dice0.32720.12620.33150.22630.33000.61320.33020.26670.36380.91710.33000.2370
Jaccard0.19560.06910.19870.13890.19760.50110.19780.16350.22280.84960.19760.1397
253092Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
8068Dice0.26450.13060.26450.19010.26430.20530.26500.17620.26320.62940.25490.1967
Jaccard0.15240.06990.15240.10530.15230.11590.15270.09750.15160.56540.14620.1097
81095Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
92014Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
365072Dice0.26510.02500.26690.22230.26690.05290.26420.11180.25740.42170.26600.1919
Jaccard0.15280.01290.15400.13310.15400.02810.15220.06280.14780.40420.15340.1292
384022Dice0.19300.01250.19020.11610.19570.12560.19210.19700.19270.64540.17380.1601
Jaccard0.10680.00630.10510.06370.10850.07220.10620.16150.10670.51350.09570.0940
5118031Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
118072Dice0.16180.03380.16330.03110.16330.03770.16630.01080.19010.15500.16490.0082
Jaccard0.08800.01840.08890.01650.08890.01980.09070.00550.10530.08650.08980.0042
326025Dice0.08750.03660.08740.07200.08770.12400.08690.03040.09090.18050.08790.0787
Jaccard0.04570.01870.04570.03880.04580.07280.04540.01560.04760.10420.04600.0410
120003Dice0.29150.09980.29310.32650.29290.76650.28750.29880.29430.89290.29710.2040
Jaccard0.17060.05370.17170.24370.17160.67820.16790.22790.17300.80660.17450.1236
253092Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
8068Dice0.21190.13070.21060.21270.21130.21490.21430.21500.20450.41840.21190.2064
Jaccard0.11850.07060.11770.12070.11810.12150.12000.12110.11400.35320.11850.1159
81095Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
92014Dice0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
Jaccard0.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.00000.0000
365072Dice0.23150.04180.22870.13820.23240.05020.23420.03730.25010.30510.23150.2013
Jaccard0.13090.02170.12920.08430.13150.02620.13270.01960.14310.24880.13090.1191
384022Dice0.09230.04400.09130.14130.09100.13590.11120.01420.09630.40800.09160.1170
Jaccard0.04840.02280.04780.07720.04770.07480.05930.00710.05060.32740.04800.0637
Table 9. The accuracy values of the algorithms.
Table 9. The accuracy values of the algorithms.
LevelImageCOSGOGWOSCSOWOAAGWO_CSPSOGSA
OtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapurOtsuKapur
21180310.90790.87200.90790.92990.90790.95580.90780.94680.90750.54800.90790.9297
1180720.28190.03940.28190.06100.28190.28670.28190.07080.28310.58710.28190.3060
3260250.09540.08570.09540.06260.09540.30020.09510.19180.09330.30960.09540.0357
1200030.33440.06790.33440.41550.33440.81260.33440.63310.33650.88630.33440.2755
2530920.87170.82140.87220.75720.87170.85160.87170.66970.87270.56850.87170.9111
80680.16520.09280.16520.10370.16520.11650.16520.23260.16520.39800.16520.1219
810950.79980.98980.79980.99110.79980.93420.79980.98580.79880.43000.79980.9897
920140.87810.83400.87810.77750.87810.89810.87810.70170.87810.76750.87810.7659
3650720.27710.04290.27710.04260.27710.05420.27710.17550.27700.63980.27710.1670
3840220.39230.06170.39230.20790.39230.21700.39290.08920.39230.63450.39230.2280
31180310.91730.94300.91760.91460.91730.81500.91680.92440.91680.92020.91730.9382
1180720.19000.00250.19000.01930.19000.04480.19000.03290.18770.33430.19000.1485
3260250.06470.12240.06470.05860.06450.06310.06450.12810.06340.22620.06470.0516
1200030.29330.02980.29330.08590.29330.11820.29330.59370.29170.80910.29330.2086
2530920.90620.92820.90330.91180.90620.91340.90620.92930.90570.70590.90620.9261
80680.15370.07150.15370.12190.15370.12600.15370.10420.15380.44250.15370.1085
810950.84120.99590.84120.79680.84200.79800.84280.93890.84810.77540.84120.9752
920140.88600.88460.88570.88710.88630.76060.88630.89340.88630.85580.88610.8921
3650720.19640.01960.19640.08470.19640.10140.19700.01610.19290.10430.19640.2438
3840220.11290.01530.11290.05290.11290.06750.11230.02360.11600.50760.11290.0325
41180310.92120.95450.92110.90680.92140.92880.92140.93720.92070.75220.92120.9349
1180720.12380.03270.12280.04580.12470.00910.12190.00590.12560.24660.12380.1196
3260250.05580.02530.05580.04440.05570.08340.05570.04290.05610.35640.05570.0444
1200030.19560.06910.19870.13890.19760.50110.19780.16350.22280.84960.19760.1397
2530920.91990.92460.91940.93400.91990.92540.91940.83740.92040.75240.92020.9118
80680.15240.06990.15240.10530.15230.11590.15270.09750.15160.56540.14620.1097
810950.86190.99680.86310.95420.86250.99730.86320.97160.86180.52120.86190.8155
920140.89450.91520.89430.88380.89450.89030.89450.84330.89550.84540.89480.8886
3650720.15280.01290.15400.13310.15400.02810.15220.06280.14780.40420.15340.1292
3840220.10680.00630.10510.06370.10850.07220.10620.16150.10670.51350.09570.0940
51180310.92370.97440.92620.91210.92390.94380.92340.90980.92470.92650.92400.9150
1180720.08800.01840.08890.01650.08890.01980.09070.00550.10530.08650.08980.0042
3260250.04570.01870.04570.03880.04580.07280.04540.01560.04760.10420.04600.0410
1200030.17060.05370.17170.24370.17160.67820.16790.22790.17300.80660.17450.1236
2530920.92980.95890.92980.93900.92940.79430.92790.91530.93480.72110.93970.9135
80680.11850.07060.11770.12070.11810.12150.12000.12110.11400.35320.11850.1159
810950.90470.99780.90530.98240.90340.97260.90210.93660.88260.21470.89480.9511
920140.90000.93090.90070.91380.90030.91370.89960.91230.90290.85960.90140.9049
3650720.13090.02170.12920.08430.13150.02620.13270.01960.14310.24880.13090.1191
3840220.04840.02280.04780.07720.04770.07480.05930.00710.05060.32740.04800.0637
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Seyyedabbasi, A. A Hybrid Multi-Strategy Optimization Metaheuristic Algorithm for Multi-Level Thresholding Color Image Segmentation. Appl. Sci. 2025, 15, 7255. https://doi.org/10.3390/app15137255

AMA Style

Seyyedabbasi A. A Hybrid Multi-Strategy Optimization Metaheuristic Algorithm for Multi-Level Thresholding Color Image Segmentation. Applied Sciences. 2025; 15(13):7255. https://doi.org/10.3390/app15137255

Chicago/Turabian Style

Seyyedabbasi, Amir. 2025. "A Hybrid Multi-Strategy Optimization Metaheuristic Algorithm for Multi-Level Thresholding Color Image Segmentation" Applied Sciences 15, no. 13: 7255. https://doi.org/10.3390/app15137255

APA Style

Seyyedabbasi, A. (2025). A Hybrid Multi-Strategy Optimization Metaheuristic Algorithm for Multi-Level Thresholding Color Image Segmentation. Applied Sciences, 15(13), 7255. https://doi.org/10.3390/app15137255

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop