Next Article in Journal
Matching Score Models for Hyperspectral Range Analysis to Improve Wood Log Traceability by Fingerprint Methods
Previous Article in Journal
A Complex MCDM Procedure for the Assessment of Economic Development of Units at Different Government Levels
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Clustering-Based Binarization Methods Applied to the Crow Search Algorithm for 0/1 Combinatorial Problems

1
Dirección de Tecnologías de Información y Comunicación, Universidad de Valparaíso, 2361864 Valparaíso, Chile
2
Escuela de Ingeniería Informática, Pontificia Universidad Católica de Valparaíso, 2362807 Valparaíso, Chile
3
Escuela de Ingeniería Industrial, Universidad Diego Portales, 8370109 Santiago, Chile
4
Departamento de Informática, Universidad Técnica Federico Santa Maria, 2390123 Valparaíso, Chile
5
Escuela de Ingeniería Informática, Universidad de Valparaíso, 2362905 Valparaíso, Chile
*
Authors to whom correspondence should be addressed.
Mathematics 2020, 8(7), 1070; https://doi.org/10.3390/math8071070
Submission received: 1 June 2020 / Revised: 25 June 2020 / Accepted: 28 June 2020 / Published: 2 July 2020

Abstract

:
Metaheuristics are smart problem solvers devoted to tackling particularly large optimization problems. During the last 20 years, they have largely been used to solve different problems from the academic as well as from the real-world. However, most of them have originally been designed for operating over real domain variables, being necessary to tailor its internal core, for instance, to be effective in a binary space of solutions. Various works have demonstrated that this internal modification, known as binarization, is not a simple task, since the several existing binarization ways may lead to very different results. This of course forces the user to implement and analyze a large list of binarization schemas for reaching good results. In this paper, we explore two efficient clustering methods, namely KMeans and DBscan to alter a metaheuristic in order to improve it, and thus do not require on the knowledge of an expert user for identifying which binarization strategy works better during the run. Both techniques have widely been applied to solve clustering problems, allowing us to exploit useful information gathered during the search to efficiently control and improve the binarization process. We integrate those techniques to a recent metaheuristic called Crow Search, and we conduct experiments where KMeans and DBscan are contrasted to 32 different binarization methods. The results show that the proposed approaches outperform most of the binarization strategies for a large list of well-known optimization instances.

1. Introduction

Optimization problems can be seen in different areas of the modern world, such as bridge reinforcement [1], load dispatch [2], location of emergency facilities [3], marketing [4], and social networks [5], among others. To solve them, we must identify and understand to which model the problem belongs. There are three types of models, the first has discrete variables, the second has continuous variables, and the third has both types of variables. Discrete optimization has widely been used to define a large number of problems belonging to the NP-hard class [6]. It is necessary to emphasize that the time to solve this type of problem increases exponentially according to its size. For this reason, we consider it totally prudent to solve these problems through metaheuristics [7,8], which deliver acceptable results in a limited period of time.
There are many metaheuristics and we can see a complete collection of all nature-inspired algorithms in [9] but most work on continuous space. Some researchers have been working in depth on developing binary versions that make these metaheuristics operate in binary spaces. A study of putting continuous metaheuristics to work in binary search spaces [10] clearly describes the two-step technique. These techniques require a large number of combinations and tuning of parameters to binarize and thus be able to obtain the best of all, which is a great waste of time.
Despite successful published work, binary techniques require offline analysis and an expert user in the field to adjust the metaheuristic to the problem domain in addition to working with the various available combinations of the two-step technique. Using gained expertise, our work is focused on two binarization clustering methods to smartly modify a set of real solutions to binary solutions by grouping decision variables, without the need for combinations as in the two-step technique, saving time and algorithm manipulation. The first one proposes to use KMeans [11,12] clustering algorithm, the most common unsupervised machine learning algorithm, to assign each variable in defined groups by the user. It is necessary to assign values of transition to each cluster in order to binarize each variable. The second one integrates DBscan [13,14] as a powerful clustering method for arraying values in a nonapriori determined number of clusters. Unlike KMeans, this technique creates the clusters by itself just knowing the number of variables required to be included in a cluster and the distance between each point in the data set. Furthermore, the transition value to binarize is calculated online and depends on each cluster. To test our approaches, we include these clustering methods in the Crow Search Algorithm (CSA) [15] which is a population-based metaheuristic inspired by the behavior of crows (easy implementation, it works very fast, and there are few parameter settings). In this method, each individual follows others trying to discover the places where they hide the food in order to steal it.
Based on [16,17], both works were used as gold standards, and we perform an experimental design led by quantitative methodology. We implement different criteria such as quality of solutions, robustness in terms of the instances, solving large-scale of well known binary optimization problems: Set Covering Problem (SCP) [18] and 0/1 Knapsack Problem (KP) [19]. These two problems are very popular benchmarks to address and try new strategies with because they have known results with which to compare.
The remainder of this manuscript is organized as follows: Section 2 presents the related work. Section 3 exposes binarization strategies. CSA is described in Section 4. Section 5 explains how to integrate binarizations on CSA. Next, experimental results including a research methodology are explained in Section 6. Discussions are shown in Section 7. At the end of the manuscript, conclusions are presented in Section 8.

2. Related Work

As previously mentioned, binarization is mandatory for continuous metaheuristics when the problem ranges in a binary domain. This task is not simple since experimentation with different binarization schemes is needed to reach good results. The transfer function is the most used binarization method and it was introduced in [20]. The transfer function is a very cheap operator, his range provides probabilities values and tries to model the transition of the particle positions. This function is responsible for the first step of the binarization method which corresponds to map the R n solutions in [ 0 , 1 ] n solutions. Two types of functions have been used in the literature, the S-shaped [21] and V-shaped [22]. The second step is to apply a binarization rule to the result of the transfer function. Examples of binarization rules are a complement, roulette, static probability, and elitist [22]. “Two-step” binarization is employed to refer to the use of transfer functions and binarization rule together. A detailed description of binarization techniques can be seen in [10].
Our goal here is to alleviate the user involvement in binarization testing and to provide a unique alternative that works better on average when contrasted to a large set of “Two-step” binarization techniques. In this context, the use of machine learning techniques appears as a good candidate to support the binarization process and as a consequence to obtain better search processes. Machine learning has already been used to improve search in metaheuristics [23]. For instance, the first trend is to employ machine learning for parameter tuning, as in [24] the authors use machine learning to parameter control in metaheuristics. The techniques reported in this context can be organized in four groups: Parameter control strategies [25], parameter tuning strategies [26], instance-specific parameter tuning strategies [27], and reactive search [28]. The general idea is to exploit search information in order to find good parameter configurations; this may lead to better quality solutions. An example of this is in [29], where the author uses a metaheuristic to determine the optimal number of hidden nodes and their proper initial locations on neural networks. Machine learning has also been used to build approximation models when objective function and constraints are expensive computationally [30]. Population management can also be improved via machine learning algorithms, obtaining information from already visited solutions, and using it to build new ones [31].
Machine learning has also been used to reduce the search space, for instance by using clustering [32] and neural networks [33]. In these works, two main areas of machine learning were used: forecasting and classify. It is important to mention the uses of machine learning with metaheuristics in these areas. It has been used to forecast different kinds of problems, like optimizing the low-carbon flexible job Shop Scheduling problem [34], electric load [35,36] use to forecast economic recessions in Italy, and other industrial problems as in [37]. As we mentioned, machine learning also can be used to classify some different groups of data sets; in this scenario, this technique has also been used in different problems as the image classification [38], electronic noise classification of bacterial foodbones pathogens [39], urban management [40], to mention some recent studies.
Although the participation of machine learning in the metaheuristic field is large, the specific use in binarization is indeed very limited. To the best of our knowledge, only a few experiments with the clustering techniques DBscan and KMeans have been reported [13,41]. In this way, we believe that binarization has a lot of room for exploration beyond the improvement of metaheuristics, the use of just one machine learning technique to binarization, compared to the 32 combinations of two-step technique that let the user gain simplicity, speediness, and less effort to use the binarization on metaheuristics, which is the aim pursued in this work.

3. Binarization

A study of putting continuous metaheuristics to work in binary search spaces [10] exposes two main groups of binarization techniques. The first group called two-step binarization allows one to put in work the continuous metaheuristics without operator modifications and the second group called continuous-binary operator transformation redefines the algebra of the search space, thereby reformulating the operators.
In this work, we used the first group to test our experiments. These two steps refer to transfer functions, which are responsible for mapping solutions in the real domain to a binary domain. Then we are going to use a machine learning strategy of clustering called KMeans in order to cluster the column of solutions and another strategy called DBscan that will cluster all the solutions (matrix). Both strategies will be used to map solutions between 0 and 1.

3.1. Two Step Binarization

Two-step binarization consists of applying, as its name says, two stages in order to bring the variables of the real search space to the world of binary variables. The first step uses eight transfer functions and the second step uses four discretization functions, therefore generating 32 possible strategies to apply.

3.1.1. First Step—Transfer Functions

These operators are very cheap to implement in the algorithm and their range provides probability values and attempts to model the transition of the particle positions; in other words, these functions transform a real solution that belongs to a real domain, to another real solution but restricted to a domain between 0 and 1. For that, there exist two classic types in the literature [42,43] called S-Shape and V-Shape, which are used.

3.1.2. Second Step—Binarization

A set of rules can be applied to transform a solution in the real domain to a solution in the binary domain. We use Standard, Complement, Static probability, and Elitist. In particle swarm optimization, this approach was first used in [20].

3.2. KMeans

KMeans is one of the simplest and most popular unsupervised machine learning algorithms for clustering. The main goal of KMeans [44,45] is to group similar data points given certain similarities. To achieve this objective, a number of clusters in a data set must be set. Then, every data point is allocated to each of the clusters, the arithmetic representation is as follows, see Equation (1):
J = j = 1 k i = 1 n x i j c l 2
where k represents the number of clusters, n depicts the number of cases, x i j defines jth case of ith solution, and  c l exposes the lth centroid (see pseudo-code in Algorithm 1).    
Algorithm 1: KMeans pseudo-code
Mathematics 08 01070 i001
The mechanism initially requires the k value as the number of the clusters and the best solution vector of dimension d composed by the decision variables X ^ = x 1 1 , , x j 1 , , x k 1 , , x l 1 , , x d 1 . This vector is passed as input data for being grouped (see the left side of Figure 1). Iteratively, KMeans assigns each decision variable to one group based on the similarity of their features and these are allocated to the closest centroid, minimizing Euclidean distances. After, KMeans recalculates centroids by taking the average of all decision variables allocated to the cluster of that centroid, thus decreasing the complete intracluster variance over the past phase.
The right side of Figure 1 illustrates the computed groups. Positions are adjusted in each iteration of the process until the algorithm converges.

3.3. DBscan

Density-Based Spatial Clustering of Applications with Noise (DBscan) [46] is a well-known data clustering algorithm which is commonly used in data mining and machine learning. It can be used to identify clusters of any shape in a data set containing noise and outliers. Clusters are thick data space areas separated by areas with reduced points density.
The goal is to define thick areas that can be measured by the amount of near-point objects. It is indispensable to know that DBscan has two important parameters that are required for work.
  • Epsilon ( ϵ ): Sets how close points should be regarded as part of a cluster to each other.
  • Minimum points (MinPts): The minimum number of points to form a dense region.
The main idea is that there must be at least a minimum number of points for each group in the neighborhood of a given radius. The  ϵ parameter defines the neighborhood radius around a point x, any point x in the data set with a neighbor above or equal to MinPts is marked as a center point, and x is a limit point whether the number of its neighbors is less than MinPts. Finally, if a point is not a core or a point of the edge, then it is called a point of noise or an outer point.
As a summary, this technique helps when you do not know much information about data, see pseudo-code in Algorithm 2.
The right side of Figure 2 illustrates the computed groups. Positions are adjusted in each iteration of the process until the algorithm converges.
Algorithm 2: DBscan pseudo-code
Mathematics 08 01070 i002

4. Crow Search Algorithm

Crows (crow family or corvids) are considered the most intelligent birds. They contain the largest brain relative to their body size. Based on a brain-to-body ratio, their brain is slightly lower than a human brain. Evidence of the cleverness of crows is plentiful. They can remember faces and warn each other when an unfriendly one approaches. Moreover, they can use tools, communicate in sophisticated ways, and recall their food hiding place up to several months later.
CSA tries to simulate the intelligent behavior of the crows to find the solution of optimization problems [15]. Optimization point of view: the crows are searchers, the environment is the search space, each position of the crow corresponds to a feasible solution, the quality of food source is the objective function, and the best food source of the environment is the global solution of the problem. Below, parameters for CSA are shown.
  • N: Population.
  • AP: Awareness probability.
  • f l : Flight length.
  • i t e r : Maximum number of iterations.
When CSA is working, two things can happen. For this, let Crow 1 and Crow 2 be different possible solutions belonging to the search space. Firstly, Crow 1 does not know that Crow 2 is following it. As a result, Crow 2 will approach the hiding place of Crow 1. This action is called State 1. On the other hand, Crow 1 knows that Crow 2 is following it. Finally, to protect its hidden place from being stolen, Crow 1 will fool Crow 2 by going to another position of the search space. This action is called state 2. The representation of these states is represented for Equation (2).
x i , j i t e r + 1 = x i , j i t e r + r 1 f l ( m i , j i t e r x i , j i t e r ) r 2 A P a random position otherwise
where:
  • r 1 and r 2 are random numbers with uniform distribution between 0 and 1.
  • f l denotes the flight length of crows.
  • A P denotes the awareness probability of crows (intensification and diversification).
Below, pseudo-code of CSA is shown in Algorithm 3.
Algorithm 3: CSA pseudo-code
Mathematics 08 01070 i003

5. Integration: Binarizations on CSA

As we shown in Section 3, to work in a binary search space CSA [15], one must apply the binarization techniques. Firstly, we will solve the problem using Two-steps technique, then we will apply KMeans technique.
To have a high level explanation, the steps are described below:
When CSA evaluate a x i , a real value is generated, then:
  • Two-steps: x i in t + 1 is binarized column by column applying the first step getting a continuous solution that enters as an input to the second step making the binarization evaluate the solution in the Objective function.
  • KMeans: In this case, x i in t + 1 is clustered. Then each cluster is assigned to a cluster transition value already defined as a parameter. Small centroids get the small cluster transition value and a high centroid value gets a high value of transition. It is necessary to evaluate every point in the clusters, asking if random values between 0 an 1 are equal or greater than a cluster transition value. If we get the position x i , j and apply the complement to the x i in t, then the objective function is evaluated.
For theses two techniques we take the same strategy that is illustrated in Algorithm 4.
Algorithm 4: CSA + Two-steps and Kmeans binarization techniques pseudo-code
Mathematics 08 01070 i004
A graphic representation can be seen in Figure 3. A continuous solution from the metaheuristic is transformed into a solution that belongs to real domain to a solution in a real domain but restricted to a domain between 0 and 1, applying the first step. Then, the solution again is transformed into a solution in a binary domain using step 2. Figure 4 shows how the real solution is clusterized and then transformed to a binary solution applying cluster transition value using KMeans.
Figure 4 shows how the real solution is clusterized and then is transformed to a binary solution applying cluster transition value using KMeans.
The third technique used is DBscan. This strategy is slightly different from previous ones because what we do is binarize the complete matrix instead of binarizing the solution vector. For that, we will create a matrix replica in its initial state ( x ), then the metaheuristic will normally work by performing its movements in the real search space. Once each iteration is complete, the entire matrix will be clustered. As we have seen before, DBscan receives a distance ( ϵ ) and a minimum of points (minPts) as input parameter, with this data the algorithm will only deliver a number of clusters depending on the data behavior. The following steps are performed after the clustering has been completed:
  • Calculate the mean of the points to every cluster.
  • Sort the clusters from least to greatest taking into account the average value of each cluster.
  • Evaluate each point in the clusters by checking if a random value between 0 and 1 is equal to or greater than the average of each cluster; if it is, then we get the position x i , j and we apply the complement to the replica of the matrix ( x ). Then the objective function is evaluated.
For this technique, we take the strategy that is illustrated in Algorithm 5. Finally, Figure 5 shows how the matrix x is clusterized and then, it is transformed to a binary matrix applying Dbscan.
Algorithm 5: CSA + DBscan binarization technique pseudo-code
Mathematics 08 01070 i005

6. Experimental Results

We propose to test our approach by solving the instances of the SCP [47] and the 0/1 Knapsack Problem [48].

6.1. Methodology

A performance analysis is totally necessary to properly evaluate the performance of metaheuristics [49]. In this study, we use the well known benchmark taken from OR-Library [50] in order to compare the given best solution of the CSA against the best known result of the benchmark. Figure 6 illustrates the steps for making an exhaustive analysis of the improved metaheuristic. For experimental design, we design aims and guidelines to demonstrate the proposed approach is a real alternative to binarization systems. Next, we establish the best value as the critical metric for evaluating future results. In this context, we apply an ordinal analysis and we employ statistical tests to determine which approach is significantly better. Finally, we describe the used hardware and software features to reproduce computational experiments, and then we report all results in tables and distribution charts.
Moreover, it is very important to show how significant of a difference the three strategies have in this type of problem. For this reason we perform a contrast statistical test for each instance through the Kolmogorov-Smirnov-Lilliefors test to determine the independence of samples [51] and the Mann–Whitney–Wilcoxon test [52] to compare the results statistically (see Figure 7).
Kolmogorov-Smirnov-Lilliefors test allows us to analyze the independence of the samples by determining the Z M I N or Z M A X (depending if the problem is minimization or maximization) obtained from the 31 executions of each instance.
The results are evaluated using the relative percentage deviation (RPD). The RPD value quantifies the deviation of the objective value Z m i n from Z o p t , which is the minimal best-known value for each instance in our experiment, and it is calculated as follows:
R D P = Z m i n Z o p t Z o p t

6.2. Set Covering Problem (SCP)

The Set Covering Problem [47] is a classic combinatorial optimization problem, belonging to the NP-hard class [53], that has been applied in several real-world problems like the location of emergency facilities [3], airline and bus crew scheduling [54], vehicle scheduling [55], steel production [56], among others.
Formally, the SCP is defined as follows: let A = ( a i j ) be a binary matrix with M-rows ( i I = { 1 , , M } ) and N-columns ( j J = { 1 , , N } ), and let C = ( c j ) be a vector representing the cost of each column j, assuming that c j > 0 , j = { 1 , , N } . Then, it observes that a column j covers a row i if a i j = 1 . Therefore, it has:
a i j = 1 , if row i can be covered by column j 0 , otherwise
The SCP consists of finding a set of elements that cover a range of needs at the lowest cost. In its matrix form, a feasible solution corresponds to a subset of columns and the needs are associated with the rows and treated as constraints. The problem aims at selecting the columns that optimally cover all the rows.
The Set Covering Problem finds a minimum cost subset S of columns such that each row is covered by at least one column of S. An integer programming formulation of the SCP is as follows:
minimize j = 1 n c j x j subject   to : j = 1 n a i j x j 1 i I x j { 0 , 1 } j J
Instances: In order to evaluate the algorithm performance when solving the SCP, we use 65 instances taken from the Beasley’s OR-library [50], which are organized in 11 sets.
Table 1 describes instance group, number of rows M, number of columns N, range of costs, density (percentage of nonzeroes in the matrix).
Reducing the instance size of SCP: In [57] different preprocessing methods have particularly been proposed to reduce the size of the SCP, two of them have been taken as the most effective ones, which are Column Domination and Column Inclusion; these methods are used to accelerate the processing of the algorithm.
Column Domination: It consists of eliminating the redundant columns of the problem in such a way that it does not affect the final solution.
Steps:
  • All the columns are ordered according to their cost in ascending order.
  • If there are equal cost columns, these are sorted in descending order by the number of rows that the column j covers.
  • Verify if the column j whose rows can be covered by a set of other columns with a cost less than c j (cost of the column j).
  • It is said that column j is dominated and can be eliminated from the problem.
Column Inclusion: If a row is covered only by one column after the domination process, it means that there is no better column to cover those rows; consequently, this column must be included in the optimal solution.
This reduction will be used as input data of instances.

6.3. Knapsack (KP)

0–1 Knapsack Problem is still an actual issue [58] which is a typical NP-hard problem in operations research. There are many studies that use this problem to prove their algorithm, such as in [59,60]. The problem may be defined as follows: given a set of objects from N, each o object has a certain weight and value. Furthermore, the number of items to be included in the collection must be determined to ensure that the full weight is less than or equal to the limit and that the full value is as large as possible.
χ j = 1 if the item is included 0 otherwise j J = { 1 , 2 , , N } .
The problem is to optimize the objective function:
max j = 1 n p j x j
Subject to:
j = 1 n w j x j b x j { 0 , 1 } , j J
Binary decision variables x j are used to indicate if item j is included in the knapsack or not. It can be assumed that all benefits and weights are positive and that all weights are smaller than capacity b.
Instances: In order to evaluate the algorithm performance of solving the KP, we used 10 instances [61].
Table 2 describes instance, dimension, and parameters: weight w, profit p, and capacity of the bag b.
As we can see in the instance Table 2, distinct sorts of problems are solved, each with specific sizes and values to get a wide concept of the algorithm’s behavior.
Software and Hardware: CSA was implemented in Java 1.8. The characteristics of the Personal Computer (PC) were: Windows 10 with an Intel Core i7-6700 CPU (3.4Ghz), 16 GB of RAM.
Parameter settings: Below, Table 3 shows the setup for Two-steps strategy, Table 4 shows the setup for KMeans strategy, and finally Table 5 shows setup for DBscan strategy.
Sixty-five instances of the SCP and ten instances of the 0/1 Knapsack Problem have been considered, which were executed 31 times each.

6.4. Comparison Results and Strategies

In [62] and in Appendix section (Appendix A.1 and Appendix A.2), we can see the results obtained; the information is detailed, also indicating each of the transfer functions and the binarization methods that were used. The data is grouped as follows: instance: name of the instance, MIN: the minimum reached value, MAX: the maximum value reached, AVG: the average value, BKS: the best known solution, RPD it is defined by the Equation (3) and finally TIME: time of execution in seconds.
As we mentioned, the first strategy implemented was Two-steps, the second strategy implemented was KMeans, and the last strategy implemented was DBscan.

Set Covering Problem Results

Comparison strategy: Table 6 shows a summary of the instances grouped by their complexity; we define three groups: G1 (easy), G2 (medium), and G3 (hard). The first group includes instances 4.1 to 6.5, the second group includes instances a.1 to d.5, and the third group includes instances nre.1 to nrh.5.
To get more information of the comparison between strategies, see [62]. It will easily realize how complex it is to develop all the combinations of two steps in order to get good results vs. clustering techniques.
Summary: Next, we rank the strategies in order of BKS obtained. In addition, the instances that obtained BKS in the 3 strategies are shown.
  • Score
    • Two-steps got 51/65 BKS.
    • DBscan got 25/65 BKS.
    • KMeans got 16/65 BKS.
  • The three strategies got BKS in the same instances: 4.1-4.3-4.6-4.7-5.1-5.4-5.5-5.6-5.7-5.8-5.9-6.2-6.4-a.5
Instance distribution: Now that we have all the results of all the strategies that solved SCP, we compare the distribution of the samples of each instance through a box plot that shows the full distribution of the data. In order to make a summary of all the instances below, we present and describe the hardest instances of each group (4.10, 5.10, 6.5, a.5, b.5, c.5, d.5, nre.5, nrf.5, nrg.5 and nrh.5):
We can see that Figure 8 and Figure 9 are alike. The Two-steps and DBscan strategies obtained BKS but KMeans did not. However, the distribution is quite homogeneous. If we see the instance 4.10 illustrated in Figure 8, DBscan’s behavior is remarkable.
Figure 10 shows again that the Two-steps and DBscan strategies obtained BKS and the KMeans strategy did not. However, it can be appreciated again that DBscan is far more compact than Two-steps, which indicates that clustering processing is accurate in this instance. Then, Figure 11 shows us for the first time a very nice and solidly compact distribution. It is necessary to emphasize that the three strategies obtained BKS. Finally, Figure 12 shows how the Two-steps and KMeans strategies obtained BKS while DBscan did not. However, if it was very close, despite this the most compact strategies in its distribution were KMeans and DBscan.
If we continue analyzing the samples, we can realize that Figure 13, Figure 14, Figure 15, Figure 16 and Figure 17 are very similar, the two-step strategies and KMeans far exceeded DBscan. This gives us an indication that DBscan works better by solving small instances with this metaheuristic, perhaps due to a large number of calculations made by grouping the complete set of solutions versus KMeans that groups the solution and Two-steps that perform binarization instantly on each vector column solution.
Finally, Figure 18 show us a null contribution of DBscan, who could not complete the executions due to the excessive delay of the strategy when clustering the most difficult instances; later we can observe KMeans absolutely trapped in an optimal local. However, the Two-steps approach continues to show good results, being the most robust at the BKS level as mentioned earlier in the strategy comparison.

6.5. Knapsack Problem Results

Unlike the previous problem (SCP), the Knapsack problem has fewer instances, so it was not necessary to group by complexity.
Comparison strategy: Table 7 shows a summary of the instances.
To get more information of the comparison between strategies, see [62].
Summary: Next, we rank the strategies in order of BKS obtained. In addition, the instances that obtained BKS in the three strategies are shown.
  • Score
    -
    Two-steps got 6/10 BKS.
    -
    DBscan got 5/10 BKS.
    -
    KMeans got 4/10 BKS.
  • The three strategies got BKS in the same instances: f3-f4-f7-f9
Instance distribution: Now that we have all the results of all the strategies that KP solved, we compare the distribution of the samples of each instance through a box diagram that shows the complete distribution of the data. To summarize all the instances below, we present and describe instances where the optimum was not reached because instances f3, f4, f7, and f9 always obtain BKS in each execution with the three strategies, so it is not necessary to graph these instances.
Figure 19, Figure 20, Figure 21, Figure 22, Figure 23 and Figure 24 shows us that the Two-steps strategy achieved better results but DBscan strategy is far more compact than KMeans strategy. Figure 24 shows the only instance where DBscan did not win to KMeans.

7. Statistical Test

As we mentioned, in order to determine independence, we propose the following hypotheses:
-
H 0 : states that Z m i n / Z m a x follows a normal distribution.
-
H 1 : states the opposite.
The test performed has yielded p _ v a l u e lower than 0.05 ; therefore, H 0 cannot be assumed. Now that we know that the samples are independent and it cannot be assumed that they follow a normal distribution, it is not feasible to use the central limit theorem. Therefore, for evaluating the heterogeneity of samples we use a nonparametric evaluation called Mann–Whitney–Wilcoxon test. To compare all the results of the hardest instances, we propose the following hypotheses:
-
H 0 : Two-steps is better than KMeans
-
H 1 : states the opposite.
-
H 2 : Two-steps is better than DBscan
-
H 3 : states the opposite.
-
H 4 : KMeans is better than DBscan
-
H 5 : states the opposite.
Finally, the contrast statistical test demonstrates which technique is significantly better.
Table 8, Table 9, Table 10, Table 11, Table 12, Table 13, Table 14, Table 15, Table 16, Table 17, Table 18, Table 19, Table 20, Table 21, Table 22, Table 23 and Table 24 compare SCP and 0/1 KP binariztion strategies for the hardest instances tested via the Wilcoxon’s signed rank test. As the significance level is also established to 0.05 , smaller values than 0.05 defines that H 0 , H 2 , and H 4 cannot be assumed.
We have a method belonging to the PISA system to conduct the test run that supports the study. To this method, we indicate all data distributions (each in a file and each data in a line) and the algorithm will give us a p-value for the hypotheses.
The following tables show the result of the Mann-Whitney-Wilcoxon test. To understand them, it is necessary to have knowledge of the following acronyms:
  • SWS = Statistically without significance.
  • D/A = Does not apply.
SCP—p-value
KP—p-value
All reported p-values are less than 0.05; as indicated above, SWS indicates that it has no statistical significance, and D/A indicates that the comparison cannot be done because one of the strategies does not contain any data. So, with this information, we can clearly see which strategy was better than the three in each instance reported. As a resume for results for the SCP p-values lower than 0.05 were 10 for Two-steps, 10 for KMeans, and 5 for DBscan. On the other hand, results for the 0/1 KP p-values lower than 0.05 were 0 for Two steps, 10 for KMeans, and 5 for DBscan.

8. Conclusions

After implementing and analyzing binarization strategies through clusters, we can realize that DBscan achieved better results than KMeans at the BKS level. However, the execution times of DBscan were longer than KMeans; we believe that this has a clear explanation. The way in which the experiment was developed with clustering is crucial; KMeans cluster a vector while the BDscan clusters the complete matrix, which makes this technique take longer. Moreover, we do not report the solving time results due to the fact that it is known that this measure basically depends of the characteristics of the used machine (computer). Finally, our work is focused on enhancing the binariazation process for metahuerisics in order to find better results. In this way, we certainly have sacrificed the solving time but we think that DBscan is very promising because with these experiments we can realize that although we cluster the complete matrix with this technique, which took more time to execute, it still obtained good results.
When solving SCP, we have tested 65 nonunique instances of the Beasley OR Library and 10 well-known instances to solve KP, where classical technique obtained better results than clustering techniques in both problems. We believe that due to the number of combinations made (992 experiments for Two-steps), it allows a better performance on this metahuristic and also that it is necessary to make adjustments, perhaps of autonomous search on the clustering techniques to obtain a better performance.
The best results were obtained with the S-Shape transfer functions (S-Shape 4 predominates over the others) and Elitist binarization method. The worst results were the combination of the V-Shape transfer function with the binarization complement. It is very relevant to say that KMeans did not got better results than Two-steps strategy. On the other hand, it is relevant that DBscan obtained good results, better than KMeans and worse than Two-steps but the quantity of Two-step combinations is a very influential factor since with 32 different different ways of binarization it is far from being an easy way to implement; this requires time and knowledge of an expert user. Which means we reduce to one single experiment (31 executions) with DBscan over the combinatorial strategies of two-step getting good results. Statistically DBscan got better performance.
As future work, we plan to reverse the strategies of KMeans and DBscan by solving the grouping problem as a matrix with KMeans and as a unique solution with DBscan. In addition, we will implement a self-adaptive approach to get better results. Finally, we will apply the online control parameters in the Two-steps strategy to reach BKS in instances that did not reach optimal values.

Author Contributions

Formal analysis, S.V.; investigation, R.S., B.C., S.V., N.C. and R.O.; methodology, R.S. and B.C.; resources, R.S. and B.C.; software, R.O., S.V. and N.C.; validation, R.O., S.V., F.P., C.C. and N.C.; writing—original draft, S.V.; writing—review and editing, R.S., R.O. and S.V. All the authors of this paper hold responsibility for every part of this manuscript. All authors have read and agreed to the published version of the manuscript.

Funding

Ricardo Soto is supported by Grant CONICYT/FONDECYT/REGULAR/1190129. B.C. is supported by Grant CONICYT/FONDECYT/REGULAR/1171243. Nicolás Caselli is supported by PUCV-INF 2019.

Conflicts of Interest

The authors declare no conflict of interest. The founding sponsors had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, and in the decision to publish the results.

Appendix A

Appendix A.1. Set Covering Problem Results

Appendix A.1.1. SCP—Crow Search Algorithm

Table A1. CSA/SCP—SShape 1—Two-steps vs. KMeans vs. DBscan (t.o = time out).
Table A1. CSA/SCP—SShape 1—Two-steps vs. KMeans vs. DBscan (t.o = time out).
InstanceBKSS1+Stand.S1+Comp.S1+StaticS1+ElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
4.1429438444.81459463.32465494.87432443.74429430429430.38
4.2512522550.35579586.39612698.06528551.10513515.84512514.77
4.3516520532.58552564.19587620.47564578.36516520.61516518.87
4.4494513524.58544555.26566620.13496515.84495498.87494496.10
4.5512518529.23554570.45601680.39518529.55514514.90512513.23
4.6560577590.23614626.23661798.55566590560562.35560560.19
4.7430436445.90485492.81498558.90438451.68430432.26430430.81
4.8492501513.74532540.29585680.61499508.52493496.13492494.10
4.9641675690.42722745.06807911.65657687.94645657.90641647.50
4.10514534554.74569578.90595678.27528550.23516519.9514515.81
5.1253265275.29291296.26309352.61258271.32253255.74253255.52
5.2302310327.55352363.68421471.19317326.74308309.81302306.68
5.3226230240.19249252.65278310.35229239.74228228.68226227.32
5.4242242249.43263264.90301354.12254269.84242242.23242243.29
5.5211211223.13242245.97247301.87245268.21211212.03211213.90
5.6213222235.71251254.39267309.87226237.87213215.10213213.04
5.7293305314.61327335.29360408.94305315.10293297.68293295.35
5.8288302310.06342348.48370424.65299310.84288290.35288290.39
5.9279283303.23321331.29357381.28287305.68279280.52279279.90
5.10265271277.45294298.45335376.16273277.10267268.77265267.97
6.1138146152.03159166210287.13145151.71140142.90140143.42
6.2146152157.29163169308401.10150156.10146148.87146149.71
6.3154148156.52162167.23268379.39148156.65147148.16145149.65
6.4131131138.26137141.94247359.87138147.64131131.68131133.03
6.5161171181.35186194.50267419.94173181.42163164.68162165.39
AVG336.08344.92356.75373.96382.11420.88491.21346.92360.91336.80339.43335.84346.80
a.1253257263.74269272.48474584.81256261.81254255.55254256
a.2252262274.42300305.23439565.55267274.74257259.48256260.65
a.3232242251256258.16410525.81242250.35235237.10233237.35
a.4234239250.71264268.35409523.32236250.97235237.39236239.97
a.5236239243.32258263.03421540.65238241.84236237.06236237.52
b.1697581.618286.10347518.137281.587479.616976.48
b.2768490.269294.19411549.618188.358390.038186.81
b.3808085.949191.90507535.98587.658488.558288.81
b.4798389.1096100.26517660.528488.558487.908388.16
b.5727278.928284.55498587.587680.357277.687876.03
c.1227232237.65263264.65512718.52233236.71228230.29233238.77
c.2219228236.61259263.13617782.81226235.58226222.71226232.74
c.3243257270.55297302.32738939.94261271254256.68253264
c.4219231243.45256260.17618783.23233243.58225227.97224233.10
c.5215219226.81245250.29529713.77218226.06215216.35222230.42
d.1606168.297070.87600889.426367.2366666879.65
d.2667073.527778.686861001.556874.1971717389.71
d.3727781.688386.748811114.297882.13828282100.84
d.4666568.297171.816449456568.2367677081.74
d.5616467.746969.97631909.136367.3966667281.29
AVG151.55156.85164.18174177.14544.45719.47157.25163.91155.70157.81156.55164
nre.1293033.23323313541643.543032.6830307080.03
nre.2303337.453637.6114051732.553336.19343483128.32
nre.3272833.613234.6111371416.773033.61343474112.35
nre.4283134343412101712.483033.90333483177.45
nre.5283032.943333.8712921860.233032.77303092150.52
nrf.11417191717.03667903.481518.421717372444.19
nrf.2151618.711818607836.291618.61181874404.58
nrf.3141719.391919.908581034.901719.291919335487.25
nrf.4141518.451718.61667902.971618.29181852420.52
nrf.5131518.291616.32653844.231517.61161665371.03
nrg.1176201208.26230233.4844735588.65192203.39197197287385.13
nrg.2154172181.06187190.4838484648.48166174.16168168208284.29
nrg.3166180190.55196197.8443195026.55180185.19183183211347.23
nrg.4168187200.03216219.2641424935.29180192.10186186250345.77
nrg.5168191201.10213217.3241985155181193.84186186230353.97
nrh.163102124.688283.65908010,028.427277.427777t.o.t.o.
nrh.26392120.108181.81908510,332.717176.487171t.o.t.o.
nrh.35989120.23747581019886.686773.616969t.o.t.o.
nrh.45891116.527374.9785929899.266770.946868t.o.t.o.
nrh.55588105.136868.7477869318.556267.166161t.o.t.o.
AVG67.1081.2591.6383.7085.273673.704385.3573.5078.7875.7575.80t.o.t.o.
Table A2. CSA/SCP—SShape 2—Two-steps vs. KMeans vs. DBscan (t.o = time out).
Table A2. CSA/SCP—SShape 2—Two-steps vs. KMeans vs. DBscan (t.o = time out).
InstanceBKSS2+Stand.S2+Comp.S2+StaticS2+ElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
4.1429432443.48457462.81622686.58431439.73429430429430.38
4.2512517549.26578587.269291084.16522549.50513515.84512514.77
4.3516519532555564.23789987.25578687.47516520.61516518.87
4.4494500520.55542555.81861963.39501517.03495498.87494496.10
4.5512518531.84557573.489011101.97514529.40514514.90512513.23
4.6560566588.48610624.8111941320.35568588.70560562.35560560.19
4.7430436447.13479491.35821892.06437448.60430432.26430430.81
4.8492499514.74525539.529461095.10493506.97493496.13492494.10
4.9641641652.16722743.3913681499.48662682.20645657.90641647.50
4.10514523550.52560575.358731064.16518546.53516519.9514515.81
5.1253257273.52287295.26490556.81260270.27253255.74253255.52
5.2302312325.23352361.77717800.55308321.53308309.81302306.68
5.3226231240.77250253.23466522.19229237.13228228.68226227.32
5.4242244249.13261264.68357487.25247278.64242242.23242243.29
5.5211215223.77242245.87287359.25287301.07211212.03211213.90
5.6213217235.61245253.68422498.10224234213215.10213213.04
5.7293307312.90331334.16563650.16299311.40293297.68293295.35
5.8288291310.71339346.74575654.52298308.57288290.35288290.39
5.9279288306.16320331.23487587.37297357.26279280.52279279.90
5.10265269276.58290297.55553609.87272278267268.77265267.97
6.1138148151.67161165.26569732.77143150.77140142.90140143.42
6.2146150158.58166169774995.52151157.90146148.87146149.71
6.3154150157.65162167.77766949.35150155.27147148.16145149.65
6.4131134138.65139142.06778954.21147157.28131131.68131133.03
6.5161175181.94186192.587921000.87170179.47163164.68162165.39
AVG336.08341.56354.92372.64381.55716842.13348.24367.78336.80339.43335.84338.25
a.1253258264.39270273.169811126.26257262.60254255.55254256
a.2252263275.06301305.329431066.48266274.33257259.48256260.65
a.3232247250.35256257.87812998.84241250.03235237.10233237.35
a.4234234240.06265268.90848977.29241251.03235237.39236239.97
a.5236238244.26259264.29834971.19239242.90236237.06236237.52
b.1697280.588285.2610681196.617680.837479.616976.48
b.2767681.45939410391179.848088.308390.038186.81
b.3808286.559191.8410541178.898691.048488.558288.81
b.4798488.6895100.2311531353.978387.578487.908388.16
b.5727379.558284.5211271201.317884.897277.687876.03
c.1227233240.23262264.6811281308.87230236.47228230.29233238.77
c.22192192225.10254262.6813591470.74230237.47226222.71226232.74
c.3243260272.03295301.5214471734.35259269.80254256.68253264
c.4219235243.74254259.4213651463.19233243.40225227.97224233.10
c.5215219225.65246249.7712531364.74219225.50215216.35222230.42
d.1606368.777070.8714831682.686267.6766666879.65
d.2666974.457778.2316081893.906973.5771717389.71
d.3727882.068386.6819002102.947780.40828282100.84
d.4666468.657071.8115311698.816468.2767677081.74
d.5616267.946969.9015211699.136466.8766667281.29
AVG151.55156.45262.97173.70177.051222.701383.50157.70164.15155.70157.81156.55164
nre.1292933.14323321472147.653033.8530307080.03
nre.2303237.293637.6126522897.613337343483128.32
nre.3273134.233334.5222422510.843033.20343474112.35
nre.4282934.323333.9725192804.133133.97333483177.45
nre.5283033.583333.9426972965.902932.67303092150.52
nrf.1141618.16171714381688.131618.531717372444.19
nrf.2151619.231717.9713351516.131718.47181874404.58
nrf.3141719.771919.8715401825.871719.501919335487.25
nrf.4141618.681718.7115241664.941618.37181852420.52
nrf.5131618.101616.3213131445.741517.83161665371.03
nrg.1176327371.45230233.1371277506.23192203.50197197287385.13
nrg.2154233279.13185189.7759946340.61167175.87168168208284.29
nrg.3166281315.81195197.7165806997.29178186.17183183211347.23
nrg.4168272303.65215218.9066017007.48183192.27186186250345.77
nrg.5168276329216217.5567787198.52184194.33186186230353.97
nrh.16311651405.978383.7712,35312,926.587277.977171t.o.t.o.
nrh.26311531326.817981.7711,77812,818.747378.577171t.o.t.o.
nrh.35911991361.747474.8711,99212,604.526973.536969t.o.t.o.
nrh.45811871334.777374.8412,29512,797.396671.036868t.o.t.o.
nrh.55510111181.946768.7111,47712,064.846368.876161t.o.t.o.
AVG67.10366.80423.8383.5085.195619.105986.4574.0579.2775.4575.50t.o.t.o.
Table A3. CSA/SCP—SShape 3—Two-steps vs. KMeans vs. DBscan (t.o = time out).
Table A3. CSA/SCP—SShape 3—Two-steps vs. KMeans vs. DBscan (t.o = time out).
InstanceBKSS3+Stand.S3+Comp.S3+StaticS3+ElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
4.1429431442.52452461.10587667.32432439.45429430429430.38
4.2512515542.26576585.359061028.52515545.26513515.84512514.77
4.3516516529.13554564.97798869.54579602.32516520.61516518.87
4.4494495518.71544553.81828914.39501513.42495498.87494496.10
4.5512517527.61559568.748631036514525.35514514.90512513.23
4.6560565585.53618624.1310821239.10566587.68560562.35560560.19
4.7430434445.52484491.16728830.16438447430432.26430430.81
4.8492498508.68529539.269421075.97497506.26493496.13492494.10
4.9641654684.03726741.9011451399.77656677.03645657.90641647.50
4.10514518540.97568578.268521017.61517537.58516519.9514515.81
5.1253260270.84289294.16471519.52257268.13253255.74253255.52
5.2302313325.77349360.81694761.77314323.06308309.81302306.68
5.3226229237249252.81419476.16228234.52228228.68226227.32
5.4242243248.53261264.45368589.12302317.22242242.23242243.29
5.5211214221.10241244.97358475.74235247.54211212.03211213.90
5.6213221233.32250253.84383469.61222234213215.10213213.04
5.7293301311.84327332.84558622.35302311.52293297.68293295.35
5.8288298307.87338345.74545631.48294307.29288290.35288290.39
5.9279281303.13315329.264412506.06301387.31279280.52279279.90
5.10265271276.39294297.97497573.84268275.19267268.77265267.97
6.1138145151.55159164.97701874.98144148.90140142.90140143.42
6.2146151157.61166169.19803941.94148156.29146148.87146149.71
6.3154148155.68160166.71788892.10148152.84147148.16145149.65
6.4131132137139141.81814912.32139150.01131131.68131133.03
6.5161170180.42183192816916.68165177.97163164.68162165.39
AVG336.08340.80353.72373.20380.81854.32809.68347.28362.93336.80339.44335.84338.25
a.1253257262.13270273.109031039.42256262.06254255.55254256
a.2252261274.94301305.19861980.68264271.61257259.48256260.65
a.3232241249.94255257.55812924.23242249.35235237.10233237.35
a.4234243251.97246268.48828921.84240249.87235237.39236239.97
a.5236238242.10260264.74846926.26237241.97236237.06236237.52
b.1697480.688284.909621110.107380.587479.616976.48
b.2768388.689093.719611083.718188.688390.038186.81
b.3808487.139091.719871087.368689.788488.558288.81
b.4798489.32959911051265.168188.038487.908388.16
b.5727479.618084.1311871398.658081.277277.687876.03
c.1227232237.48262264.5210981230.23233236.48228230.29233238.77
c.2219225236.58256262.4511931351.26225232.90226222.71226232.74
c.3243257270.03298301.1014661621.74262270.23254256.68253264
c.4219230242.74256259.4512201346.16233243.19225227.97224233.10
c.5215218225.13243249.6511181289.65219225.03215216.35222230.42
d.1606368.617970.9013671563.396166.5866666879.65
d.2667074.267778.3514151757.356973.3971717389.71
d.37278818586.7116651942.037881.13828282100.84
d.4666768.687171.90145816106567.8467677081.74
d.5616467.946969.8414431599.946366.8766667281.29
AVG151.55126.86133.19142.20144.5512431417.11127.26132.79126.46128.65127.73136.57
nre.1293033.45323322242478.473536.2130307080.03
nre.2303337.263537.1025352724.683236343483128.32
nre.3273033.653334.2920162332.483032.87343474112.35
nre.4283234.58343423862632.773134.29333483177.45
nre.5283033.163333.9425732762.352832.03303092150.52
nrf.1141519.35161713481565.771517.871717372444.19
nrf.2151618.77181812651442.901618.10181874404.58
nrf.3141720.291919.6815781739.681619.521919335487.25
nrf.4141719.061718.6813861546.101517.97181852420.52
nrf.5131618.741616.2612291367.841517.61161665371.03
nrg.1176585702231233.4264837083.65200210.65197197287385.13
nrg.2154441506.13186190.1956195953.74166177168168208284.29
nrg.3166484570.19196198.1659206475.87180188.23183183211347.23
nrg.4168506571.71217219.7459716549.42185196.81186186250345.77
nrg.5168515620.45213218.1365726799.97186198.19186186230353.97
nrh.16320432305.068383.9711,39112,133.23119156.137777t.o.t.o.
nrh.26319422188.558181.7711,64412,162.52112141.137171t.o.t.o.
nrh.35918702244.55757511,16911,969.55100141.816969t.o.t.o.
nrh.45821022259.947475.1011,57712,029.13109143.136868t.o.t.o.
nrh.55517782010.266768.9010,95211,382.8193118.716161t.o.t.o.
AVG67.10625.10712.3583.8085.315291.905656.6484.1596.7175.7575.80t.o.t.o.
Table A4. CSA/SCP—SShape 4—Two-steps vs. KMeans vs. DBscan (t.o = time out).
Table A4. CSA/SCP—SShape 4—Two-steps vs. KMeans vs. DBscan (t.o = time out).
InstanceBKSS4+Stand.S4+Comp.S4+StaticS4+ElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
4.1429432440.45456461.06440455.10429430.61429430429430.38
4.2512515542.77571583.74561598.77512515.74513515.84512514.77
4.3516516530.25554564.32523587.21516518.87516520.61516518.87
4.4494507517.45541553.48518551.81494500.10495498.87494496.10
4.5512516527.71551568.74547578.55512514.61514514.90512513.23
4.6560569586.65612624.77607643.74560563.61560562.35560560.19
4.7430434444.84483491.55451487.13430432.26430432.26430430.81
4.8492493508.06529539516558.61492494.19493496.13492494.10
4.9641668682.74728743.71705766.61647654.68645657.90641647.50
4.10514522541.81564576.52558589.06514517.19516519.9514515.81
5.1253254268.74288294.55279298.19253256.74253255.74253255.52
5.2302315326.68349360.74335368.16302308.39308309.81302306.68
5.3226228235.97246252.32244261.71226227.32228228.68226227.32
5.4242242247.74262264.87268278.36243246.23242242.23242243.29
5.5211212220.29241245.32250238.07212214.32211212.03211213.90
5.6213223232.84247253.65236260.35213213.90213215.10213213.04
5.7293301311.48327331.61322344.13293295.03293297.68293295.35
5.8288296309.32336344.90323348.68288289.90288290.35288290.39
5.9279279300.32314329.32280301.24280281.34279280.52279279.90
5.10265272276.23292297.48288306.77265267.87267268.77265267.97
6.1138145150.90158164.29157183.84138143.13140142.90140143.42
6.2146150155.58165168.71180225.68146150.61146148.87146149.71
6.3154148154.19162166.77170213.39145150.32147148.16145149.65
6.4131133136.55138141.32148154.24132134.23131131.68131133.03
6.5161171179.71187191.94208239.29161167163164.68162165.39
AVG336.08341.64353.17372.04380.58364.56393.54336.12339.52336.80339.43335.84338.25
a.1253253257.65270273.13322367.97254257.26254255.55254256
a.2252262272.58299304.87326381.03252260.13257259.48256260.65
a.3232242249.81255257.81310348.26232236.52235237.10233237.35
a.4234241250.10264267.94299348.39235240.65235237.39236239.97
a.5236239243.87261266.13295350.74236237.74236237.06236237.52
b.1697480.168285176246.526972.977479.616976.48
b.2767987.819293.74183261.267781.458390.038186.81
b.3808186.069091.58174257.038182.348488.558288.81
b.4798589.689599.13161297.457982.558487.908388.16
b.5727278.588283.90189247.987374.327277.687876.03
c.1227231237.90262264.55371424.48227232.77228230.29233238.77
c.2219224234.55258262.90372461.16220225.10226222.71226232.74
c.3243258270.42294300.68434560.23243252.81254256.68253264
c.4219233241.48256259.55353445.23219225.74225227.97224233.10
c.5215219225.55245250.42374444.32216220.55215216.35222230.42
d.1606367.976970.74376440.77606366666879.65
d.2667074.327778.45397493.066668.3971717389.71
d.3727681.328586.58439560.297275.52828282100.84
d.4666768.427171.81319450.906265.5867677081.74
d.56162686969.94317422.816163.8466667281.29
AVG151.55156.55163.31173.80176.94309.35390.49151.70155.96155.70157.81156.55164
nre.1293034.233232.84687748.323131.6530307080.03
nre.2303639.943537.26741904.423133.55343483128.32
nre.3273134.453334.58622751.812729.77343474112.35
nre.4283335.523434688860.132830.81333483177.45
nre.5283136.133333.94781936.062829.74303092150.52
nrf.1141518.581717.033624441415.651717372444.19
nrf.2151619.261717.97268394.611516.19181874404.58
nrf.3141719.611919.683555001516.611919335487.25
nrf.4141518.191818.61334429.771515.71181852420.52
nrf.5131618.351616.10272377.811415161665371.03
nrg.117610711186.32229234.1627573176.94179189.55197197287385.13
nrg.2154758861.84189190.5822262669.29160166.23168168208284.29
nrg.31668501008.58197198.3925742898.26168177.65183183211347.23
nrg.41688601008.48217219.2925692951.26175180.74186186250345.77
nrg.51689741078.77216219.3927323081.35174181.13186186230353.97
nrh.16330653318.328284.1052946065.267174.197777t.o.t.o.
nrh.26329623254.038181.9754865997.686773.487171t.o.t.o.
nrh.35930343253.357575.3948535984.236469.036969t.o.t.o.
nrh.45830103255.657375.5550026163.906267.716868t.o.t.o.
nrh.55526552934.166869.1047735589.065862.456161t.o.t.o.
AVG67.10973.951071.6884.0585.492168.802546.2169.8073.8475.7575.80t.o.t.o.
Table A5. CSA/SCP—VShape 1—Two-steps vs. KMeans vs. DBscan (t.o = time out).
Table A5. CSA/SCP—VShape 1—Two-steps vs. KMeans vs. DBscan (t.o = time out).
InstanceBKSV1+Stand.V1+Comp.V1+StaticV1+ElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
4.1429639699.94454463.13647723.19540600.03429430429430.38
4.25129591086.65573584.399531127.10742881.45513515.84512514.77
4.351610401172.58665754.21754845.32701732.21516520.61516518.87
4.4494869985.39539553.61817986.74723792.35495498.87494496.10
4.55129251102.23554561.529641143.55824904.32514514.90512513.23
4.656012591357.42614624.4511191371.909391038.55560562.35560560.19
4.7430751875.52481493.45800913.26590710.81430432.26430430.81
4.849210101144.26530539.4510281175.71796902.58493496.13492494.10
4.964113151513.10720740.2911921616.9010511193.26645657.90641647.50
4.105149751108.10566572.269871131.74748863.19516519.9514515.81
5.1253506566.03288296.23477557.65385446.97253255.74253255.52
5.2302730821.42350359.68726849.23545622.29308309.81302306.68
5.3226454502.19246251.19450537.45366407.32228228.68226227.32
5.4242438519.26321354.78451534.21369409.99242242.23242243.29
5.5211363402.842542987.54458587.32370411.21211212.03211213.90
5.6213435490.16248254.16454516.84354397.32213215.10213213.04
5.7293565658.52329336.03518666.68449526.74293297.68293295.35
5.8288598668.61337345.42585676.74483527.29288290.35288290.39
5.9279580673.10398401.05547654.21489531.07279280.52279279.90
5.10265539608.13291296.48555615.87398472.52267268.77265267.97
6.1138625702.68161165628744.58413504.45140142.90140143.42
6.21467081013.94161167.139231047.35576694.87146148.87146149.71
6.3154783946.39198209.198491021.68529641.48147148.16145149.65
6.4131534618.55180185.988471035.65584601.25131131.68131133.03
6.51618601012.23189194.068401009.48548682.58163164.68162165.39
AVG336.08738.40849.97385.88507.63742.76883.61580.48659.84336.80339.44335.84338.25
a.125310301148.10269271.069921181.42703810.94254255.55254256
a.22529031068.74294303.169501085.06612766.45257259.48256260.65
a.3232838998.48254258.109511019.65598714.74235237.10233237.35
a.4234908995.32263266.909251026.77635706.06235237.39236239.97
a.52369301017.94252259.168481043.42653732.81236237.06236237.52
b.16910921205.558285.4210911235.58656781.487479.616976.48
b.27610931215.069293.6511081225.10639798.138390.038186.81
b.38013751546.6596100.2811531482.72689799.258488.558288.81
b.47912501380.3995100.1911941402.32713914.658487.908388.16
b.57211081209.358798.2112011403.21721915.417277.687876.03
c.122711631349.45258262.3212091358.77804925.77228230.29233238.77
c.221913561502.10258261.9713111534.298371018.81226222.71226232.74
c.324316601769.61295301.0316751832.5810691218.94254256.68253264
c.421913541476.10255258.2313311500.718541014225227.97224233.10
c.521512831403.81240246.8112651448.26757980.29215216.35222230.42
d.16014941743.976970.5816021790.909421139.1666666879.65
d.26618321982.327677.9017842016.3211211287.3271717389.71
d.37219552154.878386.4519852192.7411871421.10828282100.84
d.46615491779.296876.2616341786.9410451184.4867677081.74
d.56116281772.136969.5515421781.559631165.3966667281.29
AVG151.551290.051435.96172.75177.361287.551467.41809.90964.76155.70157.82156.55164
nre.12923352506.263139.3625012975.2116251825.2130307080.03
nre.23027823022.133537.7726873002.6817161929.81343483128.32
nre.32722442547.813234.6122632628.3915071700.26343474112.35
nre.42827632912343426812944.8716631917.81333483177.45
nre.52828503078.263333.9727283036.2616742017.84303092150.52
nrf.11415701722.10171715661782.819861139.031717372444.19
nrf.21513761566181814501578.039311063.00181874404.58
nrf.31417331933.131919.8417161903.7711261275.971919335487.25
nrf.41415431729.351818.9014891718.8410191134.26181852420.52
nrf.51313341488.711616.2913321500.13820988.39161665371.03
nrg.117671567696.32230232.6874057914.9046735041.81197197287385.13
nrg.215461096442.68185189.2961556470.3239354351.13168168208284.29
nrg.316665077098.19195197.4265427166.6843754740.77183183211347.23
nrg.416864607072.45215218.1967927302.1340864713.61186186250345.77
nrg.516868377458.77214216.8167127387.6544164919.71186186230353.97
nrh.16312,51513,137.268283.7112,57413,310.87800188697777t.o.t.o.
nrh.26312,38113,125.458181.9412,22713,093.8778028797.237171t.o.t.o.
nrh.35912,24113,122.817474.9712,18013,132.8479578662.296969t.o.t.o.
nrh.45812,46413,209.457374.9712,09712,974.8779348595.196868t.o.t.o.
nrh.55511,62612,348.526768.5511,55812,356.2373398103.616161t.o.t.o.
AVG67.105741.306160.8883.4585.415732.756209.073679.254089.2975.7575.80t.o.t.o.
Table A6. CSA/SCP—VShape 2—Two-steps vs. KMeans vs. DBscan (t.o = time out).
Table A6. CSA/SCP—VShape 2—Two-steps vs. KMeans vs. DBscan (t.o = time out).
InstanceBKSV2+Stand.V2+Comp.V2+StaticV2+ElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
4.1429616687.77453462.45637706.10498582.52429430429430.38
4.25129251074.84577585.199681084.70769851.81513515.84512514.77
4.35169991129.90512548.09812987.63725974.15516520.61516518.87
4.4494841946.16541553.32834946.37681774.26495498.87494496.10
4.55129971082.71557568.269591089.29748875.52514514.90512513.23
4.656011611305.23614625.0311481310.87849990.03560562.35560560.19
4.7430762847.39478495.52751869.77665704.06430432.26430430.81
4.84929971105.65532540.299001128.16749863.55493496.13492494.10
4.964112651501.42736745.1613451512.9010261141.94645657.90641647.50
4.105148891045.52566574.109731103.13756844.03516519.9514515.81
5.1253504552.81289295.29488556.55390430.32253255.74253255.52
5.2302708790.77351360.06732816.40520602.45308309.81302306.68
5.3226427499.35245251.35426506.33356401.32228228.68226227.32
5.4242462507.42297301.02398547.14398409.94242242.23242243.29
5.5211347395.90287304.21421568.96371419.36211212.03211213.90
5.6213438483.77249254.45424499.13346382.42213215.10213213.04
5.7293588644.39329334.81538654.47426504.13293297.68293295.35
5.8288580665.32337345.74602675.07470523.06288290.35288290.39
5.9279567652.19338354.32413501.06487524.98279280.52279279.90
5.10265544598.87294298.23508598.17419468.97267268.77265267.97
6.1138588701.55159164.06613711.50388474.45140142.90140143.42
6.2146777962.42160167.038851034.29502654.45146148.87146149.71
6.3154802931.32157166.39807956.61475603.71147148.16145149.65
6.4131518602.61154187.17420508.36576672.19131131.68131133.03
6.5161870973.77188193.90760991.13539652.29163164.68162165.39
AVG336.08726.88827.56376387.02710.48834.56565.16653.04336.80339.43335.84338.25
a.125310091141.32268271.239991159.06706781.26254255.55254256
a.22529111045.35297303.909001039.52589744.06257259.48256260.65
a.3232899995.65256257.90876967.77593688.20235237.10233237.35
a.4234875972.06266268.42805990.94585694.43235237.39236239.97
a.5236862977.81254260.26836985.77533690.83236237.06236237.52
b.16910721178.358285.5810891186.58633732.907479.616976.48
b.27610371157.169293.779841174.29636753.308390.038186.81
b.38012631475.23100132.1412581748.19716845.148488.558288.81
b.47911701346.7795100.1611911348.10730853.638487.908388.16
b.57210171200.2998111.1210981569.02730784.967277.687876.03
c.122711061313.23260263.2911651315.68761906.87228230.29233238.77
c.221912951461.61258262.3512991479.65811980.50226222.71226232.74
c.324315131710.23296301.2915521728.3910241130.37254256.68253264
c.421912701418.29256258.9712841447.32784937.70225227.97224233.10
c.521512301372.42244247.8112291385.45828933.03215216.35222230.42
d.16015041693.847070.8115091685.359351068.3766666879.65
d.26616401890.457778.5217501912.7110581226.8371717389.71
d.37217992083.488586.8418072091.0311531321.50828282100.84
d.46616381721.707171.6814241718.718991108.3067677081.74
d.56114891693.556969.6815851703.689461103.5766667281.29
AVG151.551229.951392.4395174.70179.78612321431.86782.50914.29155.70157.82156.55164
nre.12921762403.263636.8725482857.3115781987.4730307080.03
nre.23026952918.323737.8426192923.8416721889.33343483128.32
nre.32723342521.453334.7422962500.4814351614.57343474112.35
nre.42825532828.48343425262793.8716141813.67333483177.45
nre.52827112938.653333.8726542970.9016141909.20303092150.52
nrf.11414681682.45171715141659.139251069.301717372444.19
nrf.21514021536.06181813981531.61855978181874404.58
nrf.31417011855.811919.9716281858.1910291200.631919335487.25
nrf.41414881644.901818.7715421674.559151058.73181852420.52
nrf.51312961442.481616.3512871475.16782919.70161665371.03
nrg.117670417478.61231233.1968697443.5545234843.27197197287385.13
nrg.215459846380.45185189.7758056303.1937374111.17168168208284.29
nrg.316664526925.32193196.9765596929.6539834468.67183183211347.23
nrg.416864426891.16214218.9064416973.3541304511.93186186250345.77
nrg.516869067285.71215217.3567557169.5542734642.87186186230353.97
nrh.16312,06312,811.138383.9011,90712,923.5874928301.207777t.o.t.o.
nrh.26311,86212,769.658181.6112,41312,925.8476878235.537171t.o.t.o.
nrh.35911,86012,599.94757512,04012,671.8774638225.976969t.o.t.o.
nrh.45812,02912,719.587374.8111,74112,699.6576578212.106868t.o.t.o.
nrh.55511,34811,962.686668.5211,37611,968.4272847840.066161t.o.t.o.
AVG67.105590.555979.8083.8585.375595.906012.683532.403891.6775.7575.80t.o.t.o.
Table A7. CSA/SCP—VShape 3—Two-steps vs. KMeans vs. DBscan (t.o = time out).
Table A7. CSA/SCP—VShape 3—Two-steps vs. KMeans vs. DBscan (t.o = time out).
InstanceBKSV3+Stand.V3+Comp.V3+StaticV3+ElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
4.1429598655.43457462.13602664.40513566.07429430429430.38
4.25128671018.73574583.639291028.10746805.20513515.84512514.77
4.35169821085.40553560.43784995.35684789.33516520.61516518.87
4.4494831930.40536552.20796909.37669732.60495498.87494496.10
4.55128861036.67557568.508581053.90749813.83514514.90512513.23
4.656011081241.53608622.7010951250.77848949.10560562.35560560.19
4.7430717827.30487494.83767833.60609662.93430432.26430430.81
4.84928911045.40533538.279491073.60694827.87493496.13492494.10
4.964112431391.57728741.9012791403.109751075.33645657.90641647.50
4.105149171003.93561570.339291026.50709804.67516519.9514515.81
5.1253463515.40285294.80462512.10363407.90253255.74253255.52
5.2302714762.50352361.23620759.07520570.77308309.81302306.68
5.3226436471.43248251.13403481.77338373.47228228.68226227.32
5.4242433493.33261263.67485501.02336489.30242242.23242243.29
5.5211359388.10243246.50478503.31333401.01211212.03211213.90
5.6213423458.60249253.17415459.23334362.60213215.10213213.04
5.7293541603.30326332.40541613431476.60293297.68293295.35
5.8288552608.03339346.10576635.13412492.63288290.35288290.39
5.9279549630.70319328.03541587.01444498.21279280.52279279.90
5.10265519566.27294298506558.87402442.03267268.77265267.97
6.1138545650.13160164.10496638.97355417.13140142.90140143.42
6.2146824937.90161166.10682917.63466593.47146148.87146149.71
6.3154778879.17162166.07737869.20430539.07147148.16145149.65
6.4131488554.37141142.23714842.36441478.85131131.68131133.03
6.5161807938.37186191.83781904.33469584.17163164.68162165.39
AVG336.08698.84787.76372.80380.01697800.87530.80606.17336.80339.44335.84338.25
a.12539511070.40269271.509521080563702.97254255.55254256
a.22529191003.17300304.80897995.60570667.50257259.48256260.65
a.3232791913.70256257.70856936.30525627.07235237.10233237.35
a.4234805901265268.23773907.73497619.87235237.39236239.97
a.5236828925.97257262.07826933.07523630.43236237.06236237.52
b.1699791084.378385.439881091.90520653.607479.616976.48
b.2768701094.339293.739751107.30563671.778390.038186.81
b.38011931379.639091.7311721854.32687761.878488.558288.81
b.47910761238.709799.8710531226646769.078487.908388.16
b.57210651149.378184.0312541425.65666701.357277.687876.03
c.122710891238.87261263.8710881228.87663784.23228230.29233238.77
c.221912051382.37259262.1712451386.47759873.50226222.71226232.74
c.324314321609.63296300.6714771613.738751029.33254256.68253264
c.421912071342.40258259.4011971333.23656842.50225227.97224233.10
c.521510361276.17247248.6711481284.57215216.50215216.35222230.42
d.16013551578.837070.9014341562.33773967.2366666879.65
d.26615971747.637778.5315581770.809571081.6371717389.71
d.37217281964.138386.6716481923.9010101174.70828282100.84
d.46614771592.737171.7714721603.60835966.5067677081.74
d.56112921579.23697014701588.57809959.0366667281.29
AVG151.551144.751303.63174.05176.591174.151342.70665.60785.03155.70157.82156.55164
nre.12920532254.20333321472457.3211111365.2130307080.03
nre.23025272711.203637.3723862741.6313781644.57343483128.32
nre.32720742322.833334.6021852346.8012111414.67343474112.35
nre.42824332608.933333.972461261414231610.63333483177.45
nre.52826092775.473333.9725162794.6015281664.90303092150.52
nrf.11414161569.431717.0313401587.47829955.901717372444.19
nrf.21513211443.471717.9712711446.23762868.80181874404.58
nrf.31415671738.871919.5314751712.239051045.071919335487.25
nrf.41413991552.831718.4712741547.93792935.03181852420.52
nrf.51312311354.601616.1312471361.70689824.87161665371.03
nrg.117664307032.93229234.5066717135.3737344242.67197197287385.13
nrg.215453345918.47189191.0355955990.2331943623.40168168208284.29
nrg.316660596399.60197198.5358546377.4734513881.77183183211347.23
nrg.416861626558.77215219.8060506507.1336203956.83186186250345.77
nrg.516863436718.60216219.8062966798.6035164139.43186186230353.97
nrh.16311,01311,9818284.3711,19112,000.6069017398.437777t.o.t.o.
nrh.26311,60712,096.408182.4011,39612,110.8365787364.837171t.o.t.o.
nrh.35911,47311,903.437575.8311,15511,836.2066257112.076969t.o.t.o.
nrh.45811,43512,043.307575.7711,31311,985.4066497158.206868t.o.t.o.
nrh.55510,61811,222.196969.2610,48611,328.0362336871.976161t.o.t.o.
AVG67.105255.205610.3384.1085.675215.455633.993056.453403.9675.7575.80t.o.t.o.
Table A8. CSA/SCP—VShape 4—Two-steps vs. KMeans vs. DBscan: Groups NRE, NRF, NRG, and NRH (t.o = time out).
Table A8. CSA/SCP—VShape 4—Two-steps vs. KMeans vs. DBscan: Groups NRE, NRF, NRG, and NRH (t.o = time out).
InstanceBKSV4+Stand.V4+Comp.V4+StaticV4+ElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
4.1429605664.50453459.93588636.33514535.30429430429430.38
4.25129031026.30571581.20873956.63672743.70513515.84512514.77
4.35169521078.60552558.87802879.87655748.21516520.61516518.87
4.4494820908.53530546.47801858.87657702.67495498.87494496.10
4.55128701024.23553564.67892970.57676757.03514514.90512513.23
4.656010871236.17613621.4310481163786882.90560562.35560560.19
4.7430728808.67485490.90663781.77547629.87430432.26430430.81
4.84929601045.10525536.23884986.07706761.53493496.13492494.10
4.964112291415.40726737.3711321305.30909995.10645657.90641647.50
4.105149051011.53552566.40859943.57711757.73516519.9514515.81
5.1253474524.30288292.70415486.23335380.70253255.74253255.52
5.2302642755.50350358.47654712.43468523.30308309.81302306.68
5.3226399472246249.97411446.87314344.33228228.68226227.32
5.4242393490.40258262.10389419.68355398.21242242.23242243.29
5.5211354386.63239243.93401420.12364378.21211212.03211213.90
5.6213402458.30242251.33399429.60314341.67213215.10213213.04
5.7293534613.47326330.13506566.87414447.57293297.68293295.35
5.8288521622.43335342.77533588.17429461.73288290.35288290.39
5.9279556619.40315324.60555578.98422463.14279280.52279279.90
5.10265512568.17291296.20479531.17368406.97267268.77265267.97
6.1138555636.10156162.13465566.53251358.43140142.90140143.42
6.2146771920.67161165683835.03400488.83146148.87146149.71
6.3154710862.73160164.97700815.03379471.63147148.16145149.65
6.4131447571.50138140.67647852.38477542.45131131.68131133.03
6.5161802907.20185190.20645823.30417502.50163164.68162165.39
AVG336.08685.24776.24370394.93656.96763.27501.60573.14336.80430335.84338.25
a.12538611067.30268271.07830958.87539622.40254255.55254256
a.2252843971299303.40829902.43507568.20257259.48256260.65
a.3232833924.73253256.90761860.47482546.47235237.10233237.35
a.4234807917.10264267.53732850.13499553.30235237.39236239.97
a.5236790917.73259262.80751846.83455550.77236237.06236237.52
b.1699621099.538384.738961002.83440556.207479.616976.48
b.2769771078.379293.879121001.17452545.808390.038186.81
b.380121714049091.639011000.25547682.448488.558288.81
b.47911591273.379498.639921124.93559632.408487.908388.16
b.5729741119.908284.239991212.21547636.967277.687876.03
c.122711431229263264.5710111134.37598689.30228230.29233238.77
c.221912211356.13258261.9011761268.40635754.17226222.71226232.74
c.324313171619.40294299.7713431476.83704862.70254256.68253264
c.421911861342.43257259.0310841209.93659735.50225227.97224233.10
c.521511961292.50245248.7710241164.20607697.50215216.35222230.42
d.16014551584.57717113061441.97651776.1366666879.65
d.26616571785.977678.2314441626.53753894.7771717389.71
d.37217911922.278386.6015741787.67801980.60828282100.84
d.46614561590.877071.8313071449.17663793.4067677081.74
d.56114591596.776969.8712961463.17682801.0366667281.29
AVG151.551165.201347.83173.50176.321189.121198.33589697155.70157.82156.55164
nre.12920252226.933333.3321682541.8411471665.2530307080.03
nre.23024722763.803738.1022622471.9311451360.57343483128.32
nre.32721402323.473435.3319512141.8310251160.67343474112.35
nre.42823812647.103434.6021942398.9011421344.43333483177.45
nre.52824652776.803434.7322312516.8011881413.73303092150.52
nrf.11414671572.301617.1712641411.83683769.901717372444.19
nrf.21512181433.201717.9711231323.43625712.90181874404.58
nrf.31415911739.231920.1314631572.83729872.401919335487.25
nrf.41414461550.401818.8312871419.83641781.23181852420.52
nrf.51311691363.901616.4711251244.10569677.17161665371.03
nrg.117664227063.57245255.8059626396.5732023556.80197197287385.13
nrg.215455985951.20194207.3349495405.8027083021.37168168208284.29
nrg.316659676446.33206217.5755615965.0328863267.30183183211347.23
nrg.416861486522.20231240.7753935925.8330073284.60186186250345.77
nrg.516864606827.30229242.3357866213.1729843381.37186186230353.97
nrh.16311,45112,041.17120157.2310,47511,056.3356786236.737777t.o.t.o.
nrh.26311,28412,032.90119162.7010,57411,033.2053896096.307171t.o.t.o.
nrh.35911,28111,932.43117151.3010,30110,855.4755426041.676969t.o.t.o.
nrh.45811,16611,880.47129152.8310,44110,867.6056826047.336868t.o.t.o.
nrh.55510,67611,213.71110137.35953810,217.1349995777.456161t.o.t.o.
AVG67.105241.355615.4297.90109.594802.405148.972548.552937.0477.9375.80t.o.t.o.

Appendix A.1.2. SCP—Two-Steps

Table A9. Two-steps results set.
Table A9. Two-steps results set.
InstanceStep 1Step 2BKSMINMAXAVGRPDTIME
4.1S-Shape 4Elitist429429433438022
4.2S-Shape 4Elitist512512564515.74025
4.3S-Shape 4Standard516516548530.2505.0
4.4S-Shape 4Elitist494494512500.10023
4.5S-Shape 4Elitist512512522514.61024
4.6S-Shape 4Elitist560560577563.61024
4.7S-Shape 4Elitist430430444432.26021
4.8S-Shape 4Elitist492492499494.19024
4.9S-Shape 2Standard641641664652.16095
4.10S-Shape 4Elitist514514525517.19022
5.1S-Shape 4Elitist253253262256.74024
5.2S-Shape 4Elitist302302324308.39027
5.3S-Shape 4Elitist226226231227.32024
5.4S-Shape 1Standard242242258249.4302.9
5.5S-Shape 1Standard211211239223.1302.5
5.6S-Shape 4Elitist213213223213.90023
5.7S-Shape 4Elitist293293308295.03024
5.8S-Shape 4Elitist288288295289.90026
5.9S-Shape 4Standard279279328301.2403.2
5.10S-Shape 4Elitist265265271267.87024
6.1S-Shape 4Elitist138138148143.13022
6.2S-Shape 4Elitist146146164150.61020
6.3S-Shape 4Elitist145145158150.32023
6.4S-Shape 1Standard131131144138.3402.4
6.5S-Shape 4Elitist161161175167023
a.1S-Shape 4Standard253253264257.650180
a.2S-Shape 4Elitist252252273260.13053
a.3S-Shape 4Elitist232232246237.1063
a.4S-Shape 2Standard234234249240.060170
a.5S-Shape 4Elitist236236242273.74055
b.1S-Shape 4Elitist69697772.97049
b.2S-Shape 2Standard76768981.450120
b.3S-Shape 1Standard80809886.9908.9
b.4S-Shape 4Elitist7979878255057
b.5S-Shape 1Standard72728778.9205.8
c.1S-Shape 4Elitist227227241232.77096
c.2S-Shape 2Standard219219231225.100330
c.3S-Shape 4Elitist243243265252.810110
c.4S-Shape 4Elitist219219237225.740110
c.5V-Shape 3Elitist215215218216.550130
d.1S-Shape 4Elitist60606863088
d.2S-Shape 4Elitist66667168.390100
d.3S-Shape 4Elitist72728375.520120
d.4S-Shape 4Elitist66667265.58094
d.5S-Shape 4Elitist61617663.84093
nre.1S-Shape 2Standard29294133.14010.0
nre.2S-Shape 4Elitist30313933.550.033160
nre.3S-Shape 4Elitist27273329.770220
nre.4S-Shape 4Elitist28283530.810150
nre.5S-Shape 4Elitist28283329.740200
nrf.1S-Shape 4Elitist14141815.620210
nrf.2S-Shape 4Elitist15151816.190100
nrf.3S-Shape 4Elitist14151916.610.071180
nrf.4S-Shape 4Elitist14151715.710.07191
nrf.5S-Shape 4Elitist131416150.077160
nrg.1S-Shape 4Elitist176179206189.550.0171100
nrg.2S-Shape 4Elitist154160187166.230.0391000
nrg.3S-Shape 4Elitist166168190177.650.0121200
nrg.4S-Shape 4Elitist168175189180.740.0421000
nrg.5S-Shape 4Elitist168174196181.130.0361000
nrh.1S-Shape 4Elitist63718374.190.127560
nrh.2S-Shape 4Elitist63678673.480.0631100
nrh.3S-Shape 4Elitist59648069.030.085680
nrh.4S-Shape 4Elitist58627566.710.069950
nrh.5S-Shape 4Elitist55587062.450.055850

Appendix A.1.3. SCP—KMeans and DBscan

Table A10. KMeans results set.
Table A10. KMeans results set.
Ins.BKSMINMAXAVGRPDTIME
4.1429429430430047
4.2512513522515.840.00230
4.3516516526520.61032
4.4494495505498.870.00230
4.5512514522514.900.00330
4.6560560565562.35033
4.7430430434432.26027
4.8492493499496.130.00230
4.9641645666657.900.00633
4.10514516526519.90.00328
5.1253253259255.74030
5.2302308312309.810.01936
5.3226228230228.680.00829
5.4242242244242.23031
5.5211211214212.03026
5.6213213219215.10028
5.7293293301297.68030
5.8288288295290.35031
5.9279279281280.52031
5.10265267271268.770.00731
6.1138140145142.900.01414
6.2146146151148.87014
6.3145147151148.160.01315
6.4131131135131.68018
6.5161163168164.680.01229
a.1253254257255.550.00471
a.2252257262259.480.01968
a.3232235240237.100.01271
a.4234235244237.390.00466
a.5236236238237.06071
b.169748079.610.07213
b.276839390.030.09214
b.380848988.550.05014
b.479848987.900.06315
b.572727977.68013
c.1227228233230.290.004130
c.2219220226222.710.00494
c.3243254260256.680.045110
c.4219225232227.970.02790
c.5215215218216.350140
d.1606666660.10024
d.2667171710.07526
d.3728282820.13827
d.4626767670.08024
d.5616666660.08224
nre.1293030300.03441
nre.2303434340.13341
nre.3273434340.25938
nre.4283333330.17947
nre.5283030300.07143
nrf.1141717170.21443
nrf.2151818180.20040
nrf.3141919190.35749
nrf.4141818180.28637
nrf.5131616160.23137
nrg.11761971971970.119170
nrg.21541681681680.091150
nrg.31661831831830.102160
nrg.41681861861860.107160
nrg.51681861861860.107170
nrh.1637777770.222340
nrh.2637171710.127330
nrh.3596969690.169330
nrh.4586868680.172330
nrh.5556161610.109320
Table A11. DBscan results set (t.o = time out).
Table A11. DBscan results set (t.o = time out).
Ins.BKSMINMAXAVGRPDTIME
4.1429429432430.3803700
4.2512512526514.7702400
4.3516516533518.8702600
4.4494494500496.1002400
4.5512512514513.2302300
4.6560560564560.1902500
4.7430430435430.8107300
4.8492492503494.1003900
4.9641641658647.5004600
4.10514514518515.8103900
5.1253253258255.5204100
5.2302302314306.6804500
5.3226226229227.3203600
5.4242242245243.2904100
5.5211211218213.9003400
5.6213213214213.0403700
5.7293293300295.3504000
5.8288288300290.3904300
5.9279279280279.9003800
5.10265265271267.9704300
6.1138140147143.420.0142500
6.2146146155149.7102900
6.3145145154149.6502700
6.4131131135133.0302400
6.5161162169165.390.0062900
a.12532542592560.00417,000
a.2252256265260.650.01518,000
a.3232233245237.350.00417,000
a.4234236245239.970.00816,000
a.5236236240237.52017,000
b.169698576.48015,000
b.276819486.810.06516,000
b.380829988.810.02520,000
b.479839988.160.05024,000
b.572787776.030.08316,000
c.1227233247238.770.02627,000
c.2219226244232.740.03228,000
c.32432532792640.04128,000
c.4219224244233.100.02226,000
c.5215222241230.420.03226,000
d.160689079.650.133130,000
d.2667310589.710.106150,000
d.37282125100.840.138160,000
d.462709681.740.129130,000
d.561729181.290.180120,000
nre.1297010380.031.413260,000
nre.23083205128.321.766420,000
nre.32774152112.351.740350,000
nre.42883799177.451.964390,000
nre.52892813150.522.285490,000
nrf.114372508444.1925.571250,000
nrf.21574468404.583.933210,000
nrf.314335551487.2522.928240,000
nrf.41452493420.522.714190,000
nrf.51365416371.034170,000
nrg.1176287436385.130.630230,000
nrg.2154208327284.290.350220,000
nrg.3166211417347.230.271240,000
nrg.4168250402345.770.488220,000
nrg.5168230398353.970.369240,000
nrh.163t.o.t.o.t.o.t.o.t.o.
nrh.263t.o.t.o.t.o.t.o.t.o.
nrh.359t.o.t.o.t.o.t.o.t.o.
nrh.458t.o.t.o.t.o.t.o.t.o.
nrh.555t.o.t.o.t.o.t.o.t.o.

Appendix A.2. 0/1 Knapsack Problem Results

Appendix A.2.1. KP—Crow Search Algorithm

Table A12. CSA/KP—SShape 1—Two-steps vs. KMeans vs. DBscan.
Table A12. CSA/KP—SShape 1—Two-steps vs. KMeans vs. DBscan.
InstanceBKSS1+Stand.S1+Comp.S1+StaticS1+ElitistKMeansDBscan
MAXAVGMAXAVGMAXAVGMAXAVGMAXAVGMAXAVG
f1295287276.12279267.21290287.32256222.81290286.32294292
f21024789771.23729652.87782.01719.08756701.33895987.23880829.87
f335343435353434353535353535
f423232320202221.502322.5023232323
f5481.06327.15310.14300.47276.34312.23309.93299.76280.80377.07354.65436.82412.19
f65250504948.455150.5043435150.235252
f710710094.239085.329997.4310590.32107107107107
f8976787437843.3289437332.0593218883.4569835574.4553775377.5453775262.35
f9130129129130130128127.50129128.50130130130130
f101025973965.04789734.55723712.66832799.12973969.52804749.39
AVG1293.911145.521049.611136.45958.181176.221124.34946.185789.78825.81832.05813.88789.28
Table A13. CSA/KP—SShape 2—Two-steps vs. KMeans vs. DBscan.
Table A13. CSA/KP—SShape 2—Two-steps vs. KMeans vs. DBscan.
InstanceBKSS2+Stand.S2+Comp.S2+StaticS2+ElitistKMeansDBscan
MAXAVGMAXAVGMAXAVGMAXAVGMAXAVGMAXAVG
f1295234222.03189176.29162150.32178168.20290286.32294292
f21024900850732699.81823800.10809771.25895987.23880829.87
f3353331.751917.332320.672422.2435353535
f4232119.341813.381816.251716.5023232323
f5481.06223.32201.10199.97189.71254.09250.12301.01276.87377.07354.65436.82412.19
f6525149.235047.665049.504240.015150.235252
f7107102100.459998.458378.339793.45107107107107
f8976782317822.7864325993.3369826777.8970146897.2153775377.5453775262.35
f913010298.769593.348985.2510297.33130130130130
f101025865763.67701678.88799740.01732699.65973969.52804749.39
AVG1293.911076.231015.91853.49800.82928.31896.84931.60908.27825.81832.05813.88789.28
Table A14. CSA/KP—SShape 3—Two-steps vs. KMeans vs. DBscan.
Table A14. CSA/KP—SShape 3—Two-steps vs. KMeans vs. DBscan.
InstanceBKSS3+Stand.S3+Comp.S3+StaticS3+ElitistKMeansDBscan
MAXAVGMAXAVGMAXAVGMAXAVGMAXAVGMAXAVG
f1295289245.33282254.33276232.76281276.34290286.32294292
f21024991937.35934900.45876835.25766750.66895987.23880829.87
f3353533.4533333434.50353535353535
f4232220.752120.332322.50232323232323
f5481.06423.05402.23333.45330.45299.22272.55317.65309.25377.07354.65436.82412.19
f6524948.235047.334744.355148.665150.235252
f7107102100.119889.3410397.238877.21107107107107
f8976786538432.2089338739.6690018990.7891129004.8953775377.5453775262.35
f9130111101.10116109.47127119.39121118.33130130130130
f101025970960.05897865.87749718.88969939.24973969.52804749.39
AVG1293.911164.511128.061169.751139.021153.521136.821176.371158.26825.81832.05813.88789.28
Table A15. CSA/KP—SShape 4—Two-steps vs. KMeans vs. DBscan.
Table A15. CSA/KP—SShape 4—Two-steps vs. KMeans vs. DBscan.
InstanceBKSS4+Stand.S4+Comp.S4+StaticS4+ElitistKMeansDBscan
MAXAVGMAXAVGMAXAVGMAXAVGMAXAVGMAXAVG
f1295276256.22273242.31222218.23250245.66290286.32294292
f21024990934.75943913.13893833.90881843.29895987.23880829.87
f3352928.482722.983028.452926.7035353535
f4232119.202220.701311.181917.2623232323
f5481.06413.31400.23388.69377.09410.54399.78354.76326.87377.07354.65436.82412.19
f6524238.484945.095150.1752525150.235252
f7107106105.4598799392.508873.19107107107107
f8976778237698.1279017406.2184328023.3693019000.9853775377.5453775262.35
f9130123109.33114110.20126112.33101110.34130130130130
f101025872762.98899843.21901862.23962943.32973969.52804749.39
AVG1293.911069.531035.321071.461005.991117.151063.211203.771163.96825.81832.05813.88789.28
Table A16. CSA/KP—VShape 1—Two-steps vs. KMeans vs. DBscan.
Table A16. CSA/KP—VShape 1—Two-steps vs. KMeans vs. DBscan.
InstanceBKSV1+Stand.V1+Comp.V1+StaticV1+ElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
f1295222210.12212199.33203193.45208193.66290286.32294292
f21024777732.23789745.10798755.33801790.65895987.23880829.87
f3353533.333229.673129.223433.5035353535
f42323232322.502220.501917.6623232323
f5481.06399.98369.09382.67342.65376.33323.66381.23379.48377.07354.65436.82412.19
f6524948.235049.325150.504746.335150.235252
f71079998.239080.339593.3310099.55107107107107
f8976780107988.7793439123.3289908600.4597349456.2553775377.5453775262.35
f9130121120.33119118.50128125.77130130130130130130
f101025970967.23955943.32961933.89896869.08973969.52804749.39
AVG1293.911070.591059.061199.571165.401165.531112.611235.021201.62825.81832.05813.88789.28
Table A17. CSA/KP—VShape 2—Two-steps vs. KMeans vs. DBscan.
Table A17. CSA/KP—VShape 2—Two-steps vs. KMeans vs. DBscan.
InstanceBKSV2+Stand.V2+Comp.V2+StaticV2+ElitistKMeansDBscan
MAXAVGMAXAVGMAXAVGMAXAVGMAXAVGMAXAVG
f1295295294.14294293.41229219.47286278.88290286.32294292
f21024990985.66987866.62798774.448888500.99895987.23880829.87
f3353332.4535352522.662928.3335353535
f42323232120.6623232018.0523232323
f5481.06288.54240.11301.78252.56297.23283.11334.98319.56377.07354.65436.82412.19
f6524745.875048.595251.503733.875150.235252
f71078786.23100100.769987.789696.06107107107107
f8976765386234.9062346002.7653725008.66500.9864990.8753775377.5453775262.35
f9130129127.77126126.76101109.8710199.7651130130130130
f101025972962.01811800.17901900.12914907.98973969.52804749.39
AVG1293.91940.25903.22895.97854.73789.72748.06320.691617.22825.81832.05813.88789.28
Table A18. CSA/KP—VShape 3—Two-steps vs. KMeans vs. DBscan.
Table A18. CSA/KP—VShape 3—Two-steps vs. KMeans vs. DBscan.
InstanceBKSV3+Stand.V3+Comp.V3+StaticV3+ElitistKMeansDBscan
MAXAVGMAXAVGMAXAVGMAXAVGMAXAVGMAXAVG
f1295287264.78279255.55190181.21257231.84290286.32294292
f21024787754.33888854.25901890.50954927.99895987.23880829.87
f3353534.5035353533.253433.5035353535
f42323232320.332017.452324.2523232323
f5481.06214.32200.21398.35345.32401.39395.75436.29426.35377.07354.65436.82412.19
f6525148.255143.333736.874845.505150.235252
f7107107100.5510799.10106103.98104100.66107107107107
f8976797019688.4089978787.2395659333.8894879221.9153775377.5453775262.35
f9130130126.67129129.33129128.50130130130130130130
f101025971965.10970969.22971966.32899890.78973969.52804749.39
AVG1293.911230.631220.571187.741153.861235.541208.771237.231203.28825.81832.05813.88789.28
Table A19. CSA/KP—VShape 4—Two-steps vs. KMeans vs. DBscan.
Table A19. CSA/KP—VShape 4—Two-steps vs. KMeans vs. DBscan.
InstanceBKSV4+Stand.V4+Comp.V4+StaticV4+ElitistKMeansDBscan
MAXAVGMAXAVGMAXAVGMAXAVGMAXAVGMAXAVG
f1295222210.58284264.58290289.98278266.66290286.32294292
f21024990987.21988980.21850876.58884821.12895987.23880829.87
f33535353534.503433.253534.5835353535
f4232220.332423.092323.562324.5023232323
f5481.06410.25408.84378.99368.29413.39410.14399.36387.41377.07354.65436.82412.19
f6525250.215250.655047.9952525150.235252
f7107107106.50107105.8210098.47107107107107107107
f8976778887784.5787458654.2589328774.7765446533.6353775377.5453775262.35
f9130129121.69130127.87119110.77128126.33130130130130
f101025874865.22855850.26847835.66701699.50973969.52804749.39
AVG1293.911072.921059.021159.891145.951165.831150.12915.13905.27825.81832.05813.88789.28

Appendix A.2.2. KP—Two-Steps

Table A20. KP—Two-steps results.
Table A20. KP—Two-steps results.
InstanceStep 1Step 2BKSMINMAXAVGRPDTIME
f1vShape 2Standard295287295294.1404.35
f2sShape3Standard1024821991937.350.0324.35
f3sShape3elitist3535353504.35
f4sShape3elitist2323232300.2
f5vShape 3elitist481.06413.36436.29426.350.0930.19
f6vShape4elitist5252525200.07
f7vShape4elitist10710510710600.48
f8vShape1elitist9767875497349456.250.0032.6
f9vShape1elitist13013013013000.1
f10sShape1Standard1025954973965.040.0511.2

Appendix A.2.3. KP—KMeans and DBscan

Table A21. KP—KMeans results.
Table A21. KP—KMeans results.
InstanceBKSMINMAXAVGRPDTIME
f1295274290286.320.0170.27
f21024745895987.230.1260.39
f33535353500.25
f42323232300.28
f5481.06347377.07354.650.2160.32
f652495150.230.0190.29
f710710710710700.27
f89767439853775073.540.4490.25
f913013013013000.26
f101025953973969.520.0510.40
Table A22. KP—DBscan results.
Table A22. KP—DBscan results.
InstanceBKSMINMAXAVGRPDTIME
f12952902942920.003580
f21024778880829.870.141170
f33535353500.22
f42323232300.23
f5481.06387.55436.82412.190.092140
f65252525200.97
f710710710710700.23
f89767514753775262.350.4490.23
f913013013013000.26
f101025694804749.390.2161.6

References

  1. Valdivia, S.; Crawford, B.; Soto, R.; Lemus-Romani, J.; Astorga, G.; Misra, S.; Salas-Fernández, A.; Rubio, J.M. Bridges Reinforcement Through Conversion of Tied-Arch Using Crow Search Algorithm. In Computational Science and Its Applications—ICCSA 2019; Springer International Publishing: Berlin/Heidelberg, Germany, 2019; pp. 525–535. [Google Scholar] [CrossRef]
  2. Apostolopoulos, T.; Vlachos, A. Application of the Firefly Algorithm for Solving the Economic Emissions Load Dispatch Problem. Int. J. Comb. 2011, 2011, 523806. [Google Scholar] [CrossRef] [Green Version]
  3. Toregas, C.; Swain, R.; ReVelle, C.; Bergman, L. The Location of Emergency Service Facilities. Oper. Res. 1971, 19, 1363–1373. [Google Scholar] [CrossRef]
  4. Fu, C.; Cheng, S.; Yi, Y. Dynamic Control of Product Innovation, Advertising Effort, and Strategic Transfer-Pricing in a Marketing-Operations Interface. Math. Probl. Eng. 2019, 2019, 8418260. [Google Scholar] [CrossRef]
  5. Talukder, A.; Alam, M.G.R.; Tran, N.H.; Niyato, D.; Hong, C.S. Knapsack-Based Reverse Influence Maximization for Target Marketing in Social Networks. IEEE Access 2019, 7, 44182–44198. [Google Scholar] [CrossRef]
  6. De Haan, R.; Szeider, S. A Compendium of Parameterized Problems at Higher Levels of the Polynomial Hierarchy. Algorithms 2019, 12, 188. [Google Scholar] [CrossRef] [Green Version]
  7. Torres-Jiménez, J.; Pavón, J. Applications of metaheuristics in real-life problems. Prog. Artif. Intell. 2014, 2, 175–176. [Google Scholar] [CrossRef]
  8. Soto, R.; Crawford, B.; Olivares, R.; Galleguillos, C.; Castro, C.; Johnson, F.; Paredes, F.; Norero, E. Using autonomous search for solving constraint satisfaction problems via new modern approaches. Swarm Evol. Comput. 2016, 30, 64–77. [Google Scholar] [CrossRef]
  9. Tzanetos, A.; Fister, I.; Dounias, G. A comprehensive database of Nature-Inspired Algorithms. Data Brief 2020, 31, 105792. [Google Scholar] [CrossRef] [PubMed]
  10. Crawford, B.; Soto, R.; Astorga, G.; García, J.; Castro, C.; Paredes, F. Putting Continuous Metaheuristics to Work in Binary Search Spaces. Complexity 2017, 2017, 8404231. [Google Scholar] [CrossRef] [Green Version]
  11. Hernández, L.; Baladrón, C.; Aguiar, J.; Carro, B.; Sánchez-Esguevillas, A. Classification and Clustering of Electricity Demand Patterns in Industrial Parks. Energies 2012, 5, 5215–5228. [Google Scholar] [CrossRef] [Green Version]
  12. Li, X.; Song, K.; Wei, G.; Lu, R.; Zhu, C. A Novel Grouping Method for Lithium Iron Phosphate Batteries Based on a Fractional Joint Kalman Filter and a New Modified K-Means Clustering Algorithm. Energies 2015, 8, 7703–7728. [Google Scholar] [CrossRef] [Green Version]
  13. García, J.; Moraga, P.; Valenzuela, M.; Crawford, B.; Soto, R.; Pinto, H.; Peña, A.; Altimiras, F.; Astorga, G. A Db-Scan Binarization Algorithm Applied to Matrix Covering Problems. Comput. Intell. Neurosci. 2019, 2019, 1–16. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  14. García, J.; Yepes, V.; Martí, J.V. A Hybrid k-Means Cuckoo Search Algorithm Applied to the Counterfort Retaining Walls Problem. Mathematics 2020, 8, 555. [Google Scholar] [CrossRef]
  15. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  16. Creswell, J. Research Design: Qualitative, Quantitative, and Mixed Methods Approaches; Sage: Thousand Oaks, CA, USA, 2013. [Google Scholar]
  17. Talbi, E.G. Metaheuristics: From Design to Implementation; John Wiley & Sons: Hoboken, NJ, USA, 2009. [Google Scholar]
  18. Caprara, A.; Fischetti, M.; Toth, P. A Heuristic Method for the Set Covering Problem. Oper. Res. 1999, 47, 730–743. [Google Scholar] [CrossRef] [Green Version]
  19. Zhou, Y.Q.; Chen, X.; Zhou, G. An Improved Monkey Algorithm for a 0-1 Knapsack Problem. Appl. Soft Comput. 2015, 38. [Google Scholar] [CrossRef] [Green Version]
  20. Kennedy, J.; Eberhart, R.C. A discrete binary version of the particle swarm algorithm. In Proceedings of the 1997 IEEE International Conference on Systems, Man, and Cybernetics, Computational Cybernetics and Simulation, Orlando, FL, USA, 12–15 October 1997; Volume 5, pp. 4104–4108. [Google Scholar]
  21. Yang, Y.; Mao, Y.; Yang, P.; Jiang, Y. The unit commitment problem based on an improved firefly and particle swarm optimization hybrid algorithm. In Proceedings of the 2013 Chinese Automation Congress, Changsha, China, 7–8 November 2013; pp. 718–722. [Google Scholar] [CrossRef]
  22. Crawford, B.; Soto, R.; Olivares-Suarez, M.; Palma, W.; Paredes, F.; Olguin, E.; Norero, E. A Binary Coded Firefly Algorithm that Solves the Set Covering Problem. Rom. J. Inf. Sci. Technol. 2014, 17, 252–264. [Google Scholar]
  23. Song, H.; Triguero, I.; Özcan, E. A review on the self and dual interactions between machine learning and optimisation. Prog. Artif. Intell. 2019, 8, 143–165. [Google Scholar] [CrossRef] [Green Version]
  24. Liang, Y.C.; Cuevas Juarez, J.R. A self-adaptive virus optimization algorithm for continuous optimization problems. Soft Comput. 2020. [Google Scholar] [CrossRef]
  25. Lobo, F.G.; Lima, C.F.; Michalewicz, Z. (Eds.) Parameter Setting in Evolutionary Algorithms; Springer: Berlin/Heidelberg, Germany, 2007. [Google Scholar] [CrossRef] [Green Version]
  26. Huang, C.; Li, Y.; Yao, X. A Survey of Automatic Parameter Tuning Methods for Metaheuristics. IEEE Trans. Evol. Comput. 2020, 24, 201–216. [Google Scholar] [CrossRef]
  27. Dobslaw, F. A Parameter-Tuning Framework For Metaheuristics Based on Design of Experiments and Artificial Neural Networks. In Proceedings of the International Conference on Computer Mathematics and Natural Computing, Rome, Italy, 28 April 2010. [Google Scholar] [CrossRef]
  28. Battiti, R.; Brunato, M. Reactive Search Optimization: Learning While Optimizing. In Handbook of Metaheuristics; Springer: Boston, MA, USA, 2010; pp. 543–571. [Google Scholar] [CrossRef] [Green Version]
  29. Ong, P.; Zainuddin, Z. Optimizing wavelet neural networks using modified cuckoo search for multi-step ahead chaotic time series prediction. Appl. Soft. Comput. 2019, 80, 374–386. [Google Scholar] [CrossRef]
  30. Regis, R.G. Evolutionary Programming for High-Dimensional Constrained Expensive Black-Box Optimization Using Radial Basis Functions. IEEE Trans. Evol. Comput. 2014, 18, 326–347. [Google Scholar] [CrossRef]
  31. Dalboni, F.; Drummond, L.M.A.; Ochi, L.S. On Improving Evolutionary Algorithms by Using Data Mining for the Oil Collector Vehicle Routing Problem. In Proceedings of the International Network Optimization Conference, Evry, France, 27–29 October 2003. [Google Scholar]
  32. Senjyu, T.; Saber, A.; Miyagi, T.; Shimabukuro, K.; Urasaki, N.; Funabashi, T. Fast technique for unit commitment by genetic algorithm based on unit clustering. IEE Proc.-Gener. Transm. Distrib. 2005, 152, 705–713. [Google Scholar] [CrossRef]
  33. Lee, C.; Gen, M.; Kuo, W. Reliability optimization design using a hybridized genetic algorithm with a neural-network technique. IEICE Trans. Fundam. Electron. Commun. Comput. Sci. 2001, 84, 627–637. [Google Scholar]
  34. Luan, F.; Cai, Z.; Wu, S.; Liu, S.Q.; He, Y. Optimizing the Low-Carbon Flexible Job Shop Scheduling Problem with Discrete Whale Optimization Algorithm. Mathematics 2019, 7, 688. [Google Scholar] [CrossRef] [Green Version]
  35. Bouktif, S.; Fiaz, A.; Ouni, A.; Serhani, M.A. Multi-Sequence LSTM-RNN Deep Learning and Metaheuristics for Electric Load Forecasting. Energies 2020, 13, 391. [Google Scholar] [CrossRef] [Green Version]
  36. Cicceri, G.; Inserra, G.; Limosani, M. A Machine Learning Approach to Forecast Economic Recessions—An Italian Case Study. Mathematics 2020, 8, 241. [Google Scholar] [CrossRef] [Green Version]
  37. Ly, H.B.; Le, T.T.; Le, L.M.; Tran, V.Q.; Le, V.M.; Vu, H.L.T.; Nguyen, Q.H.; Pham, B.T. Development of Hybrid Machine Learning Models for Predicting the Critical Buckling Load of I-Shaped Cellular Beams. Appl. Sci. 2019, 9, 5458. [Google Scholar] [CrossRef] [Green Version]
  38. Korytkowski, M.; Senkerik, R.; Scherer, M.M.; Angryk, R.A.; Kordos, M.; Siwocha, A. Efficient Image Retrieval by Fuzzy Rules from Boosting and Metaheuristic. J. Artif. Intell. Soft Comput. Res. 2020, 10, 57–69. [Google Scholar] [CrossRef] [Green Version]
  39. Hoang, N.D.; Tran, V.D. Image Processing-Based Detection of Pipe Corrosion Using Texture Analysis and Metaheuristic-Optimized Machine Learning Approach. Comput. Intell. Neurosci. 2019, 2019, 8097213. [Google Scholar] [CrossRef] [Green Version]
  40. Bui, Q.T.; Van, M.P.; Hang, N.T.T.; Nguyen, Q.H.; Linh, N.X.; Ha, P.M.; Tuan, T.A.; Cu, P.V. Hybrid model to optimize object-based land cover classification by meta-heuristic algorithm: An example for supporting urban management in Ha Noi, Viet Nam. Int. J. Digit. Earth 2019, 12, 1118–1132. [Google Scholar] [CrossRef]
  41. García, J.; Crawford, B.; Soto, R.; Astorga, G. A clustering algorithm applied to the binarization of Swarm intelligence continuous metaheuristics. Swarm Evol. Comput. 2019, 44, 646–664. [Google Scholar] [CrossRef]
  42. Mirjalili, S.; Lewis, A. S-shaped versus V-shaped transfer functions for binary Particle Swarm Optimization. Swarm Evol. Comput. 2013, 9, 1–14. [Google Scholar] [CrossRef]
  43. Feng, Y.; An, H.; Gao, X. The importance of transfer function in solving set-union knapsack problem based on discrete moth search algorithm. Mathematics 2019, 7, 17. [Google Scholar] [CrossRef] [Green Version]
  44. Celebi, M.E.; Kingravi, H.A.; Vela, P.A. A comparative study of efficient initialization methods for the k-means clustering algorithm. Expert Syst. Appl. 2013, 40, 200–210. [Google Scholar] [CrossRef] [Green Version]
  45. Zhang, C.; Xiao, X.; Li, X.; Chen, Y.J.; Zhen, W.; Chang, J.; Zheng, C.; Liu, Z. White Blood Cell Segmentation by Color-Space-Based K-Means Clustering. Sensors 2014, 14, 16128–16147. [Google Scholar] [CrossRef] [PubMed]
  46. Ester, M.; Kriegel, H.P.; Sander, J.; Xu, X. A density-based algorithm for discovering clusters in large spatial databases with noise. In Kdd; AAAI Press: Palo Alto, CA, USA, 1996; pp. 226–231. [Google Scholar]
  47. Gass, S.; Fu, M. Set-covering Problem. In Encyclopedia of Operations Research and Management Science; Springer: Cham, Switzerland, 2013; p. 1393. [Google Scholar]
  48. Lin, B.; Liu, S.; Lin, R.; Wu, J.; Wang, J.; Liu, C. Modeling the 0-1 Knapsack Problem in Cargo Flow Adjustment. Symmetry 2017, 9, 118. [Google Scholar] [CrossRef] [Green Version]
  49. Bartz-Beielstein, T.; Preuss, M. Experimental research in evolutionary computation. In Proceedings of the 9th Annual Conference Companion on Genetic and Evolutionary Computation, London, UK, 7–11 July 2007; pp. 3001–3020. [Google Scholar]
  50. Beasley, J. OR-Library. 1990. Available online: http://people.brunel.ac.uk/~mastjjb/jeb/info.html (accessed on 12 November 2017).
  51. Lilliefors, H. On the Kolmogorov-Smirnov Test for Normality with Mean and Variance Unknown. J. Am. Stat. Assoc. 1967, 62, 399–402. [Google Scholar] [CrossRef]
  52. Mann, H.; Donald, W. On a Test of Whether one of Two Random Variables is Stochastically Larger than the Other. Ann. Math. Stat. 1947, 18, 50–60. [Google Scholar] [CrossRef]
  53. Garey, M.R.; Johnson, D.S. Computers and Intractability: A Guide to the Theory of NP-Completeness; W. H. Freeman & Co.: New York, NY, USA, 1979. [Google Scholar]
  54. Smith, B.M. IMPACS—A Bus Crew Scheduling System Using Integer Programming. Math. Program. 1988, 42, 181–187. [Google Scholar] [CrossRef]
  55. Foster, B.A.; Ryan, D.M. An Integer Programming Approach to the Vehicle Scheduling Problem. J. Oper. Res. Soc. 1976, 27, 367–384. [Google Scholar] [CrossRef]
  56. Vasko, F.J.; Wolf, F.E.; Stott, K.L. A set covering approach to metallurgical grade assignment. Eur. J. Oper. Res. 1989, 38, 27–34. [Google Scholar] [CrossRef]
  57. Caprara, A.; Toth, P.; Fischetti, M. Algorithms for the set covering problem. Ann. Oper. Res. 2000, 98, 353–371. [Google Scholar] [CrossRef]
  58. Wu, C.; Zhao, J.; Feng, Y.; Lee, M. Solving discounted {0-1} knapsack problems by a discrete hybrid teaching-learning-based optimization algorithm. Appl. Intell. 2020, 50, 1872–1888. [Google Scholar] [CrossRef]
  59. Zavala-Díaz, J.C.; Cruz-Chávez, M.A.; López-Calderón, J.; Hernández-Aguilar, J.A.; Luna-Ortíz, M.E. A Multi-Branch-and-Bound Binary Parallel Algorithm to Solve the Knapsack Problem 0–1 in a Multicore Cluster. Appl. Sci. 2019, 9, 5368. [Google Scholar] [CrossRef] [Green Version]
  60. Feng, Y.; Yu, X.; Wang, G.G. A Novel Monarch Butterfly Optimization with Global Position Updating Operator for Large-Scale 0-1 Knapsack Problems. Mathematics 2019, 7, 1056. [Google Scholar] [CrossRef] [Green Version]
  61. Bhattacharjee, K.K.; Sarmah, S. Shuffled frog leaping algorithm and its application to 0/1 knapsack problem. Appl. Soft Comput. 2014, 19, 252–263. [Google Scholar] [CrossRef]
  62. Sergio, V.; Olivares, R.; Caselli, N. Clustering-based binarization methods applied to the crow search algorithm for 0/1 combinatorial problems.pdf. Figshare 2020. [Google Scholar] [CrossRef]
Figure 1. Clustering process of the KMeans algorithm applied to the decision variables.
Figure 1. Clustering process of the KMeans algorithm applied to the decision variables.
Mathematics 08 01070 g001
Figure 2. Clustering process of the DBscan algorithm applied to the best solution.
Figure 2. Clustering process of the DBscan algorithm applied to the best solution.
Mathematics 08 01070 g002
Figure 3. Continuous to Binary Two-steps.
Figure 3. Continuous to Binary Two-steps.
Mathematics 08 01070 g003
Figure 4. Continuous to Binary KMeans.
Figure 4. Continuous to Binary KMeans.
Mathematics 08 01070 g004
Figure 5. Continuous to Binary DBscan.
Figure 5. Continuous to Binary DBscan.
Mathematics 08 01070 g005
Figure 6. Steps in performance analysis of a metaheuristic.
Figure 6. Steps in performance analysis of a metaheuristic.
Mathematics 08 01070 g006
Figure 7. Statistical significance test.
Figure 7. Statistical significance test.
Mathematics 08 01070 g007
Figure 8. Instance 4.10 distribution.
Figure 8. Instance 4.10 distribution.
Mathematics 08 01070 g008
Figure 9. Instance 5.10 distribution.
Figure 9. Instance 5.10 distribution.
Mathematics 08 01070 g009
Figure 10. Instance 6.5 distribution.
Figure 10. Instance 6.5 distribution.
Mathematics 08 01070 g010
Figure 11. Instance a.5 distribution.
Figure 11. Instance a.5 distribution.
Mathematics 08 01070 g011
Figure 12. Instance b.5 distribution.
Figure 12. Instance b.5 distribution.
Mathematics 08 01070 g012
Figure 13. Instance c.5 distribution.
Figure 13. Instance c.5 distribution.
Mathematics 08 01070 g013
Figure 14. Instance d.5 distribution.
Figure 14. Instance d.5 distribution.
Mathematics 08 01070 g014
Figure 15. Instance nre.5 distribution.
Figure 15. Instance nre.5 distribution.
Mathematics 08 01070 g015
Figure 16. Instance nrf.5 distribution.
Figure 16. Instance nrf.5 distribution.
Mathematics 08 01070 g016
Figure 17. Instance nrg.5 distribution.
Figure 17. Instance nrg.5 distribution.
Mathematics 08 01070 g017
Figure 18. Instance nrh.5 distribution.
Figure 18. Instance nrh.5 distribution.
Mathematics 08 01070 g018
Figure 19. Instance f1 distribution.
Figure 19. Instance f1 distribution.
Mathematics 08 01070 g019
Figure 20. Instance f2 distribution.
Figure 20. Instance f2 distribution.
Mathematics 08 01070 g020
Figure 21. Instance f5 distribution.
Figure 21. Instance f5 distribution.
Mathematics 08 01070 g021
Figure 22. Instance f6 distribution.
Figure 22. Instance f6 distribution.
Mathematics 08 01070 g022
Figure 23. Instance f8 distribution.
Figure 23. Instance f8 distribution.
Mathematics 08 01070 g023
Figure 24. Instance f10 distribution.
Figure 24. Instance f10 distribution.
Mathematics 08 01070 g024
Table 1. Instances taken from the Beasley’s OR-Library.
Table 1. Instances taken from the Beasley’s OR-Library.
InstanceMNCostDensityBest
GroupRange(%)Known
42001000[1,100]2Known
52002000[1,100]2Known
62001000[1,100]5Known
A3003000[1,100]2Known
B3003000[1,100]5Known
C4004000[1,100]2Known
D4004000[1,100]5Known
NRE5005000[1,100]10Unknown
(except NRE.1)
NRF5005000[1,100]20Unknown
(except NRF.1)
NRG100010,000[1,100]2Unknown
(except NRG.1)
NRH100010,000[1,100]5Unknown
Table 2. 0/1 Knapsack Problem instances.
Table 2. 0/1 Knapsack Problem instances.
InstanceDimensionParameters
f110w = {95, 4, 60, 32, 23, 72, 80, 62, 65, 46}  p = {55, 10, 47,5, 4, 50, 8, 61, 85, 87}
b = 269
f220w = {92, 4, 43, 83, 84, 68, 92, 82, 6, 44, 32, 18, 56, 83,25, 96, 70, 48, 14, 58}
p = {44, 46, 90, 72, 91, 40, 75, 35,8, 54, 78, 40, 77, 15, 61, 17, 75, 29, 75, 63}  b = 878
f34w = {6, 5, 9, 7}  p = {9, 11, 13, 15}  b = 20
f44w = {2, 4, 6, 7}  p = {6, 10, 12, 13}  b = 11
f515w = {56.358531, 80.87405, 47.987304, 89.59624,74.660482, 85.894345, 51.353496,1.498459,36.445204, 16.589862, 44.569231, 0.466933,37.788018, 57.118442, 60.716575}
p = {0.125126,19.330424, 58.500931, 35.029145, 82.284005,17.41081, 71.050142,30.399487, 9.140294,14.731285, 98.852504, 11.908322, 0.89114,53.166295, 60.176397}
b = 75
f66w = {30, 25, 20, 18, 17, 11, 5, 2, 1, 1}  p = {20, 18, 17, 15,15, 10, 5, 3, 1, 1}  b = 60
f77w = {31, 10, 20, 19, 4, 3, 6}  p = {70, 20, 39, 37, 7, 5,10}  b = 50
f823w = {983, 982, 981, 980, 979, 978, 488, 976, 972, 486, 486, 972, 972, 485, 485, 969,966, 483, 964, 963, 961,958, 959}
p = {81,980, 979, 978, 977, 976, 487, 974,970, 85, 485, 970, 970, 484, 484, 976, 974,482, 962,961, 959, 958, 857}  b = 10,000
f95w = {15, 20, 17, 8, 31}  p = {33, 24, 36, 37, 12}  b = 80
f1020w = {84, 83, 43, 4, 44, 6, 82, 92, 25, 83, 56, 18, 58, 14,48, 70, 96, 32, 68, 92}
p = {91, 72, 90, 46, 55, 8, 35, 75,61, 15, 77, 40, 63, 75, 29, 75, 17, 78, 40, 44}  b = 879
Table 3. Crow Search Algorithm (CSA) Parameters for Set Covering Problem (SCP) and Knapsack Problem (KP)—Two-steps.
Table 3. Crow Search Algorithm (CSA) Parameters for Set Covering Problem (SCP) and Knapsack Problem (KP)—Two-steps.
PopulationAwareness ProbabilityFligth LengthIterations
300.123000
Table 4. CSA Parameters for SCP and KP—KMeans.
Table 4. CSA Parameters for SCP and KP—KMeans.
PopulationAwareness ProbabilityFlight LengthNumber of ClustersCluster Transition ValueIterations
300.3230.04, 0.06, 0.083000
Table 5. CSA Parameters for SCP and KP—DBscan.
Table 5. CSA Parameters for SCP and KP—DBscan.
PopulationAwareness ProbabilityFlight LengthMinimum Number of PointsMaximum DistanceIterations
300.1220.0091000
Table 6. CSA/SCP—Two-steps vs. KMeans vs. DBscan resume summary (t.o = time out).
Table 6. CSA/SCP—Two-steps vs. KMeans vs. DBscan resume summary (t.o = time out).
InstanceBKSStand.Comp.StaticElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
S1 G1336.08344.92356.75373.96382.11420.88491.21346.92360.91336.80339.43335.84346.80
S1 G2151.55156.85164.18174177.14544.45719.47157.25163.91155.70157.81156.55164
S1 G367.1081.2591.6383.7085.273673.704385.3573.5078.7875.7575.80t.o.t.o.
AVG184.91194.34204.18210.55214.841546.341865.34192.55201.20189.41191.01t.o.t.o.
S2 G1336.08341.56354.92372.64381.55716842.13348.24367.78336.80339.43335.84338.25
S2 G2151.55156.45262.97173.70177.051222.701383.50157.70164.15155.70157.81156.55164
S2 G367.10366.80423.8383.5085.195619.105986.4574.0579.2775.4575.50t.o.t.o.
AVG184.91288.27347.24209.94214.592519.262737.36193.33203.73189.31190.91t.o.t.o.
S3 G1336.08340.80353.72373.20380.81854.32809.68347.28362.93336.80339.44335.84338.25
S3 G2151.55126.86133.19142.20144.5512431417.11127.26132.79126.46128.65127.73136.57
S3 G367.10625.10712.3583.8085.315291.905656.6484.1596.7175.7575.80t.o.t.o.
AVG184.91364.25399.75199.73203.552463.072627.81186.23197.47179.67181.29t.o.t.o.
S4 G1336.08341.64353.17372.04380.58364.56393.54336.12339.52336.80339.43335.84338.25
S4 G2151.55156.55163.31173.80176.94309.35390.49151.70155.96155.70157.81156.55164
S4 G367.10973.951071.6884.0585.492168.802546.2169.8073.8475.7575.80t.o.t.o.
AVG184.91490.71529.38209.96214.33947.571110.08185.87189.77189.41191.01t.o.t.o.
V1 G1336.08738.40849.97385.88507.63742.76883.61580.48659.84336.80339.44335.84338.25
V1 G2151.551290.051435.96172.75177.361287.551467.41809.90964.76155.70157.82156.55164
V1 G367.105741.306160.8883.4585.415732.756209.073679.254089.2975.7575.80t.o.t.o.
AVG184.912589.912815.60214.02256.802587.682853.361689.871904.63189.41191.02t.o.t.o.
V2 G1336.08726.88827.56376387.02710.48834.56565.16653.04336.80339.43335.84338.25
V2 G2151.551229.951392.4395174.70179.78612321431.86782.50914.29155.70157.82156.55164
V2 G367.105590.555979.8083.8585.375595.906012.683532.403891.6775.7575.80t.o.t.o.
AVG184.912515.793403.68211.51217.392512.792759.701626.681819.66189.41191.01t.o.t.o.
V3 G1336.08698.84787.76372.80380.01697800.87530.80606.17336.80339.44335.84338.25
V3 G2151.551144.751303.63174.05176.591174.151342.70665.60785.03155.70157.82156.55164
V3 G367.105255.205610.3384.1085.675215.455633.993056.453403.9675.7575.80t.o.t.o.
AVGV3 184.912366.262567.24210.31214.092362.202592.521417.611598.38189.41191.02t.o.t.o.
V4 G1336.08685.24776.24370394.93656.96763.27501.60573.14336.80430335.84338.25
V4 G2151.551165.201347.83173.50176.321189.121198.33589697155.70157.82156.55164
V4 G367.105241.355615.4297.90109.594802.405148.972548.552937.0477.9375.80t.o.t.o.
AVG184.912363.932579.83213.8226.942216.162370.191213.051402.39190.14221.20t.o.t.o.
Table 7. CSA/KP—Two-steps vs. KMeans vs. DBscan resume summary.
Table 7. CSA/KP—Two-steps vs. KMeans vs. DBscan resume summary.
InstanceBKSStand.Comp.StaticElitistKMeansDBscan
MINAVGMINAVGMINAVGMINAVGMINAVGMINAVG
S11293.911145.521049.611136.45958.181176.221124.34946789.78825.81832.05813.88789.28
S21293.911076.231015.91853.49800.82928.31896.84931.6908.27825.81832.05813.88789.28
S31293.911164.511128.061169.751139.021153.521136.821176.371158.26825.81832.05813.88789.28
S41293.911069.531035.321071.461005.991117.151063.211203.771163.96825.81832.05813.88789.28
V11293.911070.591059.061199.571165.41165.531112.611235.021201.62825.81832.05813.88789.28
V21293.91940.25903.22895.97854.73789.72748.06320.691617.22825.81832.05813.88789.28
V31293.911230.631220.571187.741153.861235.541208.771237.231203.28825.81832.05813.88789.28
V41293.911230.631220.571187.741153.861235.541208.771237.231203.28825.81832.05813.88789.28
AVG1293.911115.981079.041087.771028.981100.191062.431036.0121155.71825.81832.05813.88789.28
Table 8. p-values for instance 4.10.
Table 8. p-values for instance 4.10.
Two-StepsKMeansDBscan
Two-steps×6.47 × 10−4SWS
KMeansSWS×SWS
DBscan1.02 × 10−22.91 × 10−7×
Table 9. p-values for instance 5.10.
Table 9. p-values for instance 5.10.
Two-StepsKMeansDBscan
Two-steps×3.34 × 10−2SWS
KMeansSWS×SWS
DBscanSWS3.09 × 10−2×
Table 10. p-values for instance 6.5.
Table 10. p-values for instance 6.5.
Two-StepsKMeansDBscan
Two-steps×SWSSWS
KMeans1.84 × 10−3×6.42 × 10−4
DBscanSWSSWS×
Table 11. p-values for instance a.5.
Table 11. p-values for instance a.5.
Two-StepsKMeansDBscan
Two-steps×SWSSWS
KMeans1.85 × 10−2×4.25 × 10−2
DBscanSWSSWS×
Table 12. p-values for instance b.5.
Table 12. p-values for instance b.5.
Two-StepsKMeansDBscan
Two-steps×SWSSWS
KMeansSWS×SWS
DBscan1.14 × 10−49.54 × 10−4×
Table 13. p-values for instance c.5.
Table 13. p-values for instance c.5.
Two-StepsKMeansDBscan
Two-steps×SWS4.17 × 10−12
KMeansSWS×4.69 × 10−12
DBscanSWSSWS×
Table 14. p-values for instance d.5.
Table 14. p-values for instance d.5.
Two-StepsKMeansDBscan
Two-steps×2.29 × 10−99.60 × 10−12
KMeansSWS×2.35 × 10−13
DBscanSWSSWS×
Table 15. p-values for instance nre.5.
Table 15. p-values for instance nre.5.
Two-StepsKMeansDBscan
Two-steps×SWS5.59 × 10−12
KMeansSWS×2.34 × 10−13
DBscanSWSSWS×
Table 16. p-values for instance nrf.5.
Table 16. p-values for instance nrf.5.
Two-StepsKMeansDBscan
Two-steps×9.16 × 10−121.87 × 10−12
KMeansSWS×2.37 × 10−13
DBscanSWSSWS×
Table 17. p-values for instance nrg.5.
Table 17. p-values for instance nrg.5.
Two-StepsKMeansDBscan
Two-steps×3.26 × 10−76.49 × 10−12
KMeansSWS×2.37 × 10−13
DBscanSWSSWS×
Table 18. p-values for instance nrh.5.
Table 18. p-values for instance nrh.5.
Two-StepsKMeansDBscan
Two-steps×SWSD/A
KMeans5.24 × 10−4×D/A
DBscanD/AD/A×
Table 19. p-values for instance f.1.
Table 19. p-values for instance f.1.
Two-StepsKMeansDBscan
Two-steps×SWSSWS
KMeans1.68 × 10−10×8.92 × 10−12
DBscanSWSSWS×
Table 20. p-values for instance f.2.
Table 20. p-values for instance f.2.
Two-StepsKMeansDBscan
Two-steps×SWSSWS
KMeans9.66 × 10−9×3.06 × 10−2
DBscan7.23 × 10−8SWS×
Table 21. p-values for instance f.5.
Table 21. p-values for instance f.5.
Two-StepsKMeansDBscan
Two-steps×SWSSWS
KMeans6.50 × 10−12×6.68 × 10−12
DBscan3.49 × 10−3SWS×
Table 22. p-values for instance f.6.
Table 22. p-values for instance f.6.
Two-StepsKMeansDBscan
Two-steps×SWSSWS
KMeans1.32 × 10−13×1.32 × 10−13
DBscanSWSSWS×
Table 23. p-values for instance f.8.
Table 23. p-values for instance f.8.
Two-StepsKMeansDBscan
Two-steps×SWSSWS
KMeans6.53 × 10−12×2.77 × 10−3
DBscan6.54 × 10−12SWS×
Table 24. p-values for instance f.10.
Table 24. p-values for instance f.10.
Two-StepsKMeansDBscan
Two-steps×SWSSWS
KMeansSWS×SWS
DBscan6.53 × 10−126.49 × 10−12×

Share and Cite

MDPI and ACS Style

Valdivia, S.; Soto, R.; Crawford, B.; Caselli, N.; Paredes, F.; Castro, C.; Olivares, R. Clustering-Based Binarization Methods Applied to the Crow Search Algorithm for 0/1 Combinatorial Problems. Mathematics 2020, 8, 1070. https://doi.org/10.3390/math8071070

AMA Style

Valdivia S, Soto R, Crawford B, Caselli N, Paredes F, Castro C, Olivares R. Clustering-Based Binarization Methods Applied to the Crow Search Algorithm for 0/1 Combinatorial Problems. Mathematics. 2020; 8(7):1070. https://doi.org/10.3390/math8071070

Chicago/Turabian Style

Valdivia, Sergio, Ricardo Soto, Broderick Crawford, Nicolás Caselli, Fernando Paredes, Carlos Castro, and Rodrigo Olivares. 2020. "Clustering-Based Binarization Methods Applied to the Crow Search Algorithm for 0/1 Combinatorial Problems" Mathematics 8, no. 7: 1070. https://doi.org/10.3390/math8071070

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop