Next Article in Journal
Modeling and Solving for Multi-Satellite Cooperative Task Allocation Problem Based on Genetic Programming Method
Previous Article in Journal
Modified Remora Optimization Algorithm with Multistrategies for Global Optimization Problem
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Quantum-Based Chameleon Swarm for Feature Selection

1
Department of Mathematics, Faculty of Science, Zagazig University, Zagazig 44519, Egypt
2
Artificial Intelligence Research Center (AIRC), College of Engineering and Information Technology, Ajman University, Ajman P.O. Box 346, United Arab Emirates
3
Faculty of Computer Science and Engineering, Galala University, Suez 435611, Egypt
4
Department of Electrical and Computer Engineering, Lebanese American University, Byblos 13-5053, Lebanon
5
Mechanical Engineering Department, Imam Mohammad Ibn Saud Islamic University, Riyadh 11564, Saudi Arabia
6
Department of Production Engineering and Mechanical Design, Faculty of Engineering, Tanta University, Tanta 31527, Egypt
7
Institute for High Performance Computing and Networking, National Research Council of Italy, 87036 Rende, Italy
*
Author to whom correspondence should be addressed.
Mathematics 2022, 10(19), 3606; https://doi.org/10.3390/math10193606
Submission received: 19 August 2022 / Revised: 24 September 2022 / Accepted: 27 September 2022 / Published: 2 October 2022

Abstract

:
The Internet of Things is widely used, which results in the collection of enormous amounts of data with numerous redundant, irrelevant, and noisy features. In addition, many of these features need to be managed. Consequently, developing an effective feature selection (FS) strategy becomes a difficult goal. Many FS techniques, based on bioinspired metaheuristic methods, have been developed to tackle this problem. However, these methods still suffer from limitations; so, in this paper, we developed an alternative FS technique, based on integrating operators of the chameleon swarm algorithm (Cham) with the quantum-based optimization (QBO) technique. With the use of eighteen datasets from various real-world applications, we proposed that QCham is investigated and compared to well-known FS methods. The comparisons demonstrate the benefits of including a QBO operator in the Cham because the proposed QCham can efficiently and accurately detect the most crucial features. Whereas the QCham achieves nearly 92.6%, with CPU time(s) nearly 1.7 overall the tested datasets. This indicates the advantages of QCham among comparative algorithms and high efficiency of integrating the QBO with the operators of Cham algorithm that used to enhance the process of balancing between exploration and exploitation.

1. Introduction

Recently, the Internet of Things (IoT) has more attention, since it can be applied to different applications. For example, medical [1,2,3], agriculture [4], industrial [5,6], and education [7]. However, the rapid growth in data volume, as well as its complexity, is the main reason for many challenging problems, such as the noisy nature, complex dimensionality, and irrelevance [8]. Consequently, these problems impair the accuracy and efficiency of the machine learning systems and result in a long computational time. Feature selection (FS) methods have been proposed to boost the accuracy of the classification process and reduce the corresponding computational costs [9]. FS approaches are usually employed to capture the properties of data by determining the subset with the most relevant features [10]. Moreover, they are employed to remove noisy irrelevant data [10]. Thus, FS techniques have been extensively used in various engineering fields, such as text classification [11], human motion detection [12], classification of computerized tomography images (especially for COVID-19 cases) [13], neuromuscular disorders [14], parameter determination of biochemical systems [15], data analysis problems [16], segmentation of magnetic resonance images [17], and other applications [18].
There are three key kinds of FS techniques, namely filter, wrapper, and embedded [18]. Filter-based methods utilize the datasets properties, while wrapper-based techniques utilize the learning approach to evaluate the selected features. Embedded-based methods focus on learning features with best contribution to the robustness and accuracy of the classification system during its creation process. Thus, wrapper-based methods have better efficiency, but higher computational costs, compared with filter-based techniques. Generally speaking, the main criteria used to assess the performance of the FS method are the minimization of the selected features, total number, and maximization of the classifier accuracy [18]. One of the most popular wrapper-based techniques depends on the Metaheuristic (MH) algorithms.
In general, MH optimizers have been commonly employed to solve complex optimization problems, such as parameter selection of solar cells [19], optimizing neural networks [20], and FS. Several MH optimizers have been developed in the literature as robust tools that are successfully utilized to solve FS problems. The most common used MH optimizers used in this direction are particle swarm optimization (PSO) [21], genetic algorithms (GA) [22], firefly algorithm [23], manta ray foraging optimizer [24], grey wolf optimizer [25], differential evolution (DE) [26], Harris hawk optimizer (HHO) [27], Henry gas optimizer [28,29], political optimizer [30], flower pollination optimizer [31], Aquila optimizer [32], crow search optimizer [33], cat swarm optimizer [34], and ecosystem-based optimizer [35]. According to the no free lunch theorem (NFLT), there is not a single algorithm that can be successfully used for solving all problems. Therefore, the hybridization conceptualization was widely employed to solve several intricate problems, such as FS. In this study, considering the hybridization conceptualization, a new FS approach is proposed, utilizing an innovative algorithm called chameleon swarm optimizer (Cham).
Cham is a newly developed MH approach that mimics the behavior of chameleon during the process of hunting the food sources in jungles or deserts. According to these behaviors, Cham has been applied to handle a set of variant applications. For example, the economic load dispatch problem [36], optimization of automobile cruise control [37], parameters extraction of solid oxide fuel cells [38], global optimization and feature selection [39], and segmentation of plant disease [40]. However, it suffers from getting stuck in local minima as well as it has very slow convergence when the optimization problem has high dimensions. To overcome these drawbacks of the standalone Cham algorithm, quantum-based optimization is used. QBO has been successfully employed in several engineering applications, such as parameter identification of PEM fuel cell [41], wrapper feature selection [42], and inverse problems [43].
In this study, a hybrid MH optimizer called QCham is proposed, which is composed of conventional Cham optimized by QBO. It is utilized to split the tested dataset into two main subsets, namely training, which represents 70% of the whole data, and test sets, which represent 30% of the whole data. At that, the initial value for individuals that characterizes the FS solutions is defined. Boolean versions of individuals set are computed to evaluate their efficacy. The fitness value is also calculated, considering the features of Boolean ones. After that, the individual with the best fitness value is determined and defined as the best solution. After that, the MH operators of Cham and QBO are utilized in the exploration stage to determine those regions with the optimal solutions, which are called the feasible region. This enhances the convergence process to obtain the optimal solution. Furthermore, conventional Cham operators are used during the exploitation stage. The procedures of augmenting individuals’ value are carried out until a stopping criterion is fulfilled. Then, the number of the data in the testing set is decreased, considering the features of the best solution. Finally, the performance of the proposed algorithm is calculated using various statistical measures. To our best knowledge, there are no previous studies on the applications of Cham or its improved version in FS problems.
The main contributions and objectives of this study are listed as follows:
  • Develop an innovative improved version of Cham using the mathematical operators of QBO to improve exploration capability.
  • Employ the enhanced hybrid QCham algorithm as a new FS method to detect and eliminate the irrelevant features, which results in improving the accuracy and efficiency of the classification process.
  • Evaluate the efficiency of the proposed QCham using eighteen UCI datasets and compare its performance with other well-known conventional FS approaches.
The remaining parts of this article are structured as follows. Section 2 presents the FS methods applied in the literature to solve the investigated problem. Section 3 discusses the fundamentals and mathematical representation of Cham and ABO. Section 4 presents the main the stages of the proposed approach. Section 5 illustrates and discusses the experimental results. Finally, the main conclusion of the study and future prospects are introduced in Section 6.

2. Related Works

Herein, the main FS methods applied in the literature to solve the investigated FS problem are highlighted. Chaudhuri and Sahu [44] developed an improved crow search algorithm (CSA) and employed it in FS problems. They regulate both stages of the search process (exploitation and exploration) by applying the time-varying flight length of the crows. They assessed many variants of FS models, and they verified the algorithm accuracy using 20 UCI datasets.
Butterfly optimizer (BO) and information theory are integrated to form a hybrid FS algorithm [45]. The hybrid algorithm overcomes the main drawbacks of the conventional BO model, and it exhibited excellent performance, compared to other well-known MH models. Maleki et al. [46] employed a conventional genetic algorithm as a FS technique, along with k-nearest neighbors classifier in classification of lung cancer disease. The application of GA in this classification process enhanced classification. Song et al. [47] proposed PSO algorithm with a new variant as a FS approach, this new algorithm is called bare bones PSO. The key idea of the proposed algorithm is to utilize a swarm-based initialization procedure based on label correlation. The exploitation process was also enhanced using two operators called deletion and supplementary operators. Furthermore, an adaptive mutation operator was used to avert getting stuck in local minima. The proposed algorithm was applied for many datasets, along with kNN, and it had a superior performance, compared with other well-known MH algorithms.
In another study, a hybrid FS model composed of the grey wolf optimizer (GWO) and rough set was developed to analyze mammogram images [48]. The proposed hybrid algorithm well-known outperformed the FS techniques, based on the comparison investigation carried out. Tubishat et al. [49] established a FS algorithm considering a dynamic version of salp swarm optimizer (SSO). The new version of SSO was established based on two approaches. The first approach was employed to control the position updating of salps, while the second approach was employed to enhance the search capabilities of the conventional SSA. The proposed SSA was employed, along with a conventional kNN classifier, and assessed using recognized datasets. Moreover, it was compared with the conventional SSA and other well-known MH algorithms and exhibited outperformance.
Dhiman et al. [50] employed a modified binary version of emperor penguin optimizer (EPO) to solve FS problems. The prosed approach was compared with other well-known approaches, considering twenty-five datasets. Generally, the binary EPO exhibited excellent performance compared with the conventional EPO. A hybrid Elastic Net and genetic algorithm was employed for FS [51]. Neggaz et al. [52] proposed a FS approach considering the Henry gas solubility optimizer (HGSO). The authors in [10] developed a FS method considering two MH approaches, namely the firefly and slime mould algorithms. The developed method was assessed using various datasets, such as the QSAR datasets.
Yousri et al. [13] presented a FS technique to improve the classifications process of COVID-19 CT images. A modified cuckoo search (CS) optimizer was employed for this target. The heavy-tailed distributions and fractional-order (FO) calculus were employed to enhance the conventional CS algorithm. In general, there are many standalone and hybrid MH approaches that have been established for different FS applications, and they outperformed other traditional techniques, as discussed in [53,54].

3. Background

3.1. Chameleon Swarm Optimizer

Chameleon swarm optimizer [55] is a bioinspired model that simulates the natural behavior of chameleon during their hunting process in different environments. Chameleons have the excellent ability to change their color to match their surroundings. They have outstanding eyesight with a clear vision range of 10 m, which enables them to monitor their preys. These preys are grasshoppers, small birds, snails, lizards, crickets, and mantis. However, chameleons themselves may be preys to snakes and birds, and they protect themselves from hunting via adapting their color with surroundings. Chameleons move in deserts and climb trees to hunt for prey. Their excellent eyesight enables them to discover the search space to find and trace their prey. They have ability to look in two directions at the same time. Each eye has its independent motion during tracking process of prey. This help chameleons to detect and track two preys simultaneously. They have a superior capability to track prey, with a 360° full stereoscopic view. Chameleons feed via clingy tongues, which stick to the prey’s body, while employing different surface operations, such as entanglement and wet adhesion with very high tongue acceleration (2590 m/s2). The developed algorithm mimics the natural hunting process of chameleons, including capturing, pursuing, and tracking. The mathematical formulation of that optimizer is discussed in this section.
First the optimizer defines its initial population in a search space with n-dimensions using m chameleons. Each chameleon represents a solution to the optimization problem and has a position at a certain iteration t , given by [55]:
x t j = [ x t , 1 j ,   x t , 2 j ,   x t , 3 j , , x t , n j ]
where j = 1 ,   2 ,   3 , , m , and m are the total number of chameleons, t denotes the iteration number, n denotes the problem dimensions, and x t , n j denotes the chameleon position.
The population is randomly initiated, according to the following formula [55]:
x j = L B i + r × ( U B j L B j )
where x j denotes the initial position of j th chameleon,   r denotes random number created in the range of [0, 1], and U B and L B denote the upper and lower bounds of the search space, respectively.
The solution quality is evaluated based on a fitness function, computed at each new chameleon position. Then, the recent position is updated to augment the solution quality. The chameleon does not move if the new position has lower quality, compared with the recent one. During execution, the search process on preys, the chameleon position is updated, according to the following formula:
x t + 1 j , i = { x t j , i + s 1 ( S t j , i W t i ) r 2 + s 2 ( W t i x t j , i ) r 1 r i S s x t j , i + z ( ( U B i L B i ) r 3 L B b i ) s g n ( r a n d 0.5 ) r i S s
where x t + 1 j , i denotes the updated position at the t + 1 iteration, x t j , i denotes the present position at the t iteration, S t j , i denotes the best position at iteration t , W t i denotes the global best solution at iteration t , s denotes a positive control parameters, r 1 , r 3 , and r i denote random value created in [ 0 , 1 ] , S s denotes perceiving probability, and z is time decay number.
The line connects S and W is given by:
L ( r 1 ) = r 1 S + ( 1 r 1 ) W  
The positions of chameleons could be characterized considering line path as follows:
F ( r 1 , r 2 ) = r 1 L + ( 1 r 2 ) E           0 r 2 1
where E = x t j , i is a position in an affine plane with other two positions S and W .
Chameleons can track the prey position by exploiting its excellent vision capabilities via rotating their eyes over 360°. A chameleon updates its position via sequential movements and rotation steps to reach the prey. The algorithm changes chameleon original position to its gravity center. Then, it computes the rotation matrix of the prey. Chameleon position is updated based on the computed rotation matrix at its gravity center. The chameleon is translated back to the previous position. The updated position of the chameleon is computed according to the following formula:
x t + 1 j = x r t j + x ¯ t j
where x t + 1 j denotes the updated position after rotation, x ¯ t j denotes the original gravity center of the chameleon, and x r t j denotes the coordinates of rotation center, which is given as:
x r t j = k × x c t j
where k denotes the rotation matrix, and x c t j denotes the coordinates of rotation center at iteration t , and it is given by:
x c t j = x t j x ¯ t j
where x t j denotes the current position at iteration t . Based on the velocity of chameleon around the prey, its new position is computed as follows:
x t + 1 j , i = x t j , i + ( ( v t j , i ) 2 ( v t 1 j , i ) 2 ) / a x ,     a x = 2 ( 2590 × ( 1 e l o g ( t ) ) )
where v t j , i and v t 1 j , i denote chameleon acceleration, chameleon velocity at iterations t and t 1 , respectively. The procedures of the CGO algorithm is given in Algorithm 1.
Algorithm 1. A pseudo-code of CGO algorithm.
1. Set S s = 0.1
2. The coordinates of rotation center of chameleon j at iteration t is given by x r t j = k × x c t j
3. Initialize the positions and velocities of chameleons.
4. While ( t < T ) do
5.         for j = 1 to m do
6.               for i = 1 to n do
7.                             if r j S s then
8.                                    x t + 1 j , i = x t j , i + s 1 ( S t j , i W t i ) r 2 + s 2 ( W t i x t j , i ) r 1
9.                             else
10.                                    x t + 1 j , i = x t j , i + z ( ( U B j L B j ) r 3 L B j ) s g n ( r a n d 0.5 )  
11.                            end if
12.              end for
13.         end for
14.              for j = 1 to m do
15.                   for i = 1 to n do
16.                            x t + 1 j , i = x t j , i + ( ( v t j , i ) 2 ( v t 1 j , i ) 2 ) / ( 5180 × ( 1 e l o g ( t ) ) )
17.                   end for
18.              end for
19.              Update positions of chameleon based on predefined L B and U B
20.         Set t = t + 1
21. end while

3.2. Quantum-Based Optimization (QBO)

In the quantum-based optimization (QBO) technique, the binary number is used to represent the features, and this indicates whether they will be selected (1) or removed (0). In QBO, each feature is denoted by a quantum bit (i.e., Q-bit (q)), where q refers to superposition of binary value (i.e., ‘1’ and ‘0’). The mathematical formulation of Q-bit(q) can be formulated using the following equation [39].
q = α + i β = e i θ ,   | α | 2 + | β | 2 = 1
where α and β denote the probability of the value of the Q-bit being ‘0’ and ‘1’, respectively. Whereas θ refers the angle of q and it is updated usig tan−1(α/β).
The QBO aims to determine the change in the value of q by computing the value of ∆θ, and this is formulated as:
q ( t + 1 ) = q ( t ) × R ( Δ θ ) = [ α ( t )   β ( t )   ] × R ( Δ θ )
R ( Δ θ ) = [ cos ( Δ θ ) sin ( Δ θ ) sin ( Δ θ ) cos ( Δ θ ) ]
where Δ θ i j refers to the rotation angle of ith Q-bit of jth Q-solution, and it is determined according to the best solution X b ad, predefined in Table 1, based on the experimental tests conducted on the knapsack problems [56].

4. Proposed QCham Method

Figure 1 shows the procedures of the developed FS approach, which is based on boosting Cham efficiency employing quantum-based technique (QBO). The primary goal of employing QBO is to improve the ability of balance between the exploration and exploitation during process of searching for a feasible solution. The proposed FS method, QCham, starts by separating the data into 70% and 30% training and testing sets, respectively. The random numerical values for N chameleons are then allocated, and the fitness value for each of them is calculated. Then, as the best chameleon, the person with the highest fitness value is used. Following this, the solution is updated using Cham exploitation. The procedure for updating individuals is repeated until the stop criteria are met. Following that, the dimension of the testing set is decreased, based on the best solution, and the implemented QCham as FS is evaluated using several metrics. The QCham is described in depth in the following paragraphs.

4.1. First Phase

The initial chameleons, which represent the solutions’ population, are created at this step. Where each solution contains D Q-bits ( D is the number of features). Therefore, the solution X j can be formulated as in Equation (13).
X j = [ q i 1 | q i 2 | | q i D ] = [ θ j 1 | θ j 2 | | θ j D ] ,   j = 1 , 2 , , m
In this equation, X j refers to a set of superpositions of probabilities of those feature that are either selected or not.

4.2. Second Phase

Within this phase of QCham, the solutions are updated until they reach the terminal criteria. The first step to accomplish this process is to convert X j into a binary form ( B X j , i ) using Equation (14):
B X j , i = { 1           i f     rand < | β | 2 0                       o t h e r w i s e
where rand [ 0 , 1 ] is a random value. The next step is to learn the KNN classifier using the training features allocated at the indices that correspond to the ones in B X j , i and calculate the fitness value, which is given as:
F i t j = ρ × γ + ( 1 ρ ) × ( | B X j , i | D )
In Equation (15), | B X j , i | refers to the total number of selected features, and γ denotes the error classification using the KNN classifier. ρ [ 0 , 1 ] is the value that equalizes the fitness value of two parts. The key reason for choosing KNN is that it is simple and efficient, and it has just one parameter. The next step is to allocate the best solution X b that has the smallest F i t b . Then, use the operators of Cham, as discussed in Equations (3)–(9).

4.3. Third Phase

Within this third phase, the test set is decreased by choosing only the values corresponding ones in B X b (step 12 in Algorithm 2). At that, the reduced test set is fed into the classifier (KNN). The next procedure is to assess the output quality using various measures (step 13 in Algorithm 2). Algorithm 2 shows the procedures of QCham.
Algorithm 2. Procedures of QCham
1. Input: Number of iterations ( t m a x ), tested dataset with D features, number of solutions (N), and other parameters
First Stage
2. Construct training and testing sets, which represents 70% and 30%.
3. Apply Equation (13) to construct the population X .
Second Stage
4. t = 1
5. While ( t < t m a x )
6.
7. Using Equation (14) to obtain the Quantum version of X j .
8. Calculate fitness value of X j according to training sample as in Equation (15).
9. Allocate the best solution X b .
10. Using Equations (3)–(9) to update X
11. t = t + 1
12. EndWhie
Third Stage
13. Remove irrelevant features from testing set using X b .
14. Assess the efficiency of QCham using different measures.

5. Experimental Results

This section assesses the QCham approach’s performance on eighteen benchmark datasets. In addition, the results are compared with other methods, including LSHADE [57], self-adaptive differential evolution (SaDE) [58], teaching-learning-based optimizer(TLBO) [59], L-SHADE with semi-parameter adaptation (LSHCMA) [60], grey wolf optimizer (GWO) [61], genetic algorithm (GA), and whale optimizer (WOA) [62].

5.1. Description of Dataset and Setting of Parameter

Table 2 lists the descriptions of eighteen UCI datasets. These datasets were obtained from several real-world applications, and they have distinct features, as shown in the table. Furthermore, Cham, LSHADE, SaDE, LSPACMA, GWO, GA, TLBO, and WOA are compared to the developed Qcham. Each algorithm’s parameter is determined depending on its original implementation. The number of iterations and chameleons are common parameters among these approaches, and we set them to 30 and 20, respectively. Furthermore, each of the FS techniques is tested 25 times to ensure that the fair of comparison. We used six criteria to measure the performance: average, standard deviation (Std) of the fitness, best (MIN), worst (MAX), and accuracy (Acc).

5.2. Experimental Results and Discussion

In this section, the results of QCham and other are discussed in this section. Table 3 shows the FS outcomes for all techniques, using the fitness function values as an average. We can see from the chart that the suggested QCham was ranked top in thirteen of the 18 datasets (D1–D4, D6–D8, D10, D12, D14, and D16–D18). The WOA approach came in second with three out of the 18 datasets (D3, D9, and D18). The Cham, GWO, and LSPACMA has the best value at only one dataset, namely D9, D5, and D13. Moreover, Figure 2 shows the average of each algorithm over all the 18 datasets. It can be seen from this average that the developed QCham has the smallest fitness value among the good algorithms. This is followed by traditional Cham and WOA, which provided better fitness values than other methods.
Table 4 shows the fitness values with the best results. We can observe from this table that the suggested QCham approach produced competitive results, when compared to the TILBO. The QCham had the best values in twenty datasets, while the Cham had the best results in four datasets, followed by WOA, which has the smallest value at three datasets. This indicates that the developed method still provides better fitness value than others.
In terms of the worst fitness values, as shown in Table 5, the proposed QCham outperformed the other examined approaches, achieving the best results in 77% of all datasets and showing competitive results in the remaining datasets. The LSHADE, LSPACMA, GWO, and TLBO came in second, third, fourth, and fifth place, respectively.
Table 6 also lists the average of the relevant selected features. As seen in Table 6, the QCham has the fewest number of features, at nearly 62% from the tested datasets. The LSHADE, SaDE, and LSPACMA allocated the second rank, which has results better than others, according to the number of selected features. However, GA is the worst algorithm among the tested datasets. The same observation can be noticed from Figure 3.
Table 7 summarizes the accuracy results of the developed QCham and compared approaches. The results of this test suggest that the proposed QCham is better than the alternatives. In 44% of all datasets, it had the best accuracy; in 11% of all datasets, it had the same accuracy as the other approaches. This result shows that the QCham can select the most relative feature, while maintaining the classification accuracy quality. TLBO came in second, followed by LSPACMA, GA, Cham, and LSHADE. In addition, Figure 4. shows the average of accuracy among the eighteen datasets, and it can be observed that QCham provides results better than others, followed by TLBO, GWO, and WOA, respectively.
Table 8 shows the average of CPU time(s) of the developed method and comptitive algorithms. It can be observed from these results that the QCham takes small CPU time(s) among the tested datasets, in comparison to other FS methods. However, traditional Cham provides better results than other methods, at S5, S5, S12, S13, S15, and S17. In addition, GA has better CPU time(s) for only one dataset, namely S9.

5.3. Comparison with Other FS Models

The findings of the created QCham are compared to commonly used FS models that rely on the MH techniques in this section. The enhanced GWO (EGWO) [63], binary bat algorithm (BBA) [64], BGOA [65], PSO, two binary GWO algorithms called bGWO1 and bGWO2 [47], AGWO [44], biogeography-based optimization (BBO) [66], enhanced crow search algorithm (ECSA) [67], and satin bird optimizer (SBO) [63] are among the FS techniques. The results of the created QCham, and other approaches’ classification accuracies are shown in Table 9. These results show that the created QCham has a high ability to enhance classification accuracy across all datasets examined.
In conclusion, it can be noticed there is no one algorithm can provide better performance among all the tested datasets, and this is concerned with the no free lunch theorem. Whereas the earlier findings imply that applying the proposed QCham approach considerably enhances the capacity to tackle feature selection challenges. The Cham is significantly enhanced when the QBO operators are used in the structure. As a result, the QCham can be regarded as an effective and efficient optimization technique for tackling the feature selection problem, since it has ability to discover the feasible regions that contain feasible solutions, as observed from the quality of the selected features that influence of the classification accuracy.

6. Conclusions and Future Work

This research presented a novel variant for the chameleon swarm algorithm (Cham) by including the operator of quantum-based optimization (QBO) into the Cham’s exploration phase to provide an efficient feature selection (FS) optimizer. When treating large-dimensional optimization issues, the Cham has a limitation, for which a hybrid variant called QCham was proposed. To complete the FS optimization objective, the suggested QCham was applied to eighteen different real-world datasets. QCham was compared to the basic Cham, LSHADE, SaDE, LSPACMA, GWO, GA, TLBO, and WOA. In 8 out of 18 datasets, the QCham obtained the best fitness, in 4 out of 18, with the lowermost Std value; in 50% of the datasets, it obtained the lowest number of features, according to the comparisons. This result showed that the QCham has the ability to choose the relevant feature and preserve the classification quality. The proposed QCham obtained the maximum accuracy in the average of all datasets. Additionally, the QBO operators are crucial in enhancing the exploration phase of the original Cham model.
The developed QCham will be tested in subsequent work, with a number of applications, including time-series forecasting, parameter estimation, and picture segmentation.

Author Contributions

M.A.E. and A.H.E.: conceptualization, supervision, methodology, formal analysis, resources, data curation, and writing—original draft preparation. M.A.E. and A.H.E.: conceptualization, supervision, methodology, formal analysis, resources, data curation, and writing--original draft preparation. M.A.E. and A.H.E.: writing—review. M.A., S.A., N.A., and A.F.: writing—review and editing. M.A.E. and A.H.E.: writing—review, editing, and editing, project administration, formal analysis, and funding acquisition. M.A.E. and A.H.E.: supervision, resources, formal analysis, methodology, and writing—review and editing. All authors have read and agreed to the published version of the manuscript.

Funding

The authors extend their appreciation to the Deputyship for Research and Innovation, Ministry of Education in Saudi Arabia, for funding this research work, through project number IFP-IMSIU202209.

Data Availability Statement

The data are collected from https://archive.ics.uci.edu/ml/index.php, accessed on 18 August 2022.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zhao, W.; Wang, C.; Nakahira, Y. Medical Application on Internet of Things. In Proceedings of the IET International Conference on Communication Technology and Application (ICCTA 2011), Beijing, China, 14–16 October 2011; pp. 660–665. [Google Scholar]
  2. Javaid, M.; Khan, I.H. Internet of Things (IoT) enabled healthcare helps to take the challenges of COVID-19 Pandemic. J. Oral Biol. Craniofacial Res. 2021, 11, 209–214. [Google Scholar] [CrossRef] [PubMed]
  3. Abd Elaziz, M.; Mabrouk, A.; Dahou, A.; Chelloug, S.A. Medical Image Classification Utilizing Ensemble Learning and Levy Flight-Based Honey Badger Algorithm on 6G-Enabled Internet of Things. Comput. Intell. Neurosci. 2022, 2022, 5830766. [Google Scholar] [CrossRef] [PubMed]
  4. Boursianis, A.D.; Papadopoulou, M.S.; Diamantoulakis, P.; Liopa-Tsakalidi, A.; Barouchas, P.; Salahas, G.; Karagiannidis, G.; Wan, S.; Goudos, S.K. Internet of Things (IoT) and Agricultural Unmanned Aerial Vehicles (UAVs) in smart farming: A comprehensive review. Internet Things 2022, 18, 100187. [Google Scholar] [CrossRef]
  5. Salih, K.O.M.; Rashid, T.A.; Radovanovic, D.; Bacanin, N. A Comprehensive Survey on the Internet of Things with the Industrial Marketplace. Sensors 2022, 22, 730. [Google Scholar] [CrossRef]
  6. Haghnegahdar, L.; Joshi, S.S.; Dahotre, N.B. From IoT-based cloud manufacturing approach to intelligent additive manufacturing: Industrial Internet of Things—an overview. Int. J. Adv. Manuf. Technol. 2022, 119, 1461–1478. [Google Scholar] [CrossRef]
  7. Abichandani, P.; Sivakumar, V.; Lobo, D.; Iaboni, C.; Shekhar, P. Internet-of-Things Curriculum, Pedagogy, and Assessment for STEM Education: A Review of Literature. IEEE Access 2022, 10, 38351–38369. [Google Scholar] [CrossRef]
  8. Tubishat, M.; Idris, N.; Shuib, L.; Abushariah, M.A.M.; Mirjalili, S. Improved Salp Swarm Algorithm based on opposition based learning and novel local search algorithm for feature selection. Expert Syst. Appl. 2020, 145, 113122. [Google Scholar] [CrossRef]
  9. Hancer, E.; Xue, B.; Karaboga, D.; Zhang, M. A binary ABC algorithm based on advanced similarity scheme for feature selection. Appl. Soft Comput. J. 2015, 36, 334–348. [Google Scholar] [CrossRef]
  10. Ewees, A.A.; Abualigah, L.; Yousri, D.; Algamal, Z.Y.; Al-qaness, M.A.A.; Ibrahim, R.A.; Abd Elaziz, M. Improved Slime Mould Algorithm based on Firefly Algorithm for feature selection: A case study on QSAR model. Eng. Comput. 2021, 38, 2407–2421. [Google Scholar] [CrossRef]
  11. Dahou, A.; Elaziz, M.A.; Zhou, J.; Xiong, S. Arabic Sentiment Classification Using Convolutional Neural Network and Differential Evolution Algorithm. Comput. Intell. Neurosci. 2019, 2019, 2537689. [Google Scholar] [CrossRef]
  12. Al-qaness, M.A.A. Device-free human micro-activity recognition method using WiFi signals. Geo Spat. Inf. Sci. 2019, 22, 128–137. [Google Scholar] [CrossRef]
  13. Sahlol, A.T.; Yousri, D.; Ewees, A.A.; Al-qaness, M.A.A.; Damasevicius, R.; Elaziz, M.A. COVID-19 image classification using deep features and fractional-order marine predators algorithm. Sci. Rep. 2020, 10, 15364. [Google Scholar] [CrossRef] [PubMed]
  14. Benazzouz, A.; Guilal, R.; Amirouche, F.; Hadj Slimane, Z.E. EMG Feature Selection for Diagnosis of Neuromuscular Disorders. In Proceedings of the 2019 International Conference on Networking and Advanced Systems (ICNAS), Annaba, Algeria, 26–27 June 2019; pp. 1–5. [Google Scholar]
  15. Nobile, M.S.; Tangherloni, A.; Rundo, L.; Spolaor, S.; Besozzi, D.; Mauri, G.; Cazzaniga, P. Computational Intelligence for Parameter Estimation of Biochemical Systems. In Proceedings of the 2018 IEEE Congress on Evolutionary Computation (CEC), Brisbane, Australia, 10–15 June 2018; pp. 1–8. [Google Scholar]
  16. Cheng, S.; Ma, L.; Lu, H.; Lei, X.; Shi, Y. Evolutionary computation for solving search-based data analytics problems. Artif. Intell. Rev. 2021, 54, 1321–1348. [Google Scholar] [CrossRef]
  17. Rundo, L.; Tangherloni, A.; Cazzaniga, P.; Nobile, M.S.; Russo, G.; Gilardi, M.C.; Vitabile, S.; Mauri, G.; Besozzi, D.; Militello, C. A novel framework for MR image segmentation and quantification by using MedGA. Comput. Methods Programs Biomed. 2019, 176, 159–172. [Google Scholar] [CrossRef]
  18. Ibrahim, R.A.; Ewees, A.A.; Oliva, D.; Elaziz, M.A.; Lu, S. Improved salp swarm algorithm based on particle swarm optimization for feature selection. J. Ambient Intell. Humaniz. Comput. 2018, 10, 1–15. [Google Scholar] [CrossRef]
  19. Ma, J.; Bi, Z.; Ting, T.O.; Hao, S.; Hao, W. Comparative performance on photovoltaic model parameter identification via bio-inspired algorithms. Sol. Energy 2016, 132, 606–616. [Google Scholar] [CrossRef]
  20. Chang, F.J.; Chang, Y.T. Adaptive neuro-fuzzy inference system for prediction of water level in reservoir. Adv. Water Resour. 2006, 29, 1–10. [Google Scholar] [CrossRef]
  21. Khanduzi, R.; Maleki, H.R.; Akbari, R. Two novel combined approaches based on TLBO and PSO for a partial interdiction/fortification problem using capacitated facilities and budget constraint. Soft Comput. 2018, 22, 5901–5919. [Google Scholar] [CrossRef]
  22. Elsheikh, A.H.; Muthuramalingam, T.; Abd Elaziz, M.; Ibrahim, A.M.M.; Showaib, E.A. Minimization of fume emissions in laser cutting of polyvinyl chloride sheets using genetic algorithm. Int. J. Environ. Sci. Technol. 2022, 19, 6331–6344. [Google Scholar] [CrossRef]
  23. Babikir, H.A.; Elaziz, M.A.; Elsheikh, A.H.; Showaib, E.A.; Elhadary, M.; Wu, D.; Liu, Y. Noise prediction of axial piston pump based on different valve materials using a modified artificial neural network model. Alexandria Eng. J. 2019, 58, 1077–1087. [Google Scholar] [CrossRef]
  24. Elmaadawy, K.; Elaziz, M.A.; Elsheikh, A.H.; Moawad, A.; Liu, B.; Lu, S. Utilization of random vector functional link integrated with manta ray foraging optimization for effluent prediction of wastewater treatment plant. J. Environ. Manag. 2021, 298, 113520. [Google Scholar] [CrossRef] [PubMed]
  25. Khoshaim, A.B.; Moustafa, E.B.; Bafakeeh, O.T.; Elsheikh, A.H. An Optimized Multilayer Perceptrons Model Using Grey Wolf Optimizer to Predict Mechanical and Microstructural Properties of Friction Stir Processed Aluminum Alloy Reinforced by Nanoparticles. Coatings 2021, 11, 1476. [Google Scholar] [CrossRef]
  26. Abd Elaziz, M.; Elsheikh, A.H.; Oliva, D.; Abualigah, L.; Lu, S.; Ewees, A.A. Advanced Metaheuristic Techniques for Mechanical Design Problems: Review. Arch. Comput. Methods Eng. 2022, 29, 695–716. [Google Scholar] [CrossRef]
  27. Shehabeldeen, T.A.; Elaziz, M.A.; Elsheikh, A.H.; Zhou, J. Modeling of friction stir welding process using adaptive neuro-fuzzy inference system integrated with harris hawks optimizer. J. Mater. Res. Technol. 2019, 8, 5882–5892. [Google Scholar] [CrossRef]
  28. Ekinci, S.; Hekimoğlu, B.; Izci, D. Opposition based Henry gas solubility optimization as a novel algorithm for PID control of DC motor. Eng. Sci. Technol. Int. J. 2021, 24, 331–342. [Google Scholar] [CrossRef]
  29. Shehabeldeen, T.A.; Elaziz, M.A.; Elsheikh, A.H.; Hassan, O.F.; Yin, Y.; Ji, X.; Shen, X.; Zhou, J. A Novel Method for Predicting Tensile Strength of Friction Stir Welded AA6061 Aluminium Alloy Joints based on Hybrid Random Vector Functional Link and Henry Gas Solubility Optimization. IEEE Access 2020, 30, 188–193. [Google Scholar] [CrossRef]
  30. Askari, Q.; Younas, I.; Saeed, M. Political Optimizer: A novel socio-inspired meta-heuristic for global optimization. Knowl. Based Syst. 2020, 195, 105709. [Google Scholar] [CrossRef]
  31. Too, J.; Abdullah, A.R.; Saad, N.M. Mohd Saad A New Quadratic Binary Harris Hawk Optimization for Feature Selection. Electronics 2019, 8, 1130. [Google Scholar] [CrossRef] [Green Version]
  32. Abd Elaziz, M.; Dahou, A.; Alsaleh, N.A.; Elsheikh, A.H.; Saba, A.I.; Ahmadein, M. Boosting COVID-19 Image Classification Using MobileNetV3 and Aquila Optimizer Algorithm. Entropy 2021, 23, 1383. [Google Scholar] [CrossRef]
  33. Askarzadeh, A. A novel metaheuristic method for solving constrained engineering optimization problems: Crow search algorithm. Comput. Struct. 2016, 169, 1–12. [Google Scholar] [CrossRef]
  34. Songyang, L.; Haipeng, Y.; Miao, W. Cat swarm optimization algorithm based on the information interaction of subgroup and the top-N learning strategy. J. Intell. Syst. 2022, 31, 489–500. [Google Scholar] [CrossRef]
  35. Zhao, W.; Wang, L.; Zhang, Z. Artificial ecosystem-based optimization: A novel nature-inspired meta-heuristic algorithm. Neural Comput. Appl. 2020, 32, 9383–9425. [Google Scholar] [CrossRef]
  36. Said, M.; El-Rifaie, A.M.; Tolba, M.A.; Houssein, E.H.; Deb, S. An Efficient Chameleon Swarm Algorithm for Economic Load Dispatch Problem. Mathematics 2021, 9, 2770. [Google Scholar] [CrossRef]
  37. Izci, D.; Ekinci, S.; Kayri, M.; Eker, E. A novel improved arithmetic optimization algorithm for optimal design of PID controlled and Bode’s ideal transfer function based automobile cruise control system. Evol. Syst. 2022, 13, 453–468. [Google Scholar] [CrossRef]
  38. Rizk-Allah, R.M.; El-Hameed, M.A.; El-Fergany, A.A. Model parameters extraction of solid oxide fuel cells based on semi-empirical and memory-based chameleon swarm algorithm. Int. J. Energy Res. 2021, 45, 21435–21450. [Google Scholar] [CrossRef]
  39. Mostafa, R.R.; Ewees, A.A.; Ghoniem, R.M.; Abualigah, L.; Hashim, F.A. Boosting chameleon swarm algorithm with consumption AEO operator for global optimization and feature selection. Knowl. Based Syst. 2022, 246, 108743. [Google Scholar] [CrossRef]
  40. Umamageswari, A.; Bharathiraja, N.; Irene, D.S. A Novel Fuzzy C-Means based Chameleon Swarm Algorithm for Segmentation and Progressive Neural Architecture Search for Plant Disease Classification. ICT Express 2021, in press. [Google Scholar] [CrossRef]
  41. Al-Othman, A.K.; Ahmed, N.A.; Al-Fares, F.S.; AlSharidah, M.E. Parameter Identification of PEM Fuel Cell Using Quantum-Based Optimization Method. Arab. J. Sci. Eng. 2015, 40, 2619–2628. [Google Scholar] [CrossRef]
  42. Agrawal, R.K.; Kaur, B.; Sharma, S. Quantum based Whale Optimization Algorithm for wrapper feature selection. Appl. Soft Comput. 2020, 89, 106092. [Google Scholar] [CrossRef]
  43. Ho, S.L.; Yang, S.; Ni, G.; Huang, J. A Quantum-Based Particle Swarm Optimization Algorithm Applied to Inverse Problems. IEEE Trans. Magn. 2013, 49, 2069–2072. [Google Scholar] [CrossRef]
  44. Chaudhuri, A.; Sahu, T.P. Feature selection using Binary Crow Search Algorithm with time varying flight length. Expert Syst. Appl. 2021, 168, 114288. [Google Scholar] [CrossRef]
  45. Sadeghian, Z.; Akbari, E.; Nematzadeh, H. A hybrid feature selection method based on information theory and binary butterfly optimization algorithm. Eng. Appl. Artif. Intell. 2021, 97, 104079. [Google Scholar] [CrossRef]
  46. Maleki, N.; Zeinali, Y.; Niaki, S.T.A. A k-NN method for lung cancer prognosis with the use of a genetic algorithm for feature selection. Expert Syst. Appl. 2021, 164, 113981. [Google Scholar] [CrossRef]
  47. Song, X.; Zhang, Y.; Gong, D.; Sun, X. Feature selection using bare-bones particle swarm optimization with mutual information. Pattern Recognit. 2021, 112, 107804. [Google Scholar] [CrossRef]
  48. Sathiyabhama, B.; Kumar, S.U.; Jayanthi, J.; Sathiya, T.; Ilavarasi, A.K.; Yuvarajan, V.; Gopikrishna, K. A novel feature selection framework based on grey wolf optimizer for mammogram image analysis. Neural Comput. Appl. 2021, 33, 14583–14602. [Google Scholar] [CrossRef]
  49. Aljarah, I.; Habib, M.; Faris, H.; Al-Madi, N.; Heidari, A.A.; Mafarja, M.; Elaziz, M.A.; Mirjalili, S. A dynamic locality multi-objective salp swarm algorithm for feature selection. Comput. Ind. Eng. 2020, 147, 106628. [Google Scholar] [CrossRef]
  50. Dhiman, G.; Oliva, D.; Kaur, A.; Singh, K.K.; Vimal, S.; Sharma, A.; Cengiz, K. BEPO: A novel binary emperor penguin optimizer for automatic feature selection. Knowl. Based Syst. 2021, 211, 106560. [Google Scholar] [CrossRef]
  51. Amini, F.; Hu, G. A two-layer feature selection method using Genetic Algorithm and Elastic Net. Expert Syst. Appl. 2021, 166, 114072. [Google Scholar] [CrossRef]
  52. Neggaz, N.; Houssein, E.H.; Hussain, K. An efficient henry gas solubility optimization for feature selection. Expert Syst. Appl. 2020, 152, 113364. [Google Scholar] [CrossRef]
  53. Rostami, M.; Berahmand, K.; Nasiri, E.; Forouzandeh, S. Review of swarm intelligence-based feature selection methods. Eng. Appl. Artif. Intell. 2021, 100, 104210. [Google Scholar] [CrossRef]
  54. Agrawal, P.; Abutarboush, H.F.; Ganesh, T.; Mohamed, A.W. Metaheuristic Algorithms on Feature Selection: A Survey of One Decade of Research (2009–2019). IEEE Access 2021, 9, 26766–26791. [Google Scholar] [CrossRef]
  55. Braik, M.S. Chameleon Swarm Algorithm: A bio-inspired optimizer for solving engineering design problems. Expert Syst. Appl. 2021, 174, 114685. [Google Scholar] [CrossRef]
  56. Srikanth, K.; Panwar, L.K.; Panigrahi, B.; Herrera-Viedma, E.; Sangaiah, A.K.; Wang, G.-G. Meta-heuristic framework: Quantum inspired binary grey wolf optimizer for unit commitment problem. Comput. Electr. Eng. 2018, 70, 243–260. [Google Scholar] [CrossRef]
  57. Tanabe, R.; Fukunaga, A.S. Improving the Search Performance of SHADE Using Linear Population Size Reduction. In Proceedings of the 2014 IEEE Congress on Evolutionary Computation (CEC), Beijing, China, 6–11 July 2014; pp. 1658–1665. [Google Scholar]
  58. Qin, A.K.; Suganthan, P.N. Self-Adaptive Differential Evolution Algorithm for Numerical Optimization. In Proceedings of the 2005 IEEE Congress on Evolutionary Computation, Edinburgh, UK, 2–4 September 2005; Volume 2, pp. 1785–1791. [Google Scholar]
  59. Gill, H.S.; Khehra, B.S.; Singh, A.; Kaur, L. Teaching-learning based optimization algorithm to minimize cross entropy for selecting multilevel threshold values. Egypt. Inform. J. 2018, 20, 11–25. [Google Scholar] [CrossRef]
  60. Mohamed, A.W.; Hadi, A.A.; Fattouh, A.M.; Jambi, K.M. LSHADE with Semi-Parameter Adaptation Hybrid with CMA-ES for Solving CEC 2017 Benchmark Problems. In Proceedings of the 2017 IEEE Congress on Evolutionary Computation (CEC), San Sebastian, Spain, 5–8 June 2017; pp. 145–152. [Google Scholar]
  61. Mirjalili, S.; Mirjalili, S.M.; Lewis, A. Grey Wolf Optimizer. Adv. Eng. Softw. 2014, 69, 46–61. [Google Scholar] [CrossRef] [Green Version]
  62. Mirjalili, S.; Lewis, A. The Whale Optimization Algorithm. Adv. Eng. Softw. 2016, 95, 51–67. [Google Scholar] [CrossRef]
  63. Arora, S.; Singh, H.; Sharma, M.; Sharma, S.; Anand, P. A New Hybrid Algorithm Based on Grey Wolf Optimization and Crow Search Algorithm for Unconstrained Function Optimization and Feature Selection. IEEE Access 2019, 7, 26343–26361. [Google Scholar] [CrossRef]
  64. Nakamura, R.Y.M.; Pereira, L.A.M.; Costa, K.A.; Rodrigues, D.; Papa, J.P.; Yang, X.-S. BBA: A Binary Bat Algorithm for Feature Selection. In Proceedings of the 2012 25th SIBGRAPI Conference on Graphics, Patterns and Images, Washington, DC, USA, 22–25 August 2012; pp. 291–297. [Google Scholar]
  65. Mafarja, M.; Aljarah, I.; Heidari, A.A.; Hammouri, A.I.; Faris, H.; Al-Zoubi, A.M.; Mirjalili, S. Evolutionary Population Dynamics and Grasshopper Optimization approaches for feature selection problems. Knowl. Based Syst. 2018, 145, 25–45. [Google Scholar] [CrossRef] [Green Version]
  66. Saremi, S.; Mirjalili, S.; Lewis, A. Biogeography-based optimisation with chaos. Neural Comput. Appl. 2014, 25, 1077–1097. [Google Scholar] [CrossRef] [Green Version]
  67. Ouadfel, S.; Abd Elaziz, M. Enhanced Crow Search Algorithm for Feature Selection. Expert Syst. Appl. 2020, 159, 113572. [Google Scholar] [CrossRef]
Figure 1. Procedures of the QCham-based FS technique.
Figure 1. Procedures of the QCham-based FS technique.
Mathematics 10 03606 g001
Figure 2. Fitness value mean of all the investigated datasets.
Figure 2. Fitness value mean of all the investigated datasets.
Mathematics 10 03606 g002
Figure 3. Average of selected features using QCham.
Figure 3. Average of selected features using QCham.
Mathematics 10 03606 g003
Figure 4. Average Accuracy of the models for all datasets studied.
Figure 4. Average Accuracy of the models for all datasets studied.
Mathematics 10 03606 g004
Table 1. Predefined value of Δ θ j i .
Table 1. Predefined value of Δ θ j i .
x t j , i
x b
f ( x t j )   f ( x b )
Δ θ j i
00F0
01F0.01 π
10F−0.01 π
11F0
00T0
01T0
10T0
11T0
Table 2. Description of datasets.
Table 2. Description of datasets.
Data CodeDatasetsNo.
Instances
No.
Features
No.
Classes
Data CodeDatasetsNo.
Instances
No.
Features
No.
Classes
S1Breastcancer69992S2BreastEW569302
S3CongressEW435162S4Exactly1000132
S5Exactly21000132S6HeartEW270132
S7IonosphereEW351342S8KrvskpEW3196362
S9Lymphography148182S10M-of-n1000132
S11PenglungEW733252S12SonarEW208602
S13SpectEW267222S14tic-tac-toe95892
S15Vote300162S16WaveformEW5000403
S17WaterEW178133S18Zoo101166
Table 3. Results of fitness values for QCham and others.
Table 3. Results of fitness values for QCham and others.
QChamChamLSHADESaDELSPACMAGWOGATLBOWOA
S10.05550.06690.08330.09170.10670.06790.10180.09250.0738
S20.06380.06800.12540.11140.13420.08090.12800.09410.0699
S30.03730.06130.06550.10630.05750.10750.10180.09020.0738
S40.04670.08420.25040.38530.29770.14150.19230.25880.1598
S50.23570.28090.22580.22850.22230.19980.33060.31580.2170
S60.14570.16170.20190.24170.25190.20380.19580.24780.2160
S70.03520.05940.11600.11480.15610.08170.12060.15730.0993
S80.07840.09080.39040.36580.35840.09550.11480.11320.0971
S90.09380.09290.25670.25160.21670.15640.18180.18890.1287
S100.04910.06660.21180.27060.31970.09980.11790.19980.1176
S110.05460.14710.32000.35000.24740.04890.20220.04180.0408
S120.05710.08860.28330.33330.39170.09650.08900.14710.0673
S130.15680.16610.16300.24170.13700.23530.20470.20270.2336
S140.21480.24620.26350.29920.32080.25480.22790.26590.2572
S150.05680.09090.05670.08500.11420.05330.10520.08810.0457
S160.25680.28430.35740.40940.43810.30260.30750.31370.2996
S170.02670.04770.18330.15830.15970.05710.08780.09560.0699
S180.01960.02340.33330.08330.23330.06600.05630.05150.0533
Table 4. Best fitness values for QCham and others.
Table 4. Best fitness values for QCham and others.
QChamChamLSHADESaDELSPACMAGWOGATLBOWOA
S10.05260.05900.08330.09170.09670.05730.08300.08300.0590
S20.04580.04700.12540.09470.13420.06070.11070.11070.0491
S30.03730.04760.06550.08970.05750.06640.07690.07690.0560
S40.04620.04620.20480.38530.29770.04620.06150.06150.0462
S50.23270.25580.22580.21180.22230.19670.30320.30320.2102
S60.12050.14620.20190.23520.22780.16280.16920.16920.1731
S70.01760.03620.11600.10990.14930.06740.10480.10480.0742
S80.06450.07810.38660.36580.32540.06830.10030.10030.0660
S90.05560.05560.25000.24480.21670.10650.13220.13220.0471
S100.04620.04620.21180.25570.31350.05380.06150.06150.0615
S110.00710.07140.32000.33330.18720.02030.19820.19820.0031
S120.03000.02330.28330.33330.33330.06480.07500.07500.0481
S130.13940.11210.16300.22780.13700.19390.17730.17730.2182
S140.21200.22430.26350.29740.32080.23070.21200.21200.2354
S150.02750.07000.05670.06830.09170.03380.06250.06250.0363
S160.23540.25620.35740.40940.42900.28470.29510.29510.2730
S170.01540.03080.18330.14440.11940.03850.06920.06920.0462
S180.01880.01250.33330.06670.23330.04380.04380.04380.0375
Table 5. Worst fitness values for QCham and others.
Table 5. Worst fitness values for QCham and others.
QChamChamLSHADESaDELSPACMAGWOGATLBOWOA
S10.06550.07190.08330.09170.11670.08180.12280.10230.0976
S20.08190.08950.12540.12810.13420.10110.13650.10650.0856
S30.03730.10800.06550.12300.05750.14310.13300.10170.1037
S40.05380.25240.29600.38530.29770.23720.30960.28450.2822
S50.27750.30000.22580.24520.22230.21210.35590.34690.3117
S60.17310.18590.20190.24810.27590.26920.20900.27950.2513
S70.04890.08600.11600.11970.16290.09960.13890.18940.1279
S80.08940.11300.39420.36580.39150.12150.13400.13990.1160
S90.12540.13220.26330.25830.21670.20520.22780.26000.2467
S100.07050.10070.21180.28550.32580.17850.21060.25820.1836
S110.13480.22520.32000.36670.30770.09050.20650.04490.0852
S120.08620.11570.28330.33330.45000.12430.10810.17050.0867
S130.19390.21670.16300.25560.13700.26520.22270.22730.2379
S140.22140.28470.26350.30100.32080.29240.27930.30400.2917
S150.07380.11380.05670.10170.13670.09750.14380.10750.0950
S160.27410.30940.35740.40940.44720.32150.32150.33480.3229
S170.05380.06920.18330.17220.20000.07120.11920.11920.0865
S180.02500.03750.33330.10000.23330.08660.06880.08660.0804
Table 6. Selected features numbers for all methods.
Table 6. Selected features numbers for all methods.
QChamChamLSHADESaDELSPACMAGWOGATLBOWOA
S1443563532
S25959682176
S333441151154
S46874107969
S5365544944
S63672361165
S73945492898
S892111151421291519
S931134381494
S10387549969
S1125593520251072675841
S12133016192024502931
S136947571784
S14354475665
S152832241152
S16521715920341922
S17467655956
S18745548938
Table 7. Accuracy results for all methods.
Table 7. Accuracy results for all methods.
QChamChamLSHADESaDELSPACMAGWOGATLBOWOA
S10.98570.96890.92860.96430.94290.95670.94620.97290.9476
S20.96880.95920.86840.91230.90350.94150.93510.95730.9433
S30.95400.95520.95400.91950.96550.91570.96090.96550.9448
S40.97430.97230.73750.64000.67000.89870.86671.00000.8960
S50.73580.63700.72500.74500.73000.79000.71300.75070.7703
S60.89440.66760.75930.75000.75930.82720.87530.88770.7988
S70.98340.81480.92960.97890.94370.93800.95770.98590.9174
S80.96270.62910.58520.52500.55940.95770.96160.95470.9507
S90.96660.64000.80000.80730.83330.87560.88440.93370.8821
S100.99350.57130.74500.73250.71000.96270.94770.99900.9497
S110.85671.00000.73330.66670.88460.98220.86670.95110.9686
S120.97140.79760.69050.69050.53570.93810.99370.97940.9825
S130.88330.75560.81480.75000.85190.77160.85800.85310.7593
S140.79380.58280.71880.76300.87500.78190.82500.83540.7809
S150.98250.82830.96670.95000.90830.97000.96000.97110.9622
S160.77320.57070.53700.55800.51700.71860.75330.75300.7283
S171.00000.84860.83330.91670.98610.98330.98331.00000.9759
S181.00000.84520.66671.00000.83330.98411.00001.00000.9968
Table 8. CPU(s) time of the QCham and other methods.
Table 8. CPU(s) time of the QCham and other methods.
QChamChamLSHADESaDELSPACMAGWOGATLBOWOA
S11.32662.53233.75973.93113.78964.80453.03727.16213.4947
S22.32204.47233.60523.66133.63893.79842.86197.18063.4250
S30.29320.46073.57213.60603.60043.64732.74386.41683.3492
S40.33350.52753.97734.04504.02284.12933.24387.96223.8586
S50.92280.47343.85833.95393.93134.04323.33007.21393.4962
S60.70980.47673.47663.48433.48743.53192.58716.78313.2491
S70.28540.43293.49253.49593.49983.63912.70856.78883.2704
S80.68041.27448.42718.54728.483114.963513.699124.117412.5058
S93.82434.43203.30853.32533.36423.43182.53206.59173.1419
S102.32483.53354.04574.14564.09624.12843.25497.88063.9806
S113.32222.44633.52043.59623.54386.60272.71096.56563.1659
S123.29443.06333.43903.48753.46943.83822.64436.45293.1784
S131.84481.44953.44573.52733.51373.51432.59646.71633.1486
S142.45213.52544.10324.17504.14464.05713.25157.95143.8750
S152.50252.43693.50853.57513.54033.55112.63266.79273.1668
S161.48192.479121.041121.309321.192030.421534.608347.054728.8056
S173.27652.46874.65374.68254.64443.46502.65496.44353.2474
S180.29400.44470.01960.03040.99590.43480.63070.48310.2730
Table 9. Comparison with FS models.
Table 9. Comparison with FS models.
DatasetsQChamWOATbGWO2BBOECSAWOARPSOAGWOBBAEGWOBGOASBO
S10.98570.9590.9750.9620.9720.9570.9670.9600.9370.9610.9690.967
S20.96880.9490.9350.9450.9580.9500.9330.9340.9310.9470.960.942
S30.95400.9140.7760.9360.9660.9100.6880.9350.8720.9430.9530.950
S40.97430.7390.750.75410.7630.730.7570.610.7530.9460.734
S50.73580.6990.7760.6920.7670.6900.7870.6950.6280.6980.760.709
S60.89440.7650.70.7820.830.7630.7440.7970.7540.7610.8260.792
S70.98340.8840.9630.8800.9310.8800.9210.8930.8770.8630.8830.898
S90.96270.8960.5840.800.8650.9010.5840.7910.7010.7660.8150.818
S100.96660.7780.7290.88010.7590.7370.8780.7220.8700.9790.863
S110.99350.8380.8220.8160.9210.8600.8220.8540.7950.7560.8610.843
S120.85670.7360.9380.8710.9260.7120.9280.8820.8440.8610.8950.894
S130.97140.8610.8340.7980.8470.8570.8190.8130.80.8040.8030.798
S140.88330.7920.7270.7680.8420.7780.7350.7620.6650.7710.9510.768
S150.79380.7360.920.9170.960.7390.9040.920.8510.9020.7290.934
S170.98250.9350.920.9660.9850.9320.9330.9570.9190.9660.9790.968
S180.77320.7100.8790.9370.9830.7120.8610.9680.8740.9680.990.968
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Elaziz, M.A.; Ahmadein, M.; Ataya, S.; Alsaleh, N.; Forestiero, A.; Elsheikh, A.H. A Quantum-Based Chameleon Swarm for Feature Selection. Mathematics 2022, 10, 3606. https://doi.org/10.3390/math10193606

AMA Style

Elaziz MA, Ahmadein M, Ataya S, Alsaleh N, Forestiero A, Elsheikh AH. A Quantum-Based Chameleon Swarm for Feature Selection. Mathematics. 2022; 10(19):3606. https://doi.org/10.3390/math10193606

Chicago/Turabian Style

Elaziz, Mohamed Abd, Mahmoud Ahmadein, Sabbah Ataya, Naser Alsaleh, Agostino Forestiero, and Ammar H. Elsheikh. 2022. "A Quantum-Based Chameleon Swarm for Feature Selection" Mathematics 10, no. 19: 3606. https://doi.org/10.3390/math10193606

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop