# A Fuzzy Classifier with Feature Selection Based on the Gravitational Search Algorithm

^{*}

## Abstract

**:**

## 1. Introduction

- A new technique for generating a fuzzy-rule-based classifier.
- A method that selects a compact and efficient subset of features.
- A new method of tuning fuzzy-rule-based classifier parameters.
- A statistical comparison among the results achieved by the fuzzy-rule-based classifiers generated by our technique and by two state-of-the-art learning algorithms.

## 2. Related Work

#### 2.1. Fuzzy Classifier Design Using Metaheuristics

#### 2.2. Feature Selection

## 3. Materials and Methods

#### 3.1. Fuzzy Classifier

**x**= ${\mathrm{x}}_{1}$ × ${\mathrm{x}}_{2}$ × … × ${\mathrm{x}}_{n}$ ∈ ℜ

^{n}be an n-dimensional feature space.

**x**in the input feature space with a calculable degree of confidence:

_{kj}is a fuzzy term that characterizes the k-th feature in the j-th rule (k = 1, …, n); ${c}_{j}$ is the consequent class;

**S**= (${s}_{1}$, ${s}_{2}$, …, ${s}_{n}$) is the binary vector of features: line ${s}_{1}\u02c4{x}_{k}$ indicates presence (${s}_{k}$ = 1) or absence (${s}_{k}$ = 0) of a feature in the classifier.

_{jk}at point ${x}_{pk}$.

#### 3.2. Performance Measures

**θ**,

**S**) is the fuzzy classifier output with parameters of fuzzy terms

**θ**and features

**S**at point ${x}_{p}$.

**S**and

**θ**= (θ

^{1}, θ

^{2}, …, θ

^{D}):

^{i}

_{min}, θ

^{i}

_{max}are the upper and lower boundaries of the domain of each parameter, correspondingly. This problem is NP-hard; in this paper, we propose to solve it by splitting it into two tasks: feature selection and tuning fuzzy term parameters.

#### 3.3. Binary Gravitational Search Algorithm

**x**that would not cause a decrease in classification accuracy as the number of features is reduced; the solution is represented as a binary vector

**S**= (${s}_{1}$, ${s}_{2}$, …, ${s}_{n}$)

^{T}, where ${s}_{i}$= 0 means that the i-th feature does not participate in classification, ${s}_{i}$ = 1 means that the i-th feature is used by the classifier. This problem can be solved with the Binary Gravitational Search Algorithm.

**S**

_{best}that makes it possible to achieve the highest level of classification accuracy.

**θ**, number of vectors P, maximum number of iterations T, initial value of gravitational constant G

_{0}, coefficients α and small constant ε. The initial population

**S**= {

**S**

_{1},

**S**

_{2}, …,

**S**

_{P}} is randomly generated. Before the start, a classifier is built based on each vector and fitness function is evaluated:

**S**

_{best}to the output.

#### 3.4. The Transfer Functions

Algorithm 1. Binary Gravitational Search Algorithm. |

Input: θ, P, T, G_{0}, α, ε. |

Output: S_{best}. |

begin |

Initialize the population S = {S_{1}, S_{2}, …, S_{P}}; |

while (t < T) |

$\mathrm{estimate}\text{}\mathrm{the}\text{}\mathrm{fitness}\text{}\mathrm{function}\text{}fi{t}_{i}$ by Equation (5) for i = 1, 2, ..., P; |

find best(t) and worst(t); |

update G(t) by Equation (9); |

$\mathrm{calculate}\text{}\mathrm{the}\text{}\mathrm{mass}\text{}{M}_{i}$$\left(t\right)\text{}\mathrm{by}\text{}\mathrm{Equation}\text{}\left(6\right),\text{}\mathrm{acceleration}\text{}{a}_{i}$$\left(t\right)\text{}\mathrm{by}\text{}\mathrm{Equation}\text{}\left(7\right)\text{}\mathrm{and}\text{}\mathrm{velocity}\text{}{V}_{i}$(t) by Equation (10) for i = 1, 2, ..., P; |

update the position of particles with one of the Equations (11)–(14); |

end while |

output the particle with the best fitness value S_{best}; |

end |

#### 3.5. Algorithm for Generating Rule Base by Extreme Feature Values

**x**

_{p}; ${t}_{p}$), p = 1 ,..., |Tr|}. Let us introduce the following notation: m is the number of classes, n is the number of features,

**Ω*** is the classifier rule base. A pseudo code of the generating algorithm is demonstrated in Algorithm 2.

Algorithm 2. Algorithm for generating rule base by extreme feature values. |

Input: m, n, Tr. |

Output: classifier rule base Ω*. |

begin |

Ω:= ∅; |

do loop j from 1 till m |

do loop k from 1 till n |

$\mathrm{search}\text{}\mathrm{min}clas{s}_{jk}:=\underset{p}{\mathrm{min}}({x}_{pk})$; |

$\mathrm{search}\text{}\mathrm{max}clas{s}_{jk}:=\underset{p}{\mathrm{max}}({x}_{pk})$; |

$\mathrm{formation}\text{}\mathrm{of}\text{}\mathrm{fuzzy}\text{}\mathrm{term}\text{}{A}_{\mathrm{jk}},\text{}\mathrm{covering}\text{}\mathrm{the}\text{}\mathrm{interval}\text{}[\mathrm{min}clas{s}_{jk}$$,\text{}\mathrm{max}clas{s}_{jk}$]; |

end of loop |

creation of rule R_{1j} on the basis of terms A_{jk} that refers observation to the class |

$\mathrm{with}\text{}\mathrm{identifier}\text{}{c}_{j}$; |

Ω*:= Ω ∪ {R_{1j}} |

end of loop |

output Ω*. |

end |

#### 3.6. Continuous Gravitational Search Algorithm

**θ**using continuous gravitational search.

**θ**. Feature a here is represented by three symmetric Gaussian terms, each of them determined by two parameters (b—the coordinate of the peak on the abscissa, c—scatter) included in vector

**θ**= (${b}_{11}$, ${c}_{11}$, ${b}_{12}$, ${c}_{12}$, ${b}_{13}$, ${c}_{13}$, ${b}_{21}$, ${c}_{21}$, …). The use of symmetric membership functions is preferable because of their better interpretability.

**θ**are determined by the number of input features used in classification and by the number and type of terms describing each feature. For some datasets, asymmetrical types of terms, such as triangular membership functions, can be a better choice.

**θ**

_{1},

**θ**

_{2}, …,

**θ**

_{P}} for the Continuous Gravitational Search Algorithm is created by copying the input vector

**θ**

_{1}, generated by the classifier structure generation algorithm, with normal deviation. The input data for the algorithm is: vector of features

**S**, number of term parameter vectors P, maximum number of iterations T, initial value of gravitational constant G

_{0}, coefficients α and small constant ε. Before the start, a classifier is built based on each vector and classification accuracy is evaluated:

**θ**

_{best}that possess the highest level of classification accuracy.

Algorithm 3. Continuous Gravitational Search Algorithm. |

Input: S, P, T, G_{0}, α, ε. |

Output: θ_{best}. |

begin |

Initialize the population Θ = {θ_{1}, θ_{2}, …, θ_{P}}; |

while (t < T) |

$\mathrm{estimate}\text{}\mathrm{the}\text{}\mathrm{fitness}\text{}\mathrm{function}\text{}fi{t}_{i}$ by Equation (17) for i = 1, 2, ..., P; |

find best(t) and worst(t); |

update G(t) by Equation (9); |

$\mathrm{calculate}\text{}\mathrm{the}\text{}\mathrm{mass}\text{}{M}_{i}$$\left(t\right)\text{}\mathrm{by}\text{}\mathrm{Equation}\text{}\left(6\right),\text{}\mathrm{acceleration}\text{}{a}_{i}$$\left(t\right)\text{}\mathrm{by}\text{}\mathrm{Equation}\text{}\left(18\right)\text{}\mathrm{and}\text{}\mathrm{velocity}\text{}{V}_{i}$(t) by Equation (10) for i = 1, 2, ..., P; |

update the position of particles with the Equation (19); |

end while |

output the particle with the best fitness value θ_{best}; |

end |

#### 3.7. Datasets

#### 3.8. Test Phase

_{0}= 10, coefficient α = 10, small constant ε = 0.01. The maximum number of iterations for the Continuous Binary Search Algorithm is T = 1000. The number of iterations for the Binary Algorithm varied depending on the number of features in the dataset (100 to 1000 iterations). The value of the parameters is determined empirically.

## 4. Experimental Results

#### 4.1. Comparison of Feature Selection Results Using the Binary Gravitational Algorithm with Various Transfer Functions

#### 4.2. Comparison to Similar Solutions

^{−9}) and the number of rules in the resulting classifiers and the FARC-HD algorithm (the test value is 2.48 × 10

^{−8}).

^{−4}, making it possible to conclude that the Binary Gravitational Algorithm demonstrates a high level of performance.

_{0}:

## 5. Conclusions

_{B}+ GSA

_{C}will be carried out.

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Aggarwal, C.C. An Introduction to data classification. In Data Classification: Algorithms and Applications; Aggarwal, C.C., Ed.; CRC Press: New York, NY, USA, 2015; pp. 2–36. [Google Scholar]
- Hu, X.; Pedrycz, W.; Wang, X. Fuzzy classifiers with information granules in feature space and logic-based computing. Pattern Recognit.
**2018**, 80, 156–167. [Google Scholar] [CrossRef] - Evsutin, O.; Shelupanov, A.; Meshcheryakov, R.; Bondarenko, D.; Rashchupkina, A. The algorithm of continuous optimization based on the modified cellular automaton. Symmetry
**2016**, 8, 84. [Google Scholar] [CrossRef] - Das, A.K.; Goswami, S.; Chakrabarti, A.; Chakraborty, B. A new hybrid feature selection approach using feature association map for supervised and unsupervised classification. Expert Syst. Appl.
**2017**, 88, 81–94. [Google Scholar] [CrossRef] - Bolon-Canedo, V.; Sanchez-Marono, N.; Alonso-Betanzos, A. Feature Selection for High-Dimensional Data; Springer: Heidelberg, Germany, 2015; ISBN 978-3-319-21857-1. [Google Scholar]
- Lavygina, A.; Hodashinsky, I. Hybrid algorithm for fuzzy model parameter estimation based on genetic algorithm and derivative based methods. In Proceedings of the International Conference on Evolutionary Computation Theory and Applications (FCTA-2011), Paris, France, 24–26 October 2011; pp. 513–515. [Google Scholar] [CrossRef]
- Wolpert, D.H. The existence of a priori distinctions between learning algorithms. Neural Comput.
**1996**, 8, 1341–1390. [Google Scholar] [CrossRef] - Wolpert, D.H. The lack of a priori distinctions between learning algorithms. Neural Comput.
**1996**, 8, 1391–1420. [Google Scholar] [CrossRef] - Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification; John Wiley & Sons: New York, NY, USA, 2001; ISBN 0-476-05669-3. [Google Scholar]
- Rashedi, E.; Nezamabadi-pour, H.; Saryazdi, S. GSA: A Gravitational Search Algorithm. Inf. Sci.
**2009**, 179, 2232–2248. [Google Scholar] [CrossRef] - Rashedi, E.; Rashedi, E.; Nezamabadi-pour, H. A comprehensive survey on gravitational search algorithm. Swarm Evolut. Comput.
**2018**, 41, 141–158. [Google Scholar] [CrossRef] - Aziz, N.A.A.; Ibrahim, Z.; Mubin, M.; Sudin, S. Adaptive switching gravitational search algorithm: An attempt to improve diversity of gravitational search algorithm through its iteration strategy. Sādhanā
**2017**, 42, 1103–1121. [Google Scholar] [CrossRef] - Pelusi, D.; Mascella, R.; Tallini, L. A fuzzy Gravitational Search Algorithm to Design Optimal IIR Filters. Energies
**2018**, 11, 736. [Google Scholar] [CrossRef] - Pelusi, D.; Mascella, R.; Tallini, L.; Nayak, J.; Naik, B.; Abraham, A. Neural network and fuzzy system for the tuning of Gravitational Search Algorithm parameters. Expert Syst. Appl.
**2018**, 102, 234–244. [Google Scholar] [CrossRef] - Pelusi, D.; Mascella, R.; Tallini, L. Revised gravitational search algorithms based on evolutionary-fuzzy systems. Algorithms
**2017**, 10, 44. [Google Scholar] [CrossRef] - Tsai, H.-C.; Tyan, Y.-Y.; Wu, Y.-W.; Lin, Y.-H. Gravitational particle swarm. Appl. Math. Comput.
**2013**, 219, 9106–9117. [Google Scholar] [CrossRef] - Yin, B.; Guo, Z.; Liang, Z.; Yue, X. Improved gravitational search algorithm with crossover. Comput. Electr. Eng.
**2018**, 66, 505–516. [Google Scholar] [CrossRef] - Bahrololoum, A.; Nezamabadi-pour, H.; Bahrololoum, H.; Saeed, M. A prototype classifier based on gravitational search algorithm. Appl. Soft Comput.
**2012**, 12, 819–825. [Google Scholar] [CrossRef] - Zhao, F.; Xue, F.; Zhang, Y.; Ma, W.; Zhang, C.; Song, H. A hybrid algorithm based on self-adaptive gravitational search algorithm and differential evolution. Expert Syst. Appl.
**2018**, 113, 515–530. [Google Scholar] [CrossRef] - Kumar, P.G.; Devaraj, D. Fuzzy Classifier Design using Modified Genetic Algorithm. Int. J. Comput. Intell. Syst.
**2010**, 3, 334–342. [Google Scholar] [CrossRef] - Chang, X.; Lilly, J.H. Evolutionary design of a fuzzy classifier from data. IEEE Trans. Syst. Man. Cybern. B Cybern.
**2004**, 34, 1894–1906. [Google Scholar] [CrossRef] [PubMed] - Olivas, F.; Valdez, F.; Castillo, O. Fuzzy classification system design using PSO with dynamic parameter adaptation through fuzzy logic. Stud. Comput. Intell.
**2015**, 574, 29–47. [Google Scholar] [CrossRef] - Chen, T.; Shen, Q.; Su, P.; Shang, C. Fuzzy rule weight modification with particle swarm optimization. Soft Comput.
**2016**, 20, 2923–2937. [Google Scholar] [CrossRef] - Hodashinsky, I.A.; Bardamova, M.B. Tuning fuzzy systems parameters with chaotic particle swarm optimization. J. Phys. Conf. Ser.
**2017**, 803, 012053. [Google Scholar] [CrossRef] [Green Version] - Pulkkinen, P.; Koivisto, H. Identification of interpretable and accurate fuzzy classifiers and function estimators with hybrid methods. Appl. Soft Comput.
**2007**, 7, 520–533. [Google Scholar] [CrossRef] - Aydogan, E.K.; Karaoglan, I.; Pardalos, P.M. hGA: Hybrid genetic algorithm in fuzzy rule-based classification systems for high-dimensional problems. Appl. Soft Comput.
**2012**, 12, 800–806. [Google Scholar] [CrossRef] - Mekh, M.A.; Hodashinsky, I.A. Comparative analysis of differential evolution methods to optimize parameters of fuzzy classifiers. J. Comput. Syst. Sci. Int.
**2017**, 56, 616–626. [Google Scholar] [CrossRef] - Alcala-Fdez, J.; Alcala, R.; Herrera, F. A fuzzy association rule-based classification model for high-dimensional problems with genetic rule selection and lateral tuning. IEEE Trans. Fuzzy Syst.
**2011**, 19, 857–872. [Google Scholar] [CrossRef] - Fazzolari, M.; Alcala, R.; Herrera, F. A multi-objective evolutionary method for learning granularities based on fuzzy discretization to improve the accuracy-complexity trade-off of fuzzy rule-based classification systems: D-MOFARC algorithm. Appl. Soft Comput.
**2014**, 24, 470–481. [Google Scholar] [CrossRef] - Alkuhlani, A.; Nassef, M.; Farag, I. Multistage feature selection approach for high-dimensional cancer data. Soft Comput.
**2017**, 21, 6895–6906. [Google Scholar] [CrossRef] - Kohavi, R.; John, G.H. Wrappers for feature subset selection. Artif. Intell.
**1997**, 97, 273–324. [Google Scholar] [CrossRef] - Dash, M.; Liu, H. Feature selection for classification. Intell. Data Anal.
**1997**, 1, 131–156. [Google Scholar] [CrossRef] - Torkkola, K. Information-theoretic methods. Stud. Fuzz. Soft Comput.
**2006**, 207, 167–185. [Google Scholar] [CrossRef] - Veerabhadrappa; Rangarajan, L. Multi-level dimensionality reduction methods using feature selection and feature extraction. Int. J. Artif. Intell. Appl.
**2010**, 1, 54–68. [Google Scholar] [CrossRef] - Yusta, S.C. Different Metaheuristic Strategies to Solve The Feature Selection Problem. Pattern Recognit. Lett.
**2009**, 30, 525–534. [Google Scholar] [CrossRef] - Pedergnana, M.; Marpu, P.R.; Dalla Mura, M.; Benediktsson, J.A.; Bruzzone, L. A novel technique for optimal feature selection in attribute profiles based on genetic algorithms. IEEE Trans. Geosci. Remote Sens.
**2013**, 51, 3514–3528. [Google Scholar] [CrossRef] - Aladeemy, M.; Tutun, S.; Khasawneh, M.T. A New Hybrid Approach for Feature Selection and Support Vector Machine Model Selection Based on Self-Adaptive Cohort Intelligence. Expert Syst. Appl.
**2017**, 88, 118–131. [Google Scholar] [CrossRef] - Hodashinsky, I.A.; Mekh, M.A. Fuzzy Classifier Design Using Harmonic Search Methods. Programm. Comput. Soft.
**2017**, 43, 37–46. [Google Scholar] [CrossRef] - Vieira, S.M.; Sousa, J.M.C.; Runkler, T.A. Ant colony optimization applied to feature selection in fuzzy classifiers. Lect. Notes Comput. Sci.
**2007**, 4529, 778–788. [Google Scholar] [CrossRef] - Gurav, A.; Nair, V.; Gupta, U.; Valadi, J. Glowworm Swarm Based Informative Attribute Selection Using Support Vector Machines for Simultaneous Feature Selection and Classification. Lect. Notes Comput. Sci.
**2015**, 8947, 27–37. [Google Scholar] [CrossRef] - Marinaki, M.; Marinakis, Y.; Zopounidis, C. Honey Bees Mating Optimization algorithm for financial classification problems. Appl. Soft Comput.
**2010**, 10, 806–812. [Google Scholar] [CrossRef] - Rashedi, E.; Nezamabadi-pour, H. Feature subset selection using improved binary gravitational search algorithm. J. Intell. Fuzzy Syst.
**2014**, 26, 1211–1221. [Google Scholar] [CrossRef] - Mirjalili, S.; Lewis, A. S-shaped versus V-shaped transfer functions for binary Particle Swarm Optimization. Swarm Evolut. Comput.
**2013**, 9, 1–14. [Google Scholar] [CrossRef] - De Gregorio, M.; Giordano, M. An experimental evaluation of weightless neural networks for multi-class classification. Appl. Soft Comput.
**2018**, 72, 338–354. [Google Scholar] [CrossRef] - Pelusi, D.; Elmougy, S.; Tallini, L.; Bose, B. m-ary Balanced Codes with Parallel Decoding. IEEE Trans. Inf. Theory
**2015**, 61, 3251–3264. [Google Scholar] [CrossRef]

**Figure 1.**Transfer functions: (

**a**) Example of an S-shaped asymmetric transfer function (

**b**) Example of a V-shaped symmetric transfer function.

Name | Abbreviation | Features | Instances | Classes |
---|---|---|---|---|

banana | bnn | 2 | 5300 | 2 |

haberman | hbm | 3 | 306 | 2 |

titanic | tit | 3 | 2201 | 2 |

iris | irs | 4 | 150 | 3 |

balance | bln | 4 | 625 | 3 |

newthyroid | nth | 5 | 215 | 3 |

phoneme | phn | 5 | 5404 | 2 |

bupa | bup | 6 | 345 | 2 |

pima | pim | 8 | 768 | 2 |

glass | gls | 9 | 214 | 7 |

wisconsin | wis | 9 | 683 | 2 |

page-blocks | pbl | 10 | 5472 | 5 |

magic | mag | 10 | 19,020 | 2 |

wine | win | 13 | 178 | 3 |

cleveland | clv | 13 | 297 | 5 |

heart | hrt | 13 | 270 | 2 |

penbased | pbs | 16 | 10,992 | 10 |

vehicle | veh | 18 | 846 | 4 |

hepatitis | hep | 19 | 80 | 2 |

segment | seg | 19 | 2310 | 7 |

ring | rin | 20 | 7400 | 2 |

twonorm | twn | 20 | 7400 | 2 |

thyroid | thr | 21 | 7200 | 3 |

satimage | sat | 36 | 6435 | 7 |

spambase | spb | 57 | 4597 | 2 |

coil2000 | coil | 85 | 9822 | 2 |

Dataset | Full Set | S1 | S2 | V1 | V2 | |||||
---|---|---|---|---|---|---|---|---|---|---|

#F | #T | #F | #T | #F | #T | #F | #T | #F | #T | |

newthyroid | 5 | 96.3 | 3.5 | 96.5 | 3.7 | 96.5 | 3.3 | 96.5 | 3.4 | 96.4 |

phoneme | 5 | 70.7 | 4 | 76.2 | 4 | 76.2 | 2.3 | 75.3 | 3.7 | 76.1 |

bupa | 6 | 49.0 | 2.7 | 60.0 | 2.8 | 59.8 | 2.7 | 60.0 | 2.8 | 57.1 |

pima | 8 | 70.2 | 3.9 | 71.0 | 3.9 | 71.0 | 2.6 | 70.8 | 4.1 | 70.6 |

glass | 9 | 49.1 | 5.2 | 55.9 | 5.1 | 56.0 | 5.9 | 53.2 | 5.5 | 53.9 |

wisconsin | 9 | 90.0 | 5.8 | 94.0 | 5.7 | 94.0 | 3.5 | 93.6 | 5.9 | 93.8 |

page-blocks | 10 | 6.1 | 2 | 80.5 | 2 | 80.5 | 2 | 80.5 | 2 | 80.5 |

magic | 10 | 56.1 | 4.1 | 70.7 | 4.1 | 70.7 | 4.1 | 70.7 | 4.1 | 70.7 |

wine | 13 | 88.2 | 5.9 | 92.6 | 5.8 | 94.8 | 6.8 | 92.2 | 6.2 | 94.5 |

cleveland | 13 | 53.5 | 7.4 | 53.1 | 7.3 | 52.5 | 2.8 | 54.4 | 5.6 | 48.8 |

heart | 13 | 57.4 | 3.1 | 67.1 | 2.8 | 67.0 | 3 | 67.7 | 4.1 | 67.7 |

penbased | 16 | 31.9 | 8.2 | 49.7 | 8.1 | 49.7 | 9.3 | 46.8 | 9 | 48.5 |

vehicle | 18 | 29.9 | 7.9 | 45.5 | 7.8 | 45.6 | 4.8 | 40.0 | 7.4 | 45.6 |

hepatitis | 19 | 61.0 | 7.7 | 87.4 | 7.9 | 87.2 | 5.3 | 82.5 | 6.7 | 85.1 |

segment | 19 | 78.2 | 10.2 | 85.4 | 9.1 | 85.7 | 8.8 | 84.1 | 8.5 | 85.7 |

ring | 20 | 49.5 | 1.0 | 58.6 | 1.0 | 58.6 | 1.0 | 57.9 | 2.5 | 55.5 |

twonorm | 20 | 96.8 | 19.7 | 96.8 | 19.9 | 96.8 | 17.8 | 96.1 | 17.1 | 95.8 |

thyroid | 21 | 99.3 | 19.9 | 99.3 | 20 | 99.3 | 16.9 | 99.3 | 14.6 | 99.3 |

satimage | 36 | 58.4 | 15.4 | 62.5 | 15.9 | 62.3 | 9.9 | 61.1 | 13.2 | 60.8 |

spambase | 57 | 56.3 | 29.7 | 65.9 | 27.0 | 65.4 | 2.7 | 70.0 | 27.9 | 64.2 |

coil2000 | 85 | 16.4 | 38.2 | 90.1 | 38.5 | 90.6 | 1 | 94.0 | 37.6 | 86.4 |

Transfer Function | All | S1 | S2 | V1 | V2 |
---|---|---|---|---|---|

All | - | 0.064 | 0.064 | 0.087 | 0.159 |

S1 | 0.064 | - | 1.0 | 0.940 | 0.792 |

S2 | 0.064 | 1.0 | - | 0.910 | 0.734 |

V1 | 0.087 | 0.940 | 0.910 | - | 0.940 |

V2 | 0.159 | 0.792 | 0.734 | 0.940 | - |

Transfer Function | All | S1 | S2 | V1 | V2 |
---|---|---|---|---|---|

All | - | 0.004 | 0.004 | 0.00001 | 0.002 |

S1 | 0.004 | - | 1.0 | 0.082 | 0.960 |

S2 | 0.004 | 1.0 | - | 0.092 | 0.990 |

V1 | 0.00001 | 0.082 | 0.092 | - | 0.078 |

V2 | 0.002 | 0.960 | 0.990 | 0.078 | - |

Dataset | Type of Membership Function | GS_{b} + GS_{c} | GS_{c} | D-MOFARC | FARC-HD | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|

#R | #F | #L | #T | #L | #T | #R | #L | #T | #R | #L | #T | ||

bnn | triangle | 2 | 2 | 72.3 | 72.8 | 72.3 | 72.8 | 8.7 | 90.3 | 89.0 | 12.9 | 86.0 | 85.5 |

hbm | triangle | 2 | 3 | 75.6 | 74.4 | 75.6 | 74.4 | 9.2 | 81.7 | 69.4 | 5.7 | 79.2 | 73.5 |

tit | gaussoid | 2 | 3 | 77.8 | 78.6 | 77.8 | 78.6 | 10.4 | 78.9 | 78.7 | 4.1 | 79.1 | 78.8 |

irs | triangle | 3 | 4 | 98.3 | 97.3 | 98.3 | 97.3 | 5.6 | 98.1 | 96.0 | 4.4 | 98.6 | 95.3 |

bln | gaussoid | 3 | 4 | 83.7 | 81.8 | 83.7 | 81.8 | 20.1 | 89.4 | 85.6 | 18.8 | 92.2 | 91.2 |

nth | gaussoid | 3 | 3 | 98.2 | 99.0 | 98.3 | 98.1 | 9.5 | 99.8 | 95.5 | 9.6 | 99.2 | 94.4 |

phn | gaussoid | 2 | 4 | 78.4 | 78.5 | 77.3 | 77.5 | 9.3 | 84.8 | 83.5 | 17.2 | 83.9 | 82.4 |

bup | triangle | 2 | 4 | 71.6 | 68.7 | 68.9 | 69 | 7.7 | 82.8 | 70.1 | 10.6 | 78.2 | 66.4 |

pim | triangle | 2 | 2 | 75.4 | 77.9 | 76.9 | 74 | 10.4 | 82.3 | 75.5 | 20.2 | 82.3 | 76.2 |

gls | gaussoid | 7 | 4 | 66 | 70.7 | 63.4 | 57.5 | 27.4 | 95.2 | 70.6 | 18.2 | 79.0 | 69.0 |

wis | triangle | 2 | 4 | 97.3 | 97.2 | 96 | 96.3 | 9.0 | 98.6 | 96.8 | 13.6 | 98.3 | 96.2 |

pbl | gaussoid | 5 | 2 | 89.7 | 89.7 | 90.8 | 90.8 | 21.5 | 97.8 | 97.0 | 18.4 | 95.5 | 95.0 |

mag | gaussoid | 2 | 4 | 71.1 | 70.9 | 79.9 | 79.5 | 32.2 | 86.3 | 85.4 | 43.8 | 85.4 | 84.8 |

win | gaussoid | 3 | 7 | 99.9 | 97.4 | 99.3 | 97.1 | 8.6 | 100.0 | 95.8 | 8.3 | 100.0 | 95.5 |

clv | gaussoid | 5 | 2 | 58.1 | 58.3 | 63.4 | 62.6 | 45.6 | 90.9 | 52.9 | 42.1 | 82.2 | 58.3 |

hrt | gaussoid | 2 | 6 | 76.2 | 70.7 | 86.5 | 84.1 | 18.7 | 94.4 | 84.4 | 27.8 | 93.1 | 83.7 |

pbs | gaussoid | 10 | 8 | 68.0 | 67.8 | 55.1 | 55.0 | 119.2 | 97.4 | 96.2 | 152.7 | 97.0 | 96.0 |

veh | triangle | 4 | 7 | 50.4 | 51.1 | 53.4 | 50 | 22.4 | 84.5 | 70.6 | 31.6 | 77.2 | 68.0 |

hep | gaussoid | 2 | 7 | 91.5 | 93.3 | 94.1 | 89.9 | 11.4 | 100.0 | 90.0 | 10.4 | 99.4 | 88.7 |

seg | triangle | 7 | 9 | 88.3 | 89.1 | 84.4 | 82.8 | 26.2 | 98.0 | 96.6 | 41.1 | 94.8 | 93.3 |

rin | gaussoid | 2 | 3 | 74.9 | 74.3 | 82.1 | 82.5 | 15.3 | 94.2 | 93.3 | 24.9 | 95.1 | 94.0 |

twn | gaussoid | 2 | 14 | 96.9 | 96.8 | 94.4 | 94.4 | 10.2 | 94.5 | 93.1 | 60.4 | 96.6 | 95.1 |

thr | triangle | 3 | 12 | 99.1 | 98.6 | 99.5 | 99.3 | 5.9 | 99.3 | 99.1 | 4.9 | 94.3 | 94.1 |

sat | gaussoid | 7 | 8 | 85.5 | 84.6 | 84.6 | 83.7 | 56.0 | 90.8 | 87.5 | 30.2 | 84.4 | 83.8 |

spb | gaussoid | 2 | 3 | 73.7 | 74.0 | 70.5 | 69.7 | 24.3 | 91.7 | 90.5 | 30.5 | 92.4 | 91.6 |

coil | triangle | 2 | 1 | 94.0 | 94.0 | 92.2 | 92.1 | 89.0 | 94.0 | 94.0 | 2.6 | 94.0 | 94.0 |

Method | GSA_{B} + GSA_{C} | GSA_{C} |
---|---|---|

D-MOFARC | 0.297 | 0.161 |

FARC-HD | 0.346 | 0.241 |

Dataset | Methods | |||||||||
---|---|---|---|---|---|---|---|---|---|---|

LR | GNB | kNN | SVC | RF | AB | GTB | MLP | WNN | GSA | |

bln | 0.8607 | 0.8381 | 0.8369 | 0.8881 | 0.712 | 0.8417 | 0.8083 | 0.9729 | 0.7808 | 0.818 |

bnn | 0.5709 | 0.614 | 0.9036 | 0.6504 | 0.897 | 0.7168 | 0.8989 | 0.8949 | 0.903 | 0.728 |

bup | 0.6463 | 0.5622 | 0.5911 | 0.5943 | 0.7363 | 0.7361 | 0.7334 | 0.7448 | 0.6781 | 0.687 |

clv | 0.5892 | 0.5345 | 0.5534 | 0.5656 | 0.5697 | 0.5623 | 0.5426 | 0.5023 | 0.5892 | 0.583 |

coil | 0.9402 | 0.1355 | 0.9405 | 0.9403 | 0.929 | 0.9403 | 0.9394 | 0.9371 | 0.9403 | 0.94 |

gls | 0.5802 | 0.469 | 0.6695 | 0.5906 | 0.7926 | 0.5146 | 0.7249 | 0.6893 | 0.7243 | 0.707 |

hbm | 0.7485 | 0.7424 | 0.7354 | 0.7353 | 0.6862 | 0.7355 | 0.7133 | 0.7389 | 0.732 | 0.744 |

hrt | 0.8444 | 0.8407 | 0.8259 | 0.8111 | 0.8222 | 0.8259 | 0.8111 | 0.8444 | 0.8333 | 0.707 |

hep | 0.8405 | 0.5919 | 0.8181 | 0.8405 | 0.8891 | 0.8827 | 0.8323 | 0.8038 | 0.875 | 0.933 |

irs | 0.9 | 0.9533 | 0.9533 | 0.9667 | 0.96 | 0.9467 | 0.96 | 0.96 | 0.9467 | 0.973 |

nth | 0.8885 | 0.9677 | 0.9437 | 0.9208 | 0.9582 | 0.9537 | 0.9487 | 0.9626 | 0.9721 | 0.99 |

pbl | 0.9445 | 0.8864 | 0.9519 | 0.9361 | 0.9686 | 0.9541 | 0.9698 | 0.9636 | 0.9567 | 0.897 |

pbs | 0.9306 | 0.8559 | 0.9931 | 0.9953 | 0.9915 | 0.6912 | 0.9915 | 0.9922 | 0.9916 | 0.678 |

phn | 0.7496 | 0.7605 | 0.8901 | 0.797 | 0.913 | 0.8231 | 0.9099 | 0.8458 | 0.899 | 0.785 |

pim | 0.7707 | 0.7629 | 0.759 | 0.7837 | 0.7524 | 0.7564 | 0.7564 | 0.7746 | 0.776 | 0.779 |

sat | 0.8236 | 0.7932 | 0.8925 | 0.8909 | 0.904 | 0.7748 | 0.9001 | 0.8814 | 0.892 | 0.846 |

seg | 0.9108 | 0.7987 | 0.961 | 0.9437 | 0.9784 | 0.8065 | 0.9818 | 0.9662 | 0.9714 | 0.891 |

thr | 0.9455 | 0.1235 | 0.9404 | 0.9385 | 0.9956 | 0.9894 | 0.9962 | 0.9843 | 0.9425 | 0.986 |

tit | 0.776 | 0.7733 | 0.7287 | 0.7819 | 0.7878 | 0.7783 | 0.7905 | 0.7892 | 0.7878 | 0.786 |

twn | 0.9778 | 0.9786 | 0.9749 | 0.9785 | 0.9736 | 0.9673 | 0.9734 | 0.9773 | 0.9759 | 0.968 |

veh | 0.7533 | 0.4611 | 0.6915 | 0.7567 | 0.7483 | 0.6089 | 0.7745 | 0.8181 | 0.7435 | 0.511 |

win | 0.9666 | 0.9663 | 0.9712 | 0.9826 | 0.9774 | 0.9215 | 0.9329 | 0.9826 | 0.9888 | 0.974 |

wis | 0.9693 | 0.962 | 0.9723 | 0.9693 | 0.965 | 0.9562 | 0.9634 | 0.965 | 0.9722 | 0.972 |

Method | LR | GNB | kNN | SVC | RF | AB | GTB | MLP | WNN |
---|---|---|---|---|---|---|---|---|---|

FC+GSA | 0.543 | 0.008 * | 0.503 | 0.808 | 0.114 | 0.429 | 0.144 | 0.094 | 0.107 |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Bardamova, M.; Konev, A.; Hodashinsky, I.; Shelupanov, A.
A Fuzzy Classifier with Feature Selection Based on the Gravitational Search Algorithm. *Symmetry* **2018**, *10*, 609.
https://doi.org/10.3390/sym10110609

**AMA Style**

Bardamova M, Konev A, Hodashinsky I, Shelupanov A.
A Fuzzy Classifier with Feature Selection Based on the Gravitational Search Algorithm. *Symmetry*. 2018; 10(11):609.
https://doi.org/10.3390/sym10110609

**Chicago/Turabian Style**

Bardamova, Marina, Anton Konev, Ilya Hodashinsky, and Alexander Shelupanov.
2018. "A Fuzzy Classifier with Feature Selection Based on the Gravitational Search Algorithm" *Symmetry* 10, no. 11: 609.
https://doi.org/10.3390/sym10110609