# Feature Selection Based on Binary Tree Growth Algorithm for the Classification of Myoelectric Signals

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

^{D}, where D is the number of features. Accordingly, it is impractical to perform an exhaustive search to look for the best feature subset [20]. Therefore, pre-processing, such as feature projection and feature selection, can be used to solve the dimensionality problem [7,21,22].

## 2. Standard Tree Growth Algorithm

_{1}trees are assigned to the first tree group. In this group, the new tree is generated as follows:

_{i}is the tree (solution) at i order in the population, θ is the tree reduction rate of power, t is the number of iterations, and r is a random number distributed between [0,1]. If the newly generated tree achieves a better fitness value, then the current tree will be replaced. Otherwise, the current tree is kept for the next generation.

_{2}trees are moved into the second group. For each tree, the two nearest trees (from first group and second group) are determined by calculating the Euclidean distance as follows:

_{N2}is the current tree and X

_{i}denotes the ith tree in the population. Note that the distance becomes infinity when X

_{N2}is the same as X

_{i}, where N2 = i. Afterward, the current tree moves toward the nearest trees to compete for light. A linear combination of the two nearest trees is computed as follows:

_{1}is the first nearest tree, T

_{2}is the second nearest tree, and λ is the parameter that is used to control the influence of the nearest tree. In the second group, the position of the tree is updated as:

_{3}worst trees are removed and replaced with the new trees (new solutions). The N

_{3}is calculated as:

_{1}is the number of trees in the first group, and N

_{2}is the number of trees in the second group.

_{4}trees are generated around the best trees (first tree group) using the mask operator. Note that the number of N

_{4}should not exceed the total number of N

_{1}and N

_{2}[34]. After that, the newly generated N

_{4}trees are added into the population. The merged population is sorted according to the fitness value. Then, the best N trees are selected as the new population for the next iteration. The algorithm is repeated until the terminated criterion is satisfied. Finally, the global best tree is selected as the best solution.

## 3. Materials and Methods

#### 3.1. EMG Data

#### 3.2. STFT Based Feature Extraction

#### 3.3. Proposed Feature Selection Approaches

#### 3.3.1. Binary Tree Growth Algorithm

_{1}new trees (trial trees) are generated as shown in Equation (1). Note that in the first and second groups, the new trees are converted into a binary form using Equation (8). The transfer function can be either Equations (6) or (7). If the new tree resulted in a better fitness value, then the current one will be replaced; otherwise, the current tree is kept for the next generation. For the second group, the two nearest trees for each tree are determined by applying Equation (2). Then, the position of N

_{2}tree is updated as shown in Equations (3) and (4), respectively. In the third group, the N

_{3}trees are removed and new trees are created. For the fourth group, the N

_{4}new trees are generated by applying the mask operation around the best trees in the first group. After that, the newly generated N

_{4}trees are added into the population. The merged population is ranked and the best N trees are kept for the next iteration. In each iteration, the global best tree is updated. The algorithm is repeated until the terminated criterion is satisfied. Finally, the global best tree is chosen as the optimal feature subset.

#### 3.3.2. Modified Binary Tree Growth Algorithm

_{i}is the ith tree in the population, T

_{1}is the first nearest tree, T

_{2}is the second nearest tree, d is the dimension of the search space, t is the number of iterations, and r

_{1}is a random number drawn from a uniform distribution between 0 and 1. As can be seen from Equation (10), crossover tends to increase the possibility of trees moving toward the two nearest trees to compete for light. Furthermore, a mutation operator is employed to enhance the search ability of MBTGA as follows:

_{id}is the dth dimension of ith tree, MR is the mutation rate, and rand is the random number distributed between 0 and 1. Note that the mutation rate is linearly decreasing from 0.9 to 0, as shown in Equation (12).

_{1}trees are assigned into the first group. In this group, the new trees (trial trees) are generated using the swap operator. After that, greedy selection is applied. If the newly generated tree obtains a better fitness value, then the current tree will be replaced. Otherwise, the current tree is kept for the next generation. Next, the N

_{2}trees are allocated to the second group. For each tree, the two nearest trees are determined using Equation (2). Then, the crossover is computed between the current tree and two nearest trees for the competition for light. Additionally, a mutation operation is conducted based on the mutation rate. In the third tree group, the N

_{3}trees are removed and new trees are planted. In the final tree group, new N

_{4}trees are generated around the best solutions (first tree group) using the mask operator. The newly generated N

_{4}trees are added into the current population. Afterward, the merged population is ranked, and the best N trees are selected as the new population for the next iteration. At the end of each iteration, the global best tree is updated. The algorithm is repeated until the terminated criterion is satisfied. Finally, the global best solution is selected as the best feature subset.

- MBTGA applies the swap operator to perform the local search, which overcomes the limitation of BTGA in the first group. This can ensure high exploitation in MBTGA.
- The use of crossover and mutation operators in MBTGA increases the global search ability, thus leading to high exploration.
- MBTGA has less parameters to adjust compared to BTGA. This again reduces the complexity of the algorithm.

#### 3.3.3. Fitness Function

_{R}is the classification error rate, |R| is the number of selected features, |S| is the total number of features, and β is the parameter that controls the importance of classification error and feature reduction. In this work, β is set at 0.99 since the classification performance is considered to be the most important [43].

#### 3.4. Evaluation Criteria

#### 3.4.1. Classification Accuracy

#### 3.4.2. Feature Selection Ratio

#### 3.4.3. Sensitivity and Specificity

#### 3.4.4. F-Measure

#### 3.4.5. Geometric Mean

#### 3.4.6. ROC Analysis

## 4. Results and Discussions

_{j}and max

_{j}are the minimum and maximum values of features in dimension j, respectively.

_{1}, N

_{2}, and N

_{4}are chosen empirically as 10, 15, and 10, respectively. Note that BTGA has two additional parameters, θ and λ, that need to be adjusted. As recommended by the authors in [34], the values of θ and λ are chosen as 0.8 and 0.5, respectively. For BDE, the crossover rate, CR, is set at 1 [50]. To ensure a fair comparison, the population size and maximum number of iterations are fixed at 30 and 100, respectively.

#### 4.1. Classification Performance

#### 4.2. Feature Selection Ratio

#### 4.3. Convergent Analysis

#### 4.4. Performance Measurement

#### 4.5. ROC Analysis

#### 4.6. Class-Wise Classification Performance

## 5. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Atzori, M.; Gijsberts, A.; Kuzborskij, I.; Elsig, S.; Hager, A.G.M.; Deriaz, O.; Castellini, C.; Müller, H.; Caputo, B. Characterization of a Benchmark Database for Myoelectric Movement Classification. IEEE Trans. Neural Syst. Rehabil. Eng.
**2015**, 23, 73–83. [Google Scholar] [CrossRef] [PubMed] - Atzori, M.; Gijsberts, A.; Castellini, C.; Caputo, B.; Hager, A.G.M.; Elsig, S.; Giatsidis, G.; Bassetto, F.; Müller, H. Electromyography data for non-invasive naturally-controlled robotic hand prostheses. Sci. Data
**2014**, 23, 140053. [Google Scholar] [CrossRef] - Chowdhury, R.H.; Reaz, M.B.; Ali, M.A.B.M.; Bakar, A.A.; Chellappan, K.; Chang, T.G. Surface electromyography signal processing and classification techniques. Sensors
**2013**, 13, 12431–12466. [Google Scholar] [CrossRef] [PubMed] - Vieira, T.; Muceli, S.; Farina, D.; Botter, A. Large inter-electrode distances lead to more representative bipolar EMGs, not necessarily affected by crosstalk. Gait Posture
**2016**, 49, S28–S29. [Google Scholar] [CrossRef] - Gijsberts, A.; Atzori, M.; Castellini, C.; Muller, H.; Caputo, B. Movement Error Rate for Evaluation of Machine Learning Methods for sEMG-Based Hand Movement Classification. IEEE Trans. Neural Syst. Rehabil. Eng.
**2014**, 22, 735–744. [Google Scholar] [CrossRef] - Pizzolato, S.; Tagliapietra, L.; Cognolato, M.; Reggiani, M.; Müller, H.; Atzori, M. Comparison of six electromyography acquisition setups on hand movement classification tasks. PLoS ONE
**2017**, 12, e0186132. [Google Scholar] [CrossRef] - Al-Angari, H.M.; Kanitz, G.; Tarantino, S.; Cipriani, C. Distance and mutual information methods for EMG feature and channel subset selection for classification of hand movements. Biomed. Signal Process. Control
**2016**, 27, 24–31. [Google Scholar] [CrossRef] - Khushaba, R.N.; Takruri, M.; Miro, J.V.; Kodagoda, S. Towards limb position invariant myoelectric pattern recognition using time-dependent spectral features. Neural Netw.
**2014**, 55, 42–58. [Google Scholar] [CrossRef][Green Version] - Waris, A.; Niazi, I.K.; Jamil, M.; Englehart, K.; Jensen, W.; Kamavuako, E.N. Multiday Evaluation of Techniques for EMG Based Classification of Hand Motions. IEEE J. Biomed. Health Inform.
**2018**, 1. [Google Scholar] [CrossRef] [PubMed] - Englehart, K.; Hudgins, B.; Parker, P.A.; Stevenson, M. Classification of the myoelectric signal using time-frequency based representations. Med. Eng. Phys.
**1999**, 21, 431–438. [Google Scholar] [CrossRef][Green Version] - Karthick, P.A.; Ghosh, D.M.; Ramakrishnan, S. Surface electromyography based muscle fatigue detection using high-resolution time-frequency methods and machine learning algorithms. Comput. Methods Programs Biomed.
**2018**, 154, 45–56. [Google Scholar] [CrossRef] [PubMed] - Tapia, C.; Daud, O.; Ruiz-del-Solar, J. EMG Signal Filtering Based on Independent Component Analysis and Empirical Mode Decomposition for Estimation of Motor Activation Patterns. J. Med. Biol. Eng.
**2017**, 37, 140–155. [Google Scholar] [CrossRef] - Naik, G.R.; Selvan, S.E.; Nguyen, H.T. Single-Channel EMG Classification with Ensemble-Empirical-Mode-Decomposition-Based ICA for Diagnosing Neuromuscular Disorders. IEEE Trans. Neural Syst. Rehabil. Eng.
**2016**, 24, 734–743. [Google Scholar] [CrossRef] [PubMed] - Tsai, A.C.; Luh, J.J.; Lin, T.T. A novel STFT-ranking feature of multi-channel EMG for motion pattern recognition. Expert Syst. Appl.
**2015**, 42, 3327–3341. [Google Scholar] [CrossRef] - Doulah, A.S.U.; Fattah, S.A.; Zhu, W.P.; Ahmad, M.O. Wavelet Domain Feature Extraction Scheme Based on Dominant Motor Unit Action Potential of EMG Signal for Neuromuscular Disease Classification. IEEE Trans. Biomed. Circuits Syst.
**2014**, 8, 155–164. [Google Scholar] [CrossRef] [PubMed] - Phinyomark, A.; Nuidod, A.; Phukpattaranont, P.; Limsakul, C. Feature Extraction and Reduction of Wavelet Transform Coefficients for EMG Pattern Classification. Elektron. Elektrotech.
**2012**, 122, 27–32. [Google Scholar] [CrossRef] - Phinyomark, A.; Phukpattaranont, P.; Limsakul, C. Feature reduction and selection for EMG signal classification. Expert Syst. Appl.
**2012**, 39, 7420–7431. [Google Scholar] [CrossRef] - Phinyomark, A.; Khushaba, R.N.; Scheme, E. Feature Extraction and Selection for Myoelectric Control Based on Wearable EMG Sensors. Sensors
**2018**, 18, 1615. [Google Scholar] [CrossRef] - Gu, Y.; Yang, D.; Huang, Q.; Yang, W.; Liu, H. Robust EMG pattern recognition in the presence of confounding factors: Features, classifiers and adaptive learning. Expert Syst. Appl.
**2018**, 96, 208–217. [Google Scholar] [CrossRef] - Hancer, E.; Xue, B.; Zhang, M.; Karaboga, D.; Akay, B. Pareto front feature selection based on artificial bee colony optimization. Inf. Sci.
**2018**, 422, 462–479. [Google Scholar] [CrossRef] - Geethanjali, P. Comparative study of PCA in classification of multichannel EMG signals. Australas. Phys. Eng. Sci. Med.
**2015**, 38, 331–343. [Google Scholar] [CrossRef] [PubMed] - Kakoty, N.M.; Hazarika, S.M.; Gan, J.Q. EMG Feature Set Selection Through Linear Relationship for Grasp Recognition. J. Med. Biol. Eng.
**2016**, 36, 883–890. [Google Scholar] [CrossRef] - Subasi, A.; Kiymik, M.K. Muscle fatigue detection in EMG using time–frequency methods, ICA and neural networks. J. Med. Syst.
**2010**, 34, 777–785. [Google Scholar] [CrossRef] [PubMed] - Zhang, X.; Zhou, P. High-density myoelectric pattern recognition toward improved stroke rehabilitation. IEEE Trans. Biomed. Eng.
**2012**, 59, 1649–1657. [Google Scholar] [CrossRef] - Wang, N.; Lao, K.; Zhang, X. Design and myoelectric control of an anthropomorphic prosthetic hand. J. Bionic Eng.
**2017**, 14, 47–59. [Google Scholar] [CrossRef] - Riillo, F.; Quitadamo, L.R.; Cavrini, F.; Gruppioni, E.; Pinto, C.A.; Pastò, N.C.; Sbernini, L.; Albero, L.; Saggio, G. Optimization of EMG-based hand gesture recognition: Supervised vs. unsupervised data preprocessing on healthy subjects and transradial amputees. Biomed. Signal Process. Control
**2014**, 14, 117–125. [Google Scholar] [CrossRef][Green Version] - Mohammadi, F.G.; Abadeh, M.S. Image steganalysis using a bee colony based feature selection algorithm. Eng. Appl. Artif. Intell.
**2014**, 31, 35–43. [Google Scholar] [CrossRef] - Xue, B.; Zhang, M.; Browne, W.N. Particle swarm optimisation for feature selection in classification: Novel initialisation and updating mechanisms. Appl. Soft Comput.
**2014**, 18, 261–276. [Google Scholar] [CrossRef] - Wan, Y.; Wang, M.; Ye, Z.; Lai, X. A feature selection method based on modified binary coded ant colony optimization algorithm. Appl. Soft Comput.
**2016**, 49, 248–258. [Google Scholar] [CrossRef] - Zhang, Y.; Wang, S.; Phillips, P.; Ji, G. Binary PSO with mutation operator for feature selection using decision tree applied to spam detection. Knowl.-Based Syst.
**2014**, 64, 22–31. [Google Scholar] [CrossRef] - Rejer, I. Genetic algorithm with aggressive mutation for feature selection in BCI feature space. Pattern Anal. Appl.
**2015**, 18, 485–492. [Google Scholar] [CrossRef] - Hariharan, M.; Fook, C.Y.; Sindhu, R.; Ilias, B.; Yaacob, S. A comparative study of wavelet families for classification of wrist motions. Comput. Electr. Eng.
**2012**, 38, 1798–1807. [Google Scholar] [CrossRef] - Subasi, A. Classification of EMG signals using PSO optimized SVM for diagnosis of neuromuscular disorders. Comput. Biol. Med.
**2013**, 43, 576–586. [Google Scholar] [CrossRef] [PubMed] - Cheraghalipour, A.; Hajiaghaei-Keshteli, M.; Paydar, M.M. Tree Growth Algorithm (TGA): A novel approach for solving optimization problems. Eng. Appl. Artif. Intell.
**2018**, 72, 393–414. [Google Scholar] [CrossRef] - Tsai, A.C.; Hsieh, T.H.; Luh, J.J.; Lin, T.T. A comparison of upper-limb motion pattern recognition using EMG signals during dynamic and isometric muscle contractions. Biomed. Signal Process. Control
**2014**, 11, 17–26. [Google Scholar] [CrossRef] - Gonzalez-Izal, M.; Malanda, A.; Navarro-Amezqueta, I.; Gorostiaga, E.M.; Mallor, F.; Ibanez, J.; Izquierdo, M. EMG spectral indices and muscle power fatigue during dynamic contractions. J. Electromyogr. Kinesiol.
**2010**, 20, 233–240. [Google Scholar] [CrossRef] [PubMed] - Karthick, P.A.; Ramakrishnan, S. Surface electromyography based muscle fatigue progression analysis using modified B distribution time–frequency features. Biomed. Signal Process. Control
**2016**, 26, 42–51. [Google Scholar] [CrossRef] - Kennedy, J.; Eberhart, R.C. A discrete binary version of the particle swarm algorithm. In Proceedings of the IEEE International Conference on Systems, Man, and Cybernetics, Computational Cybernetics and Simulation, Orlando, FL, USA, 11–15 October 1997. [Google Scholar]
- Mafarja, M.; Eleyan, D.; Abdullah, S.; Mirjalili, S. S-Shaped vs. V-Shaped Transfer Functions for Ant Lion Optimization Algorithm in Feature Selection Problem. In Proceedings of the International Conference on Future Networks and Distributed Systems, Cambridge, UK, 19–20 July 2017; ACM: New York, NY, USA, 2017. [Google Scholar]
- Mirjalili, S.; Lewis, A. S-shaped versus V-shaped transfer functions for binary Particle Swarm Optimization. Swarm Evol. Comput.
**2013**, 9, 1–14. [Google Scholar] [CrossRef] - Chandrasekaran, K.; Simon, S.P.; Padhy, N.P. Binary real coded firefly algorithm for solving unit commitment problem. Inf. Sci.
**2013**, 249, 67–84. [Google Scholar] [CrossRef] - Emary, E.; Zawbaa, H.M.; Hassanien, A.E. Binary grey wolf optimization approaches for feature selection. Neurocomputing
**2016**, 172, 371–381. [Google Scholar] [CrossRef] - Tawhid, M.A.; Dsouza, K.B. Hybrid Binary Bat Enhanced Particle Swarm Optimization Algorithm for solving feature selection problems. Appl. Comput. Inform.
**2018**, in press. [Google Scholar] [CrossRef] - Chuang, L.Y.; Chang, H.W.; Tu, C.J.; Yang, C.H. Improved binary PSO for feature selection using gene expression data. Comput. Biol. Chem.
**2008**, 32, 29–38. [Google Scholar] [CrossRef] - Chuang, L.Y.; Yang, C.H.; Li, J.C. Chaotic maps based on binary particle swarm optimization for feature selection. Appl. Soft Comput.
**2011**, 11, 239–248. [Google Scholar] [CrossRef] - Tkach, D.; Huang, H.; Kuiken, T.A. Study of stability of time-domain features for electromyographic pattern recognition. J. NeuroEng. Rehabil.
**2010**, 7, 21. [Google Scholar] [CrossRef] [PubMed][Green Version] - Li, Q.; Chen, H.; Huang, H.; Zhao, X.; Cai, Z.; Tong, C.; Liu, W.; Tian, X. An Enhanced Grey Wolf Optimization Based Feature Selection Wrapped Kernel Extreme Learning Machine for Medical Diagnosis. Comput. Math. Methods Med.
**2017**, 2017, 9512741. [Google Scholar] [CrossRef] - Purushothaman, G.; Vikas, R. Identification of a feature selection based pattern recognition scheme for finger movement recognition from multichannel EMG signals. Australas. Phys. Eng. Sci. Med.
**2018**, 41, 549–559. [Google Scholar] [CrossRef] - Wang, M.; Chen, H.; Yang, B.; Zhao, X.; Hu, L.; Cai, Z.; Huang, H.; Tong, C. Toward an optimal kernel extreme learning machine using a chaotic moth-flame optimization strategy with applications in medical diagnoses. Neurocomputing
**2017**, 267, 69–84. [Google Scholar] [CrossRef] - Zorarpacı, E.; Özel, S.A. A hybrid approach of differential evolution and artificial bee colony for feature selection. Expert Syst. Appl.
**2016**, 62, 91–103. [Google Scholar] [CrossRef]

**Figure 10.**Overall confusion matrix of four feature selection methods across 10 subjects. (

**a**) BDE; (

**b**) BTGA1; (

**c**) BTGA2; (

**d**) MBTGA.

Time-Frequency and Statistical Feature | Equation |
---|---|

Renyi entropy (RE) | $RE=\frac{1}{1-\alpha}{\mathrm{log}}_{2}{{\displaystyle \sum _{n=1}^{L}{\displaystyle \sum _{m=1}^{M}\left(\frac{S\left[n,m\right]}{{\displaystyle {\sum}_{n}{\displaystyle {\sum}_{m}S\left[n,m\right]}}}\right)}}}^{\alpha}$ |

Spectral entropy (SE) | $SE=-{\displaystyle \sum _{n=1}^{L}{\displaystyle \sum _{m=1}^{M}\frac{P\left[n,m\right]}{{\displaystyle {\sum}_{n}{\displaystyle {\sum}_{m}P\left[n,m\right]}}}}}{\mathrm{log}}_{2}\left(\frac{P\left[n,m\right]}{{\displaystyle {\sum}_{n}{\displaystyle {\sum}_{m}P\left[n,m\right]}}}\right)$ |

Shannon entropy (Sh) | $Sh=-{\displaystyle \sum _{n=1}^{L}{\displaystyle \sum _{m=1}^{M}\frac{S\left[n,m\right]}{{\displaystyle {\sum}_{n}{\displaystyle {\sum}_{m}S\left[n,m\right]}}}}}{\mathrm{log}}_{2}\left(\frac{S\left[n,m\right]}{{\displaystyle {\sum}_{n}{\displaystyle {\sum}_{m}S\left[n,m\right]}}}\right)$ |

Singular decomposition based entropy (E_{SVD}) | ${E}_{SVD}=-{\displaystyle \sum _{k=1}^{N}\overline{{S}_{k}}\mathrm{log}}\overline{{S}_{k}}$ |

Concentration measure (CM) | $CM={\left({\displaystyle \sum _{n=1}^{L}{\displaystyle \sum _{m=1}^{M}{\left|S\left[n,m\right]\right|}^{1/2}}}\right)}^{2}$ |

Mean | $Mean=\frac{1}{LM}{\displaystyle {\sum}_{n}{\displaystyle {\sum}_{m}S\left[n,m\right]}}$ |

Variance (VAR) | $VAR=\frac{1}{LM}{{\displaystyle {\sum}_{n}{\displaystyle {\sum}_{m}\left(S\left[n,m\right]-\mu \right)}}}^{2}$ |

Coefficient of variation CoV) | $CoV=\frac{\sigma}{\mu}$ |

Mean frequency (MNF) | $MNF=\frac{{\displaystyle {\sum}_{m=1}^{M}{f}_{m}P\left[n,m\right]}}{{\displaystyle {\sum}_{m=1}^{M}P\left[n,m\right]}}$ |

Median frequency (MDF) | ${\sum}_{m=1}^{MDF}P\left[n,m\right]}={\displaystyle {\sum}_{MDF}^{M}P\left[n,m\right]}=\frac{1}{2}{\displaystyle {\sum}_{m=1}^{M}P\left[n,m\right]$ |

Subject | Sensitivity | Specificity | ||||||||
---|---|---|---|---|---|---|---|---|---|---|

Original | BDE | BTGA1 | BTGA2 | MBTGA | Original | BDE | BTGA1 | BTGA2 | MBTGA | |

1 | 0.8958 | 0.9044 | 0.9088 | 0.9092 | 0.9282 | 0.9973 | 0.9975 | 0.9977 | 0.9977 | 0.9982 |

2 | 0.8958 | 0.9071 | 0.9147 | 0.9157 | 0.9249 | 0.9973 | 0.9976 | 0.9978 | 0.9978 | 0.9981 |

3 | 0.7792 | 0.7968 | 0.7981 | 0.8003 | 0.8282 | 0.9943 | 0.9948 | 0.9948 | 0.9949 | 0.9956 |

4 | 0.9042 | 0.9028 | 0.9026 | 0.9032 | 0.9203 | 0.9975 | 0.9975 | 0.9975 | 0.9975 | 0.9980 |

5 | 0.7708 | 0.7885 | 0.7925 | 0.7931 | 0.8224 | 0.9941 | 0.9946 | 0.9947 | 0.9947 | 0.9954 |

6 | 0.9083 | 0.9199 | 0.9260 | 0.9251 | 0.9478 | 0.9976 | 0.9979 | 0.9981 | 0.9981 | 0.9987 |

7 | 0.8083 | 0.8268 | 0.8422 | 0.8422 | 0.8671 | 0.9951 | 0.9956 | 0.9960 | 0.9960 | 0.9966 |

8 | 0.7708 | 0.7944 | 0.8038 | 0.8010 | 0.8363 | 0.9941 | 0.9947 | 0.9950 | 0.9949 | 0.9958 |

9 | 0.8292 | 0.8312 | 0.8372 | 0.8399 | 0.8618 | 0.9956 | 0.9957 | 0.9958 | 0.9959 | 0.9965 |

10 | 0.8292 | 0.8475 | 0.8508 | 0.8536 | 0.8840 | 0.9956 | 0.9961 | 0.9962 | 0.9962 | 0.9970 |

Mean | 0.8392 | 0.8519 | 0.8577 | 0.8583 | 0.8821 | 0.9959 | 0.9962 | 0.9964 | 0.9964 | 0.9970 |

Subject | F-Measure | G-Mean | ||||||||
---|---|---|---|---|---|---|---|---|---|---|

Original | BDE | BTGA1 | BTGA2 | MBTGA | Original | BDE | BTGA1 | BTGA2 | MBTGA | |

1 | 0.8925 | 0.9020 | 0.9063 | 0.9068 | 0.9264 | 0.9401 | 0.9457 | 0.9482 | 0.9483 | 0.9594 |

2 | 0.8963 | 0.9066 | 0.9148 | 0.9155 | 0.9251 | 0.9426 | 0.9487 | 0.9531 | 0.9538 | 0.9589 |

3 | 0.7711 | 0.7910 | 0.7918 | 0.7939 | 0.8226 | 0.8617 | 0.8741 | 0.8743 | 0.8760 | 0.8934 |

4 | 0.9036 | 0.9021 | 0.9017 | 0.9027 | 0.9200 | 0.9476 | 0.9466 | 0.9464 | 0.9468 | 0.9564 |

5 | 0.7565 | 0.7771 | 0.7823 | 0.7821 | 0.8127 | 0.8489 | 0.8629 | 0.8664 | 0.8671 | 0.8849 |

6 | 0.9047 | 0.9172 | 0.9231 | 0.9222 | 0.9463 | 0.9471 | 0.9544 | 0.9575 | 0.9571 | 0.9709 |

7 | 0.8078 | 0.8259 | 0.8398 | 0.8393 | 0.8664 | 0.8888 | 0.8996 | 0.9082 | 0.9089 | 0.9249 |

8 | 0.7605 | 0.7892 | 0.7991 | 0.7958 | 0.8318 | 0.8505 | 0.8702 | 0.8773 | 0.8747 | 0.9006 |

9 | 0.8265 | 0.8255 | 0.8308 | 0.8338 | 0.8564 | 0.8982 | 0.8987 | 0.9005 | 0.9028 | 0.9177 |

10 | 0.8206 | 0.8402 | 0.8443 | 0.8484 | 0.8795 | 0.8982 | 0.9095 | 0.9123 | 0.9147 | 0.9334 |

Mean | 0.8340 | 0.8477 | 0.8534 | 0.8540 | 0.8787 | 0.9024 | 0.9110 | 0.9144 | 0.9150 | 0.9300 |

Subject | AUC Value | ||||
---|---|---|---|---|---|

Original | BDE | BTGA1 | BTGA2 | MBTGA | |

1 | 0.9466 | 0.9510 | 0.9532 | 0.9534 | 0.9632 |

2 | 0.9466 | 0.9524 | 0.9563 | 0.9568 | 0.9615 |

3 | 0.8868 | 0.8958 | 0.8964 | 0.8976 | 0.9119 |

4 | 0.9509 | 0.9501 | 0.9501 | 0.9504 | 0.9591 |

5 | 0.8825 | 0.8915 | 0.8936 | 0.8939 | 0.9089 |

6 | 0.9530 | 0.9589 | 0.9620 | 0.9616 | 0.9732 |

7 | 0.9017 | 0.9112 | 0.9191 | 0.9191 | 0.9318 |

8 | 0.8825 | 0.8946 | 0.8994 | 0.8979 | 0.9160 |

9 | 0.9124 | 0.9135 | 0.9165 | 0.9179 | 0.9291 |

10 | 0.9124 | 0.9218 | 0.9235 | 0.9249 | 0.9405 |

Mean | 0.9175 | 0.9241 | 0.9270 | 0.9273 | 0.9395 |

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Too, J.; Abdullah, A.R.; Mohd Saad, N.; Mohd Ali, N. Feature Selection Based on Binary Tree Growth Algorithm for the Classification of Myoelectric Signals. *Machines* **2018**, *6*, 65.
https://doi.org/10.3390/machines6040065

**AMA Style**

Too J, Abdullah AR, Mohd Saad N, Mohd Ali N. Feature Selection Based on Binary Tree Growth Algorithm for the Classification of Myoelectric Signals. *Machines*. 2018; 6(4):65.
https://doi.org/10.3390/machines6040065

**Chicago/Turabian Style**

Too, Jingwei, Abdul Rahim Abdullah, Norhashimah Mohd Saad, and Nursabillilah Mohd Ali. 2018. "Feature Selection Based on Binary Tree Growth Algorithm for the Classification of Myoelectric Signals" *Machines* 6, no. 4: 65.
https://doi.org/10.3390/machines6040065