# Towards Real-Time Heartbeat Classification: Evaluation of Nonlinear Morphological Features and Voting Method

^{1}

^{2}

^{3}

^{4}

^{5}

^{6}

^{7}

^{8}

^{*}

## Abstract

**:**

## 1. Introduction

#### 1.1. Aim of the Work

#### 1.2. State-of-the-Art

#### 1.3. Contribution

## 2. Methods

#### 2.1. Database

#### AAMI Class Labeling Recommendations

#### 2.2. Pre-Processing

- Mean separation from the noisy ECG,
- Moving average filter of order five,
- High-pass filter with cut-off frequency 1 Hz (for baseline wander suppression),
- Low pass Butter worth filter with cut-off frequency 45 Hz (To suppress any left out high-frequency noise).

#### 2.3. Feature Extraction

#### 2.3.1. ICEEMD

- (i)
- Some residual can be present in the modes.
- (ii)
- During the initial decomposition stages, information may appear “late” with undesired modes, when it is compared to EEMD.

- Compute the local means of J realizations ${x}^{\left(j\right)}=x+{\beta}_{0}{E}_{1}\left({w}^{\left(j\right)}\right),j=1,2,\dots ,J$ using EMD, to obtain first residue ${r}_{1}=<M\left({x}^{\left(j\right)}\right)>$.
- At the first stage ($l=1$), compute the first IMF:$${C}_{1}=x-{r}_{1}.$$
- For $l=2,\dots ,L$, calculate ${r}_{l}$ as$${r}_{l}=<M({r}_{l-1}+{\beta}_{l-1}{E}_{l}\left({w}^{\left(j\right)}\right)>.$$
- Calculate the ${l}^{th}$ mode as$${C}_{l}={r}_{l-1}-{r}_{l}.$$
- Go to step 3 for next l

#### 2.3.2. Entropy Measures

**Shannon Entropy:**$${E}_{\mathrm{Shannon}}\left(s\right)=\sum _{i}{s}_{i}^{2}\mathrm{log}\left({s}_{i}^{2}\right),$$**log Energy Entropy:**$${E}_{\mathrm{log}\mathrm{energy}}\left(s\right)=\sum _{i}\mathrm{log}\left({s}_{i}^{2}\right),$$**norm Entropy:**The ${l}^{p}$ norm entropy with $1\le p$ is defined as$${E}_{\mathrm{norm}}\left(s\right)=\sum _{i}|\left({s}_{i}^{p}\right){|=||s||}_{p}^{p}.$$

#### 2.3.3. HOS

#### 2.4. Voting Scheme

**Mathematical Framework:**Consider a pattern recognition model where a pattern $\mathit{y}$ is to be assigned with one of the m possible classes $\left({\omega}_{1},{\omega}_{2},\dots \dots ,{\omega}_{m}\right)$. Say there are R number of classifiers used for combining. Let us assume that each classifier possesses a different representation of measurement vector ${\mathbf{x}}_{i},i=1,2,\dots ,R$.

**Product Rule:**$p({\mathbf{x}}_{1},{\mathbf{x}}_{2},.\dots .,{\mathbf{x}}_{R}|{\omega}_{j})$ represents the joint probability distribution of the measurements computed by the classifiers. Assuming that these representations are statistically independent, we can rewrite the joint probability distribution as

**Naïve Bayes Classifier:**It is a probability-based learning algorithm developed on the Bayesian framework. According to Bayes theorem, an unknown $\mathit{y}$ is categorized into the one among the R classes, with high posteriori probability:

**Linear and Quadratic Discriminant Analysis Based Classifiers:**The approach of discriminant analysis is to derive a decision boundary or a discriminant function based on the linear combinations of features that best separate the given classes. The assumption made is: examples from different categories follow Gaussian distribution. For instance, the discrimination function for two-class problems based on Bayes theory can be written as

**J48 Classifier:**Recently, decision tree-based algorithms have become popular in machine learning strategies. In practice, J48 is an execution of popular C 4.5 algorithms proposed by Quinlan [53]. According to this algorithm, the decision process involves the construction of a tree based on the feature splitting. The superiority of matching $\mathit{y}$ to a class label ${\omega}_{k}\in \mathit{\omega}$ depends on the choice of feature splitting based on the value of information gain.

**J48 Consolidated (J48-C) Classifier:**It is a consolidated version of C 4.5 classifier. “J48 consolidated’’ is an implementation of a consolidated tree’s construction algorithm, proposed by Arbelaiz et al. [54] in WEKA. The basic idea is building a single tree using several subsamples. In each iteration, we will find a better feature using information gain content similar to J48. After finding the best feature split, all the subsamples are divided using the same features. More details can be found in [54].The parameters used for J48 and J48-C classifiers are given in below Table 4.

## 3. Results

#### The Performance Measures

## 4. Discussion

#### 4.1. Comparative Analysis

#### 4.2. Limitation and Future Scope

## 5. Conclusions

## Author Contributions

## Funding

## Conflicts of Interest

## References

- Alwan, A. Global Status Report on Noncommunicable Diseases 2010; World Health Organization: Geneva, Switzerland, 2011. [Google Scholar]
- Augustyniak, P.; Tadeusiewicz, R. Assessment of electrocardiogram visual interpretation strategy based on scanpath analysis. Physiol. Meas.
**2006**, 27, 597. [Google Scholar] [CrossRef] [PubMed] - Moody, G.B.; Mark, R.G. The impact of the MIT-BIH arrhythmia database. IEEE Eng. Med. Biol. Mag.
**2001**, 20, 45–50. [Google Scholar] [CrossRef] [PubMed] - Pławiak, P. Novel methodology of cardiac health recognition based on ECG signals and evolutionary-neural system. Expert Syst. Appl.
**2018**, 92, 334–349. [Google Scholar] [CrossRef] - Yang, J.; Bai, Y.; Lin, F.; Liu, M.; Hou, Z.; Liu, X. A novel electrocardiogram arrhythmia classification method based on stacked sparse auto-encoders and softmax regression. Int. J. Mach. Learn. Cybern.
**2018**, 9, 1733–1740. [Google Scholar] [CrossRef] - Tuncer, T.; Dogan, S.; Pławiak, P.; Acharya, U.R. Automated arrhythmia detection using novel hexadecimal local pattern and multilevel wavelet transform with ECG signals. Knowl. Based Syst.
**2019**, 104923. [Google Scholar] [CrossRef] - Rajesh, K.N.; Dhuli, R. Classification of ECG heartbeats using nonlinear decomposition methods and support vector machine. Comput. Biol. Med.
**2017**, 87, 271–284. [Google Scholar] [CrossRef] - Pławiak, P. Novel genetic ensembles of classifiers applied to myocardium dysfunction recognition based on ECG signals. Swarm Evol. Comput.
**2018**, 39, 192–208. [Google Scholar] [CrossRef] - Yıldırım, Ö.; Pławiak, P.; Tan, R.S.; Acharya, U.R. Arrhythmia detection using deep convolutional neural network with long duration ECG signals. Comput. Biol. Med.
**2018**, 102, 411–420. [Google Scholar] [CrossRef] - Pławiak, P.; Acharya, U.R. Novel deep genetic ensemble of classifiers for arrhythmia detection using ECG signals. Neural Comput. Appl.
**2019**, 1–25. [Google Scholar] - Pławiak, P.; Abdar, M. Novel Methodology for Cardiac Arrhythmias Classification Based on Long-Duration ECG Signal Fragments Analysis. In Biomedical Signal Processing; Springer: Singapore, 2020; pp. 225–272. [Google Scholar]
- Khalaf, A.F.; Owis, M.I.; Yassine, I.A. A novel technique for cardiac arrhythmia classification using spectral correlation and support vector machines. Expert Syst. Appl.
**2015**, 42, 8361–8368. [Google Scholar] [CrossRef] - Mert, A. ECG feature extraction based on the bandwidth properties of variational mode decomposition. Physiol. Meas.
**2016**, 37, 530. [Google Scholar] [CrossRef] [PubMed] - Li, H.; Liang, H.; Miao, C.; Cao, L.; Feng, X.; Tang, C.; Li, E. Novel ECG signal classification based on KICA nonlinear feature extraction. Circ. Syst. Signal Process.
**2016**, 35, 1187–1197. [Google Scholar] [CrossRef] - Alickovic, E.; Subasi, A. Medical decision support system for diagnosis of heart arrhythmia using DWT and random forests classifier. J. Med. Syst.
**2016**, 40, 108. [Google Scholar] [CrossRef] [PubMed] - Martis, R.J.; Acharya, U.R.; Lim, C.M.; Mandana, K.; Ray, A.K.; Chakraborty, C. Application of higher order cumulant features for cardiac health diagnosis using ECG signals. Int. J. Neural Syst.
**2013**, 23, 1350014. [Google Scholar] [CrossRef] [PubMed] - Sharma, P.; Ray, K.C. Efficient methodology for electrocardiogram beat classification. IET Signal Process.
**2016**, 10, 825–832. [Google Scholar] [CrossRef] - Mishra, A.K.; Raghav, S. Local fractal dimension based ECG arrhythmia classification. Biomed. Signal Process. Control
**2010**, 5, 114–123. [Google Scholar] [CrossRef] - Osowski, S.; Hoai, L.T.; Markiewicz, T. Support vector machine-based expert system for reliable heartbeat recognition. IEEE Trans. Biomed. Eng.
**2004**, 51, 582–589. [Google Scholar] [CrossRef] - Ye, C.; Kumar, B.V.; Coimbra, M.T. Heartbeat classification using morphological and dynamic features of ECG signals. IEEE Trans. Biomed. Eng.
**2012**, 59, 2930–2941. [Google Scholar] - Lin, C.H. Classification enhancible grey relational analysis for cardiac arrhythmias discrimination. Med. Biol. Eng. Comput.
**2006**, 44, 311–320. [Google Scholar] [CrossRef] - Cuesta-Frau, D.; Biagetti, M.O.; Quinteiro, R.A.; Mico-Tormos, P.; Aboy, M. Unsupervised classification of ventricular extrasystoles using bounded clustering algorithms and morphology matching. Med. Biol. Eng. Comput.
**2007**, 45, 229–239. [Google Scholar] [CrossRef] - Tadeusiewicz, R. Neural networks as a tool for modeling of biological systems. Bio-Algorithms Med.-Syst.
**2015**, 11, 135–144. [Google Scholar] [CrossRef] - Kutlu, Y.; Kuntalp, D. A multi-stage automatic arrhythmia recognition and classification system. Comput. Biol. Med.
**2011**, 41, 37–45. [Google Scholar] [CrossRef] [PubMed] - Martis, R.J.; Acharya, U.R.; Min, L.C. ECG beat classification using PCA, LDA, ICA and discrete wavelet transform. Biomed. Signal Process. Control
**2013**, 8, 437–448. [Google Scholar] [CrossRef] - Martis, R.J.; Acharya, U.R.; Lim, C.M.; Suri, J.S. Characterization of ECG beats from cardiac arrhythmia using discrete cosine transform in PCA framework. Knowl.-Based Syst.
**2013**, 45, 76–82. [Google Scholar] [CrossRef] - Elhaj, F.A.; Salim, N.; Harris, A.R.; Swee, T.T.; Ahmed, T. Arrhythmia recognition and classification using combined linear and nonlinear features of ECG signals. Comput. Methods Prog. Biomed.
**2016**, 127, 52–63. [Google Scholar] [CrossRef] - Li, P.; Wang, Y.; He, J.; Wang, L.; Tian, Y.; Zhou, T.s.; Li, T.; Li, J.s. High-Performance Personalized Heartbeat Classification Model for Long-Term ECG Signal. IEEE Trans. Biomed. Eng.
**2017**, 64, 78–86. [Google Scholar] - Desai, U.; Martis, R.J.; Nayak, C.G.; Sarika, K.; Seshikala, G. Machine intelligent diagnosis of ECG for arrhythmia classification using DWT, ICA and SVM techniques. In Proceedings of the India Conference (INDICON), New Delhi, India, 17–20 Deceember 2015; pp. 1–4. [Google Scholar]
- Desai, U.; Martis, R.J.; GURUDAS NAYAK, C.; Seshikala, G.; Sarika, K.; SHETTY K, R. Decision support system for arrhythmia beats using ECG signals with DCT, DWT and EMD methods: A comparative study. J. Mech. Med. Biol.
**2016**, 16, 1640012. [Google Scholar] [CrossRef] - De Chazal, P.; O’Dwyer, M.; Reilly, R.B. Automatic classification of heartbeats using ECG morphology and heartbeat interval features. IEEE Trans. Biomed. Eng.
**2004**, 51, 1196–1206. [Google Scholar] [CrossRef] - Egila, M.G.; El-Moursy, M.A.; El-Hennawy, A.E.; El-Simary, H.A.; Zaki, A. FPGA-based electrocardiography (ECG) signal analysis system using least-square linear phase finite impulse response (FIR) filter. J. Elec. Syst. Inf. Technol.
**2016**, 3, 513–526. [Google Scholar] [CrossRef] - Raj, S.; Maurya, K.; Ray, K.C. A knowledge-based real time embedded platform for arrhythmia beat classification. Biomed. Eng. Lett.
**2015**, 5, 271–280. [Google Scholar] [CrossRef] - Zairi, H.; Talha, M.K.; Meddah, K.; Slimane, S.O. FPGA-based system for artificial neural network arrhythmia classification. Neural Comput. Appl.
**2019**, 1–16. [Google Scholar] [CrossRef] - Jewajinda, Y.; Chongstitvatana, P. FPGA-based online-learning using parallel genetic algorithm and neural network for ECG signal classification. In Proceedings of the ECTI-CON2010: The 2010 ECTI International Confernce on Electrical Engineering/Electronics, Computer, Telecommunications and Information Technology, Chiang Mai, Thailand, 19–21 May 2010; pp. 1050–1054. [Google Scholar]
- Rajesh, K.N.; Dhuli, R. Classification of imbalanced ECG beats using re-sampling techniques and AdaBoost ensemble classifier. Biomed. Signal Process. Control
**2018**, 41, 242–254. [Google Scholar] [CrossRef] - Amann, A.; Tratnig, R.; Unterkofler, K. Reliability of old and new ventricular fibrillation detection algorithms for automated external defibrillators. Biomed. Eng. Online
**2005**, 4, 60. [Google Scholar] [CrossRef] [PubMed] - Huang, N.E.; Shen, Z.; Long, S.R.; Wu, M.C.; Shih, H.H.; Zheng, Q.; Yen, N.C.; Tung, C.C.; Liu, H.H. The empirical mode decomposition and the Hilbert spectrum for nonlinear and non-stationary time series analysis. Proc. R. Soc. Lond. A Math. Phys. Eng. Sci.
**1998**, 454, 903–995. [Google Scholar] [CrossRef] - Wu, Z.; Huang, N.E. Ensemble empirical mode decomposition: A noise-assisted data analysis method. Adv. Adapt. Data Anal.
**2009**, 1, 1–41. [Google Scholar] [CrossRef] - Torres, M.E.; Colominas, M.A.; Schlotthauer, G.; Flandrin, P. A complete ensemble empirical mode decomposition with adaptive noise. In Proceedings of the Acoustics, speech and signal processing (ICASSP), Prague, Czech Republic, 22–27 May 2011; pp. 4144–4147. [Google Scholar]
- Colominas, M.A.; Schlotthauer, G.; Torres, M.E. Improved complete ensemble EMD: A suitable tool for biomedical signal processing. Biomed. Signal Process. Control
**2014**, 14, 19–29. [Google Scholar] [CrossRef] - Li, T.; Zhou, M. ECG classification using wavelet packet entropy and random forests. Entropy
**2016**, 18, 285. [Google Scholar] [CrossRef] - Rosso, O.A.; Blanco, S.; Yordanova, J.; Kolev, V.; Figliola, A.; Schürmann, M.; Başar, E. Wavelet entropy: A new tool for analysis of short duration brain electrical signals. J. Neurosci. Method.
**2001**, 105, 65–75. [Google Scholar] [CrossRef] - Shannon, C.E. A mathematical theory of communication, Part I, Part II. Bell Syst. Tech. J.
**1948**, 27, 623–656. [Google Scholar] [CrossRef] - Coifman, R.R.; Wickerhauser, M.V. Entropy-based algorithms for best basis selection. IEEE Trans. Inf. Theory
**1992**, 38, 713–718. [Google Scholar] [CrossRef] - Martis, R.J.; Acharya, U.R.; Ray, A.K.; Chakraborty, C. Application of higher order cumulants to ECG signals for the cardiac health diagnosis. In Proceedings of the 2011 Annual International Conference on Engineering in Medicine and Biology Society (EMBC), Boston, MA, USA, 30 August–3 September 2011; pp. 1697–1700. [Google Scholar]
- Nikias, C.L.; Mendel, J.M. Signal processing with higher-order spectra. IEEE Signal Process. Mag.
**1993**, 10, 10–37. [Google Scholar] [CrossRef] - Swami, A.; Mendel, J.M.; Nikias, C.L.M. Higher-order spectral analysis toolbox. Tech. Support Product Enhanc. Suggest.
**1984**. [Google Scholar] - Wolpert, D.H.; Macready, W.G. No free lunch theorems for optimization. IEEE Trans. Evolut. Comput.
**1997**, 1, 67–82. [Google Scholar] [CrossRef] [Green Version] - Kittler, J.; Hatef, M.; Duin, R.P.; Matas, J. On combining classifiers. IEEE Trans. Pattern Anal. Mach. Intell.
**1998**, 20, 226–239. [Google Scholar] [CrossRef] [Green Version] - John, G.H.; Langley, P. Estimating continuous distributions in Bayesian classifiers. In Proceedings of the Eleventh conference on Uncertainty in Artificial Intelligence, Montréal, QC, Canada, 18–20 August 1995; pp. 338–345. [Google Scholar]
- Duda, R.O.; Hart, P.E.; Stork, D.G. Pattern Classification; John Wiley & Sons: Hoboken, NJ, USA, 2012. [Google Scholar]
- Quinlan, J.R. C4. 5: Programs for Machine Learning; Elsevier: Amsterdam, The Netherlands, 2014. [Google Scholar]
- Arbelaiz Gallego, O.; Gurrutxaga, I.; Lozano, F.; Muguerza, J.; Pérez, J.M. J48Consolidated: An Implementation of CTC Algorithm for WEKA. 2016. Available online: https://addi.ehu.es/handle/10810/17314 (accessed on 17 November 2019).
- Yang, Y.; Webb, G.I. Discretization for naive-Bayes learning: managing discretization bias and variance. Mach. Learn.
**2009**, 74, 39–74. [Google Scholar] [CrossRef] - Witten, I.H.; Frank, E.; Hall, M.A.; Pal, C.J. Data Mining: Practical Machine Learning Tools and Techniques; Morgan Kaufmann: Burlington, MA, USA, 2016. [Google Scholar]
- Powers, D.M. Evaluation: From Precision, Recall and F-Measure to ROC, Informedness, Markedness and Correlation. Available online: https://bioinfopublication.org/files/articles/2_1_1_JMLT.pdf (accessed on 19 November 2019).
- Luz, E.J.D.S.; Nunes, T.M.; De Albuquerque, V.H.C.; Papa, J.P.; Menotti, D. ECG arrhythmia classification based on optimum-path forest. Expert Syst. Appl.
**2013**, 40, 3561–3573. [Google Scholar] [CrossRef] [Green Version] - Queiroz, V.; Luz, E.; Moreira, G.; Guarda, Á.; Menotti, D. Automatic cardiac arrhythmia detection and classification using vectorcardiograms and complex networks. In Proceedings of the 2015 37th Annual International Conference on Engineering in Medicine and Biology Society (EMBC), Ilan, Italy, 25–29 August 2015; pp. 5203–5206. [Google Scholar]
- Luz, E.J.D.S.; Merschmann, L.H.; Menotti, D.; Moreira, G.J. Evaluating a hierarchical approach for heartbeat classification from ECG. Int. J. Bioinf. Res. Appl.
**2017**, 13, 146–160. [Google Scholar] [CrossRef] - Garcia, G.; Moreira, G.; Luz, E.; Menotti, D. Improving automatic cardiac arrhythmia classification: Joining temporal-VCG, complex networks and SVM classifier. In Proceedings of the 2016 International Joint Conference on Neural Networks (IJCNN), Vancouver, BC, Canada, 24–29 July 2016; pp. 3896–3900. [Google Scholar]
- Chen, S.; Hua, W.; Li, Z.; Li, J.; Gao, X. Heartbeat classification using projected and dynamic features of ECG signal. Biomed. Signal Process. Control
**2017**, 31, 165–173. [Google Scholar] [CrossRef] - Garcia, G.; Moreira, G.; Menotti, D.; Luz, E. Inter-Patient ECG Heartbeat Classification with Temporal VCG Optimized by PSO. Sci. Rep.
**2017**, 7, 10543. [Google Scholar] [CrossRef]

AAMI Classes | MIT-BIH Heartbeats | Total Data | Training (DS1) | Testing (DS2) | |
---|---|---|---|---|---|

N | Normal, left and right bundle branch block | 83,761 | 41,746 | 42,015 | |

atrial and nodal escape beats | |||||

S | atrial premature contraction, Aberrated atrial, | 2614 | 777 | 1837 | |

supra ventricular and junctional premature beats | |||||

V | premature ventricular contraction, ventricular flutter | 6893 | 3787 | 3106 | |

and escape beats | |||||

F | fusion of ventricular and normal | 526 | 266 | 260 | |

beats | |||||

Q | paced, unclassifiable, | 12 | 6 | 6 | |

fusion of paced and normal beats |

Parameters | Naïve Bayes |
---|---|

Use Kernel Estimator | False |

Use supervise Discretization | True |

Parameters | LDA | QDA |
---|---|---|

Ridge | $1.0\times {10}^{-6}$ | $1.0\times {10}^{-6}$ |

Parameters | J48 | J48-C |
---|---|---|

Minimum Objects | 1000 | 1000 |

Use MDL correction | True | True |

Number of folds | 3 | 3 |

Sub-tree raising | True | True |

Predicted Labels | ||||||
---|---|---|---|---|---|---|

Actua Labels | N | V | S | F | Q | Sum |

N | ${N}_{n}$ | ${N}_{v}$ | ${N}_{s}$ | ${N}_{f}$ | ${N}_{q}$ | ${R}_{N}$ |

V | ${V}_{n}$ | ${V}_{v}$ | ${V}_{s}$ | ${V}_{f}$ | ${V}_{q}$ | ${R}_{V}$ |

S | ${S}_{n}$ | ${S}_{v}$ | ${S}_{s}$ | ${S}_{f}$ | ${S}_{q}$ | ${R}_{S}$ |

F | ${F}_{n}$ | ${F}_{v}$ | ${F}_{s}$ | ${F}_{f}$ | ${F}_{q}$ | ${R}_{F}$ |

Q | ${Q}_{n}$ | ${Q}_{v}$ | ${Q}_{s}$ | ${Q}_{f}$ | ${Q}_{q}$ | ${R}_{Q}$ |

Sum | ${C}_{N}$ | ${C}_{V}$ | ${C}_{S}$ | ${C}_{F}$ | ${C}_{Q}$ | $R/C$ |

LDA | QDA | ||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|

N | V | S | F | Q | N | V | S | F | Q | ||

N | 40,842 | 33 | 47 | 1093 | 0 | 2777 | 266 | 36,394 | 2578 | 0 | |

V | 319 | 2787 | 0 | 0 | 0 | 5 | 3101 | 0 | 0 | 0 | |

S | 1831 | 1 | 2 | 3 | 0 | 95 | 56 | 1675 | 11 | 0 | |

F | 171 | 0 | 0 | 89 | 0 | 91 | 64 | 20 | 85 | 0 | |

Q | 0 | 0 | 0 | 0 | 6 | 0 | 6 | 0 | 0 | 0 | |

naïve Bayes | J48 | ||||||||||

N | V | S | F | Q | N | V | S | F | Q | ||

N | 36,222 | 745 | 997 | 3289 | 762 | 41,801 | 214 | 0 | 0 | 0 | |

V | 930 | 1874 | 52 | 231 | 19 | 363 | 2743 | 0 | 0 | 0 | |

S | 1321 | 3 | 133 | 27 | 353 | 1819 | 18 | 0 | 0 | 0 | |

F | 8 | 0 | 2 | 245 | 5 | 259 | 1 | 0 | 0 | 0 | |

Q | 1 | 0 | 0 | 4 | 1 | 5 | 1 | 0 | 0 | 0 | |

J48-C | |||||||||||

N | V | S | F | Q | |||||||

N | 0 | 6941 | 28,966 | 5290 | 818 | ||||||

V | 0 | 2205 | 894 | 6 | 1 | ||||||

S | 0 | 24 | 1785 | 19 | 9 | ||||||

F | 0 | 30 | 23 | 191 | 16 | ||||||

Q | 0 | 0 | 0 | 0 | 6 |

LDA | QDA | ||||||
---|---|---|---|---|---|---|---|

OA = 92.60 | SEN% | FPR% | PPV% | OA = 16.20 | SEN% | FPR% | PPV% |

N | 97.2 | 44.6 | 94.6 | 6.6 | 3.7 | 93.6 | |

V | 89.7 | 0.1 | 98.8 | 99.8 | 0.9 | 88.8 | |

S | 0.1 | 0.1 | 4.1 | 91.2 | 80.2 | 4.4 | |

F | 34.2 | 2.3 | 7.5 | 32.7 | 5.5 | 3.2 | |

Q | 100 | 0 | 100 | 0 | 0 | 0 | |

naïve Bayes | J48 | ||||||

OA = 81.47 | SEN% | FPR% | PPV% | OA = 94.32 | SEN% | FPR% | PPV% |

N | 86.2 | 43.4 | 94.1 | 99.5 | 47 | 94.5 | |

V | 60.3 | 1.7 | 71.5 | 88.3 | 0.5 | 92.1 | |

S | 7.2 | 2.3 | 11.2 | 0 | 0 | 0 | |

F | 94.2 | 7.6 | 6.5 | 0 | 0 | 0 | |

Q | 16.7 | 2.4 | 0.1 | 0 | 0 | 0 | |

J48-C | |||||||

OA = 8.86 | SEN% | FPR% | PPV% | ||||

N | 0 | 0 | 0 | ||||

V | 71.0 | 15.9 | 24.0 | ||||

S | 97.2 | 65.8 | 5.6 | ||||

F | 73.5 | 11.3 | 3.5 | ||||

Q | 100 | 1.8 | 0.7 |

**Table 8.**Confusion matrix for combining J48, LDA, and naïve Bayes classifiers using a Voting scheme.

Voting (J48, LDA, naïve Bayes) | |||||
---|---|---|---|---|---|

N | V | S | F | Q | |

N | 39,542 | 53 | 489 | 1931 | 0 |

V | 395 | 2708 | 3 | 0 | 0 |

S | 1473 | 1 | 353 | 10 | 0 |

F | 22 | 5 | 1 | 232 | 0 |

Q | 0 | 0 | 0 | 0 | 6 |

**Table 9.**Performance measures for combining J48, LDA, and naïve Bayes classifiers using a Voting scheme.

Voting (J48, LDA, naïve Bayes) | |||
---|---|---|---|

OA = 90.71 | SEN% | FPR% | PPV% |

N | 94.1 | 36.3 | 95.4 |

V | 87.2 | 0.1 | 97.9 |

S | 19.2 | 1.1 | 41.7 |

F | 89.2 | 4.1 | 10.7 |

Q | 100 | 0 | 100 |

**Table 10.**Confusion matrix for combining J48, naïve Bayes, and QDA classifiers using a voting scheme.

Voting ( J48, naïve Bayes, QDA) | |||||
---|---|---|---|---|---|

N | V | S | F | Q | |

N | 35,629 | 253 | 3836 | 2297 | 0 |

V | 11 | 3095 | 0 | 0 | 0 |

S | 1188 | 11 | 624 | 14 | 0 |

F | 28 | 61 | 11 | 160 | 0 |

Q | 0 | 6 | 0 | 0 | 0 |

**Table 11.**Performance measures for combining J48, naïve Bayes, and QDA classifiers using a Voting scheme.

Voting ( J48, naïve Bayes, QDA) | |||
---|---|---|---|

OA = 83.6 | SEN% | FPR% | PPV% |

N | 84.8 | 23.6 | 96.7 |

V | 99.6 | 0.8 | 90.3 |

S | 34 | 8.5 | 14 |

F | 61.5 | 4.9 | 6.5 |

Q | 0 | 0 | 0 |

**Table 12.**Confusion matrix for combining J48-C, naïve Bayes, and QDA classifiers using a Voting scheme.

Voting (J48-C, naïve Bayes, QDA) | |||||
---|---|---|---|---|---|

N | V | S | F | Q | |

N | 29,730 | 255 | 9089 | 2491 | 0 |

V | 8 | 3098 | 0 | 0 | 0 |

S | 1031 | 8 | 779 | 19 | 0 |

F | 21 | 61 | 12 | 166 | 0 |

Q | 0 | 6 | 0 | 0 | 0 |

**Table 13.**Performance measures for combining J48-C, naïve Bayes, and QDA classifiers using a Voting scheme.

Voting (J48-C,naïve Bayes, QDA) | |||
---|---|---|---|

OA = 71.51 | SEN% | FPR% | PPV% |

N | 70.8 | 20.3 | 96.6 |

V | 99.7 | 0.7 | 90.4 |

S | 42.4 | 20.1 | 7.9 |

F | 63.8 | 6.3 | 5.3 |

Q | 0 | 0 | 0 |

**Table 14.**Confusion matrix for combining J48-C, naïve Bayes, and LDA classifiers using a Voting scheme.

Voting (J48-C, naïve Bayes, LDA) | |||||
---|---|---|---|---|---|

N | V | S | F | Q | |

N | 38,329 | 66 | 942 | 2678 | 0 |

V | 553 | 2532 | 20 | 1 | 0 |

S | 1405 | 2 | 416 | 14 | 0 |

F | 20 | 4 | 1 | 235 | 0 |

Q | 0 | 0 | 0 | 0 | 6 |

**Table 15.**Performance measures for combining J48-C, naïve Bayes, and LDA classifiers using a Voting scheme.

Voting (J48-C,naïve Bayes, LDA) | |||
---|---|---|---|

OA = 87.91 | SEN% | FPR% | PPV% |

N | 91.2 | 38 | 95.1 |

V | 81.5 | 0.2 | 97.2 |

S | 22.6 | 2.1 | 30.2 |

F | 90.4 | 5.7 | 8 |

Q | 100 | 0 | 100 |

LDA | QDA | ||||||
---|---|---|---|---|---|---|---|

N | V | S | N | V | S | ||

N | 41,875 | 27 | 113 | 4569 | 480 | 36,966 | |

V | 280 | 2826 | 0 | 5 | 3101 | 0 | |

S | 1835 | 0 | 2 | 105 | 56 | 1676 | |

naïve Bayes | J48 | ||||||

N | V | S | N | V | S | ||

N | 37,385 | 2968 | 1662 | 41,998 | 17 | 0 | |

V | 922 | 2120 | 64 | 1095 | 2011 | 0 | |

S | 1336 | 18 | 483 | 1837 | 0 | 0 | |

J48-C | |||||||

N | V | S | |||||

N | 34,999 | 7016 | 0 | ||||

V | 651 | 2455 | 0 | ||||

S | 1585 | 252 | 0 |

LDA | QDA | ||||||
---|---|---|---|---|---|---|---|

OA = 95.19 | SEN% | FPR% | PPV% | OA = 19.90 | SEN% | FPR% | PPV% |

N | 99.7 | 42.8 | 95.2 | 10.9 | 2.2 | 97.6 | |

V | 91.0 | 0.1 | 99.1 | 99.8 | 1.2 | 85.3 | |

S | 0.1 | 0.3 | 1.7 | 91.2 | 81.9 | 4.3 | |

naïve Bayes | J48 | ||||||

OA = 85.15 | SEN% | FPR% | PPV% | OA = 93.71 | SEN% | FPR% | PPV% |

N | 89 | 45.7 | 94.3 | 100 | 59.3 | 93.5 | |

V | 68.3 | 6.8 | 41.5 | 64.7 | 0 | 99.2 | |

S | 26.3 | 3.8 | 21.9 | 0 | 0 | 0 | |

J48-C | |||||||

OA = 79.76 | SEN% | FPR% | PPV% | ||||

N | 83.3 | 45.2 | 94 | ||||

V | 79 | 16.6 | 25.2 | ||||

S | 0 | 0 | 0 |

**Table 18.**Confusion matrix for combining J48, LDA, and naïve Bayes classifiers using a Voting scheme (N,S,V).

Voting ( J48, LDA, naïve Bayes) | |||
---|---|---|---|

N | V | S | |

N | 40,918 | 361 | 736 |

V | 205 | 2897 | 4 |

S | 1469 | 3 | 365 |

**Table 19.**Performance measures for combining J48, LDA, and naïve Bayes classifiers using Voting scheme (N,S,V).

Voting ( J48, LDA, naïve Bayes) | |||
---|---|---|---|

OA = 94.08 | SEN% | FPR% | PPV% |

N | 97.4 | 33.9 | 96.1 |

V | 93.3 | 0.8 | 88.8 |

S | 19.9 | 1.6 | 33 |

**Table 20.**Confusion matrix for combining J48, naïve Bayes, and QDA classifiers using Voting scheme (N,S,V).

Voting ( J48, naïve Bayes, QDA) | |||
---|---|---|---|

N | V | S | |

N | 37,421 | 574 | 4020 |

V | 12 | 3094 | 0 |

S | 1203 | 8 | 626 |

**Table 21.**Performance measures for combining J48, naïve Bayes, and QDA classifiers using Voting scheme (N,S,V).

Voting ( J48, naïve Bayes, QDA) | |||
---|---|---|---|

OA = 87.61 | SEN% | FPR% | PPV% |

N | 89.1 | 24.6 | 96.9 |

V | 99.6 | 1.3 | 84.2 |

S | 34.1 | 8.9 | 13.5 |

**Table 22.**Confusion matrix for combining J48-C, naïve Bayes, and QDA classifiers using Voting scheme (N,S,V).

Voting ( J48-C, naïve Bayes, QDA) | |||
---|---|---|---|

N | V | S | |

N | 32011 | 598 | 9406 |

V | 7 | 3099 | 0 |

S | 1062 | 8 | 767 |

**Table 23.**Performance measures for combining J48-C, naïve Bayes, and QDA classifiers using Voting scheme (N,S,V).

Voting ( J48-C, naïve Bayes, QDA) | |||
---|---|---|---|

OA = 76.40 | SEN% | FPR% | PPV% |

N | 76.2 | 21.6 | 96.8 |

V | 99.8 | 1.4 | 83.6 |

S | 41.8 | 20.8 | 7.5 |

**Table 24.**Confusion matrix for combining J48-C, naïve Bayes, and LDA classifiers using Voting scheme (N,S,V).

Voting ( J48-C, naïve Bayes, LDA) | |||
---|---|---|---|

N | V | S | |

N | 40,248 | 688 | 1079 |

V | 359 | 2735 | 12 |

S | 1407 | 2 | 428 |

**Table 25.**Performance measures for combining J48-C, naïve Bayes, LDA classifiers using Voting scheme (N,S,V).

Voting ( J48-C, naïve Bayes, LDA) | |||
---|---|---|---|

OA = 92.44 | SEN% | FPR% | PPV% |

N | 95.8 | 35.7 | 95.8 |

V | 88.1 | 1.6 | 79.9 |

S | 23.3 | 2.4 | 28.2 |

Feature Number | Features | N | S | V | F | Q | |
---|---|---|---|---|---|---|---|

1 | CUM2(IMF1) | 1.0 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 4.0 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$ | 8.0 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 4.03 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | 7.50 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.00045775 | 0.000162 ± 0.000326 | 0.0001405 ± 0.009121 | |

2 | CUM3(IMF1) | 0 ± 0 | 0 ± 0 | 0 ± 2.0 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$ | 0 ± 5.0 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$ | 0 ± 0.000313 | |

3 | CUM4(IMF1) | 0 ± 0 | 0 ± 0 | 1.0 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 2.58 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | 2.50 $\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}\phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 1.1$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | 4.5$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 0.006113 | |

4 | Shan(IMF1) | 0.004769 ± 0.009686 | 0.017643 ± 0.06530625 | 0.112159 ± 0.4199265 | 0.216149 ± 0.330006 | 0.153638 ± 1.47632 | |

5 | log(IMF1) | −4792.4962 ± 379.828 | −4735.1172 ± 286.978 | −4532.44 ± 467.507 | −4358.371 ± 271.824 | −4604.445 ± 1783.20 | |

6 | norm(IMF1) | 0.1004 ± 0.0726 | 0.1538 ± 0.1719 | 0.0815 ± 0.05794 | 0.5541 ± 0.5434 | 18.1790 ± 20.55612 | |

7 | CUM2(IMF2) | 0.001 ± 0.00347 | 0.00109 ± 0.0035 | 0.00028 ± 0.00137 | 0.00107 ± 0.00143 | 0.00411 ± 0.0134 | |

8 | CUM3(IMF2) | 8.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 5.20$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | 4.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 5.10$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | 0 ± 6.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$ | 5.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 2.10$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | 0 ± 0.000387 | |

9 | CUM4(IMF2) | 1.50$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.000185 | 2.10$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.000128 | 2.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 4.20$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | 1.60$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 5.20$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | 0.00046 ± 0.002581 | |

10 | Shan(IMF2) | 1.3074 ± 2.9180 | 1.3675 ± 3.4250 | 0.431 ± 1.429 | 1.4044 ± 1.44285 | 2.90440 ± 6.76104 | |

11 | log(IMF2) | −4055.72 ± 403.632 | −4045.57 ± 565.833 | −4064.90 ± 479.11 | −3788.21 ± 266.57 | −4023.39 ± 1366.58 | |

12 | norm(IMF2) | 2.41 ± 3.81 | 2.57 ± 4.95 | 2.744 ± 1.768 | 2.69 ± 2.04 | 18.17 ± 20.556 | |

13 | CUM2(IMF3) | 0.0069 ± 0.0097 | 0.00534 ± 0.009013 | 0.00443 ± 0.00936 | 0.011997 ± 0.00909 | 0.004148 ± 0.008015 | |

14 | CUM3(IMF3) | −7.80$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.000322 | −8.60$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.0003605 | −3.10$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.000246 | −0.00019 ± 0.00062 | 8.95$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.000266 | |

15 | CUM4(IMF3) | 0.000235 ± 0.000863 | 0.000104 ± 0.0006515 | 0.000117 ± 0.000747 | 0.0005255 ± 0.000767 | 0.0001945 ± 0.000396 | |

16 | Shan(IMF3) | 6.3075515 ± 6.075866 | 5.532014 ± 6.254886 | 4.570122 ± 5.93904725 | 10.002892 ± 5.49255 | 3.8205085 ± 6.888019 | |

17 | log(IMF3) | −3114.4106 ± 443.70730 | −3071.71175 ± 705.504 | −3178.659 ± 531.3571 | −2731.2660 ± 467.4692 | −3341.5431 ± 891.0573 | |

18 | norm(IMF3) | 9.3564 ± 7.6444 | 8.69133 ± 8.37546 | 7.585 ± 4.1465 | 14.85 ± 7.440 | 18.179 ± 20.55 | |

19 | CUM2(IMF4) | 0.013817 ± 0.0210 | 0.01233 ± 0.01653 | 0.0215 ± 0.03611 | 0.0370 ± 0.0297 | 0.0069 ± 0.0102 | |

20 | CUM3(IMF4) | −0.00012 ± 0.00081 | −6.80$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.000469 | −0.00014 ± 0.001305 | −0.001542 ± 0.00207 | 0 ± 0.00039 | |

21 | CUM4(IMF4) | 0.000147 ± 0.000734 | 4.60$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.0003225 | 0.000404 ± 0.00189 | 0.00027 ± 0.00124 | 0.000138 ± 0.00018 | |

22 | Shan(IMF4) | 13.05585 ± 12.76457 | 12.2365 ± 11.295 | 16.104 ± 15.135 | 25.934 ± 15.110 | 7.113 ± 10.471 | |

23 | log(IMF4) | −2150.494 ± 533.542 | −2143.567 ± 611.308 | −2206.943 ± 794.75 | −1614.17 ± 480.99 | −2593.22 ± 1271.81 | |

24 | norm(IMF4) | 19.367 ± 15.509 | 18.686 ± 14.240 | 16.26 ± 9.166 | 35.59 ± 20.941 | 18.179 ± 20.556 | |

25 | CUM2(IMF5) | 0.0084 ± 0.018 | 0.0066 ± 0.0156 | 0.033 ± 0.0648 | 0.056 ± 0.058 | 0.0075 ± 0.0149 | |

26 | CUM3(IMF5) | 1.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 0.00017 | 0 ± 0.000103 | −1.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 0.00146 | 1.40$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.0022 | 0 ± 0.000405 | |

27 | CUM4(IMF5) | −7.00$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 0.00012 | −6.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 0.000127 | −3.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.00193 | −0.00236 ± 0.00621 | −5.50$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 0.000287 | |

28 | Shan(IMF5) | 10.358 ± 15.6374 | 8.68112 ± 14.823 | 25.882 ± 29.195 | 39.2240 ± 25.737 | 8.992 ± 16.965 | |

29 | log(IMF5) | −1917.009 ± 593.83 | −1992.177 ± 872.920 | −1567.422 ± 918.993 | −1216.140± 489.3834 | −2179.500 ± 1495.143 | |

30 | norm(IMF5) | 17.118 ± 19.0735 | 14.983 ± 19.2726 | 19.495 ± 14.659 | 52.96 ± 32.190 | 18.179 ± 20.556 | |

31 | CUM2(IMF6) | 0.0040 ± 0.0092 | 0.0028 ± 0.00723 | 0.0184 ± 0.04606 | 0.0571 ± 0.0827 | 0.0073 ± 0.01072 | |

32 | CUM3(IMF6) | 1.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 0.00010 | 0 ± 6.80$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | 2.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 0.0011117 | 8.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-6}$± 0.00527 | 0 ± 0.0002 | |

33 | CUM4(IMF6) | −1.50$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.00011 | -8.0$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ 6± 6.25$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$ | −0.00024 ± 0.00205 | −0.00389± 0.01428 | −4.40$\phantom{\rule{3.33333pt}{0ex}}\times \phantom{\rule{3.33333pt}{0ex}}{10}^{-5}$± 0.00017 | |

34 | Shan(IMF6) | 6.50405 ± 11.11697 | 5.26119 ± 9.9055 | 20.0112 ± 31.26347 | 44.49 ± 39.5131 | 10.284 ± 13.481 | |

35 | log(IMF6) | −1903.169 ± 637.510 | −1983.32 ± 713.457 | −1470.196 ± 744.176 | −1062.5 ± 546.13 | −1682.363 ± 1296.591 | |

36 | norm(IMF6) | 13.0914 ± 15.1495 | 11.4410 ± 14.268 | 10.566 ± 12.262 | 57.357 ± 46.317 | 18.179 ± 20.556 |

Literature | Feature Extraction | Classification |
---|---|---|

[58] | ||

method-1 | R–R intervals | Optimum Path Forest (OPF) |

method-2 | Wavelet based features | OPF |

method-3 | Mean, standard deviation and average power of wavelet sub-band | OPF |

method-4 | Auto correlation and energy ratio of wavelet bands | OPF |

method-5 | Fast-ICA | OPF |

method-6 | (Wavelet+ICA+RR interval) | OPF |

[59] | (ECG+VCG) complex network based features | SVM |

[42] | Wavelet packet decomposition based entropy features | Random Forest |

[60] | ||

method-1 | Wavelet based features | Hierarchical Classification (tree approach) |

method-2 | Mean, standard deviation and average power of wavelet sub-band | Hierarchical Classification (tree approach) |

method-3 | Auto correlation and energy ratio of wavelet bands | Hierarchical Classification (tree approach) |

method-4 | Fast-ICA | Hierarchical Classification (tree approach) |

method-5 | (Wavelet+ICA+RR interval) | Hierarchical Classification (tree approach) |

[61] | Temporal Vectrcardiogram(TCG) based features | SVM |

[62] | A combination of projected features | |

(features derived from the projected matrix and DCT) and RR intervals | SVM | |

[63] | TCG feature selection by PSO | SVM |

proposed work | ||

method-1 | Entropy and statistical features calculated on ICEEMD modes | Voting ( J48, LDA, naïve Bayes) |

method-2 | Entropy and statistical features calculated on ICEEMD modes | Voting ( J48, QDA, naïve Bayes) |

method-3 | Entropy and statistical features calculated on ICEEMD modes | Voting ( J48-C, QDA, naïve Bayes) |

method-4 | Entropy and statistical features calculated on ICEEMD modes | Voting ( J48-C, LDA, naïve Bayes) |

Literature | N | S | V | F | Q |
---|---|---|---|---|---|

SEN/FPR/PPV | SEN/FPR/PPV | SEN/FPR/PPV | SEN/FPR/PPV | SEN/FPR/PPV | |

[58] | |||||

method-1 | 84.5/-/- | 1.0/-/- | 77.7/-/- | 38.4/-/- | 0/-/- |

method-2 | 86.4/-/- | 2.3/-/- | 40.8/-/- | 0.5/-/- | 0/-/- |

method-3 | 84.8/-/- | 18.3/-/- | 77.8/-/- | 7.5/-/- | 0/-/- |

method-4 | 92.5/-/- | 3.0/-/- | 61.8/-/- | 16.8/-/- | 0/-/- |

method-5 | 95.7/-/- | 17.7/-/- | 74.7/-/- | 3.9/-/- | 0/-/- |

method-6 | 93.2/-/- | 12.1/-/- | 85.5/-/- | 18.3/-/- | 0/-/- |

[59] | 89.3/25.2/96.6 | 38.6/6.7/18 | 81.2/4.9/53.6 | 0/0/0 | 0/0/0 |

[42] | 94.67/3.92/99.73 | 20/3.69/0.16 | 94.20/0.71/89.78 | 50/0.78/0.52 | 0/0/0 |

[60] | |||||

method-1 | 92.3/22.2/97.1 | 28.5/2.6/29.6 | 83.5/5.51/51.2 | 19.1/1.07/12.3 | 0/0/- |

method-2 | 93.6/57.1/93.0 | 0.49/0.47/3.81 | 67.9/3.99/54.2 | 0/1.63/0 | 0/0/0 |

method-3 | 98.2/41.2/95.1 | 4.72/0.71/20.3 | 81.7/1.25/82.0 | 2.58/0.40/4.88 | 0/0/0 |

method-4 | 98.6/39.8/95.3 | 9.15/0.56/38.6 | 83.2/1.21/82.7 | 0.26/0.38/0.53 | 0/0/- |

method-5 | 94.7/31.2/96.1 | 37.4/6.19/18.8 | 43.9/1.48/67.4 | 0.52/0.72/0.56 | 0/0/- |

proposed work | |||||

method-1 | 94.1/36.3/95.4 | 19.2/1.1/41.7 | 87.2/0.1/97.9 | 89.2/4.1/10.7 | 100/0/100 |

method-2 | 84.8/23.6/96.7 | 34/8.5/14 | 99.6/0.8/90.3 | 61.5/4.9/6.5 | 0/0/0 |

method-3 | 70.8/20.3/96.6 | 42.4/20.1/7.9 | 99.7/0.7/90.4 | 63.8/6.3/5.3 | 0/0/0 |

method-4 | 91.2/38/95.1 | 22.6/2.1/30.2 | 81.5/0.2/97.2 | 90.4/5.7/8 | 100/0/100 |

Literature | N | S | V |
---|---|---|---|

SEN/FPR/PPV | SEN/FPR/PPV | SEN/FPR/PPV | |

[61] | 95/27.9/96.5 | 29.6/3.1/26.4 | 85.1/3.01/66.3 |

[62] | 98.4/-/95.4 | 29.5/-/38.4 | 70.8/-/85.1 |

[63] | |||

method on VCG | 79.1/27.0/96.3 | 31.2/8.4/13.0 | 89.5/7.2/46.1 |

proposed work | |||

method-1 | 97.4/33.9/96.1 | 19.9/1.6/33 | 93.3/0.8/88.8 |

method-2 | 89.1/24.6/96.9 | 34.1/8.9/13.5 | 99.6/1.3/84.2 |

method-3 | 76.2/21.6/96.8 | 41.8/20.8/7.5 | 99.8/1.4/83.6 |

method-4 | 95.8/35.7/95.8 | 23.3/2.4/28.2 | 88.1/1.6/79.9 |

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Kandala, R.N.V.P.S.; Dhuli, R.; Pławiak, P.; Naik, G.R.; Moeinzadeh, H.; Gargiulo, G.D.; Gunnam, S.
Towards Real-Time Heartbeat Classification: Evaluation of Nonlinear Morphological Features and Voting Method. *Sensors* **2019**, *19*, 5079.
https://doi.org/10.3390/s19235079

**AMA Style**

Kandala RNVPS, Dhuli R, Pławiak P, Naik GR, Moeinzadeh H, Gargiulo GD, Gunnam S.
Towards Real-Time Heartbeat Classification: Evaluation of Nonlinear Morphological Features and Voting Method. *Sensors*. 2019; 19(23):5079.
https://doi.org/10.3390/s19235079

**Chicago/Turabian Style**

Kandala, Rajesh N V P S, Ravindra Dhuli, Paweł Pławiak, Ganesh R. Naik, Hossein Moeinzadeh, Gaetano D. Gargiulo, and Suryanarayana Gunnam.
2019. "Towards Real-Time Heartbeat Classification: Evaluation of Nonlinear Morphological Features and Voting Method" *Sensors* 19, no. 23: 5079.
https://doi.org/10.3390/s19235079