# Multiple Transferable Recursive Feature Elimination Technique for Emotion Recognition Based on EEG Signals

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

## 2. Related Works

## 3. Methods

#### 3.1. EEG Datasets for Effective Modeling

#### 3.2. Feature Extraction and the Target Emotion Classes

_{1}, a

_{2}, …, a

_{40}(a

_{i}∈ R

^{2}), arousal threshold point c

_{1}is computed as follows:

_{2}. It was found that c

_{1}= 5.2543 and c

_{2}= 5.1567 were the threshold values for arousal and valence dimensions. The ratings above c

_{1}were assigned as the state of high arousal and the ratings above c

_{2}were the state of high valence.

#### 3.3. Multiple Transferable Feature Elimination Based on LSSVM

_{H}and the low emotion state as V

_{L}, the corresponding state centers ${v}_{\mathrm{H}}$ and ${v}_{\mathrm{L}}$ are computed. For the $j\mathrm{th}$ feature of the $i\mathrm{th}$ subject, we define the Euclidean distance between the original EEG feature set to a novel transferring set as

_{i}, while O

_{i}is the newly extended space. H < 0 indicates the feature value is far away from V

_{H}and such feature will be eliminated from the high class. The details of M-TRFE are written in the form of pseudo codes and are given in Table 2 and Table 3.

_{1}from line 9 starts the subject ranking. ${A}_{i}$ records cross-subject performance of subject i.

_{H}(and d

_{L}) quantifies the distance between the original set and transferring set for the high (and low) class, and the distance difference $\tilde{\mathrm{D}}({r}_{1})$ is considered equally influential as LSSVM margin loss $\tilde{w}({r}_{1})$ by taking ${\lambda}_{1}={\lambda}_{2}=0.5$. An auxiliary function ${f}_{a}$ has also been used in the pseudo codes, which is introduced to simplify the representation of the algorithm:

## 4. Results

^{®}Core

^{TM}i5-7200U CPU @ 2.50 GHz 2.71 GHz and 8 GB RAM.

#### 4.1. Data Split and Cross-Validation Technique

#### 4.2. Cross-Subject Feature Selection and Binary Classification

#### 4.3. Multiclass Cross-Subject Emotion Recognition

## 5. Discussion

## 6. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Panksepp, J. Affective Neuroscience: The Foundations of Human and Animal Emotions; Oxford University Press, Oxford: New York, NY, USA, 2005. [Google Scholar]
- Schacter, D.L.; Gilbert, D.T.; Wenger, D.M.; Nock, M.K. Psychology, 3rd ed.; Worth: New York, NY, USA, 2014. [Google Scholar]
- Siegert, I.; Böck, R.; Vlasenko, B.; Philippou-Hübner, D.; Wendemuth, A. Appropriate emotional labelling of non-acted speech using basic emotions, Geneva emotion wheel and self-assessment manikins. In Proceedings of the 2011 IEEE International Conference on Multimedia and Expo, Barcelona, Spain, 11–15 July 2011; pp. 1–6. [Google Scholar]
- Bradley, M.M.; Lang, P.J. Measuring emotion: The self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry
**1994**, 25, 49–59. [Google Scholar] [CrossRef] - Parrott, W.G. Emotions in Social Psychology: Essential Readings; Psychology Press: Philadelphia, PA, USA, 2001. [Google Scholar]
- Ekman, P.; Dalgleish, T.; Power, M. Handbook of Cognition and Emotion; Wiley: Chichester, UK, 1999. [Google Scholar]
- Cambria, E.; Livingstone, A.; Hussain, A. The Hourglass of Emotions. Cogn. Behav. Syst.
**2012**, 7403, 144–157. [Google Scholar] - Mehrabian, A. Pleasure-arousal-dominance: A general framework for describing and measuring individual differences in temperament. Curr. Psychol.
**1996**, 14, 261–292. [Google Scholar] [CrossRef] - Keltner, D.; Ekman, P. Facial Expression of Emotion, 2nd ed.; Guilford Publications: New York, NY, USA, 2000. [Google Scholar]
- Zhang, Q.; Chen, X.; Zhan, Q.; Yang, T.; Xia, S. Respiration-based emotion recognition with deep learning. Comput. Ind.
**2017**, 92–93, 84–90. [Google Scholar] [CrossRef] - Tan, D.; Nijholt, A. Human-Computer Interaction Series; Springer: London, UK, 2010. [Google Scholar]
- Meehan, K.B.; Panfilis, C.D.; Cain, N.M.; Antonucci, C.; Soliani, A.; Clarkin, J.F.; Sambataro, F. Facial emotion recognition and borderline personality pathology. Psychiatry Res.
**2017**, 255, 347–354. [Google Scholar] [CrossRef] - Christensen, J.; Estepp, J.; Wilson, G.; Russell, C. The effects of day-to-day variability of physiological data on operator functional state classification. Neuroimage
**2012**, 59, 57–63. [Google Scholar] [CrossRef] - Yin, Z.; Fei, Z.; Yang, C.; Chen, A. A novel SVM-RFE based biomedical data processing approach: Basic and beyond. In Proceedings of the IECON 2016—42nd Annual Conference of the IEEE Industrial Electronics Society, Firenze, Italy, 24–27 October 2016; pp. 7143–7148. [Google Scholar]
- Shao, Z.; Yang, S.L.; Gao, F.; Zhou, K.L.; Lin, P. A new electricity price prediction strategy using mutual information-based SVM-RFE classification. Renew. Sustain. Energy Rev.
**2017**, 70, 330–341. [Google Scholar] [CrossRef] - Yin, Z.; Wang, Y.X.; Liu, L.; Zhang, W.; Zhang, J.H. Cross-Subject EEG Feature Selection for Emotion Recognition Using Transfer Recursive Feature Elimination. Front. Neurorobot.
**2017**, 11, 1662–5218. [Google Scholar] [CrossRef] [PubMed] - He, X.; Zhang, W. Emotion recognition by assisted learning with convolutional neural networks. Neurocomputing
**2018**, 291, 187–194. [Google Scholar] [CrossRef] - Yang, D.; Alsadoon, A.; Prasad, P.W.C.; Singh, A.K.; Elchouemi, A. An emotion recognition model based on facial recognition in virtual learning environment. Procedia Comput. Sci.
**2018**, 125, 2–10. [Google Scholar] [CrossRef] - Kaya, H.; Karpov, A.A. Efficient and effective strategies for cross-corpus acoustic emotion recognition. Neurocomputing
**2018**, 275, 1028–1034. [Google Scholar] [CrossRef] - Hakanpää, T.; Waaramaa, T.; Laukkanen, A.M. Emotion recognition from singing voices using contemporary commercial music and classical styles. J. Voice
**2018**. [Google Scholar] [CrossRef] - Hu, J. An approach to EEG-based gender recognition using entropy measurement methods. Knowl. Based Syst.
**2018**, 140, 134–141. [Google Scholar] [CrossRef] - Arnau, S.; Möckel, T.; Rinkenauer, G.; Wascher, E. The interconnection of mental fatigue and aging: An EEG study. Int. J. Psychophysiol.
**2017**, 117, 17–25. [Google Scholar] [CrossRef] - Yin, Z.Y.; Zhang, J. Cross-subject recognition of operator functional states via EEG and switching deep belief networks with adaptive weights. Neurocomputing
**2017**, 260, 349–366. [Google Scholar] [CrossRef] - Li, X.; Zhang, P.; Song, D.; Yu, G.; Hou, Y.; Hu, B. EEG based emotion identification using unsupervised deep feature learning. In Proceedings of the SIGIR2015 Workshop on Neuro-Physiological Methods in IR Research, Santiago, Chile, 9–13 August 2015. [Google Scholar]
- Chen, S.; Gao, Z.; Wang, S. Emotion recognition from peripheral physiological signals enhanced by EEG. In Proceedings of the 2016 IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP), Shanghai, China, 20–25 March 2016; pp. 2827–2831. [Google Scholar]
- Shahnaz, C.; Shoaib-Bin-Masud; Hasan, S.M.S. Emotion recognition based on wavelet analysis of Empirical Mode Decomposed EEG signals responsive to music videos. In Proceedings of the 2016 IEEE Region 10 Conference (TENCON), Singapore, 22–25 November 2016; pp. 424–427. [Google Scholar]
- Wen, Z.; Xu, R.; Du, J. A novel convolutional neural networks for emotion recognition based on EEG signal. In Proceedings of the 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC), Shenzhen, China, 15–18 December 2017; pp. 672–677. [Google Scholar]
- Tong, J.; Liu, S.; Ke, Y.F.; Gu, B.; He, F.; Wan, B.; Ming, D. EEG-based emotion recognition using nonlinear feature. In Proceedings of the 2017 IEEE 8th International Conference on Awareness Science and Technology (iCAST), Taichung, China, 8–10 November 2017; pp. 55–59. [Google Scholar]
- Zhang, Y.; Ji, X.; Zhang, S. An approach to EEG-based emotion recognition using combined feature extraction method. Neurosci. Lett.
**2016**, 633, 152–157. [Google Scholar] [CrossRef] - Atkinson, J.; Campos, D. Improving BCI-based emotion recognition by combining EEG feature selection and kernel classifiers. Expert Syst. Appl.
**2016**, 47, 35–41. [Google Scholar] [CrossRef] - Li, H.; Qing, C.; Xu, X.; Zhang, T. A novel DE-PCCM feature for EEG-based emotion recognition. In Proceedings of the 2017 International Conference on Security, Pattern Analysis, and Cybernetics (SPAC), Shenzhen, China, 15–18 December 2017; pp. 389–393. [Google Scholar]
- Oostenveld, R.; Praamstra, P. The five percent electrode system for high-resolution EEG and ERP measurements. Clin. Neurophysiol.
**2001**, 112, 713–719. [Google Scholar] [CrossRef] - Hidalgo-Muñoz, A.R.; López, M.M.; Santos, I.M.; Pereira, A.T.; Vázquez-Marrufo, M.; Galvao-Carmona, A.; Tomé, A.M. Application of SVM-RFE on EEG signals for detecting the most relevant scalp regions linked to affective valence processing. Expert Syst. Appl.
**2013**, 40, 2102–2108. [Google Scholar] [CrossRef] - Yin, Z.; Zhang, J. Operator functional state classification using least-square support vector machine based recursive feature elimination technique. Comput. Methods Prog. Biomed.
**2014**, 113, 101–115. [Google Scholar] [CrossRef] - Hamada, Y.; Elbarougy, R.; Akagi, M. A method for emotional speech synthesis based on the position of emotional state in Valence-Activation space. In Proceedings of the Signal and Information Processing Association Annual Summit and Conference (APSIPA), Siem Reap, Cambodia, 9–12 December 2014; pp. 1–7. [Google Scholar]
- Koelstra, S.; Mühl, C.; Soleymani, M.; Lee, J.S.; Yazdani, A.; Ebrahimi, T.; Pun, T.; Nijholt, A.; Patras, I. DEAP: A database for emotion analysis using physiological signals. IEEE Trans. Affect. Comput.
**2012**, 3, 18–31. [Google Scholar] [CrossRef] - Atasoyu, M.; Metin, B.; Kuntman, H.; Cicekoglu, O. Simple realization of a third order Butterworth filter with MOS-only technique. AEU
**2017**, 81, 205–208. [Google Scholar] [CrossRef] - Chen, X.; Liu, A.; Chen, Q.; Liu, Y.; Zou, L.; McKeown, M.J. Simultaneous ocular and muscle artifact removal from EEG data by exploiting diverse statistics. Comput. Biol. Med.
**2017**, 88, 1–10. [Google Scholar] [CrossRef] - Zhang, J.; Yin, Z.; Wang, R. Recognition of mental workload levels under complex human-machine collaboration by using physiological features and adaptive support vector machines. Hum. Mach. Syst.
**2015**, 45, 200–214. [Google Scholar] [CrossRef] - Naser, D.S.; Saha, G. Recognition of emotions induced by music videos using DT-CWPT. In Proceedings of the Indian Conference on Medical Informatics and Telemedicine (ICMIT), Kharagpur, India, 28–30 March 2013; pp. 53–57. [Google Scholar]
- Zhu, Y.; Wang, S.; Ji, Q. Emotion recognition from users’ EEG signals with the help of stimulus videos. In Proceedings of the 2014 IEEE international conference on multimedia and expo (ICME), Chengdu, China, 8–12 July 2014; pp. 1–6. [Google Scholar]
- Feradov, F.; Ganchev, T. Detection of negative emotional states from electroencephalographic (EEG) signals. Annu. J. Electron.
**2014**, 8, 66–69. [Google Scholar] - Candra, H.; Yuwono, M.; Handojoseno, A.; Chai, R.; Su, S.; Nguyen, H.T. Recognizing emotions from EEG subbands using wavelet analysis. In Proceedings of the 2015 37th annual international conference of the IEEE engineering in medicine and biology society (EMBC), Milan, Italy, 25–29 August 2015; pp. 6030–6033. [Google Scholar]
- Nakisa, B.; Rastgoo, M.N.; Tjondronegoro, D.; Chandran, V. Evolutionary computation algorithms for feature selection of EEG-based emotion recognition using mobile sensors. Expert Syst. Appl.
**2018**, 93, 143–155. [Google Scholar] [CrossRef] - Gupta, V.; Chopda, M.D.; Pachori, R.B. Cross-Subject Emotion Recognition Using Flexible Analytic Wavelet Transform from EEG Signals. IEEE Sens. J.
**2019**, 19, 2266–2274. [Google Scholar] [CrossRef]

**Figure 1.**V-A plane that defines four emotions with each subject’s self-assessment ratings and the subject-generic thresholds.

**Figure 4.**Binary subject-specific classification accuracies on (

**a**) arousal and (

**b**) valence dimensions.

**Figure 5.**Influence of M-TRFE feature elimination on binary classification: (

**a**) arousal accuracy, (

**b**) valence accuracy, (

**c**) arousal f1 score, (

**d**) valence f1 score, with different amounts of features eliminated.

**Figure 6.**Binary classification accuracy on arousal (

**a**) and valence (

**b**) dimensions under the feature selection paradigms of subject-specific, RFE and S-TRFE and M-TRFE.

**Figure 7.**Classification performances of three strategies for four emotions, (

**a**) joy, (

**b**) peace, (

**c**) anger and (

**d**) depression using separate binary classifiers.

**Figure 8.**Illustration of M-TRFE feature transferring: (

**a**) the number of features that were eliminated for each subject, (

**b**) the best results of subject-specific RFE using OvO, and (

**c**) OA of M-TRFE when the number of subjects employed increases.

Feature Index | Notations |
---|---|

44 EEG Power Features | Average PSD in four bands for all channels. |

16 EEG Power Differences | Difference of average PSD in four bands for four channel pairs. (F4-F3, C4-C3, P4-P3 and O2-O1). |

77 EEG Time Domain Features | Mean, variance, zero crossing rate, Shannon entropy, spectral entropy, kurtosis and skewness of eleven channels. |

Initialization of M-TRFE Algorithm | |
---|---|

1 | Start initialization |

2 | for I = 1:s |

3 | for j= 1:f |

4 | Define ${V}^{i}=\{{\mathbf{x}}_{\mathbf{k}},{y}_{k}\}$ using fth validating segment of subject i |

5 | Define $\tilde{J}(\mathbf{w},b,{\zeta}_{k})=\frac{1}{2}||\mathbf{w}|{|}^{2}+\frac{1}{2}({\gamma}_{j}^{(i)}\xb7{\displaystyle {\sum}_{k=1}^{{v}^{i}}{\zeta}_{k}})$ |

6 | Train LSSVM model ${y}_{j}(\mathbf{x})=\mathrm{sign}({\displaystyle {\sum}_{k=1}^{{v}^{i}}{\mathbf{\alpha}}_{\mathbf{k}}{\mathbf{y}}_{\mathbf{k}}{\mathbf{x}}_{\mathbf{k}}\mathbf{x}})+b)$ |

7 | end for |

8 | Select the model and the regularization parameter ${\gamma}_{0}^{(i)}$ by a cross-validation technique |

9 | for j_{1} = 1:s |

10 | if j_{1} = s + 1 |

11 | j_{1} = 1 |

12 | else j_{1} = i |

13 | end if |

14 | Define cross-subject data ${V}^{{j}_{1}}=\{{\mathbf{x}}_{\mathbf{k}},{y}_{k}\}$ from working segment of subject j_{1} |

15 | Define $\tilde{J}(\mathbf{w},b,{\zeta}_{k})=\frac{1}{2}||\mathbf{w}|{|}^{2}+\frac{1}{2}({\gamma}_{0}^{(i)}\xb7{\displaystyle {\sum}_{k=1}^{{v}^{{j}_{1}}}{\zeta}_{k}})$ and train the model |

16 | Test model with the validating segment ${V}^{i}=\{{\mathbf{x}}_{{\mathbf{k}}_{1}},{y}_{{\mathrm{k}}_{1}}^{}\}$ from subject i |

17 | Create subject ranking vector ${A}_{\mathrm{H}}={A}_{i}\cup {A}_{\mathrm{H}}$ |

18 | end for |

19 | Rank the most trusted subjects through ranking ${A}_{\mathrm{H}}$ |

20 | end for |

21 | End initialization |

Feature Ranking of M-TRFE Algorithm | |
---|---|

1 | Start feature ranking |

2 | for i = 1:s |

3 | Load ${V}^{i}=\{{\mathbf{x}}_{\mathbf{k}},{y}_{k}\}$, ${\gamma}_{0}^{(i)}$ |

4 | Calculate ${\mathbf{V}}_{\mathrm{H}}$ for a certain emotion and create blank space ${S}_{i}=\varnothing $ |

5 | for j = 1:${N}_{{o}_{i}}$ |

6 | if ${H}_{\mathrm{P}}({\mathbf{x}}_{j})<0$ |

7 | $Si=Si\cup {\mathbf{x}}_{j}$ |

8 | else ${\mathrm{S}}_{i}={S}_{i}$ |

9 | end if |

10 | for j = 1:L |

11 | Build the $\mathrm{O}i={V}^{i}\cup Si$ used for transferring task |

12 | Define $\tilde{J}(\mathbf{w},b,{\zeta}_{k})=\frac{1}{2}||\mathbf{w}|{|}^{2}+\frac{1}{2}({\gamma}_{0}^{(i)}\xb7{\displaystyle {\sum}_{k=1}^{{O}^{i}}{\zeta}_{k}})$ |

13 | Find support vector $\mathbf{w}={\displaystyle {\sum}_{k=1}^{{O}_{i}}{\alpha}_{k}{y}_{k}{\mathbf{x}}_{\mathbf{k}}}$ |

14 | for r = 1:L |

15 | $\tilde{w}(r)={f}_{a}[||w(r)|{|}^{2}]$, $\tilde{\mathrm{D}}(r)={f}_{a}\left[{\mathrm{d}}_{\mathrm{H}}(r)+{\mathrm{d}}_{\mathrm{L}}(r)\right],\phantom{\rule{0ex}{0ex}}\mathsf{\Delta}\tilde{\mathsf{\Phi}}(r)={\lambda}_{1}\tilde{w}(r)+{\lambda}_{2}\tilde{D}(r)$ |

16 | end for |

17 | Create a blank feature ranking set $R=\varnothing $ |

18 | $R(j)=R(j)\cup \mathrm{arg}\mathrm{min}\mathsf{\Delta}\tilde{\mathsf{\Phi}}$ |

19 | Eliminate R from feature set S |

20 | end for |

21 | Return feature ranking set $S={\displaystyle {\cup}_{J=1}^{L}R(j)}$ |

22 | End feature ranking |

Most Trusted Subject | Arousal | Valence |
---|---|---|

1 | 16 | 9 |

2 | 4 | 7 |

3 | 3 | 20 |

4 | 6 | 15 |

5 | 15 | 18 |

**Table 5.**The worst features and corresponding physiological significance ranked for binary classification.

Worst Feature | Arousal | Corresponding Physiological Significance | Valence | Corresponding Physiological Significance |
---|---|---|---|---|

1 | 61 | CZ, β, PSD | 69 | Fz, γ, PSD |

2 | 66 | O2, β, PSD | 62 | P3, β, PSD |

3 | 52 | P4, α, PSD | 66 | O2, β, PSD |

4 | 129 | FZ, Zero-crossing rate | 61 | CZ, β, PSD |

5 | 63 | P4, β, PSD | 64 | Pz, α, PSD |

6 | 65 | O1, β, PSD | 70 | C3, γ, PSD |

7 | 62 | P3, β, PSD | 129 | FZ, Zero-crossing rate |

8 | 67 | F3, γ, PSD | 65 | O1, β, PSD |

9 | 127 | F3, Zero-crossing rate | 71 | C4, γ, PSD |

10 | 58 | FZ, β, PSD | 58 | FZ, β, PSD |

**Table 6.**Binary classification performances of different cross-subject feature selections and SS methods.

Classification Scheme | Index | |||
---|---|---|---|---|

Mean Accuracy-Arousal | Mean Accuracy-Valence | Mean F1 Score-Arousal | Mean F1 Score -Valence | |

Direct Scheme | 0.5089 (0.0257) | 0.5506 (0.0467) | 0.4961 (0.2701) | 0.4818 (0.0363) |

S-TRFE | 0.6470 (0.0740) | 0.6875 (0.0588) | 0.6163 (0.0245) | 0.6838 (0.0489) |

M-TRFE | 0.6494 (0.0496) | 0.6898 (0.0676) | 0.6571 (0.0513) | 0.6773 (0.0363) |

G-TRFE | 0.5580 (0.0801) | 0.5680 (0.0696) | 0.5055 (0.0166) | 0.5361 (0.0482) |

SS | 0.6549 (0.0701) | 0.6865 (0.1581) | 0.5364 (0.2864) | 0.6389 (0.1816) |

**Table 7.**Worst features ranked for multiclass and corresponding physiological significance in multi-classification.

Worst Feature | Joy | Peace | Anger | Depression | Mutual Ranking | Corresponding Physiological Significance |
---|---|---|---|---|---|---|

1 | 61 | 132 | 63 | 69 | 63 | P4, β, PSD |

2 | 58 | 66 | 61 | 52 | 61 | CZ, β, PSD |

3 | 66 | 63 | 52 | 66 | 52 | P4, α, PSD |

4 | 63 | 52 | 129 | 63 | 66 | O2, β, PSD |

5 | 134 | 61 | 66 | 61 | 132 | CZ, Zero-crossing rate |

6 | 127 | 65 | 50 | 67 | 127 | F3, Zero-crossing rate |

7 | 129 | 127 | 51 | 62 | 129 | FZ, Zero-crossing rate |

8 | 137 | 67 | 127 | 71 | 69 | FZ, γ, PSD |

9 | 132 | 71 | 71 | 132 | 58 | FZ, β, PSD |

10 | 52 | 49 | 67 | 51 | 67 | F3, γ, PSD |

Most Trusted Subject | Joy | Peace | Anger | Depression | Mutual |
---|---|---|---|---|---|

1 | 31 | 21 | 23 | 14 | 31 |

2 | 15 | 9 | 26 | 32 | 21 |

3 | 11 | 13 | 31 | 7 | 23 |

4 | 16 | 30 | 4 | 24 | 14 |

5 | 8 | 25 | 6 | 21 | 26 |

SS | S-TRFE | M-TRFE | G-TRFE | ||
---|---|---|---|---|---|

OA | 0.5908 | 0.5342 | 0.6513 | 0.6205 | |

Kappa Value | 0.4212 | 0.3182 | 0.4665 | 0.3016 | |

ANOVA | - | p < 0.05 | p < 0.05 | p < 0.05 | |

Precision | Joy | 0.5476 | 0.3972 | 0.6416 | 0.4027 |

Peace | 0.7551 | 0.3475 | 0.7489 | 0.5212 | |

Anger | 0.5746 | 0.3426 | 0.6146 | 0.5281 | |

Depression | 0.7129 | 0.3688 | 0.8891 | 0.4938 | |

Recall | Joy | 0.571 | 0.4822 | 0.5795 | 0.4121 |

Peace | 0.4900 | 0.3049 | 0.5173 | 0.3577 | |

Anger | 0.7988 | 0.2978 | 0.8598 | 0.8544 | |

Depression | 0.3364 | 0.4610 | 0.2723 | 0.2174 | |

F1 Score | Joy | 0.5591 | 0.4356 | 0.609 | 0.4073 |

Peace | 0.5943 | 0.3248 | 0.6119 | 0.4242 | |

Anger | 0.6684 | 0.3186 | 0.7168 | 0.6527 | |

Depression | 0.4571 | 0.4098 | 0.4169 | 0.3019 |

Study | Feature Selection Method | Classifier | If or Not Cross Subject? | OA | |
---|---|---|---|---|---|

Binary | Multiclass | ||||

Koelstra, 2012 [36] | - | SVM | No | 0.6235 | - |

Naser, 2013 [40] | SVD | SVM | No | 0.6525 | - |

Zhu, 2014 [41] | - | SVM | No | 0.5795 | - |

Li, 2015 [24] | - | DBN | No | 0.5130 | - |

Atkinson, 2015 [30] | mRMR | SVM | No | 0.6151 | - |

Chen, 2016 [25] | CCA | SVM | No | 0.6040 | - |

Shahnaz, 2016 [26] | PCA | SVM | No | 0.6561 | - |

Feradov, 2014 [42] | - | SVM | No | - | 0.6200 |

Candra, 2011 [43] | - | SVM | No | - | 0.6090 |

Nakisa, 2018 [44] | EC | PNN | No | - | 0.6408–0.7085 |

Gupta, 2018 [45] | FAWT | Random Forest | Yes | - | 0.7143 |

Yin, 2017 [16] | T-RFE | LSSVM | Yes | 0.5630 | 0.6205 |

Our work | M-TRFE | LSSVM | Yes | 0.6695 | 0.6513 |

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Cai, J.; Chen, W.; Yin, Z.
Multiple Transferable Recursive Feature Elimination Technique for Emotion Recognition Based on EEG Signals. *Symmetry* **2019**, *11*, 683.
https://doi.org/10.3390/sym11050683

**AMA Style**

Cai J, Chen W, Yin Z.
Multiple Transferable Recursive Feature Elimination Technique for Emotion Recognition Based on EEG Signals. *Symmetry*. 2019; 11(5):683.
https://doi.org/10.3390/sym11050683

**Chicago/Turabian Style**

Cai, Jiahui, Wei Chen, and Zhong Yin.
2019. "Multiple Transferable Recursive Feature Elimination Technique for Emotion Recognition Based on EEG Signals" *Symmetry* 11, no. 5: 683.
https://doi.org/10.3390/sym11050683