Next Article in Journal
Dual Networks for High-Precision and High-Speed Registration of Brain Electron Microscopy Images
Previous Article in Journal
A Deep Siamese Convolution Neural Network for Multi-Class Classification of Alzheimer Disease
Previous Article in Special Issue
Neurophysiological Vigilance Characterisation and Assessment: Laboratory and Realistic Validations Involving Professional Air Traffic Controllers
Open AccessArticle

Multimodal Affective State Assessment Using fNIRS + EEG and Spontaneous Facial Expression

by Yanjia Sun 1,*, Hasan Ayaz 2,3,4,5 and Ali N. Akansu 1
Department of Electrical and Computer Engineering, New Jersey Institute of Technology, Newark, NJ 07102, USA
School of Biomedical Engineering, Science and Health Systems, Drexel University, Philadelphia, PA 19104, USA
Department of Psychology, College of Arts and Sciences, Drexel University, Philadelphia, PA 19104, USA
Department of Family and Community Health, University of Pennsylvania, Philadelphia, PA 19104, USA
Center for Injury Research and Prevention, Children’s Hospital of Philadelphia, Philadelphia, PA 19104, USA
Author to whom correspondence should be addressed.
Brain Sci. 2020, 10(2), 85;
Received: 5 October 2019 / Revised: 31 January 2020 / Accepted: 1 February 2020 / Published: 6 February 2020
(This article belongs to the Special Issue Brain Plasticity, Cognitive Training and Mental States Assessment)
Human facial expressions are regarded as a vital indicator of one’s emotion and intention, and even reveal the state of health and wellbeing. Emotional states have been associated with information processing within and between subcortical and cortical areas of the brain, including the amygdala and prefrontal cortex. In this study, we evaluated the relationship between spontaneous human facial affective expressions and multi-modal brain activity measured via non-invasive and wearable sensors: functional near-infrared spectroscopy (fNIRS) and electroencephalography (EEG) signals. The affective states of twelve male participants detected via fNIRS, EEG, and spontaneous facial expressions were investigated in response to both image-content stimuli and video-content stimuli. We propose a method to jointly evaluate fNIRS and EEG signals for affective state detection (emotional valence as positive or negative). Experimental results reveal a strong correlation between spontaneous facial affective expressions and the perceived emotional valence. Moreover, the affective states were estimated by the fNIRS, EEG, and fNIRS + EEG brain activity measurements. We show that the proposed EEG + fNIRS hybrid method outperforms fNIRS-only and EEG-only approaches. Our findings indicate that the dynamic (video-content based) stimuli triggers a larger affective response than the static (image-content based) stimuli. These findings also suggest joint utilization of facial expression and wearable neuroimaging, fNIRS, and EEG, for improved emotional analysis and affective brain–computer interface applications. View Full-Text
Keywords: functional near-infrared spectroscopy (fNIRS); electroencephalography (EEG); facial emotion recognition; brain–computer interface (BCI) functional near-infrared spectroscopy (fNIRS); electroencephalography (EEG); facial emotion recognition; brain–computer interface (BCI)
Show Figures

Figure 1

MDPI and ACS Style

Sun, Y.; Ayaz, H.; Akansu, A.N. Multimodal Affective State Assessment Using fNIRS + EEG and Spontaneous Facial Expression. Brain Sci. 2020, 10, 85.

Show more citation formats Show less citations formats
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

Back to TopTop