MFA-CNN: An Emotion Recognition Network Integrating 1D–2D Convolutional Neural Network and Cross-Modal Causal Features
Abstract
1. Introduction
- We proposed a novel causal metric for EEG-fNIRS to quantify their causal relationship.
- We designed an MFA-CNN emotion recognition model that integrates a 1D–2D CNN framework with a modality–frequency attention mechanism, which assigns weights to EEG, fNIRS, and cross-modal features to enhance recognition performance.
- We conducted systematic experiments on the TYUST-3.0 EEG-fNIRS multimodal emotion dataset [29] to evaluate MFA-CNN, including unimodal and multimodal settings, ablation studies, and comparisons with alternative classifiers.
2. Related Works
2.1. The GC Method
2.2. Multimodal Emotion Recognition Network
3. Materials
3.1. Participants
3.2. Experimental Design
3.2.1. Stimulus Selection
3.2.2. Acquisition Equipment
3.2.3. Data Collection
3.3. Signal Pre-Processing
4. The Proposed MFA-CNN Method
4.1. The GC Features of EEG Signal
4.2. The Causal Feature of EEG-fNIRS
4.3. The MFA-CNN Module
4.4. The Proposed Weighted Multi-Loss Function (WML)
5. Experimental Results and Analysis
5.1. Parameter Settings
5.2. The Emotion Recognition Performance of Single Modal Features
5.3. Emotion Recognition Performance of Combination Features
5.4. Emotion Recognition Performance of the Proposed MFA-CNN Models
5.5. Performance Comparisons for Latest Schemes
5.6. The Subject–Independent Experiments
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
| EEG | electroencephalography |
| fNIRS | functional near-infrared spectroscopy |
| GC | Granger causality |
| CNN | convolutional neural network |
| MFA | modality–frequency attention |
| HbO2 | oxyhemoglobin |
| HbR | deoxyhemoglobin |
| GCN | graph convolutional network |
| WML | weighted multi-loss |
| SVM | support vector machine |
References
- Geng, Y.; Shi, S.; Hao, X. Deep learning-based EEG emotion recognition: A comprehensive review. Neural Comput. Appl. 2025, 37, 1919–1950. [Google Scholar] [CrossRef]
- Al-Hadithy, S.S.; Abdalkafor, A.S.; Al-Khateeb, B. Emotion recognition in EEG signals: Deep and machine learning approaches, challenges, and future directions. Comput. Biol. Med. 2025, 196, 110713. [Google Scholar] [CrossRef]
- Ma, F.; Yuan, Y.; Xie, Y.; Ren, H.; Liu, I.; He, Y.; Ren, F.; Yu, F.; Ni, S. Generative technology for human emotion recognition: A scoping review. Inf. Fusion 2025, 115, 102753. [Google Scholar] [CrossRef]
- Gkintoni, E.; Aroutzidis, A.; Antonopoulou, H.; Halkiopoulo, C. From neural networks to emotional networks: A systematic review of EEG-based emotion recognition in cognitive neuroscience and real-world applications. Brain Sci. 2025, 15, 220. [Google Scholar] [CrossRef] [PubMed]
- Liu, Z.M.; Shore, J.; Wang, M.; Yuan, F.P.; Buss, A.; Zhao, X.P. A systematic review on hybrid EEG/fNIRS in brain-computer interface. Biomed. Signal Process. Control 2021, 68, 102595. [Google Scholar] [CrossRef]
- Suhaimi, N.S.; Mountstephens, J.; Teo, J. EEG-based emotion recognition: A state-of-the-art review of current trends and opportunities. Comput. Intell. Neurosci. 2020, 2020, 8875426. [Google Scholar] [CrossRef]
- Sun, X.; Dai, C.; Wu, X.; Han, T.; Li, Q.; Lu, Y.; Liu, X.; Yuan, H. Current implications of EEG and fNIRS as functional neuroimaging techniques for motor recovery after stroke. Med. Rev. 2024, 4, 492–509. [Google Scholar] [CrossRef] [PubMed]
- Si, X.; Huang, H.; Yu, J.; Ming, D. EEG microstates and fNIRS metrics reveal the spatiotemporal joint neural processing features of human emotions. IEEE Trans. Affect. Comput. 2024, 15, 2128–2138. [Google Scholar] [CrossRef]
- Wang, R.; Zhuang, P. A strategy for network multi-layer information fusion based on multimodel in user emotional polarity analysis. Int. J. Cogn. Comput. Eng. 2025, 6, 120–130. [Google Scholar] [CrossRef]
- Chen, M.S.; Cai, Q.; Omari, D.; Sanghvi, D.; Lyu, S.; Bonanno, G. Emotion regulation and mental health across cultures: A systematic review and meta-analysis. Nat. Hum. Behav. 2025, 9, 1176–1200. [Google Scholar] [CrossRef] [PubMed]
- Mao, H.; Meng, H.; Tan, Q.; Samuel, O.; Li, G.; Fang, P. Evaluation of neurovascular coupling behaviors for FES-induced wrist movements based on synchronized EEG-fNIRS signals. IEEE Trans. Neural Syst. Rehabil. Eng. 2025, 33, 2622–2630. [Google Scholar] [CrossRef] [PubMed]
- Zhong, J.; Li, G.; Lv, Z.; Chen, J.; Wang, C.; Shao, A.; Gong, Z.; Wang, J.; Liu, S.; Luo, J.; et al. Neuromodulation of cerebral blood flow: A physiological mechanism and methodological review of neurovascular coupling. Bioengineering 2025, 12, 442. [Google Scholar] [CrossRef]
- Lin, J.; Lu, J.; Shu, Z.; Han, J.; Yu, N. Subject-specific modeling of EEG-fNIRS neurovascular coupling by task-related tensor decomposition. IEEE Trans. Neural Syst. Rehabil. Eng. 2024, 32, 452–461. [Google Scholar] [CrossRef]
- Burma, J.; Oni, I.; Lapointe, A.; Rattana, S.; Schneider, K.; Debert, C.; Smirl, J.; Dunn, J. Quantifying neurovascular coupling through a concurrent assessment of arterial, capillary, and neuronal activation in humans: A multimodal EEG-fNIRS–TCD investigation. NeuroImage 2024, 302, 120910. [Google Scholar] [CrossRef]
- Chen, H.; Chan, I.Y.S.; Dong, Z.; Samuel, T. Unraveling trust in collaborative human–machine intelligence from a neurophysiological perspective: A review of EEG and fNIRS features. Adv. Eng. Inform. 2025, 67, 103555. [Google Scholar] [CrossRef]
- Llamas-Ramos, R.; Alvarado-Omenat, J.J.; Llamas-Ramos, I. Early EEG and NIRS measurements in preterm babies: A systematic review. Eur. J. Pediatr. 2024, 183, 4169–4178. [Google Scholar] [CrossRef]
- Suzen, E.; Yardibi, F.; Ozkan, O.; Ozkan, O.; Colak, O.; Ozen, S. The role of fNIRS in assessing motor function: A bibliometric review on extremity applications. Medicine 2025, 104, e43707. [Google Scholar] [CrossRef] [PubMed]
- Lee, H.T.; Shim, M.; Liu, X.; Cheon, H.; Kim, S.; Han, C.; Hwang, H. A review of hybrid EEG-based multimodal human–computer interfaces using deep learning: Applications, advances, and challenges. Biomed. Eng. Lett. 2025, 15, 587–618. [Google Scholar] [CrossRef]
- Bunterngchit, C.; Wang, J.; Su, J.; Wang, Y.; Liu, S.; Hou, Z. Temporal attention fusion network with custom loss function for EEG-fNIRS classification. J. Neural Eng. 2024, 21, 066016. [Google Scholar] [CrossRef] [PubMed]
- Qiu, L.; Feng, W.; Ying, Z.; Pan, J. EFMLNet: Fusion model based on end-to-end mutual information learning for hybrid EEG-fNIRS brain–computer interface applications. Proc. Annu. Meet. Cogn. Sci. Soc. 2024, 46, 5752–5758. [Google Scholar]
- Liu, M.; Yang, B.; Meng, L.; Zhang, Y.; Gao, S.; Zan, P.; Xia, X. STA-Net: Spatial–temporal alignment network for hybrid EEG-fNIRS decoding. Inf. Fusion 2025, 119, 103023. [Google Scholar]
- Nia, A.; Tang, V.; Talou, G.; Billinghurst, M. Decoding emotions through personalized multi-modal fNIRS-EEG systems: Exploring deterministic fusion techniques. Biomed. Signal Process. Control 2025, 105, 107632. [Google Scholar] [CrossRef]
- Ghouse, A.; Faes, L.; Valenza, G. Inferring directionality of coupled dynamical systems using Gaussian process priors: Application on neurovascular systems. Phys. Rev. E 2021, 104, 064208. [Google Scholar] [CrossRef]
- Granger, C.W.J. Investigating causal relations by econometric models and cross-spectral methods. Econom. J. Econom. Soc. 1969, 37, 424–438. [Google Scholar] [CrossRef]
- Zhang, X.; Li, Y.; Zhang, P.; Wang, D.; Yao, D.; Xu, P. Central-peripheral nervous system activation in exoskeleton modes: A Granger causality analysis via EEG–EMG fusion. Expert Syst. Appl. 2025, 268, 126311. [Google Scholar] [CrossRef]
- Wismüller, A.; Vosoughi, A.; Kasturi, A. Large-scale nonlinear Granger causality (lsNGC) analysis of functional MRI data for schizophrenia classification. In Proceedings of the Medical Imaging 2025: Computer-Aided Diagnosis (SPIE), San Diego, CA, USA, 16–21 February 2025; Volume 13407, pp. 300–307. [Google Scholar]
- Chen, J.; Zhou, G.; Han, J.; Su, P.; Zhang, H.; Tang, D. The effect of perceived groove in music on effective brain connectivity during cycling: An fNIRS study. Med. Sci. Sport. Exerc. 2024, 57, 857–866. [Google Scholar]
- Lin, R.; Dong, C.; Zhou, P.; Ma, P.; Ma, S.; Chen, X.; Liu, H. Motor imagery EEG task recognition using a nonlinear Granger causality feature extraction and an improved Salp swarm feature selection. Biomed. Signal Process. Control 2024, 88, 105626. [Google Scholar] [CrossRef]
- Liu, Y.; Zhang, X.; Chen, G.; Huang, L.; Sun, Y. EEG-fNIRS emotion recognition based on multi-brain attention mechanism capsule fusion network. J. Zhejiang Univ. (Eng. Sci.) 2024, 58, 2247–2257. [Google Scholar]
- Al-Shargie, F.; Tang, T.B.; Kiguchi, M. Stress assessment based on decision fusion of EEG and fNIRS signals. IEEE Access 2017, 5, 19889–19896. [Google Scholar] [CrossRef]
- Nour, M.; Öztürk, Ş.; Polat, K. A novel classification framework using multiple bandwidth method with optimized CNN for brain–computer interfaces with EEG-fNIRS signals. Neural Comput. Appl. 2021, 33, 15815–15829. [Google Scholar]
- Saadati, M.; Nelson, J.; Ayaz, H. Convolutional neural network for hybrid fNIRS-EEG mental workload classification. In Proceedings of the AHFE 2019 International Conference on Neuroergonomics and Cognitive Engineering, and the AHFE International Conference on Industrial Cognitive Ergonomics and Engineering Psychology, Washington, DC, USA, 24–28 July 2019; Springer: Cham, Switzerland, 2019; pp. 221–232. [Google Scholar]
- Sun, Y.J.; Ayaz, H.; Akansu, A.N. Multimodal affective state assessment using fNIRS+EEG and spontaneous facial expression. Brain Sci. 2020, 10, 85. [Google Scholar] [CrossRef]
- Liang, Z.; Wang, X.; Yu, Z.; Tong, Y.; Li, X.; Ma, Y.; Guo, H. Age-dependent neurovascular coupling characteristics in children and adults during general anesthesia. Biomed. Opt. Express 2023, 14, 2240–2259. [Google Scholar] [CrossRef] [PubMed]
- Kamat, A.; Norfleet, J.; Intes, X.; Dutta, A.; De, S. Perception–action cycle-related brain networks differentiate experts and novices: A combined EEG, fNIRS study during a complex surgical motor task. In Proceedings of the Clinical and Translational Neurophotonics 2022, SPIE BiOS, San Francisco, CA, USA, 22 January–28 February 2022; Volume 11945, pp. 47–55. [Google Scholar]
- Guo, J.L.; Fang, F.; Wang, W.; Ren, F.J. EEG emotion recognition based on Granger causality and CapsNet neural network. In Proceedings of the 2018 5th IEEE International Conference on Cloud Computing and Intelligence Systems (CCIS), Nanjing, China, 23–25 November 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 47–52. [Google Scholar]
- Bagherzadeh, S.; Maghooli, K.; Shalbaf, A.; Maghsoudi, A. Emotion recognition using effective connectivity and pre-trained convolutional neural networks in EEG signals. Cogn. Neurodyn. 2022, 16, 1087–1106. [Google Scholar] [CrossRef]
- Pugh, Z.H.; Choo, S.; Leshin, J.C.; Lindquist, K.A.; Nam, C.S. Emotion depends on context, culture and their interaction: Evidence from effective connectivity. Soc. Cogn. Affect. Neurosci. 2022, 17, 206–217. [Google Scholar] [CrossRef] [PubMed]
- Hu, Z.S.; Lam, K.F.; Xiang, Y.T.; Yuan, Z. Causal cortical network for arithmetic problem-solving represents brain’s planning rather than reasoning. Int. J. Biol. Sci. 2019, 15, 1148–1160. [Google Scholar] [CrossRef]
- Lee, S.H.; Jin, S.H.; An, J. Distinction of directional coupling in sensorimotor networks between active and passive finger movements using fNIRS. Biomed. Opt. Express 2018, 9, 2859–2870. [Google Scholar] [CrossRef]
- Hamidi, A.; Kiani, K. Motor imagery EEG signals classification using a Transformer-GCN approach. Appl. Soft Comput. 2025, 170, 112686. [Google Scholar] [CrossRef]
- Li, Y.; Zhang, X.; Ming, D. Early-stage fusion of EEG and fNIRS improves classification of motor imagery. Front. Neurosci. 2023, 16, 1062889. [Google Scholar] [CrossRef] [PubMed]
- Kwak, Y.; Song, W.J.; Kim, S.E. FGANet: FNIRS-guided attention network for hybrid EEG-fNIRS brain-computer interfaces. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 30, 329–339. [Google Scholar] [CrossRef]
- Zhang, J.; Zhang, X.; Chen, G.; Huang, L.; Sun, Y. EEG emotion recognition based on cross-frequency Granger causality feature extraction and fusion in the left and right hemispheres. Front. Neurosci. 2022, 16, 974673. [Google Scholar] [CrossRef]
- Sun, Z.; Huang, Z.; Duan, F.; Liu, Y. A novel multimodal approach for hybrid brain–computer interface. IEEE Access 2020, 8, 89909–89918. [Google Scholar] [CrossRef]
- Hou, M.; Zhang, X.; Chen, G.; Huang, L.; Sun, Y. Emotion recognition based on an EEG–fNIRS hybrid brain network in the source space. Brain Sci. 2024, 14, 1166. [Google Scholar] [CrossRef]
- Wei, Y.; Liu, Y.; Li, C.; Cheng, J.; Song, R.; Chen, X. TC-Net: A Transformer capsule network for EEG-based emotion recognition. Comput. Biol. Med. 2023, 152, 106463. [Google Scholar] [CrossRef]
- Liu, Y.; Ding, Y.; Li, C.; Cheng, J.; Song, R.; Wan, F.; Chen, X. Multi-channel EEG-based emotion recognition via a multi-level features guided capsule network. Comput. Biol. Med. 2020, 123, 103927. [Google Scholar] [CrossRef]
- Wang, Z.; Chen, C.; Li, J.; Wan, F.; Sun, Y.; Wang, H. ST-CapsNet: Linking spatial and temporal attention with capsule network for P300 detection improvement. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 991–1000. [Google Scholar] [CrossRef] [PubMed]
- Zhao, W.; Jiang, X.; Zhang, B.; Xiao, S.; Weng, S. CTNet: A convolutional transformer network for EEG-based motor imagery classification. Sci. Rep. 2024, 14, 20237. [Google Scholar] [CrossRef] [PubMed]







| Layers | Filter Size | Stride | Padding | ||
|---|---|---|---|---|---|
| GC(EEG) | HbO2 + HbR | GC(EEG-fNIRS) | |||
| Conv1 | 1 | 1 | |||
| ReLU | |||||
| Conv2 | 1 | 1 | |||
| ReLU | |||||
| Conv3 | 1 | 1 | |||
| ReLU | |||||
| FC1 | 128 | ||||
| Concatenate | |||||
| FC2 | 4 | ||||
| ReLU | |||||
| Softmax | 4 | ||||
| Features | EEG() | EEG() | EEG() | EEG() | EEG() |
|---|---|---|---|---|---|
| EEG + HbO2 | 84.56/3.60 | 85.07/3.19 | 84.16/3.98 | 89.01/9.12 | 94.65/4.56 |
| EEG + HbR | 84.87/3.53 | 84.05/3.95 | 83.90/4.54 | 82.68/7.30 | 93.66/4.92 |
| EEG + (HbO2 + HbR) | 86.64/3.21 | 87.61/3.85 | 86.51/4.21 | 90.52/4.12 | 94.76/3.81 |
| EEG + (EEG-fNIRS) | 88.86/5.37 | 89.02/5.37 | 88.93/5.45 | 88.86/5.05 | 93.43/4.88 |
| (HbO2 + HbR) + (EEG-fNIRS) | 87.05/7.02 | 89.95/3.90 | 89.55/5.52 | 85.56/8.25 | 92.08/6.65 |
| EEG + (HbO2 + HbR) + (EEG-fNIRS) | 86.71/5.13 | 88.95/4.97 | 86.54/4.85 | 89.15/5.92 | 94.86/3.81 |
| Method | EEG Band + fNIRS | EEG() + fNIRS | |||||
|---|---|---|---|---|---|---|---|
| EEG() + fNIRS | EEG() + fNIRS | EEG() + fNIRS | EEG() + fNIRS | Accuracy/SD | Time/s | Parameters | |
| 1D2D-CNN | 86.71/5.13 | 88.95/4.97 | 86.54/4.85 | 89.15/5.92 | 94.86/3.81 | 624 | 128,972,816 |
| 1D2D-CNN (MFA) | 90.07/4.42 | 88.57/4.74 | 90.67/2.57 | 89.72/3.31 | 95.82/3.45 | 626.5 | 128,972,888 |
| 1D2D-CNN (WML) | 89.52/3.84 | 88.46/4.01 | 90.52/2.99 | 90.12/3.56 | 95.47/3.55 | 624.5 | 128,972,816 |
| MFA-CNN (Unweighted) | 91.97/4.37 | 90.60/4.74 | 91.70/2.57 | 90.75/3.32 | 96.21/3.32 | 628.6 | 128,972,888 |
| MFA-CNN | 92.07/4.28 | 90.28/4.60 | 92.09/2.40 | 92.46/2.01 | 96.85/1.92 | 632.6 | 128,972,888 |
| Feature Fusion/Model | Happy | Fear | Sad | Calm | Acc./Std |
|---|---|---|---|---|---|
| Cascade+SVM | 90.74 | 82.01 | 83.74 | 87.90 | 86.12/6.71 |
| Cascade+CNN | 93.46 | 94.21 | 89.13 | 92.51 | 92.33/4.94 |
| Cascade+GCN | 92.52 | 90.19 | 90.71 | 91.06 | 91.17/4.01 |
| Weighted fusion+GCN [44] | 98.70 | 91.94 | 91.32 | 98.55 | 95.15/3.15 |
| GNN-fusion [44] | 94.11 | 94.85 | 93.55 | 94.81 | 94.33/2.98 |
| Tensor fusion [45] | 92.41 | 91.60 | 91.83 | 95.79 | 92.91/3.00 |
| P-order polynomial fusion [45] | 94.62 | 94.92 | 92.97 | 92.44 | 93.74/3.14 |
| SG_SC [46] | 97.10 | 96.60 | 95.40 | 97.50 | 91.70/3.02 |
| TC-Net [47] | 85.62 | 89.35 | 83.84 | 88.54 | 86.84/9.25 |
| MLF-CapsNet [48] | 94.48 | 94.79 | 93.46 | 95.86 | 94.65/3.80 |
| ST-CapsNet [49] | 93.57 | 95.02 | 93.04 | 94.42 | 94.01/2.95 |
| Transformers [50] | 85.62 | 89.35 | 83.84 | 88.54 | 86.84/9.25 |
| MBA-CF-cCapsNet [29] | 96.57 | 97.31 | 95.02 | 97.76 | 96.67/2.68 |
| MFA-CNN (ours) | 97.05 | 96.66 | 96.07 | 96.88 | 96.85/1.92 |
| Method | Acc. | Std |
|---|---|---|
| Cascade+SVM | 47.85 | 14.51 |
| Cascade+CNN | 53.67 | 11.35 |
| Cascade+GCN | 55.85 | 10.46 |
| p-order polynomial fusion [45] | 52.33 | 10.80 |
| Tensor fusion [45] | 50.46 | 10.12 |
| Weighted fusion + GCN [44] | 52.80 | 11.50 |
| GNN-fusion [44] | 53.98 | 12.95 |
| Transformers [50] | 53.87 | 10.81 |
| MFA-CNN (ours) | 58.59 | 9.89 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, J.; Wang, A.; Li, S.; Zhang, D.; Li, X. MFA-CNN: An Emotion Recognition Network Integrating 1D–2D Convolutional Neural Network and Cross-Modal Causal Features. Brain Sci. 2025, 15, 1165. https://doi.org/10.3390/brainsci15111165
Zhang J, Wang A, Li S, Zhang D, Li X. MFA-CNN: An Emotion Recognition Network Integrating 1D–2D Convolutional Neural Network and Cross-Modal Causal Features. Brain Sciences. 2025; 15(11):1165. https://doi.org/10.3390/brainsci15111165
Chicago/Turabian StyleZhang, Jing, Anhong Wang, Suyue Li, Debiao Zhang, and Xin Li. 2025. "MFA-CNN: An Emotion Recognition Network Integrating 1D–2D Convolutional Neural Network and Cross-Modal Causal Features" Brain Sciences 15, no. 11: 1165. https://doi.org/10.3390/brainsci15111165
APA StyleZhang, J., Wang, A., Li, S., Zhang, D., & Li, X. (2025). MFA-CNN: An Emotion Recognition Network Integrating 1D–2D Convolutional Neural Network and Cross-Modal Causal Features. Brain Sciences, 15(11), 1165. https://doi.org/10.3390/brainsci15111165

