LPGGNet: Learning from Local–Partition–Global Graph Representations for Motor Imagery EEG Recognition
Abstract
1. Introduction
- (1)
- We propose a novel LPGGNet framework with hierarchical architecture designed to capture local, partitioned, and global brain activities in MI-EEG.
- (2)
- A novel partition method for EEG electrodes is introduced to spatially isolate task-relevant brain activities and reduce inter-partition interference.
- (3)
- A novel Gaussian median distance (GMD)-based method is proposed to quantify inter-electrode relationships, which better aligns with the physiological characteristics of EEG signal propagation in the brain.
- (4)
- A BCI-based intelligent wheelchair system is developed to validate the usefulness of the proposed LPGGNet.
2. Related Works
2.1. Brain Connectivity and Partitions of EEG Signals
2.2. Convolutional Neural Networks (CNNs)
2.3. Graph Convolutional Networks (GCNs)
3. Materials and Methods
3.1. Datasets
3.1.1. Dataset A
3.1.2. Dataset B
3.1.3. Data Processing
3.2. LPGGNet
- (1)
- Local Learning Module: Adjacency matrices are constructed based on PDC to capture directional dependencies between EEG channels. Following this, two layers of temporal convolutional neural networks (TC) are then applied to capture advanced temporal features from the channel signals. Finally, local topological features of the signals are captured using a graph convolutional network (GCN), which integrates temporal dynamics with their corresponding local graph structures.
- (2)
- Partition Learning Module: An adjacency matrix is first constructed for each partition based on the Gaussian median distance (GMD), upon which a graph filter is built to optimize partition-level signal representations. Subsequently, two layers of temporal CNNs are employed to capture high-level temporal dynamics within each partition. Finally, features from all partitions are integrated using the arithmetic mean method to form a new partition-based representation.
- (3)
- Global Learning Module: Node features obtained from the Local Learning and Partition Learning modules are first fused to unified feature representation. A dynamic global adjacency matrix is then constructed based on cosine similarity to capture inter-node dependencies across all electrodes. To effectively exploit these relationships, two residual graph convolutional layers are employed, which not only extract inter-electrode global features but also alleviate overfitting through residual connections. Finally, the learned global representations are fed into two fully connected (FC) layers to perform classification.
3.3. Local Learning Module
3.3.1. PDC-Based Electrode Relationships
3.3.2. Temporal Graph Convolution Network (TGCN)
3.4. Partition Learning Module
3.4.1. Partition Strategy
3.4.2. Gaussian Median Distance (GMD) Method
3.4.3. Partitioned Feature Fusion
3.5. Global Learning Module
3.6. Classification Module
4. Experiment and Results
4.1. Evaluation Metrics
4.2. Model Parameters
4.3. Experiment on Data A and Data B
4.3.1. Overview Performance
4.3.2. Ablation Experiments
4.3.3. Online Wheelchair Experiment
5. Discussion
5.1. Effect of Gaussian Median Distance (GMD)
5.2. Visualization Analysis of LPGGNet
5.2.1. Visualization of Feature Separation
5.2.2. Visualization of Electrode Contributions
5.3. Practicality of LPGGNet
5.4. Relationship to Existing Hierarchical GCN and Spatio-Temporal GNN Models
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Su, J.; Wang, J.; Wang, W.; Wang, Y.; Bunterngchit, C.; Zhang, P.; Hou, Z.G. An adaptive hybrid brain computer interface for hand function rehabilitation of stroke patients. IEEE Trans. Neural Syst. Rehabil. Eng. 2024, 32, 2950–2960. [Google Scholar] [CrossRef]
- Oh, E.; Shin, S.; Kim, S.P. Brain–computer interface in critical care and rehabilitation. Acute Crit. Care 2024, 39, 24. [Google Scholar] [CrossRef]
- Ma, Z.Z.; Wu, J.J.; Cao, Z.; Hua, X.Y.; Zheng, M.X.; Xing, X.X.; Xu, J.G. Motor imagery-based brain–computer interface rehabilitation programs enhance upper extremity performance and cortical activation in stroke patients. J. Neuroeng. Rehabil. 2024, 21, 91. [Google Scholar] [CrossRef] [PubMed]
- Wang, A.; Sun, J. Personalized EEG-guided brain stimulation targeting in major depression via network controllability and multi-objective optimization. BMC Psychiatry 2025, 25, 723. [Google Scholar] [CrossRef] [PubMed]
- Tozzi, L.; Zhang, X.; Pines, A.; Olmsted, A.M.; Zhai, E.S.; Anene, E.T.; Williams, L.M. Personalized brain circuit scores identify clinically distinct biotypes in depression and anxiety. Nat. Med. 2024, 30, 2076–2087. [Google Scholar] [CrossRef]
- Mikołajewska, E.; Mikołajewski, D. Non-invasive EEG-based brain-computer interfaces in patients with disorders of consciousness. Mil. Med. Res. 2014, 1, 14. [Google Scholar] [CrossRef] [PubMed]
- Yang, B.; Rong, F.; Xie, Y.; Li, D.; Zhang, J.; Li, F.; Gao, X. A multi-day and high-quality EEG dataset for motor imagery brain-computer interface. Sci. Data 2025, 12, 488. [Google Scholar] [CrossRef]
- Degirmenci, M.; Yuce, Y.K.; Perc, M.; Isler, Y. EEG channel and feature investigation in binary and multiple motor imagery task predictions. Front. Hum. Neurosci. 2024, 18, 1525139. [Google Scholar] [CrossRef]
- Almohammadi, A.; Wang, Y.-K. Revealing brain connectivity: Graph embeddings for EEG representation learning and comparative analysis of structural and functional connectivity. Front. Neurosci. 2024, 17, 1288433. [Google Scholar] [CrossRef]
- Awais, M.A.; Yusoff, M.Z. Partial Directed Coherence for the Classification of Motor Imagery-Based Brain–Computer Interface. In Proceedings of the Multimedia University Engineering Conference (MECON 2022); Atlantis Press: Dordrecht, The Netherlands, 2022; pp. 121–131. [Google Scholar]
- Cattai, T.; Colonnese, S.; Barbarossa, S. Robust graph topology inference for multiple brain EEG networks. IEEE Trans. Signal Inf. Process. Netw. 2025, 11, 1317–1331. [Google Scholar] [CrossRef]
- Scrivener, C.L.; Reader, A.T. Variability of EEG electrode positions and their underlying brain regions: Visualizing gel artifacts from a simultaneous EEG–fMRI dataset. Brain Behav. 2022, 12, e2476. [Google Scholar] [CrossRef]
- Stier, C.; Loose, M.; Loew, C.; Segovia-Oropeza, M.; Baek, S.; Lerche, H.; Focke, N.K. Comprehensive evaluation of EEG spatial sampling, head modeling, and parcellation effects on network alterations in idiopathic generalized epilepsy. medRxiv 2024. [Google Scholar] [CrossRef]
- Wang, L.; Li, Z.X. EEG classification based on common spatial pattern and LDA. In Proceedings of the 7th International Conference on Artificial Life and Robotics (ICAROB), Oita, Japan, 13–16 January 2020; pp. 1–6. [Google Scholar]
- Lee, D.; Park, S.H.; Lee, S.G. Improving the accuracy and training speed of motor imagery brain–computer interfaces using wavelet-based combined feature vectors and Gaussian mixture model-supervectors. Sensors 2017, 17, 2282. [Google Scholar] [CrossRef]
- Ma, W.; Zheng, Y.; Li, T.; Li, Z.; Li, Y.; Wang, L. A comprehensive review of deep learning in EEG-based emotion recognition: Classifications, trends, and practical implications. PeerJ Comput. Sci. 2024, 10, e2065. [Google Scholar] [CrossRef]
- Saibene, A.; Ghaemi, H.; Dagdevir, E. Deep learning in motor imagery EEG signal decoding: A Systematic Review. Neurocomputing 2024, 610, 128577. [Google Scholar] [CrossRef]
- Tabar, Y.R.; Halici, U. A Novel Deep Learning Approach for Classification of EEG Motor Imagery Signals. J. Neural Eng. 2016, 14, 016003. [Google Scholar] [CrossRef]
- Lawhern, V.J.; Solon, A.J.; Waytowich, N.R.; Gordon, S.M.; Hung, C.P.; Lance, B.J. EEGNet: A compact convolutional neural network for EEG-based brain–computer interfaces. J. Neural Eng. 2018, 15, 056013. [Google Scholar] [CrossRef] [PubMed]
- Li, Y.; Zheng, W.; Zong, Y.; Cui, Z.; Zhang, T.; Zhou, X. A bi-hemisphere domain adversarial neural network model for EEG emotion recognition. IEEE Trans. Affect. Comput. 2018, 12, 494–505. [Google Scholar] [CrossRef]
- Lomelin-Ibarra, V.A.; Gutierrez-Rodriguez, A.E.; Cantoral-Ceballos, J.A. Motor imagery analysis from extensive EEG data representations using convolutional neural networks. Sensors 2022, 22, 6093. [Google Scholar] [CrossRef]
- Tang, X.; Yang, C.; Sun, X.; Zou, M.; Wang, H. Motor imagery EEG decoding based on multi-scale hybrid networks and feature enhancement. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 1208–1218. [Google Scholar] [CrossRef]
- Hamidi, A.; Kiani, K. Motor Imagery EEG signals classification using a Transformer-GCN approach. Appl. Soft Comput. 2025, 170, 112686. [Google Scholar] [CrossRef]
- Shan, X.; Cao, J.; Huo, S.; Chen, L.; Sarrigiannis, P.G.; Zhao, Y. Spatial–temporal graph convolutional network for Alzheimer classification based on brain functional connectivity imaging of electroencephalogram. Hum. Brain Mapp. 2022, 43, 5194–5209. [Google Scholar] [CrossRef] [PubMed]
- Graña, M.; Morais-Quilez, I. A review of Graph Neural Networks for Electroencephalography data analysis. Neurocomputing 2023, 562, 126901. [Google Scholar] [CrossRef]
- Tian, W.; Li, M.; Ju, X.; Liu, Y. Applying multiple functional connectivity features in GCN for EEG-based human identification. Brain Sci. 2022, 12, 1072. [Google Scholar] [CrossRef]
- Mohammadi, H.; Karwowski, W. Graph neural networks in brain connectivity studies: Methods, challenges, and future directions. Brain Sci. 2024, 15, 17. [Google Scholar] [CrossRef] [PubMed]
- Klepl, D.; Wu, M.; He, F. Graph neural network-based EEG classification: A survey. IEEE Trans. Neural Syst. Rehabil. Eng. 2024, 32, 493–503. [Google Scholar] [CrossRef]
- Henry, J.C. Electroencephalography: Basic principles, clinical applications, and related fields. Neurology 2006, 67, 2092. [Google Scholar] [CrossRef]
- Leeuwis, N.; Yoon, S.; Alimardani, M. Functional connectivity analysis in motor-imagery brain–computer interfaces. Front. Hum. Neurosci. 2021, 15, 732946. [Google Scholar] [CrossRef]
- Maghsoudi, A.; Shalbaf, A. Hand motor imagery classification using effective connectivity and hierarchical machine learning in EEG signals. J. Biomed. Phys. Eng. 2022, 12, 161. [Google Scholar] [CrossRef]
- Lun, X.; Liu, J.; Zhang, Y.; Hao, Z.; Hou, Y. A motor imagery signals classification method via the difference of EEG signals between left and right hemispheric electrodes. Front. Neurosci. 2022, 16, 865594. [Google Scholar] [CrossRef]
- Yazıcı, M.; Ulutaş, M.; Okuyan, M. Effect of EEG electrode numbers on source estimation in motor imagery. Brain Sci. 2025, 15, 685. [Google Scholar] [CrossRef]
- Schirrmeister, R.T.; Springenberg, J.T.; Fiederer, L.D.J.; Glasstetter, M.; Eggensperger, K.; Tangermann, M.; Ball, T. Deep learning with convolutional neural networks for EEG decoding and visualization. Hum. Brain Mapp. 2017, 38, 5391–5420. [Google Scholar] [CrossRef]
- Zhao, X.; Zhang, H.; Zhu, G.; You, F.; Kuang, S.; Sun, L. A multi-branch 3D convolutional neural network for EEG-based motor imagery classification. IEEE Trans. Neural Syst. Rehabil. Eng. 2019, 27, 2164–2177. [Google Scholar] [CrossRef]
- Hou, Y.; Jia, S.; Lun, X.; Hao, Z.; Shi, Y.; Li, Y.; Lv, J. GCNs-net: A graph convolutional neural network approach for decoding time-resolved EEG motor imagery signals. IEEE Trans. Neural Netw. Learn. Syst. 2022, 35, 7312–7323. [Google Scholar] [CrossRef]
- Wan, Z.; Li, M.; Liu, S.; Huang, J.; Tan, H.; Duan, W. EEGformer: A transformer-based brain activity classification method using EEG signal. Front. Neurosci. 2023, 17, 1148855. [Google Scholar] [CrossRef] [PubMed]
- Liu, K.; Yang, T.; Yu, Z.; Yi, W.; Yu, H.; Wang, G.; Wu, W. MSVTNet: Multi-scale vision transformer neural network for EEG-based motor imagery decoding. IEEE J. Biomed. Health Inf. 2024, 28, 7126–7137. [Google Scholar] [CrossRef] [PubMed]
- Jin, M.; Du, C.; He, H.; Cai, T.; Li, J. PGCN: Pyramidal graph convolutional network for EEG emotion recognition. IEEE Trans. Multimed. 2024, 26, 9070–9082. [Google Scholar] [CrossRef]
- Du, G.; Su, J.; Zhang, L.; Su, K.; Wang, X.; Teng, S.; Liu, P.X. A multi-dimensional graph convolution network for EEG emotion recognition. IEEE Trans. Instrum. Meas. 2022, 71, 2518311. [Google Scholar] [CrossRef]
- Aung, H.W.; Li, J.J.; Shi, B.; An, Y.; Su, S.W. EEG_GLT-Net: Optimising EEG graphs for real-time motor imagery signals classification. Biomed. Signal Process. Control 2025, 104, 107458. [Google Scholar] [CrossRef]
- Chiarion, G.; Sparacino, L.; Antonacci, Y.; Faes, L.; Mesin, L. Connectivity analysis in EEG data: A tutorial review of the state of the art and emerging trends. Bioengineering 2023, 10, 372. [Google Scholar] [CrossRef]
- Brunner, C.; Leeb, R.; Müller-Putz, G.; Schlögl, A.; Pfurtscheller, G. BCI Competition 2008–Graz data set A. Inst. Knowl. Discov. Graz Univ. Technol. 2008, 16, 34. [Google Scholar]
- Altaheri, H.; Muhammad, G.; Alsulaiman, M.; Amin, S.U.; Altuwaijri, G.A.; Abdul, W.; Bencherif, M.A.; Faisal, M. Deep learning techniques for classification of electroencephalogram (EEG) motor imagery (MI) signals: A review. Neural Comput. Appl. 2023, 35, 14681–14722. [Google Scholar] [CrossRef]
- Awais, M.A.; Yusoff, M.Z.; Khan, D.M.; Yahya, N.; Kamel, N.; Ebrahim, M. Effective connectivity for decoding electroencephalographic motor imagery using a probabilistic neural network. Sensors 2021, 21, 6570. [Google Scholar] [CrossRef]
- Tian, G.; Liu, Y. Simple convolutional neural network for left-right hands motor imagery EEG signals classification. Int. J. Cogn. Inform. Nat. Intell. 2019, 13, 36–49. [Google Scholar] [CrossRef]
- Giannopulu, I.; Mizutani, H. Neural kinesthetic contribution to motor imagery of body parts: Tongue, hands, and feet. Front. Hum. Neurosci. 2021, 15, 602723. [Google Scholar] [CrossRef]
- Sauvage, C.; Jissendi, P.; Seignan, S.; Manto, M.; Habas, C. Brain areas involved in the control of speed during a motor sequence of the foot: Real movement versus mental imagery. J. Neuroradiol. 2013, 42, 115–125. [Google Scholar] [CrossRef]
- Zhao, W.; Jiang, X.; Zhang, B.; Xiao, S.; Weng, S. CTNet: A convolutional transformer network for EEG-based motor imagery classification. Sci. Rep. 2024, 14, 20237. [Google Scholar] [CrossRef]
- Chen, J.; Yu, Z.; Gu, Z.; Li, Y. Deep temporal-spatial feature learning for motor imagery-based brain–computer interfaces. IEEE Trans. Neural Syst. Rehabil. Eng. 2020, 28, 2356–2366. [Google Scholar] [CrossRef]
- Ang, K.K.; Chin, Z.Y.; Wang, C.; Guan, C.; Zhang, H. Filter bank common spatial pattern algorithm on BCI competition IV datasets 2a and 2b. Front. Neurosci. 2012, 6, 39. [Google Scholar] [CrossRef]
- Li, Y.; Guo, L.; Liu, Y.; Liu, J.; Meng, F. A temporal-spectral-based squeeze-and-excitation feature fusion network for motor imagery EEG decoding. IEEE Trans. Neural Syst. Rehabil. Eng. 2021, 29, 1534–1545. [Google Scholar] [CrossRef] [PubMed]
- Mohammadi, E.; Daneshmand, P.G.; Khorzooghi, S.M.S.M. Electroencephalography-based brain–computer interface motor imagery classification. J. Med. Signals Sens. 2022, 12, 40–47. [Google Scholar] [PubMed]
- Velickovic, P.; Cucurull, G.; Casanova, A.; Romero, A.; Lio, P.; Bengio, Y. Graph attention networks. arXiv 2017, arXiv:1710.10903. [Google Scholar]
- Song, Y.; Zheng, Q.; Liu, B.; Gao, X. EEG conformer: Convolutional transformer for EEG decoding and visualization. IEEE Trans. Neural Syst. Rehabil. Eng. 2022, 31, 710–719. [Google Scholar] [CrossRef] [PubMed]
- Shi, J.; Tang, J.; Lu, Z.; Zhang, R.; Yang, J.; Guo, Q.; Zhang, D. A brain topography graph embedded convolutional neural network for EEG-based motor imagery classification. Biomed. Signal Process. Control 2024, 95, 106401. [Google Scholar] [CrossRef]
- Woolson, R.F. Wilcoxon signed-rank test. Wiley Encycl. Clin. Trials. 2007, 1–3. [Google Scholar] [CrossRef]
- Jia, J.; Zhang, B.; Lv, H.; Xu, Z.; Hu, S.; Li, H. CR-GCN: Channel-Relationships-Based Graph Convolutional Network for EEG Emotion Recognition. Brain Sci. 2022, 12, 987. [Google Scholar] [CrossRef]
- Maaten, L.v.d.; Hinton, G.E. Visualizing Data Using T-SNE. J. Mach. Learn. Res. 2008, 9, 2579–2605. [Google Scholar]
- Xia, M.; Wang, J.; He, Y. BrainNet Viewer: A network visualization tool for human brain connectomics. PLoS ONE 2013, 8, e68910. [Google Scholar] [CrossRef]
- Ying, Z.; You, J.; Morris, C.; Ren, X.; Hamilton, W.; Leskovec, J. Hierarchical graph representation learning with differentiable pooling. Adv. Neural Inf. Process. Syst. 2018, 31, 4805–4815. [Google Scholar]
- Zhao, L.; Song, Y.; Zhang, C.; Liu, Y.; Wang, P.; Lin, T.; Deng, M.; Li, H. T-GCN: A temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transp. Syst. 2019, 21, 3848–3858. [Google Scholar] [CrossRef]









| Subject | FBCSP 2012 | Shallow ConvNet * 2017 | Deep ConvNet * 2017 | GAT * 2017 | TS- SEFFNet * 2021 | SWLDA 2022 | Conformer 2022 | GECNN 2024 | Ours |
|---|---|---|---|---|---|---|---|---|---|
| 1 | 76.00 | 74.75 | 80.15 | 82.36 | 76.17 | 71.43 | 88.19 | 87.90 | 87.5 |
| 2 | 56.50 | 55.32 | 60.39 | 67.12 | 54.62 | 43.87 | 61.46 | 67.49 | 72.9 |
| 3 | 81.25 | 80.14 | 88.53 | 90.33 | 85.76 | 77.35 | 93.40 | 93.41 | 95.5 |
| 4 | 61.00 | 67.00 | 70.42 | 68.33 | 61.63 | 59.44 | 78.13 | 71.49 | 74.3 |
| 5 | 55.00 | 66.56 | 71.33 | 75.67 | 64.38 | 51.74 | 52.08 | 83.70 | 82.3 |
| 6 | 45.25 | 63.12 | 64.66 | 62.08 | 58.75 | 48.20 | 65.28 | 60.93 | 76.7 |
| 7 | 82.75 | 83.11 | 85.89 | 88.01 | 80.69 | 69.10 | 92.36 | 90.61 | 87.8 |
| 8 | 81.25 | 75.20 | 83.16 | 80.90 | 78.82 | 74.48 | 88.19 | 83.76 | 81.9 |
| 9 | 70.75 | 76.82 | 85.31 | 82.31 | 79.55 | 83.97 | 88.89 | 84.85 | 87.2 |
| Avg ± Std (%) | 67.75 ± 13.73 | 71.35 ± 8.93 | 76.64 ± 10.2 | 77.45 ± 9.79 | 71.15 ± 11.31 | 64.40 ± 14.11 | 78.66 ± 15.3 | 80.46 ± 11.16 | 82.9 ± 7.38 |
| Kap (%) | 57.00 | 61.88 | 68.56 | 69.66 | 62.12 | 53.00 | 71.55 | 74.00 | 77.2 |
| Subject | Shallow ConvNet * 2017 | Deep ConvNet * 2017 | GAT * 2017 | TS-SEFFNet * 2021 | Ours |
|---|---|---|---|---|---|
| 1 | 81.42 | 83.09 | 85.34 | 77.38 | 87.50 |
| 2 | 79.55 | 83.02 | 78.83 | 72.13 | 85.91 |
| 3 | 87.10 | 92.54 | 89.05 | 86.26 | 94.05 |
| 4 | 82.30 | 72.74 | 82.01 | 76.79 | 81.33 |
| 5 | 81.89 | 84.04 | 83.55 | 80.08 | 88.75 |
| Avg ± Std (%) | 82.45 ± 2.8 | 83.08 ± 7.02 | 83.75 ± 3.8 | 78.52 ± 8.93 | 87.50 ± 4.61 |
| Kap (%) | 76.85 | 78.47 | 79.02 | 71.05 | 83.92 |
| Method | Shallow ConvNet | Deep ConvNet | GAT | TS- SEFFNet | GECNN |
|---|---|---|---|---|---|
| p-value | 0.008 | 0.015 | 0.032 | 0.046 | 0.020 |
| Module | Local Learning | Partition Learning | Global Learning | Dataset A | Dataset B |
|---|---|---|---|---|---|
| Local only | ● | ○ | ○ | 74.3 | 80.02 |
| Partition only | ○ | ● | ○ | 80.1 | 84.92 |
| Global only | ○ | ○ | ● | 70.6 | 77.00 |
| Local removed | ○ | ● | ● | 82.3 | 86.53 |
| Partition removed | ● | ○ | ● | 73.8 | 77.18 |
| Global removed | ● | ● | ○ | 79.4 | 83.88 |
| Ours | ● | ● | ● | 82.9 | 87.50 |
| Module | Local Learning | Partition Learning | Global Learning | Dataset A | Dataset B | |||
|---|---|---|---|---|---|---|---|---|
| Component | PDC | Pearson | GMD | Inverse Square | Partition | Residual Links | ACC (%) | ACC (%) |
| Method | ○ | ● | ● | ○ | ● | ● | 82.1 | 86.3 |
| ● | ○ | ○ | ● | ● | ● | 81.3 | 85.2 | |
| ● | ○ | ● | ○ | ● | ○ | 82.3 | 87.0 | |
| ● | ○ | ● | ○ | ○ | ● | 81.7 | 85.6 | |
| ● | ○ | ● | ○ | ● | ● | 82.9 | 87.5 | |
| Subject | Left | Right | Foot | Tongue | Mean |
|---|---|---|---|---|---|
| 1 | 70.66 | 68.66 | 88.30 | 70.12 | 74.43 |
| 2 | 66.71 | 61.36 | 70.56 | 89.33 | 71.99 |
| 3 | 73.23 | 80.00 | 81.45 | 63.33 | 74.50 |
| 4 | 70.12 | 66.15 | 86.44 | 65.00 | 71.92 |
| 5 | 68.08 | 71.45 | 80.10 | 83.88 | 75.87 |
| Mean | 69.76 | 69.52 | 81.37 | 74.33 | 73.74 |
| Models | Parameters (M) | Training Time (min) | FLOPs (M) | ACC (%) |
|---|---|---|---|---|
| * Shallow ConvNet [26] | 0.048 | 9.836 | 48.3 | 71.35 |
| * TS-SEFFNet [41] | 0.283 | 10.210 | 301.2 | 71.15 |
| * GECNN [43] | 0.193 | 11.053 | 263.12 | 78.83 |
| Proposed | 0.513 | 14.361 | 486.2 | 82.90 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Zhang, N.; Jian, H.; Li, X.; Jiang, G.; Tang, X. LPGGNet: Learning from Local–Partition–Global Graph Representations for Motor Imagery EEG Recognition. Brain Sci. 2025, 15, 1257. https://doi.org/10.3390/brainsci15121257
Zhang N, Jian H, Li X, Jiang G, Tang X. LPGGNet: Learning from Local–Partition–Global Graph Representations for Motor Imagery EEG Recognition. Brain Sciences. 2025; 15(12):1257. https://doi.org/10.3390/brainsci15121257
Chicago/Turabian StyleZhang, Nanqing, Hongcai Jian, Xingchen Li, Guoqian Jiang, and Xianlun Tang. 2025. "LPGGNet: Learning from Local–Partition–Global Graph Representations for Motor Imagery EEG Recognition" Brain Sciences 15, no. 12: 1257. https://doi.org/10.3390/brainsci15121257
APA StyleZhang, N., Jian, H., Li, X., Jiang, G., & Tang, X. (2025). LPGGNet: Learning from Local–Partition–Global Graph Representations for Motor Imagery EEG Recognition. Brain Sciences, 15(12), 1257. https://doi.org/10.3390/brainsci15121257
