DAGMNet: Dual-Branch Attention-Pruned Graph Neural Network for Multimodal sMRI and fMRI Fusion in Autism Prediction
Abstract
1. Introduction
- We utilize both sMRI and fMRI data, incorporating three commonly used fMRI-based brain atlases (HO, AAL, and EZ), and design modality-specific feature extraction strategies. These methods effectively capture essential features from three-dimensional anatomical structures and topological representations of brain regions, leading to an optimal and discriminative multimodal feature set.
- We design a multimodal aware representation learning module that separately learns shared and modality-specific features from structural and functional inputs. These components are adaptively fused via a gating mechanism to generate a compact and informative feature representation, which serves as the individualized embedding for each subject.
- We design a phenotypic similarity-based population graph and apply a dynamic attention-guided pruning strategy to optimize subject connectivity and improve prediction performance iteratively.
- Experiments on the ABIDE-I dataset demonstrate the superiority of DAGMNet, achieving an accuracy of 91.59% and an AUC of 96.80%, outperforming several state-of-the-art models. Extensive ablation studies and t-SNE visualizations further support the interpretability and robustness of our framework.
2. Materials and Methods
2.1. Method Overview
2.2. Feature Extraction
2.2.1. sMRI and fMRI Image Feature Extraction
2.2.2. Multi-Atlas Brain Graph Feature Extraction
2.3. Multimodal Aware Representation Learning
2.4. Graph Neural Networks
2.5. Dataset
2.6. Implementation
2.7. Competitive Methods and Evaluation Metrics
3. Results
3.1. Comparison with Other Methods
3.2. Ablation Study
3.2.1. Impact of Key Architecture Components
3.2.2. Impact of Different Modality Inputs
3.2.3. Impact of Feature Fusion Mechanisms
3.3. ROC Curve Analysis
3.4. t-SNE Two-Dimensional Visualization
3.5. Biomarker Detection
4. Discussion
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Centers for Disease Control and Prevention. Data & Statistics on Autism Spectrum Disorder. 2023. Available online: https://www.cdc.gov/ncbddd/autism/data.html (accessed on 1 September 2025).
- Lai, M.C.; Lombardo, M.V.; Baron-Cohen, S. Autism. Lancet 2014, 383, 896–910. [Google Scholar] [CrossRef] [PubMed]
- Tafasca, S.; Gupta, A.; Kojovic, N.; Gelsomini, M.; Maillart, T.; Papandrea, M.; Schaer, M.; Odobez, J.M. The ai4autism project: A multimodal and interdisciplinary approach to autism diagnosis and stratification. In Proceedings of the 25th International Conference on Multimodal Interaction, Paris France, 9–13 October 2023; pp. 414–425. [Google Scholar]
- Marbach, F.; Lipska-Ziętkiewicz, B.S.; Knurowska, A.; Michaud, V.; Margot, H.; Lespinasse, J.; Tran Mau Them, F.; Coubes, C.; Park, J.; Grosch, S.; et al. Phenotypic characterization of seven individuals with Marbach–Schaaf neurodevelopmental syndrome. Am. J. Med. Genet. Part 2022, 188, 2627–2636. [Google Scholar] [CrossRef]
- Hirota, T.; King, B.H. Autism spectrum disorder: A review. JAMA 2023, 329, 157–168. [Google Scholar] [CrossRef] [PubMed]
- Okoye, C.; Obialo-Ibeawuchi, C.M.; Obajeun, O.A.; Sarwar, S.; Tawfik, C.; Waleed, M.S.; Wasim, A.U.; Mohamoud, I.; Afolayan, A.Y.; Mbaezue, R.N. Early diagnosis of autism spectrum disorder: A review and analysis of the risks and benefits. Cureus 2023, 15, e43226. [Google Scholar] [CrossRef]
- Heinsfeld, A.S.; Franco, A.R.; Craddock, R.C.; Buchweitz, C.; Meneguzzi, F. Identification of autism spectrum disorder using deep learning and the ABIDE dataset. Neuroimage Clin. 2018, 17, 16–23. [Google Scholar] [CrossRef]
- Khosla, M.; Jamison, K.; Ngo, G.H.; Kuceyeski, A.; Sabuncu, M.R. Machine learning in resting-state fMRI analysis. NeuroImage 2019, 202, 116–127. [Google Scholar] [CrossRef]
- Eslami, T.; Saeed, F.; Zhou, Y. ASD-DiagNet: A hybrid learning approach for detection of autism spectrum disorder using fMRI data. Front. Neuroinform. 2019, 13, 70. [Google Scholar] [CrossRef]
- Cheng, W.; Ji, X.; Zhang, J.; Feng, J. Multiview learning for autism diagnosis using structural and functional MRI. IEEE Trans. Cybern. 2021, 51, 3098–3110. [Google Scholar]
- Li, H.; Fan, Y.; Cui, Y.; Zhu, H. Dynamic connectivity modeling using deep hybrid networks for ASD. Hum. Brain Mapp. 2022, 43, 872–885. [Google Scholar]
- Zhang, H.; Song, R.; Wang, L.; Zhang, L.; Wang, D.; Wang, C.; Zhang, W. Classification of brain disorders in rs-fMRI via local-to-global graph neural networks. IEEE Trans. Med. Imaging 2022, 42, 444–455. [Google Scholar] [CrossRef] [PubMed]
- Chen, W.; Yang, J.; Sun, Z.; Zhang, X.; Tao, G.; Ding, Y.; Gu, J.; Bu, J.; Wang, H. DeepASD: A deep adversarial-regularized graph learning method for ASD diagnosis with multimodal data. Transl. Psychiatry 2024, 14, 375. [Google Scholar] [CrossRef] [PubMed]
- Ktena, S.I.; Parisot, S.; Ferrante, E.; Rajchl, M.; Lee, M.; Glocker, B.; Rueckert, D. Metric learning with spectral graph convolutions on brain connectivity networks. NeuroImage 2018, 169, 431–442. [Google Scholar] [CrossRef] [PubMed]
- Parisot, S.; Ktena, S.I.; Ferrante, E.; Lee, M.; Guerrero, R.; Glocker, B.; Rueckert, D. Disease prediction using graph convolutional networks: Application to autism spectrum disorder and Alzheimer’s disease. Med. Image Anal. 2018, 48, 117–130. [Google Scholar] [CrossRef]
- Li, X.; Zhou, Y.; Dvornek, N.; Zhang, M.; Gao, S.; Zhuang, J.; Scheinost, D.; Staib, L.H.; Ventola, P.; Duncan, J.S. Braingnn: Interpretable brain graph neural network for fmri analysis. Med. Image Anal. 2021, 74, 102233. [Google Scholar] [CrossRef]
- Huang, Y.; Chung, A.C. Edge-variational graph convolutional networks for uncertainty-aware disease prediction. In Proceedings of the International Conference on Medical Image Computing and Computer-Assisted Intervention, Lima, Peru, 4–8 October 2020; Springer: Berlin/Heidelberg, Germany, 2020; pp. 562–572. [Google Scholar]
- Gao, J.; Song, S. A Hierarchical Feature Extraction and Multimodal Deep Feature Integration-Based Model for Autism Spectrum Disorder Identification. IEEE J. Biomed. Health Inform. 2025, 29, 4920–4931. [Google Scholar] [CrossRef]
- Huang, Z.A.; Zhu, Z.; Yau, C.H.; Tan, K.C. Identifying autism spectrum disorder from resting-state fMRI using deep belief network. IEEE Trans. Neural Netw. Learn. Syst. 2020, 32, 2847–2861. [Google Scholar] [CrossRef] [PubMed]
- Craddock, C.; Benhajali, Y.; Chu, C.; Chouinard, F.; Evans, A.; Jakab, A.; Khundrakpam, B.S.; Lewis, J.D.; Li, Q.; Milham, M.; et al. The neuro bureau preprocessing initiative: Open sharing of preprocessed neuroimaging data and derivatives. Front. Neuroinform. 2013, 7, 5. [Google Scholar]
- Mahler, L.; Wang, Q.; Steiglechner, J.; Birk, F.; Heczko, S.; Scheffler, K.; Lohmann, G. Pretraining is all you need: A multi-atlas enhanced transformer framework for autism spectrum disorder classification. In Proceedings of the International Workshop on Machine Learning in Clinical Neuroimaging, Vancouver, BC, Canada, 8 October 2023; Springer: Berlin/Heidelberg, Germany, 2023; pp. 123–132. [Google Scholar]
- Manikantan, K.; Jaganathan, S. A model for diagnosing autism patients using spatial and statistical measures using rs-fMRI and sMRI by adopting graphical neural networks. Diagnostics 2023, 13, 1143. [Google Scholar] [CrossRef]
- Zheng, S.; Zhu, Z.; Liu, Z.; Guo, Z.; Liu, Y.; Yang, Y.; Zhao, Y. Multi-modal graph learning for disease prediction. IEEE Trans. Med. Imaging 2022, 41, 2207–2216. [Google Scholar] [CrossRef]
- Wang, M.; Guo, J.; Wang, Y.; Yu, M.; Guo, J. Multimodal autism spectrum disorder diagnosis method based on DeepGCN. IEEE Trans. Neural Syst. Rehabil. Eng. 2023, 31, 3664–3674. [Google Scholar] [CrossRef] [PubMed]
- Li, S.; Li, D.; Zhang, R.; Cao, F. A novel autism spectrum disorder identification method: Spectral graph network with brain-population graph structure joint learning. Int. J. Mach. Learn. Cybern. 2024, 15, 1517–1532. [Google Scholar] [CrossRef]
- Liu, J.; Mao, J.; Lin, H.; Kuang, H.; Pan, S.; Wu, X.; Xie, S.; Liu, F.; Pan, Y. Multi-Modal Multi-Kernel Graph Learning for Autism Prediction and Biomarker Discovery. IEEE Trans. Comput. Biol. Bioinform. 2025, 22, 842–854. [Google Scholar] [CrossRef] [PubMed]
- Shan, S.; Ren, Y.; Jiao, Z.; Li, X. MTGWNN: A Multi-Template Graph Wavelet Neural Network Identification Model for Autism Spectrum Disorder. Int. J. Imaging Syst. Technol. 2025, 35, e70010. [Google Scholar] [CrossRef]
- Liu, S.; Sun, C.; Li, J.; Wang, S.; Zhao, L. DML-GNN: ASD Diagnosis Based on Dual-Atlas Multi-Feature Learning Graph Neural Network. Int. J. Imaging Syst. Technol. 2025, 35, e70038. [Google Scholar] [CrossRef]
- Xia, M.; Wang, J.; He, Y. BrainNet Viewer: A network visualization tool for human brain connectomics. PLoS ONE 2013, 8, e68910. [Google Scholar] [CrossRef]
- Li, X.; Zhai, J.; Hao, H.; Xu, Z.; Cao, X.; Xia, W.; Wang, J. Functional connectivity among insula, sensory and social brain regions in boys with autism spectrum disorder. Chin. J. Sch. Health 2023, 44, 335–338+343. [Google Scholar]
- Del Casale, A.; Ferracuti, S.; Alcibiade, A.; Simone, S.; Modesti, M.N.; Pompili, M. Neuroanatomical correlates of autism spectrum disorders: A meta-analysis of structural magnetic resonance imaging (MRI) studies. Psychiatry Res. Neuroimaging 2022, 325, 111516. [Google Scholar] [CrossRef]
Method | sMRI | fMRI | Phenotype | ACC (%) | SEN (%) | SPE (%) | AUC (%) |
---|---|---|---|---|---|---|---|
DeepASD [13] | × | ✔ | ✔ | 87.38 ± 2.87 | 88.35 ± 6.83 | 86.51 ± 8.41 | 92.76 ± 4.00 |
LG-GNN [12] | × | ✔ | ✔ | 81.75 ± 1.10 | 83.22 ± 1.84 | 80.99 ± 1.17 | 85.22 ± 1.01 |
METAformer [21] | × | ✔ | ✔ | 83.70 ± 0.53 | 90.10 ± 0.27 | 81.90 ± 0.12 | 83.20 ± 0.03 |
EV-GCN [17] | × | ✔ | ✔ | 85.90 ± 4.47 | 88.23 ± 7.18 | 79.90 ± 7.37 | 84.72 ± 4.27 |
MMGL [23] | × | ✔ | ✔ | 89.77 ± 2.72 | 90.32 ± 4.21 | 89.81 ± 2.56 | 89.30 ± 6.04 |
WL-DeepGCN [24] | × | ✔ | ✔ | 77.27 ± 1.59 | 80.96 ± 2.57 | 77.70 ± 4.08 | 82.59 ± 2.80 |
BPGLNet [25] | × | ✔ | ✔ | 86.45 ± 0.62 | 86.71 ± 0.22 | 87.14 ± 1.08 | 90.75 ± 0.17 |
GCN [22] | ✔ | ✔ | ✔ | 81.23 ± 0.53 | 81.36 ± 0.30 | 81.02 ± 0.21 | 81.29 ± 0.13 |
MMKGL [26] | ✔ | ✔ | ✔ | 91.08 ± 0.59 | 91.97 ± 0.64 | 90.05 ± 1.37 | 91.01 ± 0.63 |
MTGWNN [27] | × | ✔ | ✔ | 87.25 ± 1.45 | 87.36 ± 2.49 | 87.75 ± 2.47 | 92.49 ± 1.55 |
DML-GNN [28] | × | ✔ | ✔ | 90.93 ± 0.42 | 90.75 ± 0.24 | 92.74 ± 0.13 | 93.69 ± 0.12 |
DAGMNet (Ours) | ✔ | ✔ | ✔ | 91.59 ± 0.41 | 90.33 ± 0.12 | 93.23 ± 0.03 | 96.80 ± 0.02 |
Method | ACC (%) | SEN (%) | SPE (%) | AUC (%) |
---|---|---|---|---|
w/o ESPM | 87.63 ± 0.23 | 93.20 ± 0.83 | 90.11 ± 0.12 | 86.70 ± 0.06 |
w/o MARL | 86.25 ± 0.11 | 90.85 ± 0.62 | 89.76 ± 0.21 | 85.93 ± 0.51 |
w/o PAPG | 85.77 ± 0.21 | 89.12 ± 0.14 | 88.42 ± 0.34 | 84.65 ± 0.08 |
w/o ACFE | 82.38 ± 0.32 | 86.07 ± 0.13 | 87.93 ± 0.04 | 82.21 ± 0.02 |
DAGMNet (Ours) | 91.59 ± 0.41 | 90.33 ± 0.12 | 93.23 ± 0.03 | 96.80 ± 0.02 |
Method | ACC (%) | SEN (%) | SPE (%) | AUC (%) |
---|---|---|---|---|
w/o 3DConv | 83.20 ± 0.35 | 85.10 ± 0.40 | 82.50 ± 0.38 | 84.00 ± 0.30 |
w/o Channel Attention | 87.10 ± 0.28 | 88.50 ± 0.25 | 86.30 ± 0.30 | 89.20 ± 0.25 |
w/o Spatial Attention | 87.50 ± 0.30 | 88.00 ± 0.28 | 87.10 ± 0.33 | 89.80 ± 0.20 |
w/o Residual | 89.00 ± 0.26 | 89.50 ± 0.22 | 88.80 ± 0.27 | 91.00 ± 0.18 |
DAGMNet (ACFE) | 91.59 ± 0.41 | 90.33 ± 0.12 | 93.23 ± 0.03 | 96.80 ± 0.02 |
Method | ACC (%) | SEN (%) | SPE (%) | AUC (%) |
---|---|---|---|---|
DAGMNet (sMRI_img only) | 86.79 ± 0.13 | 79.65 ± 0.19 | 92.86 ± 0.29 | 94.71 ± 0.31 |
DAGMNet (fMRI_img only) | 85.77 ± 0.22 | 76.05 ± 0.43 | 94.49 ± 0.02 | 92.25 ± 0.21 |
HO atlas | 89.15 ± 0.62 | 78.85 ± 0.75 | 88.68 ± 0.64 | 89.36 ± 0.47 |
EZ atlas | 88.01 ± 0.45 | 77.35 ± 0.26 | 85.73 ± 0.73 | 86.93 ± 0.28 |
AAL atlas | 87.03 ± 0.83 | 83.74 ± 0.36 | 86.82 ± 0.57 | 88.27 ± 0.37 |
multi-atlas only (HO, EZ, AAL) | 91.15 ± 0.34 | 90.29 ± 0.67 | 91.93 ± 0.63 | 96.65 ± 0.33 |
DAGMNet (Ours) | 91.59 ± 0.41 | 90.33 ± 0.12 | 93.23 ± 0.03 | 96.80 ± 0.02 |
Multimodal Method | ACC (%) | SEN (%) | SPE (%) | AUC (%) |
---|---|---|---|---|
Concatenation | 80.04 ± 0.12 | 59.63 ± 0.28 | 97.63 ± 0.22 | 94.73 ± 0.24 |
ECMA | 83.23 ± 0.53 | 80.77 ± 0.65 | 85.23 ± 0.89 | 92.28 ± 0.05 |
Self-Attention | 86.44 ± 0.76 | 81.28 ± 0.25 | 90.48 ± 0.76 | 95.58 ± 0.41 |
Cross-Attention | 86.80 ± 0.11 | 86.55 ± 0.33 | 89.31 ± 0.34 | 95.33 ± 0.81 |
DAGMNet (Ours) | 91.59 ± 0.41 | 90.33 ± 0.12 | 93.23 ± 0.03 | 96.80 ± 0.02 |
Method | sMRI | fMRI | Phenotype | ACC (%) | SEN (%) | SPE (%) | AUC (%) |
---|---|---|---|---|---|---|---|
DeepASD [13] | × | ✔ | ✔ | 80.73 ± 2.47 | 77.94 ± 3.56 | 58.38 ± 5.72 | 67.46 ± 2.55 |
LG-GNN [12] | × | ✔ | ✔ | 82.09 ± 1.40 | 82.96 ± 0.91 | 51.66 ± 4.84 | 68.33 ± 3.47 |
METAformer [21] | × | ✔ | ✔ | 79.45 ± 1.07 | 81.19 ± 0.45 | 60.42 ± 1.37 | 68.81 ± 0.95 |
EV-GCN [17] | × | ✔ | ✔ | 80.42 ± 3.36 | 80.86 ± 4.70 | 62.56 ± 5.26 | 67.82 ± 3.86 |
MMGL [23] | × | ✔ | ✔ | 84.65 ± 1.89 | 81.35 ± 3.47 | 63.54 ± 2.36 | 67.74 ± 5.28 |
WL-DeepGCN [24] | × | ✔ | ✔ | 75.36 ± 1.45 | 74.53 ± 2.73 | 57.27 ± 3.15 | 63.20 ± 2.25 |
BPGLNet [25] | × | ✔ | ✔ | 80.65 ± 0.82 | 77.28 ± 0.39 | 62.45 ± 1.45 | 66.23 ± 1.71 |
GCN [22] | ✔ | ✔ | ✔ | 77.18 ± 0.68 | 72.46 ± 1.38 | 60.78 ± 0.57 | 63.53 ± 1.52 |
MMKGL [26] | ✔ | ✔ | ✔ | 85.75 ± 0.63 | 82.55 ± 1.59 | 63.29 ± 1.39 | 69.16 ± 1.46 |
MTGWNN [27] | × | ✔ | ✔ | 81.47 ± 1.56 | 78.28 ± 2.43 | 60.75 ± 2.64 | 65.77 ± 1.77 |
DML-GNN [28] | × | ✔ | ✔ | 83.88 ± 0.58 | 80.23 ± 0.88 | 62.35 ± 1.25 | 69.20 ± 0.94 |
DAGMNet (Ours) | ✔ | ✔ | ✔ | 87.09 ± 1.73 | 85.83 ± 0.76 | 65.92 ± 2.20 | 72.66 ± 2.65 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, L.; Li, X.; Yuan, J.; Chen, Y. DAGMNet: Dual-Branch Attention-Pruned Graph Neural Network for Multimodal sMRI and fMRI Fusion in Autism Prediction. Biomedicines 2025, 13, 2168. https://doi.org/10.3390/biomedicines13092168
Wang L, Li X, Yuan J, Chen Y. DAGMNet: Dual-Branch Attention-Pruned Graph Neural Network for Multimodal sMRI and fMRI Fusion in Autism Prediction. Biomedicines. 2025; 13(9):2168. https://doi.org/10.3390/biomedicines13092168
Chicago/Turabian StyleWang, Lanlan, Xinyu Li, Jialu Yuan, and Yinghao Chen. 2025. "DAGMNet: Dual-Branch Attention-Pruned Graph Neural Network for Multimodal sMRI and fMRI Fusion in Autism Prediction" Biomedicines 13, no. 9: 2168. https://doi.org/10.3390/biomedicines13092168
APA StyleWang, L., Li, X., Yuan, J., & Chen, Y. (2025). DAGMNet: Dual-Branch Attention-Pruned Graph Neural Network for Multimodal sMRI and fMRI Fusion in Autism Prediction. Biomedicines, 13(9), 2168. https://doi.org/10.3390/biomedicines13092168