A Symmetric Multiscale Feature Fusion Architecture Based on CNN and GNN for Hyperspectral Image Classification
Abstract
1. Introduction
- We propose a novel MCGNet architecture that successfully achieves multi-scale feature modeling from local details to global relationships and from rule-based domains to non-Euclidean spaces through the collaborative action of the SNS module, LSE branch, SGC branch, and PGC branch. Experimental results show that MCGNet improves overall classification accuracy by 0.59% and reduces running time by 18.01 s compared to the best baseline method on multiple public datasets.
- We introduce the SNS module and the LSE module. The SNS module effectively improves the signal-to-noise ratio of input data through a lightweight noise suppression strategy, laying a solid foundation for subsequent feature extraction. The LSE module adopts deep separable convolution, which not only reduces computational complexity but also enhances the quality of feature representations.
- We propose the SGC module, which successfully captures the dependency relationships between object regions through graph convolution on superpixel graphs. Additionally, we introduce a self-attention mechanism to decouple superpixel features, further refining pixel-level feature representations.
- We propose the PGC module. This module combines spectral and spatial similarity to construct a sparse graph structure and effectively captures complex dependencies between pixels through graph convolution, thereby improving the model’s performance in identifying complex object boundaries and subtle differences.
2. Proposed Method
2.1. Spectral Noise Suppression and Local Spectral-Spatial Encoding
2.1.1. SNS
2.1.2. LSE
2.2. Regional and Pixel-Level Relationship Inference
2.2.1. SGC
2.2.2. PGC
2.3. Fusion and Final Classification
2.4. Theoretical Explanation of Symmetric Euclidean–Non-Euclidean Feature Fusion
3. Experiments
3.1. Hyperspecteral Data Sets
3.2. Experiment Setting
3.2.1. Baseline
3.2.2. Model Setup
3.2.3. Evaluation Metrics
3.3. Comparison of Classification Performance
3.3.1. Experimental Results
3.3.2. Visualization of Dataset Classification
3.3.3. Comparison of Training and Testing Time
4. Ablation Studies
4.1. Comparison and Analysis of Results
4.2. Effectiveness of Different Modules in MCGNet
4.3. Parameter Sensitivity Analysis
4.3.1. Hyperparameter Sensitivity Analysis
4.3.2. Computational Complexity Analysis
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
Abbreviations
| CNN | Convolutional Neural Network |
| GCN | Graph Convolutional Network |
| HSI | Hyperspectral Image |
| KNN | k-Nearest Neighbor |
| LSE | Local Spectral Feature Extraction |
| PGC | Pixel-level Graph Convolution |
| SLIC | Simple Linear Iterative Clustering |
| SGC | Superpixel Graph Convolution |
| SNR | Signal-to-Noise Ratio |
| SNS | Spectral Noise Suppression |
References
- Zhou, B.; Deng, L.; Ying, J.; Wang, Q.; Cheng, Y. Dimensionality reduction method based on spatial-spectral preservation and minimum noise fraction for hyperspectral images. J. Eur. Opt. Soc.-Rapid Publ. 2025, 21, 31. [Google Scholar] [CrossRef]
- Mehmood, M.; Shahzad, A.; Zafar, B.; Shabbir, A.; Ali, N. Remote sensing image classification: A comprehensive review and applications. Math. Probl. Eng. 2022, 2022, 5880959. [Google Scholar] [CrossRef]
- Pande, C.B.; Moharir, K.N. Application of hyperspectral remote sensing role in precision farming and sustainable agriculture under climate change: A review. In Climate Change Impacts on Natural Resources, Ecosystems and Agricultural Systems; Springer: Cham, Switzerland, 2023; pp. 503–520. [Google Scholar]
- Lv, W.; Wang, X. Overview of hyperspectral image classification. J. Sensors 2020, 2020, 4817234. [Google Scholar] [CrossRef]
- Wang, Y.; Xue, Z.; Jia, M.; Liu, Z.; Su, H. Hypergraph convolutional network with multiple hyperedges fusion for hyperspectral image classification under limited samples. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5526318. [Google Scholar] [CrossRef]
- Datta, D.; Mallick, P.K.; Bhoi, A.K.; Ijaz, M.F.; Shafi, J.; Choi, J. Hyperspectral image classification: Potentials, challenges, and future directions. Comput. Intell. Neurosci. 2022, 2022, 3854635. [Google Scholar] [CrossRef]
- Tejasree, G.; Agilandeeswari, L. An extensive review of hyperspectral image classification and prediction: Techniques and challenges. Multimed. Tools Appl. 2024, 83, 80941–81038. [Google Scholar] [CrossRef]
- Ullah, F.; Ullah, I.; Khan, R.U.; Khan, S.; Khan, K.; Pau, G. Conventional to deep ensemble methods for hyperspectral image classification: A comprehensive survey. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 3878–3916. [Google Scholar] [CrossRef]
- Ge, H.; Pan, H.; Wang, L.; Liu, M.; Li, C. Self-training algorithm for hyperspectral imagery classification based on mixed measurement k-nearest neighbor and support vector machine. J. Appl. Remote Sens. 2021, 15, 042604. [Google Scholar] [CrossRef]
- Liu, Z.; Zhang, Z.; Cai, Y.; Miao, Y.; Chen, Z. Semi-supervised classification via hypergraph convolutional extreme learning machine. Appl. Sci. 2021, 11, 3867. [Google Scholar] [CrossRef]
- Qin, Y.; Ye, Y.; Zhao, Y.; Wu, J.; Zhang, H.; Cheng, K.; Li, K. Nearest neighboring self-supervised learning for hyperspectral image classification. Remote Sens. 2023, 15, 1713. [Google Scholar] [CrossRef]
- Zhang, W.; Kasun, L.C.; Wang, Q.J.; Zheng, Y.; Lin, Z. A review of machine learning for near-infrared spectroscopy. Sensors 2022, 22, 9764. [Google Scholar] [CrossRef]
- Boateng, D. Advances in deep learning-based applications for Raman spectroscopy analysis: A mini-review of the progress and challenges. Microchem. J. 2025, 209, 112692. [Google Scholar] [CrossRef]
- Imani, M.; Ghassemian, H. An overview on spectral and spatial information fusion for hyperspectral image classification: Current trends and challenges. Inf. Fusion 2020, 59, 59–83. [Google Scholar] [CrossRef]
- Peng, J.; Sun, W.; Li, H.C.; Li, W.; Meng, X.; Ge, C.; Du, Q. Low-rank and sparse representation for hyperspectral image processing: A review. IEEE Geosci. Remote Sens. Mag. 2021, 10, 10–43. [Google Scholar] [CrossRef]
- Zhao, Y.; Yan, F. Hyperspectral image classification based on sparse superpixel graph. Remote Sens. 2021, 13, 3592. [Google Scholar] [CrossRef]
- Wang, N.; Zeng, X.; Duan, Y.; Deng, B.; Mo, Y.; Xie, Z.; Duan, P. Multi-scale superpixel-guided structural profiles for hyperspectral image classification. Sensors 2022, 22, 8502. [Google Scholar] [CrossRef]
- Jia, S.; Jiang, S.; Zhang, S.; Xu, M.; Jia, X. Graph-in-graph convolutional network for hyperspectral image classification. IEEE Trans. Neural Netw. Learn. Syst. 2022, 35, 1157–1171. [Google Scholar] [CrossRef]
- Bai, J.; Ding, B.; Xiao, Z.; Jiao, L.; Chen, H.; Regan, A.C. Hyperspectral image classification based on deep attention graph convolutional network. IEEE Trans. Geosci. Remote Sens. 2021, 60, 5504316. [Google Scholar] [CrossRef]
- Li, L.; Chen, X.; Song, C. A robust clustering method with noise identification based on directed K-nearest neighbor graph. Neurocomputing 2022, 508, 19–35. [Google Scholar] [CrossRef]
- Subudhi, S.; Patro, R.N.; Biswal, P.K.; Dell’Acqua, F. A survey on superpixel segmentation as a preprocessing step in hyperspectral image analysis. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 5015–5035. [Google Scholar] [CrossRef]
- Subudhi, S.; Patro, R.; Biswal, P.K. Texture Based Superpixel Segmentation Algorithm for Hyperspectral Image Classification. Res. Sq. 2022. [Google Scholar] [CrossRef]
- Yang, C.; Kong, Y.; Wang, X.; Cheng, Y. Hyperspectral Image Classification Based on Adaptive Global–Local Feature Fusion. Remote Sens. 2024, 16, 1918. [Google Scholar] [CrossRef]
- Liu, Q.; Xiao, L.; Yang, J.; Wei, Z. CNN-enhanced graph convolutional network with pixel-and superpixel-level feature fusion for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2020, 59, 8657–8671. [Google Scholar] [CrossRef]
- Bera, S.; Shrivastava, V.K.; Satapathy, S.C. Advances in Hyperspectral Image Classification Based on Convolutional Neural Networks: A Review. CMES-Comput. Model. Eng. Sci. 2022, 133, 219–250. [Google Scholar] [CrossRef]
- Ge, Z.; Cao, G.; Li, X.; Fu, P. Hyperspectral image classification method based on 2D–3D CNN and multibranch feature fusion. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 5776–5788. [Google Scholar] [CrossRef]
- Zhong, Z.; Li, J.; Luo, Z.; Chapman, M. Spectral–spatial residual network for hyperspectral image classification: A 3-D deep learning framework. IEEE Trans. Geosci. Remote Sens. 2017, 56, 847–858. [Google Scholar] [CrossRef]
- Chen, S.Y.; Chu, P.Y.; Liu, K.L.; Wu, Y.C. A Multichannel Hybrid 2D-3D-CNN for Hyperspectral Image Classification with Small Training Sample Sizes. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5540915. [Google Scholar] [CrossRef]
- Chiney, A.; Paduri, A.R.; Darapaneni, N.; Kulkarni, S.; Kadam, M.; Kohli, I.; Subramaniyan, M. Handwritten data digitization using an anchor based multi-channel CNN (MCCNN) trained on a hybrid dataset (h-EH). Procedia Comput. Sci. 2021, 189, 175–182. [Google Scholar] [CrossRef]
- Liao, T.; Li, L.; Ouyang, R.; Lin, X.; Lai, X.; Cheng, G.; Ma, J. Classification of asymmetry in mammography via the DenseNet convolutional neural network. Eur. J. Radiol. Open 2023, 11, 100502. [Google Scholar] [CrossRef]
- Roy, S.K.; Manna, S.; Song, T.; Bruzzone, L. Attention-based adaptive spectral–spatial kernel ResNet for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2020, 59, 7831–7843. [Google Scholar] [CrossRef]
- Li, S.; Zhu, X.; Liu, Y.; Bao, J. Adaptive spatial-spectral feature learning for hyperspectral image classification. IEEE Access 2019, 7, 61534–61547. [Google Scholar] [CrossRef]
- Zhao, X.; Ma, J.; Wang, L.; Zhang, Z.; Ding, Y.; Xiao, X. A review of hyperspectral image classification based on graph neural networks. Artif. Intell. Rev. 2025, 58, 172. [Google Scholar] [CrossRef]
- Ding, Y.; Chong, Y.; Pan, S.; Zheng, C. Diversity-connected graph convolutional network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5518118. [Google Scholar] [CrossRef]
- Yang, A.; Li, M.; Ding, Y.; Hong, D.; Lv, Y.; He, Y. GTFN: GCN and transformer fusion network with spatial-spectral features for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2023, 61, 6600115. [Google Scholar] [CrossRef]
- Khatun, Z.; Jónsson, H., Jr.; Tsirilaki, M.; Maffulli, N.; Oliva, F.; Daval, P.; Tortorella, F.; Gargiulo, P. Beyond pixel: Superpixel-based MRI segmentation through traditional machine learning and graph convolutional network. Comput. Methods Programs Biomed. 2024, 256, 108398. [Google Scholar] [CrossRef]
- Zhao, H.; Zhou, F.; Bruzzone, L.; Guan, R.; Yang, C. Superpixel-level global and local similarity graph-based clustering for large hyperspectral images. IEEE Trans. Geosci. Remote Sens. 2021, 60, 5519316. [Google Scholar] [CrossRef]
- Liu, Y.; Wang, X.; Jiang, B.; Chen, L.; Luo, B. SemanticFormer: Hyperspectral image classification via semantic transformer. Pattern Recognit. Lett. 2024, 179, 1–8. [Google Scholar] [CrossRef]
- Zhang, H.; Zou, J.; Zhang, L. EMS-GCN: An end-to-end mixhop superpixel-based graph convolutional network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5526116. [Google Scholar] [CrossRef]
- Yang, P.; Zhang, X. A dual-branch fusion of a graph convolutional network and a convolutional neural network for hyperspectral image classification. Sensors 2024, 24, 4760. [Google Scholar] [CrossRef]
- Zhu, W.; Sun, X.; Zhang, Q. DCG-Net: Enhanced Hyperspectral Image Classification with Dual-Branch Convolutional Neural Network and Graph Convolutional Neural Network Integration. Electronics 2024, 13, 3271. [Google Scholar] [CrossRef]
- Gao, L.; Xiao, S.; Hu, C.; Yan, Y. Hyperspectral image classification based on fusion of convolutional neural network and graph network. Appl. Sci. 2023, 13, 7143. [Google Scholar] [CrossRef]
- Chen, H.; Long, H.; Chen, T.; Song, Y.; Chen, H.; Zhou, X.; Deng, W. M3FuNet: An unsupervised multivariate feature fusion network for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5513015. [Google Scholar] [CrossRef]
- Dong, Y.; Liu, Q.; Du, B.; Zhang, L. Weighted feature fusion of convolutional neural network and graph attention network for hyperspectral image classification. IEEE Trans. Image Process. 2022, 31, 1559–1572. [Google Scholar] [CrossRef] [PubMed]
- Tu, B.; Ren, Q.; Li, Q.; He, W.; He, W. Hyperspectral image classification using a superpixel–pixel–subpixel multilevel network. IEEE Trans. Instrum. Meas. 2023, 72, 5013616. [Google Scholar] [CrossRef]
- Wang, B.; Cao, C.; Kong, D. SGFNet: Redundancy-Reduced Spectral–Spatial Fusion Network for Hyperspectral Image Classification. Entropy 2025, 27, 995. [Google Scholar] [CrossRef]
- Zang, C.; Song, G.; Li, L.; Zhao, G.; Lu, W.; Jiang, G.; Sun, Q. DB-MFENet: A Dual-Branch Multi-Frequency Feature Enhancement Network for Hyperspectral Image Classification. Remote Sens. 2025, 17, 1458. [Google Scholar] [CrossRef]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. Adv. Neural Inf. Process. Syst. 2017, 30, 1–11. [Google Scholar]
- Liu, Q.; Xiao, L.; Yang, J.; Wei, Z. Multilevel superpixel structured graph U-Nets for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2021, 60, 5516115. [Google Scholar] [CrossRef]







| No. | Class | TRAIN. | VAL. | TEST |
|---|---|---|---|---|
| 1 | Alfalfa | 1 | 1 | 45 |
| 2 | Corn-notill | 14 | 14 | 1401 |
| 3 | Corn-mintill | 8 | 8 | 815 |
| 4 | Corn | 2 | 2 | 233 |
| 5 | Grass-pasture | 5 | 5 | 473 |
| 6 | Grass-trees | 7 | 7 | 716 |
| 7 | Grass-pasture-mowed | 1 | 1 | 26 |
| 8 | Hay-windrowed | 5 | 5 | 468 |
| 9 | Oats | 1 | 1 | 18 |
| 10 | Soybean-notill | 10 | 10 | 952 |
| 11 | Soybean-mintill | 25 | 25 | 2405 |
| 12 | Soybean-clean | 6 | 6 | 581 |
| 13 | Wheat | 2 | 2 | 201 |
| 14 | Woods | 13 | 13 | 1239 |
| 15 | Buildings-Grass-Trees-Drives | 4 | 4 | 378 |
| 16 | Stone-Steel-Towers | 1 | 1 | 91 |
| Total | 105 | 105 | 10,042 |
| No. | Class | TRAIN. | VAL. | TEST |
|---|---|---|---|---|
| 1 | Asphalt | 7 | 66 | 6565 |
| 2 | Meadows | 18 | 186 | 18,464 |
| 3 | Gravel | 2 | 21 | 2078 |
| 4 | Trees | 3 | 31 | 3033 |
| 5 | Painted metal sheets | 1 | 13 | 1332 |
| 6 | Bare Soil | 5 | 50 | 4979 |
| 7 | Bitumen | 1 | 13 | 1317 |
| 8 | Self-Blocking Bricks | 4 | 37 | 3645 |
| 9 | Shadows | 1 | 9 | 938 |
| Total | 42 | 426 | 42,351 |
| No. | Class | TRAIN. | VAL. | TEST |
|---|---|---|---|---|
| 1 | Brocoli_green_weeds_1 | 2 | 20 | 1987 |
| 2 | Brocoli_green_weeds_2 | 4 | 37 | 3685 |
| 3 | Fallow | 2 | 20 | 1954 |
| 4 | Fallow_rough_plow | 1 | 14 | 1379 |
| 5 | Fallow_smooth | 3 | 27 | 2648 |
| 6 | Stubble | 4 | 40 | 3915 |
| 7 | Celery | 4 | 36 | 3539 |
| 8 | Grapes_untrained | 11 | 113 | 11,147 |
| 9 | Soil_vinyard_develop | 6 | 62 | 6135 |
| 10 | Corn_senesced_green_weeds | 3 | 33 | 3242 |
| 11 | Lettuce_romaine_4wk | 1 | 11 | 1056 |
| 12 | Lettuce_romaine_5wk | 2 | 19 | 1906 |
| 13 | Lettuce_romaine_6wk | 1 | 9 | 906 |
| 14 | Lettuce_romaine_7wk | 1 | 11 | 1058 |
| 15 | Vinyard_untrained | 1 | 7 | 718 |
| 16 | Vinyard_vertical_trellis | 2 | 18 | 1786 |
| Total | 48 | 477 | 47,061 |
| Dataset | Imbalance Ratio (IR) | ||
|---|---|---|---|
| Indiana Pines | 2455 | 20 | 122.75 |
| Pavia University | 18,668 | 948 | 19.69 |
| Salinas | 11,271 | 726 | 15.53 |
| Class | CNN | GCN | CEGCN [24] | MSSGU [49] | SSSTNet [38] | EMS-GCN [39] | OURS |
|---|---|---|---|---|---|---|---|
| 1 | 15.27 ± 11.32 | 0 ± 0 | 11.22 ± 11.43 | 13.63 ± 8.25 | 4.93 ± 3.83 | 26.96 ± 21.77 | 18.33 ± 14.24 |
| 2 | 81.37 ± 6.07 | 72.57 ± 19.29 | 86.39 ± 4.24 | 63.96 ± 6.78 | 64.29 ± 18.32 | 87.22 ± 5.99 | 85.15 ± 5.06 |
| 3 | 39.64 ± 19.79 | 42.98 ± 16.67 | 48.27 ± 17.49 | 27.73 ± 3.80 | 24.83 ± 6.53 | 74.81 ± 4.64 | 72.85 ± 13.90 |
| 4 | 18.12 ± 11.66 | 47.76 ± 32.41 | 20.54 ± 15.53 | 15.23 ± 7.25 | 11.64 ± 9.82 | 53.51 ± 28.76 | 65.27 ± 11.91 |
| 5 | 70.85 ± 8.64 | 59.28 ± 30.59 | 66.49 ± 13.34 | 58.56 ± 7.31 | 40.80 ± 18.37 | 67.56 ± 19.04 | 78.75 ± 11.51 |
| 6 | 98.21 ± 1.25 | 64.12 ± 35.17 | 96.87 ± 2.39 | 83.75 ± 6.43 | 96.42 ± 1.94 | 96.56 ± 1.65 | 98.15 ± 0.72 |
| 7 | 35.64 ± 19.57 | 0 ± 0 | 50.48 ± 20.16 | 50.76 ± 26.35 | 12.70 ± 15.12 | 72.66 ± 27.47 | 85.07 ± 22.95 |
| 8 | 94.07 ± 6.06 | 97.55 ± 4.89 | 93.43 ± 7.09 | 94.91 ± 4.00 | 79.37 ± 15.45 | 99.54 ± 0.65 | 100.00 ± 0.00 |
| 9 | 5.26 ± 8.15 | 13.68 ± 27.36 | 7.42 ± 7.12 | 64.44 ± 15.94 | 7.42 ± 7.12 | 20.17 ± 19.48 | 21.11 ± 12.82 |
| 10 | 68.22 ± 7.33 | 75.34 ± 13.99 | 65.64 ± 11.51 | 62.33 ± 7.16 | 48.6 ± 16.49 | 79.17 ± 7.07 | 77.96 ± 9.93 |
| 11 | 88.04 ± 4.90 | 82.38 ± 11.81 | 87.07 ± 4.64 | 75.16 ± 9.44 | 94.38 ± 2.76 | 91.98 ± 8.75 | 91.85 ± 7.45 |
| 12 | 45.11 ± 18.54 | 92.61 ± 11.17 | 48.04 ± 17.47 | 21.06 ± 6.03 | 27.86 ± 11.49 | 70.89 ± 13.79 | 64.29 ± 17.33 |
| 13 | 95.51 ± 3.99 | 76.71 ± 38.72 | 94.32 ± 5.05 | 77.78 ± 18.06 | 90.61 ± 6.75 | 98.95 ± 1.46 | 99.70 ± 0.39 |
| 14 | 99.80 ± 0.13 | 92.06 ± 9.78 | 99.69 ± 0.32 | 94.85 ± 0.32 | 99.24 ± 0.89 | 99.38 ± 0.92 | 99.93 ± 0.07 |
| 15 | 44.83 ± 23.77 | 68.88 ± 26.63 | 47.68 ± 29.05 | 30.00 ± 12.73 | 28.44 ± 15.54 | 67.56 ± 12.76 | 76.10 ± 19.63 |
| 16 | 19.89 ± 7.18 | 0 ± 0 | 17.06 ± 10.62 | 16.26 ± 13.39 | 21.72 ± 31.32 | 34.33 ± 20.91 | 59.62 ± 19.37 |
| OA (%) | 76.16 ± 2.73 | 74.36 ± 4.39 | 77.04 ± 2.13 | 64.67 ± 3.05 | 67.99 ± 5.19 | 85.28 ± 1.52 | 85.87 ± 1.68 |
| AA (%) | 57.49 ± 3.92 | 55.37 ± 5.40 | 58.79 ± 4.19 | 53.15 ± 3.17 | 47.08 ± 6.08 | 71.33 ± 3.00 | 74.63 ± 2.37 |
| Kappa | 72.26 ± 3.43 | 70.90 ± 4.87 | 73.38 ± 2.67 | 59.24 ± 3.31 | 62.15 ± 6.38 | 83.14 ± 1.63 | 83.81 ± 1.87 |
| F1 (%) | 74.41 ± 3.02 | 72.38 ± 4.26 | 75.22 ± 2.64 | 63.55 ± 3.21 | 66.21 ± 4.95 | 84.73 ± 1.84 | 85.06 ± 2.77 |
| Class | CNN | GCN | CEGCN | MSSGU | SSSTNet | EMS-GCN | OURS |
|---|---|---|---|---|---|---|---|
| 1 | 97.02 ± 2.65 | 62.29 ± 7.60 | 95.58 ± 2.57 | 94.16 ± 4.92 | 58.77 ± 8.85 | 98.65 ± 2.57 | 93.41 ± 5.60 |
| 2 | 94.74 ± 2.97 | 93.48 ± 2.78 | 97.35 ± 1.10 | 95.14 ± 3.07 | 99.27 ± 7.91 | 98.21 ± 0.92 | 96.07 ± 2.32 |
| 3 | 53.44 ± 2.19 | 93.86 ± 5.07 | 88.46 ± 6.39 | 65.86 ± 14.44 | 20.00 ± 17.66 | 84.41 ± 1.10 | 80.05 ± 8.81 |
| 4 | 78.93 ± 4.84 | 38.15 ± 13.67 | 78.31 ± 10.93 | 88.61 ± 4.76 | 21.87 ± 7.82 | 93.45 ± 6.39 | 84.91 ± 9.25 |
| 5 | 98.96 ± 1.04 | 98.25 ± 2.04 | 99.95 ± 0.09 | 99.90 ± 0.19 | 66.17 ± 11.91 | 97.80 ± 10.93 | 99.92 ± 0.15 |
| 6 | 55.04 ± 11.59 | 90.05 ± 16.40 | 93.16 ± 10.63 | 92.96 ± 9.13 | 25.82 ± 10.67 | 96.75 ± 0.09 | 89.15 ± 11.78 |
| 7 | 54.72 ± 30.80 | 86.63 ± 6.30 | 93.42 ± 4.80 | 87.50 ± 14.23 | 16.08 ± 11.76 | 88.83 ± 9.63 | 95.46 ± 4.01 |
| 8 | 68.02 ± 30.52 | 82.34 ± 10.75 | 72.06 ± 17.32 | 81.83 ± 20.50 | 37.91 ± 23.80 | 75.06 ± 4.80 | 83.74 ± 7.58 |
| 9 | 96.29 ± 4.43 | 9.89 ± 6.70 | 53.95 ± 27.33 | 81.70 ± 9.52 | 10.35 ± 10.29 | 63.00 ± 27.33 | 83.89 ± 8.87 |
| OA (%) | 83.90 ± 3.66 | 81.42 ± 2.40 | 91.61 ± 2.24 | 91.30 ± 3.15 | 64.06 ± 2.91 | 91.19 ± 2.45 | 92.03 ± 1.95 |
| AA (%) | 77.46 ± 6.96 | 72.77 ± 2.03 | 85.80 ± 4.00 | 87.52 ± 4.48 | 39.58 ± 5.21 | 88.36 ± 2.25 | 89.62 ± 3.25 |
| Kappa | 78.16 ± 5.20 | 75.39 ± 3.22 | 88.81 ± 3.14 | 88.48 ± 4.20 | 44.81 ± 5.55 | 87.29 ± 2.74 | 89.41 ± 2.68 |
| F1 (%) | 82.47 ± 3.91 | 80.15 ± 3.18 | 90.26 ± 2.71 | 89.87 ± 3.58 | 62.14 ± 4.96 | 90.84 ± 2.53 | 91.95 ± 2.38 |
| Class | CNN | GCN | CEGCN | MSSGU | SSSTNet | EMS-GCN | OURS |
|---|---|---|---|---|---|---|---|
| 1 | 98.18 ± 1.48 | 98.09 ± 3.73 | 98.83 ± 1.73 | 83.87 ± 18.95 | 99.26 ± 0.50 | 99.65 ± 0.68 | 94.19 ± 10.75 |
| 2 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 97.04 ± 3.29 | 99.95 ± 0.09 | 100.00 ± 0.00 | 100.00 ± 0.00 |
| 3 | 82.20 ± 17.40 | 93.68 ± 12.50 | 90.75 ± 6.11 | 78.92 ± 16.93 | 69.89 ± 13.68 | 84.41 ± 17.37 | 87.88 ± 12.78 |
| 4 | 99.33 ± 0.58 | 75.80 ± 20.49 | 99.07 ± 1.15 | 98.55 ± 1.02 | 96.59 ± 1.94 | 93.45 ± 8.43 | 99.55 ± 0.30 |
| 5 | 95.92 ± 6.16 | 86.18 ± 8.34 | 96.86 ± 3.54 | 91.10 ± 2.56 | 99.13 ± 0.77 | 97.80 ± 2.60 | 96.30 ± 5.89 |
| 6 | 99.87 ± 0.24 | 96.40 ± 4.19 | 99.90 ± 0.07 | 97.13 ± 2.68 | 100.00 ± 0.00 | 99.75 ± 0.32 | 99.93 ± 0.05 |
| 7 | 98.52 ± 2.92 | 99.70 ± 0.44 | 99.99 ± 0.01 | 96.26 ± 4.71 | 99.91 ± 0.06 | 99.83 ± 0.23 | 99.94 ± 0.09 |
| 8 | 88.60 ± 5.26 | 98.02 ± 0.73 | 92.76 ± 0.01 | 70.79 ± 11.45 | 96.32 ± 1.74 | 89.78 ± 5.47 | 92.98 ± 2.07 |
| 9 | 99.98 ± 0.01 | 99.69 ± 0.60 | 100.00 ± 0.00 | 96.90 ± 2.43 | 99.79 ± 0.26 | 100.00 ± 0.00 | 100.00 ± 0.00 |
| 10 | 82.01 ± 10.68 | 87.32 ± 13.08 | 90.36 ± 12.15 | 67.19 ± 12.12 | 84.51 ± 12.34 | 82.42 ± 14.88 | 90.05 ± 9.43 |
| 11 | 93.27 ± 6.42 | 91.92 ± 9.88 | 99.10 ± 1.15 | 68.26 ± 24.87 | 85.17 ± 24.42 | 98.10 ± 2.46 | 99.41 ± 1.17 |
| 12 | 93.42 ± 11.85 | 86.10 ± 8.21 | 99.10 ± 1.11 | 87.41 ± 20.98 | 94.50 ± 7.85 | 98.45 ± 2.25 | 99.46 ± 3.06 |
| 13 | 84.08 ± 24.89 | 59.48 ± 19.07 | 80.02 ± 32.91 | 77.49 ± 19.90 | 79.05 ± 27.03 | 88.50 ± 8.16 | 94.35 ± 6.08 |
| 14 | 98.65 ± 0.85 | 91.97 ± 9.17 | 98.41 ± 0.73 | 97.94 ± 0.94 | 97.83 ± 1.14 | 98.81 ± 0.53 | 99.54 ± 0.55 |
| 15 | 66.64 ± 11.49 | 93.79 ± 2.67 | 82.95 ± 10.36 | 67.37 ± 14.96 | 27.43 ± 18.83 | 78.43 ± 13.80 | 87.52 ± 4.63 |
| 16 | 71.72 ± 13.74 | 99.46 ± 1.07 | 79.48 ± 27.36 | 57.09 ± 32.60 | 74.47 ± 26.02 | 84.45 ± 3.46 | 85.78 ± 5.16 |
| OA (%) | 89.40 ± 2.02 | 94.54 ± 0.79 | 93.94 ± 1.60 | 81.76 ± 3.68 | 85.50 ± 3.90 | 92.19 ± 2.45 | 94.75 ± 0.48 |
| AA (%) | 90.77 ± 3.99 | 91.10 ± 2.30 | 94.22 ± 3.66 | 83.33 ± 2.95 | 87.74 ± 5.47 | 93.36 ± 2.25 | 95.37 ± 0.99 |
| Kappa | 88.15 ± 2.30 | 93.92 ± 0.88 | 93.24 ± 1.79 | 79.66 ± 4.09 | 83.68 ± 4.46 | 91.29 ± 2.74 | 94.15 ± 0.54 |
| F1 (%) | 89.02 ± 2.13 | 93.80 ± 0.91 | 93.55 ± 1.64 | 80.93 ± 3.88 | 84.62 ± 4.12 | 91.07 ± 2.56 | 94.81 ± 0.50 |
| Model | Training Time (s) | Testing Time (s) | Parameter Quantity (M) | FLOPs (G) |
|---|---|---|---|---|
| CNN | 9.67 | 0.86 | 0.19 | 1.62 |
| GCN | 8.50 | 0.80 | 0.19 | 1.62 |
| CEGCN | 9.41 | 0.80 | 0.19 | 1.62 |
| MSSGU | 16.44 | 1.46 | 0.13 | 1.38 |
| SeFormer | 12.44 | 0.88 | 0.21 | 1.63 |
| EMS-GCN | 30.32 | 1.51 | 0.19 | 1.63 |
| MFSGCN (Ours) | 12.31 | 0.62 | 0.13 | 1.63 |
| Method | IP | PU | SA | ||||||||
|---|---|---|---|---|---|---|---|---|---|---|---|
| Symmetry | Self-Attn | Class | OA | AA | Kappa | OA | AA | Kappa | OA | AA | Kappa |
| ✓ | × | LSE-Only | 74.14 ± 2.01 | 54.07 ± 3.01 | 69.84 ± 2.48 | 84.27 ± 4.13 | 78.43 ± 8.65 | 78.82 ± 5.72 | 88.77 ± 2.09 | 89.67 ± 4.07 | 87.45 ± 2.37 |
| ✓ | × | PGC-Only | 83.36 ± 1.12 | 64.52 ± 2.60 | 80.21 ± 2.30 | 88.19 ± 4.06 | 86.42 ± 4.23 | 84.38 ± 5.30 | 91.74 ± 2.18 | 90.09 ± 3.31 | 90.81 ± 2.41 |
| ✓ | ✓ | SGC-Only | 82.21 ± 1.60 | 70.52 ± 4.02 | 83.53 ± 2.38 | 84.35 ± 1.32 | 74.89 ± 1.18 | 79.25 ± 1.79 | 94.10 ± 1.40 | 91.32 ± 3.22 | 90.43 ± 1.56 |
| ✓ | ✓ | LSE + SGC | 84.59 ± 1.71 | 70.62 ± 1.64 | 82.38 ± 1.95 | 90.64 ± 2.25 | 86.21 ± 4.81 | 87.58 ± 3.04 | 93.69 ± 1.07 | 94.86 ± 0.92 | 93.20 ± 1.19 |
| ✓ | × | LSE + PGC | 83.60 ± 3.98 | 62.96 ± 3.72 | 75.57 ± 1.10 | 83.60 ± 3.98 | 78.11 ± 7.52 | 77.98 ± 5.62 | 90.72 ± 1.25 | 91.82 ± 1.88 | 89.66 ± 1.41 |
| × | ✓ | PGC + SGC | 84.95 ± 1.25 | 73.74 ± 2.59 | 82.58 ± 3.66 | 90.37 ± 2.12 | 85.61 ± 3.55 | 87.23 ± 2.93 | 94.29 ± 0.83 | 95.01 ± 0.93 | 93.87 ± 0.93 |
| ✓ | ✓ | MCGNet | 85.87 ± 1.68 | 74.63 ± 2.37 | 83.81 ± 1.87 | 92.03 ± 1.95 | 89.62 ± 3.25 | 89.41 ± 2.68 | 94.75 ± 0.48 | 95.37 ± 0.99 | 94.15 ± 0.54 |
| Scale | IP (OA%) | PU (OA%) | SA (OA%) |
|---|---|---|---|
| 25 | 84.23 ± 1.85 | 90.45 ± 2.12 | 93.21 ± 0.65 |
| 50 | 85.12 ± 1.72 | 91.23 ± 1.98 | 94.08 ± 0.58 |
| 100 | 85.65 ± 1.68 | 92.03 ± 1.95 | 94.75 ± 0.48 |
| 150 | 85.78 ± 1.71 | 91.87 ± 2.01 | 94.52 ± 0.51 |
| 200 | 85.87 ± 1.68 | 91.56 ± 2.08 | 94.28 ± 0.55 |
| 250 | 85.34 ± 1.76 | 91.12 ± 2.15 | 93.89 ± 0.62 |
| 300 | 84.91 ± 1.82 | 90.67 ± 2.23 | 93.45 ± 0.68 |
| Model | Time Complexity | Memory Complexity |
|---|---|---|
| CNN | ||
| GCN | ||
| CEGCN | ||
| MCGNet (ours) | ||
| MSSGU | ||
| EMS-GCN |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xu, Y.; Wang, J.; You, Z.; Li, X. A Symmetric Multiscale Feature Fusion Architecture Based on CNN and GNN for Hyperspectral Image Classification. Symmetry 2025, 17, 1930. https://doi.org/10.3390/sym17111930
Xu Y, Wang J, You Z, Li X. A Symmetric Multiscale Feature Fusion Architecture Based on CNN and GNN for Hyperspectral Image Classification. Symmetry. 2025; 17(11):1930. https://doi.org/10.3390/sym17111930
Chicago/Turabian StyleXu, Yaoqun, Junyi Wang, Zelong You, and Xin Li. 2025. "A Symmetric Multiscale Feature Fusion Architecture Based on CNN and GNN for Hyperspectral Image Classification" Symmetry 17, no. 11: 1930. https://doi.org/10.3390/sym17111930
APA StyleXu, Y., Wang, J., You, Z., & Li, X. (2025). A Symmetric Multiscale Feature Fusion Architecture Based on CNN and GNN for Hyperspectral Image Classification. Symmetry, 17(11), 1930. https://doi.org/10.3390/sym17111930

