Bi-Kernel Graph Neural Network with Adaptive Propagation Mechanism for Hyperspectral Image Classification
Abstract
:1. Introduction
- 1.
- We introduce a novel homophily degree matrix to estimate the homophily and heterophily that widely exist in the constructed graph for HSI. In the process of homophily degree matrix estimation, topological features and attribute features are learned by label propagation (LP) and Multilayer Perception (MLP) through extracting class-aware information. Thus, we can incorporate the homophily and heterophily information into the graph convolution framework.
- 2.
- We propose a homophily-guided bi-kernel propagation mechanism, through which we can automatically change the feature propagation process by utilizing both homophily and heterophily information from the graph. To the best of our knowledge, this is the first time that a homophily-guided GNN technique has been applied to the HSI classification task.
- 3.
- Extensive experiments on three real-world data sets, i.e., Indian Pines, Pavia University, and Kennedy Space Center, are conducted to validate the performance of the proposed BKGNN both qualitatively and quantitatively. The experimental results demonstrate a significant improvement over previous methods.
2. Methodology
2.1. Preliminaries
2.1.1. Graph Neural Network (GNN)
2.1.2. Homophily in Graphs
2.1.3. Label Propagation (LP) Algorithm
2.2. Overall Framework
2.3. Graph Projection and Re-Projection
2.4. Homophily Degree Matrix Calculation
2.5. Bi-Kernel Feature Transformation
2.6. Optimization Objective
Algorithm 1: BKGNN. |
Input: Original HSI data ; training labels ; number of superpixels N; number of iterations T; learning rate ; hyper-parameters , , , and . 2: Calculate the attribute matrix and the adjacency matrix according to Equations (6) and (8), respectively. for to T do 3: Perform MLP and LP according to Equations (10) and (13). 4: Calculate the homophily degree matrix according to Equation (15). 5: Update the outputs after two layers of bi-kernel feature transformation according to Equation (16). 6: Graph reprojection according to Equation (9). 7: Calculate the overall error over all labeled instances according to Equation (21), and update the weight matrices using Adam gradient decent. end for 8: Conduct label prediction based on the trained network. Output: Predicted label for each pixel. |
2.7. Computational Complexity
3. Experiments
3.1. Data Set Description
3.1.1. Indian Pines (IP)
3.1.2. Pavia University (PU)
3.1.3. Kennedy Space Center
3.2. Experimental Settings
3.2.1. Implementation Details
3.2.2. Compared Methods
- 1.
- SVM-RBF: The value of (the spread of the RBF kernel) and C (controlling the magnitude of penalization) is searched in the range of and .
- 2.
- MBCTU: MBCTU is actually a random forest classifier that performs color-texture feature extraction based on the selected spectral bands. The bands are selected according to their feature importance computed by another random forest classifier.
- 3.
- 1D CNN: This architecture is constructed by one convolutional layer with 20 kernels, one max pooling layer, a ReLU activation layer, and two full connection layers.
- 4.
- 2D CNN: A semi-supervised classification model, consisting of one convolutional layer with a filter, one max pooling layer, and followed by three decoding layers. Each decoding layer is made up of one full connection layer and one normalization layer.
- 5.
- 3D CNN: The 3D CNN model contains two convolution layers and a fully connected layer. Each convolutional layer is followed by ReLU activation layer and their kernel sizes are and .
- 6.
- NLGCN: This network applies two graph convolutional layers by incorporating a graph learning procedure.
- 7.
- GSAGE: Graph convolution is achieved by graph sampling and aggregation, and the second-order nearest neighbor of the target node is taken into account.
- 8.
- DARMA: A superpixel-level GNN model which is composed of three convolutional blocks. Each block consists of an ARMA graph convolutional layer, a ReLU activation layer, and a normalization layer.
3.3. Experimental Results
4. Discussion
4.1. Analysis of the Number of Superpixels
4.2. Analysis of Weights and
4.3. Analysis of Trade-Off Parameters and
5. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Wang, R.; Nie, F.; Yu, W. Fast spectral clustering with anchor graph for large hyperspectral images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2003–2007. [Google Scholar] [CrossRef]
- Chen, Y.; Lin, Z.; Zhao, X.; Wang, G.; Gu, Y. Deep learning-based classification of hyperspectral data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 2094–2107. [Google Scholar] [CrossRef]
- Bioucas-Dias, J.M.; Plaza, A.; Dobigeon, N.; Parente, M.; Du, Q.; Gader, P.; Chanussot, J. Hyperspectral unmixing overview: Geometrical, statistical, and sparse regression-based approaches. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2012, 5, 354–379. [Google Scholar] [CrossRef] [Green Version]
- Liu, S.; Marinelli, D.; Bruzzone, L.; Bovolo, F. A review of change detection in multitemporal hyperspectral images: Current techniques, applications, and challenges. IEEE Geosci. Remote Sens. Mag. 2019, 7, 140–158. [Google Scholar] [CrossRef]
- Nasrabadi, N.M. Hyperspectral target detection: An overview of current and future challenges. IEEE Signal Process. Mag. 2013, 31, 34–44. [Google Scholar] [CrossRef]
- Li, W.; Du, Q. Collaborative representation for hyperspectral anomaly detection. IEEE Trans. Geosci. Remote Sens. 2014, 53, 1463–1474. [Google Scholar] [CrossRef]
- Hu, H.; Yao, M.; He, F.; Zhang, F.; Zhao, J.; Yan, S. Nonnegative collaborative representation for hyperspectral anomaly detection. Remote Sens. Lett. 2022, 13, 352–361. [Google Scholar] [CrossRef]
- Hu, H.; He, F.; Zhang, F.; Ding, Y.; Wu, X.; Zhao, J.; Yao, M. Unifying Label Propagation and Graph Sparsification for Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett. 2022, 19, 1–5. [Google Scholar] [CrossRef]
- Blanzieri, E.; Melgani, F. Nearest Neighbor Classification of Remote Sensing Images With the Maximal Margin Principle. IEEE Trans. Geosci. Remote Sens. 2008, 46, 1804–1811. [Google Scholar] [CrossRef]
- Tarabalka, Y.; Fauvel, M.; Chanussot, J.; Benediktsson, J.A. SVM- and MRF-Based Method for Accurate Classification of Hyperspectral Images. IEEE Geosci. Remote Sens. Lett. 2010, 7, 736–740. [Google Scholar] [CrossRef]
- Fauvel, M.; Benediktsson, J.A.; Chanussot, J.; Sveinsson, J.R. Spectral and Spatial Classification of Hyperspectral Data Using SVMs and Morphological Profiles. IEEE Trans. Geosci. Remote Sens. 2008, 46, 3804–3814. [Google Scholar] [CrossRef] [Green Version]
- Sun, X.; Qu, Q.; Nasrabadi, N.M.; Tran, T.D. Structured Priors for Sparse-Representation-Based Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett. 2014, 11, 1235–1239. [Google Scholar] [CrossRef] [Green Version]
- Uddin, M.P.; Mamun, M.A.; Hossain, M.A. Effective feature extraction through segmentation-based folded-PCA for hyperspectral image classification. Int. J. Remote Sens. 2019, 40, 7190–7220. [Google Scholar] [CrossRef]
- Bandos, T.V.; Bruzzone, L.; Camps-Valls, G. Classification of Hyperspectral Images With Regularized Linear Discriminant Analysis. IEEE Trans. Geosci. Remote Sens. 2009, 47, 862–873. [Google Scholar] [CrossRef]
- Gu, Y.; Liu, T.; Jia, X.; Benediktsson, J.A.; Chanussot, J. Nonlinear multiple kernel learning with multiple-structure-element extended morphological profiles for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2016, 54, 3235–3247. [Google Scholar] [CrossRef]
- Xia, J.; Dalla Mura, M.; Chanussot, J.; Du, P.; He, X. Random subspace ensembles for hyperspectral image classification with extended morphological attribute profiles. IEEE Trans. Geosci. Remote Sens. 2015, 53, 4768–4786. [Google Scholar] [CrossRef]
- Jia, S.; Hu, J.; Xie, Y.; Shen, L.; Jia, X.; Li, Q. Gabor cube selection based multitask joint sparse representation for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2016, 54, 3174–3187. [Google Scholar] [CrossRef]
- Ding, Y.; Zhao, X.; Zhang, Z.; Cai, W.; Yang, N.; Zhan, Y. Semi-Supervised Locality Preserving Dense Graph Neural Network With ARMA Filters and Context-Aware Learning for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2021. [Google Scholar] [CrossRef]
- Yang, P.; Tong, L.; Qian, B.; Gao, Z.; Yu, J.; Xiao, C. Hyperspectral Image Classification With Spectral and Spatial Graph Using Inductive Representation Learning Network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 791–800. [Google Scholar] [CrossRef]
- Chen, Y.; Zhao, X.; Jia, X. Spectral–spatial classification of hyperspectral data based on deep belief network. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 2381–2392. [Google Scholar] [CrossRef]
- Mou, L.; Ghamisi, P.; Zhu, X.X. Deep recurrent neural networks for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 3639–3655. [Google Scholar] [CrossRef] [Green Version]
- Hong, D.; Gao, L.; Yokoya, N.; Yao, J.; Chanussot, J.; Du, Q.; Zhang, B. More Diverse Means Better: Multimodal Deep Learning Meets Remote-Sensing Imagery Classification. IEEE Trans. Geosci. Remote Sens. 2021, 59, 4340–4354. [Google Scholar] [CrossRef]
- Hamida, A.B.; Benoit, A.; Lambert, P.; Amar, C.B. 3-D deep learning approach for remote sensing image classification. IEEE Trans. Geosci. Remote Sens. 2018, 56, 4420–4434. [Google Scholar] [CrossRef] [Green Version]
- Roy, S.K.; Krishna, G.; Dubey, S.R.; Chaudhuri, B.B. HybridSN: Exploring 3-D–2-D CNN feature hierarchy for hyperspectral image classification. IEEE Geosci. Remote Sens. Lett. 2019, 17, 277–281. [Google Scholar] [CrossRef] [Green Version]
- Zhu, K.; Chen, Y.; Ghamisi, P.; Jia, X.; Benediktsson, J.A. Deep convolutional capsule network for hyperspectral image spectral and spectral-spatial classification. Remote Sens. 2019, 11, 223. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Liang, Y.; Li, C.; Huyan, N.; Jiao, L.; Zhou, H. Recursive autoencoders-based unsupervised feature learning for hyperspectral image classification. IEEE Geosci. Remote Sens. Lett. 2017, 14, 1928–1932. [Google Scholar] [CrossRef] [Green Version]
- Hong, D.; Han, Z.; Yao, J.; Gao, L.; Zhang, B.; Plaza, A.; Chanussot, J. SpectralFormer: Rethinking hyperspectral image classification with transformers. IEEE Trans. Geosci. Remote Sens. 2021, 60, 1–15. [Google Scholar] [CrossRef]
- Bai, J.; Ding, B.; Xiao, Z.; Jiao, L.; Chen, H.; Regan, A.C. Hyperspectral Image Classification Based on Deep Attention Graph Convolutional Network. IEEE Trans. Geosci. Remote Sens. 2021. [Google Scholar] [CrossRef]
- Mu, C.; Dong, Z.; Liu, Y. A Two-Branch Convolutional Neural Network Based on Multi-Spectral Entropy Rate Superpixel Segmentation for Hyperspectral Image Classification. Remote Sens. 2022, 14, 1569. [Google Scholar] [CrossRef]
- Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
- Veličković, P.; Cucurull, G.; Casanova, A.; Romero, A.; Lio, P.; Bengio, Y. Graph attention networks. arXiv 2017, arXiv:1710.10903. [Google Scholar]
- Wang, Y.; Sun, Y.; Liu, Z.; Sarma, S.E.; Bronstein, M.M.; Solomon, J.M. Dynamic Graph CNN for Learning on Point Clouds. ACM Trans. Graph. 2019, 38, 1–12. [Google Scholar] [CrossRef]
- Hamilton, W.; Ying, Z.; Leskovec, J. Inductive representation learning on large graphs. Adv. Neural Inf. Process. Syst. 2017, 30, 1025–1035. [Google Scholar]
- Qin, A.; Shang, Z.; Tian, J.; Wang, Y.; Zhang, T.; Tang, Y.Y. Spectral Spatial Graph Convolutional Networks for Semisupervised Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett. 2019, 16, 241–245. [Google Scholar] [CrossRef]
- Dong, Y.; Liu, Q.; Du, B.; Zhang, L. Weighted feature fusion of convolutional neural network and graph attention network for hyperspectral image classification. IEEE Trans. Image Process. 2022, 31, 1559–1572. [Google Scholar] [CrossRef] [PubMed]
- Hu, H.; Yao, M.; He, F.; Zhang, F. Graph neural network via edge convolution for hyperspectral image classification. IEEE Geosci. Remote Sens. Lett. 2021, 19, 1–5. [Google Scholar] [CrossRef]
- Ding, Y.; Zhao, X.; Zhang, Z.; Cai, W.; Yang, N. Graph Sample and Aggregate-Attention Network for Hyperspectral Image Classification. IEEE Geosci. Remote. Sens. Lett. 2021, 19, 5504205. [Google Scholar] [CrossRef]
- Ding, Y.; Zhao, X.; Zhang, Z.; Cai, W.; Yang, N. Multiscale Graph Sample and Aggregate Network With Context-Aware Learning for Hyperspectral Image Classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 4561–4572. [Google Scholar] [CrossRef]
- Jia, S.; Deng, X.; Xu, M.; Zhou, J.; Jia, X. Superpixel-level weighted label propagation for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2020, 58, 5077–5091. [Google Scholar] [CrossRef]
- Zhang, H.; Zou, J.; Zhang, L. EMS-GCN: An End-to-End Mixhop Superpixel-Based Graph Convolutional Network for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5526116. [Google Scholar] [CrossRef]
- Bai, J.; Shi, W.; Xiao, Z.; Regan, A.C.; Ali, T.A.A.; Zhu, Y.; Zhang, R.; Jiao, L. Hyperspectral Image Classification Based on Superpixel Feature Subdivision and Adaptive Graph Structure. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5524415. [Google Scholar] [CrossRef]
- Ma, Y.; Liu, X.; Shah, N.; Tang, J. Is homophily a necessity for graph neural networks? arXiv 2021, arXiv:2106.06134. [Google Scholar]
- Wang, T.; Jin, D.; Wang, R.; He, D.; Huang, Y. Powerful graph convolutional networks with adaptive propagation mechanism for homophily and heterophily. In Proceedings of the AAAI Conference on Artificial Intelligence, Virtual, 22 February– 1 March 2022; Volume 36, pp. 4210–4218. [Google Scholar]
- Zhu, J.; Yan, Y.; Zhao, L.; Heimann, M.; Akoglu, L.; Koutra, D. Beyond homophily in graph neural networks: Current limitations and effective designs. Adv. Neural Inf. Process. Syst. 2020, 33, 7793–7804. [Google Scholar]
- Gilmer, J.; Schoenholz, S.S.; Riley, P.F.; Vinyals, O.; Dahl, G.E. Neural message passing for quantum chemistry. In Proceedings of the International Conference on Machine Learning, PMLR, Sydney, Australia, 6–11 August 2017; pp. 1263–1272. [Google Scholar]
- Li, G.; Mueller, M.; Qian, G.; Delgadillo Perez, I.C.; Abualshour, A.; Thabet, A.K.; Ghanem, B. DeepGCNs: Making GCNs Go as Deep as CNNs. IEEE Trans. Pattern Anal. Mach. Intell. 2021. [Google Scholar] [CrossRef] [PubMed]
- Du, L.; Shi, X.; Fu, Q.; Ma, X.; Liu, H.; Han, S.; Zhang, D. GBK-GNN: Gated Bi-Kernel Graph Neural Networks for Modeling Both Homophily and Heterophily. In Proceedings of the ACM Web Conference 2022, Virtual Event, 25–29 April 2022; pp. 1550–1558. [Google Scholar]
- Kuo, B.C.; Ho, H.H.; Li, C.H.; Hung, C.C.; Taur, J.S. A kernel-based feature selection method for SVM with RBF kernel for hyperspectral image classification. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2013, 7, 317–326. [Google Scholar] [CrossRef]
- Djerriri, K.; Safia, A.; Adjoudj, R.; Karoui, M.S. Improving hyperspectral image classification by combining spectral and multiband compact texture features. In Proceedings of the IGARSS 2019—2019 IEEE International Geoscience and Remote Sensing Symposium, Yokohama, Japan, 28 July–2 August 2019; pp. 465–468. [Google Scholar]
- Hu, W.; Huang, Y.; Wei, L.; Zhang, F.; Li, H. Deep convolutional neural networks for hyperspectral image classification. J. Sens. 2015, 2015, 258619. [Google Scholar] [CrossRef] [Green Version]
- Liu, B.; Yu, X.; Zhang, P.; Tan, X.; Yu, A.; Xue, Z. A semi-supervised convolutional neural network for hyperspectral image classification. Remote Sens. Lett. 2017, 8, 839–848. [Google Scholar] [CrossRef]
- Li, Y.; Zhang, H.; Shen, Q. Spectral–spatial classification of hyperspectral imagery with 3D convolutional neural network. Remote Sens. 2017, 9, 67. [Google Scholar] [CrossRef] [Green Version]
- Mou, L.; Lu, X.; Li, X.; Zhu, X.X. Nonlocal Graph Convolutional Networks for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2020, 58, 8246–8257. [Google Scholar] [CrossRef]
Notation | Definition | Type | Size |
---|---|---|---|
Original HSI data. | 3D matrix | ||
H, W and B | Height, width, and number of bands in HSI, respectively. | Scalar | |
C | Number of classes in HSI. | Scalar | |
N | Number of nodes contained in the graph (equal to the number of superpixels). | Scalar | |
Graph with node set and edge set . | - | - | |
Adjacency matrix. | Matrix | ||
Attribute matrix. | Matrix | ||
Degree matrix of . | Matrix | ||
, | Projection matrix and its normalized version. | Matrix | |
Homophily degree matrix | Matrix | ||
Node embeddings | Matrix | ||
The set of pixels in the superpixel. | - | - | |
Neighborhood of ·. | - | - | |
Nodes in the training set. | - | - |
Class No. | Conventional Classifiers | CNN-Based Methods | GNN-Based Methods | ||||||
---|---|---|---|---|---|---|---|---|---|
SVM-RBF | MBCTU | 1D CNN | 2D CNN | 3D CNN | NLGCN | GSAGE | DARMA | BKGNN | |
1 | 87.50 ± 5.10 | 95.83 ± 2.95 | 96.25 ± 4.15 | 97.50 ± 4.15 | 99.38 ± 1.87 | 98.75 ± 2.50 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 |
2 | 55.74 ± 3.99 | 47.76 ± 0.65 | 63.21 ± 6.78 | 58.06 ± 9.13 | 66.82 ± 7.58 | 74.93 ± 8.22 | 76.02 ± 6.53 | 80.07 ± 6.15 | 88.54 ± 2.34 |
3 | 58.66 ± 4.32 | 68.17 ± 4.25 | 69.05 ± 7.37 | 49.59 ± 6.16 | 58.53 ± 5.04 | 78.56 ± 3.59 | 70.30 ± 9.86 | 94.07 ± 3.32 | 94.68 ± 3.18 |
4 | 70.53 ± 8.15 | 98.87 ± 0.91 | 86.04 ± 6.76 | 83.14 ± 5.02 | 84.69 ± 4.50 | 92.08 ± 3.73 | 97.49 ± 1.61 | 99.61 ± 0.36 | 100.00 ± 0.00 |
5 | 84.32 ± 4.67 | 71.52 ± 7.19 | 73.95 ± 16.58 | 77.15 ± 6.78 | 87.24 ± 3.67 | 93.11 ± 2.17 | 91.30 ± 9.10 | 95.01 ± 2.12 | 96.25 ± 2.60 |
6 | 90.61 ± 2.13 | 86.68 ± 0.53 | 91.27 ± 3.62 | 95.61 ± 2.48 | 91.87 ± 3.19 | 96.96 ± 2.33 | 97.71 ± 2.60 | 96.51 ± 0.71 | 98.83 ± 1.21 |
7 | 90.74 ± 6.92 | 100.00 ± 0.00 | 86.92 ± 7.73 | 96.15 ± 7.09 | 98.46 ± 3.08 | 97.69 ± 4.93 | 98.46 ± 4.62 | 96.92 ± 3.77 | 100.00 ± 0.00 |
8 | 89.58 ± 3.01 | 86.53 ± 5.55 | 94.58 ± 1.78 | 98.57 ± 0.80 | 97.05 ± 1.35 | 99.67 ± 0.30 | 97.90 ± 4.63 | 99.51 ± 0.78 | 100.00 ± 0.00 |
9 | 96.66 ± 4.71 | 93.33 ± 9.43 | 98.00 ± 6.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 82.00 ± 32.80 | 100.00 ± 0.00 | 100.00 ± 0.00 |
10 | 70.24 ± 2.86 | 86.13 ± 6.07 | 70.91 ± 7.68 | 75.34 ± 5.23 | 72.46 ± 8.02 | 87.58 ± 3.42 | 86.28 ± 8.77 | 89.07 ± 2.97 | 93.48 ± 3.00 |
11 | 50.59 ± 3.72 | 60.89 ± 9.64 | 63.65 ± 7.91 | 63.56 ± 7.25 | 66.19 ± 5.15 | 71.95 ± 4.10 | 67.46 ± 6.25 | 86.95 ± 4.36 | 92.17 ± 3.74 |
12 | 62.87 ± 6.52 | 52.46 ± 13.08 | 70.55 ± 8.75 | 67.41 ± 9.45 | 72.02 ± 6.59 | 89.17 ± 2.64 | 86.77 ± 7.55 | 89.66 ± 4.41 | 96.45 ± 1.00 |
13 | 96.57 ± 0.93 | 99.43 ± 0.25 | 96.63 ± 1.90 | 99.89 ± 0.34 | 99.20 ± 0.78 | 99.49 ± 0.40 | 99.83 ± 0.26 | 99.89 ± 0.23 | 100.00 ± 0.00 |
14 | 82.37 ± 3.97 | 86.23 ± 1.03 | 83.28 ± 11.13 | 90.95 ± 5.56 | 90.87 ± 3.51 | 91.07 ± 2.82 | 95.51 ± 3.10 | 97.51 ± 2.67 | 98.17 ± 1.99 |
15 | 64.04 ± 2.98 | 74.44 ± 4.25 | 58.62 ± 12.50 | 61.74 ± 6.22 | 67.42 ± 9.75 | 88.01 ± 5.95 | 93.12 ± 3.12 | 98.43 ± 1.61 | 99.72 ± 0.36 |
16 | 89.41 ± 8.33 | 99.47 ± 0.75 | 94.29 ± 4.03 | 99.68 ± 0.95 | 98.41 ± 2.56 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 |
OA | 66.93 ± 1.50 | 70.62 ± 2.94 | 72.64 ± 2.41 | 72.31 ± 1.07 | 75.27 ± 1.11 | 83.62 ± 1.72 | 82.48 ± 2.22 | 90.91 ± 2.42 | 94.66 ± 0.91 |
AA | 77.53 ± 1.46 | 81.75 ± 2.17 | 81.07 ± 1.18 | 82.15 ± 1.38 | 84.41 ± 1.14 | 91.19 ± 0.97 | 90.01 ± 2.56 | 95.20 ± 1.35 | 97.39 ± 0.31 |
Kappa | 62.81 ± 1.66 | 66.83 ± 3.28 | 69.04 ± 2.64 | 68.76 ± 1.12 | 71.98 ± 1.26 | 81.44 ± 1.93 | 80.16 ± 2.48 | 89.62 ± 2.76 | 93.90 ± 1.03 |
Class No. | Conventional Classifiers | CNN-Based Methods | GNN-Based Methods | ||||||
---|---|---|---|---|---|---|---|---|---|
SVM-RBF | MBCTU | 1D CNN | 2D CNN | 3D CNN | NLGCN | GSAGE | DARMA | BKGNN | |
1 | 65.14 ± 5.45 | 75.89 ± 4.64 | 69.58 ± 4.00 | 66.94 ± 8.50 | 75.56 ± 11.85 | 88.30 ± 2.60 | 90.54 ± 3.01 | 94.03 ± 1.35 | 96.76 ± 0.72 |
2 | 59.19 ± 4.45 | 55.52 ± 14.56 | 65.41 ± 13.70 | 61.68 ± 4.49 | 74.15 ± 11.41 | 75.17 ± 7.54 | 81.87 ± 6.95 | 90.75 ± 5.53 | 98.62 ± 1.21 |
3 | 27.69 ± 2.46 | 66.30 ± 7.35 | 68.27 ± 23.92 | 62.24 ± 15.61 | 81.08 ± 8.34 | 89.70 ± 1.82 | 88.09 ± 8.48 | 95.90 ± 5.09 | 99.40 ± 0.77 |
4 | 95.25 ± 2.14 | 92.23 ± 1.75 | 93.84 ± 4.32 | 91.80 ± 2.05 | 91.14 ± 3.96 | 94.18 ± 1.78 | 94.74 ± 2.60 | 88.49 ± 3.01 | 96.86 ± 1.13 |
5 | 99.18 ± 0.14 | 100.00 ± 0.00 | 99.41 ± 0.20 | 99.38 ± 0.41 | 98.82 ± 0.59 | 99.69 ± 0.22 | 100.00 ± 0.00 | 97.83 ± 0.84 | 99.68 ± 0.42 |
6 | 70.42 ± 9.49 | 70.01 ± 20.63 | 59.33 ± 17.98 | 82.68 ± 5.23 | 67.06 ± 18.27 | 80.53 ± 6.42 | 89.98 ± 5.62 | 94.94 ± 4.72 | 99.82 ± 0.17 |
7 | 90.10 ± 1.45 | 95.82 ± 1.53 | 91.74 ± 1.47 | 85.19 ± 7.88 | 90.48 ± 2.92 | 96.15 ± 0.62 | 96.95 ± 2.60 | 99.91 ± 0.06 | 100.00 ± 0.00 |
8 | 87.03 ± 2.81 | 77.99 ± 11.50 | 71.60 ± 17.12 | 72.50 ± 12.96 | 92.27 ± 4.77 | 92.83 ± 2.73 | 86.19 ± 9.48 | 95.35 ± 1.68 | 98.96 ± 0.51 |
9 | 99.92 ± 0.05 | 98.80 ± 0.27 | 99.95 ± 0.05 | 99.13 ± 0.51 | 98.52 ± 0.77 | 99.97 ± 0.05 | 99.96 ± 0.05 | 96.53 ± 1.93 | 97.71 ± 2.05 |
OA | 67.93 ± 0.55 | 69.01 ± 4.20 | 70.65 ± 4.82 | 70.77 ± 2.50 | 78.43 ± 3.11 | 83.36 ± 3.22 | 87.17 ± 3.16 | 92.86 ± 2.62 | 98.47 ± 0.58 |
AA | 77.11 ± 0.16 | 81.40 ± 1.17 | 79.90 ± 1.47 | 80.17 ± 1.74 | 85.45 ± 1.67 | 90.72 ± 1.01 | 92.03 ± 1.06 | 94.86 ± 1.37 | 98.65 ± 0.27 |
Kappa | 60.27 ± 0.25 | 61.86 ± 3.97 | 63.18 ± 4.94 | 63.96 ± 2.74 | 72.59 ± 3.28 | 78.80 ± 3.83 | 83.58 ± 3.81 | 90.69 ± 3.31 | 97.98 ± 0.76 |
Class No. | Conventional Classifiers | CNN-Based Methods | GNN-Based Methods | ||||||
---|---|---|---|---|---|---|---|---|---|
SVM-RBF | MBCTU | 1D CNN | 2D CNN | 3D CNN | NLGCN | GSAGE | DARMA | BKGNN | |
1 | 89.78 ± 1.89 | 97.84 ± 0.10 | 79.78 ± 17.29 | 79.49 ± 5.55 | 93.57 ± 6.16 | 97.13 ± 1.26 | 96.79 ± 2.00 | 99.64 ± 0.24 | 99.97 ± 0.05 |
2 | 84.66 ± 0.58 | 92.49 ± 2.26 | 83.90 ± 4.08 | 80.85 ± 11.14 | 74.65 ± 7.86 | 90.94 ± 2.46 | 91.92 ± 2.12 | 97.28 ± 3.15 | 100.00 ± 0.00 |
3 | 56.78 ± 22.21 | 94.87 ± 2.14 | 50.44 ± 34.24 | 69.03 ± 17.86 | 85.40 ± 12.67 | 94.65 ± 5.32 | 97.92 ± 5.66 | 97.26 ± 2.30 | 99.82 ± 0.22 |
4 | 26.42 ± 19.23 | 14.14 ± 4.63 | 32.93 ± 23.16 | 46.08 ± 14.75 | 22.52 ± 13.80 | 59.46 ± 9.54 | 54.23 ± 23.74 | 99.01 ± 1.41 | 98.65 ± 1.61 |
5 | 38.42 ± 2.00 | 44.43 ± 5.66 | 38.93 ± 13.56 | 66.26 ± 13.75 | 84.73 ± 5.67 | 85.65 ± 8.61 | 84.89 ± 18.63 | 87.79 ± 0.84 | 96.34 ± 2.95 |
6 | 41.37 ± 2.47 | 77.69 ± 1.29 | 38.89 ± 11.66 | 80.65 ± 15.11 | 79.90 ± 9.24 | 75.58 ± 8.85 | 87.04 ± 6.59 | 97.59 ± 4.11 | 100.00 ± 0.00 |
7 | 89.33 ± 2.17 | 95.47 ± 4.27 | 84.67 ± 16.44 | 95.33 ± 8.56 | 98.00 ± 1.54 | 95.20 ± 6.09 | 98.27 ± 3.16 | 100.00 ± 0.00 | 99.47 ± 1.07 |
8 | 44.80 ± 8.03 | 65.34 ± 4.14 | 83.44 ± 6.95 | 67.88 ± 5.05 | 69.58 ± 4.17 | 93.17 ± 2.93 | 97.33 ± 2.50 | 100.00 ± 0.00 | 100.00 ± 0.00 |
9 | 75.98 ± 6.67 | 83.43 ± 1.13 | 71.94 ± 16.71 | 84.02 ± 5.77 | 81.63 ± 8.74 | 96.88 ± 4.49 | 97.14 ± 2.66 | 100.00 ± 0.00 | 100.00 ± 0.00 |
10 | 65.95 ± 1.81 | 93.64 ± 2.67 | 78.66 ± 8.99 | 96.47 ± 2.61 | 94.92 ± 3.76 | 97.99 ± 1.61 | 96.39 ± 5.79 | 99.79 ± 0.11 | 99.89 ± 0.13 |
11 | 89.88 ± 1.15 | 100.00 ± 0.00 | 93.42 ± 1.39 | 99.54 ± 0.52 | 99.74 ± 1.65 | 98.92 ± 0.94 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 |
12 | 83.72 ± 2.10 | 89.81 ± 0.84 | 80.13 ± 4.95 | 94.28 ± 4.87 | 82.47 ± 1.81 | 92.79 ± 3.21 | 95.03 ± 3.28 | 99.37 ± 1.27 | 99.20 ± 1.50 |
13 | 99.85 ± 0.14 | 100.00 ± 0.00 | 100.00 ± 0.00 | 99.99 ± 0.03 | 100.00 ± 0.00 | 99.79 ± 0.46 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 |
OA | 75.98 ± 0.31 | 86.58 ± 0.34 | 78.03 ± 4.33 | 84.18 ± 1.93 | 85.87 ± 1.64 | 93.70 ± 0.73 | 94.70 ± 1.31 | 99.14 ± 0.41 | 99.73 ± 0.14 |
AA | 68.23 ± 0.15 | 80.70 ± 0.55 | 70.55 ± 4.21 | 79.86 ± 2.97 | 82.24 ± 2.80 | 90.63 ± 1.11 | 92.07 ± 1.60 | 98.29 ± 0.58 | 99.49 ± 0.26 |
Kappa | 73.14 ± 0.34 | 84.99 ± 0.38 | 75.53 ± 4.74 | 82.38 ± 2.15 | 84.22 ± 1.94 | 92.95 ± 0.81 | 94.08 ± 1.47 | 99.04 ± 0.45 | 99.70 ± 0.16 |
Datasets | 200 | 500 | 1000 | 2000 | 3000 | 4000 | 5000 | |
---|---|---|---|---|---|---|---|---|
IP | OA | 89.65 | 94.66 | 94.59 | 92.70 | 91.94 | 91.55 | 90.25 |
time | 18.7 | 21.2 | 24.8 | 70.4 | 76.71 | 339.1 | 342.8 | |
PU | OA | 90.25 | 97.66 | 98.47 | 95.78 | 97.69 | 96.10 | 94.68 |
time | 45.0 | 50.8 | 66.2 | 104.1 | 185.4 | 301.1 | OM | |
KSC | OA | 95.57 | 98.74 | 99.73 | 99.60 | 98.18 | 98.37 | 98.47 |
time | 69.9 | 74.7 | 89.5 | 125.9 | 216.5 | 285.5 | OM |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Hu, H.; Ding, Y.; He, F.; Zhang, F.; Zhao, J.; Yao, M. Bi-Kernel Graph Neural Network with Adaptive Propagation Mechanism for Hyperspectral Image Classification. Remote Sens. 2022, 14, 6224. https://doi.org/10.3390/rs14246224
Hu H, Ding Y, He F, Zhang F, Zhao J, Yao M. Bi-Kernel Graph Neural Network with Adaptive Propagation Mechanism for Hyperspectral Image Classification. Remote Sensing. 2022; 14(24):6224. https://doi.org/10.3390/rs14246224
Chicago/Turabian StyleHu, Haojie, Yao Ding, Fang He, Fenggan Zhang, Jianwei Zhao, and Minli Yao. 2022. "Bi-Kernel Graph Neural Network with Adaptive Propagation Mechanism for Hyperspectral Image Classification" Remote Sensing 14, no. 24: 6224. https://doi.org/10.3390/rs14246224
APA StyleHu, H., Ding, Y., He, F., Zhang, F., Zhao, J., & Yao, M. (2022). Bi-Kernel Graph Neural Network with Adaptive Propagation Mechanism for Hyperspectral Image Classification. Remote Sensing, 14(24), 6224. https://doi.org/10.3390/rs14246224