WSSGCN: Hyperspectral Forest Image Classification via Watershed Superpixel Segmentation and Sparse Graph Convolutional Networks
Abstract
:1. Introduction
- (1)
- Robustness Enhancement of Superpixel Segmentation: An adaptive superpixel size adjustment strategy is introduced, dynamically optimizing segmentation parameters based on object complexity. Specifically, by calculating the texture complexity of local regions (e.g., gray-level co-occurrence matrix energy values), small-sized superpixels are adopted in fragmented areas (e.g., urban building clusters) to preserve details, while large-sized superpixels are used in homogeneous regions (e.g., farmland) to reduce redundant nodes.
- (2)
- Dynamic Optimization of Graph Structures: An Attentive Adjacency Refinement (AAR) algorithm based on attention mechanisms is proposed. This algorithm dynamically adjusts the connection strength between superpixel nodes through learnable attention weights, mitigating the interference of noisy edges. For example, for node pairs with high spectral variability (e.g., forests and shadowed regions), AAR automatically reduces their connection weights, thereby minimizing the propagation of erroneous information.
- (3)
- Hierarchical Design of Multi-Scale Feature Fusion: A cross-scale feature interaction mechanism is introduced in the dual-branch architecture. Multi-scale convolutional features from the local branch are fused with graph convolutional features from the global branch through skip connections, ensuring complementary expression of local details and global semantics. For instance, shallow convolutional features (capturing texture details) and deep graph convolutional features (modeling spatial distributions) are adaptively fused through a gating mechanism, enhancing the discriminative ability for complex objects.
2. Research Area and Data and Algorithms
2.1. Hyperspectral Data Source
2.2. Methodology
2.2.1. Watershed Superpixel Segmentation Module
2.2.2. Lightweight Multi-Scale Convolution Module (LMC Module)
2.2.3. Adaptive Sparse Graph Modeling Module (ASG Module)
2.2.4. Classifier and Loss Function Design
2.3. Experimental Setup
3. Experimental Results
3.1. Ablation Study
3.2. Multi-Method Comparison
4. Discussion
5. Conclusions
- Develop and integrate an adaptive spectral correction mechanism that can dynamically adjust to varying spectral conditions, particularly in shadow regions and areas with high spectral variability;
- Advance the superpixel segmentation algorithm by incorporating deep learning techniques to improve the precision of forest boundary preservation and reduce the loss of important spatial information;
- Explore and implement multimodal data fusion strategies, combining hyperspectral data with LiDAR data to leverage forest height and structural information, thereby enhancing the classification of under-canopy vegetation and improving overall model performance in complex forest environments.
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
GCN | Graph Convolutional Networks |
SVMs | Support vector machines |
RNNs | Recurrent neural networks |
ROI | Region of interest |
References
- Goetz, A.F.; Vane, G.; Solomon, J.E.; Rock, B.N. Imaging spectrometry for earth remote sensing. Science 1985, 228, 1147–1153. [Google Scholar] [CrossRef] [PubMed]
- Mahlein, A.K.; Oerke, E.C.; Steiner, U.; Dehne, H.W. Recent advances in sensing plant diseases for precision crop protection. Eur. J. Plant Pathol. 2012, 133, 197–209. [Google Scholar] [CrossRef]
- Brando, V.E.; Dekker, A.G. Satellite hyperspectral remote sensing for estimating estuarine and coastal water quality. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1378–1387. [Google Scholar] [CrossRef]
- Weng, Q.; Lu, D.; Schubring, J. Estimation of land surface temperature–vegetation abundance relationship for urban heat island studies. Remote Sens. Environ. 2004, 89, 467–483. [Google Scholar] [CrossRef]
- Schaepman, M.E.; Ustin, S.L.; Plaza, A.J.; Painter, T.H.; Verrelst, J.; Liang, S. Earth system science related imaging spectroscopy—An assessment. Remote Sens. Environ. 2009, 113, S123–S137. [Google Scholar] [CrossRef]
- Thenkabail, P.S.; Smith, R.B.; De Pauw, E. Hyperspectral vegetation indices and their relationships with agricultural crop characteristics. Remote Sens. Environ. 2000, 71, 158–182. [Google Scholar] [CrossRef]
- Gao, B.C.; Goetz, A.F. Column atmospheric water vapor and vegetation liquid water retrievals from airborne imaging spectrometer data. J. Geophys. Res. Atmos. 1990, 95, 3549–3564. [Google Scholar] [CrossRef]
- Melgani, F.; Bruzzone, L. Classification of hyperspectral remote sensing images with support vector machines. IEEE Trans. Geosci. Remote Sens. 2004, 42, 1778–1790. [Google Scholar] [CrossRef]
- Ham, J.; Chen, Y.; Crawford, M.M.; Ghosh, J. Investigation of the random forest framework for classification of hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2005, 43, 492–501. [Google Scholar] [CrossRef]
- Camps-Valls, G.; Bruzzone, L. Kernel-based methods for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2005, 43, 1351–1362. [Google Scholar] [CrossRef]
- Toporkov, J.V.; Sletten, M.A. Numerical simulations and analysis of wide-band range-resolved HF backscatter from evolving ocean-like surfaces. IEEE Trans. Geosci. Remote Sens. 2012, 50, 2986–3003. [Google Scholar] [CrossRef]
- Bandos, T.V.; Bruzzone, L.; Camps-Valls, G. Classification of hyperspectral images with regularized linear discriminant analysis. IEEE Trans. Geosci. Remote Sens. 2009, 47, 862–873. [Google Scholar] [CrossRef]
- Fauvel, M.; Tarabalka, Y.; Benediktsson, J.A.; Chanussot, J.; Tilton, J.C. Advances in spectral-spatial classification of hyperspectral images. Proc. IEEE 2012, 101, 652–675. [Google Scholar] [CrossRef]
- Badenas, J.; Sanchiz, J.M.; Pla, F. Motion-based segmentation and region tracking in image sequences. Pattern Recognit. 2001, 34, 661–670. [Google Scholar] [CrossRef]
- Chen, Y.; Jiang, H.; Li, C.; Jia, X.; Ghamisi, P. Deep feature extraction and classification of hyperspectral images based on convolutional neural networks. IEEE Trans. Geosci. Remote Sens. 2016, 54, 6232–6251. [Google Scholar] [CrossRef]
- Hu, W.; Huang, Y.; Wei, L.; Zhang, F.; Li, H. Deep convolutional neural networks for hyperspectral image classification. J. Sens. 2015, 2015, 258619. [Google Scholar] [CrossRef]
- Zhong, Z.; Li, J.; Luo, Z.; Chapman, M. Spectral–spatial residual network for hyperspectral image classification: A 3-D deep learning framework. IEEE Trans. Geosci. Remote Sens. 2017, 56, 847–858. [Google Scholar] [CrossRef]
- Li, W.; Wu, G.; Zhang, F.; Du, Q. Hyperspectral image classification using deep pixel-pair features. IEEE Trans. Geosci. Remote Sens. 2016, 55, 844–853. [Google Scholar] [CrossRef]
- Pržulj, N. Biological network comparison using graphlet degree distribution. Bioinformatics 2007, 23, e177–e183. [Google Scholar] [CrossRef]
- Wang, Y.; Wang, J.; Chen, X.; Chu, T.; Liu, M.; Yang, T. Feature surface extraction and reconstruction from industrial components using multistep segmentation and optimization. Remote Sens. 2018, 10, 1073. [Google Scholar] [CrossRef]
- Zhao, L.; Song, Y.; Zhang, C. T-GCN: A temporal graph convolutional network for traffic prediction. IEEE Trans. Intell. Transp. Syst. 2019, 21, 3848–3858. [Google Scholar] [CrossRef]
- Kipf, T.N.; Welling, M. Semi-supervised classification with graph convolutional networks. arXiv 2016, arXiv:1609.02907. [Google Scholar]
- Chen, Z.; Li, J.; Liu, H.; Wang, X.; Wang, H.; Zheng, Q. Learning multi-scale features for speech emotion recognition with connection attention mechanism. Expert Syst. Appl. 2023, 214, 118943. [Google Scholar] [CrossRef]
- Shuman, D.I.; Narang, S.K.; Frossard, P.; Ortega, A.; Vandergheynst, P. The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and other irregular domains. IEEE Signal Process. Mag. 2013, 30, 83–98. [Google Scholar] [CrossRef]
- Bronstein, M.M.; Bruna, J.; LeCun, Y.; Szlam, A.; Vandergheynst, P. Geometric deep learning: Going beyond euclidean data. IEEE Signal Process. Mag. 2017, 34, 18–42. [Google Scholar] [CrossRef]
- Hong, D.; Gao, L.; Yao, J.; Zhang, B.; Plaza, A.; Chanussot, J. Graph convolutional networks for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2020, 59, 5966–5978. [Google Scholar] [CrossRef]
- Venkateswarlu, R. Eye gaze estimation from a single image of one eye. In Proceedings of the Ninth IEEE International Conference on Computer Vision, Nice, France, 14–17 October 2003; pp. 136–143. [Google Scholar]
- Achanta, R.; Shaji, A.; Smith, K.; Lucchi, A.; Fua, P.; Süsstrunk, S. SLIC superpixels compared to state-of-the-art superpixel methods. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 2274–2282. [Google Scholar] [CrossRef]
- Liu, M.Y.; Tuzel, O.; Ramalingam, S.; Chellappa, R. Entropy rate superpixel segmentation. In Proceedings of the CVPR 2011, Colorado Springs, CO, USA, 20–25 June 2011; pp. 2097–2104. [Google Scholar]
- Mou, L.; Lu, X.; Li, X.; Zhu, X.X. Nonlocal graph convolutional networks for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2020, 58, 8246–8257. [Google Scholar] [CrossRef]
- Subudhi, S.; Patro, R.N.; Biswal, P.K.; Dell’Acqua, F. A survey on superpixel segmentation as a preprocessing step in hyperspectral image analysis. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2021, 14, 5015–5035. [Google Scholar] [CrossRef]
- Stutz, D.; Hermans, A.; Leibe, B. Superpixels: An evaluation of the state-of-the-art. Comput. Vis. Image Underst. 2018, 166, 1–27. [Google Scholar] [CrossRef]
- Defferrard, M.; Bresson, X.; Vandergheynst, P. Convolutional neural networks on graphs with fast localized spectral filtering. Adv. Neural Inf. Process. Syst. 2016, 29, 3844–3852. [Google Scholar]
- Bianchi, F.M.; Grattarola, D.; Alippi, C. Spectral clustering with graph neural networks for graph pooling. In Proceedings of the International Conference on Machine Learning, PMLR, Online, 13–18 July 2020; pp. 874–883. [Google Scholar]
- Li, X.; Fan, X.; Fan, J.; Li, Q.; Gao, Y.; Zhao, X. DASR-Net: Land Cover Classification Methods for Hybrid Multiattention Multispectral High Spectral Resolution Remote Sensing Imagery. Forests 2024, 15, 1826. [Google Scholar] [CrossRef]
- Li, X.; Fan, X.; Li, Q.; Zhao, X. RS-Net: Hyperspectral Image Land Cover Classification Based on Spectral Imager Combined with Random Forest Algorithm. Electronics 2024, 13, 4046. [Google Scholar] [CrossRef]
- Shi, C.; Liao, D.; Zhang, T.; Wang, L. Hyperspectral image classification based on 3D coordination attention mechanism network. Remote Sens. 2022, 14, 608. [Google Scholar] [CrossRef]
- Yin, J.; Qi, C.; Huang, W.; Chen, Q.; Qu, J. Multibranch 3D-dense attention network for hyperspectral image classification. IEEE Access 2022, 10, 71886–71898. [Google Scholar] [CrossRef]
- Wang, L.; Song, Z.; Zhang, X.; Wang, C.; Zhang, G.; Zhu, L.; Li, J.; Liu, H. SAT-GCN: Self-attention graph convolutional network-based 3D object detection for autonomous driving. Knowl. Based Syst. 2023, 259, 110080. [Google Scholar] [CrossRef]
- Diao, Q.; Dai, Y.; Zhang, C.; Wu, Y.; Feng, X.; Pan, F. Superpixel-based attention graph neural network for semantic segmentation in aerial images. Remote Sens. 2022, 14, 305. [Google Scholar] [CrossRef]
- Shen, Y.; Fu, H.; Du, Z.; Chen, X.; Burnaev, E.; Zorin, D.; Zhou, K.; Zheng, Y. GCN-denoiser: Mesh denoising with graph convolutional networks. ACM Trans. Graph. (TOG) 2022, 41, 1–14. [Google Scholar] [CrossRef]
Class | CNN | GCN | SparseGCN | WatershedGCN | OUR |
---|---|---|---|---|---|
Red roof | 97.41 + − 0.62 | 95.16 + −2.06 | 97.43 + −1.46 | 98.74 + −1.05 | 99.95 + −0.04 |
Road | 86.59 + −1.18 | 38.86 + −4.96 | 53.65 + −4.56 | 61.25 + −5.24 | 99.66 + −0.13 |
Bare soil | 95.43 + −0.79 | 99.01 + −1.66 | 97.12 + −0.25 | 99.12 + −0.27 | 97.40 + −2.44 |
Cotton | 99.57 + −0.06 | 99.66 + −0.06 | 99.24 + −0.35 | 99.88 + −0.05 | 99.77 + −0.10 |
Cotton firewood | 83.75 + −1.12 | 90.25 + −2.98 | 94.65 + −2.26 | 93.65 + −2.58 | 98.25 + −1.09 |
Rape | 97.32 + −0.29 | 94.50 + −2.65 | 95.26 + −0.16 | 99.25 + −0.25 | 99.98 + −0.01 |
Chinese cabbage | 91.69 + −1.14 | 88.74 + −3.88 | 95.95 + −0.28 | 96.24 + −0.54 | 99.98 + −0.00 |
Pakchoi | 51.84 + −3.71 | 30.48 + −20.85 | 95.31 + −4.27 | 91.26 + −5.26 | 97.23 + −0.36 |
Cabbage | 97.90 + −0.74 | 92.42 + −0.05 | 97.65 + −0.24 | 97.89 + −0.42 | 97.37 + −1.01 |
Tuber mustard | 88.08 + −2.09 | 88.62 + −2.12 | 88.56 + −2.14 | 89.12 + −2.01 | 98.25 + −1.09 |
Brassica parachinensis | 83.39 + −1.79 | 80.24 + −6.95 | 87.26 + −2.35 | 89.26 + −2.54 | 98.24 + −1.01 |
Brassica chinensis | 78.82 + −2.52 | 49.21 + −12.35 | 86.56 + −2.57 | 88.26 + −2.15 | 97.24 + −0.24 |
Small Brassica chinensis | 88.15 + −1.46 | 89.11 + −2.69 | 95.25 + −0.95 | 96.24 + −1.25 | 99.01 + −0.04 |
Lactuca sativa | 88.80 + −3.44 | 85.60 + −4.85 | 92.37 + −0.25 | 92.25 + −0.57 | 98.89 + −0.18 |
Celtuce | 77.96 + −3.14 | —— | 63.66 + −8.57 | 65.58 + −7.59 | 97.02 + −0.21 |
Film-covered lettuce | 93.83 + −2.25 | 94.07 + −7.18 | 93.57 + −2.59 | 93.45 + −2.57 | 99.67 + −0.24 |
Romaine lettuce | 90.63 + −2.94 | 80.76 + −13.65 | 91.56 + −1.24 | 91.56 + −1.25 | 98.24 + −0.57 |
Carrot | 89.39 + −2.01 | 44.09 + −23.44 | 90.24 + −1.59 | 91.26 + −1.57 | 98.26 + −0.47 |
White radish | 93.91 + −1.09 | 83.87 + −3.18 | 84.26 + −1.65 | 83.56 + −1.99 | 98.16 + −0.57 |
Garlic sprout | 92.26 + −1.32 | 21.12 + −18.39 | 98.69 + −1.69 | 99.16 + −1.11 | 97.68 + −03.34 |
Broad bean | 47.79 + −12.28 | —— | 97.68 + −1.98 | 98.54 + −1.12 | 97.68 + −1.89 |
Tree | 84.86 + −5.36 | 91.74 + −2.48 | 96.84 + −1.59 | 97.16 + −1.28 | 98.67 + −0.35 |
OA (%) | 94.57 + −0.30 | 91.14 + −0.40 | 97.58 + −0.64 | 98.12 + −0.56 | 99.66 + −0.06 |
AA (%) | 86.34 + −1.05 | 79.89 + −1.75 | 92.01 + −1.06 | 93.07 + −1.26 | 98.84 + −0.26 |
Kappa | 93.13 + −0.38 | 88.77 + −0.51 | 96.88 + −0.98 | 97.66 + −0.57 | 99.56 + −0.08 |
Class | AB-LSTM | SPEFORMER | SSFTT | DFFN | GSAT | OUR |
---|---|---|---|---|---|---|
Dense Urban Fabric | 27.10 + − 13.73 | 38.92 + −13.71 | 69.19 + −7.15 | 46.80 + −16.05 | 51.23 + −12.03 | 67.80 + −8.51 |
Mineral Extraction Sites | 93.33 + −3.50 | 100.00 + −0.00 | 90.67 + −10.98 | 89.67 + −8.59 | 86.54 + −10.12 | 75.00 + −37.82 |
Non Irrigated Arable Land | 83.73 + −2.74 | 69.84 + −9.15 | 86.52 + −3.40 | 83.20 + −2.63 | 87.68 + −4.26 | 92.99 + −2.74 |
Fruit Trees | 7.04 + −12.08 | 1.69 + −2.25 | 51.83 + −11.21 | 14.65 + −7.43 | 17.26 + −6.48 | 28.73 + −12.52 |
Olive Groves | 86.60 + −4.70 | 76.83 + −4.00 | 92.23 + −3.50 | 90.75 + −2.92 | 91.64 + −2.08 | 94.35 + −1.58 |
Broad-leaved Forest | 33.13 + −18.23 | 12.24 + −9.66 | 59.10 + −6.66 | 24.68 + −9.60 | 38.59 + −10.57 | 55.72 + −13.57 |
Coniferous Forest | 46.67 + −10.13 | 42.71 + −2.61 | 71.02 + −7.16 | 57.47 + −3.06 | 68.59 + −3.29 | 70.44 + −5.81 |
Mixed Forest | 66.74 + −7.00 | 66.34 + −8.98 | 74.34 + −8.91 | 67.79 + −3.34 | 79.59 + −3.48 | 82.07 + −6.05 |
Dense Sclerophyllous Vegetation | 81.99 + −2.87 | 76.14 + −2.80 | 84.05 + −4.37 | 85.09 + −1.45 | 82.06 + −1.54 | 83.42 + −2.14 |
Sparce Sclerophyllous Vegetation | 83.63 + −3.58 | 72.20 + −1.60 | 86.34 + −2.64 | 77.50 + −2.60 | 80.21 + −3.49 | 83.68 + −2.34 |
Sparcely Vegetated Areas | 51.79 + −9.19 | 64.68 + −5.52 | 69.26 + −7.84 | 59.72 + −5.17 | 65.26 + −8.57 | 76.25 + −10.00 |
Rocks and Sand | 88.72 + −4.98 | 92.74 + −1.97 | 92.56 + −6.03 | 92.65 + −1.58 | 91.26 + −1.59 | 94.52 + −1.16 |
Water | 100.00 + −0.00 | 100.00 + −0.00 | 100.00 + −0.00 | 100.00 + −0.00 | 100.00 + −0.00 | 100.00 + −0.00 |
Coastal Water | 99.90 + −0.12 | 100.00 + −0.00 | 100.00 + −0.00 | 100.00 + −0.00 | 100.00 + −0.00 | 100.00 + −0.00 |
OA (%) | 79.80 + −1.07 | 74.47 + −0.72 | 85.38 + −1.02 | 80.98 + −1.13 | 82.26 + −1.07 | 85.70 + −0.32 |
AA (%) | 67.88 + −2.46 | 65.31 + −1.44 | 80.51 + −2.11 | 70.71 + −1.39 | 73.89 + −1.05 | 78.93 + −2.66 |
Kappa | 75.81 + −1.35 | 69.48 + −0.93 | 82.60 + −1.16 | 77.24 + −1.35 | 77.69 + −0.98 | 83.03 + −0.42 |
time (s) | 300.61 | 196.65 | 436.85 | 232.59 | 326.85 | 177.68 |
Class | AB-LSTM | SPEFORMER | SSFTT | DFFN | GSAT | OUR |
---|---|---|---|---|---|---|
Red roof | 97.39 + − 0.69 | 98.47 + −0.30 | 99.06 + −0.28 | 99.25 + −0.42 | 99.22 + −0.48 | 99.95 + −0.04 |
Road | 90.59 + −1.33 | 94.41 + −1.72 | 94.30 + −2.00 | 96.95 + −0.42 | 97.15 + −1.02 | 99.66 + −0.13 |
Bare soil | 96.42 + −0.61 | 96.27 + −0.78 | 97.95 + −1.00 | 98.63 + −0.63 | 98.93 + −0.30 | 97.40 + −2.44 |
Cotton | 99.61 + −0.08 | 99.59 + −0.13 | 99.84 + −0.04 | 99.86 + −0.06 | 99.87 + −0.04 | 99.77 + −0.10 |
Cotton firewood | 88.00 + −1.70 | 90.83 + −3.67 | 93.13 + −1.12 | 96.62 + −1.89 | 98.18 + −0.70 | 98.25 + −1.09 |
Rape | 97.48 + −0.41 | 98.78 + −0.26 | 99.36 + −0.14 | 99.31 + −0.19 | 99.65 + −0.11 | 99.98 + −0.01 |
Chinese cabbage | 92.10 + −0.90 | 95.32 + −0.38 | 97.07 + −0.45 | 97.78 + −0.45 | 98.55 + −0.34 | 99.98 + −0.00 |
Pakchoi | 60.05 + −3.17 | 75.62 + −2.47 | 90.58 + −1.84 | 90.18 + −1.71 | 94.38 + −1.79 | 97.23 + −0.36 |
Cabbage | 97.50 + −0.65 | 99.09 + −0.12 | 99.48 + −0.30 | 99.50 + −0.15 | 99.66 + −0.12 | 97.37 + −1.01 |
Tuber mustard | 89.47 + −1.38 | 90.40 + −1.44 | 97.20 + −0.88 | 97.39 + −0.54 | 97.94 + −0.54 | 98.25 + −1.09 |
Brassica parachinensis | 85.03 + −2.22 | 90.39 + −0.98 | 95.69 + −1.08 | 95.81 + −0.28 | 97.59 + −0.62 | 98.24 + −1.01 |
Brassica chinensis | 81.81 + −1.49 | 88.56 + −2.15 | 94.47 + −1.79 | 95.84 + −0.94 | 97.08 + −0.79 | 97.24 + −0.24 |
Small Brassica chinensis | 90.22 + −0.62 | 93.32 + −0.70 | 97.01 + −0.29 | 97.68 + −0.36 | 98.41 + −0.60 | 99.01 + −0.04 |
Lactuca sativa | 94.17 + −0.43 | 94.65 + −1.56 | 97.50 + −0.58 | 98.78 + −0.30 | 98.74 + −0.26 | 98.89 + −0.18 |
Celtuce | 82.24 + −4.31 | 90.33 + −2.17 | 93.35 + −1.97 | 96.01 + −2.83 | 96.87 + −1.68 | 97.02 + −0.21 |
Film-covered lettuce | 97.61 + −0.60 | 98.60 + −0.35 | 98.57 + −0.57 | 99.34 + −0.36 | 99.63 + −0.29 | 99.67 + −0.24 |
Romaine lettuce | 91.84 + −1.65 | 92.57 + −2.53 | 94.46 + −1.80 | 97.58 + −0.80 | 98.12 + −0.69 | 98.24 + −0.57 |
Carrot | 86.71 + −3.04 | 91.37 + −1.25 | 97.02 + −0.90 | 96.15 + −1.03 | 97.26 + −0.65 | 98.26 + −0.47 |
White radish | 90.77 + −0.89 | 94.71 + −0.65 | 96.05 + −0.80 | 97.52 + −0.44 | 97.78 + −0.42 | 98.16 + −0.57 |
Garlic sprout | 90.30 + −1.51 | 94.44 + −1.88 | 96.75 + −1.51 | 97.39 + −0.35 | 97.72 + −0.26 | 97.68 + −03.34 |
Broad bean | 45.77 + −12.40 | 82.53 + −10.69 | 94.95 + −2.42 | 95.36 + −2.80 | 96.55 + −2.35 | 97.68 + −1.89 |
Tree | 88.56 + −1.68 | 93.80 + −1.44 | 95.82 + −2.40 | 97.85 + −1.50 | 98.40 + −0.36 | 98.67 + −0.35 |
OA (%) | 95.26 + −0.20 | 96.84 + −0.18 | 98.40 + −0.13 | 98.77 + −0.17 | 99.13 + −0.03 | 99.66 + −0.06 |
AA (%) | 87.89 + −0.51 | 92.91 + −0.81 | 96.35 + −0.29 | 97.31 + −0.29 | 98.08 + −0.17 | 98.84 + −0.26 |
Kappa | 94.00 + −0.26 | 96.00 + −0.23 | 97.98 + −0.17 | 98.45 + −0.21 | 98.90 + −0.04 | 99.56 + −0.08 |
time (s) | 167.29 | 138.07 | 240.36 | 182.68 | 216.58 | 122.29 |
Class | AB-LSTM | SPEFORMER | SSFTT | DFFN | GSAT | OUR |
---|---|---|---|---|---|---|
Strawberry | 97.79 + − 0.17 | 97.87 + −0.36 | 98.18 + −0.14 | 99.38 + −0.18 | 99.45 + −0.08 | 99.42 + −0.12 |
Cowpea | 95.43 + −0.41 | 94.47 + −1.39 | 96.52 + −0.61 | 99.03 + −0.14 | 99.27 + −0.18 | 99.18 + −0.25 |
Soybean | 94.82 + −1.22 | 97.36 + −0.38 | 95.51 + −2.09 | 99.28 + −0.15 | 99.52 + −0.16 | 99.34 + −0.18 |
Sorghum | 97.56 + −0.81 | 98.04 + −0.76 | 98.31 + −0.65 | 99.00 + −0.45 | 99.66 + −0.13 | 99.67 + −0.19 |
Water spinach | 83.87 + −3.94 | 93.22 + −3.50 | 91.35 + −2.03 | 98.20 + −0.79 | 98.33 + −0.99 | 98.12 + −0.88 |
Watermelon | 71.74 + −2.89 | 68.35 + −2.21 | 82.59 + −2.76 | 92.65 + −1.54 | 94.49 + −1.02 | 93.97 + −1.37 |
Greens | 90.07 + −2.62 | 88.91 + −1.83 | 93.95 + −0.80 | 97.13 + −1.01 | 97.85 + −1.03 | 97.82 + −1.68 |
Trees | 91.05 + −0.84 | 90.61 + −1.20 | 93.75 + −1.12 | 98.07 + −0.49 | 97.91 + −1.01 | 98.01 + −0.67 |
Grass | 91.25 + −0.58 | 90.62 + −1.50 | 95.01 + −1.36 | 98.53 + −0.88 | 98.96 + −0.20 | 98.99 + −0.17 |
Red roof | 97.54 + −0.57 | 98.38 + −0.54 | 98.38 + −0.46 | 99.39 + −0.22 | 99.38 + −0.27 | 99.37 + −0.25 |
Gray roof | 97.17 + −0.61 | 96.78 + −0.41 | 97.78 + −0.44 | 99.11 + −0.28 | 99.24 + −0.35 | 99.25 + −0.26 |
Plastic | 80.75 + −3.13 | 82.75 + −1.11 | 89.85 + −2.40 | 96.87 + −1.77 | 98.33 + −1.12 | 97.34 + −1.55 |
Bare soil | 79.85 + −2.49 | 82.29 + −2.63 | 86.86 + −2.17 | 94.22 + −1.08 | 95.14 + −1.17 | 94.78 + −1.12 |
Road | 94.37 + −0.77 | 92.44 + −0.61 | 94.97 + −1.16 | 98.85 + −0.17 | 99.03 + −0.22 | 99.01 + −0.38 |
Bright object | 79.59 + −2.85 | 86.67 + −3.23 | 92.50 + −3.42 | 95.28 + −3.91 | 97.81 + −1.66 | 97.12 + −2.14 |
Water | 99.62 + −0.07 | 99.37 + −0.13 | 99.64 + −0.14 | 99.77 + −0.05 | 99.89 + −0.07 | 99.88 + −0.06 |
OA (%) | 95.33 + −0.19 | 95.21 + −0.14 | 96.73 + −0.16 | 98.86 + −0.09 | 99.09 + −0.13 | 99.00 + −0.27 |
AA (%) | 90.15 + −0.59 | 91.13 + −0.36 | 94.07 + −0.34 | 97.80 + −0.39 | 98.39 + −0.20 | 98.29 + −0.32 |
Kappa | 94.53 + −0.23 | 94.39 + −0.17 | 96.18 + −0.19 | 98.67 + −0.11 | 98.94 + −0.15 | 98.72 + −0.26 |
time (s) | 180.08 | 139.62 | 241.36 | 141.23 | 200.71 | 136.85 |
Class | AB-LSTM | SPEFORMER | SSFTT | DFFN | GSAT | OUR |
---|---|---|---|---|---|---|
Corn | 99.82 + − 0.05 | 99.87 + −0.04 | 99.94 + −0.01 | 99.94 + −0.03 | 99.96 + −0.01 | 99.98 + −0.03 |
Cotton | 98.47 + −0.28 | 98.77 + −0.38 | 99.77 + −0.15 | 99.78 + −0.11 | 99.81 + −0.11 | 99.89 + −0.08 |
Sesame | 98.13 + −1.41 | 99.14 + −0.58 | 99.45 + −0.34 | 99.49 + −0.36 | 99.85 + −0.16 | 99.87 + −0.11 |
Broad-leaf soybean | 99.67 + −0.06 | 99.62 + −0.06 | 99.85 + −0.05 | 99.89 + −0.02 | 99.88 + −0.05 | 99.89 + −0.02 |
Narrow-leaf soybean | 96.51 + −0.45 | 96.62 + −1.04 | 98.32 + −0.87 | 98.23 + −1.28 | 98.67 + −0.48 | 98.94 + −0.17 |
Rice | 99.77 + −0.10 | 99.87 + −0.13 | 99.95 + −0.10 | 99.99 + −0.01 | 99.99 + −0.02 | 99.99 + −0.01 |
Water | 99.98 + −0.00 | 99.99 + −0.00 | 99.98 + −0.01 | 99.98 + −0.01 | 99.98 + −0.01 | 99.98 + −0.02 |
Roads and houses | 97.97 + −0.26 | 98.01 + −0.72 | 98.88 + −0.23 | 98.65 + −0.41 | 99.05 + −0.25 | 99.52 + −0.16 |
Mixed weed | 97.04 + −0.66 | 98.72 + −0.34 | 98.72 + −0.55 | 98.72 + −0.46 | 98.92 + −0.35 | 99.12 + −0.27 |
OA (%) | 99.54 + −0.03 | 99.61 + −0.07 | 99.81 + −0.04 | 99.82 + −0.03 | 99.85 + −0.02 | 99.87 + −0.12 |
AA (%) | 98.59 + −0.09 | 98.96 + −0.26 | 99.43 + −0.16 | 99.41 + −0.16 | 99.57 + −0.06 | 99.61 + −0.04 |
Kappa | 99.40 + −0.03 | 99.49 + −0.09 | 99.75 + −0.05 | 99.76 + −0.04 | 99.81 + −0.03 | 99.87 + −0.09 |
time (s) | 242.18 | 165.56 | 387.12 | 185.92 | 276.99 | 141.86 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Chen, P.; Li, X.; Peng, Y.; Fan, X.; Li, Q. WSSGCN: Hyperspectral Forest Image Classification via Watershed Superpixel Segmentation and Sparse Graph Convolutional Networks. Forests 2025, 16, 827. https://doi.org/10.3390/f16050827
Chen P, Li X, Peng Y, Fan X, Li Q. WSSGCN: Hyperspectral Forest Image Classification via Watershed Superpixel Segmentation and Sparse Graph Convolutional Networks. Forests. 2025; 16(5):827. https://doi.org/10.3390/f16050827
Chicago/Turabian StyleChen, Pingfei, Xuyang Li, Yong Peng, Xiangsuo Fan, and Qi Li. 2025. "WSSGCN: Hyperspectral Forest Image Classification via Watershed Superpixel Segmentation and Sparse Graph Convolutional Networks" Forests 16, no. 5: 827. https://doi.org/10.3390/f16050827
APA StyleChen, P., Li, X., Peng, Y., Fan, X., & Li, Q. (2025). WSSGCN: Hyperspectral Forest Image Classification via Watershed Superpixel Segmentation and Sparse Graph Convolutional Networks. Forests, 16(5), 827. https://doi.org/10.3390/f16050827