Wide Sliding Window and Subsampling Network for Hyperspectral Image Classification
Abstract
:1. Introduction
- Extracting a higher level of spatial and spectral features by the multiple layers of transform kernels efficiently, and the parameters of these transform kernels can be learned using unsupervised learning or obtained directly by randomly choosing them from training samples.
- Adjusting features easily by adjusting the width and the field of view of WSWS layers according to the size of training data.
- Training the WSWS Net easily, because the weights are mostly in the fully connected layer, which can be computed with least squares.
2. Wide Sliding Window and Subsampling Network (WSWS Net)
2.1. Generating Patch Vectors for WSWS Net from HSI Data
2.2. Constructing the Transform Kernel Layer by Wide Sliding Window and Subsampling
2.3. Going Deeper with a Fully Connected Layer
2.4. Extracting Different Level of Spatial and Spectral Features Stably and Effectively
3. Datasets and Experimental Settings
3.1. Dataset Description
3.2. Experimental Setup
4. Experimental Results
4.1. Classification Results for Pavia University
4.2. Classification Results for KSC
4.3. Classification Results for Salinas
5. Discussion
5.1. The Effects of Different Ratio of Training Samples
5.2. The Effects of Different Neighborhood Sizes
5.3. Visualization of Different Layers of Extracted features
6. Conclusions
Author Contributions
Funding
Acknowledgments
Conflicts of Interest
Abbreviations
HSI | Hyperspectral Image |
CNN | Convolutional neural network |
WSWS | Wide sliding window and subsampling |
KNN | K-nearest neighbors |
SVM | Support vector machine |
MLP | Multilayer perceptron |
RF | Random forest |
RBF | Radial basis function network |
SAE | Stacked auto encoder |
PCA | principal component analysis |
OA | Overall Accuracy |
AA | Average Accuracy |
References
- Li, W.; Wu, G.; Zhang, F.; Du, Q. Hyperspectral image classification using deep pixel-pair features. IEEE Trans. Geosci. Remote Sens. 2016, 55, 844–853. [Google Scholar] [CrossRef]
- Tarabalka, Y.; Fauvel, M.; Chanussot, J.; Benediktsson, J.A. SVM-and MRF-based method for accurate classification of hyperspectral images. IEEE Geosci. Remote Sens. Lett. 2010, 7, 736–740. [Google Scholar] [CrossRef] [Green Version]
- Xi, J.; Ersoy, O.K.; Fang, J.; Wu, T.; Wei, X.; Zhao, C. Parallel Multistage Wide Neural Network; Department of Electrical and Computer Engineering Technical Reports; Purdue University: West Lafayette, Indiana, 2020; Volume 757. [Google Scholar]
- Li, J.; Bioucas-Dias, J.M.; Plaza, A. Semisupervised hyperspectral image classification using soft sparse multinomial logistic regression. IEEE Geosci. Remote Sens. Lett. 2012, 10, 318–322. [Google Scholar]
- Fauvel, M.; Benediktsson, J.A.; Chanussot, J.; Sveinsson, J.R. Spectral and spatial classification of hyperspectral data using SVMs and morphological profiles. IEEE Trans. Geosci. Remote Sens. 2008, 46, 3804–3814. [Google Scholar] [CrossRef] [Green Version]
- Li, J.; Bioucas-Dias, J.M.; Plaza, A. Spectral–spatial hyperspectral image segmentation using subspace multinomial logistic regression and Markov random fields. IEEE Trans. Geosci. Remote Sens. 2011, 50, 809–823. [Google Scholar] [CrossRef]
- Chen, Y.; Nasrabadi, N.M.; Tran, T.D. Hyperspectral image classification using dictionary-based sparse representation. IEEE Trans. Geosci. Remote Sens. 2011, 49, 3973–3985. [Google Scholar] [CrossRef]
- Li, J.; Marpu, P.R.; Plaza, A.; Bioucas-Dias, J.M.; Benediktsson, J.A. Generalized composite kernel framework for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2013, 51, 4816–4829. [Google Scholar] [CrossRef]
- Lee, H.; Kwon, H. Going deeper with contextual CNN for hyperspectral image classification. IEEE Trans. Image Process. 2017, 26, 4843–4855. [Google Scholar] [CrossRef] [Green Version]
- Mei, S.; Ji, J.; Hou, J.; Li, X.; Du, Q. Learning sensor-specific spatial-spectral features of hyperspectral images via convolutional neural networks. IEEE Trans. Geosci. Remote Sens. 2017, 55, 4520–4533. [Google Scholar] [CrossRef]
- Gao, Q.; Lim, S.; Jia, X. Hyperspectral image classification using convolutional neural networks and multiple feature learning. Remote Sens. 2018, 10, 299. [Google Scholar] [CrossRef] [Green Version]
- Wang, W.; Dou, S.; Jiang, Z.; Sun, L. A fast dense spectral–spatial convolution network framework for hyperspectral images classification. Remote Sens. 2018, 10, 1068. [Google Scholar] [CrossRef] [Green Version]
- Paoletti, M.; Haut, J.; Plaza, J.; Plaza, A. A new deep convolutional neural network for fast hyperspectral image classification. ISPRS J. Photogramm. Remote Sens. 2018, 145, 120–147. [Google Scholar] [CrossRef]
- Zhang, M.; Li, W.; Du, Q. Diverse region-based CNN for hyperspectral image classification. IEEE Trans. Image Process. 2018, 27, 2623–2634. [Google Scholar] [CrossRef] [PubMed]
- Cheng, G.; Li, Z.; Han, J.; Yao, X.; Guo, L. Exploring hierarchical convolutional features for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2018, 56, 6712–6722. [Google Scholar] [CrossRef]
- Zhang, L.; Zhang, L.; Du, B. Deep learning for remote sensing data: A technical tutorial on the state of the art. IEEE Geosci. Remote Sens. Mag. 2016, 4, 22–40. [Google Scholar] [CrossRef]
- Mou, L.; Ghamisi, P.; Zhu, X.X. Deep recurrent neural networks for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2017, 55, 3639–3655. [Google Scholar] [CrossRef] [Green Version]
- Mei, X.; Pan, E.; Ma, Y.; Dai, X.; Huang, J.; Fan, F.; Du, Q.; Zheng, H.; Ma, J. Spectral-spatial attention networks for hyperspectral image classification. Remote Sens. 2019, 11, 963. [Google Scholar] [CrossRef] [Green Version]
- Roy, S.K.; Krishna, G.; Dubey, S.R.; Chaudhuri, B.B. HybridSN: Exploring 3-D–2-D CNN feature hierarchy for hyperspectral image classification. IEEE Geosci. Remote Sens. Lett. 2019, 17, 277–281. [Google Scholar] [CrossRef] [Green Version]
- Zheng, J.; Feng, Y.; Bai, C.; Zhang, J. Hyperspectral Image Classification Using Mixed Convolutions and Covariance Pooling. IEEE Trans. Geosci. Remote Sens. 2020, 59, 522–534. [Google Scholar] [CrossRef]
- Gong, Z.; Zhong, P.; Yu, Y.; Hu, W.; Li, S. A CNN With Multiscale Convolution and Diversified Metric for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2019, 57, 3599–3618. [Google Scholar] [CrossRef]
- Haut, J.M.; Paoletti, M.E.; Plaza, J.; Li, J.; Plaza, A. Active Learning With Convolutional Neural Networks for Hyperspectral Image Classification Using a New Bayesian Approach. IEEE Trans. Geosci. Remote Sens. 2018, 56, 6440–6461. [Google Scholar] [CrossRef]
- Cao, X.; Yao, J.; Xu, Z.; Meng, D. Hyperspectral image classification with convolutional neural network and active learning. IEEE Trans. Geosci. Remote Sens. 2020. [Google Scholar] [CrossRef]
- Feng, J.; Wu, X.; Shang, R.; Sui, C.; Li, J.; Jiao, L.; Zhang, X. Attention Multibranch Convolutional Neural Network for Hyperspectral Image Classification Based on Adaptive Region Search. IEEE Trans. Geosci. Remote Sens. 2020. [Google Scholar] [CrossRef]
- Chen, Y.; Zhu, K.; Zhu, L.; He, X.; Ghamisi, P.; Benediktsson, J.A. Automatic Design of Convolutional Neural Network for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2019, 57, 7048–7066. [Google Scholar] [CrossRef]
- Hong, D.; Wu, X.; Ghamisi, P.; Chanussot, J.; Yokoya, N.; Zhu, X.X. Invariant attribute profiles: A spatial-frequency joint feature extractor for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2020, 58, 3791–3808. [Google Scholar] [CrossRef] [Green Version]
- Wang, J.; Gao, F.; Dong, J.; Du, Q. Adaptive DropBlock-Enhanced Generative Adversarial Networks for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2020. [Google Scholar] [CrossRef]
- Shen, Y.; Zhu, S.; Chen, C.; Du, Q.; Xiao, L.; Chen, J.; Pan, D. Efficient Deep Learning of Nonlocal Features for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2020. [Google Scholar] [CrossRef]
- Liu, Q.; Xiao, L.; Yang, J.; Chan, J.C.W. Content-Guided Convolutional Neural Network for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2020, 58, 6124–6137. [Google Scholar] [CrossRef]
- Tang, X.; Meng, F.; Zhang, X.; Cheung, Y.; Ma, J.; Liu, F.; Jiao, L. Hyperspectral Image Classification Based on 3-D Octave Convolution With Spatial-Spectral Attention Network. IEEE Trans. Geosci. Remote Sens. 2020. [Google Scholar] [CrossRef]
- Masarczyk, W.; Głomb, P.; Grabowski, B.; Ostaszewski, M. Effective Training of Deep Convolutional Neural Networks for Hyperspectral Image Classification through Artificial Labeling. Remote Sens. 2020, 12, 2653. [Google Scholar] [CrossRef]
- Okwuashi, O.; Ndehedehe, C.E. Deep support vector machine for hyperspectral image classification. Pattern Recognit. 2020, 103, 107298. [Google Scholar] [CrossRef]
- Cao, F.; Guo, W. Cascaded dual-scale crossover network for hyperspectral image classification. Knowl. Based Syst. 2020, 189, 105122. [Google Scholar] [CrossRef]
- Neyshabur, B.; Li, Z.; Bhojanapalli, S.; LeCun, Y.; Srebro, N. The role of over-parametrization in generalization of neural networks. In Proceedings of the International Conference on Learning Representations, Vancouver, BC, Canada, 30 April–3 May 2019. [Google Scholar]
- Lee, J.; Xiao, L.; Schoenholz, S.S.; Bahri, Y.; Sohl-Dickstein, J.; Pennington, J. Wide neural networks of any depth evolve as linear models under gradient descent. arXiv 2019, arXiv:1902.06720. [Google Scholar]
- Cheng, H.T.; Koc, L.; Harmsen, J.; Shaked, T.; Chandra, T.; Aradhye, H.; Anderson, G.; Corrado, G.; Chai, W.; Ispir, M.; et al. Wide & deep learning for recommender systems. In Proceedings of the 1st Workshop on Deep Learning for Recommender Systems, Boston, MA, USA, 15 September 2016; pp. 7–10. [Google Scholar]
- Worrall, D.E.; Garbin, S.J.; Turmukhambetov, D.; Brostow, G.J. Harmonic networks: Deep translation and rotation equivariance. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 5028–5037. [Google Scholar]
- Liu, C.; Li, J.; He, L.; Plaza, A.; Li, S.; Li, B. Naive Gabor Networks for Hyperspectral Image Classification. IEEE Trans. Neural Networks Learn. Syst. 2020. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Azar, S.G.; Meshgini, S.; Rezaii, T.Y.; Beheshti, S. Hyperspectral image classification based on sparse modeling of spectral blocks. Neurocomputing 2020, 407, 12–23. [Google Scholar] [CrossRef]
- Yu, S.; Jia, S.; Xu, C. Convolutional neural networks for hyperspectral image classification. Neurocomputing 2017, 219, 88–98. [Google Scholar] [CrossRef]
Types and Descriptions | Related Works |
---|---|
Machine learning, spectral features | KNN [1], SVM [2], MLP [3], RBF [3], RF [4] |
Machine learning and other methods without using deep learning, spectral, and spatial features | MPs (Fauvel et al.) [5], IAPs (Hong et al.) [26], MRFs (Li et al.) [6], SVM-MRF (Tarabalka) [2], sparsity-based method (Chen et al.) [7], generalized composite kernel machine (Li et al.) [8] |
Deep learning, spectral, and spatial features | contextual CNN (Lee et al.) [9], Mei et al. [10], Gao et al. [11], FDSSC (Wang et al.) [12], 3-D CNN (Paoletti et al.) [13], diverse region-based CNN (Zhang et al.) [14], Chen et al. [15], Zhang et al. [16], deep RNN (Mou et al.) [17], Mei et al. [18] |
Deep learning combined with emerging methods, spectral, and spatial features | HybridSN (Roy et al.) [19], mixed CNN (Zheng et al.) [20], CNN with active learning: (Haut et al.) [22], and (Cao et al.) [23], attentional model (Feng et al.) [24], MS-CNNs (Gong et al.) [21], automatic CNN (Chen et al.) [25], dropBlock GAN (Wang et al.) [27], ENL-FCN (Shen et al.) [28], CGCNN (Liu et al.) [29], 3DOC-SSAN (Tang et al.) [30], transfer learning (Masarczyk et al.) [31] |
Learning models with different novel architectures, spectral, and spatial features | DSVM (Okwuashi et al.) [32], cascaded dual-scale crossover network (Cao et al.) [33], naive Gabor networks(Liu et al.) [38] |
NO. | Pavia University | KSC | Salinas | |||
---|---|---|---|---|---|---|
Class Name | NO. | Class Name | NO. | Class Name | NO. | |
1 | Asphalt | 6631 | Scrub | 347 | Brocoli-green-weeds-1 | 2009 |
2 | Meadows | 18,649 | Willow swamp | 243 | Brocoli-green-weeds-2 | 3726 |
3 | Gravel | 2099 | CP hammock | 256 | Fallow | 1976 |
4 | Trees | 3064 | Slash pine | 252 | Fallow-rough-plow | 1394 |
5 | Painted metal sheets | 1345 | Oak/broadleaf | 161 | Fallow-smooth | 2678 |
6 | Bare soil | 5029 | Hardwood | 229 | Stubble | 3959 |
7 | Bitumen | 1330 | Swamp | 105 | Celery | 3579 |
8 | Self-blocking bricks | 3682 | Graminoid marsh | 390 | Grapes-untrained | 11,271 |
9 | Shadows | 947 | Spartina marsh | 520 | Soil-vinyard-develop | 6203 |
10 | Cattail marsh | 404 | Corn-senesced-green-weeds | 3278 | ||
11 | Salt marsh | 419 | Lettuce-romaine-4wk | 1068 | ||
12 | Mud flats | 503 | Lettuce-romaine-5wk | 1927 | ||
13 | Water | 927 | Lettuce-romaine-6wk | 916 | ||
14 | Lettuce-romaine-7wk | 1070 | ||||
15 | Vinyard-untrained | 7268 | ||||
16 | Vinyard-vertica-trellis | 1807 | ||||
Total | 42,776 | 5211 | 54,129 |
Class NO. | Class Name | MLP | RBF | SAE | CNN | RBFE | CNNE | SMSB [39] | WSWS |
---|---|---|---|---|---|---|---|---|---|
1 | Asphalt | 97.13 | 97.65 | 97.43 | 96.18 | 98.99 | 95.22 | 99.11 | 99.10 |
2 | Meadows | 98.43 | 99.53 | 98.60 | 96.69 | 99.86 | 99.03 | 98.97 | 100.00 |
3 | Gravel | 85.15 | 80.62 | 0.00 | 80.86 | 81.43 | 82.92 | 98.89 | 93.01 |
4 | Trees | 95.05 | 93.84 | 91.35 | 87.21 | 95.43 | 87.00 | 98.74 | 98.37 |
5 | Painted metal sheets | 99.88 | 91.58 | 99.75 | 99.63 | 99.88 | 99.75 | 100 | 99.88 |
6 | Bare soil | 96.35 | 87.06 | 76.30 | 88.30 | 81.67 | 80.88 | 99.87 | 99.97 |
7 | Bitumen | 90.85 | 90.30 | 0.00 | 82.58 | 84.46 | 90.73 | 99.79 | 99.00 |
8 | Self-blocking bricks | 93.21 | 92.43 | 85.84 | 94.12 | 92.85 | 93.44 | 98.99 | 98.33 |
9 | Shadows | 99.30 | 94.84 | 95.61 | 99.30 | 97.72 | 99.47 | 98.04 | 98.95 |
OA (%) | 96.47 | 95.18 | 86.23 | 93.66 | 95.23 | 93.95 | 99.11 | 99.19 | |
AA (%) | 95.04 | 91.98 | 71.64 | 91.65 | 92.41 | 92.05 | 99.16 | 98.51 | |
Kappa (%) | 95.36 | 93.66 | 82.03 | 91.72 | 93.72 | 92.05 | 98.79 | 98.93 | |
Test Time (s) | 1.3 | 1.5 | 0.2 | 3.7 | 8.6 | 22.2 | 61.0 | 6.4 |
Class NO. | Class Name | MLP | RBF | SAE | CNN | RBFE | CNNE | WSWS |
---|---|---|---|---|---|---|---|---|
1 | Scrub | 99.78 | 98.47 | 98.69 | 97.37 | 96.94 | 97.81 | 100.00 |
2 | Willow swamp | 99.31 | 88.28 | 71.03 | 94.48 | 92.41 | 94.48 | 100.00 |
3 | CP hammock | 92.86 | 96.75 | 93.51 | 95.45 | 96.10 | 98.70 | 99.35 |
4 | Slash pine | 79.61 | 64.47 | 34.21 | 76.97 | 71.71 | 70.39 | 100.00 |
5 | Oak/broadleaf | 87.63 | 90.72 | 0.00 | 72.16 | 92.78 | 69.07 | 96.91 |
6 | Hardwood | 99.27 | 88.32 | 2.19 | 83.21 | 83.21 | 86.13 | 100.00 |
7 | Swamp | 100.00 | 96.83 | 12.70 | 100.00 | 95.24 | 90.48 | 100.00 |
8 | Graminoid marsh | 100.00 | 98.07 | 83.01 | 96.53 | 94.98 | 99.61 | 100.00 |
9 | Spartina marsh | 100.00 | 100.00 | 99.36 | 100.00 | 100.00 | 100.00 | 100.00 |
10 | Cattail marsh | 99.59 | 99.59 | 100.00 | 100.00 | 97.93 | 100.00 | 100.00 |
11 | Salt marsh | 98.41 | 90.84 | 97.61 | 100.00 | 95.62 | 100.00 | 100.00 |
12 | Mud flats | 99.34 | 98.01 | 92.68 | 96.01 | 98.67 | 98.34 | 100.00 |
13 | Water | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 | 100.00 |
OA (%) | 97.95 | 95.36 | 83.72 | 95.75 | 95.52 | 95.97 | 99.87 | |
AA (%) | 96.60 | 93.10 | 68.31 | 93.25 | 93.51 | 92.69 | 99.71 | |
Kappa (%) | 97.72 | 94.85 | 82.03 | 95.28 | 95.03 | 95.52 | 99.86 | |
Test Time (s) | 0.6 | 0.1 | 0.1 | 0.9 | 2.2 | 5.9 | 1.9 |
Class NO. | Class Name | MLP | RBF | SAE | CNN | RBFE | CNNE | SMSB [39] | WSWS |
---|---|---|---|---|---|---|---|---|---|
1 | Bro.-gw-1 | 100.00 | 99.75 | 99.17 | 98.51 | 100.00 | 98.92 | 99.78 | 100.00 |
2 | Bro.-gw-2 | 100.00 | 100.00 | 99.87 | 99.82 | 100.00 | 99.87 | 99.97 | 99.87 |
3 | Fallow | 99.41 | 99.83 | 92.16 | 99.66 | 99.75 | 99.41 | 99.94 | 98.82 |
4 | Fal.-rough-plow | 99.52 | 99.52 | 97.61 | 98.68 | 99.52 | 98.80 | 99.28 | 97.73 |
5 | Fallow-smooth | 97.70 | 97.14 | 98.63 | 99.38 | 97.14 | 99.75 | 99.54 | 99.38 |
6 | Stubble | 100.00 | 100.00 | 99.92 | 99.96 | 100.00 | 99.92 | 99.97 | 99.96 |
7 | Celery | 0.00 | 100.00 | 99.63 | 99.95 | 100.00 | 99.95 | 99.88 | 99.91 |
8 | Gra.-untrained | 90.64 | 91.22 | 89.75 | 74.24 | 91.62 | 90.17 | 98.87 | 99.72 |
9 | Soil-vd | 100.00 | 100.00 | 99.87 | 100.00 | 100.00 | 100.00 | 99.91 | 99.76 |
10 | Corn-sgw | 99.08 | 98.98 | 96.19 | 93.44 | 99.08 | 93.13 | 98.85 | 99.64 |
11 | Let.-r-4wk | 99.53 | 99.69 | 93.75 | 96.72 | 99.69 | 97.19 | 99.79 | 100.00 |
12 | Let.-r-5wk | 100.00 | 100.00 | 98.96 | 99.74 | 100.00 | 99.91 | 99.94 | 99.91 |
13 | Let.-r-6wk | 99.64 | 99.27 | 100.00 | 98.91 | 99.64 | 99.82 | 99.03 | 99.82 |
14 | Let.-r-7wk | 99.84 | 99.69 | 96.42 | 100.00 | 99.53 | 99.69 | 98.86 | 100.00 |
15 | Vinyard-u | 85.53 | 79.33 | 77.50 | 88.65 | 79.54 | 75.16 | 97.63 | 99.52 |
16 | Vinyard-v-t | 99.91 | 100.00 | 99.45 | 98.53 | 99.91 | 98.62 | 99.92 | 100.00 |
OA (%) | 89.27 | 95.14 | 93.86 | 92.42 | 95.26 | 93.97 | 99.26 | 99.67 | |
AA (%) | 91.92 | 97.78 | 96.18 | 96.64 | 97.84 | 96.90 | 99.45 | 99.63 | |
Kappa (%) | 88.20 | 94.64 | 93.23 | 91.69 | 94.77 | 93.35 | 99.17 | 99.63 | |
Test Time (s) | 4.1 | 4.6 | 0.7 | 4.5 | 25.5 | 46.9 | 51.0 | 11.8 |
Ratio of Training Samples | Pavia Univ./(%) | KSC/(%) | Salinas (%) | ||||||
---|---|---|---|---|---|---|---|---|---|
OA | AA | Kappa | OA | AA | Kappa | OA | AA | Kappa | |
0.05 | 96.33 | 94.21 | 95.16 | 91.94 | 88.23 | 91.08 | 97.19 | 98.20 | 96.89 |
0.1 | 98.38 | 97.34 | 97.86 | 96.38 | 94.24 | 95.98 | 98.62 | 98.97 | 98.47 |
0.15 | 98.84 | 98.07 | 98.46 | 99.21 | 98.39 | 99.12 | 99.19 | 99.23 | 99.10 |
0.2 | 99.11 | 98.44 | 98.82 | 99.52 | 99.24 | 99.47 | 99.44 | 99.57 | 99.38 |
0.25 | 98.92 | 97.94 | 98.56 | 99.65 | 99.37 | 99.62 | 99.48 | 99.64 | 99.42 |
0.3 | 98.96 | 98.17 | 98.62 | 99.79 | 99.57 | 99.77 | 99.37 | 99.47 | 99.30 |
Neighborhood Sizes | Pavia Univ./(%) | KSC/(%) | Salinas/ (%) | ||||||
---|---|---|---|---|---|---|---|---|---|
OA | AA | Kappa | OA | AA | Kappa | OA | AA | Kappa | |
95.13 | 92.41 | 93.61 | 80.46 | 69.71 | 78.53 | 92.83 | 96.66 | 92.12 | |
98.22 | 96.87 | 97.65 | 95.87 | 93.14 | 95.42 | 93.07 | 96.55 | 92.37 | |
99.11 | 98.44 | 98.82 | 98.31 | 97.00 | 98.11 | 98.67 | 99.15 | 98.52 | |
99.19 | 98.51 | 98.93 | 99.52 | 99.24 | 99.47 | 99.44 | 99.57 | 99.38 | |
99.17 | 98.63 | 98.91 | 99.87 | 99.71 | 99.86 | 99.56 | 99.53 | 99.51 | |
97.45 | 96.36 | 96.64 | 99.26 | 99.12 | 99.18 | 99.67 | 99.63 | 99.63 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Xi, J.; Ersoy, O.K.; Fang, J.; Cong, M.; Wu, T.; Zhao, C.; Li, Z. Wide Sliding Window and Subsampling Network for Hyperspectral Image Classification. Remote Sens. 2021, 13, 1290. https://doi.org/10.3390/rs13071290
Xi J, Ersoy OK, Fang J, Cong M, Wu T, Zhao C, Li Z. Wide Sliding Window and Subsampling Network for Hyperspectral Image Classification. Remote Sensing. 2021; 13(7):1290. https://doi.org/10.3390/rs13071290
Chicago/Turabian StyleXi, Jiangbo, Okan K. Ersoy, Jianwu Fang, Ming Cong, Tianjun Wu, Chaoying Zhao, and Zhenhong Li. 2021. "Wide Sliding Window and Subsampling Network for Hyperspectral Image Classification" Remote Sensing 13, no. 7: 1290. https://doi.org/10.3390/rs13071290
APA StyleXi, J., Ersoy, O. K., Fang, J., Cong, M., Wu, T., Zhao, C., & Li, Z. (2021). Wide Sliding Window and Subsampling Network for Hyperspectral Image Classification. Remote Sensing, 13(7), 1290. https://doi.org/10.3390/rs13071290