MRFP-Mamba: Multi-Receptive Field Parallel Mamba for Hyperspectral Image Classification
Abstract
1. Introduction
- Spatial–spectral Decoupling in Mamba Variants: While these methods excel at modeling sequential spectral patterns (e.g., distinguishing subtle reflectance variations between vegetation species), they inadequately integrate spatial hierarchies—the multi-scale geometric and contextual relationships between pixels.
- Scale Sensitivity of Single-Receptive-Field Convolutions: Current HSI classification methods rely on fixed-size convolutional kernels or attention windows, which restrict their ability to capture multi-granular spatial features.
- We propose MRFP-Mamba, a hierarchical architecture that combines multi-receptive-field convolutional feature extraction with a parameter-optimized Vision Mamba, enabling adaptive capture of multi-scale spatial features and efficient modeling of global spectral dependencies.
- We introduce a multi-receptive-field convolutional module to extract hierarchical spatial features, addressing scale sensitivity issues in single-kernel convolutions. This module captures fine-grained details and coarse contextual information simultaneously, improving representation of multi-scale objects and spatial–spectral interactions.
- We design a parameter-optimized Vision Mamba branch that models long-range spectral dependencies across bands, enabling effective fusion of local spatial hierarchies and global spectral correlations.
- Extensive experiments on four HSI datasets demonstrate that MRFP-Mamba outperforms state-of-the-art HSI classification methods, achieving significant improvements in Overall Accuracy (OA).
2. Related Works
2.1. Convolution Neural Networks for Hyperspectral Image Classification
2.2. Transformer Networks for Hyperspectral Image Classification
2.3. Mamba Networks for Hyperspectral Image Classification
3. Proposed Methodology
3.1. Preliminary
3.2. Overview of MRFP-Mamba
3.3. Multi-Receptive Field Convolutional Feature Extraction
Algorithm 1 MRFP-Mamba Implementation Process |
|
3.4. Parallel Mamba Structure for Long-Range Spatial–Spectral Dependency Modeling
3.4.1. Vision Mamba Parameter Impact Analysis
3.4.2. Parallel Vision Mamba Module
4. Experiments
4.1. Datasets
4.1.1. Indian Pines Dataset
4.1.2. Pavia University Dataset
4.1.3. Houston 2013 Dataset
4.1.4. WHU-Hi-LongKou Dataset
4.2. Experimental Setup
4.2.1. Implementation Details
4.2.2. Comparison with State-of-the-Art Backbone Methods
4.3. Results and Analysis
4.4. Ablation Studies
4.4.1. Ablation Study of the Input Size
4.4.2. Ablation Study of Different Modules
4.4.3. Ablation Study of the Numbers of the Training Samples
4.5. Discussion
5. Conclusions and Fulture Work
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Gevaert, C.M.; Suomalainen, J.; Tang, J.; Kooistra, L. Generation of Spectral—Temporal Response Surfaces by Combining Multispectral Satellite and Hyperspectral UAV Imagery for Precision Agriculture Applications. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2015, 8, 3140–3146. [Google Scholar] [CrossRef]
- Hestir, E.L.; Brando, V.E.; Bresciani, M.; Giardino, C.; Matta, E.; Villa, P.; Dekker, A.G. Measuring freshwater aquatic ecosystems: The need for a hyperspectral global mapping satellite mission. Remote Sens. Environ. 2015, 167, 181–195. [Google Scholar] [CrossRef]
- Chen, F.; Wang, K.; Voorde, T.V.D.; Tang, T.F. Mapping urban land cover from high spatial resolution hyperspectral data: An approach based on simultaneously unmixing similar pixels with jointly sparse spectral mixture analysis. Remote Sens. Environ. 2017, 196, 324–342. [Google Scholar] [CrossRef]
- Sun, L.; Wu, F.; Zhan, T.; Liu, W.; Wang, J.; Jeon, B. Weighted Nonlocal Low-Rank Tensor Decomposition Method for Sparse Unmixing of Hyperspectral Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 1174–1188. [Google Scholar] [CrossRef]
- Wang, J.; Zhang, L.; Tong, Q.; Sun, X. The Spectral Crust project—Research on new mineral exploration technology. In Proceedings of the 2012 4th Workshop on Hyperspectral Image and Signal Processing: Evolution in Remote Sensing (WHISPERS), Shanghai, China, 4–7 June 2012; IEEE: Piscataway, NJ, USA, 2012; pp. 1–4. [Google Scholar]
- Zhang, Y.; Yan, S.; Jiang, X.; Zhang, L.; Cai, Z.; Li, J. Dual Graph Learning Affinity Propagation for Multimodal Remote Sensing Image Clustering. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5521713. [Google Scholar] [CrossRef]
- Roy, S.K.; Krishna, G.; Dubey, S.R.; Chaudhuri, B.B. HybridSN: Exploring 3-D–2-D CNN Feature Hierarchy for Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett. 2020, 17, 277–281. [Google Scholar] [CrossRef]
- Pasolli, E.; Melgani, F.; Tuia, D.; Pacifici, F.; Emery, W.J. SVM active learning approach for image classification using spatial information. IEEE Trans. Geosci. Remote Sens. 2013, 52, 2217–2233. [Google Scholar] [CrossRef]
- Ham, J.; Chen, Y.; Crawford, M.M.; Ghosh, J. Investigation of the random forest framework for classification of hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2005, 43, 492–501. [Google Scholar] [CrossRef]
- Ma, L.; Crawford, M.M.; Tian, J. Local Manifold Learning-Based k -Nearest-Neighbor for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2010, 48, 4099–4109. [Google Scholar] [CrossRef]
- Song, W.; Li, S.; Kang, X.; Huang, K. Hyperspectral image classification based on KNN sparse representation. In Proceedings of the 2016 IEEE International Geoscience and Remote Sensing Symposium (IGARSS), Beijing, China, 10–15 July 2016; IEEE: Piscataway, NJ, USA, 2016; pp. 2411–2414. [Google Scholar]
- Zhang, Y.; Wang, X.; Jiang, X.; Zhang, L.; Du, B. Elastic Graph Fusion Subspace Clustering for Large Hyperspectral Image. IEEE Trans. Circuits Syst. Video Technol. 2025. early access. [Google Scholar]
- Zhang, Y.; Jiang, G.; Cai, Z.; Zhou, Y. Bipartite Graph-based Projected Clustering with Local Region Guidance for Hyperspectral Imagery. IEEE Trans. Multimed. 2024, 26, 9551–9563. [Google Scholar] [CrossRef]
- Sheykhmousa, M.; MahdianPari, M.; Ghanbari, H.; Mohammadimanesh, F.; Ghamisi, P.; Homayouni, S. Support Vector Machine Versus Random Forest for Remote Sensing Image Classification: A Meta-Analysis and Systematic Review. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 6308–6325. [Google Scholar] [CrossRef]
- Villa, A.; Benediktsson, J.A.; Chanussot, J.; Jutten, C. Hyperspectral Image Classification with Independent Component Discriminant Analysis. IEEE Trans. Geosci. Remote Sens. 2011, 49, 4865–4876. [Google Scholar] [CrossRef]
- Liao, W.; Pizurica, A.; Scheunders, P.; Philips, W.; Pi, Y. Semisupervised Local Discriminant Analysis for Feature Extraction in Hyperspectral Images. IEEE Trans. Geosci. Remote Sens. 2013, 51, 184–198. [Google Scholar] [CrossRef]
- Wang, Q.; Meng, Z.; Li, X. Locality Adaptive Discriminant Analysis for Spectral-Spatial Classification of Hyperspectral Images. IEEE Geosci. Remote Sens. Lett. 2017, 14, 2077–2081. [Google Scholar] [CrossRef]
- Dai, J.; Qi, H.; Xiong, Y.; Li, Y.; Zhang, G.; Hu, H.; Wei, Y. Deformable Convolutional Networks. In Proceedings of the 2017 IEEE International Conference on Computer Vision (ICCV), Venice, Italy, 22–29 October 2017; IEEE Computer Society: Piscataway, NJ, USA, 2017; pp. 764–773. [Google Scholar]
- Zhao, C.; Zhu, W.; Feng, S. Superpixel Guided Deformable Convolution Network for Hyperspectral Image Classification. IEEE Trans. Image Process. 2022, 31, 3838–3851. [Google Scholar] [CrossRef] [PubMed]
- Yang, X.; Cao, W.; Lu, Y.; Zhou, Y. Hyperspectral Image Transformer Classification Networks. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5528715. [Google Scholar] [CrossRef]
- Ahmad, M.; Khan, A.M.; Mazzara, M.; Distefano, S.; Ali, M.; Sarfraz, M.S. A Fast and Compact 3-D CNN for Hyperspectral Image Classification. IEEE Geosci. Remote Sens. Lett. 2022, 19, 5502205. [Google Scholar] [CrossRef]
- He, X.; Chen, Y.; Lin, Z. Spatial-Spectral Transformer for Hyperspectral Image Classification. Remote Sens. 2021, 13, 498. [Google Scholar] [CrossRef]
- Graham, B.; El-Nouby, A.; Touvron, H.; Stock, P.; Joulin, A.; Jégou, H.; Douze, M. Levit: A vision transformer in convnet’s clothing for faster inference. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 12259–12269. [Google Scholar]
- Mei, S.; Song, C.; Ma, M.; Xu, F. Hyperspectral image classification using group-aware hierarchical transformer. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5539014. [Google Scholar] [CrossRef]
- Xu, Y.; Wang, D.; Zhang, L.; Zhang, L. Dual selective fusion transformer network for hyperspectral image classification. Neural Networks 2025, 187, 107311. [Google Scholar] [CrossRef]
- Cheng, S.; Chan, R.; Du, A. CACFTNet: A Hybrid Cov-Attention and Cross-Layer Fusion Transformer Network for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2024, 62, 1–17. [Google Scholar] [CrossRef]
- Yao, J.; Hong, D.; Li, C.; Chanussot, J. SpectralMamba: Efficient Mamba for Hyperspectral Image Classification. arXiv 2024, arXiv:2404.08489. [Google Scholar]
- Lee, H.; Kwon, H. Going Deeper with Contextual CNN for Hyperspectral Image Classification. IEEE Trans. Image Process. 2017, 26, 4843–4855. [Google Scholar] [CrossRef] [PubMed]
- Cao, X.; Zhou, F.; Xu, L.; Meng, D.; Xu, Z.; Paisley, J.W. Hyperspectral Image Classification with Markov Random Fields and a Convolutional Neural Network. IEEE Trans. Image Process. 2018, 27, 2354–2367. [Google Scholar] [CrossRef]
- Wang, Z.; Chen, B.; Lu, R.; Zhang, H.; Liu, H.; Varshney, P.K. FusionNet: An Unsupervised Convolutional Variational Network for Hyperspectral and Multispectral Image Fusion. IEEE Trans. Image Process. 2020, 29, 7565–7577. [Google Scholar] [CrossRef]
- Li, Y.; Zhang, H.; Shen, Q. Spectral-Spatial Classification of Hyperspectral Imagery with 3D Convolutional Neural Network. Remote Sens. 2017, 9, 67. [Google Scholar] [CrossRef]
- Zhang, Y.; Yan, S.; Zhang, L.; Du, B. Fast Projected Fuzzy Clustering with Anchor Guidance for Multimodal Remote Sensing Imagery. IEEE Trans. Image Process. 2024, 33, 4640–4653. [Google Scholar] [CrossRef]
- Hang, R.; Liu, Q.; Hong, D.; Ghamisi, P. Cascaded recurrent neural networks for hyperspectral image classification. IEEE Trans. Geosci. Remote Sens. 2019, 57, 5384–5394. [Google Scholar] [CrossRef]
- Mei, X.; Pan, E.; Ma, Y.; Dai, X.; Huang, J.; Fan, F.; Du, Q.; Zheng, H.; Ma, J. Spectral-spatial attention networks for hyperspectral image classification. Remote Sens. 2019, 11, 963. [Google Scholar] [CrossRef]
- Yang, X.; Cao, W.; Tang, D.; Zhou, Y.; Lu, Y. ACTN: Adaptive Coupling Transformer Network for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2025, 63, 5503115. [Google Scholar] [CrossRef]
- Xu, Y.; Du, B.; Zhang, L. Self-Attention Context Network: Addressing the Threat of Adversarial Attacks for Hyperspectral Image Classification. IEEE Trans. Image Process. 2021, 30, 8671–8685. [Google Scholar] [CrossRef]
- Hong, D.; Han, Z.; Yao, J.; Gao, L.; Zhang, B.; Plaza, A.; Chanussot, J. SpectralFormer: Rethinking Hyperspectral Image Classification With Transformers. IEEE Trans. Geosci. Remote Sens. 2022, 60, 5518615. [Google Scholar] [CrossRef]
- Ouyang, E.; Li, B.; Hu, W.; Zhang, G.; Zhao, L.; Wu, J. When Multigranularity Meets Spatial-Spectral Attention: A Hybrid Transformer for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2023, 61, 1–18. [Google Scholar] [CrossRef]
- Xu, Y.; Xie, Y.; Li, B.; Xie, C.; Zhang, Y.; Wang, A.; Zhu, L. Spatial-Spectral 1DSwin Transformer with Groupwise Feature Tokenization for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5516616. [Google Scholar] [CrossRef]
- Wu, H.; Xiao, B.; Codella, N.; Liu, M.; Dai, X.; Yuan, L.; Zhang, L. CvT: Introducing Convolutions to Vision Transformers. In Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, QC, Canada, 10–17 October 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 22–31. [Google Scholar]
- Zhou, Y.; Huang, X.; Yang, X.; Peng, J.; Ban, Y. DCTN: Dual-Branch Convolutional Transformer Network with Efficient Interactive Self-Attention for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5508616. [Google Scholar] [CrossRef]
- Gu, A.; Dao, T. Mamba: Linear-Time Sequence Modeling with Selective State Spaces. arXiv 2023, arXiv:2312.00752. [Google Scholar]
- Lu, S.; Zhang, M.; Huo, Y.; Wang, C.; Wang, J.; Gao, C. SSUM: Spatial—Spectral Unified Mamba for Hyperspectral Image Classification. Remote Sens. 2024, 16, 4653. [Google Scholar] [CrossRef]
- He, Y.; Tu, B.; Liu, B.; Li, J.; Plaza, A. 3DSS-Mamba: 3D-Spectral-Spatial Mamba for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5534216. [Google Scholar] [CrossRef]
- Wu, R.; Liu, Y.; Liang, P.; Chang, Q. UltraLight VM-UNet: Parallel Vision Mamba Significantly Reduces Parameters for Skin Lesion Segmentation. arXiv 2024, arXiv:2403.20035. [Google Scholar]
- Liu, B.; Yu, X.; Zhang, P.; Tan, X.; Yu, A.; Xue, Z. A semi-supervised convolutional neural network for hyperspectral image classification. Remote Sens. Lett. 2017, 8, 839–848. [Google Scholar] [CrossRef]
- Sharma, V.; Diba, A.; Tuytelaars, T.; Van Gool, L. Hyperspectral CNN for Image Classification & Band Selection, with Application to Face Recognition; Technical Report KUL/ESAT/PSI/1604; KU Leuven, ESAT: Leuven, Belgium, 2016. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S.; et al. An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale. arXiv 2021, arXiv:2010.11929. [Google Scholar] [CrossRef]
- Heo, B.; Yun, S.; Han, D.; Chun, S.; Choe, J.; Oh, S.J. Rethinking spatial dimensions of vision transformers. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Montreal, BC, Canada, 11–17 October 2021; pp. 11936–11945. [Google Scholar]
- Li, Y.; Luo, Y.; Zhang, L.; Wang, Z.; Du, B. MambaHSI: Spatial–Spectral Mamba for Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2024, 62, 5524216. [Google Scholar] [CrossRef]
Class No. | Class Name | Training | Testing |
---|---|---|---|
1 | Alfalfa | 5 | 41 |
2 | Corn-notill | 143 | 285 |
3 | Corn-minill | 83 | 747 |
4 | Corn | 24 | 213 |
5 | Grass-pasture-mowed | 48 | 435 |
6 | Grass-tress | 73 | 657 |
7 | Grass-pasture | 3 | 25 |
8 | Hay-windrowed | 48 | 430 |
9 | Oats | 2 | 18 |
10 | Soybean-notill | 97 | 875 |
11 | Soybean-mintill | 123 | 1112 |
12 | Soybean-clean | 59 | 534 |
13 | Wheat | 20 | 185 |
14 | Woods | 126 | 1139 |
15 | Buildings | 39 | 347 |
16 | Stone | 6 | 84 |
Total | 1024 | 9225 |
Class No. | Class Name | Training | Testing |
---|---|---|---|
1 | Asphalt | 66 | 6565 |
2 | Meadows | 186 | 18,463 |
3 | Gravel | 20 | 2079 |
4 | Trees | 30 | 3034 |
5 | Painted metal sheets | 13 | 1332 |
6 | Bare Soil | 50 | 4979 |
7 | Bitumen | 13 | 1317 |
8 | Self-Blocking Bricks | 36 | 3646 |
9 | Shadows | 9 | 938 |
Total | 423 | 42,353 |
Class No. | Class Name | Training | Testing |
---|---|---|---|
1 | Healthy Grass | 125 | 126 |
2 | Stressed Grass | 125 | 129 |
3 | Synthetic Grass | 70 | 627 |
4 | Trees | 124 | 120 |
5 | Soil | 124 | 118 |
6 | Water | 33 | 292 |
7 | Residential | 127 | 1141 |
8 | Commercial | 124 | 120 |
9 | Road | 125 | 1127 |
10 | Highway | 123 | 1104 |
11 | Railway | 123 | 1112 |
12 | Parking Lot 1 | 123 | 123 |
13 | Parking Lot 2 | 47 | 422 |
14 | Tennise Court | 43 | 385 |
15 | Running Track | 66 | 594 |
Total | 1502 | 13,527 |
Class No. | Class Name | Training | Testing |
---|---|---|---|
1 | Corn | 172 | 34,339 |
2 | Cotton | 41 | 8333 |
3 | Sesame | 15 | 3016 |
4 | Broad-leaf soybean | 316 | 62,896 |
5 | Narrow-leaf soybean | 20 | 4131 |
6 | Rice | 59 | 1795 |
7 | Water | 335 | 66,721 |
8 | Roads and houses | 35 | 7089 |
9 | Mixed weed | 26 | 5203 |
Total | 1019 | 203,523 |
Class No. | CNNs | Transformers | Mambas | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
2D-CNN | 3D-CNN | ViT | Deep-ViT | HiT | SSFTT | GAHT | DCTN | MambaHSI | 3DSS-Mamba | Ours | |
1 | 95.10 ± 3.52 | 11.46 ± 16.22 | 78.78 ± 16.65 | 9.27 ± 15.03 | 4.88 ± 8.09 | 89.62 ± 6.89 | 88.29 ± 7.30 | 87.07 ± 11.75 | 71.71 ± 17.49 | 86.59 ± 9.27 | 82.44 ± 13.22 |
2 | 91.56 ± 1.98 | 89.75 ± 2.74 | 94.21 ± 3.36 | 67.27 ± 12.07 | 90.67 ± 2.73 | 94.13 ± 1.09 | 92.67 ± 1.99 | 93.58 ± 3.20 | 92.45 ± 3.44 | 91.61 ± 2.80 | 93.25 ± 3.00 |
3 | 92.08 ± 1.56 | 85.98 ± 4.70 | 95.34 ± 3.90 | 50.04 ± 12.45 | 80.17 ± 7.87 | 90.10 ± 2.66 | 94.39 ± 4.27 | 96.14 ± 3.08 | 92.90 ± 3.03 | 95.77 ± 3.18 | 96.87 ± 3.19 |
4 | 97.94 ± 1.46 | 68.08 ± 10.32 | 96.48 ± 2.90 | 76.38 ± 17.90 | 87.37 ± 7.39 | 94.93 ± 3.45 | 94.27 ± 5.55 | 96.95 ± 3.37 | 85.07 ± 3.62 | 95.49 ± 2.84 | 94.74 ± 3.60 |
5 | 93.09 ± 3.34 | 86.07 ± 5.50 | 94.00 ± 2.93 | 40.00 ± 11.97 | 78.46 ± 7.07 | 93.09 ± 2.48 | 93.20 ± 3.54 | 94.83 ± 1.40 | 93.47 ± 1.37 | 95.13 ± 2.62 | 89.56 ± 4.38 |
6 | 95.67 ± 2.98 | 93.14 ± 1.47 | 97.17 ± 0.94 | 90.12 ± 4.25 | 93.93 ± 1.20 | 95.98 ± 1.25 | 96.21 ± 0.86 | 96.16 ± 1.02 | 95.25 ± 2.26 | 97.23 ± 1.21 | 95.80 ± 2.20 |
7 | 7.93 ± 19.30 | 0.00 ± 0.00 | 62.80 ± 25.43 | 0.00 ± 0.00 | 0.00 ± 0.00 | 54.65 ± 35.88 | 27.20 ± 28.78 | 46.40 ± 40.37 | 81.60 ± 22.57 | 50.00 ± 30.79 | 92.00 ± 8.39 |
8 | 99.69 ± 0.45 | 99.67 ± 0.68 | 100 ± 0.00 | 97.07 ± 1.33 | 99.74 ± 0.69 | 98.74 ± 1.40 | 100 ± 0.00 | 100.00 ± 0.00 | 98.56 ± 0.77 | 100.00 ± 0.00 | 97.26 ± 2.23 |
9 | 73.35 ± 29.10 | 0.00 ± 0.00 | 11.11 ± 11.65 | 0.00 ± 0.00 | 0.00 ± 0.00 | 14.35 ± 32.06 | 10.56 ± 19.95 | 3.33 ± 5.67 | 45.56 ± 12.86 | 73.89 ± 19.25 | 53.33 ± 20.67 |
10 | 87.75 ± 1.58 | 76.34 ± 1.67 | 79.65 ± 9.39 | 62.19 ± 10.05 | 75.43 ± 3.53 | 87.12 ± 1.76 | 81.13 ± 2.27 | 82.86 ± 2.39 | 94.74 ± 1.04 | 84.83 ± 5.98 | 92.57 ± 4.84 |
11 | 96.23 ± 1.26 | 95.69 ± 2.03 | 89.18 ± 19.02 | 89.67 ± 4.14 | 95.86 ± 1.70 | 97.68 ± 0.83 | 98.11 ± 1.03 | 98.16 ± 1.32 | 96.90 ± 1.08 | 97.61 ± 1.67 | 97.04 ± 1.90 |
12 | 91.75 ± 2.23 | 88.91 ± 6.05 | 90.11 ± 7.70 | 59.31 ± 16.66 | 88.97 ± 3.96 | 89.52 ± 3.33 | 93.75 ± 3.15 | 89.53 ± 8.68 | 88.20 ± 3.45 | 92.30 ± 2.85 | 86.03 ± 4.20 |
13 | 98.12 ± 1.25 | 80.11 ± 12.35 | 96.00 ± 2.18 | 79.03 ± 0.07 | 93.03 ± 3.61 | 95.02 ± 3.61 | 87.03 ± 9.35 | 94.49 ± 4.04 | 90.70 ± 6.48 | 90.92 ± 6.69 | 91.89 ± 6.84 |
14 | 98.25 ± 2.46 | 98.93 ± 0.76 | 99.75 ± 0.19 | 96.71 ± 3.23 | 99.44 ± 0.60 | 98.67 ± 0.64 | 99.46 ± 0.55 | 99.60 ± 0.28 | 98.00 ± 1.37 | 99.28 ± 0.52 | 98.02 ± 1.09 |
15 | 97.82 ± 1.45 | 77.84 ± 6.81 | 91.67 ± 4.58 | 74.47 ± 15.57 | 86.22 ± 12.60 | 96.10 ± 2.79 | 92.77 ± 5.91 | 92.51 ± 8.54 | 89.80 ± 3.54 | 93.63 ± 4.09 | 90.32 ± 3.56 |
16 | 51.99 ± 22.01 | 8.81 ± 16.07 | 30.36 ± 20.06 | 11.55 ± 18.72 | 8.81 ± 17.82 | 39.23 ± 32.33 | 31.90 ± 16.30 | 38.93 ± 25.76 | 85.71 ± 2.71 | 44.40 ± 20.75 | 86.90 ± 6.16 |
OA (%) | 94.28 ± 1.41 | 88.57 ± 0.78 | 91.73 ± 5.38 | 75.30 ± 5.40 | 88.94 ± 1.42 | 94.09 ± 0.97 | 93.56 ± 0.72 | 94.14 ± 0.63 | 94.16 ± 1.59 | 94.24 ± 0.59 | 94.46 ± 1.33 |
AA (%) | 85.52 ± 4.26 | 66.30 ± 2.22 | 81.66 ± 3.03 | 56.44 ± 5.88 | 67.69 ± 1.95 | 83.06 ± 2.15 | 80.06 ± 2.46 | 81.91 ± 3.57 | 87.54 ± 4.11 | 86.79 ± 3.16 | 89.88 ± 2.54 |
(%) | 93.70 ± 1.65 | 86.88 ± 0.90 | 90.61 ± 6.02 | 71.34 ± 6.42 | 87.32 ± 1.64 | 93.26 ± 0.12 | 92.64 ± 0.82 | 93.31 ± 0.73 | 93.34 ± 1.81 | 93.42 ± 0.68 | 93.67 ± 1.53 |
Class No. | CNNs | Transformers | Mambas | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
2D-CNN | 3D-CNN | ViT | Deep-ViT | HiT | SSFTT | GAHT | DCTN | MambaHSI | 3DSS-Mamba | Ours | |
1 | 93.04 ± 2.35 | 91.80 ± 4.64 | 73.71 ± 4.03 | 91.86 ± 5.58 | 97.13 ± 0.93 | 96.31 ± 2.85 | 96.47 ± 1.68 | 96.05 ± 1.55 | 96.24 ± 2.31 | 97.23 ± 1.50 | 97.19 ± 1.52 |
2 | 99.21 ± 0.45 | 97.73 ± 1.50 | 95.26 ± 2.42 | 44.13 ± 9.52 | 98.95 ± 0.53 | 99.80 ± 0.18 | 99.48 ± 0.43 | 99.58 ± 0.70 | 99.96 ± 0.04 | 99.76 ± 0.19 | 99.83 ± 0.19 |
3 | 37.26 ± 17.15 | 35.90 ± 16.58 | 27.47 ± 19.09 | 72.48 ± 8.67 | 87.94 ± 3.03 | 80.55 ± 20.82 | 89.03 ± 6.12 | 88.02 ± 6.48 | 90.37 ± 5.17 | 88.16 ± 5.40 | 91.58 ± 6.04 |
4 | 88.83 ± 3.44 | 74.78 ± 11.96 | 22.70 ± 10.64 | 98.77 ± 1.00 | 87.13 ± 2.74 | 88.73 ± 3.29 | 87.26 ± 3.89 | 86.47 ± 11.25 | 88.69 ± 2.53 | 88.23 ± 2.91 | 91.51 ± 2.23 |
5 | 99.86 ± 0.34 | 92.30 ± 12.75 | 97.18 ± 2.15 | 70.22 ± 19.06 | 100 ± 0.00 | 99.73 ± 0.53 | 99.50 ± 0.73 | 99.91 ± 0.18 | 99.98 ± 0.05 | 98.41 ± 2.59 | 99.88 ± 0.20 |
6 | 84.09 ± 14.11 | 79.47 ± 5.72 | 26.28 ± 9.46 | 31.47 ± 13.30 | 96.29 ± 1.07 | 100 ± 0.00 | 99.76 ± 0.23 | 99.42 ± 0.86 | 99.60 ± 0.32 | 99.95 ± 0.15 | 99.93 ± 0.18 |
7 | 39.07 ± 13.26 | 42.41 ± 7.49 | 19.65 ± 12.40 | 51.84 ± 28.80 | 97.44 ± 2.63 | 91.09 ± 10.69 | 93.02 ± 8.96 | 90.73 ± 10.92 | 90.10 ± 5.33 | 97.98 ± 2.72 | 98.90 ± 1.32 |
8 | 84.41 ± 15.87 | 50.72 ± 32.72 | 73.58 ± 3.53 | 42.04 ± 15.61 | 96.53 ± 1.54 | 97.77 ± 1.95 | 98.12 ± 1.46 | 97.06 ± 2.05 | 97.53 ± 1.38 | 98.05 ± 1.74 | 98.35 ± 1.12 |
9 | 56.89 ± 17.69 | 27.75 ± 19.52 | 4.79 ± 7.80 | 78.44 ± 2.24 | 73.97 ± 2.80 | 69.24 ± 8.30 | 59.47 ± 9.66 | 72.86 ± 5.87 | 74.65 ± 3.24 | 73.06 ± 6.06 | 72.76 ± 8.43 |
OA (%) | 88.63 ± 3.41 | 82.50 ± 3.62 | 69.13 ± 1.62 | 65.40 ± 2.36 | 96.19 ± 0.30 | 96.42 ± 1.52 | 96.46 ± 0.73 | 96.44 ± 1.70 | 96.99 ± 0.52 | 97.16 ± 0.37 | 97.68 ± 0.48 |
AA (%) | 75.85 ± 6.01 | 65.87 ± 5.40 | 48.96 ± 3.96 | 71.03 ± 2.70 | 92.82 ± 0.53 | 91.47 ± 4.02 | 91.35 ± 1.59 | 92.23 ± 3.02 | 93.01 ± 1.00 | 93.43 ± 0.94 | 94.44 ± 1.54 |
(%) | 84.67 ± 4.84 | 76.30 ± 4.94 | 56.53 ± 2.70 | 91.86 ± 5.58 | 94.94 ± 0.41 | 95.25 ± 2.03 | 95.30 ± 0.98 | 95.26 ± 2.29 | 96.00 ± 0.69 | 96.23 ± 0.50 | 96.92 ± 0.64 |
Class No. | CNNs | Transformers | Mambas | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
2D-CNN | 3D-CNN | ViT | Deep-ViT | HiT | SSFTT | GAHT | DCTN | MambaHSI | 3DSS-Mamba | Ours | |
1 | 95.20 ± 4.66 | 91.29 ± 5.38 | 87.20 ± 5.85 | 85.37 ± 5.00 | 89.44 ± 2.43 | 98.06 ± 1.15 | 96.94 ± 2.55 | 98.06 ± 1.15 | 96.77 ± 1.94 | 98.14 ± 1.55 | 98.08 ± 1.71 |
2 | 98.29 ± 1.26 | 92.12 ± 6.99 | 85.67 ± 5.45 | 89.79 ± 6.56 | 95.18 ± 1.02 | 96.18 ± 1.73 | 95.03 ± 1.75 | 96.18 ± 1.73 | 96.38 ± 1.92 | 95.97 ± 2.14 | 99.13 ± 0.86 |
3 | 99.74 ± 0.46 | 94.68 ± 3.26 | 98.41 ± 0.77 | 98.56 ± 0.95 | 97.75 ± 0.43 | 97.11 ± 1.07 | 97.02 ± 1.00 | 97.11 ± 1.07 | 98.79 ± 0.45 | 97.18 ± 0.51 | 99.49 ± 0.27 |
4 | 97.26 ± 2.47 | 93.31 ± 9.21 | 65.51 ± 6.81 | 70.55 ± 7.48 | 83.57 ± 4.52 | 96.71 ± 1.51 | 96.01 ± 2.01 | 96.71 ± 1.51 | 95.23 ± 2.03 | 97.51 ± 1.56 | 97.73 ± 0.92 |
5 | 99.53 ± 0.74 | 98.17 ± 1.72 | 98.80 ± 0.64 | 99.55 ± 0.50 | 99.75 ± 0.19 | 99.97 ± 0.04 | 99.94 ± 0.11 | 99.97 ± 0.04 | 99.94 ± 0.11 | 99.95 ± 0.11 | 99.61 ± 0.31 |
6 | 97.46 ± 2.34 | 87.21 ± 8.12 | 66.58 ± 6.82 | 78.22 ± 6.49 | 76.61 ± 2.78 | 89.38 ± 5.34 | 89.55 ± 1.52 | 89.38 ± 5.34 | 90.31 ± 3.44 | 94.32 ± 2.82 | 99.73 ± 0.26 |
7 | 93.04 ± 2.26 | 91.46 ± 3.55 | 70.21 ± 4.24 | 85.85 ± 4.60 | 77.59 ± 5.33 | 97.91 ± 2.15 | 98.36 ± 0.85 | 97.91 ± 2.15 | 98.31 ± 1.02 | 97.94 ± 1.80 | 95.21 ± 1.74 |
8 | 81.36 ± 3.99 | 84.41 ± 7.90 | 61.80 ± 4.02 | 79.27 ± 5.74 | 84.96 ± 2.55 | 96.18 ± 2.24 | 98.01 ± 1.43 | 96.18 ± 2.24 | 98.51 ± 0.85 | 96.75 ± 2.35 | 96.71 ± 1.33 |
9 | 90.02 ± 3.41 | 87.75 ± 3.82 | 65.19 ± 8.48 | 75.18 ± 3.30 | 79.47 ± 2.40 | 98.76 ± 0.89 | 98.00 ± 1.20 | 98.76 ± 0.89 | 98.19 ± 0.89 | 97.91 ± 1.75 | 97.18 ± 1.88 |
10 | 96.52 ± 1.51 | 82.10 ± 5.10 | 80.54 ± 6.08 | 96.29 ± 1.06 | 97.22 ± 2.33 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 99.92 ± 0.24 | 99.97 ± 0.08 | 99.86 ± 0.25 |
11 | 92.77 ± 2.16 | 85.91 ± 4.62 | 57.72 ± 8.01 | 77.74 ± 10.58 | 89.87 ± 3.52 | 99.69 ± 0.83 | 99.72 ± 0.84 | 99.69 ± 0.83 | 99.89 ± 0.24 | 99.29 ± 0.88 | 98.36 ± 1.12 |
12 | 91.19 ± 3.69 | 85.33 ± 5.86 | 65.55 ± 2.87 | 91.98 ± 2.55 | 96.10 ± 0.94 | 97.20 ± 1.23 | 98.39 ± 0.62 | 97.20 ± 1.23 | 98.93 ± 0.60 | 97.80 ± 1.54 | 98.90 ± 0.48 |
13 | 97.67 ± 2.03 | 86.81 ± 6.28 | 45.31 ± 20.56 | 78.72 ± 10.51 | 84.91 ± 5.79 | 96.16 ± 3.36 | 95.57 ± 4.28 | 96.16 ± 3.36 | 97.42 ± 2.08 | 97.42 ± 2.12 | 97.16 ± 1.04 |
14 | 100.00 ± 0.00 | 91.65 ± 6.35 | 89.40 ± 3.56 | 98.52 ± 1.12 | 99.95 ± 0.10 | 100.00 ± 0.00 | 99.97 ± 0.08 | 100.00 ± 0.00 | 100.00 ± 0.00 | 100.00 ± 0.00 | 99.38 ± 0.53 |
15 | 100.00 ± 0.00 | 96.18 ± 2.67 | 74.95 ± 2.22 | 95.15 ± 3.47 | 99.68 ± 0.37 | 99.95 ± 0.15 | 100.00 ± 0.00 | 99.95 ± 0.15 | 100.00 ± 0.00 | 99.95 ± 0.15 | 99.80 ± 0.27 |
OA (%) | 94.47 ± 0.82 | 89.13 ± 2.12 | 74.41 ± 1.75 | 86.23 ± 1.39 | 90.02 ± 1.15 | 97.74 ± 0.46 | 97.87 ± 0.46 | 97.91 ± 0.42 | 97.86 ± 0.34 | 98.10 ± 0.41 | 98.26 ± 0.26 |
AA (%) | 95.34 ± 0.77 | 89.89 ± 1.74 | 74.19 ± 2.19 | 86.72 ± 1.06 | 90.14 ± 1.11 | 97.91 ± 0.42 | 97.50 ± 0.51 | 97.55 ± 0.54 | 97.90 ± 0.39 | 98.01 ± 0.36 | 98.42 ± 0.23 |
(%) | 94.23 ± 1.92 | 88.24 ± 2.29 | 72.31 ± 1.91 | 85.11 ± 1.50 | 89.20 ± 1.24 | 97.55 ± 0.54 | 97.69 ± 0.50 | 97.74 ± 0.46 | 98.01 ± 0.37 | 97.95 ± 0.44 | 98.11 ± 0.28 |
Class No. | CNNs | Transformers | Mambas | ||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
2D-CNN | 3D-CNN | ViT | Deep-ViT | HiT | SSFTT | GAHT | DCTN | MambaHSI | 3DSS-Mamba | Ours | |
1 | 99.88 ± 0.02 | 98.37 ± 0.37 | 98.00 ± 0.50 | 98.71 ± 0.61 | 99.37 ± 0.36 | 99.77 ± 0.18 | 99.71 ± 0.10 | 99.90 ± 0.06 | 99.56 ± 0.28 | 99.86 ± 0.05 | 99.88 ± 0.04 |
2 | 99.73 ± 0.09 | 90.40 ± 5.37 | 91.54 ± 4.61 | 85.55 ± 7.66 | 91.48 ± 4.68 | 97.72 ± 0.83 | 97.82 ± 1.16 | 99.00 ± 0.78 | 96.65 ± 2.01 | 98.54 ± 1.56 | 99.48 ± 0.23 |
3 | 94.96 ± 1.26 | 1.00 ± 2.84 | 0.13 ± 0.38 | 65.74 ± 20.15 | 74.22 ± 12.89 | 96.41 ± 2.95 | 89.61 ± 3.43 | 99.05 ± 0.57 | 95.38 ± 2.80 | 95.43 ± 4.01 | 95.86 ± 1.45 |
4 | 99.12 ± 0.07 | 99.69 ± 0.19 | 98.21 ± 0.81 | 99.37 ± 0.22 | 99.59 ± 0.17 | 99.55 ± 0.18 | 99.83 ± 0.07 | 99.82 ± 0.11 | 99.77 ± 0.20 | 99.76 ± 0.11 | 99.49 ± 0.11 |
5 | 95.43 ± 0.72 | 68.09 ± 5.45 | 22.02 ± 6.25 | 58.82 ± 19.46 | 68.03 ± 12.74 | 89.49 ± 4.05 | 85.29 ± 2.93 | 97.14 ± 1.84 | 88.41 ± 3.99 | 91.08 ± 4.06 | 98.06 ± 0.92 |
6 | 98.56 ± 0.20 | 96.57 ± 2.16 | 91.60 ± 2.96 | 97.02 ± 2.81 | 98.11 ± 1.89 | 98.51 ± 1.01 | 98.60 ± 0.52 | 98.39 ± 0.66 | 97.47 ± 1.71 | 99.25 ± 0.24 | 98.09 ± 0.56 |
7 | 99.74 ± 0.05 | 99.93 ± 0.08 | 99.96 ± 0.03 | 99.84 ± 0.24 | 99.99 ± 0.00 | 99.85 ± 0.10 | 99.94 ± 0.05 | 99.94 ± 0.08 | 99.87 ± 0.10 | 99.98 ± 0.01 | 99.78 ± 0.10 |
8 | 85.56 ± 0.79 | 66.97 ± 8.10 | 71.64 ± 6.45 | 69.03 ± 11.48 | 81.13 ± 7.50 | 83.46 ± 5.58 | 91.46 ± 3.47 | 89.59 ± 5.46 | 83.93 ± 5.25 | 88.65 ± 3.06 | 92.83 ± 3.88 |
9 | 78.58 ± 2.05 | 67.57 ± 8.36 | 68.68 ± 7.22 | 75.84 ± 9.50 | 71.19 ± 4.74 | 80.82 ± 8.07 | 84.14 ± 5.38 | 77.41 ± 6.49 | 76.49 ± 9.90 | 78.68 ± 8.07 | 90.77 ± 2.17 |
OA (%) | 98.35 ± 0.09 | 94.92 ± 0.54 | 93.41 ± 0.37 | 95.73 ± 0.67 | 96.88 ± 0.49 | 98.26 ± 0.32 | 98.55 ± 0.20 | 98.76 ± 0.21 | 98.06 ± 0.31 | 98.61 ± 0.25 | 99.03 ± 0.22 |
AA (%) | 94.61 ± 0.51 | 76.51 ± 2.10 | 71.31 ± 1.09 | 83.32 ± 3.43 | 87.01 ± 2.57 | 93.95 ± 1.40 | 94.04 ± 0.69 | 95.58 ± 0.70 | 93.06 ± 1.40 | 94.58 ± 1.03 | 97.14 ± 0.90 |
(%) | 97.81 ± 0.12 | 93.22 ± 0.73 | 91.25 ± 0.49 | 94.35 ± 0.89 | 95.88 ± 0.65 | 97.71 ± 0.42 | 98.09 ± 0.26 | 98.37 ± 0.27 | 97.45 ± 0.42 | 98.15 ± 0.34 | 98.73 ± 0.28 |
Methods | Indian Pines | Methods | Houston 2013 | ||||
---|---|---|---|---|---|---|---|
OA | AA | κ | OA | AA | κ | ||
SSFTT | 90.11 ± 1.31 | 67.88 ± 1.79 | 88.65 ± 1.51 | SSFTT | 93.97 ± 0.57 | 93.53 ± 0.57 | 93.47 ± 0.62 |
DCTN | 90.04 ± 1.05 | 68.93 ± 2.38 | 88.60 ± 1.21 | DCTN | 95.53 ± 0.54 | 95.10 ± 0.55 | 95.17 ± 0.58 |
MambaHSI | 88.36 ± 1.25 | 72.69 ± 1.56 | 86.45 ± 1.47 | MambaHSI | 95.06 ± 0.35 | 94.86 ± 0.89 | 94.26 ± 0.45 |
3DSS-Mamba | 89.23 ± 1.07 | 75.58 ± 1.72 | 87.72 ± 1.21 | 3DSS-Mamba | 95.65 ± 0.40 | 95.59 ± 0.48 | 95.30 ± 0.43 |
Ours | 92.18 ± 0.93 | 77.61 ± 2.16 | 91.07 ± 1.06 | Ours | 96.55 ± 0.45 | 96.20 ± 0.38 | 95.93 ± 0.49 |
Methods | FLOPs (G) | Param (MB) | Training Time (s) | Testing Time (s) | OA (%) | AA (%) | (%) |
---|---|---|---|---|---|---|---|
2D-CNN | 0.06 | 1.67 | 34.69 | 1.78 | 94.28 ± 1.41 | 85.52 ± 4.26 | 93.70 ± 1.65 |
ViT | 0.34 | 6.54 | 52.23 | 2.78 | 91.73 ± 5.38 | 81.66 ± 3.03 | 90.61 ± 6.02 |
GAHT | 0.31 | 0.68 | 46.79 | 2.84 | 93.56 ± 0.72 | 80.06 ± 2.46 | 92.64 ± 0.82 |
SSFTT | 0.24 | 0.94 | 45.12 | 2.46 | 94.09 ± 0.97 | 83.06 ± 2.15 | 93.26 ± 0.12 |
DCTN | 2.95 | 53.94 | 441.56 | 7.93 | 94.14 ± 0.63 | 81.91 ± 3.57 | 93.31 ± 0.73 |
MambaHSI | 0.01 | 0.42 | 360.26 | 6.75 | 94.16 ± 1.59 | 87.54 ± 4.11 | 93.34 ± 1.81 |
3DSS-Mamba | 0.01 | 0.43 | 356.45 | 7.68 | 94.24 ± 0.59 | 86.79 ± 3.16 | 93.42 ± 0.68 |
Ours | 0.73 | 1.64 | 80.36 | 3.86 | 94.46 ± 1.33 | 89.88 ± 2.54 | 93.67 ± 1.53 |
Sizes | Indian Pines | PaviaU | Houston 2013 | WHL | ||||
---|---|---|---|---|---|---|---|---|
OA | κ | OA | κ | OA | κ | OA | κ | |
7 × 7 | 97.99 ± 0.46 | 97.71 ± 0.53 | 98.97 ± 1.31 | 98.81 ± 1.41 | 98.41 ± 0.54 | 97.76 ± 0.71 | 99.63 ± 0.08 | 99.51 ± 0.11 |
9 × 9 | 97.50 ± 0.95 | 97.15 ± 1.08 | 98.78 ± 0.25 | 98.69 ± 0.27 | 98.25 ± 0.96 | 97.55 ± 1.27 | 99.58 ± 0.07 | 99.44 ± 0.09 |
11 × 11 | 97.26 ± 0.38 | 96.87 ± 0.43 | 98.46 ± 0.21 | 98.40 ± 0.35 | 97.89 ± 1.70 | 97.44 ± 2.27 | 99.35 ± 0.03 | 99.21 ± 0.15 |
13 × 13 | 95.71 ± 0.38 | 95.11 ± 0.43 | 98.39 ± 0.34 | 98.21 ± 0.37 | 97.80 ± 1.11 | 97.02 ± 1.47 | 99.19 ± 0.13 | 98.93 ± 0.17 |
15 × 15 | 94.46 ± 1.33 | 93.67 ± 1.53 | 98.26 ± 0.26 | 98.11 ± 0.28 | 97.68 ± 0.48 | 96.92 ± 0.64 | 99.03 ± 0.22 | 98.73 ± 0.28 |
17 × 17 | 92.59 ± 1.07 | 91.53 ± 1.23 | 95.51 ± 0.21 | 97.31 ± 0.29 | 96.82 ± 1.05 | 95.78 ± 1.39 | 98.71 ± 0.19 | 98.30 ± 0.25 |
Conv 1 × 1 | Conv 3 × 3 | Conv 5 × 5 | Conv 7 × 7 | OA (%) | AA (%) | (%) |
---|---|---|---|---|---|---|
✓ | ✓ | × | × | 97.91 ± 0.41 | 96.04 ± 0.89 | 97.86 ± 0.21 |
✓ | × | ✓ | × | 97.83 ± 1.21 | 95.87 ± 0.68 | 97.05 ± 0.62 |
✓ | × | × | ✓ | 98.56 ± 0.35 | 96.56 ± 1.20 | 98.03 ± 0.32 |
× | ✓ | ✓ | × | 98.43 ± 0.33 | 96.24 ± 0.45 | 97.56 ± 0.63 |
× | ✓ | × | ✓ | 98.32 ± 0.56 | 96.23 ± 0.89 | 97.89 ± 0.56 |
× | × | ✓ | ✓ | 97.22 ± 0.26 | 95.46 ± 0.74 | 96.56 ± 0.87 |
✓ | ✓ | ✓ | ✓ | 99.03 ± 0.22 | 97.14 ± 0.90 | 98.73 ± 0.28 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Yang, X.; Li, L.; Xue, S.; Li, S.; Yang, W.; Tang, H.; Huang, X. MRFP-Mamba: Multi-Receptive Field Parallel Mamba for Hyperspectral Image Classification. Remote Sens. 2025, 17, 2208. https://doi.org/10.3390/rs17132208
Yang X, Li L, Xue S, Li S, Yang W, Tang H, Huang X. MRFP-Mamba: Multi-Receptive Field Parallel Mamba for Hyperspectral Image Classification. Remote Sensing. 2025; 17(13):2208. https://doi.org/10.3390/rs17132208
Chicago/Turabian StyleYang, Xiaofei, Lin Li, Suihua Xue, Sihuan Li, Wanjun Yang, Haojin Tang, and Xiaohui Huang. 2025. "MRFP-Mamba: Multi-Receptive Field Parallel Mamba for Hyperspectral Image Classification" Remote Sensing 17, no. 13: 2208. https://doi.org/10.3390/rs17132208
APA StyleYang, X., Li, L., Xue, S., Li, S., Yang, W., Tang, H., & Huang, X. (2025). MRFP-Mamba: Multi-Receptive Field Parallel Mamba for Hyperspectral Image Classification. Remote Sensing, 17(13), 2208. https://doi.org/10.3390/rs17132208