TABS-Net: A Temporal Spectral Attentive Block with Space–Time Fusion Network for Robust Cross-Year Crop Mapping
Highlights
- A 3D CNN framework (TABS-Net) integrating CBAM3D and DOY positional encoding was developed for accurate crop mapping.
- The model achieves superior accuracy and Inter-Annual Robustness (IAR) values close to 1 by simulating phenological shifts via Temporal Jitter.
- Explicitly aligning seasonal timing and simulating shifts effectively overcomes the “date–spectrum–class” misalignment caused by climate variability.
- TABS-Net provides a scalable and transferable solution for consistent large-scale agricultural monitoring across different years.
Abstract
1. Introduction
- Relative to mainstream temporal baselines (e.g., 3D CNN and LSTM), to what extent can our phenology-oriented TABS-Net improve the accuracy of fine-grained crop classification?
- How do the three components—CBAM3D, day-of-year (DOY) encoding, and temporal jitter—individually contribute to classification accuracy and cross-year generalization, and which component plays the leading role in boosting generalization?
2. Materials
2.1. Study Site
2.2. Sentinel-2 Data and Preprocessing
2.3. CDL Sample Construction and Dataset Preparation
2.4. Analysis of Phenological Drift Mechanism
3. Methodology
3.1. The Structure of TABS-NET
3.1.1. Overall Architecture
3.1.2. Spatio-Temporal Feature Extraction
3.1.3. Three-Dimensional Convolutional Block Attention Module
3.1.4. Cross-Year Generalization Strategies
3.2. Comparison Methods
3.3. Experimental Setup and Evaluation Metrics
3.3.1. Experimental Protocols
3.3.2. Evaluation Metrics
- OA and Kappa
- UA_mean and PA_mean
- F1 and Macro-F1(%)
- IoU and mIoU
- Inter-Annual Robustness: IAR_F1 and IAR_IoU
3.3.3. Training and Evaluation Procedures
3.4. Implementation Details
4. Results
4.1. Overall Performance
4.2. Ablation Study and Analysis
4.3. Spectro-Temporal Signature Analysis
4.4. Extrapolation Experiment and Visual Assessment for 2025
5. Discussion
5.1. Computational Efficiency and Spatial Coherence Analysis
5.2. Effect of Band Selection
5.3. Effect of Number of Observations
5.4. Cross-Regional Transferability Experiment: A Case Study in Horqin
5.5. Practical Implications and Economic Value
5.6. Limitations and Further Study
6. Conclusions
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Shiferaw, B.; Prasanna, B.M.; Hellin, J.; Bänziger, M. Crops That Feed the World 6. Past Successes and Future Challenges to the Role Played by Maize in Global Food Security. Food Sec. 2011, 3, 307–327. [Google Scholar] [CrossRef]
- Ray, D.K.; Mueller, N.D.; West, P.C.; Foley, J.A. Yield Trends Are Insufficient to Double Global Crop Production by 2050. PLoS ONE 2013, 8, e66428. [Google Scholar] [CrossRef]
- Karthikeyan, L.; Chawla, I.; Mishra, A.K. A Review of Remote Sensing Applications in Agriculture for Food Security: Crop Growth and Yield, Irrigation, and Crop Losses. J. Hydrol. 2020, 586, 124905. [Google Scholar] [CrossRef]
- Wu, B.; Zhang, M.; Zeng, H.; Tian, F.; Potgieter, A.B.; Qin, X.; Yan, N.; Chang, S.; Zhao, Y.; Dong, Q.; et al. Challenges and Opportunities in Remote Sensing-Based Crop Monitoring: A Review. Natl. Sci. Rev. 2023, 10, nwac290. [Google Scholar] [CrossRef] [PubMed]
- Belgiu, M.; Drăguţ, L. Random Forest in Remote Sensing: A Review of Applications and Future Directions. ISPRS J. Photogramm. Remote Sens. 2016, 114, 24–31. [Google Scholar] [CrossRef]
- Mountrakis, G.; Im, J.; Ogole, C. Support Vector Machines in Remote Sensing: A Review. ISPRS J. Photogramm. Remote Sens. 2011, 66, 247–259. [Google Scholar] [CrossRef]
- Li, Z.; He, W.; Cheng, M.; Hu, J.; Yang, G.; Zhang, H. SinoLC-1: The First 1 m Resolution National-Scale Land-Cover Map of China Created with a Deep Learning Framework and Open-Access Data. Earth Syst. Sci. Data 2023, 15, 4749–4780. [Google Scholar] [CrossRef]
- Karmakar, P.; Teng, S.W.; Murshed, M.; Pang, S.; Li, Y.; Lin, H. Crop Monitoring by Multimodal Remote Sensing: A Review. Remote Sens. Appl. Soc. Environ. 2024, 33, 101093. [Google Scholar] [CrossRef]
- Kussul, N.; Lavreniuk, M.; Skakun, S.; Shelestov, A. Deep Learning Classification of Land Cover and Crop Types Using Remote Sensing Data. IEEE Geosci. Remote Sens. Lett. 2017, 14, 778–782. [Google Scholar] [CrossRef]
- Adrian, J.; Sagan, V.; Maimaitijiang, M. Sentinel SAR-Optical Fusion for Crop Type Mapping Using Deep Learning and Google Earth Engine. ISPRS J. Photogramm. Remote Sens. 2021, 175, 215–235. [Google Scholar] [CrossRef]
- Lu, T.; Gao, M.; Wang, L. Crop Classification in High-Resolution Remote Sensing Images Based on Multi-Scale Feature Fusion Semantic Segmentation Model. Front. Plant Sci. 2023, 14, 1196634. [Google Scholar] [CrossRef]
- Gao, F.; Anderson, M.; Daughtry, C.; Karnieli, A.; Hively, D.; Kustas, W. A Within-Season Approach for Detecting Early Growth Stages in Corn and Soybean Using High Temporal and Spatial Resolution Imagery. Remote Sens. Environ. 2020, 242, 111752. [Google Scholar] [CrossRef]
- Pelletier, C.; Webb, G.; Petitjean, F. Temporal Convolutional Neural Network for the Classification of Satellite Image Time Series. Remote Sens. 2019, 11, 523. [Google Scholar] [CrossRef]
- Durrani, A.U.R.; Minallah, N.; Aziz, N.; Frnda, J.; Khan, W.; Nedoma, J. Effect of Hyper-Parameters on the Performance of ConvLSTM Based Deep Neural Network in Crop Classification. PLoS ONE 2023, 18, e0275653. [Google Scholar] [CrossRef]
- Rußwurm, M.; Körner, M. Multi-Temporal Land Cover Classification with Sequential Recurrent Encoders. ISPRS Int. J. Geo-Inf. 2018, 7, 129. [Google Scholar] [CrossRef]
- Zhang, R.; Wu, X.; Li, J.; Zhao, P.; Zhang, Q.; Wuri, L.; Zhang, D.; Zhang, Z.; Yang, L. A Bibliometric Review of Deep Learning in Crop Monitoring: Trends, Challenges, and Future Perspectives. Front. Artif. Intell. 2025, 8, 1636898. [Google Scholar] [CrossRef] [PubMed]
- Ji, S.; Zhang, C.; Xu, A.; Shi, Y.; Duan, Y. 3D Convolutional Neural Networks for Crop Classification with Multi-Temporal Remote Sensing Images. Remote Sens. 2018, 10, 75. [Google Scholar] [CrossRef]
- Gallo, I.; La Grassa, R.; Landro, N.; Boschetti, M. Sentinel 2 Time Series Analysis with 3D Feature Pyramid Network and Time Domain Class Activation Intervals for Crop Mapping. ISPRS Int. J. Geo-Inf. 2021, 10, 483. [Google Scholar] [CrossRef]
- Blickensdörfer, L. Mapping of Crop Types and Crop Sequences with Combined Time Series of Sentinel-1, Sentinel-2 and Landsat 8 Data for Germany. Remote Sens. Environ. 2022, 269, 112831. [Google Scholar] [CrossRef]
- Shen, Y.; Zhang, X.; Yang, Z. Mapping Corn and Soybean Phenometrics at Field Scales over the United States Corn Belt by Fusing Time Series of Landsat 8 and Sentinel-2 Data with VIIRS Data. ISPRS J. Photogramm. Remote Sens. 2022, 186, 55–69. [Google Scholar] [CrossRef]
- Hu, X.; Wang, X.; Zhong, Y.; Zhang, L. S3ANet: Spectral-Spatial-Scale Attention Network for End-to-End Precise Crop Classification Based on UAV-Borne H2 Imagery. ISPRS J. Photogramm. Remote Sens. 2022, 183, 147–163. [Google Scholar] [CrossRef]
- Woo, S.; Park, J.; Lee, J.-Y.; Kweon, I.S. CBAM: Convolutional Block Attention Module. In Computer Vision—ECCV 2018; Ferrari, V., Hebert, M., Sminchisescu, C., Weiss, Y., Eds.; Lecture Notes in Computer Science; Springer International Publishing: Cham, Switzerland, 2018; Volume 11211, pp. 3–19. ISBN 978-3-030-01233-5. [Google Scholar]
- Wang, Y.; Zhang, Z.; Feng, L.; Ma, Y.; Du, Q. A New Attention-Based CNN Approach for Crop Mapping Using Time Series Sentinel-2 Images. Comput. Electron. Agric. 2021, 184, 106090. [Google Scholar] [CrossRef]
- Garnot, V.S.F.; Landrieu, L.; Chehata, N. Multi-Modal Temporal Attention Models for Crop Mapping from Satellite Time Series. ISPRS J. Photogramm. Remote. Sens. 2022, 187, 294–305. [Google Scholar] [CrossRef]
- Antonijević, O.; Jelić, S.; Bajat, B.; Kilibarda, M. Transfer Learning Approach Based on Satellite Image Time Series for the Crop Classification Problem. J. Big Data 2023, 10, 54. [Google Scholar] [CrossRef]
- Mohammadi, S.; Belgiu, M.; Stein, A. Few-Shot Learning for Crop Mapping from Satellite Image Time Series. Remote Sens. 2024, 16, 1026. [Google Scholar] [CrossRef]
- Elshamli, A.; Taylor, G.W.; Areibi, S. Multisource Domain Adaptation for Remote Sensing Using Deep Neural Networks. IEEE Trans. Geosci. Remote Sens. 2020, 58, 3328–3340. [Google Scholar] [CrossRef]
- You, N.; Dong, J.; Li, J.; Huang, J.; Jin, Z. Rapid Early-Season Maize Mapping without Crop Labels. Remote Sens. Environ. 2023, 290, 113496. [Google Scholar] [CrossRef]
- McDanel, J.J.; Meghani, N.A.; Miller, B.A.; Moore, P.L. Harmonized Landform Regions in the Glaciated Central Lowlands, USA. J. Maps 2022, 18, 448–460. [Google Scholar] [CrossRef]
- Lesiv, M.; Laso Bayas, J.C.; See, L.; Duerauer, M.; Dahlia, D.; Durando, N.; Hazarika, R.; Kumar Sahariah, P.; Vakolyuk, M.; Blyshchyk, V.; et al. Estimating the Global Distribution of Field Size Using Crowdsourcing. Glob. Change Biol. 2019, 25, 174–186. [Google Scholar] [CrossRef]
- Daly, C.; Halbleib, M.; Smith, J.I.; Gibson, W.P.; Doggett, M.K.; Taylor, G.H.; Curtis, J.; Pasteris, P.P. Physiographically Sensitive Mapping of Climatological Temperature and Precipitation across the Conterminous United States. Int. J. Climatol. 2008, 28, 2031–2064. [Google Scholar] [CrossRef]
- Seifert, C.A.; Roberts, M.J.; Lobell, D.B. Continuous Corn and Soybean Yield Penalties across Hundreds of Thousands of Fields. Agron. J. 2017, 109, 541–548. [Google Scholar] [CrossRef]
- Martens, D.A.; Jaynes, D.B.; Colvin, T.S.; Kaspar, T.C.; Karlen, D.L. Soil Organic Nitrogen Enrichment Following Soybean in an Iowa Corn–Soybean Rotation. Soil Sci. Soc. Am. J. 2006, 70, 382–392. [Google Scholar] [CrossRef]
- Ashourloo, D.; Nematollahi, H.; Huete, A.; Aghighi, H.; Azadbakht, M.; Shahrabi, H.S.; Goodarzdashti, S. A New Phenology-Based Method for Mapping Wheat and Barley Using Time-Series of Sentinel-2 Images. Remote Sens. Environ. 2022, 280, 113206. [Google Scholar] [CrossRef]
- Han, W.; Yang, Z.; Di, L.; Mueller, R. CropScape: A Web Service Based Application for Exploring and Disseminating US Conterminous Geospatial Cropland Data Products for Decision Support. Comput. Electron. Agric. 2012, 84, 111–123. [Google Scholar] [CrossRef]
- Duan, K.; Vrieling, A.; Schlund, M.; Bhaskar Nidumolu, U.; Ratcliff, C.; Collings, S.; Nelson, A. Detection and Attribution of Cereal Yield Losses Using Sentinel-2 and Weather Data: A Case Study in South Australia. ISPRS J. Photogramm. Remote Sens. 2024, 213, 33–52. [Google Scholar] [CrossRef]
- Yang, Y.; Tao, B.; Ruane, A.C.; Shen, C.; Matteson, D.S.; Cousin, R.; Ren, W. Widespread Advances in Corn and Soybean Phenology in Response to Future Climate Change Across the United States. JGR Biogeosci. 2025, 130, e2024JG008266. [Google Scholar] [CrossRef]
- Yang, Y.; Tao, B.; Liang, L.; Huang, Y.; Matocha, C.; Lee, C.D.; Sama, M.; Masri, B.E.; Ren, W. Detecting Recent Crop Phenology Dynamics in Corn and Soybean Cropping Systems of Kentucky. Remote Sens. 2021, 13, 1615. [Google Scholar] [CrossRef]
- Chen, W.; Liu, G. A Novel Method for Identifying Crops in Parcels Constrained by Environmental Factors Through the Integration of a Gaofen-2 High-Resolution Remote Sensing Image and Sentinel-2 Time Series. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 450–463. [Google Scholar] [CrossRef]
- Ople, J.J.M.; Yeh, P.-Y.; Sun, S.-W.; Tsai, I.-T.; Hua, K.-L. Multi-Scale Neural Network with Dilated Convolutions for Image Deblurring. IEEE Access 2020, 8, 53942–53952. [Google Scholar] [CrossRef]
- Zhang, S.; Yang, J.; Leng, P.; Ma, Y.; Wang, H.; Song, Q. Crop Type Mapping with Temporal Sample Migration. Int. J. Remote Sens. 2024, 45, 7014–7032. [Google Scholar] [CrossRef]
- Reuß, F.; Greimeister-Pfeil, I.; Vreugdenhil, M.; Wagner, W. Comparison of Long Short-Term Memory Networks and Random Forest for Sentinel-1 Time Series Based Large Scale Crop Classification. Remote Sens. 2021, 13, 5000. [Google Scholar] [CrossRef]
- Xu, L.; Zhang, H.; Wang, C.; Zhang, B.; Liu, M. Crop Classification Based on Temporal Information Using Sentinel-1 SAR Time-Series Data. Remote Sens. 2018, 11, 53. [Google Scholar] [CrossRef]
- Simón Sánchez, A.-M.; González-Piqueras, J.; De La Ossa, L.; Calera, A. Convolutional Neural Networks for Agricultural Land Use Classification from Sentinel-2 Image Time Series. Remote Sens. 2022, 14, 5373. [Google Scholar] [CrossRef]
- Bargiel, D. A New Method for Crop Classification Combining Time Series of Radar Images and Crop Phenology Information. Remote Sens. Environ. 2017, 198, 369–383. [Google Scholar] [CrossRef]
- Pontius, R.G.; Millones, M. Death to Kappa: Birth of Quantity Disagreement and Allocation Disagreement for Accuracy Assessment. Int. J. Remote Sens. 2011, 32, 4407–4429. [Google Scholar] [CrossRef]
- Stehman, S.V. Selecting and Interpreting Measures of Thematic Classification Accuracy. Remote Sens. Environ. 1997, 62, 77–89. [Google Scholar] [CrossRef]
- Farhadpour, S.; Warner, T.A.; Maxwell, A.E. Selecting and Interpreting Multiclass Loss and Accuracy Assessment Metrics for Classifications with Class Imbalance: Guidance and Best Practices. Remote Sens. 2024, 16, 533. [Google Scholar] [CrossRef]
- He, X.; Zhang, S.; Xue, B.; Zhao, T.; Wu, T. Cross-Modal Change Detection Flood Extraction Based on Convolutional Neural Network. Int. J. Appl. Earth Obs. Geoinf. 2023, 117, 103197. [Google Scholar] [CrossRef]
- Frantz, D.; Rufin, P.; Janz, A.; Ernst, S.; Pflugmacher, D.; Schug, F.; Hostert, P. Understanding the Robustness of Spectral-Temporal Metrics across the Global Landsat Archive from 1984 to 2019—A Quantitative Evaluation. Remote Sens. Environ. 2023, 298, 113823. [Google Scholar] [CrossRef]
- Swain, P.H.; King, R.C. Two Effective Feature Selection Criteria for Multispectral Remote Sensing. In LARS Technical Reports; Purdue University: West Lafayette, IN, USA, 1973. [Google Scholar]
- Welch, B.L. The Generalization of ‘Student’s’ Problem When Several Different Population Variances Are Involved. Biometrika 1947, 34, 28. [Google Scholar] [CrossRef]
- Wright, N.; Duncan, J.M.A.; Callow, J.N.; Thompson, S.E.; George, R.J. CloudS2Mask: A Novel Deep Learning Approach for Improved Cloud and Cloud Shadow Masking in Sentinel-2 Imagery. Remote Sens. Environ. 2024, 306, 114122. [Google Scholar] [CrossRef]
- Hong, J.; Zhang, Y.-D.; Chen, W. Source-Free Unsupervised Domain Adaptation for Cross-Modality Abdominal Multi-Organ Segmentation. Knowl.-Based Syst. 2022, 250, 109155. [Google Scholar] [CrossRef]
- Noman, M.; Fiaz, M.; Cholakkal, H.; Narayan, S.; Muhammad Anwer, R.; Khan, S.; Shahbaz Khan, F. Remote Sensing Change Detection With Transformers Trained From Scratch. IEEE Trans. Geosci. Remote Sens. 2024, 62, 4704214. [Google Scholar] [CrossRef]
- Wang, D.; Zhang, Q.; Xu, Y.; Zhang, J.; Du, B.; Tao, D.; Zhang, L. Advancing Plain Vision Transformer Toward Remote Sensing Foundation Model. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5607315. [Google Scholar] [CrossRef]
- Wang, D.; Zhang, J.; Du, B.; Xia, G.-S.; Tao, D. An Empirical Study of Remote Sensing Pretraining. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5608020. [Google Scholar] [CrossRef]
- Capliez, E.; Ienco, D.; Gaetano, R.; Baghdadi, N.; Salah, A.H.; Le Goff, M.; Chouteau, F. Multisensor Temporal Unsupervised Domain Adaptation for Land Cover Mapping With Spatial Pseudo-Labeling and Adversarial Learning. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5405716. [Google Scholar] [CrossRef]
- Shen, Z.; Ni, H.; Guan, H.; Niu, X. Optimal Transport-Based Domain Adaptation for Semantic Segmentation of Remote Sensing Images. Int. J. Remote Sens. 2024, 45, 420–450. [Google Scholar] [CrossRef]
- Eisfelder, C.; Boemke, B.; Gessner, U.; Sogno, P.; Alemu, G.; Hailu, R.; Mesmer, C.; Huth, J. Cropland and Crop Type Classification with Sentinel-1 and Sentinel-2 Time Series Using Google Earth Engine for Agricultural Monitoring in Ethiopia. Remote Sens. 2024, 16, 866. [Google Scholar] [CrossRef]
- Najem, S.; Baghdadi, N.; Bazzi, H.; Lalande, N.; Bouchet, L. Detection and Mapping of Cover Crops Using Sentinel-1 SAR Remote Sensing Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 1446–1461. [Google Scholar] [CrossRef]
- Xu, Y.; Ma, Y.; Zhang, Z. Self-Supervised Pre-Training for Large-Scale Crop Mapping Using Sentinel-2 Time Series. ISPRS J. Photogramm. Remote Sens. 2024, 207, 312–325. [Google Scholar] [CrossRef]
- Tao, C.; Qi, J.; Guo, M.; Zhu, Q.; Li, H. Self-Supervised Remote Sensing Feature Learning: Learning Paradigms, Challenges, and Future Works. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5610426. [Google Scholar] [CrossRef]
- Ma, S.; Tong, L.; Zhou, J.; Yu, J.; Xiao, C. Self-Supervised Spectral–Spatial Graph Prototypical Network for Few-Shot Hyperspectral Image Classification. IEEE Trans. Geosci. Remote Sens. 2023, 61, 5518915. [Google Scholar] [CrossRef]















| Bands Name | Central Wavelength (nm) | Resolution (m) | Description |
|---|---|---|---|
| B2 | 496.6 | 10 | Blue |
| B3 | 560.0 | 10 | Green |
| B4 | 664.5 | 10 | Red |
| B5 | 703.9 | 20 | Red Edge 1 |
| B6 | 740.2 | 20 | Red Edge 2 |
| B7 | 782.5 | 20 | Red Edge 3 |
| B8 | 835.1 | 10 | NIR |
| B8A | 864.8 | 20 | Narrow NIR |
| B11 | 1613.7 | 20 | SWIR-1 |
| B12 | 2202.4 | 20 | SWIR-2 |
| Method | Method Category | Key Characteristics and Inductive Bias | Strengths | Weaknesses |
|---|---|---|---|---|
| 2D CNN | Space-Dominant | Implicit Temporal: Time steps stacked as channels. | Strong spatial texture learning; high inference speed. | Ignores temporal order; sensitive to phenological shifts. |
| 2D LSTM | Time-Dominant (Recurrent) | Sequential: Pixel-wise recurrent processing. | Captures evolution dynamics; long-term memory. | Lacks spatial context; slow training; salt-and-pepper noise. |
| TempCNN-1D | Time-Dominant (Convolutional) | Local Temporal: Pixel-wise 1D convolution. | Efficient temporal extraction; interpretable filters. | Interpretable filters. Limited spatial modeling; fixed temporal receptive field. |
| Transformer-1D | Pure Attention | Global Attention: Self-attention-based weighting. | Models long-range dependencies; global context. | Data-hungry; computationally heavy; ignores local priors. |
| Model | OA | Kappa | MacroF1 | mIoU |
|---|---|---|---|---|
| TABS-Net | 0.941 | 0.907 | 0.932 | 0.875 |
| 3D-CNN baseline | 0.916 | 0.866 | 0.910 | 0.835 |
| 2D CNN | 0.913 | 0.862 | 0.903 | 0.825 |
| 2dlstm | 0.890 | 0.825 | 0.867 | 0.773 |
| TempCNN-1D | 0.837 | 0.736 | 0.832 | 0.715 |
| Transformer-1D | 0.802 | 0.677 | 0.793 | 0.665 |
| Methods | IAR_IoU | IAR_F1 | UA_Mean | PA_Mean | ||||
|---|---|---|---|---|---|---|---|---|
| Corn | Soybean | Corn | Soybean | Corn | Soybean | Corn | Soybean | |
| ALL | 0.997 | 0.996 | 0.998 | 0.994 | 0.952 | 0.973 | 0.975 | 0.953 |
| DOY-0 | 0.956 | 0.906 | 0.976 | 0.947 | 0.900 | 0.951 | 0.969 | 0.865 |
| Jitter-0 | 0.986 | 0.971 | 0.992 | 0.984 | 0.916 | 0.964 | 0.973 | 0.911 |
| CBAM-0 | 0.989 | 0.977 | 0.994 | 0.987 | 0.919 | 0.960 | 0.960 | 0.924 |
| CBAM-2D | 0.979 | 0.960 | 0.989 | 0.978 | 0.914 | 0.961 | 0.973 | 0.914 |
| Model | Params (M) | FLOPs (G) | Latency (ms) | FPS | Speed (km2/min) |
|---|---|---|---|---|---|
| TABS-Net | 0.42 | 123.70 | 326.7 | 12.2 | 19,256.7 |
| 3D-CNN baseline | 0.41 | 118.13 | 63.5 | 63.0 | 99,119.7 |
| 2D CNN | 0.22 | 14.43 | 9.2 | 436.1 | 685,959.0 |
| 2dlstm | 0.20 | 42.71 | 53.5 | 74.8 | 117,582.1 |
| TempCNN-1D | 0.34 | 1059.92 | 439.9 | 9.1 | 14,300.5 |
| Transformer-1D | 0.08 | 240.16 | 1356.8 | 2.9 | 4637.0 |
| Band Set | Macro-F1 | Miou | IAR_F1 | IAR_IoU |
|---|---|---|---|---|
| RGB (3 bands) | 0.756 | 0.693 | 0.736 | 0.721 |
| RGB+NIR (5 bands) | 0.837 | 0.796 | 0.759 | 0.743 |
| RGB+NIR+RE (8 bands) | 0.886 | 0.833 | 0.874 | 0.847 |
| Full-Spec (10 bands) | 0.932 | 0.875 | 0.996 | 0.994 |
| T | Macro-F1 | mIoU | IAR_F1 | IAR_IoU |
|---|---|---|---|---|
| 3 | 0.793 | 0.746 | 0.849 | 0.832 |
| 6 | 0.852 | 0.817 | 0.914 | 0.906 |
| 9 | 0.879 | 0.821 | 0.958 | 0.949 |
| 12 | 0.932 | 0.875 | 0.996 | 0.994 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Zhou, X.; Huang, Y.; Shen, Q.; Yao, Y.; Wen, Q.; Xi, F.; Ma, C. TABS-Net: A Temporal Spectral Attentive Block with Space–Time Fusion Network for Robust Cross-Year Crop Mapping. Remote Sens. 2026, 18, 365. https://doi.org/10.3390/rs18020365
Zhou X, Huang Y, Shen Q, Yao Y, Wen Q, Xi F, Ma C. TABS-Net: A Temporal Spectral Attentive Block with Space–Time Fusion Network for Robust Cross-Year Crop Mapping. Remote Sensing. 2026; 18(2):365. https://doi.org/10.3390/rs18020365
Chicago/Turabian StyleZhou, Xin, Yuancheng Huang, Qian Shen, Yue Yao, Qingke Wen, Fengjiang Xi, and Chendong Ma. 2026. "TABS-Net: A Temporal Spectral Attentive Block with Space–Time Fusion Network for Robust Cross-Year Crop Mapping" Remote Sensing 18, no. 2: 365. https://doi.org/10.3390/rs18020365
APA StyleZhou, X., Huang, Y., Shen, Q., Yao, Y., Wen, Q., Xi, F., & Ma, C. (2026). TABS-Net: A Temporal Spectral Attentive Block with Space–Time Fusion Network for Robust Cross-Year Crop Mapping. Remote Sensing, 18(2), 365. https://doi.org/10.3390/rs18020365

