Early Mapping of Farmland and Crop Planting Structures Using Multi-Temporal UAV Remote Sensing
Abstract
1. Introduction
2. Data and Research Area
2.1. Overview of the Research Area
2.2. Data Acquisition and Preprocessing
3. Research Method
3.1. Feature Calculation
3.2. Planting Structure Extraction Based on Optimal Feature Selection
3.2.1. Farmland Extraction
3.2.2. Optimal Feature Selection
3.2.3. Planting Structure Extraction
- (1)
- Random Forest (RF)
- (2)
- SVM
- (3)
- Convolutional Neural Network (CNN)
3.2.4. Accuracy Assessment
3.3. Experimental Setting
4. Results and Analysis
4.1. Farmland and Plot Extraction Result
4.1.1. Farmland Extraction Results
4.1.2. Plot Extraction Result
4.2. Optimal Feature Selection Results
4.2.1. Optimally Selected Features for the Entire Planting Structure
4.2.2. Optimally Selected Features for Monoculture Structures
4.3. Classification Results and Accuracy Assessment
5. Discussion
5.1. Crop Phenology and Early Identification of Planting Structures
5.2. Timeline and Feature Selection for Early Identification of Planting Structures
5.3. Effects of Temporal Phase, Parcel Heterogeneity, and Crop Density on Segmentation
5.4. The Impact of Classification Methods on Classification Results
6. Conclusions
- (1)
- In a representative Qingyang (Loess Plateau) site, April enables early-season identification of cropland extent and winter wheat; June allows fine discrimination of monocropped parcels, with April + June together supporting a preliminary wall-to-wall inventory; July yields high-accuracy, full-coverage classification for all crops except buckwheat; and by September, buckwheat likewise attains high-accuracy discrimination.
- (2)
- RF feature selection shows the top-10 cumulative importance peaking in July (72.26%). Pairwise (ratio/difference) features between April and June reach 67.36% (within 5 pp of July), suiting time-critical applications. Crop-specific optimal windows include July for maize, legumes, maize–legume intercropping, sorghum, millet, and vegetables; September for buckwheat; and May for wheat.
- (3)
- Accuracy and mapped areas: April early-season cropland OA = 82.1%; winter wheat 0.33 ha (4.95 mu) at 94%. June SAM-based segmentation reaches 92.8%. In July, full cropping-structure classification achieves OA = 92.66% (Kappa = 0.9143), with mapped areas—millet 7.61 ha (114.17 mu), legumes 7.02 ha (105.36 mu), vegetables 5.33 ha (79.89 mu), sorghum 0.23 ha (3.44 mu), maize 3.59 ha (53.89 mu), maize–legume intercropping 7.01 ha (105.21 mu); buckwheat is 0.69 ha (10.32 mu) at 73% in July and 0.93 ha (13.89 mu) at 98% in September. SAM-based, U-Net-style segmentation mitigates misclassification of fine, fragmented parcels.
- (4)
- Multi-payload UAV data (LiDAR + thermal + multispectral + RGB) enhance cropland extraction; integrating GLCM textures with vegetation indices reduces spectral confusion; and RF performs strongly for UAV-scale mapping and feature prioritization. Given UAVs’ rapid, repeatable, low-cost acquisition, deep learning can be further leveraged once data volume suffices for large-scale training.
- (5)
- This study defines temporal windows and feature sets for UAV-based extraction of cropland and cropping structures in a representative Loess Plateau region, achieving high classification accuracy. Limitations remain: an operational pathway for transferring the approach to large-area satellite imagery is not yet specified, and segmentation/classification choices for regional-scale data are unvalidated. Our experiments focus on complex plots with stringent feature weighting; for simpler, conventional parcels, computation should be pared back to avoid redundancy and reduce cost. Future work will port the workflow to satellite sensors, clarify cross-sensor/scale interoperability, and enable early, high-accuracy cropland and cropping-structure mapping across the Loess Plateau at regional scales.
Author Contributions
Funding
Institutional Review Board Statement
Data Availability Statement
Conflicts of Interest
Abbreviations
RF | Random Forest |
UAVRSS | UAV Remote Sensing System |
UAV | Unmanned aerial vehicle |
SAM | Segment Anything Model |
PDI | Perpendicular Drought Index |
RCNN | Recurrent Convolutional Neural Networks |
SAR | Synthetic Aperture Radar |
CRF | Conditional Random Field |
MRS | Multi-resolution Segmentation |
ICP | Iterative Closest Point |
GLCM | Gray-Level Co-occurrence Matrix |
RMSE | Root mean square error |
RTK | Real-Time Kinematic |
NDVI | Normalized Difference Vegetation Index |
EVI | Enhanced Vegetation Index |
NDRE | Normalized Difference Red Edge Index |
SAVI | Soil-Adjusted Vegetation Index |
MSAVI | Modified Soil-Adjusted Vegetation Index |
GNDVI | Green Normalized Difference Vegetation Index |
RVI | Ratio Vegetation Index |
SR | Simple Ratio Index |
BNDVI | Blue Normalized Difference Vegetation Index |
DVI | Difference Vegetation Index |
MNLI | Modified Non-Linear Vegetation Index |
VAR | Variance |
HOM | Homogeneity |
CON | Contrast |
DIS | Dissimilarity |
ENT | Entropy |
ASM | Second Moment |
CORR | Correlation |
TIR | Thermal Infrared |
CNN | Convolutional Neural Network |
CHM | Canopy Height Model |
OOB | Out-of-bag |
OA | Overall Accuracy |
UA | User’s Accuracy |
PA | Producer’s Accuracy |
FAR | False alarm rate |
SVM | Support Vector Machine |
References
- Ajayi, O.G.; Iwendi, E.; Adetunji, O.O. Optimizing crop classification in precision agriculture using AlexNet and high resolution UAV imagery. Technol. Agron. 2024, 4, e011. [Google Scholar] [CrossRef]
- Phang, S.K.; Chiang, T.H.A.; Happonen, A.; Chang, M.M.L. From satellite to UAV-based remote sensing: A review on precision agriculture. IEEE Access 2023, 11, 127057–127076. [Google Scholar] [CrossRef]
- Ji, S.; Zhang, Z.; Zhang, C.; Wei, S.; Lu, M.; Duan, Y. Learning discriminative spatiotemporal features for precise crop classification from multi-temporal satellite images. Int. J. Remote Sens. 2020, 41, 3162–3174. [Google Scholar] [CrossRef]
- Raja, S.Á.; Sawicka, B.; Stamenkovic, Z.; Mariammal, G. Crop prediction based on characteristics of the agricultural environment using various feature selection techniques and classifiers. IEEE Access 2022, 10, 23625–23641. [Google Scholar] [CrossRef]
- Saikhom, V.; Kalita, M. UAV for Remote Sensing Applications: An Analytical Review. In International Conference on Emerging Global Trends in Engineering and Technology; Springer: Singapore, 2022; pp. 51–59. [Google Scholar]
- De Swaef, T.; Maes, W.H.; Aper, J.; Baert, J.; Cougnon, M.; Reheul, D.; Steppe, K.; Roldán-Ruiz, I.; Lootens, P. Applying RGB-and thermal-based vegetation indices from UAVs for high-throughput field phenotyping of drought tolerance in forage grasses. Remote Sens. 2021, 13, 147. [Google Scholar] [CrossRef]
- Feng, Q.; Yang, J.; Liu, Y.; Ou, C.; Zhu, D.; Niu, B.; Liu, J.; Li, B. Multi-temporal unmanned aerial vehicle remote sensing for vegetable mapping using an attention-based recurrent convolutional neural network. Remote Sens. 2020, 12, 1668. [Google Scholar] [CrossRef]
- Yang, S.; Song, Z.; Yin, H.; Zhang, Z.; Ning, J. Crop classification method of UVA multispectral remote sensing based on deep semantic segmentation. Trans. Chin. Soc. Agric. Mach. 2021, 52, 185–192. [Google Scholar]
- Wang, F.; Yi, Q.; Hu, J.; Xie, L.; Yao, X.; Xu, T.; Zheng, J. Combining spectral and textural information in UAV hyperspectral images to estimate rice grain yield. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102397. [Google Scholar] [CrossRef]
- Deng, H.; Zhang, W.; Zheng, X.; Zhang, H. Crop classification combining object-oriented method and random forest model using unmanned aerial vehicle (UAV) multispectral image. Agriculture 2024, 14, 548. [Google Scholar] [CrossRef]
- Alvarez-Vanhard, E.; Corpetti, T.; Houet, T. UAV & satellite synergies for optical remote sensing applications: A literature review. Sci. Remote Sens. 2021, 3, 100019. [Google Scholar]
- Chang, B.; Li, F.; Hu, Y.; Yin, H.; Feng, Z.; Zhao, L. Application of UAV remote sensing for vegetation identification: A review and meta-analysis. Front. Plant Sci. 2025, 16, 1452053. [Google Scholar] [CrossRef]
- Ecke, S. Drone Remote Sensing for Forest Health Monitoring. Ph.D. Thesis, Universität Freiburg, Breisgau, Germany, 2025. [Google Scholar]
- Javan, F.D.; Samadzadegan, F.; Toosi, A. Air pollution observation—Bridging spaceborne to unmanned airborne remote sensing: A systematic review and meta-analysis. Air Qual. Atmos. Health 2025, 18, 2481–2549. [Google Scholar] [CrossRef]
- Ma, L.; Liu, Y.; Zhang, X.; Ye, Y.; Yin, G.; Johnson, B.A. Deep learning in remote sensing applications: A meta-analysis and review. ISPRS J. Photogramm. Remote Sens. 2019, 152, 166–177. [Google Scholar] [CrossRef]
- Kamilaris, A.; Prenafeta-Boldú, F.X. Deep learning in agriculture: A survey. Comput. Electron. Agric. 2018, 147, 70–90. [Google Scholar] [CrossRef]
- Osco, L.P.; Junior, J.M.; Ramos, A.P.M.; de Castro Jorge, L.A.; Fatholahi, S.N.; de Andrade Silva, J.; Matsubara, E.T.; Pistori, H.; Gonçalves, W.N.; Li, J. A review on deep learning in UAV remote sensing. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102456. [Google Scholar] [CrossRef]
- Rußwurm, M.; Körner, M. Multi-temporal land cover classification with sequential recurrent encoders. ISPRS Int. J. Geo-Inf. 2018, 7, 129. [Google Scholar] [CrossRef]
- Wang, X.; Wang, A.; Yi, J.; Song, Y.; Chehri, A. Small object detection based on deep learning for remote sensing: A comprehensive review. Remote Sens. 2023, 15, 3265. [Google Scholar] [CrossRef]
- Garnot, V.S.F.; Landrieu, L.; Giordano, S.; Chehata, N. Satellite image time series classification with pixel-set encoders and temporal self-attention. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 12325–12334. [Google Scholar]
- Ronneberger, O.; Fischer, P.; Brox, T. U-net: Convolutional networks for biomedical image segmentation. In International Conference on Medical Image Computing and Computer-Assisted Intervention; Springer: Berlin/Heidelberg, Germany, 2015; pp. 234–241. [Google Scholar]
- Chen, L.-C.; Zhu, Y.; Papandreou, G.; Schroff, F.; Adam, H. Encoder-decoder with atrous separable convolution for semantic image segmentation. In Proceedings of the European Conference on Computer Vision (ECCV), Munich, Germany, 8–14 September 2018; pp. 801–818. [Google Scholar]
- Wang, S.; Azzari, G.; Lobell, D.B. Crop type mapping without field-level labels: Random forest transfer and unsupervised clustering techniques. Remote Sens. Environ. 2019, 222, 303–317. [Google Scholar] [CrossRef]
- Kirillov, A.; Mintun, E.; Ravi, N.; Mao, H.; Rolland, C.; Gustafson, L.; Xiao, T.; Whitehead, S.; Berg, A.C.; Lo, W.-Y. Segment anything. In Proceedings of the IEEE/CVF International Conference on Computer Vision, Paris, France, 2–3 October 2023; pp. 4015–4026. [Google Scholar]
- Huang, Z.; Jing, H.; Liu, Y.; Yang, X.; Wang, Z.; Liu, X.; Gao, K.; Luo, H. Segment anything model combined with multi-scale segmentation for extracting complex cultivated land parcels in high-resolution remote sensing images. Remote Sens. 2024, 16, 3489. [Google Scholar] [CrossRef]
- Zhang, E.; Liu, J.; Cao, A.; Sun, Z.; Zhang, H.; Wang, H.; Sun, L.; Song, M. RS-SAM: Integrating multi-scale information for enhanced remote sensing image segmentation. In Proceedings of the Asian Conference on Computer Vision, Hanoi, Vietnam, 8–12 December 2024; pp. 994–1010. [Google Scholar]
- Osco, L.P.; Wu, Q.; De Lemos, E.L.; Gonçalves, W.N.; Ramos, A.P.M.; Li, J.; Junior, J.M. The segment anything model (sam) for remote sensing applications: From zero to one shot. Int. J. Appl. Earth Obs. Geoinf. 2023, 124, 103540. [Google Scholar] [CrossRef]
- Yang, R.; Qi, Y.; Zhang, H.; Wang, H.; Zhang, J.; Ma, X.; Zhang, J.; Ma, C. A study on the object-based high-resolution remote sensing image classification of crop planting structures in the loess plateau of eastern gansu province. Remote Sens. 2024, 16, 2479. [Google Scholar] [CrossRef]
- Wang, L.; Qi, Y.; Xie, W.; Yang, R.; Wang, X.; Zhou, S.; Dong, Y.; Lian, X. Estimating Gully Erosion Induced by Heavy Rainfall Events Using Stereoscopic Imagery and UAV LiDAR. Remote Sens. 2025, 17, 3363. [Google Scholar] [CrossRef]
- Raza, M.A.; Yasin, H.S.; Gul, H.; Qin, R.; Mohi Ud Din, A.; Khalid, M.H.B.; Hussain, S.; Gitari, H.; Saeed, A.; Wang, J. Maize/soybean strip intercropping produces higher crop yields and saves water under semi-arid conditions. Front. Plant Sci. 2022, 13, 1006720. [Google Scholar] [CrossRef]
- Glenn, E.P.; Huete, A.R.; Nagler, P.L.; Nelson, S.G. Relationship between remotely-sensed vegetation indices, canopy attributes and plant physiological processes: What vegetation indices can and cannot tell us about the landscape. Sensors 2008, 8, 2136–2160. [Google Scholar] [CrossRef]
- Carlson, T.N.; Ripley, D.A. On the relation between NDVI, fractional vegetation cover, and leaf area index. Remote Sens. Environ. 1997, 62, 241–252. [Google Scholar] [CrossRef]
- Peng, X.; Han, W.; Ao, J.; Wang, Y. Assimilation of LAI Derived from UAV Multispectral Data into the SAFY Model to Estimate Maize Yield. Remote Sens. 2021, 13, 1094. [Google Scholar] [CrossRef]
- Fitzgerald, G.; Rodriguez, D.; O’Leary, G. Measuring and predicting canopy nitrogen nutrition in wheat using a spectral index-The canopy chlorophyll content index (CCCI). Field Crops Res. 2010, 116, 318–324. [Google Scholar] [CrossRef]
- Roujean, J.L.; Breon, F.M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
- Daughtry, C.S.T.; Walthall, C.L.; Kim, M.S.; de Colstoun, E.B.; McMurtrey, J.E. Estimating corn leaf chlorophyll concentration from leaf and canopy reflectance. Remote Sens. Environ. 2000, 74, 229–239. [Google Scholar] [CrossRef]
- Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
- Stone, K.H. Aerial photographic interpretation of natural vegetation in the Anchorage area, Alaska. Geogr. Rev. 1948, 38, 465–474. [Google Scholar] [CrossRef]
- Jordan, C.F. Derivation of Leaf-Area Index from Quality of Light on Forest Floor. Ecology 1969, 50, 663. [Google Scholar] [CrossRef]
- Yang, C.; Everitt, J.H.; Bradford, J.M. Airborne hyperspectral imagery and linear spectral unmixing for mapping variation in crop yield. Precis. Agric. 2007, 8, 279–296. [Google Scholar] [CrossRef]
- Gong, P.; Pu, R.L.; Biging, G.S.; Larrieu, M.R. Estimation of forest leaf area index using vegetation indices derived from Hyperion hyperspectral data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 1355–1362. [Google Scholar] [CrossRef]
- Haralick, R.M.; Shanmugam, K.; Dinstein, I.H. Textural features for image classification. IEEE Trans. Syst. Man Cybern. 2007, 12, 610–621. [Google Scholar] [CrossRef]
- Li, Z.; Wang, Q.; Xu, H.; Yang, W.; Sun, W. Multi-Source Remote Sensing-Based Reconstruction of Glacier Mass Changes in Southeastern Tibet Since the 21st Century. EGUsphere 2025, 2025, 1–29. [Google Scholar]
- Kettig, R.L. Computer Classification of Remotely Sensed Multispectral Image Data by Extraction and Classification of Homogeneous Objects; Purdue University: West Lafayette, IN, USA, 1975. [Google Scholar]
- Song, X.; Xie, P.; Sun, W.; Mu, X.; Gao, P. The greening of vegetation on the Loess Plateau has resulted in a northward shift of the vegetation greenness line. Glob. Planet. Change 2024, 237, 104440. [Google Scholar] [CrossRef]
- Antonarakis, A.; Richards, K.S.; Brasington, J.; Bithell, M.; Muller, E. Retrieval of vegetative fluid resistance terms for rigid stems using airborne lidar. J. Geophys. Res. Biogeosci. 2008, 113, G02S07. [Google Scholar] [CrossRef]
- Pereira, L.G.; Fernandez, P.; Mourato, S.; Matos, J.; Mayer, C.; Marques, F. Quality control of outsourced LiDAR data acquired with a UAV: A case study. Remote Sens. 2021, 13, 419. [Google Scholar] [CrossRef]
- Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
- Strobl, C.; Boulesteix, A.-L.; Kneib, T.; Augustin, T.; Zeileis, A. Conditional variable importance for random forests. BMC Bioinform. 2008, 9, 307. [Google Scholar] [CrossRef] [PubMed]
- Awad, M.; Khan, L. Support vector machines. In Intelligent Information Technologies: Concepts, Methodologies, Tools, and Applications; IGI Global: Hershey, PA, USA, 2008; pp. 1138–1146. [Google Scholar]
- Krizhevsky, A.; Sutskever, I.; Hinton, G.E. Imagenet classification with deep convolutional neural networks. Adv. Neural Inf. Process. Syst. 2012, 25, 1097–1105. [Google Scholar] [CrossRef]
- Banko, G. A Review of Assessing the Accuracy of Classifications of Remotely Sensed Data and of Methods Including Remote Sensing Data in Forest Inventory; International Institute for Applied Systems Analysis: Laxenburg, Austria, 1998. [Google Scholar]
- Zhu, X.; Guo, R.; Liu, T.; Xu, K. Crop yield prediction based on agrometeorological indexes and remote sensing data. Remote Sens. 2021, 13, 2016. [Google Scholar] [CrossRef]
- Zhang, D.; Zhang, M.; Lin, F.; Pan, Z.; Jiang, F.; He, L.; Yang, H.; Jin, N. Fast extraction of winter wheat planting area in Huang-Huai-Hai Plain using high-resolution satellite imagery on a cloud computing platform. Int. J. Agric. Biol. Eng. 2022, 15, 241–250. [Google Scholar] [CrossRef]
- Li, Y.; Porto-Neto, L.; McCulloch, R.; McWilliam, S.; Alexandre, P.; Lehnert, S.; Reverter, A.; McDonald, J.; Smith, C. Comparing genomic prediction accuracies for commercial cows’ reproductive performance using GA2CAT and two machine learning methods. Proc. Assoc. Advmt. Anim. Breed. Genet. 2023, 25, 154–157. [Google Scholar]
- Sheykhmousa, M.; Mahdianpari, M.; Ghanbari, H.; Mohammadimanesh, F.; Ghamisi, P.; Homayouni, S. Support vector machine versus random forest for remote sensing image classification: A meta-analysis and systematic review. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2020, 13, 6308–6325. [Google Scholar] [CrossRef]
- Sawarkar, A.D.; Shrimankar, D.D.; Ali, S.; Agrahari, A.; Singh, L. Bamboo plant classification using deep transfer learning with a majority multiclass voting algorithm. Appl. Sci. 2024, 14, 1023. [Google Scholar] [CrossRef]
- Zhang, C.; Puspitasari, F.D.; Zheng, S.; Li, C.; Qiao, Y.; Kang, T.; Shan, X.; Zhang, C.; Qin, C.; Rameau, F. A survey on segment anything model (sam): Vision foundation model meets prompt engineering. arXiv 2023, arXiv:2306.06211. [Google Scholar] [CrossRef]
- Ke, L.; Ye, M.; Danelljan, M.; Tai, Y.-W.; Tang, C.-K.; Yu, F. Segment anything in high quality. Adv. Neural Inf. Process. Syst. 2023, 36, 29914–29934. [Google Scholar]
- Hall, O.; Hay, G.J.; Bouchard, A.; Marceau, D.J. Detecting dominant landscape objects through multiple scales: An integration of object-specific methods and watershed segmentation. Landsc. Ecol. 2004, 19, 59–76. [Google Scholar] [CrossRef]
- Zhang, Y.-L.; Wang, F.-X.; Shock, C.C.; Feng, S.-Y. Modeling the interaction of plastic film mulch and potato canopy growth with soil heat transport in a semiarid area. Agronomy 2020, 10, 190. [Google Scholar] [CrossRef]
- Yang, W.; Li, Z.; Chen, G.; Cui, S.; Wu, Y.; Liu, X.; Meng, W.; Liu, Y.; He, J.; Liu, D. Soybean (Glycine max L.) leaf moisture estimation based on multisource unmanned aerial vehicle image feature fusion. Plants 2024, 13, 1498. [Google Scholar] [CrossRef]
- Tan, W.; Yin, Q.; Zhao, H.; Wang, M.; Sun, X.; Cao, H.; Wang, D.; Li, Q. Disruption of chlorophyll metabolism and photosynthetic efficiency in winter jujube (Ziziphus jujuba) Induced by Apolygus lucorum infestation. Front. Plant Sci. 2025, 16, 1536534. [Google Scholar] [CrossRef]
- Ljubičić, N.; Popović, V.; Kostić, M.; Vukosavljev, M.; Buđen, M.; Stanković, N.; Stevanović, N. The normalized difference red edge index (NDRE) in grain yield and biomass estimation in maize (Zea mays L.). In Proceedings of the XV International Scientific Agricultural Symposium Agrosym, Jahorina, Bosnia and Herzegovina, 10–13 October 2024; pp. 373–378. [Google Scholar]
- Avtar, R.; Suab, S.A.; Syukur, M.S.; Korom, A.; Umarhadi, D.A.; Yunus, A.P. Assessing the influence of UAV altitude on extracted biophysical parameters of young oil palm. Remote Sens. 2020, 12, 3030. [Google Scholar] [CrossRef]
- Massey, R.; Sankey, T.T.; Congalton, R.G.; Yadav, K.; Thenkabail, P.S.; Ozdogan, M.; Meador, A.J.S. MODIS phenology-derived, multi-year distribution of conterminous US crop types. Remote Sens. Environ. 2017, 198, 490–503. [Google Scholar] [CrossRef]
- Inglada, J.; Vincent, A.; Arias, M.; Marais-Sicre, C. Improved early crop type identification by joint use of high temporal resolution SAR and optical image time series. Remote Sens. 2016, 8, 362. [Google Scholar] [CrossRef]
- Jin, M.; Xu, Q.; Guo, P.; Han, B.; Jin, J. Crop Classification Method from UAV Images based on Object-Oriented Multi-feature Learning. Remote Sens. Technol. Appl. 2023, 38, 588–598. [Google Scholar]
- Ramos, L.T.; Sappa, A.D. Dual-Branch ConvNeXt-Based Network with Attentional Fusion Decoding for Land Cover Classification Using Multispectral Imagery. In Proceedings of the SoutheastCon 2025, Concord, NC, USA, 22–30 March 2025; pp. 187–194. [Google Scholar]
- Allu, A.R.; Mesapam, S. Fusion of Satellite and UAV Imagery for Crop Monitoring. ISPRS Ann. Photogramm. Remote Sens. Spat. Inf. Sci. 2025, X-G-2025, 71–79. [Google Scholar] [CrossRef]
- Zhu, X.; Cai, F.; Tian, J.; Williams, T.K.-A. Spatiotemporal fusion of multisource remote sensing data: Literature survey, taxonomy, principles, applications, and future directions. Remote Sens. 2018, 10, 527. [Google Scholar] [CrossRef]
- Wu, J.; Zheng, D.; Wu, Z.; Song, H.; Zhang, X. Prediction of buckwheat maturity in UAV-RGB images based on recursive feature elimination cross-validation: A case study in Jinzhong, Northern China. Plants 2022, 11, 3257. [Google Scholar] [CrossRef] [PubMed]
- Zhang, Z.; Whish, J.P.; Bell, L.W.; Nan, Z. Forage production, quality and water-use-efficiency of four warm-season annual crops at three sowing times in the Loess Plateau region of China. Eur. J. Agron. 2017, 84, 84–94. [Google Scholar]
- Zhao, X.; Wang, J.; Ding, Y.; Gao, X.; Li, C.; Huang, H.; Gao, X. High-resolution (10 m) dataset of multi-crop planting structure on the Loess Plateau during 2018–2022. Sci. Data 2025, 12, 1190. [Google Scholar] [CrossRef]
- Wierzbicki, D.; Kedzierski, M.; Fryskowska, A. Assesment of the influence of UAV image quality on the orthophoto production. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 1–8. [Google Scholar] [CrossRef]
- Catania, P.; Ferro, M.V.; Orlando, S.; Vallone, M. Grapevine and cover crop spectral response to evaluate vineyard spatio-temporal variability. Sci. Hortic. 2025, 339, 113844. [Google Scholar] [CrossRef]
- Cheng, B.; Misra, I.; Schwing, A.G.; Kirillov, A.; Girdhar, R. Masked-attention mask transformer for universal image segmentation. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, New Orleans, LA, USA, 18–24 June 2022; pp. 1290–1299. [Google Scholar]
- Xie, E.; Wang, W.; Yu, Z.; Anandkumar, A.; Alvarez, J.M.; Luo, P. SegFormer: Simple and efficient design for semantic segmentation with transformers. Adv. Neural Inf. Process. Syst. 2021, 34, 12077–12090. [Google Scholar]
- Song, Y.; Pu, B.; Wang, P.; Jiang, H.; Dong, D.; Cao, Y.; Shen, Y. Sam-lightening: A lightweight segment anything model with dilated flash attention to achieve 30×. arXiv 2024, arXiv:2403.09195. [Google Scholar] [CrossRef]
- Wu, S.; Su, Y.; Lu, X.; Xu, H.; Kang, S.; Zhang, B.; Hu, Y.; Liu, L. Extraction and Mapping of Cropland Parcels in Typical Regions of Southern China Using Unmanned Aerial Vehicle Multispectral Images and Deep Learning. Drones 2023, 7, 285. [Google Scholar] [CrossRef]
- Li, J.; Feng, Q.; Zhang, J.; Yang, S. EMSAM: Enhanced multi-scale segment anything model for leaf disease segmentation. Front. Plant Sci. 2025, 16, 1564079. [Google Scholar] [CrossRef] [PubMed]
- Ji, W.; Li, J.; Bi, Q.; Liu, T.; Li, W.; Cheng, L. Segment anything is not always perfect: An investigation of sam on different real-world applications. Mach. Intell. Res. 2024, 21, 617–630. [Google Scholar] [CrossRef]
- Xu, W.; Lan, Y.; Li, Y.; Luo, Y.; He, Z. Classification method of cultivated land based on UAV visible light remote sensing. Int. J. Agric. Biol. Eng. 2019, 12, 103–109. [Google Scholar] [CrossRef]
- Gowda, S.N.; Clifton, D.A. Cc-sam: Sam with cross-feature attention and context for ultrasound image segmentation. In Proceedings of the European Conference on Computer Vision, Milan, Italy, 29 September–4 October 2024; pp. 108–124. [Google Scholar]
- Dosovitskiy, A.; Beyer, L.; Kolesnikov, A.; Weissenborn, D.; Zhai, X.; Unterthiner, T.; Dehghani, M.; Minderer, M.; Heigold, G.; Gelly, S. An image is worth 16x16 words: Transformers for image recognition at scale. arXiv 2020, arXiv:2010.11929. [Google Scholar]
- Shwartz-Ziv, R.; Armon, A. Tabular data: Deep learning is not all you need. Inf. Fusion 2022, 81, 84–90. [Google Scholar] [CrossRef]
- Yan, H.; Zhuo, Y.; Li, M.; Wang, Y.; Guo, H.; Wang, J.; Li, C.; Ding, F. Alfalfa yield prediction using machine learning and UAV multispectral remote sensing. Trans. Chin. Soc. Agric. Eng. 2022, 38, 64–71. [Google Scholar]
Parameter | Description |
---|---|
Experiment time | 30 April, 17 May, 14 June, 20 July, 15 August, 20 September, 2022 |
UAV platform | DJI Matrice 300 RTK UAV platform |
Sensor configuration | Dual-gimbal system carrying DJI L1 LiDAR camera, DJI P1 survey camera, and Yusense MS600 Pro multispectral sensor |
Data acquisition method | Multi-source remote sensing data were obtained through six synchronized UAV and ground experiments. |
Flight area | 0.6 km2 |
Flight altitude | The constant cruising altitude of all sensors was 100 m |
Multispectral data bands | B1 (450 nm), B2 (555 nm), B3 (660 nm), B4 (720 nm), B5 (750 nm), and B6 (840 nm) |
Ground resolution | 7 cm |
Frequency of LiDAR echo | 3 times |
Point cloud density | 540 points/m2 |
Number of ground sample points | 75 points |
Ground object types | Maize, wheat, maize-legume striped intercropping, legumes, sorghum, millet, broom corn millet, buckwheat, vegetables, and non-agricultural farmland |
Other ground measurement data | Plant height, soil temperature, and plant greenness |
Feature | Formula | Detailed Description of Features | |
---|---|---|---|
Vegetation Index | Normalized Difference Vegetation Index (NDVI) [32] | It is sensitive to chlorophyll content and canopy density, enabling early differentiation between germinating crops and bare land. | |
Enhanced Vegetation Index (EVI) [33] | It is sensitive to high Leaf Area Index (LAI) and canopy structure, facilitating early distinction between densely planted and sparsely planted crops. | ||
Normalized Difference Red Edge Index (NDRE) [34] | It is sensitive to the nitrogen content and physiological status of early-stage leaves, helping to distinguish legume and millet and crops with significant differences in nitrogen fertilization management. | ||
Soil-Adjusted Vegetation Index (SAVI) [35] | It suppresses the interference of early-stage soil background, especially in crop planting areas with a high proportion of bare soil. | ||
Modified Soil-Adjusted Vegetation Index (MSAVI) [36] | It enhances the identification capability in areas with low vegetation coverage. | ||
Green Normalized Difference Vegetation Index (GNDVI) [37] | It is sensitive to early chlorophyll content and thus can be used to distinguish early-stage crops with high chlorophyll content. | ||
Ratio Vegetation Index (RVI) [38] | It reflects the vegetation amount, enabling early-stage differentiation between bare land and vegetation. | ||
Simple Ratio Index (SR) [39] | It is sensitive to total vegetation amount, assisting in improving accuracy in combination with other indices. | ||
Blue Normalized Difference Vegetation Index (BNDVI) [40] | It enhances the early sensitivity to moisture, facilitating early identification of dry land and irrigated crop planting areas. | ||
Difference Vegetation Index (DVI) [33] | It reflects the reflectance difference between vegetation and soil and thus is suitable for early identification of sparsely planted crops. | ||
Modified Non-Linear Vegetation Index (MNLI) [41] | It is sensitive to spectral changes in medium and high-density vegetation, enabling early identification of densely planted and monoculture crops. | ||
Texture features | Mean (MEAN) | It reflects image brightness, facilitating early coarse distinction between densely planted and sparsely planted crop areas. | |
Variance (VAR) | It reflects the discrete degree of spectral values and is sensitive to the spectral inhomogeneity of intercropped vegetation. | ||
Homogeneity (HOM) | It reflects the texture uniformity, with high value in densely planted areas and low value in intercropped or vegetable areas. | ||
Contrast (CON) | It reflects the gray-level difference between neighboring pixels, with high values in areas with different canopy height and row spacing. | ||
Dissimilarity (DIS) | It reflects the average gray-level difference between neighboring pixels, enabling early identification of crops with a high proportion of bare soil between rows. | ||
Entropy (ENT) | It reflects the degree of disorder in gray-level distribution, with high entropy values in areas with complex vegetation structures. | ||
Second Moment (ASM) | It reflects the texture regularity, with high values for crops with high uniformity. | ||
Correlation (CORR) | It reflects the degree of correlation between gray-level values in neighboring pixels, with high values for uniformly growing crops and low values for intercropping and vegetables. |
Parameters Name | Final Parameters | Explanation |
---|---|---|
points_per_side | 128 | Defines sampling points along one image side. |
pred_iou_thresh | 0.86 | Filters masks based on predicted quality, within the range [0, 1]. |
stability_score_thresh | 0.92 | Adjusts cutoff for stability score calculation. |
crop_n_layers | 1 | Determines the number of image crop layers, where each layer includes 2i image subdivisions. |
crop_n_points_downscale_factor | 2 | Downscales sampled points per side for layer n by a factor of 2n. |
min_mask_region_area | 80 | Removes small regions and holes in masks smaller than the specified area. |
Accuracy Statistics | NDVI | TIR | NDVI + TIR | NDVI + TIR + CHM |
---|---|---|---|---|
April | 0.531 | 0.359 | 0.562 | 0.821 |
May | 0.575 | 0.492 | 0.591 | 0.893 |
June | 0.665 | 0.593 | 0.706 | 0.964 |
Accuracy Statistics | SAM-June | SAM-May | SAM-April |
---|---|---|---|
Detection accuracy | 0.874 | 0.721 | 0.683 |
False alarm rate | 0.094 | 0.158 | 0.209 |
Overall accuracy | 0.928 | 0.832 | 0.791 |
Period | April | May | Apr./May | June | Apr./Jun. | May/Jun. | July | August | September |
---|---|---|---|---|---|---|---|---|---|
Sum of Top10 | 16.39% | 21.95% | 43.27% | 51.95% | 67.36% | 58.52% | 72.26% | 71.07% | 70.83% |
Crop\Period | April | May | June | Apr./Jun. | July | August | September |
---|---|---|---|---|---|---|---|
Millet | 0.58 | 0.61 | 0.65 | 0.71 | 0.79 | 0.7 | 0.66 |
Legumes | 0.39 | 0.45 | 0.5 | 0.66 | 0.85 | 0.77 | 0.76 |
Vegetables | 0.54 | 0.62 | 0.75 | 0.80 | 0.83 | 0.76 | 0.65 |
Sorghum | 0.5 | 0.53 | 0.7 | 0.71 | 0.8 | 0.74 | 0.75 |
Maize | 0.57 | 0.65 | 0.73 | 0.78 | 0.82 | 0.8 | 0.78 |
Maize & Legumes | 0.49 | 0.68 | 0.69 | 0.73 | 0.86 | 0.81 | 0.74 |
Wheat | 0.73 | 0.81 | - | - | - | - | - |
Buckwheat | - | - | - | - | 0.66 | 0.79 | 0.87 |
(a) | ||||||
April | May | June | July | August | September | |
Millet | 0.011 | 0.214 | 3.743 | 7.611 | 6.810 | 6.393 |
Legumes | 0.023 | 1.957 | 2.881 | 7.024 | 6.759 | 6.597 |
Vegetables | 0.142 | 1.519 | 3.346 | 5.326 | 5.065 | 4.821 |
Sorghum | 0.006 | 0.086 | 0.145 | 0.229 | 0.207 | 0.177 |
Maize | 0.047 | 1.065 | 1.909 | 3.593 | 3.249 | 3.087 |
Maize & Legumes | 0.010 | 2.053 | 3.960 | 7.014 | 6.597 | 6.145 |
Wheat | 0.330 | 0.341 | ||||
Buckwheat | 0.688 | 0.810 | 0.926 | |||
(b) | ||||||
April | May | June | July | August | September | |
Millet | 0 | 0.03 | 0.46 | 0.93 | 0.83 | 0.78 |
Legumes | 0 | 0.27 | 0.4 | 0.97 | 0.94 | 0.91 |
Vegetables | 0.03 | 0.28 | 0.61 | 0.97 | 0.92 | 0.88 |
Sorghum | 0.03 | 0.36 | 0.61 | 0.97 | 0.87 | 0.74 |
Maize | 0.01 | 0.28 | 0.51 | 0.96 | 0.87 | 0.82 |
Maize & Legumes | 0 | 0.28 | 0.54 | 0.96 | 0.9 | 0.84 |
Wheat | 0.94 | 0.98 | ||||
Buckwheat | 0.73 | 0.86 | 0.98 | |||
(c) | ||||||
April | May | June | July | August | September | |
Millet | 0.02 | 0.15 | 0.27 | 0.55 | 0.62 | 0.35 |
Legumes | 0.01 | 0.1 | 0.18 | 0.22 | 0.88 | 0.6 |
Vegetables | −0.1 | 0.1 | 0.56 | 0.82 | 0.03 | 0 |
Sorghum | 0.04 | 0.1 | 0.35 | 0.78 | 0.7 | 0.2 |
Maize | 0.05 | 0.15 | 0.65 | 0.85 | 0.75 | 0.35 |
Maize & Legumes | 0.02 | 0.12 | 0.52 | 0.78 | 0.5 | 0.1 |
Wheat | 0.75 | |||||
Buckwheat | 0.76 | 0.78 | 0.82 |
(a) | ||||||
Segmentation Methods | Classification Methods | Kappa Coefficient | Overall Accuracy | |||
July | August | July | August | |||
MRS | RF | 0.8919 | 0.8276 | 0.8905 | 0.8351 | |
SVM | 0.8691 | 0.7843 | 0.8705 | 0.7903 | ||
CNN | 0.6356 | 0.5917 | 0.6569 | 0.6124 | ||
SAM | RF | 0.9163 | 0.8718 | 0.9266 | 0.8847 | |
SVM | 0.9051 | 0.8505 | 0.9171 | 0.8433 | ||
CNN | 0.8293 | 0.7817 | 0.8509 | 0.7942 | ||
(b) | ||||||
Month | Comparator Method | OA (SAM + RF) | OA (Comparator) | ΔOA = OA_SAM + RF − OA_Comp. | 95% CI for ΔOA (%) | p (Paired Permutation) |
Jul | Best baseline (SAM + SVM) | 0.9266 | 0.9171 | 0.0095 | +0.10–+1.80 | 0.036 |
Jul | MRS + RF | 0.9266 | 0.8905 | 0.0361 | +2.20–+4.95 | <0.001 |
Jul | SAM + SVM | 0.9266 | 0.9171 | 0.0095 | +0.10–+1.80 | 0.036 |
Jul | SAM + CNN | 0.9266 | 0.8509 | 0.0757 | +6.20–+8.90 | <0.001 |
Aug | Best baseline (SAM + SVM) | 0.8847 | 0.8433 | 0.0414 | +2.80–+5.50 | <0.001 |
Aug | MRS + RF | 0.8847 | 0.8351 | 0.0496 | +3.60–+6.30 | <0.001 |
Aug | SAM + SVM | 0.8847 | 0.8433 | 0.0414 | +2.80–+5.50 | <0.001 |
Aug | SAM + CNN | 0.8847 | 0.7942 | 0.0905 | +7.80–+10.30 | <0.001 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, L.; Qi, Y.; Zhang, J.; Yang, R.; Wang, H.; Zhang, J.; Ma, C. Early Mapping of Farmland and Crop Planting Structures Using Multi-Temporal UAV Remote Sensing. Agriculture 2025, 15, 2186. https://doi.org/10.3390/agriculture15212186
Wang L, Qi Y, Zhang J, Yang R, Wang H, Zhang J, Ma C. Early Mapping of Farmland and Crop Planting Structures Using Multi-Temporal UAV Remote Sensing. Agriculture. 2025; 15(21):2186. https://doi.org/10.3390/agriculture15212186
Chicago/Turabian StyleWang, Lu, Yuan Qi, Juan Zhang, Rui Yang, Hongwei Wang, Jinlong Zhang, and Chao Ma. 2025. "Early Mapping of Farmland and Crop Planting Structures Using Multi-Temporal UAV Remote Sensing" Agriculture 15, no. 21: 2186. https://doi.org/10.3390/agriculture15212186
APA StyleWang, L., Qi, Y., Zhang, J., Yang, R., Wang, H., Zhang, J., & Ma, C. (2025). Early Mapping of Farmland and Crop Planting Structures Using Multi-Temporal UAV Remote Sensing. Agriculture, 15(21), 2186. https://doi.org/10.3390/agriculture15212186