Next Article in Journal
RTUAV-YOLO: A Family of Efficient and Lightweight Models for Real-Time Object Detection in UAV Aerial Imagery
Previous Article in Journal
Can Sled-Resisted Priming Training Enhance Performance in Amateur Football Players?
Previous Article in Special Issue
Real-Time Detection Sensor for Unmanned Aerial Vehicle Using an Improved YOLOv8s Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Deep Learning-Driven Automatic Segmentation of Weeds and Crops in UAV Imagery †

1
Graduate School of Global Environmental Studies, Sophia University, Tokyo 102-8554, Japan
2
Faculty of Robot Science and Engineering, Northeastern University, Shenyang 110819, China
3
Graduate School of Frontier Sciences, The University of Tokyo, Tokyo 277-8563, Japan
4
College of Information Science and Engineering, Xinjiang College of Science & Technology, Urumqi 830046, China
5
Department of Environmental Health Sciences, University of California, Los Angeles, CA 90095, USA
6
Graduate School of Information, Production and Systems, Waseda University, Tokyo 169-8050, Japan
7
Graduate School of Information Science and Technology, The University of Tokyo, Tokyo 113-8654, Japan
*
Author to whom correspondence should be addressed.
This paper is an extended version of our paper published in Zhao, F.; Huang, J.; Liu, Y.; Wang, J.; Chen, Y.; Shao, X.; Ma, B.; Xi, D.; Zhang, M.; Tu, Z.; et al. A Deep Learning Approach Combining Super-Resolution and Segmentation to Identify Weed and Tobacco in UAV Imagery. In Proceeding of the 2024 IEEE International Conference on Computer Science and Blockchain (CCSB), Shenzhen, China, 6–8 September 2024.
Sensors 2025, 25(21), 6576; https://doi.org/10.3390/s25216576 (registering DOI)
Submission received: 10 September 2025 / Revised: 15 October 2025 / Accepted: 22 October 2025 / Published: 25 October 2025
(This article belongs to the Special Issue Smart Sensing and Control for Autonomous Intelligent Unmanned Systems)

Abstract

Accurate segmentation of crops and weeds is essential for enhancing crop yield, optimizing herbicide usage, and mitigating environmental impacts. Traditional weed management practices, such as manual weeding or broad-spectrum herbicide application, are labor-intensive, environmentally harmful, and economically inefficient. In response, this study introduces a novel precision agriculture framework integrating Unmanned Aerial Vehicle (UAV)-based remote sensing with advanced deep learning techniques, combining Super-Resolution Reconstruction (SRR) and semantic segmentation. This study is the first to integrate UAV-based SRR and semantic segmentation for tobacco fields, systematically evaluate recent Transformer and Mamba-based models alongside traditional CNNs, and release an annotated dataset that not only ensures reproducibility but also provides a resource for the research community to develop and benchmark future models. Initially, SRR enhanced the resolution of low-quality UAV imagery, significantly improving detailed feature extraction. Subsequently, to identify the optimal segmentation model for the proposed framework, semantic segmentation models incorporating CNN, Transformer, and Mamba architectures were used to differentiate crops from weeds. Among evaluated SRR methods, RCAN achieved the optimal reconstruction performance, reaching a Peak Signal-to-Noise Ratio (PSNR) of 24.98 dB and a Structural Similarity Index (SSIM) of 69.48%. In semantic segmentation, the ensemble model integrating Transformer (DPT with DINOv2) and Mamba-based architectures achieved the highest mean Intersection over Union (mIoU) of 90.75%, demonstrating superior robustness across diverse field conditions. Additionally, comprehensive experiments quantified the impact of magnification factors, Gaussian blur, and Gaussian noise, identifying an optimal magnification factor of 4×, proving that the method was robust to common environmental disturbances at optimal parameters. Overall, this research established an efficient, precise framework for crop cultivation management, offering valuable insights for precision agriculture and sustainable farming practices.
Keywords: automatic segmentation; crops-weed detection; deep learning; precision agriculture; super-resolution reconstruction; UAV remote sensing automatic segmentation; crops-weed detection; deep learning; precision agriculture; super-resolution reconstruction; UAV remote sensing

Share and Cite

MDPI and ACS Style

Tao, J.; Qiao, Q.; Song, J.; Sun, S.; Chen, Y.; Wu, Q.; Liu, Y.; Xue, F.; Wu, H.; Zhao, F. Deep Learning-Driven Automatic Segmentation of Weeds and Crops in UAV Imagery. Sensors 2025, 25, 6576. https://doi.org/10.3390/s25216576

AMA Style

Tao J, Qiao Q, Song J, Sun S, Chen Y, Wu Q, Liu Y, Xue F, Wu H, Zhao F. Deep Learning-Driven Automatic Segmentation of Weeds and Crops in UAV Imagery. Sensors. 2025; 25(21):6576. https://doi.org/10.3390/s25216576

Chicago/Turabian Style

Tao, Jianghan, Qian Qiao, Jian Song, Shan Sun, Yijia Chen, Qingyang Wu, Yongying Liu, Feng Xue, Hao Wu, and Fan Zhao. 2025. "Deep Learning-Driven Automatic Segmentation of Weeds and Crops in UAV Imagery" Sensors 25, no. 21: 6576. https://doi.org/10.3390/s25216576

APA Style

Tao, J., Qiao, Q., Song, J., Sun, S., Chen, Y., Wu, Q., Liu, Y., Xue, F., Wu, H., & Zhao, F. (2025). Deep Learning-Driven Automatic Segmentation of Weeds and Crops in UAV Imagery. Sensors, 25(21), 6576. https://doi.org/10.3390/s25216576

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop