Efficient Weed Detection in Cabbage Fields Using a Dual-Model Strategy
Abstract
1. Introduction
2. Method
2.1. Overview
- Develop a cabbage segmentation training dataset and optimize a lightweight, efficient segmentation network based on YOLO11n-seg to enhance the accuracy of cabbage region extraction from images.
- Construct a classification training dataset comprising background, broadleaf weeds, and grass weeds, and independently train a high-performance classification model to accurately execute weed classification tasks.
- Apply the segmentation network to detect and delineate cabbage regions, followed by the use of image processing techniques to remove these regions from the original images. Subsequently, partition the processed images into an n × n grid, where n corresponds to the effective coverage area of the herbicide sprayer nozzle.
- Employ the classification network to categorize each grid cell, thereby providing a foundation for targeted herbicide application.

2.2. Dataset and Experimental Setup
2.3. Segmentation Model
2.3.1. C3k2-Faster
2.3.2. BiFPN
2.3.3. Segment_LSCSBD
2.4. Classification Model
2.5. Evaluation Metrics
3. Results and Discussion
3.1. Cabbage Segmentation
3.1.1. Ablation Experiment
3.1.2. Comparative Experiment
3.2. Classification Experiment
3.3. Weed Detection
4. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
Abbreviations
| ACCase | Acetyl-CoA carboxylase |
| AP50 | Average Precision at IoU = 50 |
| BiFPN | Bidirectional Feature Pyramid Network |
| CNN | Convolutional Neural Network |
| C2PSA | Convolutional block with parallel spatial attention |
| C3k2 | Cross-stage partial with kernel size 2 |
| DCPA | dimethyl tetrachloroterephthalate |
| DWConv | Depthwise Separable Convolution |
| FBL-YOLONet | Faster-Block_BiFPN_LSCSBD_YOLONet |
| FLOPs | Floating Point Operations |
| FN | False Negative |
| FP | False Positive |
| FPN | Feature Pyramid Network |
| GConv | Group Convolution |
| GFLOPs | Giga floating-point operations per second |
| IoU | Intersection over Union |
| LSCSBD | Lightweight Shared Convolutional Separator Batch-Normalization Detection Head |
| mAP | mean Average Precision |
| MB | Megabyte |
| PConv | Partial Convolution |
| PM | Proposed method |
| P-R | Precision-Recall |
| R-CNN | Region-Based Convolutional Neural Network |
| SGD | Stochastic Gradient Descent |
| SPB | Separate Boom |
| SPPF | Spatial pyramid Pooling–Fast |
| TP | True Positive |
| YOLO | You Only Look Once |
References
- Moreb, N.; Murphy, A.; Jaiswal, S.; Jaiswal, A.K. Cabbage. In Nutritional Composition and Antioxidant Properties of Fruits and Vegetables; Academic Press: Cambridge, MA, USA, 2020; pp. 33–54. [Google Scholar]
- Singh, B.K.; Sharma, S.R.; Singh, B. Variation in mineral concentrations among cultivars and germplasms of cabbage. J. Plant Nutr. 2009, 33, 95–104. [Google Scholar] [CrossRef]
- Kaur, S.; Kaur, R.; Chauhan, B.S. Understanding crop-weed-fertilizer-water interactions and their implications for weed management in agricultural systems. Crop Prot. 2018, 103, 65–72. [Google Scholar] [CrossRef]
- Yu, J.; Boyd, N.S.; Dittmar, P.J. Evaluation of herbicide programs in Florida cabbage production. HortScience 2018, 53, 646–650. [Google Scholar] [CrossRef]
- Zotarelli, L.; Dittmar, P.J.; Dufault, N.; Wells, B.; Desaeger, J.; Noling, J.W.; McAvou, E.; Wang, Q.; Miller, C.F. Cole crop production. Veg. Prod. Handb. Fla. 2019, 2020, 35–52. [Google Scholar]
- Kocourek, F.; Stará, J.; Holý, K.; Horská, T.; Kocourek, V.; Kováčová, J.; Kohoutková, J.; Suchanová, M.; Hajšlová, J. Evaluation of pesticide residue dynamics in Chinese cabbage, head cabbage and cauliflower. Food Addit. Contam. Part A 2017, 34, 980–989. [Google Scholar] [CrossRef]
- Zhang, Z.-Y.; Liu, X.-J.; Yu, X.-Y.; Zhang, C.-Z.; Hong, X.-Y. Pesticide residues in the spring cabbage (Brassica oleracea L. var. capitata) grown in open field. Food Control 2007, 18, 723–730. [Google Scholar] [CrossRef]
- Nascimento, A.; Pereira, G.; Pucci, L.; Alves, D.; Gomes, C.; Reis, M. Tolerance of cabbage crop to auxin herbicides. Planta Daninha 2020, 38, e020218387. [Google Scholar] [CrossRef]
- Yura, W.; Muhammad, F.; Mirza, F.; Maurend, Y.; Widyantoro, W.; Farida, S.; Aziz, Y.; Desti, A.; Edy, W.; Septy, M. Pesticide residues in food and potential risk of health problems: A systematic literature review. IOP Conf. Ser. Earth Environ. Sci. 2021, 894, 012025. [Google Scholar] [CrossRef]
- Ahmad, U.; Kondo, N.; Arima, S.; Monta, M.; Mohri, K. Weed detection in lawn field using machine vision utilization of textural features in segmented area. J. Jpn. Soc. Agric. Mach. 1999, 61, 61–69. [Google Scholar]
- El-Faki, M.S.; Zhang, N.; Peterson, D. Weed detection using color machine vision. Trans. ASAE 2000, 43, 1969–1978. [Google Scholar] [CrossRef]
- Bai, J.; Xu, Y.; Wei, X.; Zhang, J.; Shen, B. Weed identification from winter rape at seedling stage based on spectrum characteristics analysis. Trans. Chin. Soc. Agric. Eng. 2013, 29, 128–134. [Google Scholar]
- Chen, Y.; Lin, P.; He, Y. Velocity representation method for description of contour shape and the classification of weed leaf images. Biosyst. Eng. 2011, 109, 186–195. [Google Scholar] [CrossRef]
- Wu, Z.; Chen, Y.; Zhao, B.; Kang, X.; Ding, Y. Review of weed detection methods based on computer vision. Sensors 2021, 21, 3647. [Google Scholar] [CrossRef]
- Jin, X.; Han, K.; Zhao, H.; Wang, Y.; Chen, Y.; Yu, J. Detection and coverage estimation of purple nutsedge in turf with image classification neural networks. Pest Manag. Sci. 2024, 80, 3504–3515. [Google Scholar] [CrossRef]
- Razfar, N.; True, J.; Bassiouny, R.; Venkatesh, V.; Kashef, R. Weed detection in soybean crops using custom lightweight deep learning models. J. Agric. Food Res. 2022, 8, 100308. [Google Scholar] [CrossRef]
- Rehman, M.U.; Eesaar, H.; Abbas, Z.; Seneviratne, L.; Hussain, I.; Chong, K.T. Advanced drone-based weed detection using feature-enriched deep learning approach. Knowl.-Based Syst. 2024, 305, 112655. [Google Scholar] [CrossRef]
- Yu, J.; Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Deep learning for image-based weed detection in turfgrass. Eur. J. Agron. 2019, 104, 78–84. [Google Scholar] [CrossRef]
- Wang, G.; Li, Z.; Weng, G.; Chen, Y. An overview of industrial image segmentation using deep learning models. Intell. Robot 2025, 5, 143–180. [Google Scholar] [CrossRef]
- Tingting, Z.; Xunru, L.; Bohuan, X.; Xiaoyu, T. An in-vehicle real-time infrared object detection system based on deep learning with resource-constrained hardware. Intell. Robot. 2024, 4, 276–292. [Google Scholar] [CrossRef]
- Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You only look once: Unified, real-time object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
- Girshick, R.; Donahue, J.; Darrell, T.; Malik, J. Rich feature hierarchies for accurate object detection and semantic segmentation. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Columbus, OH, USA, 23–28 June 2014; pp. 580–587. [Google Scholar]
- Ren, S.; He, K.; Girshick, R.; Sun, J. Faster R-CNN: Towards real-time object detection with region proposal networks. IEEE Trans. Pattern Anal. Mach. Intell. 2016, 39, 1137–1149. [Google Scholar] [CrossRef]
- Zhu, W.; Zu, Q.; Wang, J.; Liu, T.; Maity, A.; Sun, J.; Li, M.; Jin, X.; Yu, J. CD-YOLO-Based deep learning method for weed detection in vegetables. Weed Sci. 2025, 73, e99. [Google Scholar] [CrossRef]
- Jin, X.; Sun, Y.; Che, J.; Bagavathiannan, M.; Yu, J.; Chen, Y. A novel deep learning-based method for detection of weeds in vegetables. Pest Manag. Sci. 2022, 78, 1861–1869. [Google Scholar] [CrossRef]
- Xu, K.; Shu, L.; Xie, Q.; Song, M.; Zhu, Y.; Cao, W.; Ni, J. Precision weed detection in wheat fields for agriculture 4.0: A survey of enabling technologies, methods, and research challenges. Comput. Electron. Agric. 2023, 212, 108106. [Google Scholar] [CrossRef]
- Grossmann, K. Auxin herbicides: Current status of mechanism and mode of action. Pest Manag. Sci. Former. Pestic. Sci. 2010, 66, 113–120. [Google Scholar] [CrossRef]
- McCullough, P.E.; Yu, J.; Raymer, P.L.; Chen, Z. First report of ACCase-resistant goosegrass (Eleusine indica) in the United States. Weed Sci. 2016, 64, 399–408. [Google Scholar] [CrossRef]
- Yu, J.; McCullough, P.E.; Czarnota, M.A. First report of acetyl-CoA carboxylase–resistant southern crabgrass (Digitaria ciliaris) in the united states. Weed Technol. 2017, 31, 252–259. [Google Scholar] [CrossRef]
- Howard, A.G.; Zhu, M.; Chen, B.; Kalenichenko, D.; Wang, W.; Weyand, T.; Andreetto, M.; Adam, H. Mobilenets: Efficient convolutional neural networks for mobile vision applications. arXiv 2017, arXiv:1704.04861. [Google Scholar] [CrossRef]
- Zhang, X.; Zhou, X.; Lin, M.; Sun, J. Shufflenet: An extremely efficient convolutional neural network for mobile devices. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Salt Lake City, UT, USA, 18–22 June 2018; pp. 6848–6856. [Google Scholar]
- Han, K.; Wang, Y.; Tian, Q.; Guo, J.; Xu, C.; Xu, C. Ghostnet: More features from cheap operations. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 1580–1589. [Google Scholar]
- Chen, J.; Kao, S.-h.; He, H.; Zhuo, W.; Wen, S.; Lee, C.-H.; Chan, S.-H.G. Run, don’t walk: Chasing higher FLOPS for faster neural networks. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Vancouver, BC, Canada, 17–24 June 2023; pp. 12021–12031. [Google Scholar]
- Tan, M.; Pang, R.; Le, Q.V. Efficientdet: Scalable and efficient object detection. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, WA, USA, 13–19 June 2020; pp. 10781–10790. [Google Scholar]
- Lin, T.-Y.; Dollár, P.; Girshick, R.; He, K.; Hariharan, B.; Belongie, S. Feature pyramid networks for object detection. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, HI, USA, 21–26 July 2017; pp. 2117–2125. [Google Scholar]
- Lu, Z.; Chengao, Z.; Lu, L.; Yan, Y.; Jun, W.; Wei, X.; Ke, X.; Jun, T. Star-YOLO: A lightweight and efficient model for weed detection in cotton fields using advanced YOLOv8 improvements. Comput. Electron. Agric. 2025, 235, 110306. [Google Scholar] [CrossRef]
- Tan, M.; Le, Q. Efficientnetv2: Smaller models and faster training. In Proceedings of the International Conference on Machine Learning, Virtual, 18–24 July 2021; pp. 10096–10106. [Google Scholar]
- Tan, M.; Le, Q. Efficientnet: Rethinking model scaling for convolutional neural networks. In Proceedings of the International Conference on Machine Learning, Long Beach, CA, USA, 9–15 June 2019; pp. 6105–6114. [Google Scholar]
- Wang, C.-Y.; Liao, H.-Y.M. YOLOv1 to YOLOv10: The fastest and most accurate real-time object detection systems. APSIPA Trans. Signal Inf. Process. 2024, 13, 1–38. [Google Scholar] [CrossRef]
- Bai, Q.; Gao, R.; Li, Q.; Wang, R.; Zhang, H. Recognition of the behaviors of dairy cows by an improved YOLO. Intell. Robot. 2024, 4, 1–19. [Google Scholar] [CrossRef]
- Kong, X.; Liu, T.; Chen, X.; Jin, X.; Li, A.; Yu, J. Efficient crop segmentation net and novel weed detection method. Eur. J. Agron. 2024, 161, 127367. [Google Scholar] [CrossRef]
- Barbieri, G.F.; Young, B.G.; Dayan, F.E.; Streibig, J.C.; Takano, H.K.; Merotto, A., Jr.; Avila, L.A. Herbicide mixtures: Interactions and modeling. Adv. Weed Sci. 2023, 40, e020220051. [Google Scholar] [CrossRef]
- Aguero-Alvarado, R.; Appleby, A.P.; Armstrong, D.J. Antagonism of haloxyfop activity in tall fescue (Festuca arundinacea) by dicamba and bentazon. Weed Sci. 1991, 39, 1–5. [Google Scholar] [CrossRef]
- Osipe, J.B.; de Oliveira Júnior, R.S.; Constantin, J.; Braga, G.; Braz, P.; Takano, H.K.; Biffe, D.F. Interaction of dicamba or 2,4-D with acetyl-CoA carboxylase inhibiting herbicides to control fleabane and sourgrass. J. Agric. Sci. Eng. 2021, 3, 220–237. [Google Scholar]
- Merritt, L.H.; Ferguson, J.C.; Brown-Johnson, A.E.; Reynolds, D.B.; Tseng, T.-M.; Lowe, J.W. Reduced herbicide antagonism of grass weed control through spray application technique. Agronomy 2020, 10, 1131. [Google Scholar] [CrossRef]
- Deng, B.; Lu, Y.; Xu, J. Weed database development: An updated survey of public weed datasets and cross-season weed detection adaptation. Ecol. Inform. 2024, 81, 102546. [Google Scholar] [CrossRef]
- Krestenitis, M.; Raptis, E.K.; Kapoutsis, A.C.; Ioannidis, K.; Kosmatopoulos, E.B.; Vrochidis, S.; Kompatsiaris, I. CoFly-WeedDB: A UAV image dataset for weed detection and species identification. Data Brief 2022, 45, 108575. [Google Scholar] [CrossRef]
- Lameski, P.; Zdravevski, E.; Trajkovik, V.; Kulakov, A. Weed detection dataset with RGB images taken under variable light conditions. In Proceedings of the ICT Innovations 2017: Data-Driven Innovation. 9th International Conference, ICT Innovations 2017, Skopje, Macedonia, 18–23 September 2017; Proceedings 9. pp. 112–119. [Google Scholar]
- Jin, X.; Liu, T.; Yang, Z.; Xie, J.; Bagavathiannan, M.; Hong, X.; Xu, Z.; Chen, X.; Yu, J.; Chen, Y. Precision weed control using a smart sprayer in dormant bermudagrass turf. Crop Prot. 2023, 172, 106302. [Google Scholar] [CrossRef]










| Dataset | Number |
|---|---|
| Training | 1716 |
| Validation | 215 |
| Testing | 214 |
| Dataset | Class | Number |
|---|---|---|
| Training | Background | 3630 |
| Broadleaf weeds | 1298 | |
| Grass weeds | 2126 | |
| Validation | Background | 453 |
| Broadleaf weeds | 162 | |
| Grass weeds | 266 | |
| Testing | Background | 454 |
| Broadleaf weeds | 162 | |
| Grass weeds | 266 |
| Method | C3k2- Faster | BiFPN 1 | Segment_LSCSBD 2 | Parameter Quantity | GFLOPs (G) 3 | Model Size (MB) | Mask mAP50 (%) | Mask mAP50–95 (%) | Mask Precision (%) | Mask Recall (%) | Speed (ms) |
|---|---|---|---|---|---|---|---|---|---|---|---|
| YOLO11n-seg | 2,834,763 | 10.2 | 6.0 | 97.7 | 86.8 | 97.7 | 94.2 | 7.5 | |||
| PM1 4 | √ | 2,540,611 | 9.6 | 5.4 | 97.6 | 86.2 | 96.5 | 94.6 | 7.4 | ||
| PM2 | √ | 2,101,511 | 10.1 | 4.6 | 98.0 | 87.0 | 96.8 | 95.4 | 8.3 | ||
| PM3 | √ | 2,594,956 | 9.7 | 5.5 | 98.2 | 87.0 | 95.4 | 97.1 | 8.0 | ||
| PM4 | √ | √ | 1,912,975 | 9.5 | 4.2 | 98.1 | 86.7 | 96.1 | 96.8 | 7.7 | |
| PM5 | √ | √ | 2,451,700 | 9.4 | 5.2 | 98.3 | 84.0 | 96.0 | 96.0 | 7.5 | |
| PM6 | √ | √ | 1,900,904 | 9.4 | 4.2 | 98.0 | 86.1 | 96.5 | 95.1 | 7.8 | |
| PM7 | √ | √ | √ | 1,757,648 | 9.1 | 3.9 | 98.3 | 87.9 | 96.2 | 97.2 | 7.0 |
| FBL-YOLONet 5 | √ | √ | √ | 1,117,200 | 8.6 | 2.6 | 98.3 | 86.6 | 97.0 | 95.5 | 6.4 |
| Method | Parameter Quantity | GFLOPs (G) 1 | Model Size (MB) | Mask mAP50 (%) | Mask mAP50–95 (%) | Mask Precision (%) | Mask Recall (%) | Speed (ms) |
|---|---|---|---|---|---|---|---|---|
| YOLOv8n-seg | 3,258,259 | 12.0 | 6.8 | 97.5 | 88.6 | 98.0 | 93.0 | 6.1 |
| YOLOv9c-seg | 27,625,299 | 157.6 | 56.3 | 98.1 | 89.7 | 96.3 | 95.2 | 38.7 |
| YOLO11n-seg | 2,834,763 | 10.2 | 6.0 | 97.7 | 86.8 | 97.7 | 94.2 | 7.5 |
| FBL-YOLONet 2 | 1,117,200 | 8.6 | 2.6 | 98.3 | 86.6 | 97.0 | 95.5 | 6.4 |
| Method | Class | Precision (%) | Recall (%) | F1 Scores (%) | Speed (ms) |
|---|---|---|---|---|---|
| Baseline (Direct Classification) | Background | 91.2 | 97.6 | 94.3 | 6.6 |
| Broadleaf weeds | 93.3 | 85.8 | 89.4 | ||
| Grass weeds | 93.9 | 87.2 | 90.4 | ||
| Proposed (Seg-Assisted) | Background | 93.4 | 96.9 | 95.1 | 6.7 |
| Broadleaf weeds | 93.5 | 88.9 | 91.1 | ||
| Grass weeds | 93.8 | 90.6 | 92.2 |
| Study | Crop | Method Paradigm | Weed Classes | Key Performance (Reported) |
|---|---|---|---|---|
| [15] | Weeds in Bermudagrass turf | Grid-based image classification for weed detection and coverage estimation | Purple nutsedge | The F1 score is no less than 0.972 |
| [16] | Weed detection in soybean plantation | Direct weed detection using CNNs | Grass and broadleaf weeds | The custom 5-layer CNN achieves 97.7% accuracy, 1.78 GB memory usage, and 22.245 ms latency |
| [17] | Soybean dataset | Direct weed detection using deep learning models | Multiple weed species | Achieved a Precision of 72.5%, Recall of 68.0%, and mAP@0.5 of 73.9 |
| [18] | Weeds in Bermudagrass turf | Direct weed detection using deep learning models | dollar weed, old world diamond-flower, Florida pusley, and annual bluegrass | In actively growing bermudagrass, VGGNet achieved F1 scores above 0.95 for detecting dollar weed, old world diamond-flower, and Florida pusley, outperforming GoogLeNet. In dormant bermudagrass, DetectNet achieved F1 scores above 0.99 for detecting annual bluegrass and mixed broadleaf weeds. |
| [24] | Vegetable seedlings (bok choy) | A detection network is first used to locate crop plants, and green pixels outside the crop bounding boxes are considered as weeds | Distinction between weed species is not required | The crop detection model achieved a mAP@50 of 98.1% |
| [36] | Weeds in cotton fields | Direct weed detection using deep learning models | 12 classes including cotton and multiple weed species | The reported study achieved mAP@50 = 98.0% and mAP@50–95 = 95.4% |
| [41] | Weeds in corn fields | Crop plants are indirectly segmented using a deep learning network, and green pixels outside the crop regions are treated as weeds | Distinction between weed species is not required | The reported experiment achieved a mIoU@50 of 90.9 |
| Ours | Weeds in cabbage fields | Crop plants are indirectly segmented using a deep learning network, and the processed images are classified on a grid to detect broadleaf and grass weeds | Broadleaf and grass weeds | In this study, the segmentation model achieved a mask mAP@50 of 98.3%, while the classification network attained F1 scores of 91.1% and 92.2% for broadleaf weeds and grass weeds, respectively |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Li, M.; Zhu, W.; Zhang, X.; Jiang, Y.; Yu, J.; Li, A.; Jin, X. Efficient Weed Detection in Cabbage Fields Using a Dual-Model Strategy. Agronomy 2026, 16, 93. https://doi.org/10.3390/agronomy16010093
Li M, Zhu W, Zhang X, Jiang Y, Yu J, Li A, Jin X. Efficient Weed Detection in Cabbage Fields Using a Dual-Model Strategy. Agronomy. 2026; 16(1):93. https://doi.org/10.3390/agronomy16010093
Chicago/Turabian StyleLi, Mian, Wenpeng Zhu, Xiaoyue Zhang, Ying Jiang, Jialin Yu, Aimin Li, and Xiaojun Jin. 2026. "Efficient Weed Detection in Cabbage Fields Using a Dual-Model Strategy" Agronomy 16, no. 1: 93. https://doi.org/10.3390/agronomy16010093
APA StyleLi, M., Zhu, W., Zhang, X., Jiang, Y., Yu, J., Li, A., & Jin, X. (2026). Efficient Weed Detection in Cabbage Fields Using a Dual-Model Strategy. Agronomy, 16(1), 93. https://doi.org/10.3390/agronomy16010093

