Next Article in Journal
Edge-Compatible Deep Learning Models for Detection of Pest Outbreaks in Viticulture
Next Article in Special Issue
Seed Meals from Allelopathic Crops as a Potential Bio-Based Herbicide on Herbicide-Susceptible and -Resistant Biotypes of Wild Oat (Avena fatua L.)
Previous Article in Journal
Carbon Dioxide Efflux of Bare Soil as a Function of Soil Temperature and Moisture Content under Weather Conditions of Warm, Temperate, Dry Climate Zone
Previous Article in Special Issue
Evaluation of Diode Laser Treatments to Manage Weeds in Row Crops
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Deep Learning-Based Weed Detection in Turf: A Review

1
College of Mechanical and Electronic Engineering, Nanjing Forestry University, Nanjing 210037, China
2
Peking University Institute of Advanced Agricultural Sciences, Shandong Laboratory of Advanced Agricultural Sciences at Weifang, Weifang 261325, China
*
Author to whom correspondence should be addressed.
Agronomy 2022, 12(12), 3051; https://doi.org/10.3390/agronomy12123051
Submission received: 15 November 2022 / Revised: 29 November 2022 / Accepted: 1 December 2022 / Published: 2 December 2022
(This article belongs to the Special Issue The Future of Weed Science—Novel Approaches to Weed Management)

Abstract

:
Precision spraying can significantly reduce herbicide input for turf weed management. A major challenge for autonomous precision herbicide spraying is to accurately and reliably detect weeds growing in turf. Deep convolutional neural networks (DCNNs), an important artificial intelligent tool, demonstrated extraordinary capability to learn complex features from images. The feasibility of using DCNNs, including various image classification or object detection neural networks, has been investigated to detect weeds growing in turf. Due to the high level of performance of weed detection, DCNNs are suitable for the ground-based detection and discrimination of weeds growing in turf. However, reliable weed detection may be subject to the influence of weeds (e.g., biotypes, species, densities, and growth stages) and turf factors (e.g., turf quality, mowing height, and dormancy vs. non-dormancy). The present review article summarizes the previous research findings using DCNNs as the machine vision decision system of smart sprayers for precision herbicide spraying, with the aim of providing insights into future research.

1. Introduction

Turf is the predominant vegetation cover in urban landscapes, golf courses, residential lawns, and sports fields. In the United States, it was estimated that the total turf area covers 163,812 km2 with a lower and upper 95% confidence interval bounds of ±35,850 km2 [1]. According to the National Golf Foundation, there are over 15,000 golf courses, with an average of 50 to 73 ha per golf course, in the United States [2]. Turf offers many benefits, such as providing evaporative cooling in an urban area, remediating contaminated soil, absorbing atmospheric pollutants, and increasing the aesthetic value of residential and non-residential areas [3]. Nevertheless, weeds are a challenging issue for turf management. Weeds compete with turf for sunlight, nutrients, and water resources and may significantly reduce turf aesthetics and functionalities [4,5,6].
Turf managers predominately rely on synthetic herbicides for controlling weeds [7,8,9]. Unfortunately, for controlling certain weeds growing in turf, the present control programs relying on synthetic herbicides are not cost-efficient [6,10]. For example, repeat applications of sulfonylurea herbicides thiencarbazone + foramsulfuron + halosulfuron in combination with amicarbazone at 0.25 kg ai ha−1 adequately controlled tropical signalgrass (Urochloa distachya (L.) T.Q. Nguyen] in bermudagrass (Cynodon dactylon (L.) Pers.) [11]. However, repeated application of this herbicide program is expensive since a single application of amicarbazone at 0.25 kg ai ha−1 would cost approximately 1500 U.S. dollars.
Moreover, some synthetic herbicides used in turf are suspected of polluting environments [12]. Possible adverse impacts include, but are not limited to, the damaging effect on non-target organisms, water pollution, and harmful impact on humans [13,14,15]. In the United States, it was reported that nearly 80% of stream samples in urban/suburban contained at least five pesticides [16]. Atrazine is one of the most commonly used herbicides in warm-season turfgrasses [17]; however, it is frequently detected in underground water [16,18]. Monosodium methylarsenate (MSMA) is a highly effective broad-spectrum herbicide against a number of difficult-to-control weed species, including dallisgrass (Paspalum dilatatum Poir.), but pollutes underground water [19]. Following application, MSMA is converted to a more toxic form of inorganic arsenic that may contaminate water through soil runoff [20]. In the United States, only spot-treatment of MSMA is permitted to be sprayed on established golf courses [21].
Deep learning, a subset of machine learning technology, has emerged as successful applications in various scientific domains, including computer vision [22,23,24]. Deep convolutional neural networks (DCNNs) demonstrated extraordinary capability to extract complex features from images [25] and are utilized as a tool to detect weeds and perform precision herbicide spraying [26,27,28,29,30,31]. For example, See & Spray®, an autonomous smart sprayer utilizing DCNNs for weed detection, has been developed for precision herbicide application in agronomic crops [32]. Detection of weeds growing in turf needs to consider weeds (e.g., weed growth stage, weed species, and biotypes) and turf factors (e.g., turf quality, mowing height, dormant vs. non-dormant stages). DCNNs recognize weeds based on plant morphological features, leaf texture, and color [33,34,35,36]. Therefore, it is logical to assume that the detection of weeds growing in dormant turfgrass is easier than in actively growing turfgrass; the detection of large-leaved weeds is easier than small-leaved weeds; and the detection of broadleaf weeds is easier than grasses or grass-like weeds growing in turfgrass (Figure 1) [37,38].
In recent years, researchers reported that DCNNs could potentially serve as a tool for detecting weeds growing in turf [37,38,39,40]. They suggested that the DCNNs-based machine vision sub-system of smart sprayers might serve as an effective tool to reduce herbicide inputs and weed control costs for turf weed management. This review paper summarizes previous research findings in the past 10 years on deep learning-based weed detection in turf with the objective of offering insights for further research. The studies cited in this review were searched and collected in various databases, including Web of Science, ScienceDirect, Scopus, and Google Scholar.

2. Detection of Weeds Growing in Turf

2.1. Image Classification versus Object Detection

As shown in Figure 2, weeds grow either scatteringly or in relatively large patches in turf. The preparation of training datasets for object detection neural networks involves drawing bounding boxes on the training images. For this reason, object detectors are used to detect scattered weeds growing in turf [37,40,41]. However, to detect inconspicuous weeds, such as common lespedeza (Kummerowia striata L.) and spotted spurge (Euphorbia maculata L.), labeling the ground-truth locations within images for individual weeds is rather painstaking and laborious. Moreover, when detecting weeds in relatively large patches, a large number of weeds per image need to be labeled prior to training the object detectors.
Compared to object detectors, the training of image classification neural networks takes less time because it does not need to draw the bounding boxes. The grid cells (sub-images) could be created on the input images. Subsequently, the developed image classification neural networks could be used to detect if the grid cells contain weeds [42]. The image classification neural networks could be employed to detect either scattered or relatively large-patched weeds in turf. When using the image classification neural networks as the machine vision decision system, the spray outputs of the smart sprayers need to be the same or slightly larger than the size of the sub-images in order to fully cover the sub-images containing the target weeds [43].

2.2. Detection of Weeds in Dormant Turfgrass

Yu et al. [37] evaluated DetectNet, GoogLeNet, and VGGNet for detecting annual bluegrass (Poa annua L.) or annual bluegrass growing in proximity to various broadleaf weeds, such as common chickweed (Stellaria media (L.) Vill.), dandelion, and white clover (Trifolium repens L.). The authors reported that DetectNet was the most effective, while GoogLeNet was the least effective among the neural networks evaluated for detecting annual bluegrass in dormant bermudagrass. DetectNet achieved high precision and recall values with the highest F1 score (≥0.99) at detecting annual bluegrass growing in dormant bermudagrass. In another study, Yu et al. [38] reported that VGGNet achieved high F1 scores with high recall values (1.00) for detecting various broadleaf weeds, including common chickweed [Stellaria media (L.) Vill.], dandelion, henbit (Lamium amplexicaule L.), purple deadnettle (Lamium purpureum L.), and white clover (Trifolium repens L.) in dormant bermudagrass turf.

2.3. Detection of Broadleaf Weeds in Actively Growing Turfgrass

Yu et al. [44] compared DetectNet, GoogLeNet, and VGGNet to detect dandelion, ground ivy (Glechoma hederacea L.), and spotted spurge growing in actively growing perennial ryegrass and reported that VGGNet was more effective than AlexNet and GoogLeNet in detecting these weeds. When the neural networks were trained with 15,486 negative (images without weeds) and 17,600 positive images (6500 images contain spotted spurge, 4600 images contain ground ivy, and 6500 images contain dandelion), VGGNet achieved high F1 scores (≥0.9345) with high recall values (≥0.9952) to detect these weeds; the F1 scores of AlexNet and GoogLeNet did not exceed 0.9103, while DetectNet was highly effective and achieved high F1 scores (≥0.9843) to detect dandelion growing in perennial ryegrass.

2.4. Detection of Grass or Grass-Like Weeds in Actively Growing Turfgrasss

It was assumed that machine vision-based detection of grass or grass-like weeds in turfgrass is especially challenging due to the similarity in plant morphology [37,38,45]. Yu et al. [45] evaluated the use of image classification neural networks, including AlexNet, GoogLeNet, and VGGNet, for the detection of smooth crabgrass (Digitaria ischaemum L.), dallisgrass, doveweed [Murdannia nudiflora (L.) Brenan], and tropical signalgrass [Urochloa distachya (L.) T.Q. Nguyen] growing in bermudagrass with erratic turf surface conditions (i.e., varying mowing heights and surface qualities). The authors found that VGGNet achieved excellent performances for detecting these weed species with high F1 scores (≥0.93) and recall values (1.00). Although AlexNet and GoogLeNet achieved high recall, they exhibited low precision [45]. The low precision indicates that the neural networks are more likely to misclassify turfgrass as weeds, leading to herbicide applications in turf where weeds do not occur.

2.5. Weed Localization

Object detectors, such as Faster R-CNN [46], YOLO (You Only Look Once) [47], and SSD (Single Shot Detector) [48], generate bounding box outputs but do not determine the exact location of weeds on the images. Mask R-CNN, a segmentation network, can address this issue because it can achieve finer image segmentation for object detection [49]. Nevertheless, this neural network requires pixel-wise precise ground truth labeling, which is time-consuming. Xie et al. [39] developed an algorithm to generate synthetic data and constructed a nutsedge (Cyperus spp.) skeleton-based probabilistic map as the neural network input to reduce the dependence on pixel-wise precise labeling. This approach effectively overcame the effect of insufficient training images and reduced the labeling time by 95%, and meanwhile, it outperformed the original Mask R-CNN approach for weed detection.
Despite all the successes described in previous paragraphs, detecting weeds growing in turf with image classification neural networks faces challenges [37,50]. While previous researchers reported that image classification neural networks could detect and discriminate the sub-images containing weeds, they did not attempt to identify the location of weeds on the images [37,38,50]. When using the image classification neural networks for weed detection, the exact location of the sub-images containing weeds on the input images needs to be determined to realize precision herbicide application with the smart sprayers. To address this issue, Yu and Jin [42] developed a software that can integrate image classification neural networks and OpenCV-Python to create the grid cells on the input images. This software can crop the testing image (1920 × 1080 pixels) into a total of 40 equal size grid cells. The software marks the grid cells as “spray” if the inference of the developed neural networks indicates that they contained weeds and marks as “non-spray” if the inference indicates that they did not contain weeds. The x, y coordinates of the grid cells containing weeds are located with the developed software when used in conjunction with the image classification neural networks. Using this software, Jin et al. [42] found that EfficientNetV2 was reliably inferred if the grid cells contained the target weeds with high F1 scores (≥0.980) and noted that DenseNet, EfficientNetV2, ResNet, RegNet, and VGGNet reliably detected and discriminated the grid cells contained in dandelion, dallisgrass, purple nutsedge, and white clover. After the grid cells are located using the developed software in the machine vision sub-system of the smart sprayer, the nozzles over the grid cells containing weeds are turned on to realize precision herbicide spraying.

2.6. Detection of Weeds Growing in Various Turfgrass Surface Conditions

Image classification and object detection neural networks can detect weeds growing in various turf surface conditions (Table 1) [37,50]. When the neural networks were trained with images taken at athletic fields, institutional lawns, and various golf course management zones (i.e., fairways, tees, putting greens, and rough), VGGNet demonstrated high F1 score values (≥0.95) and effectively detected dollar weeds (Hydrocotyle spp.), old world diamond-flower (Hedyotis cormybosa L. Lam.), and Florida pusley (Richardia scabra L.) in actively growing bermudagrass turf [37].
Abiotic/biotic stresses or varying management practices (e.g., irrigation, mowing, and fertilization) can cause erratic turf conditions with different surface qualities. Soil water deficiency could alter plant leaf color and morphological features and thus affecting neural networks for detecting weeds. Certain weed species, such as Florida pusley (Richardia scabra L.), are highly drought-tolerant and can thrive in drought-impacted bahiagrass. Zhuang et al. [50] investigated the feasibility of using object detection and image classification neural networks for the detection of Florida pusley (Richardia scabra L.) growing in drought-stressed or unstressed bahiagrass (Paspalum natatum Flugge) and found that the evaluated object detectors, including YOLOv3, Faster R-CNN, and VFNet, did not reliably detect Florida pusley growing in drought-stressed or unstressed bahiagrass. In contrast, the evaluated image classification neural networks, including AlexNet, GoogLeNet, and VGGNet, achieved high F1 scores (≥0.97) to detect Florida pusely growing in varying drought-stressed bahiagrass turf.

2.7. Detection of Weeds at Various Densities and Growth Stages

Weed density significantly impacted the performances of DCNNs for weed detection [45]. AlexNet (for detection of crabgrass species, dallisgrass, doveweed, and tropical signalgrass) and GoogLeNet (for detection of smooth crabgrass) exhibited higher accuracy when detecting high weed densities (weeds ≥80% image area) compared to low weed densities (weeds ≤20% image area); however, VGGNet reliably detected all these weed species, regardless of weed densities [45]. In perennial ryegrass turf, Yu [44] reported that DetectNet achieved high F1 scores to detect dandelion at varying densities and growth stages. In the case of high dandelion density, DetectNet-generated bounding boxes failed to cover every leaf of the weeds, reducing the recall values. However, it was hypothesized that this is unlikely to be an issue in field applications since most weeds per image were detected, and a few undetected weeds likely fall into the spray zone if the smart sprayers utilize flat fan nozzles for herbicide application.

2.8. Detection of Weeds Based on Herbicide Weed Control Spectrum

POST herbicides have their specific weed control spectrum. For instance, glyphosate and glufosinate are used to nonselectively control all winter weeds in dormant bermudagrass and zoysiagrass (Zoysia spp.) turf [6]. However, most POST herbicides used in turf are selective; for example, synthetic auxin herbicides (e.g., 2,4-D, dicamba, and MCPP) only control broadleaf weeds [6,51]; Acetyl-CoA carboxylase inhibiting herbicides (e.g., clethodim, sethoxydim, and fenoxaprop-P-ethyl) only control grass weeds [10,52]; and sulfentrazone controls broadleaves, certain grass weeds (e.g., goosegrass), and sedges [45]. Therefore, instead of indiscriminately detecting all types of weed species growing in turf, the machine vision decision system of the smart sprayer detecting herbicides’ weed control spectrum can efficiently save herbicides. Further investigations are needed to evaluate the feasibility of using DCNNs to detect herbicides’ weed control spectrum.
In a previous investigation, Jin et al. [43] trained GoogLeNet, VGGNet, MobileNet-v3, and ShuffleNet-v2 to discriminate the vegetation into three classes according to the herbicide weed control spectrum, including grass weeds (susceptible to Acetyl-CoA carboxylase-inhibiting herbicides), broadleaf weeds (susceptible to synthetic auxin herbicides), and turfgrass only (no herbicide spraying). The authors documented that VGGNet and ShuffleNet-v2 achieved a high overall accuracy of ≥0.999 to detect and discriminate the vegetation, including crabgrass, dallisgrass, dollarweed, goosegrass, old world diamond-flower, tropical signalgrass, Virginia buttonweed, and white clover growing in turf into the categories based on their susceptibility to ACCase-inhibiting herbicides and synthetic auxin herbicides. ShuffleNet-v2 was noticeably faster than GoogLeNet and VGGNet, and thus the authors concluded that ShuffleNet-v2 was the most efficient and reliable neural network among the neural networks evaluated.

3. Future Research Directions

A variety of weed species with comparable visual characteristics may occur in the turfgrass. Detection and classification of weeds in turf are difficult as weeds and turfgrass often exhibit similar colors, morphologies, and textures. Thus, using these characteristics alone is insufficient to distinguish between weeds and turfgrass. Moreover, weed detection can be more challenging under certain situations, such as the color and texture varying due to the variations of illumination and lighting conditions, or weeds are overlapped or partially occluded by turfgrass leaves. Previous findings have demonstrated that deep learning-based methods outperformed conventional approaches, including image processing, support vector machine (SVM), K Nearest Neighbor (KNN), and random forest (RF) [53]. Deep learning models have extraordinary feature learning and representing abilities, making them capable of addressing fine-grained detection and classification problems [54]. The cited studies in this paper offer a feasible basis and reference for applying deep learning methods in detecting and discriminating weeds while growing on turf. Deep learning datasets are essential for training the DCNNs to learn all aspects of complex natural environments. The training datasets are expected to comprise diverse images, such as weeds and turfgrass acquired at different growth stages, position, orientation, and various illumination conditions, to improve the adaptability and robustness of the developed DCNNs. It is obvious that higher accuracy could be achieved with larger training datasets. The following research directions should be pursued in future in order to realize precision herbicide application in turfgrass landscapes.
First, the training image size was reported to considerably affect the effectiveness of DCNNs for weed detection and discrimination [30,55]. For example, Yang et al. investigated the impact of training image sizes on deep convolutional neural networks for weed detection in alfalfa (Medicago sativa L.) and found that increasing training image sizes from 200 × 200 pixels to 800 × 800 pixels reduced the detection accuracy of all deep learning models. The DCNNs trained with an image size of 200 × 200 pixels resulted in the best detection accuracy [55]. In another study, Zhuang et al. trained the DCNNs with various sizes of images, including 200 × 200, 300 × 300, and 400 × 400 pixels, for detecting weeds in wheat (Triticum aestivum L.). The authors reported that AlexNet and VGGNet achieved increased classification accuracy when they were trained with 200 × 200 pixels than 300 × 300 or 400 × 400 pixels sizes. However, conversely, results were observed for DenseNet and ResNet. Nevertheless, when the DCNNs were trained with larger datasets, no noticeable difference was observed between the training image sizes. Therefore, the authors conclude that increasing the amount of training images generally boosts the performance of DCNNs while diminishing the impacts of training image sizes [30]. The impacts of the training image quantities and the training image sizes on the performance of DCNNs for weed detection in turf shall be the first future research direction to effectively employ DCNNs as the machine vision sub-system of smart sprayers.
Second, because of phenotypic plasticity, significant morphological variations exist between the weed ecotypes from distinct turfgrass management regimes or geographical areas [4,56,57]. For example, morphologically different goosegrass (Eleusine indica L.) ecotypes are reported in Malaysia [57] and Florida in the United States [56]. In Florida, the dwarf ecotypes of goosegrass have an average internode length of 0.2 cm, 1 raceme per plant, and 6 cm plant height, while the wild ecotypes have an average internode length of 7 cm, 7 racemes per plant, and 36 cm plant height [56]. Therefore, the complexity of feature extraction would increase if the training and testing datasets contained various broadleaf and grass weed species and ecotypes, which might reduce the performance of weed detection. We hypothesize that the training images covering more diverse weed ecotypes across varying geographic regions, turf management zones, and traffic stresses will increase the datasets’ robustness for improving the performance of weed detection, which warrants further investigation.
Third, weed detection based on the herbicide weed control spectrum allows the smart sprayer to apply herbicides only onto the susceptible weed species, thereby saving more herbicides compared to an approach that indiscriminately detects weed species. Jin et al. [43] confirmed that the image classification neural networks could effectively detect and discriminate weeds growing in bermudagrass turf susceptible to Acetyl-CoA carboxylase-inhibitors and synthetic auxin herbicides; but the authors did not attempt to develop neural networks for detecting and discriminating more categories of weed species based on their susceptibilities to herbicides. Here, we suggest that additional research is needed to examine the feasibility of developing neural networks for detecting and discriminating three categories of weed species, including broadleaves, grass weeds, and sedges.
Fourth, weeds are often present in relatively large patches in turfgrass. For this reason, it is likely that creating grid cells on the images and identifying if the grid cells contain weeds is a universal method for detecting weeds growing in turf compared to object detectors. In previous research, Yu and Jin [42] developed a software to create grid cells on the input images containing dandelion, dallisgrass, purple nutsedge, or white clover growing in bermudagrass turf and reliably identify if the grid cells exclusively contain weeds or turfgrass. Further research is needed to evaluate the feasibility of using this method to detect a more diverse weed species growing on turf.
Fifth, deep neural networks need to be trained with large constructed and labeled datasets; however, labeling and processing large datasets for training neural networks are time-consuming, labor-intensive, and often require professional knowledge to perform the task. In previous works, researchers have manually labeled and processed very large training datasets for developing neural networks to detect various broadleaf and grassy weeds growing in turf. For example, Yu et al. [37] developed an image classification neural network using a total of 36,000 images, including 18,000 true positive (images containing weeds) and 18,000 true negative (images containing turfgrass only). In another work, a total of 39,000 images, including 19,500 true positive and 19,500 true negative images were used to develop a neural network to detect common dandelion, ground ivy, and spotted spurge growing in perennial ryegrass [44]. A potential solution is to use a semi-supervised learning algorithm as it could keep the neural network consistent and turn with labeling during iterative procedure [58]. Semi-supervised learning algorithms could alternatively use a small labeled dataset and a larger unlabeled dataset to simultaneously learn and to enhance feature representation and prediction; and consequently, a relatively small labeled dataset could be used for developing an effective neural network [58]. We hypothesize that using a semi-supervised learning algorithm could significantly reduce the size of the training dataset while achieving the same performance of weed detection compared to supervised learning; however, this assumption needs to be further verified.
Last but not least, previous studies demonstrated the effectiveness of using DCNNs for weed detection with CUDA-enable graphics processing units (GPUs). However, weed detection with DCNNs, utilizing an edge device that is not CUDA-capable, would have reduced the performance of weed detection. Recently, Medrano [40] evaluated several object detectors, including YOLOv4 [59], YOLOv4-tiny [60], and YOLOv5 [61], on Jetson Nano 4GB as the central computer in a mobile robotic platform for the real-time detection of dandelion in bermudagrass in the hopes of achieving real-time detection speed. It was found that using Jetson Nano 4GB, YOLOv5 achieved excellent accuracy of weed detection in a real-time manner. An additional study is needed to explore image classification neural networks with more advanced or newer edge devices to detect weeds growing in turf in real-time.

Author Contributions

Conceptualization, X.J. and J.Y.; methodology, X.J. and J.Y.; validation, T.L.; formal analysis, X.J. and J.Y.; investigation, T.L. and J.Y.; resources, Y.C. and J.Y.; data curation, writing—original draft preparation, X.J. and J.Y.; writing—review and editing, J.Y.; visualization, J.Y.; supervision, Y.C. and J.Y.; project administration, J.Y.; funding acquisition, Y.C. and J.Y. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the National Natural Science Foundation of China (Grant No. 32072498) and the Postgraduate Research & Practice Innovation Program of Jiangsu Province (Grant No. KYCX22_1051).

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Milesi, C.; Elvidge, C.; Dietz, J.; Tuttle, B.; Nemani, R.; Running, S. A strategy for mapping and modeling the ecological effects of US lawns. J. Turfgrass Manag. 2005, 1, 83–97. [Google Scholar]
  2. Beditz, J. The development and growth of the US golf market. In Science and Golf II; Taylor & Francis: Abingdon, UK, 2002; pp. 678–686. [Google Scholar]
  3. Stier, J.C.; Steinke, K.; Ervin, E.H.; Higginson, F.R.; McMaugh, P.E. Turfgrass benefits and issues. Turfgrass Biol. Use Manag. 2013, 56, 105–145. [Google Scholar]
  4. Busey, P. Cultural management of weeds in turfgrass: A review. Crop Sci. 2003, 43, 1899–1911. [Google Scholar] [CrossRef]
  5. Hao, Z.; Bagavathiannan, M.; Li, Y.; Qu, M.; Wang, Z.; Yu, J. Wood vinegar for control of broadleaf weeds in dormant turfgrass. Weed Technol. 2021, 35, 901–907. [Google Scholar] [CrossRef]
  6. McElroy, J.; Martins, D. Use of herbicides on turfgrass. Planta Daninha 2013, 31, 455–467. [Google Scholar] [CrossRef] [Green Version]
  7. Gómez de Barreda, D.; Yu, J.; McCullough, P.E. Seedling tolerance of cool-season turfgrasses to metamifop. HortScience 2013, 48, 1313–1316. [Google Scholar] [CrossRef]
  8. McCullough, P.E.; Yu, J.; de Barreda, D.G. Seashore paspalum (Paspalum vaginatum) tolerance to pronamide applications for annual bluegrass control. Weed Technol. 2012, 26, 289–293. [Google Scholar] [CrossRef]
  9. Tate, T.M.; Meyer, W.A.; McCullough, P.E.; Yu, J. Evaluation of mesotrione tolerance levels and [14C] mesotrione absorption and translocation in three fine fescue species. Weed Sci. 2019, 67, 497–503. [Google Scholar] [CrossRef]
  10. McCullough, P.E.; Yu, J.; Raymer, P.L.; Chen, Z. First report of ACCase-resistant goosegrass (Eleusine indica) in the United States. Weed Sci. 2016, 64, 399–408. [Google Scholar] [CrossRef]
  11. Pearsaul, D.G.; Leon, R.G.; Sellers, B.A.; Silveira, M.L.; Odero, D.C. Evaluation of verticutting and herbicides for tropical signalgrass (Urochloa subquadripara) control in turf. Weed Technol. 2018, 32, 392–397. [Google Scholar] [CrossRef]
  12. Balogh, J.C.; Anderson, J.L. Environmental Impacts of Turfgrass Pesticides; Golf Course Management & Construction: Boca Raton, FL, USA, 2020; pp. 221–353. [Google Scholar]
  13. Tappe, W.; Groeneweg, J.; Jantsch, B. Diffuse atrazine pollution in German aquifers. Biodegradation 2002, 13, 3–10. [Google Scholar] [CrossRef] [PubMed]
  14. Nitschke, L.; Schüssler, W. Surface water pollution by herbicides from effluents of waste water treatment plants. Chemosphere 1998, 36, 35–41. [Google Scholar] [CrossRef] [PubMed]
  15. Starrett, S.; Christians, N.; Al Austin, T. Movement of herbicides under two irrigation regimes applied to turfgrass. Adv. Environ. Res. 2000, 4, 169–176. [Google Scholar] [CrossRef]
  16. Petrovic, A.M.; Easton, Z.M. The role of turfgrass management in the water quality of urban environments. Int. Turfgrass Soc. Res. J. 2005, 10, 55–69. [Google Scholar]
  17. Yu, J.; McCullough, P.E. Triclopyr reduces foliar bleaching from mesotrione and enhances efficacy for smooth crabgrass control by altering uptake and translocation. Weed Technol. 2016, 30, 516–523. [Google Scholar] [CrossRef] [Green Version]
  18. Pimentel, D.; Zuniga, R.; Morrison, D. Update on the environmental and economic costs associated with alien-invasive species in the United States. Ecol. Econ. 2005, 52, 273–288. [Google Scholar] [CrossRef]
  19. Mahoney, D.J.; Gannon, T.W.; Jeffries, M.D.; Matteson, A.R.; Polizzotto, M.L. Management considerations to minimize environmental impacts of arsenic following monosodium methylarsenate (MSMA) applications to turfgrass. J. Environ. Manag. 2015, 150, 444–450. [Google Scholar] [CrossRef] [Green Version]
  20. Mulligan, C.; Yong, R.; Gibbs, B. Remediation technologies for metal-contaminated soils and groundwater: An evaluation. Eng. Geol. 2001, 60, 193–207. [Google Scholar] [CrossRef]
  21. Busey, P. Managing goosegrass II. Removal. Golf Course Manag. 2004, 72, 132–136. [Google Scholar]
  22. Shi, J.; Li, Z.; Zhu, T.; Wang, D.; Ni, C. Defect detection of industry wood veneer based on NAS and multi-channel mask R-CNN. Sensors 2020, 20, 4398. [Google Scholar] [CrossRef]
  23. He, T.; Liu, Y.; Yu, Y.; Zhao, Q.; Hu, Z. Application of deep convolutional neural network on feature extraction and detection of wood defects. Measurement 2020, 152, 107357. [Google Scholar] [CrossRef]
  24. Zhou, H.; Zhuang, Z.; Liu, Y.; Liu, Y.; Zhang, X. Defect Classification of Green Plums Based on Deep Learning. Sensors 2020, 20, 6993. [Google Scholar] [CrossRef]
  25. LeCun, Y.; Bengio, Y.; Hinton, G. Deep learning. Nature 2015, 521, 436–444. [Google Scholar] [CrossRef]
  26. Le, V.N.T.; Ahderom, S.; Alameh, K. Performances of the lbp based algorithm over cnn models for detecting crops and weeds with similar morphologies. Sensors 2020, 20, 2193. [Google Scholar] [CrossRef]
  27. Liu, B.; Bruch, R. Weed Detection for Selective Spraying: A Review. Curr. Robot. Rep. 2020, 1, 19–26. [Google Scholar] [CrossRef] [Green Version]
  28. Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Detection of Carolina geranium (Geranium carolinianum) growing in competition with strawberry using convolutional neural networks. Weed Sci. 2019, 67, 239–245. [Google Scholar] [CrossRef]
  29. Jin, X.; Che, J.; Chen, Y. Weed identification using deep learning and image processing in vegetable plantation. IEEE Access 2021, 9, 10940–10950. [Google Scholar] [CrossRef]
  30. Zhuang, J.; Li, X.; Bagavathiannan, M.; Jin, X.; Yang, J.; Meng, W.; Li, T.; Li, L.; Wang, Y.; Chen, Y. Evaluation of different deep convolutional neural networks for detection of broadleaf weed seedlings in wheat. Pest Manag. Sci. 2022, 78, 521–529. [Google Scholar] [CrossRef] [PubMed]
  31. Jin, X.; Sun, Y.; Che, J.; Bagavathiannan, M.; Yu, J.; Chen, Y. A novel deep learning-based method for detection of weeds in vegetables. Pest Manag. Sci. 2022, 78, 1861–1869. [Google Scholar] [CrossRef] [PubMed]
  32. Chostner, B. See & Spray: The next generation of weed control. Resour. Mag. 2017, 24, 4–5. [Google Scholar]
  33. Dyrmann, M.; Karstoft, H.; Midtiby, H.S. Plant species classification using deep convolutional neural network. Biosyst. Eng. 2016, 151, 72–80. [Google Scholar] [CrossRef]
  34. Ghosal, S.; Blystone, D.; Singh, A.K.; Ganapathysubramanian, B.; Singh, A.; Sarkar, S. An explainable deep machine vision framework for plant stress phenotyping. Proc. Natl. Acad. Sci. USA 2018, 115, 4613–4618. [Google Scholar] [CrossRef] [Green Version]
  35. Schmidhuber, J. Deep learning in neural networks: An overview. Neural Netw. 2015, 61, 85–117. [Google Scholar] [PubMed] [Green Version]
  36. Wang, A.; Zhang, W.; Wei, X. A review on weed detection using ground-based machine vision and image processing techniques. Comput. Electron. Agric. 2019, 158, 226–240. [Google Scholar] [CrossRef]
  37. Yu, J.; Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Deep learning for image-based weed detection in turfgrass. Eur. J. Agron. 2019, 104, 78–84. [Google Scholar] [CrossRef]
  38. Yu, J.; Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Detection of broadleaf weeds growing in turfgrass with convolutional neural networks. Pest Manag. Sci. 2019, 75, 2211–2218. [Google Scholar] [CrossRef]
  39. Xie, S.; Hu, C.; Bagavathiannan, M.; Song, D. Toward Robotic Weed Control: Detection of Nutsedge Weed in Bermudagrass Turf Using Inaccurate and Insufficient Training Data. arXiv 2021, arXiv:2106.08897. [Google Scholar] [CrossRef]
  40. Medrano, R. Feasibility of Real-Time Weed Detection in Turfgrass on an Edge Device. Master’s Thesis, The California State Univeristy, Camarillo, CA, USA, 2021. [Google Scholar]
  41. Sharpe, S.M.; Schumann, A.W.; Boyd, N.S. Goosegrass detection in strawberry and tomato using a convolutional neural network. Sci. Rep. 2020, 10, 9548. [Google Scholar] [CrossRef] [PubMed]
  42. Jin, X.; Bagavathiannan, M.; McCullough, P.E.; Chen, Y.; Yu, J. A deep learning-based method for classification, detection, and localization of weeds in turfgrass. Pest Manag. Sci. 2022, 78, 4809–4821. [Google Scholar] [CrossRef]
  43. Jin, X.; Bagavathiannan, M.; Maity, A.; Chen, Y.; Yu, J. Deep learning for detecting herbicide weed control spectrum in turfgrass. Plant Methods 2022, 18, 94. [Google Scholar] [CrossRef]
  44. Yu, J.; Schumann, A.W.; Cao, Z.; Sharpe, S.M.; Boyd, N.S. Weed Detection in Perennial Ryegrass With Deep Learning Convolutional Neural Network. Front. Plant Sci. 2019, 10, 1422. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  45. Yu, J.; Schumann, A.W.; Sharpe, S.M.; Li, X.; Boyd, N.S. Detection of grassy weeds in bermudagrass with deep convolutional neural networks. Weed Sci. 2020, 68, 545–552. [Google Scholar] [CrossRef]
  46. Ren, S.; He, K.; Girshick, R.; Sun, J. Faster r-cnn: Towards real-time object detection with region proposal networks. arXiv 2015, arXiv:1506.01497. [Google Scholar] [CrossRef] [Green Version]
  47. Redmon, J.; Divvala, S.; Girshick, R.; Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 779–788. [Google Scholar]
  48. Liu, W.; Anguelov, D.; Erhan, D.; Szegedy, C.; Reed, S.; Fu, C.-Y.; Berg, A.C. Ssd: Single shot multibox detector. In Proceedings of the European Conference on Computer Vision, Amsterdam, The Netherlands, 11–14 October 2016; pp. 21–37. [Google Scholar]
  49. He, K.; Gkioxari, G.; Dollár, P.; Girshick, R. Mask r-cnn. In Proceedings of the IEEE International Conference on Computer Vision, Venice, Italy, 22–29 October 2017; pp. 2961–2969. [Google Scholar]
  50. Zhuang, J.; Jin, X.; Chen, Y.; Meng, W.; Wang, Y.; Yu, J.; Muthukumar, B. Drought stress impact on the performance of deep convolutional neural networks for weed detection in Bahiagrass. Grass Forage Sci. 2022. early view. [Google Scholar] [CrossRef]
  51. Reed, T.V.; Yu, J.; McCullough, P.E. Aminocyclopyrachlor efficacy for controlling Virginia buttonweed (Diodia virginiana) and smooth crabgrass (Digitaria ischaemum) in tall fescue. Weed Technol. 2013, 27, 488–491. [Google Scholar] [CrossRef]
  52. Tate, T.M.; McCullough, P.E.; Harrison, M.L.; Chen, Z.; Raymer, P.L. Characterization of mutations conferring inherent resistance to acetyl coenzyme A carboxylase-inhibiting herbicides in turfgrass and grassy weeds. Crop Sci. 2021, 61, 3164–3178. [Google Scholar] [CrossRef]
  53. Hasan, A.M.; Sohel, F.; Diepeveen, D.; Laga, H.; Jones, M.G. A survey of deep learning techniques for weed detection from images. Comput. Electron. Agric. 2021, 184, 106067. [Google Scholar] [CrossRef]
  54. Quan, L.; Feng, H.; Lv, Y.; Wang, Q.; Zhang, C.; Liu, J.; Yuan, Z. Maize seedling detection under different growth stages and complex field environments based on an improved Faster R–CNN. Biosyst. Eng. 2019, 184, 1–23. [Google Scholar] [CrossRef]
  55. Yang, J.; Bagavathiannan, M.; Wang, Y.; Chen, Y.; Yu, J. A comparative evaluation of convolutional neural networks, training image sizes, and deep learning optimizers for weed detection in Alfalfa. Weed Technol. 2022, 36, 512–522. [Google Scholar] [CrossRef]
  56. Kerr, R.A.; Zhebentyayeva, T.; Saski, C.; McCarty, L.B. Comprehensive phenotypic characterization and genetic distinction of distinct goosegrass (Eleusine indica L. Gaertn.) ecotypes. J. Plant Sci. Phytopathol. 2019, 3, 95–100. [Google Scholar] [CrossRef] [Green Version]
  57. Saidi, N.; Kadir, J.; Hong, L.W. Genetic diversity and morphological variations of goosegrass [Eleusine indica (L.) Gaertn] ecotypes in Malaysia. Weed Turf. Sci. 2016, 5, 144–154. [Google Scholar] [CrossRef]
  58. Zhu, X.; Goldberg, A.B. Introduction to semi-supervised learning. Synth. Lect. Artif. Intell. Mach. Learn. 2009, 3, 1–130. [Google Scholar]
  59. Alexey, B.; Wang, C.; Mark Liao, H. Optimal speed and accuracy of object detection. arXiv 2020, arXiv:2004.10934. [Google Scholar]
  60. Wang, C.-Y.; Bochkovskiy, A.; Liao, H.-Y.M. Scaled-yolov4: Scaling cross stage partial network. In Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, Nashville, TN, USA, 20–25 June 2021; pp. 13029–13038. [Google Scholar]
  61. Ultralytics. yolov5. Available online: https://github.com/ultralytics/yolov5 (accessed on 15 November 2022).
Figure 1. Presumed difficulty of using DCNNs for detecting weeds growing in turfgrass.
Figure 1. Presumed difficulty of using DCNNs for detecting weeds growing in turfgrass.
Agronomy 12 03051 g001
Figure 2. Dandelion (Taraxacum officinale F.H. Wigg.) scatteringly grows in perennial ryegrass (Lolium perenne L.) turf (A). Smooth crabgrass (Digitaria ischaemum (Schreb.) Muhl) grows in a relatively large patch in bermudagrass turf (B).
Figure 2. Dandelion (Taraxacum officinale F.H. Wigg.) scatteringly grows in perennial ryegrass (Lolium perenne L.) turf (A). Smooth crabgrass (Digitaria ischaemum (Schreb.) Muhl) grows in a relatively large patch in bermudagrass turf (B).
Agronomy 12 03051 g002
Table 1. A summary of published reports on the use of DCNNs for detecting weeds growing in turf.
Table 1. A summary of published reports on the use of DCNNs for detecting weeds growing in turf.
Turfgrass SpeciesTurfgrass
Conditions
WeedsDeep Learning ModelsBrief SummaryReference
BermudagrassDormantAnnual bluegrass or annual bluegrass
grows in proximity to various broadleaf weeds
DetectNet, GoogLeNet, and VGGNetDetectNet exhibited high F1 scores (≥0.99) to detect annual bluegrass, broadleaf weeds, or annual bluegrass occulted with broadleaf weeds. VGGNet reliably detected various broadleaf weeds (≥0.96).Yu et al. [37,38]
BermudagrassActively growingDollarweed, old world diamond-flower, and Florida pusleyDetectNet, GoogLeNet, and VGGNetVGGNet outperformed GoogLeNet and achieved high F1 scores (≥0.95) with high recall (0.99) to detect all three weed species growing in bermudagrass turf.Yu et al. [37]
BermudagrassActively growingCrabgrass species, doveweed, dallisgrass, and tropical
Signalgrass
AlexNet, GoogLeNet, and VGGNetVGGNet achieved high F1 scores (1.00) to detect all four weed species regardless of weed densities.Yu et al. [45]
BermudagrassActively growingA mix of yellow and purple nutsedge weedsMask R-CNNMask R-CNN trained with synthetic data (generated with a nutsedge skeleton-based probabilistic map) and raw data reduced labeling time by 95% compared to the Mask R-CNN trained with the raw data.Xie et al. [39]
BermudagrassActively growingCommon dandelion, dallisgrass, purple nutsedge, and white cloverDenseNet, EfficientNetV2, ResNet, RegNet, and VGGNetA custom software was
built to generate grid cell maps on the input images. When used in conjunction with the developed software, the image classification neural networks effectively detected and discriminated the grid cells containing weeds and turfgrass only.
Jin et al. [42]
BermudagrassActively growingCrabgrass, dallisgrass, dollarweed, goosegrass, old world diamond-flower, tropical signalgrass, Virginia buttonweed, and
white clover
growing in actively growing
bermudagrass turf
GoogLeNet, MobileNet-v3, ShuffleNet-v2, and VGGNetThe research evaluated the feasibility of using image classification neural networks for detecting and discriminating weed species according to their susceptibilities to ACCase-inhibiting and synthetic auxin herbicides. ShuffleNet-v2 performed best in terms of overall accuracy and image processing speed compared to GoogLeNet, MobileNet-v3, and VGGNet.Jin et al. [43]
BermudagrassActively growingDandelionYOLOv4, YOLOv4-tiny, and YOLOv5YOLOv5 achieved 97%
precision, 91% recall, and 41.2 frames per second to detect dandelion with
Deepstrem on NVIDIA
Jetson Nano 4GB.
Medrano [40]
BahiagrassDrought-stressed or
actively growing
Florida pusleyYOLO-v3, Faster R-CNN, VFNet, AlexNet, GoogLeNet, and VGGNetThe object detection neural networks, including YOLOv3, faster region-based convolutional network, and variable filter net did not effectively detect Florida pusley growing in drought-stressed or unstressed bahiagrass, while the developed image classification neural networks AlexNet, GoogLeNet, and VGGNet effectively detected Florida pusley growing in drought-stressed or unstressed bahiagrass.Zhuang et al. [50]
Perennial ryegrassActively growingDandelion, ground ivy, and spotted spurgeAlexNet, DetectNet, GoogLeNet, and VGGNetDetectNet exhibited high F1 scores (≥0.98) to detect dandelion. VGGNet achieved high F1 scores (≥0.92) with high recall (≥0.99) to detect all three weed species.Yu et al. [44]
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Jin, X.; Liu, T.; Chen, Y.; Yu, J. Deep Learning-Based Weed Detection in Turf: A Review. Agronomy 2022, 12, 3051. https://doi.org/10.3390/agronomy12123051

AMA Style

Jin X, Liu T, Chen Y, Yu J. Deep Learning-Based Weed Detection in Turf: A Review. Agronomy. 2022; 12(12):3051. https://doi.org/10.3390/agronomy12123051

Chicago/Turabian Style

Jin, Xiaojun, Teng Liu, Yong Chen, and Jialin Yu. 2022. "Deep Learning-Based Weed Detection in Turf: A Review" Agronomy 12, no. 12: 3051. https://doi.org/10.3390/agronomy12123051

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop