Next Article in Journal
Evaluation of Operational Projects Supported by Cohesion Funds for the National Forest Parks of Greece
Previous Article in Journal
Some Problems Arising during the Initiation of Somatic Embryogenesis in Pinus sylvestris L.
Order Article Reprints
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Proceeding Paper

Deep Learning-Based Approach for Weed Detection in Potato Crops †

Data-Driven Smart Decision Platform, PMAS-Arid Agriculture University, Rawalpindi 46000, Pakistan
University Institute for Information Technology, PMAS Arid Agricultural University, Rawalpindi 46000, Pakistan
Department of Agronomy, PMAS-Arid Agriculture University, Rawalpindi 46000, Pakistan
Establishment of National Center for Industrial Biotechnology (NCIB), PMAS-Arid Agriculture University, Rawalpindi 46000, Pakistan
Faculty of Agricultural Engineering and Technology, PMAS Arid Agricultural University, Rawalpindi 46000, Pakistan
Author to whom correspondence should be addressed.
Presented at the 1st International Precision Agriculture Pakistan Conference 2022 (PAPC 2022)—Change the Culture of Agriculture, Rawalpindi, Pakistan, 22–24 September 2022.
Environ. Sci. Proc. 2022, 23(1), 6;
Published: 28 November 2022


The digital revolution is transforming agriculture by applying artificial intelligence (AI) techniques. Potato (Solanum tuberosum L.) is one of the most important food crops which is susceptible to different varieties of weeds which not only lower its yield but also affect crop quality. Artificial Intelligence and Computer Vision (CV) techniques have been proven to be state-of-the-art in terms of addressing various agricultural problems. In this study, a dataset of five different potato weeds was collected in different environments and under different climatic conditions such as sunny, cloudy, partly cloudy, and at different times of the day on a weekly basis. For weeds-detection purposes, the Tiny-YOLOv4 model was trained on the collected potato weeds dataset. The proposed model obtained 49.4% mAP value by calculating the IoU. The model trained with high prediction accuracy will later be used as part of a site-specific spraying system to apply agrochemicals for weed management in potato crops.

1. Introduction

Artificial Intelligence (AI) and Deep Learning (DL) is the best choice for researchers in the field of agriculture for weed detection. DL-based Convolutional Neural Networks (CNNs) embedded with Graphics Processing Unit (GPUs) have resulted in innovations in the agriculture sector [1]. Recently, it played a vital role in automating the process of harvesting by fast and more accurate detection of weeds in the real-time environment. The state-of-the-art deep CNNs can extract complex and useful features from the input images which results in significant detection of weeds [2], diseases [3], pests [4], plant nutrients [5], etc.
Potato (Solanum tuberosum L.) is one of the most important food crops for over a billion people worldwide. According to a study in [6], 37% of the potato crop is damaged due to weeds. Piyazi booti (Asphodelus tenuifolius L.), Canada thistle (Cirsium arvense L.), jungli gajjar (Parthenium hysterophorous L.), bathu (Chenopodium album L.), and billi booti (Anagallis arvensis L.) were the most common weeds that grow in the potato field, especially in Potohar region. Tiny-YOLOv4 model is capable of detecting and classifying/descriminating these weeds in the real-time environment with high prediction accuracy.
The image dataset of potato weeds is not available publicly for training and validation of the model. The findings from this study will later be utilized in the real-time detection and spot-specific sprayer system for the management of the potato field. The first objective of this study was to collect the dataset locally from the Potohar region for training and validating the model. The second was to detect and classify the most common types of weeds in a potato field.

2. Methodology

Various DL and Computer Vision (CV) techniques are being used in the agriculture sector which has become a hot research topic nowadays. The proposed methodology for weed detection and classification is shown in Figure 1. First, the image dataset of five potato weeds was collected through a Logitech camera. Next, image processing was performed; in this step, the blurred and noisy images were discarded. Then, data annotation was performed, and the processed images were labeled through Yolo_mark and labellmg tools. The next step was to split the dataset into training and testing categories for feeding into the selected Tiny-YOLOv4 model. The best trained weights of the model with maximum accuracy were selected for the real-time detection of potato weeds.

3. Results

3.1. Evaluation Indicators

Precision, Recall, F1-Score, and Mean Average Precision (mAP) were employed to evaluate the prediction model performance in this work.
An alternative term for precision is “positive predictive value”. This is calculated by dividing the True Positive (TP) by the sum of TP and False Negative (FN) i.e., (TP + FN) as shown in Equation (1). TP means that the weed belongs to class 1 and the model predicted correctly, that it is from class 1. The FN means that the weed is from class 1, but the model predicted that it belongs to no class. The range for the best value for precision is 1.0, and 0.0 for the worst condition.
P = TP TP + FN
where P = Precision.
The Recall is also called “sensitivity” or “true positive rate” and it is the ratio of TP and the sum of TP and FN as shown in Equation (2). FP here means that the weed belongs to class 1 but the model classifies it into class 2. Its value varies from 0 to 1.
Recall = TP TP + FP
The harmonic mean of recall and precision is known as the F1-Score where the best value is 1.0 and 0.0 is the worst. It is calculated by using Equation (3).
F 1 = 2 × precision Recall precision + recall
The sum of the precision and recall of the detected bounding boxes produces the mean average precision (mAP). Equation (4) provides a formula for calculating mAP, where AP is the average precision of each class.
mAP = 1 / N i = 1 N AP

3.2. Experimental Results

The Tiny-YOLOv4 model was tested on unseen images with the 416 × 416 image resolution to make the models consistent with the training dataset. The experimental results of the Tiny-YOLOv4 model are presented in Figure 2. The red color line represents the mAP, the blue color shows the loss, and the green color represents the iteration count. The model was trained on the 10,000 iterations, 16 subdivisions, and with the 416 × 416 image resolution. The Mish was used as the activation function.
The real-time detection results of the Tiny-YOLOv4 model are shown in Figure 3. The model predicts the potato weeds correctly and efficiently with a high confidence score.

4. Conclusions

DL-based weed detection and classification techniques play a vital role in the domain of agriculture. In this work, a Tiny-YOLOv4 model was used for the detection of potato weeds in a real-time environment. The adopted model gives 49.4% testing accuracy on a very limited dataset. The best-trained model was used for weed detection in potato crops. There is still room for improvement; therefore, we will extend this system for the site-specific spraying technology and will use the most accurate detection algorithm with higher detection accuracy.

Author Contributions

Conceptualization, F.K. and N.Z.; methodology, F.K.; validation, M.A., F.K. and N.Z.; formal analysis, F.K.; investigation, S.S.; resources, S.S. and M.N.T.; data acquisition, F.K. and Z.H.; writing—original draft preparation, F.K. and F.K.; writing—review and editing, F.K. and M.A.; visualization, F.K. and M.A.; supervision, N.Z.; project administration, M.N.T.; funding acquisition, F.K. All authors have read and agreed to the published version of the manuscript.


The study is a part of PSDP-funded project No. PSDP-332 “Data-Driven Smart Decision Platform” to increase productivity.

Institutional Review Board Statement

Not Applicable.

Informed Consent Statement

Not Applicable.

Data Availability Statement

Not Applicable.


The authors would like to thank the DDSDP project.

Conflicts of Interest

The authors declare no conflict of interest.


  1. Gu, J.; Wang, Z.; Kuen, J.; Ma, L.; Shahroudy, A.; Shuai, B.; Liu, T.; Wang, X.; Wang, G.; Cai, J.; et al. Recent advances in convolutional neural networks. Pattern Recognit. 2018, 77, 354–377. [Google Scholar] [CrossRef]
  2. Subeesh, A.; Bhole, S.; Singh, K.; Chandel, N.S.; Rajwade, Y.A.; Rao, K.V.R. Deep convolutional neural network models for weed detection in polyhouse grown bell peppers. Artif. Intell. Agric. 2022, 6, 47–54. [Google Scholar] [CrossRef]
  3. Astani, M.; Hasheminejad, M.; Vaghefi, M. A diverse ensemble classifier for tomato disease recognition. Comput. Electron. Agric. 2022, 198, 107054. [Google Scholar] [CrossRef]
  4. Huang, M.L.; Chuang, T.C.; Liao, Y.C. Application of transfer learning and image augmentation technology for tomato pest identification. Sustain. Comput. Inform. Syst. 2022, 33, 100646. [Google Scholar] [CrossRef]
  5. Azimi, S.; Kaur, T.; Gandhi, T.K. A deep learning approach to measure stress level in plants due to Nitrogen deficiency. Measurement 2021, 173, 108650. [Google Scholar] [CrossRef]
  6. Mishra, A.M.; Harnal, S.; Gautam, V.; Tiwari, R.; Upadhyay, S. Weed density estimation in soya bean crop using deep convolutional neural networks in smart agriculture. J. Plant Dis. Prot. 2022, 129, 593–604. [Google Scholar] [CrossRef]
Figure 1. Proposed methodology diagram.
Figure 1. Proposed methodology diagram.
Environsciproc 23 00006 g001
Figure 2. Tiny-YOLOv4 model mAP @ 0.5 (red), loss (blue), iterations (green).
Figure 2. Tiny-YOLOv4 model mAP @ 0.5 (red), loss (blue), iterations (green).
Environsciproc 23 00006 g002
Figure 3. Real-time detection results of the Tiny-YOLOv4 model.
Figure 3. Real-time detection results of the Tiny-YOLOv4 model.
Environsciproc 23 00006 g003
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Khan, F.; Zafar, N.; Tahir, M.N.; Aqib, M.; Saleem, S.; Haroon, Z. Deep Learning-Based Approach for Weed Detection in Potato Crops. Environ. Sci. Proc. 2022, 23, 6.

AMA Style

Khan F, Zafar N, Tahir MN, Aqib M, Saleem S, Haroon Z. Deep Learning-Based Approach for Weed Detection in Potato Crops. Environmental Sciences Proceedings. 2022; 23(1):6.

Chicago/Turabian Style

Khan, Faiza, Noureen Zafar, Muhammad Naveed Tahir, Muhammad Aqib, Shoaib Saleem, and Zainab Haroon. 2022. "Deep Learning-Based Approach for Weed Detection in Potato Crops" Environmental Sciences Proceedings 23, no. 1: 6.

Article Metrics

Back to TopTop