Development of Deep Learning-Based Variable Rate Agrochemical Spraying System for Targeted Weeds Control in Strawberry Crop

: Agrochemical application is an important tool in the agricultural industry for the protection of crops. Agrochemical application with conventional sprayers results in the waste of applied agrochemicals, which not only increases ﬁnancial losses but also contaminates the environment. Targeted agrochemical sprayers using smart control systems can substantially decrease the chemical input, weed control cost, and destructive environmental contamination. A variable rate spraying system was developed using deep learning methods for the development of new models to classify weeds and to accurately spray on desired weeds target. Laboratory and ﬁeld experiments were conducted to assess the sprayer performance for weed classiﬁcation and precise spraying of the target weeds using three classiﬁcation CNNs (Convolutional Neural Networks) models. The DCNNs models (AlexNet, VGG-16, and GoogleNet) were trained using a dataset containing a total of 12,443 images captured from the strawberry ﬁeld (4200 images with spotted spurge, 4265 images with Shepherd’s purse, and 4178 strawberry plants). The VGG-16 model attained higher values of precision, recall and F1-score as compared to AlexNet and GoogleNet. Additionally VGG-16 model recorded higher percentage of completely sprayed weeds target (CS = 93%) values. Overall in all experiments, VGG-16 performed better than AlexNet and GoogleNet for real-time weeds target classiﬁcation and precision spraying. The experiments results revealed that the Sprayer performance decreased with the increase of sprayer traveling speed above 3 km/h. Experimental results recommended that the sprayer with the VGG-16 model can achieve high performance that makes it more ideal for a real-time spraying application. It is concluded that the advanced variable rate spraying system has the potential for spot application of agrochemicals to control weeds in a strawberry ﬁeld. It can reduce the crop input costs and the environmental pollution risks.


Introduction
Agrochemical application plays an important role in agricultural production. These agrochemicals are essential for controlling weeds, disease and pests for better crop yield. If the weeds are present in large quantity, then development of essential crop will be less and they reduce yields, can make harvesting difficult and may affect the quality of the produce. Spraying plays an important role in reducing crop losses and increasing productivity. Cho et al. [1] stated that the dangerous insects and diseases are reduced by spraying, and losses between 30% and 35% of production can be eliminated. Plant protection is important to ensure the precise quantity and quality of the crop. The use of herbicides is the most preferred method for weed control because manual weeding is a laborious operation. Herbicides are the most widely used type of pesticide today, as weeds are an important factor limiting productivity in many crops [2].
The main goal of agricultural chemical application techniques is to control plant diseases, pests, and weeds and achieve the goal with maximum efficiency of agrochemicals and minimum effort to ensure minimum pollution. The uniform agrochemical application has been shown to be effective in removing weeds and but also it is a labor-intensive and time-consuming practice. Hydraulic sprayers are generally used for uniform agrochemical applications. In hydraulic sprayers, the spray fluid is pressurized by pump and then the pressurized fluid is forced through the spray nozzles toward the foliage. Boom sprayer is a type of hydraulic sprayer used for uniform agrochemical applications. Boom sprayer has multiple spray tips spread out along both boom arms with even spacing and are pointed straight down toward the target. Additionally, Air-blast sprayers are often used for spraying, where spraying liquid are propagated by a high volume of airflow generated by fan. Agrochemicals application by hydraulic sprayers have high ineffectiveness. These commonly used uniform agrochemicals sprayers cause the over-application of these harmful chemicals, increases crop input costs, deteriorates the environment [3][4][5], and contact with humans risks human health [6], and results in low application efficiency of sprayers [7]. That is why excessive use of agrochemicals is one of the factors affecting the economic, environmental, and production parameters. Due to their negative impact, governments and farmers are trying to reduce the entry of herbicides into agricultural activities [8,9].
Precision agriculture offers a solution to this problem by including weed control mechanisms to apply the treatment at the single plant level or in a small group of weeds [10]. In developed countries, there is a strong trend to decrease the use of herbicides in agriculture production [11][12][13][14]. Spraying application methods should ideally be targeted to ensure the safety of non-targets and the environment. As we know that all pesticides, herbicides pose risks to the environment and the user. Therefore, there is a serious need to decrease dependence on traditional agricultural chemicals (and sprayers) without disturbing crop production. In the future, herbicide applications will likely need to be more targeted as weeds. Intelligent weed removal, comprising weed recognition and elimination, has extended high acceptance among the farmers. The use of variable rate agrochemicals (VRA) are becoming very famous in recent years, because of its abundant potential to increase weed control efficiency and decreasing environmental risks and economic expenses [15,16].
Variable rate agrochemicals (VRA) application can greatly decrease the quantity of the chemicals used and reduce the cost of weeds control. Intelligent agrochemicals spraying system based on real-time sensing technology is emerging more rapidly in recent decades. Spraying the crop at a certain application rate will help in precision farming and precision fruit growing. As stated, only 30 to 40% of the agrochemical applied to the targets and most agrochemicals have disappeared into the environment, causing pollution of workers and pollution of the environment. The purpose of protecting plants with various chemicals is to prevent plant contamination [17]. Selective agrochemical spraying is one of the most satisfactory weed control mechanisms today. For decreasing the adverse effects, targeted agrochemical spraying methods have presented an essential development in relations of effectiveness and protection by implementing the modern developments in microelectronics, simulated intelligence and robotics [18]. Jianjun et al. [19] reported an infrared target detection system for variable rate spraying. The sprayer sees the precise design requirements for targeted spraying and can measure the distance between target and sprayer.
Bargen et al. [20] developed a near-infrared echo recognition method to identify plants. The detection of the plants is established on the different structures of canopy size. The target detection system built on infrared equipment is incompetent to detect characteristic information. Ultrasonic sensors also used for variable rate spraying systems. Giles et al. [21] developed an ultrasonic sensor-based conventional air-blast variable rate sprayer. Ultrasonic sensors performance decrease with background sounds. Palacin et al. [22] used a laser scanner to measure the canopy volume of plant in real time for variable rate spraying. Lasers are one of the most commonly used for variable rate spraying but also have some drawbacks, Such as they cannot be used in snowy or foggy weather conditions, also the laser beams cannot enter the vegetation. It also has poor performance in edge detection.
Oberti et al. [23] presented the targeted agrochemical spraying robot. They practiced crop image recognition to find powdery mildew plants; powdery mildew have a powdery appearance on the plants at first, and other powdery mildew indications. Berenstein et al. [24] showed that targeted agrochemical spraying can decrease the amount of agrochemical uses in present farming by 30%. Lee et al. [25] suggested a model for targeted spraying system by means of computer visualization and accurate agrochemical use to control weeds. In addition, agrochemicals can be saved up to 97%.
Machine vision is an effective means of recognizing the position, size, shape, color, and texture of vegetation. The practice of computer visualization in automated farming is gradually more popular [26]. Lamm et al. [27] developed a spraying prototype by means of computer vision procedures that is proficient in differentiating different weeds in the cotton crop and 88.8% of the weeds and 21.3% of the crop were sprayed. Sun et al. and Song et al. [28,29] studied a variety of weed detection sensors and techniques, such as computer vision, remote sensing, thermal imaging, and spectral study. Computer vision has been used for several years and it can differentiate different weeds, plants from the background by image integration procedures due to color variance among them. Hijazi et al. [30] presented high-speed imaging methods used for targeted chemical spraying. Another machine vision system was developed for the automatic segmentation of plants in different growth stages under different imaging situations and lighting conditions. The machine vision system achieved high accuracy and speed [31]. Burgos et al. [32] presented several machine vision-based image processing approaches for the calculation of proportions of weed, crop and soil present in an image presenting an area of interest of the crop field.
The usage of computer-generated intelligence is able to considerably increase the proficiency of spraying systems [33]. The artificial intelligence uses data-driven modeling techniques by taking the processed and labeled images of the targets as input expected at the improvement of a computer vision method. Target plants are identified on the basis of the morphology and texture of plants [34]. In the past few years, machine vision and deep learning methods showed significant signs of progress. Various machine learning applications have been developed to detect and distinguish weeds from real crops in coordination with sensors [35,36].
Machine vision using deep learning artificial neural networks (DL-ANNs) is a relatively new and very effective tool for classifying digital images and identifying objects within images. Deep learning convolutional neural networks (CNNs) are an advanced form of image processing that can quickly and accurately classify images or objects within an image [37]. A major benefit of deep learning is that its functionality is automatically extracted from the raw data. Between deep learning methods, Convoluted Neural Network (CNN) has shown great progress in the large-scale image and live video targets identifications [38]. CNN was initially motivated by the nature of the ability to see animals. The in-depth discovery of CNN (DL-CNN) moves toward the form of inexpensive multiprocessor graphics cards and graphics processing units (GPUs). GPUs excel at rapidly multiplying the arrays and vectors required for DL-CNN training, which can greatly accelerate learning [39].
In recent years, DL-ANNs have been widely tested and deployed in agricultural applications such as weed, pest, and disease detection, fruit and plant counting for yield estimation, and for intelligent navigation in autonomous vehicles [40][41][42]. Deep learning methods have turned out to be the most popular today due to the highest contemporary results achieved from image arrangement, target detection, ordinary verbal processing. The performance of deep learning techniques depends more on the quality of the data set than on other traditional machine learning techniques.
Dos Santos Ferreira et al. [43] devolved a conventional neural network model that classify various broadleaf and grassy weeds in soybean crop, the model performed well and achieved an overall accuracy of >99%. Some other researchers used a mixture of small components of a precision conventional neural network (CNN) to obtain an accurate weed classification that can achieve 90% accuracy with processing of 1.07-1.83 frames per s [44]. Sa et al. [45] used an encoder-decoder cascaded convolution neural network (CNN) to perform a semantic classification of the dense weeds product that provides high classification accuracy.
Yang and Sun [46] reported that DCNNs are supportive of agricultural applications. They used DCNN models to classify weeds with >95% accuracy. Milioto et al. [47] reported achieving high classification accuracy (99%) using CNNs for sugar beets and weeds distinction. Object detection models used for detection, as well as real-time applications of agrochemicals, with high accuracy [48]. Lee et al. [49] introduced the DL-CNN method for plant identification based on leaf characteristics. The development of DL-CNN models is the first step toward the development of a machine vision detection system for penetrating agrochemical applications. Different convolutional neural network (CNN) models have established to deliver new consistent crop/weed recognition outcomes. Recently, convolutional neural networks (CNN) have been used in agricultural applications for in-depth studies. A CNN's model's main advantage is its great performance in targets recognition and programmed feature work. Two CNN models (AlexNet and GoogleNet) were used for the classification in [50]. The performance of both models was assessed in terms of accuracy, recall, and precision. The results showed that GoogleNet performed better than the AlexNet architecture. In [51], the VGGNet model achieved higher classification accuracy (0.95). Among the several CNN models, VGGNet is the most accurate model in object classification.
This study evaluates the possibility of using three DL-CNN architectures GoogleNet [52], AlexNet [53] and Visual Geometry Groups Net (VGG-16), [36] for precision spraying in the strawberry field. The literature review revealed that the use of CNN models for weed classification for variable rate spraying still requires some research to deliver a deep learningbased spraying system. Therefore, the purpose of this study was to evaluate the effectiveness of the three deep learning models (GoogleNet, AlexNet and VGG-16) for weeds targeted spraying. The effect of sprayer ground speed on the performance CNNs models for spraying accuracy was also investigated. This article introduces and evaluates an economical variable rate sprayer for precise weed management. Three trained models (GoogleNet, AlexNet and VGG-16) were deployed and tested on sprayer in laboratory and field experiments for targeted weeds spraying. In this paper, a new approach to deep learning convolutional neural networks was used for precise agrochemical spraying for weed control (spotted spurge, Shepherd's purse) in the strawberry field.

The Spraying Machine
A deep learning-based variable rate spraying system was designed and developed for the target spraying application. The sprayer consists of an electric four-wheeled chassis frame vehicle. The front two hub motor wheels were used as the driving wheels and the rear two wheels are independent caster type which used for turning the sprayer. The sprayer had 0.60 m of ground clearance and spacing between the two tires was 0.70 m. The wheel spacing and sprayer height were adjustable according to the site conditions. The sprayer prototype, with a size of 1.2 m (length) × 0.80 m (width) × 0.75 m (height), is power-driven by a 24-V lithium battery and automatically drive by remote control (SAGA1-L8B) and rated for a carrying capacity of up to 100 kg.

The Deep Learning-Based Variable Rate Spraying System
The developed variable rate spraying system using deep learning techniques consists of the following machine components for targeted spraying. A spraying tank of 50 L was used for agrochemical storage and the spraying tank was connected with a water diaphragm pump, the pump had maximum discharge flow of 15 Lpm and a maximum pressure of 4 bar. The pump main task was to supply the water towards nozzles continuously. Also a pressure relief bypass valve was used to avoid backflow of water toward Agronomy 2021, 11, 1480 5 of 17 the agrochemical tank and to maintain the flow toward nozzles in the case of one or more nozzles close when the system does not detected the weeds. Three 12 V solenoid valves (ASCO 8262H002), which have less than 60 ms response time, were used to actuate the nozzles spraying system with relay switches. The distance between the two adjacent spraying nozzles was kept 0.7 m to cover one row of the strawberry crop and the camera was also fixed with each nozzle for image acquisition.
A microcontroller (Arduino ATmega328) panel was used to control the all spraying system devices. The spraying nozzles (Solo 4900654-P) were flexible so that the height and the angle of the spraying nozzle can be changed according to the site conditions. Three low cost webcams (Aluratek AWC01F) were used for the real-time image acquisition process in the experimental field. The cameras were mounted on an iron bar at the heights and positions of spray nozzles to capture the images in real time for further processing of spraying system. The total width of the bar was 1.40 m that allowed the mounting of three cameras and three nozzles at the same time. These cameras were adjusted in such a way that the lowest intersection between the views of cameras will occurred. A schematic working diagram of the spraying system is presented in Figure 1.
phragm pump, the pump had maximum discharge flow of 15 Lpm and a maximum p sure of 4 bar. The pump main task was to supply the water towards nozzles continuou Also a pressure relief bypass valve was used to avoid backflow of water toward the ag chemical tank and to maintain the flow toward nozzles in the case of one or more noz close when the system does not detected the weeds. Three 12 V solenoid valves (AS 8262H002), which have less than 60 ms response time, were used to actuate the noz spraying system with relay switches. The distance between the two adjacent spray nozzles was kept 0.7 m to cover one row of the strawberry crop and the camera was fixed with each nozzle for image acquisition.
A microcontroller (Arduino ATmega328) panel was used to control the all spray system devices. The spraying nozzles (Solo 4900654-P) were flexible so that the height the angle of the spraying nozzle can be changed according to the site conditions. Th low cost webcams (Aluratek AWC01F) were used for the real-time image acquisition p cess in the experimental field. The cameras were mounted on an iron bar at the heig and positions of spray nozzles to capture the images in real time for further processin spraying system. The total width of the bar was 1.40 m that allowed the mounting of th cameras and three nozzles at the same time. These cameras were adjusted in such a w that the lowest intersection between the views of cameras will occurred. A schem working diagram of the spraying system is presented in Figure 1. Nvidia GeForce GTX 1080 processing unit was used to run the convolutional ne networks (CNN) in this study. Nvidia GeForce GTX 1080 is a particularly powerful sin board computer. Additionally, it has 8 GB GDDR5X RAM that operates on 256 memory, and with Pascal GPU GP104 that operates at a frequency of 1733 MHz. Th CNN models were selected for weed identification. This study offers AlexNet, VGG Nvidia GeForce GTX 1080 processing unit was used to run the convolutional neural networks (CNN) in this study. Nvidia GeForce GTX 1080 is a particularly powerful singleboard computer. Additionally, it has 8 GB GDDR5X RAM that operates on 256-bit memory, and with Pascal GPU GP104 that operates at a frequency of 1733 MHz. Three CNN models were selected for weed identification. This study offers AlexNet, VGG- 16  In 2014, VGG-16 won the ILSVRC award. The VGG-16 architecture consists of 16 layers and 138 million parameters. GoogleNet has seven million parameters and a total of 22 layers. Improved versions of these modern models have been found to have remarkable target detection accuracy [54].

Data Acquisition and Image Processing
The images used for this study were collected from strawberry fields in southern Punjab (31 • 21 41.99 N, 70 • 58 10.99 E), Pakistan, during the 2020 crop season. Two digital cameras with resolutions ranging from 3000 × 2000 to 1500 × 1000 pixels captured color images of two weeds (spotted spurge, Shepherd's purse) and strawberry plants. The images were captured approximately 70% of images from waist height (0.99 ± 0.09 m), 15% from knee height (0.52 ± 0.04 m) and 15% from chest height (1.35 ± 0.07 m) to allow the CNNs to recognize targets at a range of distances. These images were captured at different times of the day, under varying light intensities and from different angles to capture every possible setting for CNNs models.
After applying augmentation process on captured images a total of 12,443 images were generated, out of which 4200 images were of spotted spurge, 4265 images were of Shepherd's purse, and 4178 were of strawberry plants ( Figure 2). All the images dataset were randomly subdivided into three sub-datasets for training, validation, and testing of the classification CNNs models. Around 70% (8500 images) was used during training, in which 4200 images were of spotted spurge and Shepherd's purse weeds, and 2300 were of strawberry plants. Approximately 20% (2488) of the dataset was used for validation (1660 of spotted spurge and Shepherd's purse, and 830 of strawberry plants) and 10% (1240) of the images including both weeds and strawberry plants were used for testing purposes. and 61 million parameters. In 2014, VGG-16 won the ILSVRC award. The VGG-16 architecture consists of 16 layers and 138 million parameters. GoogleNet has seven million parameters and a total of 22 layers. Improved versions of these modern models have been found to have remarkable target detection accuracy [54].

Data Acquisition and Image Processing
The images used for this study were collected from strawberry fields in southern Punjab (31°21′41.99″ N, 70°58′10.99″ E), Pakistan, during the 2020 crop season. Two digital cameras with resolutions ranging from 3000 × 2000 to 1500 × 1000 pixels captured color images of two weeds (spotted spurge, Shepherd's purse) and strawberry plants. The images were captured approximately 70% of images from waist height (0.99 ± 0.09 m), 15% from knee height (0.52 ± 0.04 m) and 15% from chest height (1.35 ± 0.07 m) to allow the CNNs to recognize targets at a range of distances. These images were captured at different times of the day, under varying light intensities and from different angles to capture every possible setting for CNNs models.
After applying augmentation process on captured images a total of 12,443 images were generated, out of which 4200 images were of spotted spurge, 4265 images were of Shepherd's purse, and 4178 were of strawberry plants ( Figure 2). All the images dataset were randomly subdivided into three sub-datasets for training, validation, and testing of the classification CNNs models. Around 70% (8500 images) was used during training, in which 4200 images were of spotted spurge and Shepherd's purse weeds, and 2300 were of strawberry plants. Approximately 20% (2488) of the dataset was used for validation (1660 of spotted spurge and Shepherd's purse, and 830 of strawberry plants) and 10% (1240) of the images including both weeds and strawberry plants were used for testing purposes.

DCNN Models Training and Testing
For training and testing the DCNN models, all images were re-scaled and cropped using IrfanView software (Version 4.54) to 224 × 224 pixels as an input dataset for easier, learning, validation, and testing processes. The ratio of training and testing validation images was 70:30 for both weeds and plant image datasets. The images used for training were not used during testing for better models performance.
All training experiments were performed using GPU (Nvidia GTX 1080) on ubuntu 16.04. The Tensor Flow framework was used for model training. The Tensor Flow framework was developed by Google. It was initially developed to perform large datasets of numerical calculations [55]. Figure 3 shows the convolutional neural network algorithm for weed classification. During the training of the convolutional neural network models,

DCNN Models Training and Testing
For training and testing the DCNN models, all images were re-scaled and cropped using IrfanView software (Version 4.54) to 224 × 224 pixels as an input dataset for easier, learning, validation, and testing processes. The ratio of training and testing validation images was 70:30 for both weeds and plant image datasets. The images used for training were not used during testing for better models performance.
All training experiments were performed using GPU (Nvidia GTX 1080) on ubuntu 16.04. The Tensor Flow framework was used for model training. The Tensor Flow framework was developed by Google. It was initially developed to perform large datasets of numerical calculations [55]. Figure 3 shows the convolutional neural network algorithm for weed classification. During the training of the convolutional neural network models, several hyper-parameters were selected to obtain the high value of precision. The momen- All training experiments were performed using GPU (Nvidia GTX 1080) on ubuntu 16.04. The Tensor Flow framework was used for model training. The Tensor Flow framework was developed by Google. It was initially developed to perform large datasets of numerical calculations [55]. Figure 3 shows the convolutional neural network algorithm for weed classification. During the training of the convolutional neural network models, several hyper-parameters were selected to obtain the high value of precision. The momentum value was attained as 0.95, image size (224 × 224 pixels), base learning rate was 0.001, weight decay was 0.0005, and iterations was 6000. VGG-16, AlexNet, and GoogleNet DCNN models were used to train the weeds and plant classification datasets using the Tensor Flow framework. After training, DCNN models are ready to classify the weeds with real-time input images from the camera (Figure 4). Precision, recall, F1-Score, and overall accuracy were used to assess the performance of the convolutional neural network models used in this study. Performance results for all CNNs models were modified in matrices of binary confusion, containing true positive (Tp), false positive (Fp), true negative (Tn), and false negative (Fn) [56]. In this framework, Tp characterizes images with well-defined classified weeds; Tn, characterizes well-defined images of strawberry plants without weeds; Fp reflects the number of images that were incorrectly classified as weed images; and Fn represents images that were incorrectly classified as non-weeds images, such as strawberry plants. The models performance values of precision, recall, accuracy and F1 score are a small indicator of probability proficiency and range from 0 to 1. The higher the value, the more the accuracy of the networks. Precision represents the accuracy of a neural network in the event of a positive classification and is measured by Equation (1): VGG-16, AlexNet, and GoogleNet DCNN models were used to train the weeds and plant classification datasets using the Tensor Flow framework. After training, DCNN models are ready to classify the weeds with real-time input images from the camera (Figure 4). Precision, recall, F1-Score, and overall accuracy were used to assess the performance of the convolutional neural network models used in this study. Performance results for all CNNs models were modified in matrices of binary confusion, containing true positive (Tp), false positive (Fp), true negative (Tn), and false negative (Fn) [56]. In this framework, Tp characterizes images with well-defined classified weeds; Tn, characterizes well-defined images of strawberry plants without weeds; Fp reflects the number of images that were incorrectly classified as weed images; and Fn represents images that were incorrectly classified as non-weeds images, such as strawberry plants. The models performance values of precision, recall, accuracy and F1 score are a small indicator of probability proficiency and range from 0 to 1. The higher the value, the more the accuracy of the networks. Precision represents the accuracy of a neural network in the event of a positive classification and is measured by Equation (1)   The Recall represent the effectiveness of the neural network in which the object is classified and calculated by Equation (2): Overall Accuracy is the total observation rate of the correctly classified observation and measured by Equation (3): Overall accuracy = Tp + Tn/Tp + Fp + Fn + Tn The F1 score is the average of the harmonic mean of precision, recall and calculated by Equation (4): F1 score = 2 × precision × recall/precision + recall (4)

Electronic Mechanism
The spraying system electronic mechanism was automated with the controller system and relays. To classify and verify recommended weed identification patterns, an Arduino script was established to read signal data coming from a processing unit that contain the classified weeds. The data signal communication between the processing unit (GPU) and the Arduino microcontroller (ATmega328) unit was done through a universal serial bus (USB) assembly. As per the development, the trained CNNs models gather the classification results from input image acquisition through the camera. Throughout this phase, processed images result by a neural network model are transmitted from the processing unit to the microcontroller unit, and then the microcontroller unit creates the spraying control signals which directed through the USB port that activates the relays and solenoid valves opens. As the solenoid valve opens the spraying liquid start flowing toward spraying nozzle and applied on the desired weeds.

Research Plan
The experiments were designed to evaluate the performance of CNNs models by acquiring input images in real time from the camera. In addition, to evaluate the working performance of the variable rate sprayer. The experiments were conducted in simulated lab conditions and later in actual field conditions. Two weeds spotted spurge and Shepherd's purse were selected as targets and strawberry plants as non-targets. The performance of the variable rate sprayer was assessed in all experiments, after spraying the target weeds (spotted spurge, Shepherd's purse), it is done by manually observing the red color spraying liquid on the target weeds, to find out the all weeds are sprayed or not sprayed. The red paint was added in spray liquid water, so that by visually easily identify the weeds are being sprayed or not in all experiments. The performance of the spraying system was calculated by assessment models and described in (Table 1). In the assessment models, state A, B, C, D indicates completely sprayed, incompletely sprayed, not sprayed, and mistakenly sprayed target, respectively. Additionally the percentages of state A, B, C, and D were also calculated as shown in Table 1.

Lab Experiment
In the lab experiment, the classification accuracy of three DCNN models (AlexNet, VGG-16, and GoogleNet) by acquiring input images in real time from the camera and precision spraying of the variable rate sprayer was assessed. The effect of sprayer (chassis) ground speed on classification accuracy and precision spraying were analyzed. Three different ground speeds (1, 3, and 5 km/h) of the sprayer were applied to evaluate trained models' classification accuracy values in real-time image acquisition and precision spraying. During the Laboratory experiments the wind speed, relative humidity, the air temperature was 2-4 km h −1 , 20-25% and 25-35 • C respectively. In the lab experiment, a track was devolved to pretend a strawberry field ( Figure 5). The trial track contains two weeds (spotted spurge, Shepherd's purse) placed randomly among the strawberry plants in a straight line. The weeds were mentioned as targets and strawberry plants were mentioned non-targets. This trial track holds three parallel rows of weeds and strawberry plants. Each row had ten targets (both weeds) and ten non-targets (strawberry plants). The track was 10 m long and 1.4 m wide (two adjacent rows 0.7 m wide). The experiment was repeated ten times and, the average data values were calculated for all three DCNN models, and during each repetition the position of targets weeds and plants rearranged manually.

Lab Experiment
In the lab experiment, the classification accuracy of three DCNN models (AlexNet, VGG-16, and GoogleNet) by acquiring input images in real time from the camera and precision spraying of the variable rate sprayer was assessed. The effect of sprayer (chassis) ground speed on classification accuracy and precision spraying were analyzed. Three different ground speeds (1, 3, and 5 km/h) of the sprayer were applied to evaluate trained models' classification accuracy values in real-time image acquisition and precision spraying. During the Laboratory experiments the wind speed, relative humidity, the air temperature was 2-4 km h −1 , 20-25% and 25-35 °C respectively. In the lab experiment, a track was devolved to pretend a strawberry field ( Figure 5). The trial track contains two weeds (spotted spurge, Shepherd's purse) placed randomly among the strawberry plants in a straight line. The weeds were mentioned as targets and strawberry plants were mentioned non-targets. This trial track holds three parallel rows of weeds and strawberry plants. Each row had ten targets (both weeds) and ten non-targets (strawberry plants). The track was 10 m long and 1.4 m wide (two adjacent rows 0.7 m wide). The experiment was repeated ten times and, the average data values were calculated for all three DCNN models, and during each repetition the position of targets weeds and plants rearranged manually.

Field Experiment
Field performance evaluation of the sprayer was conducted in the strawberry field ( Figure 6), so that the variable rate sprayer performance assessed in actual complex condition of the field. In field experiments, the wind speed was 2-5 km h −1 with the ambient temperature of 27-32 • C and relative humidity of 14-20%. The classification accuracy and precision of the spraying system using three neural network models (AlexNet, VGG-16, and GoogleNet) were evaluated in field experiments. The most promising traveling speed (1 km/h) attained from lab experiments was applied in field evaluation experiments. In the strawberry field, three adjacent rows were randomly selected for experiments to assess sprayer performance. Each nozzle of the spryer covers one row of strawberry crops. The length of the selected rows was 10 m and the width of three adjacent rows were 1.5 m (two adjacent rows width was 0.7 m). The experiment was repeated five times and the number of target weeds in each experiment was 30. The average data values were calculated for all three models to evaluate the performance of the variable rate sprayer. For better performance, the sprayer should only spray on certain weeds (targets) and not on the strawberry plants.
length of the selected rows was 10 m and the width of three adjacent rows were 1.5 m (two adjacent rows width was 0.7 m). The experiment was repeated five times and the number of target weeds in each experiment was 30. The average data values were calculated for all three models to evaluate the performance of the variable rate sprayer. For better performance, the sprayer should only spray on certain weeds (targets) and not on the strawberry plants.

Validation Dataset Results of DCNNs Models
The CNNs models performance (accuracy, precision, recall, F1-score) results on the validation dataset are provided in Table 2. For all CNNs models the classification accuracy values were recorded in the range of 0.95 to 0.97. The validation results show that the VGG-16 model outperformed than the other two CNN models and achieved higher values of accuracy, precision, recall, and F1-score. The VGG-16 model achieved the peak values of precision (0.98), recall (0.97), F1-score (0.97) and accuracy (0.97) for weeds classification. The GoogleNet achieved the values of precision (0.96), recall (0.97), F1-score (0.96), and accuracy (0.96) for weeds classification on validation dataset. The AlexNet model for

Validation Dataset Results of DCNNs Models
The CNNs models performance (accuracy, precision, recall, F1-score) results on the validation dataset are provided in Table 2  The DCNNs models (AlexNet, VGG-16 and GoogleNet) were successfully trained to classify the target weeds in the strawberry crop. The performance of deep learning neural networks models was analyzed by acquiring input images in real time from the camera during lab experiment. Table 3 shows the performance results of AlexNet, VGG-16 and GoogleNet model for weeds classification in real time. The VGG-16 models worked better than AlexNet and GoogleNet for precisely classifying the weeds. F-score, recall, precision and accuracy values were higher in the case of the VGG-16 model, which means that VGG-16 can work significantly better to identify spotted spurge and shepherd purse in strawberry crop compared to the other two models. The maximum values of precision (0.96), recall (0.94), F1-score (0.94) and accuracy (0.95) achieved for the VGG-16 model by acquiring input images in real time from the camera with sprayer running at 1 km/h and the lowest values of precision (0.88), recall (0.85), F1-score (0.86) and accuracy (0.87) were recorded at 5 km/h because the images were blurry and model performance decreased. Similarly for GoogleNet model the peak values of precision (0.94), recall (0.92), F1-score (0.92) and accuracy (0.93) were achieved by running the sprayer at 1 km/h. The AlexNet model recorded lower values of precision (0.92), recall (0.90), F1-score (0.90) and accuracy (0.91) for the classification of weeds in real time input images at 1 km/h as compared to the other two models. The experimental results revealed that an increase in the sprayer's forward speed can reduce the classification accuracy of the system due to blurry image quality in real time image acquisition from the camera. Minor performance differences have been reported between sprayer speeds of 1 km/h and 3 km/h.

Performance Evaluation of Spraying System
The precision of the variable rate sprayer was evaluated in the laboratory experiment. The sprayer performed significantly well in terms of precision and accuracy. The number of target weeds in the experiment were 30. Sprayer using the VGG-16 model achieved better results than the other two models (GoogleNet, AlexNet), specifically when comparing the completely sprayed weeds (CS) and missed target weeds (NS). The lab experiment results of precision spraying are shown in Table 4. The highest percentage of completely sprayed weeds (CS = 93%) were noted for the VGG-16 model at 1 km/h and the lowest percentage value of completely sprayed weeds (CS = 80%) was recorded by running the sprayer at 5 km/h this decreased in performance was due to classification frailer by blurry images acquisition. Similarly for the GoogleNet model the highest percentage of completely sprayed weeds (CS = 90%) was achieved by running the system with 1 km/h and the lowest percentage value (CS = 76%) was recorded by running the system with 5 km/h. AlexNet model recorded the lowest percentage of completely sprayed weeds (CS = 87%) at 1 km/h as compared to the other two models. Some incompletely spray (IS) target weeds were noted during the use of GoogleNet and AlexNet model; however this appears to be reduced using the VGG-16 model for spraying. it is also witnessed that the sprayer did not sprayed any non-weeds target mistakenly in lab experiments. Overall VGG-16 model performed better than the GoogleNet and AlexNet model for precision spraying in lab experiments. Additionally observed from experiments that the percentage of completely sprayed weeds targets decreased with the increase of sprayer ground speed it was because of the decrease in classification accuracy of the models due to blurry images acquisition. The minor differences in sprayer performance were recorded between 1 and 3 km/h sprayer speed. The percentage of the missed spray weeds target (NS) increased by increasing the ground speed above 3 km/h. As a result, this study revealed that the suggested real-time sprayer ground speed should be 1 km/h for a higher percentage of completely sprayed targets.

Deep Learning Models Results in The Field Experiment
The Deep learning models (AlexNet, VGG-16 and GoogleNet) successfully classified the weeds in real time live input images from the camera during the field experiments. In field complex environment the performance of deep learning neural networks were assessed by acquiring input images in real time from the camera. VGG-16 and GoogleNet both were able to successfully classify the weeds (spotted spurge, Shepherd's purse) in field experiments. VGG-16 produced significantly better results than GoogleNet and AlexNet models for weeds classification.
The VGG- 16  Overall VGG-16 model performed well by achieving higher precision, recall and F1-score values as compared to other two models. Table 5 shows the results of AlexNet, VGG-16 and GoogleNet models for weeds classification in field experiments. In complex field conditions the performance of the viable rate sprayer was assessed. During field experiments, it was clearly observed that the VGG-16 model performed better than the GoogleNet and AlexNet models, particularly when comparing the complete sprayed weeds (spotted spurge, Shepherd's purse) targets (CS) and missed weeds target (NS) percentages ( Table 6). The VGG-16 model achieved a higher percentage of completely sprayed weeds targets (CS = 86%) as compared to GoogleNet (CS = 83%). This increase of the percentage of completely sprayed targets was 3%. AlexNet model lower percentage of completely sprayed weeds targets (CS = 77%).  VGG-16  26  1  3  0  86  3  10  0   GoogleNet  25  1  3  1  83  3  10  3   AlexNet  23  1  5  1  77  3  17  3 It also can be observed from Figure 7 that the AlexNet model has missed more targets than GoogleNet and the VGG-16 model. Therefore the AlexNet model was not significant in complex field conditions. Another important observation was that there is no single mistakenly sprayed (MS) non-weeds target recorded by the VGG-16 model. Overall the sprayer performed significantly well with the VGG-16 model in complex field conditions.   VGG-16  26  1  3  0  86  3  10  0  GoogleNet  25  1  3  1  83  3  10  3  AlexNet  23  1  5  1  77  3  17  3 It also can be observed from Figure 7 that the AlexNet model has missed more targets than GoogleNet and the VGG-16 model. Therefore the AlexNet model was not significant in complex field conditions. Another important observation was that there is no single mistakenly sprayed (MS) non-weeds target recorded by the VGG-16 model. Overall the sprayer performed significantly well with the VGG-16 model in complex field conditions.

Discussion
Among the image classification neural networks, VGG-16 performed better than AlexNet and GoogleNet. Precision, F-score, recall, and accuracy values were higher in the case of the VGG-16 model compared to the other two models, which means that VGG-16

Discussion
Among the image classification neural networks, VGG-16 performed better than AlexNet and GoogleNet. Precision, F-score, recall, and accuracy values were higher in the case of the VGG-16 model compared to the other two models, which means that VGG-16 can work considerably better to classify weeds (spotted spurge and shepherd purse) in strawberry fields. Some other researchers [51,57,58] also stated that VGG-16 can achieve better precision than GoogleNet and AlexNet models in the classification of weeds. The CNN models (AlexNet, VGG-16, and GoogleNet) were trained using a dataset containing a total of 12,443 images captured from the strawberry field (4200 images with spotted spurge, 4265 images with Shepherd's purse, and 4178 of strawberry plants). The ratio of training and validation images was 70:30 for both weeds and plant image datasets.
The validation results showed that the VGG-16 model outperformed than the other two CNN models and achieved higher values of accuracy, precision, recall, and F1-score. The Laboratory and field experiments were performed to evaluate the smart sprayer performance for weed classification by acquiring input images in real time from the camera and precisely spraying the target weeds using three trained object classification CNNs models.
In laboratory experiments the VGG-16 model showed higher classification values of precision (0.96), recall (0.94), F1-score (0.94), and also a higher percentage of completely sprayed weeds target (CS = 93%) values as compared to the other two models. The second best performed model was GoogleNet, the model recorded precision 0.94, recall 0.92, F1score 0.92 and the percentage of completely sprayed (CS) weeds target was 90%. The AlexNet model recorded lower weeds classification values of precision (0.92), recall (0.90), F1-score (0.90) and also achieved the lowest percentage of completely sprayed weeds targets (CS = 87%) as compared to the other two models.
In the field's complex environment, the peak values of precision (0.90), recall (0.88) and F1-score (0.88) were achieved by the VGG-16 model. Additionally, VGG-16 model recorded 3% and 9% higher percentage of completely sprayed weeds targets as compared to GoogleNet and AlexNet model respectively. Additionally, the VGG-16 model reduced the percentage of missed sprayed weeds (spotted spurge, Shepherd's purse) from 17% to 10% (comparing by AlexNet model). VGG-16 and GoogleNet both models recorded significant performance results in field experiments. The AlexNet model appeared to be less accurate for the classification of weeds (spotted spurge, Shepherd's purse) in real time field experiments and recorded lower values of precision (0.85), recall (0.81), F1-score (0.82) and percentage of completely sprayed weeds targets (CS = 77%) as compared to other two models.
The experiment's results also suggested that the sprayer's performance decreased with the increase of sprayer traveling speed above 3 km/h. The higher ground speeds the (5 km/h) of sprayer have resulted in the classification failure of weeds due to blurry image quality in real-time image acquisition from the camera. Overall in all experiments VGG-16 model provides precise and accurate results for real-time weeds (Shepherd's purse and spotted spurge) classification and precision spraying.

Conclusions
A variable rate sprayer using deep learning-based methods to classify the weeds in real time was developed and the performance was evaluated by precisely spraying the desired target weeds. The variable rate sprayer includes webcams for capturing images, computing unit for image processing, a microcontroller board to control system operation, and spray nozzles with solenoid valves. The variable rate sprayer enabled image capture and processing to send trigger signals to open the nozzles and spray on the targeted weeds. This study proposes AlexNet, VGG-16, and GoogleNet deep learning architecture for weed classification. Three object classification CNN models (AlexNet, VGG-16 and GoogleNet) were positively trained and tested with the images of weeds (spotted spurge, Shepherd's purse) and strawberry plants. The goal of this study was to evaluate the potential of using convolutional neural network models to identify spotted spurge and Shepherd purse weeds in real time for spraying. The VGG-16 was more effective than AlexNet and GoogleNet based on the accuracy of the models and precision spraying. Experimental results recommended that sprayer with VGG-16 model can achieve high performance for real-time spraying application in the field.
In all experiments, the VGG-16 model demonstrated significant performance results. Based on the outcomes of lab experiments and real-time field assessments, it can be concluded that the developed variable rate sprayer was capable of precisely differentiate between the weeds (Shepherd's purse, spotted spurge) and non-weeds target (strawberry plants) and spraying on the target weeds. The developed system offers a potential solution to prevent input waste of agrochemicals, therefore increasing formers productivity and reducing the environmental contamination.
Future research studies can focus on the identification of the diseases affected plants. Additionally, additional CNNs will be trained at lower resolutions and for other target weeds. It is also observed from the field experiments that sprayer performance was better under trees shadow conditions than the open field. Therefore for future research artificial lights arrangements will be added for better performance.

Data Availability Statement:
The data presented in this study is available on request from the corresponding author.

Conflicts of Interest:
The authors declared no conflict of interest.