1. Introduction
Pesticides and fertilizers are essential for crop protection but pose significant health risks when applied manually. In 2018, the World Health Organization reported over a million health issues linked to manual pesticide spraying [
1]. Unmanned aerial vehicles (UAVs) have emerged as a promising solution for pesticide application and crop monitoring to address this issue. This technology not only safeguards human health but also optimizes resource utilization.
Beyond pesticide application, UAVs have broad applications in precision agriculture. Equipped with advanced sensors, they can efficiently collect data for topographic surveys, crop health assessments, pest monitoring, and targeted treatment [
2,
3]. Early detection and precise intervention enabled by UAVs can significantly enhance crop yields and reduce losses due to pests and diseases.
Despite challenges such as cost and payload limitations, UAVs offer significant economic potential in agriculture due to their ability to increase productivity and reduce crop damage. Numerous studies have explored the potential of UAVs to transform agricultural practices, particularly in precision agriculture. UAVs excel in real-time data acquisition and crop monitoring, providing invaluable insights for farm management. The historical development of UAV-based low-altitude remote sensing in agriculture has been well documented, offering detailed analyses of UAV characteristics, sensor capabilities, and operational constraints. Applications range from soil and production mapping to the integration of Geographic Information Systems (GISs), as well as crop monitoring and optimized pesticide spraying [
4,
5,
6].
Regular crop observation provides crucial data on the condition of the plants and identifies specific areas that need more attention from producers. This activity can be carried out efficiently through visual inspection, allowing the detection of signs that indicate the need for crop care, such as irregular plant growth, changes in leaf coloration that suggest the presence of pests, soil erosion, and problems in the post-planting process. According to [
7], remote sensing monitoring is often carried out using drones equipped with multiple cameras. Recent advances in UAV aerial imagery highlight the technology’s potential for a number of applications, including the inspection of transmission lines, bridges, and pipelines [
8,
9,
10,
11], autonomous search of landing and takeoff platforms [
12], monitoring of mining areas and ecological environments [
13], assessment of forest conditions [
14], traffic surveillance on highways [
15,
16], and mapping of underground agricultural drainage pipes [
17].
UAVs have transformed modern agriculture by providing efficient, flexible, and cost-effective aerial spraying solutions. Their effectiveness has been demonstrated across various crops, including papaya, soybean, and pear orchards [
18,
19,
20]. However, the operational parameters of UAVs significantly affect droplet distribution, which in turn influences pesticide efficiency. Meng et al. Meng et al. [
21] examined how different UAV operational parameters impact droplet distribution in peach orchards with varying tree structures. Similarly, Tang et al. [
22] investigated the effect of application height on droplet distribution in citrus orchards.
According to Basso and Pignaton de Freitas [
23], selective spraying of agrochemicals is crucial in precision agriculture to ensure high productivity and quality of agricultural products. UAVs offer a significant advantage in this regard, as they reduce soil compaction compared to heavy machinery and minimize the waste of artificial substances through precise and self-regulating applications. The study proposes a UAV guidance system incorporating both hardware and software based on image processing techniques to optimize this process.
Row crop detection is another important technique in precision agriculture, which can guide agricultural machinery more effectively. Ruan et al. [
24] propose a method for row crop detection using aerial images and YOLO-R, addressing the limitations of traditional image processing methods that are often affected by weeds and lighting variations. Additionally, Bah et al. [
25] introduce CRowNet, a method utilizing a convolutional neural network (CNN) combined with the Hough transform to detect crop rows in UAV images. Notably, their model demonstrates the capability to identify rows across various crop types.
The approach to agricultural spraying using drones can be significantly enhanced by considering the weight of the pesticide being transported, rather than relying solely on the nozzle’s spraying rate, as is common in commercial models, such as those discussed by [
1] and DJI’s Agras series (
https://ag.dji.com/, accessed on 16 September 2024). Monitoring the weight provides a direct and accurate measurement of the actual amount of pesticide available, allowing for dynamic adjustments in the spraying process to optimize liquid distribution, minimize waste, and ensure uniform coverage. In contrast, systems that use the nozzle’s spraying rate base their calculations on fixed parameters, which may not adequately reflect real variations in the field, such as changes in liquid viscosity or the performance of the pumping system due to wear on the nozzles over time. Thus, the weight-based approach offers greater accuracy and efficiency, particularly in variable field conditions, where real-time adjustments are essential for effective and sustainable spraying.
In this context, our research aims to enhance UAV-based spraying efficiency by correlating liquid payload with distribution to optimize resource utilization and improve crop health. We have also developed a drone-based plant health monitoring system that utilizes image processing to detect diseases and nutritional deficiencies based on plant coloration. This system provides valuable insights for farmers, enabling data-driven decisions on irrigation, fertilization, and pest control. Our findings highlight the potential of UAV technology to advance agricultural practices and tackle critical challenges such as pest and disease management.
To enhance clarity, this work is structured as follows.
Section 2 outlines the problem addressed.
Section 2.1 and
Section 2.2 detail the proposed spraying system, including its mechanism and the trigger strategy, as well as the image processing approach proposed to enable it. The results and their implications are presented in
Section 3. Finally, concluding remarks are highlighted in
Section 4.
2. The Case Study
Figure 1 illustrates the practical scenario studied in this work, in which the UAV performs an autonomous inspection task of a vegetable garden, which can be extended to larger crops. Given the arrangement of plants in the beds, where they are usually equally spaced from each other, occupying the entire length of the bed, the UAV must optimally cover the entire cultivated area, monitoring the health of the plants. In this sense, the UAV follows a path composed of uniformly spaced points at a fixed height, generating straight segments connected by smooth curves. It should be emphasized that the list of points was ordered to allow continuous drone navigation over all beds, that is, once a bed has been inspected, the UAV must move to the nearest bed. Furthermore, the UAV’s yaw orientation must be tangent to the curves that connect the beds in order to maintain the same displacement pattern (moving forward) while navigating and during the transitions between the beds, otherwise there would be beds where the drone would return backwards. Once it has taken off from the takeoff/landing platform, the UAV heads towards the first point of the path, and upon reaching this position, it must pass over the first bed at a constant steady speed to ensure the sharpness of the captured photos. Upon reaching the last point of the path, the inspection is completed and the UAV returns to the base to end the task.
The UAV’s spraying system is designed to operate based on the amount of pesticide in the tank. The pump activates when the tank’s weight is above a predefined minimum threshold, allowing the UAV to distribute the pesticide over the crops. As the liquid level decreases, the system continues spraying until the tank’s weight nears the empty reservoir’s weight. At this critical point, the UAV records its current position and returns to the base to prevent damage to the pump and conserve battery power. The farmer then has the option to refill the tank and resume spraying from the last saved location or to end the task. The monitoring and spraying process is considered complete once the UAV finishes the final crop row.
The strategy for capturing photos is shown in
Figure 2. When flying at 1.5 m above the plant beds, it was observed in the captured photos that there was always more than one plant in the images; however, the classification algorithm should analyze only the plant directly below the UAV and disregard any others. For this purpose, our image processing strategy was to frame the plant of interest in the center of the image, given that the camera resolution is known,
pixels, and 14 megapixels. Once this was implemented, whenever a photo was captured, a
-pixel crop was then taken around the center of the image to check for the presence of a plant, and in case of a positive result, to check if it was diseased. When a diseased plant was identified, the UAV paused its movement and started spraying over the infected plant.
2.1. The Spraying System
This study developed an automated UAV-based spraying system for precise sub-leaf application of pesticides or nutrients. The system incorporates a remotely controlled pump activated at optimal times during flight. A 1 kg load cell monitors the liquid weight in the reservoir. A mini submersible DC water pump, powered at 3.3 V and delivering 80–120 L/h, was used for experimental validation. The system’s nozzle, depicted in
Figure 3a, features three red indicators marking the water jet exit points.
An ESP32-DevKit V1 microcontroller managed the prototype’s operation, while a Matlab-based control station received real-time data on the reservoir weight and pump status. ROS 1 facilitated seamless communication between components. The automated spraying system comprised a single ESP32 microcontroller, a 1 kg load cell, an LM 7805 voltage regulator, a 24-bit HX711 module converter, a TBJ C1815 transistor, a mini pump operating between 3 and 6 volts, an RBB LED, and three 220-ohm resistors.
Figure 3 depicts the spraying mechanism integrated into a Parrot Bebop 2 UAV. The 3D-printed prototype houses electronics for load sensor data acquisition and transmission. Careful design ensured the additional weight was balanced on the drone, and the tank’s position allowed for camera monitoring.
The load weight measurement system was calibrated using a commercial precision scale as a reference, which has a sensitivity of 1 g and a maximum capacity of 10 kg. The calibration process involved adjusting the load cell signals relative to the reference scale, based on multiple readings for weights ranging from 0 to 1 kg, both ascending and descending. This method enhances the accuracy and reliability of the measurements obtained by the system. In summary, this procedure ensures that the values obtained from the load cell accurately reflect the weight, in grams, of the liquid contained in the tank.
2.2. The Monitoring Strategy
Firstly, it is important to note that the UAV’s onboard camera is displaced by about 10 cm along the
-axis of the UAV’s center of mass (the origin of the UAV’s reference system), as illustrated in
Figure 2. Therefore, a homogeneous transformation between the camera reference system
and the UAV reference system
was necessary to accurately determine the corrected geometric location of the drone. Given that the position of each plant was known due to standardized planting, the condition required to capture images with the plants centered on the image plane was based on the positional error between the UAV and each plant. Specifically, let
, where
. The image capture process was allowed whenever
was less than
cm. A binary vector checks plant visitation, ensuring that each plant is photographed only once.
Our monitoring proposal basically involves identifying the presence of plants in the center of the images and assessing their condition. Image processing consists of three fundamental steps: image capture, cropping, and feature extraction. First, photos are taken to ensure that the UAV is positioned above the plant, using the position error between the drone and each plant, as discussed in
Section 2. As several plants can appear in the image, the target plant for analysis is the one located in the center of the image plane, i.e., 10 cm from the UAV’s current position. The algorithm cuts the image around the center to focus on the desired plant. The cropped image is then converted from the RGB color scale to HSV to facilitate the application of a segmentation mask, which identifies green tones to confirm the presence of a plant. Another mask is applied to detect yellow tones, indicating the presence of a diseased plant directly below the drone. The result of the algorithm for each image is a Boolean vector that indicates whether a plant is present in the captured image and whether it is diseased. To avoid noise in the detection process, the presence of green or yellow was only considered significant if the segmented area exceeded 20 pixels
2. Finally, diseased plants are highlighted by coloring the edges of the yellow regions in red.
2.3. The Control Strategy
Successful autonomous UAV spraying hinges on factors including pesticide mass and target location geometry. While this paper focuses on the spraying system, the reader is referred to [
26] for in-depth controller design and stability analysis. To simplify the presentation and implementation, this section provides a concise overview of the UAV model and its navigation controller.
The UAV’s translational coordinates are represented as , while its rotational coordinates are denoted by the vector , which correspond to the Tait–Bryan angles of roll, pitch, and yaw, respectively, all relative to the global frame .
The Bebop 2 drone’s built-in firmware simplifies the modeling of its dynamics, particularly during tasks requiring precise control near critical points, by allowing its autopilot to maintain flight stability. This streamlined approach aids in the design of the flight controller. The drone’s dynamic model can be represented succinctly as follows:
where
and
. In a compact form, this can be written as
, where
are the normalized control signals. Here,
and
correspond to pitch and roll commands, influencing the linear velocity along the
and
axes, respectively. Meanwhile,
and
pertain to the control of
and
. The variables
,
, and
represent the linear velocities along the
x,
y, and
z axes in the UAV body frame, while
denotes the angular velocity around the
z-axis in the global frame.
It is important to note, as discussed in [
26], that while this model does not capture the full dynamics of the UAV, it effectively describes the influence of high-level control signals on the vehicle’s maneuvers. For path-following tasks, the control law is given by
where
represents the pose error, and
and
are positive gain matrices.
Algorithm 1 outlines the logic for executing the spraying mission, incorporating real-time monitoring of the UAV’s payload.
Algorithm 1 Structure of Control System. |
- 1:
Initialize ROS Communication, tracking system (OptiTrack) and Bebop UAV - 2:
Initialize the Spraying device (load sensor and pump) - 3:
Takeoff Bebop and Start the Monitoring Mission - 4:
while !(Route Accomplished) & !(Empty tank) do - 5:
Monitor the plantation (Image processing) - 6:
if Sick Plant detected then - 7:
Pause path tracking - 8:
Execute spraying - 9:
if Defensive Amount reached then - 10:
Stop spraying - 11:
Resume path - 12:
end if - 13:
else - 14:
Follow the path - 15:
end if - 16:
Get Sensor Payload data - 17:
if Tank is empty then - 18:
Return the base to refill tank | Finish the mission - 19:
end if - 20:
Get Pose data and Update the UAV states - 21:
Compute the Control Signals - 22:
if Exist commands from Joystick then - 23:
Override autonomous controller with Joystick info - 24:
end if - 25:
Sent control signals to UAV - 26:
Store state variables - 27:
end while
|
3. Results and Discussion
In this section, a proof-of-concept experiment was conducted using two simulated garden beds, each 3 m in length. Plant models were created using EVA leaves of varying sizes (0.4
, 0.30
, and 0.20
) and colors (light and dark green) to represent different crop types. Plants were spaced 0.5
apart, totaling 11 plants (with one intentional gap, no plant). The experimental validation video can be accessed via the following link:
https://youtu.be/lB_JT8X3N-8, accessed on 16 September 2024.
The image processing algorithm was responsible for classifying the plants as healthy, sick, or absent.
Figure 4 shows examples of these categories, with the corresponding segmentation results displayed in the bottom row. Note that the irrigation nozzle is visible in the photographs captured by the UAV; however, it was intentionally moved back to avoid overlapping the mechanism with the plants in the cropped image in the center of the photographs (as indicated by the blue square). The images in the second row show the result of our segmentation, where we highlighted the presence of green, yellow, or black colors, which allowed us to identify and classify the health of the plants analyzed.
Figure 5 illustrates the variation in payload, as well as the times when the pump was activated and deactivated during the real-time experimental validation of the proof of concept. The periods labeled calibration, pump OFF, and pump ON are highlighted in gray, green, and red, respectively. These labels correspond to the time required to calibrate the load cell after initialization and the periods when the pump was switched on and off. During the calibration stage, it is not possible to determine the weight available in the tank, which explains the absence of a signal during this process. Both graphs show sharper variations when the UAV takes off; however, despite the initial noise, the signal stabilizes once the UAV is in flight. When the pump is activated, there is a gradual decrease in the signal over time, explained by the fact that the liquid is pumped out of the tank, thus reducing its weight. In both experiments, the same amount of pesticide, 5 g, was released on each sick plant. It is interesting to note that the time interval needed to spray the same amount of pesticide varied between the experiments. The time taken to release the 5 g on each diseased plant was longer in the first experiment compared to the second, indicating a lower flow rate out of the nozzle in the first experiment. Finally, it is worth noting that the noise observed in the signal in both graphs is caused by the vibration of the quadcopter itself in flight.
Pump activation was contingent upon the image segmentation stage, specifically, the detection of diseased plants (yellow colored). In the initial experiment (
Figure 5a), despite identifying three sick plants, the pump operated only twice. This discrepancy arose from the load-cell system indicating insufficient liquid for continued application after detecting the third sick plant, necessitating the UAV’s return to base for refueling. To prevent pump damage from dry running, a weight threshold of 30 g was established. This accounted for potential liquid displacement due to the UAV’s movement. Spraying commenced only if a diseased plant was detected and the liquid weight exceeded the threshold. Otherwise, the UAV returned for refueling. In the second experiment (
Figure 5b), the initial pesticide load was approximately 50 g, sufficient to cover all beds and treat the three sick plants without intermediate refueling.
The UAV’s trajectory during both experiments is shown in
Figure 6. In the first experiment (
Figure 6a), the UAV detected insufficient pesticide upon encountering the third diseased plant and returned to base for refueling. This resulted in a deviation from the planned path and omission of inspection for the remaining plants in the second bed. Blank entries indicate that the UAV failed to pass within 0.1 m of these plants, preventing image capture and subsequent analysis. In contrast, in the second experiment (
Figure 6b), the UAV successfully identified, sprayed, and classified all plants in the garden. Notably, in both experiments, the desired
and current
UAV positions closely aligned, demonstrating the control system’s effectiveness in maintaining precise positioning, even while carrying a payload.
The analysis revealed the UAV could carry a maximum payload of 228 g, comprising a 178-g spraying system and 50 g of liquid. This load represented approximately 45.60% of the UAV’s estimated 500-g weight. This study focused on developing a self-contained spraying prototype for drone integration, prioritizing weight-based liquid dispensing over nozzle flow rate control. An RGB LED visual indicator was implemented to signal liquid levels during testing. The LED cycled through green (steady or flashing) and red to denote liquid weight above 40 g, between 30 and 40 g, or below 30 g, respectively.
It is worth highlighting that our approach can be adapted for non-liquid payloads since the load sensor functions independently of the cargo type. This versatility allows for the utilization of drones in various applications, such as seed distribution in reforestation efforts or the deployment of larvicides in water reservoirs to combat dengue and other similar diseases. This potential for diverse applications highlights the broader implications of our work in advancing UAV technology for environmental and public health initiatives.
In this work, the agricultural spraying system has been designed for targeted applications, allowing for individualized treatment of each plant. This approach stands out compared to traditional systems, primarily due to its ability to ensure that a precise and uniform amount of pesticide is applied to each plant based on the weight of the applied product, rather than relying on time intervals or flow control. Unlike conventional systems that depend on the nozzle’s spray rate, our system measures the actual amount of pesticide deposited, guaranteeing that each plant receives the same quantity of product regardless of terrain variations, environmental conditions, or liquid viscosity. This precision is crucial in crops where uniform treatment directly affects productivity and plant health.
The precise measurement of the liquid’s weight guarantees uniform application, avoiding both under-dosing, which could compromise the treatment’s effectiveness, and over-dosing, which could lead to waste and undesirable environmental impacts. This optimization not only reduces the use of excess inputs, making operations more economical, but also maintains the effectiveness of the treatment.
Additionally, the proposed spraying system allows for flexibility in the amount of pesticide applied to each plant based on its disease level (as determined by the ratio of yellow to green colors). This feature is particularly advantageous for crops with individual treatment needs, such as orchards, vineyards, or gardens, ensuring that more diseased plants receive a higher quantity of pesticide. In essence, through the use of a plant disease index (PDI), the system can automatically adjust the amount of pesticide released in grams, adapting to the specific needs of each plant and variations in environmental conditions. However, this aspect falls outside the scope of the current work and remains a point for future improvement.
In conclusion, we conducted additional tests to validate alternative approaches, which are showcased in the accompanying video (
https://youtu.be/IdVl9vTMFpg, accessed on 16 September 2024). Throughout our experiments, we explored various configurations, including adjustments to the control strategy, the simulated liquid dispersion system, and the monitoring and pesticide application strategies. These findings underscore the effectiveness of our proposed methods and provide further insight into the potential of UAV-based spraying systems.
4. Concluding Remarks
This study presents the development of a UAV-based spraying system designed for targeted pesticide application on color-identified diseased plants, aiming to optimize resource use and improve crop health and productivity. The experimental results confirmed the effectiveness of the digital image processing technique, the functionality of the onboard water pumping mechanism, and the accuracy of the UAV’s pesticide weight measurement. A main contribution of this work is the integration of a real-time payload monitoring system, which continuously tracks the weight during flight to ensure precise pesticide application and prevent over- or under-spraying. Additionally, the system supports automatic refueling by detecting low pesticide levels and directing the UAV to return to base when needed.
In conclusion, this work demonstrates an alternative application of UAVs in agriculture. Notably, the employed technique minimizes direct human exposure to agricultural pesticides and unintentional contact with crops, preventing potential developmental delays. The proposed inspection technique could be further expanded to identify mature and green plants, for example. Overall, our study contributes to the development of cost-effective technologies that maximize the potential of UAVs in precision agriculture.
Additionally, our approach can be extended to non-liquid payloads, as the load sensor operates independently of the type of cargo being transported. This enables the use of drones for seed distribution in reforestation missions or for deploying larvicides in water reservoirs to combat dengue and similar diseases.
Future research directions include conducting open-air field tests with real plants. This would introduce additional complexity due to external factors like wind gusts and varying lighting conditions. Additionally, employing RGB-D cameras would provide color and depth information, facilitating real-plant growth assessment based on height measurements, using the relative distance between the drone and the top of the plants.