Next Article in Journal
Mask R-CNN and Centroid Tracking Algorithm to Process UAV Based Thermal-RGB Video for Drylot Cattle Heat Stress Monitoring
Next Article in Special Issue
L1 Adaptive Control for Small-Scale Unmanned Helicopters: Enhancing Speed Regulation
Previous Article in Journal
A Green Laboratory Approach to Medical Sample Transportation: Assessing the Carbon Dioxide (CO2) Footprint of Medical Sample Transportation by Drone, Combustion Car, and Electric Car
Previous Article in Special Issue
Heterogeneous Multi-UAV Mission Reallocation Based on Improved Consensus-Based Bundle Algorithm
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Integration of Payload Sensors to Enhance UAV-Based Spraying

by
Celso O. Barcelos
1,†,
Leonardo A. Fagundes-Júnior
1,†,
André Luis C. Mendes
1,
Daniel C. Gandolfo
2 and
Alexandre S. Brandão
1,*,‡
1
Núcleo de Especialização em Robótica, Departamento de Engenharia Elétrica, Programa de Pós-graduação em Ciência da Computação, Universidade Federal de Viçosa, Viçosa 36570-900, Minas Gerais, Brazil
2
Instituto de Automática, Universidad Nacional de San Juan, Av. San Martín (Oeste) 1109, San Juan J5400ARL, Argentina
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Current address: Via da Agronomia, 299, Campus Universitário, Viçosa 36570-900, Minas Gerais, Brazil.
Drones 2024, 8(9), 490; https://doi.org/10.3390/drones8090490
Submission received: 5 August 2024 / Revised: 5 September 2024 / Accepted: 15 September 2024 / Published: 17 September 2024

Abstract

:
This work focuses on the use of load sensors to help with spraying tasks using unmanned aerial vehicles (UAVs). The study details the construction of a prototype for load measurement to validate the proof of concept. To simulate the application of agricultural pesticides, the UAV follows a predefined route and an image processing system detects the presence of diseased plants. After detection, the UAV pauses its route momentarily and activates the spraying device. The payload sensor monitors the fertilizer application process, which determines whether the amount of pesticide has been fully applied. If the storage tank is empty or the remaining quantity is insufficient for another operation, the system will command the UAV to return to the base station for refueling. Experimental validations were carried out in an indoor controlled environment to verify the proposal and the functionality of the in-flight payload monitoring system. Additionally, the UAV’s flight controller demonstrated robust performance, maintaining stability despite the challenges posed by liquid-load oscillations and varying payloads during the spraying process. In summary, our main contribution is a real-time payload monitoring system that monitors weight during flight to avoid over- or under-spraying. In addition, this system supports automatic refueling, detecting low levels of pesticides and directing the UAV to return to base when necessary.

1. Introduction

Pesticides and fertilizers are essential for crop protection but pose significant health risks when applied manually. In 2018, the World Health Organization reported over a million health issues linked to manual pesticide spraying [1]. Unmanned aerial vehicles (UAVs) have emerged as a promising solution for pesticide application and crop monitoring to address this issue. This technology not only safeguards human health but also optimizes resource utilization.
Beyond pesticide application, UAVs have broad applications in precision agriculture. Equipped with advanced sensors, they can efficiently collect data for topographic surveys, crop health assessments, pest monitoring, and targeted treatment [2,3]. Early detection and precise intervention enabled by UAVs can significantly enhance crop yields and reduce losses due to pests and diseases.
Despite challenges such as cost and payload limitations, UAVs offer significant economic potential in agriculture due to their ability to increase productivity and reduce crop damage. Numerous studies have explored the potential of UAVs to transform agricultural practices, particularly in precision agriculture. UAVs excel in real-time data acquisition and crop monitoring, providing invaluable insights for farm management. The historical development of UAV-based low-altitude remote sensing in agriculture has been well documented, offering detailed analyses of UAV characteristics, sensor capabilities, and operational constraints. Applications range from soil and production mapping to the integration of Geographic Information Systems (GISs), as well as crop monitoring and optimized pesticide spraying [4,5,6].
Regular crop observation provides crucial data on the condition of the plants and identifies specific areas that need more attention from producers. This activity can be carried out efficiently through visual inspection, allowing the detection of signs that indicate the need for crop care, such as irregular plant growth, changes in leaf coloration that suggest the presence of pests, soil erosion, and problems in the post-planting process. According to [7], remote sensing monitoring is often carried out using drones equipped with multiple cameras. Recent advances in UAV aerial imagery highlight the technology’s potential for a number of applications, including the inspection of transmission lines, bridges, and pipelines [8,9,10,11], autonomous search of landing and takeoff platforms [12], monitoring of mining areas and ecological environments [13], assessment of forest conditions [14], traffic surveillance on highways [15,16], and mapping of underground agricultural drainage pipes [17].
UAVs have transformed modern agriculture by providing efficient, flexible, and cost-effective aerial spraying solutions. Their effectiveness has been demonstrated across various crops, including papaya, soybean, and pear orchards [18,19,20]. However, the operational parameters of UAVs significantly affect droplet distribution, which in turn influences pesticide efficiency. Meng et al. Meng et al. [21] examined how different UAV operational parameters impact droplet distribution in peach orchards with varying tree structures. Similarly, Tang et al. [22] investigated the effect of application height on droplet distribution in citrus orchards.
According to Basso and Pignaton de Freitas [23], selective spraying of agrochemicals is crucial in precision agriculture to ensure high productivity and quality of agricultural products. UAVs offer a significant advantage in this regard, as they reduce soil compaction compared to heavy machinery and minimize the waste of artificial substances through precise and self-regulating applications. The study proposes a UAV guidance system incorporating both hardware and software based on image processing techniques to optimize this process.
Row crop detection is another important technique in precision agriculture, which can guide agricultural machinery more effectively. Ruan et al. [24] propose a method for row crop detection using aerial images and YOLO-R, addressing the limitations of traditional image processing methods that are often affected by weeds and lighting variations. Additionally, Bah et al. [25] introduce CRowNet, a method utilizing a convolutional neural network (CNN) combined with the Hough transform to detect crop rows in UAV images. Notably, their model demonstrates the capability to identify rows across various crop types.
The approach to agricultural spraying using drones can be significantly enhanced by considering the weight of the pesticide being transported, rather than relying solely on the nozzle’s spraying rate, as is common in commercial models, such as those discussed by [1] and DJI’s Agras series (https://ag.dji.com/, accessed on 16 September 2024). Monitoring the weight provides a direct and accurate measurement of the actual amount of pesticide available, allowing for dynamic adjustments in the spraying process to optimize liquid distribution, minimize waste, and ensure uniform coverage. In contrast, systems that use the nozzle’s spraying rate base their calculations on fixed parameters, which may not adequately reflect real variations in the field, such as changes in liquid viscosity or the performance of the pumping system due to wear on the nozzles over time. Thus, the weight-based approach offers greater accuracy and efficiency, particularly in variable field conditions, where real-time adjustments are essential for effective and sustainable spraying.
In this context, our research aims to enhance UAV-based spraying efficiency by correlating liquid payload with distribution to optimize resource utilization and improve crop health. We have also developed a drone-based plant health monitoring system that utilizes image processing to detect diseases and nutritional deficiencies based on plant coloration. This system provides valuable insights for farmers, enabling data-driven decisions on irrigation, fertilization, and pest control. Our findings highlight the potential of UAV technology to advance agricultural practices and tackle critical challenges such as pest and disease management.
To enhance clarity, this work is structured as follows. Section 2 outlines the problem addressed. Section 2.1 and Section 2.2 detail the proposed spraying system, including its mechanism and the trigger strategy, as well as the image processing approach proposed to enable it. The results and their implications are presented in Section 3. Finally, concluding remarks are highlighted in Section 4.

2. The Case Study

Figure 1 illustrates the practical scenario studied in this work, in which the UAV performs an autonomous inspection task of a vegetable garden, which can be extended to larger crops. Given the arrangement of plants in the beds, where they are usually equally spaced from each other, occupying the entire length of the bed, the UAV must optimally cover the entire cultivated area, monitoring the health of the plants. In this sense, the UAV follows a path composed of uniformly spaced points at a fixed height, generating straight segments connected by smooth curves. It should be emphasized that the list of points was ordered to allow continuous drone navigation over all beds, that is, once a bed has been inspected, the UAV must move to the nearest bed. Furthermore, the UAV’s yaw orientation must be tangent to the curves that connect the beds in order to maintain the same displacement pattern (moving forward) while navigating and during the transitions between the beds, otherwise there would be beds where the drone would return backwards. Once it has taken off from the takeoff/landing platform, the UAV heads towards the first point of the path, and upon reaching this position, it must pass over the first bed at a constant steady speed to ensure the sharpness of the captured photos. Upon reaching the last point of the path, the inspection is completed and the UAV returns to the base to end the task.
The UAV’s spraying system is designed to operate based on the amount of pesticide in the tank. The pump activates when the tank’s weight is above a predefined minimum threshold, allowing the UAV to distribute the pesticide over the crops. As the liquid level decreases, the system continues spraying until the tank’s weight nears the empty reservoir’s weight. At this critical point, the UAV records its current position and returns to the base to prevent damage to the pump and conserve battery power. The farmer then has the option to refill the tank and resume spraying from the last saved location or to end the task. The monitoring and spraying process is considered complete once the UAV finishes the final crop row.
The strategy for capturing photos is shown in Figure 2. When flying at 1.5 m above the plant beds, it was observed in the captured photos that there was always more than one plant in the images; however, the classification algorithm should analyze only the plant directly below the UAV and disregard any others. For this purpose, our image processing strategy was to frame the plant of interest in the center of the image, given that the camera resolution is known, 480 × 856 pixels, and 14 megapixels. Once this was implemented, whenever a photo was captured, a 150 × 150 -pixel crop was then taken around the center of the image to check for the presence of a plant, and in case of a positive result, to check if it was diseased. When a diseased plant was identified, the UAV paused its movement and started spraying over the infected plant.

2.1. The Spraying System

This study developed an automated UAV-based spraying system for precise sub-leaf application of pesticides or nutrients. The system incorporates a remotely controlled pump activated at optimal times during flight. A 1 kg load cell monitors the liquid weight in the reservoir. A mini submersible DC water pump, powered at 3.3 V and delivering 80–120 L/h, was used for experimental validation. The system’s nozzle, depicted in Figure 3a, features three red indicators marking the water jet exit points.
An ESP32-DevKit V1 microcontroller managed the prototype’s operation, while a Matlab-based control station received real-time data on the reservoir weight and pump status. ROS 1 facilitated seamless communication between components. The automated spraying system comprised a single ESP32 microcontroller, a 1 kg load cell, an LM 7805 voltage regulator, a 24-bit HX711 module converter, a TBJ C1815 transistor, a mini pump operating between 3 and 6 volts, an RBB LED, and three 220-ohm resistors.
Figure 3 depicts the spraying mechanism integrated into a Parrot Bebop 2 UAV. The 3D-printed prototype houses electronics for load sensor data acquisition and transmission. Careful design ensured the additional weight was balanced on the drone, and the tank’s position allowed for camera monitoring.
The load weight measurement system was calibrated using a commercial precision scale as a reference, which has a sensitivity of 1 g and a maximum capacity of 10 kg. The calibration process involved adjusting the load cell signals relative to the reference scale, based on multiple readings for weights ranging from 0 to 1 kg, both ascending and descending. This method enhances the accuracy and reliability of the measurements obtained by the system. In summary, this procedure ensures that the values obtained from the load cell accurately reflect the weight, in grams, of the liquid contained in the tank.

2.2. The Monitoring Strategy

Firstly, it is important to note that the UAV’s onboard camera is displaced by about 10 cm along the x B -axis of the UAV’s center of mass (the origin of the UAV’s reference system), as illustrated in Figure 2. Therefore, a homogeneous transformation between the camera reference system ( C ) and the UAV reference system ( B ) was necessary to accurately determine the corrected geometric location of the drone. Given that the position of each plant was known due to standardized planting, the condition required to capture images with the plants centered on the image plane was based on the positional error between the UAV and each plant. Specifically, let ρ ˜ = ρ 1 ρ 2 ρ n , where ρ n = x U A V x p l a n t n . The image capture process was allowed whenever ρ ˜ was less than ρ m i n = 10 cm. A binary vector checks plant visitation, ensuring that each plant is photographed only once.
Our monitoring proposal basically involves identifying the presence of plants in the center of the images and assessing their condition. Image processing consists of three fundamental steps: image capture, cropping, and feature extraction. First, photos are taken to ensure that the UAV is positioned above the plant, using the position error between the drone and each plant, as discussed in Section 2. As several plants can appear in the image, the target plant for analysis is the one located in the center of the image plane, i.e., 10 cm from the UAV’s current position. The algorithm cuts the image around the center to focus on the desired plant. The cropped image is then converted from the RGB color scale to HSV to facilitate the application of a segmentation mask, which identifies green tones to confirm the presence of a plant. Another mask is applied to detect yellow tones, indicating the presence of a diseased plant directly below the drone. The result of the algorithm for each image is a Boolean vector that indicates whether a plant is present in the captured image and whether it is diseased. To avoid noise in the detection process, the presence of green or yellow was only considered significant if the segmented area exceeded 20 pixels2. Finally, diseased plants are highlighted by coloring the edges of the yellow regions in red.

2.3. The Control Strategy

Successful autonomous UAV spraying hinges on factors including pesticide mass and target location geometry. While this paper focuses on the spraying system, the reader is referred to [26] for in-depth controller design and stability analysis. To simplify the presentation and implementation, this section provides a concise overview of the UAV model and its navigation controller.
The UAV’s translational coordinates are represented as x = x y z , while its rotational coordinates are denoted by the vector η = ϕ θ ψ , which correspond to the Tait–Bryan angles of roll, pitch, and yaw, respectively, all relative to the global frame W .
The Bebop 2 drone’s built-in firmware simplifies the modeling of its dynamics, particularly during tasks requiring precise control near critical points, by allowing its autopilot to maintain flight stability. This streamlined approach aids in the design of the flight controller. The drone’s dynamic model can be represented succinctly as follows:
x ¨ y ¨ z ¨ ψ ¨ = k 1 c ψ k 3 s ψ 0 0 k 1 s ψ k 3 c ψ 0 0 0 0 k 5 0 0 0 0 k 7 u v x u v y u z ˙ u ψ ˙ k 2 c ψ k 4 s ψ 0 0 k 2 s ψ k 4 c ψ 0 0 0 0 k 6 0 0 0 0 k 8 v x v y z ˙ ψ ˙ ,
where c . = cos ( . ) and s . = sin ( . ) . In a compact form, this can be written as q ¨ = F 1 u F 2 v , where u [ 1 , 1 ] are the normalized control signals. Here, u v x and u v y correspond to pitch and roll commands, influencing the linear velocity along the x B and y B axes, respectively. Meanwhile, u z ˙ and u ψ ˙ pertain to the control of z ˙ and ψ ˙ . The variables v x , v y , and z ˙ represent the linear velocities along the x, y, and z axes in the UAV body frame, while ψ ˙ denotes the angular velocity around the z-axis in the global frame.
It is important to note, as discussed in [26], that while this model does not capture the full dynamics of the UAV, it effectively describes the influence of high-level control signals on the vehicle’s maneuvers. For path-following tasks, the control law is given by
u = F 1 1 η + F 2 v , with η = q ¨ d + K d q ˜ ˙ + K p q ˜ ,
where q ˜ = q d q represents the pose error, and K d and K p are positive gain matrices.
Algorithm 1 outlines the logic for executing the spraying mission, incorporating real-time monitoring of the UAV’s payload.
Algorithm 1 Structure of Control System.
  1:
Initialize ROS Communication, tracking system (OptiTrack) and Bebop UAV
  2:
Initialize the Spraying device (load sensor and pump)
  3:
Takeoff Bebop and Start the Monitoring Mission
  4:
while !(Route Accomplished) & !(Empty tank) do
  5:
    Monitor the plantation (Image processing)
  6:
    if Sick Plant detected then
  7:
          Pause path tracking
  8:
          Execute spraying
  9:
          if Defensive Amount reached then
10:
              Stop spraying
11:
              Resume path
12:
          end if
13:
    else
14:
          Follow the path
15:
    end if
16:
    Get Sensor Payload data
17:
    if Tank is empty then
18:
          Return the base to refill tank | Finish the mission
19:
    end if
20:
    Get Pose data and Update the UAV states
21:
    Compute the Control Signals
22:
    if Exist commands from Joystick then
23:
          Override autonomous controller with Joystick info
24:
    end if
25:
    Sent control signals to UAV
26:
    Store state variables
27:
end while

3. Results and Discussion

In this section, a proof-of-concept experiment was conducted using two simulated garden beds, each 3 m in length. Plant models were created using EVA leaves of varying sizes (0.4 m , 0.30 m , and 0.20 m ) and colors (light and dark green) to represent different crop types. Plants were spaced 0.5 m apart, totaling 11 plants (with one intentional gap, no plant). The experimental validation video can be accessed via the following link: https://youtu.be/lB_JT8X3N-8, accessed on 16 September 2024.
The image processing algorithm was responsible for classifying the plants as healthy, sick, or absent. Figure 4 shows examples of these categories, with the corresponding segmentation results displayed in the bottom row. Note that the irrigation nozzle is visible in the photographs captured by the UAV; however, it was intentionally moved back to avoid overlapping the mechanism with the plants in the cropped image in the center of the photographs (as indicated by the blue square). The images in the second row show the result of our segmentation, where we highlighted the presence of green, yellow, or black colors, which allowed us to identify and classify the health of the plants analyzed.
Figure 5 illustrates the variation in payload, as well as the times when the pump was activated and deactivated during the real-time experimental validation of the proof of concept. The periods labeled calibration, pump OFF, and pump ON are highlighted in gray, green, and red, respectively. These labels correspond to the time required to calibrate the load cell after initialization and the periods when the pump was switched on and off. During the calibration stage, it is not possible to determine the weight available in the tank, which explains the absence of a signal during this process. Both graphs show sharper variations when the UAV takes off; however, despite the initial noise, the signal stabilizes once the UAV is in flight. When the pump is activated, there is a gradual decrease in the signal over time, explained by the fact that the liquid is pumped out of the tank, thus reducing its weight. In both experiments, the same amount of pesticide, 5 g, was released on each sick plant. It is interesting to note that the time interval needed to spray the same amount of pesticide varied between the experiments. The time taken to release the 5 g on each diseased plant was longer in the first experiment compared to the second, indicating a lower flow rate out of the nozzle in the first experiment. Finally, it is worth noting that the noise observed in the signal in both graphs is caused by the vibration of the quadcopter itself in flight.
Pump activation was contingent upon the image segmentation stage, specifically, the detection of diseased plants (yellow colored). In the initial experiment (Figure 5a), despite identifying three sick plants, the pump operated only twice. This discrepancy arose from the load-cell system indicating insufficient liquid for continued application after detecting the third sick plant, necessitating the UAV’s return to base for refueling. To prevent pump damage from dry running, a weight threshold of 30 g was established. This accounted for potential liquid displacement due to the UAV’s movement. Spraying commenced only if a diseased plant was detected and the liquid weight exceeded the threshold. Otherwise, the UAV returned for refueling. In the second experiment (Figure 5b), the initial pesticide load was approximately 50 g, sufficient to cover all beds and treat the three sick plants without intermediate refueling.
The UAV’s trajectory during both experiments is shown in Figure 6. In the first experiment (Figure 6a), the UAV detected insufficient pesticide upon encountering the third diseased plant and returned to base for refueling. This resulted in a deviation from the planned path and omission of inspection for the remaining plants in the second bed. Blank entries indicate that the UAV failed to pass within 0.1 m of these plants, preventing image capture and subsequent analysis. In contrast, in the second experiment (Figure 6b), the UAV successfully identified, sprayed, and classified all plants in the garden. Notably, in both experiments, the desired x d and current x UAV positions closely aligned, demonstrating the control system’s effectiveness in maintaining precise positioning, even while carrying a payload.
The analysis revealed the UAV could carry a maximum payload of 228 g, comprising a 178-g spraying system and 50 g of liquid. This load represented approximately 45.60% of the UAV’s estimated 500-g weight. This study focused on developing a self-contained spraying prototype for drone integration, prioritizing weight-based liquid dispensing over nozzle flow rate control. An RGB LED visual indicator was implemented to signal liquid levels during testing. The LED cycled through green (steady or flashing) and red to denote liquid weight above 40 g, between 30 and 40 g, or below 30 g, respectively.
It is worth highlighting that our approach can be adapted for non-liquid payloads since the load sensor functions independently of the cargo type. This versatility allows for the utilization of drones in various applications, such as seed distribution in reforestation efforts or the deployment of larvicides in water reservoirs to combat dengue and other similar diseases. This potential for diverse applications highlights the broader implications of our work in advancing UAV technology for environmental and public health initiatives.
In this work, the agricultural spraying system has been designed for targeted applications, allowing for individualized treatment of each plant. This approach stands out compared to traditional systems, primarily due to its ability to ensure that a precise and uniform amount of pesticide is applied to each plant based on the weight of the applied product, rather than relying on time intervals or flow control. Unlike conventional systems that depend on the nozzle’s spray rate, our system measures the actual amount of pesticide deposited, guaranteeing that each plant receives the same quantity of product regardless of terrain variations, environmental conditions, or liquid viscosity. This precision is crucial in crops where uniform treatment directly affects productivity and plant health.
The precise measurement of the liquid’s weight guarantees uniform application, avoiding both under-dosing, which could compromise the treatment’s effectiveness, and over-dosing, which could lead to waste and undesirable environmental impacts. This optimization not only reduces the use of excess inputs, making operations more economical, but also maintains the effectiveness of the treatment.
Additionally, the proposed spraying system allows for flexibility in the amount of pesticide applied to each plant based on its disease level (as determined by the ratio of yellow to green colors). This feature is particularly advantageous for crops with individual treatment needs, such as orchards, vineyards, or gardens, ensuring that more diseased plants receive a higher quantity of pesticide. In essence, through the use of a plant disease index (PDI), the system can automatically adjust the amount of pesticide released in grams, adapting to the specific needs of each plant and variations in environmental conditions. However, this aspect falls outside the scope of the current work and remains a point for future improvement.
In conclusion, we conducted additional tests to validate alternative approaches, which are showcased in the accompanying video (https://youtu.be/IdVl9vTMFpg, accessed on 16 September 2024). Throughout our experiments, we explored various configurations, including adjustments to the control strategy, the simulated liquid dispersion system, and the monitoring and pesticide application strategies. These findings underscore the effectiveness of our proposed methods and provide further insight into the potential of UAV-based spraying systems.

4. Concluding Remarks

This study presents the development of a UAV-based spraying system designed for targeted pesticide application on color-identified diseased plants, aiming to optimize resource use and improve crop health and productivity. The experimental results confirmed the effectiveness of the digital image processing technique, the functionality of the onboard water pumping mechanism, and the accuracy of the UAV’s pesticide weight measurement. A main contribution of this work is the integration of a real-time payload monitoring system, which continuously tracks the weight during flight to ensure precise pesticide application and prevent over- or under-spraying. Additionally, the system supports automatic refueling by detecting low pesticide levels and directing the UAV to return to base when needed.
In conclusion, this work demonstrates an alternative application of UAVs in agriculture. Notably, the employed technique minimizes direct human exposure to agricultural pesticides and unintentional contact with crops, preventing potential developmental delays. The proposed inspection technique could be further expanded to identify mature and green plants, for example. Overall, our study contributes to the development of cost-effective technologies that maximize the potential of UAVs in precision agriculture.
Additionally, our approach can be extended to non-liquid payloads, as the load sensor operates independently of the type of cargo being transported. This enables the use of drones for seed distribution in reforestation missions or for deploying larvicides in water reservoirs to combat dengue and similar diseases.
Future research directions include conducting open-air field tests with real plants. This would introduce additional complexity due to external factors like wind gusts and varying lighting conditions. Additionally, employing RGB-D cameras would provide color and depth information, facilitating real-plant growth assessment based on height measurements, using the relative distance between the drone and the top of the plants.

Author Contributions

Conceptualization, C.O.B., L.A.F.-J., and A.L.C.M.; methodology, C.O.B.; validation, C.O.B., L.A.F.-J., and A.S.B.; data curation, C.O.B. and L.A.F.-J.; writing—original draft preparation, C.O.B.; writing, C.O.B. and A.S.B.; review and editing, A.S.B. and D.C.G.; supervision, A.S.B.; project administration, A.S.B.; funding acquisition, A.S.B. All authors have read and agreed to the published version of the manuscript.

Funding

This work was financially supported by FAPEMIG—Fundação de Amparo à Pesquisa do Estado de Minas Gerais (Grant Number APQ-02573-21 and APQ-02282-24).

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The data is available on request to interested readers.

Acknowledgments

Barcelos and Fagundes-Junior thank CAPES—Coordenação de Aperfeiçoamento de Pessoal de Nível Superior and FAPEMIG—Fundação de Amparo à Pesquisa do Estado de Minas Gerais, respectively, for their scholarships.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Mogili, U.R.; Deepak, B. Review on application of drone systems in precision agriculture. Procedia Comput. Sci. 2018, 133, 502–509. [Google Scholar] [CrossRef]
  2. Amaral, L.R.d.; Zerbato, C.; Freitas, R.G.d.; Barbosa Júnior, M.R.; Simões, I.O.P.d.S. UAV applications in Agriculture 4.0. Rev. Ciênc. Agron. 2021, 51, e20207748. [Google Scholar] [CrossRef]
  3. Velusamy, P.; Rajendran, S.; Mahendran, R.K.; Naseer, S.; Shafiq, M.; Choi, J.G. Unmanned Aerial Vehicles (UAV) in precision agriculture: Applications and challenges. Energies 2021, 15, 217. [Google Scholar] [CrossRef]
  4. Tsouros, D.C.; Bibi, S.; Sarigiannidis, P.G. A review on UAV-based applications for precision agriculture. Information 2019, 10, 349. [Google Scholar] [CrossRef]
  5. Zhang, H.; Wang, L.; Tian, T.; Yin, J. A review of unmanned aerial vehicle low-altitude remote sensing (UAV-LARS) use in agricultural monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  6. Radoglou-Grammatikis, P.; Sarigiannidis, P.; Lagkas, T.; Moscholios, I. A compilation of UAV applications for precision agriculture. Comput. Netw. 2020, 172, 107148. [Google Scholar] [CrossRef]
  7. Daponte, P.; De Vito, L.; Glielmo, L.; Iannelli, L.; Liuzza, D.; Picariello, F.; Silano, G. A review on the use of drones for precision agriculture. In Proceedings of the IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristol, UK, 2019; Volume 275, p. 012022. [Google Scholar]
  8. Souza, B.J.; Stefenon, S.F.; Singh, G.; Freire, R.Z. Hybrid-YOLO for classification of insulators defects in transmission lines based on UAV. Int. J. Electr. Power Energy Syst. 2023, 148, 108982. [Google Scholar] [CrossRef]
  9. Takaya, K.; Ohta, H.; Kroumov, V.; Shibayama, K.; Nakamura, M. Development of UAV system for autonomous power line inspection. In Proceedings of the 2019 23rd International Conference on System Theory, Control and Computing (ICSTCC), Sinaia, Romania, 9–11 October 2019; pp. 762–767. [Google Scholar]
  10. Peng, X.; Zhong, X.; Zhao, C.; Chen, A.; Zhang, T. A UAV-based machine vision method for bridge crack recognition and width quantification through hybrid feature learning. Constr. Build. Mater. 2021, 299, 123896. [Google Scholar] [CrossRef]
  11. da Silva, Y.M.; Andrade, F.A.; Sousa, L.; de Castro, G.G.; Dias, J.T.; Berger, G.; Lima, J.; Pinto, M.F. Computer vision based path following for autonomous unmanned aerial systems in unburied pipeline onshore inspection. Drones 2022, 6, 410. [Google Scholar] [CrossRef]
  12. Fagundes-Junior, L.A.; Barcelos, C.O.; Gandolfo, D.C.; Brandão, A.S. Bdp-uaifly system: A platform for the robocup brazil open flying robot trial league. In Proceedings of the 2023 International Conference on Unmanned Aircraft Systems (ICUAS), Warsaw, Poland, 6–9 June 2023; pp. 1021–1028. [Google Scholar]
  13. Ren, H.; Zhao, Y.; Xiao, W.; Hu, Z. A review of UAV monitoring in mining areas: Current status and future perspectives. Int. J. Coal Sci. Technol. 2019, 6, 320–333. [Google Scholar] [CrossRef]
  14. Ecke, S.; Dempewolf, J.; Frey, J.; Schwaller, A.; Endres, E.; Klemmt, H.J.; Tiede, D.; Seifert, T. UAV-based forest health monitoring: A systematic review. Remote Sens. 2022, 14, 3205. [Google Scholar] [CrossRef]
  15. Khan, N.A.; Jhanjhi, N.; Brohi, S.N.; Usmani, R.S.A.; Nayyar, A. Smart traffic monitoring system using unmanned aerial vehicles (UAVs). Comput. Commun. 2020, 157, 434–443. [Google Scholar] [CrossRef]
  16. Elloumi, M.; Dhaou, R.; Escrig, B.; Idoudi, H.; Saidane, L.A. Monitoring road traffic with a UAV-based system. In Proceedings of the 2018 IEEE Wireless Communications and Networking Conference (WCNC), Barcelona, Spain, 15–18 April 2018; pp. 1–6. [Google Scholar]
  17. Allred, B.; Martinez, L.; Fessehazion, M.K.; Rouse, G.; Williamson, T.N.; Wishart, D.; Koganti, T.; Freeland, R.; Eash, N.; Batschelet, A.; et al. Overall results and key findings on the use of UAV visible-color, multispectral, and thermal infrared imagery to map agricultural drainage pipes. Agric. Water Manag. 2020, 232, 106036. [Google Scholar] [CrossRef]
  18. Ribeiro, L.F.O.; Vitória, E.L.d.; Soprani Júnior, G.G.; Chen, P.; Lan, Y. Impact of Operational Parameters on Droplet Distribution Using an Unmanned Aerial Vehicle in a Papaya Orchard. Agronomy 2023, 13, 1138. [Google Scholar] [CrossRef]
  19. Lopes, L.d.L.; Cunha, J.P.A.R.d.; Nomelini, Q.S.S. Use of Unmanned Aerial Vehicle for Pesticide Application in Soybean Crop. AgriEngineering 2023, 5, 2049–2063. [Google Scholar] [CrossRef]
  20. Qi, P.; Zhang, L.; Wang, Z.; Han, H.; Müller, J.; Li, T.; Wang, C.; Huang, Z.; He, M.; Liu, Y.; et al. Effect of Operational Parameters of Unmanned Aerial Vehicle (UAV) on Droplet Deposition in Trellised Pear Orchard. Drones 2023, 7, 57. [Google Scholar] [CrossRef]
  21. Meng, Y.; Su, J.; Song, J.; Chen, W.H.; Lan, Y. Experimental evaluation of UAV spraying for peach trees of different shapes: Effects of operational parameters on droplet distribution. Comput. Electron. Agric. 2020, 170, 105282. [Google Scholar] [CrossRef]
  22. Tang, Y.; Hou, C.; Luo, S.; Lin, J.; Yang, Z.; Huang, W. Effects of operation height and tree shape on droplet deposition in citrus trees using an unmanned aerial vehicle. Comput. Electron. Agric. 2018, 148, 1–7. [Google Scholar] [CrossRef]
  23. Basso, M.; Pignaton de Freitas, E. A UAV guidance system using crop row detection and line follower algorithms. J. Intell. Robot. Syst. 2020, 97, 605–621. [Google Scholar] [CrossRef]
  24. Ruan, Z.; Chang, P.; Cui, S.; Luo, J.; Gao, R.; Su, Z. A precise crop row detection algorithm in complex farmland for unmanned agricultural machines. Biosyst. Eng. 2023, 232, 1–12. [Google Scholar] [CrossRef]
  25. Bah, M.D.; Hafiane, A.; Canals, R. CRowNet: Deep network for crop row detection in UAV images. IEEE Access 2019, 8, 5189–5200. [Google Scholar] [CrossRef]
  26. Santana, L.V.; Brandão, A.S.; Sarcinelli-Filho, M. Navigation and cooperative control using the ar. drone quadrotor. J. Intell. Robot. Syst. 2016, 84, 327–350. [Google Scholar] [CrossRef]
Figure 1. Illustration of the task of real-time monitoring of a carrot crop using the proposed path-following approach.
Figure 1. Illustration of the task of real-time monitoring of a carrot crop using the proposed path-following approach.
Drones 08 00490 g001
Figure 2. Reference system for Parrot Bebop 2 UAV using Tait–Bryan angles.
Figure 2. Reference system for Parrot Bebop 2 UAV using Tait–Bryan angles.
Drones 08 00490 g002
Figure 3. Spraying system reservoir developed. (a) Items: 1. UAV support; 2. cell load; 3. cell load support; 4. reservoir cover; 5. nozzle with 3 outlets; 6. feeding the pump; 7. refueling entrance; 8. pump; 9. feeding the load cell. (b) Spraying system mounted on the UAV.
Figure 3. Spraying system reservoir developed. (a) Items: 1. UAV support; 2. cell load; 3. cell load support; 4. reservoir cover; 5. nozzle with 3 outlets; 6. feeding the pump; 7. refueling entrance; 8. pump; 9. feeding the load cell. (b) Spraying system mounted on the UAV.
Drones 08 00490 g003
Figure 4. Captured photos and segmented images. (left) Healthy plant. (middle) Sick plant. (right) No plant.
Figure 4. Captured photos and segmented images. (left) Healthy plant. (middle) Sick plant. (right) No plant.
Drones 08 00490 g004
Figure 5. Evolution of pesticide mass in UAV tank and corresponding pump states.
Figure 5. Evolution of pesticide mass in UAV tank and corresponding pump states.
Drones 08 00490 g005
Figure 6. UAV flight path with corresponding plant disease classification. Note: NO and NC mean ‘no plants’ and ‘not computed’, respectively.
Figure 6. UAV flight path with corresponding plant disease classification. Note: NO and NC mean ‘no plants’ and ‘not computed’, respectively.
Drones 08 00490 g006
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Barcelos, C.O.; Fagundes-Júnior, L.A.; Mendes, A.L.C.; Gandolfo, D.C.; Brandão, A.S. Integration of Payload Sensors to Enhance UAV-Based Spraying. Drones 2024, 8, 490. https://doi.org/10.3390/drones8090490

AMA Style

Barcelos CO, Fagundes-Júnior LA, Mendes ALC, Gandolfo DC, Brandão AS. Integration of Payload Sensors to Enhance UAV-Based Spraying. Drones. 2024; 8(9):490. https://doi.org/10.3390/drones8090490

Chicago/Turabian Style

Barcelos, Celso O., Leonardo A. Fagundes-Júnior, André Luis C. Mendes, Daniel C. Gandolfo, and Alexandre S. Brandão. 2024. "Integration of Payload Sensors to Enhance UAV-Based Spraying" Drones 8, no. 9: 490. https://doi.org/10.3390/drones8090490

APA Style

Barcelos, C. O., Fagundes-Júnior, L. A., Mendes, A. L. C., Gandolfo, D. C., & Brandão, A. S. (2024). Integration of Payload Sensors to Enhance UAV-Based Spraying. Drones, 8(9), 490. https://doi.org/10.3390/drones8090490

Article Metrics

Back to TopTop