Next Article in Journal
Extending the Vase Life of Vanda Orchid Cut Flowers Using Plasma Technology
Previous Article in Journal
Microbiome Dynamics in Four Different Casing Materials Used for Milky Mushroom (Calocybe indica) Cultivation
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

A Review of Intelligent Orchard Sprayer Technologies: Perception, Control, and System Integration

1
School of Agricultural Engineering, Jiangsu University, Zhenjiang 212013, China
2
Jiangsu Provincial Key Laboratory of Hi-Tech Research for Intelligent Agricultural Equipment, Jiangsu University, Zhenjiang 212013, China
3
Ping An Property & Casualty Insurance Company of China, Ltd., Hebei Branch, Shijiazhuang 050000, China
*
Author to whom correspondence should be addressed.
Horticulturae 2025, 11(6), 668; https://doi.org/10.3390/horticulturae11060668
Submission received: 17 April 2025 / Revised: 31 May 2025 / Accepted: 5 June 2025 / Published: 11 June 2025
(This article belongs to the Section Fruit Production Systems)

Abstract

:
With the ongoing advancement of global agricultural modernization, intelligent technologies have gained significant attention in agricultural production—particularly in the field of intelligent orchard sprayers, where notable progress has been achieved. Intelligent orchard sprayers, equipped with precise sensing and control systems, enable targeted spraying. This enhances the efficiency of crop health management, reduces pesticide usage, minimizes environmental pollution, and supports the development of precision agriculture. This review focuses on three core modules of intelligent sprayer technology: perception and intelligent control, spray deposition and drift control, and autonomous navigation with system integration. By addressing key areas such as sensor technologies, object detection algorithms, and real-time control strategies, this review explores current challenges and future directions for intelligent orchard sprayer technology. It also discusses existing technical bottlenecks and obstacles to large-scale adoption. Finally, this review highlights the pivotal role of intelligent orchard sprayer technology in enhancing crop management efficiency, improving environmental sustainability, and facilitating the transformation of agricultural production systems.

1. Introduction

As the global population continues to grow and arable land becomes increasingly scarce, modern agriculture faces unprecedented challenges, highlighting the urgent need to enhance production efficiency to ensure food security [1,2,3]. Against this backdrop, plant protection plays a critical role in maintaining crop yield and quality as a core component of agricultural production. However, traditional plant protection methods suffer from several limitations, including low pesticide utilization caused by uniform large-scale spraying, severe environmental pollution, low operational efficiency, and significant pesticide residue concerns [4,5,6].
Studies have shown that the effective pesticide utilization rate in conventional spraying operations is relatively low. A significant portion of pesticides is deposited in non-target areas or released into the environment, resulting in economic losses and risks to ecosystems and human health [7,8,9]. In pesticide spraying applications, fixed-rate spraying is commonly adopted, often neglecting the uneven crop distribution and complex spatial structures. This results in pesticide waste and reduced spraying efficacy [10]. Research has shown that excessive spraying not only fails to enhance deposition on target surface but also leads to droplet runoff and drift, ultimately decreasing spraying efficiency [11].
Orchards, characterized by complex spatial structures and heterogeneous crop distributions, place greater demands on the precision and adaptability of plant protection operations. Unlike field crops, fruit trees usually exhibit irregular canopy structures and varying plant heights, complicating the control of spray deposition on plant surfaces [12,13]. Moreover, fluctuating wind conditions, shading, and other environmental factors in orchards further challenge the sensing, decision-making, and actuation capabilities of spraying systems. In conventional spraying, the lack of target recognition and adaptive control often results in over-deposition on the canopy front and insufficient coverage on the rear, thereby reducing pesticide use efficiency and disease control efficacy.
In response, intelligent and precision-oriented orchard plant protection has emerged as a key focus in modern agriculture [14,15,16]. Intelligent orchard sprayers integrate advanced sensing technologies, intelligent decision-making algorithms, and precision actuators to dynamically adjust spraying strategies in response to real-time field conditions. Research indicates that such technologies significantly enhance pesticide utilization and reduce environmental pollution while maintaining effective crop protection [17,18].
In recent years, propelled by advancements in sensor technology, deep learning, and agricultural robotics, intelligent orchard sprayers have attracted increasing attention and seen preliminary implementation. RGB-D, LiDAR, and ultrasonic sensors demonstrate high adaptability in target detection and 3D modeling. YOLO algorithms and improved convolutional neural networks (CNNs) have greatly enhanced detection accuracy in complex orchard environments [19,20,21]. Concurrently, real-time control strategies, variable-rate spraying technologies, autonomous navigation, and integrated systems have become key enablers of precision spraying operations.
Despite rapid technological progress, significant challenges remain in practical applications, such as insufficient sensing accuracy under complex conditions, delayed control responses, poor deposition efficiency, and difficulty controlling spray drift [22,23]. For instance, Zhou et al. [24] found that ultrasonic sensors achieved a 91.2% detection accuracy for canopy thickness in laboratory settings, but this dropped to 80.9% in real trees detection experiments due to interference from lighting and occlusions. Ramón Salcedo et al. [23] reported that mismatched sampling frequencies and spray valve response times caused system delays and led to a 6.2% increase in airborne drift.
To overcome these limitations, researchers have explored enhancements in detection algorithms, sensor accuracy, data fusion from multiple sources, and system integration. Looking ahead, intelligent orchard sprayers are expected to evolve with more efficient sensing, intelligent control mechanisms, and robust system integration, advancing sustainable and smart agricultural development.

2. Perception and Intelligent Control

2.1. Sensor Technologies

The perception system of an intelligent orchard sprayer serves as the foundational link in the technology chain, and its performance directly determines the accuracy of subsequent control processes. In recent years, rapid advancements in sensor technology have enabled orchard environment perception to evolve from single-modality visual detection to multimodal information fusion systems. This shift addresses challenges unique to orchards, such as light variability, canopy occlusion, and complex 3D structures. These systems facilitate 3D tree modeling, canopy parameter estimation, and targeted spray control. Key research focuses include the integration of heterogeneous sensing technologies for robust data acquisition, the advancement of detection algorithms to improve recognition accuracy under orchard-specific conditions, and the refinement of real-time control strategies to enable precise and adaptive spraying operations.

2.1.1. RGB-D Sensor

In complex orchard environments, sensor selection must comprehensively consider factors such as adaptability to lighting conditions, 3D reconstruction capability, and cost-effectiveness. RGB cameras are widely used in tasks such as canopy feature recognition and pest and disease monitoring due to their low cost, high resolution, and ease of deployment. However, their sensitivity to lighting variations and inability to capture 3D spatial information limit their effectiveness in target localization and canopy structure modeling. In contrast, RGB-D cameras capture both color and depth data, offering superior spatial perception and a favorable cost-performance ratio. They are well-suited for close-range fruit tree recognition and robotic navigation. Due to their rich color and depth information, RGB-D cameras have become a research hotspot [25,26].
Xue et al. [27] used depth information to assist in the identification and segmentation of citrus tree canopies. The results showed that both RGB and RGB-D achieved good segmentation accuracy. However, in more complex background environments, RGB-D detectors outperformed RGB detectors in segmentation accuracy. Using RGB-D images increased precision by 4.67% compared to RGB images alone. Tang et al. [28] utilized RGB-D data for apple recognition and localization. Under conditions of occlusion and overlap, their model achieved the highest F1-score and mAP (mean average precision) of 88.5% and 90.1%, respectively, on the apple instance segmentation dataset. The detection process is illustrated in Figure 1.

2.1.2. LiDAR Sensor

Light Detection and Ranging (LiDAR), with its active sensing capability and millimeter-level spatial resolution, has become a core sensor for environmental perception in intelligent orchard sprayers [29,30]. Unlike visual sensors, which are sensitive to changes in illumination, LiDAR emits laser pulses to directly measure distances to targets. It maintains stable performance even under challenging conditions such as haze or backlighting [31,32].
3D LiDAR generates dense point clouds through multi-beam rotational scanning, significantly improving the completeness of spatial information, although at a relatively higher cost. Yang et al. [33] proposed a LiDAR-based self-calibration method for measuring the 3D morphology of fruit tree canopies. The approach includes 3D point cloud generation, point cloud registration, and canopy information extraction for apple trees. Under the double-visual-angle condition, the average relative errors (ARE) between calculated and measured morphological parameters—tree height, maximum width, and canopy thickness—were 2.5%, 3.6%, and 3.2%, respectively. These results demonstrate the method’s accuracy in extracting 3D canopy information and its significance for intelligent control in standardized orchards.
Suchet Bargoti et al. [34] proposed a trunk localization pipeline based on ground LiDAR data to identify individual apple trees in orchards. The method achieved 89% segmentation accuracy and demonstrated the potential of LiDAR for precise tree detection in orchard environments.
Wang et al. [35] mounted the LiDAR sensor on a slider and controlled its movement speed using a motor to detect canopy volume at varying speeds. They proposed the Cuboid Accumulation of Divided Areas (CADA), an online LiDAR-based method for detecting fruit tree canopy volume using dynamic mesh division. Detection experiments were conducted on both simulated and real peach tree canopies. Results showed that the CADA method had an 8.33% error in estimating simulated canopy volume, primarily due to inaccuracies at the canopy edges. At a movement speed of 1 m/s, the CADA method achieved a canopy volume detection accuracy of 99.18%. The detection principle of the canopy volume calculation is illustrated in Figure 2.

2.1.3. Ultrasonic Sensor

Ultrasonic sensors, due to their low cost, strong environmental adaptability, and real-time distance measurement capabilities, have shown unique value in extracting fruit tree canopy contours. Their working principle is based on time-of-flight (ToF) measurement of sound waves. By emitting high-frequency acoustic signals and receiving the reflected echoes, the relative distance to a target can be determined [36,37].
Zhou et al. [24] used echo signals from ultrasonic sensors to estimate canopy thickness. Since ultrasonic waves propagate at a constant speed in a uniform medium and reflect upon encountering obstacles, this property was utilized for measurement. The distance to the target can be estimated based on the time interval between signal emission and reception, combined with the propagation speed of ultrasonic waves under current temperature conditions. Experimental results showed that the average error between measured and actual canopy thickness was 19.1%. The diagram of canopy thickness measurement is shown in Figure 3. Tomas Palleja et al. [38] also proposed a real-time method using an array of ultrasonic sensors to estimate canopy density in apple orchards and vineyards. The results indicated that the method achieved estimation errors of 3.8% in apple orchards and 14.1% in vineyards, demonstrating sufficient sensitivity in capturing canopy features.

2.1.4. Comparative Evaluation of Orchard Sensor Technologies

In actual orchard environments, RGB-D, LiDAR, and ultrasonic sensors each exhibit distinct strengths and limitations in terms of detection accuracy, environmental adaptability, and cost-effectiveness. RGB cameras offer high resolution and are cost-efficient, but their performance can be severely affected by backlighting, often leading to overestimated canopy volumes. LiDAR sensors provide high spatial accuracy and robustness under variable lighting but are relatively expensive and may be influenced by branch structures. Ultrasonic sensors are economical and effective under challenging weather conditions but tend to suffer from accuracy loss due to leaf occlusion and uneven canopy distribution. As shown in Table 1, the relative error and environmental sensitivity of each sensor type vary depending on orchard structure and canopy density.

2.1.5. Multi-Sensor Fusion

The complex and variable conditions in orchards—such as illumination fluctuations, leaf occlusion, and weather interference—make it difficult for a single sensor to comprehensively capture canopy features. As shown in Table 1, ultrasonic sensors performed well in lab tests with sparse foliage but showed higher error rates in denser canopies. These performance discrepancies highlight the need for sensor fusion strategies that leverage complementary features and mitigate individual limitations. As a result, multimodal sensor fusion has emerged as a key solution to enhance the perception reliability of intelligent spraying systems. By leveraging the complementary advantages and redundancy verification of heterogeneous sensors, the physical limitations of individual sensors can be effectively overcome, enabling the acquisition of multimodal data [42,43]. Multimodal data enable multi-stage information integration and offer a more comprehensive dataset than single-source inputs, thereby improving the accuracy and precision of prediction results. Another key advantage of multimodal data is the reduction of dependency on a single data source, which enhances model robustness [44,45]. Lu et al. [46] estimated the leaf area index (LAI) of citrus using multimodal data. Compared to estimations based solely on RGB or point cloud data, the multimodal approach improved the coefficient of determination (R2) by 10.0% and 6.8%, respectively. A comparison between multimodal and unimodal data is shown in Table 2.

2.2. Object Detection Algorithms

With the development of smart agriculture, orchard target detection algorithms have become a key technology for precision operation and automated management. Their core task is to accurately identify agricultural targets such as fruits, trunks, and branches in complex natural environments, providing fundamental data support for variable spraying and orchard pest and disease identification [47,48].

2.2.1. General Architectures—YOLO Series Algorithms

As a representative of single-stage detection algorithms, the YOLO series is widely used in orchard fruit detection due to its fast detection speed, simple architecture, and relatively high accuracy [49,50]. In practical detection, the model divides the image into grids, where each grid predicts bounding boxes and class probabilities. When a predicted box exceeds a confidence threshold and is classified as a fruit, it is considered a detection. The schematic of the YOLO detection algorithm is shown in Figure 4.
With continuous iterations, the YOLO series has shown improved adaptability and robustness in orchard detection tasks. Hamzeh Mirhaji et al. [52] applied YOLOv2, YOLOv3, and YOLOv4 to citrus detection and systematically evaluated the detection performance and processing speed of each version, as illustrated in Figure 5 and Figure 6. Results showed that YOLOv4 significantly outperformed its predecessors in both accuracy and speed. Its daytime detection mAP increased from 74.24% (YOLOv2) to 91.41%, and nighttime detection mAP improved from 73.51% to 90.36%. The processing speed also improved from 28.31 ms to 23.6 ms, demonstrating excellent real-time performance and stability. However, the model still exhibited some misclassification in scenarios where yellow leaves and citrus fruits were visually similar, indicating a need to further enhance target discrimination under complex backgrounds.
Ranjan Sapkota et al. [53] applied YOLOv8 combined with instance segmentation to detect immature green apples. By integrating 3D point cloud geometric fitting, they achieved accurate detection of small fruits. YOLOv8 demonstrated superior target recognition under complex backgrounds and maintained a detection accuracy of 94% even when fruit and foliage had similar color and texture. The integration of point cloud modeling further improved the model’s accuracy and robustness in estimating fruit size, providing technical support for intelligent identification and precision management of small fruits in orchards.
Furthermore, Li et al. [54] introduced the D3-YOLOv10 model, specifically designed for tomato detection in protected cultivation environments. The model achieved a mAP0.5 of 91.8%, while significantly reducing the number of parameters and computational complexity (measured in FLOPs). Moreover, it maintained a high inference speed of 80.1 FPS, indicating strong potential for real-time applications in smart agriculture.

2.2.2. Adaptive Mechanisms—Attention Mechanisms

To enhance detection accuracy, the integration of attention mechanisms has become a key direction in algorithm optimization. These mechanisms automatically focus on target-relevant regions in an image, amplifying salient features while suppressing irrelevant background information, thereby improving the model’s ability to detect small targets such as fruits [55,56]. Typical approaches, such as channel attention, spatial attention, and Transformer-based architectures, have demonstrated promising results in the detection of orchard crops like apples and citrus.
The Squeeze-and-Excitation Module (SEM) adaptively adjusts the importance of each channel, enhancing the representation of fruit features along the channel dimension. This is particularly effective in scenarios where fruit colors closely resemble the background [57,58]. To improve the accuracy and speed of kiwifruit detection, Yang et al. [59] integrated SEM into the YOLOv4-tiny architecture. This enhancement significantly boosted feature extraction capabilities, resulting in a detection accuracy of 93.96%, a 7.07% increase in mAP, and a reduced per-image inference time to 8.50 ms—markedly outperforming the original model.
The Convolutional Block Attention Module (CBAM) combines channel and spatial attention to more precisely extract salient regions of targets in an image, thereby improving performance in detecting occluded and densely packed objects [60]. Jiang et al. [61] incorporated the CBAM module into YOLOv4, raising the average accuracy of apple fruitlet detection to 97.2%. The model also demonstrated enhanced robustness in low-quality images affected by occlusion, blurring, and lighting interference.
A Spatial Attention Module (SAM) enhances detection performance by emphasizing leaf color, structure, and branch texture, thereby reducing the likelihood of missed detections. In orchard detection tasks, Akdoğan et al. [62] integrated the SAM into the PP-YOLO model, increasing the F1-score from 94.3% to 95.8% and achieving a mAP50 of 98.3%. The SAM module alone contributed a 0.6% improvement, significantly enhancing detection accuracy under complex backgrounds.
Attention mechanisms based on Transformer architectures can model global dependencies among features, improving detection and recognition in complex fruit distribution scenarios such as multiple occluded fruits. In a pear detection task, Huang et al. [63] introduced a lightweight Transformer architecture and HiLo attention mechanism into the RT-DETR model. This improved the model’s feature extraction capacity under complex natural environments, achieving a detection accuracy of 93.7% and a mAP of 98%. Additionally, the number of parameters was reduced by 48.47%, achieving both high accuracy and deployment efficiency.

2.2.3. Customized Detection Architectures and Fusion Strategies

To address challenges such as species diversity, complex growth stages, and strong background interference in orchard target detection, researchers have developed various fusion optimization strategies and customized architectures based on general object detection models. These advancements aim to further improve detection accuracy and robustness in complex orchard environments [64]. These strategies include lightweight network design, improved feature fusion structures, loss function optimization, and deep integration with attention mechanisms, laying the foundation for efficient perception in orchard scenarios.
To tackle the challenge of detecting small objects under complex lighting conditions in densely planted pitaya orchards, Zhang et al. [65] proposed a lightweight improved model based on YOLOv5s. The model integrates the Ghost module, BiFPN feature fusion, and the SIoU loss function, enhancing detection accuracy while maintaining real-time performance. With the integration of the CAM attention module, the model achieved a pitaya detection accuracy of 92.1% in nighttime environments—an improvement of approximately 7.6% over the original YOLOv5s—demonstrating enhanced robustness and adaptability to varying scenes.
Liu et al. [66] addressed the challenge of detecting dense small targets in citrus orchards by developing a lightweight and efficient orchard-specific detection model based on YOLOv5. The model incorporates a Coordinate Attention (CA) module, BiFPN feature fusion, and the Varifocal Loss. The method achieved an average precision of 98.4% across four types of citrus trees, with improvements of up to 7.5% over the original YOLOv5. The average inference time per image was 0.019 s, enabling accurate recognition and fast reasoning for densely distributed small fruits. The model also demonstrated stable performance across multiple citrus varieties, indicating strong generalization capability and potential for real-world deployment.

2.2.4. Performance of YOLO Models Under Complex Orchard Conditions

While the aforementioned architectural enhancements and fusion strategies have significantly improved the generalizability and computational efficiency of YOLO-based models, their effectiveness in practical orchard environments remains subject to further scrutiny. Most existing evaluations are conducted under controlled conditions and do not fully capture the challenges posed by real-world scenarios—such as variable lighting, fruit occlusion, dense canopies, and complex background interference [67,68].
To bridge this gap, it is essential to assess how different YOLO architectures perform under such environmental constraints, as shown in Table 3. The following section synthesizes recent studies that have experimentally evaluated YOLO-based models in complex orchard conditions, offering insights into their robustness, limitations, and areas for further improvement.

2.3. Real-Time Control Strategies for Orchard Spraying Systems

2.3.1. Key Factors Affecting Deposition Efficiency

In intelligent orchard spraying systems, environmental perception is only the starting point; the key to achieving efficient spraying lies in promptly and accurately transmitting perception results to actuators to enable dynamic adjustment of spraying parameters [74,75]. Real-time control strategies allow for rapid response and dynamic adjustment during the spraying process. Intelligent orchard sprayers must dynamically control nozzle activation, spray volume, and spray angle based on real-time perception data—such as tree distribution, leaf area density, and wind conditions—to enhance pesticide utilization efficiency and reduce drift losses [76].
Through nozzle-array-based zoning and electromagnetic valve-based independent control, identified fruit tree areas can be mapped to corresponding nozzle zones, enabling precise “see-and-spray” responses [77]. Motalab et al. [78] developed a CAN-bus-based electronic control unit (ECU) that enables seamless communication between machine vision systems and boom sprayers. In field experiments conducted on a utility task vehicle (UTV) equipped with 12 nozzles and a six-camera machine vision system, the ECU demonstrated robust real-time control. Specifically, the system sent control commands every 10 ms and issued 400 ms spray bursts upon detection, achieving a minimum spray length of 345 mm at speeds up to 9.66 km/h. Multiple target distributions were tested—including individual, bulk, and sequential layouts—demonstrating 100% target spray accuracy and consistent control performance across speed levels. These results confirm the feasibility and accuracy of such systems in realistic spraying scenarios.
Delay compensation control based on predictive modeling addresses the time lag between target detection and spraying by incorporating target motion prediction, spray delay modeling, and mistargeting correction to improve hit accuracy [79]. Ma et al. [80] implemented an ultrasonic-based sprayer in orchards that dynamically adjusted nozzle activation timing based on vehicle speed and canopy position. Field trials showed that this delay-compensated system significantly improved spraying precision, with a measured 25.16% front canopy coverage under low-volume spray conditions. The study emphasized system responsiveness in actual orchard environments with variable canopy geometries, validating the algorithm’s effectiveness in minimizing over-spray and under-spray. Such field-tested implementations provide practical insights for real-time control strategy optimization in orchard-specific applications.

2.3.2. Spray Control Strategies and Algorithms

To achieve precise and efficient pesticide application, intelligent orchard sprayers must conduct real-time perception and spray control under complex environmental conditions. Control strategies serve as the system’s core, dictating the responsiveness, stability, and adaptability of the spraying process. Current mainstream spraying control strategies include traditional Proportional-Integral-Derivative (PID) control, fuzzy logic control, model predictive control (MPC), and rule-based approaches—each offering distinct advantages for specific application scenarios.
Wen et al. [81] applied a PID control algorithm to an orchard spraying system, achieving a steady-state deviation of 2.16% between actual and target flow rates, confirming its essential role in precision variable spraying. Outdoor tests showed that the system, based on STM32 control and pulse width modulation, achieved rapid flow adjustment, with nozzle outputs ranging from 0.16 to 0.54 L/min, and actual droplet deposition closely matching prescription values, supporting its field applicability. Shi et al. [82] developed a sensor-integrated real-time variable-rate spraying system utilizing a fuzzy control algorithm for adaptive regulation of spray dosage. Fuzzy rules and membership functions were constructed and jointly simulated in Simulink. The system demonstrated short transition times, stable performance, and enhanced spraying sensitivity and accuracy, highlighting its potential for orchard applications. Ivo Vatavuk et al. [83] validated a model predictive control (MPC)-based vineyard sprayer in field trials, achieving a root mean square (RMS) tracking error of 4.32 mm and a maximum error of 22.16 mm, thereby demonstrating centimeter-level path control under orchard conditions. However, two practical limitations remain: (1) the vehicle tilt caused by uneven terrain is not yet compensated, and (2) canopy row identification still requires manual input. Future work aims to address these issues by integrating inertial correction and a stereo-vision foliage detection module. This case succinctly shows how coupling perception with advanced control moves theoretical models toward reliable, in-field application.
Control strategy selection should consider the complexity of orchard operations and specific system performance demands [84]. Recent trends highlight that integrating multi--sensor fusion with intelligent algorithms for adaptive control strategy selection or hybridization is critical to improving the real-time responsiveness and intelligence of smart spraying systems. Peter Berk et al. [85] developed a fuzzy logic-based spraying system for apple orchards, which used LiDAR-derived canopy leaf area estimates to dynamically adjust the pulse width of electromagnetic valves, enabling dosage control tailored to specific canopy segments. Field trials in apple orchards showed that this method reduced pesticide usage by an average of 17.92% compared to conventional fixed-rate spraying, with particularly notable savings during the later growth stage (BBCH 91). By enabling demand-based spraying, the system improved pesticide utilization efficiency and proved suitable for orchards with complex canopy structures and dense foliage, demonstrating strong application potential. Song et al. [86] introduced a variable-rate spraying control system combining adaptive fuzzy PID and chaotic optimization. The approach was validated through experimental tests on a real-time spraying control platform, reducing response time to 0.9 s (a 59% improvement over traditional PID), improving the effective droplet ratio to 89.4%, and limiting overshoot to under 2.67%. These results demonstrated improved control accuracy and stability under dynamic conditions, offering vital support for high-precision and high-efficiency orchard spraying.

3. Spray Deposition and Pesticide Drift Control

3.1. Key Factors Affecting Spray Deposition Efficiency

Spray deposition efficiency refers to the quantity of pesticide that settles on the target crop surface. It directly influences pesticide effectiveness, economic efficiency, and environmental safety. Optimizing spray deposition efficiency is essential for enhancing pesticide utilization, reducing waste, and minimizing off-target drift. Multiple factors influence spray deposition, primarily including nozzle type, droplet size, wind speed and direction, and canopy structure [87,88].
The nozzle is a critical component of the spraying system, as it directly determines the spray pattern, droplet size distribution, and consequently, the deposition efficiency. Different nozzle types generate varying spray characteristics, including atomization, spray angle, and flow rate, that significantly affect deposition performance [89,90,91]. Guo et al. [92] conducted a comparative study of four commercial nozzles in a pear orchard. The mean deposition for each nozzle type is summarized in Table 4. The IDK nozzle achieved the highest deposition on the upper canopy (0.488 μL/cm−2), significantly surpassing the TR nozzle (0.284 μL/cm−2). This disparity is mainly due to the IDK 90015 being an air-assisted nozzle. Air assistance is an effective method for enhancing droplet deposition.
Notably, electrostatic spraying has demonstrated significant advantages in improving leaf surface coverage, particularly on the abaxial (underside) surface. Compared to conventional spraying, electrostatic techniques can increase adaxial (front) coverage by 54% and abaxial coverage by up to 112%. However, its overall deposition efficiency is highly sensitive to ambient humidity fluctuations and requires further optimization [93].
In orchard spraying, external wind alters droplet trajectories and may reduce deposition efficiency. Moderate wind can promote canopy disturbance, facilitating droplet penetration and enhancing internal leaf surface coverage. Air-assisted sprayers mitigate adverse wind effects on droplet deposition, thereby improving both operational efficiency and spraying precision [94,95,96]. Feng et al. [97] proposed an airflow grading and regulation strategy, adjusting airflow intensity and direction to align with the leaf area distribution of fruit tree canopies. The deposition reached 83.55% and the coefficient of variation fell below 33.24%.
The combined effects of temperature and humidity on droplet behavior warrant deeper exploration. Elevated temperatures and reduced relative humidity (RH) accelerate droplet evaporation on leaf surfaces, thereby reducing effective deposition [98]. Recent studies have shown that higher RH increases droplet size, coverage, and deposition due to suppressed evaporation, while temperature effects on deposition are less pronounced within moderate ranges (10–29 °C) [99]. In addition, droplet-scale evaporation experiments indicate that evaporation can be mitigated under high RH, particularly when organo-silicon adjuvants are used [100]. These findings highlight the importance of considering both temperature and RH to optimize droplet deposition and reduce pesticide loss in variable orchard environments.
Crop canopy structure and leaf density also play a critical role in determining deposition efficiency [101,102]. Jiang et al. [103] investigated the relationship between foliage area volume density (FAVD) in mango trees and droplet coverage. Due to the “leaf wall effect” on the canopy surface, air-assisted droplets face difficulty penetrating the inner canopy. As FAVD increases, internal canopy wind speed and droplet coverage exhibit a declining trend.

3.2. Mechanisms and Mitigation Techniques of Pesticide Drift

Pesticide drift refers to the loss of spray droplets during application, where the droplets fail to deposit on target crops and are instead dispersed by external forces such as wind or air currents. In orchard spraying, pesticide drift is a key factor that affects both the effectiveness of pesticide application and environmental safety. It primarily occurs in three forms: ground drift, airborne drift, and re-evaporation [104]. Apart from external conditions, poor management of droplet size and terminal velocity can exacerbate drift risks. Smaller droplets with low terminal velocity tend to remain airborne longer, making them more susceptible to long-distance transport, air pollution, and potential public health hazards. The drift reduces pesticide efficiency and may contaminate nearby non-target vegetation, soil, and water bodies, posing potential ecological risks to residential areas and water sources [105,106].
To reduce drift and enhance pesticide efficiency, several mitigation strategies have been widely adopted, including electrostatic spraying, wind curtain technology, and low-drift nozzles. These technologies significantly reduce environmental pollution while improving the precision and sustainability of spraying operations. Future research should focus on integrating these drift-reduction technologies with emerging approaches such as multi-sensor fusion and AI-based optimization to enhance sustainable and precision agriculture.
Electrostatic spraying imparts an electric charge to the droplets, causing mutual repulsion and uniform distribution, and enhances their adhesion to target crop surfaces, effectively reducing drift [107,108]. Vigo Morancho et al. [109] investigated the effectiveness of electrostatic spraying in reducing droplet drift. Results showed that electrostatic spraying significantly reduced drift and improved droplet adhesion on crop foliage. Under laboratory conditions, total deposition increased by up to 66% using electrostatic systems, indicating strong effectiveness. In field trials, although electrostatic effects were less pronounced in dense or shaded crop areas, total deposition still improved, confirming its practical potential for drift reduction.
Wind curtain systems form an air barrier around the sprayer, regulating wind flow to reduce drift and improve targeting accuracy. By adjusting curtain intensity and direction, spray droplets are better directed toward target crops, reducing off-target contamination [110].
Low-drift nozzles are engineered to reduce spray drift by minimizing the generation of fine droplets susceptible to wind dispersal. These nozzles maintain efficacy in pest and disease control while significantly reducing both airborne and ground drift [111]. Recent experimental studies on air-induction nozzles (AINs) confirm that spray parameters—including pressure, nozzle orifice size, and concurrent airflow—have significant effects on droplet size classification, as defined by ASABE S-572.1 standards [112]. While increased spray pressure tends to reduce droplet size, excessively high air velocities can exacerbate droplet fragmentation, increasing drift risk. In contrast, moderate air velocities were shown to produce more stable and appropriately sized droplets for agricultural spraying. These findings reinforce the importance of tuning nozzle type and operational settings to minimize drift without compromising spray coverage and deposition.

3.3. Variable Rate Spraying Strategies

Variable rate spraying (VRS) enables dynamic adjustment of pesticide dosage during agricultural spraying based on precise real-time data analysis, thereby minimizing resource waste, improving pesticide use efficiency, and reducing environmental pollution. Unlike traditional uniform spraying across entire fields, VRS systems typically rely on multiple sensing technologies—such as RGB imaging, LiDAR, depth cameras, and ultrasonic sensors—to collect real-time crop data. Based on factors such as crop distribution, canopy structure, and growth status, the system dynamically adjusts both spray volume and pattern, enabling more precise pesticide application.
Caner Koc et al. [113] developed an intelligent orchard sprayer based on image analysis. The system uses a binocular camera to measure the distance between the tree canopy and the nozzle, enabling three-dimensional canopy perception. An electronic control unit dynamically adjusts nozzle activation, achieving precise variable spraying while reducing ground and airborne drift.
The core value of variable rate spraying in sustainable agriculture lies in its ability to significantly improve pesticide utilization, while reducing resource waste and environmental impact [114]. Compared with conventional uniform spraying, variable rate spraying enables precise dosage control, effectively avoiding over-application and significantly reducing unnecessary pesticide input. In addition, savings in pesticides and water can significantly reduce spraying costs, making VRS particularly suitable for large-scale orchard operations. VRS can also effectively reduce pesticide residues, enhance the market competitiveness of agricultural products, and promote the development of efficient, environmentally friendly, and sustainable agriculture. Pesticide savings rates are shown in Table 5.

4. Autonomous Navigation and System Integration

4.1. Autonomous Navigation in Orchard Environments

Autonomous Navigation Systems (ANS) play a critical role in intelligent orchard spraying. Compared to open-field agriculture, orchard environments present greater challenges due to uneven terrain, complex planting layouts, and severe visual occlusions. These factors place higher demands on perception accuracy, path planning, and real-time decision-making capabilities [119,120,121]. Therefore, developing a robust, high-precision, and coordinated autonomous navigation and path planning system is essential for ensuring the stability and accuracy of spraying operations. Current orchard navigation systems rely heavily on multi-sensor fusion technologies. By integrating GNSS/RTK, LiDAR, inertial measurement units (IMUs), and vision sensors, these systems achieve accurate localization and mapping in complex orchard environments [122,123].
Jiang et al. [124] proposed an autonomous navigation solution for orchard spraying robots based on 3D LiDAR SLAM. The system constructs a point cloud map using 3D LiDAR, enhances map accuracy via NDT-ICP registration, and converts the 3D data into a 2D grid map containing key features for path planning. A multi-threaded ROS-based cooperative obstacle avoidance algorithm further improves navigation stability. Experimental results showed that, in a standardized peach orchard, the robot achieved an average localization error of 1.173 m, a maximum lateral deviation of 7.04 cm, and an average heading error below 8°, meeting the precision requirements for orchard spraying.
Wang et al. [125] introduced an integrated navigation method combining LiDAR, IMU, and GNSS. Using the LIO-SAM algorithm, the system constructs a grid map of the orchard to support path planning and obstacle avoidance. Experimental results demonstrated a positioning accuracy of 2.215 cm and a maintained safety distance of 1 m during dynamic obstacle avoidance, significantly enhancing the robot’s autonomous navigation and obstacle avoidance performance.
Given the dense obstacles between orchard rows, path planning modules often combine global planning with local obstacle avoidance algorithms to ensure operational efficiency while avoiding trunks, branches, and terrain depressions [126]. Traditional methods include graph search algorithms, such as Dijkstra, and local trajectory optimization techniques, including TEB and DWA. Recently, deep learning-based end-to-end navigation approaches, leveraging imitation learning or reinforcement learning, have emerged to enhance generalization and decision-making capabilities.
To address navigation signal instability and discontinuous tree rows, Cong Thanh Vu et al. [127] developed an autonomous navigation framework that integrates a binocular vision system with multi-sensor fusion. The system combines RTK-GNSS, visual odometry, and IMU data for high-precision localization. A hybrid controller based on the Dynamic Window Approach (DWA) ensures stable operation even in occluded or GPS-denied areas without frequent control mode switching. Implemented on ROS2, the system has been validated through simulation and field tests, demonstrating strong robustness and path-tracking performance in complex environments, thus providing a reliable navigation solution for orchard spraying.
Autonomous navigation systems are responsible not only for path control of mobile platforms but also for deep coordination with spraying perception and control modules. Guided by fruit tree recognition results, navigation path generation must consider the spatial distribution of spraying targets to ensure temporal and spatial synchronization between spraying actions and target areas. Some studies have explored navigation–spraying coordination mechanisms driven by map information and visual recognition results. By integrating task region annotation, path generation, and perception-triggered activation, these approaches significantly improve spray accuracy and droplet utilization.

4.2. Modular Integration of Intelligent Spraying Systems

Integrating sensor modules, detection algorithms, spray control strategies, and navigation systems into a unified “perception–decision–execution” closed-loop architecture is fundamental to achieving intelligent orchard spraying. The integrated architecture of the spraying system is designed to ensure efficient coordination among all modules. The perception system employs RGB-D and LiDAR sensors to accurately identify and locate fruit trees, providing target data. Target detection algorithms (e.g., the YOLO series) work with intelligent control algorithms to schedule spraying tasks with precision. The autonomous navigation system optimizes path planning to avoid obstacles, ensure full area coverage, and minimize redundant spraying. Data fusion techniques integrate information from each module in real time, ensuring efficient coordination and decision support in complex environments, ultimately enabling precise and intelligent spraying operations [128].
The introduction of the CAN bus, ROS framework, and IoT communication modules enhances the efficiency and stability of inter-system information exchange. Additionally, edge computing and embedded deployment enable terminal devices to perform key tasks—such as image recognition, spray control, and path adjustment—reducing communication latency and reliance on computing resources. The integration of these technologies not only improves the feasibility and reliability of system integration but also provides strong support for the efficient operation of intelligent sprayers [129].
Wang et al. [130] proposed a multi-source information fusion strategy based on 3D LiDAR and millimeter-wave radar for orchard autonomous navigation and spraying systems, enabling efficient obstacle avoidance and accurate localization in complex environments. Experimental results showed that the system performed well in obstacle avoidance and navigation accuracy, with an average navigation error of 15 cm and a spraying coverage rate of 50%, significantly improving operational efficiency and precision.

5. Challenges and Future Perspectives

5.1. Current Technological Bottlenecks

The preceding sections have reviewed recent advancements in sensor technologies, control strategies, and system integration for intelligent orchard sprayers. Despite these developments, several technical limitations and implementation challenges remain. This section summarizes key research gaps and proposes future directions to support the wider adoption of intelligent spraying systems.

5.1.1. Limitations in Perception and Intelligent Control

Perception reliability in complex environments remains a significant challenge, primarily due to the physical limitations of sensors. For instance, RGB cameras suffer from reduced signal-to-noise ratio in low-light conditions, LiDAR is sensitive to fine water droplets, and millimeter-wave radar lacks sufficient spatial resolution. A more complex issue lies in the inherent trade-offs between performance and ecological impact. Enhancing system performance during dawn and dusk often requires additional illumination, which may disturb nocturnal beneficial species such as pollinating bats, thereby introducing new ecological concerns.
Jordi Gené-Mola et al. [131] conducted performance evaluations of RGB-D sensors under varying light conditions in an apple orchard. The results indicated that under moderate to high illuminance levels (>2000 lx), the sensor’s resolution and accuracy significantly declined. However, this negative effect was minimized when measurements were taken closer to the target. In contrast, illuminance levels below 50 lx adversely affect the quality of color data, potentially necessitating artificial lighting. Point clouds acquired from sensor stations K2S1, K2S2, and K2S3 under different lighting conditions are shown in Figure 7.
Target detection is one of the core functions of intelligent spraying systems. Existing target detection algorithms, such as the YOLO series and RT-DETR, still suffer from limitations in accuracy and reliability under complex conditions, particularly in scenarios involving variable lighting, fruit occlusion, and densely planted orchards [70]. Zhu et al. [132] proposed an improved YOLO-LM model based on YOLOv7-tiny, which effectively addresses fruit occlusion issues and achieves a detection accuracy of 93.96%. In contrast, mainstream lightweight models such as YOLOv3-tiny, YOLOv4-tiny, and YOLOv5s still suffer from missed detections and duplicate detections under complex environmental conditions.
Although the YOLO-LM model performs well in fruit detection, its generalizability across different camellia varieties and environmental conditions remains limited due to phenotypic variations. This highlights a major bottleneck in current technologies: the challenge of environmental adaptability and varietal diversity.

5.1.2. Limitations in Orchard Navigation

In intelligent orchard spraying systems, high-precision autonomous navigation is essential for accurate path execution. However, current navigation technologies often face limitations in orchard environments, particularly in densely planted areas, narrow paths, and under complex lighting conditions. Traditional vision sensors are susceptible to environmental interference, resulting in reduced ability to detect obstacles or paths accurately, which subsequently affects the precision and efficiency of spraying operations [133,134].
To address these challenges, Jiang et al. [135] proposed a novel navigation system that integrates 2D LiDAR and thermal imaging data. By employing deep learning algorithms such as YOLACT, the system performs navigation, target detection, and image segmentation, enabling efficient autonomous operation under varying lighting conditions. The system operates reliably in low-light or nighttime conditions, effectively avoiding the performance degradation commonly observed in traditional systems under insufficient or excessive lighting. In experiments, the system achieved a mean average precision (mAP) of 83.74% for box segmentation in a simulated orchard, and 62.03% in real orchard conditions. The average positional error relative to the intended path was 0.21 m, demonstrating good navigation accuracy in complex orchard environments.
Although the system performs well during daytime and nighttime, its segmentation accuracy significantly decreases at dusk due to reduced temperature contrast between trees and background, which diminishes the thermal camera’s ability to distinguish targets. Therefore, while the system performs reliably under most lighting conditions, further optimization is needed to improve target recognition accuracy in extremely low-light environments. This study indicates that although current navigation and path planning technologies achieve good accuracy under specific conditions, they still face technical challenges in extreme environments, such as at dusk or under complex lighting variations. Further optimization to address these challenges will be critical to enhancing the performance and operational efficiency of intelligent spraying systems.

5.2. Barriers to Application and Adoption

The promotion of intelligent orchard spraying technology faces multifaceted challenges, primarily related to system integration, environmental adaptability, user acceptance, and technical maintenance [136,137]. Intelligent orchard spraying systems consist of multiple subsystems—such as sensors, target detection modules, and navigation systems—whose data integration and coordination remain major challenges. Compatibility issues between devices of different brands and models may hinder overall system efficiency and stability. Effective integration of orchard spraying systems requires strong technical expertise, particularly in aligning hardware and software. Merging systems with varying technical standards remains a critical challenge in large-scale implementation. Deep integration between target detection algorithms and sensor output requires high computational power and real-time responsiveness. However, current integration technologies and data processing frameworks often perform inconsistently in complex scenarios.
Environmental adaptability is a key challenge for intelligent spraying systems in orchard automation. Orchard environments vary significantly across regions—such as hills, mountains, and wetlands. In rugged terrains, autonomous navigation systems face major difficulties in path planning and obstacle recognition, reducing both spraying efficiency and accuracy. Figure 8 illustrates terrain in orchards across different regions [92,138], highlighting how poor adaptability constrains the practical application of autonomous navigation systems.
Orchard crops are diverse, and their growth stages, spatial structures, and canopy morphologies vary significantly. For instance, spindle-shaped and trellised Y-shaped tree structures differ greatly in leaf density and spatial distribution, as shown in Figure 9 [139], placing varying demands on spray penetration and deposition uniformity. However, current intelligent spraying systems lack sufficient adaptability to diverse tree structures. Their inability to dynamically adjust spraying parameters according to tree types results in reduced accuracy, increased pesticide waste, and decreased operational efficiency and pest control effectiveness.
Operating intelligent spraying systems requires professional training. However, most farmers—particularly in traditional agricultural regions—lack the necessary technical background, making it difficult to master complex operations and maintenance procedures, thereby limiting technology adoption. Therefore, designing user-friendly interfaces and offering targeted training have become critical challenges in promoting this technology.
Technical maintenance and support pose additional challenges. The long-term stability and ongoing maintenance of intelligent spraying systems are critical yet often overlooked in large-scale adoption. Hardware components such as sensors, drones, and spraying devices require regular inspection and maintenance. However, a shortage of qualified technicians and the inability to resolve technical faults promptly reduce farmers’ reliance on these systems. The lack of professional service teams and technical support complicates equipment recovery and increases the technical burden on farmers.

5.3. Future Trends and Research Priorities

To enable large-scale deployment and efficient operation of intelligent orchard sprayers in real-world agricultural production, future research and applications should focus on the following key areas:
(1)
Development of robust and cost-effective perception systems. Given the frequent challenges of varying illumination, target occlusion, and morphological differences in orchards, future perception systems must enhance their robustness in complex environments. Emphasis should be placed on developing cost-effective multimodal sensor fusion technologies, such as integrated perception schemes combining RGB-D, LiDAR, and ultrasonic sensors, to enhance the accurate recognition of tree structures and fruit targets.
(2)
Synergistic optimization of intelligent navigation and spraying control systems. Current systems suffer from task segmentation and response delays. Future efforts should focus on establishing a linkage mechanism between path planning and spraying execution, enabling real-time dynamic path optimization and task scheduling. Additionally, navigation systems should incorporate multi-source perception fusion and self-learning capabilities to handle complex terrains and path interferences, thereby improving operational efficiency and spray coverage accuracy.
(3)
Environment-aware adaptive spraying strategies. Considering the variation in crop types, growth stages, and environmental conditions, deep learning and big data analytics should be leveraged to enable adaptive adjustment of spraying parameters. Causal inference models linking environment, crop characteristics, and spraying outcomes should be established to enable need-based and differentiated spraying, thereby reducing pesticide usage and improving environmental sustainability.
(4)
Interdisciplinary collaboration and system integration innovation. The inherent complexity of intelligent orchard spraying systems necessitates the deep integration of multidisciplinary technologies. Future efforts should enhance cross-domain collaboration among agricultural engineering, artificial intelligence, robotics, and the Internet of Things, achieving technological synergy in hardware platforms, algorithm optimization, and system integration to drive full-process intelligent spraying operations.
(5)
Agricultural data sharing and intelligent service platform development. Establishing a unified agricultural big data platform and open service system is crucial to improving the operational efficiency and intelligence level of orchard systems. Standardized acquisition and sharing of sensor data, remote sensing imagery, and operational information should be promoted to enhance data connectivity and technical interoperability among farmers, enterprises, and research institutions, thereby supporting intelligent decision-making.
(6)
Dual promotion through policy guidance and market mechanisms. The successful implementation of technology ultimately depends on both governmental support and market forces. On one hand, governments should introduce targeted support policies for smart agriculture, fostering pilot programs and industry standardization. On the other hand, enterprises should leverage technological leadership and product commercialization to accelerate the adoption of intelligent spraying systems and increase user acceptance and coverage.
In summary, intelligent orchard spraying technology integrates advanced perception systems, target detection algorithms, spray control strategies, and autonomous navigation to achieve precise and intelligent spraying operations. Although challenges such as perception instability and complex system integration persist, the technology demonstrates great potential for enhancing agricultural productivity and environmental sustainability. With continued interdisciplinary collaboration and policy support, its large-scale deployment is expected to accelerate.

Author Contributions

M.W.: Formal Analysis, Investigation, Validation, Writing—original draft. S.L.: Conceptualization, Methodology, Writing—Review. Z.L.: Methodology and Investigation. M.O.: Investigation, Formal analysis. S.D.: Formal analysis, Supervision. X.D.: Investigation, Validation. X.W.: Conceptualization. L.J.: Methodology. W.J.: Conceptualization and Editing, Supervision, Project administration. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Priority Academic Program Development of Jiangsu Higher Education Institutions (grant number: PAPD-2023-87), and Integrated Research and Development of Key Technologies and Intelligent Equipment for Water, Fertilizer, and Pesticide Collaborative Operations in Rice and Wheat Cultivation (grant number: 8261200003).

Data Availability Statement

The data presented in this study are available within the article.

Acknowledgments

The author thanks the School of Agricultural Engineering of Jiangsu University for its facilities and support.

Conflicts of Interest

Author Siyuan Liu was employed by the company Ping An Property & Casualty Insurance Company of China, Ltd., Hebei Branch. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Shen, L.; Zenan, S.; Man, H. The impact of digital literacy on farmers’ pro-environmental behavior: An analysis with the Theory of Planned Behavior. Front. Sustain. Food Syst. 2024, 8, 1432184. [Google Scholar] [CrossRef]
  2. Xiao, L.; Liu, J.; Ge, J. Dynamic game in agriculture and industry cross-sectoral water pollution governance in developing countries. Agric. Water Manag. 2021, 243, 106417. [Google Scholar] [CrossRef]
  3. Jin, Y.; Liu, J.; Xu, Z.; Yuan, S.; Li, P.; Wang, J. Development status and trend of agricultural robot technology. Int. J. Agric. Biol. Eng. 2021, 14, 1–19. [Google Scholar] [CrossRef]
  4. Zhou, J.; Zou, X.; Song, S.; Chen, G. Quantum Dots Applied to Methodology on Detection of Pesticide and Veterinary Drug Residues. J. Agric. Food Chem. 2018, 66, 1307–1319. [Google Scholar] [CrossRef] [PubMed]
  5. Gao, Q.; Wang, Y.; Li, Y.; Yang, W.; Jiang, W.; Liang, Y.; Zhang, Z. Residue behaviors of six pesticides during apple juice production and storage. Food Res. Int. 2024, 177, 113894. [Google Scholar] [CrossRef]
  6. Lykogianni, M.; Bempelou, E.; Karamaouna, F.; Aliferis, K.A. Do pesticides promote or hinder sustainability in agriculture? The challenge of sustainable use of pesticides in modern agriculture. Sci. Total Environ. 2021, 795, 148625. [Google Scholar] [CrossRef]
  7. Xu, Y.; Hassan, M.; Sharma, A.S.; Li, H.; Chen, Q. Recent advancement in nano-optical strategies for detection of pathogenic bacteria and their metabolites in food safety. Crit. Rev. Food Sci. Nutr. 2023, 63, 486–504. [Google Scholar] [CrossRef] [PubMed]
  8. Jiang, L.; Hassan, M.; Ali, S.; Li, H.; Sheng, R.; Chen, Q. Evolving trends in SERS-based techniques for food quality and safety: A review. Trends Food Sci. Technol. 2021, 112, 225–240. [Google Scholar] [CrossRef]
  9. Wei, Z.; Xue, X.; Salcedo, R.; Zhang, Z.; Gil, E.; Sun, Y.; Li, Q.; Shen, J.; He, Q.; Dou, Q.; et al. Key Technologies for an Orchard Variable-Rate Sprayer: Current Status and Future Prospects. Agronomy 2022, 13, 59. [Google Scholar] [CrossRef]
  10. Louisa, P. Digital Agriculture and Labor A Few Challenges for Social Sustainability. Sustainability 2021, 13, 5980. [Google Scholar] [CrossRef]
  11. Li, F.; Zhang, W. Research on the Effect of Digital Economy on Agricultural Labor Force Employment and Its Relationship Using SEM and fsQCA Methods. Agriculture 2023, 13, 566. [Google Scholar] [CrossRef]
  12. Xi, T.; Li, C.; Qiu, W.; Wang, H.; Lv, X.; Han, C.; Ahmad, F. Droplet Deposition Behavior on a Pear Leaf Surface under Wind-Induced Vibration. Appl. Eng. Agric. 2020, 36, 913–926. [Google Scholar] [CrossRef]
  13. Ye, L.; Wu, F.; Zou, X.; Li, J. Path planning for mobile robots in unstructured orchard environments: An improved kinematically constrained bi-directional RRT approach. Comput. Electron. Agric. 2023, 215, 108453. [Google Scholar] [CrossRef]
  14. Shepherd, M.; Turner, J.A.; Small, B.; Wheeler, D. Priorities for science to overcome hurdles thwarting the full promise of the ‘digital agriculture’ revolution. J. Sci. Food Agric. 2020, 100, 5083–5092. [Google Scholar] [CrossRef] [PubMed]
  15. Taseer, A.; Han, X. Advancements in variable rate spraying for precise spray requirements in precision agriculture using Unmanned aerial spraying Systems: A review. Comput. Electron. Agric. 2024, 219, 108841. [Google Scholar] [CrossRef]
  16. Dou, H.; Zhang, C.; Li, L.; Hao, G.; Ding, B.; Gong, W.; Huang, P. Application of variable spray technology in agriculture. In IOP Conference Series: Earth and Environmental Science; IOP Publishing: Bristol, UK, 2018; p. 012007. [Google Scholar] [CrossRef]
  17. Chen, P.; Ding, X.; Chen, M.; Song, H.; Imran, M. The Impact of Resource Spatial Mismatch on the Configuration Analysis of Agricultural Green Total Factor Productivity. Agriculture 2025, 15, 23. [Google Scholar] [CrossRef]
  18. Calatrava, J.; Martínez-Granados, D.; Zornoza, R.; Gonzalez-Rosado, M.; Lozano-Garcia, B.; Vega-Zamora, M.; Gómez-López, M.D. Barriers and Opportunities for the Implementation of Sustainable Farming Practices in Mediterranean Tree Orchards. Agronomy 2021, 11, 821. [Google Scholar] [CrossRef]
  19. Bahlol, H.Y.; Chandel, A.K.; Hoheisel, G.A.; Khot, L.R. The smart spray analytical system: Developing understanding of output air-assist and spray patterns from orchard sprayers. Crop Prot. 2020, 127, 104977. [Google Scholar] [CrossRef]
  20. Tang, Y.; Qiu, J.; Zhang, Y.; Wu, D.; Cao, Y.; Zhao, K.; Zhu, L. Optimization strategies of fruit detection to overcome the challenge of unstructured background in field orchard environment: A review. Precis. Agric. 2023, 24, 1183–1219. [Google Scholar] [CrossRef]
  21. Liu, Y.; Li, L.; Liu, Y.; He, X.; Song, J.; Zeng, A.; Wang, Z. Assessment of spray deposition and losses in an apple orchard with an unmanned agricultural aircraft system in China. Trans. Asabe 2020, 63, 619–627. [Google Scholar] [CrossRef]
  22. Peng, Y.; Wang, A.; Liu, J.; Faheem, M. A Comparative Study of Semantic Segmentation Models for Identification of Grape with Different Varieties. Agriculture 2021, 11, 997. [Google Scholar] [CrossRef]
  23. Salcedo, R.; Zhu, H.; Ozkan, E.; Falchieri, D.; Zhang, Z.; Wei, Z. Reducing ground and airborne drift losses in young apple orchards with PWM-controlled spray systems. Comput. Electron. Agric. 2021, 189, 106389. [Google Scholar] [CrossRef]
  24. Zhou, H.; Jia, W.; Li, Y.; Ou, M. Method for Estimating Canopy Thickness Using Ultrasonic Sensor Technology. Agriculture 2021, 11, 1011. [Google Scholar] [CrossRef]
  25. Niu, Z.; Huang, T.; Xu, C.; Sun, X.; Taha, M.F.; He, Y.; Qiu, Z. A Novel Approach to Optimize Key Limitations of Azure Kinect DK for Efficient and Precise Leaf Area Measurement. Agriculture 2025, 15, 173. [Google Scholar] [CrossRef]
  26. De Bortoli, L.; Marsi, S.; Marinello, F.; Carrato, S.; Ramponi, G.; Gallina, P. Structure from Linear Motion (SfLM): An On-the-Go Canopy Profiling System Based on Off-the-Shelf RGB Cameras for Effective Sprayers Control. Agronomy 2022, 12, 1276. [Google Scholar] [CrossRef]
  27. Xue, X.; Luo, Q.; Bu, M.; Li, Z.; Lyu, S.; Song, S. Citrus Tree Canopy Segmentation of Orchard Spraying Robot Based on RGB-D Image and the Improved DeepLabv3+. Agronomy 2023, 13, 2059. [Google Scholar] [CrossRef]
  28. Tang, S.; Xia, Z.; Gu, J.; Wang, W.; Huang, Z.; Zhang, W. High-precision apple recognition and localization method based on RGB-D and improved SOLOv2 instance segmentation. Front. Sustain. Food Syst. 2024, 8, 1403872. [Google Scholar] [CrossRef]
  29. Murcia, H.F.; Tilaguy, S.; Ouazaa, S. Development of a Low-Cost System for 3D Orchard Mapping Integrating UGV and LiDAR. Plants 2021, 10, 2804. [Google Scholar] [CrossRef]
  30. Liu, H.; Zhu, H. Evaluation of a Laser Scanning Sensor in Detection of Complex-Shaped Targets for Variable-Rate Sprayer Development. Trans. ASABE 2016, 59, 1181–1192. [Google Scholar] [CrossRef]
  31. Liu, L.; Liu, Y.; He, X.; Liu, W. Precision Variable-Rate Spraying Robot by Using Single 3D LIDAR in Orchards. Agronomy 2022, 12, 2509. [Google Scholar] [CrossRef]
  32. Jiang, A.; Ahamed, T. Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection. Sensors 2023, 23, 4808. [Google Scholar] [CrossRef]
  33. Yang, H.; Wang, X.; Sun, G. Three-Dimensional Morphological Measurement Method for a Fruit Tree Canopy Based on Kinect Sensor Self-Calibration. Agronomy 2019, 9, 741. [Google Scholar] [CrossRef]
  34. Bargoti, S.; Underwood, J.P.; Nieto, J.I.; Sukkarieh, S. A Pipeline for Trunk Localisation Using LiDAR in Trellis Structured Orchards. Springer Tracts Adv. Robot. 2015, 105, 455–468. [Google Scholar] [CrossRef]
  35. Wang, M.; Dou, H.; Sun, H.; Zhai, C.; Zhang, Y.; Yuan, F. Calculation Method of Canopy Dynamic Meshing Division Volumes for Precision Pesticide Application in Orchards Based on LiDAR. Agronomy 2023, 13, 1077. [Google Scholar] [CrossRef]
  36. Zhmud, V.A.; Kondratiev, N.O.; Kuznetsov, K.A.; Trubin, V.G.; Dimitrov, L.V. Application of ultrasonic sensor for measuring distances in robotics. J. Phys. Conf. Ser. 2018, 1015, 032189. [Google Scholar] [CrossRef]
  37. Qiu, Z.; Lu, Y.; Qiu, Z. Review of Ultrasonic Ranging Methods and Their Current Challenges. Micromachines 2022, 13, 520. [Google Scholar] [CrossRef]
  38. Palleja, T.; Landers, A.J. Real time canopy density estimation using ultrasonic envelope signals in the orchard and vineyard. Comput. Electron. Agric. 2015, 115, 108–117. [Google Scholar] [CrossRef]
  39. Mahmud, S.; He, L.; Heinemann, P.; Choi, D.; Zhu, H. Unmanned aerial vehicle based tree canopy characteristics measurement for precision spray applications. Smart Agric. Technol. 2023, 4, 100153. [Google Scholar] [CrossRef]
  40. Gu, C.; Zhao, C.; Zou, W.; Yang, S.; Dou, H.; Zhai, C. Innovative Leaf Area Detection Models for Orchard Tree Thick Canopy Based on LiDAR Point Cloud Data. Agriculture 2022, 12, 1241. [Google Scholar] [CrossRef]
  41. Ou, M.; Hu, T.; Hu, M.; Yang, S.; Jia, W.; Wang, M.; Jiang, L.; Wang, X.; Dong, X. Experiment of Canopy Leaf Area Density Estimation Method Based on Ultrasonic Echo Signal. Agriculture 2022, 12, 1569. [Google Scholar] [CrossRef]
  42. Niu, Y.; Han, W.; Zhang, H.; Zhang, L.; Chen, H. Estimating maize plant height using a crop surface model constructed from UAV RGB images. Biosyst. Eng. 2024, 241, 56–67. [Google Scholar] [CrossRef]
  43. Wei, L.; Yang, H.; Niu, Y.; Zhang, Y.; Xu, L.; Chai, X. Wheat biomass, yield, and straw-grain ratio estimation from multi-temporal UAV-based RGB and multispectral images. Biosyst. Eng. 2023, 234, 187–205. [Google Scholar] [CrossRef]
  44. Li, L.; Xie, S.; Ning, J.; Chen, Q.; Zhang, Z. Evaluating green tea quality based on multisensor data fusion combining hyperspectral imaging and olfactory visualization systems. J. Sci. Food Agric. 2019, 99, 1787–1794. [Google Scholar] [CrossRef]
  45. Ali, M.M.; Hashim, N.; Aziz, S.A.; Lasekan, O. Utilisation of Deep Learning with Multimodal Data Fusion for Determination of Pineapple Quality Using Thermal Imaging. Agronomy 2023, 13, 401. [Google Scholar] [CrossRef]
  46. Lu, X.; Li, W.; Xiao, J.; Zhu, H.; Yang, D.; Yang, J.; Xu, X.; Lan, Y.; Zhang, Y. Inversion of Leaf Area Index in Citrus Trees Based on Multi-Modal Data Fusion from UAV Platform. Remote Sens. 2023, 15, 3523. [Google Scholar] [CrossRef]
  47. Zhou, X.; Chen, W.; Wei, X. Improved Field Obstacle Detection Algorithm Based on YOLOv8. Agriculture 2024, 14, 2263. [Google Scholar] [CrossRef]
  48. Zhang, Z.; Lu, Y.; Zhao, Y.; Pan, Q.; Jin, K.; Xu, G.; Hu, Y. TS-YOLO: An All-Day and Lightweight Tea Canopy Shoots Detection Model. Agronomy 2023, 13, 1411. [Google Scholar] [CrossRef]
  49. Ji, W.; Pan, Y.; Xu, B.; Wang, J. A Real-Time Apple Targets Detection Method for Picking Robot Based on ShufflenetV2-YOLOX. Agriculture 2022, 12, 856. [Google Scholar] [CrossRef]
  50. Ji, W.; Gao, X.; Xu, B.; Pan, Y.; Zhang, Z.; Zhao, D. Apple target recognition method in complex environment based on improved YOLOv4. J. Food Process Eng. 2021, 44, e13866. [Google Scholar] [CrossRef]
  51. Tian, Y.; Yang, G.; Wang, Z.; Wang, H.; Li, E.; Liang, Z. Apple detection during different growth stages in orchards using the improved YOLO-V3 model. Comput. Electron. Agric. 2019, 157, 417–426. [Google Scholar] [CrossRef]
  52. Mirhaji, H.; Soleymani, M.; Asakereh, A.; Mehdizadeh, S.A. Fruit detection and load estimation of an orange orchard using the YOLO models through simple approaches in different imaging and illumination conditions. Comput. Electron. Agric. 2021, 191, 106533. [Google Scholar] [CrossRef]
  53. Ranjan, S.; Dawood, A.; Martin, C.; Manoj, K. Immature Green Apple Detection and Sizing in Commercial Orchards Using YOLOv8 and Shape Fitting Techniques. IEEE Access 2024, 12, 43436–43452. [Google Scholar]
  54. Li, A.; Wang, C.; Ji, T.; Wang, Q.; Zhang, T. D3-YOLOv10: Improved YOLOv10-Based Lightweight Tomato Detection Algorithm Under Facility Scenario. Agriculture 2024, 14, 2268. [Google Scholar] [CrossRef]
  55. Guo, M.; Xu, T.; Liu, J.; Liu, Z.; Jiang, P.; Mu, T.; Zhang, S.; Martin, R.; Cheng, M.; Hu, S. Attention mechanisms in computer vision: A survey. Comput. Vis. Media 2022, 8, 331–368. [Google Scholar] [CrossRef]
  56. Ouyang, D.; He, S.; Zhang, G.; Luo, M.; Guo, H.; Zhan, J.; Huang, Z. Efficient Multi-Scale Attention Module with Cross-Spatial Learning. In Proceedings of the 2023 IEEE International Conference on Acoustics, Speech and Processing, Rhodes Island, Greece, 4–10 June 2023. [Google Scholar]
  57. Hu, J.; Shen, L.; Sun, G. Squeeze-and-Excitation Networks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, UT, USA, 18–22 June 2018; pp. 7132–7141. [Google Scholar]
  58. Wang, J.; Ding, J.; Ran, S.; Qin, S.; Liu, B.; Li, X. Automatic Pear Extraction from High-Resolution Images by a Visual Attention Mechanism Network. Remote Sens. 2023, 15, 3283. [Google Scholar] [CrossRef]
  59. Yang, Y.; Su, L.; Zong, A.; Tao, W.; Xu, X.; Chai, Y.; Mu, W. A New Kiwi Fruit Detection Algorithm Based on an Improved Lightweight Network. Agriculture 2024, 14, 1823. [Google Scholar] [CrossRef]
  60. Woo, S.; Park, J.; Lee, J.-Y.; Kweon, I.S. CBAM: Convolutional Block Attention Module; Springer International Publishing: Berlin/Heidelberg, Germany, 2018; pp. 3–19. [Google Scholar]
  61. Jiang, M.; Song, L.; Wang, Y.; Li, Z.; Song, H. Fusion of the YOLOv4 network model and visual attention mechanism to detect low-quality young apples in a complex environment. Precis. Agric. 2022, 23, 559–577. [Google Scholar] [CrossRef]
  62. Akdoğan, C.; Özer, T.; Oğuz, Y. PP-YOLO: Deep learning based detection model to detect apple and cherry trees in orchard based on Histogram and Wavelet preprocessing techniques. Comput. Electron. Agric. 2025, 232, 110052. [Google Scholar] [CrossRef]
  63. Huang, Z.; Zhang, X.; Wang, H.; Wei, H.; Zhang, Y.; Zhou, G. Pear Fruit Detection Model in Natural Environment Based on Lightweight Transformer Architecture. Agriculture 2024, 15, 24. [Google Scholar] [CrossRef]
  64. Tao, K.; Wang, A.; Shen, Y.; Lu, Z.; Peng, F.; Wei, X. Peach Flower Density Detection Based on an Improved CNN Incorporating Attention Mechanism and Multi-Scale Feature Fusion. Horticulturae 2022, 8, 904. [Google Scholar] [CrossRef]
  65. Zhang, B.; Wang, R.; Zhang, H.; Yin, C.; Xia, Y.; Fu, M.; Fu, W. Dragon fruit detection in natural orchard environment by integrating lightweight network and attention mechanism. Front. Plant Sci. 2022, 13, 1040923. [Google Scholar] [CrossRef] [PubMed]
  66. Liu, X.; Li, G.; Chen, W.; Liu, B.; Chen, M.; Lu, S. Detection of Dense Citrus Fruits by Combining Coordinated Attention and Cross-Scale Connection with Weighted Feature Fusion. Appl. Sci. 2022, 12, 6600. [Google Scholar] [CrossRef]
  67. Ji, W.; Wang, J.; Xu, B.; Zhang, T. Apple Grading Based on Multi-Dimensional View Processing and Deep Learning. Foods 2023, 12, 2117. [Google Scholar] [CrossRef] [PubMed]
  68. Hu, T.; Wang, W.; Gu, J.; Xia, Z.; Zhang, J.; Wang, B. Research on Apple Object Detection and Localization Method Based on Improved YOLOX and RGB-D Images. Agronomy 2023, 13, 1816. [Google Scholar] [CrossRef]
  69. Jiang, L.; Wang, Y.; Wu, C.; Wu, H. Fruit Distribution Density Estimation in YOLO-Detected Strawberry Images: A Kernel Density and Nearest Neighbor Analysis Approach. Agriculture 2024, 14, 1848. [Google Scholar] [CrossRef]
  70. Zhang, F.; Chen, Z.; Ali, S.; Yang, N.; Fu, S.; Zhang, Y. Multi-class detection of cherry tomatoes using improved YOLOv4-Tiny. Int. J. Agric. Biol. Eng. 2023, 16, 225–231. [Google Scholar] [CrossRef]
  71. Xu, Z.; Liu, J.; Wang, J.; Cai, L.; Jin, Y.; Zhao, S.; Xie, B. Realtime Picking Point Decision Algorithm of Trellis Grape for High-Speed Robotic Cut-and-Catch Harvesting. Agronomy 2023, 13, 1618. [Google Scholar] [CrossRef]
  72. Ji, W.; Zhang, T.; Xu, B.; He, G. Apple recognition and picking sequence planning for harvesting robot in a complex environment. J. Agric. Eng. 2024, 55, 1. [Google Scholar]
  73. Xie, H.; Zhang, Z.; Zhang, K.; Yang, L.; Zhang, D.; Yu, Y. Research on the visual location method for strawberry picking points under complex conditions based on composite models. J. Sci. Food Agric. 2024, 104, 8566–8579. [Google Scholar] [CrossRef]
  74. Rovira-Más, F.; Saiz-Rubio, V.; Cuenca, A.; Ortiz, C.; Teruel, M.P.; Ortí, E. Open-Format Prescription Maps for Variable Rate Spraying in Orchard Farming. J. Asabe 2024, 67, 243–257. [Google Scholar] [CrossRef]
  75. Hu, K.; Feng, X. Research on the Variable Rate Spraying System Based on Canopy Volume Measurement. J. Inf. Process. Syst. 2019, 15, 1131–1140. [Google Scholar]
  76. Li, J.; Nie, Z.; Chen, Y.; Ge, D.; Li, M. Development of Boom Posture Adjustment and Control System for Wide Spray Boom. Agriculture 2023, 13, 2162. [Google Scholar] [CrossRef]
  77. Zou, W.; Wang, X.; Deng, W.; Su, S.; Wang, S.; Fan, P. Design and test of automatic toward-target sprayer used in orchard. In Proceedings of the 2015 IEEE International Conference on Cyber Technology in Automation, Control, and Intelligent Systems (CYBER), Shenyang, China, 8–12 June 2015; IEEE: Piscataway, NJ, USA, 2015; pp. 697–702. [Google Scholar]
  78. Bin Motalab, M.; Al-Mallahi, A. Development of a flexible electronic control unit for seamless integration of machine vision to CAN-enabled boom sprayers for spot application technology. Smart Agric. Technol. 2024, 9, 100618. [Google Scholar] [CrossRef]
  79. Zhang, C.; Zhai, C.; Zhang, M.; Zhang, C.; Zou, W.; Zhao, C. Staggered-Phase Spray Control: A Method for Eliminating the Inhomogeneity of Deposition in Low-Frequency Pulse-Width Modulation (PWM) Variable Spray. Agriculture 2024, 14, 465. [Google Scholar] [CrossRef]
  80. Ma, C.; Li, G.; Peng, Q. Design and Test of a Jet Remote Control Spraying Machine for Orchards. Agriengineering 2021, 3, 797–814. [Google Scholar] [CrossRef]
  81. Wen, S.; Zhang, Q.; Deng, J.; Lan, Y.; Yin, X.; Shan, J. Design and Experiment of a Variable Spray System for Unmanned Aerial Vehicles Based on PID and PWM Control. Appl. Sci. 2018, 8, 2482. [Google Scholar] [CrossRef]
  82. Shi, Y.; Zhang, C.; Liang, A.; Yuan, H. Fuzzy Control of the Spraying Medicine Control System; Springer: York, Nork, NY, USA, 2008; pp. 1087–1094. [Google Scholar]
  83. Vatavuk, I.; Vasiljević, G.; Kovačić, Z. Task Space Model Predictive Control for Vineyard Spraying with a Mobile Manipulator. Agriculture 2022, 12, 381. [Google Scholar] [CrossRef]
  84. Li, J.; Cui, H.; Ma, Y.; Xun, L.; Li, Z.; Yang, Z.; Lu, H. Orchard Spray Study: A Prediction Model of Droplet Deposition States on Leaf Surfaces. Agronomy 2020, 10, 747. [Google Scholar]
  85. Berk, P.; Stajnko, D.; Hočevar, M.; Malneršič, A.; Jejčič, V.; Belšak, A. Plant protection product dose rate estimation in apple orchards using a fuzzy logic system. PLoS ONE 2019, 14, e0214315. [Google Scholar] [CrossRef]
  86. Song, L.; Huang, J.; Liang, X.; Yang, S.X.; Hu, W.; Tang, D. An Intelligent Multi-Sensor Variable Spray System with Chaotic Optimization and Adaptive Fuzzy Control. Sensors 2020, 20, 2954. [Google Scholar]
  87. Liao, J.; Hewitt, A.J.; Wang, P.; Luo, X.; Zang, Y.; Zhou, Z.; Lan, Y.; O’Donnell, C. Development of droplet characteristics prediction models for air induction nozzles based on wind tunnel tests. Int. J. Agric. Biol. Eng. 2019, 12, 1–6. [Google Scholar] [CrossRef]
  88. Shi, Q.; Mao, H.; Guan, X. Numerical Simulation and Experimental Verification of the Deposition Concentration of an Unmanned Aerial Vehicle. Appl. Eng. Agric. 2019, 35, 367–376. [Google Scholar] [CrossRef]
  89. Gong, C.; Li, D.; Kang, C. Visualization of the evolution of bubbles in the spray sheet discharged from the air--induction nozzle. Pest Manag. Sci. 2022, 78, 1850–1860. [Google Scholar] [CrossRef] [PubMed]
  90. Pan, X.; Yang, S.; Gao, Y.; Wang, Z.; Zhai, C.; Qiu, W. Evaluation of Spray Drift from an Electric Boom Sprayer: Impact of Boom Height and Nozzle Type. Agronomy 2025, 15, 160. [Google Scholar] [CrossRef]
  91. Liu, J.; Liu, X.; Zhu, X.; Yuan, S. Droplet characterisation of a complete fluidic sprinkler with different nozzle dimensions. Biosyst. Eng. 2016, 148, 90–100. [Google Scholar] [CrossRef]
  92. Guo, S.; Yao, W.; Xu, T.; Ma, H.; Sun, M.; Chen, C.; Lan, Y. Assessing the application of spot spray in Nanguo pear orchards: Effect of nozzle type, spray volume rate and adjuvant. Pest Manag. Sci. 2022, 78, 3564–3575. [Google Scholar] [CrossRef]
  93. Amaya, K.; Bayat, A. Innovating an electrostatic charging unit with an insulated induction electrode for air-assisted orchard sprayers. Crop Prot. 2024, 181, 106701. [Google Scholar] [CrossRef]
  94. Hu, Y.; Chen, Y.; Wei, W.; Hu, Z.; Li, P. Optimization Design of Spray Cooling Fan Based on CFD Simulation and Field Experiment for Horticultural Crops. Agriculture 2021, 11, 566. [Google Scholar] [CrossRef]
  95. Lin, J.; Cai, J.; Xiao, L.; Liu, K.; Chen, J.; Ma, J.; Qiu, B. An angle correction method based on the influence of angle and travel speed on deposition in the air-assisted spray. Crop Prot. 2024, 175, 106444. [Google Scholar] [CrossRef]
  96. Guo, J.; Dong, X.; Qiu, B. Analysis of the Factors Affecting the Deposition Coverage of Air-Assisted Electrostatic Spray on Tomato Leaves. Agronomy 2024, 14, 1108. [Google Scholar] [CrossRef]
  97. Feng, F.; Dou, H.; Zhai, C.; Zhang, Y.; Zou, W.; Hao, J. Design and Experiment of Orchard Air-Assisted Sprayer with Airflow Graded Control. Agronomy 2024, 15, 95. [Google Scholar] [CrossRef]
  98. Bourodimos, G.; Koutsiaras, M.; Psiroukis, V.; Balafoutis, A.; Fountas, S. Development and Field Evaluation of a Spray Drift Risk Assessment Tool for Vineyard Spraying Application. Agriculture 2019, 9, 181. [Google Scholar] [CrossRef]
  99. Qi, H.; Lin, Z.; Zhou, J.; Li, J.; Chen, P.; Ouyang, F. Effect of temperature and humidity on droplet deposition of unmanned agricultural aircraft system. Int. J. Precis. Agric. Aviat. 2018, 1, 41–49. [Google Scholar] [CrossRef]
  100. Wang, Z.; Lan, L.; He, X.; Herbst, A. Dynamic evaporation of droplet with adjuvants under different environment conditions. Int. J. Agric. Biol. Eng. 2020, 13, 1–6. [Google Scholar] [CrossRef]
  101. Zhou, Q.; Xue, X.; Chen, C.; Cai, C.; Jiao, Y. Canopy deposition characteristics of different orchard pesticide dose models. Int. J. Agric. Biol. Eng. 2023, 16, 1–6. [Google Scholar] [CrossRef]
  102. Ma, J.; Liu, K.; Dong, X.; Huang, X.; Ahmad, F.; Qiu, B. Force and motion behaviour of crop leaves during spraying. Biosyst. Eng. 2023, 235, 83–99. [Google Scholar] [CrossRef]
  103. Jiang, S.; Yang, S.; Xu, J.; Li, W.; Zheng, Y.; Liu, X.; Tan, Y. Wind field and droplet coverage characteristics of air--assisted sprayer in mango--tree canopies. Pest Manag. Sci. 2022, 78, 4892–4904. [Google Scholar] [CrossRef]
  104. Sarwar, A.; Peters, T.R.; Shafeeque, M.; Mohamed, A.; Arshad, A.; Ullah, A.; Saddique, A.; Muzammil, M.; Aslam, R.A. Accurate measurement of wind drift and evaporation losses could improve water application efficiency of sprinkler irrigation systems—A comparison of measuring techniques. Agric. Water Manag. 2021, 258, 107209. [Google Scholar]
  105. Blanco, M.N.; Fenske, R.A.; Kasner, E.J.; Yost, M.G.; Seto, E.; Austin, E. Real-Time Monitoring of Spray Drift from Three Different Orchard Sprayers. Chemosphere 2019, 222, 46–55. [Google Scholar] [CrossRef]
  106. Chen, P.; Lan, Y.; Huang, X.; Qi, H.; Wang, G.; Wang, J.; Wang, L.; Xiao, H. Droplet Deposition and Control of Planthoppers of Different Nozzles in Two-Stage Rice with a Quadrotor Unmanned Aerial Vehicle. Agronomy 2020, 10, 303. [Google Scholar] [CrossRef]
  107. Lin, J.; Cai, J.; Ouyang, J.; Xiao, L.; Qiu, B. The Influence of Electrostatic Spraying with Waist-Shaped Charging Devices on the Distribution of Long-Range Air-Assisted Spray in Greenhouses. Agronomy 2024, 14, 2278. [Google Scholar] [CrossRef]
  108. Appah, S.; Wang, P.; Ou, M.; Gong, C.; Jia, W. Review of electrostatic system parameters, charged droplets characteristics and substrate impact behavior from pesticides spraying. Int. J. Agric. Biol. Eng. 2019, 12, 1–9. [Google Scholar] [CrossRef]
  109. Vigo-Morancho, A.; Videgain, M.; Boné, A.; Vidal, M.; García-Ramos, F.J. Characterization and Evaluation of an Electrostatic Knapsack Sprayer Prototype for Agricultural Crops. Agronomy 2024, 14, 2343. [Google Scholar] [CrossRef]
  110. Song, Y. Analysis of air curtain system flow field and droplet drift characteristics of high clearance sprayer based on CFD. Int. J. Agric. Biol. Eng. 2024, 17, 38–45. [Google Scholar] [CrossRef]
  111. Ellis, M.C.B.; Lane, A.G.; O’Sullivan, C.M.; Jones, S. Wind tunnel investigation of the ability of drift-reducing nozzles to provide mitigation measures for bystander exposure to pesticides. Biosyst. Eng. 2021, 202, 152–164. [Google Scholar] [CrossRef]
  112. Liao, J.; Luo, X.; Wang, P.; Zhou, Z.; O’Donnell, C.C.; Zang, Y.; Hewitt, A.J. Analysis of the Influence of Different Parameters on Droplet Characteristics and Droplet Size Classification Categories for Air Induction Nozzle. Agronomy 2020, 10, 256. [Google Scholar] [CrossRef]
  113. Koc, C.; Duran, H.; Koc, D.G. Orchard Sprayer Design for Precision Pesticide Application. Erwerbs-Obstbau 2023, 65, 1819–1828. [Google Scholar] [CrossRef]
  114. Gil, E.; Campos, J.; Salcedo, R.; García-Ruiz, F. Variable Rate Application in fruit orchards and vineyards in Europe: Canopy characterization and system improvement. In Proceedings of the 2023 ASABE Annual Meeting, Omaha, Nebraska, 8–12 July 2023; Volume 1. [Google Scholar]
  115. Li, L.; He, X.; Song, J.; Liu, Y.; Zeng, A.; Liu, Y.; Liu, C.; Liu, Z. Design and experiment of variable rate orchard sprayer based on laser scanning sensor. Int. J. Agric. Biol. Eng. 2018, 11, 101–108. [Google Scholar] [CrossRef]
  116. Fessler, L.; Fulcher, A.; Lockwood, D.; Wright, W.; Zhu, H. Advancing Sustainability in Tree Crop Pest Management: Refining Spray Application Rate with a Laser-guided Variable-rate Sprayer in Apple Orchards. HortScience 2020, 55, 1522–1530. [Google Scholar] [CrossRef]
  117. Nackley, L.L.; Warneke, B.; Fessler, L.; Pscheidt, J.W.; Lockwood, D.; Wright, W.C.; Sun, X.; Fulcher, A. Variable-rate Spray Technology Optimizes Pesticide Application by Adjusting for Seasonal Shifts in Deciduous Perennial Crops. HortTechnology 2021, 31, 479–489. [Google Scholar] [CrossRef]
  118. Nan, Y.; Zhang, H.; Zheng, J.; Yang, K.; Ge, Y. Low-volume precision spray for plant pest control using profile variable rate spraying and ultrasonic detection. Front. Plant Sci. 2023, 13, 1042769. [Google Scholar] [CrossRef] [PubMed]
  119. Cui, B.; Cui, X.; Wei, X.; Zhu, Y.; Ma, Z.; Zhao, Y.; Liu, Y. Design and Testing of a Tractor Automatic Navigation System Based on Dynamic Path Search and a Fuzzy Stanley Model. Agriculture 2024, 14, 2136. [Google Scholar] [CrossRef]
  120. Zhu, F.; Chen, J.; Guan, Z.; Zhu, Y.; Shi, H.; Cheng, K. Development of a combined harvester navigation control system based on visual simultaneous localization and mapping-inertial guidance fusion. J. Agric. Eng. 2024, 55. [Google Scholar] [CrossRef]
  121. Cui, L.; Mao, H.; Xue, X.; Ding, S.; Qiao, B. Optimized design and test for a pendulum suspension of the crop spray boom in dynamic conditions based on a six DOF motion simulator. Int. J. Agric. Biol. Eng. 2018, 11, 76–85. [Google Scholar]
  122. Su, Z.; Zou, W.; Zhai, C.; Tan, H.; Yang, S.; Qin, X. Design of an Autonomous Orchard Navigation System Based on Multi-Sensor Fusion. Agronomy 2024, 14, 2825. [Google Scholar] [CrossRef]
  123. Liu, W.; Hu, J.; Liu, J.; Yue, R.; Zhang, T.; Yao, M.; Li, J. Method for the navigation line recognition of the ridge without crops via machine vision. Int. J. Agric. Biol. Eng. 2024, 17, 230–239. [Google Scholar]
  124. Jiang, S.; Qi, P.; Han, L.; Liu, L.; Li, Y.; Huang, Z.; Liu, Y.; He, X. Navigation system for orchard spraying robot based on 3D LiDAR SLAM with NDT_ICP point cloud registration. Comput. Electron. Agric. 2024, 220, 108870. [Google Scholar] [CrossRef]
  125. Wang, W.; Qin, J.; Huang, D.; Zhang, F.; Liu, Z.; Wang, Z.; Yang, F. Integrated Navigation Method for Orchard-Dosing Robot Based on LiDAR/IMU/GNSS. Agronomy 2024, 14, 2541. [Google Scholar] [CrossRef]
  126. Guevara, J.; Fernando, A.; Cheein, A.; Gené-Mola, J.; Gregorio, E. Analyzing and overcoming the effects of GNSS error on LiDAR based orchard parameters estimation. Comput. Electron. Agric. 2020, 170, 105255. [Google Scholar] [CrossRef]
  127. Vu, C.T.; Chen, H.C.; Liu, Y.C. Toward Autonomous Navigation for Agriculture Robots in Orchard Farming. In Proceedings of the 2024 IEEE International Conference on Recent Advances in Systems Science and Engineering (RASSE), Taichung, Taiwan, 6–8 November 2024; IEEE: Piscataway, NJ, USA, 2024; pp. 1–8. [Google Scholar]
  128. Zhang, Y.; Zhang, B.; Shen, C.; Liu, H.; Huang, J.; Tian, K.; Tang, Z. Review of the field environmental sensing methods based on multi-sensor information fusion technology. Int. J. Agric. Biol. Eng. 2024, 17, 1–13. [Google Scholar]
  129. Jon, M.; Ander, A.; Inaki, M.; Aitor, G.; David, O.; Oskar, C. A Generic ROS-Based Control Architecture for Pest Inspection and Treatment in Greenhouses Using a Mobile Manipulator. IEEE Access 2021, 9, 94981–94995. [Google Scholar]
  130. Wang, S.; Song, J.; Qi, P.; Yuan, C.; Wu, H.; Zhang, L.; Liu, W.; Liu, Y.; He, X. Design and development of orchard autonomous navigation spray system. Front. Plant Sci. 2022, 13, 960686. [Google Scholar] [CrossRef] [PubMed]
  131. Gené-Mola, J.; Llorens, J.; Rosell-Polo, J.R.; Gregorio, E.; Arnó, J.; Solanelles, F.; Martínez-Casasnovas, J.; Escolà, A. Assessing the Performance of RGB-D Sensors for 3D Fruit Crop Canopy Characterization under Different Operating and Lighting Conditions. Sensors 2020, 20, 7072. [Google Scholar] [CrossRef]
  132. Zhu, X.; Chen, F.; Zheng, Y.; Chen, C.; Peng, X. Detection of Camellia oleifera fruit maturity in orchards based on modified lightweight YOLO. Comput. Electron. Agric. 2024, 226, 109471. [Google Scholar] [CrossRef]
  133. Zhong, W.; Yang, W.; Zhu, J.; Jia, W.; Dong, X.; Ou, M. An Improved UNet-Based Path Recognition Method in Low-Light Environments. Agriculture 2024, 14, 1987. [Google Scholar] [CrossRef]
  134. Ma, Z.; Yang, S.; Li, J.; Qi, J. Research on SLAM Localization Algorithm for Orchard Dynamic Vision Based on YOLOD-SLAM2. Agriculture 2024, 14, 1622. [Google Scholar] [CrossRef]
  135. Jiang, A.; Ahamed, T. Development of an autonomous navigation system for orchard spraying robots integrating a thermal camera and LiDAR using a deep learning algorithm under low- and no-light conditions. Comput. Electron. Agric. 2025, 235, 110359. [Google Scholar] [CrossRef]
  136. Ji, X.; Wang, A.; Wei, X. Precision Control of Spraying Quantity Based on Linear Active Disturbance Rejection Control Method. Agriculture 2021, 11, 761. [Google Scholar] [CrossRef]
  137. Xue, R.; Zhang, C.; Yan, H.; Disasa, K.N.; Lakhiar, I.A.; Akhlaq, M.; Hameed, M.U.; Li, J.; Ren, J.; Deng, S.; et al. Determination of the optimal frequency and duration of micro-spray patterns for high-temperature environment tomatoes based on the Fuzzy Borda model. Agric. Water Manag. 2025, 307, 109240. [Google Scholar] [CrossRef]
  138. Li, Z.; Li, C.; Zeng, Y.; Mai, C.; Jiang, R.; Li, J. Design and Realization of an Orchard Operation-Aid Platform: Based on Planting Patterns and Topography. Agriculture 2024, 15, 48. [Google Scholar] [CrossRef]
  139. Bloch, V.; Degani, A.; Bechar, A. A methodology of orchard architecture design for an optimal harvesting robot. Biosyst. Eng. 2018, 166, 126–137. [Google Scholar] [CrossRef]
Figure 1. Detection process of apple: (a) Original image. (b) Segmentation result. (c) Pickable apples. (d) Result of pickable apples [28].
Figure 1. Detection process of apple: (a) Original image. (b) Segmentation result. (c) Pickable apples. (d) Result of pickable apples [28].
Horticulturae 11 00668 g001
Figure 2. Detection principle of the canopy volume calculation [35].
Figure 2. Detection principle of the canopy volume calculation [35].
Horticulturae 11 00668 g002
Figure 3. The diagram of canopy thickness measurement: (a) Diagram of canopy thickness detection principle; (b) Ultrasonic echo signal and echo interval time [24].
Figure 3. The diagram of canopy thickness measurement: (a) Diagram of canopy thickness detection principle; (b) Ultrasonic echo signal and echo interval time [24].
Horticulturae 11 00668 g003
Figure 4. The schematic of the YOLO detection algorithm [51]. Reproduced with permission from [Tian, Y.; Yang, G.; Wang, Z.; Wang, H.; Li, E.; Liang, Z], [Computers and Electronics in Agriculture]; published by [ELSEVIER], [2019].
Figure 4. The schematic of the YOLO detection algorithm [51]. Reproduced with permission from [Tian, Y.; Yang, G.; Wang, Z.; Wang, H.; Li, E.; Liang, Z], [Computers and Electronics in Agriculture]; published by [ELSEVIER], [2019].
Horticulturae 11 00668 g004
Figure 5. The detection performance of each algorithm [52]. Reproduced with permission from [Mirhaji, H.; Soleymani, M.; Asakereh, A.; Mehdizadeh, S.A], [Computers and Electronics in Agriculture]; published by [ELSEVIER], [2021].
Figure 5. The detection performance of each algorithm [52]. Reproduced with permission from [Mirhaji, H.; Soleymani, M.; Asakereh, A.; Mehdizadeh, S.A], [Computers and Electronics in Agriculture]; published by [ELSEVIER], [2021].
Horticulturae 11 00668 g005
Figure 6. The processing speed of each algorithm [52]. Reproduced with permission from [Mirhaji, H.; Soleymani, M.; Asakereh, A.; Mehdizadeh, S.A], [Computers and Electronics in Agriculture]; published by [ELSEVIER], [2021].
Figure 6. The processing speed of each algorithm [52]. Reproduced with permission from [Mirhaji, H.; Soleymani, M.; Asakereh, A.; Mehdizadeh, S.A], [Computers and Electronics in Agriculture]; published by [ELSEVIER], [2021].
Horticulturae 11 00668 g006
Figure 7. Point clouds acquired from sensors under different lighting conditions.
Figure 7. Point clouds acquired from sensors under different lighting conditions.
Horticulturae 11 00668 g007
Figure 8. Different orchard terrain: (a) Terraced Nanguo pear orchard [92]. Reproduced with permission from [Guo, S.; Yao, W.; Xu, T.; Ma, H.; Sun, M.; Chen, C.; Lan], [Pest Management Science]; published by [Wiley], [2022]. (b) Lychee in hilly and mountainous terrains [138].
Figure 8. Different orchard terrain: (a) Terraced Nanguo pear orchard [92]. Reproduced with permission from [Guo, S.; Yao, W.; Xu, T.; Ma, H.; Sun, M.; Chen, C.; Lan], [Pest Management Science]; published by [Wiley], [2022]. (b) Lychee in hilly and mountainous terrains [138].
Horticulturae 11 00668 g008
Figure 9. Pattern of fruit trees [139]. Reproduced with permission from [Bloch, V.; Degani, A.; Bechar, A], [Biosystems Engineering]; published by [ELSEVIER], [2018]: (a) Tall spindle; (b) Y-trellis.
Figure 9. Pattern of fruit trees [139]. Reproduced with permission from [Bloch, V.; Degani, A.; Bechar, A], [Biosystems Engineering]; published by [ELSEVIER], [2018]: (a) Tall spindle; (b) Y-trellis.
Horticulturae 11 00668 g009
Table 1. Sensor performance comparison in orchard canopy detection.
Table 1. Sensor performance comparison in orchard canopy detection.
SensorFeatureOrchardRelative ErrorShortcoming
RGB-DCanopy segmentationCitrus2.94%The detection speed is slower compared to RGB [27]
RGBCanopy volumeApple6.64%Strongly affected by backlighting, leading to overestimation of canopy volume [39].
LiDARCanopy heightCitrus26.80%Affected by environmental noise such as weather and reflections [29].
LiDARLeaf areaApple13.90%Greatly affected by branches and trunks [40].
UltrasonicLeaf area densityOsmanthus2.84% Greatly affected by leaf occlusion and irregular distribution [41].
UltrasonicCanopy thicknessOsmanthus18.8%Strongly affected by lighting and denser canopies, with only 8.8% error in lab tests [24].
Table 2. A comparison between multimodal and unimodal data.
Table 2. A comparison between multimodal and unimodal data.
Data SourceMean Squared ErrorMean Absolute ErrorR2
RGB data0.0870.0130.814
Point cloud data0.0790.010.846
Multi-data0.0620.0050.914
Table 3. Different YOLO architectures’ performance under such environmental constraints.
Table 3. Different YOLO architectures’ performance under such environmental constraints.
YOLO VersionEnvironment ChallengeTaskmAP/AccuracyNotes
YOLOv8nHigh fruit densityDensity EstimationmAP: 87.3%;
accuracy: 98.7%.
Accurate cluster segmentation [69].
YOLOv4-Tiny + FENOcclusion + Day/NightMulti-class tomato detectionmAP: 94.72% in night.Maintained performance under occlusion [70].
YOLOv4-SEFruit overlap + depthGrape detection + picking pointAverage recognition success rate 97%.Depth fusion improved precision [71].
EF-YOLOv5sLighting variation + occlusionApple detection + clusteringPrecision: 98.84%.2.86 s per pick [72].
YOLOv8s-seg-CBAMSmall target + occlusionStrawberry peduncle detectionAccuracy: 86.2%.30.6 ms/image [73].
Table 4. Four commercial nozzles and mean deposition [92].
Table 4. Four commercial nozzles and mean deposition [92].
Nozzle TypeDescriptionManufacturerMean Deposition
SXFlat-fanDJI (Shenzhen, China)0.442 μL/cm−2
XRExtended range flat-fanTeeJet (Wheaton, IL, USA)0.410 μL/cm−2
IDKAir-induction flat-fanLechler (Metzingen, Germany)0.488 μL/cm−2
TRHollow coneLechler (Metzingen, Germany)0.284 μL/cm−2
Table 5. Pesticide savings rates under variable rate spraying.
Table 5. Pesticide savings rates under variable rate spraying.
SensorFeatureOrchardPesticide Saving Rate
Laser scanningCanopy volumeApple44.2% [115]
Real-time scanning laserCanopy volumeApple41.0% [116]
LaserCanopy densityGrape48.8% [117]
UltrasonicLeaf area densityOsmanthus28.4% [118]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wu, M.; Liu, S.; Li, Z.; Ou, M.; Dai, S.; Dong, X.; Wang, X.; Jiang, L.; Jia, W. A Review of Intelligent Orchard Sprayer Technologies: Perception, Control, and System Integration. Horticulturae 2025, 11, 668. https://doi.org/10.3390/horticulturae11060668

AMA Style

Wu M, Liu S, Li Z, Ou M, Dai S, Dong X, Wang X, Jiang L, Jia W. A Review of Intelligent Orchard Sprayer Technologies: Perception, Control, and System Integration. Horticulturae. 2025; 11(6):668. https://doi.org/10.3390/horticulturae11060668

Chicago/Turabian Style

Wu, Minmin, Siyuan Liu, Ziyu Li, Mingxiong Ou, Shiqun Dai, Xiang Dong, Xiaowen Wang, Li Jiang, and Weidong Jia. 2025. "A Review of Intelligent Orchard Sprayer Technologies: Perception, Control, and System Integration" Horticulturae 11, no. 6: 668. https://doi.org/10.3390/horticulturae11060668

APA Style

Wu, M., Liu, S., Li, Z., Ou, M., Dai, S., Dong, X., Wang, X., Jiang, L., & Jia, W. (2025). A Review of Intelligent Orchard Sprayer Technologies: Perception, Control, and System Integration. Horticulturae, 11(6), 668. https://doi.org/10.3390/horticulturae11060668

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop