Next Article in Journal
Spatio-Temporal Analysis of Movement Behavior of Herded Goats Grazing in a Mediterranean Woody Rangeland Using GPS Collars
Next Article in Special Issue
Online Monitoring of Aerodynamic Characteristics of Fruit Tree Leaves Based on Strain-Gage Sensors
Previous Article in Journal
Influence of Straw Incorporation Rates on Soil Stoichiometry, Microbial Biomass, and Enzymatic Activities in Dryland Wheat Fields of the Loess Plateau, Gansu
Previous Article in Special Issue
Orchard Variable-Rate Sprayer Using LiDAR-Based Canopy Volume Measurement
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Research on a Multi-Sensor Fusion-Based Method for Fruit-Tree Dripline Path Detection

1
College of Mechanical and Electrical Engineering, Xinjiang Agricultural University, Urumqi 830052, China
2
Intelligent Equipment Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
*
Authors to whom correspondence should be addressed.
Agronomy 2026, 16(1), 20; https://doi.org/10.3390/agronomy16010020
Submission received: 17 November 2025 / Revised: 15 December 2025 / Accepted: 19 December 2025 / Published: 21 December 2025
(This article belongs to the Special Issue Advances in Precision Pesticide Spraying Technology and Equipment)

Abstract

To enable automatic extraction of high-precision paths for intelligent orchard operations, a path detection method targeting the fruit-tree dripline is proposed. The method integrates 2D-LiDAR, RTK-GNSS, and an electronic compass, achieving time synchronization, coordinate-frame construction, and extrinsic calibration. Point clouds are rotation-normalized via least-squares trajectory fitting; ground segmentation and statistical filtering suppress noise; segment-wise extremal edge points, together with an α-shape-based concave hull algorithm, fit and generate the dripline path; and inverse rotation restores the result to the orchard-local coordinate frame. Field experiments demonstrated that the method accurately extracts dripline paths in orchard environments; relative to manual measurements, the overall mean absolute error was 0.23 m and the root-mean-square error was 0.30 m. Across different travel speeds, the system exhibited good adaptability and stability, meeting the path-planning requirements of precision orchard operations.

1. Introduction

The national orchard-planting area has reached 12.808 million hectares, and both the fruit-planting area and production rank first worldwide [1]. At present, orchards are moving toward scale and standardization, progressively integrating agronomy with agricultural machinery and creating conditions favorable for the promotion and application of intelligent orchard equipment [1]. The fruit-tree dripline is the outermost boundary of the canopy’s vertical projection on the ground and is closely related to the root uptake zone. In engineering and agronomic practice, the dripline is commonly used as a spatial proxy for near-surface absorbing roots and is widely applied in the determination of fertilization-trench locations, arrangement of ring irrigation, selection of field soil-sampling points, and the delineation of obstacle-avoidance boundaries for pest and disease control and mechanized operations. For orchard intelligence, the stable geometric properties of the dripline provide drivable boundaries and safety-clearance constraints for path planning; existing studies employ the dripline (or canopy-edge projection) as the target boundary for image/LiDAR-based navigation-line detection to enable automatic guidance for trench fertilization, variable irrigation, and inter-row operations [2,3,4,5]. Utilizing multi-sensor information for spatial compensation and structural optimization to derive an efficient and accurate dripline-based path constitutes a prerequisite for the realization of precision agriculture [6,7].
Path planning is a prominent research topic relating to the intelligence of agricultural machinery, and extensive studies on path extraction for certain field-crops have been reported in recent years [8,9,10,11,12,13,14]. As to extracting canopy boundaries (driplines) and row-structure lines from orchard data, existing approaches fall into two categories: image-based methods [15,16,17,18,19] and LiDAR point-cloud methods [20,21,22,23]. In image-based methods, He et al. [24] proposed a monocular-camera orchard navigation–path recognition algorithm that located the trunk region by horizontal projection, applied thresholding and morphological filtering, scanned and clustered edge points along the trunk–ground interface, and, finally, fitted boundary lines on both sides via least squares to obtain a stable edge-point set. Torres-Sospedra et al. [25] proposed a vision-based approach for orange orchards that employed wavelet and window-energy features plus a multilayer perceptron to classify ground/canopy/sky and used the Hough transform on interclass boundaries to extract row-direction lines and acquire inter-row boundary edge points. Sharifi et al. [26] proposed a camera-based inter-row guidance method that performed mean-shift clustering on color images and partitioned the scene into ground/tree/sky, and then fitted two straight lines along the ground–tree segmentation boundary using the Hough transform to extract edge points and a navigation centerline. Wang Heng et al. [27] proposed an image-processing-based dripline path detection method that employed a vertically mounted CCD camera to capture canopy-projection images and, through grayscale processing, binary segmentation, morphological dilation, boundary tracking, and Bezier-curve fitting, ultimately extracted the dripline path. Within LiDAR-based approaches, Li Guangjing et al. [28] proposed a real-time road-edge extraction algorithm based on 3D-LIDAR that derived height and smoothness features, selected edge points, and, finally, applied polynomial fitting to delineate the edge. Wang et al. [29] proposed a 3D-LIDAR curb-detection and tracking method comprising GPS/IMU de-skewing and ground removal, multi-feature candidate extraction, density-based clustering to separate left and right, dual filtering with distance and RANSAC to suppress spurious points, least-squares quadratic fitting, and Kalman filtering for robust curb tracking. Zhang proposed a LiDAR-based road and curb detection method that obtained road candidates via height-domain filtering plus pattern recognition, followed by model-consistency evaluation in the ground-plane projection to recover and output 3D curb points [30]. The foregoing LiDAR-based edge-extraction approaches face reduced navigation accuracy or unstable performance under terrain undulation, dense-foliage occlusion, and high-density structural point clouds; when cameras are used for canopy recognition, shadows and leaf-color variation frequently interfere, which makes accurate canopy-edge extraction difficult, with performance particularly poor under interlaced multi-canopy or irregular tree architectures.
Studies on path detection specifically for fruit-tree driplines are scarce; to address this and improve the adaptability and automation of dripline extraction, a path detection method integrating 2D-LIDAR, RTK-GNSS, and electronic-compass sensor information is proposed, and a field validation has been conducted. From the processed data, canopy-outline edge points—namely, the dripline feature points—were precisely extracted based on discontinuities in point-cloud normal vectors; the dripline point set was fitted and smoothed using an α-shape-based concave hull algorithm; and the result was inversely rotated to the orchard-local coordinate frame to obtain the dripline path. Field validations conducted at multiple times and across multiple rows in orchards using the custom tracked test platform showed that the method consistently accomplished edge-point extraction and path fitting with favorable performance determined under dense planting, irregular tree architecture, and local occlusions, indicating strong robustness.

2. Materials and Methods

2.1. System Hardware Composition

The architecture of the fruit-tree dripline path detection system is shown in Figure 1. The hardware mainly consisted of an electronic compass, a 2D LiDAR, an RTK-GNSS receiver, an onboard computer, and a tracked mobile platform. During the duration of detection, the sensor suite comprising the electronic compass, 2D LiDAR, and RTK-GNSS acquired heading, range, and position data in the orchard environment and transmitted them to the onboard computer. The electronic compass was the DFEC900 dynamic model (Dongfang Weidian Technology, Wuhan, China), with a 50 Hz output rate, 0.01° angular resolution, and ≤0.5° azimuth accuracy. It used USART communication and supported soft-iron and hard-iron calibration. The 2D LiDAR was an LMS111 (SICK, Waldkirch, Germany) featuring a 20 m scan radius, 25 Hz scan frequency, 0.25° angular resolution, and a −45° to 225° field of view. The RTK-GNSS receiver was the NAV-L1 (Suzhou New Coordinate Intelligent Equipment Co., Ltd., Suzhou, China), which accessed network RTK via a 4G connection and achieved 2 cm positioning accuracy. The computing platform was a laptop configured with an i9-14900HX CPU (2.2 GHz), 32 GB RAM, and an NVIDIA GeForce RTX 4060 Laptop GPU. The tracked chassis (Qingke Intelligent Technology Co., Ltd., Jinan, China), with dimensions of 100 cm length and 80 cm width, and a maximum travel speed of 5 km/h. This crawler mobile platform was mainly used as a remote-control data collection platform. The platform adopted a dual-motor differential drive structure, was equipped with two DC motors, and could be controlled for forward motion, backward motion, and steering through a remote control.

2.2. System Software Implementation

The software workflow and architecture of the proposed fruit-tree dripline path detection system are shown in Figure 2. (1) Raw data from 2D LiDAR, RTK-GNSS, and IMU are imported into the system. (2) Latitude–longitude coordinates from RTK-GNSS are converted to the orchard-local planar Cartesian coordinate frame. (3) The processed sensor streams are fused and used for 3D reconstruction of the LiDAR point cloud. (4) The vehicle trajectory is fitted using least squares to estimate the trajectory offset angle in the orchard-local frame, and the LiDAR point cloud is rotated by this angle for normalization. (5) Statistical filtering is applied to the rotated LiDAR point cloud for denoising and outlier rejection. (6) Canopy-outline edge points (the dripline feature points) are extracted, and an α-shape-based concave hull fitting of the edge points yields the dripline path coordinates. (7) The extracted dripline coordinates are inverse-rotated back to the original orchard-local frame to obtain the final path coordinates. The software runs as Python scripts on Windows 11 with Python 3.12.

2.2.1. Sensor-Data Import and Establishment of the Orchard-Local Coordinate Frame

In the preprocessing stage, the acquired multi-source sensor data (RTK-GNSS, electronic compass, and LiDAR) are imported into the system. All computations for dripline path extraction are performed in a local coordinate frame, established from RTK-GNSS outputs to define the transformation between WGS-84 geodetic coordinates and the local frame.
The horizontal system of coordinates is used as the local frame in the dripline path-extraction process. The topocentric system is defined as follows: origin O b lies on the local reference ellipsoid, the X b -axis points east along the prime vertical, the Y b -axis points toward the North Pole along the meridian, and the Z b -axis points to the zenith along the outward normal of the ellipsoidal surface, as shown in Figure 3. The transformation between WGS-84 geodetic (spherical) coordinates and WGS-84 Earth-centered, Earth-fixed Cartesian coordinates is given by Equation (1):
X e Y e Z e = R N + h cos ( λ ) cos ( φ ) R N + h sin ( λ ) cos ( φ ) R N 1 e 2 + h sin ( φ )
where X e , Y e , Z e denotes the WGS-84 Cartesian coordinates of an arbitrary point on the Earth’s surface; λ , φ , and h are the geodetic latitude, longitude, and ellipsoidal height of that point, respectively; R N is the prime-vertical radius of curvature at that point; e is the first eccentricity of the ellipsoid; and the transformation between WGS-84 Cartesian and topocentric coordinates is given by Equation (2).
X e Y e Z e = sin ( λ ) cos ( λ ) sin ( φ ) cos ( λ ) cos ( φ ) cos ( λ ) sin ( λ ) sin ( φ ) sin ( λ ) cos ( φ ) 0 cos ( φ ) sin ( φ ) X b Y b Z b + X oe Y oe Z oe
where X b , Y b , Z b denotes the topocentric coordinates of a point; X e , Y e , Z e denotes the WGS-84 Cartesian coordinates of that point; X o e , Y o e , Z o e denotes the WGS-84 Cartesian coordinates of the origin of the corresponding topocentric frame; and the signs of λ and φ are adjusted according to the hemisphere in which the point lies. Accordingly, designating a point as the origin establishes a topocentric coordinate system with that point as the origin (the local coordinate frame); for any other point, its WGS-84 geodetic coordinates are converted to WGS-84 Cartesian coordinates, and the transformation between WGS-84 Cartesian and topocentric coordinates yields its coordinates in the defined topocentric frame, namely the local coordinates, thereby completing the conversion from WGS-84 geodetic to local coordinates.
At the start of the tracked platform’s operation, the first RTK-GNSS position was automatically designated as the origin of the local coordinate system, establishing the local frame. All path-extraction and error analyses were conducted within this orchard-local frame.

2.2.2. Multi-Source Sensor Fusion and 3D Reconstruction of LiDAR Point Clouds

During the three-dimensional reconstruction of fruit-tree point clouds, because a 2D LiDAR cannot provide absolute pose, the tightly coupled integration of IMU data with GNSS and LiDAR proposed by Wang [31] was adopted to enhance short-term attitude estimation and positioning robustness, thereby ensuring the accuracy of the subsequent motion compensation and three-dimensional point-cloud reconstruction. Using the LiDAR timestamps as the temporal axis, the IMU data were matched by a nearest-neighbor strategy (for each LiDAR timestamp, the IMU sample with the minimum absolute time difference was identified, and the matched IMU record was assigned the LiDAR timestamp).
With t l i d a r denoting the LiDAR timestamp set and T i m u = t 1 , t 2 , , t n the IMU timestamp set, the matching relation is given in Equation (3):
t i m u = arg t T i m u min t l i d a r t
where t i m u denotes the timestamp within T i m u that has the smallest absolute time difference from t l i d a r .
RTK-GNSS provides centimeter-level absolute coordinates, but its update rate is typically 1 Hz, far lower than the LiDAR scan frequency. The RTK-GNSS data were upsampled to 25 Hz via piecewise linear interpolation to estimate the position at any LiDAR timestamp in a continuous and differentiable manner, thereby preserving global accuracy while avoiding the introduction of higher-order errors. Interpolation and timestamp alignment were performed in the following three steps:
(1)
Time-interval partitioning: RTK epochs were used as nodes to form the continuous interval set t 0 , t 1 , t 1 , t 2 , , t n 1 , t n .
(2)
Intra-interval interpolation: For each LiDAR timestamp t l i d a r t k , t k + 1 , the linear interpolation Formula (4) was applied.
R T K v a l = r k + ( r k + 1 r k ) × t l i d a r t k t k + 1 t k
where r k denotes the RTK value at time t k , r k + 1 denotes the RTK value at time t k + 1 , and t l i d a r t k t k + 1 t k denotes the time–progress ratio (0–1).
(3)
Boundary handling: Before the first node, the first RTK sample R T K v a l ( t l i d a r < t 0 ) = r 0 was held; beyond the last node, the last RTK sample R T K v a l ( t l i d a r > t n ) = r n was held; and an exact timestamp match used the corresponding RTK sample R T K v a l t l i d a r = t k = r k .
Strict alignment was enforced between each fused LiDAR frame and the IMU/RTK records at the corresponding time to ensure spatiotemporal consistency for motion-distortion compensation. The processed multi-source data were used for three-dimensional reconstruction, yielding a centimeter-level spatial representation of the orchard canopy.

2.2.3. Rotation of LiDAR Point-Cloud Coordinates Based on Least-Squares Trajectory Fitting

During coordinate rotation of the reconstructed LiDAR point cloud, because the vehicle’s acquisition trajectory was generally not strictly linear, a least-squares linear fit was applied to the actual vehicle path. The set of trajectory points acquired during vehicle motion was denoted as a i , b i i = 1 , 2 , , n . Because the system automatically designated the first RTK-GNSS measurement as the origin of the local coordinate frame during data collection, the vehicle trajectory necessarily passed through this origin in the local frame, and the following linear model was established:
b i = m a i
The optimal fitted slope was determined using the least-squares method by minimizing the sum of squared residuals S :
S = i = 1 n ( b i m a i ) 2
where m denotes the slope of the vehicle trajectory to be estimated, S the sum of squared residuals, and n the number of RTK-GNSS samples along the vehicle trajectory. Differentiating S with respect to m and setting the derivative to zero yielded the optimal estimate of the slope m :
m = i = 1 n a i b i i = 1 n a i 2
Based on the slope m obtained from the above formula, the offset angle θ = arctan ( m ) between the vehicle trajectory and the y-axis in the orchard-local coordinate system was further determined. Subsequently, the entire fruit-tree point cloud was rotated clockwise by the angle 90 + θ so that the orchard travel direction in the reconstructed point cloud became parallel to the y-axis of the orchard-local coordinate system, thereby providing a unified reference frame for subsequent analyses and processing.

2.2.4. Ground Segmentation and Outlier Filtering for LiDAR Point Clouds

After completing the rotation of the LiDAR point cloud, the entire rotated cloud was first subjected to ground segmentation. A ground-height threshold of 0.5 m was set, the rotated point-cloud set was denoted P and each point was denoted p i = x i , y i , z i , and the condition for retaining valid points was as given in Equation (8).
P f = p i p z i > z g
where z g denotes the preset ground-height threshold and z i denotes the vertical coordinate of the i - th point in the cloud.
After removing ground points, the remaining point-cloud set was further processed with statistical filtering to eliminate noise and outliers. The statistical filter assumed that the spatial coordinates of the point cloud followed a normal distribution and identified and removed anomalous points by analyzing the standard deviation of the coordinates. The means and standard deviations of the LiDAR point-cloud coordinates along the x and y directions of the local frame were computed as follows.
μ x = 1 n i = 1 n x i ,   μ y = 1 n i = 1 n y i
σ x = 1 n i = 1 n x i μ x 2 ,   σ y = 1 n i = 1 n y i μ y 2
where μ x and μ y denote the means of the point-cloud set along the x-axis and y-axis coordinates, respectively, and σ x and σ y denote the corresponding standard deviations. Based on the valid point-cloud set p f , an improved 3 σ criterion grounded in coordinate distributions was applied for statistical filtering. Direction-specific threshold parameters L Horizontal and L Longitudinal were set, and the outlier rejection condition was formulated as in Equation (11).
x i μ x σ x > L Horizontal   or   y i μ y σ y > L Longitudinal
Parameters L Horizontal and L Longitudinal controlled the allowable coordinate deviation ranges in the x and y directions, respectively. Given the small x -direction sway of the acquisition platform during motion and the concentration of anomalies and sensor errors, and to reduce point-cloud size and improve runtime efficiency, the x -direction threshold was set to the parameter L Horizontal = 1 , corresponding to the 68.27% confidence interval under a normal distribution, to effectively remove x -direction outliers. In the y direction, because the platform trajectory exhibited strong extension, the forward scene contained complex structures, and the point-cloud variance was relatively large, the y -direction threshold was set to the parameter L Longitudinal = 3.5 , corresponding to the 99.95% confidence interval under a normal distribution, so as to retain as many valid y -direction points as possible. A comparison of preprocessing effects for the rotation-normalized point cloud (with local magnifications) is presented in Figure 4.

2.2.5. Extraction of the Fruit-Tree Dripline Path

The processed three-dimensional LiDAR point cloud was projected onto a two-dimensional plane by removing the z-axis component, yielding the LiDAR data in the x y -plane. The two-dimensional LiDAR data were then segmented along the y-axis with a fixed step size. In each segment, the left-side set ( x < 0 ) and right-side set ( x > 0 ) of points were extracted, and extreme-value points were selected as local edge points. The left edge point was taken as the point with the maximal x in the segment, whereas the right edge point was taken as the point with the minimal x ; letting the segment range be S e g y y m i n + d 1 , y m a x d 2 with d 1 and d 2 denoting the start and end skip distances, respectively, the edge-point set was obtained as follows:
E l e f t = g i g i arg max ( s p j )
E r i g h t = g i g i arg min ( s p j )
To enhance the quality of the edge fitting, x -direction outliers in the edge-point sequence were further removed; with the point sequence denoted S P = s p i , a bidirectional moving-window mean was used to predict the trend value s p i for each point, and the deviation δ i was computed:
s p i = 1 2 h + 1 j = i h i + h s p j
δ i = s p i s p i
where s p i was the trend prediction value at the i - th point, h the window radius, and s p j the original coordinate of the j - th point within the window. A dynamic threshold T h r e s h o l d d y n a m i c was set using the median absolute deviation M A D method:
M d = m e d i a n ( d 1 , d 2 , , d n )
M A D = m e d i a n ( δ 1 M d , δ 2 M d , , δ i M d )
T h r e s h o l d d y n a m i c = M d + c M A D
where M d denoted the median deviation, M A D the median absolute deviation, c the sensitivity coefficient, and T h r e s h o l d d y n a m i c the dynamic threshold to be obtained. If condition δ i > T h r e s h o l d d y n a m i c held, the LiDAR point-cloud sample was regarded as an outlier and removed, yielding the final edge-point set E = E m , E n , which constituted the fruit-tree dripline points.
An α-shape-based concave-hull algorithm (extended α-shape) was applied, and a criterion α was defined for the dripline point set E = E m , E n = m , n : points E m and E n formed a concave-hull boundary if and only if there existed an empty circle of radius r = α 1 2 (containing no other points), centered at D : D E = E m , E n , with radius r α 1 2 . The concave-hull boundary point set was B α = ( E m , E n ) min r ( D ) D α 1 2 (parameter α 0.1 , 0.5 -controlled boundary concavity, with smaller values producing a tighter envelope), and the path point set B α was ultimately obtained by fitting to the dripline points.
The path point set B α extracted by the α-shape-based concave hull algorithm constituted a closed boundary curve; because the entire point cloud had been rotated, the boundary points of the left and right paths unfolded from the tree center toward both sides and the normal directions diverged outward, so the tangent and normal directions were employed to determine the left and right paths of a tree row. The variable q was regarded as the arc-length parameter of the boundary curve, namely, the accumulated distance when traversing the coordinates in order; a continuous curve c u r v e q was defined; and differentiating c u r v e q yielded the tangent vector u i :
c u r v e ( q ) = x ( q ) y ( q )
u i = d d q x ( q ) y ( q ) = u x , i u y , i
Rotating the tangent vector u i clockwise by 90° yielded the outward-pointing normal vector Normal i at the corresponding point:
Normal i = u y , i u x , i
The sign of the normal vector Normal i along the x-axis partitioned a tree row into left- and right-side paths; because the acquisition platform captured two rows in a single pass, the right-side path of the left row and the left-side path of the right row (relative to the travel direction) were selected. When Normal x , i > 0 (normal pointing to the right, positive x direction), right-side points of the right-hand row were discarded, and left-side points were retained as the data for the right-side path relative to the travel direction. When Normal x , i < 0 (normal pointing to the left, negative x -direction), left-side points of the left-hand row were discarded, and right-side points were retained as the data for the left-side path relative to the travel direction. The process of extracting the fruit-tree dripline path is illustrated in Figure 5.

2.2.6. Coordinate Recovery Based on a Two-Dimensional Inverse Rotation Matrix

The fitted dripline path coordinates were restored to their original orientation via an inverse rotation. As detailed in Section 2.2.3, the original LiDAR point cloud had been rotated clockwise by the angle θ + 90 . Consequently, an inverse rotation by an angle of θ + 90 was applied to the fitted paths, using the constructed 2D inverse rotation matrix:
R 1 = cos θ sin θ sin θ cos θ
The left and right dripline paths of the travel direction point sets extracted in Section 2.2.5 were denoted P L = x i L , y i L and P R = x i R , y i R ; the local coordinates after inverse rotation were expressed as
P L = P L ( R 1 ) T
P R = P R ( R 1 ) T
Following the above transformation, the positions of the left and right dripline paths of the travel direction coordinates were obtained in the orchard-local coordinate system. This is shown in Figure 6.

2.3. Field Validation Experiments

2.3.1. Orchard Experimental Protocol

To verify the proposed fruit-tree dripline path detection method, field trials were conducted in August 2025 in a peach orchard at the Peach Industry Technology Research Institute, Pinggu District, Beijing, China. The experiments comprised two parts: (1) positioning-accuracy test in which the static accuracy values of the handheld RTK and the vehicle-mounted RTK were evaluated, and errors were computed after transforming both datasets to the orchard-local coordinate system; (2) dripline path detection test in which the data-collection platform moved at multiple constant speeds between peach trees, the resulting driplines were registered to manually surveyed driplines by linear interpolation along an identical y-coordinate sequence, and the lateral deviation was computed.
Before conducting the experiments, in situ coordinates of the orchard canopy driplines were acquired. To evaluate the accuracy of the dripline extraction algorithm, manual surveying was used to obtain ground-truth dripline coordinates as a reference. A G970II Pro handheld RTK unit was used to perform field measurements in the peach-orchard test area. As shown in Figure 7, an operator carried a long rod fitted with a string and plumb bob and walked slowly along the outer boundary of the tree row, visually estimating the maximal canopy outline, placing one end of the rod near the canopy edge, and allowing the plumb bob to hang to the ground; the touchdown point was taken as a characteristic point on the canopy dripline. The dripline feature points of each tree were manually surveyed with the handheld RTK, with 5–10 samples per tree; in densely planted areas, the number of samples was reduced as appropriate, and connecting all feature points yielded the measured canopy dripline for a tree row. The surveyed dripline coordinates were recorded and stored as latitude–longitude under the WGS-84 datum, then transformed to planar Cartesian coordinates in the orchard-local frame; all measurements were completed under conditions of clear weather and with good satellite reception, ensuring RTK accuracy and stability.
System Positioning-Accuracy Test
In the experiments, the fixed solution of RTK-GNSS was adopted as the reference ground truth for system position; non-fixed solutions (such as differential and float) exhibited large errors and did not meet accuracy requirements, so they were excluded during data processing. A G970II Pro handheld RTK receiver (Beijing UniStrong Science & Technology Co., Ltd., Beijing, China) was used to record latitude–longitude coordinates, with a positioning accuracy of 2 cm and an update rate of 1 Hz. A NAV-L1 RTK-GNSS receiver (Suzhou New Coordinate Intelligent Equipment Co., Ltd., Suzhou, China) was used to record the trajectory ground truth, with a positioning accuracy of 2 cm and an update rate of 1 Hz.
To determine the positioning accuracy and stability of the RTK devices in this study, a static accuracy test was conducted for the handheld RTK and the RTK module mounted on the mobile platform. A G970II Pro handheld RTK receiver and a NAV-L1 onboard RTK receiver (Suzhou New Coordinate Intelligent Equipment Co., Ltd., Suzhou, China) were placed on open, level ground, under clear weather with good satellite reception; the devices were kept stationary while latitude–longitude coordinates at the same location were continuously acquired. More than 500 samples were collected over a 10 min period. The data were statistically analyzed using coordinate transformations; samples were expressed in a local coordinate frame with the initial point as the origin, and the RTK mean error and standard deviation were computed as metrics of positioning accuracy.
Dripline Path Detection Test
A tracked mobile platform equipped with the aforementioned LiDAR and positioning sensors moved slowly at a constant speed between two rows of peach trees; during travel, the onboard LiDAR acquired canopy point-cloud data, while the RTK-GNSS and electronic compass recorded vehicle position and heading in real time, enabling synchronized acquisition of point clouds and pose; the data were then transferred to the dripline path detection system to extract the left and right dripline boundaries and to generate the path. For ease of subsequent description, a top-view schematic of the test orchard and segment labels (A–P) for the driplines were provided to map each segment to its spatial location in the actual orchard. A schematic of the platform’s inter-row trajectory, together with the dripline segment labels (A–P), is shown in Figure 8.
The accuracy of dripline extraction directly determined the overall system accuracy; by fitting precisely extracted dripline coordinates, the system planned a high-precision path consistent with the actual tree distribution, referred to as the “dripline path”. Accordingly, to assess the accuracy and stability of the dripline path detection system, a comparative inter-row path-detection experiment involving multiple tree groups was designed. The platform performed dripline path detection sequentially for four different peach rows at the same speed and, for each row, compared the automatically extracted left- and right-boundary coordinates with the manually surveyed driplines; in addition, to evaluate the effect of travel speed on detection accuracy, driplines were measured in the same orchard environment at three speeds (1 km/h, 3 km/h, and 5 km/h), and the relative error of the driplines was computed as a key metric used to evaluate the system’s detection performance.

2.3.2. Data Processing

In evaluating the accuracies of the handheld and vehicle-mounted RTK units, mean error and standard deviation were used as the primary metrics. The mean error reflects systematic bias and characterizes the accuracy of the positioning results, whereas the standard deviation characterizes random error and reflects the repeatability and stability of the measurements. The calculation formulas were as follows.
E ¯ = 1 n i = 1 n E x i 2 + E y i 2
σ = 1 n 1 i = 1 n ( E i E ¯ ) 2
where E i = E x i 2 + E y i 2 reflected the overall dispersion of the latitude–longitude positions. For n observation points, the eastward error at the i - th point was E x i and the northward error was E y i ( i = 1 , 2 , , n ).
In the dripline path detection experiment, the system-detected driplines were compared with the manually measured driplines, the detected dripline data were interpolated, and the lateral deviation x between the two curves at identical y coordinates was computed; a schematic of the interpolation is shown in Figure 9. Root-mean-square error and mean absolute error were used as the primary evaluation metrics, with the formulas as given in the following.
R M S E = 1 n i = 1 n e i 2
M A E = 1 n i = 1 n e i
where e i = x s y s , i x t r u e , i denotes the error at the i - t h point; x s y s , i is the x value of the system-detected fruit-tree dripline; x t r u e , i is the x value of the measured fruit-tree dripline; and e i is the absolute value of that error.

3. Results and Discussion

3.1. System Positioning-Accuracy Test

The handheld RTK positioning-accuracy results are shown in Figure 10a; the handheld unit exhibited excellent positioning accuracy. The overall mean error was 0.028 m and the standard deviation was 0.011 m, indicating small pointwise bias and low dispersion, and meeting the positioning requirements for orchard operations. The vehicle-mounted RTK results are shown in Figure 10b, and reflect an overall mean error of 0.059 m and a standard deviation of 0.018 m. Compared with the handheld unit, the vehicle module exhibited slightly higher overall error, yet maintained centimeter-level static positioning. Statistics from the two devices indicated that the handheld RTK was suitable for acquiring ground-truth driplines, whereas the vehicle-mounted RTK supported path detection and navigation positioning of the mobile platform in the orchard. The system’s positioning accuracy and stability satisfied the application requirements for inter-row navigation and dripline boundary extraction in orchards.

3.2. Dripline Path Detection Experiments

3.2.1. Dripline Path Detection Performance Test

The test platform traveled at a constant speed of 3 km/h between four peach-tree rows, each approximately 86 m in length. The left and right dripline paths of the travel direction were generated using the algorithm in Section 2.2, ground-truth driplines were obtained by the method described in the Section Dripline Path Detection Test, and linear interpolation ensured comparison of lateral deviation ( x ) at y-coordinate sequences identical with the system outputs. A comparison between the measured driplines and the system-extracted driplines is shown in Figure 11.
The root-mean-square error, mean absolute error, and maximum absolute error between the system-detected and measured driplines were summarized for each segment, and the results are shown in Table 1. Across the four rows, the overall root-mean-square error between the detected and measured driplines was 0.30 m, the mean absolute error was 0.23 m, and the maximum absolute error (MaxAE) was 1.19 m. The experimental results indicated small discrepancies between the system-detected and measured driplines; across segments of the four peach rows, the RMSE and MAE remained within acceptable ranges, with a maximum absolute error of 1.19 m attributable to localized canopy overlap, high foliage density, and transient sensor occlusion-conditions, which were regarded as local anomalies rather than system-wide bias—while subjectivity in manual dripline surveying also existed. From the overall error distribution, peak errors did not persist across segments, indicating that the system exhibited strong robustness and consistency. Therefore, the path fitted from the system-detected driplines retained high practical accuracy, demonstrating that the system effectively extracted driplines, matched the fruit-tree driplines, and could meet the precision requirements of orchard operations in practical applications.

3.2.2. Detection Performance for the Dripline Path at Different Travel Speeds

Comparative results for the measured and system-extracted driplines on the same segment at different travel speeds are shown in Figure 12 and Table 2.
From Table 2, at speeds of 1, 3, and 5 km/h, the RMSEs between the system-detected A–B dripline segment and the measured dripline were 0.23, 0.24, and 0.36 m, respectively. The corresponding MAEs were 0.19, 0.19, and 0.29 m, respectively. The maximum absolute errors were 0.55, 0.75, and 0.92 m, respectively. For the C–D dripline segment, the RMSEs between the detected and measured driplines were 0.26, 0.28, and 0.34 m, respectively. The MAEs were 0.18, 0.19, and 0.22 m, respectively. The maximum absolute errors were 1.19, 1.17, and 1.42 m, respectively. Errors increased with operating speed, indicating that speed affected the detection accuracy for the dripline path.

3.3. Discussion

3.3.1. Discussion of System Positioning Results

The experiments evaluated the positioning accuracy of two RTK-GNSS systems. The results indicated that both systems delivered high positioning performance, meeting the basic accuracy requirements of typical orchard operations. The handheld RTK exhibited an overall mean error of 0.028 m, reflecting high point-position measurement accuracy and suitability for small-area precision tasks with stringent accuracy demands. The vehicle-mounted RTK yielded an overall mean error of 0.059 m, slightly higher than the handheld unit, yet consistent with the accuracy requirements of large-scale mechanized operations. In orchard environments, canopy occlusion and terrain undulation may interfere with GNSS signal quality and thereby affect positioning stability. Accordingly, a multi-sensor fusion strategy was adopted, which markedly enhanced positioning robustness and continuity in orchards and provided more reliable localization support for field equipment.
The positioning accuracy test in this study (Figure 10) was performed with the platform in a static state, aiming to evaluate the intrinsic accuracy of the RTK modules in the absence of motion interference and thereby provide a reliable absolute positional reference for the system. Positioning accuracy during dynamic travel was subject to platform vibration, multipath effects, and the matching between the RTK signal update rate and the motion state. Although this study did not systematically test the standalone positioning accuracy at different speeds, the dripline path detection experiments in Section 3.2.2 demonstrated that the system maintained centimeter-level positional awareness and path generation capability (MAE 0.29 m) even at the maximum travel speed of 5 km/h. This indirectly indicated that through the tightly coupled integration strategy of multiple sensors (IMU/GNSS), the system could effectively sustain the reliability and continuity of positioning information under dynamic operating conditions, fulfilling the engineering requirements for path detection. Future work would specifically design experiments to quantitatively analyze the impact of driving speed on the accuracy of RTK and fusion positioning.

3.3.2. Discussion of Dripline Path Detection Results

The proposed multi-sensor fusion method demonstrated strong accuracy and robustness under complex peach-orchard scenarios. Despite wide canopies, dense foliage, and highly variable edge morphology, the system effectively identified and fitted the projected boundaries to generate navigation-ready driplines. When the speed increased from 1 km/h to 5 km/h, the mean absolute error rose from 0.19 m to 0.29 m. The primary cause was that a 2D LiDAR scan required 40 ms (25 Hz); at higher speed (5 km/h ≈ 1.39 m/s), the platform translated up to 5.6 cm within a single scan frame. Although motion compensation via IMU/GNSS fusion was applied, high-frequency jitter and stretching of the point cloud became pronounced at higher speeds—especially near canopy edges—thereby increasing uncertainty in edge-point extraction and degrading subsequent path-fitting accuracy. Even at 5 km/h, the system maintained a 0.29 m mean absolute error and a 0.36 m root-mean-square error, indicating good adaptability and preserving satisfactory navigation accuracy while supporting operational efficiency.
Existing methods for fruit-tree dripline path detection include image-processing approaches that acquire canopy projection images with a camera [27]. Image-based dripline detection in orchards is susceptible to illumination intensity, shadows, leaf-color variation, and background diversity; when multiple canopies interlace and boundaries are unstable, the risks of false and missed detections increase, reducing robustness. In contrast, the present approach relied primarily on LiDAR point clouds with multi-sensor fusion, did not depend on stable illumination, and enabled reliable canopy-boundary detection under all-weather scenarios.
It is worth noting that a direct quantitative comparison of accuracy metrics with prior works was challenging, as studies specifically focused on dripline extraction were limited and often reported different performance measures or focused on different primary tasks. The quantitative results presented in this study therefore served as a valuable benchmark for this particular task.
Three-dimensional point clouds or SLAM can capture more complete spatial structure, but typically involve higher equipment costs, greater computational and calibration complexity, and more demanding requirements for computing resources and deployment conditions. Rolling 2D LiDAR can also be used to form three-dimensional perception, but this requires additional mechanical actuation and synchronization control, increasing system-integration complexity. By comparison, this method employed a 2D LiDAR with multi-sensor fusion to detect fruit-tree driplines and achieved accuracy sufficient for field operations.
The ground-truth driplines contained errors arising jointly from handheld RTK coordinate error, subjective boundary judgment and plumb-bob landing offset, and canopy motion caused by wind. Evaluation of system dripline accuracy was based on the geometric discrepancy between the system-extracted driplines and the manually surveyed driplines. As described in Section 2.2, the final path output by the system was a smooth curve generated by fitting the extracted discrete dripline feature points with an α-shape-based concave hull algorithm. Consequently, the accuracy of the manual dripline survey also influenced the assessed accuracies of both the system-detected driplines and the final path.
However, the current study has several limitations. All experiments were conducted under favorable daytime weather conditions. The system’s performance in challenging environments such as dense fog, heavy rain, or low-light/nighttime operation has not been verified and may require further adaptation. The method was developed and validated in a specific peach orchard environment with particular tree structures and planting density. Its generalizability to orchards with entirely different architectures—such as elevated trellis systems, wide-row spacing, or irregular layouts—has not been established and necessitates additional testing.

4. Conclusions

A dripline path detection method for fruit trees was proposed that fused 2D LiDAR, RTK-GNSS, and electronic-compass measurements, achieving precise extraction of canopy projection boundaries and generation of paths. Compared with manually surveyed driplines, the system-extracted driplines yielded a mean absolute error of 0.23 m and a root-mean-square error of 0.30 m. At a travel speed of 1 km/h, the mean absolute error was 0.19 m and the root-mean-square error was 0.25 m. At travel speeds of 3 km/h and 5 km/h, the mean absolute errors were 0.19 m and 0.26 m, and the root-mean-square errors were 0.26 m and 0.35 m, respectively. The results indicated that multi-sensor fusion improved the accuracy of dripline path extraction and that the method maintained good adaptability and stability across speeds. The experiments validated the effectiveness of the system and provided an efficient and precise solution for orchard automation.

Author Contributions

Conceptualization, D.W. and W.Z.; methodology, D.W., W.Z., Z.W. and C.Z.; software, D.W.; validation, D.W., W.Z. and C.Z.; formal analysis, W.Z.; investigation, D.W.; resources, C.Z.; data curation, D.W. and X.L.; writing—original draft preparation, D.W.; writing—review and editing, W.Z., C.Z. and Z.W.; visualization, Z.W. and J.W.; supervision, C.Z.; project administration, W.Z.; funding acquisition, C.Z. and W.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was financially supported by China Agriculture Research System of the Ministry of Finance and the Ministry of Agriculture and Rural Affairs (CARS-30-4-01) and Science and Technology Innovation Special Construction Funded Program of Beijing Academy of Agriculture and Forestry Sciences (KJCX20240509).

Data Availability Statement

The data presented in this study are available upon request from the corresponding authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Su, Z.; Zou, W.; Zhai, C.; Tan, H.; Yang, S.; Qin, X. Design of an Autonomous Orchard Navigation System Based on Multi-Sensor Fusion. Agronomy 2024, 14, 2825. [Google Scholar] [CrossRef]
  2. Xie, S. Precautions for Fertilizing Fruit Trees. Fujian Agric. 2012, 10, 11. [Google Scholar]
  3. Jiang, A.; Ahamed, T. Navigation of an Autonomous Spraying Robot for Orchard Operations Using LiDAR for Tree Trunk Detection. Sensors 2023, 23, 4808. [Google Scholar] [CrossRef]
  4. Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. Orchard Mapping and Mobile Robot Localisation Using On-Board Camera and Laser Scanner Data Fusion–Part B: Mapping and Localisation. Comput. Electron. Agric. 2015, 119, 267–278. [Google Scholar] [CrossRef]
  5. Zhou, Y.; Wang, X.; Wang, Z.; Ye, Y.; Zhu, F.; Yu, K.; Zhao, Y. Rolling 2D Lidar-Based Navigation Line Extraction Method for Modern Orchard Automation. Agronomy 2025, 15, 816. [Google Scholar] [CrossRef]
  6. Brown, J.; Paudel, A.; Biehler, D.; Thompson, A.; Karkee, M.; Grimm, C.; Davidson, J.R. Tree detection and in-row localization for autonomous precision orchard management. Comput. Electron. Agric. 2024, 227, 109454. [Google Scholar] [CrossRef]
  7. Han, W.; Gu, Q.; Gu, H.; Xia, R.; Gao, Y.; Zhou, Z.; Luo, K.; Fang, X.; Zhang, Y. Design of Chili Field Navigation System Based on Multi-Sensor and Optimized TEB Algorithm. Agronomy 2024, 14, 2872. [Google Scholar] [CrossRef]
  8. Zhang, M.; Ji, Y.; Li, S.; Cao, R.; Xu, H.; Zhang, Z. Research Progress of Agricultural Machinery Navigation Technology. Trans. Chin. Soc. Agric. Mach. 2020, 51, 1–18. [Google Scholar]
  9. Zhang, S.; Liu, Y.; Xiong, K. A Review of Vision-Based Crop Row Detection Method: Focusing on Field Ground Autonomous Navigation Operations. Comput. Electron. Agric. 2024, 222, 109086. [Google Scholar] [CrossRef]
  10. Shi, J.; Bai, Y.; Diao, Z.; Zhou, J.; Yao, X.; Zhang, B. Row Detection BASED Navigation and Guidance for Agricultural Robots and Autonomous Vehicles in Row-Crop Fields: Methods and Applications. Agronomy 2023, 13, 1780. [Google Scholar] [CrossRef]
  11. de Silva, R.; Cielniak, G.; Wang, G.; Gao, J. Deep Learning-Based Crop Row Detection for Infield Navigation of Agri-Robots. J. Field Robot. 2024, 10, 2299–2321. [Google Scholar] [CrossRef]
  12. Song, Y.; Xu, F.; Yao, Q.; Liu, J.; Yang, S. Navigation Algorithm Based on Semantic Segmentation in Wheat Fields Using an RGB-D Camera. Inf. Process. Agric. 2023, 10, 475–490. [Google Scholar] [CrossRef]
  13. Li, B.; Li, D.; Wei, Z.; Wang, J. Rethinking the Crop Row Detection Pipeline: An End-to-End Method for Crop Row Detection Based on Row-Column Attention. Comput. Electron. Agric. 2024, 225, 109264. [Google Scholar] [CrossRef]
  14. Li, D.; Li, B.; Kang, S.; Feng, H.; Long, S.; Wang, J. E2CropDet: An Efficient End-to-End Solution to Crop Row Detection. Expert Syst. Appl. 2023, 227, 120345. [Google Scholar] [CrossRef]
  15. Mendez, E.; Piña Camacho, J.; Escobedo Cabello, J.A.; Gómez-Espinosa, A. Autonomous Navigation and Crop Row Detection in Vineyards Using Machine Vision with 2D Camera. Automation 2023, 4, 309–326. [Google Scholar] [CrossRef]
  16. Fu, D.; Chen, Z.; Yao, Z.; Liang, Z.; Cai, Y.; Liu, C.; Tang, Z.; Lin, C.; Feng, X.; Qi, L. Vision-Based Trajectory Generation and Tracking Algorithm for Maneuvering of a Paddy Field Robot. Comput. Electron. Agric. 2024, 226, 109368. [Google Scholar] [CrossRef]
  17. Khan, M.N.; Rahi, A.; Rajendran, V.P.; Hasan, M.A.; Anwar, S. Real-Time Crop Row Detection Using Computer Vision-Application in Agricultural Robots. Front. Artif. Intell. 2024, 7, 1435686. [Google Scholar] [CrossRef]
  18. Åstrand, B.; Baerveldt, A. A vision-based row-following system for agricultural field machinery. Expert Syst. Appl. 2005, 15, 251–269. [Google Scholar] [CrossRef]
  19. Zhao, R.; Yuan, X.; Yang, Z.; Zhang, L. Image-based Crop Row Detection Utilizing the Hough Transform and DBSCAN Clustering Analysis. IET Image Process. 2024, 18, 1161–1177. [Google Scholar] [CrossRef]
  20. Zhang, C.; Yong, L.; Chen, Y.; Zhang, S.; Ge, L.; Wang, S.; Li, W. A Rubber-Tapping Robot Forest Navigation and Information Collection System Based on 2D LiDAR and a Gyroscope. Sensors 2019, 19, 2136. [Google Scholar] [CrossRef]
  21. Zhang, S.; Guo, C.; Gao, Z.; Sugirbay, A.; Chen, J.; Chen, Y. Research on 2D Laser Automatic Navigation Control for Standardized Orchard. Appl. Sci. 2020, 10, 2763. [Google Scholar] [CrossRef]
  22. Li, H.; Huang, K.; Sun, Y.; Lei, X.; Yuan, Q.; Zhang, J.; Lv, X. An Autonomous Navigation Method for Orchard Mobile Robots Based on Octree 3D Point Cloud Optimization. Front. Plant Sci. 2025, 15, 1510683. [Google Scholar] [CrossRef]
  23. Xia, Y.; Lei, X.; Pan, J.; Chen, L.; Zhang, Z.; Lv, X. Research on Orchard Navigation Method Based on Fusion of 3D SLAM and Point Cloud Positioning. Front. Plant Sci. 2023, 14, 1207742. [Google Scholar] [CrossRef] [PubMed]
  24. He, B.; Liu, G.; Ji, Y.; Si, Y.; Gao, R. Auto recognition of navigation path for harvest robot based on machine vision. In Computer and Computing Technologies in Agriculture IV (CCTA 2010); IFIP AICT; Springer: Berlin/Heidelberg, Germany, 2011; Volume 344, pp. 138–148. [Google Scholar]
  25. Torres-Sospedra, J.; Nebot, P. A New Approach to Visual-Based Sensory System for Navigation into Orange Groves. Sensors 2011, 11, 4086–4103. [Google Scholar] [CrossRef] [PubMed]
  26. Sharifi, M.; Chen, X. A novel vision-based row guidance approach for navigation of agricultural mobile robots in orchards. In Proceedings of the 2015 6th International Conference on Automation, Robotics and Applications (ICARA), Queenstown, New Zealand, 17–19 February 2015. [Google Scholar]
  27. Heng, W.; Weiyi, Z.; Chengsong, L.; Pei, W.; Lihong, W. Research on the navigation path detection method of fruit tree drop line based on image processing. J. Chin. Agric. Mech. 2023, 44, 183–190. [Google Scholar]
  28. Li, G.; Bao, H.; Xu, C. Real-time Road Edge Extraction Algorithm Based on 3D-Lidar. Comput. Sci. 2018, 45, 294–298. [Google Scholar]
  29. Wang, G.; Wu, J.; He, R.; Yang, S. A Point Cloud-Based Robust Road Curb Detection and Tracking Method. IEEE Access 2019, 7, 24611–24625. [Google Scholar] [CrossRef]
  30. Zhang, Y.; Wang, J.; Wang, X.; Dolan, J.M. Road-Segmentation-Based Curb Detection Method for Self-Driving via a 3D-LiDAR Sensor. IEEE Trans. Intell. Transp. Syst. 2018, 19, 3981–3991. [Google Scholar] [CrossRef]
  31. Wang, W.; Qin, J.; Huang, D.; Zhang, F.; Liu, Z.; Wang, Z.; Yang, F. Integrated Navigation Method for Orchard-Dosing Robot Based on LiDAR/IMU/GNSS. Agronomy 2024, 14, 2541. [Google Scholar] [CrossRef]
Figure 1. System hardware diagram.
Figure 1. System hardware diagram.
Agronomy 16 00020 g001
Figure 2. System software implementation flowchart.
Figure 2. System software implementation flowchart.
Agronomy 16 00020 g002
Figure 3. Schematic of WGS-84 coordinate system and topocentric (horizon) coordinate system.
Figure 3. Schematic of WGS-84 coordinate system and topocentric (horizon) coordinate system.
Agronomy 16 00020 g003
Figure 4. Comparison of preprocessing results for the rotation-normalized point cloud (with local magnifications): (a) raw data; (b) ground-point removal (height threshold 0.5 m); (c) statistical filtering; (df) local magnifications of (ac), respectively.
Figure 4. Comparison of preprocessing results for the rotation-normalized point cloud (with local magnifications): (a) raw data; (b) ground-point removal (height threshold 0.5 m); (c) statistical filtering; (df) local magnifications of (ac), respectively.
Agronomy 16 00020 g004
Figure 5. Schematic of key steps and results for dripline extraction: (a) two-dimensional point cloud after removing z-axis (x–y plane, local coordinate frame); (b) extraction of edge points on both sides of tree rows based on segmented extrema along the y direction; and (c) fitted left and right dripline paths along the direction of travel, using α-shape-based concave hull algorithm.
Figure 5. Schematic of key steps and results for dripline extraction: (a) two-dimensional point cloud after removing z-axis (x–y plane, local coordinate frame); (b) extraction of edge points on both sides of tree rows based on segmented extrema along the y direction; and (c) fitted left and right dripline paths along the direction of travel, using α-shape-based concave hull algorithm.
Agronomy 16 00020 g005
Figure 6. Comparison of coordinate recovery by inverse rotation for dripline paths: (a) processed driplines in the rotation-normalized coordinate frame (before inverse rotation); (b) processed driplines after inverse rotation, restored to original local coordinate system.
Figure 6. Comparison of coordinate recovery by inverse rotation for dripline paths: (a) processed driplines in the rotation-normalized coordinate frame (before inverse rotation); (b) processed driplines after inverse rotation, restored to original local coordinate system.
Agronomy 16 00020 g006
Figure 7. Schematic of manual measurement of fruit-tree canopy driplines: (a) top view in which one end of the long rod is positioned at the canopy edge, and the touchdown point of the string and plumb bob defines the ground-projection point of dripline; (b) a side view that shows the relative position between rod height and canopy edge and the vertical touchdown point of the plumb bob.
Figure 7. Schematic of manual measurement of fruit-tree canopy driplines: (a) top view in which one end of the long rod is positioned at the canopy edge, and the touchdown point of the string and plumb bob defines the ground-projection point of dripline; (b) a side view that shows the relative position between rod height and canopy edge and the vertical touchdown point of the plumb bob.
Agronomy 16 00020 g007
Figure 8. Schematic of platform’s inter-row trajectory and labeled dripline segments (A–P) in the orchard.
Figure 8. Schematic of platform’s inter-row trajectory and labeled dripline segments (A–P) in the orchard.
Agronomy 16 00020 g008
Figure 9. Schematic of interpolation method.
Figure 9. Schematic of interpolation method.
Agronomy 16 00020 g009
Figure 10. Results of RTK positioning-accuracy tests: (a) handheld RTK; (b) vehicle-mounted RTK.
Figure 10. Results of RTK positioning-accuracy tests: (a) handheld RTK; (b) vehicle-mounted RTK.
Agronomy 16 00020 g010
Figure 11. Comparison between measured fruit-tree driplines and system-extracted driplines.
Figure 11. Comparison between measured fruit-tree driplines and system-extracted driplines.
Agronomy 16 00020 g011
Figure 12. Comparison between measured fruit-tree driplines and system-extracted driplines at different travel speeds.
Figure 12. Comparison between measured fruit-tree driplines and system-extracted driplines at different travel speeds.
Agronomy 16 00020 g012
Table 1. Error analysis for different dripline segments.
Table 1. Error analysis for different dripline segments.
Dripline SegmentRMSE (m)MAE (m)MaxAE (m)
A–B0.240.190.75
C–D0.280.191.17
E–F0.350.290.97
G–H0.280.210.89
I–J0.350.270.97
K–L0.260.190.97
M–N0.330.261.19
O–P0.330.250.84
Table 2. Error analysis for A–B and C–D dripline segments at different travel speeds.
Table 2. Error analysis for A–B and C–D dripline segments at different travel speeds.
Dripline SegmentSpeed (km/h)RMSE (m)MAE (m)MaxAE (m)
A–B10.230.190.55
A–B30.240.190.75
A–B50.360.290.92
C–D10.260.181.19
C–D30.280.191.17
C–D50.340.221.42
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wei, D.; Wang, Z.; Wang, J.; Li, X.; Zou, W.; Zhai, C. Research on a Multi-Sensor Fusion-Based Method for Fruit-Tree Dripline Path Detection. Agronomy 2026, 16, 20. https://doi.org/10.3390/agronomy16010020

AMA Style

Wei D, Wang Z, Wang J, Li X, Zou W, Zhai C. Research on a Multi-Sensor Fusion-Based Method for Fruit-Tree Dripline Path Detection. Agronomy. 2026; 16(1):20. https://doi.org/10.3390/agronomy16010020

Chicago/Turabian Style

Wei, Daochu, Zhichong Wang, Jingwei Wang, Xuecheng Li, Wei Zou, and Changyuan Zhai. 2026. "Research on a Multi-Sensor Fusion-Based Method for Fruit-Tree Dripline Path Detection" Agronomy 16, no. 1: 20. https://doi.org/10.3390/agronomy16010020

APA Style

Wei, D., Wang, Z., Wang, J., Li, X., Zou, W., & Zhai, C. (2026). Research on a Multi-Sensor Fusion-Based Method for Fruit-Tree Dripline Path Detection. Agronomy, 16(1), 20. https://doi.org/10.3390/agronomy16010020

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop