Next Article in Journal
Unravelling Yield and Yield-Related Traits in Soybean Using GGE Biplot and Path Analysis
Previous Article in Journal
Comparative Transcriptome Analysis Reveals a Tissue-Specific Pathway Involved in Nitrogen Utilization Between Genotypes with Different Nitrogen Use Efficiencies in Tea Plants (Camellia sinensis)
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Design of an Autonomous Orchard Navigation System Based on Multi-Sensor Fusion

1
College of Mechanical Engineering, Guangxi University, Nanning 530004, China
2
Intelligent Equipment Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
3
Information Technology Research Center, Beijing Academy of Agriculture and Forestry Sciences, Beijing 100097, China
4
Beijing PAIDE Science and Technology Development Co., Ltd., Beijing 100097, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Agronomy 2024, 14(12), 2825; https://doi.org/10.3390/agronomy14122825
Submission received: 24 October 2024 / Revised: 21 November 2024 / Accepted: 26 November 2024 / Published: 27 November 2024
(This article belongs to the Section Precision and Digital Agriculture)

Abstract

:
To address the limitations of traditional GNSS-based navigation systems in orchard environments, we propose a multi-sensor fusion-based autonomous navigation method for orchards. A crawler-type agricultural platform was used as a test vehicle, and an autonomous orchard navigation system was constructed using a 2D LiDAR, a dynamic electronic compass, and an encoder. The proposed system first filters LiDAR point cloud data and uses the DBSCAN–ratio–threshold method to process data and identify clusters of tree trunks. By matching the center coordinates of trunk clusters with a fruit tree distribution map, the platform’s positional measurements are determined. An extended Kalman filter fusion algorithm is then employed to obtain a posterior estimate of the platform’s position and pose. Experimental results demonstrate that in localization accuracy tests and navigation tests, the proposed system provides high navigation accuracy and robustness, making it suitable for autonomous walking operations in orchard environments.

1. Introduction

With its orchard planting area reaching 12.808 million hectares, China currently leads the world in both fruit cultivation acreage and production volume [1]. To reduce costs, orchards are shifting toward large-scale and standardized development. Orchard construction is increasingly focusing on the integration of agronomy and agricultural machinery, providing favorable conditions for the promotion and application of intelligent equipment in orchard operations. Orchard production involves multiple stages and a high demand for labor. Taking apple orchards as an example, labor costs account for more than 60% of the total production cost [1]. Therefore, it is urgent to develop simple, lightweight, and intelligent orchard machinery to replace manual labor. Autonomous navigation is not only a common support technology for orchard platforms performing tasks such as spraying, fertilizing, and harvesting but also a core technology for the intelligent operation of such equipment.
The localization of operation platforms is a prerequisite for achieving autonomous navigation, which relies on the efficient acquisition of position and pose information by the navigation system. To achieve position and pose acquisition for operation platforms, the primary technologies currently used include 2D LiDAR [2,3], 3D LiDAR [4], cameras [5], and GNSS [6,7,8]. Compared with LiDAR and cameras, GNSS is a relatively mature technology for localization and navigation. Real-time kinematic positioning (RTK)-GNSS can achieve centimeter-level localization accuracy in open environments. However, in orchard environments, when tree canopies or other objects heavily obstruct signals, RTK-GNSS may experience reduced localization accuracy or even localization failure [9]. Both cameras and LiDAR can capture rich environmental information; however, cameras are strongly affected by lighting conditions, whereas LiDAR offers all-weather environmental data acquisition. Many domestic and foreign scholars have studied LiDAR-based orchard navigation methods. To achieve the localization of operating platforms, the conventional approach is to obtain environmental information within the operation area using LiDAR, establish a prior map for autonomous navigation based on this information, and then match the environmental information sensed by the operating platform with the prior map. Navigation systems based on 3D LiDAR require extensive storage space and high computational power to construct prior maps due to the large amount of point cloud data generated, leading to higher equipment costs. In contrast, 2D LiDAR offers advantages such as compact size, easier integration, and lower computational demands.
Autonomous navigation in orchards with 2D LiDAR relies significantly on the navigation line extraction. For autonomous navigation line extraction, Higuti et al. [10] employed the least squares method to extract navigation lines from filtered 2D LiDAR point cloud data. The system achieved total travel distances of 1115 m, 1022 m, and 660 m in two cornfields and one sorghum field, indicating the promising prospects of LiDAR-based navigation methods in outdoor environments. However, the autonomous navigation approach utilized in their study had limitations at headland owing to insufficient orchard information, resulting in challenges in executing turning, and limiting its practical applications in real-world scenarios. Malavazi et al. [11] developed an autonomous navigation method that does not rely on prior crop information using an improved PEARL algorithm to extract navigation lines. However, the autonomous navigation method used in this work also had limitations in the turning function. Jiang et al. [12] used density-based spatial clustering of applications with noise (DBSCAN) and K-means to classify 2D LiDAR point cloud data and then employed the random sample consensus (RANSAC) method to extract navigation lines and adjusted the steering angle of the vehicle based on the distance between LiDAR and the tree trunk in the turning stage to realize the turning function. Andersen et al. [13] used the least square method to extract the navigation line from the LiDAR returned data and judged that the LiDAR point cloud data density in the fruit tree line was higher than that in the headland to determine whether it was in the fruit tree row or the headland. In this work, they used the GPS results as the ground truth to evaluate the navigation effect. The navigation system built by Bergerman et al. [14] used the least square method to extract navigation lines and the extended Kalman filter to solve the pose estimation problem. The system was tested for 350 km, and the test results showed that a vehicle with an autonomous navigation function can effectively improve production efficiency. Libby et al. [15] used the Hough transform [16,17] to extract navigation lines from LiDAR data. The test of more than 20 km showed that the system they constructed had a good navigation effect, but reflective bands were needed for auxiliary navigation when it was in the headland. Barawid et al. [18] used the Hough transform as an algorithm to identify tree rows. In this work, they considered that the output of the Hough transform contained noise, which would make the steering control unstable. Therefore, they used the linear time series general auto-regression method to process the output of the Hough transform to obtain the minimum lateral error. In the visual navigation method designed for unmanned ground vehicles (UGV), Zhou et al. [19] proposed a method combining the Hough matrix and RANSAC [20] to extract the navigation path, considering that the traditional Hough transform and least square method are difficult to apply in visual outdoor navigation. The experimental results showed that the accuracy of the method in path acquisition was 90.36–96.81%, but the method is not good in real-time, and the time reserved for program running needs to be controlled above 0.55 s. Guyonneau et al. [21] used the PEARL/Ruby algorithm [22] as the navigation line search algorithm. In the offline simulation, this method showed good anti-interference to weeds, uneven tree rows, and other conditions. The final navigation effect of this method was not clear. Additional research has focused on recognizing trees and obstacles [23,24], the motion control of operation platforms [25,26], and the establishment of a 2D LiDAR rangefinder model [27,28]. In most of the above studies, autonomous navigation depended on the accurate identification and extraction of crop row features. However, a key limitation is that a platform can only acquire distance information relative to the crop rows on either side, resulting in low localization accuracy. Shalal et al. [29] constructed a fruit tree distribution map of an operation area using 2D LiDAR and camera sensors and combined 2D LiDAR, cameras, and IMU data to localize their platform and achieve autonomous navigation. This method only requires a single mapping process for the fruit tree distribution, which can then be used for navigation. Regardless, the inclusion of visual information in the navigation system introduces redundancy, leading to wasted system resources. Additionally, the use of an IMU leads to cumulative errors in azimuth measurements, which increase the system’s azimuth error. In contrast, a dynamic electronic compass does not suffer from cumulative error during azimuth measurement. Such a compass can provide long-term stable azimuth measurements and improve the accuracy and performance of autonomous navigation systems.
To address the redundancy in multi-sensor fusion navigation systems involving 2D LiDAR and achieve high navigation accuracy, we propose an autonomous navigation method that integrates 2D LiDAR and electronic compass data. The proposed method uses RTK-GNSS to construct a fruit tree distribution map of the operation area, filters LiDAR point cloud data using a pass-through filter, and applies the DBSCAN–ratio–threshold method to identify tree trunk clusters. The platform’s position is determined by matching the center coordinates of the trunk clusters with the fruit tree distribution map. An extended Kalman filter fusion algorithm is employed to obtain a posterior estimate of the platform’s position and pose. Finally, fuzzy control is used to achieve motion control and enable the platform’s autonomous navigation.

2. Materials and Methods

2.1. System Hardware Components

The hardware architecture of the proposed autonomous orchard navigation system is presented in Figure 1, and the actual setup is depicted in Figure 2. The main hardware components included a dynamic electronic compass, 2D LiDAR, an industrial control computer, and a crawler-type mobile platform. The system functions as follows: The sensor system, which consists of the electronic compass, 2D LiDAR, and encoder, transmits measurement data to the industrial control computer for state estimation. The computed motion control quantities for the motor driver are then sent to the motor driver to execute the platform’s motion control. The dynamic electronic compass used in this system is the DFEC900 [30] model manufactured by Dfwee Co., Ltd., Wuhan, China. It has a data output frequency of 50 Hz, an azimuth accuracy of ≤0.5°, and an angular resolution of 0.01°. The compass supports USART communication and allows for both hard and soft iron calibration. The 2D LiDAR is the LMS111 [31] model from SICK AG, Waldkirch, Germany, which has a scanning radius of 20 m, scanning range of −45° to 225°, scanning frequency of 25 Hz, and scanning angular resolution of 0.25°. The industrial control computer is produced by Shanghai Senke Electronic Technology Co., Ltd. (Shanghai, China) and has a main frequency of 2.0 GHz. It operates on the Windows 10 platform and is mounted on the upper support frame of the crawler chassis. The crawler chassis is manufactured by Shandong Qingke Intelligent Technology Co., Ltd., Jinan, China, and has dimensions of 100 cm in length and 80 cm in width, with a maximum driving speed of 4 km/h.

2.2. System Software Implementation

The software implementation and architecture of the proposed autonomous navigation method are illustrated in Figure 3. The main steps are as follows: (1) Collect sensor data and the fruit tree distribution map of the orchard; (2) Extract the center coordinates of trunk clusters from the LiDAR data; (3) Perform a coordinate transformation of the extracted trunk cluster center coordinates into the orchard’s local coordinate system; (4) Match the transformed trunk cluster center coordinates with the fruit tree distribution map to obtain the positional measurements of the platform at the current moment; (5) Apply the extended Kalman filter to obtain a posterior estimate of the platform’s current position and pose; (6) Complete the motion control of the platform using a fuzzy control method. The system’s state estimation and motion control programs run on Windows 10 using MATLAB R2019a and are executed as MATLAB scripts.

2.2.1. Establishment of the Orchard’s Local Coordinate System and Fruit Tree Distribution Map

Prior to beginning navigation, the positional information of the fruit trees must be acquired to construct the orchard’s fruit tree distribution map. The positional information of the fruit trees can be obtained by measuring their spherical coordinates in the WGS-84 coordinate system. All navigation calculations were performed in the local coordinate system, which required the establishment of a local coordinate system and the definition of a transformation relationship between WGS-84 and local coordinates.
The local coordinate system was defined as a horizontal coordinate system with the origin O L located at a point on the local reference ellipsoid’s surface. The X L axis points eastward along the reference ellipsoid’s prime vertical, Y L axis points toward the geographic north pole along the reference ellipsoid’s principal vertical, and Z L axis points outward along the outer normal direction of the ellipsoid’s surface toward the zenith, as shown in Figure 4. The transformation between WGS-84 spherical coordinates and WGS-84 Cartesian coordinates is defined by Equation (1).
x e y e z e = R N + h cos ( λ ) cos ( φ ) R N + h sin ( λ ) cos ( φ ) R N 1 e 2 + h sin ( φ )
Here, X e , Y e , Z e are the WGS-84 Cartesian coordinates of any point on the Earth’s surface; λ , φ , and h are the geographical latitude, longitude, and altitude, respectively, of that point; R N is the radius of curvature in the prime vertical; and e is the first eccentricity of the ellipsoid. The transformation between WGS-84 Cartesian coordinates and the local coordinate system is defined by Equation (2).
X e Y e Z e = sin ( λ ) cos ( λ ) sin ( φ ) cos ( λ ) cos ( φ ) cos ( λ ) sin ( λ ) sin ( φ ) sin ( λ ) cos ( φ ) 0 cos ( φ ) sin ( φ ) X L Y L Z L + X oc Y oc Z oe
Here, X L , Y L , Z L and X e , Y e , Z e are the horizontal and WGS-84 Cartesian coordinates of a point, respectively; X oe , Y oe , Z oe are the WGS-84 Cartesian coordinates of the origin of the local horizontal coordinate system at this point; and λ and φ are the geographical latitude and longitude of the point, respectively. The signs of λ and φ are adjusted based on the hemisphere in which the point is located.
By defining a point as the origin of the horizontal coordinate system, a local coordinate system can be established with this point as its origin. Once the WGS-84 spherical coordinates of any other point are obtained, they can be converted into WGS-84 Cartesian coordinates. By using the transformation relationship between the WGS-84 Cartesian coordinates and horizontal coordinates, the horizontal coordinates of the point in the defined local coordinate system can be determined, thereby enabling the conversion of WGS-84 spherical coordinates into local coordinates.
Before implementing navigation functionality, any point within the orchard can be selected as the origin of the local coordinate system to establish a local coordinate system. By acquiring the spherical coordinates of the fruit tree trunks in the WGS-84 coordinate system, the local coordinates of the fruit tree trunks in the established local coordinate system can be obtained based on the transformation relationship between the WGS-84 spherical coordinates and local coordinates. This process enables the construction of a fruit tree distribution map for the orchard.

2.2.2. Identification of Fruit Tree Trunks and Acquisition of Trunk Cluster Center Coordinates

When performing state estimation of the platform’s position and pose through multi-source sensor fusion, the first step was to identify and extract fruit tree trunk information from the LiDAR scan results. This process involved determining the distance from the center of each fruit tree trunk to the LiDAR scanning center, as well as the azimuth of the trunk centers in the LiDAR coordinate system.
As the measurement distance increased, the measurement error of the LiDAR also increased, and interference in the scan results became more prominent. Therefore, prior to processing the LiDAR scan results, a pass-through filter was applied, and data outside the filtered region were discarded. After applying the pass-through filter, the DBSCAN algorithm was used to remove noise points from the filtered data and extract preliminary trunk clusters. Based on the characteristics of the DBSCAN algorithm, the processed output included not only identified noise points but also several clusters with distinct features. For example, because the LiDAR may scan the ground at an inclined angle, the output of the DBSCAN-processed LiDAR data could include ground clusters with linear features. To filter out ground clusters and extract only trunk clusters, the results processed by DBSCAN were further analyzed based on the arc-shaped characteristics of trunk clusters.
Let the scan result of a single LiDAR frame be denoted as r i , θ i , indicating that the i -th point is located at the polar radius r i and polar angle θ i in the LiDAR polar coordinate system. Here, i = 1 , 2 , 3 m , where m is determined by the LiDAR’s scanning range and angular resolution. The DBSCAN-processed LiDAR scan result is denoted as r i ( j ) , θ i ( j ) , indicating that the i -th point is located at the polar radius r i and polar angle θ i in the LiDAR polar coordinate system, and that this point belongs to the j -th cluster. Here, j = 1 , 1 , 2 , 3 n , j = 1 indicates that the i -th point is a noise point, n represents the number of clusters in the LiDAR scan of the frame, and n > 0 . Let c l a s s ( i ) = j indicate that the i -th point belongs to the j -th cluster. The definitions of i and j are as described above, and we define the following function:
δ x , y = 1 x = y 0 x y ( x = 1 , 2 , 3   , y = 1 , 2 , 3 )
therefore,
R ( k ) = i = 1 m δ k , c l a s s ( i ) r i i = 1 m δ k , c l a s s ( i ) k = j = 1 , 2 , 3 n
Θ ( k ) = i = 1 m δ k , c l a s s ( i ) θ i i = 1 m δ k , c l a s s ( i ) k = j = 1 , 2 , 3 n
Let R ( k ) denote the polar radius of the k -th cluster center in the LiDAR polar coordinate system and Θ ( k ) denote the polar angle of the k -th cluster center in the LiDAR polar coordinate system. In the LiDAR polar coordinate system, a circular area is defined with the point R ( k ) , Θ ( k ) as the center and r a d i u s _ c l a s s as the radius. When the ratio of points r ( k ) , θ ( k ) in the k -th cluster that falls within this circular area over the total number of data points in the k -th cluster exceeds a certain threshold, as shown in Equation (6), the k -th cluster is considered to be a trunk cluster.
r a t i o _ i n C i r c l e = n u m ( i n C i r c l e _ k ) n u m ( C l a s s _ k ) > s e t V a l u e
Here, n u m ( i n C i r c l e _ k ) represents the number of points in the k -th cluster that falls within the circular area, n u m ( C l a s s _ k ) is the total number of points in the k -th cluster, and R ( k ) , Θ ( k ) are the center coordinates of the trunk cluster. The value of r a d i u s _ c l a s s can be set based on the average radius of the fruit tree trunks in the orchard.
d q ( i ) = r i r i + 1 L i D A R _ A n g u l a r R e s o l u t i o n
Using Equation (7), the fruit tree trunk information in a single frame of LiDAR scan results can be preliminarily extracted, where r i represents the polar radius of the i -th point in the LiDAR polar coordinate system and L i D A R _ A n g u l a r R e s o l u t i o n Represents the angular resolution of the LiDAR. It is important to note that as a result of the measurement error of the LiDAR, additional detection methods are required to verify the fruit tree trunk information.

2.2.3. Coordinate Transformation of Trunk Cluster Centers into the Orchard’s Local Coordinate System

The center coordinates of the trunk clusters obtained as described above were expressed in the LiDAR polar coordinate system. These coordinates must be transformed into the orchard’s local coordinate system. First, the trunk cluster center coordinates were converted into the LiDAR Cartesian coordinate system. Then, using the coordinate transformation method, these coordinates were transformed into the orchard’s local coordinate system, which represented the fruit tree distribution map. The required rotation angle for the transformation was measured by the dynamic electronic compass, while the translation distance was determined from the prior estimate of the LiDAR scanning center’s current position. Let the center of the platform be the LiDAR scanning center, and define the platform’s state vector as v = x , y , ϕ T , where x , y are the Cartesian coordinates of the LiDAR scanning center in the orchard’s local coordinate system and ϕ is the azimuth angle of the operating platform in the orchard’s local coordinate system. The azimuth is defined as the angle between the 90° radial direction in the LiDAR polar coordinate system and the positive x axis of the orchard’s local coordinate system. This angle is measured counterclockwise as positive and clockwise as negative from the positive x axis. The prior estimate of the platform’s position and pose at time k is defined as v k = x k , y k , ϕ k T .The posterior estimate of the platform’s position and pose at time k is given by v k + = x k + , y k + , ϕ k + T .
From the odometry model, we derive Equations (8) and (9).
Δ D ( k ) = Δ d L + Δ d R 2 Δ ϕ ( k ) = Δ d R Δ d L a
R r o t a t e ( k ) = Δ D ( k ) Δ ϕ ( k )
Here, Δ d L and Δ d R are the distance traveled by the left and right tracks of the platform from time k 1 to time k , respectively, and a is the distance between the left and right tracks. The radius of the travel trajectory from time k 1 to time k , denoted as R r o t a t e ( k ) , determines whether we use the arc or straight-line models in the odometry calculation. The prior estimate of the platform’s position and pose using the arc model at time k is expressed as
v k = f v k 1 + , u k = x k 1 + + Δ x ( k ) y k 1 + + Δ y ( k ) ϕ k 1 + + Δ ϕ ( k ) = x k 1 + + Δ D ( k ) Δ ϕ ( k ) sin ϕ k 1 + + Δ ϕ ( k ) sin ϕ k 1 + y k 1 + Δ D ( k ) Δ ϕ ( k ) cos ϕ k 1 + + Δ ϕ ( k ) cos ϕ k 1 + ϕ k 1 + + Δ ϕ ( k )
The prior estimate using the straight-line model is expressed as
v k = f v k 1 + , u k = x k 1 + + Δ D ( k ) cos ϕ k 1 + y k 1 + + Δ D ( k ) sin ϕ k 1 + ϕ k 1 +

2.2.4. Matching the Trunk Cluster Centers with the Fruit Tree Distribution Map

As described in the previous section, the trunk cluster centers were expressed in Cartesian coordinates within the orchard’s local coordinate system ( O L o c a l X L o c a l Y L o c a l in Figure 5). Next, these centers were matched with the fruit tree distribution map (see Section 2.2.1). The matching method is as follows. If a trunk cluster center falls within a circular area centered on a fruit tree in the distribution map with a threshold radius, the trunk cluster center is considered to match the corresponding fruit tree. The polar radius R ( k ) of the trunk cluster in the LiDAR polar coordinate system is considered as the distance from the LiDAR scanning center to the matched fruit tree. As shown in Figure 5, the trunk cluster center D 1 falls within the circular area centered on the fruit tree T 1 , which confirms a successful match between the two points. The distance from the LiDAR scanning center to point D 1 is then considered to be the distance to fruit tree T 1 . For trunk centers that do not successfully match, such as points D 4 , D 7 , and D 9 , these trunks could be obstacles encountered during the platform’s movement (point D 9 ), unmarked trees in the fruit tree distribution map (point D 4 ), or significant deviations caused by surrounding objects near the fruit trees, preventing the trunk cluster center from matching with the corresponding fruit tree (point D 7 ). The DBSCAN algorithm was used to perform matching between the trunk cluster centers and the fruit tree distribution map. This section of the program outputs the Cartesian coordinates of the successfully matched fruit trees (e.g., trees T 1 , T 2 , T 3 , T 5 , and T 8 in Figure 5) in the orchard’s local coordinate system, as well as the distances from the LiDAR scanning center to the matched trees.

2.2.5. Obtaining a Posterior Estimate

By using the Cartesian coordinates of successfully matched fruit trees within the orchard’s local coordinate system and distances from these trees to the LiDAR scan center, the Cartesian coordinates of the LiDAR scan center within the orchard’s local coordinate system can be calculated based on the relationship between two circles.
For each pair of successfully matched trees, the position of the LiDAR scan center can be computed within the orchard’s local coordinate system. To ensure a unique solution, the posterior estimate of the platform’s position and pose from the previous time step must be used. As shown in Figure 5, the computation of the LiDAR scan center’s position based on two matched trees falls into two scenarios: one in which the two trees are located on opposite sides of the LiDAR scan center (trees T 1 and T 5 in Figure 5), and another where both trees are on the same side (trees T 1 and T 2 in Figure 5). As a result of LiDAR measurement errors and the offset between the trunk cluster center and center of the matched tree, each scenario can yield no solution, one solution, or two solutions. Due to the measurement error of LiDAR, it is necessary to confirm the reliability of the current position measurement results further. So, for these three states, the concepts of “valid measurement” and “invalid measurement” were introduced. For example, consider tree T 1 , tree T 2 , and the distances R T 1 and R T 2 from the LiDAR scan center to these trees, respectively. If no solution is obtained, this is considered an “invalid measurement” for tree T 1 , tree T 2 , and the distances R T 1 and R T 2 . If a single solution is found and the solution x * , y * satisfies,
x * , y x k 1 + , y k 1 + 2 C
then this is considered a “valid measurement” for tree T 1 , tree T 2 , and the distances R T 1 and R T 2 ; otherwise, it is considered an “invalid measurement”. Here, C is a constant, and its value is related to the LiDAR measurement error and 2 denotes the two-norm of the vector. If two solutions x i * , y i * i = 1 , 2 are obtained and the solution x i * , y i * satisfies
x i * , y i * x k 1 + , y k 1 + 2 < x j * , y j * x k 1 + , y k 1 + 2 i , j = 1 , 2 ; i j
x i * , y i * x k 1 + , y k 1 + 2 C ,
then this is considered a “valid measurement” for tree T 1 , tree T 2 , and the distances R T 1 and R T 2 ; otherwise, it is considered an “invalid measurement”. The mean of multiple valid measurements for the position of the LiDAR scan center in the orchard’s local coordinate system was used as the positional measurement of the LiDAR scan center. This value was also considered the positional measurement of the platform in the local coordinate system. The platform’s azimuth in the local coordinate system was measured using the dynamic electronic compass. Based on the relevant details of the extended Kalman filter, prior estimate of the position and pose at time k , and measurements, the posterior estimate v k + of the platform’s position and pose at time k can be obtained. Simon [32] provides more details on the extended Kalman filter.

2.2.6. Motion Control

Once the posterior estimate of the platform’s position and pose at time k was obtained, control inputs could be derived, including the vertical distance between the current position and preset path at time k , as well as the azimuth deviation between the platform’s heading and preset path at time k . The speed difference between the left and right tracks was used as the control output for fuzzy control. The vertical distance between the position at time k and the preset driving path, the azimuth deviation, and the driving speed difference between the left and right tracks of the operating platform were [−2000 mm 2000 mm], [−1.047 rad 1.047 rad], [−400 400], respectively. The input and output quantities were divided into negative big (NB), negative medium (NM), negative small (NS), zero (Z0), positive small (PS), positive medium (PM), and positive big (PB) seven grades. The membership functions and fuzzy control rules for motion control followed the method described by Zhang et al. [2].

2.3. Test Design

2.3.1. Orchard Tests

To verify the autonomous navigation method proposed in this paper, field tests were conducted in July of 2024 in a pear orchard located at the National Experiment Station for Precision Agriculture in Changping District, Beijing, China. The tests mainly focused on two aspects: (1) system localization accuracy tests and (2) orchard navigation tests. The navigation tests included overall system performance evaluation and trajectory tracking tests at different straight-line driving speeds. Following the test design by Shalal et al. [29], an RTK-GNSS system was used to provide ground-truth information for the platform’s position during the tests. Because RTK-GNSS non-fixed solutions (e.g., differential or float solutions) were unable to meet the accuracy requirements of the tests, only fixed RTK-GNSS solutions were used. The ground-truth trajectory was recorded using a NAV-L1 RTK-GNSS receiver manufactured by Suzhou Cenozoic Intelligence Equipment Pty., Ltd., Suzhou, China, which has a localization accuracy of up to 2 cm and a data updating frequency of 1 Hz.
Prior to conducting the tests, the fruit tree distribution map of the orchard needed to be obtained. The map was generated by manually surveying the fruit trees using an RTK-GNSS system. The RTK-GNSS receiver used for this task was a G970II Pro manufactured by Beijing UniStrong Science & Technology Co., Ltd., Beijing, China, with a localization accuracy of 2 cm. After obtaining the spherical coordinates of the trees in the WGS-84 coordinate system, the positional information of the trees in the orchard’s local coordinate system was computed via coordinate transformation to construct the fruit tree distribution map for the orchard. Figure 6 presents a schematic diagram of the fruit tree distribution map, where the green solid circles represent the positions of the tree trunks, and the red solid line indicates the preset path of the platform. The path was generated based on a straight-line driving strategy between the tree rows and semi-circular turns at the headlands. The red solid circles represent the midpoints between the two trees at the ends of each row, while the blue dashed ellipses represent the headland judgment areas. When the platform enters a headland judgment area, it performs a semi-circular turn based on the preset path to complete the row-switching operation.

System Localization Accuracy Tests

During the system localization accuracy tests, the pass-through filter area for LiDAR was defined as a region extending 8 m to both sides of the platform and 13 m along the driving direction. The number of sampling points was determined based on the number of state estimates. The platform’s driving speed was set to 1 km/h, and the total travel distance was set to 100 m. Approximately 8000 state estimates were conducted during the test, and samples were collected at the 2000th, 2500th, 3000th, 3500th, 4000th, 4500th, 5000th, 5500th, and 6000th estimates. After reaching the specified number of estimates, the platform executed a parking command, outputting the state estimate for the current position. Using RTK-GNSS, the platform’s parking location was measured in spherical coordinates under the WGS-84 coordinate system, and these data were converted into the orchard’s local coordinate system to obtain position measurements. The system’s localization accuracy was evaluated using the system localization error E , localization error E x in the x -axis direction, and localization error E y in the y -axis direction.

Orchard Navigation Tests

The orchard navigation tests consisted of two parts. For the overall system navigation performance evaluation, the LiDAR pass-through filter area was the same as in the localization accuracy tests. The platform traveled 250 m along the preset path at a speed of 1 km/h. The platform made two left turns and two right turns during the row-switching process. The RTK-GNSS fixed solution was used as the ground truth for the platform’s trajectory, and the preset path generation strategy was identical to that shown in Figure 6. For trajectory tracking tests at different straight-line driving speeds, the arrangements were similar to the overall system navigation tests. The difference was that the platform traveled at speeds of 1 km/h, 2 km/h, and 3 km/h along the straight paths between the tree rows. An encoder was used to monitor the platform’s driving speed, and the system’s navigation performance was evaluated based on trajectory tracking errors.

2.3.2. Data Processing

During the system localization accuracy tests, the system’s estimate of the platform’s current position in the orchard’s local coordinate system was denoted as x e s t i m a t e , y e s t i m a t e and the RTK-GNSS measurement of the platform’s position was denoted as x m e a s u r e , y m e a s u r e , which was considered to be the ground truth. The system localization error is defined as shown in Equation (15).
E = x e s t i m a t e x m e a s u r e 2 + y e s t i m a t e y m e a s u r e 2
The localization errors in the x -axis and y -axis directions are defined in Equations (16) and (17), respectively.
E x = x e s t i m a t e x m e a s u r e
E y = y e s t i m a t e y m e a s u r e
In the orchard navigation tests, the trajectory tracking error was defined as the perpendicular distance between the platform’s current ground-truth position and the preset trajectory. If the ground truth was located to the left of the preset trajectory, the tracking error was negative, and if it was located to the right, the tracking error was positive. The ground truth for the platform’s position was defined in the same manner as in the localization accuracy tests. The trajectory tracking error serves as the evaluation metric for the system’s navigation performance.

3. Test Results and Analysis

During our tests, manual RTK-GNSS point-based mapping was used to construct the fruit tree distribution map of the orchard, as shown in Figure 7. In this figure, the red solid line represents the preset driving path of the platform, and the red solid dots denote the midpoints of the lines connecting the two fruit trees at the ends of each row.

3.1. System Localization Accuracy Tests

The results of the system localization accuracy tests are presented in Figure 8. The sampling points were sequentially located in the orchard’s local coordinate system (as shown in Figure 7) at coordinates of (−31.42, 28.34) for 2000 estimates, (−18.35, 28.29) for 2500 estimates, (−9.30, 28.13) for 3000 estimates, (−0.41, 28.73) for 3500 estimates, (−22.26, 32.40) for 4000 estimates, (−22.66, 32.44) for 4500 estimates, (−31.35, 32.46) for 5000 estimates, (−56.27, 34.19) for 5500 estimates, and (−29.41, 36.04) for 6000 estimates (coordinates are in meters). The test results revealed that the mean absolute value of the localization error E x in the x -axis direction was 0.05 m with a standard deviation of 0.04 m, the mean absolute value of the localization error E y in the y -axis direction was 0.06 m with a standard deviation of 0.05 m, and overall mean localization error E of the system was 0.09 m with a standard deviation of 0.05 m. The results of the system localization accuracy tests indicated that the system meets the navigation accuracy requirements for orchard operations.
By analyzing the system localization accuracy test results, one can see that the valid measurement values for the platform’s position in the orchard’s local coordinate system at the current time follow truncated normal distributions in their projections along the coordinate axes. As an example, consider point 2500 (sampling point coordinates: (−18.35, 28.29), in meters). The distribution of valid measurement values for point 2500 and their projections onto the coordinate axes are presented in Figure 9. The blue solid dots represent valid measurements, while the red solid dots represent the ground truth. A chi-square goodness-of-fit test was conducted, revealing that the projections of the valid measurements along both the X and Y axes follow truncated normal distributions. The observed test statistics were 0.7058 and 6.5432, respectively, while the critical values at the 0.05 significance level were 5.99 and 7.81. Therefore, the projections of the valid measurements along the X and Y axes are considered to follow truncated normal distributions, which can be used to assess the system’s localization accuracy.

3.2. Orchard Navigation Tests

3.2.1. Overall System Navigation Performance Evaluation

The results of the system’s overall navigation performance evaluation, including the ground-truth trajectory, state estimation results, and preset trajectory, are presented in Figure 10. During the straight-line navigation phase (i.e., segments A-B, C-D, etc.), the mean absolute trajectory tracking error was 0.07 m with a standard deviation of 0.06 m. In the turning and row-switching phase (i.e., segments B-C, D-E, etc.), the mean absolute trajectory tracking error was 0.30 m with a standard deviation of 0.30 m. The output frequency of the system state estimation result was approximately 10 Hz.
According to the test results, significant deviations occurred between the system’s estimated and preset trajectories during the turning and row-switching phase, particularly in segments D-E and F-G, as shown in Figure 11. Considering segment D-E as an example, Figure 12 presents the filtered LiDAR data for state estimation points 3200, 3280, 3360, 3380, 3440, and 3560. The localization status and number of valid measurements (see Section 2.2.5) for each point are listed in Table 1. From Figure 10 and Table 1, it can be inferred that the primary reasons for localization failures at state estimation points 3200, 3280, and 3360 are twofold: (1) these points are located in the headland, where fewer fruit trees are available for localization; and (2) the LiDAR scanning area was mainly oriented away from the fruit trees. These factors collectively resulted in fewer measurable fruit trees within the LiDAR field of view, leading to a reduced number of valid measurements and ultimately causing localization failures at the current time step. In contrast, at points 3380, 3440, and 3560, the LiDAR scanning area was primarily oriented toward the fruit trees, resulting in more measurable fruit trees and a higher number of valid measurements, allowing for successful localization at these points. As shown in Figure 11, during the period between points 3200 and 3360 in segment D-E, the system was in a state of localization failure most of the time, during which the localization function was taken over by the odometry system. Figure 11 further reveals that during this interval, the deviation between the state estimation results and RTK-GNSS measurements gradually increased. However, between points 3360 and 3380, the system successfully re-localized as more fruit trees became measurable within the LiDAR’s field of view, causing the deviation between the state estimation and RTK-GNSS measurements to decrease gradually. After point 3420, the state estimation results largely coincided with the RTK-GNSS measurements.
In Figure 13 and Figure 14, one can see that during the straight-line travel within the fruit tree rows, the speeds of the left and right wheels remained relatively stable, with no significant fluctuations. Regarding the azimuth angle, only minor variations were observed, and no abrupt changes occurred. However, during the semi-circular turns for row switching, the speeds of both wheels exhibited significant changes to accomplish the differential steering, which was mirrored by substantial variations in the azimuth angle.
In orchard operations, such as spraying and harvesting, the straight-line phase is considered the operational segment, requiring high accuracy in trajectory tracking. In contrast, the turning and row-switching phase is classified as a non-operational segment. In this phase, provided that collisions with fruit trees are avoided, the requirements for trajectory tracking accuracy are relatively relaxed, effectively reducing the cost of navigation equipment. Based on our test results, the proposed system meets the operational requirements of orchard environments.

3.2.2. Trajectory Tracking Tests at Different Straight-Line Driving Speeds

The results of the trajectory tracking tests at different straight-line driving speeds are presented in Figure 15 and Table 2.
In Figure 15 and Table 2, one can see that the maximum trajectory tracking errors occurred shortly after the platform started moving. According to Table 2, the maximum absolute trajectory tracking errors at speeds of 1, 2, and 3 km/h were 0.32 m, 0.56 m, and 0.64 m, respectively. The proportion of errors below 0.18 m, 0.27 m, and 0.47 m exceeded 95%, meeting the application requirements for most orchard environments.
Figure 15 also shows that at different straight-line driving speeds, the proportion of positive trajectory tracking errors was relatively high, accounting for 75.64%, 85.96%, and 84.07% at speeds of 1, 2, and 3 km/h, respectively. This indicates that the platform remained on the right side of the pre-set path in more than 75% of the time during straight-line driving.

3.3. Discussion

3.3.1. Discussion of System Localization Results

From the system localization accuracy tests, it can be concluded that in a single-state estimation, the projections of valid measurements on the X and Y axes followed a truncated normal distribution. Taking the projection on the X-axis as an example, according to the central limit theorem, when the number of valid measurements in a single state estimation is large, the mean of the valid measurements approximates a normal distribution N μ , σ 2 / n . Here, μ and σ 2 represent the mathematical expectation and variance of the distribution of the valid measurement projections on the X-axis, respectively, and n is the number of valid measurements. Therefore, when the number of valid measurements is large, the positional measurements of the operation platform in the orchard’s local coordinate system approximately follow a normal distribution, allowing for the estimation and prediction of the deviation between the measured values and ground truth.
When computing the platform’s position at time k based on the tree coordinates and distance to the trees, Equation (12) or Equation (14) helps verify the correctness of the solution. Because the constant C is chosen empirically, if C is too small and x t r u e k x k 1 + ( x t r u e k is the X-axis projection of the ground-truth position of the platform at time k ) is large, this could disrupt the distribution characteristics of valid measurements described in Section 3.1. This disruption would shift the mathematical expectation, introduce systematic errors, and reduce the localization accuracy of the system.

3.3.2. Discussion of System Navigation Performance

As can be seen from Table 2 in Section 3.2.2, the proposed method achieved comparable navigation results with the method “Kalman filter algorithm with line-detection” used by Blok et al. [9]. In this study, the RMS was 0.09 m at a speed of 1 km/h, and the RMS at a speed of 0.25 m/s was 0.087 m, which indicates that the proposed method has a good navigation effect. Compared to Blok et al.’s [9] work, the proposed method has the function of autonomous turning and makes an improvement in localization accuracy (see Section 3.1).
As shown in Table 1 in Section 3.2, the number of successfully matched trunk clusters equaled the number of successfully matched fruit trees. However, theoretical analysis suggests that this relationship is not always valid. According to Section 2.2.4, when multiple trunk cluster centers fall within the threshold area around a single fruit tree, a situation may arise in which a single fruit tree matches multiple trunk cluster centers. In orchard environments, factors such as tree branches, dense weeds around trees, or free-ranging poultry may cause one fruit tree to match multiple trunk cluster centers. When this occurs, the polar radii of the multiple trunk cluster centers in the LiDAR polar coordinate system can be considered as distances from the LiDAR scanning center to the corresponding fruit tree. These distances can then be used to calculate the current position of the operation platform, enhancing the system’s adaptability to the operational environment.
In the trajectory tracking tests at different straight-line driving speeds, the system’s trajectory tracking errors were positive in over 75% of all cases, meaning the platform remained on one side of the preset path during most of the straight-line driving phase. This phenomenon aligns with the findings of Blok et al. [9], although in their study, the platform tended to the left, whereas in our study, it tended to the right. Because this situation did not occur in the offline simulation of the motion control method, this phenomenon was attributable to the following factor: A deviation existed between the measured value of the current position of the operating platform and the true value. In other words, the measured current position value of the operating platform contained a systematic error. The reason for the systematic error of the measured value may be that the distribution characteristics of the effective measurement value described in Section 3.1 have been destroyed, and the mathematical expectation of the distribution followed by the effective measurement value has a deviation from the true value of the current position, thus causing the systematic error. Future research should focus on understanding the causes of the distribution of valid measurements to reduce the system error in the navigation system’s localization function and improve both localization and navigation accuracy.
When the average radius of the fruit tree trunk is small and far from the LiDAR scanning center, only limited laser points may fall on the fruit tree trunk, which may make the autonomous navigation method proposed in this study limited. The same concern was expressed by Higuti et al. [10], but the answer was given in their article: “… However, chances are that the navigation will be unreliable if only LiDAR is used, considering very young plants. When plants are very small, this is not a limiting factor. Indeed, GNSS is usable since there is little multipath error due to cluttered canopies.”

4. Conclusions

We developed an autonomous orchard navigation system that integrated sensor data from 2D LiDAR, an electronic compass, and an encoder. The system’s mean localization error was 0.09 m with a standard deviation of 0.05 m. The mean absolute localization errors in the X and Y axes were 0.05 m and 0.06 m, respectively, with corresponding standard deviations of 0.04 m and 0.05 m. At a driving speed of 1 km/h, the mean absolute trajectory tracking error in the operational segments was 0.07 m with a standard deviation of 0.06 m. At driving speeds of 2 and 3 km/h, the mean absolute trajectory tracking errors were 0.11 m and 0.18 m, respectively, with standard deviations of 0.09 m and 0.14 m. During straight-line driving, the platform remained on the right side of the preset path for more than 75% of the time. These results demonstrate that the proposed method provides effective navigation and is suitable for various orchard environments.

Author Contributions

Conceptualization, Z.S. and W.Z.; methodology, Z.S., W.Z. and C.Z.; software, Z.S.; validation, Z.S., W.Z. and C.Z.; formal analysis, W.Z.; investigation, Z.S.; resources, C.Z.; data curation, Z.S.; writing—original draft preparation, Z.S.; writing—review and editing, W.Z., C.Z. and H.T.; visualization, S.Y. and X.Q.; supervision, C.Z.; project administration, W.Z.; funding acquisition, C.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was financially supported by the National Key Research and Development Plan Project (grant number: 2022YFD2001402), Science and Technology Innovation Special Construction Funded Program of Beijing Academy of Agriculture and Forestry Sciences (KJCX20240509), the Laboratory Construction Project of 2024 National Engineering Research Center for Intelligent Equipment in Agriculture (PT2024-41).

Data Availability Statement

The data presented in this study are available upon request from the corresponding author.

Conflicts of Interest

Authors Shuo Yang and Xiangyang Qin was employed by the company Beijing PAIDE Science and Technology Development Co., Ltd. The remaining authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Dou, H.; Chen, Z.; Zhai, C.; Zou, W.; Song, J.; Feng, F.; Zhang, Y.L.; Wang, X. Research progress on autonomous navigation technology for orchard intelligent equipment. Trans. Chin. Soc. Agric. Mach. 2024, 55, 1–22. [Google Scholar]
  2. Zhang, C.; Yong, L.; Chen, Y.; Zhang, S.; Ge, L.; Wang, S.; Li, W. A rubber-tapping robot forest navigation and information collection system based on 2D LiDAR and a gyroscope. Sensors 2019, 19, 2136. [Google Scholar] [CrossRef] [PubMed]
  3. Zhang, S.; Guo, C.; Gao, Z.; Sugirbay, A.; Chen, J.; Chen, Y. Research on 2D laser automatic navigation control for standardized orchard. Appl. Sci. 2020, 10, 2763. [Google Scholar] [CrossRef]
  4. Jiang, S.; Qi, P.; Han, L.; Liu, L.; Li, Y.; Huang, Z.; Liu, Y.; He, X. Navigation system for orchard spraying robot based on 3D LiDAR SLAM with NDT_ICP point cloud registration. Comput. Electron. Agric. 2024, 220, 108870. [Google Scholar] [CrossRef]
  5. Gu, B.; Liu, Q.; Tian, G.; Wang, H.; Li, H.; Xie, S. Recognizing and locating the trunk of a fruit tree using improved YOLOv3. Trans. Chin. Soc. Agric. Eng. 2022, 38, 122–129. [Google Scholar]
  6. Han, J.H.; Park, C.H.; Kwon, J.H.; Lee, J.; Kim, T.S.; Jang, Y.Y. Performance evaluation of autonomous driving control algorithm for a crawler-type agricultural vehicle based on low-cost multi-sensor fusion positioning. Appl. Sci. 2020, 10, 4667. [Google Scholar] [CrossRef]
  7. Han, J.H.; Park, C.H.; Jang, Y.Y.; Gu, J.D.; Kim, C.Y. Performance evaluation of an autonomously driven agricultural vehicle in an orchard environment. Sensors 2021, 22, 114. [Google Scholar] [CrossRef]
  8. Han, J.H.; Park, C.H.; Jang, Y.Y. Development of a moving baseline RTK/motion sensor-integrated positioning-based autonomous driving algorithm for a speed sprayer. Sensors 2022, 22, 9881. [Google Scholar] [CrossRef]
  9. Blok, P.M.; van Boheemen, K.; van Evert, F.K.; IJsselmuiden, J.; Kim, G.H. Robot navigation in orchards with localization based on Particle filter and Kalman filter. Comput. Electron. Agric. 2019, 157, 261–269. [Google Scholar] [CrossRef]
  10. Higuti, V.A.; Velasquez, A.E.; Magalhaes, D.V.; Becker, M.; Chowdhary, G. Under canopy light detection and ranging-based autonomous navigation. J. Field Robot. 2019, 36, 547–567. [Google Scholar] [CrossRef]
  11. Malavazi, F.B.P.; Guyonneau, R.; Fasquel, J.B.; Lagrange, S.; Mercier, F. LiDAR-only based navigation algorithm for an autonomous agricultural robot. Comput. Electron. Agric. 2018, 154, 71–79. [Google Scholar] [CrossRef]
  12. Jiang, A.; Ahamed, T. Navigation of an autonomous spraying robot for orchard operations using LiDAR for tree trunk detection. Sensors 2023, 23, 4808. [Google Scholar] [CrossRef] [PubMed]
  13. Andersen, J.C.; Ravn, O.; Andersen, N.A. Autonomous rule-based robot navigation in orchards. IFAC Proc. Vol. 2010, 43, 43–48. [Google Scholar] [CrossRef]
  14. Bergerman, M.; Maeta, S.M.; Zhang, J.; Freitas, G.M.; Hamner, B.; Singh, S.; Kantor, G. Robot farmers: Autonomous orchard vehicles help tree fruit production. IEEE Robot. Automat. Mag. 2015, 22, 54–63. [Google Scholar] [CrossRef]
  15. Libby, J.; Kantor, G. Deployment of a point and line feature localization system for an outdoor agriculture vehicle. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; IEEE Publications: Piscataway Township, NJ, USA, 2011; pp. 1565–1570. [Google Scholar]
  16. Hough, P.V. Method and Means for Recognizing Complex Patterns. U.S. Patent No. 3,069,654, 18 December 1962. [Google Scholar]
  17. Chen, J.; Qiang, H.; Wu, J.; Xu, G.; Wang, Z. Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform. Comput. Electron. Agric. 2021, 180, 105911. [Google Scholar] [CrossRef]
  18. Barawid, O.C., Jr.; Mizushima, A.; Ishii, K.; Noguchi, N. Development of an autonomous navigation system using a two-dimensional laser scanner in an orchard application. Biosyst. Eng. 2007, 96, 139–149. [Google Scholar] [CrossRef]
  19. Zhou, M.; Xia, J.; Yang, F.; Zheng, K.; Hu, M.; Li, D.; Zhang, S. Design and experiment of visual navigated UGV for orchard based on Hough matrix and RANSAC. Int. J. Agric. Biol. Eng. 2021, 14, 176–184. [Google Scholar] [CrossRef]
  20. Fischler, M.A.; Bolles, R.C. Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM 1981, 24, 381–395. [Google Scholar] [CrossRef]
  21. Guyonneau, R.; Mercier, F.; Oliveira Freitas, G.F. LiDAR-only crop navigation for symmetrical robot. Sensors 2022, 22, 8918. [Google Scholar] [CrossRef]
  22. Isack, H.; Boykov, Y. Energy-based geometric multi-model fitting. Int. J. Comput. Vis. 2012, 97, 123–147. [Google Scholar] [CrossRef]
  23. Wang, Y.; Geng, C.; Zhu, G.; Shen, R.; Gu, H.; Liu, W. Information perception method for fruit trees based on 2D LiDAR sensor. Agriculture 2022, 12, 914. [Google Scholar] [CrossRef]
  24. Shang, Y.; Wang, H.; Qin, W.; Wang, Q.; Liu, H.; Yin, Y.; Song, Z.; Meng, Z. Design and test of obstacle detection and harvester pre-collision system based on 2D lidar. Agronomy 2023, 13, 388. [Google Scholar] [CrossRef]
  25. Velasquez, A.E.B.; Higuti, V.A.H.; Guerrero, H.B.; Gasparino, M.V.; Magalhães, D.V.; Aroca, R.V.; Becker, M. Reactive navigation system based on H∞ control system and LiDAR readings on corn crops. Precis. Agric. 2020, 21, 349–368. [Google Scholar] [CrossRef]
  26. Mújica-Vargas, D.; Vela-Rincón, V.; Luna-Álvarez, A.; Rendón-Castro, A.; Matuz-Cruz, M.; Rubio, J. Navigation of a differential wheeled robot based on a type-2 fuzzy inference tree. Machines 2022, 10, 660. [Google Scholar] [CrossRef]
  27. Hiremath, S.A.; Van Der Heijden, G.W.A.M.; Van Evert, F.K.; Stein, A.; Ter Braak, C.J.F. Laser range finder model for autonomous navigation of a robot in a maize field using a particle filter. Comput. Electron. Agric. 2014, 100, 41–50. [Google Scholar] [CrossRef]
  28. Thrun, S.; Burgard, W.; Fox, D. Probabilistic Robotics; China Machine Press: Beijing, China, 2017. [Google Scholar]
  29. Shalal, N.; Low, T.; McCarthy, C.; Hancock, N. Orchard mapping and mobile robot localisation using on-board camera and laser scanner data fusion—Part B: Mapping and localisation. Comput. Electron. Agric. 2015, 119, 267–278. [Google Scholar] [CrossRef]
  30. Available online: http://www.dfwee.com/h-pd-127.html (accessed on 14 November 2024).
  31. Available online: https://www.sick.com/cn/zh/catalog/products/lidar-and-radar-sensors/lidar-sensors/lms1xx/c/g91901?tab=downloads (accessed on 14 November 2024).
  32. Simon, D. Optimal State Estimation: Kalman, H Infinity, and Nonlinear Approaches; John Wiley & Sons, Inc.: Hoboken, NJ, USA, 2006. [Google Scholar]
Figure 1. Hardware architecture diagram of the proposed system.
Figure 1. Hardware architecture diagram of the proposed system.
Agronomy 14 02825 g001
Figure 2. Actual hardware setup of the system.
Figure 2. Actual hardware setup of the system.
Agronomy 14 02825 g002
Figure 3. Software implementation flow diagram of the proposed system.
Figure 3. Software implementation flow diagram of the proposed system.
Agronomy 14 02825 g003
Figure 4. Schematic diagram of the WGS-84 coordinate system and horizontal coordinate system.
Figure 4. Schematic diagram of the WGS-84 coordinate system and horizontal coordinate system.
Agronomy 14 02825 g004
Figure 5. Schematic diagram of matching trunk cluster centers with the current fruit tree distribution map.
Figure 5. Schematic diagram of matching trunk cluster centers with the current fruit tree distribution map.
Agronomy 14 02825 g005
Figure 6. Schematic diagram of the fruit tree distribution map.
Figure 6. Schematic diagram of the fruit tree distribution map.
Agronomy 14 02825 g006
Figure 7. Fruit tree distribution map.
Figure 7. Fruit tree distribution map.
Agronomy 14 02825 g007
Figure 8. System localization accuracy test results.
Figure 8. System localization accuracy test results.
Agronomy 14 02825 g008
Figure 9. Distribution of valid measurement values for point 2500.
Figure 9. Distribution of valid measurement values for point 2500.
Agronomy 14 02825 g009
Figure 10. Results of the overall system navigation performance evaluation.
Figure 10. Results of the overall system navigation performance evaluation.
Agronomy 14 02825 g010
Figure 11. Zoomed-in views of segments D-E and F-G.
Figure 11. Zoomed-in views of segments D-E and F-G.
Agronomy 14 02825 g011
Figure 12. Some LiDAR data in segment D-E during the turning and row-switching phase.
Figure 12. Some LiDAR data in segment D-E during the turning and row-switching phase.
Agronomy 14 02825 g012
Figure 13. Speeds of the left and right wheels of the platform.
Figure 13. Speeds of the left and right wheels of the platform.
Agronomy 14 02825 g013
Figure 14. Azimuth results from state estimation.
Figure 14. Azimuth results from state estimation.
Agronomy 14 02825 g014
Figure 15. Trajectory tracking errors at different straight-line driving speeds.
Figure 15. Trajectory tracking errors at different straight-line driving speeds.
Agronomy 14 02825 g015
Table 1. Localization states in segment D-E.
Table 1. Localization states in segment D-E.
State Estimation PointNumber of Trunk ClustersNumber of Successfully Matched Trunk ClustersNumber of Successfully Matched Fruit TreesNumber of Valid MeasurementsLocalization State
320013110Failed
328012110Failed
336010330Failed
338016553Success
344098825Success
3560137721Success
Table 2. Trajectory tracking error results at different straight-line driving speeds.
Table 2. Trajectory tracking error results at different straight-line driving speeds.
SpeedMean Absolute Error of Trajectory TrackingStandard Deviation of Absolute Trajectory Tracking ErrorMaximum Absolute Trajectory Tracking ErrorRMSProportion of Absolute Error
≥5%
1 km/h0.07 m0.06 m0.32 m0.09 m≥0.18 m
(5.79%)
2 km/h0.11 m0.09 m0.56 m0.14 m≥0.27 m
(5.06%)
3 km/h0.18 m0.14 m0.64 m0.22 m≥0.47 m
(5.08%)
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Su, Z.; Zou, W.; Zhai, C.; Tan, H.; Yang, S.; Qin, X. Design of an Autonomous Orchard Navigation System Based on Multi-Sensor Fusion. Agronomy 2024, 14, 2825. https://doi.org/10.3390/agronomy14122825

AMA Style

Su Z, Zou W, Zhai C, Tan H, Yang S, Qin X. Design of an Autonomous Orchard Navigation System Based on Multi-Sensor Fusion. Agronomy. 2024; 14(12):2825. https://doi.org/10.3390/agronomy14122825

Chicago/Turabian Style

Su, Zhengquan, Wei Zou, Changyuan Zhai, Haoran Tan, Shuo Yang, and Xiangyang Qin. 2024. "Design of an Autonomous Orchard Navigation System Based on Multi-Sensor Fusion" Agronomy 14, no. 12: 2825. https://doi.org/10.3390/agronomy14122825

APA Style

Su, Z., Zou, W., Zhai, C., Tan, H., Yang, S., & Qin, X. (2024). Design of an Autonomous Orchard Navigation System Based on Multi-Sensor Fusion. Agronomy, 14(12), 2825. https://doi.org/10.3390/agronomy14122825

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop