Next Article in Journal
Comparison of Single-Shot and Two-Shot Deep Neural Network Models for Whitefly Detection in IoT Web Application
Previous Article in Journal
Energy Assessment for First and Second Season Conventional and Transgenic Corn
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluation of LiDAR for the Free Navigation in Agriculture

by
Matthias Reger
1,*,
Jörn Stumpenhausen
2 and
Heinz Bernhardt
1
1
Chair of Agricultural Systems Engineering, TUM School of Life Sciences, Technical University of Munich, 85354 Freising, Germany
2
Faculty Sustainable Agriculture and Energy Systems, University of Applied Sciences Weihenstephan-Triesdorf, 85354 Freising, Germany
*
Author to whom correspondence should be addressed.
AgriEngineering 2022, 4(2), 489-506; https://doi.org/10.3390/agriengineering4020033
Submission received: 19 April 2022 / Revised: 24 May 2022 / Accepted: 31 May 2022 / Published: 9 June 2022
(This article belongs to the Section Sensors Technology and Precision Agriculture)

Abstract

:
Driverless transport systems (DTS) or automated guided vehicles (AGV) have been part of intralogistics for over six decades. The uniform and structured environment conditions in industrial halls provided the ideal conditions for simple automation, such as in goods transport. Initially, implementing simply-designed safety devices, e.g., bumpers, could reduce risk to an acceptable level. However, these conditions are not present in an agricultural environment. Soiling and harsh weather conditions are anticipated both indoors and outdoors. The state of the art in intralogistics are light detection and ranging (LiDAR) scanners, which are suitable for both navigation and collision avoidance, including personal protection. In this study, the outdoor and navigation suitability of LiDAR is assessed in test series. The aim is to contribute advice on validation of LiDAR as a possible technology with respect to navigation and collision avoidance in freely navigating automatic feeding systems.

1. Introduction

Present-day commercial automatic feeding systems (AFS) only incorporate an incomplete automation of the feeding process (“semi-automatic feeding”, stage 2) [1]. Stage 2 indicates that filling, mixing, and distribution of the feed ration is performed automatically by the system, whereas feed removal from silos and transport from silos to the stable are performed manually by the farmer. These systems can only remove feed from interim storage and convey it into mixers. The interim storages, again, have to be filled by a manual operator. Moreover, the feed mixer is a distribution unit, or it conveys the mixed ration to a special feed distributor, which then dispenses the feed to the feeding fence. Furthermore, some of these systems push the feed regularly at programmable intervals. The duration of the interim storage of silage is limited to a few hours or a few days, as the air supply leads to rapid fodder spoilage after removal from the silo [2,3]. The removal of the roughage from the silos and their transport to the mixer or interim storage are, in turn, steps in feeding that cannot yet be managed by any commercial automatic feeding system. The most significant challenges here include the safe removal of silage with cutting or rotating tools, and the safe transport of the vehicles weighing several tons over non-restricted traffic areas.
Contrary to the harsh and complex environmental conditions in agriculture, standardized working industrial environments laid the foundation for the automation of transport vehicles [4]. The success story of driverless transport systems or automated guided vehicles (AGVs) in intralogistics began more than six decades ago [5]. There are optimal conditions, e.g., constant illuminance, paved roads, no changing or demanding weather conditions, and demarcated areas with trained staff. In the early days of driverless transport systems, in the 1950s, navigation was based on conductors with current flowing through them laid in the ground, known as inductive lane guidance [5]. These were simply to be installed in the already paved surfaces of industrial halls and warehouses. LiDAR technology is now used in most AGVs. With the duration and the angle of the emitted laser beams, LiDAR scanners enable self-driving vehicles to navigate freely in indoor applications. Additionally, they are often part of perception systems in automated machinery, as they can maximize personal protection and collision avoidance [6,7,8,9]. LiDAR is an optical measuring principle, which makes it sensitive to external light and optical impairments. Therefore, LiDAR is only suitable to a limited extent for use outdoors. Dirt, fog, rain, soil conditions, and light-shadow transitions can substantially impact LiDAR technology [10].
Automatic feeding systems move on a range of floor coverings, and operate both outdoors and indoors. This makes the use of ground control points difficult or completely eliminates the need for them. Likewise, using the global navigation satellite system (GNSS) alone is inadequate for two reasons. First, navigation in buildings is not possible due to the shading of the signal by the building envelope and, second, additional technologies are required for direct monitoring of the vehicle environment [8]. LiDAR is an advanced technology that is highly promising for use in agricultural machinery, as well. This potential is predominantly seen in two sectors: In the automotive industry and in intralogistics. LiDAR technology is an important technological pillar for automation [11,12]. Even in land preparation LiDAR has an important standing, although GNSS and red-green- blue (RGB) cameras are more commonly used in agricultural automated machinery [13]. This comes from the open-air environment on fields ideal for GNSS and their main tasks in identifying crops, rows or weeds ideal for RGB vision. LiDAR can create a precise image of the near and far range. This makes LiDAR an excellent fit for both navigation [14,15,16] and in safety devices [11,12,14,17]. In an experimental setting, a robot vehicle was equipped with a LiDAR scanner and tested in practical agricultural driving tests. Vehicle navigation with LiDAR was analyzed using standardized criteria and robustness against challenging environmental conditions was documented. The aim of this article is to analyze the navigation of a driverless feed mixer with LiDAR under practical conditions and provide orientation on how to evaluate these navigation technologies.

2. State of Knowledge

Laser navigation is the most prominent representative of free navigation [11]. It makes use of the physical characteristics of a laser (light amplification by stimulated emission of radiation) and implements them in the LiDAR method [18]. LiDAR is a method for optical distance and speed measurement [19]. It is based on the time-of-flight of emitted laser pulses and their scattering, known as time-of-flight (TOF) [11,12]. The laser beams of the LiDAR are electromagnetic waves and can be distinguished by differences in frequencies. LiDAR is an optical measurement method, which is the reason that the emitted pulses are absorbed, deflected or scattered by particles, bodies, and surfaces.
For the purposes of this study, LiDAR sensors can be divided into two categories. The first design principle is called a LiDAR sensor, which contains no moving parts and relies only on diode arrays. The LiDAR sensors work according to the multi-beam principle. For this purpose, they are provided with several permanently mounted transmitter and receiver units arranged horizontally next to each other. The angular resolution depends on the beam width and the lateral opening angle on the number of beams. Ranges of up to 150 m are possible in this configuration. The advantage of this design is that it does not have any moving parts. In the automotive industry (e.g., Volkswagen Up, Ford Focus), 2D LiDAR systems based on the multi-beam principle are used for active emergency braking. Due to the limited horizontal detection area, their use is limited to longitudinal guidance. They are not yet used in the AGV industry [20].
The so-called laser scanners are the second type of LiDAR sensor. They are equipped with one or more mechanically rotating mirrors that redirect the emitted light pulses. The angle to the object is recorded based on the rotation of the mirror. The exact position of an object can be determined using the distance to the object and the angular position of the mirror [20].
The multi-target capability of the sensors is a critical foundation for navigation, as several objects can be located in the sensor area. To differentiate between the targets, a separation ability is required, which among other things, depends on the resolution of the angle measurement. LiDAR laser beams leave the sensor as spherical waves, which correspond to strong bundling. Therefore, they can generate high-resolution measurement data [16].
Classic laser navigation is based on the use of artificial landmarks. These are attached to walls and pillars at a specific height to avoid shadowing from people or other objects. A rotating laser scanner can precisely measure these reference marks even over long distances (Figure 1). Depending on the procedure, at least two or three artificial markings must be visible to determine the position. The creation of new driving courses is possible directly via programming or a learning run (teach-in), which guarantees a significantly heightened level of flexibility with regard to the layout. To avoid the ambiguities of marks, it may be necessary to code them [5].
The coordinates of the reference marks are saved when the vehicle is commissioned and configured. While driving, the laser scanner continuously detects the positions of these stationary marks. Therefore, the current position and orientation of the vehicle can be determined from the comparison of the position data. This position estimate is known as a feature-based method, which seeks to identify features or landmarks [9]. This can be achieved in two ways: By means of the relative positions or by means of the absolute positions (Figure 2). When covering the course, the vehicle computer continuously corrects the vehicle’s course deviations, which can occur due to various factors [5].
To reduce, omit or support artificial landmarks, information concerning the contours of the environment is necessary. The laser scanners can output the measured contours to recognize natural landmarks in an external computer using suitable algorithms. The marks must be clearly recognizable and their position must not change. A distance measuring laser scanner can be used, for example, to drive along a wall. If a complete position determination/navigation is to succeed, additional procedures, such as edge detection or artificial bearing marks are added. This method of environmental navigation was also applied in these manuscript’s test series [15].
If the laser scanner is also swiveled around another axis, it is possible to create a 3D image of the surroundings. This method is also used by Strautmann und Söhne GmbH in the “Verti-Q” concept presented in 2017 (Figure 3). This enables quite reliable ceiling navigation, as the view of the ceiling is usually free of obstacles. Therefore, the disadvantage is the greater computing and time required for this multi-dimensional recording. At present, only slowly moving vehicles in the single-digit kilometers per hour range can be implemented with this technology in a practical manner. Additionally, the method is only suitable for indoor applications due to its principle [15].

3. Materials and Methods

To evaluate the automatic lane guidance by LiDAR scanner, it is validated on the basis of three characteristics: Accuracy, Precision, and Consistency [22]. For this purpose, a dynamic driving test was carried out using a robot vehicle in an agricultural environment (Figure 4). At the test farm “Veitshof” of the Technical University of Munich, the vehicle drives along a route in and outside a stable building using a LiDAR scanner. A “run” was considered to be a series of 10 “test drives”, each completing the total route, as shown in Figure 5. The climatic environment was consistent throughout test runs 2, 3, and 4 regarding lighting and weather conditions. It was a clouded December day between 10 a.m. and 4 p.m. at about 5 to 7 °C.
The LMS 100 laser scanner was mounted on the front of the vehicle in the middle above the steering axle. The scanner used a light wavelength of 905 nanometers and updated measures with a repetition rate of 50 Hz. The measuring range was between 0.5 and 20 m with an opening angle of 270° and an angle resolution of 0.50°. The scanner was insensitive to extraneous light up to 40,000 lx. The systematic error of the scanner was 30 mm. The LMS 100 has fog correction and multiple echo evaluation. The onboard PC ran an Ubuntu Linux operating system on which the robot operating system (ROS) was installed. The “Cartographer” program was used to graphically display the LiDAR data. The inertial measuring system of the type MTi-30-2A5G4 also supports the determination of spatial movement and geographical positioning.
The route (target trajectory) is programmed using waypoints on a digitally created map. The digital map is generated using the measurement data from the LMS 100 laser scanner from Sick AG and the “Cartographer” software program during a teach-in drive [23]. The target trajectory was conducted over a drive-on feeding alley in the dairy stable (marks 2–3) around the stable building (marks 3–4–1) and back to the starting point at the feeding alley (mark 2), which corresponded to a route of about 110.5 m in length (Figure 5). The feeding alley has a metal catching/feeding fence on both sides and is divided lengthwise by a metal drawbridge. The walls outside are wooden with a concrete base. Calf hutches are set up in the eastern area of the barn. The environment offers highly variable conditions: Floor coverings from gravel (79 m) to concrete (31.5 m), indoor and outdoor lighting conditions (31.5 m or 79 m), changing distances to reflective objects, changing materials/surface properties of the reflective objects, both static structural and dynamic living objects, such as cattle, in addition to dry or rainy conditions.
The extent to which the test drives (drive trajectories) deviate from the planned route (target trajectory) (accuracy) should be verified. Moreover, the extent to which the test drives (drive trajectories) deviate from one another (precision) should be examined. The criterion consistency is derived from accuracy and precision. The accuracy of a measurement system is the degree of deviation of measurement data from a true value. The accuracy was calculated by determining the closest neighbors of the drive trajectories to the target trajectory using the 1-nearest neighbor method (1NN method) and Euclidean distance calculation [24]. The reproducibility of a measuring system describes the extent to which random samples under the same conditions scatter over several repetitions (precision). The precision was calculated by determining the closest neighbors of the drive trajectories to each other using the 1NN method and Euclidean distance calculation. If the measurement results of an evaluated test are accurate and precise, then they are consistent (consistency) [22,25].
In this work, a tolerance limit of 0.05 m is specified for precision and accuracy. As a result, this tolerance limit is set by the user and assumes a different value for additional scenarios. This limit has been orientated on real time kinematic (RTK)-GNSS, which is used, e.g., in guidance of farm machinery on fields and is capable of precision steering with only small deviations of ±3 cm. This considers other high-precision technologies for navigation as well as the restricted conditions inside buildings. The deviations of the individual trajectories are statistically evaluated by multiple assessments, applying the false discovery rate and using the Welch test.
The raw laser data are processed in real time with a Monte Carlo filter, which is widely used as a method of laser positioning [26] during the experiment to remove measurement noise, and thus increase the accuracy of the data. After the test has been carried out, the data are further maintained using a Kalman filter and the associated high-resolution physical measuring system. The aim of using a Kalman filter is to eliminate systematic errors [27]. The systematic error is one of three types of errors; it is the only type of error that is considered avoidable, and can be eliminated a priori. Random errors are random system errors, in which the cause is largely unknown [25,27]. Chaotic errors are mathematically inexplicable system errors, in which the cause is unknown [25].
The high-resolution physical measuring system was developed to guarantee the optimal use of the Kalman filter for the evaluation. This can be described as follows.
γ t + h = γ t + γ · t h + O h 2
Equation (1) describes the mathematical relationship between the position vector γ t   at time t and the position vector γ t + h at time t + h (actual time plus certain quantity) and has an approximation quality of O h 2 .
γ x t + h = γ x t + cos φ t γ · t h + O h 2
Equation (2) represents the x-component representation of Equation (1).
γ y t + h = γ y t + sin φ t γ · t h + O h 2
Equation (3) represents the y-component representation of Equation (1).
φ t + h = φ t + γ · t γ 2 x t + γ 2 y t h + O h 2
Equation (4) describes the mathematical relationship between the heading angle φ t at time t and the heading angle φ t + h at time t + h and has an approximation quality of O h 2 .
γ · t + h = γ · t + γ ¨ t h + O h 2
= γ t · + c o n s t . γ · t γ · t + γ · t 2 φ t cos π 2 sin π 2 sin π 2 cos π 2 γ · t γ t · h + O h
= γ t · + c o n s t . cos t sin t + γ · t 2 φ t cos π 2 sin π 2 sin π 2 cos π 2 cos t sin t h + O h w i t h   c o n s t . = 0.6 m s
where the quantity φ is provided by
φ t = cos φ t γ · t 2 + sin φ t γ · t 2 3 2 cos φ t γ · t a y 2 sin φ t γ · t a x 2
and the quantities a x and a y in Equation (8) are provided by the O h 2 approximations
a x t = cos φ t + 10 5 c o n s t . + 10 5 cos φ t 10 5 c o n s t . 10 5 2 10 5 + O 10 10
and
a y t = sin φ t + 10 5 c o n s t . + 10 5 sin φ t 10 5 c o n s t . 10 5 2 10 5 + O 10 10 .
Equation (5) describes the mathematical relationship between the absolute velocity γ · t   at time t and the absolute velocity γ · t + h   at time t + h . By decomposing the acceleration vector γ ¨ t into its tangential and normal components, one first arrives at Equation (6) and finally, by means of trigonometric simplifications, at Equation (7). The radius of curvature φ t at time t is included in Equation (8). The quantities provided by Equations (9) and (10) are required for calculating the radius of curvature φ t   and describe the x and y components of the acceleration vector γ ¨ t   at time t .
z k = H k X k , k + v k   w i t h   v k ~ N 0 ,   R k
Equation (11) describes the relationship between the measurement data z k , the measurement matrix Hk, and the actual noise-free position data X k , k . The noise of the measurement data is modeled using a normally distributed (N(0, Rk)) error v k , where R k represents the covariance matrix.
The physical system Equations (1)–(10) and the measurement system are linked by the mathematical Equations (1)–(11). They form the extended Kalman filter. The results of the Kalman filter are optimal estimated values for the system state variable z k .
Predicted state estimate:
x k , k 1 = f x k , k 1 + w k   w i t h   w k ~ N 0 ,   Q k 1
Predicted covariance estimate:
P k , k 1 = F k 1 P k 1 , k 1 F T k 1 + Q k 1
Innovation:
y k = z k H k x k , k 1
Innovation covariance:
S k = H k P k , k 1 H T k + R k
Near optimal Kalman gain:
K k = P k , k 1 H k 1 S k 1
Updated state estimate:
x k , k = x k , k 1   + K k y k
Updated covariance estimate:
P k , k = I K m H k   P k , k 1
The prediction Equation (12), which in practice is subject to a normally distributed noise w k   with a covariance matrix Q k 1 , is used to predict the true trajectory x k , k 1   and is then calculated using the Kalman gain update step in Equation (17), which is still improving. To predict the covariance matrix Equation (13) of the denoised measurement data, the Jacobian matrix F k 1 , the current covariance matrix P k 1 , k 1 of the denoised measurement data, and the covariance matrix Q k 1   of the physical system are required. The Jacobian matrix F k 1   is approximated by central, finite differences with sufficient accuracy up to an O h 2 accuracy. The matrix H k in Equation (14) describes the Jacobian observing matrix of the measurement system and relates the measurement data to the product of the observing matrix and the predicted position vectors x k , k 1   . Equation (16) represents the optimal Kalman gain (Kk) and is required for the innovation in Equation (17) to calculate the actual position vector x k , k . Equation (18) describes the update of the covariance matrix of the denoised measurement data. The so-called innovation covariance matrix Equation (15) is required for Equation (16).
In total, four runs were conducted. The first run was a test run, to check that every aspect of the test setup was working. The second run revealed some issues, which were tracked down with observations made during the test and the notes in the test protocol. The robot displayed an erratic driving pattern during the entire run 2 on every journey. The reason for this was suspected to be the number of waypoints on the programmed route. An evaluation did not provide any quantifiable results, which is the reason that these measurement data were not further considered for data analysis. For runs 3 and 4, the number of waypoints in the route were increased. The erratic driving pattern could no longer be observed in runs 3 and 4. In addition to the x and y coordinates for each measurement point, the measurement time is also saved. This makes it easier to assign the data. The laser measurement data are recorded as a “.bag” file with every drive with ROS, and then saved as a text file. This indicates that the measurement data can be used for evaluation in Microsoft Excel and MATLAB.

4. Results

The results from the tests demonstrate the abilities of the automatic lane guidance via LiDAR scanner and their evaluation is validated on the basis of three characteristics: Accuracy, Precision, and Consistency. For a clear and definite assignment in the following paragraphs, runs and drives are shown as codes, e.g., R3D6 (run 3, drive 6).

4.1. Accuracy

The determination of the accuracy refers to the distances between the target trajectory and the test drives. The waypoints for the navigation represented the target trajectory, which in turn formed the true (target) trajectory in the accuracy calculation. Then, the accuracy for each point of the target trajectory was determined with the help of the 1NN method. For each point of the target trajectory, the nearest neighbor of all nine considered trajectories from run 3 in the section [mark 2–3] (referred to as “alley”) inside the facility, was determined by means of the Euclidean distance (Figure 5).
In Figure 6, a greater deviation of the trajectories at the beginning of section “alley” during run 3 is visible, which resulted from the previous cornering. The “steering angle” due to the different speeds of the wheels on the drive axle (rear axle) favored these deviations when cornering. The driving accuracy itself can be very high despite the fluctuations in the data, since even small steering commands are occurring on the scanners placed above the front axle enhanced by the lever. These measurement data were also corrected, assuming that they were influenced by the previous cornering. Nearest neighbors, which may have been incorrectly assigned, have been adjusted to reduce the systematic error. This two-stage preprocessing and cleaning of the measurement data led to a reliable data basis to enable robust evaluation results.
Figure 7 shows an example of how the accuracy of the drives from run 3 in the route section results in a reference point of the target trajectory. The target trajectory runs horizontally through the point XTRUE roughly. No regularity was discernible in the scatter of the measurement points of the drive trajectories around the target trajectory. The deviation of the closest neighbors to this point XTRUE varied from 0.0058 m (R3D2) to 0.0572 m (R3D7). Therefore, the measured values of the individual trajectories are classified as accurate in relation to this measuring point.
In Figure 8, this so-called 1NN method is shown as an example in a 1-point scenario. As can be deduced from the figure, the calculated distance is greater than the actual deviation. From this, it can be concluded that the deviation is actually smaller than the evaluation of the data shown. A evident uncertainty behind this finding is that it is not known how the robot moved between the two measuring points. It can be assumed that he drove on the direct route. However, it is also possible that he moved irregularly in the area. Basically, a positive trend can be assumed, which implies that the deviations from the target trajectory are actually smaller than those determined in the evaluations.
The results of the accuracy assessment of all drives from run 3 in the route section are shown in Figure 9. In total, 171 nearest neighbors were evaluated for the nine trajectories of run 3. Only the adjusted results from the processes described above were taken into account.

4.2. Precision

In contrast to the accuracy calculation, the precision results from the calculation of the distances between the drives. In the precision calculation, the 1NN method was used to determine the precision in the repetition of the drives. For each point of the drive, the closest neighbor of all nine considered drives from runs 3 and 4 was determined using the Euclidean distance. The data were processed in the same way as for the determination of the accuracy.
The closest neighbors are calculated for the measurement data of the nine selected drives from run 3 in the route section “alley”. As an example, Figure 10 shows how the precision is calculated between the drives R3D6 and R3D1. A total of 514 closest neighbors between R3D6 and R3D1 were determined. The trajectories of the drives R3D6 and R3D1 are between 0.0008 and 0.0825 m apart. The average precision for this pair of trajectories is 0.0327 m. Therefore, the pair of trajectories is precise.
Figure 11 shows the statistical relationship between the precision of the nine considered trajectories from run 3 in the route section [2,3]. A total of 15,584 nearest neighbors form the data basis for the histogram. Overall, 1.4% of the values were over 0.1 m.
Figure 12 shows the statistical relationship between the precisions of the nine considered trajectories from run 3 in the route section. A total of 35,192 nearest neighbors formed the data basis for the histogram. Overall, 18.9% of the values were over 0.1 m.
Figure 13 shows the statistical relationship between the precisions of the nine considered trajectories from run 4 in the route section. A total of 23,437 nearest neighbors formed the data basis for the histogram. Overall, 27.3% of the values were over 0.10 m, 6.4% were over 0.20 m, and 1.5% were over 0.3 m.

4.3. Accuracy, Precision, and Consistency

In this work, a tolerance limit of 0.05 m is specified for precision and accuracy. This tolerance limit can be selected by the user, and thus a different value is assumed for other scenarios. Measured against this threshold value, the navigation performance of the vehicle with a LiDAR scanner can be evaluated as follows:
In the route section “alley” from run 3, the navigation is rated as accurate and precise, and consequently consistent. An average deviation of 0.0487 m (Accuracy) or 0.0439 m (Precision) was determined. The standard deviation was 0.0286 m (Accuracy) and 0.0238 m (Precision). The maximum value was 0.1406 m (Accuracy) and 0.1511 m (Precision), the minimum value was 0.0047 m (Accuracy) and 0.00001 m (Precision) (Figure 14).
In run 3, the navigation is assessed as inaccurate and imprecise, and subsequently as inconsistent. An average deviation of 0.0608 m (Accuracy) or 0.0707 m (Precision) was determined. The standard deviation was 0.0450 m (Accuracy) and 0.0588 m (Precision). The maximum value was 0.3337 m (Accuracy) and 0.4162 m (Precision), the minimum value was 0.0047 m (Accuracy) and 0.0001 m (Precision).
In run 4, the navigation is assessed as inaccurate and imprecise, and subsequently as inconsistent. An average deviation of 0.0878 m (Accuracy) or 0.0820 m (Precision) was determined. The standard deviation was 0.0657 m (Accuracy) and 0.0645 m (Precision). The maximum value was 0.3510 m (Accuracy) and 0.4276 m (Precision), the minimum value was 0.0099 m (Accuracy) and 0.0002 m (Precision).
The data showed a relationship between the number of measurement data and the accuracy or precision of the navigation. It can be observed that the drives with a more extensive measurement data basis as a result of the 1NN method are assessed as more accurate than the drives with less measurement data basis. This observation can be seen in all drives.

5. Discussion

The aim of this article is to analyze the navigation of a driverless feed mixer with LiDAR under practical conditions and present an orientation on how to evaluate these navigation technologies.
A practical, realistic environment is characterized by an undefined dynamic. Animals (cows and calves), people (employees and test staff), and machines (farm loaders and mixer wagons) are involved in the direct test environment at the TUM “Veitshof” research farm. At the same time, the route leads past a differentiated environment and over heterogeneous ground conditions. This environment is highly demanding on the technology. Particular challenges include the navigation via the cramped feeding table in the stable building, the navigation in the outside area with few natural landmarks for direction coordination, and the condition of the subsoil with inclines, slopes, and changing surfaces. Therefore, the driverless navigation in and around the stable building can be considered a significant success. It exhibited an easy yet functional implementation, despite not meeting the self-set deviation limits.
The selected characteristics are well suited for evaluating laser-assisted navigation. The accuracy in the repetition of a previously programmed route by the automated vehicle is a primarily fundamental requirement of the system. However, the navigation performance cannot be comprehensively assessed through accuracy alone, since chaotic or random errors can lead to dispersion in the reproduction of the drives. Only by considering the repeated drives and their deviation from one another (precision) in connection with the accuracy of the lane guidance does the consistent correctness (consistency) of the navigation performance result.
The results of the calculation of Accuracy and Precision show that navigation using the LiDAR scanner is inaccurate and imprecise, depending on the tolerance limit of 0.05 m. It is important to consider this binary result in a more differentiated manner. First, the tolerance limit of 0.05 m for Accuracy and Precision can be freely selected and thus adapted to different requirements. This value was selected in the present experiment since the routes of an automatic feed mixer wagon can also be located inside buildings and in confined conditions. As described in the results, it can be seen in the stable building on the cramped feeding table between the feed fences of the stall that the navigation performance is significantly improved. The navigation performance in the section “alley” is described as both accurate and precise, and therefore consistent. The main factors that positively influence this improved navigation are the paved concrete feeding table and the numerous objects for detection in the area. On the other hand, in the outside area in the section “alley”, the navigation performance deteriorates, but remains within an acceptable range. It is acceptable that the accuracy requirement should be selected to be lower in the outdoor area due to fewer obstacles or objects in the driving area. Another manufacturer of an automatic feeding system further showed that the driving accuracy and reproduction were artificially manipulated or provided with a random or regular offset in order that the floor coverings, such as tar or pavement are more evenly stressed and do not tend to form gullies. This does not affect the causality that the navigation performance increases as the number of objects in the detection area of the LiDAR scanner increases. The most inaccurate and imprecise places revealed the cornering, especially in the area of the incline at the stable entrance [mark 2] and the gradient at the stable exit [mark 3]. Herein, the navigation is impaired by the angular transition of the floor covering from gravel to concrete, the rigid connection of the laser scanner on the test vehicle, and thus the strong vibrations and inclinations with a disadvantageous detection area, as well as the inconvenient positioning of the laser scanner over the front axle at an appreciable distance (lever) to the drive axis. The smallest steering commands are amplified by the lever up to the laser scanner. This makes it difficult to steer the vehicle very accurately and precisely. This effect is intensified by the type of steering of the vehicle, which is solved by different wheel speeds on the drive axle (rear axle). As a final point, it can be mentioned that the calculated characteristics tend to worsen the navigation performance. This is due to the 1NN method and the calculation of the Euclidean distance. Since the target trajectory is subdivided into a limited number of waypoints, it is not technically feasible for the measurements of the driving trajectories to be at a perpendicular distance from the target trajectory. It can be assumed that the vehicle traveled directly between the waypoints and that the navigation performance is therefore better than determined in the calculations. Due to the circumstances mentioned throughout the tests, navigation with LiDAR scanners in an agricultural environment can be classified as absolutely sufficient and suitable. The most antagonistic factors for the navigation deviations have been observed within the machine construction, but not with the LiDAR scanner capabilities.
The scope of the measurement data when determining the accuracy is to be critically considered. With the inclusion of further measuring points, the informative value and security of the parameter could be augmented. Before starting the experiment, more waypoints or target trajectory positions would have to be stored. Based on fewer measurement gaps in the sequence, the data cleansing effort could be reduced. Additionally, an increased number of XTRUE points and a reduction in the distance between two XTRUE points should improve measurement data accuracy. However, there is also a limitation of the measurement data, since too many stored waypoints could cause the software to overcompensate, thereby causing the data to be more inaccurate. Moreover, due to the interpolation between the two waypoints, several waypoints could no longer be approached precisely. It can be assumed that the influence of discretization is generally higher on curves than on straight lines. In contrast to the determination of the “Accuracy”, a sufficiently large database could be used to determine the “Precision”. Although the significance of the Precision was highly rated, it could be further improved with an increase in the number of waypoints. As with the Accuracy, the number of possible waypoints is limited to not be imprecise.

6. Conclusions

The calculation of the characteristics Accuracy, Precision, and Consistency describes the LiDAR navigation as inaccurate and imprecise. However, a differentiated, analytical assessment of the test conditions yields the result that LiDAR scanner-based navigation is a suitable option for automating automatic feeding systems. The potential of the highly developed technology is also relevant for use on agricultural machines. Beyond that, the usage of LiDAR in other challenging environmental use cases seems promising, e.g., for unknown land exploration.
Additionally, the navigation should be tested with the aid of a LiDAR scanner on a mixer wagon. It should be assessed whether there are restrictions in the use of LiDAR technology on an agricultural vehicle in long-term tests. Furthermore, it would be advantageous to evaluate how navigation with LiDAR can be achieved outdoors with few detectable objects. In this scenario, can artificial landmarks be reinforced that blend in homogeneously with the surroundings or is it advisable to use GNSS support in addition to LiDAR in the outside area?
A critical concern in automated vehicles is personal protection and collision avoidance. Due to their physical properties, laser scanners are only partially suitable as an optical measurement technology for personal protection in outdoor areas. Herein, it is important to clarify whether laser scanners are exclusively suitable for these tasks or whether other technologies should supplement the concept of an automatic feeding system.

Author Contributions

Conceptualization, M.R.; methodology, M.R.; validation, M.R.; formal analysis, M.R.; investigation, M.R.; resources, M.R.; data curation, M.R.; writing—original draft preparation, M.R.; writing—review and editing, H.B. and J.S.; visualization, M.R.; supervision, H.B. and J.S.; project administration, J.S.; funding acquisition, J.S. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Bavarian State Ministry for Science and Art (Bayerisches Staatsministerium für Unterricht und Kultus). Funding indicator VIII.2-F1116.WE/15/3.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

The authors would like to thank the project partner Mayer Maschinenbaugesellschaft mbH. Thanks to Veronica Ramirez for critical reading of the manuscript.

Conflicts of Interest

The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

  1. Haidn, B.; Macuhova, J.; Maier, S.; Oberschätzl, R. Automatisierung der Milchviehhaltung in Beständen bis 200 Kühe—Schwerpunkt Fütterung, Kiel. 2013. Available online: http://www.wgmev.de/download/jahrestagungen/jahrestagungen—archiv/jahrestagung—2013/automatisierung—fuetterung—in—der—milchviehhaltung.html (accessed on 10 January 2018).
  2. Grothmann, A. Einfluss von Automatischen Fütterungsverfahren in der Milchviehhaltung auf das Tierverhalten und die Futterqualität. Ph.D. Thesis, University of Hohenheim, Hohenheim, Germany, 2015. [Google Scholar]
  3. Adeili, S.; Haidn, B.; Robert, M. Development of a control unit for the autonomous guidance of manure removal—, cubicle cleaning—and bedding machines, as well as of automotive fooder—mixing—vehicles. In Proceedings of the Environmentally Friendly Agriculture and Forestry for Future Generations. XXXVI CIOSTA & CIGR SECTION V Conference, Sankt Petersburg, Russia, 26–28 May 2015; Popov, V.D., Belyakov, V.V., Eds.; Saint Petersburg State Agrarian University: Sankt Petersburg, Russia, 2015; pp. 19–26, ISBN 978-5-85983-257-6. [Google Scholar]
  4. Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng. 2016, 149, 94–111. [Google Scholar] [CrossRef]
  5. Ullrich, G. The history of automated guided vehicle systems. In Automated Guided Vehicle Systems; Ullrich, G., Ed.; Springer: Berlin/Heidelberg, Germany, 2015; pp. 1–14. ISBN 978-3-662-44813-7. [Google Scholar]
  6. Csaba, G.; Somlyai, L.; Vamossy, Z. Mobil robot navigation using 2D LIDAR. In Proceedings of the 2018 IEEE 16th World Symposium, Kosice and Herlany, Slovakia, 2 July 2018; pp. 143–148. [Google Scholar]
  7. Fritsche, P.; Kueppers, S.; Briese, G.; Wagner, B. Radar and LiDAR Sensorfusion in low visibility environments. In Proceedings of the 13th International Conference on Informatics in Control, Automation and Robotics, Lisbon, Portugal, 29–31 July 2016; SCITEPRESS-Science and and Technology Publications: Lisbon, Portugal, 2016; pp. 30–36, ISBN 978-989-758-198-4. [Google Scholar]
  8. Kubinger, W.; Peschak, B.; Wöber, W.; Sulz, C. Bildgebende Sensorsysteme für robotische Systeme in der Agrar—und Landtechnik. Elektrotech. Inftech. 2017, 134, 316–322. [Google Scholar] [CrossRef]
  9. D’Adamo, T.; Phillips, T.; McAree, P. LiDAR—Stabilised GNSS—IMU Platform Pose Tracking. Sensors 2022, 22, 2248. [Google Scholar] [CrossRef] [PubMed]
  10. Vargas, J.; Alsweiss, S.; Toker, O.; Razdan, R.; Santos, J. An Overview of Autonomous Vehicles Sensors and Their Vulnerability to Weather Conditions. Sensors 2021, 21, 5397. [Google Scholar] [CrossRef] [PubMed]
  11. Frost & Sullivan. LiDAR: Driving the Future of Autonomous Navigation, Mountain View, CA. 2016. Available online: https://pdf4pro.com/download/lidar-driving-the-future-of-autonomous-navigation-2f66.html (accessed on 9 March 2021).
  12. Gotzig, H.; Geduld, G.; Geduld, G.O. LIDAR—Sensorik. In Handbuch Fahrerassistenzsysteme; Winner, H., Hakuli, S., Lotz, F., Singer, C., Eds.; Springer Fachmedien Wiesbaden: Wiesbaden, Germany, 2015; pp. 317–334. ISBN 978-3-658-05733-6. [Google Scholar]
  13. Oliveira, L.F.P.; Moreira, A.P.; Silva, M.F. Advances in Agriculture Robotics: A State-of-the-Art Review and Challenges Ahead. Robotics 2021, 10, 52. [Google Scholar] [CrossRef]
  14. Hata, A.; Wolf, D. Road Marking Detection Using LIDAR Reflective Intensity Data and its Application to Vehicle Localization // Road marking detection using LIDAR reflective intensity data and its application to vehicle localization. In Proceedings of the 17th International IEEE Conference on Intelligent Transportation Systems (ITSC), Qingdao, China, 8–10 October 2014; pp. 584–589, ISBN 978-1-4799-6078-1. [Google Scholar]
  15. Ullrich, G. Fahrerlose Transportsysteme: Eine Fibel-mit Praxisanwendungen-zur Technik-für die Planung, 2nd ed.; Revised and Expanded Edition; Springer Fachmedien Wiesbaden: Wiesbaden, Germany, 2014; ISBN 978-3-8348-2591-9. [Google Scholar]
  16. Winner, H.; Hakuli, S.; Lotz, F.; Singer, C. (Eds.) Handbuch Fahrerassistenzsysteme; Springer Fachmedien Wiesbaden: Wiesbaden, Germany, 2015; ISBN 978-3-658-05733-6. [Google Scholar]
  17. Wang, H.; Wang, B.; Liu, B.; Meng, X.; Yang, G. Pedestrian recognition and tracking using 3D LiDAR for autonomous vehicle. Robot. Auton. Syst. 2017, 88, 71–78. [Google Scholar] [CrossRef]
  18. Eichler, J.; Eichler, H.-J. Laser: Bauformen, Strahlführung, Anwendungen, 7th ed.; Updated Edition; Springer: Berlin/Heidelberg, Germany; Dordrecht, The Netherlands; London, UK; New Yok, NY, USA, 2010; ISBN 9783642104626. [Google Scholar]
  19. Geduld, G. Lidarsensorik. In Handbuch Fahrerassistenzsysteme: Grundlagen, Komponenten und Systeme für Aktive Sicherheit und Komfort, 2nd Corrected Edition; Winner, H., Hakuli, S., Wolf, G., Eds.; Vieweg + Teubner: Wiesbaden, Germany, 2012; pp. 172–185. ISBN 9783834814579. [Google Scholar]
  20. Cacilo, A.; Schmidt, S.; Wittlinger, P.; Herrmann, F.; Bauer, W.; Sawade, O.; Doderer, H.; Hartwig, M.; Scholz, V. Hochautomatisiertes Fahren auf Autobahnen-Industriepolitische Schlussfolgerungen; Dienstleistungsprojekt 15/14: Study Commissioned by the Federal Ministry for Economic Affairs and Energy (BMWi); Fraunhofer—Gesellschaft for the Promotion of Applied Research Registered Association: Munich, Germany, 2015. [Google Scholar]
  21. Götting, K.G. Laserscanner zur Navigation. Available online: https://www.goetting.de/komponenten/43600 (accessed on 19 September 2018).
  22. Wörz, S.; Mederle, M.; Heizinger, V.; Bernhardt, H. A novel approach to piecewise analytic agricultural machinery path reconstruction. Eng. Optim. 2017, 49, 2150–2173. [Google Scholar] [CrossRef]
  23. Hess, W.; Kohler, D.; Rapp, H.; Andor, D. Real—time loop closure in 2D LIDAR SLAM. In Proceedings of the 2016 IEEE International Conference on Robotics and Automation, Stockholm, Sweden, 16–21 May 2016; Okamura, A., Menciassi, A., Eds.; IEEE: Piscataway, NJ, USA, 2016; pp. 1271–1278, ISBN 978-1-4673-8026-3. [Google Scholar]
  24. Guo, G.; Wang, H.; Bell, D.; Bi, Y.; Greer, K. KNN Model—Based Approach in Classification. In On The Move to Meaningful Internet Systems 2003: CoopIS, DOA, and ODBASE; Goos, G., Hartmanis, J., van Leeuwen, J., Meersman, R., Tari, Z., Schmidt, D.C., Eds.; Springer: Berlin/Heidelberg, Germany, 2003; pp. 986–996. ISBN 978-3-540-20498-5. [Google Scholar]
  25. Deutsches Institut für Normung e.V. Grundlagen der Meßtechnik-Teil 1: Grundbegriffe: 1995—01, Grundlagen der Meßtechnik_—Teil_1: Grundbegriffe; Beuth Verlag GmbH: Berlin, Germany, 1995.
  26. Liu, Y.; Zhao, C.; Wei, Y. A Robust Localization System Fusion Vision—CNN Relocalization and Progressive Scan Matching for Indoor Mobile Robots. Appl. Sci. 2022, 12, 3007. [Google Scholar] [CrossRef]
  27. Jetti, H.V.; Salicone, S. A Possibilistic Kalman Filter for the Reduction of the Final Measurement Uncertainty, in Presence of Unknown Systematic Errors. Metrology 2021, 1, 39–51. [Google Scholar] [CrossRef]
Figure 1. The rotating laser scanner is mounted on the vehicle and measures the reference marks on pillars and walls. Reprinted/adapted with permission from Ref. [21]. 2018, Götting KG.
Figure 1. The rotating laser scanner is mounted on the vehicle and measures the reference marks on pillars and walls. Reprinted/adapted with permission from Ref. [21]. 2018, Götting KG.
Agriengineering 04 00033 g001
Figure 2. Basic sketch of the determination of the relative position between the reference mark and the vehicle (left) and the absolute position in the coordinate system (right) (Adapted after SICK AG, in [5]).
Figure 2. Basic sketch of the determination of the relative position between the reference mark and the vehicle (left) and the absolute position in the coordinate system (right) (Adapted after SICK AG, in [5]).
Agriengineering 04 00033 g002
Figure 3. The sequence of images shows the clockwise rotation of the 2D laser scanner (white arrow) in the direction of travel and the output of the processed scanner data as a 3D point cloud on a monitor. Visitors to the fair can be seen passing by (yellow arrows). (Own recordings).
Figure 3. The sequence of images shows the clockwise rotation of the 2D laser scanner (white arrow) in the direction of travel and the output of the processed scanner data as a 3D point cloud on a monitor. Visitors to the fair can be seen passing by (yellow arrows). (Own recordings).
Agriengineering 04 00033 g003
Figure 4. The robot vehicle with a laser scanner (blue-black) and radar scanner in the early prototype stage (white). (Own recordings).
Figure 4. The robot vehicle with a laser scanner (blue-black) and radar scanner in the early prototype stage (white). (Own recordings).
Agriengineering 04 00033 g004
Figure 5. The route with marks 1 to 4. The route of the robotic vehicle did not change during the runs. Only the starting point has been relocated. (Scale 1:10) (google.maps.de).
Figure 5. The route with marks 1 to 4. The route of the robotic vehicle did not change during the runs. Only the starting point has been relocated. (Scale 1:10) (google.maps.de).
Agriengineering 04 00033 g005
Figure 6. Plot of the nine considered test drives from run 3 in the route section [2,3].
Figure 6. Plot of the nine considered test drives from run 3 in the route section [2,3].
Agriengineering 04 00033 g006
Figure 7. Selected 1-point zoom in target and single drive trajectories.
Figure 7. Selected 1-point zoom in target and single drive trajectories.
Agriengineering 04 00033 g007
Figure 8. With the 1NN method, the Euclidean calculated distance could only approximate the true distance. Herein, the calculated distance of a point of the trajectory R3D10 is, for example, greater than the actual deviation.
Figure 8. With the 1NN method, the Euclidean calculated distance could only approximate the true distance. Herein, the calculated distance of a point of the trajectory R3D10 is, for example, greater than the actual deviation.
Agriengineering 04 00033 g008
Figure 9. The box plots show the deviations of the closest neighbors (accuracy) for each single drive from run 3 in the route section. (+—maximum deviation).
Figure 9. The box plots show the deviations of the closest neighbors (accuracy) for each single drive from run 3 in the route section. (+—maximum deviation).
Agriengineering 04 00033 g009
Figure 10. The precision calculated using the 1NN method for the drives R3D6 and R3D1. The precision was between 0.0008 and 0.0825 m.
Figure 10. The precision calculated using the 1NN method for the drives R3D6 and R3D1. The precision was between 0.0008 and 0.0825 m.
Agriengineering 04 00033 g010
Figure 11. The histogram shows the statistical relationship between the precision of the nine drives taken into account from run 3 in the route section [2,3].
Figure 11. The histogram shows the statistical relationship between the precision of the nine drives taken into account from run 3 in the route section [2,3].
Agriengineering 04 00033 g011
Figure 12. The histogram shows the statistical relationship between the precision of the nine drives taken from run 3 in the route section.
Figure 12. The histogram shows the statistical relationship between the precision of the nine drives taken from run 3 in the route section.
Agriengineering 04 00033 g012
Figure 13. The histogram shows the statistical relationship between the precision of the nine drives taken from run 4 in the route section.
Figure 13. The histogram shows the statistical relationship between the precision of the nine drives taken from run 4 in the route section.
Agriengineering 04 00033 g013
Figure 14. The diagram shows the average deviation, standard deviation, maximum and minimum of three different runs, for both accuracy and precision.
Figure 14. The diagram shows the average deviation, standard deviation, maximum and minimum of three different runs, for both accuracy and precision.
Agriengineering 04 00033 g014
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Reger, M.; Stumpenhausen, J.; Bernhardt, H. Evaluation of LiDAR for the Free Navigation in Agriculture. AgriEngineering 2022, 4, 489-506. https://doi.org/10.3390/agriengineering4020033

AMA Style

Reger M, Stumpenhausen J, Bernhardt H. Evaluation of LiDAR for the Free Navigation in Agriculture. AgriEngineering. 2022; 4(2):489-506. https://doi.org/10.3390/agriengineering4020033

Chicago/Turabian Style

Reger, Matthias, Jörn Stumpenhausen, and Heinz Bernhardt. 2022. "Evaluation of LiDAR for the Free Navigation in Agriculture" AgriEngineering 4, no. 2: 489-506. https://doi.org/10.3390/agriengineering4020033

APA Style

Reger, M., Stumpenhausen, J., & Bernhardt, H. (2022). Evaluation of LiDAR for the Free Navigation in Agriculture. AgriEngineering, 4(2), 489-506. https://doi.org/10.3390/agriengineering4020033

Article Metrics

Back to TopTop