Next Article in Journal
Physical Mechanism and Parameterization for Correcting Radar Wave Velocity in Yellow River Ice with Air Temperature and Ice Thickness
Next Article in Special Issue
Cotton Growth Modelling Using UAS-Derived DSM and RGB Imagery
Previous Article in Journal
AF-OSD: An Anchor-Free Oriented Ship Detector Based on Multi-Scale Dense-Point Rotation Gaussian Heatmap
Previous Article in Special Issue
Geometrical Characterization of Hazelnut Trees in an Intensive Orchard by an Unmanned Aerial Vehicle (UAV) for Precision Agriculture Applications
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Quality Analysis of a High-Precision Kinematic Laser Scanning System for the Use of Spatio-Temporal Plant and Organ-Level Phenotyping in the Field

Institute of Geodesy and Geoinformation, University of Bonn, Nußallee 17, 53115 Bonn, Germany
*
Author to whom correspondence should be addressed.
Remote Sens. 2023, 15(4), 1117; https://doi.org/10.3390/rs15041117
Submission received: 3 January 2023 / Revised: 10 February 2023 / Accepted: 16 February 2023 / Published: 18 February 2023
(This article belongs to the Special Issue 3D Modelling and Mapping for Precision Agriculture)

Abstract

:
Spatio–temporal determination of phenotypic traits, such as height, leaf angles, and leaf area, is important for the understanding of crop growth and development in modern agriculture and crop science. Measurements of these parameters for individual plants so far have been possible only in greenhouse environments using high-resolution 3D measurement techniques, such as laser scanning or image-based 3D reconstruction. Although aerial and ground-based vehicles equipped with laser scanners and cameras are more and more used in field conditions to perform large-scale phenotyping, these systems usually provide parameters more on the plot level rather than on a single plant or organ level. The reason for this is that the quality of the 3D information generated with those systems is mostly not high enough to reconstruct single plants or plant organs. This paper presents the usage of a robot equipped with a high-resolution mobile laser scanning system. We use the system, which is usually used to create high-definition 3D maps of urban environments, for plant and organ-level morphological phenotyping in agricultural field conditions. The analysis focuses on the point cloud quality as well as the system’s potential by defining quality criteria for the point cloud and system and by using them to evaluate the measurements taken in an experimental agricultural field with different crops. Criteria for evaluation are the georeferencing accuracy, point precision, spatial resolution, and point cloud completeness. Additional criteria are the large-scale scan efficiency and the potential for automation. Wind-induced plant jitter that may affect the crop point cloud quality is discussed afterward. To show the system’s potential, exemplary phenotypic traits of plant height, leaf area, and leaf angles for different crops are extracted based on the point clouds. The results show a georeferencing accuracy of 1–2 cm, a point precision on crop surfaces of 1–2 mm, and a spatial resolution of just a few millimeters. Point clouds become incomplete in the later stages of growth since the vegetation is denser. Wind-induced plant jitters can lead to distorted crop point clouds depending on wind force and crop size. The phenotypic parameter extraction of leaf area, leaf angles, and plant height from the system’s point clouds highlight the outstanding potential for 3D crop phenotyping on the plant-organ level in agricultural fields.

Graphical Abstract

1. Introduction

Computing large-scale phenotypic traits of crops under field conditions within the phases of growth is one of the most important tasks in the context of breeding, crop health and plant performance monitoring, and many other fields in crop sciences [1,2,3,4,5,6]. Crop traits of interest are often leaf area index (LAI), leaf angle distribution (LAD), leaf length and width, plant height, and biomass [7,8,9,10,11,12]. In agricultural science, these traits play an important role in understanding the crop’s responses to environmental conditions [13], predicting yield [14], specifying evapotranspiration in water cycles [15], and determining chlorophyll content [16] or daily accumulated light [17] by using leaf area information. Traits are often used to optimize mixed-cropping cultivation in terms of resistance to diseases and yield increase [18]. One classic destructive method to calculate phenotyping parameters needs single plants to be removed from the field which is very time-consuming and prevents single plants to be monitored over time. For that reason, non-destructive approaches have become more popular. By using sensor technology for the generation of high-resolution, three-dimensional point clouds, traits of interest can be extracted in a short time without removing single plants from the field, allowing spatio–temporal crop monitoring. The following three sections summarize the most relevant sensor technologies for point cloud creation under field conditions.
(1) Terrestrial laser scanning (TLS) is a popular sensor technology in this context which is usually established in a wide field of industry and monitoring tasks. To create point clouds of the environment, scans are taken from several views. Large-scale and detailed point clouds are created with scan registration, using corresponding targets or object geometries. However, this method is very time-consuming in planning and offers low potential for automation. In the context of high-resolution field crop scanning, several works deal with the computation of phenotypic traits from TLS point clouds [7,8,19,20] for crops such as maize, soybean, and wheat.
(2) Multi-view photogrammetry provides an alternative to terrestrial laser scanning. By taking images from different perspectives, three-dimensional colorized point clouds can be calculated. Challenges in this approach arise from image registration, varying illumination conditions, and a potentially large number of required images. Several works deal with image-based 3D reconstruction in the context of phenotyping [8,21,22,23] for crops such as maize, cabbage, cauliflower, sugar beet, and brassica.
(3) Kinematic laser scanning is another technique for high-resolution point cloud generation. By georeferencing single-profile laser scans with the position and orientation of the system, high-resolution point clouds of the environment can be captured in a short time. Position and orientation of the system for georeferencing are usually estimated by fusing sensor data from GNSS (global navigation satellite system) and IMU (inertial measurement unit) using algorithms such as Kalman filter techniques [24]. In crop phenotyping, kinematic laser scanning systems have been used in several robotic field platforms [11,25,26,27].
The quality of point clouds generated with those systems is essentially influenced by the sensor properties. Two decisive factors affecting point cloud quality are the laser distance measurement noise and the angular resolution of the rotating laser scanner. In most existing research approaches related to crop point cloud creation in fields, low-cost LIDAR (light detection and ranging) sensors with a range precision of some centimeters have been used. The generated crop point clouds indicate the shape of plant geometries but do not capture details such as small leaves and stems. Therefore an extraction of 3D phenotyping parameters at the plant and organ level is not feasible.
In this work, a high-quality kinematic laser scanning system that was designed for surveying urban environments and roads is used. The system is equipped with a high-resolution and high-precision profile scanner, which has a range precision below 1 mm and delivers a spatial resolution of 0.5 mm at a measurement range of 3 m. In contrast to low-cost LIDAR sensors, this scanner has a much higher spatial resolution and significantly lower distance measurement noise. We use this system in combination with a field robot to create an extensive spatio–temporal data set of several crops at different growth stages in an agricultural field experiment. The main contribution of this paper is the quality analysis of the 3D crop point clouds created with the system under field conditions by definition of suitable quality parameters and their application on the gathered data set. We also present some exemplary morphological traits, such as leaf inclination angles, leaf areas, and plant height, which we determined from single plants in the field. We present and discuss the results concerning the capability of the presented measurement methodologies for single plant and organ-level morphological phenotyping.
The paper is structured as follows. The description of the laser scanning system and direct georeferencing of laser profiles for point cloud creation is given in Section 2.1, followed by a detailed description of the agricultural experimental field and the datasets taken for the evaluation in Section 2.2. In Section 2.3, we first describe the quality criteria and how we derive them and then explain the trait estimation pipeline. The results and discussion are presented in Section 3 of this paper.

2. Materials and Methods

2.1. Mobile Laser Scanning System

The laser scanning system used in this study is usually applied for high-resolution 3D mapping in urban environments. It consists of an inertial navigation system (iMAR iNAV FJi-LSURV) with an integrated global navigation satellite system (GNSS) receiver and a Leica AS10 GNSS antenna used for the system’s trajectory state estimation of position and orientation. For scanning the environment, the system is equipped with a profile scanner (Zoller+Froehlich Profiler 9012 A). The single beam laser scanner rotates around its sensor axis and creates single laser profiles with every rotation. Based on the manufacturer’s manual, relevant information about the sensor specifications are given as follows. The scanner has a vertical field of view of 360 and measures up to 1 million points per second at a maximum scan rate of 200 profiles per second. The angular resolution of 0.0088 leads to spatial point distances along the profile of about 0.5 mm at a distance to the object of 3 m. The distance measurement works with a phase-based method and a wavelength of 635 nm. The laser spot size at a distance of 0.1 m is specified by the manufacturer with a diameter of 1.9 mm and can be assumed to be constant due to short laser distance measurements of just a few meters in our applications. The manufacturer specifies a range standard deviation below 1 mm in close-range mode. The scanner is tilted by 30 to the rigid base plate of our system, see Figure 1. An example scan profile is drawn in the right image of Figure 1. The laser scanner also measures the laser intensity reflected, which depends on the color, roughness, reflectivity of the surface, and laser incidence angle. If not otherwise noted, the measured laser intensity is used to colorize the point clouds presented throughout the paper.
The inertial navigation system of the iMAR contains a high-quality IMU with fiber optic gyroscopes and servo accelerometers for high-precision 3D angular velocity and acceleration measurements at a sample rate of 1000 Hz. Based on the IMU specifications in the user manual of the manufacturer, the IMU is classified into the high-end quality class defined in [28]. The multi-GNSS receiver NovAtel OEM4-G2L provides a position precision of a few centimeters when performing real-time kinematic (RTK) GNSS baseline estimation in post-processing. The Leica AS10 antenna observes signals from GPS, Glonass, Beidou, and Galileo satellite systems and is mounted 60 cm above the IMU origin, see Figure 1a. The GNSS receiver provides absolute time stamps via a PPS (pulse per second) signal interface that we use to synchronize GNSS, IMU, and laser scanner data. The system is shown in Figure 1a. For field measurements, we mount the system on the roof of a robotic field platform, see Figure 1b. The robot platform can drive in field environments with a row width of 1.5 m. Although the platform is in general able to be automated, it is remotely controlled in all experiments described in this paper.
For the generation of a consistent point cloud from multiple scan profiles, we follow the direct georeferencing equation of Figure 2, which transforms each laser scanner observation into the global coordinate system g. Figure 2 also shows this transformation in schematic visualization. We use a leveled coordinate system with a UTM (universal transverse Mercator) projection with ellipsoidal height as the global coordinate system.
The laser scanner measurements are given in the scanner’s sensor frame s and consist of an angle measurement ω and a distance measurement d, see Figure 2, right. We transform them into the vehicle-fixed body system b using the rotation matrix R s b ( α , β , γ ) and the translation vector Δ x Δ y Δ z T of the system calibration, see Figure 2, respectively. The calibration parameters are the relative translation and rotation of the laser scanner frame to the body frame b of the system, coinciding with the IMU origin. System calibration parameter estimation is performed using a plane-based calibration field approach. Detailed information about plane setups and system calibration technique gives [29]. The results of this work have standard deviations in the range of 1.5 mm for translation and 0.005 (0.9 mm @ 10 m) for rotation. This accuracy is assumed to be sufficient for the task of crop point cloud generation. After transformation from the sensor into the body frame using the system calibration parameters, the laser observations are represented in the body frame of the system.
For the georeferencing of laser points, the trajectory consisting of position t x , t y , t z T and orientation r , p , y T in the equation of Figure 2 must be known. The rotation angles of roll r, pitch p, and yaw y describe the system’s orientation to the local leveled coordinate frame. By fusing GNSS positions with the angular velocity and acceleration measurements of the IMU, the position and orientation of the system is estimated. First, the GNSS baselines are computed using master station data from a GNSS service provider and receiver observations of our system. This results in a GNSS position rate of 0.5 Hz with a standard deviation of about 1–2 cm. Both the estimation of GNSS baselines and trajectory states are performed using the commercial software solution Inertial Explorer of NovAtel. It uses a Kalman smoothing algorithm for trajectory estimation similar to [24]. Finally, the translation and orientation of the trajectory are interpolated with respect to the laser observation timestamps, and the georeferencing equation of Figure 2 is used to create the point cloud. The schematic overview of the transformation is provided in lower pipeline of Figure 2.

2.2. Field Experiments

The basis for quality evaluation is a data set, which we generated with the kinematic laser scanning system on an experimental agricultural field with the size of 40 × 150 m near the city of Bonn, Germany, see Figure 3. The field contains several plots with different crops (sugar beet, maize, soybean, potato, and a mixture of wheat and beans, see Figure 3). The experiments are related to different variations in crop variety, seeding density, and herbicide application levels, as well as investigations about the advantages of mixed cropping cultivation. The field is structured in plots with a size of 1.5 × 3.0 m, allowing all parts of the field to be reached by the robotic platform. The potato field on the left of Figure 3 is structured in rows and can be accessed by the robot as well. We navigate the system manually via remote control to reach the plots and rows of interest. The experimental field consists of repetitive rows and plots. For that reason, we do not scan the entire field but select individual plots which we observe within the vegetation period multiple times. The selected plots and rows within the field are highlighted in Figure 3 by the red boxes.
The experimental field also contains ground control targets at known positions. We use targets, highlighted in Figure 3, for the evaluation of the georeferencing accuracy and scan them multiple times during the season. Table 1 shows the days of measurements.
Some evaluation criteria that we want to analyze in our work, such as point precision and spatial resolution, depend on the scanning geometry, such as measurement distance, scanning angle to the surfaces, and reflection properties. On the other hand, criteria, such as the number of outliers and the point cloud completeness, depend on the plant organ size, vegetation density, and plant structure. All these characteristics vary for different crops and different growth stages. By using this extensive data set, we ensure a high variation in the context of plant size and height, plant organ size, and vegetation density.

2.3. Evaluation Criteria in Kinematic Crop Laser Scanning

There is a lack of studies in the research community related to the definition of quality criteria for crop point clouds. We, therefore, consider the related research field of engineering geodesy processes in which quality criteria, such as completeness, correctness, and accuracy, exist. A good overview of different quality measures is given in [30,31]. We adapt these ideas for the definition of our criteria related to the laser scanning system and the 3D crop point clouds in the context of plant and organ-level phenotyping.
We first define criteria and then explain the methodologies to evaluate them. These criteria are the georeferencing accuracy, the point cloud quality, the time consumption for scanning large fields, the potential for automation, and field capability. In this chapter, we also explain the methods of extracting phenotypic parameters based on the point clouds created in the field.

2.3.1. Georeferencing Accuracy

Due to the interest in single plants and their spatio–temporal monitoring in the field, it is important to find the individual plants in the large data set. This means that spatial coordinates, which are assigned to the points in the cloud need to have a georeference which is accurate up to several centimeters. The point clouds from the kinematic laser scanning system are directly georeferenced using the GNSS position information within the trajectory estimation of position and orientation. The accuracy of the georeferenced point cloud directly depends on the accuracy of the trajectory states, influenced by the GNSS conditions prevailing in the field and the quality of the IMU observations.
The GNSS conditions depend on several factors. The number and geometry of incoming satellite signals are essential. Furthermore, multi-path signals caused by objects can lead to biased positions. On the field, excellent GNSS conditions are present as many GNSS signals are received from all directions, and no buildings or other objects cause multi-path signals. Therefore, the GNSS positioning precision in the field can be predicted within a few centimeters. In addition, the quality of the IMU measurement also affects the trajectory state estimation and therefore the georeferencing precision. Due to the use of a high-end IMU and GNSS receiver in the system, the georeferencing precision is approximated by a few centimeters in advance.
In kinematic laser scanning, georeferencing accuracy evaluation is often performed by point cloud comparisons by scanning static objects or points multiple times and estimating a standard deviation. The accuracy is analyzed by comparing target center coordinates, similar to the evaluation methodology of [32]. A number of targets are deployed in the agricultural field that are scanned several times during the vegetation period (Figure 3). The target scans are extracted manually from the whole point cloud using the open-source software cloud compare, and the center points are identified and stored. By determining the target center coordinates multiple times within the growing season, multi-temporal mean target center coordinates are estimated. The residuals to the mean centers and the standard deviations are computed to approximate the georeferencing accuracy. The results are shown in Section 3.1 of this work. Please note that by calculating the residuals to the mean, the georeferencing precision is given and not the accuracy. Precision is the parameter of interest when we visit a plant multiple times, but the term georeferencing precision is not common, and we avoided using it.

2.3.2. Point Cloud Quality

Point Precision A crucial criteria for the quality of point clouds is the point precision on the scanned plant surfaces. Sufficient point precision enables the surface reconstruction of plant organs, such as leaves and stems, which is necessary for the calculation of parameters, such as LAI, LAD, and plant heights. The term point precision describes how the distances of the measurement points to the actual measured surface are distributed. This depends mainly on the laser scanner’s distance measurement noise, which is influenced by the reflection properties and distance to the object as well as the scan geometry defined by the laser beam’s angle to the surface.
For point precision analysis on crop surfaces, the distances of points to a locally fitted plane are used. Single leaves are extracted manually from the point cloud and stored in a tree data structure. A core point on a flat area of the leaf is selected. In this area, the plane fit is assumed to be sufficient as a description of the surface. Around this core point with a radius of 1 cm, all neighborhood points are determined using a tree range query. Afterward, a least-squares plane fit based on these neighboring points is implemented, and the distances of neighborhood points to the best-fit plane are computed. Finally, the standard deviation of these distances is estimated that is used to approximate the point precision on leaf surfaces. To consider variations of the scanning geometry, the standard deviation is computed for different core points on the leaf. The pipeline is repeated for the leaves of maize, soybean, potato, wheat, and sugar beet. The input of our self-implemented Python pipeline is a leaf point cloud that we extract from the whole point cloud manually. The results of the point precision analysis are presented and discussed in Section 3.2.1 of this paper.
Spatial Resolution The extraction of phenotypic parameters based on point clouds on the plant and plant-organ level needs a sufficient capture quality of these structures. Plant organs, such as leaves and stems, are often very small, especially in the early stages of growth. A reliable spatial resolution of the point cloud is necessary to capture those plant structures. We define spatial resolution as the point-to-point distances within the point cloud.
In kinematic laser scanning, multiple influences affect the spatial resolution of point clouds. The resolutions are divided into across and along the direction of travel. Across the direction of travel, the spatial resolution is influenced by the laser scanner’s angle resolution and distance to the objects as the point-to-point distances increase with larger measurement ranges. In the direction of travel, the scanner’s profile rate and the velocity of the system influence the resolution. Independent from across and along the direction of travel, the angle between plant surfaces and laser beam influences the spatial resolution as well. Small incidence angles lead to a lower resolution than large ones.
By making assumptions about the scan geometry, the system’s velocity, and the properties of the profile scanner, we approximate the spatial resolution in advance as follows. Assuming a range of 3 m to the object and using the scanner’s angle resolution of 0.0088 , the point-to-point distances are 0.5 mm along the scan profile. At a profile rate of 200 Hz and a system’s velocity of 10 cm/s, the resolution in the direction of travel leads to 0.5 mm as well. Of course, these values are only valid for idealized geometries and do not hold for scans of complex plant surfaces since the spatial resolution is highly influenced by the orientation of crop surfaces to the laser scanner.
To empirically determine the spatial resolution in field conditions, leaves from the point clouds of different crops are extracted manually. The point clouds are stored in a tree data structure that is exploited to determine the nearest neighbor point for each point using a minimal range query. The distance to the nearest neighborhood point is obtained. This determination is repeated for each point in the point cloud and mean, minimum, and maximum distances are calculated. The self-implemented Python pipeline is applied for different crops to achieve a variation in the size and shape of plant organs. Results related to the spatial resolution are presented in Section 3.2.2 of this paper.
Outliers Another important quality criterion is the occurrence of outliers and point artifacts in the crop point clouds and the capability to remove them automatically using a filter algorithm to avoid time-consuming manual removal. Since in kinematic laser scanning mixed pixels are the most dominant reason for outliers, we focus its occurrences in our work. Mixed pixels are caused by multiple reflections along the laser beam at the edges of objects. The reflection of laser spots on surfaces is used to compute the distance to objects. The laser has a certain diameter when it hits the surface. At edge areas, the laser spot partly hits the edge itself but also objects behind or in front of the edge. This leads to reflections of the laser beam coming from several objects and leads to erroneous distance measurements. Since the laser spot size is larger with increasing measurement distances and also depends on the angle of incidence, mixed pixel effects describe a complex problem in laser scanning and are therefore a significant criterion for the quality of point clouds here.
To remove mixed pixels and outliers in the point cloud, a filter algorithm is required that can remove them reliably and automatically. Choosing a suitable filter algorithm, we took the density of the mixed pixels into account. There are several relevant works dealing with mixed pixel removal in laser scanning point clouds, demonstrating that mixed pixels show significantly lower point densities compared to points belonging to object surfaces [33,34]. For that reason, we chose the statistical outlier removal (SOR) filter implemented within the open-source software cloud compare. The filter computes the mean average distances of the k nearest neighbor points and rejects points farther away than the average distance plus the estimated standard deviation multiplied by a factor q acting as a threshold parameter.
To evaluate the capability of automated outlier removal in the measurement conditions at hand, example point clouds of plants and plant organs of maize, soybean, potato, wheat, and sugar beet were extracted from the point clouds manually to apply the SOR filter in several runs. In each run, the number of neighborhood points k and the multiplier factor q were varied. Point clouds before filtering and afterward were compared qualitatively to evaluate the success of the filter process. The number of detected mixed pixels of the different crops and the percentage concerning the total number of points were compared and discussed. We expect a larger amount of mixed pixels to be detected in plant point clouds having structures with many small leaves since they have more edges compared to plants with large leaves. After the analysis of the SOR filter results, the investigations focused on the extent to which the removal process can be automated. The discussion of mixed pixel filtering success and the potential for automation are given in Section 3.2.3 of the work.
Completeness Plants are described by a complex geometry extending from upper leaves to soil. In the laser scanner’s view, important plant organs below the canopy might be occluded by upper plant structures and neighboring plants as well. In these cases, the created crop point clouds are incomplete. For that reason, this paper considers the completeness as a quality criterion of the crop point clouds. The laser scanning system is mounted on the roof of the robot, see Figure 1. Therefore, mainly leaves of the upper canopy are observed. Due to scanner tilt of about 30 degrees and measurements across the direction of travel, scans are taken below the canopy as well. However, the scan geometry cannot result in complete point clouds since the occlusions prevent laser measurements of all crop parts. The occlusions are influenced by the density of the vegetation varying between types of crops, as they differ in size and amount of upper canopy leaves. In addition, the stage of growth and seeding density influence the number of occlusions. We expect the completeness to decrease in the later stages of growth.
For evaluation of the completeness, we analyze multi-temporal point clouds of a single soybean plot scanned at five different days in the growing season, since the vegetation density and expected occlusions increase in later stages of growth. The results are presented in Section 3.2.4 of this paper which includes a qualitative discussion of the results. We are aware of the fact that this procedure can only give qualitative insight into completeness. A rigorous analysis would on the one hand need a complete reference geometry with similar quality and on the other hand depend so strongly on the actual reference object. Therefore, it might be impossible to derive generalizable results for the complex objects we are looking at.

2.3.3. Time Consumption and Potential for Automation

Extracting plant-level phenotypic parameters from crop point clouds on a large scale is time-consuming. Influences on time consumption are the scan efficiency (area scanned by the system per hour), the size of the agricultural field of interest, and restrictions related to the spatial resolution of the output point clouds. Another important criterion is the potential for automated point cloud generation using autonomous driving techniques to avoid manual control of the system for the reduction of human time consumption. We discuss which essential components are required for automated driving in the field and the potential of our robotic field platform to realize autonomous driving. Both time consumption and potential for automation are analyzed and discussed in Section 3.3.

2.3.4. Field Capability

The last criterion that we investigate in this work are the conditions in the field influencing our sensors and the point clouds. Field conditions are composed of a handful of factors that can change dynamically. We distinguish here between the influences on the laser scanner measurements and the kinematic scan process for point cloud creation. Atmospheric refraction of our laser scanners’ phase-based distance measurements is mainly influenced by changes in temperature, atmospheric pressure, and humidity. Influences have been analyzed and discussed in research works such as [35,36] for decades. Since the impact on small distance measurements of a few meters is significantly small concerning the laser distance measurement noise, the influences on the range measurements can be neglected. However, the much more important influence is plant movements while scanning them with our system. Wind-induced plant jitter during the scans may lead to significant artifacts. For that reason, the capture of plant organ structures becomes incorrect. Movements of crops in the field depend on the wind force, its dynamic change, and the plant surface on which the wind can attack. Therefore, plant jitter should be less for crop types, such as soybean, potato, and sugar beet, compared to larger individuals, such as maize, since they have in comparison smaller plant organs. Of course, the influence also depends on the stage of growth since the plant organ sizes increase within the growing season.
For the evaluation of plant jittering influences, we extract leaves of soybean and maize from our point clouds for two days on which different wind conditions prevail in the field. On the first day, weak wind forces occurred during the measurements. On the second day, the wind strengths were much stronger. However, we choose these crop types for evaluation since they significantly differ in height and size of their organs to compare the results. In that context, we assume fewer wind artifacts in crop point clouds of soybean compared to maize since the size of the organs is much smaller, whereby the wind forces have less surface to attack. In Section 3.4, the point clouds of the extracted crop leaves are visualized, followed by a discussion of the results in a qualitative manner.

2.4. Extraction of Phenotyping Parameters

To highlight the potential and quality of the kinematic laser scanning system and its generated crop point cloud, phenotypic parameters are extracted on the scale of plant individuals and even single crop leaves. This section explains the parameter extraction methodology of leaf area, leaf inclination angles, and plant height based on the high-quality crop point clouds. For the determination of leaf areas and angles, the leaves within crop point clouds are segmented manually from the whole output point cloud created by applying the georeferencing pipeline shown in Figure 2 using the software cloud compare. Afterward, a ball-pivoting algorithm is used to generate a triangle mesh structure. Detailed information about the algorithm is given in [37]. The sum of all meshed triangles gives the total leaf area. By calculating the normal vectors for each triangle, the inclination angles and their distribution are deduced. The estimation pipeline is summarized in Figure 4 and implemented in Python using the open3D package.
To evaluate the extraction of phenotypic features, the pipeline in Figure 4 is applied for the point clouds of different crops. We compare and discuss the quality of the triangle meshes of the different crops and discuss the results for leaf area and leaf angle distributions in Section 3.5.1 of this work. Additionally, height and leaf areas for a multi-temporal point cloud of a single maize plant are extracted. The high-precision georeferencing allows a spatio–temporal assignment of single plants over time. In Section 3.5.2, the results of the development of the leaf area and the plant height for the maize data set are presented and discussed.

3. Results and Discussions

In this chapter, we apply the criteria we defined for the evaluation of the kinematic laser scanning for crop point cloud generation. We present the results for each criterion followed by a detailed discussion. Afterward, we show and discuss the results of the extraction of plant height, leaf area, and leaf angle distribution in Section 3.5.

3.1. Georeferencing Accuracy

Figure 5a shows the ground control targets that we use for georeferencing accuracy analysis and an example trajectory of the kinematic laser scanning platform. In the starting sequence of each measurement drive, we rotate the system multiple times around the vertical axis to allow reliable IMU bias parameters in the Kalman filter algorithm. For this reason, two circles are visible in the southern part of Figure 5 in the beginning. An example target is shown in the image of Figure 5b,c shows a resulting scan.
The residuals to the mean center coordinates and approximated normal distributions for east, north, and ellipsoidal height coordinates are visualized in Figure 6. The black lines are the estimated standard deviations of the normal distributions. For east and north, the residuals are in the interval of ±2 cm and standard deviations of 0.5 cm and 0.6 cm. The residuals of the ellipsoidal height range in an interval of ±2.3 cm with a corresponding standard deviation of 1.1 cm. These larger values can be explained by the more inaccurate height positioning with GNSS that is transferred to the point cloud due to direct georeferencing of the scan profiles.
The high georeferencing accuracy at the centimeter level enables single plants to be monitored over time. Figure 7 shows a point cloud of a single maize plant extracted within the growing season 2021 on five different days. On the right side of Figure 7, the five scans are visualized in one plot colorized by the corresponding days of the measurements. The high accurate georeferencing enables registration of point clouds of individual crops in time. Nevertheless, the individual plants must be manually extracted from the entire point cloud. This can be performed quickly in the early stages of development (16 June–4 July) since the vegetation density is low. However, at later stages of increased density, individual plants overlap, which requires more time for manual segmentation. At this point, it becomes clear that high accurate of georeferencing is a prerequisite for automated spatio–temporal registration on the plant or even the plant-organ level. There are several works dealing with this research question, such as [38,39]. However, automated point cloud segmentation is not the focus of this paper. Nevertheless, the highly accurate georeferenced crop point clouds are the basic prerequisite to enable automated, spatio–temporal registration of crop point clouds at individual plant and even plant-organ level in the field.

3.2. Point Cloud Quality

In this chapter, we first show the evaluation results of point precision, spatial resolution, outliers, and point cloud completeness. Afterward, we analyze and discuss our results to investigate the potential and challenges in kinematic crop laser scanning in field situations.

3.2.1. Point Precision

For the determination of the point precision estimation, we follow the pipeline described in Section 2.3.2. We calculated the precision for leaves of different crops, for five different core points, respectively, in different areas on the single leaves. Figure 8 shows the leaf point clouds and core points selected. The tables in Figure 8 summarize the standard deviations of the plane-to-point distances for the leaves of maize, soybean, potato, and wheat computed in the local neighborhood of the five core points.
The standard deviations range on average from 0.80 mm to 1.02 mm and are in the same order of magnitude of about one millimeter, see tables in Figure 8. The variations of these standard deviations σ within an interval from 0.71 mm to 1.10 mm can be explained by the different angles of incidence to the surface since in laser scanning large incidence angles lead to larger range measurement noise. Based on these results, the point precision on the leaf surfaces can be valued by about a millimeter and does not depend on the crop type. This high point precision is suitable to capture single plant organs such as leaves and highlights the potential for 3D crop phenotyping on plant and plant organ scales.

3.2.2. Spatial Resolution

The spatial resolution of point clouds is determined by the point-to-point distances on plant surfaces. We analyze the resolution using leaves extracted from point clouds of different crops. We use the pipeline introduced in Section 2.3.2 for the evaluation.
Table 2 shows the results for the mean spatial resolutions for maize, soybean, potato, wheat, and sugar beet leaves of Figure 8. The comparison between the different leaves shows a mean point distance of 0.99 mm for maize and 1.48 mm for wheat. The minimum (maximum) point distances vary between 0.32 mm and 1.0 mm (2.51 mm and 3.85 mm). The spatial resolution approximated based on the sensor configuration and system in Section 2.3.2 resulted in point-to-point distances of 0.5 mm along the profile and direction of travel at a distance to the plant of 3 m, a system’s velocity of 10 cm/s, and a scan profile rate of 200 Hz. The lower estimated resolutions of the results here can be explained by the complex leaf geometries affecting the point-to-point distances. For that reason, the spatial resolution of our point clouds is inhomogeneous. However, the point clouds in Figure 8 show a sufficient spatial resolution offering high potential for plant and organ-level phenotyping.

3.2.3. Outliers

In this section, we show exemplary point clouds of different crops and describe the occurrence of outliers due to mixed pixels. The point cloud results after SOR filtering are discussed, and the percentage of detected outliers and mixed pixels for each crop point cloud is determined. Since the analyzed crops differ significantly in plant and plant organ size, the differences in the results are discussed afterward. The top of Figure 9 shows the point clouds of single maize, soybean, potato, wheat, and sugar beet plants. The point clouds show a large number of outliers and mixed pixels close to leaf edges.
For outlier removal, the SOR filter is applied as described in Section 2.3.2. The resulting point clouds for each crop point cloud are shown in the lower row of Figure 9. Applying the filter with k = 20 neighborhood points and a standard deviation multiplier of q = 0.1. showed reasonable removal performance for all crop types. This has been derived from several trials and qualitative analyses.
Table 3 shows the number of points before and after filtering and the number and the percentage of removed points. The number of points differs between 168,575 (maize) and 7785 (potato), see Table 3. For the soybean point cloud, the smallest percentage of points (7.8%) has been removed. The wheat point cloud has the highest score of removed points with 33.6%. These results can be explained by the different shapes, amounts, and sizes of leaves. Wheat plants have many small leaves. In contrast, soybean plants have large leaves but in small amounts. Due to these differing organ geometries, many more edges are captured with the laser scanner in the case of wheat compared to soybean, resulting in a larger number of mixed pixels.
The results of this section show that a reliable removal of outliers and mixed pixels by applying the SOR filter with parameters independent of the crop type is possible. This capability of automated outlier removal increases the potential for automation with respect to the extraction of phenotyping parameters by using our system.

3.2.4. Completeness

Figure 10 shows the point clouds of a single plot of soybean plants observed on five different days within the growing season from 16 June to 4 August. The upper part shows the top view of the plot; the bottom part shows the side view of the plot. In the following, we discuss the development of the soybean plants concerning the completeness of the points clouds.
On 16 June and 23 June, the vegetation density within the plot is low. Therefore, individual soybean plants are visible in the point clouds without the problem of occlusions due to leaves overlapping the laser scanners’ point of view. However, from 30 June to 4 August, there is a relevant increase in vegetation density and overlapping of neighboring plants and leaves. For that reason, the scanner system mainly measures plant organs of the upper canopy structure. A lack of scanned plant organs in the lower canopy is visible. An indicator of this problem is particularly evident in the point cloud of the plot scanned on 4 August in the side view of Figure 10b. These results show a significant decrease in point cloud completeness, especially in later growth phases, caused by the increase in vegetation density.
Although this example is not a rigorous generalizable result, it shows one of the weaknesses of the presented methodology. Depending on the crop type and the growth stage, the lower areas of the canopy might be occluded by the upper part. Although this problem will be present for all non-destructive measurement systems, it might be possible to optimize the scanner configuration using multiple scanners to manipulate the canopy using robotic arms. Moreover, additional scanning from below using under-canopy robot systems might be an option to reach high completeness.

3.3. Time Consumption and Potential for Automation

Generating point clouds with kinematic laser scanning systems on large scales is time-consuming. In this section, the time required for capturing point clouds and the potential for autonomous driving to reduce human time consumption are discussed. The time consumption for point cloud generation in the field is influenced by different factors. These are restrictions to the spatial resolution of the point clouds to enable plant-level phenotyping, affected by the scan rate, resolution along a scan profile, the distance to the measurement object, and the velocity of the system. However, the field size also influences the time consumption. Taking these factors into account, we estimate the time consumption as follows.
Since high laser incidence angles to the crop surfaces will lead to a large number of mixed pixels and a low spatial resolution, we do not use the full laser profiles for the point cloud generation. We choose a field of view for the laser scanner of about 66 degrees, leading to a 5 m width of the laser profile at a measurement range of 3 m. At the scanners’ maximum scan rate of 200 Hz and a speed of 0.1 m/s, we capture 0.5 m 2 /s of the field with a spatial resolution of 0.5 mm across and along the direction of travel. Based on these assumptions, we need about 5 h 30 min for scanning a field with the size of one hectare. Compared to systems, such as UAVs, this scan efficiency is not very high. However, the need for high-resolution and precise scans of crops depends on the application. In the context of crop breeding, it is often not necessary to scan whole fields. Single plants, rows, or plots of interest can be scanned instead.
In that context, the time consumption can be assumed to be acceptable for high-quality crop point cloud generation. Nevertheless, there is potential for improvement. By using scanners with higher profile rates or using multiple scanners at the same time, it would be possible to increase the velocity of the field robot without a reduction of the spatial resolution, leading to less time needed for scanning fields or plots of interest. Even if the measurement takes some time, it is possible to automate the measurement process in a way, that does not require a human operator or manual intervention. Although in our experiments we used a remote control to steer the robot over agricultural fields, it is technically prepared to navigate autonomously. This navigation requires tasks, such as robot state estimation, global and local path planning, obstacle detection, and robot control. These are problems that have been solved in other robotic applications, and it can be assumed that they can be applied here as well. For that reason, our system delivers much potential for autonomous driving to reduce human time consumption to enable automated point cloud creation.

3.4. Field Capability

For the qualitative analysis of wind influences on the crop point clouds, we extracted exemplary single leaves of maize and soybean, shown in Figure 11. Due to dynamic wind conditions during the scan on 4 July especially, the maize leaf shows significant point cloud artifacts compared to the scan on 30 June with fewer wind occurrences. The maize leaf on 4 July develops jittering effects that increase in strength along the leaf with a larger distance to the stem center, located on the right side but not shown in the figure. The leaf jitters are expressed in different directions as well. Closer to the stem, they are orthogonal to the leaf surface, but on the left end, additional lateral movements are recognizable. In contrast to these effects, the maize leaf scanned on 30 June shows no wind-induced jitters.
Both soybean leaves in Figure 11 show no wind-induced artifacts within the point clouds. Compared to the large maize leaves, the size of both leaves is much smaller, leading to no significant point cloud artifacts being visible in our soybean scans.
The results presented here show that wind-induced plant jitters can be a challenging problem in the context of kinematic crop laser scanning under field conditions. This is especially valid if the plants and their organs are big and the wind forces are large. For that reason, there is much potential for further research related to the modeling of plant jitters for the correction of point artifacts in crop point clouds created with mobile laser scanning platforms. We discuss ideas related to this topic in the outlook of our work.

3.5. Phenotyping Parameter Extraction

In this chapter, we show and discuss the results of the phenotypic parameter extraction based on the point clouds of our system. Section 3.5.1 presents the results of leaf area and leaf inclination angle distribution for leaves of maize, soybean, potato, wheat, and sugar beet of Figure 8. We discuss the quality of the meshing result in that context as well. In Section 3.5.2, the results of the dynamic evolution of leaf area and plant height of the multi-temporal maize data set shown in Figure 7 are analyzed.

3.5.1. Leaf Area and Leaf Inclination Angle Distribution

Figure 12 shows the meshing result of the point clouds from Figure 8 for the leaves of maize, soy, potato, wheat, and sugar beet. The mesh generation is performed by using the pipeline in Figure 4. Most parts of the mesh show reasonable results, but problems are also visible in the structure. In some areas of the point clouds with low spatial resolutions, mesh holes are visible. In those areas, the ball pivoting algorithm is not able to calculate a triangular structure. The reason is that the ball radius computed by the algorithm uses the average point-to-point distances within the point cloud. Parts of low resolution are rejected by the algorithm since the distances exceed the ball radius. The occurring holes in Figure 12 show further potential to optimize the ball radius for better meshing results. However, this optimization is not the focus of this work. These results show that the high point precision and spatial resolution of the point clouds provide a reliable meshing quality on the scale of individual crop leaves. Even for the small and slim leaves of wheat, a reasonable meshing is produced. We also calculated the areas for each leaf, which show—without providing reference values—reasonable results.
Figure 13 shows the leaf inclination angle distributions for the meshed point clouds of Figure 12. For each triangle, we estimate the normal vector, compute the inclination angle, normalize with the area of the triangles, and visualize their distribution. The histograms in Figure 13 show significant differences in the angle distributions. The maize leaf has a histogram maximum of about 50 degrees; the histogram for soybean and sugar beet show a maximum of 30 degrees and respectively 65 degrees. In contrast, the wheat leaf shows uniform distributed inclination angles, which can be explained by the curved shape of the leaf shown in Figure 12. A dominant angle of the potato leaf can be detected at zero degrees.
Overall, the leaf angle distributions show reasonable results, as they fit the leaf shapes and orientations of the meshing results in Figure 12. The results show the large potential of our high-quality point clouds for plant-organ-based phenotyping.

3.5.2. Spatio–Temporal Height and Leaf Area Estimation

For the maize point clouds in Figure 7, we calculate the phenotyping parameters of plant height and leaf area. The results are shown in Figure 14. The colorization of the points corresponds to the calculated height value. We also calculated the leaf area for the full plant. The small drop in the leaf area from 1615 cm 2 on 30 June to 1420 cm 2 on 4 July can be explained by leaf occlusions during the scanning process affecting the point cloud completeness.

4. Summary and Outlook

This work analyzes the quality of a high-precision kinematic laser scanning system and the generated high-resolution crop point clouds for organ-level phenotyping in the field using several quality criteria. The point-cloud-related criteria are the georeferencing accuracy, point cloud quality including point precision, spatial resolution, outliers and completeness, time consumption and potential for estimation, and field capability of the system. Criteria of the system itself are the potential for automation and field capability. To highlight the system’s potential for phenotyping, structural phenotypic parameters of plant heights, leaf areas, and leaf inclination angles are extracted for different crops and at different stages of growth.
The results presented show that high accurate georeferencing enables the identification of single plants in the field throughout the full vegetation period. Therefore individual plots, plants, and even plant organs can be assigned within the growing season, delivering a great potential for crop monitoring. The high-quality point clouds concerning the noise on the plant surfaces and high spatial resolution in the millimeter range show the sufficiency of the kinematic laser scanning system for organ-scale phenotyping. Outliers in the point clouds manifested as mixed pixel artifacts in the point cloud are reduced by statistical outlier filtering increasing the system’s potential for automation. The extraction of phenotyping parameters on plant and organ levels demonstrates the quality of the point clouds. Leaf area, leaf angle distribution, and plant height estimation based on the point clouds triangle meshes deliver reasonable results.
However, this work also illustrated key challenges in kinematic laser scanning of crops, such as the incompleteness of point clouds and wind-induced plant jitters leading to an incorrect caption of single plant organs. Completeness of crop point clouds is a crucial problem in late growth stages having high vegetation densities since the system can mainly scan upper canopy leaves. In addition, plant jitters become a significant problem in the late stages of growth and are especially relevant for crop types whose plant organs are large.
Equipping the system with several profile scanners observing crops from a different perspective would increase the point cloud completeness but needs the problems of sensor data synchronization and system calibration to be solved. Jitter artifacts could be decreased by modeling the wind-induced plant movements or including camera information in the system to track plant motions in images.
Overall, the results of this work show the remarkable quality of the high-precision kinematic laser scanning system for plant and organ-level 3D phenotyping under field conditions. Based on the high potential of the system highlighted in this paper, future research will focus on the development of a fully automated pipeline for crop organ-level morphological phenotyping using autonomously navigated robotic field platforms equipped with multiple laser scanners and cameras.

Author Contributions

Conceptualization, F.E. and L.K.; Methodology, F.E.; Validation, F.E. and L.Z.; Formal analysis, F.E.; Data curation, F.E.; Writing—original draft, F.E.; Writing—review and editing, L.K.; Visualization, F.E. and L.Z.; Supervision, L.K.; Project administration, L.K. and H.K.; Funding acquisition, H.K. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by the Deutsche Forschungsgemeinschaft (DFG, German Research Foundation) under Germany’s Excellence Strategy–EXC 2070-390732324.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Conflicts of Interest

The authors declare no conflict of interest.

Abbreviations

The following abbreviations are used in this manuscript:
3DThree-Dimensional
TLSTerrestrial Laser Scanning
GNSSGlobal Navigation Satellite System
IMUInertial Measurement Unit
LIDARLight Detection and Ranging
UTMUniversal Transverse Mercator
LAILeaf Area Index
LADLeaf Area Distribution
SORStatistical Outlier Filter

References

  1. Araus, J.L.; Cairns, J.E. Field high-throughput phenotyping: The new crop breeding frontier. Trends Plant Sci. 2014, 19, 52–61. [Google Scholar] [CrossRef] [PubMed]
  2. Van Eeuwijk, F.A.; Bustos-Korts, D.; Millet, E.J.; Boer, M.P.; Kruijer, W.; Thompson, A.; Malosetti, M.; Iwata, H.; Quiroz, R.; Kuppe, C.; et al. Modelling strategies for assessing and increasing the effectiveness of new phenotyping techniques in plant breeding. Plant Sci. 2019, 282, 23–39. [Google Scholar] [CrossRef] [PubMed]
  3. Gracia-Romero, A.; Vergara-Díaz, O.; Thierfelder, C.; Cairns, J.E.; Kefauver, S.C.; Araus, J.L. Phenotyping conservation agriculture management effects on ground and aerial remote sensing assessments of maize hybrids performance in Zimbabwe. Remote Sens. 2018, 10, 349. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  4. Chandra, A.L.; Desai, S.V.; Guo, W.; Balasubramanian, V.N. Computer vision with deep learning for plant phenotyping in agriculture: A survey. arXiv 2020, arXiv:2006.11391. [Google Scholar]
  5. Khan, Z.; Rahimi-Eichi, V.; Haefele, S.; Garnett, T.; Miklavcic, S.J. Estimation of vegetation indices for high-throughput phenotyping of wheat using aerial imaging. Plant Methods 2018, 14, 1–11. [Google Scholar] [CrossRef] [Green Version]
  6. Perich, G.; Hund, A.; Anderegg, J.; Roth, L.; Boer, M.P.; Walter, A.; Liebisch, F.; Aasen, H. Assessment of multi-image unmanned aerial vehicle based high-throughput field phenotyping of canopy temperature. Front. Plant Sci. 2020, 11, 150. [Google Scholar] [CrossRef]
  7. Ali, B.; Zhao, F.; Li, Z.; Zhao, Q.; Gong, J.; Wang, L.; Tong, P.; Jiang, Y.; Su, W.; Bao, Y.; et al. Sensitivity Analysis of Canopy Structural and Radiative Transfer Parameters to Reconstructed Maize Structures Based on Terrestrial LiDAR Data. Remote Sens. 2021, 13, 3751. [Google Scholar] [CrossRef]
  8. Wang, Y.; Wen, W.; Wu, S.; Wang, C.; Yu, Z.; Guo, X.; Zhao, C. Maize plant phenotyping: Comparing 3D laser scanning, multi-view stereo reconstruction, and 3D digitizing estimates. Remote Sens. 2019, 11, 63. [Google Scholar] [CrossRef] [Green Version]
  9. Jiang, Y.; Li, C.; Takeda, F.; Kramer, E.A.; Ashrafi, H.; Hunter, J. 3D point cloud data to quantitatively characterize size and shape of shrub crops. Hortic. Res. 2019, 6, 1–17. [Google Scholar] [CrossRef] [Green Version]
  10. Young, S.N.; Kayacan, E.; Peschel, J.M. Design and field evaluation of a ground robot for high-throughput phenotyping of energy sorghum. Precis. Agric. 2019, 20, 697–722. [Google Scholar] [CrossRef] [Green Version]
  11. Sun, S.; Li, C.; Paterson, A.H. In-field high-throughput phenotyping of cotton plant height using LiDAR. Remote Sens. 2017, 9, 377. [Google Scholar] [CrossRef] [Green Version]
  12. Atefi, A.; Ge, Y.; Pitla, S.; Schnable, J. Robotic Technologies for High-Throughput Plant Phenotyping: Contemporary Reviews and Future Perspectives. Front. Plant Sci. 2021, 12. [Google Scholar] [CrossRef]
  13. Magney, T.S.; Vierling, L.A.; Eitel, J.U.; Huggins, D.R.; Garrity, S.R. Response of high frequency Photochemical Reflectance Index (PRI) measurements to environmental conditions in wheat. Remote Sens. Environ. 2016, 173, 84–97. [Google Scholar] [CrossRef]
  14. Dente, L.; Satalino, G.; Mattia, F.; Rinaldi, M. Assimilation of leaf area index derived from ASAR and MERIS data into CERES-Wheat model to map wheat yield. Remote Sens. Environ. 2008, 112, 1395–1407. [Google Scholar] [CrossRef]
  15. Aboutalebi, M.; Torres-Rua, A.F.; McKee, M.; Kustas, W.P.; Nieto, H.; Alsina, M.M.; White, A.; Prueger, J.H.; McKee, L.; Alfieri, J.; et al. Incorporation of unmanned aerial vehicle (UAV) point cloud products into remote sensing evapotranspiration models. Remote Sens. 2019, 12, 50. [Google Scholar] [CrossRef] [Green Version]
  16. Simic Milas, A.; Romanko, M.; Reil, P.; Abeysinghe, T.; Marambe, A. The importance of leaf area index in mapping chlorophyll content of corn under different agricultural treatments using UAV images. Int. J. Remote Sens. 2018, 39, 5415–5431. [Google Scholar] [CrossRef]
  17. Dhondt, S.; Wuyts, N.; Inzé, D. Cell to whole-plant phenotyping: The best is yet to come. Trends Plant Sci. 2013, 18, 428–439. [Google Scholar] [CrossRef]
  18. Agegnehu, G.; Ghizaw, A.; Sinebo, W. Yield performance and land-use efficiency of barley and faba bean mixed cropping in Ethiopian highlands. Eur. J. Agron. 2006, 25, 202–207. [Google Scholar] [CrossRef]
  19. Friedli, M.; Kirchgessner, N.; Grieder, C.; Liebisch, F.; Mannale, M.; Walter, A. Terrestrial 3D laser scanning to track the increase in canopy height of both monocot and dicot crop species under field conditions. Plant Methods 2016, 12, 1–15. [Google Scholar] [CrossRef] [Green Version]
  20. Su, Y.; Wu, F.; Ao, Z.; Jin, S.; Qin, F.; Liu, B.; Pang, S.; Liu, L.; Guo, Q. Evaluating maize phenotype dynamics under drought stress using terrestrial lidar. Plant Methods 2019, 15, 1–16. [Google Scholar] [CrossRef] [Green Version]
  21. Jay, S.; Rabatel, G.; Hadoux, X.; Moura, D.; Gorretta, N. In-field crop row phenotyping from 3D modeling performed using Structure from Motion. Comput. Electron. Agric. 2015, 110, 70–77. [Google Scholar] [CrossRef] [Green Version]
  22. Wu, S.; Wen, W.; Wang, Y.; Fan, J.; Wang, C.; Gou, W.; Guo, X. MVS-Pheno: A portable and low-cost phenotyping platform for maize shoots using multiview stereo 3D reconstruction. Plant Phenomics 2020, 2020, 1848437. [Google Scholar] [CrossRef] [Green Version]
  23. Lou, L.; Liu, Y.; Sheng, M.; Han, J.; Doonan, J.H. A cost-effective automatic 3D reconstruction pipeline for plants using multi-view images. In Proceedings of the Conference Towards Autonomous Robotic Systems, Birmingham, UK, 1–3 September 2014; Springer: Berlin/Heidelberg, Germany, 2014; pp. 221–230. [Google Scholar]
  24. Abd Rabbou, M.; El-Rabbany, A. Tightly coupled integration of GPS precise point positioning and MEMS-based inertial systems. GPS Solut. 2015, 19, 601–609. [Google Scholar] [CrossRef]
  25. Qiu, Q.; Sun, N.; Bai, H.; Wang, N.; Fan, Z.; Wang, Y.; Meng, Z.; Li, B.; Cong, Y. Field-based high-throughput phenotyping for maize plant using 3D LiDAR point cloud generated with a “Phenomobile”. Front. Plant Sci. 2019, 10, 554. [Google Scholar] [CrossRef] [Green Version]
  26. Gage, J.L.; Richards, E.; Lepak, N.; Kaczmar, N.; Soman, C.; Chowdhary, G.; Gore, M.A.; Buckler, E.S. In-field whole-plant maize architecture characterized by subcanopy rovers and latent space phenotyping. Plant Phenome J. 2019, 2, 1–11. [Google Scholar] [CrossRef]
  27. Iqbal, J.; Xu, R.; Halloran, H.; Li, C. Development of a multi-purpose autonomous differential drive mobile robot for plant phenotyping and soil sensing. Electronics 2020, 9, 1550. [Google Scholar] [CrossRef]
  28. Groves, P.D. Principles of GNSS, inertial, and multisensor integrated navigation systems, [Book review]. IEEE Aerosp. Electron. Syst. Mag. 2015, 30, 26–27. [Google Scholar] [CrossRef]
  29. Heinz, E.; Holst, C.; Kuhlmann, H.; Klingbeil, L. Design and evaluation of a permanently installed plane-based calibration field for mobile laser scanning systems. Remote Sens. 2020, 12, 555. [Google Scholar] [CrossRef] [Green Version]
  30. Schweitzer, J.; Schwieger, V. Modeling of quality for engineering geodesy processes in civil engineering. J. Appl. Geod. 2011, 5, 13–22. [Google Scholar] [CrossRef]
  31. Balangé, L.; Zhang, L.; Schwieger, V. First Step Towards the Technical Quality Concept for Integrative Computational Design and Construction. In Proceedings of the Contributions to International Conferences on Engineering Surveying; Springer: Cham, Switzerland, 2021; pp. 118–127. [Google Scholar]
  32. Dreier, A.; Janßen, J.; Kuhlmann, H.; Klingbeil, L. Quality Analysis of Direct Georeferencing in Aspects of Absolute Accuracy and Precision for a UAV-Based Laser Scanning System. Remote Sens. 2021, 13, 3564. [Google Scholar] [CrossRef]
  33. Chaudhry, S.; Salido-Monzú, D.; Wieser, A. Simulation of 3D laser scanning with phase-based EDM for the prediction of systematic deviations. In Proceedings of the Modeling Aspects in Optical Metrology VII, Munich, Germany, 24–26 June 2019; SPIE: Bellingham, WA, USA, 2019; Volume 11057, pp. 92–104. [Google Scholar]
  34. Tang, P.; Akinci, B.; Huber, D. Quantification of edge loss of laser scanned data at spatial discontinuities. Autom. Constr. 2009, 18, 1070–1083. [Google Scholar] [CrossRef]
  35. Kowalczyk, K.; Rapinski, J. Investigating the error sources in reflectorless EDM. J. Surv. Eng. 2014, 140, 06014002. [Google Scholar] [CrossRef]
  36. Brunner, F. Modelling of atmospheric effects on terrestrial geodetic measurements. In Geodetic Refraction; Springer: Berlin/Heidelberg, Germany, 1984; pp. 143–162. [Google Scholar]
  37. Bernardini, F.; Mittleman, J.; Rushmeier, H.; Silva, C.; Taubin, G. The ball-pivoting algorithm for surface reconstruction. IEEE Trans. Vis. Comput. Graph. 1999, 5, 349–359. [Google Scholar] [CrossRef]
  38. Chebrolu, N.; Magistri, F.; Läbe, T.; Stachniss, C. Registration of spatio-temporal point clouds of plants for phenotyping. PLoS ONE 2021, 16, e0247243. [Google Scholar] [CrossRef]
  39. Magistri, F.; Chebrolu, N.; Stachniss, C. Segmentation-based 4D registration of plants point clouds for phenotyping. In Proceedings of the 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), IEEE, Las Vegas, NV, USA, 25–29 October 2020; pp. 2433–2439. [Google Scholar]
Figure 1. (a) Mobile laser scanning system with Leica AS10 GNSS antenna on the top, iMAR iNAV FJi-LSURV INS and profile laser scanner Z+F Profiler 9012 A; (b) unmanned ground robot used as carrier platform in the field, mobile laser scanning system on the roof. An angular section of the scan profile is visualized in red.
Figure 1. (a) Mobile laser scanning system with Leica AS10 GNSS antenna on the top, iMAR iNAV FJi-LSURV INS and profile laser scanner Z+F Profiler 9012 A; (b) unmanned ground robot used as carrier platform in the field, mobile laser scanning system on the roof. An angular section of the scan profile is visualized in red.
Remotesensing 15 01117 g001
Figure 2. Schematic visualization of the direct georeferencing equation for the point cloud creation. Each laser measurement in the sensor frame on the right side is transformed into the global coordinate system on the left side by using the system calibration parameters and the system’s trajectory states of position and orientation.
Figure 2. Schematic visualization of the direct georeferencing equation for the point cloud creation. Each laser measurement in the sensor frame on the right side is transformed into the global coordinate system on the left side by using the system calibration parameters and the system’s trajectory states of position and orientation.
Remotesensing 15 01117 g002
Figure 3. Field experiment at campus Klein Altendorf near Bonn, Germany. The field contains plots of potato, sugar beet, maize, soybean, and a mixture of wheat and beans. The plots and rows observed by the system within the vegetation period of 2021 are highlighted by red boxes. Landmark targets that we use for later georeferencing precision evaluation are drawn as yellow circles.
Figure 3. Field experiment at campus Klein Altendorf near Bonn, Germany. The field contains plots of potato, sugar beet, maize, soybean, and a mixture of wheat and beans. The plots and rows observed by the system within the vegetation period of 2021 are highlighted by red boxes. Landmark targets that we use for later georeferencing precision evaluation are drawn as yellow circles.
Remotesensing 15 01117 g003
Figure 4. Extraction of leaf area and leaf inclination angle distribution by meshing our high-quality crop point clouds. We mesh the points by applying the ball pivoting algorithm and estimate the area and normal vector for each triangle generated. Finally, we sum up all triangle areas to compute the total leaf area and summarize the inclination angles in a histogram.
Figure 4. Extraction of leaf area and leaf inclination angle distribution by meshing our high-quality crop point clouds. We mesh the points by applying the ball pivoting algorithm and estimate the area and normal vector for each triangle generated. Finally, we sum up all triangle areas to compute the total leaf area and summarize the inclination angles in a histogram.
Remotesensing 15 01117 g004
Figure 5. (a) Targets and trajectory of the kinematic laser scanning system in the experimental field; (b) image of an example target; (c) target in point cloud colorized by the laser intensity.
Figure 5. (a) Targets and trajectory of the kinematic laser scanning system in the experimental field; (b) image of an example target; (c) target in point cloud colorized by the laser intensity.
Remotesensing 15 01117 g005
Figure 6. Estimated georeferencing precision for east, north, and height. Histograms in green are the residuals to the mean targets center coordinates in Figure 5a. In red: approximation of the normal distribution. Black lines correspond to estimated standard deviations.
Figure 6. Estimated georeferencing precision for east, north, and height. Histograms in green are the residuals to the mean targets center coordinates in Figure 5a. In red: approximation of the normal distribution. Black lines correspond to estimated standard deviations.
Remotesensing 15 01117 g006
Figure 7. Single maize plant extracted from the point clouds at five days assisted by high-accurate georeferencing colored by laser intensity for 16 June to 4 August. The colorization, right, corresponds to the point cloud indices of the different days.
Figure 7. Single maize plant extracted from the point clouds at five days assisted by high-accurate georeferencing colored by laser intensity for 16 June to 4 August. The colorization, right, corresponds to the point cloud indices of the different days.
Remotesensing 15 01117 g007
Figure 8. Leaf point clouds of maize, soybean, potato, wheat, and sugar beet colorized by laser intensity used for point precision and spatial resolution analysis. The numbers correspond to the core points selected for precision evaluation. The tables summarize the estimated standard deviations of the local point-to-plane distances derived at the core points.
Figure 8. Leaf point clouds of maize, soybean, potato, wheat, and sugar beet colorized by laser intensity used for point precision and spatial resolution analysis. The numbers correspond to the core points selected for precision evaluation. The tables summarize the estimated standard deviations of the local point-to-plane distances derived at the core points.
Remotesensing 15 01117 g008
Figure 9. Point clouds of maize, soybean, potato, wheat, and sugar beet before and after mixed pixel removal: (a) crop point clouds affected by outliers and mixed pixels; (b) point clouds after application of the statistical outlier removal filter; SOR parameter: neighborhood points: k = 20 ; standard deviation multiplier: q = 0.1 .
Figure 9. Point clouds of maize, soybean, potato, wheat, and sugar beet before and after mixed pixel removal: (a) crop point clouds affected by outliers and mixed pixels; (b) point clouds after application of the statistical outlier removal filter; SOR parameter: neighborhood points: k = 20 ; standard deviation multiplier: q = 0.1 .
Remotesensing 15 01117 g009
Figure 10. Plot of soybean plants at five different days within the growing season: (a) top view of the scanned plot; (b). side view of the plot.
Figure 10. Plot of soybean plants at five different days within the growing season: (a) top view of the scanned plot; (b). side view of the plot.
Remotesensing 15 01117 g010
Figure 11. Example maize and soybean leaf scanned at different days within the growing season. On 30 June less wind effects in the field were present. On 7 July dynamic wind conditions prevail in the field.
Figure 11. Example maize and soybean leaf scanned at different days within the growing season. On 30 June less wind effects in the field were present. On 7 July dynamic wind conditions prevail in the field.
Remotesensing 15 01117 g011
Figure 12. Estimated triangle meshes and computed leaf areas for the leaf point clouds in Figure 8.
Figure 12. Estimated triangle meshes and computed leaf areas for the leaf point clouds in Figure 8.
Remotesensing 15 01117 g012
Figure 13. Leaf inclination angle distribution computed for the leaf point clouds of maize, soybean, potato, wheat, and sugar beet of Figure 8. The inclination angles are computed by using the pipeline shown in Figure 4.
Figure 13. Leaf inclination angle distribution computed for the leaf point clouds of maize, soybean, potato, wheat, and sugar beet of Figure 8. The inclination angles are computed by using the pipeline shown in Figure 4.
Remotesensing 15 01117 g013
Figure 14. Estimated heights and leaf area for the multi-temporal data set of a single maize plant at five different days in the growing season.
Figure 14. Estimated heights and leaf area for the multi-temporal data set of a single maize plant at five different days in the growing season.
Remotesensing 15 01117 g014
Table 1. Datasets captured at the field experiment shown in Figure 3 during the growing season of 2021. The markers “x” highlight observed plots at the corresponding days.
Table 1. Datasets captured at the field experiment shown in Figure 3 during the growing season of 2021. The markers “x” highlight observed plots at the corresponding days.
MonthDaySugar BeetMaizeSoybeanPotatoWheat and Bean (Mixed)Targets
June2xx
June11 x
June16xxxx x
June23xxxx
June30 xx
July7xxxxxx
August4xxx x
August12xxx x
September3 xx
Table 2. Estimated spatial resolutions based on the leaf point clouds in Figure 8. Mean resolution is computed by averaging all nearest neighbor distances. The minimum and maximum values describe the lowest and largest distance that has been calculated.
Table 2. Estimated spatial resolutions based on the leaf point clouds in Figure 8. Mean resolution is computed by averaging all nearest neighbor distances. The minimum and maximum values describe the lowest and largest distance that has been calculated.
ValueMaizeSoybeanPotatoWheatSugar Beet
Mean Resolution0.991.021.441.481.54mm
Min/Max0.33/3.670.32/2.511.0/3.00.65/3.850.45/2.92mm
Table 3. Number of points before and after application of the SOR filter for the crop point clouds in Figure 9; number of removed points and the percentage with respect to the unfiltered point clouds; Applied SOR parameter: neighborhood points: k = 20 ; standard deviation multiplier: q = 0.1 .
Table 3. Number of points before and after application of the SOR filter for the crop point clouds in Figure 9; number of removed points and the percentage with respect to the unfiltered point clouds; Applied SOR parameter: neighborhood points: k = 20 ; standard deviation multiplier: q = 0.1 .
MaizeSoybeanPotatoWheatSugar Beet
Number of points before filtering168,57538,402778511,95932,321
Number of points after filtering149,69035,4096735793927,451
Number of removed points18,8852993105040194870
Percentage of removed points11.27.813.533.6115.1%
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Esser, F.; Klingbeil, L.; Zabawa, L.; Kuhlmann, H. Quality Analysis of a High-Precision Kinematic Laser Scanning System for the Use of Spatio-Temporal Plant and Organ-Level Phenotyping in the Field. Remote Sens. 2023, 15, 1117. https://doi.org/10.3390/rs15041117

AMA Style

Esser F, Klingbeil L, Zabawa L, Kuhlmann H. Quality Analysis of a High-Precision Kinematic Laser Scanning System for the Use of Spatio-Temporal Plant and Organ-Level Phenotyping in the Field. Remote Sensing. 2023; 15(4):1117. https://doi.org/10.3390/rs15041117

Chicago/Turabian Style

Esser, Felix, Lasse Klingbeil, Lina Zabawa, and Heiner Kuhlmann. 2023. "Quality Analysis of a High-Precision Kinematic Laser Scanning System for the Use of Spatio-Temporal Plant and Organ-Level Phenotyping in the Field" Remote Sensing 15, no. 4: 1117. https://doi.org/10.3390/rs15041117

APA Style

Esser, F., Klingbeil, L., Zabawa, L., & Kuhlmann, H. (2023). Quality Analysis of a High-Precision Kinematic Laser Scanning System for the Use of Spatio-Temporal Plant and Organ-Level Phenotyping in the Field. Remote Sensing, 15(4), 1117. https://doi.org/10.3390/rs15041117

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop