Next Article in Journal
Boundary Layer Height as Estimated from Radar Wind Profilers in Four Cities in China: Relative Contributions from Aerosols and Surface Features
Next Article in Special Issue
Prediction of Aboveground Biomass of Three Cassava (Manihot esculenta) Genotypes Using a Terrestrial Laser Scanner
Previous Article in Journal
Mapping Multiple Insect Outbreaks across Large Regions Annually Using Landsat Time Series Data
Previous Article in Special Issue
An Efficient Processing Approach for Colored Point Cloud-Based High-Throughput Seedling Phenotyping
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry

1
Leibniz Institute for Agricultural Engineering and Bioeconomy, 14469 Potsdam, Germany
2
Technische Universität Berlin, Institut für Konstruktion, Mikro- und Medizintechnik, FG Agromechatronik, 10623 Berlin, Germany
*
Author to whom correspondence should be addressed.
Remote Sens. 2020, 12(10), 1656; https://doi.org/10.3390/rs12101656
Submission received: 30 March 2020 / Revised: 14 May 2020 / Accepted: 19 May 2020 / Published: 21 May 2020
(This article belongs to the Special Issue 3D Point Clouds for Agriculture Applications)

Abstract

:
In apple cultivation, spatial information about phenotypic characteristics of tree walls would be beneficial for precise orchard management. Unmanned aerial vehicles (UAVs) can collect 3D structural information of ground surface objects at high resolution in a cost-effective and versatile way by using photogrammetry. The aim of this study is to delineate tree wall height information in an apple orchard applying a low-altitude flight pattern specifically designed for UAVs. This flight pattern implies small distances between the camera sensor and the tree walls when the camera is positioned in an oblique view toward the trees. In this way, it is assured that the depicted tree crown wall area will be largely covered with a larger ground sampling distance than that recorded from a nadir perspective, especially regarding the lower crown sections. Overlapping oblique view images were used to estimate 3D point cloud models by applying structure-from-motion (SfM) methods to calculate tree wall heights from them. The resulting height models were compared with ground-based light detection and ranging (LiDAR) data as reference. It was shown that the tree wall profiles from the UAV point clouds were strongly correlated with the LiDAR point clouds of two years (2018: R2 = 0.83; 2019: R2 = 0.88). However, underestimation of tree wall heights was detected with mean deviations of −0.11 m and −0.18 m for 2018 and 2019, respectively. This is attributed to the weaknesses of the UAV point clouds in resolving the very fine shoots of apple trees. Therefore, the shown approach is suitable for precise orchard management, but it underestimated vertical tree wall expanses, and widened tree gaps need to be accounted for.

Graphical Abstract

1. Introduction

In commercial fruit orchards, heterogeneous environmental conditions should be considered in management decisions when aiming for an optimal and sustainable yield. Over the lifespan of a tree, numerous long-term factors will influence its tree growth and fruit ripening, such as soil conditions, topography, and microclimate, as well as short-term influences, such as insect pests and orchard management measures [1,2,3,4]. Accordingly, eco-physiological and morphological traits differ from tree to tree, such as canopy structure [5], water status [6] and yield amount and quality [5,7,8,9].
Therefore, information about the spatio-temporal heterogeneity of the influencing factors in an orchard would be highly beneficial for sustainable orchard management, where site-specific approaches could potentially achieve more precise plant protection, pruning, watering, or application of fertilizers. While the benefits of site-specific strategies have been studied [10,11,12] and have been reviewed for decades in arable crop production [13,14,15], this trend becomes more important for precision fruticulture [16,17,18,19]. Compared to the crops examined in precision agriculture for arable farming, fruit plantations show more complex three-dimensional structures and leaf density. The focus here is on larger individual plants, which determine productivity and resource effort. Therefore, site-specific management decisions in horticulture require spatiotemporal data with highly resolved, georeferenced, three-dimensional information. To meet these demands, different sensors have been researched for horticultural needs [20]. As manual data collection is too labor-intensive and expensive for sampling in reasonable amounts, ground-based approaches have been used for monitoring of tree growth in fruit orchards. For this purpose, imaging data and laser scanner data (light detection and ranging, LiDAR) were mainly recorded and analyzed from ground vehicles [21].
Since unmanned aerial vehicles (UAVs) have been successfully introduced in horticulture, high-resolution sensor data from individual trees and fruit walls have been automatically recorded in a cost-effective and versatile way [22,23]. UAVs are sensor platforms that can be operated individually at low altitudes and bridge the gap between satellite and ground-based observation [24,25]. In contrast to satellite remote sensing, UAVs can collect data independently of cloud coverage and at user-defined points during the entire growth period of trees. Both UAV platforms and flight mission plans are highly customizable, allowing different sensor implementation, flight altitudes, and viewpoint angles, which can be changed ad hoc or systematically during the flight mission. UAV flight campaigns can be conducted with high flight frequency and narrow measurement grids.
Due to their flexibility, UAVs are suitable for many agronomic questions in precision farming and fruticulture. they have been used for individual tree detection in orchards [26,27] as well as tree detection and species recognition in forests [28]. Many studies have described the potential of delineating structural growth parameters of trees, such as height, canopy volume, and leaf area index (LAI), with UAVs. Multispectral- [29,30,31,32,33], hyperspectral- [34,35], and thermal infrared cameras [36,37,38,39], simple RGB photographs [40], and combinations of sensors [41] have been researched to delineate specific orchard information from the UAV platform, including the detection of diseases and stress, radiation interception, or energy flux concepts. LiDAR sensors have been also attached to UAVs for estimation of 3D structural information [21].
For low-cost surveys, UAVs can be equipped with consumer-grade cameras turning the system into an image-based mapping system. Currently-available photogrammetry software has been adapted to the needs of UAV imagery and can handle thousands of images for processing orthophotos and 3D surface models. By using the structure-from-motion (SfM) approach, camera parameters and positions are approximated from overlapping images. To do so, key points will be determined in each image by applying, e.g., scale invariant feature transform (SIFT), and then matching points of corresponding images are found within the overlapping space. Based on the matching points, the camera position and parameters for each image can be found with a bundle adjustment algorithm, and a sparse 3D cloud can be inferred. A dense 3D cloud can then be reconstructed with multi-view stereo algorithms [42]. SfM has been found to be a good low-cost alternative to LiDAR. In a study comparing both sensors from the same UAV platform, SfM proved to provide nearly the same accurate representation of tree height in forest stands with only slightly higher error [43]. However, studies showed that SfM does not provide the same good gap penetration through tree canopies as LiDAR [43,44].
In studies delineating 3D structural tree information from UAV imagery in orchards, the SfM approach has been used most commonly. Many studies have focused on the spatially distributed information of growth parameters from olive trees, which form rather voluminous canopies in orchards [22,45,46]. Studies have shown that this is possible with high flight altitudes of the UAV in case of large, single standing trees [26,47,48]. Because of the large depicted ground area in each image, these studies benefit from high image overlap, but they suffer from low ground sampling distance (GSD), and thus fewer details, especially for the lower tree crowns.
In commercial apple orchards, trees are planted in rows with a short planting distance and are regularly pruned. In that way they constitute a wall-like structure. Furthermore, the individual plants are rather thin and small, so the formed tree walls are translucent. This is why the ground surface of the orchard shines through the trees, while it normally shows only little contrast to trees in images. Thus, small branches, such as fine shoots in the upper part of apple tree walls, can become problematic in correct calculation with SfM for UAV photogrammetry.
To address this issue, we suggest a new flight pattern for UAV photogrammetry in this study to get better delineation of apple tree walls in 3D reconstructions. This was achieved by collecting imagery from a UAV with a low flight altitude to enhance the GSD. Furthermore, the camera was adjusted to oblique perspective toward the tree wall crown area for an enhanced tree profile capture in the data. Additionally, a specially designed flight routing along the tree rows was used. This way, the focused crown area along the tree walls was maximized. This highly detailed, oblique flight setting is a novelty in apple orchards. The idea was that structure delineations would become more accurate, because fine structures such as small branches can be better resolved. Thus, the overall objective of this study was to estimate 3D point clouds from UAV photogrammetry for two years in an apple orchard using the suggested flight pattern, assess the accuracy of the derived tree wall profiles and show its potential for monitoring apple tree orchards.

2. Materials and Methods

2.1. Test Site

Measurement campaigns were conducted on 26 April 2018 and 24 April 2019 in Bavendorf at the Kompetenzzentrum Obstbau-Bodensee (KOB), Germany (47°46′9″N, 9°33′31″E). The test site was planted in 2011 in 6 rows, each with 56 trees of the apple variety Kanzi. Three rows were chosen for the measurement campaign, spanning an area of about 550 m2. From the initial 168 trees in the test area, approximately half had to be removed because of vulnerability of the variety to fruit tree canker (‘Neonectria ditissima’). Some trees were chopped at half their height. The orchard was chosen for its high heterogeneity of tree height. The gaps between the trees formed an environment in which site-specific management would make sense. The trees were regularly pruned to slender spindle trees. The row distance was 3.2 m and the distance between trees was 1 m, constituting continuous tree walls.

2.2. UAV Measurements

The flight campaigns were conducted shortly after full blossom of the trees. The flight missions were conducted with an octocopter (CiS GmbH, Rostock) carrying a consumer grade RGB camera (α-6000, Sony, Tokyo, Japan). The system had a takeoff weight of less than 2 kg and was capable of flight times up to 30 min. To reduce blurring and get the camera angle fixed, a two-axis gimbal was used. The camera had the following specifications: 24.7-megapixel APS-C chip with a sensor size of 23.6 mm × 15.8 mm, with a resulting pixel pitch of 3.9 µm. The focal length used in all campaigns was 16 mm and the aperture was set to 5.6. The light sensitivity was set to ISO 800 and 400 in 2018 and 2019, respectively. The shutter speed was set to “adaptive” due to changing light conditions.
Two flight patterns were used for each date, as depicted in Figure 1: an overview flight (Figure 1a) and a detailed flight (Figure 1b). During the overview flight, the orchard was captured from a larger perspective over its boundaries. The UAV circled along the contour of the orchard at an altitude of 20 m and images were taken with an angle of 9° against nadir. The resulting sampling distance (SD) on the tree wall surface was about 4.63 mm for the first captured tree row. SD is comparable to the ground sample distance but calculated for the vertical tree wall surface at a tree height of 1.5 m above ground level. The detailed flight was conducted directly afterwards, with a flight altitude of 10 m. The flight path followed the tree rows while the camera took oblique shots with an angle of 20° against nadir. In that way, a clear view of the neighboring tree row was achieved. The resulting SD of the depicted neighboring tree row was 2.25 mm. The flight speed was set to 1 m/s, which resulted in a forward image overlap of 92% and 85% on the first depicted tree wall for the contour and detailed flight, respectively.
For georeferencing and to combine several point clouds, marker plates were visibly placed within the orchard. The coordinates of each plate were recorded using a GNSS-RTK system (HIPer Pro, Topcon, Tokyo, Japan). The coordinates were stored in the WGS84 (EPSG: 4326) coordinate system and later transformed to the ETRS89 UTM zone 32N (EPSG: 25832) coordinate system.

2.3. Ground-Based Reference Measurements

LiDAR reference measurements were completed with an LD MRS 400001 (Sick, Germany) on the same date as the UAV flights. The monochromatic outdoor laser scanner had a size of 94 mm × 165 mm × 88 mm. It was set to an opening angle of 110° with an angle resolution of 0.25°, which allowed working from a minimal distance of 0.5 m. The LiDAR sensor was mounted vertically on a tractor 2 m above the ground with a preset angle of 0° and a measurement frequency of 25 Hz. The tree rows were scanned highly accurately at a slow driving speed of about 0.4 m/s. The resulting scan resolution alongside the tree row was 16 mm. Each tree row was scanned from both sides.

2.4. Data Analysis

The images of each flight campaign were photogrammetrically processed (Metashape, Agisoft LLC, Russia) to estimate the 3D point clouds. The software used SfM and multi-view stereo reconstruction. Parameters for calculating the image position as well as its orientation and matching with overlapping neighboring images were set to high quality. To improve the quality of dense cloud reconstruction, the sparse cloud was manually processed. The elimination of particular points was achieved by setting thresholds for reprojection error, reconstruction uncertainty, and projection accuracy to maximum values of 0.1, 50, and 10 pixels, respectively. Calculated image positions, orientations, and matches were updated based on the remaining tie points. The reconstruction of the dense 3D point cloud was set to “high” quality and “mild” depth filtering. The calculation time to compute the point clouds from the aligned 546 and 578 photographs took about 83 and 200 min for 2018 and 2019, respectively. Finally, the resulting 3D point clouds were georeferenced into the ETRS89 UTM zone 32N (EPSG: 25832) coordinate system and the tree rows were cut out.
The LiDAR data were recorded by self-written software that visualizes the actual scanner data on the vehicle for testing purposes and simultaneously saves the data in a self-written binary format (.sld) for further processing. This consists of an identification char-array of 14 bytes, followed by a binary header with information of the LiDAR scanner, number of scan points per rotation, and number of total scans. Attached to this header, all data of the scanned points are also saved in binary format.
To obtain the necessary point cloud reconstruction from the LiDAR data, a basic coordinate transformation for the given 2D-LiDAR was completed by an evaluation script (.m) programmed in MatLab (version 2018a or 2019a depending on the year of measurement; Mathworks, USA). The moving direction of the vehicle on which the LiDAR was mounted is represented on the x-axis. The y- and z-axes correspond to tree height and distance between sensor and tree row. The latter values are based on the time of flight of the laser impulse and the mirror angle. As the LiDAR instrument has a fixed position on the vehicle, a reference to new x, y, z coordinates was calculated from each echo position. The exact procedure is described in more detail in Dworak et al. [49]. The calculated LiDAR point cloud is oriented in a local but metric coordinate system.
The two point clouds, one for each side of the tree row, were then merged to form a unique point cloud for a single row. The alignment was completed with the maximum cover ratio function using CloudCompare software [50]. The prerequisite was manual selection of 10 arbitrary matching points in the two point clouds opposing each other. The automatic matching of the point clouds was completed with a root mean square difference of 0.00001 m to yield exact matched clouds. Furthermore, a targeted overlap of 90% was set because the viewpoint of the LiDAR sensor was different, so the point clouds did not entirely show the same objects. To diminish the influence of driving speed on the LiDAR point clouds, subparts along the tree row were aligned separately. In addition, obvious noise points were manually cleaned from the reference data. The combined point cloud was then scaled in the x and y axes to the extensions of the already georeferenced UAV point cloud. The height was not scaled. The LiDAR point cloud was then translated and rotated to align to the coordinate system and to the orientation of the UAV point clouds. After registration, points representing hail net poles or single plant growth rods in both the UAV and LiDAR point clouds were deleted so that only the trees and the orchard ground were part of the point clouds.
In the next step, the distances between tree points to the immediate ground were calculated using R-script [51]. For this, the point clouds were first classified using the cloth simulation filter according to Zhang et al. [52] with the lidR-package [53] into tree and ground points. The parameter rigidity and cloth resolution were set to a value of 0.5 and the slope parameter to a value of 1. A constant threshold of 0.25 m was used as the distance to the simulated cloth to discriminate a point cloud into ground and non-ground classes. With these settings, adequate separation between tree and ground points except for small tree stumps was achieved. To remove the effect of the tree stumps and yield an even reference surface, a plane was estimated within the ground point cloud using a local linear regression model with the loess function [50]. Only a small degree of smoothing of the model was used by setting the span parameter to 0.3. The predicted grid surface had a resolution of 0.01 m. This surface was used as the ground base surface. As a final step, the distances between the tree points and the nearest point on the ground base surface were calculated, constituting the height of the tree point over the surface. These distances were found by using a nearest neighbor approach based on a KD-tree search algorithm for 3D coordinates. The calculation was implemented in the R package “Rvcg” (modified from Schlager [54]). Tree point distances were calculated for both UAV and LiDAR point clouds.
All tree point heights from the point clouds were then summarized along a 0.25 m georeferenced grid positioned along the tree rows by finding the maximum tree point distance in each grid cell using QGIS [55]. For this, each tree point height was perpendicularly projected into the grid cell by the merge attributes by position function. All points of the same cell got an identifier (ID) for labelling. For estimation of tree wall height per cell, the maximum value of z-coordinates was retrieved and stored for each point cloud. This way the contour of the tree wall as calculated from the UAV and LiDAR point cloud could be retrieved in a spatial resolution of 0.25 m. An overview of the whole data processing is shown in Figure 2.
To estimate UAV point cloud quality, different quality parameters were calculated for the separate tree points of each tree row using CloudCompare software [50]. The point cloud density was determined volumetrically with a next-neighbor approach. For this purpose, the number of neighboring points within a sphere of 1 m3 (r = 0.620 m) was counted and averaged for each point in the cloud. To determine the point cloud completeness, a two-dimensional grid with squared cells and a width of 0.1 m was created in nadir perspective with the same software. This was carried out individually for all rows. The result is shown as the relative proportion of filled cells in the UAV point clouds compared to the filled cells of the LiDAR reference.
The tree wall contours of the UAV point clouds were compared with the contours of the LiDAR reference measurements. To compare the maximum height of each grid cell between UAV and LiDAR data, mean error (ME) for bias, mean absolute error (MAE) for accuracy, and the correlation-based performance parameter coefficient of determination were calculated as follows:
M E = 1 n i = 1 n U i L i
M A E = 1 n i = 1 n | U i L i |
R 2 = 1 n i = 1 n ( U i U ¯ ) ( L i L ¯ ) ( 1 n i = 1 n ( U i U ¯ ) 2 ) ( 1 n i = 1 n ( L i L ¯ ) 2 )
where U and L denote the contour heights of the tree wall calculated from the UAV and LiDAR point clouds. The comparison was carried out for each tree row separately and for all data pooled together.
To analyze the UAV point clouds for missing branches, the function cloud-to-cloud distance in CloudCompare [50] was used to calculate the point cloud distance. Here the distance from each point to the nearest point in a reference point cloud is calculated. For this analysis the LiDAR point cloud was compared to the UAV reference. In that way, points in the LiDAR cloud representing branches, which are missing in the UAV cloud, show further distance from their nearest neighbor. The result is a point cloud in which the distance to the reference point cloud of each point is displayed by a color scale. Thus, missing branches in the reference point cloud are highlighted. This analysis was completed for the years 2018 and 2019 individually. To keep the results comparable, the octree level setting, the level of subdivision of cubic volumes into which the cloud is divided, was kept stable at 10 for both calculations. To detect underestimated areas in the UAV point cloud, all points with a distance value of more than 0.2 m were extracted from these point clouds. The resulting sub point cloud was analyzed with the method Label Connected Components in the software CloudCompare [50]. A minimum cluster size of 30 points and a maximum distance of 48.6 mm between the points were set to filter single points. The combined clusters were counted and evaluated for the years 2018 and 2019, respectively.

3. Results

Figure 3 shows a comparison of UAV and LiDAR point clouds for the three apple tree rows A, B, and C for the years 2018 and 2019. The lines show the interpolated values of the maximum tree wall heights as found in each 0.25 m grid cell along the tree rows for both point clouds.
All contour curves for the tree rows showed high correlations for both tested years, as can also be seen from the scatter plots in Figure 4. The highest coefficients of determination between tree wall heights from the UAV and LiDAR point clouds were R2 = 0.87 in 2018 and R2 = 0.91 in 2019. However, we found two types of deviations between the two types of point clouds. First, along the tree groups, a general weak underestimation of the contour curves from the UAV point clouds is recognized, especially for row B in 2018, and a nearly constant underestimation for row A and B in 2019. Second, there is a deviation in the transition from tree groups to tree gaps, where the tree height curves show steep slopes. Even though the tree gap detection works well, the width of the tree gaps tends to be overestimated by the UAV point cloud model (e.g., row B in 2019 at a distance of 34–35 m or 42–45 m).
Table 1 shows a list of quality parameters of the UAV point clouds. The volumetric density of 3D points for one row in both years ranged between 4.5 × 105 and 5.1 × 105 points/m3 for 2018 and from 3.8 × 105 to 4.4 × 105 points/m3 for 2019. The top-down point cloud completeness ranged from 76.6% to 86.6% and 73.1% to 86.2% compared to the LiDAR reference clouds for 2018 and 2019, respectively. For 2018, the fewest points per grid cell, around 16.000 points, were found in row B, whereas for 2019, row A had the fewest with around 9000 points per grid cell. This is accompanied by stronger underestimations of tree wall heights compared to the reference. All UAV point clouds are significantly correlated with the LiDAR reference point clouds (Pearson p < 0.0001). For 2018, R2 values ranged between 0.81 and 0.87, with a mean absolute error (MAE) between 0.18 m and 0.23 m. For 2019, R2 values ranged between 0.81 and 0.91, with a mean absolute error between 0.21 m and 0.24 m. The MAE of the tree height estimations ranged from 12.5% to 15.3% and from 9.2% to 13.2% for 2018 and 2019, respectively. As recognized from Figure 3, the UAV point clouds underestimated to some extent relative to the LiDAR reference model, on average, for the three rows and two years.
In Figure 4, all tree height estimations are pooled within a scatter plot showing the UAV data compared with the LiDAR data. With R2 values of 0.83 and 0.88 for 2018 and 2019, respectively, the estimated tree heights from the UAV model fit the reference model quite well. However, a general underestimation of the UAV point cloud leads to more right-shifted points. This effect is intensified by zero values in the UAV point cloud (17 and 21 of 706 data points in 2018 and 2019, respectively), based on widened tree gaps. Additionally, there were some zero values in the reference cloud (14 and 18 of 706 data points), where noisy points in the UAV model led to tree height estimations.
In Figure 5, a more detailed comparison of the UAV and LiDAR point clouds for row C in 2018 is shown (Figure 5c). As shown before, the general structures are mapped accurately from the UAV point cloud, e.g., tree profiles and gaps. However, the LiDAR data allows for more details and characteristics of the apple trees, although it has fewer points. The LiDAR point cloud even rendered fine shoots of the trees. For the UAV point cloud, it is noticeable that points are not as homogenously distributed over the row as reference points. In the example, the last tree group (distance > 50 m) shows structural quality that is visually about the same as in the reference cloud, whereas the first tree group (distance < 10 m) shows a more porous profile in the UAV point cloud. However, despite the heterogeneous point distribution in the UAV point cloud, the delineated tree heights were hardly affected (Figure 5a). The bar plot in Figure 5b shows the differences in maximum tree wall height estimations from both models. In general, the differences are rather small, with a tendency to underestimate the LiDAR tree wall height references to some extent. When comparing the two point clouds, this comes from the SfM approach missing fine structures of tree walls, such as small shoots in the upper part of the trees. Larger deviations occur mainly in the vicinity of tree gaps in areas of steep slopes of the tree height curves. Here, widened gaps in the UAV point cloud cause strong differences in height estimation. This happens because fine lateral shoots are estimated shorter and therefore a gap is detected instead of the tree height value. This increases the underestimation effect of tree height estimates by the UAV point cloud.
To visualize this effect in more detail, Figure 6 shows RGB images shot from the UAV and the calculated UAV point cloud superimposed with the reference LiDAR point cloud for five trees as an example. We see that the orange UAV point cloud outlines a good tree crown profile but fewer details in the structures at the crown edges compared to the reference LiDAR point cloud, depicted here with blue points. Due to the lack of fine shoots in the UAV point cloud, the lateral and vertical expanses of the apple tree canopies were attenuated. In the detailed view for tree wall contour curves, this leads to an underestimation of all tree crown maxima, especially within the canopy gaps and at the beginning and end of tree groups.
To show the effect of underestimated areas, which are often missing branches, over the whole orchard, Figure 7 displays an overview of the distance of each point to its nearest neighbor in the compared cloud. Here, the LiDAR point cloud shows all small shoots. In that way, points representing branches that do not exist in the UAV point cloud are further away from their next neighbor and are therefore highlighted in red. The blue center of the trees can be seen over the whole orchard for both years. Therefore, tree structures in general are well represented in the UAV point clouds. However, red clusters at the top of the trees and in the transition from gap to tree group indicate problems with the UAV point cloud in depicting these finer structures. This becomes particularly clear in the magnified areas of the tree rows. Missing point cloud clusters, which are framed with yellow boxes, are located around the tree. Here, as already described, missed fine shoots lead to an underestimation of tree height and to a widening of tree gaps. Over the orchard 555 and 299 missing point cloud clusters were identified for 2018 and 2019, respectively. It is assumed that different recording conditions, such as different wind conditions, led to the variation in quality of point cloud delineation for small branches.

4. Discussion

To meet the requirement for more detailed information in modern precision horticulture, UAV-based measurements deliver interesting new data. The experiments carried out within the present framework used low-altitude flights. A consumer grade camera system together with modified photogrammetry methods were used to determine the height of apple fruit tree wall sections. The results are based on data from flight campaigns of two years in which three apple tree rows, consisting of 168 individuals, were mapped. The comparison of tree wall heights delineated from UAV and reference LiDAR point clouds shows that low-altitude UAV imagery is appropriate to systematically observe apple tree growth within the tree wall sections of orchards.
The correlations found in this study for tree height estimations from UAV point cloud and reference data were comparable with results obtained for olive tree plantations by Diaz-Varela [47], where the root mean squared errors ranged between 6% and 20%. Slightly minor errors between 3.5% and 3.8% were stated by Torres-Sánchez et al. [48] for large olive tree walls, corresponding to a difference of 0.17 m and 0.18 m. In comparison, we found slightly higher errors ranging from 0.18 m to 0.24 m in our study.
The studies mentioned above used high flight altitudes for nadir UAV measurement campaigns to delineate structural parameters from trees. This way, there can be high image overlap, from which the SfM photogrammetry benefits. This was the same in the case of Dandois et al. [56], who found that a high forward overlap of UAV imagery from 60% to 96% is crucial for minimizing the canopy height error for forest point clouds. Torres-Sánchez et al. [46] came to a similar conclusion. They found that a forward overlap of 95% was appropriate for 3D surface modelling of orchard trees with regard to processing time and flight time. All of these studies used higher flight altitudes than in our case, and therefore benefited from larger ground area covered by each image. Therefore, the trade-off between image overlap and flight time is not as crucial as it is for low-altitude UAV imagery. This way, however, they lose details due to greater ground surface distance.
However, with further advancement in battery technology, flight time limitations will become more negligible, so that a high percentage of image overlap can be combined with low-altitude flights. Seifert et al. [57] found that this would give the best reconstruction quality for forest trees. As a reasonable amount of forward overlap for low flight altitudes, taking processing time as a practical limitation factor, the authors suggested 95% [57].
For oblique image flight campaigns in orchards, the target area from which image data should be acquired with high overlap should be the tree wall area to receive maximum information. For this study, flight patterns were set in a way that the forward overlap in images along the tree wall was about 85% due to the low flight velocity of 1 m/s. For more stable point cloud calculation, the flight velocity might be decreased even further. In that way, areas with lower point density in the point clouds would be prevented from reducing the underestimation of crown height. This is reasonable, because regions in a point cloud that have fewer points compared to their surroundings might suffer from greater uncertainty in 3D point cloud reconstruction. Therefore, it will be even less probable that finer structures will be found in the UAV point cloud. Therefore, these areas are strongly underestimated. It is shown that even poorly resolved tree wall sections in a UAV point cloud can lead to stable but underestimated height information.
A major struggle for the UAV photogrammetry in our study was the estimation of fine shoots, which develop after the slender apple spindles in the tree walls are pruned each year. We found 555 and 299 missing point clusters, which are interpreted as missing shoots, over the orchard for 2018 and 2019, respectively. These small structures had very low color contrast against the background, which is a common problem for small apple orchards [58], and spanned only a small area in the photographs. This makes the distinction between foreground and background challenging [59]. In addition, small branches can easily move in the wind by multiples of their own diameter. These problems make the SfM approach struggle when estimating accurate point locations for fine structures in dense clouds. This is consistent with other research findings. Fritz et al. [60] found that thinner branches in SfM based point clouds from non-nadir forest overflights remain likely to be undetected. They concluded that a systematic underestimation of tree row heights can hardly be prevented. For our study, this underestimation occurred mainly in two ways. On the one hand, lower crown heights were estimated where thin branches in the upper part of the trees were missing. This effect was generally observed for most of the tree tops along the apple tree rows. On the other hand, small twigs, hanging laterally in the tree wall gaps, often could not be found. This effect occurs around tree gaps, widening them in the UAV point cloud. Therefore, underestimations of the tree wall heights occur with the SfM approach at the beginning and the end of tree groups. To cope with this problem for crop protection, e.g., application maps for spraying measures, a buffer could be used over the tree contour line. Based on the findings of underestimation in our study, we would suggest a buffer size of about 20 cm.
When calculating three-dimensional point clouds from images, tree parts can be missed, for which no sufficient two-dimensional data are provided. Frey et al. [61] compared the amount of three-dimensional grid cells, or voxels, containing points in the three-dimensional space in relation to their most filled model when working with a UAV based SfM approach in forests; they call it digital surface model completeness. They found a positive correlation between GSD and the completeness of surface models for two-dimensional ground surface, but this correlation disappeared for the three-dimensional voxel space. They concluded that fine GSD and high image overlap are both beneficial for sampling lower tree canopy parts. In our study, we used a small surface sampling distance, which resulted in good model completeness values ranging from 73.1% to 86.6%. However, image overlap has a stronger effect on 3D model completeness, as Frey added. Therefore, the problems in reconstructing small shoots in UAV point clouds should be addressed using slower flight speeds, with consequent higher image overlaps. In summary, it is a trade-off between the level of detail in the point clouds and the processing time. For coarse structure delineation, high flight altitude and a fast point cloud processing would be sufficient, while tasks such as site-specific plant protection and yield estimation would benefit from a higher level of detail. For these applications, a maximized crown wall area in photographs taken at a low flight altitude and an oblique view combined with high forward overlap of images is recommended.

5. Conclusions

The study findings show that the heights of slender apple tree walls can be delineated from SfM estimated point clouds based on oblique, close-range UAV imagery with R2 values ranging from 0.81 to 0.91. Our suggested flight pattern is therefore a suitable tool for structural tree wall analysis, with the benefit of providing more detailed data for apple tree orchards specifically for precise horticultural applications. Due to the low flight altitude, UAV photogrammetry needs to be calculated with a slightly lower image overlap to maintain reasonable flying times. The accurate delineation of fine structures in the tree walls, however, was challenging for SfM because small shoots showed little contrast to the background and could be easily moved by the wind. This led to wider gaps in the tree walls and missing shoots in the upper parts of the tree walls in the point clouds. For these reasons, a slight but general underestimation was observed with UAV photogrammetry. To address this issue, we suggest the use of buffers for precise orchard applications or a higher forward overlap of the imagery, however, the latter requires greater effort for data collection and processing time.

Author Contributions

Conceptualization, M.P. and M.S.; Data curation, M.H. and M.P.; Formal analysis, M.H. and M.S.; Funding acquisition, M.P. and M.S.; Investigation, M.H., M.P. and M.S.; Methodology, M.H., M.P. and M.S.; Project administration, M.P. and M.S.; Visualization, M.H. and M.P.; Writing—original draft, M.H., M.P. and M.S.; Writing—review and editing, C.W. All authors have read and agreed to the published version of the manuscript.

Funding

The research project on which this paper is based was funded by the German government, grant number 2814903915.

Acknowledgments

The authors thank Christian Scheer and Magdalena Proske for all the fruitful discussions and for providing as well as maintaining the experimental plant. Further, we would like to thank Konni Biegert. For fundamental project support, we thank Ludwig Schrenk and his team from CiS-GmbH Rostock, Mirco Bohte and Sergej Krukouski, who helped us regarding the UAV with their expertise in flight planning and conducting, as well as point cloud generation. We would also like to thank Antje Giebel, who always came to our assistance when there were problems with the 3D point cloud calculation, and Michael Heisig, who helped with LiDAR point cloud calculation. Further we thank Klaus Grohmann for his patient graphic advice. We also thank Bommie Xiong, section managing editor, for handling the manuscript and three anonymous reviewers for constructive comments.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Neilsen, D.; Neilsen, G.; Guak, S.; Forge, T. Consequences of Deficit Irrigation and Crop Load Reduction on Plant Water Relations, Yield, and Quality of ‘Ambrosia’ Apple. HortScience 2016, 51, 98–106. [Google Scholar] [CrossRef] [Green Version]
  2. Yuri, J.; Talice, J.G.; Verdugo, J.; Del Pozo, A. Responses of fruit growth, quality, and productivity to crop load in apple cv. Ultra Red Gala/MM111. Sci. Hortic. 2011, 127, 305–312. [Google Scholar] [CrossRef]
  3. Leser, C.; Treutter, D. Effects of nitrogen supply on growth, contents of phenolic compounds and pathogen (scab) resistance of apple trees. Physiol. Plant. 2005, 123, 49–56. [Google Scholar] [CrossRef]
  4. Huguet, J.-G.; Li, S.; Lorendeau, J.-Y.; Pelloux, G. Specific micromorphometric reactions of fruit trees to water stress and irrigation scheduling automation. J. Hortic. Sci. 1992, 67, 631–640. [Google Scholar] [CrossRef]
  5. Schumann, A.; Miller, W.; Zaman, Q.; Hostler, K.; Buchanon, S.; Cugati, S. Variable rate granular fertilization of citrus groves: Spreader performance with single-tree prescription zones. Appl. Eng. Agric. 2006, 22, 19–24. [Google Scholar] [CrossRef]
  6. Käthner, J. Interaction of Spatial Variability Characterized by Soil Electrical Conductivity and Plant Water Status Related to Generative Growth of Fruit Trees. Ph.D. Thesis, University of Potsdam, Potsdam, Germany, 2016. [Google Scholar]
  7. Aggelopoulou, K.D.; Wulfsohn, D.; Fountas, S.; Gemtos, T.A.; Nanos, G.D.; Blackmore, S. Spatial variation in yield and quality in a small apple orchard. Precis. Agric. 2009, 11, 538–556. [Google Scholar] [CrossRef]
  8. Perry, E.M.; Dezzani, R.J.; Seavert, C.F.; Pierce, F.J. Spatial variation in tree characteristics and yield in a pear orchard. Precis. Agric. 2010, 11, 42–60. [Google Scholar] [CrossRef]
  9. Liu, X.; Chen, S.W.; Aditya, S.; Sivakumar, N.; Dcunha, S.; Qu, C.; Taylor, C.J.; Das, J.; Kumar, V. Robust fruit counting: Combining Deep Learning, Tracking, and Structure from Motion. In Proceedings of the 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, Spain, 1–5 October 2018. [Google Scholar]
  10. Pathak, H.S.; Brown, P.; Best, T. A systematic literature review of the factors affecting the precision agriculture adoption process. Precis. Agric. 2019, 20, 1292–1316. [Google Scholar] [CrossRef]
  11. Dammer, K.-H.; Ehlert, D. Variable-rate fungicide spraying in cereals using a plant cover sensor. Precis. Agric. 2006, 7, 137–148. [Google Scholar] [CrossRef]
  12. Pflanz, M.; Nordmeyer, H.; Schirrmann, M. Weed Mapping with UAS Imagery and a Bag of Visual Words Based Image Classifier. Remote Sens. 2018, 10, 1530. [Google Scholar] [CrossRef] [Green Version]
  13. Thompson, J.; Stafford, J.; Miller, P. Potential for automatic weed detection and selective herbicide application. Crop Prot. 1991, 10, 254–259. [Google Scholar] [CrossRef]
  14. Auernhammer, H. Precision farming—The environmental challenge. Comput. Electron. Agric. 2001, 30, 31–43. [Google Scholar] [CrossRef]
  15. Bhakta, I.; Phadikar, S.; Majumder, K. State-of-the-art technologies in precision agriculture: A systematic review. J. Sci. Food Agric. 2019, 99, 4878–4888. [Google Scholar] [CrossRef] [PubMed]
  16. Doruchowski, G.; Swiechowski, W.; Godyn, A.; Holownicki, R. Automatically controlled sprayer to implement spray drift reducing application strategies in orchards. J. Fruit Ornam. Plant Res. 2011, 19, 175–182. [Google Scholar]
  17. Llorens, J.; Gil, E.; Llop, J.; Escolà, A. Ultrasonic and LIDAR Sensors for Electronic Canopy Characterization in Vineyards: Advances to Improve Pesticide Application Methods. Sensors 2011, 11, 2177–2194. [Google Scholar] [CrossRef] [Green Version]
  18. Hočevar, M.; Širok, B.; Jejčič, V.; Godeša, T.; Lešnika, M.; Stajnko, D. Design and testing of an automated system for targeted spraying in orchards. J. Plant Dis. Prot. 2010, 117, 71–79. [Google Scholar] [CrossRef]
  19. Pflanz, M.; RALFS, J.; Pelzer, T. Site-Specific Plant Protection Using Precise Canopy Gap Detection. In Proceedings of the 12th Workshop on Spray Application Techniques in Fruit Growing, Valencia, Spain, 26–28 June 2013. [Google Scholar]
  20. Zude-Sasse, M.; Fountas, S.; Gemtos, T.A.; Abu-Khalaf, N. Palestine Applications of precision agriculture in horticultural crops. Eur. J. Hortic. Sci. 2016, 81, 78–90. [Google Scholar] [CrossRef]
  21. Escolà, A.; Martínez-Casasnovas, J.A.; Rufat, J.; Arnó, J.; Arbonés, A.; Sebé, F.; Pascual, M.; Gregorio, E.; Rosell-Polo, J.R. Mobile terrestrial laser scanner applications in precision fruticulture/horticulture and tools to extract information from canopy point clouds. Precis. Agric. 2017, 18, 111–132. [Google Scholar] [CrossRef] [Green Version]
  22. Zarco-Tejada, P.J.; Diaz-Varela, R.; Angileri, V.; Loudjani, P. Tree height quantification using very high resolution imagery acquired from an unmanned aerial vehicle (UAV) and automatic 3D photo-reconstruction methods. Eur. J. Agron. 2014, 55, 89–99. [Google Scholar] [CrossRef]
  23. Mejia-Aguilar, A.; Tomelleri, E.; Vilardi, A.; Zebisch, M. UAV Based Tree Height Estimation in Apple Orchards: Potential of Multiple Approaches. In EGU General Assembly Conference Abstracts; EGU: Vienna, Austria, 2015. [Google Scholar]
  24. Hunt, E.R.; Daughtry, C.S.T. What good are unmanned aircraft systems for agricultural remote sensing and precision agriculture? Int. J. Remote Sens. 2018, 39, 5345–5376. [Google Scholar] [CrossRef] [Green Version]
  25. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  26. Kattenborn, T.; Sperlich, M.; Bataua, K.; Koch, B. Automatic Single Tree Detection in Plantations using UAV-based Photogrammetric Point clouds. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2014, XL–3, 139–144. [Google Scholar] [CrossRef] [Green Version]
  27. Ok, A.O.; Ozdarici-Ok, A. Detection of citrus trees from UAV DSMs. ISPRS Annals of Photogrammetry. Remote Sens. Spat. Inf. Sci. 2017, IV-1/W1, 27–34. [Google Scholar]
  28. Nevalainen, O.; Honkavaara, E.; Tuominen, S.; Viljanen, N.; Hakala, T.; Yu, X.; Hyyppä, J.; Saari, H.; Pölönen, I.; Imai, N.; et al. Individual Tree Detection and Classification with UAV-Based Photogrammetric Point Clouds and Hyperspectral Imaging. Remote Sens. 2017, 9, 185. [Google Scholar] [CrossRef] [Green Version]
  29. Garcia-Ruiz, F.; Sankaran, S.; Maja, J.M.; Lee, W.S.; Rasmussen, J.; Ehsani, R. Comparison of two aerial imaging platforms for identification of Huanglongbing-infected citrus trees. Comput. Electron. Agric. 2013, 91, 106–115. [Google Scholar] [CrossRef]
  30. Bulanon, D.; Lonai, J.; Skovgard, H.; Fallahi, E. Evaluation of Different Irrigation Methods for an Apple Orchard Using an Aerial Imaging System. ISPRS Int. J. Geo Inf. 2016, 5, 79. [Google Scholar] [CrossRef] [Green Version]
  31. Zhao, T.; Chen, Y.; Ray, A.; Doll, D. Quantifying almond water Stress Using Unmanned Aerial Vehicles (UAVS): Correlation of Stem Water Potential and Higher Order Moments of Non-Normalized Canopy Distribution. In Proceedings of the ASME 2017 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference, Cleveland, OH, USA, 6–9 August 2017. [Google Scholar]
  32. Guillen-Climent, M.L.; Zarco-Tejada, P.J.; Berni, J.A.J.; North, P.R.J.; Villalobos, F.J. Mapping radiation interception in row-structured orchards using 3D simulation and high-resolution airborne imagery acquired from a UAV. Precis. Agric. 2012, 13, 473–500. [Google Scholar] [CrossRef] [Green Version]
  33. Guillen-Climent, M.L.; Zarco-Tejada, P.J.; Villalobos, F.J. Estimating Radiation Interception in Heterogeneous Orchards Using High Spatial Resolution Airborne Imagery. IEEE Geosci. Remote Sens. Lett. 2013, 11, 579–583. [Google Scholar] [CrossRef] [Green Version]
  34. Calderón, R.; Navas-Cortés, J.A.; Lucena, C.; Zarco-Tejada, P.J. High-resolution airborne hyperspectral and thermal imagery for early detection of Verticillium wilt of olive using fluorescence, temperature and narrow-band spectral indices. Remote Sens. Environ. 2013, 139, 231–245. [Google Scholar] [CrossRef]
  35. Zarco-Tejada, P.J.; González-Dugo, V.; Berni, J.A.J. Fluorescence, temperature and narrow-band indices acquired from a UAV platform for water stress detection using a micro-hyperspectral imager and a thermal camera. Remote Sens. Environ. 2012, 117, 322–337. [Google Scholar] [CrossRef]
  36. Gómez-Candón, D.; Virlet, N.; Labbé, S.; Jolivot, A.; Regnard, J.-L. Field phenotyping of water stress at tree scale by UAV-sensed imagery: New insights for thermal acquisition and calibration. Precis. Agric. 2016, 17, 786–800. [Google Scholar] [CrossRef]
  37. Gonzalez-Dugo, V.; Zarco-Tejada, P.; Nicolás, E.; Nortes, P.A.; Alarcón, J.J.; Intrigliolo, D.S.; Fereres, E. Using high resolution UAV thermal imagery to assess the variability in the water status of five fruit tree species within a commercial orchard. Precis. Agric. 2013, 14, 660–678. [Google Scholar] [CrossRef]
  38. Gonzalez-Dugo, V.; Zarco-Tejada, P.J.; Fereres, E. Applicability and limitations of using the crop water stress index as an indicator of water deficits in citrus orchards. Agric. For. Meteorol. 2014, 198–199, 94–104. [Google Scholar] [CrossRef]
  39. Park, S.; Ryu, D.; Fuentes, S.; Chung, H.; Hernández-Montes, E.; O’Connell, M. Adaptive Estimation of Crop Water Stress in Nectarine and Peach Orchards Using High-Resolution Imagery from an Unmanned Aerial Vehicle (UAV). Remote Sens. 2017, 9, 828. [Google Scholar] [CrossRef] [Green Version]
  40. Psirofonia, P.; Samaritakis, V.; Eliopoulos, P.; Potamitis, I. Use of Unmanned Aerial Vehicles for Agricultural Applications with Emphasis on Crop Protection: Three Novel Case - studies. Int. J. Agric. Sci. Technol. 2017, 5, 30–39. [Google Scholar] [CrossRef]
  41. Ortega-Farías, S.; Ortega-Salazar, S.; Poblete, T.; Kilic, A.; Allen, R.; Poblete-Echeverría, C.; Ahumada-Orellana, L.; Zuñiga, M.; Sepúlveda, D. Estimation of Energy Balance Components over a Drip-Irrigated Olive Orchard Using Thermal and Multispectral Cameras Placed on a Helicopter-Based Unmanned Aerial Vehicle (UAV). Remote Sens. 2016, 8, 638. [Google Scholar] [CrossRef] [Green Version]
  42. Smith, M.W.; Carrivick, J.L.; Quincey, D.J. Structure from motion photogrammetry in physical geography. Prog. Phys. Geogr. Earth Environ. 2016, 40, 247–275. [Google Scholar] [CrossRef]
  43. Wallace, L.; Lucieer, A.; Malenovský, Z.; Turner, D.; Vopěnka, P. Assessment of Forest Structure Using Two UAV Techniques: A Comparison of Airborne Laser Scanning and Structure from Motion (SfM) Point Clouds. Forests 2016, 7, 62. [Google Scholar] [CrossRef] [Green Version]
  44. Guerra-Hernández, J.; Cosenza, D.N.; Rodriguez, L.C.E.; Silva, M.; Tomé, M.; Díaz-Varela, R.A.; González-Ferreiro, E. Comparison of ALS- and UAV(SfM)-derived high-density point clouds for individual tree detection in Eucalyptus plantations. Int. J. Remote Sens. 2018, 39, 5211–5235. [Google Scholar] [CrossRef]
  45. Jiménez-Brenes, F.M.; López-Granados, F.; de Castro, A.I.; Torres-Sánchez, J.; Serrano, N.; Peña, J.M. Quantifying pruning impacts on olive tree architecture and annual canopy growth by using UAV-based 3D modelling. Plant Methods 2017, 13, 55. [Google Scholar] [CrossRef] [Green Version]
  46. Torres-Sánchez, J.; López-Granados, F.; Borra-Serrano, I.; Peña, J.M. Assessing UAV-collected image overlap influence on computation time and digital surface model accuracy in olive orchards. Precis. Agric. 2018, 19, 115–133. [Google Scholar] [CrossRef]
  47. Díaz-Varela, R.; de la Rosa, R.; León, L.; Zarco-Tejada, P. High-Resolution Airborne UAV Imagery to Assess Olive Tree Crown Parameters Using 3D Photo Reconstruction: Application in Breeding Trials. Remote Sens. 2015, 7, 4213–4232. [Google Scholar] [CrossRef] [Green Version]
  48. Torres-Sánchez, J.; López-Granados, F.; Serrano, N.; Arquero, O.; Peña, J.M. High-Throughput 3-D Monitoring of Agricultural-Tree Plantations with Unmanned Aerial Vehicle (UAV) Technology. PLoS ONE 2015, 10, e0130479. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  49. Selbeck, J.; Dworak, V.; Ehlert, D. Testing a vehicle-based scanning lidar sensor for crop detection. Can. J. Remote Sens. 2010, 36, 24–35. [Google Scholar] [CrossRef]
  50. GPL Software. CloudCompare; France. Available online: https://www.danielgm.net/cc/ (accessed on 20 May 2020).
  51. R Core Team. R: A Language and Environment for Statistical Computing; R Foundation for Statistical Computing: Vienna, Austria, 2019. [Google Scholar]
  52. Zhang, K.; Chen, S.-C.; Whitman, D.; Shyu, M.-L.; Yan, J.; Zhang, C. A progressive morphological filter for removing nonground measurements from airborne LIDAR data. IEEE Trans. Geosci. Remote Sens. 2003, 41, 872–882. [Google Scholar] [CrossRef] [Green Version]
  53. Roussel, J.-R.; Auty, D. Lidr: Airborne LiDAR Data Manipulation and Visualization for Forestry Applications. Available online: https://rdrr.io/cran/lidR/ (accessed on 20 May 2020).
  54. Schlager, S. Morpho and Rvcg–Shape Analysis in R: R-Packages for Geometric Morphometrics, Shape Analysis and Surface Manipulations. In Statistical Shape and Deformation Analysis; Elsevier: Amsterdam, The Netherlands, 2017; pp. 217–256. [Google Scholar]
  55. QGIS Development Team. QGIS Geographic Information System. Open Source Geospatial Foundation Project. Available online: https://qgis.org/de/site/forusers/download.html (accessed on 20 May 2020).
  56. Dandois, J.; Olano, M.; Ellis, E. Optimal Altitude, Overlap, and Weather Conditions for Computer Vision UAV Estimates of Forest Structure. Remote Sens. 2015, 7, 13895–13920. [Google Scholar] [CrossRef] [Green Version]
  57. Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of Drone Altitude, Image Overlap, and Optical Sensor Resolution on Multi-View Reconstruction of Forest Images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef] [Green Version]
  58. Dong, X.; Zhang, Z.; Yu, R.; Tian, Q.; Zhu, X. Extraction of Information about Individual Trees from High-Spatial-Resolution UAV-Acquired Images of an Orchard. Remote Sens. 2020, 12, 133. [Google Scholar] [CrossRef] [Green Version]
  59. Klodt, M.; Herzog, K.; Töpfer, R.; Cremers, D. Field phenotyping of grapevine growth using dense stereo reconstruction. Bmc Bioinform. 2015, 16, 143. [Google Scholar] [CrossRef] [Green Version]
  60. Fritz, A.; Kattenborn, T.; Koch, B. UAV-based photogrammetric point clouds—Tree stem mapping in open stands in comparison to terrestrial laser scanner point clouds. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2013, XL-1/W2, 141–146. [Google Scholar] [CrossRef] [Green Version]
  61. Frey, J.; Kovach, K.; Stemmler, S.; Koch, B. UAV Photogrammetry of Forests as a Vulnerable Process. A Sensitivity Analysis for a Structure from Motion RGB-Image Pipeline. Remote Sens. 2018, 10, 912. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Unmanned aerial vehicle (UAV) flight plan for the test site: (a) contour flight and (b) detailed flight along the tree rows.
Figure 1. Unmanned aerial vehicle (UAV) flight plan for the test site: (a) contour flight and (b) detailed flight along the tree rows.
Remotesensing 12 01656 g001
Figure 2. Overview of data processing steps and their main parameter settings.
Figure 2. Overview of data processing steps and their main parameter settings.
Remotesensing 12 01656 g002
Figure 3. Maximum tree wall height estimations from UAV (orange) and light detection and ranging (LiDAR) (blue) point clouds for each 0.25 m in apple tree rows A, B, and C for 2018 (top) and 2019 (bottom).
Figure 3. Maximum tree wall height estimations from UAV (orange) and light detection and ranging (LiDAR) (blue) point clouds for each 0.25 m in apple tree rows A, B, and C for 2018 (top) and 2019 (bottom).
Remotesensing 12 01656 g003
Figure 4. Scatter plots of estimated tree wall heights for grid cells for 2018 and 2019.
Figure 4. Scatter plots of estimated tree wall heights for grid cells for 2018 and 2019.
Remotesensing 12 01656 g004
Figure 5. (a) Tree wall height estimations from UAV (orange) and LiDAR (blue) point clouds for each grid cell in apple tree row C for 2018. (b) Difference in maximum tree wall height estimations between UAV and reference point cloud (c).
Figure 5. (a) Tree wall height estimations from UAV (orange) and LiDAR (blue) point clouds for each grid cell in apple tree row C for 2018. (b) Difference in maximum tree wall height estimations between UAV and reference point cloud (c).
Remotesensing 12 01656 g005
Figure 6. Detailed view of row A (42–50 m) in 2019 as an RGB image (top) and point clouds from UAV (orange) and LiDAR (blue) data superimposed with tree wall height curves from UAV (orange) and LiDAR (blue) model (bottom).
Figure 6. Detailed view of row A (42–50 m) in 2019 as an RGB image (top) and point clouds from UAV (orange) and LiDAR (blue) data superimposed with tree wall height curves from UAV (orange) and LiDAR (blue) model (bottom).
Remotesensing 12 01656 g006
Figure 7. Detected areas of underestimation (yellow boxes) in UAV point cloud compared to LiDAR point cloud (color indicates distance to the nearest point in reference point cloud; all distances are given in meters) depicted as overview and magnified for a subsection.
Figure 7. Detected areas of underestimation (yellow boxes) in UAV point cloud compared to LiDAR point cloud (color indicates distance to the nearest point in reference point cloud; all distances are given in meters) depicted as overview and magnified for a subsection.
Remotesensing 12 01656 g007
Table 1. Quality parameters for UAV point clouds of tree rows A, B, and C in 2018/19.
Table 1. Quality parameters for UAV point clouds of tree rows A, B, and C in 2018/19.
YearRowPoint Density (Points/m3)Median Points per 0.25 m SectionPC Complete-Ness (%)ME
(m)
MAE (m)R2
2018A50,74622,20284.1−0.090.200.83
2018B45,44316,29176.6−0.180.230.87
2018C50,06724,06586.6−0.050.180.81
2019A37,920903873.1−0.220.230.90
2019B44,11818,03382.0−0.150.210.91
2019C40,04517,64186.2−0.170.240.81

Share and Cite

MDPI and ACS Style

Hobart, M.; Pflanz, M.; Weltzien, C.; Schirrmann, M. Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry. Remote Sens. 2020, 12, 1656. https://doi.org/10.3390/rs12101656

AMA Style

Hobart M, Pflanz M, Weltzien C, Schirrmann M. Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry. Remote Sensing. 2020; 12(10):1656. https://doi.org/10.3390/rs12101656

Chicago/Turabian Style

Hobart, Marius, Michael Pflanz, Cornelia Weltzien, and Michael Schirrmann. 2020. "Growth Height Determination of Tree Walls for Precise Monitoring in Apple Fruit Production Using UAV Photogrammetry" Remote Sensing 12, no. 10: 1656. https://doi.org/10.3390/rs12101656

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop