Next Article in Journal
High-Precision River Network Mapping Using River Probability Learning and Adaptive Stream Burning
Previous Article in Journal
ICESat-2 and SnowEx Surface Elevation Measurements: A Cross-Validation Study for Snow Depth Application
Previous Article in Special Issue
Effects of Spatial Resolution on Assessing Cotton Water Stress Using Unmanned Aerial System Imagery
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Effects of Flight and Processing Parameters on UAS Image-Based Point Clouds for Plant Height Estimation

U.S. Department of Agriculture—Agricultural Research Service, Southern Plains Agricultural Research Center, College Station, TX 77845, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2026, 18(2), 360; https://doi.org/10.3390/rs18020360
Submission received: 17 December 2025 / Revised: 13 January 2026 / Accepted: 19 January 2026 / Published: 21 January 2026

Highlights

What are the main findings?
  • Plant height estimation accuracy varied across flight altitudes, image overlaps, and crop types, with point clouds consistently outperforming DSMs.
  • Best accuracy occurred at 60–90 m (1.0–1.5 cm GSD), and moderately reduced overlaps produced accuracy comparable to full overlaps; processing parameters strongly influenced point cloud density, processing time, and height estimation performance.
What are the implications of the main findings?
  • Reducing side overlap while maintaining high front overlap improves efficiency by lowering flight time and image count without sacrificing accuracy; point cloud-based estimation is especially beneficial for sparse or spiky canopy structures.
  • Flying at 60–90 m (1.0–1.5 cm GSD) with reduced side overlap and optimized processing settings provides a strong balance between accuracy and efficiency, enabling faster and more cost-effective phenotyping and precision-agriculture workflows.

Abstract

Point clouds and digital surface models (DSMs) derived from unmanned aircraft system (UAS) imagery are widely used for plant height estimation in plant phenotyping and precision agriculture. However, comprehensive evaluations across multiple crops, flight altitudes, and image overlaps are limited, restricting guidance for optimizing flight strategies. This study evaluated the effects of flight altitude, side and front overlap, and image processing parameters on point cloud generation and plant height estimation. UAS imagery was collected at four altitudes (30–120 m, corresponding to 0.5–2.0 cm ground sampling distance, GSD) with multiple side and front overlaps (67–94%) over a 2–ha field planted with corn, cotton, sorghum, and soybean on three dates across two growing seasons, producing 90 datasets. Orthomosaics, point clouds, and DSMs were generated using Pix4Dmapper, and plant height estimates were extracted from both DSMs and point clouds. Results showed that point clouds consistently outperformed DSMs across altitudes, overlaps, and crop types. Highest accuracy occurred at 60–90 m (1.0–1.5 cm GSD) with RMSE values of 0.06–0.10 m (R2 = 0.92–0.95) in 2019 and 0.07–0.08 m (R2 = 0.80–0.89) in 2022. Across multiple side and front overlap combinations at 60–120 m, reduced overlaps produced RMSE values comparable to full overlaps, indicating that optimized flight settings, particularly reduced side overlap with high front overlap, can shorten flight and processing time without compromising point cloud quality or height estimation accuracy. Pix4Dmapper processing parameters strongly affected 3D point cloud density (2–600 million points), processing time (1–16 h), and plant height accuracy (R2 = 0.67–0.95). These findings provide practical guidance for selecting UAS flight and processing parameters to achieve accurate, efficient 3D modeling and plant height estimation. By balancing flight altitude, image side and front overlap, and photogrammetric processing settings, users can improve operational efficiency while maintaining high-accuracy plant height measurements, supporting faster and more cost-effective phenotyping and precision agriculture applications.

1. Introduction

Unmanned aircraft systems (UAS) equipped with imaging technologies are essential tools for acquiring high-resolution data in precision agriculture. By capturing overlapping imagery, UAS can generate orthomosaics, 3D point clouds, and digital surface models (DSMs), providing detailed insights into crop structure and growth conditions [1,2,3]. These datasets enable accurate measurement of plant height, biomass, and yield potential, as well as detailed characterization of canopy structure, leaf geometry, and weed–crop differentiation [4,5,6,7]. Consequently, UAS-derived 3D reconstructions are increasingly applied in research and agricultural decision making [8]. Structure-from-motion (SfM), the photogrammetric method used to reconstruct 3D structure from overlapping images, is sensitive to camera specifications, flight parameters, and image processing settings [9,10,11,12,13].
Ground control points (GCPs) are essential in UAS–SfM workflows to ensure accurate scale, orientation, and georeferencing, particularly for drones without real-time kinematic (RTK) positioning. Non-RTK platforms typically deploy 4–30 well-distributed GCPs across small experimental fields to reduce positional errors and improve plant height estimation [14,15,16,17,18]. RTK-enabled drones provide accurate direct georeferencing and reduce reliance on dense GCP networks, although vertical accuracy can still vary with canopy structure and flight conditions [19,20,21]. Oblique or multi-axis imaging can help mitigate vertical biases [22,23], and even a few strategically placed GCPs (1–3) can further enhance 3D reconstruction quality in RTK workflows [24,25]. Given ongoing use of non-RTK platforms and variability in RTK performance, GCPs continue to play a key role in ensuring reliable positional accuracy and height-based crop metrics in plant phenotyping and precision agriculture.
Flight altitude, which is directly related to ground sampling distance (GSD) for a given camera, and image overlap strongly influence SfM-derived 3D reconstructions and plant height estimation. Lower altitudes (smaller GSDs) combined with higher side and front overlaps generally produce denser, more accurate point clouds, whereas higher altitudes reduce spatial resolution and vertical detail. Most research on these effects has focused on forestry and natural vegetation. For example, Seifert et al. [26] found that lower altitudes improved reconstruction detail and precision, with high image overlap most beneficial at low altitudes. Kameyama et al. [27] reported that lower altitudes produced more accurate tree height measurements, whereas higher side overlap only modestly improved accuracy. Grybas and Congalton [28] observed minimal altitude effects but noted improvements with higher forward overlap. Mao et al. [29] demonstrated that increasing altitude reduced the ability to capture vertical structure in desert shrubs, leading to underestimation of plant height.
Despite these findings in forestry and shrub systems, relatively few studies have investigated the effects of flight altitude and image overlap on agricultural crops. Adedeji et al. [30] measured cotton plant height at two altitudes and three camera angles with fixed side and front overlap, finding that lower altitudes and oblique viewing angles significantly improved UAS-derived height accuracy. Bazrafkan et al. [31] evaluated different combinations of flight altitude, speed, and image overlap over dry pea plots and reported consistent underestimation of crop height at higher altitudes. Overall, no consensus exists on standardized flight settings for crop phenotyping, and trade-offs between reconstruction accuracy and operational efficiency, including flight time, data volume, and processing requirements, are rarely quantified. Most existing studies focus on isolated parameters or single crop types, limiting the generalizability of their findings. There is a need for systematic, crop-focused empirical studies that evaluate multiple altitudes, overlaps, and crop species to identify robust and efficient UAS-SfM protocols for plant height estimation.
UAS-based plant height estimation has traditionally relied on DSMs derived from 3D point clouds because of their computational efficiency and compatibility with raster-based analyses. However, growing evidence indicates that DSM-based approaches may inadequately represent complex canopy structures, as they are inherently affected by interpolation and surface-smoothing effects. In contrast, raw point clouds preserve detailed vertical canopy information and have been directly used for plant height estimation in a few studies [32,33]. Comparative studies in grassland and row crops demonstrate that point cloud-based methods more effectively capture fine-scale height variability and canopy maxima, whereas DSM-based accuracy declines as raster resolution coarsens or canopy structure becomes discontinuous [34,35]. Despite these insights, the relative performance of point cloud- versus DSM-based approaches under varying flight configurations and across multiple crop species has not been systematically evaluated, leaving a knowledge gap for practical crop monitoring.
Another important yet understudied source of variability in UAS-based plant height estimation is the selection of SfM processing parameters. Both open-source and commercial photogrammetry platforms, such as Agisoft Metashape and Pix4Dmapper, follow similar workflows that include feature extraction, dense point cloud generation, and DSM or orthomosaic construction. However, their performance can vary considerably depending on crop architecture, canopy density, ground visibility, and user-defined settings. Previous studies have shown that adjustments to processing parameters can substantially affect point cloud density, DSM accuracy, and computational requirements, which for large UAS datasets may span several hours or even days [10,13,36]. Yet, the practical implications of these processing choices for crop height estimation accuracy and workflow efficiency remain insufficiently characterized in agricultural phenotyping contexts. As UAS data collection becomes more frequent and multi-temporal field campaigns generate increasingly large datasets, understanding how processing choices impact plant height estimation is critical for developing efficient, reproducible, and scalable workflows.
Given the gaps in understanding the effects of flight altitude, image overlap, and SfM processing parameters, particularly in multi-crop phenotyping contexts, this study evaluates their combined influence on UAS-based plant height estimation across multiple crop species and growing seasons. The work aims to establish empirical performance trends and operational trade-offs, offering insights to support optimized UAS-based crop monitoring. By analyzing point clouds and DSMs generated under diverse flight and processing conditions, the study provides practical guidance for optimizing flight planning and image processing to maintain point cloud quality and height estimation accuracy while minimizing both flight and computational time. The specific objectives of the study were to: (1) compare the performance of raw point clouds and interpolated DSMs for plant height estimation; (2) assess the effects of flight altitude and image overlap on plant height estimation accuracy and operational efficiency; and (3) evaluate how different Pix4Dmapper processing parameters influence processing time and the quality of 3D point clouds used for plant height estimation.

2. Materials and Methods

2.1. Layout of Experimental Plots

This study was conducted in a 2-ha experimental field (30°31′19.2″N, 96°24′0.7″W) at the Texas A&M University AgriLife Research Farm near College Station, Texas, in 2019 and 2022. In 2019, four crops, including cotton, corn, grain sorghum and soybeans, were planted in the northwest portion of the field in 16 plots arranged in four replications (Figure 1a). Additionally, cotton with varying nitrogen rate treatments was planted in 24 plots in the southeast portion of the field. Each plot contained eight rows, 15 m in length, with a row spacing of 1.016 m. In 2022, the layout for the four-crop plots remained the same, but only 16 cotton plots with varying nitrogen rate treatments were planted in the southeast portion of the field (Figure 1b).
Previous UAS–SfM studies have used between 4 and 30 GCPs. To evaluate the effects of both GCP number and placement on positional accuracy, this study deployed 36 white square panels (0.305 m × 0.305 m) in 2019, including 12 at ground level, 12 at 0.75 m, and 12 at 1.5 m AGL (Figure 1a), and 25 panels in 2022, with 13 at ground level and 12 at 1.5 m AGL (Figure 1b). A total of 29 GCP configurations were evaluated in 2019 and 13 in 2022, including zero-GCP, single-level, and mixed-height arrangements. RMSE values in X, Y, Z, and total directions were calculated using independent checkpoints. Results showed that 4–5 ground-level GCPs were sufficient to stabilize RMSE at approximately 0.03 m, with similar trends for elevated panels. In practical applications, a few additional GCPs may be used for validation. When RTK-equipped UAS with centimeter-level positioning are used, the need for dense GCP networks is largely eliminated. However, a few GCPs can still be useful for independent accuracy assessment and for evaluating potential vertical biases under varying field and canopy conditions.

2.2. UAS Image Acquisition

A consumer-grade Nikon D7100 digital camera (6000 × 4000 pixels; Nikon Inc., Melville, NY, USA) was mounted on a rotary-wing AG-V6A hexacopter (Homeland Surveillance & Electronics, LLC, Casselberry, FL, USA) for image acquisition in both years. Images were collected along 13 parallel flight lines at four altitudes (30, 60, 90, and 120 m AGL) under calm, sunny conditions around solar noon on 8 August 2019, 23 June 2022, and 11 July 2022. Datasets from each UAS flight date were treated separately to account for date-specific differences in crop growth and environmental conditions. The D7100 camera has a sensor size of 23.5 mm × 15.6 mm and was equipped with a 24 mm focal-length lens. At 30 m AGL, each image covered a ground area of 29.4 m × 19.5 m with a GSD of 0.49 cm. The UAS followed the same 13 flight lines at a constant speed and captured one image per second at all altitudes. Flight line spacing (9.9 m in 2019 and 7.3 m in 2022) and airspeed (6.5 m s−1 in 2019 and 5.0 m s−1 in 2022) were selected to maintain comparable side and front image overlap at each altitude.
In 2019, side and front overlaps were 67%, 83%, 89%, and 92% at 30, 60, 90, and 120 m, respectively. In 2022, the corresponding overlaps were 75%, 87%, 92%, and 94%. GSD increased from approximately 0.5 cm at 30 m to 2.0 cm at 120 m. Based on the predetermined flight altitudes and overlap targets, flight plans were generated using Mission Planner software (version 1.3.65) [37]. To ensure optimal image quality, camera settings were determined following the procedure described by Yang et al. [38]. ISO was set to 100 to minimize sensor noise. Exposure time was set to 1/2000 s (0.5 ms) in 2019 and 1/1000 s (1 ms) in 2022 to prevent motion blur. Aperture was set to f/5.6 in 2019 and f/6.3 in 2022, based on image histogram analysis to avoid under- or overexposure.

2.3. Plant Height Measurement

Plant height was measured manually using a meter stick or tape measure at three selected plant canopies per plot within each of the 16 plots for the four crops on 8 August 2019. On 23 June and 11 July 2022, four canopy measurements per plot were collected. Measurements from corn plots in 2022 were excluded due to wild hog–induced damage that affected only these plots. The XYZ coordinates of each sampling location were recorded using a centimeter-grade Trimble R2 GNSS receiver with virtual reference station (VRS)-based RTK corrections (Trimble Inc., Westminster, CO, USA). GNSS data were transformed to the Universal Transverse Mercator (UTM), World Geodetic System 1984 (WGS 84), Zone 14 coordinate system, which was also used for all image processing and spatial analyses. The sample size per plot (3–4 measurements) was relatively small given the spatial variability of the plants, particularly in corn canopies. To mitigate this limitation, the plant height measured at each specific sampling location rather than averaging values across the plot was used, allowing for a more accurate point-by-point comparison with the UAS-derived data.

2.4. Image Subsetting to Create Various Overlap Levels

The 30 m flight altitude was selected as the base altitude based on camera sensor size, lens focal length, achievable GSD, and regulatory limits on maximum flight height (120 m AGL). This altitude served as a reference for generating additional datasets at higher altitudes through systematic image thinning. Images collected at higher altitudes were thinned to produce overlap levels comparable to those at lower altitudes. For the 2019 flights, 60 m images with an original 83% overlap were thinned to match the 30 m overlap of 67%; 90 m images with 89% overlap were thinned to match the 30 m and 60 m overlaps of 67% and 83%; and 120 m images with 92% overlap were thinned to match the overlaps at 30 m, 60 m, and 90 m. The same strategy was applied to the 2022 datasets.
To generate datasets with reduced image overlap at higher altitudes, image subsets were created by systematically removing flight lines and individual images at uniform intervals. At 60 m, retaining every other flight line and every other image reduced the number of flight lines from 13 to 7 and the total number of images to approximately 25% of the original dataset, reducing overlap from 83% to 67% in 2019. This approach produced four overlap combinations with side/front overlaps of 67%/67%, 67%/83%, 83%/67%, and 83%/83%. At 90 m, selecting images from every second or third flight line and every second or third image reduced the dataset by factors of 1/4 or 1/9, decreasing overlap from 89% to 83% or 67%, and resulting in nine overlap combinations. Similarly, at 120 m, datasets were reduced by factors of 1/4, 1/9, or 1/16, lowering overlap from 92% to 89%, 83%, or 67%, yielding 16 overlap combinations. In 2022, the same subsetting strategy was applied, with the only difference being that the original full overlaps were 75%, 87%, 92%, and 94% at the four altitudes. In total, 30 datasets (1 + 4 + 9 + 16) were generated for each flight date, resulting in 90 datasets across the three flight dates.
This altitude-dependent image thinning strategy enabled a systematic evaluation of the combined effects of altitude and image overlap while significantly reducing the number of required flights and overall field time. By shortening total flight duration, it also minimized the influence of temporal variations in weather and lighting conditions, such as cloud cover and sun angle, thereby improving data quality and mission efficiency.

2.5. Image Processing for Flight Height and Overlap Combinations

Pix4Dmapper (Pix4D S.A., Prilly, Switzerland) was used for image mosaicking and 3D reconstruction in this study. The workflow consisted of three main steps: (1) initial processing, (2) point cloud and mesh creation, and (3) DSM, orthomosaic, and index generation. Each project began with the creation of a structured project file that included the input images, processing settings, GCPs, and output directories. During initial processing, Pix4Dmapper generated a sparse point cloud along with a preliminary DSM and orthomosaic. At this stage, the reconstruction was internally consistent but not yet accurately aligned with real-world coordinates unless an RTK-equipped drone was used. GCPs were therefore incorporated after initial processing to refine absolute georeferencing. In this study, all available GCPs were used to minimize variability in georeferencing accuracy, even though the GCP configuration analysis described earlier indicated that 6–9 well-distributed GCPs would be sufficient.
For each of the 90 datasets representing all flight height and overlap combinations across the three flight dates in 2019 and 2022, default Pix4Dmapper settings were applied in Steps 1 and 2. However, in Step 3, DSMs were generated using both the default inverse distance weighting (IDW) method and the triangulation interpolation method to evaluate their relative performance for plant height estimation. The final outputs for each fully processed project included the dense 3D point cloud in LAS format and the orthomosaic, DSM, and digital terrain model (DTM) in TIFF format. All datasets were processed on a Dell Precision 7540 computer equipped with an Intel® Core™ i9-9980HK CPU (2.40 GHz, 8 cores, 16 logical processors) and 32 GB RAM (Dell Inc., Round Rock, TX, USA).

2.6. Creation of Point Clouds and DSMs Using Various Processing Parameters

For each of the three steps in Pix4Dmapper, various options are available to allow the user to define appropriate processing parameters. In Step 1, the extraction of keypoints from the images is essential. Keypoints are points or pixels with easily recognizable contrast and texture, and they can be extracted using one of five keypoint image scales: 1, 2, 1/2, 1/4, and 1/8. A scale of 1 represents the original image size (default), while a scale of 2 doubles the image size in both directions. The three smaller scales can be selected to speed up processing by using only portions of the full image size. A larger image scale typically extracts more keypoints but requires longer processing time.
Step 2 involves point cloud densification, and three parameters can be selected: image scale, point density, and the minimum number of matches. As in Step 1, the image scale defines the scale at which additional 3D points are computed, with four options available: 1, 1/2 (default), 1/4, and 1/8. Point density defines the density of the densified point cloud, with options of High (slow), Optimal (default), and Low (fast). The minimum number of matches represents the minimum number of valid reprojections of each 3D point onto the images. This value can range from 2 to 6, with 3 being the default. Increasing the number of matches reduces noise and improves the quality of the point cloud but may result in fewer 3D points in the final output.
Step 3 focuses on the generation of the DSM, DTM, orthomosaic, and spectral indices. Various parameters can be adjusted, including the spatial resolution for output products, DSM filters for smoothing point clouds, and interpolation methods for creating the DSM raster. For DSM filtering, three surface smoothing types are available: Sharp (default), which preserves the orientation of the surface and maintains sharp features; Smooth, which softens areas with sharp features into planar regions; and Medium, which provides an intermediate effect between the two other options. Of the two interpolation methods for DSM generation, although IDW is the default option, triangulation was selected for this analysis because preliminary results indicated that IDW tended to underestimate plant height.
With the numerous processing options available in Pix4Dmapper, hundreds of possible combinations exist. For Steps 1 and 2 alone, there are 300 possible combinations (5 × 4 × 3 × 5). To enable meaningful comparison among processing configurations while maintaining computational feasibility, only 20 representative parameter combinations were selected. This reduction was necessary because some combinations required several hours, up to a full day, to process even on relatively high-performance computers.
Parameter variations were applied in Steps 1 and 2, while all Step 3 settings were kept at their defaults except that triangulation was used as the interpolation method. Specifically, each selected dataset was first processed using all five keypoint image scales in Step 1 with the default settings in Step 2, resulting in five projects. Next, the dataset was processed by individually varying the following nine Step 2 parameter settings while keeping the default keypoint scale in Step 1: three densification image scales (1, 1/4, 1/8), two point density levels (low and high), and four minimum numbers of matches (2, 4, 5, 6). Additionally, six other parameter combinations were processed, involving simultaneous variations in keypoint image scale in Step 1 and densification image scale and point density in Step 2.
To evaluate the effects of processing parameters on point cloud density, processing time, and plant height estimation, datasets captured at 60 m with full side and front overlaps on each flight date were processed. Additionally, the full-overlap datasets captured at 30 m and 90 m in July 2022 were processed using the first 10 parameter combinations to verify that trends observed at 60 m were representative. In total, 80 Pix4Dmapper projects were processed across the three dates. Not all datasets from the four flight altitudes and flight dates were analyzed due to time and computational resource constraints. For this analysis, a more powerful Dell Precision 7920 workstation (Dell Inc., Round Rock, TX, USA), equipped with an Intel® Xeon® Gold 6140 CPU (2.30 GHz, 2294 MHz, 18 cores and 36 threads) and 128 GB of RAM, was used.

2.7. Extraction of Plant Height from Point Clouds and DSMs

To estimate plant height, circles with a radius of 15 cm, centered at each of the 48 sampling points in 2019 and 2022, were used to extract all 3D points from each point cloud and all the pixels from each DSM that fell within the circles. The 15 cm radius was determined based on the approximate area over which height measurements were taken at each sampling point. Canopy height estimates were obtained by subtracting the actual ground elevation values, measured by GNSS at the sampling points from the extracted elevation values in the point clouds and DSMs. The 95th through 100th percentiles of the estimated height values within each sampling circle were calculated, with the 100th percentile corresponding to the maximum canopy height within each circle.
To implement this workflow, the following step-by-step approach was applied using several Python libraries (version 3.13) capable of handling geospatial vector and raster data (i.e., geopandas, rasterio, laspy, and shapely): (1) create a point GeoPackage (GPKG) from the XYZ coordinates, plant height, and other attributes associated with the 48 sampling sites; (2) convert the point GPKG to a polygon GPKG by creating a buffer around each sampling point with a radius of 15 cm; (3) extract the points within each circle from the point cloud, subtract the ground elevation for the sampling site, and compute the 95th–100th percentiles of the point cloud-based height values; and (4) extract the pixel values within each circle from the DSM image, subtract the ground elevation, and calculate the same six percentiles for the DSM-based height values.

2.8. Accuracy Assessment

Linear regression was performed to determine the regression equations and coefficients of determination (R2) between measured plant height and the percentiles derived from the point clouds and DSMs. The analysis was conducted for the 90 projects representing the flight height and overlap combinations across the three dates, and for the 60 projects representing the Pix4Dmapper parameter combinations. In addition, the RMSE was calculated using the following formula:
R M S E = 1 n i = 1 n y i y ^ i 2
where y i is the measured plant height for sample i, y ^ i is the estimated plant height for sample i from the DSM or point cloud, and n is the number of samples. Python scripts were used to extract plant height estimates from the point clouds and DSMs, perform regression analyses, and calculate RMSE between measured and estimated plant height values. These results were used to examine the effects of flight and image processing parameters on point cloud density, flight time, and plant height estimation.

3. Results

3.1. Percentile-Based Plant Height Estimation Using Three Methods

Table 1 summarizes the RMSE and R2 values between measured plant height and estimates derived from three methods (DSM-IDW, DSM-TRI, and point clouds) using the 98th, 99th, and 100th percentiles of plant height values. The four-crop datasets (corn, cotton, sorghum, soybean) and the three-crop datasets (excluding corn) were analyzed separately to assess the influence of canopy structure on performance. Percentiles below 98% were not shown in the table because they consistently produced lower accuracy compared with the three higher percentiles. Among the three percentiles, both the 99th and 100th performed well, but the 99th percentile was preferred because it closely matched actual plant height while being less sensitive to outliers, whereas the 100th percentile was more susceptible to extreme values. Across all four altitudes, point cloud–based estimates produced the lowest RMSE and highest R2. This was particularly evident in the four-crop datasets, where the tall, senescing corn with spiky tasseling and sparse, irregular structures introduced substantial variability in DSM surfaces. In contrast, the three shorter crops formed more uniform canopies, which explains the consistently better accuracy observed in the three-crop datasets. This structural difference is also illustrated in Figure 2, showing why DSMs tend to smooth tall, irregular corn features while point clouds preserve canopy peaks more effectively.
At the lowest altitude (30 m, 0.5 cm GSD), point cloud RMSE values for the four-crop dataset ranged from 0.302 to 0.329 m (R2 = 0.51–0.52), outperforming DSM-IDW (0.423–0.442 m) and DSM-TRI (0.360–0.426 m). When corn was removed, accuracy improved substantially, with point cloud RMSE values decreasing to 0.199–0.222 m. Nevertheless, accuracy at 30 m was relatively poor, largely due to the low image overlap of 67%. The best overall performance occurred at 60 m altitude (1.0 cm GSD) for both datasets: point cloud RMSE reached 0.090 m (R2 = 0.94) for the four-crop dataset and as low as 0.064 m (R2 = 0.95) for the three-crop dataset. DSM-TRI generally outperformed DSM-IDW, but both remained less accurate than point clouds. At 90 m (1.5 cm GSD), point cloud estimates remained highly accurate (RMSE = 0.091–0.116 m), while DSM-based results again degraded, especially for the four-crop dataset. At 120 m (2.0 cm GSD), accuracy declined slightly, but point clouds still achieved RMSE values around 0.140–0.175 m for the four crops and 0.085–0.089 m for the three crops, substantially better than DSMs at the same altitude. Overall, the results from Table 1 demonstrate three consistent patterns: (1) Point clouds outperform DSMs at all altitudes and percentiles, especially when tall corn is present; (2) The three-crop datasets show higher accuracy than the four-crop datasets due to more uniform canopy structure; and (3) The 99th percentile is the most reliable height metric, offering accuracy close to the maximum value while minimizing outlier effects.
Figure 2 shows the scatterplots and regression lines between ground-measured plant height and the corresponding estimates derived from the two DSM models and point clouds at 60 m (1.0 cm GSD) and 90 m (1.5 cm GSD) for the four crops in 2019. As expected from the low RMSE and high R2 values for the point cloud–based method, strong linear relationships were observed between estimated and measured plant heights for all crops at both altitudes. In contrast, both DSM-based methods substantially underestimated the height of many corn canopies but performed well for the other three crops. This underestimation is likely due to the sparse, spiky architecture of the corn canopy and the inherent smoothing applied during DSM generation, which tends to suppress canopy peaks and blend high tassel points with lower values in the gaps. In this study, wind conditions were generally calm and stable across all flights, although minor variations within each flight mission or between different flight dates may have occurred. Tassel movement under even light winds can introduce photogrammetric noise for sparse and spiky corn canopies.
DSM-IDW performed even worse than DSM-TRI at higher altitudes because distance-weighted averaging further attenuates localized canopy maxima as point density decreases. At lower altitudes, the smaller pixel size and denser point cloud allow canopy peaks to be more reliably captured, whereas at higher altitudes, these peak points are more frequently missed and DSM smoothing effects become more pronounced. In contrast, point clouds preserve the original 3D measurements without smoothing or interpolation, enabling them to retain fine-scale canopy structure and capture true local maxima even in sparse, heterogeneous corn canopies. Overall, these results indicate that DSM-based height estimation is less suitable for corn, particularly in later growth stages. For the three crops with more uniform canopies, all three methods produced similar results, as reflected in the RMSE and R2 values in Table 1. Nevertheless, point clouds consistently achieved the highest accuracy across all altitudes and crops.
Table 2 presents the RMSE and R2 values for plant height estimation from DSM-IDW, DSM-TRI, and point clouds at four altitudes for June and July 2022. Only the three crops (cotton, sorghum, and soybean) were analyzed, as corn was excluded due to hog damage. As in 2019, percentiles below 98% were not considered because they consistently produced lower accuracy; among the upper percentiles, both the 99th and 100th performed well, with the 99th percentile preferred because it is less sensitive to outliers. Across both months and all altitudes, point clouds consistently produced the lowest RMSE and highest R2, reflecting the overall pattern observed in 2019. However, the performance gap between DSMs and point clouds was noticeably smaller in 2022. This improvement is partly due to the absence of tall, structurally complex corn canopies, which previously caused smoothing-related underestimation in the DSMs. With only shorter and more uniform crops, DSM-TRI often performed competitively and, at some altitudes, approached point cloud accuracy.
Performance at 30 m (0.5 cm GSD) improved substantially compared to 2019. In 2019, accuracy at 30 m was limited by the low image overlap of 67%. In 2022, the side and front overlap at 30 m was increased to 75%, and the reduction in RMSE reflects this improvement. In June and July 2022, point clouds achieved RMSE values of 0.078–0.094 m (R2 = 0.74–0.85), outperforming both DSM-IDW and DSM-TRI in all cases, except at the 99th percentile, where DSM-TRI produced the same lowest RMSE value (0.078 m) as the point cloud method at the 98th percentile. At 60 m (1.0 cm GSD), all methods performed well, with point clouds providing the highest accuracy (RMSE = 0.074–0.085 m; R2 = 0.85–0.89) and DSM-TRI outperforming DSM-IDW. At 90 m (1.5 cm GSD), point clouds retained their advantage over the DSM-based methods, achieving RMSE values of 0.084–0.096 m, while DSM-TRI again outperformed DSM-IDW. At both 60 m and 90 m, point clouds exhibited slightly higher RMSE values at the 100th percentile in July, reflecting the greater susceptibility of this percentile to outliers. At 120 m (2.0 cm GSD), accuracy declined for all methods, but point clouds still achieved the lowest RMSE values (0.131–0.158 m), consistent with the trend observed in 2019. It should be noted that although R2 is commonly used to assess model performance, it did not always correspond with RMSE in this analysis. Several estimates exhibited relatively high R2 values despite having larger RMSE values, as reported in Table 1 and Table 2, indicating that R2 alone is not a reliable metric for evaluating plant height accuracy. In contrast, RMSE provides a more robust measure because it directly reflects the magnitude of deviations from actual plant heights.
Figure 3 shows the scatterplots and regression lines between ground-measured plant height and the corresponding estimates derived from the two DSM models and the point cloud at 60 m (1.0 cm GSD) for the three crops in June and July 2022. Since corn was not included in these datasets, the strong underestimation observed in Figure 2 for sparse, spiky corn canopies was not evident. At first glance, all three methods appeared to perform similarly; however, the point cloud–based method consistently produced the lowest RMSE values, followed by DSM-TRI, while DSM-IDW exhibited the largest errors. As seen in both Figure 2 and Figure 3, all methods tended to underestimate plant height, particularly for taller plants. This underestimation is primarily caused by occlusion of lower canopy layers, reduced visibility of true canopy peaks, and photogrammetric limitations that resulted in missed or smoothed high points during DSM generation. Consequently, even when regression fits indicate high R2 values, the estimated heights fall below the true values, leading to larger RMSE values.

3.2. Plant Height Estimation Under Different Flight Altitude and Overlap Combinations

Table 3 presents the point cloud density, processing time, and plant height estimation accuracy across all 30 combinations of side and front overlaps for four flight altitudes in August 2019. Overall, the results demonstrated that flight altitude and image overlap substantially influenced point cloud reconstruction density, processing time, and plant height estimation accuracy. Moreover, canopy structure also plays an important role, as seen in the differences between the four-crop and three-crop datasets. At 30 m (0.5 cm GSD), the low-overlap configuration (67%/67%) resulted in relatively high RMSE and low R2, as noted previously. This highlights that insufficient overlap at low altitude can compromise point cloud quality. Nevertheless, this configuration generated over 77.4 million points, the highest among all, because the images had the smallest pixel size, allowing more points to be created. Importantly, high point density does not necessarily correspond to high plant height estimation accuracy. The processing time for this configuration was 5 h and 27 min.
At 60 m (1.0 cm GSD), plant height estimation accuracy improved notably across all overlap combinations, except for the 67%/67% configuration, which performed substantially worse than the same configuration at 30 m. The 83%/83% configuration provided the best accuracy at this altitude, with RMSE of 0.089 m for four crops and 0.063 m for three crops, and R2 of 0.94 and 0.95, respectively. When side overlap was reduced from 83% to 67% while maintaining front overlap at 83%, the RMSE increased to 0.116 m for four crops and 0.069 m for three crops, while R2 decreased to 0.88 and 0.93, respectively. Correspondingly, point cloud density decreased from 55.8 million points to 36.3 million points, and processing time was reduced significantly from 6 h 2 min to 2 h 11 min. In contrast, the 83%/67% configuration showed slightly lower performance, with a similar reduction in point density and processing time, indicating that reducing front overlap has a somewhat larger impact on accuracy than reducing side overlap. These results suggest that, at this altitude, moderate reductions in side overlap do not substantially compromise accuracy, highlighting practical implications for UAS operations: flight time can be shortened without significant loss of plant height estimation accuracy. Reducing front overlap also decreases the number of images collected and lowers processing time, but the resulting reduction in estimation accuracy may outweigh the time savings.
At 90 m (1.5 cm GSD), plant height estimation accuracy varied across the nine side–front overlap combinations, with the lowest-overlap configuration (67%/67%) performing the worst. This setting produced the highest RMSE values (0.534 m for four crops; 0.392 m for three crops) and the lowest R2 values (0.22 and 0.48), confirming that low overlap at this altitude substantially degrades height estimation quality. Increasing front overlap while keeping side overlap relatively low yielded strong improvements. For example, the 67%/89% configuration reduced RMSE to 0.142 m (four crops) and 0.072 m (three crops), while R2 increased to 0.84 and 0.93. The 83%/89% configuration performed similarly well, achieving RMSE values of 0.101 m and 0.077 m with R2 = 0.92 for both datasets. Although the highest-overlap configuration (89%/89%) produced the best overall accuracy (0.103 m and 0.062 m RMSE; R2 = 0.92 and 0.95), the improvement over the 67%/89% and 83%/89% configurations was modest, particularly for the three-crop dataset, indicating that full high-overlap imaging is not strictly necessary for reliable height estimation at 90 m. Reducing front overlap also yielded competitive results. Configurations such as 89%/67% and 89%/83% produced RMSE values of 0.112–0.128 m for the four crops and 0.071–0.078 m for the three crops, with corresponding R2 values of 0.89–0.92 and 0.92–0.94, respectively, only slightly lower than those obtained from the highest-overlap setting.
Meanwhile, point cloud density increased consistently with higher overlaps, from 12.5 million points at 67%/67% to 43.9 million points at 89%/89%, accompanied by a corresponding increase in processing time (from 29 min to nearly 6 h). In contrast, reduced side overlap configurations, such as 67%/89% and 83%/89%, offered a practical balance: they delivered substantial accuracy improvements compared with low-overlap imaging while keeping point cloud sizes (22.6–26.8 million points) and processing times (1.5–2.0 h) far lower than those of the highest-overlap option. Overall, at 90 m, reducing side overlap while maintaining high front overlap provides the best balance of efficiency and accuracy, decreasing both flight duration and processing demands. Reducing front overlap can also be effective for lowering computational costs, though it does not reduce UAS flight time because the number of flight lines remains unchanged.
At 120 m (2.0 cm GSD), plant height estimation accuracy varied widely across the 16 side/front overlap combinations, with the poorest performance occurring in the 83%/67% configuration. This setting performed even worse than the lowest-overlap configuration (67%/67%) due to several extremely low height estimates, indicating that at this altitude low front overlap is particularly detrimental to reconstruction quality at this altitude. Increasing front overlap from 67% to 92% while keeping side overlap at 67% substantially improved accuracy, reducing RMSE from 0.439 m to 0.141 m for the four crops and from 0.313 m to 0.091 m for the three crops; R2 values increased to 0.84 and 0.90, respectively. The highest-overlap configuration (92%/92%) yielded RMSE values of 0.157 m (four crops) and 0.085 m (three crops), slightly worse than the 67%/92% setting for the four crops and only marginally better for the three crops.
Other high front overlap combinations, such as 89%/92%, showed similarly strong performance (RMSE = 0.159 m and 0.093 m; R2 = 0.83 and 0.92). These results demonstrate that full high overlap in both directions is not necessary for reliable height estimation at 120 m. Configurations with reduced front overlap lowered the number of images and processing time, though without reducing UAS flight time. These combinations produced moderate accuracy that may still be acceptable in applications where computational efficiency is prioritized. Point cloud density increased with overlap, ranging from 7.8 million points (67%/67%) to 35.3 million points (92%/92%), with processing times increasing from 17 min to more than 7 h. The most efficient yet accurate strategy at 120 m is to maintain high front overlap while reducing side overlap (e.g., 89%/92%), providing near-maximal accuracy while substantially decreasing flight duration, point cloud size, and processing time.
Table 4 and Table 5 summarize point cloud density, processing time, and plant height estimation accuracy across all 30 combinations of side and front overlaps for four flight altitudes in June and July 2022. Overall, the results showed consistently moderate to high accuracy across both months. Most RMSE values were below 0.10 m, with the lowest being 0.066 m in June and 0.075 m in July, although slightly higher values were observed at 120 m in June. Similarly, most R2 values exceeded 0.8, with the highest reaching 0.89 in June and 0.87 in July, indicating strong agreement with ground measurements. Accuracy was generally high and consistent at 60–90 m (1.0–1.5 cm GSD), reflecting a favorable balance between flight altitude, image overlap, and reliable point cloud reconstruction, while also supporting operational efficiency.
Figure 4 illustrates scatterplots and regression lines comparing measured plant height with 99th-percentile estimates derived from point clouds under selected reduced side/front overlap configurations at 60 m, 90 m, and 120 m (0.5 cm, 1.0 cm., and 2.0 cm GSD) for the three crops across three dates (August 2019, June 2022, and July 2022). For 2019, the selected side/front overlap combinations were 67%/83% at 60 m, 67%/89% at 90 m, and 89%/92% at 120 m. For 2022, the chosen configurations were 75%/87% at 60 m, 75%/92% at 90 m, and 92%/94% at 120 m for both June and July. These overlap settings were identified from 3–5, providing accuracy comparable to, or better than, full side and front overlap configurations at 60–90 m altitudes while substantially reducing the number of images collected. For example, at 60 m (1.0 cm GSD), using every other flight line approximately halved both the number of images and flight time, whereas at 90 m (1.5 cm GSD), using every third flight line reduced image acquisition and flight time by nearly two-thirds. Despite these reductions, the scatterplots and regression results show that plant height estimation accuracy remained high across all altitudes and dates, as reflected by consistently low RMSE values and high R2 values.
The selected overlap configurations offer clear operational advantages. Reducing side overlap is particularly beneficial because it directly shortens flight time, a critical consideration for field campaigns constrained by adverse weather conditions or limited daylight. Although reducing front overlap can also decrease the number of images collected, it has a smaller effect on operational efficiency and data quality, as images acquired along the flight direction provide flexibility for subset selection without compromising the ability to capture the canopy under optimal conditions. As demonstrated in Table 3, Table 4 and Table 5, moderate reductions in side overlap can be effectively offset by maintaining relatively high front overlap, thereby preserving sufficient image redundancy for accurate 3D reconstruction and reliable plant height estimation.
UAS-based plant height estimation can be made more efficient without sacrificing accuracy by strategically adjusting side and front overlaps. Across the nine subfigures, the point cloud–based method consistently exhibited strong linear relationships with field-measured heights, low RMSE values, and high R2 values, even under reduced side overlap conditions. These results confirm that maintaining high front overlap while reducing side overlap is an effective strategy for field-scale surveys, as it preserves sufficient 3D point density and canopy representation, while substantially reducing flight duration, the number of images collected, and the subsequent image processing time. This strategy significantly enhances operational efficiency, particularly for large fields or time-sensitive campaigns, without compromising plant height estimation accuracy.

3.3. Effects of Processing Parameters in Pix4Dmapper on Plant Height Estimation

Table 6 summarizes the number of 3D points generated, processing time, RMSE and R2 values between measured plant height and the 99th-percentile estimates derived from the point clouds for 20 different combinations of processing parameters in Pix4Dmapper for the images captured at 60 m (1.0 cm GSD) in August 2019. When the keypoint image scale decreased from 2 to 1/8 (with 1 as the default), the total number of 3D points remained essentially constant at approximately 50 million (50 M). This stability occurred because a lower number of keypoints in Step 1 is compensated by densification in Step 2. However, processing time decreased substantially from 2 h 48 min at scale 2 to 1 h 4 min at scale 1/8. R2 values remained generally consistent across keypoint scales (0.91–0.95), with corresponding RMSE decreasing from 0.109 m to 0.081, the smaller scales (1/4 and 1/8) achieving similar or slightly better accuracy than larger scales while reducing processing time.
Reducing the densification image scale from 1 to 1/8, while keeping other parameters at default, drastically reduced the number of 3D points from 177.7 M to 2.9 M and shortened processing time from 5 h 14 min to 1 h 47 min. However, R2 dropped from 0.93 to 0.76 with RMSE increasing from 0.094 m to 0.345 m, indicating that extremely low densification scales can compromise accuracy. Adjusting point density from Low to High (with default Optimal) increased 3D points from 14.0 M to 175.7 M and processing time from 1 h 56 min to 4 h 13 min, improving R2 from 0.88 to 0.94 and reducing RMSE from 0.138 m to 0.093 m. On the other hand, increasing the minimum number of matches from 2 to 6 decreased the number of 3D points from 67.2 M to 30.9 M, with only a slight reduction in processing time. Compared to the default match value of 3, values of 2, 4, and 5 resulted in slightly lower R2 values (0.93–0.90) and higher RMSE (0.096–0.129 m), while a match value of 6 caused a substantial drop in R2 to 0.67 and an increase in RMSE to 0.230 m.
The remaining six parameter combinations with the default minimum match number of 3 produced between 13.9 M and 598.8 M 3D points, with processing times ranging from 2 h 5 min to 16 h 4 min, and R2 values from 0.82 to 0.95, with corresponding RMSE decreasing from 0.128 m to 0.081 m. The combination (2, 1, High, 3) generated the most points and required the longest processing time, yet achieved a relatively low R2 of 0.82 and a high RMSE of 0.128 m. In contrast, the combination (2, 1/2, High, 3) produced 175.4 M points in 4 h 3 min and reached a high R2 of 0.95 and a low RMSE of 0.081 m, matching the performance of (1/4, 1/2, Optimal, 3) and (1/8, 1/2, Optimal, 3). Several other combinations also achieved high R2 values with lower RMSE. These results demonstrate that processing parameters strongly affect both computational efficiency and estimation accuracy, and that appropriate parameter selection enables high-accuracy plant height estimation within a reasonable processing time.
Table 7 presents the results based on the point clouds for 20 different processing parameter combinations in Pix4Dmapper using the 60 m (1.0 cm GSD) datasets from June and July 2022. Overall trends in the number of 3D points generated and processing times for both dates were consistent with those observed in 2019. The R2 and RMSE values generally followed similar patterns to those in 2019, but some minor discrepancies were observed between June and July in 2022 and between the two years due to seasonal and dataset-specific differences. Although the 1/8 scale can yield high R2 values with low RMSE, as observed in 2019, the coarse point cloud generated in Step 1 at this scale is not ideal for initial visual evaluation. Therefore, this scale should be used primarily when processing time is a limiting factor. In contrast, the 1/4 scale performed consistently across both years, and the parameter combination (1/4, 1/2, Optimal, 3) represents a robust choice. Additionally, using a densification scale of 1/2 with High point density and the default keypoint and match settings (1, 1/2, High, 3) produced high R2 values (0.86 in both June and July) and low RMSE values (0.081 m in June and 0.075 m in July), while generating very high-density point clouds. The remaining 10 parameter combinations evaluated in 2022 The remaining 10 parameter combinations evaluated in 2022 did not produce R2 or RMSE values superior to those of the best-performing combinations from the first 10.
Table 8 presents the results for the first 10 parameter combinations, evaluated using the datasets captured at 30 m (0.5 cm GSD) and 90 m (1.5 cm GSD) in July 2022 to assess the potential effect of flight altitude and the corresponding image GSD on point cloud generation and plant height accuracy. Overall trends in the number of 3D points generated, processing times, and accuracy values across both flight altitudes were consistent with those observed at 60 m (1.0 cm GSD). More points were generated at 30 m (0.5 cm GSD) than at the higher altitudes, with approximately 55 M at 30 m, 34 M at 60 m, and 29 M at 90 m. Although larger keypoint image scales usually increase processing time, for images captured at 90 m, scale 2 was slightly faster than scales 1 and 1/2, likely due to image content and internal resizing affecting the matching process. While smaller GSDs at lower altitudes allow more 3D points to be identified, the increased point density had minimal effect on plant height estimation accuracy. These results further confirm that the effect of processing parameters on point cloud quality is consistent, although minor dataset-specific differences exist.

4. Discussion

4.1. Performance Differences Between DSM- and Point Cloud-Based Plant Height Estimation

Results from three dates over two years (Table 1 and Table 2) demonstrate that point clouds consistently outperformed the two DSM-based methods for plant height estimation. In 2019, tall, tasseling corn canopies exhibited sparse, vertically complex, and spiky structures, which reduced DSM accuracy due to interpolation and surface-smoothing effects, an issue commonly reported for row crops with discontinuous upper canopies [4,17,33]. In contrast, point clouds preserved localized canopy peaks and fine-scale height variability, resulting in lower RMSE and higher R2 values. The three shorter crops displayed more uniform and continuous canopy surfaces, leading to improved DSM performance and overall higher estimation accuracy, consistent with findings that canopy homogeneity strongly influences DSM reliability [34,35].
Across all datasets, the 99th percentile consistently produced the most reliable plant height estimates by capturing true canopy maxima while minimizing the influence of noise and extreme outliers, a strategy also recommended in several crop height studies using percentile-based metrics [30,32,34]. In 2022, when corn was excluded, DSM-TRI performance improved markedly and, in some cases, approached that of point clouds, further underscoring the role of canopy uniformity in determining DSM accuracy. Collectively, these findings highlight that point clouds remain the preferred data product for structurally complex canopies, whereas DSM-TRI can serve as an efficient and sufficiently accurate alternative for crops with smoother and more homogeneous vegetation surfaces.

4.2. Estimation Accuracy and Efficiency as Affected by Flight Altitude and Image Overlap

Analysis of Table 3, Table 4 and Table 5 indicates that flight altitude, image overlap, and canopy structure all significantly influence plant height estimation accuracy and UAS survey efficiency, consistent with previous non-crop studies [11,29,39]. Increasing flight altitude while maintaining constant image overlap generally reduced plant height estimation accuracy, in agreement with findings from limited crop-based studies [30,31]. However, this study also showed that higher image overlap can mitigate this effect, enabling consistent height estimates across altitudes and highlighting the combined importance of flight altitude and overlap in UAS-based measurements.
Mid-range altitudes (60–90 m, 1.0–1.5 cm GSD) consistently provided the best balance between accuracy and operational efficiency, supporting prior conclusions that moderate flight heights preserve canopy detail while limiting image distortion and excessive data volumes. Although increasing side or front overlap generally improved estimation accuracy, moderate overlap reductions at altitudes of 60–120 m (1.0–2.0 cm GSD) had minimal impact, reflecting diminishing returns beyond standard photogrammetric thresholds [25,26]. Higher altitudes or larger GSDs were also more tolerant of overlap reductions, offering greater flexibility in flight planning while reducing both flight duration and computational workload.
Overlap configuration also provides clear operational advantages. Reducing side overlap directly shortens flight time, which is critical under adverse weather conditions or limited daylight. Moderate reductions in side overlap can be effectively offset by maintaining relatively high front overlap, thereby preserving sufficient image redundancy for accurate 3D reconstruction. As demonstrated in Table 3, Table 4 and Table 5, maintaining high front overlap while reducing side overlap allowed plant height estimation accuracy to remain high, maintaining strong linear relationships with field-measured heights, low RMSE values, and high R2 values, while substantially improving operational efficiency by reducing both the number of images collected and subsequent processing time.

4.3. DTM-Based Ground Elevation Compared with GNSS Measurements

DTMs are widely used in UAS-based plant height estimation, where canopy height is derived by subtracting a DTM from a DSM or a point cloud-based canopy surface. Numerous studies across different crops have successfully employed this DSM–DTM approach for height estimation [5,18,40,41]. In this study, GNSS measurements collected at the sampling sites were used as the reference ground elevations for plant height determination. DTMs were also generated for each project during image processing in Pix4Dmapper.
Although this study did not use DTMs for plant height calculations, it is worthwhile to briefly examine potential errors in DTMs and assess how these errors could affect plant height estimation accuracy, thereby providing a more realistic assessment of operational uncertainty. To assess DTM accuracy, differences between the DTM-derived ground elevations and the GNSS-measured elevations were calculated at the 48 sampling points for flights conducted at the 60 m altitude (1.0 cm GSD) on the three dates. The mean elevation differences across the 48 samples were −0.071 m, −0.086 m, and −0.052 m for August 2019, June 2022, and July 2022, respectively, with corresponding standard deviations of 0.040 m, 0.032 m, and 0.022 m. The difference values were consistently negative, indicating that the DTM-derived ground elevations were lower than those measured by GNSS. This systematic negative bias can be attributed to the crop rows being planted on raised ridges, as the GNSS measurements were taken at ridge tops, which were slightly higher than the surrounding ground and the inter-plot alleys from which the DTM ground elevations were derived. Despite this consistent negative bias, the overall variation between DTM- and GNSS-derived ground elevations was relatively small, indicating that DTMs can provide a reliable approximation of ground elevation for agricultural fields when estimating plant height. To assess the effect of this systematic bias on plant height estimation, RMSE and R2 were calculated for the three representative datasets. The DTM-based RMSE values were 0.103 m, 0.140 m, and 0.108 m for August 2019, June 2022, and July 2022, respectively. These values were higher than the corresponding GNSS-based RMSE values (0.089 m, 0.075 m, and 0.075 m) reported in Table 3, Table 4 and Table 5, with the largest difference (0.065 m) observed for June 2022. The DTM-based R2 values for the three dates were 0.91, 0.86, and 0.82, which were slightly lower than the GNSS-based R2 values (0.94, 0.89, and 0.85).
Although these results were derived from only three of the 90 datasets collected or derived in this study, they provide a positive indication that DTM-derived ground elevation can be used for plant height estimation. Future research should further investigate how DTM-related errors influence plant height estimation across datasets generated under varying flight altitudes and image overlap configurations. Nevertheless, these results support the practical utility of UAS-derived DTMs for high-throughput plant phenotyping. Numerous studies have successfully relied on bare-soil DSMs acquired before or after planting [42,43], as well as within-season imagery containing both crop canopy and exposed soil, to derive DTMs for plant height estimation, as demonstrated in this study and others [30,40]. However, similar to DSMs, DTM accuracy can be influenced by flight parameters, crop growth stage, and crop structure, particularly in fields with non-uniform terrain or elevated crop rows. Further research is therefore needed to evaluate DTM performance and robustness under more complex crop arrangements and field conditions.

4.4. Influence of Processing Parameters on Point Cloud and DSM Generation

Main processing parameters in Pix4Dmapper, including keypoint image scale, densification image scale, point density, and minimum number of matches, strongly influence the quality of generated 3D point clouds, DSMs, and DTMs, as well as overall computational efficiency. Results from this study demonstrate clear trade-offs among point density, processing time, and plant height estimation accuracy, consistent with previous UAS photogrammetry research [11,13,44]. Reducing the keypoint image scale from 2 to 1/8 decreased only the number of points extracted during initial processing but did not reduce the number of points in the final dense cloud, while substantially reducing processing time, particularly at the 1/4 and 1/8 scales. R2 and RMSE values across the five keypoint scales were generally comparable, with the default (1) and 1/4 scales showing the most consistent performance. In contrast, reducing the densification scale from the default (1/2) dramatically lowered point cloud density and processing time, but at the expense of plant height estimation accuracy. Increasing point density to High generated more points with similar R2 and RMSE values, though processing times increased substantially. Increasing the minimum number of matches from 2 to 5 reduced the number of points but had only a minor effect on processing time, with the default value of 3 yielding the highest R2. These findings indicate that selecting an optimal combination of processing parameters is crucial to balance accuracy and computational efficiency in UAS-based plant height estimation.

4.5. Optimizing UAS-Based Plant Height Estimation in Precision Agriculture

UAS-derived point clouds and DSMs have become indispensable tools for plant height estimation in plant phenotyping and precision agriculture, providing a cost-effective and non-invasive alternative to traditional ground-based measurements. Numerous studies have demonstrated the accuracy of UAS-based canopy height modeling across a wide range of crops, including cotton, maize, wheat, rice, and soybean [2,6,18,43,45]; however, relatively few studies have explicitly examined how flight altitude and image overlap affect estimation accuracy and operational efficiency. The findings of this study provide comprehensive datasets and practical guidance for optimizing UAS data acquisition and photogrammetric processing for plant height estimation. Key considerations include determining optimal flight altitude and side/front overlap settings, selecting DSMs or point clouds for plant height extraction based on plant canopy structure, and refining processing parameters in relation to data volume and computing resources.
Careful selection of flight parameters, such as flight height and image overlap, together with appropriate data extraction methods and processing parameters, allows users to balance data quality and processing time, which can be further reduced with advanced computing technology. Furthermore, the small discrepancies observed between GNSS-measured and DTM-derived ground elevations indicate that UAS-derived DTMs can serve as reliable proxies for ground surfaces in many agricultural settings, thereby streamlining workflows for plant height derivation. Collectively, these insights can assist researchers and producers in selecting efficient, robust, and cost-effective survey configurations tailored to their operational needs, ranging from small experimental plots to large commercial fields.

4.6. Limitations and Future Work

This study was designed as a systematic and practical evaluation of UAS-based crop height estimation under varying flight and processing configurations across multiple crops and growing seasons. As such, several limitations should be acknowledged, along with opportunities for future research.
First, the study emphasizes systematic evaluation of commonly used flight parameters and SfM processing settings rather than the development of new photogrammetric methods. The focus was on quantifying their operational effects on point cloud quality, processing efficiency, and plant height estimation accuracy. While this approach provides extensive empirical evidence and practical guidance for UAS users, future studies could strengthen the theoretical foundation by explicitly linking observed performance trends to the geometric and radiometric principles underlying SfM-based 3D reconstruction.
Second, although both side and front image overlaps were systematically varied, their impacts were primarily assessed using experimental performance metrics such as RMSE, R2, processing time, and point cloud density. The distinct and potentially asymmetric roles of side and front overlap in tie-point generation, image geometry, and vertical accuracy were not explicitly analyzed from a photogrammetric perspective. Further research could explore these relationships more deeply, for example through controlled experiments or simulation-based analyses that isolate the influence of overlap directionality on reconstruction stability and error propagation.
Third, the evaluation of processing parameter effects was limited to Pix4Dmapper, which constrains the general applicability of the selected parameter combinations. While Pix4Dmapper is widely used in agricultural UAS applications and represents a typical SfM workflow, other photogrammetry software packages, such as Agisoft Metashape and OpenDroneMap, offer comparable functionality and may differ substantially in computational efficiency, parameter implementation, and reconstruction behavior. Consequently, optimal parameter combinations identified in Pix4Dmapper may not directly translate to other software packages. Future work should include cross-platform comparisons to evaluate the consistency of results across software environments and to assess the transferability of recommended processing strategies. In addition, due to time and computational resource constraints, detailed analyses of SfM processing parameters were conducted using a limited set of representative parameter combinations and primarily focused on datasets acquired at a 60 m flight altitude. Although similar trends were observed across multiple flight dates, processing parameter effects were not exhaustively evaluated across all flight altitudes or image GSDs. Future studies could extend this analysis to additional altitudes and spatial resolutions by leveraging high-performance or cloud-based computing resources to support more extensive and systematic evaluations of processing parameter combinations.
Finally, this study estimated plant height using percentile-based metrics derived from SfM-generated point clouds within localized sampling areas. While this approach is transparent, computationally efficient, and well suited for controlled field experiments, it does not explore recent advances in learning-based methods that operate directly on raw 3D point clouds. Future research could investigate the use of deep learning approaches for point cloud analysis to better capture complex canopy structures, improve robustness to noise and canopy motion, and reduce sensitivity to environmental variability. Comparative studies between conventional statistical methods and learning-based models would help clarify trade-offs among accuracy, computational demand, and interpretability.
Despite these limitations, this study provides a large, multi-year empirical dataset and a systematic evaluation of commonly used UAS flight and SfM processing parameters. The results offer practical guidance for balancing accuracy, efficiency, and operational constraints in UAS-based crop height estimation, while also identifying clear directions for future methodological and theoretical advances.

5. Conclusions

This study provides a comprehensive evaluation of UAS-derived plant height estimation based on 90 datasets collected across two growing seasons, spanning multiple flight altitudes, image overlap configurations, and crop types. By systematically assessing both data acquisition and photogrammetric processing parameters, this work offers practical insights into optimizing UAS workflows for plant height estimation. Mid-range flight altitudes (60–90 m, corresponding to 1.0–1.5 cm GSD) consistently provided the best balance between survey efficiency and estimation accuracy. Reducing side overlaps (75% at 60 m and 89% at 90 m) while maintaining front overlaps (87% at 60 m and 92% at 90 m) enabled substantial reductions in flight time and processing effort without or only slightly reducing data quality or plant height estimates. Reconstruction performance was influenced by crop structure, with uniform canopies producing stable results and taller, structurally complex canopies benefiting from point cloud–based analyses to better capture canopy extremes.
Photogrammetric processing parameters in Pix4Dmapper strongly influenced point cloud density, computational time, and plant height estimation accuracy. Parameter configurations with smaller keypoint image scales (1/2 and 1/4) achieved high accuracy with reasonable processing times, whereas higher point density settings substantially increased point cloud density and computational demand with little improvement in estimation accuracy. These findings highlight the importance of aligning processing settings with specific application objectives, whether prioritizing plant height estimation accuracy, computational efficiency, or detailed canopy representation.
Point cloud reconstruction quality, plant height estimation accuracy, and processing efficiency can be affected by sensor and platform selection, flight altitude and overlap settings, spatial and temporal variability among crop types, weather or environmental conditions during data acquisition, and available computing and software resources. Future research should explore a broader range of sensors and platforms, expanded flight configurations, diverse crop growing conditions, and automated or cloud-based processing pipelines to enhance robustness, scalability, and operational efficiency. Overall, this study provides practical guidance for optimizing UAS survey design and photogrammetric workflows, enabling accurate and efficient plant height estimation to support precision agriculture and high-throughput phenotyping.

Author Contributions

Conceptualization: C.Y., C.P.-C.S. and B.K.F.; methodology, C.Y. and C.P.-C.S.; formal analysis, C.Y.; investigation, C.Y. and C.P.-C.S.; data curation, C.Y.; writing—original draft preparation, C.Y.; visualization, C.Y., C.P.-C.S. and B.K.F.; writing—review and editing, C.Y., C.P.-C.S. and B.K.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Data Availability Statement

Data will be made available upon reasonable request.

Acknowledgments

The authors thank Mike O’Neil for preparing and maintaining the test plots, and Fred Gomez for assisting in field data collection. Mention of trade names or commercial products in this article is solely for the purpose of providing specific information and does not imply recommendation or endorsement by the U.S. Department of Agriculture. The USDA is an equal opportunity provider and employer.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using Crop Surface Models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  2. Chen, R.; Chu, T.; Landivar, J.A.; Yang, C.; Maeda, M.M. Monitoring cotton (Gossypium hirsutum L.) germination using ultrahigh-resolution UAS images. Precis. Agric. 2018, 19, 161–177. [Google Scholar] [CrossRef]
  3. Corti, M.; Cavalli, D.; Cabassi, G.; Bechini, L.; Pricca, N.; Paolo, D.; Marinoni, L.; Vigoni, A.; Degano, L.; Gallina, P.M. Improved estimation of herbaceous crop aboveground biomass using UAV-derived crop height combined with vegetation indices. Precis. Agric. 2023, 24, 587–606. [Google Scholar] [CrossRef]
  4. Chang, A.; Jung, J.; Maeda, M.M.; Landivar, J. Crop height monitoring with digital imagery from Unmanned Aerial System (UAS). Comput. Electron. Agric. 2017, 141, 232–237. [Google Scholar] [CrossRef]
  5. De Souza, C.H.W.; Lamparelli, R.A.C.; Rocha, J.V.; Magalhães, P.S.G. Height estimation of sugarcane using an unmanned aerial system (UAS) based on structure from motion (SfM) point clouds. Int. J. Remote Sens. 2017, 38, 2218–2230. [Google Scholar] [CrossRef]
  6. Gilliot, J.M.; Michelin, J.; Hadjard, D.; Houotet, S. An accurate method for predicting spatial variability of maize yield from UAV-based plant height estimation: A tool for monitoring agronomic field experiments. Precis. Agric. 2021, 22, 897–921. [Google Scholar] [CrossRef]
  7. Harandi, N.; Vandenberghe, B.; Vankerschaver, J.; Depuydt, S.; Van Messem, A. How to make sense of 3D representations for plant phenotyping: A compendium of processing and analysis techniques. Plant Methods 2023, 19, 60. [Google Scholar] [CrossRef]
  8. Martinez-Guanter, J.; Ribeiro, Á.; Peteinatos, G.G.; Pérez-Ruiz, M.; Gerhards, R.; Bengochea-Guevara, J.M.; Machleb, J.; Andújar, D. Low-cost three-dimensional modeling of crop plants. Sensors 2019, 19, 2883. [Google Scholar] [CrossRef]
  9. Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M. “Structure-from-Motion” photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [Google Scholar] [CrossRef]
  10. James, M.R.; Robson, S.; d’Oleire-Oltmanns, S.; Niethammer, U. Optimising UAV topographic surveys processed with structure-from-motion: Ground control quality, quantity and bundle adjustment. Geomorphology 2017, 280, 51–66. [Google Scholar] [CrossRef]
  11. Liu, Y.; Han, K.; Rasdorf, W. Assessment and prediction of impact of flight configuration factors on UAS-based photogrammetric survey accuracy. Remote Sens. 2022, 14, 4119. [Google Scholar] [CrossRef]
  12. Du, X.; Zheng, L.; Zhu, J.; Cen, H.; He, Y. Evaluation of mosaic image quality and analysis of influencing factors based on UAVs. Drones 2024, 8, 143. [Google Scholar] [CrossRef]
  13. Jarahizadeh, S.; Salehi, B. A comparative analysis of UAV photogrammetric software performance for forest 3D modeling: A case study using Agisoft Photoscan, Pix4Dmapper, and DJI Terra. Sensors 2024, 24, 286. [Google Scholar] [CrossRef]
  14. Bendig, J.; Kang, Y.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  15. Holman, F.H.; Riche, A.B.; Michalski, A.; Castle, M.; Wooster, M.J.; Hawkesford, M.J. High throughput field phenotyping of wheat plant height and growth rate in field plot trials using UAV based remote sensing. Remote Sens. 2016, 8, 1031. [Google Scholar] [CrossRef]
  16. Madec, S.; Baret, F.; de Solan, B.; Thomas, S.; Dutartre, D.; Jezequel, S.; Hemmerlé, M.; Colombeau, G.; Comar, A. High-throughput phenotyping of plant height: Comparing unmanned aerial vehicles and ground LiDAR estimates. Front. Plant Sci. 2017, 8, 2002. [Google Scholar] [CrossRef] [PubMed]
  17. Xie, T.; Li, J.; Yang, C.; Jiang, Z.; Chen, Y.; Guo, L.; Zhang, J. Crop height estimation based on UAV images: Methods, errors, and strategies. Comput. Electron. Agric. 2021, 185, 106155. [Google Scholar] [CrossRef]
  18. Pun Magar, L.; Sandifer, J.; Khatri, D.; Poudel, S.; KC, S.; Gyawali, B.; Gebremedhin, M.; Chiluwal, A. Plant height measurement using UAV-based aerial RGB and LiDAR images in soybean. Front. Plant Sci. 2025, 16, 1488760. [Google Scholar] [CrossRef]
  19. Forlani, G.; Dall’Asta, E.; Diotri, F.; Cella, U.M.; Roncella, R.; Santise, M. Quality assessment of DSMs produced from UAV flights georeferenced with on board RTK positioning. Remote Sens. 2018, 10, 311. [Google Scholar] [CrossRef]
  20. Tomaštík, J.; Mokroš, M.; Surový, P.; Grznárová, A.; Merganič, J. UAV RTK/PPK method—An optimal solution for mapping inaccessible forested areas? Remote Sens. 2019, 11, 721. [Google Scholar] [CrossRef]
  21. Sefercik, U.G.; Kavzoğlu, T.; Çölkesen, İ.; Nazar, M.; Öztürk, M.Y.; Adalı, S.; Dinç, S. 3D positioning accuracy and land cover classification performance of multispectral RTK UAVs. Int. J. Eng. Geosci. 2023, 8, 119–128. [Google Scholar] [CrossRef]
  22. Taddia, Y.; Stecchi, F.; Pellegrinelli, A. Coastal mapping using DJI Phantom 4 RTK in post-processing kinematic mode. Drones 2020, 4, 9. [Google Scholar] [CrossRef]
  23. Štroner, M.; Urban, R.; Seidl, J.; Reindl, T.; Brouček, J. Photogrammetry using UAV-mounted GNSS RTK: Georeferencing strategies without GCPs. Remote Sens. 2021, 13, 1336. [Google Scholar] [CrossRef]
  24. Štroner, M.; Urban, R.; Reindl, T.; Seidl, J.; Brouček, J. Evaluation of the georeferencing accuracy of a photogrammetric model using a quadrocopter with onboard GNSS RTK. Sensors 2020, 20, 2318. [Google Scholar] [CrossRef] [PubMed]
  25. Maes, W.H. Practical guidelines for performing UAV mapping flights with snapshot sensors. Remote Sens. 2025, 17, 606. [Google Scholar] [CrossRef]
  26. Seifert, E.; Seifert, S.; Vogt, H.; Drew, D.; van Aardt, J.; Kunneke, A.; Seifert, T. Influence of drone altitude, image overlap, and optical sensor resolution on multi-view reconstruction of forest images. Remote Sens. 2019, 11, 1252. [Google Scholar] [CrossRef]
  27. Kameyama, S.; Sugiura, K. Estimating tree height and volume using unmanned aerial vehicle photography and SfM technology, with verification of result accuracy. Drones 2020, 4, 19. [Google Scholar] [CrossRef]
  28. Grybas, H.; Congalton, R.G. Evaluating the impacts of flying height and forward overlap on tree height estimates using unmanned aerial systems. Forests 2022, 13, 1462. [Google Scholar] [CrossRef]
  29. Mao, P.; Jiang, B.; Shi, Z.; He, Y.; Shen, T.; Qiu, G.Y. Effects of UAV flight height on biomass estimation of desert shrub communities. Ecol. Indic. 2023, 154, 110698. [Google Scholar] [CrossRef]
  30. Adedeji, O.; Abdalla, A.; Ghimire, B.; Ritchie, G.; Guo, W. Flight altitude and sensor angle affect unmanned aerial system cotton plant height assessments. Drones 2024, 8, 746. [Google Scholar] [CrossRef]
  31. Bazrafkan, A.; Worral, H.; Perdigon, C.; Oduor, P.G.; Bandillo, N.; Flores, P. Evaluating sensor fusion and flight parameters for enhanced plant height measurement in dry peas. Sensors 2025, 25, 2436. [Google Scholar] [CrossRef]
  32. Malambo, L.; Popescu, S.C.; Murray, S.C.; Putman, E.; Pugh, N.A.; Horne, D.W.; Richardson, G.; Sheridan, R.; Rooney, W.L.; Avant, R.; et al. Multitemporal field-based plant height estimation using 3D point clouds generated from small unmanned aerial systems high-resolution imagery. Int. J. Appl. Earth Obs. Geoinf. 2018, 64, 31–42. [Google Scholar] [CrossRef]
  33. Kümmerer, R.; Noack, P.O.; Bauer, B. Using high-resolution UAV imaging to measure canopy height of diverse cover crops and predict biomass. Remote Sens. 2023, 15, 1520. [Google Scholar] [CrossRef]
  34. Chu, T.; Starek, M.J.; Brewer, M.J.; Murray, S.C.; Pruter, L.S. Characterizing canopy height with UAS structure-from-motion photogrammetry—Results analysis of a maize field trial with respect to multiple factors. Remote Sens. Lett. 2018, 9, 753–762. [Google Scholar] [CrossRef]
  35. Zhang, K.; Sekiyama, A.; Okazawa, H.; Yamazaki, Y.; Hayashi, K.; Tsuji, O.; Akimoto, M. Comparison of crop surface models and 3D point clouds by UAV imagery on estimating plant height and biomass volume of pasture grass. Int. J. Environ. Rural Dev. 2022, 13, 137–143. [Google Scholar] [CrossRef]
  36. Niederheiser, R.; Mokroš, M.; Lange, J.; Petschko, H.; Prasicek, G.; Elberink, S.O. Deriving 3D point clouds from terrestrial photographs—Comparison of different sensors and software. In Proceedings of the 23rd Congress of the International Society of Photogrammetry and Remote Sensing: From Human History to the Future with Spatial Information, Prague, Czech Republic, 12–19 July 2016; Volume 41-B5, pp. 685–692. [Google Scholar] [CrossRef]
  37. Oborne, M. Mission Planner Software. Available online: http://www.ardupilot.org/planner/ (accessed on 15 December 2025).
  38. Yang, C.; Fritz, B.K.; Suh, C.P.-C. Practical methods for aerial image acquisition and reflectance conversion using consumer-grade cameras on manned and unmanned aircraft. Precis. Agric. 2024, 25, 2831–2852. [Google Scholar] [CrossRef]
  39. Dhruva, A.; Hartley, R.J.L.; Redpath, T.A.N.; Estarija, H.J.C.; Cajes, D.; Massam, P.D. Effective UAV photogrammetry for forest management: New insights on side overlap and flight parameters. Forests 2024, 15, 2135. [Google Scholar] [CrossRef]
  40. Hassan, M.A.; Yang, M.; Fu, L.; Rasheed, A.; Zheng, B.; Xia, X.; Xiao, Y.; He, Z. Accuracy assessment of plant height using an unmanned aerial vehicle for quantitative genomic analysis in bread wheat. Plant Methods 2019, 15, 37. [Google Scholar] [CrossRef]
  41. Lu, W.; Okayama, T.; Komatsuzaki, M. Rice height monitoring between different estimation models using UAV photogrammetry and multispectral technology. Remote Sens. 2022, 14, 78. [Google Scholar] [CrossRef]
  42. Stanton, C.; Starek, M.J.; Elliott, N.; Brewer, M.; Maeda, M.M.; Chu, T. Unmanned aircraft system-derived crop height and normalized difference vegetation index metrics for sorghum yield and aphid stress assessment. J. Appl. Remote Sens. 2017, 11, 026035. [Google Scholar] [CrossRef]
  43. Kawamura, K.; Asai, H.; Yasuda, T.; Khanthavong, P.; Soisouvanh, P.; Phongchanmixay, S. Field phenotyping of plant height in an upland rice field in Laos using low-cost small unmanned aerial vehicles (UAVs). Plant Prod. Sci. 2020, 23, 452–465. [Google Scholar] [CrossRef]
  44. Harwin, S.; Lucieer, A.; Osborn, J. The impact of the calibration method on the accuracy of point clouds derived using unmanned aerial vehicle multi-view stereopsis. Remote Sens. 2015, 7, 11933–11953. [Google Scholar] [CrossRef]
  45. Wang, D.; Li, R.; Zhu, B.; Liu, T.; Sun, C.; Guo, W. Estimation of wheat plant height and biomass by combining UAV imagery and elevation data. Agriculture 2023, 13, 9. [Google Scholar] [CrossRef]
Figure 1. Layouts of crop plots across a 2-ha experimental field near College Station, Texas, in (a) August 2019 and (b) June and July 2022.
Figure 1. Layouts of crop plots across a 2-ha experimental field near College Station, Texas, in (a) August 2019 and (b) June and July 2022.
Remotesensing 18 00360 g001
Figure 2. Scatterplots and regression lines between measured plant height and 99th percentile estimates derived from two digital surface models (DSM), including inverse distance weighting (DSM-IDW) and triangulation (DSM-TRI), and from the point clouds at 60 m (1.0 cm GSD) and 90 m (1.5 cm GSD) for four crops in August 2019. The dotted line represents the 1:1 line.
Figure 2. Scatterplots and regression lines between measured plant height and 99th percentile estimates derived from two digital surface models (DSM), including inverse distance weighting (DSM-IDW) and triangulation (DSM-TRI), and from the point clouds at 60 m (1.0 cm GSD) and 90 m (1.5 cm GSD) for four crops in August 2019. The dotted line represents the 1:1 line.
Remotesensing 18 00360 g002
Figure 3. Scatterplots and regression lines between measured plant height and 99th percentile estimates derived from two digital surface models (DSM), including inverse distance weighting (DSM-IDW) and triangulation (DSM-TRI), and from the point clouds at 60 m (1.0 cm GSD) for three crops in June and July 2022. The dotted line represents the 1:1 line.
Figure 3. Scatterplots and regression lines between measured plant height and 99th percentile estimates derived from two digital surface models (DSM), including inverse distance weighting (DSM-IDW) and triangulation (DSM-TRI), and from the point clouds at 60 m (1.0 cm GSD) for three crops in June and July 2022. The dotted line represents the 1:1 line.
Remotesensing 18 00360 g003
Figure 4. Scatterplots and regression lines between measured plant height and 99th-percentile estimates derived from point clouds with reduced side overlaps at 60 m (1.0 cm GSD), 90 m (1.5 cm GSD), and 120 m (2.0 cm GSD) for three crops across three dates in 2019 and 2022. The dotted line represents the 1:1 line.
Figure 4. Scatterplots and regression lines between measured plant height and 99th-percentile estimates derived from point clouds with reduced side overlaps at 60 m (1.0 cm GSD), 90 m (1.5 cm GSD), and 120 m (2.0 cm GSD) for three crops across three dates in 2019 and 2022. The dotted line represents the 1:1 line.
Remotesensing 18 00360 g004
Table 1. Root mean square error and coefficient of determination, RMSE (R2), between measured plant height and estimates derived from three estimation methods and three percentiles for four crops and three crops in 2019.
Table 1. Root mean square error and coefficient of determination, RMSE (R2), between measured plant height and estimates derived from three estimation methods and three percentiles for four crops and three crops in 2019.
PercentileFour Crops
(Corn, Cotton, Sorghum, Soybean)
Three Crops
(Cotton, Sorghum, Soybean)
DSM-IDW aDSM-TRI aPoint CloudDSM-IDWDSM-TRIPoint Cloud
At 30 m altitude (0.5 cm ground sampling distance, GSD), 67%/67% side/front overlap, 416 images
98th
99th
100th
0.442 (0.32)
0.426 (0.36)
0.423 (0.36)
0.426 (0.34)
0.408 (0.35)
0.360 (0.35)
0.329 (0.51)
0.321 (0.51)
0.302 (0.52)
0.288 (0.68)
0.286 (0.68)
0.283 (0.68)
0.287 (0.64)
0.276 (0.64)
0.235 (0.59)
0.222 (0.73)
0.218 (0.72)
0.199 (0.74)
At 60 m altitude (1.0 cm GSD), 83%/83% side/front overlap, 445 images
98th
99th
100th
0.260 (0.70)
0.258 (0.71)
0.256 (0.71)
0.196 (0.79)
0.166 (0.81)
0.215 (0.81)
0.097 (0.94)
0.090 (0.94)
0.108 (0.88)
0.138 (0.94)
0.137 (0.94)
0.135 (0.94)
0.089 (0.95)
0.088 (0.92)
0.151 (0.79)
0.064 (0.95)
0.063 (0.95)
0.105 (0.86)
At 90 m altitude (1.5 cm GSD), 89%/89% side/front overlap, 418 images
98th
99th
100th
0.424 (0.22)
0.422 (0.23)
0.411 (0.25)
0.241 (0.71)
0.204 (0.81)
0.128 (0.90)
0.116 (0.92)
0.103 (0.92)
0.091 (0.93)
0.162 (0.94)
0.161 (0.94)
0.160 (0.94)
0.105 (0.95)
0.099 (0.95)
0.090 (0.92)
0.063 (0.95)
0.062 (0.95)
0.071 (0.94)
At 120 m altitude (2.0 cm GSD), 92%/92% side/front overlap, 438 images
98th
99th
100th
0.477 (0.24)
0.475 (0.24)
0.474 (0.24)
0.334 (0.48)
0.318 (0.52)
0.296 (0.56)
0.175 (0.79)
0.157 (0.81)
0.140 (0.82)
0.229 (0.94)
0.227 (0.94)
0.226 (0.94)
0.152 (0.92)
0.148 (0.92)
0.140 (0.92)
0.089 (0.91)
0.085 (0.91)
0.088 (0.89)
a Digital surface models (DSM) were created using inverse distance weighting (IDW) and triangulation (TRI). Bold values indicate best RMSE (R2) values among the three methods and three percentiles for each altitude.
Table 2. Root mean square error and coefficient of determination, RMSE (R2), between measured plant height and estimates based on three estimation methods and three percentiles for three crops flown in June and July 2022.
Table 2. Root mean square error and coefficient of determination, RMSE (R2), between measured plant height and estimates based on three estimation methods and three percentiles for three crops flown in June and July 2022.
PercentileJune 2022July 2022
DSM-IDW aDSM-TRI aPoint CloudDSM-IDWDSM-TRIPoint Cloud
At 30 m altitude, 75%/75% side/front overlap, 381 images (June) and 358 images (July)
98th
99th
100th
0.126 (0.63)
0.123 (0.65)
0.109 (0.70)
0.107 (0.70)
0.100 (0.72)
0.144 (0.64)
0.094 (0.74)
0.082 (0.83)
0.090 (0.85)
0.084 (0.80)
0.083 (0.80)
0.083 (0.80)
0.080 (0.80)
0.078 (0.81)
0.143 (0.68)
0.078 (0.82)
0.080 (0.82)
0.094 (0.83)
At 60 m altitude, 87%/87% side/front overlap, 378 images (June) and 365 images (July)
98th
99th
100th
0.100 (0.86)
0.100 (0.86)
0.099 (0.86)
0.082 (0.85)
0.079 (0.85)
0.087 (0.80)
0.074 (0.89)
0.075 (0.89)
0.085 (0.88)
0.089 (0.82)
0.089 (0.82)
0.088 (0.82)
0.076 (0.84)
0.075 (0.84)
0.075 (0.82)
0.074 (0.85)
0.074 (0.85)
0.077 (0.86)
At 90 m altitude, 92%/92% side/front overlap, 377 images (June) and 361 images (July)
98th
99th
100th
0.178 (0.75)
0.177 (0.75)
0.176 (0.75)
0.108 (0.80)
0.105 (0.80)
0.093 (0.81)
0.084 (0.80)
0.084 (0.80)
0.090 (0.80)
0.124 (0.78)
0.123 (0.78)
0.122 (0.78)
0.086 (0.80)
0.085 (0.80)
0.084 (0.78)
0.081 (0.83)
0.083 (0.83)
0.096 (0.82)
At 120 m altitude, 94%/92% side/front overlap, 376 images (June) and 365 images (July)
98th
99th
100th
0.299 (0.79)
0.298 (0.79)
0.297 (0.79)
0.221 (0.76)
0.213 (0.77)
0.196 (0.77)
0.142 (0.70)
0.137 (0.70)
0.133 (0.70)
0.322 (0.77)
0.321 (0.77)
0.321 (0.77)
0.241 (0.72)
0.233 (0.73)
0.220 (0.72)
0.158 (0.69)
0.144 (0.70)
0.131 (0.71)
a Digital surface models (DSM) were created using inverse distance weighting (IDW) and triangulation (TRI). Bold values indicate best RMSE (R2) values among the three methods and three percentiles for each altitude.
Table 3. Point cloud density and processing time as well as RMSE and R2 between measured plant height and estimates derived from point clouds for all combinations of side and front overlaps at four flight altitudes in August 2019.
Table 3. Point cloud density and processing time as well as RMSE and R2 between measured plant height and estimates derived from point clouds for all combinations of side and front overlaps at four flight altitudes in August 2019.
Side/Front
Overlap
Number
of
Images
Number of 3D Points
Total (Sample)
Process
Time
Four Crops
(48 Samples)
Three Crops
(36 Samples)
RMSE
(m)
R2RMSE
(m)
R2
30 m (0.5 cm GSD)
67%/67%
60 m (1.0 cm GSD)
67%/67%
67%/83%
83%/67%
83%/83%
90 m (1.5 cm GSD)
67%/67%
67%/83%
67%/89%
83%/67%
83%/83%
83%/89%
89%/67%
89%/83%
89%/89%
120 m (2.0 cm GSD)
67%/67%
67%/83%
67%/89%
67%/92%
83%/67%
83%/83%
83%/89%
83%/92%
89%/67%
89%/83%
89%/89%
89%/92%
92%/67%
92%/83%
92%/89%
92%/92%

416

117
234
223
445

54
80
160
75
112
222
140
209
418

33
44
65
131
43
55
83
167
58
77
115
231
110
146
219
438

77.4 M (9845)

25.2 M (2689)
36.3 (4533)
33.4 (3847)
55.8 M (7719)

12.5 M (1170)
16.1 M (1603)
22.6 M (2464)
14.5 M (1447)
18.5 M (1960)
26.8 M (3080)
20.3 M (2280)
25.4 M (3032)
43.9 M (5765)

7.8 M (710)
9.7 M (810)
11.9 M (1086)
16.4 M (1506)
8.9 M (745)
10.5 M (886)
13.1 M (1574)
18.8 M (1783)
10.5 M (910)
12.4 M (1022)
15.2 M (1419)
22.7 M (2113)
14.5 M (1367)
17.1 M (1678)
21.8 M (2134)
35.3 M (3872)

5 h 27 min

1 h 06 min
2 h 11 min
2 h 10 min
6 h 02 min

0 h 29 min
0 h 40 min
1 h 28 min
0 h 39 min
0 h 58 min
2 h 00 min
1 h 07 min
1 h 47 min
5 h 51 min

0 h 17 min
0 h 23 min
0 h 32 min
1 h 10 min
0 h 22 min
0 h 25 min
0 h 43 min
1 h 36 min
1 h 28 min
0 h 37 min
1 h 00 min
2 h 14 min
1 h 00 min
0 h 53 min
2 h 28 min
7 h 09 min

0.321

0.543
0.116
0.232
0.089

0.534
0.182
0.142
0.420
0.181
0.101
0.112
0.128
0.103

0.439
0.350
0.202
0.141
0.440
0.363
0.175
0.232
0.259
0.263
0.174
0.159
0.218
0.204
0.192
0.157

0.51

0.31
0.88
0.71
0.94

0.22
0.79
0.84
0.28
0.76
0.92
0.91
0.89
0.92

0.39
0.56
0.78
0.84
0.30
0.41
0.79
0.58
0.68
0.62
0.77
0.83
0.69
0.68
0.72
0.81

0.218

0.491
0.069
0.162
0.063

0.392
0.122
0.072
0.282
0.109
0.077
0.078
0.071
0.062

0.313
0.254
0.148
0.091
0.395
0.212
0.118
0.116
0.164
0.163
0.115
0.093
0.105
0.100
0.094
0.085

0.72

0.29
0.93
0.83
0.95

0.48
0.87
0.93
0.59
0.86
0.92
0.92
0.94
0.95

0.67
0.77
0.83
0.90
0.25
0.76
0.88
0.84
0.86
0.83
0.85
0.92
0.90
0.90
0.90
0.91
Table 4. Point cloud density and processing time as well as RMSE and R2 between measured plant height and estimates derived from point clouds for all combinations of side and front overlaps at four flight altitudes in June 2022.
Table 4. Point cloud density and processing time as well as RMSE and R2 between measured plant height and estimates derived from point clouds for all combinations of side and front overlaps at four flight altitudes in June 2022.
Side/Front
Overlap
Number
of
Images
Number of
3D Points
Total (Sample)
Process
Time
RMSE
(m)
R2
30 m (0.5 cm GSD)
75%/75%
60 m (1.0 cm GSD)
75%/75%
75%/87%
87%/75%
87%/87%
90 m (1.5 cm GSD)
75%/75%
75%/87%
75%/92%
87%/75%
87%/87%
87%/92%
92%/75%
92%/87%
92%/92%
120 m (2.0 cm GSD)
75%/75%
75%/87%
75%/92%
75%/94%
87%/75%
87%/87%
87%/92%
87%/94%
92%/75%
92%/87%
92%/92%
92%/94%
94%/75%
94%/87%
94%/92%
94%/94%

381

103
204
189
378

48
73
145
69
105
207
126
189
377

32
41
61
119
37
49
73
144
54
71
106
209
95
126
189
376

60.3 M (16,810)

17.8 M (4379)
26.1 M (6972)
23.8 M (4614)
37.1 M (8390)

9.1 M (1623)
11.5 M (2099)
16.8 M (3199)
11.0 M (1889)
13.9 M (2550)
20.5 M (3963)
15.2 M (2864)
19.2 M (3770)
28.9 M (5554)

6.5 M (831)
7.4 M (965)
9.0 M (1269)
12.7 M (1784)
6.8 M (898)
8.0 M (1077)
9.8 M (1305)
14.3 M (1864)
8.3 M (1107)
9.7 M (1305)
12.1 M (1542)
17.8 M (2295)
11.2 M (1596)
13.2 M (1771)
16.6 M (2140)
27.2 M (3354)

4 h 16 min

0 h 47 min
1 h 48 min
1 h 45 min
4 h 15 min

0 h 22 min
0 h 33 min
1 h 07 min
0 h 29 min
0 h 51 min
1 h 54 min
1 h 00 min
1 h 47 min
4 h 33 min

0 h 13 min
0 h 18 min
0 h 27 min
1 h 00 min
0 h 17 min
0 h 23 min
0 h 32 min
1 h 24 min
0 h 23 min
0 h 31 min
0 h 51 min
2 h 17 min
0 h 47 min
1 h 07 min
1 h 54 min
6 h 50 min

0.082

0.099
0.075
0.076
0.075

0.144
0.126
0.084
0.114
0.096
0.093
0.100
0.090
0.084

0.171
0.164
0.163
0.130
0.197
0.158
0.171
0.144
0.174
0.140
0.138
0.129
0.145
0.127
0.147
0.137

0.83

0.76
0.89
0.88
0.89

0.59
0.63
0.82
0.64
0.77
0.76
0.71
0.78
0.80

0.68
0.64
0.69
0.68
0.65
0.68
0.68
0.68
0.69
0.73
0.72
0.72
0.67
0.71
0.70
0.70
Table 5. Point cloud density and processing time as well as RMSE and R2 between measured plant height and estimates derived from point clouds for all combinations of side and front overlaps at four flight altitudes in July 2022.
Table 5. Point cloud density and processing time as well as RMSE and R2 between measured plant height and estimates derived from point clouds for all combinations of side and front overlaps at four flight altitudes in July 2022.
Side-Front
Overlap
Number
of
Images
Number of
3D Points
Total (Sample)
Process
Time
RMSE
(m)
R2
30 m (0.5 cm GSD)
75%/75%
60 m (1.0 cm GSD)
75%/75%
75%/87%
87%/75%
87%/87%
90 m (1.5 cm GSD)
75%/75%
75%/87%
75%/92%
87%/75%
87%/87%
87%/92%
92%/75%
92%/87%
92%/92%
120 m (2.0 cm GSD)
75%/75%
75%/87%
75%/92%
75%/94%
87%/75%
87%/87%
87%/92%
87%/94%
92%/75%
92%/87%
92%/92%
92%/94%
94%/75%
94%/87%
94%/92%
94%/94%

358

98
195
183
365

48
69
138
64
96
190
121
181
361

27
37
55
110
36
47
71
140
48
65
97
195
92
122
183
365

55.1 M (15,163)

17.1 M (4344)
24.1 M (7174)
22.9 M (6550)
32.5 M (6554)

9.0 M (1315)
10.9 M (1910)
16.6 M (2956)
10.4 M (1608)
13.1 M (2303)
20.0 M (3754)
14.8 M (2750)
19.0 M (3654)
29.3 M (5607)

5.3 M (795)
6.6 M (903)
8.2 M (1136)
12.4 M (2302)
6.4 M (1110)
7.5 M (1066)
9.6 M (1754)
14.5 M (2715)
7.5 M (1346)
9.1 M (1650)
11.4 M (1639)
17.4 M (3367)
11.0 M (2084)
13.0 M (2505)
16.5 M (3157)
27.2 M (3670)

3 h 22 min

0 h 48 min
1 h 50 min
1 h 39 min
4 h 18 min

0 h 21 min
0 h 31 min
1 h 08 min
0 h 27 min
0 h 40 min
1 h 41 min
0 h 59 min
1 h 37 min
4 h 29 min

0 h 11 min
0 h 16 min
0 h 24 min
0 h 54 min
0 h 16 min
0 h 21 min
0 h 31 min
1 h 14 min
0 h 20 min
0 h 28 min
0 h 44 min
1 h 59 min
0 h 44 min
1 h 05 min
1 h 52 min
6 h 46 min

0.080

0.077
0.071
0.066
0.075

0.165
0.110
0.083
0.106
0.085
0.098
0.084
0.087
0.083

0.255
0.230
0.177
0.076
0.086
0.191
0.070
0.071
0.070
0.078
0.148
0.074
0.072
0.075
0.073
0.144

0.82

0.83
0.87
0.87
0.85

0.53
0.64
0.82
0.67
0.78
0.75
0.80
0.79
0.83

0.52
0.55
0.65
0.86
0.78
0.68
0.86
0.88
0.86
0.84
0.69
0.87
0.86
0.86
0.86
0.70
Table 6. Point cloud density, processing time, RMSE, and R2 between measured plant height and estimates derived from point clouds at the 99th percentile for 20 combinations of processing parameters in Pix4Dmapper using images captured at 60 m (1.0 cm GSD, 83%/83% side/front overlap, 445 images) in August 2019.
Table 6. Point cloud density, processing time, RMSE, and R2 between measured plant height and estimates derived from point clouds at the 99th percentile for 20 combinations of processing parameters in Pix4Dmapper using images captured at 60 m (1.0 cm GSD, 83%/83% side/front overlap, 445 images) in August 2019.
Keypoint
Image Scale
Densification
Image
Scale
Point DensityMinimum
Number of
Matches
Number
of 3D
Points (Million)
Process
Time
RMSE
(m)
R2
1
2
1/2
1/4
1/8
1
1
1
1
1
1
1
1
1
1/2
2
2
2
2
2
1/2
1/2
1/2
1/2
1/2
1
1/4
1/8
1/2
1/2
1/2
1/2
1/2
1/2
1
1
1
1
1/2
1/2
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Low
High
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Low
High
Low
High
3
3
3
3
3
3
3
3
3
3
2
4
5
6
3
3
3
3
3
3
49.9
49.8
50.0
49.8
50.2
177.7
11.7
2.9
14.0
175.7
67.2
41.0
35.1
30.9
178.6
180.9
54.9
598.8
13.9
175.4
02 h 38 min
02 h 48 min
02 h 28 min
01 h 23 min
01 h 04 min
05 h 14 min
01 h 55 min
01 h 47 min
01 h 56 min
04 h 13 min
02 h 40 min
02 h 31 min
02 h 25 min
02 h 23 min
05 h 21 min
05 h 18 min
02 h 42 min
16 h 04 min
02 h 05 min
04 h 23 min
0.098
0.096
0.109
0.083
0.081
0.094
0.136
0.345
0.138
0.093
0.102
0.096
0.129
0.230
0.089
0.101
0.092
0.128
0.126
0.081
0.93
0.94
0.91
0.95
0.95
0.93
0.90
0.76
0.88
0.94
0.93
0.92
0.90
0.67
0.94
0.91
0.94
0.82
0.86
0.95
The first row displays the default processing parameters.
Table 7. Point cloud density, processing time, RMSE, and R2 between measured plant height and estimates derived from point clouds at the 99th percentile for 20 combinations of processing parameters in Pix4Dmapper using images captured at 60 m (1.0 cm GSD, 87%/87% side/front overlap) in June and July 2022.
Table 7. Point cloud density, processing time, RMSE, and R2 between measured plant height and estimates derived from point clouds at the 99th percentile for 20 combinations of processing parameters in Pix4Dmapper using images captured at 60 m (1.0 cm GSD, 87%/87% side/front overlap) in June and July 2022.
Keypoint
Image
Scale
Densification
Image
Scale
Point DensityMinimum
Number
of
Matches
June 2022 (378 Images) July 2022 (365 Images)
Number
of 3D
Points (Million)
Process
Time
RMSE
(m)
R2Number
of 3D Points (Million)
Process
Time
RMSE
(m)
R2
1
2
1/2
1/4
1/8
1
1
1
1
1
1
1
1
1
1/2
2
2
2
2
2
1/2
1/2
1/2
1/2
1/2
1
1/4
1/8
1/2
1/2
1/2
1/2
1/2
1/2
1
1
1
1
1/2
1/2
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Low
High
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Low
High
Low
High
3
3
3
3
3
3
3
3
3
3
2
4
5
6
3
3
3
3
3
3
33.7
36.1
33.4
34.5
34.6
92.8
8.6
2.0
10.2
118.4
50.7
26.6
23.1
23.3
93.8
93.8
30.5
418.3
13.9
114.4
02 h 36 min
03 h 09 min
02 h 21 min
01 h 52 min
01 h 44 min
06 h 37 min
01 h 48 min
01 h 41 min
01 h 49 min
05 h 10 min
02 h 38 min
02 h 39 min
02 h 41 min
01 h 58 min
07 h 18 min
08 h 19 min
05 h 40 min
11 h 13 min
02 h 55 min
05 h 24 min
0.074
0.080
0.073
0.077
0.075
0.079
0.086
0.190
0.084
0.081
0.077
0.076
0.081
0.082
0.091
0.081
0.096
0.104
0.080
0.084
0.89
0.87
0.88
0.87
0.86
0.82
0.82
0.74
0.83
0.86
0.88
0.85
0.83
0.87
0.78
0.81
0.74
0.84
0.85
0.84
32.5
32.4
32.4
32.2
32.1
104.9
7.7
1.8
9.7
113.7
47.1
26.2
24.3
21.8
100.2
104.2
27.8
419.8
9.7
116.0
02 h 22 min
02 h 28 min
02 h 10 min
01 h 48 min
01 h 36 min
07 h 08 min
01 h 42 min
01 h 35 min
01 h 47 min
04 h 24 min
02 h 18 min
02 h 22 min
02 h 15 min
01 h 59 min
07 h 24 min
07 h 04 min
02 h 59 min
08 h 56 min
01 h 55 min
04 h 24 min
0.077
0.075
0.075
0.069
0.072
0.080
0.084
0.186
0.078
0.075
0.077
0.074
0.071
0.076
0.083
0.089
0.079
0.080
0.083
0.079
0.85
0.85
0.85
0.87
0.83
0.81
0.78
0.72
0.84
0.86
0.84
0.85
0.85
0.82
0.80
0.77
0.81
0.84
0.82
0.84
The first row displays the default processing parameters.
Table 8. Point cloud density, processing time, RMSE, and R2 between measured plant height and estimates derived from point clouds at the 99th percentile for 10 combinations of processing parameters in Pix4Dmapper using images captured at 30 m and 90 m in July 2022.
Table 8. Point cloud density, processing time, RMSE, and R2 between measured plant height and estimates derived from point clouds at the 99th percentile for 10 combinations of processing parameters in Pix4Dmapper using images captured at 30 m and 90 m in July 2022.
Keypoint
Image
Scale
Densification
Image
Scale
Point DensityMinimum
Number
of
Matches
30 m (0.5 cm GSD,
75%/75% Overlap,
358 Images)
90 m (1.5 cm GSD,
92%/92% Overlap,
361 Images)
Number
of 3D
Points (Million)
Process
Time
(hh mm)
RMSE
(m)
R2Number
of 3D Points (Million)
Process
Time
RMSE
(m)
R2
1
2
1/2
1/4
1/8
1
1
1
1
1
1/2
1/2
1/2
1/2
1/2
1
1/4
1/8
1/2
1/2
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Optimal
Low
High
3
3
3
3
3
3
3
3
3
3
55.1
55.1
55.2
55.1
55.1
223.7
3.7
3.4
14.9
211.6
02 h 10 min
02 h 30 min
01 h 55 min
01 h 35 min
01 h 30 min
03 h 51 min
01 h 34 min
01 h 33 min
01 h 41 min
03 h 24 min
0.080
0.081
0.077
0.079
0.072
0.084
0.086
0.092
0.078
0.075
0.84
0.83
0.84
0.83
0.84
0.83
0.76
0.77
0.84
0.85
28.8
28.9
29.5
29.3
28.6
104.6
6.1
1.2
8.3
113.7
02 h 34 min
02 h 14 min
02 h 28 min
01 h 59 min
01 h 42 min
05 h 36 min
01 h 59 min
01 h 55 min
02 h 02 min
04 h 56 min
0.082
0.087
0.089
0.083
0.083
0.101
0.112
0.366
0.086
0.081
0.83
0.81
0.80
0.82
0.82
0.83
0.72
0.71
0.81
0.84
The first row displays the default processing parameters.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Yang, C.; Suh, C.P.-C.; Fritz, B.K. Effects of Flight and Processing Parameters on UAS Image-Based Point Clouds for Plant Height Estimation. Remote Sens. 2026, 18, 360. https://doi.org/10.3390/rs18020360

AMA Style

Yang C, Suh CP-C, Fritz BK. Effects of Flight and Processing Parameters on UAS Image-Based Point Clouds for Plant Height Estimation. Remote Sensing. 2026; 18(2):360. https://doi.org/10.3390/rs18020360

Chicago/Turabian Style

Yang, Chenghai, Charles P.-C. Suh, and Bradley K. Fritz. 2026. "Effects of Flight and Processing Parameters on UAS Image-Based Point Clouds for Plant Height Estimation" Remote Sensing 18, no. 2: 360. https://doi.org/10.3390/rs18020360

APA Style

Yang, C., Suh, C. P.-C., & Fritz, B. K. (2026). Effects of Flight and Processing Parameters on UAS Image-Based Point Clouds for Plant Height Estimation. Remote Sensing, 18(2), 360. https://doi.org/10.3390/rs18020360

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop