Next Article in Journal
HTC+ for SAR Ship Instance Segmentation
Previous Article in Journal
Combining Passive Acoustics and Environmental Data for Scaling Up Ecosystem Monitoring: A Test on Coral Reef Fishes
Previous Article in Special Issue
A Multi-Source Data Fusion Decision-Making Method for Disease and Pest Detection of Grape Foliage Based on ShuffleNet V2
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Optimization of UAV-Based Imaging and Image Processing Orthomosaic and Point Cloud Approaches for Estimating Biomass in a Forage Crop

by
Worasit Sangjan
1,
Rebecca J. McGee
2 and
Sindhuja Sankaran
1,*
1
Department of Biological Systems Engineering, Washington State University, Pullman, WA 99164, USA
2
United States Department of Agriculture-Agricultural Research Service, Grain Legume Genetics and Physiology Research Unit, Washington State University, Pullman, WA 99164, USA
*
Author to whom correspondence should be addressed.
Remote Sens. 2022, 14(10), 2396; https://doi.org/10.3390/rs14102396
Submission received: 6 April 2022 / Revised: 28 April 2022 / Accepted: 12 May 2022 / Published: 17 May 2022

Abstract

:
Forage and field peas provide essential nutrients for livestock diets, and high-quality field peas can influence livestock health and reduce greenhouse gas emissions. Above-ground biomass (AGBM) is one of the vital traits and the primary component of yield in forage pea breeding programs. However, a standard method of AGBM measurement is a destructive and labor-intensive process. This study utilized an unmanned aerial vehicle (UAV) equipped with a true-color RGB and a five-band multispectral camera to estimate the AGBM of winter pea in three breeding trials (two seed yields and one cover crop). Three processing techniques—vegetation index (VI), digital surface model (DSM), and 3D reconstruction model from point clouds—were used to extract the digital traits (height and volume) associated with AGBM. The digital traits were compared with the ground reference data (measured plant height and harvested AGBM). The results showed that the canopy volume estimated from the 3D model (alpha shape, α = 1.5) developed from UAV-based RGB imagery’s point clouds provided consistent and high correlation with fresh AGBM (r = 0.78–0.81, p < 0.001) and dry AGBM (r = 0.70–0.81, p < 0.001), compared with other techniques across the three trials. The DSM-based approach (height at 95th percentile) had consistent and high correlation (r = 0.71–0.95, p < 0.001) with canopy height estimation. Using the UAV imagery, the proposed approaches demonstrated the potential for estimating the crop AGBM across winter pea breeding trials.

Graphical Abstract

1. Introduction

Field pea (Pisum sativum L.) is an annual crop, and when planted as a cover crop, it provides multiple benefits to production agriculture, such as weed suppression, erosion control, nitrogen fixation, and increased soil organic matter. Field pea also provides numerous benefits as food in the form of whole or split dry seed, as feed or fodder, and as a fractionated ingredient. As a forage, field pea is a short-duration crop with high yield and protein concentration [1,2,3,4]. Similarly, when harvested as a grain, it is highly palatable, nutrient-dense, and serves as a high protein feed for all classes of livestock.
Animal feed represents the greatest proportion of variable costs in most livestock systems, and livestock farming practices contribute to environmental impacts. Ruminant grazing has a severe negative environmental effect by contributing up to 14.5% of all anthropogenic sources of atmospheric methane emissions worldwide [5,6]. Breeding programs aim to improve field pea cultivars that are well-adapted to regional environments and produce high quantity and quality biomass. Recent efforts have focused on improving the nutrient quality of forage converted into animal products (meat or milk) and decreasing methane produced by cattle as feed is digested. This is essential to maximize growers’ and ranchers’ profitability and reduce greenhouse gas [7,8,9].
In field pea breeding programs, the above-ground biomass (AGBM) is a performance trait that can be used to assess the nutritional status of the crop and is the primary yield component of a cover or forage crop [10]. Traditionally, the AGBM and other morphological traits such as plant height are measured manually for a large number of genotypes in breeding programs. This sampling is destructive, laborious, and prone to errors [11,12,13]. Currently, remote sensing technologies with the ability for high-throughput data acquisition have been adapted and utilized as tools to phenotype breeding lines in a non-destructive manner [14,15,16,17].
Unmanned aerial vehicles (UAVs) have increasingly been used for sensing to extract structural and quantitative traits of crops and trees [18,19,20,21]. UAV-mounted multispectral cameras have been used to calculate vegetation indices (VI) for crop AGBM evaluation in barley [22,23], dry bean [24], rice [25,26], and wheat [27,28]. Additionally, images obtained from UAV-mounted digital red–green–blue (RGB) cameras have been used to construct digital surface models (DSMs) using the structure from motion (SfM) algorithm [29] to estimate AGBM in cotton [30], maize [31,32], oat [33], rice [16,34], soybean [35,36], and wheat [37,38].
In general, VIs represent digital AGBM based on the spectral reflectance characteristics of the canopy or crop’s top surface. The DSM can be applied for extracting digital traits from different dimensions to quantify crop morphological traits. The one-dimensional (1D) data include traits such as crop height and crop width, and the two-dimensional (2D) data include traits such as crop area. Both 1D and 2D data can be associated with canopy volume or AGBM [38,39]. The accuracy of biomass estimation, however, may be affected by saturation effects (sunlight and other environmental conditions) on the VI data [40], the limitations of the RGB camera, and the precision of the method used to create the digital terrain model (DTM). DTMs represent the morphology of the terrain where the crop is grown, which is then used to extract crop height from the DSM after removing the ground elevation [41,42]. In recent years, with the advancements in the SfM algorithm and dense-image-matching techniques, three-dimensional (3D) point clouds can be generated from UAV-based RGB imagery. When estimating canopy volume in tree species, the digital traits based on the 3D reconstruction models (convex hull, concave hull, alpha shape, and voxel grid) from point clouds captured with RGB cameras have been found to perform better than the DSM-based approach, serving as a simple alternative, compared with the expensive light detection and ranging (LiDAR) sensor systems [43,44,45]. However, the application of these techniques using point clouds generated from UAV-based RGB imagery has not been evaluated in field crops, especially within breeding programs.
In this study, three techniques were utilized to estimate the digital traits from UAV-based imagery associated with AGBM in winter field pea breeding trials. These techniques are vegetation indices, traits extracted from DSM, and traits extracted from the 3D model. The specific objectives were to (1) develop pipelines to segment vegetation point clouds from the soil surface and weeds, reconstruct point clouds of individual plots, and extract digital traits utilizing the point clouds constructed from UAV-based RGB imagery; and (2) explore the potential of utilizing the three different approaches for extracting digital traits to assess AGBM of individual winter field pea breeding plots.

2. Materials and Methods

2.1. Study Area

The winter field pea breeding trials were located in the Palouse region of northwestern USA. The trials were conducted in two growing seasons, 2018–2019 and 2019–2020. In the 2018–2019 season, the Austrian winter pea advanced yield trials (referred to as panel 1921) were planted in two locations—Genesee, ID, and Garfield, WA. In the 2019–2020 season, the forage and cover crop winter pea advanced yield trial (referred to as panel 2021cc) was planted in Pullman, WA. The Austrian winter pea (panel 1921) contained genotypes bred for seed yield (semi-leafless plants with small, pigmented seeds), while the cover crop peas (panel 2021cc) contained genotypes specifically bred to maximize biomass. These three trials were used in this study and are presented in Figure 1. The plots were arranged in a randomized complete block design with three replicates (10 entries for panel 1921, 9 entries for panel 2021cc) and relevant commercial check cultivars (total individual plots = 126). The individual plot size was approximately 1.5 × 5.0 m.

2.2. Data Acquisition

The multispectral images were acquired from quadrotor UAV AgBotTM (ATI/Aerial Technology International, Oregon City, OR, USA) mounted with a 1.2-megapixel RedEdge camera (Micasense Inc., Seattle, WA, USA). The UAV’s flight mission was a single grid with 80% front and 70% side image overlap, and the UAV was operated at 2 m/s flying speed at 20 m flight altitude. These parameters were set using Mission Planner software (http://ardupilot.org/planner; accessed on 5 January 2022).
DJI-Phantom 4 Pro, another quadrotor UAV, with a 20-megapixel onboard RGB camera (DJI Inc., Los Angeles, CA, USA), was used with a similar flight pattern as the AgBotTM system to collect RGB images, except that the flight plan was programmed using Pix4Dcapture (Pix4D S.A., Lausanne, Switzerland), and data were acquired at two flight altitudes, 10 and 20 m. During each flight, a 0.3 × 0.3 m white reference panel having 99% reflectance from RGB to near-infrared (NIR) spectral range (Spectralon® Diffuse Reflectance Targets, SRS-99–120, Labsphere Inc., North Sutton, NH, USA) was placed in the field for radiometric correction. A summary of the flight details is provided in Table 1.
After UAV-based data collection, the average canopy height of three representative plants from each plot and harvested biomass (entire plants were cut at ground level and weighed, referred to as fresh AGBM) from a 1.5 × 1.5 m subplot were collected as ground reference data. These ground reference data were collected when 50% of the plants in the plots were flowering (F50). Forage crops are typically harvested at F50 as plants have accumulated maximum biomass, which translates to maximum productivity. Ground reference data (dry AGBM) were also acquired at physiological maturity (PM), where the dry plants were harvested, at ground level, from the remainder of each plot (also after UAV-based data collection).

2.3. Image Processing

RGB and multispectral images were preprocessed using the Pix4Dmapper photogrammetry software (Pix4D S.A., Lausanne, Switzerland) to derive an orthomosaic image (*.tif file) from both datasets. The software automatically utilized the scene illumination, reference panel, and sensor specifications to improve the orthomosaic images’ radiometric quality. These processes are required to create the surface reflectance imagery (*.tif file) from the multispectral camera and to construct VI images with consistent light quality for better data comparisons. Pix4Dmapper was also used to generate the DSM (*.tif file) and point cloud (*.las file) data from the RGB images. These three raw data types (orthomosaic, DSM, and point cloud) were processed to extract the digital traits, as presented in Figure 2a.

2.3.1. Vegetation Indices

The multispectral surface reflectance images were used to generate VIs, as described in Figure 2b, using Python 3 and the Rasterio library (https://rasterio.readthedocs.io/en/latest/#; accessed on 5 January 2022). Twelve VIs (Table 2), commonly used in agriculture and with reported potential to estimate AGBM productivity, were calculated.
The soil mask layer was generated using the soil-adjusted vegetation index [46] and was applied to eliminate the soil surface from each VI image. The polygons defining each plot consisted of six subplots (1.35 × 0.75 m per subplot) and were digitized in a shapefile (*.shp) format using open-source software Quantum GIS (QGIS, version 3.20.3). Two and four subplots covered the area harvested at F50 and PM, respectively. The subplot polygons were shaped such that the area from which the canopy volume (CV) was estimated corresponded to the ground reference data (harvested AGBM) acquired from the same area. The plot segmentation shapefiles were imported into the developed algorithm. Then, image features such as maximum (Max), average (Mean), sum, standard deviation, 95th percentile, 90th percentile, and 85th percentile (extracted highest data value after 5%, 10%, and 15% of the data were removed after soil subtraction process from VI images, respectively, referred to as 95P, 90P, and 85P), total area (total area of a region of interest in terms of the number of pixels), and crop area (area of crop cover within a region of interest in terms of the number of pixels) of each subplot within each VI image was extracted with the Python libraries—NumPy (https://numpy.org/; accessed on 5 January 2022) and Rasterstats (https://pythonhosted.org/rasterstats/#; accessed on 5 January 2022). These feature data extracted from each subplot (labeled during the plot segmentation creation process) were exported as a comma-separated values (CSV) file.
The image feature data (e.g., max, mean, standard deviation, 95P, 90P, and 85P) were further analyzed to compute VI-based CV data (refer to Figure 2a) by multiplying the image feature data with the crop coverage area (the ratio between crop area and total area). Although the new feature represented an area ratio rather than CV, it was termed VI-based CV data in this study and used to estimate fresh AGBM [47,48]. These parameters were computed for the subplot area (two subplots harvested at F50) and the whole plot area (total of six subplots).
Table 2. Summary of vegetation indices that were extracted in the study.
Table 2. Summary of vegetation indices that were extracted in the study.
Vegetation IndexFormulationReference
CIgr: Chlorophyll Index Green N I R G r e e n 1 [49]
CIre: Chlorophyll Index Red Edge N I R R e d E d g e 1 [49]
EVI2: Enhanced Vegetation Index 2 2.5 × ( N I R R e d ) 1 + N I R + ( 2.4 × R e d ) [50]
GNDVI: Green Normalized Difference
Vegetation Index
N I R G r e e n N I R + G r e e n [51]
MCARI2: Modified Chlorophyll Absorption
Ratio Index 2
1.5 × [ ( 2.5 × ( N I R R e d ) ) ( 1.3 × ( N I R G r e e n ) ) ] ( 2 × N I R + 1 ) 2   ( 6 × N I R 5 × R e d ) 0.5 [52]
MTVI2: Modified Triangular Vegetation
Index 2
1.5 × [ ( 1.2 × ( N I R G r e e n ) ) ( 2.5 × ( R e d G r e e n ) ) ] ( 2 × N I R + 1 ) 2   ( 6 × N I R 5 × R e d ) 0.5 [52]
NDRE: Normalized Difference Red Edge N I R R e d E d g e N I R + R e d E d g e [53]
NDVI: Normalized Difference Vegetation Index N I R R e d N I R + R e d [54]
NDWI: Normalized Difference Water Index G r e e n N I R G r e e n + N I R [55]
OSAVI: Optimized Soil-Adjusted Vegetation Index N I R R e d N I R + R e d + 0.16 [56]
RDVI: Renormalized Difference Vegetation Index N I R R e d ( N I R + R e d ) [57]
RGBVI: Red–Green–Blue Vegetation Index G r e e n 2 ( B l u e × R e d ) G r e e n 2 ( B l u e × R e d ) [58]

2.3.2. Canopy Height Model from Digital Surface Model

The DSM (height above mean sea level, m) from RGB imagery was used to create a canopy height model (CHM) to measure the height of the crops from each subplot before using the height data to calculate the canopy volume, as shown in Figure 2c. The process starts with constructing the digital terrain model (DTM), a terrain surface topography, using QGIS software. The software extracts the elevation and coordinate information of the targeted points on the DSM’s soil surface. This information serves as an input into the triangulated irregular network (TIN) algorithm to generate a DTM (*.tif file) of the field [16,47]. Then, the pixel-wise subtraction of the DTM from the DSM was performed to construct a CHM using Python 3. In this process, the threshold (height above ground level) was set at 0.05 m to remove the pixels lower than the threshold—including weeds or other noises. Similar to VI data extraction processes, the plot segmentation shapefile was used with the CHM image layer after subtracting the noise to acquire canopy height (CH) statistical data for each subplot. The crop coverage area data were multiplied by the CH data to acquire CV data for each subplot, and then CV data were calculated for subplot (two subplots) and whole plot (six subplots) areas.

2.3.3. Three-Dimensional Reconstruction Model of Point Clouds

The generated point cloud data from RGB imagery consisted of non-pea plant materials such as soil surface and weeds. The first processing step was to separate the winter field pea plot from these objects (Figure 3a) prior to reconstructing a 3D model to calculate CV (Figure 3b).
The object-based image analysis (OBIA) technique was used on the orthomosaic RGB image to classify the pea plants versus all other objects. Such OBIA-based classification was performed utilizing QGIS software with the Orfeo ToolBox plugin (https://www.orfeo-toolbox.org/; accessed on 5 January 2022). The watershed segmentation algorithm combined with supervised classification using the image classification algorithm from the support vector machine classifier was applied to segment and classify the objects on the winter field pea image, as described in [59]. This technique was used for classifying objects in this study as the technique was simple and the classification problem was uncomplex. However, other recently developed OBIA approaches such as those based on saliency points and deep learning models, can also be applied [60,61,62] for more complex object classification problems.
The resulting data from the OBIA technique used in this study were shapefiles of four classified polygons (i.e., vegetation (winter field pea crop), soil surface, shadow, and other objects). The vegetation polygon composed of pea plants was selected and converted into another shapefile. The point cloud data were then clipped with the vegetation shapefile to obtain the point clouds of the vegetation (*.las file)—mainly the winter field pea plot in the field (Figure 3a). Next, the plot segmentation shapefiles of each plot were applied to the vegetation point clouds to obtain a single plot of winter field pea utilizing the WhiteboxTools library (https://www.whiteboxgeo.com/; accessed on 5 January 2022) using a Python 3 script. CloudCompare software version 2.11.3 Anoia (https://www.danielgm.net/cc/; accessed on 5 January 2022) was occasionally used to check and remove outliers (extreme points) above and around the plot in this process.
The digital traits were extracted from the point clouds of individual winter field pea plots (Figure 3b). These single plots (plots 45, 45, and 36 at Genesee, Garfield, and Pullman, respectively, at one growth stage) were loaded to an algorithm in MATLAB R2021b (The Math Works Inc., Natick, MA, USA). All points were projected on the X–Y plane, and the tallest winter field pea height as a maximum CH (Max) and tallest at 95% percentile (95H) of the CH were extracted from the point cloud position in the Z plane, Figure 3b. The difference between the point cloud positions in the Z plane at maximum and 5% percentile (Max-5H), and at 95% and 5% percentile (95H-5H) were also calculated, which minimizes the effect of the terrain inclination where the plots were located. Before reconstructing the 3D model, more point clouds on the plot surface were created by interpolating point clouds within each polygon grid of the alpha shape model. In this model, alpha (α), a parameter used to tune the shape tightness around the point, was set at 0.5. More point clouds were filled to create points to connect the model’s surface (Figure 3b).
Three types of 3D reconstruction models—convex hull, concave hull, and alpha shape—were then constructed to compute the CV of each plot. For the convex hull, the point cloud’s extreme extents were connected to reconstruct convex shapes [63,64]. The concave hull and alpha shape are generalized convex hull models allowing the recovery of non-convex and non-connected points. However, the alpha-shape model can be constructed with an optimized dimension (alpha, α) to shape the fit around the points for modeling a non-convex region [39,65].
With three techniques to estimate fresh AGBM in field breeding crops, the DSM and VI were applied as the standard approaches. The 3D reconstruction model from UAV-RGB’s point cloud was conducted as a new approach, which allows rapid data acquisition that is more cost-effective and efficient than other sensing systems, including LiDAR sensor systems and ground vehicle platforms integrated with sensors [66,67]. Moreover, since biomass can be extracted directly using image processing and analysis, the application of sophisticated machine learning approaches integrating multiple features [10,37,47] may not be required.

2.4. Data Analysis

Correlation analysis was performed between digital traits (CH and CV) extracted from UAV imagery with three techniques (VI, DSM, and 3D model) and ground reference data (harvested biomass). The digital traits extracted from RGB imagery of different flying altitudes (10 and 20 m) and from subplots and whole plots were included in the analysis. Simple linear regression analyses were conducted between both data sources (digital traits and ground reference data), and the coefficient of determination (R2) and the root-mean-squared errors (RMSE) were calculated.

3. Result and Discussion

3.1. Ground Reference Data

Ground reference data in the form of canopy height and harvested AGBM were acquired at 50% flowering and at physiological maturity. As shown in Figure 4, the average canopy height at F50 for Genesee and Garfield trials was around 0.65 m, and that of the Pullman trials was 0.80 m. The average fresh AGBM in the Pullman trial (4.01 kg/m2) was higher than those in Genesee and Garfield trials (1.07–1.36 kg/m2). This difference can be attributed to the differences in the genotypes in the Genesee and Garfield trials (panel 1921) compared to the Pullman trial (panel 2021cc). The Pullman trial contained genotypes that were specifically bred to maximize biomass, while the Genesee and Garfield trials contained genotypes bred for seed yield.

3.2. Canopy Height Estimation

3.2.1. DSM-Based Technique for Canopy Height Estimation

The canopy height data extracted from two flying altitudes and using two processing techniques (DSM and point clouds) were compared with the ground reference data, as shown in Figure 5a. The DSM-based techniques’ 95th-percentile CH showed consistent and high correlations (r = 0.71–0.95, p < 0.001) with the ground reference data. The correlation coefficients for CH estimated at different flight altitudes were very similar, with the CH estimations from the whole plot being slightly higher than those from the subplots. This is reasonable, as the ground reference data for CH were measured and averaged from plants throughout the entire plot.
The correlation coefficient was high and consistent for whole plot CH (95th percentile) at 20 m for Genesee (R2 = 0.87), Pullman (R2 = 0.86), and Garfield (R2 = 0.60) trials (Figure 5b). The moderate estimate at the Garfield trial was due to a few outlier data points. However, the RMSE in CH estimation for the Garfield trial was 0.07 m, similar to the other two trials (0.05–0.06 m), suggesting a high accuracy and repeatability.
The accuracy of DSM-based CH estimation relies on various factors, including the visible surface complexity, resolution and radiometric depth, sun–object–sensor geometry, and sensor type [38,68]. DTM layer also affects the accuracy of CH estimation. DTM is generated from classifying point cloud in Pix4Dmapper software; thus, a DTM layer is generated when the sensor captures the bare earth surface, and the height of objects and earth surface has good contrast [69,70]. The DTM in this study was constructed using the TIN algorithm, as the RGB camera could not penetrate the winter field pea canopy to capture enough data from the bare earth surface to construct a reliable model. The DTM technique used in this study used a clean and sufficiently wide inter-plot area (bare earth surface) to facilitate the extraction of the elevation information for the TIN algorithm, which may have influenced the results.

3.2.2. Point-Cloud-Based Technique for Canopy Height Estimation

The CH extracted from the point-cloud-based technique (Figure 5a) had a consistent and high correlation (r = 0.79–0.92, p < 0.001) with ground reference data for all three trials. Overall, the correlation coefficient between the two flying altitudes was comparable to the DSM-based CH estimates.
The extracted CH digital traits (e.g., between 95H and Max-5H) from the Genesee and Pullman point cloud datasets showed differences, which could be the effect of the plot segmentation process to create a noise-free single winter field pea plot. This process would affect the extraction of the highest and lowest point cloud position (Figure 3a), thus impacting the digital traits. The vegetation point clouds in this study were generated under a very complex surface (winter field pea is an herbaceous legume with roundish hollow stems) and micro relief-height variation. Thus, the off-position point clouds (outside the defined plot area that was planted)—resulting from the canopy vigor and shape and limitations due to the sensor and SfM algorithm efficiency—were removed to maintain the shape of the winter field pea plot during 3D model reconstruction (Figure 3a). Another potential reason for the difference in digital CH within Genesee and Pullman trials could be noise from the soil surface, as the OBIA technique did not entirely remove soil surface point clouds. The inclusion of some of the soil surface points in the point cloud during 3D model reconstruction was intentional in order to create a layer under and around the plot boundary to use as a base to capture the 3D model structure. These factors might have affected the accuracy of the extracted CH data since the CH data were calculated from the range of point cloud position (Figure 3b). A similar issue was also described in [67].

3.3. Fresh AGBM Estimation

3.3.1. Vegetation-Index-Based Technique

Canopy volume estimated using VI, DSM, and 3D model approaches and its relationship with ground reference data at the F50 stage are shown in Figure 2a. One of the major contributors to the VI data is the chlorophyll content of the crop leaves and stems, which can be an indicator of crop biomass and vigor. Thus, the VI-based CV data were compared with the ground reference data (fresh AGBM).
From Figure 6a, overall, the 90th percentile and mean of NDRE had high correlations with the fresh AGBM for all three trials (r = 0.55–0.86, p < 0.001). In Genesee and Garfield trials, Clre and NDRE from the 90th percentile and mean data demonstrated high correlations with fresh AGBM (r = 0.66–0.88, p < 0.001). Similarly, in the Pullman trial, both Clre and NDRE were correlated with fresh AGBM (r = 0.55–0.66, p < 0.001), although not as strong as in the other two trials. The primary difference in the crop type (panels 1921 and 2021cc) and associated crop morphological characteristics (e.g., leaf morphology) could have contributed to these results. This reflectance corresponding to the crop morphology, potentially resulting in the variable VI-based CV estimation accuracy, was also found in [10].
The correlation coefficient between fresh AGBM and VI-based CV extracted for a subplot was generally higher than the data extracted from the whole plot in all trials. These results are as expected because the digital traits were extracted from the subplots, which directly relate to the harvested biomass area. CIre and NDRE of 90th percentile and mean extracted from the subplots were highly correlated (r = 0.60–0.88, p < 0.001) with ground reference data in the three trials. Other digital traits from the subplots—EVI2, MCARI2, MTVI2, NDVI, and OSAVI 90th percentile and mean—were also correlated with fresh AGBM (r = ~0.60–0.80, p < 0.001). CIre and NDRE are both red-edge, spectrum-based vegetation indices. Other studies have also found that the red-edge-based indices have significant relationships with plant biomass, as this spectrum is sensitive to chlorophyll absorption [38,47,48,49,71,72].

3.3.2. DSM-Based Technique

Canopy volume for each plot was computed by multiplying the CH and canopy coverage area, both estimated from CHM. The CVs were significantly correlated with fresh AGBM (r = 0.57–0.86, p < 0.001), as shown in Figure 6a. Between experiments, the correlation coefficients were higher between CV and fresh AGBM in Genesee and Garfield (r = 0.71–0.86, p < 0.001) than Pullman (r = 0.57–0.64, p < 0.001), which could be due to canopy morphological differences.
The correlation coefficients in the subplots were slightly higher than those from the whole plots, similar to the VI-based CV approach. For the two flying altitudes, the results were not significantly different, contrasting with the results reported in other studies [73,74]. This could be associated with the speed of UAV during data acquisition. A low speed (2 m/s) would translate to capturing high-quality images with high overlap to provide an accurate position of the object. The UAV’s flight mission parameters, such as the altitude, speed, and overlap, determine the quality of the data, and spatial resolution and spectral discrimination need to be balanced in order to achieve the desired data accuracy [75,76].

3.3.3. Three-Dimensional Model-Based Technique

RGB point clouds were utilized to reconstruct three types of 3D models to estimate the fresh AGBM (Figure 3b) and compare it with the ground reference data. In general, the correlation coefficient showed that alpha-shape-based (α = 1.5) CV was more accurate and provided a consistent correlation (r = 0.78–0.81, p < 0.001) with the fresh AGBM than the other 3D models for the three trials at both flying altitudes (Figure 6a). Linear regression (Figure 6b) between ground reference data and alpha-shape-based (α = 1.5) CV data at 20 m flying altitude indicated an acceptable accuracy for fresh AGBM estimation.
For the 3D model approach, both alpha shape (α = 1.5) and convex hull volume demonstrated higher correlations (r = 0.79–0.81, p < 0.001) with the fresh AGBM in the Genesee and Garfield trials than the Pullman trial (r = 0.66–0.78, p < 0.001). In general, the convex hull computes the space based on the smallest convex closure containing all the given points. The method accounts for the unoccupied space inside the closure in the volume estimation. In Garfield and Genesee trials, where crops were shorter and denser, the convex hull model forms a smooth layer around crops (low variation in plant height within a plot), thus giving a good CV estimation that corresponds to AGBM. However, for the Pullman trial, which had taller plants and more variability in a plot, the convex hull model computed the volume within the unoccupied space [39,77] between these tall plants (overestimation of volume within the gaps), thus resulting in a lower correlation with ground reference AGBM.
Due to this reason, the alpha-shape-based (α = 1.5) canopy volume may have a better relationship with the fresh AGBM in the Pullman trial, as the alpha shape is a generalized convex hull method that allows the adjustment of the constant alpha (α) to optimize the tightness of the shape around the 3D point cloud. In this study, the RGB images used to construct point clouds were collected from a single grid with 80% front and 70% side overlap at camera inclinations of 90°. The point cloud numbers and the accuracy of point cloud position can further be improved by using a double grid pattern, different camera inclinations, and an increased percentage of image overlap [44,78,79].

3.4. Dry AGBM Estimation

UAV imagery at the PM stage was processed using two techniques—DSM and 3D point cloud model—to extract digital CV traits and compare the extracted data with dry AGBM. Figure 7a shows that CV from the DSM-based technique had a moderate relationship (r = 0.44–0.85, p < 0.001) with the ground reference data for the Genesee and Garfield trials and a low correlation with that for the Pullman trial. Figure 8 demonstrates the potential factor contributing to the low correlation for the Pullman trials is that some genotypes have lodged and extended beyond the plot boundary. Similarly, significant lodging was also observed in the Genesee trial.
The CV estimated from 3D model-based approach with the alpha-shape (α = 1.5) volume provided a consistent and high correlation coefficient (r = 0.70–0.81, p < 0.001) with dry AGBM, compared with other 3D models for all trials (Figure 7a). Regression between the alpha-shape (α = 1.5) canopy volume estimate at 20 m flying altitude with the ground reference data resulted in moderate estimates (Figure 7b).

4. Summary and Conclusions

Both DSM and point cloud data were successful in estimating canopy height at 20 m flight altitude with the DSM technique, providing higher and more stable correlations with the ground reference data. All three approaches (VI, DSM, and 3D model) demonstrated the feasibility of using UAV-mounted RGB or multispectral cameras to measure AGBM, especially fresh AGBM. The point cloud data generated from RGB images collected at 20 m flying altitude combined with the 3D model technique (the alpha shape with α = 1.5) provided high and consistent correlations with both fresh and dry AGBM across different trials, although the processing of these datasets was complex. The VI- and DSM-based approaches also provided a high correlation of estimated fresh AGBM, but the use of these approaches should match the crop type, size, and shape (i.e., the selection of VIs for estimation). Both digital traits (CH and CV) had higher correlations with the ground reference data (measured CH and harvested AGBM) when they were extracted from similar areas (i.e., the subplots in this study).
In this study, the number and precision of point clouds from UAV-based RGB imagery might be lower than those generated from the LiDAR system; however, UAV-based RGB imagery is inexpensive, compared with the LiDAR system. The point cloud quality can also be improved by optimizing the UAV flight mission parameters, as previously described. Various software packages used to process point clouds from UAV-based RGB imagery are open-source software. For example, OpenDroneMap can be used for constructing point clouds and DSM from RGB imagery (https://www.opendronemap.org/; accessed on 5 January 2022). Other software solutions are available to visualize and manipulate the point clouds, such as QGIS (starting from version 3.18) and CloudCompare, including Python’s WhiteboxTools, and Open3D (http://www.open3d.org/; accessed on 5 January 2022). These resources may significantly improve the opportunities and benefits to the research community for analysis of point clouds derived from UAV-based RGB imagery.
In summary, these results offer an alternative, resource-efficient, rapid, non-destructive approach for plant scientists and growers to estimate AGBM. Further research will focus on data analysis and incorporating multivariate traits, including factors to help explain the interactions between genotype, environment, and management (G × E × M), to increase prediction accuracy [22,38,47,48] and to provide informed decisions for breeding programs.

Author Contributions

Conceptualization, W.S. and S.S.; methodology, W.S. and S.S.; software, W.S. and S.S.; validation, W.S., S.S. and R.J.M.; formal analysis, W.S.; investigation, W.S., S.S. and R.J.M.; resources, S.S. and R.J.M.; data curation, W.S.; writing—original draft preparation, W.S.; writing—review and editing, S.S. and R.J.M.; visualization, W.S.; supervision, S.S. and R.J.M.; project administration, S.S. and R.J.M.; funding acquisition, S.S. and R.J.M. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by Washington State University’s Center for Sustaining Agriculture and Natural Resources BioAg Program (project ID 184), and the US Department of Agriculture’s National Institute of Food and Agriculture (USDA-NIFA) hatch project (accession number 1014919).

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors would like to thank Mary A. Lauver for the ground reference data. We would also like to thank Chongyuan Zhang for his support during the UAV data collection.

Conflicts of Interest

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

References

  1. Fraser, M.D.; Fychan, R.; Jones, R. The effect of harvest date and inoculation on the yield, fermentation characteristics and feeding value of forage pea and field bean silages. Grass Forage Sci. 2001, 56, 218–230. [Google Scholar] [CrossRef]
  2. Chen, C.; Miller, P.; Muehlbauer, F.; Neill, K.; Wichman, D.; McPhee, K. Winter pea and lentil response to seeding date and micro-and macro-environments. Agron. J. 2006, 98, 1655–1663. [Google Scholar] [CrossRef] [Green Version]
  3. Clark, A. Managing Cover Crops Profitably, 3rd ed.; Sustainable Agriculture Network: Beltsville, MD, USA, 2008. [Google Scholar]
  4. Tulbek, M.C.; Lam, R.S.H.; Wang, Y.C.; Asavajaru, P.; Lam, A. Pea: A Sustainable Vegetable Protein Crop. In Sustainable Protein Sources; Nadathur, S.R., Wanasundara, J.P.D., Scanlin, L., Eds.; Elsevier Inc.: Amsterdam, The Netherlands, 2017; pp. 145–164. [Google Scholar] [CrossRef]
  5. Steinfeld, H.; Gerder, P.; Wassenaar, T.D.; Castel, V.; Rosales, M.; de Haan, C. Livestock’s Long Shadow: Environmental Issues and Options; Food and Agriculture Organization: Rome, Italy, 2006. [Google Scholar]
  6. Gerber, P.J.; Steinfeld, H.; Henderson, B.; Mottet, A.; Opio, C.; Dijkman, J.; Falcucci, A.; Tempio, G. Tackling Climate Change through Livestock: A Global Assessment of Emissions and Mitigation Opportunities; Food and Agriculture Organization: Rome, Italy, 2013. [Google Scholar]
  7. Annicchiarico, P.; Russi, L.; Romani, M.; Pecetti, L.; Nazzicari, N. Farmer-participatory vs. conventional market-oriented breeding of inbred crops using phenotypic and genome-enabled approaches: A pea case study. Field Crops Res. 2019, 232, 30–39. [Google Scholar] [CrossRef]
  8. Insua, J.R.; Utsumi, S.A.; Basso, B. Estimation of spatial and temporal variability of pasture growth and digestibility in grazing rotations coupling unmanned aerial vehicle (UAV) with crop simulation models. PLoS ONE 2019, 14, e0212773. [Google Scholar] [CrossRef] [Green Version]
  9. Ligoski, B.; Gonçalves, L.F.; Claudio, F.L.; Alves, E.M.; Krüger, A.M.; Bizzuti, B.E.; Lima, P.D.M.T.; Abdalla, A.L.; Paim, T.D.P. Silage of intercropping corn, palisade grass, and pigeon pea increases protein content and reduces in vitro methane production. Agronomy 2020, 10, 1784. [Google Scholar] [CrossRef]
  10. Quirós Vargas, J.J.; Zhang, C.; Smitchger, J.A.; McGee, R.J.; Sankaran, S. Phenotyping of plant biomass and performance traits using remote sensing techniques in pea (Pisum sativum, L.). Sensors 2019, 19, 2031. [Google Scholar] [CrossRef] [Green Version]
  11. Furbank, R.T.; Tester, M. Phenomics–technologies to relieve the phenotyping bottleneck. Trends Plant Sci. 2011, 16, 635–644. [Google Scholar] [CrossRef]
  12. Cobb, J.N.; DeClerck, G.; Greenberg, A.; Clark, R.; McCouch, S. Next-generation phenotyping: Requirements and strategies for enhancing our understanding of genotype–phenotype relationships and its relevance to crop improvement. Theor. Appl. Genet. 2013, 126, 867–887. [Google Scholar] [CrossRef] [Green Version]
  13. Maesano, M.; Khoury, S.; Nakhle, F.; Firrincieli, A.; Gay, A.; Tauro, F.; Harfouche, A. UAV-based LiDAR for high-throughput determination of plant height and above-ground biomass of the bioenergy grass arundo donax. Remote Sens. 2020, 12, 3464. [Google Scholar] [CrossRef]
  14. Jung, J.; Maeda, M.; Chang, A.; Bhandari, M.; Ashapure, A.; Landivar-Bowles, J. The potential of remote sensing and artificial intelligence as tools to improve the resilience of agriculture production systems. Curr. Opin. Biotechnol. 2021, 70, 15–22. [Google Scholar] [CrossRef]
  15. Li, D.; Quan, C.; Song, Z.; Li, X.; Yu, G.; Li, C.; Muhammad, A. High-throughput plant phenotyping platform (HT3P) as a novel tool for estimating agronomic traits from the lab to the field. Front. Bioeng. Biotechnol. 2021, 8, 1533. [Google Scholar] [CrossRef] [PubMed]
  16. Ortiz, M.V.; Sangjan, W.; Selvaraj, M.G.; McGee, R.J.; Sankaran, S. Effect of the solar zenith angles at different latitudes on estimated crop vegetation indices. Drones 2021, 5, 80. [Google Scholar] [CrossRef]
  17. Zhang, C.; Serra, S.; Quirós Vargas, J.; Sangjan, W.; Musacchi, S.; Sankaran, S. Non-invasive sensing techniques to phenotype multiple apple tree architectures. Inf. Process. Agric. 2021; in press. [Google Scholar] [CrossRef]
  18. De Jesus Colwell, F.; Souter, J.; Bryan, G.J.; Compton, L.J.; Boonham, N.; Prashar, A. Development and validation of methodology for estimating potato canopy structure for field crop phenotyping and improved breeding. Front. Plant Sci. 2021, 12, 139. [Google Scholar] [CrossRef] [PubMed]
  19. Guo, W.; Carroll, M.E.; Singh, A.; Swetnam, T.L.; Merchant, N.; Sarkar, S.; Singh, A.K.; Ganapathysubramanian, B. UAS-based plant phenotyping for research and breeding applications. Plant Phenom. 2021, 2021, 9840192. [Google Scholar] [CrossRef]
  20. Sangjan, W.; Sankaran, S. Phenotyping architecture traits of tree species using remote sensing techniques. Trans. ASABE 2021, 64, 1611–1624. [Google Scholar] [CrossRef]
  21. Zhang, H.; Wang, L.; Tian, T.; Yin, J. A review of unmanned aerial vehicle low-altitude remote sensing (UAV-LARS) use in agricultural monitoring in China. Remote Sens. 2021, 13, 1221. [Google Scholar] [CrossRef]
  22. Tilly, N.; Aasen, H.; Bareth, G. Fusion of plant height and vegetation indices for the estimation of barley biomass. Remote Sens. 2015, 7, 11449–11480. [Google Scholar] [CrossRef] [Green Version]
  23. Wengert, M.; Piepho, H.P.; Astor, T.; Graß, R.; Wijesingha, J.; Wachendorf, M. Assessing spatial variability of barley whole crop biomass yield and leaf area index in silvoarable agroforestry systems using UAV-borne remote sensing. Remote Sens. 2021, 13, 2751. [Google Scholar] [CrossRef]
  24. Sankaran, S.; Zhou, J.; Khot, L.R.; Trapp, J.J.; Mndolwa, E.; Miklas, P.N. High-throughput field phenotyping in dry bean using small unmanned aerial vehicle based multispectral imagery. Comput. Electron. Agric. 2018, 151, 84–92. [Google Scholar] [CrossRef]
  25. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  26. Wan, L.; Zhang, J.; Dong, X.; Du, X.; Zhu, J.; Sun, D.; Liu, Y.; He, Y.; Cen, H. Unmanned aerial vehicle-based field phenotyping of crop biomass using growth traits retrieved from PROSAIL model. Comput. Electron. Agric. 2021, 187, 106304. [Google Scholar] [CrossRef]
  27. Yue, J.; Yang, G.; Tian, Q.; Feng, H.; Xu, K.; Zhou, C. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices. ISPRS J. Photogramm. Remote Sens. 2019, 150, 226–244. [Google Scholar] [CrossRef]
  28. Roy Choudhury, M.; Das, S.; Christopher, J.; Apan, A.; Chapman, S.; Menzies, N.W.; Dang, Y.P. Improving biomass and grain yield prediction of wheat genotypes on sodic soil using integrated high-resolution multispectral, hyperspectral, 3D point cloud, and machine learning techniques. Remote Sens. 2021, 13, 3482. [Google Scholar] [CrossRef]
  29. Schönberger, J.L.; Frahm, J. Structure-from-Motion revisited. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 4104–4113. [Google Scholar]
  30. Thompson, A.L.; Thorp, K.R.; Conley, M.M.; Elshikha, D.M.; French, A.N.; Andrade-Sanchez, P.; Pauli, D. Comparing nadir and multi-angle view sensor technologies for measuring in-field plant height of upland cotton. Remote Sens. 2019, 11, 700. [Google Scholar] [CrossRef] [Green Version]
  31. Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating above-ground biomass of maize using features derived from UAV-based RGB imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef] [Green Version]
  32. Gilliot, J.M.; Michelin, J.; Hadjard, D.; Houot, S. An accurate method for predicting spatial variability of maize yield from UAV-based plant height estimation: A tool for monitoring agronomic field experiments. Precis. Agric. 2021, 22, 897–921. [Google Scholar] [CrossRef]
  33. Acorsi, M.G.; das Dores Abati Miranda, F.; Martello, M.; Smaniotto, D.A.; Sartor, L.R. Estimating biomass of black oat using UAV-based RGB imaging. Agronomy 2019, 9, 344. [Google Scholar] [CrossRef] [Green Version]
  34. Peprah, C.O.; Yamashita, M.; Yamaguchi, T.; Sekino, R.; Takano, K.; Katsura, K. Spatio-temporal estimation of biomass growth in rice using canopy surface model from unmanned aerial vehicle images. Remote Sens. 2021, 13, 2388. [Google Scholar] [CrossRef]
  35. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Maimaitiyiming, M.; Hartling, S.; Peterson, K.T.; Maw, M.J.W.; Shakoor, N.; Mockler, T.; Fritschi, F.B. Vegetation index weighted canopy volume model (CVMVI) for soybean biomass estimation from unmanned aerial system-based RGB imagery. ISPRS J. Photogramm. Remote Sens. 2019, 151, 27–41. [Google Scholar] [CrossRef]
  36. Toda, Y.; Kaga, A.; Kajiya-Kanegae, H.; Hattori, T.; Yamaoka, S.; Okamoto, M.; Tsujimoto, H.; Iwata, H. Genomic prediction modeling of soybean biomass using UAV-based remote sensing and longitudinal model parameters. Plant Genome 2021, 14, e20157. [Google Scholar] [CrossRef]
  37. Yue, J.; Yang, G.; Li, C.; Li, Z.; Wang, Y.; Feng, H.; Xu, B. Estimation of winter wheat above-ground biomass using unmanned aerial vehicle-based snapshot hyperspectral sensor and crop height improved models. Remote Sens. 2017, 9, 708. [Google Scholar] [CrossRef] [Green Version]
  38. Banerjee, B.P.; Spangenberg, G.; Kant, S. Fusion of spectral and structural information from aerial images for improved biomass estimation. Remote Sens. 2020, 12, 3164. [Google Scholar] [CrossRef]
  39. Di Gennaro, S.F.; Matese, A. Evaluation of novel precision viticulture tool for canopy biomass estimation and missing plant detection based on 2.5D and 3D approaches using RGB images acquired by UAV platform. Plant Methods 2020, 16, 91. [Google Scholar] [CrossRef] [PubMed]
  40. Reddersen, B.; Fricke, T.; Wachendorf, M. A multi-sensor approach for predicting biomass of extensively managed grassland. Comput. Electron. Agric. 2014, 109, 247–260. [Google Scholar] [CrossRef]
  41. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating biomass of barley using crop surface models (CSMs) derived from UAV-based RGB imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef] [Green Version]
  42. Rogers, S.R.; Manning, I.; Livingstone, W. Comparing the spatial accuracy of digital surface models from four unoccupied aerial systems: Photogrammetry versus LiDAR. Remote Sens. 2020, 12, 2806. [Google Scholar] [CrossRef]
  43. Dong, X.; Kim, W.Y.; Lee, K.H. Drone-based three-dimensional photogrammetry and concave hull by slices algorithm for apple tree volume mapping. J. Biosyst. Eng. 2021, 46, 474–484. [Google Scholar] [CrossRef]
  44. Kothawade, G.S.; Chandel, A.K.; Schrader, M.J.; Rathnayake, A.P.; Khot, L.R. High throughput canopy characterization of a commercial apple orchard using aerial RGB imagery. In Proceedings of the 2021 IEEE International Workshop on Metrology for Agriculture and Forestry (MetroAgriFor), Trento-Bolzano, Italy, 3–5 November 2021; pp. 177–181. [Google Scholar] [CrossRef]
  45. Qi, Y.; Dong, X.; Chen, P.; Lee, K.H.; Lan, Y.; Lu, X.; Jia, R.; Deng, J.; Zhang, Y. Canopy volume extraction of Citrus reticulate Blanco cv. Shatangju trees using UAV image-based point cloud deep learning. Remote Sens. 2021, 13, 3437. [Google Scholar] [CrossRef]
  46. Huete, A.R. A soil-adjusted vegetation index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  47. Jiang, Q.; Fang, S.; Peng, Y.; Gong, Y.; Zhu, R.; Wu, X.; Ma, Y.; Duan, B.; Liu, J. UAV-based biomass estimation for rice-combining spectral, TIN-based structural and meteorological features. Remote Sens. 2019, 11, 890. [Google Scholar] [CrossRef] [Green Version]
  48. Tefera, A.T.; Banerjee, B.P.; Pandey, B.R.; James, L.; Puri, R.R.; Cooray, O.; Marsh, J.; Richards, M.; Kant, S.; Fitzgerald, G.J.; et al. Estimating early season growth and biomass of field pea for selection of divergent ideotypes using proximal sensing. Field Crops Res. 2022, 277, 108407. [Google Scholar] [CrossRef]
  49. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef] [PubMed]
  50. Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
  51. Gitelson, A.A.; Merzlyak, M.N. Remote sensing of chlorophyll concentration in higher plant leaves. Adv. Space Res. 1998, 22, 689–692. [Google Scholar] [CrossRef]
  52. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  53. Gitelson, A.; Merzlyak, M.N. Quantitative estimation of chlorophyll-a using reflectance spectra: Experiments with autumn chestnut and maple leaves. J. Photochem. Photobiol. B Biol. 1994, 22, 247–252. [Google Scholar] [CrossRef]
  54. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the Great Plains with ERTS. NASA Spec. Publ. 1974, 351, 309. [Google Scholar]
  55. McFeeters, S.K. The use of the normalized difference water index (NDWI) in the delineation of open water features. Int. J. Remote Sens. 1996, 17, 1425–1432. [Google Scholar] [CrossRef]
  56. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  57. Roujean, J.L.; Breon, F.M. Estimating PAR absorbed by vegetation from bidirectional reflectance measurements. Remote Sens. Environ. 1995, 51, 375–384. [Google Scholar] [CrossRef]
  58. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  59. Duarte, L.; Silva, P.; Teodoro, A.C. Development of a QGIS plugin to obtain parameters and elements of plantation trees and vineyards with aerial photographs. ISPRS Int. J. Geo-Inf. 2018, 7, 109. [Google Scholar] [CrossRef] [Green Version]
  60. Li, C.; Luo, B.; Hong, H.; Su, X.; Wang, Y.; Liu, J.; Wang, C.; Zhang, J.; Wei, L. Object detection based on global-local saliency constraint in aerial images. Remote Sens. 2020, 12, 1435. [Google Scholar] [CrossRef]
  61. Jian, M.; Wang, J.; Yu, H.; Wang, G.; Meng, X.; Yang, L.; Dong, J.; Yin, Y. Visual saliency detection by integrating spatial position prior of object with background cues. Expert Syst. Appl. 2021, 168, 114219. [Google Scholar] [CrossRef]
  62. Huyan, L.; Bai, Y.; Li, Y.; Jiang, D.; Zhang, Y.; Zhou, Q.; Wei, J.; Liu, J.; Zhang, Y.; Cui, T. A Lightweight Object Detection Framework for Remote Sensing Images. Remote Sens. 2021, 13, 683. [Google Scholar] [CrossRef]
  63. Chakraborty, M.; Khot, L.R.; Sankaran, S.; Jacoby, P.W. Evaluation of mobile 3D light detection and ranging based canopy mapping system for tree fruit crops. Comput. Electron. Agric. 2019, 158, 284–293. [Google Scholar] [CrossRef]
  64. Saarinen, N.; Calders, K.; Kankare, V.; Yrttimaa, T.; Junttila, S.; Luoma, V.; Huuskonen, S.; Hynynen, J.; Verbeeck, H. Understanding 3D structural complexity of individual Scots pine trees with different management history. Ecol. Evol. 2021, 11, 2561–2572. [Google Scholar] [CrossRef]
  65. Yan, Z.; Liu, R.; Cheng, L.; Zhou, X.; Ruan, X.; Xiao, Y. A concave hull methodology for calculating the crown volume of individual trees based on vehicle-borne LiDAR data. Remote Sens. 2019, 11, 623. [Google Scholar] [CrossRef] [Green Version]
  66. Zhang, F.; Hassanzadeh, A.; Kikkert, J.; Pethybridge, S.J.; van Aardt, J. Comparison of UAS-based structure-from-motion and LiDAR for structural characterization of short broadacre crops. Remote Sens. 2021, 13, 3975. [Google Scholar] [CrossRef]
  67. Jiang, Y.; Li, C.; Paterson, A.H.; Sun, S.; Xu, R.; Robertson, J. Quantitative analysis of cotton canopy size in field conditions using a consumer-grade RGB-D camera. Front. Plant Sci. 2018, 8, 2233. [Google Scholar] [CrossRef] [Green Version]
  68. Fonstad, M.A.; Dietrich, J.T.; Courville, B.C.; Jensen, J.L.; Carbonneau, P.E. Topographic structure from motion: A new development in photogrammetric measurement. Earth Surf. Processes Landf. 2013, 38, 421–430. [Google Scholar] [CrossRef] [Green Version]
  69. Jensen, J.L.; Mathews, A.J. Assessment of image-based point cloud products to generate a bare earth surface and estimate canopy heights in a woodland ecosystem. Remote Sens. 2016, 8, 50. [Google Scholar] [CrossRef] [Green Version]
  70. Salach, A.; Bakuła, K.; Pilarska, M.; Ostrowski, W.; Górski, K.; Kurczyński, Z. Accuracy assessment of point clouds from LiDAR and dense image matching acquired using the UAV platform for DTM creation. ISPRS Int. J. Geo-Inf. 2018, 7, 342. [Google Scholar] [CrossRef] [Green Version]
  71. Kanke, Y.; Tubana, B.; Dalen, M.; Harrell, D. Evaluation of red and red-edge reflectance-based vegetation indices for rice biomass and grain yield prediction models in paddy fields. Precis. Agric. 2016, 17, 507–530. [Google Scholar] [CrossRef]
  72. Cheng, T.; Song, R.; Li, D.; Zhou, K.; Zheng, H.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Spectroscopic estimation of biomass in canopy components of paddy rice using dry matter and chlorophyll indices. Remote Sens. 2017, 9, 319. [Google Scholar] [CrossRef] [Green Version]
  73. Peña, J.M.; Torres-Sánchez, J.; Serrano-Pérez, A.; De Castro, A.I.; López-Granados, F. Quantifying efficacy and limits of unmanned aerial vehicle (UAV) technology for weed seedling detection as affected by sensor resolution. Sensors 2015, 15, 5609–5626. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  74. Jin, X.; Liu, S.; Baret, F.; Hemerlé, M.; Comar, A. Estimates of plant density of wheat crops at emergence from very low altitude UAV imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef] [Green Version]
  75. Gómez-Candón, D.; De Castro, A.I.; López-Granados, F. Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precis. Agric. 2014, 15, 44–56. [Google Scholar] [CrossRef] [Green Version]
  76. Mesas-Carrascosa, F.J.; Torres-Sánchez, J.; Clavero-Rumbao, I.; García-Ferrer, A.; Peña, J.M.; Borra-Serrano, I.; López-Granados, F. Assessing optimal flight parameters for generating accurate multispectral orthomosaicks by UAV to support site-specific crop management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar] [CrossRef] [Green Version]
  77. Jiang, Y.; Li, C.; Takeda, F.; Kramer, E.A.; Ashrafi, H.; Hunter, J. 3D point cloud data to quantitatively characterize size and shape of shrub crops. Hortic. Res. 2019, 6, 43. [Google Scholar] [CrossRef] [Green Version]
  78. Kuželka, K.; Slavík, M.; Surový, P. Very high density point clouds from UAV laser scanning for automatic tree stem detection and direct diameter measurement. Remote Sens. 2020, 12, 1236. [Google Scholar] [CrossRef] [Green Version]
  79. Moreira, B.M.; Goyanes, G.; Pina, P.; Vassilev, O.; Heleno, S. Assessment of the influence of survey design and processing choices on the accuracy of tree diameter at breast height (DBH) measurements using UAV-based photogrammetry. Drones 2021, 5, 43. [Google Scholar] [CrossRef]
Figure 1. Region of the study area and locations of the three breeding trials in the Palouse region: Genesee, ID (2018–2019), Garfield, WA (2018–2019), and Pullman, WA (2019–2020), USA. Data Source: US Census Bureau; Cartographer: Worasit Sangjan; Date: 10 January 2022.
Figure 1. Region of the study area and locations of the three breeding trials in the Palouse region: Genesee, ID (2018–2019), Garfield, WA (2018–2019), and Pullman, WA (2019–2020), USA. Data Source: US Census Bureau; Cartographer: Worasit Sangjan; Date: 10 January 2022.
Remotesensing 14 02396 g001
Figure 2. Image processing pipeline: (a) flowchart of overall pipelines used to extract digital traits in this study; (b) a VI data extraction pipeline; (c) a canopy height data extraction from DSM pipeline. SAVI: soil-adjusted vegetation index; OBIA: object-based image analysis; NIR: near-infrared; RE: red edge; CIre: chlorophyll index red edge.
Figure 2. Image processing pipeline: (a) flowchart of overall pipelines used to extract digital traits in this study; (b) a VI data extraction pipeline; (c) a canopy height data extraction from DSM pipeline. SAVI: soil-adjusted vegetation index; OBIA: object-based image analysis; NIR: near-infrared; RE: red edge; CIre: chlorophyll index red edge.
Remotesensing 14 02396 g002
Figure 3. Image processing pipeline for 3D reconstruction model: (a) pipeline used to segment the point clouds of an individual winter field pea plot; (b) pipeline used to reconstruct 3D models to extract canopy volume.
Figure 3. Image processing pipeline for 3D reconstruction model: (a) pipeline used to segment the point clouds of an individual winter field pea plot; (b) pipeline used to reconstruct 3D models to extract canopy volume.
Remotesensing 14 02396 g003
Figure 4. Ground reference data from the three trials at Genesee, Garfield, and Pullman: (a) canopy height; (b) fresh above-ground biomass; (c) dry above-ground biomass. µ: mean; σ: standard deviation; AGBM: above-ground biomass.
Figure 4. Ground reference data from the three trials at Genesee, Garfield, and Pullman: (a) canopy height; (b) fresh above-ground biomass; (c) dry above-ground biomass. µ: mean; σ: standard deviation; AGBM: above-ground biomass.
Remotesensing 14 02396 g004
Figure 5. Relationships between ground reference data and digital canopy height traits (at F50 stage) extracted from UAV imagery from three trials at Genesee, Garfield, and Pullman: (a) correlation coefficients based on digital traits extracted from DSM and point cloud data; (b) linear regression between DSM-based 95th-percentile CH and ground reference data. CH: canopy height; DSM technique—Max: maximum CH; 95P: 95th-percentile CH; 90P: 90th-percentile CH; 85P: 85th-percentile CH; Mean: average CH; Point cloud technique—Max: CH at maximum height; 95H: CH at 95% percentile height; Max-5H: CH difference between the maximum and 5% height; 95H-5H: CH difference between 95% and 5% height. Significant probability level: *** 0.001.
Figure 5. Relationships between ground reference data and digital canopy height traits (at F50 stage) extracted from UAV imagery from three trials at Genesee, Garfield, and Pullman: (a) correlation coefficients based on digital traits extracted from DSM and point cloud data; (b) linear regression between DSM-based 95th-percentile CH and ground reference data. CH: canopy height; DSM technique—Max: maximum CH; 95P: 95th-percentile CH; 90P: 90th-percentile CH; 85P: 85th-percentile CH; Mean: average CH; Point cloud technique—Max: CH at maximum height; 95H: CH at 95% percentile height; Max-5H: CH difference between the maximum and 5% height; 95H-5H: CH difference between 95% and 5% height. Significant probability level: *** 0.001.
Remotesensing 14 02396 g005
Figure 6. Relationships between ground reference data and digital canopy volume traits (at F50 stage) extracted from UAV imagery from three trials at Genesee, Garfield, and Pullman: (a) correlation coefficients based on digital traits extracted from VI imagery, DSM, and point cloud data; (b) linear regression between 3D-model-based alpha-shape (α = 1.5) CV and ground reference data. CV: canopy volume; VI technique—90P: 90th percentile of VI; Mean: average VI; 3D model technique—Alpha1.5: alpha shape (α = 1.5); Alpha1.0: alpha shape (α = 1.0); Alpha0.5: alpha shape (α = 1.5); Concave: concave hull; Convex: convex hull. Significant probability level: ** 0.01, and *** 0.001.
Figure 6. Relationships between ground reference data and digital canopy volume traits (at F50 stage) extracted from UAV imagery from three trials at Genesee, Garfield, and Pullman: (a) correlation coefficients based on digital traits extracted from VI imagery, DSM, and point cloud data; (b) linear regression between 3D-model-based alpha-shape (α = 1.5) CV and ground reference data. CV: canopy volume; VI technique—90P: 90th percentile of VI; Mean: average VI; 3D model technique—Alpha1.5: alpha shape (α = 1.5); Alpha1.0: alpha shape (α = 1.0); Alpha0.5: alpha shape (α = 1.5); Concave: concave hull; Convex: convex hull. Significant probability level: ** 0.01, and *** 0.001.
Remotesensing 14 02396 g006
Figure 7. Relationships between ground reference data and digital canopy volume traits (at PM stage) extracted from UAV imagery from three trials at Genesee, Garfield, and Pullman: (a) correlation coefficients based on digital traits extracted from DSM and point cloud data; (b) linear regression between 3D model-based alpha-shape (α = 1.5) CV and ground reference data. CV: canopy volume; 3D model technique—Alpha1.5: alpha shape (α = 1.5); Alpha1.0: alpha shape (α = 1.0); Alpha0.5: alpha shape (α = 1.5); Concave: concave hull; Convex: convex hull. Significant probability level: ** 0.01, and *** 0.001.
Figure 7. Relationships between ground reference data and digital canopy volume traits (at PM stage) extracted from UAV imagery from three trials at Genesee, Garfield, and Pullman: (a) correlation coefficients based on digital traits extracted from DSM and point cloud data; (b) linear regression between 3D model-based alpha-shape (α = 1.5) CV and ground reference data. CV: canopy volume; 3D model technique—Alpha1.5: alpha shape (α = 1.5); Alpha1.0: alpha shape (α = 1.0); Alpha0.5: alpha shape (α = 1.5); Concave: concave hull; Convex: convex hull. Significant probability level: ** 0.01, and *** 0.001.
Remotesensing 14 02396 g007
Figure 8. Part of CHM model at the PM stage with plot segmentation polygon (red color) for CH and crop-coverage area extraction.
Figure 8. Part of CHM model at the PM stage with plot segmentation polygon (red color) for CH and crop-coverage area extraction.
Remotesensing 14 02396 g008
Table 1. Summary of the raw data acquired in this study.
Table 1. Summary of the raw data acquired in this study.
SeasonField
Location
GrowthStageUAV’s Flight—Image AcquisitionGround Reference Data
DateFlight Altitude (m)CameraGSD 5 (cm/pixel)Fresh AGBM 6Dry AGBM
2019GeneseeF50 118 June10RGB 30.2119 June-
20RGB0.50
20MS 41.34
PM 229 July10RGB0.19-31 July
20RGB0.52
GarfieldF5024 June10RGB0.2225 June-
20RGB0.51
20MS1.19
PM29 July10RGB0.21-31 July
20RGB0.52
2020PullmanF5010 June10RGB0.2512 June-
20RGB0.51
20MS1.26
PM27 July10RGB0.29-27 July
20RGB0.55
1 F50: 50% flowering; 2 PM: physiological maturity; 3 RGB: red–green–blue (spectral band range from 390 to 700 nm); 4 MS: multispectral camera (spectral bands were blue: 475 ± 10 nm, green: 560 ± 10 nm, red: 668 ± 5 nm, red edge: 717 ± 5 nm, and NIR: 840 ± 20 nm); 5 GSD: ground sampling distance; 6 AGBM: above-ground biomass.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Sangjan, W.; McGee, R.J.; Sankaran, S. Optimization of UAV-Based Imaging and Image Processing Orthomosaic and Point Cloud Approaches for Estimating Biomass in a Forage Crop. Remote Sens. 2022, 14, 2396. https://doi.org/10.3390/rs14102396

AMA Style

Sangjan W, McGee RJ, Sankaran S. Optimization of UAV-Based Imaging and Image Processing Orthomosaic and Point Cloud Approaches for Estimating Biomass in a Forage Crop. Remote Sensing. 2022; 14(10):2396. https://doi.org/10.3390/rs14102396

Chicago/Turabian Style

Sangjan, Worasit, Rebecca J. McGee, and Sindhuja Sankaran. 2022. "Optimization of UAV-Based Imaging and Image Processing Orthomosaic and Point Cloud Approaches for Estimating Biomass in a Forage Crop" Remote Sensing 14, no. 10: 2396. https://doi.org/10.3390/rs14102396

APA Style

Sangjan, W., McGee, R. J., & Sankaran, S. (2022). Optimization of UAV-Based Imaging and Image Processing Orthomosaic and Point Cloud Approaches for Estimating Biomass in a Forage Crop. Remote Sensing, 14(10), 2396. https://doi.org/10.3390/rs14102396

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop