Estimating Canopy Density Parameters Time-Series for Winter Wheat Using UAS Mounted LiDAR

: Monitoring of canopy density with related metrics such as leaf area index (LAI) makes a signiﬁcant contribution to understanding and predicting processes in the soil–plant–atmosphere system and to indicating crop health and potential yield for farm management. Remote sensing methods using optical sensors that rely on spectral reﬂectance to calculate LAI have become more mainstream due to easy entry and availability. Methods with vegetation indices (VI) based on multispectral reﬂectance data essentially measure the green area index (GAI) or response to chlorophyll content of the canopy surface and not the entire aboveground biomass that may be present from non-green elements that are key to fully assessing the carbon budget. Methods with light detection and ranging (LiDAR) have started to emerge using gap fraction (GF) to estimate the plant area index (PAI) based on canopy density. These LiDAR methods have the main advantage of being sensitive to both green and non-green plant elements. They have primarily been applied to forest cover with manned airborne LiDAR systems (ALS) and have yet to be used extensively with crops such as winter wheat using LiDAR on unmanned aircraft systems (UAS). This study contributes to a better understanding of the potential of LiDAR as a tool to estimate canopy structure in precision farming. The LiDAR method proved to have a high to moderate correlation in spatial variation to the multispectral method. The LiDAR-derived PAI values closely resemble the SunScan Ceptometer GAI ground measurements taken early in the growing season before major stages of senescence. Later in the growing season, when the canopy density was at its highest, a possible overestimation may have occurred. This was most likely due to the chosen ﬂight parameters not providing the best depictions of canopy density with consideration of the LiDAR’s perspective, as the ground-based destructive measurements provided lower values of PAI. Additionally, a distinction between total LiDAR-derived PAI, multispectral-derived GAI, and brown area index (BAI) is made to show how the active and passive optical sensor methods used in this study can complement each other throughout the growing season.


Introduction
Leaf area index (LAI) is a popular metric used to monitor crop biomass over the growing season. It is a key biophysical parameter for modeling mass and energy exchange between the biosphere and atmosphere [1]. LAI is intrinsically connected to biophysical processes such as photosynthesis, evaporation and transpiration, rainfall interception, and carbon flux [2]. LAI is widely used in crop simulation to assist in crop yield modeling and irrigation management [3]. Improving upon LAI measurement is important, especially for agricultural systems, as they are under strong pressure due to the changing climate.
LAI is the ratio of the total one-sided leaf area per unit of horizontal ground surface area (dimensionless). It can be distinguished into two subcategories. These are green leaf to map underlying terrain, or even to measure tree trunk diameters [26][27][28]. It is also possible to relate the rate at which these signals pass through gaps in the canopy, essentially applying GF principles to LiDAR, that is reflective of vegetation density that can be related to LAI. Studies have found that LiDAR-derived LAI has performed well by comparison with ground collection methods such as LAI-2200 [3]. It is suggested to be a better indicator of the three-dimensional plant structure as opposed to passive optical methods that mainly obtain top surface information with two-dimensional data [2]. LiDAR has the potential to estimate the complete aboveground biomass of the plant, regardless of its pigment or photosynthetic state.
Several studies have shown how manned airborne LiDAR scanning (ALS) can be used to estimate LAI in forestry [29][30][31][32][33][34]. Various attributes of LiDAR have been exploited to examine the canopy structure and density. Distinctions between echoes, signal intensity, and ground points have all been used within methods to derive LAI. These methods have proven to provide indications of suitably accurate LAI estimations. In one study [34], ALS data were used to estimate LAI and gap fraction for a forest and compared against estimations made from hyperspectral data, and R 2 > 0.77 was found. In [32], a performance with a R 2 of 0.67 was exhibited using Beer-Lambert equations. However, LiDAR-derived LAI is subject to inaccuracies depending on the vegetation cover and optimization of the surrounding methods. There is a lack of investigations into optimizing collection and processing parameters for LiDAR retrieval of LAI [32]. Overestimations are known to occur when the vegetation density is too high for the LiDAR signal to reach the ground. In the study of [30], overestimations of 3.1-4.6% were experienced, which increased considerably with scan angles over 32 degrees. The higher the scan angle, the further the signal must travel through the canopy before reaching the ground. Flight parameterization is important in ensuring a complete picture of the canopy density with the most inclusive perspectives that will result in the best ratio of gap probability to mitigate overestimation [32]. This can be related to flight line directions and spacing, altitude, scan angles, and the particular LiDAR sensor used during data acquisition [32]. There is a lack of understanding and distinction between which parameters to use in collection and processing based on the sensor used and vegetation being monitored [3,32]. There is a strong need for further research that includes more vegetation types and the investigation and improvement of various flight parameters and LiDAR features. Additionally, the methods are mainly manned ALS, which can have large LiDAR signal footprints pertaining to the higher flight altitudes and principles of beam divergence and can fail to capture the full gapsize distribution due to not being able to take advantage of small gaps in the canopy [1]. Unmanned aircraft systems (UAS) present the possibility of flights with lower altitudes, lower costs, higher flexibility, and smaller footprint sizes.
In the past few years, LiDAR technology has seen a major development in ultralight laser scanning systems, enabling them to be more commonly mounted on UAS platforms [10]. Predominantly, UAS is used in combination with passive optical sensors because this is more affordable, and with the increase in the quality and availability of surface from motion (SfM) software, these sensors are able to create point clouds similar to those produced with LiDAR [35]. However, unlike LiDAR, they are unable to sense shadowed areas or through gaps and even suffer from over-smoothing of the data [35][36][37]. LiDAR systems for UAS are becoming increasingly more affordable with new user-friendly software, making them a more competitive solution [38]. Ultimately, UAS provide more flexibility, with compact sizes and accessibility that can provide increased temporal resolution. Being independent of cloud cover, UAS is sometimes the only means of obtaining the appropriate temporal resolution desired for a growing season [39]. Deery et al. (2021) [40] have shown the effectiveness of estimating crop metrics such as LAI throughout time using LiDAR via a ground buggy. Additionally, UAS-mounted LiDAR has been used to estimate crop density with a ratio of canopy signal returns to all signal returns to improve their biomass model [41]. To the best of our knowledge, there are no studies using a UAS-mounted LiDAR system to estimate crop PAI in relation to Remote Sens. 2021, 13, 710 4 of 21 canopy density in a time-series throughout the growing season. This study uses the proven method of GF for PAI, previously used with ALS in forestry [32], and tests its potential with UAS and winter wheat throughout the growing season. Since the crop in this study is winter wheat with the probability of reaching senescence heterogeneously early in the growing season, the nature of multispectral data to calculate GAI is used to show the differences and similarities in methods and to provide a new method when needing to distinguish between PAI, GAI, and BAI. This shows that a combination of passive optical and LiDAR can yield improved characterization of canopy structure, as demonstrated in previous studies [42][43][44][45].

Study Area
The study was conducted at the Integrated Carbon Observation System (ICOS) Class 1 ecosystem site DE-RuS (https://www.icos-cp.eu) in Selhausen, Germany (50 • 51 56 N 6 • 27 03 E) with a winter wheat (Triticum aestivum) field of approximately ten hectares. It is part of the Terrestrial Environmental Observatories (TERENO) network [46] and has been selected as a Supersite by the Committee on Earth Observation Satellites (CEOS) Land Product Validation subgroup. The growing period of the winter wheat, from emergence to harvest, was approximately from October to the end of July. The field ranges from 101 m in the west to 103 m above mean sea level (MSL) in the east, with an average annual precipitation of 715 mm and average temperature of 10.2 • C. The soil is composed of Pleistocene loess and translocated loess from the Holocene, along with sand and gravels at the lower depths, which were discovered at this site through previous geophysical studies [47]. The west side of the field has increased sand and gravel amounts that are at shallower depths, which has been shown to cause heterogeneity among the crops during periods of water scarcity [47]. Likewise, the same field was previously observed using a terrestrial laser scanner (TLS) in a study for crop height estimations [48]. This study's resulting maps show crop height patterns that appear to be in agreement with the soil properties and can be used to form prior knowledge and assumptions about the patterns in crop structure at this particular field.

UAS and Sensors
Two sensors were used in this study, a YellowScan Surveyor LiDAR and a Micasense RedEdge-M multispectral sensor, each mounted on a DJI Matrice 600, as seen in Figure 1. The LiDAR unit is composed of a Velodyne LiDAR puck, onboard computer, Inertial Measuring Units (IMU), and Global Navigation Satellite System (GNSS) receiver. The GNSS-Inertial solution is from Applanix APX-15. Ranging data were provided by the LiDAR puck/scanner. The variations in attitude were measured by the IMU, and positioning was provided by the GNSS, which also provided time synchronization for all the sensors. This multi-sensor system provides precision capable of direct georeferencing [38]. Point density of the scanned surface depends on the flight altitude, speed, and flight line spacing of the aircraft. The RedEdge-M is a five narrowband multispectral camera. The spectrum is mapped in blue (465-485 nm), green (550-570 nm), red (663-673 nm), red-edge (712-722 nm), and near infrared (NIR) (820-86 nm) bands. Each has a focus length of 5.4 mm, with a 46-degree field of view (FOV). Aside from the sensors, the setup consists of a GNSS and Downwelling Light Sensor (DLS) for georeferencing and sun/cloud corrections. The DJI Matrice 600 is a hexacopter of 17 × 15 × 7 cm in dimensions and a weight of 10 kilograms (kg). The payload capacity is 5.5-6 kg, making it possible to easily carry the 1.6-kg LiDAR system in addition to an RGB camera, which could be replaced with another sensor such as the 0.17-kg Micasense Rededge-M in future use.

Data Acquisition and Initial Processing
Data acquisition took place from April 1 through July 21 of 2020 and consisted of eight flight campaigns, once every two weeks. All LiDAR flights were performed at 50-m height above ground level (AGL) and at a ground speed of 5 m/s, which was balanced between coverage, overlaps, and scan angles. The approximate point density with these parameters is 85 pts/sqm, with an accuracy of 5 cm, as specified in the YelllowScan user manual. The flight speed on the second campaign (14.04.2020) was 7 m/s and had to be disregarded due to lack of resulting point density. Flight parameters varied earlier in the growing season as canopy density estimations were not the focus and they depended on the project scope at the time. Scan angles of 35 degrees on both sides, for a FOV of 70 degrees, were used to be similar to parameters used in a past ALS study for LAI [33] and proved to result in favorable values when compared to GAI earlier in the growing season before senescence. This FOV gave an approximate overlap of 60%. The first flight campaigns (01.04.2020-12.05.2020) were performed at larger flight line distances and a 60-degree scan angle had to be used to achieve similar overlaps and point density. There was a low amount of signal returns from the vegetation when lower scan angles were used for these earlier campaigns.
The LiDAR data were processed using YellowScan's CloudStation software to process the point cloud data (LASer files) for each flight strip for alignment and georeferencing by applying corrections through GNSS offset (lever-arms), sensor angle (boresight), and GNSS post-processing with precise position techniques [10]. The post-processing solution for direct georeferencing was established using a smoothed best estimated trajectory (SBET) file from Applanix's PosPac software with the use of a Virtual Reference Station (VRS) triangulated from Trimble's network of reference stations.
The multispectral data were collected on the same day and directly after the LiDAR flights. However, these flights were conducted at an altitude of 100 m AGL and at a flight speed of 6 m/s. Parameters for an image forward and side overlap of approximately 90% were used. Before and after each flight, spectral calibration was performed with a reflectance panel. Ground control points were used for georeferencing. The Micasense multi- The DJI Matrice 600 is a hexacopter of 17 × 15 × 7 cm in dimensions and a weight of 10 kilograms (kg). The payload capacity is 5.5-6 kg, making it possible to easily carry the 1.6-kg LiDAR system in addition to an RGB camera, which could be replaced with another sensor such as the 0.17-kg Micasense Rededge-M in future use.

Data Acquisition and Initial Processing
Data acquisition took place from April 1 through July 21 of 2020 and consisted of eight flight campaigns, once every two weeks. All LiDAR flights were performed at 50-m height above ground level (AGL) and at a ground speed of 5 m/s, which was balanced between coverage, overlaps, and scan angles. The approximate point density with these parameters is 85 pts/sqm, with an accuracy of 5 cm, as specified in the YelllowScan user manual. The flight speed on the second campaign (14.04.2020) was 7 m/s and had to be disregarded due to lack of resulting point density. Flight parameters varied earlier in the growing season as canopy density estimations were not the focus and they depended on the project scope at the time. Scan angles of 35 degrees on both sides, for a FOV of 70 degrees, were used to be similar to parameters used in a past ALS study for LAI [33] and proved to result in favorable values when compared to GAI earlier in the growing season before senescence. This FOV gave an approximate overlap of 60%. The first flight campaigns (01.04.2020-12.05.2020) were performed at larger flight line distances and a 60-degree scan angle had to be used to achieve similar overlaps and point density. There was a low amount of signal returns from the vegetation when lower scan angles were used for these earlier campaigns.
The LiDAR data were processed using YellowScan's CloudStation software to process the point cloud data (LASer files) for each flight strip for alignment and georeferencing by applying corrections through GNSS offset (lever-arms), sensor angle (boresight), and GNSS post-processing with precise position techniques [10]. The post-processing solution for direct georeferencing was established using a smoothed best estimated trajectory (SBET) file from Applanix's PosPac software with the use of a Virtual Reference Station (VRS) triangulated from Trimble's network of reference stations.
The multispectral data were collected on the same day and directly after the LiDAR flights. However, these flights were conducted at an altitude of 100 m AGL and at a flight speed of 6 m/s. Parameters for an image forward and side overlap of approximately 90% were used. Before and after each flight, spectral calibration was performed with a reflectance panel. Ground control points were used for georeferencing. The Micasense multispectral imagery was stitched into orthomosaics for each band of the camera using photogrammetry and computer vision methods in Pix4D (Pix4D, Lausanne, Switzerland). The orthomosaics were further manually aligned and resampled to 16 cm in post-processing to be consistent with the lowest-resolution multispectral data as the first two flights in the growing season were flown at higher altitudes.

Ground Sampling
For comparison to the LiDAR-derived PAI, ground measurements from the ICOS site were used. All aboveground biomass measurements at the site were carried out in accordance with the ICOS protocol [49]. Measurements based on two different methods and several plot locations were performed. Repeated non-destructive GAI measurements were carried out with a linear ceptometer (SS1 SunScan Canopy Analysis System, Delta-T Devices Ltd, Campbridge, UK) throughout the growing season (08.04.2020, 06.05.2020, 28.05.2020, 22.06.2020, 02.07.2020) at four plot locations in the center of the field, deemed the Continuous Measurement Plots (CPs, see Figure 2). On each of the measurement days mentioned above, a total of 36 ceptometer measurements distributed over three sublocations were carried out for each plot and subsequently averaged to a single GAI value representative of the respective plot (CP 02-05). Destructive measurements were conducted on a single day near the end of the growing season (07.07.2020) at four locations, deemed the Destructive Measurement Plots (DPs, see Figure 2), in the eastern part of the field. Here, all biomass in each of the four plots with a size of 5.4 m 2 , respectively, was harvested and analyzed in the lab. Metrics on green and non-green leaf, stem, and ear samples were derived using an area meter (LI-3100, LI-COR Biosciences, Lincoln, NE, USA) for subsamples of 10% of the fresh weight of the full sample from the respective plot. The flat surfaces, such as leaves, were determined directly by the LI-3100 area meter and the non-flat was determined by calculating the hemi-surface area with basic dimensions of the structure to approximate its geometric volume. The combination of these measurements was used to represent PAI and GAI. These ground measurements are used in comparison to the UAS data (Sections 3.1 and 3.2) for validation and in the discussion (Section 4.3) to inversely parameterize the LiDAR and multispectral methodologies to further understand the data's behavior.

LAI, GAI, BAI Estimation
Even if several studies assume that the LiDAR method provides an estimation of LAI [29][30][31][32][33], LiDAR cannot distinguish between leaf and other plant elements, therefore providing an estimation of PAI. One reason for assuming that the results estimate LAI is that the PAI values are similar depending on the period in the growing season, or if a certain vegetation is mainly composed of leaves. Additionally, clumping effects may lead to an underestimation of LiDAR retrievals of PAI, thus providing estimates closer to LAI [3]. Studies using the same LiDAR GF methodology as proposed here indicate this as an estimation of effective LAI or effective PAI when considering non-randomly distributed leaves in clumped canopies such as forests or row crops [1,32]. If it is desired to calculate true LAI or PAI, then corrections can be applied, as discussed in previous literature [11,32,50]. To maintain consistency between the measured quantities and the actual definitions of the different vegetation indices, we will use PAI LiDAR to describe the LiDAR estimations. The multispectral estimations will be represented as GAI multispectral as it represents the response to plant chlorophyll pigments. It is assumed that the PAI LiDAR represents the total aboveground biomass while GAI multispectral represents the green tissues. Therefore, as shown in Equation (1), the brown area index (BAI Hybrid can be obtained from a hybrid comparison of LiDAR and multispectral estimates. shows the CP locations of the non-destructive measurements that were taken throughout the growing season with the ceptometer instrument and (B) shows the DP locations of the destructive measurements that were taken on 07.07.20. Note that the imagery was from later in the growing season and the brown coloring indicates those areas experiencing senescence.

LAI, GAI, BAI Estimation
Even if several studies assume that the LiDAR method provides an estimation of LAI [29][30][31][32][33], LiDAR cannot distinguish between leaf and other plant elements, therefore providing an estimation of PAI. One reason for assuming that the results estimate LAI is that the PAI values are similar depending on the period in the growing season, or if a certain vegetation is mainly composed of leaves. Additionally, clumping effects may lead to an underestimation of LiDAR retrievals of PAI, thus providing estimates closer to LAI [3]. Studies using the same LiDAR GF methodology as proposed here indicate this as an estimation of effective LAI or effective PAI when considering non-randomly distributed leaves in clumped canopies such as forests or row crops [1,32]. If it is desired to calculate true LAI or PAI, then corrections can be applied, as discussed in previous literature [11,32,50]. To maintain consistency between the measured quantities and the actual definitions of the different vegetation indices, we will use PAILiDAR to describe the LiDAR es- Figure 2. Natural color composite of the field taken on 09.07.2020 with locations of the ground sampling. (A) shows the CP locations of the non-destructive measurements that were taken throughout the growing season with the ceptometer instrument and (B) shows the DP locations of the destructive measurements that were taken on 07.07.20. Note that the imagery was from later in the growing season and the brown coloring indicates those areas experiencing senescence.

Deriving PAI LiDAR
To calculate GF from LiDAR, the points are separated and defined as ground and non-ground points using the Cloth Simulation Filtering (CSF) algorithm within CloudCompare [51]. CSF avoids complex computations as seen in common slope-based, mathematical morphology and surface-based models, enabling it to be applied to numerous types of landscapes with far fewer parameters [51]. The process inverts the point cloud and uses the interaction between cloth nodes and the corresponding LiDAR points to make the separation. The International Society for Photogrammetry and Remote Sensing (ISPRS) provided a benchmark dataset where a total error of 4.52% was demonstrated, which is comparable to most methods [51]. The main parameter used in this algorithm, at least in this crop study, appears to depend on the point density or average point spacing. As the crop's height increases, the points become less dense and are spread out across the vertical dimensions of the canopy. A grid resolution parameter (GR) represents the horizontal distance between two neighboring particles and is used to determine the number of cloth particles created for a specific dataset [51]. With the canopy becoming denser and higher as the growing season progresses, fewer points reach the ground and the point spacing of those that reach the ground increases vertically and horizontally. As this spacing becomes larger, a larger GR seems to be required to increase cloth rigidness in the y axis and accurately separate ground from non-ground points. An example of how well the LiDAR signal penetrates the canopy and the associated distribution of points can be seen in Figure 3. This cross-section of the point cloud provides a small 2-m example located in the center of the field. As can be seen, the point spacing may be smaller at the top than below, depending on the density of the canopy. When smaller GR is used at the densest periods of the growing season, the simulation cloth would shape to points above the ground and would lead to large portions of crop points being classified with ground points, as can be seen in Figure A1 in the Appendix A. As seen in Figure 3, GR values ranging from 0.1 to 6 m were used, with 0.1 m being the first collection in the growing season and 6 m being when the crop height and the canopy density were at their highest. GR parameterization could change depending on flight line distance and scan angle (see Figure 4), which would in turn change the point density and the way in which the signal penetrates through the canopy. For the distance to the simulated cloth or threshold parameter, we selected 0.1 m for each date. All other parameters were left at their defaults. The accuracy of the segmentation was verified with visual inspection, ensuring that the classified ground points were at the same relative elevation as nearby tractor paths and that proper classification was made with obvious elements such as nearby roads. Additional checks were made by using a range of GR values for each date and testing which values resulted in field averages of PAI that resembled the ground measurements. To be consistent with the multispectral data, a grid of 16-cm cells was generated and both the ground point population and total point population within the cells were enumerated. The scan angle of the individual points was averaged into the same 16-cm cells. These  A modified Beer-Lambert light extinction model is often used to calculate PAI or LAI by analyzing the light-intercepting effect of a canopy, relating to the amount and distribution of openings in the canopy, with the simplified assumption that all parts within the canopy are randomly distributed [2]. The Beer-Lambert equation is used to define the light extinction coefficient k as follows: where k is the extinction coefficient, I is the below-canopy light intensity, and I0 is the above-canopy light intensity. The amount of radiation that penetrates the canopy is dependent on the canopy structure and gaps and hence on GF.
This method can be adapted as proposed in the study by Richardson et al. (2009) [32] to consider LiDAR attributes. Instead of solar radiation with (I/I0), GF is calculated using the amount of laser signal returns that penetrate through gaps in the canopy to the ground in relation to those returns on top and within the canopy. The following equation represents this relationship: With nground being the number of points reaching the ground within a predefined area or grid and n being the total number of points above that same predefined area. GF is usually obtained using optical radiation measurement instruments such as hemispherical photographs, or LAI-2000 (LI-COR Biosciences, Lincoln, NE, USA; [2]). Hemispherical photography characterizes the canopy by using vertically oriented images taken by a camera with a fisheye lens and parameterizations that denote GF at the zenith angle θ of incoming direct sunlight [2,33]. Simply, these methods are based on the transmission of solar radiation through the vegetation. It could be assumed that the same rules would apply to LiDAR signals and their penetration through the vegetation canopy [33]. However, unlike the other methods, LiDAR is an active sensor and consideration of the solar zenith angle is not needed but rather the scan angle from the LiDAR. The scan angle of the signals is meant to replicate the considerations of solar zenith angle that is similar to passive optical methods with the following: The accuracy of the segmentation was verified with visual inspection, ensuring that the classified ground points were at the same relative elevation as nearby tractor paths and that proper classification was made with obvious elements such as nearby roads. Additional checks were made by using a range of GR values for each date and testing which values resulted in field averages of PAI that resembled the ground measurements. To be consistent with the multispectral data, a grid of 16-cm cells was generated and both the ground point population and total point population within the cells were enumerated. The scan angle of the individual points was averaged into the same 16-cm cells. These data were then used to calculate the PAI with a modified Beer-Lambert light extinction model.
A modified Beer-Lambert light extinction model is often used to calculate PAI or LAI by analyzing the light-intercepting effect of a canopy, relating to the amount and distribution of openings in the canopy, with the simplified assumption that all parts within the canopy are randomly distributed [2]. The Beer-Lambert equation is used to define the light extinction coefficient k as follows: where k is the extinction coefficient, I is the below-canopy light intensity, and I 0 is the abovecanopy light intensity. The amount of radiation that penetrates the canopy is dependent on the canopy structure and gaps and hence on GF. This method can be adapted as proposed in the study by Richardson et al. (2009) [32] to consider LiDAR attributes. Instead of solar radiation with (I/I 0 ), GF is calculated using the amount of laser signal returns that penetrate through gaps in the canopy to the ground in relation to those returns on top and within the canopy. The following equation represents this relationship: With n ground being the number of points reaching the ground within a predefined area or grid and n being the total number of points above that same predefined area. GF is usually obtained using optical radiation measurement instruments such as hemispherical photographs, or LAI-2000 (LI-COR Biosciences, Lincoln, NE, USA; [2]). Hemispherical photography characterizes the canopy by using vertically oriented images taken by a camera with a fisheye lens and parameterizations that denote GF at the zenith angle θ of incoming direct sunlight [2,33]. Simply, these methods are based on the transmission of solar radiation through the vegetation. It could be assumed that the same rules would apply to LiDAR signals and their penetration through the vegetation canopy [33]. However, unlike the other methods, LiDAR is an active sensor and consideration of the solar zenith angle is not needed but rather the scan angle from the LiDAR. The scan angle of the signals is meant to replicate the considerations of solar zenith angle that is similar to passive optical methods with the following: where θ represents the average scan angle from all points within the study area. These variables are then combined to imitate Equation (2) to calculate PAI: with ln(GF) being the natural logarithm of GF to imitate ln(I/I 0 ) and k(θ) for the extinction coefficient when Equation (2) is rearranged to solve for LAI or PAI. It should be noted that we treated k(θ) as a constant over space and time.

Deriving GAI multispectral
The approach used in this study to calculate the GAI using passive spectral means is derived from the method developed in the study by Ali et al. (2015) [52] based on the NDVI. A similar method is used in the study by Thorp et al. (2010) [53] but differentiates the results as green LAI (GLAI) [53]. These formulas were based on estimations from satellite remote sensing. The differences in spectral characteristics between satellites and UASs will often provide different scales in NDVI while maintaining a similar variation and distribution of the values [54,55]. The variability of values will still be reliable to compare and assess LiDAR's ability to produce a representation of the canopy structure. The initial step of the method is the calculation of NDVI: Next, fractional vegetation cover (FVC) is calculated using the NDVI values. For each of the resulting NDVIs, for each date, histograms within QGIS were used to determine the bare soil reflectance (NDVIs) and the full vegetation reflectance (NDVI v ). Bare soil was determined to be 0.22 and full vegetation at 0.99 at the highest point in the growing season. These values were then used to determine the FVC: The GAI was then computed from FVC according to: where k(θ) is the light extinction coefficient for a given solar zenith angle. The k(θ) is generally obtained using the literature as ground observations are often not available and k(θ) is difficult to measure in the field [47]. Moreover, many studies use constant k(θ) over the growing period to simplify calculations [56]. A study calculating LAI for winter wheat using RapidEye multispectral satellite imagery was performed at the same study area in Selhausan, Germany [47]. In that study, k(θ) was derived by inverse parameterization with comparison to ground measurements. It was found that the most accurate results were achieved when k(θ) = 0.60 till the end of April and k(θ) = 0.35 after April. The same k(θ) parameterization was used for both LiDAR and multispectral calculations here. This helps to provide an example when determining how well a k(θ) from other literature with a particular crop can be used with UAS LiDAR estimations. Additionally, using only two k(θ) will test the LiDAR's stability in creating accurate estimations throughout the growing season. A k(θ) calibration for the UAS data is performed in the discussion (Section 4.3) to further examine the general characteristics of k(θ) to create a more accurate result and examine it behaves temporally between the collection dates.

Comparison against Ground Measurements
The individual Sunscan SS1 ceptometer ground GAI measurements for each of the designated plot areas (CPs, see Figure 2) were averaged together and compared against the mean values of PAI LiDAR and GAI multispectral over the ground measurement plots for each date, consisting of approximately 9600 pixels. The ceptometer collections took place on different days to the aerial acquisitions but were grouped based on temporal proximity. The results of this comparison are represented in Figure 5. All three methods show similar results for the first two dates. The GAI multispectral values continuously decrease after the 12th of May, while the PAI LiDAR estimates increase to reach a maximum on the 9th of July. The ceptometer and LiDAR results remain close in April to early June, with a difference of 3-18%, but the differences in the estimations grow to approximately 50-58% after late June for each date.

Comparison Against Ground Measurements
The individual Sunscan SS1 ceptometer ground GAI measurements for each of the designated plot areas (CPs, see Figure 2) were averaged together and compared against the mean values of PAILiDAR and GAImultispectral over the ground measurement plots for each date, consisting of approximately 9600 pixels. The ceptometer collections took place on different days to the aerial acquisitions but were grouped based on temporal proximity. The results of this comparison are represented in Figure 5. All three methods show similar results for the first two dates. The GAImultispectral values continuously decrease after the 12th of May, while the PAILiDAR estimates increase to reach a maximum on the 9th of July. The ceptometer and LiDAR results remain close in April to early June, with a difference of 3-18%, but the differences in the estimations grow to approximately 50-58% after late June for each date. Figure 5. Comparison between the Sunscan SS1 ceptometer, the PAILiDAR, and the GAImultispectral estimations at each ground measurement plot location with a confidence interval of ~95%. The collection date is noted separately in the graph for the SS1 ceptometer (GAIcept) ground collections. Note that there is no ceptometer measurement for 21/7/2020 as 02/07/2020 was the last date on which a ceptometer measurement was taken in any of the CPs.
The destructive PAI measurement plot (DP, see Figure 2) mean values were compared to the mean PAILiDAR values of adjacent plots of approximately 5.4 m 2 consisting of 200 pixels each. Values from the PAILiDAR data were taken from an adjacent area to the DPs (see Figure 2) because the closest flight campaign was just after the DP collection. The next closest date was two weeks prior. The results (see Figure 6) show that the values with the LiDAR methods are 28% larger at this period in the growing season. Figure 5. Comparison between the Sunscan SS1 ceptometer, the PAI LiDAR , and the GAI multispectral estimations at each ground measurement plot location with a confidence interval of~95%. The collection date is noted separately in the graph for the SS1 ceptometer (GAI cept ) ground collections. Note that there is no ceptometer measurement for 21/7/2020 as 02/07/2020 was the last date on which a ceptometer measurement was taken in any of the CPs.
The destructive PAI measurement plot (DP, see Figure 2) mean values were compared to the mean PAI LiDAR values of adjacent plots of approximately 5.4 m 2 consisting of 200 pixels each. Values from the PAI LiDAR data were taken from an adjacent area to the DPs (see Figure 2) because the closest flight campaign was just after the DP collection. The next closest date was two weeks prior. The results (see Figure 6) show that the values with the LiDAR methods are 28% larger at this period in the growing season. The GAI ceptomoter measurements were used to determine a R 2 = 0.45 and a RMSE = 2.05 for the GAImultispectral when using a k(θ) from the literature. For PAILiDAR, the first three ceptometer measurements were used (08/04-28/05). This was the period in the growing season when there was little to no senescence, especially near the center of the field. It was assumed that PAI and GAI were the same during this period and at those CP locations. Then, the PAI calculated for each DP area from destructive measurements was used for comparison to the 09/07 data. A R 2 = 0.89 and a RMSE = 0.87 were determined for the PAILiDAR results when also using a k(θ) from the literature.

Spatial Variation in Values
In order to evaluate the relationships between the LiDAR and multispectral methods, spatial sampling was performed for each individual date using 82 evenly distributed points with the same point locations across all dates and methods (see Figure 7). As the passive optical methods using VIs derived from multispectral data are proven in past studies, they are used here to assess the LiDAR's ability to detect the spatial variation in the canopy structure. There is not a significant correlation between early and late stages in the growing season, which can be attributed to optical NDVI's dependence on plant pigment and the soil contribution to the signal, whilst LiDAR depends more on the entire canopy and its structure. The correlations are moderate to high (R = 0.39-0.66) mid-growing season. Although the magnitude of the absolute values is different, explained by the different plant elements sensed by the two methods, the relative spatial variation shows some consistency between the methods. The GAI ceptomoter measurements were used to determine a R 2 = 0.45 and a RMSE = 2.05 for the GAI multispectral when using a k(θ) from the literature. For PAI LiDAR , the first three ceptometer measurements were used (08/04-28/05). This was the period in the growing season when there was little to no senescence, especially near the center of the field. It was assumed that PAI and GAI were the same during this period and at those CP locations. Then, the PAI calculated for each DP area from destructive measurements was used for comparison to the 09/07 data. A R 2 = 0.89 and a RMSE = 0.87 were determined for the PAI LiDAR results when also using a k(θ) from the literature.

Spatial Variation in Values
In order to evaluate the relationships between the LiDAR and multispectral methods, spatial sampling was performed for each individual date using 82 evenly distributed points with the same point locations across all dates and methods (see Figure 7). As the passive optical methods using VIs derived from multispectral data are proven in past studies, they are used here to assess the LiDAR's ability to detect the spatial variation in the canopy structure. There is not a significant correlation between early and late stages in the growing season, which can be attributed to optical NDVI's dependence on plant pigment and the soil contribution to the signal, whilst LiDAR depends more on the entire canopy and its structure. The correlations are moderate to high (R = 0.39-0.66) mid-growing season. Although the magnitude of the absolute values is different, explained by the different plant elements sensed by the two methods, the relative spatial variation shows some consistency between the methods.

PAI, GAI, and BAI
The visualization of the results is presented in Figure 8, where the LiDAR and the multispectral results can be compared against the RGB imagery. Spatial heterogeneity among the crop is clearly visible in the RGB imagery, along with senescence starting on the western side of the field as early as late May. It can be seen that the LiDAR methodology used is able to capture these changes in crop structure within its PAI LiDAR estimations. The GAI multispectral calculated using passive optical means with NDVI exhibits similar patterns but the reflectance from the crop dissipates as the crop approaches the full senescence stage, as confirmed in the RGB imagery. . Spatial and temporal correlations between the LiDAR-derived PAI and the multispectralderived GAI. Pearson correlation coefficients (R) are depicted for each measurement campaign during the growing season. Note that 82 evenly distributed points across both methods were used for each date.

PAI, GAI, and BAI
The visualization of the results is presented in Figure 8, where the LiDAR and the multispectral results can be compared against the RGB imagery. Spatial heterogeneity among the crop is clearly visible in the RGB imagery, along with senescence starting on the western side of the field as early as late May. It can be seen that the LiDAR methodology used is able to capture these changes in crop structure within its PAILiDAR estimations. The GAImultispectral calculated using passive optical means with NDVI exhibits similar patterns but the reflectance from the crop dissipates as the crop approaches the full senescence stage, as confirmed in the RGB imagery. The temporal evolution of the field average of PAI LiDAR and GAI multispectral is depicted in Figure 9, where it can be seen how these two types of metrics and methods can complement one another when monitoring winter wheat. Both PAI and GAI are similar until 12 May 2020, before the appearance of senescence. This allows for the LiDAR and multispectral methods to be compared during this period in the growing season. The separation in values coincides with a dry period, as can be seen from the weather data in Figure A2. GAI multispectral decreases after 12/5 and appears to level off towards the end of May with increased precipitation while PAI LiDAR continues to increase. This also agrees with the results captured by the LiDAR crop height measurements seen in Figure 10. PAI LiDAR decreases at the end of June but the GAI multispectral decrease is more drastic when the crop enters senescence stages for the entire field. The temporal evolution of the field average of PAILiDAR and GAImultispectral is depicted in Figure 9, where it can be seen how these two types of metrics and methods can complement one another when monitoring winter wheat. Both PAI and GAI are similar until 12 May 2020, before the appearance of senescence. This allows for the LiDAR and multispectral methods to be compared during this period in the growing season. The separation in values coincides with a dry period, as can be seen from the weather data in Figure A2. GAImultispectral decreases after 12/5 and appears to level off towards the end of May with increased precipitation while PAILiDAR continues to increase. This also agrees with the results captured by the LiDAR crop height measurements seen in Figure 10. PAILiDAR decreases at the end of June but the GAImultispectral decrease is more drastic when the crop enters senescence stages for the entire field. This highlights the potential to distinguish between the PAI with reference to the complete aboveground biomass with LiDAR, the GAI for plant pigment with multispectral methods, and the non-green biomass with BAI as a byproduct of the combination of both methods.  The PAILiDAR and the ground-ceptometer-derived GAI values are similar as there is little difference between GAI and PAI when the plant structure is mostly green [3]. Differences begin to occur as the field approaches senescence. The differences in values may be further exaggerated as effects such as clumping can cause underestimations in ground measurements and LiDAR-derived values are known to be overestimated when canopy density is at its highest. As seen in Figure 3, at the peak of crop growth, the point cloud consists mostly within the top portion of the crop with few signals passing through. This leads to a lack of differentiation throughout the crop and to less information about its structure. This low gap probability and loss of information could be linked to an overestimation of PAILiDAR in parts of the field. This possible overestimation may be suggested from the comparison made with the destructive PAI measurements. Inadequate collection parameters (i.e., flight profile, overlap, scan angle) could be the cause, failing to fully exploit the canopy gaps so that an appropriate characterization of the canopy density could be obtained. This highlights the potential to distinguish between the PAI with reference to the complete aboveground biomass with LiDAR, the GAI for plant pigment with multispectral methods, and the non-green biomass with BAI as a byproduct of the combination of both methods.

Time-Series Trend
The results of this study show that estimating canopy structure with LiDAR for crops such as winter wheat is feasible and may indeed have potential benefits. The differences in temporal evolutions of LiDAR and multispectral data are in agreement with expected values as they measure different parts of the vegetation. The PAI LiDAR and GAI multispectral are similar at the start of the growing season (1.4.2020-12.5.2020) as most plant elements are green at this time. The field GAI multispectral decreases on 26.5.2020, which is due to an increase in plant stress and visible browning that begins to occur on the western part of the field. The precipitation and soil moisture data seen in Figure A2 show that precipitation had decreased in May, which possibly explains this sudden drop in GAI. PAI increase also slows down in this temporal window. The structures appearing on the western side are consistent with those soil areas with increased sand and gravel fractions closer to the surface, which are known to be more affected by water scarcity [47]. These patterns are also similar to those experienced with the previous TLS study for crop height at the same field during the 2008 and 2009 growing seasons [48]. Once prominent precipitation amounts appeared at the end of May and beginning of June, GAI decline slowed and PAI increase became steeper. The PAI LiDAR field averages increased along with the increase in crop height (see Figure 10) during the growing season, as expected [29]. Values of PAI and GAI continued to separate throughout the growing season as winter wheat approached full senescence, with lower greenness in plant tissue and lower GAI values. However, PAI remained high, with LiDAR still being able to sense the variations in canopy density.
The PAI LiDAR and the ground-ceptometer-derived GAI values are similar as there is little difference between GAI and PAI when the plant structure is mostly green [3]. Differences begin to occur as the field approaches senescence. The differences in values may be further exaggerated as effects such as clumping can cause underestimations in ground measurements and LiDAR-derived values are known to be overestimated when canopy density is at its highest. As seen in Figure 3, at the peak of crop growth, the point cloud consists mostly within the top portion of the crop with few signals passing through. This leads to a lack of differentiation throughout the crop and to less information about its structure. This low gap probability and loss of information could be linked to an overestimation of PAI LiDAR in parts of the field. This possible overestimation may be suggested from the comparison made with the destructive PAI measurements. Inadequate collection parameters (i.e., flight profile, overlap, scan angle) could be the cause, failing to fully exploit the canopy gaps so that an appropriate characterization of the canopy density could be obtained.

PAI LiDAR Spatial Variation
The spatial variation in PAI LiDAR correlates well with the GAI multispectral estimates obtained from an already established method with passive optical sensors. The lack of correlations in the beginning and end of the growing season is likely due to the fact that the two methods sense different plant parts. The multispectral method is based on differences in spectral reflectance and therefore sensitive to plant pigments, while the LiDAR method is based on the sensor's laser penetration through the whole canopy and thus affected by canopy density and gap probability. For example, when the crop reached complete senescence before harvest, the reflectance was essentially uniform at approximately the same values of bare soil, as all of the crop consisted of non-green senescent vegetation. However, unlike the multispectral approach, the LiDAR was able to detect the senescent vegetation. Although their spatial variation did not correlate on the first two dates and the last date, they provided similar spatial patterns in the mid-growing season.

Impacts of the Extinction Coefficient k(θ)
PAI LiDAR benefited more than the GAI multispectral from the k(θ) value selection based on the literature and was able to deliver estimations with a respectable R 2 and RMSE. The k(θ) for LiDAR may not always agree well with those derived from other sensors. For example, in the study by Solberg et al. (2006) [57], an approximate k(θ) value of 0.7 was derived using ALS while the ground-based measurements were 0.51 and 0.44. The k(θ) could be a contributing factor to the PAI LiDAR results overestimating at the height of the growing season. LiDAR resembles beam radiation that is more concentrated while the k(θ) was derived from direct and diffuse radiation as used in passive optical methods [32]. This can be better understood and improved by further analysis with various LiDAR collection parameters in comparison with k(θ) values derived from other sensors and with different crops.
Instead of using literature values, we evaluated the potential of a k(θ) calibration to characterize its general nature during changing crop conditions. The PAI LiDAR and GAI multispectral k(θ) were inversely calibrated using the ground measurements. The resulting PAI LiDAR obtained a R 2 = 0.91 and a RMSE = 0.43, whereas the GAI multispectral obtained a R 2 = 0.96 and a RMSE = 0.31. The retrieved k(θ) values are listed in Table 1. The GAI multispectral k(θ) values changed with time and progressively declined throughout the growing season. The GAI multispectral method incorporated for UAS for this project may benefit from a linearity adjustment factor in the equations for better results or studies without in situ measurements. The PAI LiDAR appears to be more stable over time and shows promise that fixing this parameter would introduce just minor uncertainty, so that it could be incorporated using values from the literature [47,56]. It is worth noting that none of the ground measurements were taken on the same day as flight campaigns, but within a maximum time span of nine days. More specific reference measurements are needed for better interpretation.

Future Insights
There are several possible improvements that can be made to increase the accuracy and detail of results for future studies. Changes and balances between flight direction, scan angle, and overlap during data acquisition can be made to change the perspectives to create better information about the plant structure while increasing the chances of the LiDAR signal detecting gaps in the canopy [32]. The closer the flight lines are to one another, the higher the number of perspectives will be from the LiDAR with respect to the chosen scan angle. Implementing crosshatch flight paths could also increase accuracy by not only increasing overlap, but by increasing oblique perspectives from additional directions. These methods must be investigated and balanced with flight time and battery efficiency, for at a certain point, the improvements in accuracy may not be worth the prolonged acquisition duration. Off-nadir scan angles also help in finding gaps and understanding the canopy structure but have to be limited at an optimal point considering signal travel and strength. In the study by Morsdorf et al. (2006) [31], it is suggested that parameters such as laser wavelength and flying altitude affect LiDAR footprint size and point spacing, which influences accuracy and should be investigated further. Aside from flight parameters during the acquisition of the data, improvements can be made during the processing and calculation stages of the data. Studies have used other methods of analysis, including laser penetration index (LPI) and signal intensity, to characterize canopy structure [33]. Additionally, advancements in hardware provide further potential, such as with multifrequency and multi-channel LiDAR systems. In future studies, combining additional variables or considerations within the analysis and flight parameterization has promise for improved estimations.
Considering that LiDAR provides a better characterization of the crop canopy structure, combining the LiDAR height and PAI LiDAR values may give better estimates of biomass than previously seen with multispectral and photogrammetry methods [2]. GAI of winter wheat could also be calculated using the intensity of the LiDAR returns since the signal is emitted close to NIR [6]. The more products that this sensor can calculate without supporting data from other sensors and as a standalone sensor, the more efficient it becomes. For further consideration in its future potential, LiDAR is less affected by its environment, such as with solar zenith angles and cloudiness, as compared to UAS with passive sensors. LiDAR systems have already made huge leaps in size and as the market competition increases, the pricing will decline and the data processing will become more user-friendly, making these systems and potential methods more accessible for use in farm management [10].

Conclusions
A method to differentiate PAI, GAI, and BAI by UAS-mounted active (LiDAR) and passive (multispectral) sensors has been developed for a winter wheat case. Having the ability to differentiate between these products can improve the understanding of crop performance and carbon budget, which can increase the precision of farming practices. Additionally, proof of concept has been applied when calculating metrics for crop structure, such as PAI, with UAS-mounted LiDAR. It shows promise as it can replicate the structural patterns within the crop canopy, which is consistent with multispectral methods. The PAI LiDAR and GAI multispectral estimations were comparable before the appearance of senescence, providing further evidence in support of the methods tested. The LiDAR method is in agreement with other accepted non-destructive and destructive methods. It has been shown that this method may also be able to enhance or capture what might be missed with GAI multispectral estimations, especially in the case of cereal crops. Without dependence on spectral reflectance and solar radiation, the LiDAR methods provide improved options for characterizing full crop canopy structure.
When LiDAR is used with a UAS, it provides further versatility in potential flight collection parameters as traditional ALS methods may be more restricted when it comes to lower flight altitudes and flight paths. There are still additional factors concerning flight planning, processing, and variables within the analysis to explore in order to further understand and increase the accuracy and feasibility of the methods presented. The flight parameters chosen in this study may not have fully exploited the canopy gaps when the crop was at its highest density, causing potential overestimation in values late in the growing season. Although improvements are possible, this study succeeds in providing evidence that UAS-mounted active and passive sensors can indeed be used to characterize the full canopy of crops such as winter wheat and differentiate between PAI, GAI, and BAI.