Next Article in Journal
A Kalman Filter for Nonlinear Attitude Estimation Using Time Variable Matrices and Quaternions
Previous Article in Journal
Reducing Response Time in Motor Imagery Using A Headband and Deep Learning
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimation of Peanut Leaf Area Index from Unmanned Aerial Vehicle Multispectral Images

1
College of Engineering, South China Agricultural University, Guangzhou 510642, China
2
Guangdong Laboratory for Lingnan Modern Agriculture, Guangzhou 510642, China
3
College of Agriculture, South China Agricultural University, Guangzhou 510642, China
4
College of Electronics Engineering, South China Agricultural University, Guangzhou 510642, China
*
Author to whom correspondence should be addressed.
Sensors 2020, 20(23), 6732; https://doi.org/10.3390/s20236732
Submission received: 11 September 2020 / Revised: 22 November 2020 / Accepted: 23 November 2020 / Published: 25 November 2020
(This article belongs to the Section Sensing and Imaging)

Abstract

:
Leaf area index (LAI) is used to predict crop yield, and unmanned aerial vehicles (UAVs) provide new ways to monitor LAI. In this study, we used a fixed-wing UAV with multispectral cameras for remote sensing monitoring. We conducted field experiments with two peanut varieties at different planting densities to estimate LAI from multispectral images and establish a high-precision LAI prediction model. We used eight vegetation indices (VIs) and developed simple regression and artificial neural network (BPN) models for LAI and spectral VIs. The empirical model was calibrated to estimate peanut LAI, and the best model was selected from the coefficient of determination and root mean square error. The red (660 nm) and near-infrared (790 nm) bands effectively predicted peanut LAI, and LAI increased with planting density. The predictive accuracy of the multiple regression model was higher than that of the single linear regression models, and the correlations between Modified Red-Edge Simple Ratio Index (MSR), Ratio Vegetation Index (RVI), Normalized Difference Vegetation Index (NDVI), and LAI were higher than the other indices. The combined VI BPN model was more accurate than the single VI BPN model, and the BPN model accuracy was higher. Planting density affects peanut LAI, and reflectance-based vegetation indices can help predict LAI.

1. Introduction

Leaf area index (LAI) is defined as half of the total green leaf area per unit of the horizontal ground surface area of the vegetation canopy [1]. LAI is used to estimate leaf cover and monitor and predict crop growth and yield [2,3], and it is a key parameter of photosynthesis, respiration, and transpiration in plants [4]. LAI is an important variable in many land surface models, and assimilating LAI-derived remote sensing data into crop models has improved estimates of biomass and yield [5]. The efficient and nondestructive monitoring of crop growth is essential for precise crop management and is key to modern precision agriculture [6]. Real-time LAI monitoring can provide information on crop health and nutrient status and assist fertilization and irrigation management efforts. LAI has traditionally been measured by in situ (destructive or optical) methods, which require significant time and human resources. Moreover, the sample-based measurements are spatially discontinuous [7,8]. Remote sensing techniques are often associated with green-related biophysical parameters, such as the crop chlorophyll content, vegetation biomass, or leaf area index, which directly or indirectly reflect crop vigor and photosynthetic capacity [9]. Since the development of remote sensing technology in the 1950s, satellites, manned aircraft, and ground-based spectral equipment have been used to monitor crop growth. However, these platforms have limitations. Satellite data are limited by altitude and orbit (i.e., clouds and suspended particles) and do not provide the spatial, temporal, or spectral resolution required for growth monitoring [10]. In recent years, with improvements in relevant technologies, satellite data have become available to meet monitoring needs. Manned aircraft are expensive to operate and require operators with relevant flight training. Ground-based spectral equipment is bulky and inefficient, inevitably causing damage to the crop canopies. In contrast, unmanned aerial vehicles (UAVs) offer an economical and efficient way to address the growing requirements for spatial, temporal, and spectral resolution [11]. UAVs have been used to acquire high-precision crop data, and crop monitoring via UAVs is common, including crop plot detection [12], fruit detection [13], crop yield [14], crop variable measurement [15], crop terrain mapping [16], and crop physiological parameter estimation [17]. Low-altitude remote sensing technology provides high temporal and spatial resolution, enabling the nondestructive, accurate, and timely estimation of leaf area index, crop growth, plant biomass, final crop yield, and other biophysiological parameters [18].
There are two general methods for estimating LAI from remotely sensed data: a process-based method and an empirical method based on the vegetation index (VI), also known as the VI method [19]. The process-based approach estimates LAI from a radiative transfer model, which was developed from remotely acquired canopy reflectance data [20]. Radiative transfer models simulate the (bidirectional) reflectivity of land surfaces through a series of physical or mathematical descriptions of the background (i.e., soil or snow surface), the object (i.e., the canopy or other surfaces), the physical and radiative properties of the atmosphere, and the geometry of the sun and sensors [21]. In contrast, the VI method establishes a statistical relationship between the remotely sensed VI and observed LAI values (hereafter referred to as the LAI-VI relationship) [22]. VI is the reflectance from two or more spectral bands and can be used to estimate the physiological and biochemical characteristics of vegetation, such as the LAI, biomass, and canopy chlorophyll content [23]. Su et al. constructed the Ratio Vegetation Index (RVI), Normalized Difference Vegetation Index (NDVI), and Optimized Soil-Adjusted Vegetation Index (OSAVI) to accurately monitor yellow rust in wheat using multispectral images based on UAVs footage [24]. Carlos et al. used UAVs for aerial crop monitoring to combine seven vegetation indices of rice growth in a multivariate regression model to estimate rice biomass [25]. Wang et al. developed a rice yield estimation model based on satellite images using field measurements and canopy reflection band ratios (NIR/RED, NIR/GRN), and successfully predicted rice yield over a large area [26]. Zarco et al. combined R515/R570 and Transformational Chlorophyll Absorb Reflectance Index (TCARI)/OSAVI narrowband indices to estimate leaf carotenoids with a hyperspectral camera on a UAV [27]. Vegetation indices provide quick and easily obtainable information for a better understanding of the underlying mechanisms of crops, and many VIs are closely related to LAI. The most common VIs are the simple ratio vegetation index [28] and NDVI [29]. However, there is an exponential relationship between NDVI and LAI, and NDVI is saturated when the aboveground biomass is too high [30]. When the LAI was 2 to 6, the reflectance in the near-infrared band was significantly higher than in the red band. When the reflectance in the near-infrared band exceeds 40%, the contribution of reflectance to NDVI is small [31]. Considerable efforts have been made to minimize the effects of the soil, including the Soil-Adjusted Vegetation Index (SAVI) [32], the optimized SAVI (OSAVI) [33], the improved SAVI [25], and the Atmospheric-Resistant Vegetation Index (ARVI) [28]. Steps have also been taken to improve the sensitivity of the vegetation indices at high LAI and to reduce atmospheric disturbance, such as the Enhanced Vegetation Index (EVI) [34,35], the Improved Delta Vegetation Index (MTVI2) [25], and the Wide Dynamic Range Vegetation Index (WDRVI) [36]. In addition to different VIs, the LAI-VI relationships use several mathematical equations, including linear, exponential, logarithmic, and polynomial [37,38].
Peanuts are grown worldwide as oil and cash crops. Peanut production in China and the United States accounts for approximately 70% of the total global production, and the global production and trade of peanuts is very important [39]. Planting density has a significant impact on crop growth status and is important for guiding high yielding crops. Currently, most research is devoted to the use of spectroscopic techniques to monitor crop physiological parameters to understand crop growth, but few studies have explored the relationship between spectra, growth parameters, and plant density. The prediction model usually includes physical and statistical models [40], and the statistical models for LAI prediction include parametric and nonparametric regression models. Parametric models are simple, but the accuracy of the prediction is limited by the selected band. The nonparametric regression method makes full use of the spectral information and has high accuracy and robustness. This study used two different peanut varieties planted at eight different densities to assess peanut LAI from multispectral data collected during the field experiments. The band sensitive to green plant LAI was selected to establish the vegetation index, and simple regression (SR) and backpropagation (BP) neural network models were constructed to achieve fast, accurate, and nondestructive prediction of the peanut LAI. A BP neural network (BPN) is a multilayer feed-forward network trained by error backpropagation, consisting of three levels: input, implicit, and output [41]. The BPN method is based on sample training to establish a model of vegetation index and LAI to estimate LAI. Fortin et al. used multispectral near-infrared and red bands as the inputs to a BPN model to achieve high-precision maize LAI predictions [42]. Peng et al. compared fuzzy logic and artificial neural networks as inverse models for predicting drought-tolerant soybean varieties and found that the artificial neural networks achieved 80% prediction accuracy [43]. The most important advantage of BPN over other nonlinear methods is that the neural networks can be approximated globally, with a high degree of accuracy. BPN modeling does not require prior assumptions, as it is largely determined by the characteristics of the data. BP neural networks are widely used in research and have important capabilities, such as fly linear mapping, generalization, and fault tolerance. However, artificial neural networks also have unstable training results and are highly influenced by the samples. The objectives of this study were (i) to explore the relationship between density and spectral, (ii) to identify the bands most sensitive to peanut density and LAI and construct vegetation indices, and (iii) to evaluate different prediction models and evaluate their performance for predicting peanut LAI.

2. Material and Methods

2.1. Test Design

The experimental area is located at the teaching and research base of South China Agricultural University in Zengcheng, China (23°09 N, 113°22′ E; altitude: 11 m). The area is characterized by a subtropical monsoon climate, with 1945 annual sunshine hours, an annual average temperature of 20–22 °C, and 1623.6–1899.8 mm of annual precipitation.
Peanut varieties Yueyou 45 and Yanghua No. 1 were selected as the test varieties and sown on 7 August 2019. The layout of the experimental plot is shown in Figure 1. The test field covered an area of 0.2 ha, containing 746 kg of compound fertilizer per ha and 896 kg of lime. The straddling width was 120 cm, the furrow width was 30 cm, there were 4 rows of peanuts in each row of fields, and the rows were spaced by 25 cm (10 cm spacing on the sides of the plot). There were 8 density treatments per peanut variety for a total of 16 plots: a single seed within an 8 cm row (S1), a single seed within a 10 cm row (S2), a single seed within a 12 cm row (S3), a single seed within a 20 cm row (S4), two seeds within a 16 cm row (D1), two seeds within a 24 cm row (D2), two seeds within a 20 cm row (D3), and three seeds within a 20 cm row (T). The plants were grown with the recommended fertilization and irrigation schemes.

2.2. Data Acquisition

2.2.1. Multispectral Data Acquisition and Processing

A Parrot Sequoia 4-channel multispectral camera was mounted on a Parrot Bluegrass UAV to collect multispectral peanut data at key fertility stages, i.e., the seedling stage (7 September 2019), flowering stage (27 September 2019), pod-filling stage (26 October 2019), and maturity stage (17 November 2019). The setup included a 16 MP rolling shutter RGB camera (resolution: 4608 × 3456 pixels) and four 1.5 MP global shutter single-band cameras (resolution: 1280 × 960 pixels) in the following spectral bands: green (center wavelength = 550 nm, bandwidth = 40 nm), red (center wavelength = 660 nm, bandwidth = 40 nm), red-edge (center wavelength = 735 nm, bandwidth = 10 nm), and near-infrared (NIR, center wavelength = 790 nm, bandwidth = 40 nm) (Table 1) [6].
Radiation-calibrated images of a calibrated reflectance panel (MicaSense) were captured on the ground before each flight [43]. To obtain sufficient image resolution, the aircraft flew at 2.5 m/s at an altitude of 18 m above the ground, capturing 85% of the forward and lateral overlap images at 1.5 s intervals. During the flight, the Parrot Sequoia camera uses its built-in sunlight sensor and the calibrated reference board image to calibrate and correct the reflectivity of the captured images to minimize the amount of error [6]. After the flight, radiation correction of the multispectral images is performed using the following equation:
R T a r g e t = D N T a r g e t D N C · R C
where R T a r g e t   is the reflectivity of the target, D N T a r g e t is the DN (Digital Number) value of the target, D N C is the DN value of the correction plate, and R C is the reflectivity value of the correction plate. The radiation calibration of the multispectral images from the UAV requires extraction of the corrective class DN values of the green, red, red-edge, and near-red band images separately. The radiation corrections are applied to each single-band image, which is then synthesized to obtain the multispectral image data.
Five multispectral image sets were generated after each UAV flight: green, red, red-edge, NIR, and Red, Green, and Blue (RGB). Image processing selected only the first four multispectral image sets because this study required VIs and four monochrome image reflections. Multispectral image stitching to form orthophoto images was performed in Pix4D mapper4.2 (Pix4D S.A., Lausanne, Switzerland) and included camera alignment, georeferencing, construction of dense point clouds, and orthogonal stitching [44].

2.2.2. Collection and Processing of the Leaf Area Index

The leaf area index (LAI) was measured with a LAI-2200C plant canopy analyzer (Li-Cor Biosciences, Lincoln, NE, USA). This instrument is commonly used for in-field LAI measurements and is equipped with a fisheye optical sensor to measure the radiation above and below the canopy. Peanut LAI was measured below the plant after spectral data acquisition was completed. Five sample points were measured per plot, for a total of 80 LAI measurements, including GPS coordinates for each sample point, and the peanut LAI values were processed according to the time of measurement to remove anomalous data. The data were initially processed using Microsoft Excel to calculate the mean value of each plot sample point (n = 5), which was then used as the mean LAI for the plot.

2.3. Selection of the Vegetation Index

A vegetation index constructed from the combination of the red and near-infrared bands can enhance the effective information of the vegetation leaf area index [30]. In this study, 12 vegetation indices related to leaf area were selected for analysis (Table 2).

2.4. Prediction Model Construction and Verification Accuracy

This study used VIs as independent variables (LAI as dependent variable) for further modeling and comparative analysis. Two different modeling methods, SR and BPN, were used to predict peanut LAI, and the model accuracy was evaluated by the coefficient of determination (R2) and root mean square error (RMSE). High R2 and low RMSE values were indicative of high model accuracy and were calculated as:
R 2 = 1 j M y j y j ^ 2 j M y j y j ¯ 2
RMSE = 1 M j = 1 M y j y j ^ 2
where, y j and y j ^ are the measured and predicted values, y j ¯ is the average of the measured values, respectively, and M is the number of samples.
The single and combined vegetation indices were used as input variables, and the measured LAI was used as an output variable to construct a single vegetation index neural network training model (S-BPN) and a combined vegetation index neural network training model (C-BPN), respectively. Using the MATLAB neural network toolbox, the number of iterative training sessions was set to 1000, the learning rate was set to 0.01, and the training function was used to iteratively train within a reasonable range of implicit layer nodes to identify the best network training effect. The number of implicit layer neurons was determined using [45] as follows:
N h = m + 2 N + 2 N m + 2
where N h   is the number of implicit layer neurons, m is the number of layers, and N is the number of input neurons.

3. Results

3.1. Spectral Data Processing

The reflectance curves of the peanut plants grown at different densities were obtained using the region of interest marker tool in the ENVI software to select the markers for different density plots in the original multispectral images, using the single-seeded Yueyou 45 plants as an example (Figure 2). Plant density had a strong influence on the red light and near-red bands, and the peanut LAI increased with increasing plant density. This showed a relationship between the LAI and the red light and near-infrared bands. Therefore, these bands can be used to construct the vegetation index, which is consistent with previous studies.

3.2. Correlation Analysis between the Vegetation Indices and Measured LAIs

The 12 vegetation indices were correlated with the measured peanut LAI values at the four growth stages (Table 3). The vegetation indices were more strongly correlated with peanut LAI throughout the reproductive period than during any single growth stage.
Based on the above correlation analysis, eight VIs were selected: DVI, GNDVI, MCARI, MSR, NDVI, OSAVI, RDVI, and RVI. A correlation heat map of the VIs and the whole growth period was produced (Figure 3) and shows that MSR, NDVI, and RVI had the strongest correlations with LAI throughout the reproductive period (0.796, 0.767, and 0.789, respectively).

3.3. Peanut Leaf Area Index Analysis

There were 320 measured values of LAI from the two peanut varieties across four growth stages. Five LAI data were collected from each plot, and the mean values were used as the mean LAI of the plot. The LAI of variety Yanghua 1 was higher than Yueyou 45 (Figure 4), and the LAI of both varieties peaked at the flowering stage. A comprehensive analysis of the different density plots for both varieties (Figure 5) showed that, throughout the growth stage, LAI was highest at the flowering stage, followed by the seedling stage, where the LAI was relatively low because of yellowing and fading leaves at the maturity stage. LAI was highest when the planting density was S1 and D1 (compared to the other densities). Figure 5A,B compares the single and double planting plots, respectively; the overall LAI of both varieties, regardless of the seeding method, decreased with decreasing density. Due to the early (seedling) stage, Yueyou 45 had a relatively low LAI because of poor growth in the S4 treatment.

3.4. Simple Regression Model

The SPSS software was used to randomly sample 64 peanut LAI measurements during the whole reproductive period; 50 were model samples, and 14 were test samples. The eight vegetation indices were fit with linear, logarithmic, power, and exponential models to select the most precise model for peanut LAI (Table 4).
Table 4 shows the predictive ability of the four regression models on peanut LAI; the first three single vegetation indices have similar accuracy to the inverse model, with R2 values of 0.773, 0.792, and 0.79, respectively. The fourth model is a multiple regression of the eight vegetation indices and measured LAI, which has the highest predictive accuracy (R2 = 0.83).

3.5. BP Neural Network Model

In this study, the number of implicit layer neurons was set to 10, according to Equation (4). The training results are shown in Table 5, and the accuracy of the S-BPN models was mostly <0.9 (only the NDVI-BPN and RVI-BPN models had prediction accuracies >0.9). The C-BPN model predicts much better than the S-BPN models, reaching 0.968. Figure 6 compares the training results of the C-BPN model with the measured LAI values; the C-BPN predictions were very close to the observed values, suggesting good predictive ability.

3.6. Mapping the LAI Prediction

The MSR, NDVI, and RVI vegetation indices were strongly correlated with peanut LAI. According to the simple regression models, NDVI-LAI and RVI-LAI (established in Section 3.4), the prediction accuracy of peanut LAI reached 0.773 and 0.79, respectively. The NDVI and RVI prediction filling plots of peanut flowering LAI using the ENVI 5.2 software are shown in Figure 7.

4. Discussion

Improving peanut yield has been the focus of research in the peanut field, and in the traditional cultivation model, the sowing density is too high for optimal peanut production and seedling yield [39]. Variety characteristics and planting densities are related to the physiological morphology of the peanut plant and affect crop yield. Thus, optimal planting densities can improve the group leaf area index, which can enhance peanut production per unit area. This study showed that peanut LAI tended to first increase and then decrease during the reproductive process, and LAI peaked at the flowering stage. Peanut LAI also tended to increase with density (either single or double sowing) and was highest in the S1 and D1 treatments compared to the other density treatments throughout the reproductive period. Appropriate sowing densities are conducive to the regulatory relationships between individual plants and groups, and optimal group structure contributes to high crop yield [39].
The leaf area index is a key variable linking remote sensing observations to the quantification of agroecosystem processes [46]. In agroecosystem studies, LAI is commonly used to estimate photosynthesis, evapotranspiration, crop yield, and many other physiological processes [19]. Since LAI is functionally linked to canopy spectral reflectance, it is widely used in remote sensing prediction studies [30]. Most VIs used for LAI estimation combine reflectance in the visible (RGB) and near-infrared (NIR) bands [30]; the visible wavelengths help to control background soil disturbance effects, and the NIR wavelengths allow a large dynamic detection range [47]. In this study, we selected 12 different vegetation indices that are correlated with LAI and established simple regression (SR) and artificial neural network models (BPN). The VIs had good predictive capacities for peanut LAI, among which MSR, RVI, and NDVI were the best. The multivariate regression model with eight VIs had a higher predictive accuracy for LAI (R2 = 0.83) than the regression models with single VIs. Furthermore, the combined vegetation index neural network training model (C-BPN) had a higher prediction accuracy (R2 = 0.968) than the single vegetation index neural network training model (S-BPN). Overall, the BPN models were more accurate than the SR models. In addition, more high-precision prediction models have yet to be further explored. However, to ensure that these spectral indices can be used as a general tool for green vegetation leaf area index prediction, further experimental validation is required for other plant species in different geographic and climatic regions. The methods presented here represent important advances in the nondestructive monitoring of peanut LAI that can aid accurate predictions of crop content during growth [46].

5. Conclusions

The results of this study show that plant density has a significant effect on peanut LAI. Generally, there was a positive correlation between planting density and LAI. Plant density also affected the red light (RED) and near-infrared (NIR) bands, suggesting that these are the sensitive bands of LAI. The simple regression (SR) prediction model and the BP neural network (BPN) prediction model of peanut LAI achieved good predictive results. The neural network model with the combined vegetation indices (C-BPN) had the best prediction effect and predicted peanut LAI with the highest accuracy. Thus, the C-BPN model can be used for real-time LAI monitoring during peanut production.

Author Contributions

Conceptualization, H.Q., Y.L. (Yubin Lan) and L.Z.; data curation, B.Z., Z.W., L.W. and J.L.; formal analysis, B.Z. and Y.L. (Yu Liang); funding acquisition, H.Q., Y.L. (Yubin Lan) and L.Z.; investigation, H.Q., B.Z., L.W., Z.W., J.L., Y.L. (Yu Liang), T.C. and L.Z.; project administration, Y.L. (Yubin Lan) and L.Z.; resources, Y.L. (Yubin Lan), L.Z.; writing—original draft, H.Q., Y.L. (Yubin Lan), and L.Z.; writing—review and editing, H.Q., Y.L. (Yubin Lan) and L.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This work was funded by the Key Science and Technology Planning Project of Guangdong Province (2019B020214003), Characteristic Innovation Projects of Guangdong Provincial Department of Education (2019KTSCX015); and Guangdong Technical System of Peanut and Soybean Industry (2019KJ136-05).

Acknowledgments

We are grateful to the personnel of Guangdong Academy of Agricultural Science for providing the seed of Yueyou 45. We are also grateful to the editor and the anonymous reviewers.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Chen, J.M.; Black, T.A. Defining leaf area index for non-flat leaves. Agric. For. Meteorol. 1992, 15, 421–429. [Google Scholar] [CrossRef]
  2. Yao, X.; Wang, N.; Liu, Y.; Cheng, T.; Tian, Y.; Chen, Q.; Zhu, Y. Estimation of Wheat LAI at Middle to High Levels Using Unmanned Aerial Vehicle Narrowband Multispectral Imagery. Remote Sens. 2017, 9, 1304. [Google Scholar] [CrossRef] [Green Version]
  3. Casa, R.; Varella, H.; Buis, S.; Guérif, M.; De Solan, B.; Baret, F. Forcing a wheat crop model with LAI data to access agronomic variables: Evaluation of the impact of model and LAI uncertainties and comparison with an empirical approach. Eur. J. Agron. 2012, 37, 1–10. [Google Scholar] [CrossRef]
  4. Laliberte, A.S.; Goforth, M.A.; Steele, C.M.; Rango, A. Multispectral Remote Sensing from Unmanned Aircraft: Image Processing Workflows and Applications for Rangeland Environments. Remote Sens. 2011, 3, 2529–2551. [Google Scholar] [CrossRef] [Green Version]
  5. Dente, L.; Satalino, G.; Mattia, F.; Rinaldi, M. Assimilation of leaf area index derived from ASAR and MERIS data into CERES-Wheat model to map wheat yield. Remote Sens. Environ. 2008, 112, 1395–1407. [Google Scholar] [CrossRef]
  6. Wang, Y.; Zhang, K.; Tang, C.; Cao, Q.; Tian, Y.; Zhu, Y.; Cao, W.; Liu, X. Estimation of Rice Growth Parameters Based on Linear Mixed-Effect Model Using Multispectral Images from Fixed-Wing Unmanned Aerial Vehicles. Remote Sens. 2019, 11, 1371. [Google Scholar] [CrossRef] [Green Version]
  7. Guindin-Garcia, N.; Gitelson, A.A.; Arkebauer, T.J.; Shanahan, J.; Weiss, A. An evaluation of MODIS 8- and 16-day composite products for monitoring maize green leaf area index. Agric. For. Meteorol. 2012, 161, 15–25. [Google Scholar] [CrossRef] [Green Version]
  8. Jonckheere, I.; Fleck, S.; Nackaerts, K.; Muys, B.; Coppin, P.; Weiss, M.; Baret, F. Review of methods for in situ leaf area index determination. Agric. For. Meteorol. 2004, 121, 19–35. [Google Scholar] [CrossRef]
  9. Huang, J.; Sedano, F.; Huang, Y.; Ma, H.; Li, X.; Liang, S.; Tian, L.; Zhang, X.; Fan, J.; Wu, W. Assimilating a synthetic Kalman filter leaf area index series into the WOFOST model to improve regional winter wheat yield estimation. Agric. For. Meteorol. 2016, 216, 188–202. [Google Scholar] [CrossRef]
  10. Lu, D.; Chen, Q.; Wang, G.; Moran, E.; Batistella, M.; Zhang, M.; Vaglio Laurin, G.; Saah, D. Aboveground Forest Biomass Estimation with Landsat and LiDAR Data and Uncertainty Analysis of the Estimates. Int. J. For. Res. 2012, 2012, 1–16. [Google Scholar] [CrossRef]
  11. Zhang, C.; Kovacs, J.M. The application of small unmanned aerial systems for precision agriculture: A review. Precis. Agric. 2012, 13, 693–712. [Google Scholar] [CrossRef]
  12. Nebiker, S.; Lack, N.; Abächerli, M.; Läderach, S. Light-weight multispectral UAV sensors and their capabilities for predicting grain yield and detecting plant diseases. ISPRS Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B1, 963–970. [Google Scholar] [CrossRef]
  13. Apolo-Apolo, O.E.; Martínez-Guanter, J.; Egea, G.; Raja, P.; Pérez-Ruiz, M. Deep learning techniques for estimation of the yield and size of citrus fruits using a UAV. Eur. J. Agron. 2020, 115, 126030. [Google Scholar] [CrossRef]
  14. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Hartling, S.; Esposito, F.; Fritschi, F.B. Soybean yield prediction from UAV using multimodal data fusion and deep learning. Remote Sens. Environ. 2020, 237, 111599. [Google Scholar] [CrossRef]
  15. Dalla Corte, A.P.; Rex, F.E.; Almeida, D.R.A.D.; Sanquetta, C.R.; Silva, C.A.; Moura, M.M.; Wilkinson, B.; Zambrano, A.M.A.; Cunha Neto, E.M.D.; Veras, H.F.P.; et al. Measuring Individual Tree Diameter and Height Using GatorEye High-Density UAV-Lidar in an Integrated Crop-Livestock-Forest System. Remote Sens. 2020, 12, 863. [Google Scholar] [CrossRef] [Green Version]
  16. Guo, T.; Kujirai, T.; Watanabe, T. Mapping Crop Status from an Unmanned Aerial Vehicle for Precision Agriculture Applications. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2012, XXXIX-B1, 485–490. [Google Scholar] [CrossRef] [Green Version]
  17. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef] [Green Version]
  18. Xie, Y.; Wang, P.; Sun, H.; Zhang, S.; Li, L. Assimilation of Leaf Area Index and Surface Soil Moisture With the CERES-Wheat Model for Winter Wheat Yield Estimation Using a Particle Filter Algorithm. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2017, 10, 1303–1316. [Google Scholar] [CrossRef]
  19. Xiao, Z.; Liang, S.; Wang, J.; Jiang, B.; Li, X. Real-time retrieval of Leaf Area Index from MODIS time series data. Remote Sens. Environ. 2011, 115, 97–106. [Google Scholar] [CrossRef]
  20. Atzberger, C.; Darvishzadeh, R.; Immitzer, M.; Schlerf, M.; Skidmore, A.; le Maire, G. Comparative analysis of different retrieval methods for mapping grassland leaf area index using airborne imaging spectroscopy. Int. J. Appl. Earth Obs. 2015, 43, 19–31. [Google Scholar] [CrossRef] [Green Version]
  21. Ganguly, S.; Nemani, R.R.; Zhang, G.; Hashimoto, H.; Milesi, C.; Michaelis, A.; Wang, W.; Votava, P.; Samanta, A.; Melton, F.; et al. Generating global Leaf Area Index from Landsat: Algorithm formulation and demonstration. Remote Sens. Environ. 2012, 122, 185–202. [Google Scholar] [CrossRef] [Green Version]
  22. Le Maire, G.; Marsden, C.; Nouvellon, Y.; Stape, J.; Ponzoni, F. Calibration of a Species-Specific Spectral Vegetation Index for Leaf Area Index (LAI) Monitoring: Example with MODIS Reflectance Time-Series on Eucalyptus Plantations. Remote Sens. 2012, 4, 3766–3780. [Google Scholar] [CrossRef] [Green Version]
  23. Baghzouz, M.; Devitt, D.A.; Fenstermaker, L.F.; Young, M.H. Monitoring Vegetation Phenological Cycles in Two Different Semi-Arid Environmental Settings Using a Ground-Based NDVI System: A Potential Approach to Improve Satellite Data Interpretation. Remote Sens. 2010, 2, 990–1013. [Google Scholar] [CrossRef] [Green Version]
  24. Su, J.; Liu, C.; Coombes, M.; Hu, X.; Wang, C.; Xu, X.; Li, Q.; Guo, L.; Chen, W. Wheat yellow rust monitoring by learning from multispectral UAV aerial imagery. Comput. Electron. Agric. 2018, 155, 157–166. [Google Scholar] [CrossRef]
  25. Devia, C.A.; Rojas, J.P.; Petro, E.; Martinez, C.; Mondragon, I.F.; Patino, D.; Rebolledo, M.C.; Colorado, J. High-Throughput Biomass Estimation in Rice Crops Using UAV Multispectral Imagery. J. Intell. Robot. Syst. 2019, 96, 573–589. [Google Scholar] [CrossRef]
  26. Wang, Y.; Chang, K.; Chen, R.; Lo, J.; Shen, Y. Large-area rice yield forecasting using satellite imageries. Int. J. Appl. Earth Obs. 2010, 12, 27–35. [Google Scholar] [CrossRef]
  27. Zarco-Tejada, P.J.; Guillén-Climent, M.L.; Hernández-Clemente, R.; Catalina, A.; González, M.R.; Martín, P. Estimating leaf carotenoid content in vineyards using high resolution hyperspectral imagery acquired from an unmanned aerial vehicle (UAV). Agric. For. Meteorol. 2013, 171–172, 281–294. [Google Scholar] [CrossRef] [Green Version]
  28. Muhammad, H.; Mengjiao, Y.; Awais, R.; Xiuliang, J.; Xianchun, X.; Yonggui, X. Time-series multispectral indices from unmanned aerial vehicle imagery reveal senescence rate in bread wheat. Remote Sens. 2018, 10, 809. [Google Scholar]
  29. Glenn, E.P.; Huete, A.R.; Nagler, P.L.; Nelson, S.G. Relationship Between Remotely-sensed Vegetation Indices, Canopy Attributes and Plant Physiological Processes: What Vegetation Indices Can and Cannot Tell Us About the Landscape. Sensors 2008, 8, 2136–2160. [Google Scholar] [CrossRef] [Green Version]
  30. Haboudane, D. Hyperspectral vegetation indices and novel algorithms for predicting green LAI of crop canopies: Modeling and validation in the context of precision agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  31. Cao, Y.; Li, G.L.; Luo, Y.K.; Pan, Q.; Zhang, S.Y. Monitoring of sugar beet growth indicators using wide-dynamic-range vegetation index (WDRVI) derived from UAV multispectral images. Comput. Electron. Agric. 2020, 171, 105331. [Google Scholar] [CrossRef]
  32. Rondeaux, G.; Steven, M.; Baret, F. Optimization of soil-adjusted vegetation indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  33. Kaufman, Y.J.; Tanre, D. Atmospherically resistant vegetation index (ARVI) for EOS-MODIS. IEEE Trans. Geosci. Remote 1992, 30, 261–270. [Google Scholar] [CrossRef]
  34. Jiang, Z.; Huete, A.; Didan, K.; Miura, T. Development of a two-band enhanced vegetation index without a blue band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
  35. Huete, A.; Didan, K.; Miura, T.; Rodriguez, E.P.; Gao, X.; Ferreira, L.G. Overview of the radiometric and biophysical performance of the MODIS vegetation indices. Remote Sens. Environ. 2002, 83, 195–213. [Google Scholar] [CrossRef]
  36. Gitelson, A.A. Wide Dynamic Range Vegetation Index for Remote Quantification of Biophysical Characteristics of Vegetation. J. Plant Physiol. 2004, 161, 165–173. [Google Scholar] [CrossRef] [Green Version]
  37. Viña, A.; Gitelson, A.A.; Nguy-Robertson, A.L.; Peng, Y. Comparison of different vegetation indices for the remote assessment of green leaf area index of crops. Remote Sens. Environ. 2011, 115, 3468–3478. [Google Scholar] [CrossRef]
  38. Liu, J.; Pattey, E.; Jégo, G. Assessment of vegetation indices for regional crop green LAI estimation from Landsat images over multiple growing seasons. Remote Sens. Environ. 2012, 123, 347–358. [Google Scholar] [CrossRef]
  39. Qi, H.; Zhu, B.; Kong, L.; Yang, W.; Zou, J.; Lan, Y.; Zhang, L. Hyperspectral Inversion Model of Chlorophyll Content in Peanut Leaves. Appl. Sci. 2020, 10, 2259. [Google Scholar] [CrossRef] [Green Version]
  40. Delegido, J.; Verrelst, J.; Rivera, J.P.; Ruiz-Verdú, A.; Moreno, J. Brown and green LAI mapping through spectral indices. Int. J. Appl. Earth Obs. 2015, 35, 350–358. [Google Scholar] [CrossRef]
  41. Chen, L.; Huang, J.F.; Wang, F.M.; Tang, Y.L. Comparison between back propagation neural network and regression models for the estimation of pigment content in rice leaves and panicles using hyperspectral data. Int. J. Remote Sens. 2007, 28, 3457–3478. [Google Scholar] [CrossRef]
  42. Fortin, J.G.; Anctil, F.; Parent, L.E. Comparison of physically based and empirical models to estimate corn (Zea mays L) LAI from multispectral data in eastern Canada. Can. J. Remote Sens. 2013, 39, 89–99. [Google Scholar] [CrossRef]
  43. Peng, Y.; Lu, R. Modeling multispectral scattering profiles for prediction of apple fruit firmness. Trans. ASABE 2005, 48, 235–242. [Google Scholar] [CrossRef]
  44. Zhu, Z.; Bi, J.; Pan, Y.; Ganguly, S.; Anav, A.; Xu, L.; Samanta, A.; Piao, S.; Nemani, R.; Myneni, R. Global Data Sets of Vegetation Leaf Area Index (LAI)3g and Fraction of Photosynthetically Active Radiation (FPAR)3g Derived from Global Inventory Modeling and Mapping Studies (GIMMS) Normalized Difference Vegetation Index (NDVI3g) for the Period 1981 to 2011. Remote Sens. 2013, 5, 927–948. [Google Scholar]
  45. Gómez-Candón, D.; De Castro, A.I.; López-Granados, F. Assessing the accuracy of mosaics from unmanned aerial vehicle (UAV) imagery for precision agriculture purposes in wheat. Precis. Agric. 2014, 15, 44–56. [Google Scholar] [CrossRef] [Green Version]
  46. Kang, Y.; Özdoğan, M.; Zipper, S.; Román, M.; Walker, J.; Hong, S.; Marshall, M.; Magliulo, V.; Moreno, J.; Alonso, L.; et al. How Universal Is the Relationship between Remotely Sensed Vegetation Indices and Crop Leaf Area Index? A Global Assessment. Remote Sens. 2016, 8, 597. [Google Scholar] [CrossRef] [Green Version]
  47. Pinty, B.; Lavergne, T.; Widlowski, J.L.; Gobron, N.; Verstraete, M.M. On the need to observe vegetation canopies in the near-infrared to estimate visible light absorption. Remote Sens. Environ. 2009, 113, 10–23. [Google Scholar] [CrossRef]
Figure 1. Location of the test area.
Figure 1. Location of the test area.
Sensors 20 06732 g001
Figure 2. Histogram of the reflectance at different wavelengths for the peanut plants grown at different densities (S1, S2, S3, S4). p = 0.01, the correlation is significant.
Figure 2. Histogram of the reflectance at different wavelengths for the peanut plants grown at different densities (S1, S2, S3, S4). p = 0.01, the correlation is significant.
Sensors 20 06732 g002
Figure 3. Correlations between the vegetation indices and peanut LAI during the whole growth period. p = 0.01, the correlation is significant.
Figure 3. Correlations between the vegetation indices and peanut LAI during the whole growth period. p = 0.01, the correlation is significant.
Sensors 20 06732 g003
Figure 4. Changes in LAI throughout the growing period in two peanut varieties; p = 0.01, the correlation is significant.
Figure 4. Changes in LAI throughout the growing period in two peanut varieties; p = 0.01, the correlation is significant.
Sensors 20 06732 g004
Figure 5. LAI for two peanut varieties under single sowing (A), and LAI for two peanut varieties under double sowing (B).
Figure 5. LAI for two peanut varieties under single sowing (A), and LAI for two peanut varieties under double sowing (B).
Sensors 20 06732 g005
Figure 6. LAI-predicted results of the combined vegetation index neural network training (C-BPN) model.
Figure 6. LAI-predicted results of the combined vegetation index neural network training (C-BPN) model.
Sensors 20 06732 g006
Figure 7. Prediction map of the RVI and NDVI vegetation indices for LAI.
Figure 7. Prediction map of the RVI and NDVI vegetation indices for LAI.
Sensors 20 06732 g007
Table 1. Specifications of the multispectral sensor used in this study.
Table 1. Specifications of the multispectral sensor used in this study.
Band NumberBand NameCenter Wavelength (nm)Bandwidth FWHM (nm)Definition
1Green550401.4 MP
2Red660401.4 MP
3Red-Edge735101.4 MP
4Near-Infrared790401.4 MP
Table 2. Definition of the selected vegetation indices (VIs).
Table 2. Definition of the selected vegetation indices (VIs).
Vegetation Index EquationReferences
Ratio   Vegetation   Index   RVI = NIR / RED Jordan et al.,
1969
Normalised   Difference   Vegetation   Index   NDVI = NIR RED / N I R + R E D Peñuelas et al.,
1997
Soil - Adjusted   Vegetation   Index   SAVI = 1.5 NIR RED / NIR RED + 0.5 Haboudane et al.,
2004
Renormalized   Difference   Vegetation   Index   RDVI = NIR RED / NIR + RED 1 2 Roujean et al.,
2005
Green   Normalized   Difference   Vegetation   Index   GNDVI = NIR GREEN / NIR + GREEN Gitelson et al.,
1996
Modified   Red - Edge   Simple   Ratio   Index   MSR = NIR / RED 1 / NIR / RED + 1 1 2 Chen et al.,
1996
Difference   Vegetation   Index DVI = NIR RED Becker et al.,
1988
Normalized   Difference   Red - Edge   Index NDRE = NIR RedEdge / NIR RedEdge Gitelson et al.,
1994
Red - edge   chlorophyll   index   CL RE = NIR / RedEdge 1 Gitelson et al.,
2005
Optimized   SAVI   OSAVI = 1 + 0.16 NIR RED / NIR + RED + 0.16 Rondeaux et al.,
1996
Modified   Chlorophyll   Absorption   in   Reflectance   Index   MCARI = 1.2 2.5 NIR RED 1.3 NIR GREEN Daughtry et al.,
2000
Modified   Triangular   Vegetation   Index   1   MTVI 1 = 1.2 1.2 NIR GREEN 2.5 RED GREEN Haboudane et al.,
2004
Table 3. Correlations between the vegetation indices and peanut leaf area index (LAI) at each growth stage.
Table 3. Correlations between the vegetation indices and peanut leaf area index (LAI) at each growth stage.
Growth StageVegetation Index
CI RE DVI GNDVIMCARIMSRMTVI1NDRENDVIOSAVIRDVISAVIRVI
Seedling stage0.1240.4670.3830.4220.4810.4220.2320.5060.5290.5150.5130.426
Flowering stage0.280.6430.5450.6270.5540.350.3260.5250.6310.6250.610.582
Pod-filling stage0.5480.3890.5240.3770.6440.3770.5060.9180.4810.4430.4210.641
Maturity stage0.062 0.1780.2310.1820.3190.1560.0640.2810.20.2030.1880.341
Whole growth period0.3940.6690.7090.7250.7960.6590.4380.7670.7390.720.7140.789
Note: DVI, Difference Vegetation of Index; GNDVI, Green Normalized Difference Vegetation Index; MCARI, Modified Chlorophyll Absorption in Reflectance Index; MSR, Modified Red-Edge Simple Ratio Index; NDVI, Normalized Difference Vegetation Index; OSAVI, Optimized SAVI; RDVI, Renormalized Difference Vegetation Index); RVI, Ratio Vegetation Index. p = 0.01, the correlation is significant.
Table 4. The optimal inverse model and accuracy check of peanut LAI with each vegetation index construction.
Table 4. The optimal inverse model and accuracy check of peanut LAI with each vegetation index construction.
ModelEquationR2RMSE
NDVI-LAI y = 5.919 x 1.872 0.7730.407
MSR-LAI y = 0.9586 x + 1.197 0.7920.389
RVI-LAI y = 1.289 x 0.4629 0.7900.390
8 VIsLAI = 52.95DVI − 2.12GNDVI + 0.3MCARI + 7.31MSR + 11.12NDVI − 66.48RDVI − 0.77RVI0.8300.376
Note: R2, determination coefficient; RMSE, root mean square error. p = 0.01, the correlation is significant.
Table 5. Comparative analyses of the accuracies of the backpropagation (BP) neural network models for different input parameters.
Table 5. Comparative analyses of the accuracies of the backpropagation (BP) neural network models for different input parameters.
ModelR2RMSE
DVI-BPN0.8570.127
GNDVI-BPN0.8750.2
MCARI-BPN0.8630.235
MSR-BPN0.8960.145
NDVI-BPN0.9230.124
OSAVI-BPN0.8730.165
RDVI-BPN0.8850.144
RVI-BPN0.9270.059
All VIs-BPN0.9680.165
Note: VIs-BPN is an inverse model of the established vegetation index versus the measured LAI. R2 indicates the prediction accuracy of the established neural network model on peanut LAI, and RMSE indicates the root mean square error of the model.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Qi, H.; Zhu, B.; Wu, Z.; Liang, Y.; Li, J.; Wang, L.; Chen, T.; Lan, Y.; Zhang, L. Estimation of Peanut Leaf Area Index from Unmanned Aerial Vehicle Multispectral Images. Sensors 2020, 20, 6732. https://doi.org/10.3390/s20236732

AMA Style

Qi H, Zhu B, Wu Z, Liang Y, Li J, Wang L, Chen T, Lan Y, Zhang L. Estimation of Peanut Leaf Area Index from Unmanned Aerial Vehicle Multispectral Images. Sensors. 2020; 20(23):6732. https://doi.org/10.3390/s20236732

Chicago/Turabian Style

Qi, Haixia, Bingyu Zhu, Zeyu Wu, Yu Liang, Jianwen Li, Leidi Wang, Tingting Chen, Yubin Lan, and Lei Zhang. 2020. "Estimation of Peanut Leaf Area Index from Unmanned Aerial Vehicle Multispectral Images" Sensors 20, no. 23: 6732. https://doi.org/10.3390/s20236732

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop