Previous Article in Journal
Abstracts of the 4th International Electronic Conference on Brain Sciences
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Estimating Leaf Area Index of Wheat Using UAV-Hyperspectral Remote Sensing and Machine Learning †

1
Division of Agricultural Physics, Indian Council of Agricultural Research (ICAR)–Indian Agricultural Research Institute (IARI), New Delhi 110012, India
2
International Rice Research Institute, New Delhi 110012, India
*
Author to whom correspondence should be addressed.
Presented at the 4th International Electronic Conference on Agronomy, 2–5 December 2024; Available online: https://sciforum.net/event/IECAG2024.
Biol. Life Sci. Forum 2025, 41(1), 11; https://doi.org/10.3390/blsf2025041011
Published: 18 June 2025
(This article belongs to the Proceedings of The 4th International Electronic Conference on Agronomy)

Abstract

:
Hyperspectral remote sensing using Unmanned Aerial Vehicles (UAVs) provides accurate, near real-time, and large-scale spatial estimation of the leaf area index (LAI), a significant crop variable for monitoring crop growth. In the present study, the LAI of wheat crops was estimated using high-resolution UAV-borne hyperspectral data. The PLS (Partial Least Squares) regression combined with the VIP (Variable Importance in the Projection) was used for selecting the optimum indices as feature vectors to the Extreme Gradient Boosting (Xgboost) model for predicting LAI. Twelve of twenty-seven vegetation indices were selected to develop the prediction model. On validation against the in situ measured LAI values, the prediction model shows good accuracy with an R2 of 0.71. The model was used to generate a spatial map showing the variability of the LAI. Accurate mapping of LAI from high-resolution hyperspectral UAV data using machine learning models facilitates near-real-time monitoring of crop health.

1. Introduction

The leaf area index (LAI) is a dimensionless variable measured as the total one-sided foliar area per unit area on the ground [1]. Since LAI is closely related to many plant canopies’ physical and biological processes such as photosynthesis, evapotranspiration, carbon uptake, energy balance, etc., it is essential to monitor timely, accurate, and temporal manner. However, the conventional approach to LAI estimation for larger fields is time-consuming, inaccurate, and labour-intensive [2]. Recent advances in remote sensing have made a significant breakthrough in estimating LAI more accurately, rapidly and reliably for larger areas. The capability of unmanned aerial vehicles (UAVs) to fly at low altitudes generates high-resolution images, which can be used to produce LAI maps showing high spatial variability. The potential of hyperspectral sensors mounted on UAVs has been explored successfully for estimating LAI utilizing various physical, empirical, machine learning or deep learning regression models [3,4]. Hybrid models developed using process-based radiative transfer (RT) models and machine learning models retrieve accurate measurements of crop traits [5]. Both spectral data [6] and spectral indices [7] are strongly related to various crop traits. After applying suitable feature extraction or transformation techniques, they serve as potential feature vectors for predicting these important crop variables [8,9]. Recently, frameworks made using a combination of preprocessing algorithms, feature selection models and machine learning/deep learning regression models were found to be able to successfully predict crop variables with better fit and accuracy [10]. On applying these optimized models to UAV-acquired data, operational mapping of crop traits can be achieved for smart nutrient input applications.
The LAI of wheat crops in the research farm of ICAR—Indian Agricultural Research Institute (IARI), New Delhi, was successfully estimated using different machine learning models from UAV image spectra and RT simulated spectra as input feature vectors [11,12]. In the present study, we attempted to estimate LAI using vegetation indices calculated from the UAV hyperspectral data in the spectral range of 400–1000 nm using the Extreme Gradient Boosting (Xgboost) regression model. A combined approach of PLSR with VIP scores was used for selecting optimal indices.

2. Materials and Methods

The flowchart showing LAI estimation using UAV data is shown in Figure 1. The major steps involved are the following: (i) Field Experimental Design, Data Acquisition and Pre-processing and (ii) Selection of Vegetation Indices and Model Development.

2.1. Field Experimental Design, Data Acquisition and Pre-Processing

The wheat experiments were conducted in 45 plots in the research farm of ICAR—Indian Agricultural Research Institute (IARI), New Delhi, during the rabi season (winter cropping season from October to April) of 2021–2022 (as shown in Figure 2a,b). The field was maintained with three replications of fifteen plots consisting of five graded nitrogen levels (0, 50, 100, 150, and 200 kgNha−1) and three irrigation treatments (soil moisture sensor-based treatment-I1; crop water stress index (CWSI)-based treatment-I2; and conventional treatment-I3 (as shown in Figure 2c).
The five graded nitrogen levels were represented as N0, N1, N2, N3, and N4. The hyperspectral data, having 269 bands in the spectral range of 400–1000 nm, was captured on 17 March 2022 using a Nano-Hyperspec sensor (Headwall Photonics Inc., Bolton, MA, USA) mounted on a DJI Matrice 600 Pro hexacopter platform (DJI Sky City, Shenzhen, China) with a Ronin gimbal. Image acquisition was carried out between 11:00 AM and 12:00 PM on a bright sunny day. The mission planning was carried out using the flight planning UgCS mission planning software v3.4.609 (Universal Ground Control Software, SPH Engineering, Riga, Latvia). The drone was flown with a speed of 3 m/s at 21 m altitude to achieve a spatial resolution of 4 cm. The acquired data underwent a series of processing steps, including (i) radiometric calibration, (ii) orthorectification, (iii) mosaicking, (iv) spectral smoothing, and (v) crop area segmentation [6]. The LAI was also measured on the same day for each plot using an LAI-2000 plant canopy analyser (Li-Cor, Inc., Lincoln, NE, USA).

2.2. Selection of Vegetation Indices and Model Development

After a detailed review of the literature, a total of 27 vegetation indices, as listed in Table 1, were selected for LAI estimation. The PLSR-VIP was used for ranking these vegetation indices in order of their importance. The measured LAI was taken as an independent variable, and the Extreme Gradient Boosting (Xgboost) machine learning model was deployed for LAI prediction using these optimal vegetation indices. The optimal parameters, such as n_estimators, learning_rate, and max_depth, were selected using the RandomizedSearchCV function in Python 3.11.3 . The optimized model was validated against the in situ LAI measurements using root mean square error (RMSE), mean absolute error (MAE), and R-Square (R2). A 70/30 train/test data splitting was used for model training and validation. Finally, the validated model was applied to the pre-processed imagery for generating a spatial map showing the variability of LAI values.

3. Results

3.1. Optimizing Vegetation Indices Using PLSR-VIP

After pre-processing, the hyperspectral imagery was used for generating the spectral indices. The PLSR-VIP scores obtained for each spectral index were illustrated in a bar diagram as shown in Figure 3.
On applying a threshold of 1, twelve indices were finally selected as feature vectors for regression analysis. Among 27 indices, only 12 were strongly related to the measured LAI as indicated by the PLSR-VIP scoring approach. They are CIgreen, SIPI, AIVI, MRENDVI, TCARI/OSAVI, ONLI, RSI, MTCI2, VREI1, SR [800,680], NDVI [550,750], and MTVI2. The maps corresponding to these optimal indices are shown in Figure 4. All index values show a strong relationship with the N treatments.

3.2. LAI Mapping and Validation

The 12 characteristic indices and Xgboost model were deployed to develop an LAI prediction model. On validating against the in situ measurements, high prediction accuracies in terms of R2 of 0.71, RMSE of 0.52, and MAE of 0.44, respectively, were obtained. The optimized values obtained for ‘learning_rate’, ‘max_depth’, and ‘n_estimators’ were 0.14, 1, and 87, respectively. The scatter diagram with validation results is shown in Figure 5a. The LAI generated by applying the optimized model to the 12 characteristic indices of UAV hyperspectral imagery is shown in Figure 5b. The LAI map was able to showcase the distinct variation in LAI with respect to N treatments. Most of the plots under the N0 treatment showed lower LAI values. Thus, the predicted map provides a clear visualization of spatial variation in LAI.

4. Discussion

Many studies have reported characteristic bands for LAI estimation using multiple feature selection models, but they varied with respect to regression modelling and variable selection approaches [33]. Optimized spectral indices consisting of red edge bands were proved to be crucial in monitoring the LAI of wheat [34]. More accurate LAI estimations were reported for vegetation indices using near-infrared bands. Compared to the red band, blue and green bands show a strong contribution to LAI estimation [2]. The indices optimized through PLSR-VIP were also dominated by these LAI-sensitive bands and thereby demonstrated the accurate estimation of LAI. Moreover, SR [800,680], MTVI2, and CIgreen, belonging to the optimized indices, have already proven their ability to estimate LAI accurately [1,2,33]. These highlight the importance of using a threshold of 1 for selecting these optimal indices, which were also reported earlier for VIP scores in LAI estimation [35]. The optimized indices and the generated LAI map show a similar trend in the LAI distribution with respect to ground truth values. The experimental plots corresponding to low-nitrogen treatments consist of many red-coloured pixels, suggesting lower LAI values and thereby a strong variation in estimated LAI with respect to N treatments was obtained. Since LAI is a strong indicator of crop health [11], these maps can be utilized for fertilizer management.
In the present study, Xgboost delivers an accurate prediction for wheat LAI using characteristic indices obtained from the PLSR-VIP selection approach. The model resulted in an R2 value of 0.71, which is comparable with previous studies on LAI estimation using multiple machine learning models [4,36,37]. Among them, a few studies reported the outperformance of XGboost over other machine learning models [37,38,39]. In addition to XGboost, other advanced regressors like K-neighbours, extra trees, random forest, and support vectors also outperformed with higher R2 value accuracy [40]. However, a comparison of multiple machine learning models and feature selection techniques will be performed in the future to select the best one for a more accurate prediction of LAI. Moreover, more data points from multiple growth stages and cropping seasons will be considered in the future for training ML models, to improve the prediction accuracy.

5. Conclusions

The selection of optimal vegetation indices was successfully achieved through PLSR-VIP scoring for predicting the LAI of wheat crops. A total of 27 indices was reduced to an optimal 12 characteristic indices sensitive to the LAI variability of wheat fields. These indices were used as input feature vectors for the Xgboost regression model and achieved a good prediction accuracy (R2 of 0.71, RMSE of 0.52, and MAE of 0.44). The high-resolution LAI map generated from the UAV hyperspectral data using this optimized model offers a non-destructive, rapid, and accurate alternative to conventional approaches.

Author Contributions

Conceptualization, R.G.R. and R.N.S.; methodology, R.G.R.; software, R.N.S.; validation, R.G.R. and R.R.; formal analysis, R.G.R. and R.R.; investigation, R.G.R.; resources, T.K., A.B. and S.G.; data curation, T.K., A.B. and S.G.; writing—original draft preparation, R.G.R.; writing—review and editing, R.G.R. and R.N.S.; visualization, R.G.R.; supervision, R.N.S.; project administration, R.N.S.; funding acquisition, R.N.S. All authors have read and agreed to the published version of the manuscript.

Funding

The results summarized in the manuscript were achieved as part of the research project “Network Program on Precision Agriculture (NePPA)”, which is funded by the Indian Council of Agricultural Research (ICAR), India.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Data can be made available upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Gong, Y.; Yang, K.; Lin, Z.; Fang, S.; Wu, X.; Zhu, R.; Peng, Y. Remote Estimation of Leaf Area Index (LAI) with Unmanned Aerial Vehicle (UAV) Imaging for Different Rice Cultivars throughout the Entire Growing Season. Plant Methods 2021, 17, 88. [Google Scholar] [CrossRef] [PubMed]
  2. Din, M.; Zheng, W.; Rashid, M.; Wang, S.; Shi, Z. Evaluating Hyperspectral Vegetation Indices for Leaf Area Index Estimation of Oryza Sativa L. at Diverse Phenological Stages. Front. Plant Sci. 2017, 8, 820. [Google Scholar] [CrossRef] [PubMed]
  3. Rejith, R.G.; Sahoo, R.N.; Verrelst, J.; Ranjan, R.; Gakhar, S.; Anand, A.; Kondraju, T.; Kumar, S.; Kumar, M.; Dhandapani, R. UAV-Based Retrieval Of Wheat Canopy Chlorophyll Content Using A Hybrid Machine Learning Approach. In Proceedings of the 2023 IEEE India Geoscience and Remote Sensing Symposium (InGARSS), Bangalore, India, 10–13 December 2023; pp. 1–4. [Google Scholar]
  4. Shi, B.; Guo, L.; Yu, L. Accurate LAI Estimation of Soybean Plants in the Field Using Deep Learning and Clustering Algorithms. Front. Plant Sci. 2025, 15, 1501612. [Google Scholar] [CrossRef] [PubMed]
  5. Sahoo, R.N.; Kondraju, T.; Rejith, R.G.; Verrelst, J.; Ranjan, R.; Gakhar, S.; Bhandari, A.; Chinnusamy, V. Monitoring Cropland LAI Using Gaussian Process Regression and Sentinel—2 Surface Reflectance Data in Google Earth Engine. Int. J. Remote Sens. 2024, 45, 5008–5027. [Google Scholar] [CrossRef]
  6. Sahoo, R.N.; Gakhar, S.; Rejith, R.G.; Ranjan, R.; Meena, M.C.; Dey, A.; Mukherjee, J.; Dhakar, R.; Arya, S.; Daas, A.; et al. Unmanned Aerial Vehicle (UAV)–Based Imaging Spectroscopy for Predicting Wheat Leaf Nitrogen. Photogramm. Eng. Remote Sens. 2023, 89, 107–116. [Google Scholar] [CrossRef]
  7. Rejith, R.G.; Gakhar, S.; Sahoo, R.N.; Ranjan, R.; Meena, M.C.; Meena, A. UAV Hyperspectral Remote Sensing for Wheat Leaf Nitrogen Prediction. In Proceedings of the Applied Geoinformatics for Society and Environment(AGSE), Kerala, India, 2–4 November 2022; pp. 54–63. [Google Scholar]
  8. Sahoo, R.N.; Rejith, R.G.; Gakhar, S.; Ranjan, R.; Meena, M.C.; Dey, A.; Mukherjee, J.; Dhakar, R.; Meena, A.; Daas, A.; et al. Drone Remote Sensing of Wheat N Using Hyperspectral Sensor and Machine Learning. Precis. Agric. 2023, 25, 704–728. [Google Scholar] [CrossRef]
  9. Hu, M.; Wang, J.; Yang, P.; Li, P.; He, P.; Bi, R. Estimation of Daylily Leaf Area Index by Synergy Multispectral and Radar Remote-Sensing Data Based on Machine-Learning Algorithm. Agronomy 2025, 15, 456. [Google Scholar] [CrossRef]
  10. Li, Y.; Xu, X.; Wu, W.; Zhu, Y.; Gao, L.; Jiang, X.; Meng, Y.; Yang, G.; Xue, H. Hyperspectral Estimation of Chlorophyll Content in Grapevine Based on Feature Selection and GA-BP. Sci. Rep. 2025, 15, 8029. [Google Scholar] [CrossRef]
  11. Sahoo, R.N.; Gakhar, S.; Rejith, R.G.; Verrelst, J.; Ranjan, R.; Kondraju, T.; Meena, M.C.; Mukherjee, J.; Daas, A.; Kumar, S.; et al. Optimizing the Retrieval of Wheat Crop Traits from UAV-Borne Hyperspectral Image with Radiative Transfer Modelling Using Gaussian Process Regression. Remote Sens. 2023, 15, 5496. [Google Scholar] [CrossRef]
  12. Sahoo, R.N.; Rejith, R.G.; Gakhar, S.; Verrelst, J.; Ranjan, R.; Kondraju, T.; Meena, M.C.; Mukherjee, J.; Dass, A.; Kumar, S.; et al. Estimation of Wheat Biophysical Variables through UAV Hyperspectral Remote Sensing Using Machine Learning and Radiative Transfer Models. Comput. Electron. Agric. 2024, 221, 108942. [Google Scholar] [CrossRef]
  13. He, L.; Song, X.; Feng, W.; Guo, B.B.; Zhang, Y.S.; Wang, Y.H.; Wang, C.Y.; Guo, T.C. Improved Remote Sensing of Leaf Nitrogen Concentration in Winter Wheat Using Multi-Angular Hyperspectral Data. Remote Sens. Environ. 2016, 174, 122–133. [Google Scholar] [CrossRef]
  14. Nie, C.; Shi, L.; Li, Z.; Xu, X.; Yin, D.; Li, S.; Jin, X. A Comparison of Methods to Estimate Leaf Area Index Using Either Crop-Specific or Generic Proximal Hyperspectral Datasets. Eur. J. Agron. 2023, 142, 126664. [Google Scholar] [CrossRef]
  15. Jiang, Z.; Huete, A.R.; Didan, K.; Miura, T. Development of a Two-Band Enhanced Vegetation Index without a Blue Band. Remote Sens. Environ. 2008, 112, 3833–3845. [Google Scholar] [CrossRef]
  16. Gitelson, A.A.; Viña, A.; Ciganda, V.; Rundquist, D.C.; Arkebauer, T.J. Remote Estimation of Canopy Chlorophyll Content in Crops. Geophys. Res. Lett. 2005, 32, L08403. [Google Scholar] [CrossRef]
  17. Sims, D.A.; Gamon, J.A. Relationships between Leaf Pigment Content and Spectral Reflectance across a Wide Range of Species, Leaf Structures and Developmental Stages. Remote Sens. Environ. 2002, 81, 337–354. [Google Scholar] [CrossRef]
  18. Dash, J.; Curran, P.J. The MERIS Terrestrial Chlorophyll Index. Int. J. Remote Sens. 2004, 25, 5403–5413. [Google Scholar] [CrossRef]
  19. Haboudane, D.; Miller, J.R.; Pattey, E.; Zarco-Tejada, P.J.; Strachan, I.B. Hyperspectral Vegetation Indices and Novel Algorithms for Predicting Green LAI of Crop Canopies: Modeling and Validation in the Context of Precision Agriculture. Remote Sens. Environ. 2004, 90, 337–352. [Google Scholar] [CrossRef]
  20. Wu, C.; Niu, Z.; Tang, Q.; Huang, W. Estimating Chlorophyll Content from Hyperspectral Vegetation Indices: Modeling and Validation. Agric. For. Meteorol. 2008, 148, 1230–1241. [Google Scholar] [CrossRef]
  21. Qi, J.; Chehbouni, A.; Huete, A.R.; Kerr, Y.H.; Sorooshian, S. A Modified Soil Adjusted Vegetation Index. Remote Sens. Environ. 1994, 48, 119–126. [Google Scholar] [CrossRef]
  22. Rondeaux, G.; Steven, M.; Baret, F. Optimization of Soil-Adjusted Vegetation Indices. Remote Sens. Environ. 1996, 55, 95–107. [Google Scholar] [CrossRef]
  23. Rouse, J.W., Jr.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring Vegetation Systems in the Great Plains with Erts. In Proceedings of the NASA SP-351, 3rd ERTS-1 Symposium, Washington DC, USA, 10–14 December 1974. [Google Scholar]
  24. Gitelson, A.; Merzlyak, M.N. Spectral Reflectance Changes Associated with Autumn Senescence of Aesculus Hippocastanum L. and Acer Platanoides L. Leaves. Spectral Features and Relation to Chlorophyll Estimation. J. Plant Physiol. 1994, 143, 286–292. [Google Scholar] [CrossRef]
  25. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a Green Channel in Remote Sensing of Global Vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  26. Feng, W.; Wu, Y.; He, L.; Ren, X.; Wang, Y.; Hou, G.; Wang, Y.; Liu, W.; Guo, T. An Optimized Non-Linear Vegetation Index for Estimating Leaf Area Index in Winter Wheat. Precis. Agric. 2019, 20, 1157–1176. [Google Scholar] [CrossRef]
  27. Tanaka, S.; Kawamura, K.; Maki, M.; Muramoto, Y.; Yoshida, K.; Akiyama, T. Spectral Index for Quantifying Leaf Area Index of Winter Wheat by Field Hyperspectral Measurements: A Case Study in Gifu Prefecture, Central Japan. Remote Sens. 2015, 7, 5329–5346. [Google Scholar] [CrossRef]
  28. Penuelas, J.; Baret, F.; Filella, I. Semi-Empirical Indices to Assess Carotenoids/Chlorophyll a Ratio from Leaf Spectral Reflectance. Photosynthetica 1995, 31, 221–230. [Google Scholar]
  29. Broge, N.H.; Leblanc, E. Comparing Prediction Power and Stability of Broadband and Hyperspectral Vegetation Indices for Estimation of Green Leaf Area Index and Canopy Chlorophyll Density. Remote Sens. Environ. 2001, 76, 156–172. [Google Scholar] [CrossRef]
  30. Haboudane, D.; Miller, J.R.; Tremblay, N.; Zarco-Tejada, P.J.; Dextraze, L. Integrated Narrow-Band Vegetation Indices for Prediction of Crop Chlorophyll Content for Application to Precision Agriculture. Remote Sens. Environ. 2002, 81, 416–426. [Google Scholar] [CrossRef]
  31. Vogelmann, J.E.; Rock, B.N.; Moss, D.M. Red Edge Spectral Measurements from Sugar Maple Leaves. Int. J. Remote Sens. 1993, 14, 1563–1575. [Google Scholar] [CrossRef]
  32. Liang, L.; Di, L.; Zhang, L.; Deng, M.; Qin, Z.; Zhao, S.; Lin, H. Estimation of Crop LAI Using Hyperspectral Vegetation Indices and a Hybrid Inversion Method. Remote Sens. Environ. 2015, 165, 123–134. [Google Scholar] [CrossRef]
  33. He, L.; Ren, X.; Wang, Y.; Liu, B.; Zhang, H.; Liu, W.; Feng, W.; Guo, T. Comparing Methods for Estimating Leaf Area Index by Multi-Angular Remote Sensing in Winter Wheat. Sci. Rep. 2020, 10, 13943. [Google Scholar] [CrossRef]
  34. Kiala, Z.; Odindi, J.; Mutanga, O.; Peerbhay, K. Comparison of Partial Least Squares and Support Vector Regressions for Predicting Leaf Area Index on a Tropical Grassland Using Hyperspectral Data. J. Appl. Remote Sens. 2016, 10, 036015. [Google Scholar] [CrossRef]
  35. Ferraz, E.X.L.; Bezerra, A.C.; Lira, R.M.d.; Cruz Filho, E.M.d.; Santos, W.M.d.; Oliveira, H.F.E.d.; Silva, J.A.O.S.; Silva, M.V.d.; Silva, J.R.I.d.; Silva, J.L.B.d.; et al. What Is the Predictive Capacity of Sesamum Indicum L. Bioparameters Using Machine Learning with Red–Green–Blue (RGB) Images? Agriengineering 2025, 7, 64. [Google Scholar] [CrossRef]
  36. Wang, Q.; Lu, X.; Zhang, H.; Yang, B.; Gong, R.; Zhang, J.; Jin, Z.; Xie, R.; Xia, J.; Zhao, J. Comparison of Machine Learning Methods for Estimating Leaf Area Index and Aboveground Biomass of Cinnamomum Camphora Based on UAV Multispectral Remote Sensing Data. Forests 2023, 14, 1688. [Google Scholar] [CrossRef]
  37. Zhang, J.; Cheng, T.; Guo, W.; Xu, X.; Qiao, H.; Xie, Y.; Ma, X. Leaf Area Index Estimation Model for UAV Image Hyperspectral Data Based on Wavelength Variable Selection and Machine Learning Methods. Plant Methods 2021, 17, 49. [Google Scholar] [CrossRef]
  38. Li, X.; Li, C.; Guo, F.; Meng, X.; Liu, Y.; Ren, F. Coefficient of Variation Method Combined with XGboost Ensemble Model for Wheat Growth Monitoring. Front. Plant Sci. 2023, 14, 1267108. [Google Scholar] [CrossRef] [PubMed]
  39. Chatterjee, S.; Baath, G.S.; Sapkota, B.R.; Flynn, K.C.; Smith, D.R. Enhancing LAI Estimation Using Multispectral Imagery and Machine Learning: A Comparison between Reflectance-Based and Vegetation Indices-Based Approaches. Comput. Electron. Agric. 2025, 230, 109790. [Google Scholar] [CrossRef]
  40. Du, L.; Yang, H.; Song, X.; Wei, N.; Yu, C.; Wang, W.; Zhao, Y. Estimating Leaf Area Index of Maize Using UAV-Based Digital Imagery and Machine Learning Methods. Sci. Rep. 2022, 12, 15937. [Google Scholar] [CrossRef]
Figure 1. The workflow of estimating LAI using UAV hyperspectral data and machine learning.
Figure 1. The workflow of estimating LAI using UAV hyperspectral data and machine learning.
Blsf 41 00011 g001
Figure 2. Study area map showing the location of wheat fields (a,b) and experimental design (c).
Figure 2. Study area map showing the location of wheat fields (a,b) and experimental design (c).
Blsf 41 00011 g002
Figure 3. PLSR-VIP ranking for vegetation indices. The red line denotes the threshold of 1.
Figure 3. PLSR-VIP ranking for vegetation indices. The red line denotes the threshold of 1.
Blsf 41 00011 g003
Figure 4. Optimal vegetation indices selected through PLSR-VIP scoring.
Figure 4. Optimal vegetation indices selected through PLSR-VIP scoring.
Blsf 41 00011 g004
Figure 5. (a) Scatter diagram showing validation results; (b) predicted LAI map.
Figure 5. (a) Scatter diagram showing validation results; (b) predicted LAI map.
Blsf 41 00011 g005
Table 1. List of vegetation indices used in the study.
Table 1. List of vegetation indices used in the study.
Sl No.Vegetation IndexFormulationRef.
1Angular insensitivity vegetation index (AIVI) R 445 R 720 + R 735 R 573 R 720 R 735 R 720 R 573 + R 445 [13]
2Chlorophyll index (ClI) R 750 / R 700 + R 710 1 [14]
3Two-band Enhanced vegetation index (EVI2)2.5[(R800 − R660)/(1 + R800 + 2.4 R660)][15]
4Green chlorophyll index (CIgreen)R780/R550 − 1[16]
5Modified simple ratio (mSR)(R750 − R445)/(R705 − R445)[17]
6MERIS terrestrial chlorophyll index (MTCI2)(R754 − R709)/(R709 − R681)[18]
7Modified triangular vegetation index (MTVI2) 1.5 1.2 R 800 R 550 2.5 R 670 R 550 2 R 800 + 1 2 6 R 800 + 5 R 670 0.5 0.5 0.5 [19]
8Modified chlorophyll absorption ratio indices (MCARI3) R 750 R 705 0.2 R 750 R 550 × R 750 / R 705 [20]
9Modified chlorophyll absorption ratio index 1 (MCARI1) 1.2 2.5 R 800 R 670 1.3 R 800 R 550 [19]
10Modified chlorophyll absorption ratio index 2 (MCARI2)[1.5 × (2.5 × (R800 − R670)  −  1.3 × (R800 − R550))]/[Sqrt((2 × R800 + 1)2  −  6 × R800 +  5 × sqrtR670) − 0.5][19]
11Modified Red Edge Normalized Difference Vegetation Index (MRENDVI)(R750 − R705)/(R750 + R705 − 2 R445)[17]
12Modified soil adjusted vegetation index (MSAVI2)(2 R810 + 1 − sqrt((2 R810 + 1)2 − 8 (R810 − R660)))/2[21]
13MCARI3/OSAVI R 750 R 705 0.2 R 750 R 550 R 750 / R 705 ( 1 + 0.16 ) ( R 800 R 670 ) / ( R 800 + R 670 + 0.16 ) [20,22]
14Normalized difference vegetation index (NDVI [670,800])(R800 − R670)/(R800 + R670)[23]
15Normalized Difference ND [705,750](R750 − R705)/(R750 + R705)[24]
16NDVI [550,750](R750 − R550)/(R750 + R550)[25]
17Optimized soil adjusted vegetation index (OSAVI)(1 + 0.16)(R800 − R670)/(R800 + R670 + 0.16)[22]
18Optimized nonlinear vegetation index (ONLI)1.05(0.6×R7982 − R728)/ (0.6*R7982 + R728 + 0.05)[26]
19Photon radiance index (PRI2)(R531 − R570/(R531 + R570)[27]
20Ratio spectral index (RSI)R760/R730[28]
21Red-edge chlorophyll index (CIre)R780/R710 − 1[16]
22Structure Insensitive Pigment Index (SIPI)(R800 − R445)/(R800 − R680)[29]
23Simple ratio index (SR) SR [800,680]R800/R680[17]
24Triangular vegetation index (TVI)0.5 [120(R750 − R550) − 200(R670 − R550)][30]
25Transformed chlorophyll absorption in reflectance index (TCARI)3 [(R700 − R670) − 0.2(R700 − R550)(R700/R670)][31]
26TCARI/OSAVI 3 [ ( R 700 R 670 ) 0.2 ( R 700 R 550 ) ( R 700 / R 670 ) ] ( 1 + 0.16 ) ( R 800 R 670 ) / ( R 800 + R 670 + 0.16 ) [22,31]
27Vogelmann red edge index 1 (VREI1)R740/R720[32]
Rλ is reflectance at λ wavelength.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Rejith, R.G.; Sahoo, R.N.; Ranjan, R.; Kondraju, T.; Bhandari, A.; Gakhar, S. Estimating Leaf Area Index of Wheat Using UAV-Hyperspectral Remote Sensing and Machine Learning. Biol. Life Sci. Forum 2025, 41, 11. https://doi.org/10.3390/blsf2025041011

AMA Style

Rejith RG, Sahoo RN, Ranjan R, Kondraju T, Bhandari A, Gakhar S. Estimating Leaf Area Index of Wheat Using UAV-Hyperspectral Remote Sensing and Machine Learning. Biology and Life Sciences Forum. 2025; 41(1):11. https://doi.org/10.3390/blsf2025041011

Chicago/Turabian Style

Rejith, Rajan G., Rabi N. Sahoo, Rajeev Ranjan, Tarun Kondraju, Amrita Bhandari, and Shalini Gakhar. 2025. "Estimating Leaf Area Index of Wheat Using UAV-Hyperspectral Remote Sensing and Machine Learning" Biology and Life Sciences Forum 41, no. 1: 11. https://doi.org/10.3390/blsf2025041011

APA Style

Rejith, R. G., Sahoo, R. N., Ranjan, R., Kondraju, T., Bhandari, A., & Gakhar, S. (2025). Estimating Leaf Area Index of Wheat Using UAV-Hyperspectral Remote Sensing and Machine Learning. Biology and Life Sciences Forum, 41(1), 11. https://doi.org/10.3390/blsf2025041011

Article Metrics

Back to TopTop