Next Article in Journal
Stepwise-Regression-Based Finite Mixture Model for Multi-Aspect SAR Image Modeling
Previous Article in Journal
Assessing the Spatial Similarity of Soil Moisture Patterns and Their Environmental and Observational Drivers from Remote Sensing and Earth System Modeling Across Europe
Previous Article in Special Issue
Analysis of the Spatiotemporal Variation Characteristics and Driving Forces of Crops in the Yellow River Basin from 2000 to 2023
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Characterizing Cotton Defoliation Progress via UAV-Based Multispectral-Derived Leaf Area Index and Analysis of Influencing Factors

1
Engineering Research Center of Plant Growth Regulator, Ministry of Education, College of Agronomy and Biotechnology, China Agricultural University, Beijing 100193, China
2
Key Laboratory of Tobacco Biology and Processing, Ministry of Agriculture and Rural Affairs, Tobacco Research Institute of Chinese Academy of Agricultural Sciences, Qingdao 266101, China
3
College of Agriculture, Tarim University, Alar 843300, China
*
Author to whom correspondence should be addressed.
These authors have contributed equally to this work and share first authorship.
Remote Sens. 2026, 18(4), 609; https://doi.org/10.3390/rs18040609
Submission received: 30 December 2025 / Revised: 29 January 2026 / Accepted: 12 February 2026 / Published: 15 February 2026

Highlights

What are the main findings?
  • Optimal UAV height and index identified: EVI at 100 m flight altitude provided the most accurate LAI estimation (R2 = 0.921, RRMSE = 11.808%) for monitoring cotton defoliation.
  • Strong proxy for defoliation rate: The rate of LAI change is highly correlated with the manually measured defoliation rate (r = 0.83–0.88), enabling reliable operational monitoring.
  • Key interference sources quantified: Soil background and open cotton bolls were the primary factors reducing LAI estimation accuracy; removing them improved model performance significantly (e.g., R2 increase of 0.169 at 15 days after treatment).
  • Dynamic model selection needed: No single machine learning model performed best throughout the defoliation period, indicating the need for stage-specific or adaptive modeling strategies.
What are the implications of the main findings?
  • (Practical Application) The strong correlation between the LAI change rate and the defoliation rate (r = 0.83–0.88) provides farmers and agronomists with a reliable proxy metric. It enables rapid, UAV-based LAI monitoring to dynamically assess defoliation progress, thereby supporting precise harvest timing decisions.
  • (Methodological Improvement) The identification of soil and open cotton bolls as primary interference sources clearly indicates that background removal preprocessing (e.g., the SVM classification used in this study) must be integrated into UAV monitoring workflows during mid-to-late defoliation to significantly improve inversion accuracy.
  • (Methodological Insight) The variation in the optimal model with days after application reveals that a traditional “one-model-fits-all” approach is inadequate during periods of rapid canopy structural change. Future systems should employ adaptive or stage-specific intelligent modeling frameworks.
  • (Operational Guidance) The finding that the 100 m flight altitude yielded the best results offers direct guidance for UAV operational parameters. It shows that for defoliation monitoring, lower flight altitude (higher resolution) is not always better; moderate pixel mixing can help suppress canopy heterogeneity noise.

Abstract

Timely monitoring of cotton defoliation progress is crucial for optimizing the quality of mechanical harvesting. To accurately assess the defoliation status prior to mechanical picking, a field experiment was conducted in Hejian, Hebei Province, China, in 2022. Using a DJI P4M multispectral drone, canopy images of cotton were collected before and after defoliation at three flight altitudes: 25 m, 50 m, and 100 m. The study employed machine learning algorithms including linear regression, Support Vector Machine (SVM), Generalized Additive Model (GAM), and Random Forest (RF) to invert the Leaf Area Index (LAI). Additionally, SVM-based supervised classification was introduced to eliminate background interference from soil and open cotton bolls, while the XGBoost model and SHAP method were used to analyze the main factors influencing LAI inversion. Key findings include the following: The univariate linear relationship between EVI and LAI proved to be the most robust, with the model constructed from 100 m flight altitude data performing best (validation set: R2 = 0.921, RMSE = 0.284). The rate of LAI change showed a strong positive correlation with field-measured defoliation rate (r = 0.83–0.88), confirming its reliability as a proxy indicator for defoliation progress. Soil and open cotton bolls were identified as major negative factors affecting LAI inversion accuracy. The optimal machine learning prediction model varied with days after spraying, demonstrating significant temporal variability. This study demonstrates that high-throughput LAI inversion based on drone-derived multispectral EVI enables precise and dynamic monitoring of cotton defoliation. The approach provides farmers and field managers with an efficient, non-destructive monitoring tool. By delivering real-time insight into defoliation progress, it plays a pivotal role in enabling precision defoliation management, reducing excessive chemical use, optimizing the scheduling of mechanical operations, and ultimately enhancing both the sustainability and profitability of cotton production.

1. Introduction

Cotton is a strategic economic crop in China, playing a vital role in the national economy. The harvesting of cotton traditionally requires substantial manual labor, yet with the growing scarcity of agricultural labor, the shift toward mechanized harvesting has become an imperative. In practical agricultural production, defoliation and ripening agents are commonly applied to accelerate leaf abscission, thereby enhancing the efficiency of mechanical harvesting and reducing the impurity content in seed cotton. The progress of cotton defoliation is generally assessed by the defoliation rate, which serves as a key criterion for mechanical harvesting readiness.
Leaf Area Index (LAI), defined as the total one-sided area of green leaves per unit ground surface area, is a core parameter for characterizing vegetation structure and a key indicator for quantifying canopy leaf content in vegetation and ecosystem studies [1]. Beyond statistical correlations, LAI is mechanistically linked to critical physiological and ecological processes. It plays a significant role across multiple domains, including yield prediction [2], nitrogen nutrition diagnosis [3], water stress assessment [4], and early warning for pests and diseases [5]. Liao et al. found that LAI exhibited a negative exponential relationship with droplet deposition and a negative cubic relationship with cotton defoliation rate. Significant correlations were observed among LAI, droplet deposition, and defoliation rate, enabling prediction ofcotton defoliation rate using an LAI-based multiple regression model [6]. In areas with low LAI, defoliant readily penetrates the upper canopy and reach the lower canopy, resulting in a defoliation rate exceeding 80%. In contrast, in regions with relatively high LAI, UAV spraying demonstrates weaker penetration compared to ground-based spraying, leading to significantly lower defoliation rates and suboptimal efficacy. This indicates that LAI substantially influences post-application cotton defoliation rates [7].
LAI measurement methods can be broadly categorized into three types: direct destructive methods, indirect optical methods, and remote sensing inversion methods. Direct destructive methods are not only time-consuming and labor-intensive but also difficult to implement due to vegetation damage, non-repeatability, and variations across growth stages and leaf positions. Indirect optical methods, based on radiation transfer models or digital hemispherical photography principles, still require significant time for LAI determination. Therefore, the ability to rapidly and accurately determine LAI through high-throughput inversion methods is crucial. Compared to satellite and airborne platforms, UAVs offer distinct advantages: operational flexibility, low-altitude flight capability, and centimeter-level spatial resolution. Consequently, UAV-based remote sensing, particularly using multispectral cameras equipped with red-edge and near-infrared bands, is widely regarded as an ideal approach for retrieving crop physiological and biochemical parameters [8,9,10,11]. Studies at the cotton field scale have validated the predictive capability of UAV-based multispectral remote sensing for LAI, often employing vegetation indices and machine learning models such as Random Forest [12,13]. However, these traditional spectral-index-based approaches can face limitations under dynamic and heterogeneous canopy conditions, such as those encountered during defoliation. Recent advances in UAV-based deep learning models, which can automatically extract complex spatial-spectral features, have shown promising results in overcoming similar challenges in agricultural monitoring tasks, such as fine-grained crop detection and phenotyping [14]. These developments suggest their potential for improving LAI inversion robustness in complex scenarios.
Despite the immense potential of UAV remote sensing for monitoring crop LAI, existing studies have predominantly focused on the peak vegetative and reproductive growth stages, where closed canopies dominated by green leaves allow relatively straightforward model development [15,16,17]. However, during the period when open cotton bolls and defoliants are applied, cotton leaves undergo concurrent changes in color and water loss, wilting first and then drying before abscission [18]. Substantial leaf shedding exposes non-photosynthetic components such as soil and open cotton bolls, leading to a gradual increase in cotton spectral reflectance in the visible range and a decline in the near-infrared (NIR) and red-edge bands as leaves drop [19,20]. Consequently, the canopy spectral signal shifts from being “vegetation-dominated” to a complex mixture of vegetation, soil, and non-photosynthetic vegetation. Empirical models linking vegetation indices to LAI, originally established for early or full-bloom stages, often exhibit reduced accuracy during the defoliation period, likely due to insufficient consideration of background interference. Moreover, the influence of background factors on LAI retrieval during defoliation has not been systematically analyzed. The optimal observation scale remains unclear. While precision agriculture typically pursues higher spatial resolution to capture fine details, during defoliation a finer resolution (smaller ground sample distance, GSD) may amplify heterogeneity noise from soil and open cotton bolls, whereas moderate spectral mixing (larger GSD) could potentially improve inversion accuracy—an issue that has yet to be comparatively examined.
Accordingly, a field experiment on cotton defoliation and ripening was conducted in 2022 in Hejian, Hebei Province, within the Yellow River cotton-growing region. Multispectral imagery was acquired at three different flight altitudes (25 m, 50 m, and 100 m). We employed linear regression and machine learning modeling to achieve high-throughput inversion of cotton Leaf Area Index (LAI). The objectives were to identify the optimal flight altitude, vegetation index, and machine learning method for UAV-based LAI retrieval, and to investigate factors influencing inversion accuracy during the late growth stage of cotton. This work aims to provide methodological support for the real-time monitoring of cotton defoliation, thereby serving precision agriculture management. By supplying objective and continuous data on canopy status, it lays the groundwork for implementing intelligent agronomic practices such as defoliant variable application and precision harvesting. This approach contributes to enhancing operational efficiency while promoting resource conservation and sustainable agricultural development.

2. Materials and Methods

2.1. Experimental Design and Management

The experiment was conducted in 2022 in Hejian City, Hebei Province (38°23′N, 116°08′E). A split-plot design was adopted, with planting density as the main-plot factor and harvest aid concentration as the sub-plot factor. The cotton cultivar used was “Huazamian H318,” sown on 25 April. Two planting densities were set: 45,000 plants/ha and 90,000 plants/ha. The harvest aid concentration included five levels: CK (clear water control), T1 (750 mL/ha), T2 (1500 mL/ha), T3 (3000 mL/ha), and T4 (4500 mL/ha). Harvest aid treatment was applied on 23 September using a 50% (w/w) thidiazuron • ethephon suspension concentrate (T•E, from Hebei Guoxinnuonong Biotechnology Co., Ltd., Hejian City, China) (Figure 1a). Meteorological data recorded over the 21 days following the treatment showed an average temperature of 16.9 °C, with maximum and minimum daily temperatures of 25.7 °C and 9.3 °C, respectively, and a cumulative rainfall of 51.1 mm.
Each experimental plot covered 72 m2 (9 m long × 8 m wide). At a density of 45,000 plants ha−1, rows were spaced 100 cm apart with 22 cm between plants and at 90,000 plants ha−1, a “90 + 10 cm” wide-narrow row configuration was adopted, maintaining a 22 cm within-row spacing. Soil base fertility was measured prior to the commencement of the experiment: pH 7.8, total N 0.8 g kg−1, available P 46.0 mg kg−1, and available K 67.0 mg kg−1. Throughout the 2022 season, mepiquat chloride was applied at a cumulative rate of 487.5 g ha−1 for comprehensive growth regulation. On 22 July, chemical topping was performed by spraying 225 g ha−1 mepiquat chloride supplemented with 150 g ha−1 adjuvant. All other agronomic practices followed standard local recommendations.

2.2. Field Indicator Survey

2.2.1. Defoliation Rate

Six cotton plants with uniform growth, adjacent positioning, and representativeness were selected from each plot. The total leaf count per plant was recorded immediately before the application of the defoliant and at different days after treatment. The defoliation percentage (DP) was computed as (Equation (1)):
DP (%) = (N0Nn)/N0 × 100%
where N0 denotes the leaf count recorded immediately before defoliant spraying, and Nn represents the corresponding count at 5, 10, 15, or 21 days thereafter.

2.2.2. Leaf Area Index (LAI)

On the day of defoliation rate evaluation, leaf area index (LAI) was measured in the three central rows of each plot using an LAI-2200 plant canopy analyzer (LI-COR, Lincoln, NE, USA) (Figure 1b).

2.3. Acquisition of UAV Imagery

The spatial extent of the experimental area was first delineated, after which flight routes were automatically generated using DJI GS Pro software (version 2.0.17). Multispectral imagery was acquired with a DJI P4M unmanned aerial vehicle (UAV) (manufactured by SZ DJI Technology Co., Ltd., Shenzhen, China) equipped with a multispectral camera. The camera consists of one color sensor for visible imaging and five monochrome sensors dedicated to multispectral imaging, each with an effective resolution of 2.08 megapixels (total pixels: 2.12 megapixels). The five multispectral bands, defined by their central wavelengths and full width at half maximum (FWHM), are as follows: blue (B, 450 ± 16 nm), green (G, 560 ± 16 nm), red (R, 650 ± 16 nm), red edge (RE, 730 ± 16 nm), and near-infrared (NIR, 840 ± 26 nm). The UAV flight speed was set to 5 m/s, with along- and across-track overlap rates of 80% and 70%, respectively. Image capture was performed in waypoint hover mode, while other flight parameters remained at default settings. Flights were conducted at three altitudes (25 m, 50 m, and 100 m) using a multispectral camera with a field of view of 62.7°. All missions were carried out between 11:00 and 13:00 local time, with an along-track overlap of 80%.
Initially, radiometric calibration was performed using a reflectance panel. Subsequently, the multispectral images were mosaicked using PIX4Dmapper software (version 4.5.6, Pix4D SA, Prilly, Switzerland). Following image mosaicking, normalization was applied to derive vegetation reflectance data. Spectral indices were then extracted based on the processed reflectance data. In QGIS 3.22 (QGIS Development Team), each plot was divided into two sub-areas based on ground observations to improve sample representativeness. Regions of interest (ROIs) were delineated within the central six rows of each sub-area. Finally, the corresponding spectral index values for each plot were calculated using standard vegetation index formula (Figure 2).

2.4. Vegetation Index Formula

Vegetation indices associated with leaf area index dynamics and crop growth monitoring were selected from the literature for analysis, as listed in Table 1.

2.5. Methods and Model Evaluation

All statistical analyses were performed using R 4.5.0 (The University of Auckland, Auckland, New Zealand). The dataset was partitioned into training and validation sets at an 8:2 ratio. To ensure the reproducibility of each training and validation run, a fixed random seed was applied. Prior to constructing multiple machine learning models for inversion, correlation analysis was first performed to identify the vegetation index exhibiting the strongest correlation. Subsequently, vegetation indices showing a correlation greater than 0.8 with this strongest index were removed to mitigate multicollinearity. Following this, Recursive Feature Elimination (RFE) was conducted for further feature selection. The overall flowchart of the research methodology is illustrated in the figure below (Figure 3).

2.5.1. Sample Feature Selection

A Support Vector Machine (SVM) was employed to perform background removal targeting soil and open cotton bolls from the cotton multispectral imagery. During the supervised classification process using SVM, four categories were defined: soil, open cotton bolls, canopy leaves, canopy. For each category, 50 representative sample points were uniformly selected from the multispectral imagery to establish the training set. Subsequently, for validation, an additional 50 samples per category were chosen. The classification accuracy for each category was assessed using the confusion matrix tool within ENVI (version 5.6). Following this, ArcMap 10.8 was utilized to extract the respective areas of soil and opened cotton bolls.

2.5.2. Linear Regression Model

Linear regression, a form of regression analysis, aims to establish a functional relationship between a dependent variable and one or more independent variables by estimating model parameters. The error term is typically minimized using the least squares method.
The model is expressed as:
y = β 0 + β 1 x + E
where y represents the dependent variable, x denotes the independent variable, β 1 is the regression coefficient, β 0   is the intercept, and E stands for the error term.

2.5.3. Support Vector Machine (SVM)

SVM is a robust machine learning algorithm widely employed in both classification and regression tasks. Its fundamental principle involves identifying an optimal hyperplane that maximizes the margin between different classes, thereby enhancing the model’s generalization capability. For regression tasks, SVM is extended to Support Vector Regression (SVR), which introduces an ε-insensitive loss function. This function allows a predefined tolerance margin between predicted and actual values, penalizing only errors that exceed this threshold, thus improving robustness against noise and outliers. SVM has been widely adopted in agricultural research, particularly for modeling tasks characterized by small sample sizes, high-dimensional features, and nonlinear relationships. Representative applications include crop yield prediction, monitoring of temperature and humidity variations, analysis of crop physiological and biochemical traits, as well as soil quality assessment and classification. The method demonstrates notable adaptability to nonlinear problems and performs robustly even with limited sample data.

2.5.4. Generalized Additive Model (GAM)

GAM represents a significant extension of the Generalized Linear Model (GLM), relaxing the strict linearity assumption of GLM by incorporating smooth functions. This enables GAM to flexibly capture nonlinear relationships inherent in the data. Key features of GAM include its ability to automatically model nonlinear associations between predictors and response variables, as well as its adaptability to various data types such as continuous, binary, and count responses.

2.5.5. Random Forest (RF)

RF operates by constructing multiple weak learners (decision trees) through bootstrap sampling and random feature selection, with final predictions derived from voting or averaging across all trees. This ensemble approach demonstrates robustness against overfitting, eliminates the need for feature scaling, and enables assessment of feature importance. Particularly suited for high-dimensional data, RF achieves superior generalization capability compared to single-model methods.

2.5.6. Recursive Feature Elimination (RFE)

RFE is a greedy “top-down” feature selection strategy. It starts by training a model on the full feature set, ranks features by their importance, and then eliminates the least important ones. This process is repeated iteratively—retraining the model and removing features—until a predefined number of features is reached. The final output is a subset of variables that contribute most significantly to prediction. The core principle of RFE is to use the model’s intrinsic “feature importance” as the elimination criterion, thereby avoiding the need for additional statistical tests.

2.5.7. SHAP

SHAP (SHapley Additive exPlanations) is a model interpretation method based on cooperative game theory. It assigns an importance value—known as the SHAP value—to each feature of the model to explain the contribution of individual features to the model output (i.e., the prediction result). These SHAP values reveal which features significantly influence the prediction and whether their effects are positive or negative. Combining XGBoost with SHAP analysis provides a powerful approach for interpreting predictions from complex models. Even for datasets with high dimensionality and intricate feature interactions, the XGBoost and SHAP framework delivers accurate and reliable explanations of model behavior.

2.5.8. Standardized Regression Coefficient (β Coefficient)

β coefficient is obtained by standardizing all variables in the regression model, thereby allowing variables measured in different units to be directly compared in terms of their relative importance. The standardization is performed using the following formula:
Z = X μ σ
where X is the original variable, μ is its mean, and σ is its standard deviation.

2.5.9. Evaluation of Model Accuracy

To evaluate the performance of the model, the coefficient of determination (R2), root mean square error (RMSE), and relative root mean square error (RRMSE) were used. The formulas for calculating R2, RMSE, and RRMSE are as follows:
R 2 = 1 i = 1 n y i y ^ i 2 i = 1 n y i y ¯ i 2
R M S E = 1 n i = 1 n y i y ^ i 2
R R M S E = 1 n i = 1 n y i y ^ i 2 y ¯
The sample size is denoted by n, y ^ i represents the predicted value, y i is the actual value, and y ¯   is the mean of the actual values.

3. Results and Analysis

3.1. The Rate of Change in LAI Is Highly Correlated with the Defoliation Rate

Figure 4 shows scatter plots illustrating the correlations between leaf area index (LAI) and leaf number, as well as between the rate of LAI change and defoliation rate in cotton under different planting densities before and after the application of harvest aids. As shown in Figure 4a, under both 45,000 plants/ha and 90,000 plants/ha density conditions, LAI was highly positively correlated with leaf number, with correlation coefficients of 0.84 and 0.86, respectively, both reaching an extremely significant level. Furthermore, Figure 4b demonstrates that under the same two density conditions, the rate of LAI change was also highly positively correlated with the defoliation rate, with correlation coefficients of 0.83 (45,000 plants/ha) and 0.88 (90,000 plants/ha), respectively, both of which were statistically significant.

3.2. Univariate Linear Regression Results of Vegetation Indices and LAI Under Different Height Conditions

A simple linear regression analysis was conducted between LAI and the vegetation indices listed in Table 2. Among all vegetation indices, the Enhanced Vegetation Index (EVI) demonstrated the best predictive performance at 25 m altitude, with the regression equation y = 5.510x + 0.102. This yielded training set metrics of R2 = 0.881, RMSE = 0.360, and RRMSE = 14.706%, and validation set metrics of R2 = 0.882, RMSE = 0.347, and RRMSE = 14.418%. At 50 m altitude, EVI also exhibited optimal predictive accuracy, described by the equation y = 5.839x − 0.030. Training set results were R2 = 0.905, RMSE = 0.322, and RRMSE = 13.147%, and validation set results of R2 = 0.902, RMSE = 0.315, and RRMSE = 13.088%. Similarly, at 100 m altitude, EVI achieved the highest prediction accuracy, modeled as y = 6.239x − 0.111. Training set values were R2 = 0.916, RMSE = 0.303, and RRMSE = 12.381%, while validation set values were R2 = 0.921, RMSE = 0.284, and RRMSE = 11.808% (Table 2). Based on the EVI model from the 100 m altitude data, LAI inversion mapping was performed using ENVI 5.3 and ArcMap 10.8. The resulting spatial distribution map (Figure 5) visually represents the dynamic changes in LAI before and after chemical application at different time points.

3.3. Leaf Area Index (LAI) Inversion Under Different Machine Learning Methods

All machine learning models demonstrated strong performance in predicting LAI. The RF model achieved an R2 of 0.991, RMSE of 0.101, and RRMSE of 4.176% on the training set, and an R2 of 0.958, RMSE of 0.208, and RRMSE of 8.405% on the validation set. The GAM yielded an R2 of 0.954, RMSE of 0.222, and RRMSE of 9.158% for the training set, and an R2 of 0.966, RMSE of 0.188, and RRMSE of 7.168% for the validation set. The SVM model produced an R2 of 0.947, RMSE of 0.238, and RRMSE of 9.802% on the training set, and an R2 of 0.952, RMSE of 0.223, and RRMSE of 9.031% on the validation set. Among the three models, the GAM exhibited comparatively robust and stable predictive performance (Figure 6).

3.4. Classification Results of Soil, Canopy, and Opened Cotton Bolls Using SVM-Based Supervised Classification

Based on the ENVI 5.3 platform, supervised classification using SVM was performed to distinguish soil, canopy, Green leaves, and open cotton bolls. The images of cotton fields before and on different days after the application of the harvest aids, with soil removed, are shown in Figure 7, while the corresponding images with both soil and open cotton bolls removed are presented in Figure 8. Accuracy assessment results are summarized in Table 3. The classification outcomes from 4 October and 8 October consistently achieved overall accuracy values exceeding 99% for both soil vs. canopy and Canopy leaves vs. open cotton bolls, with Kappa coefficients above 0.99. These results indicate that the SVM-based supervised classification approach delivers excellent performance in separating soil from canopy and Canopy leaves from open cotton bolls.

3.5. Accuracy Analysis of LAI Retrieval Under Two Scenarios: Soil Removal and Combined Soil and Open Cotton Bolls Removal, Based on Multi-Day Data and Multiple Machine Learning Methods

The vegetation index features selected by different machine learning methods exhibited notable variations. As shown in Table 4, under the original image conditions, the relative root mean square error (RRMSE) of the validation set on 22 September (1 day before spraying) and 28 September (5 days after spraying) was below 10%, indicating high prediction accuracy during these two periods. In contrast, on 4 October (10 days after spraying), 8 October (15 days after spraying), and 13 October (20 days after spraying), the validation set RRMSE exceeded 10%, with significantly increased errors on 4 October and 8 October, suggesting a decline in model predictive performance during these later stages. In terms of the comparison among machine learning methods, the optimal model varied with the number of days after application: RF performed best on 22 September, GAM achieved the highest accuracy on 28 September, SVM was most effective on 4 October, RF again showed optimal performance on 8 October, and SVM yielded the best results on 13 October.
After removing the soil background (Table 4), the RRMSE of the validation set remained below 10% on 22 September and 28 September. For 4 October, 8 October, and 13 October, the RRMSE showed improvement compared to the original images. Specifically, the optimal model RRMSE for 4 October and 13 October decreased to below 10%, and the error for 8 October was also significantly reduced. Regarding the selection of the optimal model, RF remained the best on 22 September, SVM became optimal on 28 September, SVM continued to perform best on 4 October, GAM achieved the highest accuracy on 8 October, and GAM again was optimal on 13 October.
After removing both soil and open cotton bolls (Table 4), the RRMSE of the optimal models in the validation set was below 10% on 22 September, 28 September, 4 October, and 13 October, while the RRMSE on 8 October also showed further improvement. In terms of model performance, GAM performed best on 22 September, SVM on 28 September, GAM on 4 October, SVM on 8 October, and RF on 13 October.

3.6. Analysis of Factors Influencing the Accuracy of Leaf Area Index Retrieval

As shown in Figure 9a, the soil area proportion (Soil) and the open cotton bollsl area proportion (Cotton) were identified as the main factors affecting the accuracy of LAI retrieval. Partial correlation analysis indicated that both the open cotton bolls area (Cotton) and the soil area proportion (Soil) exhibited an overall negative correlation with LAI (Figure 9b). Figure 9c illustrates the spectral reflectance of different surface features across various bands. Green leaves exhibited higher reflectance in the near-infrared (NIR) region. In contrast, dry leaves showed lower reflectance in the red band and decreased reflectance in the NIR region. Open cotton bolls demonstrated relatively high reflectance in the visible spectrum. Soil, however, displayed low reflectance across all bands. Figure 9d presents the extent of influence of soil fraction and open cotton bolls fraction on vegetation indices, measured using standardized coefficients. The results indicated that both soil fraction and open cotton bolls fraction exerted a significant negative influence on NDVI and DVI. In this study, EVI showed greater sensitivity to variations in soil fraction, while NDVI was comparatively less affected by open cotton bolls than EVI was. Furthermore, the interaction between soil fraction and open cotton bolls fraction had a statistically significant effect on NDVI, but no significant effect was observed on EVI.

4. Discussion

4.1. Optimal Vegetation Index Selection for Leaf Area Index Retrieval During Cotton Defoliation

This study systematically evaluated the applicability of multiple vegetation indices for retrieving LAI during cotton defoliation using UAV-based multispectral data acquired at different flight altitudes. One consistent finding was that the EVI consistently demonstrated optimal and stable predictive performance for LAI inversion across three flight altitudes (25 m, 50 m, and 100 m). Moreover, model accuracy showed a slight improvement with increasing altitude, as evidenced by the optimal performance at 100 m, where the validation set achieved an R2 of 0.921 and RRMSE as low as 11.808%. The ground sample distances (GSDs) at flight altitudes of 25 m, 50 m, and 100 m were 1.29 cm, 2.61 cm, and 5.12 cm, respectively. The reduction in resolution caused by higher flight altitudes increases the proportion of mixed pixels, making it more difficult to distinguish between different surface features [39]. In a study by Zhang et al. [40], predicting oat above-ground biomass using imagery acquired at different altitudes (25 m, 50 m, and 100 m), the highest accuracy was achieved at 25 m. Furthermore, spectral resolution has been shown to have a particularly pronounced effect on the inversion accuracy of plant height [41]. However, the findings of this study do not contradict the theory of mixed pixels. Instead, during the cotton defoliation period—characterized by leaf discoloration, wilting, abscission, soil exposure, and a mixture of green leaves on the ground and withered leaves remaining on the plants—canopy heterogeneity increases, amplifying background noise from shadows and soil. UAV-based inspection systems often face similar challenges with background noise and complex scenes, where advanced segmentation and classification methods become essential. In such scenarios, a lower flight altitude, implying “excessive clarity” in observation, may intensify the impact of such background interference on estimation accuracy. Matching the spatial resolution precisely with the ground sampling requirements can potentially lead to higher precision [42]. For instance, Wang et al., using DJI Phantom 4 Multispectral imagery for SPAD inversion at different flight heights, found that the inversion accuracy at 120 m was slightly higher than that at 20 m [43]. Similarly, in Qu et al.’s study predicting blueberry yield using imagery from different altitudes, models performed poorly with lower altitude imagery (5–10 m) due to interference from tree crown shadows, whereas imagery from relatively higher altitudes yielded better model performance [44]. This is further supported by Awais et al., who found that a flight altitude of 60 m provided the most accurate canopy temperature estimation in their multi-altitude study (25 m, 40 m, and 60 m) [45]. Therefore, the optimal flight altitude identified in this study does not negate the mixed pixel theory. Rather, it indicates that for specific crops, models, and research objectives, there may exist an optimal scale that achieves the best balance between effectively characterizing target agronomic parameters and suppressing irrelevant noise.
This trend may be attributed to the larger spatial coverage and reduced spatial heterogeneity in images acquired at higher altitudes, which likely mitigated the influence of mixed pixels. As a key parameter for characterizing crop canopy structure and photosynthetic capacity, LAI is closely related to vegetation indices. Previous studies have indicated that vegetation indices represented by EVI can serve as effective indicators for estimating biomass and monitoring canopy status across different growth stages [46,47]. Nevertheless, EVI is not the only effective parameter for LAI retrieval. For example, in irrigation and fertilization experiments conducted in Shihezi, Xinjiang, the ratio vegetation index (RVI = NIR/R) derived from UAV multispectral images acquired at the full bud, early flowering, and full flowering stages showed the best predictive performance [12]. However, after the application of defoliant and ripening agents during the late growth stage of cotton, the canopy undergoes a dynamic process characterized by a continuous decline in LAI from initially high levels and a progressive increase in the exposure of soil background and white open cotton bolls. Under conditions of high LAI, the potential saturation effect of vegetation indices must be considered. Taking the classic Normalized Difference Vegetation Index (NDVI) as an example, its sensitivity decreases markedly, and its response becomes sluggish when canopy biomass or LAI is already high and continues to increase, easily entering a “saturated” state. Some studies, using radiative transfer models, have suggested a general threshold range of LAI = 2–3—corresponding to the critical point of “just complete vegetation cover”—as the interval where sensitivity is lost [48]. Mutanga et al., through a summary analysis, indicated that grasslands, shrubs, and forests may have different saturation points [49]. Cotton typically reaches its peak LAI around the flowering stage. During the defoliation period, cotton has generally entered the boll-opening phase, and the canopy is progressing towards senescence. In the study by Xu et al. on LAI inversion using machine learning, retrieval results noticeably deteriorated when LAI exceeded 4 [50]. In our study, the mean LAI was 3.795 before defoliant application, 3.221 at 5 days after application (DAA), 2.119 at 10 DAA, 1.661 at 15 DAA, and 1.303 at 20 DAA. Therefore, the issue of vegetation index saturation is essentially not a concern within the scope of this research. In the formulation of the Enhanced Vegetation Index (EVI), a blue band is introduced along with an atmosphere resistance factor to partially correct for aerosol scattering effects in the red band. Furthermore, it accounts for the nonlinear relationship between reflectance and fractional vegetation cover. The introduced blue band helps correct for soil background influences [51,52,53]. When vegetation LAI falls below 1.0, indices such as EVI demonstrate greater robustness against soil background interference [54]. EVI may also be more effective than other vegetation indices for detecting plant growth during mid-to-late growing seasons [55]. In the present study, EVI maintained a more stable linear relationship with LAI compared to other indices [56].

4.2. Temporal Variability in the Performance of Machine Learning Models for Inversion

In the prediction of single-date LAI, none of the machine learning models evaluated—RF, SVM, or GAM—consistently achieved optimal inversion accuracy across all observation dates throughout the defoliation process, regardless of whether the analysis was based on original multispectral images or data processed by removing soil alone or both soil and open cotton bolls. For example, RF performed best one day before chemical application, GAM yielded the most accurate predictions five days after application, GAM again showed superior performance ten days after application, while RF regained the best performance at 15 and 20 days after application. This temporal variation in model performance is unlikely to be due to random error but may instead reflect the differential ability of the machine learning algorithms to adapt to variations in data distribution characteristics. The defoliation process is essentially a rapid transition of the cotton canopy from a relatively homogeneous and continuous state to a fragmented and heterogeneous one. Prior to chemical application, the canopy had a higher LAI and relatively uniform spectral characteristics. The relationship between vegetation indices and LAI may approximate a stable nonlinear mapping. As an ensemble learning algorithm, Random Forest excels at capturing complex feature interactions and nonlinear relationships by constructing a large number of decision trees. It generally performs well when data distribution is relatively stable and the relationship between features and the target variable is complex, which may explain its higher inversion accuracy in the pre-application phase.
After applying the harvest aids, the canopy structure undergoes pronounced changes: leaves gradually abscise, the proportion of green vegetation decreases, and large areas of soil background and open white cotton bolls become exposed. These changes cause the global relationship between vegetation indices and LAI to become more complex and less stable. Simultaneously, as defoliation progresses, the range of LAI values narrows (transitioning from higher to lower values), and the underlying data distribution may shift substantially. Under such conditions, the generalization ability of different models and their capacity to capture local patterns are challenged. The strong performance of SVM during the mid-to-late stages after chemical application deserves attention. The core principle of SVM lies in identifying an optimal hyperplane (or hypersurface in a high-dimensional feature space) and addressing nonlinearity through kernel functions. Its optimization objective focuses on maximizing the “margin” rather than simply minimizing the error across all samples, which confers a degree of robustness against outliers and local variations in data distribution. During the mid- and late- defoliation stages, samples in the feature space may become more dispersed or form complex local structures due to increased heterogeneity. By selecting support vectors to define the model, SVM may be better able to capture the dominant inversion patterns compared to models that attempt to achieve a global fit. The superior performance of the GAM during specific periods (e.g., 5 days after application) may suggest that the relationship between leaf area index (LAI) and certain vegetation indices during these phases aligns well with the smooth additive form that GAM is particularly adept at modeling. This observation implies that expecting a single static inversion model to maintain optimal performance throughout the entire rapidly changing defoliation process may be unrealistic. Especially during phenological stages characterized by rapid physiological or structural changes—such as defoliation, senescence, or stress—the best-performing model is likely to vary dynamically with the evolving canopy state. Multi-temporal remote sensing has been effectively used for crop mapping in complex terrains, supporting the need for phased modeling approaches [57]. Therefore, we propose the implementation of a phased modeling strategy—specifically, selecting or retraining the predictive model based on the number of days after defoliant application. In addition, Vision Transformer (ViT) models have shown strong performance in structural monitoring tasks, suggesting potential for future LAI inversion [58]. Similarly, data assimilation and adaptive modeling have improved crop growth simulations [59], supporting our phased model selection strategy.

4.3. Factors Influencing Leaf Area Index Retrieval

SVM demonstrated excellent performance in classifying soil, canopy, and open cotton bolls, providing a solid basis for subsequent removal of non-vegetation background interference and improvement of leaf area index (LAI) retrieval accuracy. This result aligns with previous studies indicating that SVM possesses strong discriminative capability for small-sample and complex-category problems in high-dimensional feature spaces [60]. Soil background influences the reflectance spectrum of the cotton canopy [61]. High-resolution UAV imagery can enhance prediction accuracy by removing soil background [62]. However, removing the soil background may also carry vegetation information, and this may not improve the inversion accuracy of the indicators [63]. This suggests that the approach should be tailored to the specific context. During the late growth stage of cotton—especially around the application of defoliants—LAI retrieval can be affected by both soil background and open cotton bolls. The improvement in accuracy observed on 8 October (15 days after application) was notably more pronounced on images with both soil and open cotton bolls removed compared to others, with an increase in R2 of 0.169 and a decrease in RRMSE of 4.415%. As cotton defoliation accelerates, the soil background and open cotton bolls become increasingly distinct. To examine the influence of soil coverage and boll opening percentage on vegetation indices, we conducted the following analysis. First, we processed the original multispectral imagery by sequentially removing soil pixels and then both soil and open cotton bolls pixels. For each plot, we calculated the mean reflectance per spectral band across three image sets: the original, soil-removed, and soil-and-open cotton bolls-removed. Regions of interest (ROIs) were delineated to characterize the spectral signatures of key surface features: open cotton bolls, shaded dark soil, “litter-covered soil” (soil with senesced leaves on the surface), green leaves, and withered leaves. The spectral analysis revealed that open cotton bolls exhibited high reflectance across all bands, particularly in the visible region. Green leaves displayed the typical spectral profile of healthy vegetation. Withered leaves, compared to green leaves, showed increased reflectance in the red band and decreased reflectance in the near-infrared (NIR) band, indicating cellular structure degradation while retaining some vegetative characteristics. Soil spectra were generally low and flat across bands, with litter-covered soil showing slightly higher reflectance than bare dark soil. As defoliation progresses, the increasing prominence of soil and open cotton bolls may influence vegetation indices due to their relatively high reflectance in both the red and NIR bands. Taking NDVI and EVI as examples, we quantified the impact of soil fraction (Soil%), boll opening fraction (Boll%), and their interaction on these indices using standardized regression coefficients derived from multiple linear regression models. The results indicated that EVI demonstrated greater sensitivity to soil fraction, while NDVI was more sensitive to boll fraction. A synergistic interaction between soil and boll fractions was statistically significant for NDVI but not for EVI. Contrary to the expectation that EVI should be less affected by soil background than NDVI, the standardized coefficients here suggested a stronger soil influence on EVI. This discrepancy may be attributed to our use of soil coverage percentage rather than soil brightness as the explanatory variable. SHAP analysis further supported these findings, confirming that both soil fraction and boll opening fraction exhibited negative correlations with LAI.

4.4. Limitations of This Study and Suggestions

In discussing the influence of soil fraction and boll opening area on vegetation in-dices, the analysis was limited to their effects on NDVI and EVI. Notably, the standardized coefficient for soil fraction was larger for EVI than for NDVI, confirming EVI’s greater sensitivity to soil background in this context. While EVI is primarily designed to mitigate the influence of soil brightness, soil brightness itself was not explicitly considered in this study. This highlights the need for more systematic future research, which might explore the development of an adaptive, dynamic vegetation index. Regarding the saturation point of LAI, it may vary with planting density, among other factors. Our assumption that LAI saturation in cotton likely occurs above a value of 4 is based on inferences from published inversion data. In reality, this “saturation point” is likely influenced by multiple interacting factors, presenting an intriguing avenue for further investigation. Furthermore, this study is constrained by geographical and temporal limitations, having been conducted at a single location (Hejian, Hebei) during the 2022 growing season. Data-driven models often lack generalizability across different cultivars, management practices, and environments. Enhancing model robustness may require integration with defoliation physiology and crop growth models. The universality of the background removal method is also limited. The SVM-based approach for masking soil and open cotton bolls relies on specific image features. Its automated applicability may require optimization under conditions with significant variations in boll opening status, soil moisture, or surface cover. Finally, while models such as SVM, GAM, and RF were compared, the analysis of feature selection and model comparison could be further refined. Issues such as multicollinearity among features and its impact on model stability were not fully explored, nor were attempts made at ensemble modeling or in-depth hyperparameter optimization. These limitations collectively point to clear directions for future work, including expanding the experimental scope, integrating multi-source data, developing adaptive background filtering methods, and strengthening the linkage between model mechanisms and agronomic management practices.

5. Conclusions

In summary, this study demonstrates the feasibility of monitoring cotton defoliation progress using UAV-based multispectral data through phased analysis and machine learning. Key findings confirm the Enhanced Vegetation Index (EVI) as the optimal vegetation index for LAI retrieval and identifying a flight altitude of 100 m as providing the best balance by effectively mitigating background noise from heterogeneous soil and open cotton bolls. The retrieved LAI showed a strong positive correlation with the field-measured defoliation rate (r = 0.83–0.88), while soil and open boll fractions were the main negative factors affecting accuracy—though SVM-based pixel-wise removal notably improved model performance. The observed temporal variability in model performance further underscores the need for phased adaptive modeling rather than relying on a single static approach. The dynamic monitoring framework developed in this study can provide real-time, objective progress mapping for cotton defoliation management, supporting precision defoliation and scientific harvesting decisions. This technology offers actionable data support for the precision management of cotton production, contributing to the transition of agricultural practices from experience-based to data-driven decision-making. However, several limitations point toward clear future research directions: the analysis of background influence was limited to NDVI and EVI without examining soil brightness, suggesting the potential of integrating UAV-based thermal and hyperspectral data to enhance robustness and physiological insight; the geographical and temporal scope was confined to a single location and season, calling for validation across different climatic regions and management practices to improve generalizability; and the dependence of the SVM background-removal method on specific image features, along with the need to explore feature collinearity and advanced modeling techniques, indicates that future work should also develop dynamic, time-series-aware frameworks—such as incorporating growing degree days or cumulative days after defoliation—to better capture the continuous physiological and structural changes during the entire defoliation process.

Author Contributions

Conceptualization and supervision, K.Y., M.D. and Z.L.; methodology, K.Y., M.D., F.L. and X.T.; validation, Y.W. and Z.Z.; formal analysis, Y.W., Z.Z. and C.X.; writing—original draft preparation and visualization, Y.W.; data curation, Y.W., Z.Z., C.X., T.Z., C.Z. and Q.L.; investigation, Z.Z., C.X., T.Z., C.Z. and Q.L.; writing—review and editing, F.L., G.C., S.W., X.T. and M.D.; resources, G.C. and S.W.; funding acquisition, M.D.; All authors have read and agreed to the published version of the manuscript.

Funding

This work was jointly funded by the China Agriculture Research System (CARS–15–16) and the China Agricultural University–Tarim University Joint Research Fund (ZNLH202301).

Data Availability Statement

Data will be made available on request.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Li, Z.W.; Xin, X.P.; Tang, H.; Yang, F.; Chen, B.R.; Zhang, B.H. Estimating Grassland LAI Using the Random Forests Approach and Landsat Imagery in the Meadow Steppe of Hulunber, China. J. Integr. Agric. 2017, 16, 286–297. [Google Scholar] [CrossRef]
  2. Baez-Gonzalez, A.D.; Kiniry, J.R.; Tiscareno, M.; Jaime, M.C.; Jose, L.M.; Richardson, C.W.; Jaime, S.G.; Juan, R.M. Large-Area Maize Yield Forecasting Using Leaf Area Index Based Yield Model. Agron. J. 2005, 97, 418–425. [Google Scholar] [CrossRef]
  3. Liu, X.J.; Cao, Q.; Yuan, Z.F.; Liu, X.; Wang, X.L.; Tian, Y.C.; Cao, W.X.; Zhu, Y. Leaf Area Index Based Nitrogen Diagnosis in Irrigated Lowland Rice. J. Integr. Agric. 2018, 17, 111–121. [Google Scholar] [CrossRef]
  4. Shu, Z.Y.; Zhang, B.Q.; Yu, L.Y.; Zhao, X.N. Reconciling Plant Water Stress Response Using Vegetation and Soil Moisture Data Assimilation for Vegetation-Soil-Hydrology Interaction Estimation Over the Chinese Loess Plateau. Agric. For. Meteorol. 2025, 369, 110584. [Google Scholar] [CrossRef]
  5. Board, J.E.; Maka, V.; Price, R.; Knight, D.; Baur, M.E. Development of Vegetation Indices for Identifying Insect Infestations in Soybean. Agron. J. 2007, 99, 650–656. [Google Scholar] [CrossRef]
  6. Liao, J.; Zang, Y.; Luo, X.W.; Zhou, Z.Y.; Zang, Y.; Wang, P. The Relations of Leaf Area Index with the Spray Quality and Efficacy of Cotton Defoliant Spraying Using Unmanned Aerial Systems (UASs). Comput. Electron. Agric. 2020, 169, 105228. [Google Scholar] [CrossRef]
  7. Wang, G.B.; Wang, J.H.; Chen, P.C.; Han, X.Q.; Chen, S.D.; Lan, Y.B. Droplets Deposition and Harvest-aid Efficacy for UAV Application in arid Cotton Areas in Xinjiang, China. Int. J. Agric. Biol. Eng. 2022, 15, 9–18. [Google Scholar] [CrossRef]
  8. Jin, X.L.; Liu, S.Y.; Baret, F.; Hemerlé, M.; Comar, A. Estimates of Plant Density of Wheat Crops at Emergence From Very Low Altitude UAV Imagery. Remote Sens. Environ. 2017, 198, 105–114. [Google Scholar] [CrossRef]
  9. Xun, L.; Zhang, J.H.; Yao, F.M.; Cao, D. Improved Identification of Cotton Cultivated Areas by Applying Instance-Based Transfer Learning on the Time Series of MODIS NDVI. Catena 2022, 213, 106130. [Google Scholar] [CrossRef]
  10. Feng, A.J.; Zhou, J.F.; Vories, E.; Sudduth, K.A. Evaluation of Cotton Emergence Using UAV-based Imagery and Deep Learning. Comput. Electron. Agric. 2020, 177, 105711. [Google Scholar] [CrossRef]
  11. Feng, L.; Chen, S.S.; Zhang, C.; Zhang, Y.C.; He, Y. A comprehensive review on recent applications of unmanned aerial vehicle remote sensing with various sensors for high-throughput plant phenotyping. Comput. Electron. Agric. 2021, 182, 106033. [Google Scholar] [CrossRef]
  12. Chen, P.F. Cotton Leaf Area Index Estimation Using Unmanned Aerial Vehicle Multi-Spectral Images. In IGARSS 2019—2019 IEEE International Geoscience and Remote Sensing Symposium; IEEE: New York, NY, USA, 2025; Volume 244, pp. 6251–6254. [Google Scholar]
  13. Shi, H.L.; Cao, H.X.; Zhang, W.J.; Zhu, S.; He, Z.J.; Zhang, Z. Leaf Area Index Inversion of Cotton Based on Drone Multi-Spectral and Multiple Growth Stages. Sci. Agric. Sin. 2024, 57, 80–95. (In Chinese) [Google Scholar]
  14. Alshehri, M.; Zahoor, L.; Alqahtani, Y.; Alshahrani, A.; AlHammadi, D.A.; Jalal, A.; Liu, H. Unmanned Aerial Vehicle Based Multi-Person Detection via Deep Neural Network Models. Front. Neurorobotics 2025, 19, 1582995. [Google Scholar] [CrossRef] [PubMed]
  15. Tian, Y.J.; Jiang, Y.F.; Zeng, M.; Hui, J.J.; Jiang, Q.S. Inversion of Cotton Leaf Area Index Under Verticillium Wilt Stress from UAV Multispectral Images: Deep Learning-based vs. Classical-based Algorithms. Ind. Crop. Prod. 2025, 233, 121345. [Google Scholar] [CrossRef]
  16. Ma, Y.R.; Zhang, Q.; Yi, X.; Ma, L.L.; Zhang, L.F.; Huang, C.P.; Zhang, Z.; Lv, X. Estimation of Cotton Leaf Area Index (LAI) Based on Spectral Transformation and Vegetation Index. Remote Sens. 2022, 14, 136. [Google Scholar] [CrossRef]
  17. Ma, Y.R.; Lv, X.; Yi, Y.; Ma, L.L.; Qi, Y.Q.; Hou, D.Y.; Zhang, Z. Monitoring of Cotton Leaf Area Index Using Machine Learning. Trans. Chin. Soc. Agric. Eng. 2021, 37, 152–162. (In Chinese) [Google Scholar]
  18. Zhang, G.L.; Chen, B.; Liu, J.D.; Wang, J.; Yu, Y.; Hang, H.Y.; Wang, F.Y.; Li, T.T. Analysis of Visual Symptoms and Mechanism of Machine-Picked Cotton under Different Configuration Modes. Xinjiang Agric. Sci. 2019, 56, 1783–1793. (In Chinese) [Google Scholar]
  19. Guo, Y.T. Construction of Evaluation and Prediction Model for Cotton Defoliation Sensitivity Based on Multispectral; Xinjiang Agricultural University: Ürümqi, China, 2025. (In Chinese) [Google Scholar]
  20. Shi, G.W.; Du, X.; Du, M.W.; Li, Q.Z.; Tian, X.L.; Ren, Y.T.; Zhang, Y.; Wang, H.Y. Cotton Yield Estimation Using the Remotely Sensed Cotton Boll Index from UAV Images. Drones 2022, 6, 254. [Google Scholar] [CrossRef]
  21. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W.; Harlan, J.C. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation; NASA/GSFC: Greenbelt, MD, USA, 1974. [Google Scholar]
  22. Tucher, C.J. Red and Photographic Infrared Linear Combinations for Monitoring Vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  23. Wang, X.Q.; Wang, M.M.; Wang, S.Q.; Wu, Y.D. Extraction of Vegetation Information from Visible Unmanned Aerial Vehicle Images. Trans. Chin. Soc. Agric. Eng. 2015, 31, 152–159. (In Chinese) [Google Scholar]
  24. Lyon, J.G.; Yuan, D.; Lunetta, R.S.; Elvidge, C.D. A Change Detection Experiment Using Vegetation Indices. Photogramm. Eng. Remote Sens. 1998, 64, 143–150. [Google Scholar]
  25. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a Green Channel in Remote Sensing of Global Vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  26. Gao, Y.; Li, K.L.; Luo, Y.K.; Pan, Q.; Zhang, S.Y. Monitoring of Sugar Beet Growth Indicators Using Wide-Dynamic-Range Vegetation Index (WDRVI) Derived from UAV Multispectral Images. Comput. Electron. Agric. 2020, 171, 105331. [Google Scholar]
  27. Birth, G.S.; Mcvey, G.R. Measuring the Color of Growing Turf with a Reflectance Spectrophotometer. Agron. J. 1968, 60, 587–712. [Google Scholar] [CrossRef]
  28. Vincini, M.; Frazzi, E.; D’Alessio, P. A broad-band leaf chlorophyll vegetation index at the canopy scale. Precis. Agric. 2008, 9, 303–319. [Google Scholar] [CrossRef]
  29. Gamon, J.A.; Surfus, J.S. Assessing Leaf Pigment Content and Activity with A Reflectometer. New Phytol. 1999, 143, 105–117. [Google Scholar] [CrossRef]
  30. Jordan, C.F. Derivation of Leaf-Area Index from Quality of Light on the Forest Floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  31. Meyer, G.E.; Mehta, T.; Kocher, M.F.; Mortensen, D.A.; Samal, M.A. Textural Imaging and Discriminant Analysis for Distinguishing Weeds for Spot Spraying. Trans. ASAE 1998, 41, 1189–1197. [Google Scholar] [CrossRef]
  32. Meyer, G.E.; Neto, J.C. Verification of Color Vegetation Indices for Automated Crop Imaging Applications. Comput. Electron. Agric. 2008, 63, 282–293. [Google Scholar] [CrossRef]
  33. Meyer, G.E.; Hindman, T.W.; Laksmi, K. Machine Vision Detection Parameters for Plant Species Identification. In Proceedings of Spie the International Society for Optical Engineering; Society of Photo Optical: Bellingham, WA, USA, 1999; Volume 3543, pp. 327–335. [Google Scholar]
  34. Novozhilov, G.N.; Dav′ydov, O.V.; Mazurov, K.V.; Dudochkin, N.A.; Mikhailov, N. The Vegetative Index of Kerdo as an Indication of Primary Adaptation to Hot Climate Conditions. Voen.-Meditsinskiizhurnal 1969, 8, 68–69. [Google Scholar]
  35. Gitelson, A.A.; Kaufman, Y.J.; Stark, R.; Rundquist, D. Novel Algorithms for Remote Estimation of Vegetation Fraction. Remote Sens. Environ. 2002, 80, 76–87. [Google Scholar] [CrossRef]
  36. Huete, A.R. A Soil-Adjusted Vegetation Index (SAVI). Remote Sens. Environ. 1988, 25, 295–309. [Google Scholar] [CrossRef]
  37. Bareth, G.; Bolten, A.; Gnyp, M.L.; Reusch, S.; Jasper, J. Comparison of Uncalibrated RGBVI with Spectrometer-Based NDVI Derived from UAV Sensing Systems on Field Scale. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2016, XLI-B8, 837–843. [Google Scholar] [CrossRef][Green Version]
  38. Miura, T.; Huete, A.R.; Yoshioka, H. Evaluation of Sensor Calibration Uncertainties on Vegetation Indices for MODIS. IEEE Trans. Geosci. Remote Sens. 2000, 38, 1399–1409. [Google Scholar] [CrossRef]
  39. Francisco-Javier, M.C.; Torres-Sánchez, J.; Inmaculada, C.R.; Clavero-Rumbao, J.; García-Ferrer, A.; Peña, J.M.; Borra-Serrano, I.; López-Granados, F. Assessing Optimal Flight Parameters for Generating Accurate Multispectral Orthomosaicks by UAV to Support Site-Specific Crop Management. Remote Sens. 2015, 7, 12793–12814. [Google Scholar]
  40. Zhang, P.P.; Lu, B.; Ge, J.Y.; Wang, X.Y.; Yang, Y.D.; Shang, J.L.; La, Z.; Zang, H.D.; Zeng, Z.H. Using UAV-based Multispectral and Rgb Imagery to Monitor Above-Ground Biomass of Oat-based Diversified Cropping. Eur. J. Agron. 2025, 162, 127422. [Google Scholar] [CrossRef]
  41. Adedeji, O.; Abdalla, A.; Ghimire, B.; Ritchie, G.; Guo, W.X. Flight Altitude and Sensor Angle Affect Unmanned Aerial System Cotton Plant Height Assessments. Drones 2024, 8, 746. [Google Scholar] [CrossRef]
  42. Guo, Y.H.; Yin, G.D.; Sun, H.Y.; Wang, H.X.; Chen, S.Z.; Senthilnath, J.; Wang, J.Z.; Fu, Y.S. Scaling Effects on Chlorophyll Content Estimations with RGB Camera Mounted on a UAV Platform Using Machine-Learning Methods. Sensors 2020, 20, 5130. [Google Scholar] [CrossRef]
  43. Wang, J.J.; Yin, Q.; Cao, L.; Zhang, Y.T.; Li, W.L.; Wang, W.L.; Zhou, G.S.; Hou, Z.Y. Enhancing Winter Wheat Soil-Plant Analysis Development Value Prediction through Evaluating Unmanned Aerial Vehicle Flight Altitudes, Predictor Variable Combinations, and Machine Learning Algorithms. Plants 2024, 13, 1926. [Google Scholar] [CrossRef] [PubMed]
  44. Qu, H.C.; Zheng, C.F.; Ji, H.; Barai, K.; Zhang, Y.J. A Fast and Efficient Approach to Estimate Wild Blueberry Yield Using Machine Learning with Drone Photography: Flight Altitude, Sampling Method and Model Effects. Comput. Electron. Agric. 2024, 216, 108543. [Google Scholar] [CrossRef]
  45. Awais, M.; Li, W.; Masud Cheema, M.J.; Hussain, S.; Shaheen, A.; Aslam, B.; Liu, C.; Ali, A. Assessment of Optimal Flying Height and Timing Using High-Resolution Unmanned Aerial Vehicle Images in Precision Agriculture. Int. J. Environ. Sci. Technol. 2021, 19, 2703–2720. [Google Scholar] [CrossRef]
  46. Li, H.J.; Liu, Z.Y.; Chen, Y.; Zhang, X.; Chen, D.H.; Chen, Y. A Positive Correlation Between Seed Cotton Yield and High-Efficiency Leaf Area Index in Directly Seeded Short-Season Cotton After Wheat. Field Crops Res. 2022, 285, 108594. [Google Scholar] [CrossRef]
  47. Potgieter, A.B.; George-Jaeggli, B.; Chapman, S.C.; Laws, K.; Cadavid, L.A.S.; Wixted, J.; Waston, J.; Eldridge, M.; Jordan, D.R.; Hammer, G.L. Multi-Spectral Imaging from an Unmanned Aerial Vehicle Enables the Assessment of Seasonal Leaf Area Dynamics of Sorghum Breeding Lines. Front. Plant Sci. 2017, 8, 1532. [Google Scholar] [CrossRef]
  48. Carlson, T.N.; Ripley, D.A. On the Relation Between NDVI, Fractional Vegetation Cover, and Leaf Area Index. Remote Sens. Environ. 1997, 62, 241–252. [Google Scholar] [CrossRef]
  49. Mutanga, O.; Masenyama, A.; Sibanda, M. Spectral Saturation in the Remote Sensing of High-Density Vegetation Traits: A Systematic Review of Progress, Challenges, and Prospects. ISPRS J. Photogramm. Remote Sens. 2023, 198, 297–309. [Google Scholar] [CrossRef]
  50. Xu, Y.W.; Wang, Z.; Li, Y.F.; Li, J.S. Inversion of Cotton Growth Indexes at Drip Irrigation Subunit Scale Based on UAV Multi-Spectral Remote Sensing. Agric. Res. Arid Areas 2025, 43, 224–233. (In Chinese) [Google Scholar]
  51. Ma, H.X.; Chen, C.C.; Song, Y.Q.; Sha, Y.; Hu, Y.M. Analysis of Vegetation Cover Change and Its Driving Factors over the Past Ten Years in Qinghai Province. J. Soil Water Res. 2018, 25, 137–145. (In Chinese) [Google Scholar]
  52. Qiu, J.Z.; Yang, J.X.; Wang, Y.P.; Su, H. A Comparison of NDVI and EVI in the DisTrad Model for Thermal Sub-Pixel Mapping in Densely Vegetated Areas: A Case Study in Southern China. Int. J. Remote Sens. 2018, 39, 2105–2118. [Google Scholar] [CrossRef]
  53. Jaafar, H.H.; Zurayk, R.; King, C.; Ahmad, F.; Al-Outa, R. Impact of the Syrian Conflict on Irrigated Agriculture in the Orontes Basin. In The Water-Energy-Food Nexus in the Middle East and North Africa; Routledge: Oxfordshire, UK, 2015; pp. 436–449. [Google Scholar]
  54. Tian, X.C.; Jia, X.Y.; Da, Y.Z.; Liu, J.Y.; Ge, W.Y. Evaluating the Sensitivity of Vegetation Indices to Leaf Area Index Variability at Individual Tree Level Using Multispectral Drone Acquisitions. Agric. For. Meteorol. 2025, 364, 110441. [Google Scholar] [CrossRef]
  55. Gu, H.B.; Mills, C.; Ritchis, G.L.; Guo, W.X. Water Stress Assessment of Cotton Cultivars Using Unmanned Aerial System Images. Remote Sens. 2024, 16, 2609. [Google Scholar] [CrossRef]
  56. Gao, X.; Huete, A.R.; Ni, W.; Miura, T. Optical–Biophysical Relationships of Vegetation Spectra without Background Contamination. Remote Sens. Environ. 2000, 74, 609–620. [Google Scholar] [CrossRef]
  57. Wang, N.; Wu, Q.X.; Gui, Y.Y.; Li, W. Cross-Modal Segmentation Network for Winter Wheat Mapping in Complex Terrain Using Remote-Sensing Multi-Temporal Images and DEM Data. Remote Sens. 2024, 16, 1775. [Google Scholar] [CrossRef]
  58. Li, Y.S.; Liu, C.L.; Weng, Z.H.; Wu, D.F.; Du, Y.C. Aggregate-Level 3D Analysis of Asphalt Pavement Deterioration Using Laser Scanning and Vision Transformer. Autom. Constr. 2025, 178, 106380. [Google Scholar] [CrossRef]
  59. Bao, L.; Yu, L.X.; Yu, E.T.; Li, R.P.; Cai, Z.Q.; Yu, J.X.; Li, X. Improving the Simulation of Maize Growth Using WRF-Crop Model Based on Data Assimilation and Local Maize Characteristics. Agric. For. Meteorol. 2025, 365, 110478. [Google Scholar] [CrossRef]
  60. Kok, Z.H.; Shariff, A.R.M.; Alfatni, M.S.M.; Khairunniza-Bejo, S. Support Vector Machine in Precision Agriculture: A review. Comput. Electron. Agric. 2021, 191, 106546. [Google Scholar] [CrossRef]
  61. Chen, P.F.; Liang, F. Cotton Nitrogen Nutrition Diagnosis Based on Spectrum and Texture Feature of Images from Low Altitude Unmanned Aerial Vehicle. Sci. Agric. Sin. 2019, 52, 2220–2229. (In Chinese) [Google Scholar]
  62. Wu, S.; Deng, L.; Guo, L.J.; Wu, Y.J. Wheat Leaf Area Index Prediction Using Data Fusion Based on High-Resolution Unmanned Aerial Vehicle Imagery. Plant Methods 2022, 18, 68. [Google Scholar] [CrossRef]
  63. Mao, Y.L.; Li, H.; Wang, Y.; Shen, J.Z.; Xu, Y.; Ding, S.B.; Wang, H.; Ding, Z.T.; Fan, K. Rapid Monitoring of Tea Plants Under Cold Stress Based on UAV Multi-Sensor Data. Comput. Electron. Agric. 2023, 213, 108176. [Google Scholar] [CrossRef]
Figure 1. Field application of harvest aid (a) and measurement of LAI (b).
Figure 1. Field application of harvest aid (a) and measurement of LAI (b).
Remotesensing 18 00609 g001
Figure 2. Time-series RGB images of cotton fields before and after the application of harvest aid agents at different concentrations.
Figure 2. Time-series RGB images of cotton fields before and after the application of harvest aid agents at different concentrations.
Remotesensing 18 00609 g002
Figure 3. Overall workflow of the study.
Figure 3. Overall workflow of the study.
Remotesensing 18 00609 g003
Figure 4. Scatter plots illustrating the correlations among LAI, leaf numbers, rate of LAI change, and defoliation rate in cotton under different planting densities and defoliant application regimes. (a) Relationship between LAI and leaf number across different planting densities. (b) Relationship between the rate of LAI change and defoliation rate across different planting densities.
Figure 4. Scatter plots illustrating the correlations among LAI, leaf numbers, rate of LAI change, and defoliation rate in cotton under different planting densities and defoliant application regimes. (a) Relationship between LAI and leaf number across different planting densities. (b) Relationship between the rate of LAI change and defoliation rate across different planting densities.
Remotesensing 18 00609 g004
Figure 5. Mapping of Leaf Area based on Simple Linear Regression between EVI and Leaf Area at 100 m.
Figure 5. Mapping of Leaf Area based on Simple Linear Regression between EVI and Leaf Area at 100 m.
Remotesensing 18 00609 g005
Figure 6. Prediction performance of different machine learning models on training and validation datasets.
Figure 6. Prediction performance of different machine learning models on training and validation datasets.
Remotesensing 18 00609 g006
Figure 7. Cotton field images before defoliant and ripening agent application and at different days after application, with soil removed.
Figure 7. Cotton field images before defoliant and ripening agent application and at different days after application, with soil removed.
Remotesensing 18 00609 g007
Figure 8. Cotton field images before and at various days after defoliant and ripening-agent application, with both soil and open cotton bolls removed.
Figure 8. Cotton field images before and at various days after defoliant and ripening-agent application, with both soil and open cotton bolls removed.
Remotesensing 18 00609 g008
Figure 9. Key background factors affecting LAI retrieval, their spectral reflectance, and influences on vegetation indices. (a) SHAP analysis identifies main factors affecting LAI retrieval (Soil and Cotton). (b) Partial correlation of Soil and Cotton with LAI. (c) Spectral reflectance of different ground objects. (d) Standardized coefficients of Soil and Cotton effects on vegetation indices. Significance levels: *** p < 0.001. The spectral signatures of distinct surface targets are labeled as follows: “Green Leaf” for healthy foliage, “Dark Soil” for shaded black soil, “Litter Layer” for senesced leaves on the ground surface, “Withered Leaf” for desiccated leaves following defoliant application, and “Cotton” for open cotton bolls.
Figure 9. Key background factors affecting LAI retrieval, their spectral reflectance, and influences on vegetation indices. (a) SHAP analysis identifies main factors affecting LAI retrieval (Soil and Cotton). (b) Partial correlation of Soil and Cotton with LAI. (c) Spectral reflectance of different ground objects. (d) Standardized coefficients of Soil and Cotton effects on vegetation indices. Significance levels: *** p < 0.001. The spectral signatures of distinct surface targets are labeled as follows: “Green Leaf” for healthy foliage, “Dark Soil” for shaded black soil, “Litter Layer” for senesced leaves on the ground surface, “Withered Leaf” for desiccated leaves following defoliant application, and “Cotton” for open cotton bolls.
Remotesensing 18 00609 g009
Table 1. Calculation Formulas for Spectral Indices.
Table 1. Calculation Formulas for Spectral Indices.
Vegetation IndexFormulaReferences
Normalized Difference Vegetation Index (NDVI)(NIR − R)/(NIR + R)[21]
Normalized Difference Red Edge (NDRE)(NIR − RE)/(NIR + RE)[22]
Visible-band Difference Vegetation Index (VDVI)(2 * G − R − B)/(2 * G + R + B)[23]
Normalized Green-red Difference Index (NGRDI)(G − R)/(G + R)[24]
Green Normalized Difference Vegetation Index (GNDVI)(NIR − G)/(NIR + G)[25]
Green Wide Dynamic Range Vegetation Index (GWDRVI)(0.12 * NIR − G)/(0.12 * NIR + G)[26]
Simple Ratio Index (SR)NIR/R[27]
Green Ratio Vegetation Index (GRVI)NIR/G[28]
Red Green Ratio Index (RGRI)R/G[29]
Difference Vegetation Index (DVI)NIR − R[30]
Excess Green Vegetation Index (EXG)2 * G − R − B[31]
Excess Red Vegetation Index (EXR)1.4 * R − G[32]
Excess Green minus Excess Red Vegetation Index (EXGR)EXG − EXR[33]
Vegetative Index (VEG)G/(Ra * B(1−a)), a = 0.667[34]
Visible Atmospherically Resistant Index (VARI)(G − R)/(G + R − B)[35]
Red Edge Soil-Adjusted Vegetation Index (RESAVI)1.5 * (NIR − RE)/(NIR + RE + 0.5)[36]
Red Green Blue Vegetation Index (RGBVI)(G2 − B * R)/(G2 + B * R)[37]
Enhanced Vegetation Index (EVI)2.5 * (NIR − R)/(NIR + 6 * R − 7.5 * B + 1)[38]
Note: B, G, R, RE, and NIR are the reflectance at the wavelengths of 450, 560, 650, 730, and 840 nm, respectively.
Table 2. Results of Simple Linear Regression between Different Vegetation Indices and Leaf Area Index at 25 m, 50 m and 100 m Height.
Table 2. Results of Simple Linear Regression between Different Vegetation Indices and Leaf Area Index at 25 m, 50 m and 100 m Height.
Height
(m)
Vegetation IndexModel FormulaTrainingValidation
R2RMSE (m2/m2)RRMSE (%)R2RMSE (m2/m2)RRMSE (%)
25NDVIy = 6.044x − 1.8980.7110.56122.9120.7220.52521.817
NDREy = 38.559x − 6.3460.4470.77631.6970.4460.73830.699
GNDVIy = 9.049x − 4.3180.7130.56022.8250.7300.51721.506
GWDRVIy = 5.181x + 3.0010.7540.51821.1490.7680.48119.991
SRy = 0.219x + 0.4640.8380.42017.1640.8460.39116.273
GRVIy = 0.376x − 0.3300.7780.49120.0650.7900.45819.045
DVIy = 8.965x + 0.3610.8710.37515.3250.8690.36515.183
RESAVIy = 27.671x − 1.5720.8650.38415.6860.8610.37415.543
VDVIy = 13.762x + 1.0020.7260.54622.3060.7120.53422.201
NGRDIy = 8.739x + 2.0660.8040.46318.8890.8050.44018.290
RGRIy = −4.352x + 6.6080.7730.49820.3220.7770.47019.551
EXGy = 82.110x + 1.1480.8200.44318.0810.7950.45318.842
EXRy = −55.026x + 3.2420.6080.65426.6940.6020.62726.089
EXGRy = 35.035x + 2.3990.7370.53521.8630.7230.52521.840
VEGy = 5.105x − 3.8570.7750.49520.2080.7580.48920.336
VARIy = 5.469x + 1.9800.8330.42617.4100.8400.39816.545
RGBVIy = 8.152x + 0.6300.7250.54822.3690.7190.52621.875
EVIy = 5.510x + 0.1020.8810.36014.7060.8820.34714.418
50NDVIy = 6.092x − 1.8910.7160.55722.7230.7240.52421.778
NDREy = 38.948x − 6.1900.5410.70828.8930.5710.65127.085
GNDVIy = 9.512x − 4.3300.7200.55322.5720.7380.51021.208
GWDRVIy = 5.348x + 3.1280.7600.51220.8970.7810.46719.438
SRy = 0.247x + 0.4080.8390.41917.0930.8500.38716.092
GRVIy = 0.412x − 0.4110.7780.49220.0100.8040.44318.414
DVIy = 9.607x + 0.2580.8920.34414.0290.8850.34114.095
RESAVIy = 28.247x − 1.6560.8870.35214.3590.8880.33513.911
VDVIy = 14.252x + 1.0330.7670.50420.5940.7330.51421.393
NGRDIy = 9.156x + 2.1070.8130.45218.4450.7980.44818.643
RGRIy = −4.507x + 6.7750.7770.49320.1320.7620.48720.247
EXGy = 85.820x + 1.1840.8320.42717.4520.8000.44618.537
EXRy = −55.504x + 3.2900.6370.62925.6800.6260.61025.357
EXGRy = 35.483x + 2.4630.7510.52121.2530.7290.52021.624
VEGy = 5.442x − 4.1390.8110.45318.5080.7820.46419.309
VARIy = 5.795x + 2.0410.8320.42817.4660.8220.42117.493
RGBVIy = 8.311x + 0.7030.7370.50720.7000.7370.51021.201
EVIy = 5.839x − 0.0300.9050.32213.1470.9020.31513.088
100NDVIy = 5.997x − 1.7300.7250.54722.3460.7350.51321.349
NDREy = 36.492x − 5.4790.5540.69728.4690.6410.59624.770
GNDVIy = 9.126x − 3.9000.7290.54422.1920.7470.50120.848
GWDRVIy = 5.489x + 3.3460.7760.49420.1870.7940.45518.915
SRy = 0.285x + 0.3350.8590.39216.0090.8710.35914.941
GRVIy = 0.462x − 0.4920.8000.46719.0650.8190.42817.807
DVIy = 10.650x + 0.1580.9010.32813.3980.9060.30912.860
RESAVIy = 28.748x − 1.6600.8810.36014.6810.9020.31913.255
VDVIy = 14.526x + 1.0700.7660.50520.6180.7510.49520.601
NGRDIy = 9.778x + 2.1100.8200.44318.0860.8190.42317.606
RGRIy = −4.817x + 7.0620.7820.48719.8850.7820.46619.366
EXGy = 90.207x + 1.1940.8220.44017.9850.8020.44218.372
EXRy = −53.381x + 3.2850.6230.64126.1690.6180.61625.614
EXGRy = 35.287x + 2.5110.7330.53922.0080.7170.53122.095
VEGy = 5.713x − 4.3640.8150.44918.3460.8040.43918.260
VARIy = 6.193x + 2.0620.8360.42317.2510.8400.39816.571
RGBVIy = 8.262x + 0.8060.7600.51220.8900.7530.49320.494
EVIy = 6.239x − 0.1110.9160.30312.3810.9210.28411.808
Table 3. Accuracy assessment results for the extraction of soil and canopy, as well as canopy and open cotton bolls.
Table 3. Accuracy assessment results for the extraction of soil and canopy, as well as canopy and open cotton bolls.
DateCategoryOverall AccuracyKappa Coefficient
9.22Soil and Canopy99.70%0.973
9.22Ganopy leaves and open cotton bolls97.23%0.931
9.28Soil and Canopy99.28%0.985
9.28Ganopy leaves and open cotton bolls99.06%0.979
10.4Soil and Canopy99.94%0.999
10.4Ganopy leaves and open cotton bolls99.73%0.992
10.8Soil and Canopy99.83%0.996
10.8Ganopy leaves and open cotton bolls99.85%0.996
10.13Soil and Canopy98.39%0.963
10.13Ganopy leaves and open cotton bolls99.89%0.998
Table 4. LAI inversion models using machine learning with RFE feature selection: a comparison under original, soil-removed, and soil-and-open cotton bolls-removed conditions.
Table 4. LAI inversion models using machine learning with RFE feature selection: a comparison under original, soil-removed, and soil-and-open cotton bolls-removed conditions.
ConditionDateMethodsVariablesTrainingValidation
R2RMSE
(m2/m2)
RRMSE
(%)
R2RMSE
(m2/m2)
RRMSE
(%)
Original9.22RFEVI0.9280.1062.7760.7230.2135.746
GAMNDRE GRVI RESAVI
EVI SR
0.8860.1323.4570.4690.3308.886
SVMEVI0.8040.1754.5790.6470.2556.867
9.29RFVDVI DVI NDRE0.9230.1263.9030.6910.2196.882
GAMVDVI NDRE0.6800.2347.2350.6730.1986.222
SVMVDVI0.6410.2537.8230.5910.2126.672
10.04RFNDRE VARI0.9640.1115.0520.5210.44020.258
GAMNDRE0.8240.23710.7420.7420.25611.793
SVMNDRE0.8760.2169.7700.5620.33615.488
10.08RFNDRE EXG0.9110.1418.4260.5630.32519.825
GAMEXG NDRE0.7580.22413.4270.5520.33020.126
SVMEXG NDRE0.7790.21913.1530.5470.33320.288
10.13RFRGRI NDRE0.9370.0654.9750.6950.14010.822
GAMRGRI0.7730.1189.0370.6680.14311.055
SVMRGRI NDRE0.8590.0969.5740.5430.17113.238
9.22RFEXG NDRE0.9470.1002.6260.4630.2747.377
GAMEXG0.7540.1945.0800.6930.2125.708
SVMEXG0.7920.1824.7760.5030.2536.804
Soil-removed9.29RFEXG VDVI0.9100.1173.6430.6730.2327.223
GAMNDRE EXG0.6510.2206.8020.6130.2537.951
SVMEXG VDVI DVI
NDRE GRVI
0.8110.1685.2260.6080.2598.078
10.04RFGRVI0.9520.1255.6440.8010.23410.756
GAMGRVI0.8620.2109.5180.8320.2169.950
SVMGRVI0.8760.2019.1040.8470.2029.323
10.08RFNDRE EXR0.9090.1468.7690.6420.29417.900
GAMNDRE EXR0.7170.24314.5650.6540.28717.504
SVMNDRE EXR0.7770.21813.0930.6030.31018.901
10.13RFEVI NGRDI EXGR0.9340.0715.4400.6810.14611.292
GAMEXGR EVI0.6910.13810.5690.6960.13610.514
SVMEVI EXGR NGRDI0.8040.1108.4610.4880.17713.699
Soil and opened cotton bolls-removed9.22RFEVI RGBVI NDRE0.9520.0952.4850.5220.2637.071
GAMEVI0.8050.1724.5040.7030.2225.978
SVMEVI0.8260.1654.3310.6820.2246.031
9.29RFVEG NDVI DVI GNDVI0.9170.1243.8580.7690.2126.610
GAMNDRE DVI NDVI GNDVI0.7810.1755.4110.4780.39612.445
SVMVEG NDVI DVI
NDRE GNDVI
0.8030.1705.3110.6050.2567.987
10.04RFRESAVI0.9140.1677.5780.6520.30614.112
GAMRESAVI0.7710.27012.2380.7540.25611.793
SVMRESAVI EXR0.8200.24611.1480.8440.21810.035
10.08RFRESAVI NDRE0.9340.1237.3800.7320.25315.410
GAMRESAVI0.7500.22813.6660.7060.26516.162
SVMRESAVI NDRE EXR0.8470.18811.2460.6240.30318.505
10.13RFRGRI NDRE EXGR0.9190.0796.0120.6970.13910.732
GAMRGRI EXGR0.7300.1299.8800.6580.14511.210
SVMRGRI NDRE EXGR0.8010.1178.9520.5810.16312.572
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, Y.; Zhang, Z.; Xiao, C.; Zhang, T.; Yu, K.; Zhang, C.; Liao, Q.; Li, F.; Wan, S.; Chen, G.; et al. Characterizing Cotton Defoliation Progress via UAV-Based Multispectral-Derived Leaf Area Index and Analysis of Influencing Factors. Remote Sens. 2026, 18, 609. https://doi.org/10.3390/rs18040609

AMA Style

Wang Y, Zhang Z, Xiao C, Zhang T, Yu K, Zhang C, Liao Q, Li F, Wan S, Chen G, et al. Characterizing Cotton Defoliation Progress via UAV-Based Multispectral-Derived Leaf Area Index and Analysis of Influencing Factors. Remote Sensing. 2026; 18(4):609. https://doi.org/10.3390/rs18040609

Chicago/Turabian Style

Wang, Yukun, Zhenwang Zhang, Chenyu Xiao, Te Zhang, Keke Yu, Chong Zhang, Qinghua Liao, Fangjun Li, Sumei Wan, Guodong Chen, and et al. 2026. "Characterizing Cotton Defoliation Progress via UAV-Based Multispectral-Derived Leaf Area Index and Analysis of Influencing Factors" Remote Sensing 18, no. 4: 609. https://doi.org/10.3390/rs18040609

APA Style

Wang, Y., Zhang, Z., Xiao, C., Zhang, T., Yu, K., Zhang, C., Liao, Q., Li, F., Wan, S., Chen, G., Tian, X., Du, M., & Li, Z. (2026). Characterizing Cotton Defoliation Progress via UAV-Based Multispectral-Derived Leaf Area Index and Analysis of Influencing Factors. Remote Sensing, 18(4), 609. https://doi.org/10.3390/rs18040609

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop