Next Article in Journal
Bayesian Denoising Algorithm for Low SNR Photon-Counting Lidar Data via Probabilistic Parameter Optimization Based on Signal and Noise Distribution
Previous Article in Journal
Alteration Information Extraction and Mineral Prospectivity Mapping in the Laozhaiwan Area Using Multisource Remote Sensing Data
Previous Article in Special Issue
Severity Assessment of Cotton Canopy Verticillium Wilt by Machine Learning Based on Feature Selection and Optimization Algorithm Using UAV Hyperspectral Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

UAV as a Bridge: Mapping Key Rice Growth Stage with Sentinel-2 Imagery and Novel Vegetation Indices

1
Chongqing Institute of Meteorological Sciences, Chongqing 401147, China
2
China Meteorological Administration Economic Transformation of Climate Resources Key Laboratory, Chongqing 401147, China
3
College of Land Science and Technology, China Agricultural University, Beijing 100193, China
4
School of Mechanical Engineering, Tianjin University of Commerce, Tianjin 300134, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Remote Sens. 2025, 17(13), 2180; https://doi.org/10.3390/rs17132180
Submission received: 27 April 2025 / Revised: 3 June 2025 / Accepted: 9 June 2025 / Published: 25 June 2025
(This article belongs to the Special Issue Recent Progress in UAV-AI Remote Sensing II)

Abstract

Rice is one of the three primary staple crops worldwide. The accurate monitoring of its key growth stages is crucial for agricultural management, disaster early warning, and ensuring food security. The effective collection of ground reference data is a critical step for monitoring rice growth stages using satellite imagery, traditionally achieved through labor-intensive field surveys. Here, we propose utilizing UAVs as an alternative means to collect spatially continuous ground reference data across larger areas, thereby enhancing the efficiency and scalability of training and validation processes for rice growth stage mapping products. The UAV data collection involved the Nanchuan, Yongchuan, Tongnan, and Kaizhou districts of Chongqing City, encompassing a total area of 377.5 hectares. After visual interpretation, centimeter-level high-resolution labels of the key rice growth stages were constructed. These labels were then mapped to Sentinel-2 imagery through spatiotemporal matching and scale conversion, resulting in a reference dataset of Sentinel 2 data that covered growth stages such as jointing and heading. Furthermore, we employed 30 vegetation index calculation methods to explore 48,600 spectral band combinations derived from 10 Sentinel-2 spectral bands, thereby constructing a series of novel vegetation indices. Based on the maximum relevance minimum redundancy (mRMR) algorithm, we identified an optimal subset of features that were both highly correlated with rice growth stages and mutually complementary. The results demonstrate that multi-feature modeling significantly enhanced classification performance. The optimal model, incorporating 300 features, achieved an F1 score of 0.864, representing a 2.5% improvement over models based on original spectral bands and a 38.8% improvement over models using a single feature. Notably, a model utilizing only 12 features maintained a high classification accuracy (F1 = 0.855) while substantially reducing computational costs. Compared with existing methods, this study constructed a large-scale ground-truth reference dataset for satellite imagery based on UAV observations, demonstrating its potential as an effective technical framework and providing an effective technical framework for the large-scale mapping of rice growth stages using satellite data.

1. Introduction

Rice is one of the three major staple crops globally, and identifying its key growth stages is crucial for agricultural production management [1]. The accurate monitoring of these stages provides essential data for rice growth assessment, optimizing fertilizer and water management, and enabling disaster early warning, thereby playing a pivotal role in ensuring agricultural productivity and food security [2]. For example, during the jointing stage, precise monitoring helps guide the appropriate application of nitrogen fertilizers [3]; during the heading stage, it supports heat stress warnings [4]; during both the heading and milky stages, it aids in controlling planthoppers [5]; and during the heading and maturity stages, it facilitates accurate yield predictions [6].
The primary methods commonly employed to monitor the growth stages of rice are manual field measurements and high-throughput remote sensing technologies, both of which are widely recognized and utilized. However, traditional monitoring methods heavily depend on manual field observations, which face several challenges, such as high costs, low efficiency, and limited spatiotemporal coverage. Additionally, these methods are susceptible to biases introduced by observer subjectivity. Ground-based phenological records are also difficult to implement for high-frequency dynamic monitoring at the regional scale [7]. Moreover, the spatiotemporal heterogeneity of rice phenology, exacerbated by climate change, further reduces the effectiveness of traditional methods [8].
In recent years, remote sensing technology has emerged as a crucial tool for monitoring the various stages of crop growth. The spectral characteristics, canopy structure, and biomass accumulation of rice vary significantly throughout the different phases of growth. For example, during the jointing stage, the chlorophyll content in rice leaves increases rapidly, whereas at the heading stage, canopy coverage peaks. These physiological changes can be accurately detected using remote sensing spectral data [9].
Currently, growth stage identification primarily depends on image data acquired through two remote sensing technologies: unmanned aerial vehicles (UAVs) and satellites [6,10]. Satellite remote sensing, with its wide coverage, high frequency, and cost-effectiveness, has been instrumental in overcoming the limitations of traditional monitoring. Researchers have widely explored its application for crop growth stage monitoring. For instance, Amani et al. [11] developed an unsupervised classification method using Sentinel-2 time-series data, combined with rice growth stages. By integrating near-infrared and harvest-period band reflectance features and applying NDVI time-series analyses, they achieved Kappa coefficients between 0.70 and 0.94 across three regions in Iran, demonstrating the method’s adaptability in monitoring rice growth stages. Chen et al. [12] combined canopy structure models with growth curve fitting to monitor growth stages at the sub-field scale for winter wheat and maize using Sentinel-2 data. Their approach resulted in a BBCH scale prediction error of less than 3.7 days, addressing the challenge of mismatched spatiotemporal resolution between satellite data and agronomic standards. These approaches demonstrate satellite data’s potential for addressing large-scale monitoring needs.
In recent years, the rapid progress of UAVs, coupled with advancements in data processing software, has led to the widespread adoption of UAV-based remote sensing technology in agriculture. This is mainly attributed to its cost-efficiency, operational flexibility, and superior spatial resolution [13]. UAVs offer centimeter-level resolution imagery, making them highly accurate for identifying fine-scale crop characteristics. Astridevi et al. [14] used ultra-high-resolution UAV imagery and the Mask R-CNN method to automatically detect rice growth stages. The method achieved an F1 score of 0.8 during the maturity stage and 0.87 during the vegetative growth stage. Qiu et al. [15] developed a growth stage prediction model for 327 rice varieties by combining UAV-acquired multispectral images and deep learning algorithms with the normalized difference vegetation index (NDVI) and cubic polynomial regression equations. The model achieved a prediction accuracy of 0.75–0.93 and a relative error of less than 14.66%, significantly enhancing the accuracy and generalization capability of rice growth stage prediction. Lu et al. [16], using UAV images and a model (GBiNet), constructed the PaddySeg dataset, which included 2600 images. By incorporating direct georeferencing (DGL) and incremental sparse sampling (ISS) technologies, they achieved automatic monitoring of rice growth stages, with an average IoU of 91.50%. Wang et al. [17] utilized UAV multispectral images and a random forest model to confirm the inversion capability of the red band for key growth stage LAI and LCC in maize. The model maintained stability during cross-site water and fertilizer transfer, highlighting the monitoring advantages of UAVs in heterogeneous farmland. Liu et al. [18] captured data for five wheat growth stages using UAVs and selected NDRE/TVI as the optimal indicators. Their random forest model achieved a plant height prediction error of less than 1.01 cm and simultaneously identified moisture stress zones, enhancing the full-growth-stage management scheme. UAV remote sensing provides centimeter-level resolution imagery, making it highly accurate for identifying rice growth stages. However, its limited coverage posed challenges in supporting large-scale monitoring needs [19].
Satellite imagery provides global coverage; however, it faces the challenge of limited labeled data. Traditionally, reference data for remote sensing applications are primarily collected on the ground and geographically encoded using GNSS devices [20]. However, this approach has several limitations. First, the quantity of ground reference data is limited because field surveys are time-consuming, labor-intensive, and costly. For example, in the studies conducted by Mu, Huang, and others, the number of reference data points did not exceed 400 [21,22]. Second, the quality and representativeness of ground observation data are often affected by GPS inaccuracies caused by geographic factors [23]. Moreover, field data are typically collected as discrete point observations or plot measurements, which do not align well with the continuous nature of remote sensing data and its spatial scale [24]. Particularly in regions such as Chongqing, which is dominated by mountainous and hilly terrain, collecting ground reference data proves to be highly challenging.
To overcome these limitations, this study proposes the use of UAVs as an alternative method for collecting reference data, thereby obtaining spatially continuous information on rice growth stages with exceptionally high spatial resolution. With the increasing simplification of UAV operation and data processing technologies, UAVs have become a cost-effective operational tool for small-scale spatial applications. This concept has been successfully demonstrated in other agricultural applications. For instance, Md Kamrul Islam et al. [25] combined high-spatial-resolution training and validation data provided by UAVs (for generating accurate sample labels) with the hyperspectral observation capabilities of WorldView-3 satellites, achieving high-precision wetland plant classification. Wu et al. [26] used UAV-derived pepper yield information as label data for Sentinel-2 imagery, enabling large-scale yield inversion. The integration of UAV and satellite imagery effectively addresses the challenge of scarce labeled data in satellite imagery, thus enabling high-quality modeling for broader applications. To our knowledge, a similar comprehensive approach for generating large-scale, high-precision ground-truth datasets for rice growth stage monitoring using UAV imagery to train satellite models has not yet been widely applied or documented. Once the challenge of scarce labeled data in satellite imagery is addressed, optimizing the use of multispectral satellite imagery for modeling becomes another key issue. Constructing vegetation indices (VIs) is an effective approach. Compared to modeling with raw spectral bands, VIs improved the model’s resilience to variations in lighting conditions and atmospheric disturbances, improved the model’s physical interpretability and generalizability, reduced dimensionality and redundancy, and increased the information’s expressiveness. Currently, the primary methods for detecting rice growth stages in the remote sensing field include SAR backscatter inversion methods and optical remote sensing vegetation index approaches. While SAR data (e.g., from Sentinel-1) have been explored for rice monitoring due to their ability to penetrate clouds and sensitivity to surface moisture [27,28], these data are often influenced by surface moisture and often involve higher inversion complexity, and their imaging quality (spatial texture and structural clarity) is typically inferior to that of optical imagery, limiting their effectiveness in capturing fine-scale crop phenology dynamics. Conventional optical vegetation indices, including the Normalized Difference Vegetation Index (NDVI), Enhanced Vegetation Index (EVI), and Spectral Derived Index, have been extensively utilized to monitor the growth stages of rice [7,29]. However, existing studies still face the problem of spectral information underutilization. Traditional indices are often based on fixed-band combinations (e.g., NDVI uses only the red and near-infrared bands), failing to fully exploit the red-edge and shortwave infrared bands of Sentinel-2 [9]. Building upon the aforementioned challenges and opportunities, this study aims to enhance the accuracy of rice growth stage detection using medium-resolution satellite imagery by generating high-precision labels from UAV imagery, developing data-driven feature selection methods, and integrating advanced machine learning algorithms. The primary objectives of this study are outlined as follows: (1) Based on high-resolution UAV imagery, high-precision growth stage labels obtained from UAV images were mapped to Sentinel-2 imagery through spatiotemporal matching and scale conversion, thereby constructing a benchmark dataset covering 377.5 hectares of rice growth stages. (2) A total of 48,600 spectral band combinations were explored based on 30 vegetation index calculation methods to construct a series of novel vegetation indices. The maximum relevance minimum redundancy (mRMR) algorithm was utilized to select a feature subset that is both highly correlated with and complementary to rice growth stages, with a focus on vegetation indices closely related to key rice growth stages. (3) A comparative analysis was conducted, and high-precision machine learning models were established to achieve the accurate identification of key rice growth stages, providing an expandable technical framework for rice growth stage monitoring.

2. Materials and Methods

2.1. Description of Study Area

This study was conducted in the rice-growing regions of Chongqing City, located in the southwestern part of China. It covered areas such as the Nanchuan, Yongchuan, Tongnan, and Kaizhou districts (Figure 1), which encompass the main rice-producing areas of the city. Chongqing is situated in the upper reaches of the Yangtze River, with geographic coordinates ranging from 105°11′E to 110°11′E and from 28°10′N to 32°13′N. The area is located within the subtropical monsoon climate zone, characterized by an average yearly temperature between 16 °C and 18 °C, and annual precipitation varying from 1000 mm to 1400 mm. This warm and humid climate fosters optimal conditions for the cultivation of rice. The study area is characterized by complex terrain, mainly composed of hills and mountains. Rice cultivation is mainly concentrated in river valley plains and gentle slopes, with a dominant single-cropping rice system. In some areas, double-cropping rice or rice–oilseed rape rotation systems are practiced.
The Nanchuan, Yongchuan, Tongnan, and Kaizhou districts are the main rice-producing areas in Chongqing, with extensive planting areas and high yields, exhibiting typical characteristics of the southwestern rice-growing region. Moreover, the rice growth stages in these areas are significantly affected by climate and topography, displaying distinct spatiotemporal heterogeneity. This provides a rich data foundation and a representative research scenario for this study. The selection of these regions reflects the rice cultivation characteristics of the southwestern rice-growing area and offers a scientific foundation for accurately determining the critical stages of rice growth.

2.2. UAV Image Acquisition

This research utilized the M300 RTK UAV, produced by DJI Innovations (Shenzhen, China), and the DJI Zenmuse H20 camera to acquire UAV images. The UAV has a total weight of 6.3 kg and a wheelbase measuring 895 mm, and it is capable of reaching a maximum flight speed of 23 m/s. The camera is equipped with a 23× zoom RGB camera featuring a 20-megapixel 1/1.7 CMOS sensor, as well as a wide-angle RGB camera with a 12-megapixel 1/2.3 CMOS sensor, allowing for the concurrent capture of RGB imagery.
A total of 10 images were captured during data collection on 13–14 June, 18–20 July, and 12–14 August 2023. According to the Agricultural Meteorological Observation Standards, the rice imagery was systematically classified into four key growth stages: jointing, heading, milky, and maturity (Table 1).
Image stitching was carried out with the help of Agisoft Metashape Professional 1.6.4 software. The process encompassed several essential stages, such as loading the image folder, performing reflectance calibration based on a calibration board, aligning the photos, optimizing camera settings, creating dense point clouds, developing digital elevation models (DEMs), generating orthophotos, normalizing reflectance, and exporting the resulting images.
This research utilized supervised machine learning classification methods, with the data provided as masks to assist in classifying the regions of interest. Annotations were made using ArcGIS Pro 2.5.2, with manual interpretation conducted by experienced interpreters. The images were divided into five classes: ‘Other’, ‘Jointing’, ‘Heading’, ‘Milky’, and ‘Maturity’. Rice growth stage detection in this study was framed as a five-class supervised classification task. The visual interpretation of rice growth stages was cross-checked by two interpreters, and uncertain areas were excluded from the subsequent classification. The interpretation results were further verified through on-site inspections to ensure the accuracy of the visual interpretation.

2.3. Satellite Image Acquisition and Annotation

To construct a large-scale rice growth stage recognition model, this study selected Sentinel-2 satellite imagery data covering the research area. Sentinel-2 consists of 13 spectral bands, with bands 1, 9, and 10 having a resolution of 60 m, primarily used for atmospheric correction. Therefore, only the other 10 bands, with resolutions ranging from 10 to 20 m, were selected. With the use of the Super Resolution plugin developed by the European Space Agency (ESA) in SNAP software, all bands were resampled to a resolution of 10 m and exported as TIFF images.
The UAV imagery for the study area and Google imagery were registered and geometrically corrected using ArcGIS 10.8. Since the spatial resolution of UAV imagery (typically at the centimeter level) is significantly higher than that of Sentinel-2 imagery (10 m), and the downloaded Sentinel-2 imagery slightly exceeds the coverage of the UAV images, Python 3.6 programming was used to extend the spatial range of the label image to fully match the Sentinel-2 image. A specific pixel-padding technique was applied to achieve this extension. Subsequently, the proportion of each rice growth stage class within the corresponding UAV image region for each Sentinel-2 pixel was calculated, and the growth stage with the highest proportion was assigned as the label for that Sentinel-2 pixel. The image and bridging steps are illustrated in Figure 2. This method effectively bridges high-resolution UAV imagery with medium-resolution satellite imagery, providing precise growth stage labeling data for Sentinel-2 imagery. All data are shown in Figure 3.

2.4. Vegetation Index Construction and Optimization

To differentiate rice plants at various key growth stages from other regions in the Sentinel-2 imagery, this study constructed and selected vegetation indices. Using 30 commonly applied vegetation index calculation formulas (Table 2), a custom Python script iterated through various spectral band combinations. The classification performance of the vegetation indices derived from these combinations was evaluated by developing and assessing a Multilayer Perceptron (MLP) model, which ranked the spectral band combinations according to their performance, thereby identifying the optimal combinations. Compared to traditional linear correlation-based selection methods (e.g., Pearson’s correlation coefficient), this approach captures nonlinear features and complex interactions, facilitating the construction of more discriminative and generalizable vegetation indices.
Subsequently, the top ten band combinations with the highest classification accuracy for each vegetation index form were selected, and a feature dataset was constructed [30]. For instance, in the case of two-band index forms, 12 possible combinations existed. By selecting 10 bands from Sentinel-2, each vegetation index yielded 90 spectral band combinations. By choosing the 10 best-performing band combinations, 120 distinct feature vegetation index band combinations were obtained. The same procedure was applied to three-band and four-band vegetation index forms, resulting in a total of 300 feature vegetation index band combinations.

2.5. Multi-VI Feature Selection

In order to assess the effectiveness of multi-feature modeling, this research utilizes the feature selection approach based on the minimum redundancy maximum relevance (mRMR) criterion [31]. The mRMR algorithm concurrently optimizes two objective functions for feature selection: (1) maximizing the correlation between the features and the target variable, and (2) minimizing the redundancy among features, aiming to select a subset that is both highly predictive and non-redundant.
In this implementation, this study utilized the Pymrmr toolkit to apply the mRMR algorithm for secondary feature selection, based on the vegetation index band combinations selected in Section 2.4. By setting different thresholds for the number of features, data subsets containing 100, 24, 12, 6, and 3 features were constructed. The effect of feature quantity on model performance was systematically examined to determine the optimal feature dimensionality.

2.6. Model Construction and Validation

In this step, the feature dataset selected in Section 2.5 was used to construct machine learning classification models. A total of five classical classification models were developed: the K-Nearest Neighbor (KNN) model [32], Support Vector Machine (SVM) model [33], Multilayer Perceptron (MLP) model [34], Extreme Gradient Boosting (XGBoost) model [35], and Random Forest (RandomForest) model [36]. All models were implemented using the Scikit-learn library in Python, and standardized data, processed with exponential transformations, were used as inputs for the models. Three-fold cross-validation was employed for model training. In evaluating the models for rice growth stage classification, this study employed several performance metrics, including recall, precision, and F1 score.
R e c a l l = T P T P + F N
P r e c i s i o n = T P T P + F P
F 1 = 2 × P r e c i s i o n × R e c a l l P r e c i s i o n + R e c a l l
True positives (TP) refer to cases where both the actual and detected class were positive, indicating the correct identification of positive values. False positives (FP) represent instances where positive values were incorrectly detected, while false negatives (FN) denote situations where negative values were mistakenly identified as positive. These metrics were calculated using pixel-level detection results; this study placed greater emphasis on the model’s overall ability to recognize rice growth stages. Therefore, the overall F1 score for each category [37] was selected as the evaluation metric.
To more clearly illustrate the technical approach of this study, a detailed workflow diagram was constructed (Figure 4). This diagram outlines all the steps from data collection to model construction, including UAV image acquisition, satellite image retrieval, UAV-satellite label bridging, vegetation index construction and selection, multi-feature selection, sensitive band analysis, and final machine learning modeling. Through this flowchart, readers can quickly understand the overall framework of this study and the logical relationships between each step.

3. Results

3.1. Results of Full-Spectrum Modeling

Spectral data were extracted from ten images at the pixel level, yielding a total of 27,906 spectral data points. Due to the uneven distribution of data across categories, a subset of data was randomly selected from each category to construct a benchmark dataset comprising 9220 spectral data points. This dataset contained 2962 data points from the ‘Other’ category, 1468 from the ‘Jointing’ stage, 1082 from the ‘Heading’ stage, 2601 from the ‘Milky’ stage, and 1107 from the ‘Maturity’ stage. A Python script was used to calculate the average spectral reflectance and standard deviation for each category. Additionally, Spearman’s correlation coefficients between the spectral data and the different stages of rice growth, as well as for the entire dataset, were computed. The results are presented in Figure 5.
Figure 5a illustrates the variation in spectral reflectance across four critical growth stages of rice (Jointing, Heading, Milky, and Maturity) and other categories within the 490–2190 nm spectral range. Overall, the spectral curves for different growth stages exhibit minimal variation in reflectance at the 490 nm, 740 nm, 1610 nm, and 2190 nm wavelengths, with Spearman correlation coefficients for growth stages consistently below 0.1, suggesting that these wavelengths contribute little to distinguishing between growth stages. Additionally, the standard deviation of spectral reflectance for the ‘Other’ category is relatively high, introducing some interference into the overall spectral characteristics.
At wavelengths of 560 nm, 665 nm, and 705 nm, the spectral reflectance gradually increases as the rice plant progresses through its growth stages, with the most significant increase observed at 705 nm, as shown in Figure 5c. This indicates that these wavelengths are more sensitive to the spectral response associated with rice growth stages. In contrast, at wavelengths of 783 nm, 842 nm, and 865 nm, the spectral reflectance shows a declining trend, with the decrease at 842 nm being particularly notable. As shown in Figure 5b, the largest difference in Spearman’s correlation coefficients between the full category and the growth stage experiment reached 0.448 (at 842 nm), with other wavelengths exhibiting varying degrees of change. This demonstrates that the presence of additional categories significantly weakened the response of growth stage-specific wavelengths.
Based on the aforementioned spectral characteristics, this study employed five machine learning algorithms, including Random Forest (RF) and Support Vector Machine (SVM), to model the complete spectrum, as presented in Table 3. The classification performance of different models across categories, as well as their overall performance, exhibited certain variations, offering valuable guidance for the further optimization of model structure and feature selection.

3.2. Results of Vegetation Index Selection

This study systematically evaluated the classification performance of multi-band combinations in vegetation indices (see Section 2.4). The optimal band combinations were identified based on the average F1 scores obtained from three-fold cross-validation. As shown in Figure 6, the best-performing model using the original single bands produces an F1 score of 0.280. In comparison, the optimal two-band vegetation index, ARI (1610, 665), achieves an F1 score of 0.418. The best-performing three-band index, EVI (865, 842, 2190), further improves performance with an F1 score of 0.471, while the top four-band index, GARI (842, 2190, 1610, 842), reaches the highest score of 0.476. These results indicate that band combinations derived from vegetation indices significantly enhance classification accuracy compared to models based on individual spectral bands. Additionally, the classification performance shows a consistent upward trend as the number of bands included in the indices increases.
Figure 7 illustrates the comparative F1-score performance of four two-band vegetation index combinations. The optimal F1 values are consistently observed at the spectral intersection of the short-wave infrared (SWIR) band (Band 1: 1610–2190 nm) and the red-edge band (Band 2: 665–705 nm), indicating a robust synergistic interaction. Notably, this spectral pairing demonstrates high generalizability across multiple vegetation indicators.

3.3. Results of Multi-Feature Index Selection

This study systematically assessed the potential of different vegetation index combinations to improve the classification accuracy of rice phenological stages. A total of 300 candidate vegetation indices derived from Sentinel-2 imagery were considered, covering the top ten band combinations for each index type. The mRMR algorithm was used to perform recursive feature selection, with an optimal feature subset iteratively identified through a forward search strategy.
The experimental results (Table 4, Figure 8) show a progressive improvement in the model’s F1 score as the number of selected features increases. When all 300 vegetation indices are employed, the model achieves an F1 score of 0.864, representing a 2.5% improvement over the best-performing model based on raw spectral bands and a 38.8% increase compared to the optimal single-feature model. These findings highlight the significant contribution of multi-index combinations in improving classification performance.
Although increasing the number of features can enhance model performance, it may also lead to higher computational costs and an increased risk of overfitting. The results further show that model performance tends to stabilize when the number of features reaches 12, with an F1 score of 0.855, while maintaining acceptable computational efficiency. Therefore, for practical applications, constructing models with a carefully selected subset of 12 optimized vegetation indices offers a balanced solution, achieving high classification accuracy with reduced computational burden.
As shown in Table 5, the F1 scores for all four rice growth stages exceed 0.85, with the heading stage achieving the highest score of 0.872. Figure 9 presents the confusion matrices for five classification models constructed using 12 selected features, while Figure 10 demonstrates the classification outcomes of the Random Forest model based on 12 vegetation indices applied to Sentinel-2 imagery. Although some misclassifications occur, the overall classification performance remains satisfactory. These results indicate that the model demonstrates strong discriminative ability across all phenological stages, exhibiting high robustness and reliability.

4. Discussion

4.1. Integration of UAV Imagery and Satellite Remote Sensing

Low-resolution satellite imagery poses significant challenges in acquiring reliable labels. In contrast, unmanned aerial vehicles (UAVs) provide an efficient solution for collecting large-scale, continuous ground reference data with minimal labor, thus offering a viable alternative to manual ground sampling. Since Sentinel-2 imagery is precisely aligned with Google Maps, this study effectively addresses the challenge of multi-source remote sensing data co-registration by accurately co-registering UAV imagery with sub-meter-resolution Google imagery. To address the issue of pixel expansion during the download of Sentinel-2 imagery, this study employed a geospatial boundary expansion strategy, which involved adding a buffer of pixels around UAV labels to ensure complete alignment with satellite imagery, thereby ensuring accurate label mapping. Based on this approach, this study constructed a dataset of over 9200 rice growth period records derived from Sentinel-2 satellite imagery. This method provided an important technical reference for other remote sensing applications, particularly in the fusion of high-resolution and medium-resolution imagery, demonstrating broad applicability and potential for wider adoption.

4.2. Sensitive Bands for Rice Growth Period Classification and Their Physiological Significance

This study calculated the highest F1 mean values for all possible combinations of vegetation indices derived from two, three, and four bands, with the results presented in Table 6. It is noteworthy that the 705 nm and 2190 nm bands consistently appear in these combinations, indicating their significant role in rice growth period classification.
The 705 nm band lies within the red-edge region (680–750 nm) and is highly sensitive to changes in chlorophyll content. During the rice tillering to booting stage, the accumulation of chlorophyll caused a movement of the red-edge position toward longer wavelengths (red shift), leading to a significant increase in the reflectance [38]. The spectral curve in Figure 5a clearly reflects this trend, while the analysis in Figure 5c indicates that Spearman’s correlation coefficient for the 705 nm band and the rice growth period reached 0.33, further validating the characteristic increase in reflectance as the growth period progressed.
The 2190 nm band lies within the shortwave infrared (SWIR) region and is highly sensitive to water absorption [39]. Although Spearman’s correlation coefficient for this band with the rice growth period was relatively low, its absolute value reached 0.31 in the full-category Spearman correlation analysis, which was the highest among all bands. This indicates that the 2190 nm band had a significant effect in distinguishing rice from other categories.
Overall, the combination of the 705 nm and 2190 nm bands not only captured key spectral features of the rice growth period but also effectively distinguished rice from other land cover categories, providing an important spectral basis for the accurate identification of the rice growth period.

4.3. The Necessity of Multi-Feature Modeling in Complex Spectral Pixel Classification

The spatial resolution of Sentinel-2 imagery is 10 m, which can lead to the presence of multiple land cover types within a single pixel, thereby creating the issue of mixed pixels. This makes it challenging for classification models to accurately identify and differentiate between various land cover types, significantly affecting the accuracy of the classification results. The analysis of Figure 5 shows that the highest absolute Spearman correlation coefficient for a single band is only 0.37, while the difference in Spearman correlation coefficients between the full category and growth period experiments reaches up to 0.448 (842 nm).
This interference primarily stemmed from the overlap and confusion between the spectral features of other categories and those of the rice growth period. For instance, the spectral reflectance of certain non-rice categories may have resembled that of a specific rice growth period, leading to misclassification by the model. Therefore, relying solely on the spectral information of a single band made it difficult to effectively differentiate between the various rice growth stages, thereby impacting the accuracy of the classification.
In complex spectral pixel classification, the richness of spectral information is crucial. The vegetation index calculation framework, by integrating information from multiple bands, effectively enhances the classification capability of the model. This study found that as the number of bands increased, the F1 score of individual vegetation index band combinations gradually improved, indicating that an increase in the number of bands significantly enhanced the richness of spectral information. Additionally, this study employed the mRMR algorithm for features selection, and the results show that an excessive number of features had a limited effect on improving model performance. Specifically, the optimal model with 12 features achieved an F1 score of 0.855, with a difference of less than 1% compared to the optimal model with 300 features (F1 = 0.864). However, in comparison with the optimized single-feature model (F1 = 0.476), the model with 12 features showed a performance improvement of 37.9%. This indicates that the model with 12 features, while maintaining high classification accuracy, significantly reduced computational load, offering higher practicality and efficiency.
In the future, this study can integrate the advantages of SAR satellite imagery in penetrating cloud cover to analyze changes in rice canopy structure, incorporating SAR data into the analysis framework. By integrating SAR data, the number of features can be significantly increased, leading to further improvements.

5. Conclusions

This study leverages ultra-high-resolution imagery acquired by unmanned aerial vehicles (UAVs) to generate accurate, large-scale, and temporally continuous phenological labels for satellite imagery during key growth stages of rice. By integrating UAV data with Sentinel-2 satellite imagery, we systematically evaluated the performance of 30 vegetation indices (VIs) and their optimal band combinations for classifying rice growth stages. Among these, the most effective dual-band index was ARI (1610, 665), with an F1 score of 0.418; the optimal tri-band index was EVI (865, 842, 2190), with an F1 score of 0.471; and the best-performing four-band index was GARI (842, 2190, 1610, 842), which yielded an F1 score of 0.476. The results indicate that the classification performance of vegetation indices (VIs) shows a notable improvement as the number of spectral bands increases. Notably, the combination of the 705 nm and 2190 nm bands effectively captures critical spectral features associated with rice phenology and distinguishes rice from other land cover types, providing essential spectral evidence for accurate phenological identification.
In addition, the mRMR algorithm was used to select a set of features for combined modeling, and the corresponding performance was comprehensively evaluated. Compared to single-feature models, multi-feature approaches exhibited a significant improvement in classification performance. For instance, a four-band index model incorporating 12 features achieved an F1 score of 0.855, which significantly surpassed that of models relying on individual features. The results indicate that using either six or twelve selected features leads to strong and consistent classification performance.
Overall, this study underscores the feasibility of synergistically combining UAV and satellite imagery for the phenological monitoring of rice using machine learning and vegetation indices. The proposed approach provides a promising foundation for large-scale, efficient agricultural monitoring and provides critical technical support for future integrated UAV–satellite crop phenology observation frameworks.

Author Contributions

Conceptualization, J.Z. and R.Z.; methodology, J.Z. and R.Z.; software, R.Z. and Q.M. validation, R.Z.; formal analysis, R.Z. and Q.M.; investigation, J.D.; resources, J.Z. and J.D.; data curation, Y.C. and J.Z.; writing—original draft preparation, J.Z. and R.Z.; writing—review and editing, J.D., Y.C. and B.C.; visualization, R.Z. and B.C.; supervision, J.D. and B.C.; project administration, J.Z.; funding acquisition, J.Z. and Y.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China, grant number 42175193, open found project of CMA Key Open Laboratory of Transforming Climate Resources to Economy, grant number 2023006. We are grateful for the financial support provided by the National Natural Science Foundation of China and the CMA Key Open Laboratory of Transforming Climate Resources to Economy.

Data Availability Statement

Data are available upon request to the corresponding author.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Arthi, B.; Maragatham, N. Effect of Elevated Temperature on Rice Phenology and Yield. Indian J. Sci. Technol. 2013, 6, 5095–5097. [Google Scholar] [CrossRef]
  2. Bouman, B.A.M.; Humphreys, E.; Tuong, T.P.; Barker, R. Rice and Water. In Advances in Agronomy; Sparks, D.L., Ed.; Academic Press: Cambridge, MA, USA, 2007; Volume 92, pp. 187–237. [Google Scholar]
  3. Hou, W.; Shen, J.; Xu, W.; Khan, M.R.; Wang, Y.; Zhou, X.; Gao, Q.; Murtaza, B.; Zhang, Z. Recommended Nitrogen Rates and the Verification of Effects Based on Leaf SPAD Readings of Rice. PeerJ 2021, 9, e12107. [Google Scholar] [CrossRef]
  4. Xu, M.; Xu, J.; Xu, M.; Xu, Y. Spatiotemporal variation characteristics and forecast model construction of high temperature heat damage intensity in rice. Trans. Chin. Soc. Agric. Eng. Trans. CSAE 2024, 40, 97–106. [Google Scholar] [CrossRef]
  5. Sharma, K.R.; Raju, S.V.S.; Singh, K.; Babu, S.R. Effect of Crop Growth Stages on the Field Population of Rice Hoppers. Indian J. Entomol. 2023, 85, 701–703. [Google Scholar] [CrossRef]
  6. Duan, B.; Fang, S.; Zhu, R.; Wu, X.; Wang, S.; Gong, Y.; Peng, Y. Remote Estimation of Rice Yield With Unmanned Aerial Vehicle (UAV) Data and Spectral Mixture Analysis. Front. Plant Sci. 2019, 10, 204. [Google Scholar] [CrossRef]
  7. Li, S.; Xiao, J.; Ni, P.; Zhang, J.; Wang, H.; Wang, J. Monitoring Paddy Rice Phenology Using Time Series MODIS Data over Jiangxi Province, China. Int. J. Agric. Biol. Eng. 2014, 7, 28–36. [Google Scholar] [CrossRef]
  8. Singh, R.P.; Oza, S.R.; Pandya, M.R. Observing Long-Term Changes in Rice Phenology Using NOAA–AVHRR and DMSP–SSM/I Satellite Sensor Measurements in Punjab, India. Curr. Sci. 2006, 91, 1217–1221. [Google Scholar]
  9. Ma, Y.; Jiang, Q.; Wu, X.; Zhu, R.; Gong, Y.; Peng, Y.; Duan, B.; Fang, S. Monitoring Hybrid Rice Phenology at Initial Heading Stage Based on Low-Altitude Remote Sensing Data. Remote Sens. 2021, 13, 86. [Google Scholar] [CrossRef]
  10. Xie, Z.; Zhang, C.; Feng, S.; Zhang, F.; Cai, H.; Tang, M.; Kong, J. Reviews of methods for vegetation phenology monitoring from remote sensing data. Remote Sens. Technol. Appl. 2023, 38, 1–14. [Google Scholar]
  11. Moeini Rad, A.; Ashourloo, D.; Salehi Shahrabi, H.; Nematollahi, H. Developing an Automatic Phenology-Based Algorithm for Rice Detection Using Sentinel-2 Time-Series Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2019, 12, 1471–1481. [Google Scholar] [CrossRef]
  12. Liao, C.; Wang, J.; Shan, B.; Shang, J.; Dong, T.; He, Y. Near Real-Time Detection and Forecasting of Within-Field Phenology of Winter Wheat and Corn Using Sentinel-2 Time-Series Data. ISPRS J. Photogramm. Remote Sens. 2023, 196, 105–119. [Google Scholar] [CrossRef]
  13. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-Based Plant Height from Crop Surface Models, Visible, and near Infrared Vegetation Indices for Biomass Monitoring in Barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  14. Astridevi; Zainuddin, Z.; Paundu, A.W. Rice Growth Phase Detection in Rice Field Plots Using Drone Imagery. In Proceedings of the 2024 19th International Joint Symposium on Artificial Intelligence and Natural Language Processing (iSAI-NLP), Chonburi, Thailand, 14 November 2024; pp. 1–6. [Google Scholar]
  15. Qiu, Z.; Liu, H.; Wang, L.; Shao, S.; Chen, C.; Liu, Z.; Liang, S.; Wang, C.; Cao, B. Accurate Prediction of 327 Rice Variety Growth Period Based on Unmanned Aerial Vehicle Multispectral Remote Sensing. Drones 2024, 8, 665. [Google Scholar] [CrossRef]
  16. Lu, X.; Zhou, J.; Yang, R.; Yan, Z.; Lin, Y.; Jiao, J.; Liu, F. Automated Rice Phenology Stage Mapping Using UAV Images and Deep Learning. Drones 2023, 7, 83. [Google Scholar] [CrossRef]
  17. Zhu, W.; Rezaei, E.E.; Nouri, H.; Sun, Z.; Li, J.; Yu, D.; Siebert, S. UAV-Based Indicators of Crop Growth Are Robust for Distinct Water and Nutrient Management but Vary between Crop Development Phases. Field Crops Res. 2022, 284, 108582. [Google Scholar] [CrossRef]
  18. Zhang, D.; Qi, H.; Guo, X.; Sun, H.; Min, J.; Li, S.; Hou, L.; Lv, L. Integration of UAV Multispectral Remote Sensing and Random Forest for Full-Growth Stage Monitoring of Wheat Dynamics. Agriculture 2025, 15, 353. [Google Scholar] [CrossRef]
  19. Du, M.; Noguchi, N. Multi-Temporal Monitoring of Wheat Growth through Correlation Analysis of Satellite Images, Unmanned Aerial Vehicle Images with Ground Variable. IFAC-PapersOnLine 2016, 49, 5–9. [Google Scholar] [CrossRef]
  20. Kiang, R.K. Textural-Contextual Labeling and Metadata Generation for Remote Sensing Applications. In Applications and Science of Computational Intelligence II; SPIE: Bellingham, WA, USA, 1999; Volume 3722, pp. 243–248. [Google Scholar]
  21. Ma, H.; Zhang, J.; Huang, W.; Ruan, C.; Chen, D.; Zhang, H.; Zhou, X.; Gui, Z. Monitoring Yellow Rust Progression during Spring Critical Wheat Growth Periods Using Multi-Temporal Sentinel-2 Imagery. Pest. Manag. Sci. 2024, 80, 6082–6095. [Google Scholar] [CrossRef]
  22. Huang, L.; Jiang, J.; Huang, W.; Ye, H.; Zhao, J.; Ma, H.; Ruan, C. Wheat yellow rust monitoring method based on Sentinel-2 image and BPNN model. Trans. Chin. Soc. Agric. Eng. 2019, 35, 178–185. [Google Scholar] [CrossRef]
  23. Nowakowski, M.; Dudek, E.; Rosiński, A. The Influence of Varying Atmospheric and Space Weather Conditions on the Accuracy of Position Determination. Sensors 2023, 23, 2814. [Google Scholar] [CrossRef]
  24. Xu, B.; Li, J.; Liu, Q.; Xin, X.; Zeng, Y.; Yin, G. Review of methods for evaluating representativeness of ground station observations. J. Remote Sens. 2015, 19, 703–718. [Google Scholar] [CrossRef]
  25. Islam, M.K.; Simic Milas, A.; Abeysinghe, T.; Tian, Q. Integrating UAV-Derived Information and WorldView-3 Imagery for Mapping Wetland Plants in the Old Woman Creek Estuary, USA. Remote Sens. 2023, 15, 1090. [Google Scholar] [CrossRef]
  26. Wu, Y.; Wang, Y.; Deng, J.; Li, Y.; Zhang, R. Bridging Field Investigation and Sentinel 2 Satellite Image with UAV Remote Sensing for Yield Inversion of Chinese Pepper. In Advances in Guidance, Navigation and Control; Yan, L., Duan, H., Deng, Y., Eds.; Springer Nature: Singapore, 2025; pp. 543–556. [Google Scholar]
  27. Wang, M.; Wang, J.; Chen, L.; Du, Z. Mapping Paddy Rice and Rice Phenology with Sentinel-1 SAR Time Series Using a Unified Dynamic Programming Framework. Open Geosci. 2022, 14, 414–428. [Google Scholar] [CrossRef]
  28. Supriatna; Rokhmatuloh; Wibowo, A.; Shidiq, I.P.A.; Pratama, G.P.; Gandharum, L. Spatio-temporal analysis of rice field phenology using Sentinel-1 image in Karawang Regency West Java, Indonesia. Int. J. GEOMATE 2019, 17, 101–106. [Google Scholar] [CrossRef]
  29. Badrul Hisham, N.H.; Hashim, N.; Saraf, N.M.; Talib, N. Monitoring of Rice Growth Phases Using Multi-Temporal Sentinel-2 Satellite Image. IOP Conf. Ser. Earth Environ. Sci. 2022, 1051, 012021. [Google Scholar] [CrossRef]
  30. Deng, J.; Wang, R.; Yang, L.; Lv, X.; Yang, Z.; Zhang, K.; Zhou, C.; Pengju, L.; Wang, Z.; Abdullah, A.; et al. Quantitative Estimation of Wheat Stripe Rust Disease Index Using Unmanned Aerial Vehicle Hyperspectral Imagery and Innovative Vegetation Indices. IEEE Trans. Geosci. Remote Sens. 2023, 61, 4406111. [Google Scholar] [CrossRef]
  31. Zhao, Z.; Anand, R.; Wang, M. Maximum Relevance and Minimum Redundancy Feature Selection Methods for a Marketing Machine Learning Platform. In Proceedings of the 2019 IEEE International Conference on Data Science and Advanced Analytics (DSAA), Washington, DC, USA, 5–8 October 2019; pp. 442–452. [Google Scholar]
  32. Cunningham, P.; Delany, S.J. K-Nearest Neighbour Classifiers: 2nd Edition (with Python Examples). ACM Comput. Surv. 2022, 54, 128. [Google Scholar] [CrossRef]
  33. Hearst, M.A.; Dumais, S.T.; Osuna, E.; Platt, J.; Scholkopf, B. Support Vector Machines. IEEE Intell. Syst. Their Appl. 1998, 13, 18–28. [Google Scholar] [CrossRef]
  34. Tolstikhin, I.; Houlsby, N.; Kolesnikov, A.; Beyer, L.; Zhai, X.; Unterthiner, T.; Yung, J.; Steiner, A.; Keysers, D.; Uszkoreit, J.; et al. MLP-Mixer: An All-MLP Architecture for Vision. arXiv 2021, arXiv:2105.01601. [Google Scholar]
  35. Chen, T.; Guestrin, C. XGBoost: A Scalable Tree Boosting System. In Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, San Francisco, CA, USA, 13 August 2016; pp. 785–794. [Google Scholar]
  36. Breiman, L. Random Forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  37. Powers, D.M.W. Evaluation: From Precision, Recall and F-Measure to ROC, Informedness, Markedness and Correlation. arXiv 2020, arXiv:2010.16061. [Google Scholar]
  38. Liu, W.; Xiang, Y.; Zheng, L.; Tong, Q.; Wu, C. Relationships between Rice LAl, CH.D and Hyperspectra Data. J. Remote Sens. 2000, 4, 279–283. [Google Scholar]
  39. Zhang, J.; Xu, Y.; Yao, F.; Wang, P.; Guo, W.; Li, L.; Yang, L. Advances in estimation methods of vegetation water content based on optical remote sensing techniques. Sci. China Technol. Sci. 2010, 53, 1159–1167. [Google Scholar] [CrossRef]
Figure 1. (a) Map of China (review number: GS(2023)2766); (b) map of Chongqing Municipality with the study area marked; (c) Fushou Research Area, Nanchuan District. Map projection: WGS84 UTM Zone 49N.
Figure 1. (a) Map of China (review number: GS(2023)2766); (b) map of Chongqing Municipality with the study area marked; (c) Fushou Research Area, Nanchuan District. Map projection: WGS84 UTM Zone 49N.
Remotesensing 17 02180 g001
Figure 2. (a) Raw dataset; (b) UAV-to-Google georeferencing; (c) label-bridging workflow.
Figure 2. (a) Raw dataset; (b) UAV-to-Google georeferencing; (c) label-bridging workflow.
Remotesensing 17 02180 g002
Figure 3. UAV imagery, UAV labels, Google imagery, Sentinel-2 imagery, and Sentinel-2 labels.
Figure 3. UAV imagery, UAV labels, Google imagery, Sentinel-2 imagery, and Sentinel-2 labels.
Remotesensing 17 02180 g003
Figure 4. Technique flowchart.
Figure 4. Technique flowchart.
Remotesensing 17 02180 g004
Figure 5. (a) shows the reflectances of different categories within the 490–2190 nm spectral band. The solid lines represent the means, and the corresponding dashed-line ranges are the standard deviations. (b) illustrates Spearman’s correlation coefficients between the spectral reflectance at each wavelength and the full category. (c) shows Spearman’s correlation coefficients between the spectral reflectance at each wavelength and the rice growth stage categories.
Figure 5. (a) shows the reflectances of different categories within the 490–2190 nm spectral band. The solid lines represent the means, and the corresponding dashed-line ranges are the standard deviations. (b) illustrates Spearman’s correlation coefficients between the spectral reflectance at each wavelength and the full category. (c) shows Spearman’s correlation coefficients between the spectral reflectance at each wavelength and the rice growth stage categories.
Remotesensing 17 02180 g005
Figure 6. Top ten band combinations for vegetation indices based on MLP modeling across different spectral regions, along with the F1 score of the model constructed using original spectral bands.
Figure 6. Top ten band combinations for vegetation indices based on MLP modeling across different spectral regions, along with the F1 score of the model constructed using original spectral bands.
Remotesensing 17 02180 g006
Figure 7. Results of triple cross-validation for MLP regression models of different band combinations in the dual-band index.
Figure 7. Results of triple cross-validation for MLP regression models of different band combinations in the dual-band index.
Remotesensing 17 02180 g007
Figure 8. F1 score for multi-feature subset selection.
Figure 8. F1 score for multi-feature subset selection.
Remotesensing 17 02180 g008
Figure 9. 12 Feature data modeling confusion matrix.
Figure 9. 12 Feature data modeling confusion matrix.
Remotesensing 17 02180 g009
Figure 10. Example of Sentinel-2 imagery, annotated labels, and the prediction results of the Random Forest model based on twelve extracted features.
Figure 10. Example of Sentinel-2 imagery, annotated labels, and the prediction results of the Random Forest model based on twelve extracted features.
Remotesensing 17 02180 g010
Table 1. Detailed description of UAV data.
Table 1. Detailed description of UAV data.
ImageryLocationGrowth StageCell Size (cm)Resolution
(Pixel)
DJI_202307181803_002Yongchuan LaishuHeading(0.2, 0.2)25,733 × 20,569
DJI_202307191526_001KaizhouHeading(0.2, 0.2)19,576 × 24,503
DJI_202307200949_002Kaizhou DadeJointing(0.2, 0.2)22,160 × 22,869
DJI_202307201727_007Nanchuan NongjiJointing(0.2, 0.2)24,432 × 31,413
DJI_202307201851_001Nanchuan FushouJointing(0.2, 0.2)26,666 × 20,154
DJI_202308131333_013Nanchuan FushouMilky(0.2, 0.2)26,955 × 19,198
DJI_202308131629_014Nanchuan NongjiMilky(0.2, 0.2)24,769 × 28,047
DJI_202308140904_015Yongchuan LaishuMaturity(0.2, 0.2)24,719 × 22,190
DJI_202308141250_016Tongnan ChongcanMilky(0.2, 0.2)23,152 × 18,702
DJI_202308141525_017Tongnan ZitanMaturity(0.2, 0.2)23,152 × 18,702
Table 2. The name and calculation method of vegetation indices.
Table 2. The name and calculation method of vegetation indices.
Index AcronymComplete NameCalculation Method
2BandNDVINormalized Difference Vegetation Index(b1 − b2)/(b1 + b2)
TDVITransformed Difference Vegetation Index1.5 × ((b1 − b2)/((b12 + b22 + 0.5)0.5))
NIRvNear-Infrared Reflectance of Vegetation((b1 − b2)/(b1 + b2)) × b1
MSIMoisture Stress Indexb2/b1
MGRVIModified Green Red Vegetation Index(b12 − b22)/(b12 + b22)
IPVIInfrared Percentage Vegetation Indexb2/(b2 + b1)
EVI2Two-Band Enhanced Vegetation Index2.5 × (b2 − b1)/(b2 + 2.4 × b1 + 6)
DVIDifference Vegetation Indexb2 − b1
CIGChlorophyll Index Green(b2/b1) − 1.0
CSIChar Soil Indexb1/b2
BAIBurned Area Index1.0/((0.1 − b1)2 + (0.06 − b2)2)
ARIAnthocyanin Reflectance Index(1/b1) − (1/b2)
3BandARI2Anthocyanin Reflectance Index 2b3 × ((1/b1) − (1/b2))
ARVIAtmospherically Resistant Vegetation Index(b3 − (b2 − 2.5 × (b2 − b1)))/(b3 + (b2 − 2.5 × (b2 − b1)))
SWISnow Water Index(b1 × (b2 − b3))/((b1 + b2) × (b2 + b3))
EBBIEnhanced Built-Up and Bareness Index(b2 − b1)/(10.0 × ((b3 + b2) 0.5))
EVIEnhanced Vegetation Index2.5 × (b3 − b2)/(b3 + 6 × b2 − 7.5 × b1 + 1)
GBNDVIGreen-Blue Normalized Difference Vegetation Index(b3 − (b2 + b1))/(b3 + (b2 + b1))
GLIGreen Leaf Index(2.0 × b3 − b2 − b1)/(2.0 × b3 + b2 + b1)
MBIModified Bare Soil Index((b2 − b3 − b1)/(b2 + b3 + b1)) + 0.5
PSRIPlant Senescing Reflectance Index(b2 − b1)/b3
BaIBareness Indexb1 + b3 − b2
4BandBLFEIBuilt-Up Land Features Extraction Index(((b1 + b2 + b4)/3.0) − b3)/(((b1 + b2 + b4)/3.0) + b3)
BIBare Soil Index((b4 + b2) − (b3 + b1))/((b4 + b2) + (b3 + b1))
DBIDry Built-Up Index((b1 − b4)/(b1 + b4)) − ((b3 − b2)/(b3 + b2))
DBSIDry Bareness Index((b4 − b1)/(b1 + b4)) − ((b3 − b2)/(b3 + b2))
EMBIEnhanced Modified Bare Soil Index((((b3 − b4 − b2)/(b3 + b4 + b2)) + 0.5) − ((b1 − b3)/(b1 + b3)) − 0.5)/((((b3 − b4 − b2)/(b3 + b4 + b2))
+ 0.5) + ((b1 − b3)/(b1 + b3)) + 1.5)
FCVIFluorescence Correction Vegetation Indexb4 − ((b1 + b2 + b3)/3.0)
GARIGreen Atmospherically Resistant Vegetation Index(b4 − (b1 − (b2 − b3)))/(b4 − (b1 + (b2 − b3)))
WRIWater Ratio Index(b1 + b2)/(b3 + b4)
Table 3. Model performance using the full set of original spectral bands.
Table 3. Model performance using the full set of original spectral bands.
ClassKNNSVMMLPXGBoostRandom Forest
PrecisionOther0.800 ± 0.005690.794 ± 0.011360.809 ± 0.014550.814 ± 0.014270.797 ± 0.01594
Heading0.755 ± 0.024700.805 ± 0.018360.805 ± 0.011470.839 ± 0.012270.810 ± 0.00585
Jointing0.849 ± 0.018350.862 ± 0.012980.838 ± 0.003870.881 ± 0.013470.904 ± 0.01692
Milky0.832 ± 0.006760.831 ± 0.001720.834 ± 0.010340.863 ± 0.006850.814 ± 0.00441
Maturity0.752 ± 0.010830.719 ± 0.007720.735 ± 0.025390.803 ± 0.011000.788 ± 0.01262
Mean0.798 ± 0.007580.802 ± 0.007730.804 ± 0.006660.840 ± 0.004980.823 ± 0.00526
RecallOther0.770 ± 0.000540.818 ± 0.008340.809 ± 0.016840.819 ± 0.007430.807 ± 0.00744
Heading0.823 ± 0.012680.734 ± 0.018030.759 ± 0.024390.839 ± 0.013980.800 ± 0.00228
Jointing0.790 ± 0.016480.819 ± 0.010670.828 ± 0.016020.834 ± 0.009640.787 ± 0.01502
Milky0.833 ± 0.016740.837 ± 0.016020.854 ± 0.025570.870 ± 0.013080.869 ± 0.01448
Maturity0.829 ± 0.007980.759 ± 0.019160.743 ± 0.015960.832 ± 0.006640.780 ± 0.02028
Mean0.809 ± 0.006590.793 ± 0.007570.798 ± 0.006050.839 ± 0.003610.809 ± 0.00461
F1 scoreOther0.785 ± 0.003000.806 ± 0.008630.809 ± 0.008190.816 ± 0.010760.802 ± 0.01154
Heading0.787 ± 0.018170.768 ± 0.018160.781 ± 0.018360.839 ± 0.013120.805 ± 0.00398
Jointing0.819 ± 0.014160.840 ± 0.009830.833 ± 0.007940.857 ± 0.003380.841 ± 0.00397
Milky0.832 ± 0.011750.834 ± 0.008520.843 ± 0.007940.866 ± 0.005340.841 ± 0.00450
Maturity0.789 ± 0.009460.738 ± 0.005180.738 ± 0.005560.817 ± 0.006410.784 ± 0.01237
Mean0.802 ± 0.007070.797 ± 0.007580.801 ± 0.006430.839 ± 0.004230.815 ± 0.00430
Table 4. Optimal vegetation indices selected by the mRMR algorithm.
Table 4. Optimal vegetation indices selected by the mRMR algorithm.
Feature Selection of mRMR Algorithm
3DBI (783 nm,2190 nm,842 nm,1610 nm)
EVI (783 nm,2190 nm,842 nm)
BAIe (1610 nm,842 nm)
6TDVI (842 nm,1610 nm)
EVI (1610 nm,2190 nm,842 nm)
PSRI (1610 nm,842 nm,783 nm)
12MSI (665 nm,490 nm)
ARI (783 nm,1610 nm)
GBNDVI (2190 nm,842 nm,842 nm)
BaI (842 nm,1610 nm,842 nm)
NIRv (783 nm,2190 nm)
BaI (842 nm,783 nm,1610 nm)
24FCVI (842 nm,1610 nm,783 nm,1610 nm)
NIRv (842 nm,1610 nm)
SWI (842 nm,865 nm,2190 nm)
EMBI (783 nm,2190 nm,842 nm,490 nm)
GBNDVI (2190 nm,490 nm,842 nm)
MGRVI (1610 nm,783 nm)
DBI (1610 nm,842 nm,2190 nm,842 nm)
CIG (842 nm,842 nm)
BaI (842 nm,2190 nm,665 nm)
PSRI (842 nm,1610 nm,865 nm)
GBNDVI (490 nm,1610 nm,842 nm)
NIRv (665 nm,490 nm)
Table 5. Evaluation indicators of each category of 12 features.
Table 5. Evaluation indicators of each category of 12 features.
ClassKNNSVMMLPXGBoostRandomForest
PrecisionOther0.801 ± 0.0060.814 ± 0.0050.830 ± 0.0110.831 ± 0.0060.841 ± 0.003
Heading0.729 ± 0.0220.762 ± 0.0220.780 ± 0.0050.870 ± 0.0170.873 ± 0.025
Jointing0.837 ± 0.0210.851 ± 0.0290.824 ± 0.0250.888 ± 0.0100.891 ± 0.018
Milky0.794 ± 0.0100.754 ± 0.0060.792 ± 0.0090.851 ± 0.0140.843 ± 0.008
Maturity0.737 ± 0.0230.693 ± 0.0290.746 ± 0.0290.828 ± 0.0130.838 ± 0.006
All0.780 ± 0.0450.775 ± 0.0590.794 ± 0.0360.854 ± 0.0260.857 ± 0.025
RecallOther0.770 ± 0.0060.779 ± 0.0170.779 ± 0.0220.815 ± 0.0150.812 ± 0.014
Heading0.773 ± 0.0330.704 ± 0.0460.774 ± 0.0300.827 ± 0.0310.828 ± 0.029
Jointing0.784 ± 0.0060.779 ± 0.0150.811 ± 0.0180.860 ± 0.0150.853 ± 0.009
Milky0.815 ± 0.0150.818 ± 0.0180.838 ± 0.0190.895 ± 0.0110.899 ± 0.013
Maturity0.787 ± 0.0220.764 ± 0.0090.787 ± 0.0160.850 ± 0.0070.876 ± 0.011
All0.786 ± 0.0230.769 ± 0.0440.798 ± 0.0300.849 ± 0.0330.854 ± 0.036
F1 scoreOther0.785 ± 0.0060.796 ± 0.0110.804 ± 0.0160.823 ± 0.0110.827 ± 0.006
Heading0.750 ± 0.0270.732 ± 0.0350.777 ± 0.0140.848 ± 0.0220.850 ± 0.022
Jointing0.809 ± 0.0070.813 ± 0.0100.817 ± 0.0060.874 ± 0.0030.872 ± 0.006
Milky0.804 ± 0.0060.785 ± 0.0050.814 ± 0.0040.873 ± 0.0040.870 ± 0.003
Maturity0.761 ± 0.0130.726 ± 0.0180.765 ± 0.0100.839 ± 0.0090.857 ± 0.006
All0.782 ± 0.0270.770 ± 0.0400.795 ± 0.0230.851 ± 0.0230.855 ± 0.019
Table 6. Band combinations with the highest F1 mean values for each band.
Table 6. Band combinations with the highest F1 mean values for each band.
Band1Band2Band3Band4F1_AverageF1_Max_Index
2band705 nm2190 nm 0.3870.412
3band2190 nm705 nm1610 nm 0.3710.417
4band2190 nm842 nm705 nm560 nm0.3700.456
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, J.; Zhang, R.; Meng, Q.; Chen, Y.; Deng, J.; Chen, B. UAV as a Bridge: Mapping Key Rice Growth Stage with Sentinel-2 Imagery and Novel Vegetation Indices. Remote Sens. 2025, 17, 2180. https://doi.org/10.3390/rs17132180

AMA Style

Zhang J, Zhang R, Meng Q, Chen Y, Deng J, Chen B. UAV as a Bridge: Mapping Key Rice Growth Stage with Sentinel-2 Imagery and Novel Vegetation Indices. Remote Sensing. 2025; 17(13):2180. https://doi.org/10.3390/rs17132180

Chicago/Turabian Style

Zhang, Jianping, Rundong Zhang, Qi Meng, Yanying Chen, Jie Deng, and Bingtai Chen. 2025. "UAV as a Bridge: Mapping Key Rice Growth Stage with Sentinel-2 Imagery and Novel Vegetation Indices" Remote Sensing 17, no. 13: 2180. https://doi.org/10.3390/rs17132180

APA Style

Zhang, J., Zhang, R., Meng, Q., Chen, Y., Deng, J., & Chen, B. (2025). UAV as a Bridge: Mapping Key Rice Growth Stage with Sentinel-2 Imagery and Novel Vegetation Indices. Remote Sensing, 17(13), 2180. https://doi.org/10.3390/rs17132180

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop