Next Article in Journal
Impact of Satellite Clock Corrections and Different Precise Products on GPS and Galileo Precise Point Positioning Performance
Previous Article in Journal
Dry-Transferred MoS2 Films on PET with Plasma Patterning for Full-Bridge Strain-Gauge Sensors
Previous Article in Special Issue
Method for Monitoring the Safety of Urban Subway Infrastructure Along Subway Lines by Fusing Inter-Track InSAR Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Evaluating Multi-Temporal Sentinel-1 and Sentinel-2 Imagery for Crop Classification: A Case Study in a Paddy Rice Growing Region of China

1
School of Geography and Information Engineering, China University of Geosciences (Wuhan), Wuhan 430074, China
2
Technology Innovation Center for Ecological Conservation and Restoration in Dongting Lake Basin, Ministry of Natural Resources, Changsha 410083, China
3
School of Geosciences and Info-Physics, Central South University, Changsha 410083, China
*
Author to whom correspondence should be addressed.
Sensors 2026, 26(2), 586; https://doi.org/10.3390/s26020586
Submission received: 12 December 2025 / Revised: 9 January 2026 / Accepted: 12 January 2026 / Published: 15 January 2026
(This article belongs to the Special Issue Application of SAR and Remote Sensing Technology in Earth Observation)

Highlights

What are the main findings?
  • The decomposition parameters m v derived from the dual-polarization model-based decomposition can effectively discriminate different crop types.
  • Multi-temporal optical data with low cloud cover can effectively support crop classification. Incorporating dual-polarimetric SAR data further enhances the classification accuracy, particularly for rice and corn.
What is the implication of the main finding?
  • A practical classification strategy was proposed for crop type identification.
  • A 10-m-resolution thematic map was produced to classify crops in the study area.

Abstract

Information on crop planting structure serves as a key reference for crop growth monitoring and agricultural structural adjustment. Mapping the spatial distribution of crops through feature-based classification serves as a fundamental component of sustainable agricultural development. However, current crop classification methods often face challenges such as the discontinuity of optical data due to cloud cover and the limited discriminative capability of traditional SAR backscatter intensity for spectrally similar crops. In this case study, we assess multi-temporal Sentinel-1 and Sentinel-2 Satellite images for crop classification in a paddy rice growing region in Helonghu Town, located in the central region of Xiangyin County, Yueyang City, Hunan Province, China (28.5° N–29.0° N, 112.8° E–113.2° E). We systematically investigate three key aspects: (1) the classification performance using optical time-series Sentinel-2 imagery; (2) the time-series classification performance utilizing polarimetric SAR decomposition features from Sentinel-1 dual-polarimetric SAR images; and (3) the classification performance based on a combination of Sentinel-1 and Sentinel-2 images. Optimal classification results, with the highest overall accuracy and Kappa coefficient, are achieved through the combination of Sentinel-1 (SAR) and Sentinel-2 (optical) data. This case study evaluates the time-series classification performance of Sentinel-1 and Sentinel-2 data to determine the optimal approach for crop classification in Helonghu Town.

1. Introduction

Accurate and timely information on crop distribution is essential for agricultural production management, yield estimation, and sustainable land use planning [1,2]. Crop classification is essential for agricultural monitoring, providing the foundation for tracking crop growth, optimizing planting structures, and ensuring food security, while also supporting agricultural research, resource management, breeding innovation, and sustainable development [3,4]. The development of advanced crop distribution mapping algorithms is critical for establishing a robust and generalizable model applicable to large-scale agricultural monitoring via remote sensing [5,6]. Given the increasing pressures of population growth, global environmental change, and limited arable land resources, obtaining reliable and efficient crop classification has emerged as an urgent applied research challenge [7,8]. Crop growth dynamics are governed by critical factors including soil conditions, irrigation systems, and climatic variables [9]. Precise crop classification—essential for growth optimization due to species-specific environmental adaptability—forms the foundation for effective yield estimation and prediction [10,11,12].
Traditional crop classification approaches often rely exclusively on intensive ground surveys to characterize farmland attributes, posing significant economic and temporal constraints. As a non-contact information acquisition tool, remote sensing is extensively applied in agriculture and land management [10]. Remote sensing data with spatiotemporal heterogeneity substantially enhance the classification accuracy of land cover types, thereby enabling more precise mapping and analysis of farmland distribution [13]. Compared to conventional methods, remote sensing offers distinct advantages: (1) simultaneous large-scale observation with enhanced analytical timeliness; (2) reduced susceptibility to weather and environmental interference, lowering operational costs while improving reliability; and (3) establishment of comprehensive farmland production databases.
Optical remote sensing for crop classification has matured over decades. For land cover and land use classification, multi-temporal high-resolution optical imagery has become a critical tool—this shift has been enabled by the expanding repository of remote sensing satellite data. The calculation of the Normalized Difference Vegetation Index (NDVI) serves as a reliable method for extracting critical information on surface vegetation [14,15], and this index is widely used to predict key growth stages, such as the grain-filling and ripening stages of rice, providing robust technical support for the development of crop growth models. Rice spectral response varies across phenological stages. During early emergence and tillering, sparse canopy and low biomass result in low red and NIR reflectance, yielding lower NDVI and EVI values. As the canopy thickens and chlorophyll increases during tillering and jointing, NIR reflectance and VIs peak [16,17]. In the heading and flowering stages, with full canopy closure and maximum LAI, NIR reflectance and VIs reach their highest values, strongly correlating with yield [18,19,20]. In the grain filling stage, chlorophyll decreases and the canopy yellows, causing a drop in NIR and an increase in red reflectance, leading to a decline in NDVI and EVI. At maturity, as grains ripen and leaves yellow, red and SWIR reflectance increase, while NIR decreases, with a distinct red-edge shift [21]. These spectral changes are critical for differentiating rice growth stages and are widely used in optical remote sensing phenology studies. Moreover, the integration of NDVI and the Normalized Water Index (NDWI) has demonstrated promising results in crop recognition, particularly for rice [22]. While optical remote sensing technology exhibits strengths such as fine resolution and well-defined feature attributes, its effectiveness is often limited by weather conditions. For instance, crops are predominantly cultivated in subtropical regions, where the planting season often overlaps with the rainy season, posing significant challenges for monitoring. The persistent influence of environmental factors may render the accumulation of temporal imagery insufficient to significantly enhance crop classification accuracy.
Synthetic Aperture Radar (SAR) penetrates clouds, fog, and snow, enabling direct observation of surface targets [23]. This all-weather capability drives its increasing adoption in crop classification studies [24,25,26,27]. In addition, beyond these conventional applications, advanced polarimetric decomposition techniques have significantly enhanced SAR’s sensitivity to vegetation structure [28,29]. Researchers have discovered that polarimetric SAR features exhibit higher sensitivity to crop structure and biophysical characteristics than optical vegetation index features [19]. The full polarimetric SAR signals show a strong potential in the application of effective differentiation of various crops types [30]. In agriculture, model-based polarimetric decomposition based on physical scattering models quantifies parameters like volume scattering power and surface scattering power, providing a mechanistic basis for crop classification. Here, volume scattering is correlated with crop biomass and height characterizes canopy multiple scattering [31], while surface scattering reflects soil interface scattering [32]. Studies on scattering mechanisms based on the two-dimensional H-α eigenspace have shown that significant progress has been made in the differentiation of target types using the eigenvalues of the coherence matrix and its associated eigenvectors [33,34,35]. The H / A / α / decomposition parameters constitute a robust analytical framework for polarimetric SAR image interpretation and has been extensively applied across diverse areas, including crop classification, land use classification [34,36]. These parameters outperform traditional backscatter coefficients by isolating scattering mechanisms, thereby improving classification accuracy. Current Sentinel-1-based crop classification methodologies predominantly rely on backscatter coefficients and H / A / α / decomposition features [37,38,39,40], underutilizing advanced polarimetric information derived from model-based decomposition approaches.
The contribution of single SAR images to rice mapping needs to be improved [41], and future research needs to deepen the coupling of scattering mechanisms and agronomic parameters to promote the paradigm shift in crop classification from two-dimensional mapping to three-dimensional synergistic perception of structure and physiological state. Long-time-series SAR data can effectively monitor the scattering characteristic changes during crop growth [42,43], so it is advantageous to utilize long-time-series images for crop classification. Current Sentinel-1-based crop classification methodologies predominantly rely on backscatter coefficients and H / A / α / decomposition features [37,38,39,40], underutilizing advanced polarimetric information derived from model-based decomposition approaches. Therefore, fully utilizing multi-temporal polarization decomposition features can effectively characterize crop structural changes and further enhance classification performance [44,45].
The main objectives of this case study are summarized as follows: (1) To assess the accuracy, temporal regularity, and classification effectiveness of multi-temporal Sentinel-2 imagery for agricultural crop classification with different crops. (2) To evaluate the classification performance, temporal regularity, and crop-recognition capabilities of multi-temporal Sentinel-1 data, emphasizing the contribution of model-based polarimetric decomposition features. (3) From a practical perspective, to determine the most suitable crop classification strategy for the research region by comparatively analyzing the Sentinel-1 and Sentinel-2 images.

2. Materials and Methods

2.1. Study Area

The town of Helonghu is located in the center of Xiangyin County, Yueyang City, Hunan Province (28.5° N–29.0° N, 112.8° E–113.2° E), as shown in Figure 1, which belongs to a typical lake-wetland ecosystem. The town holds significant geographic and economic importance in the region due to its abundant water resources and unique ecological environment. These favorable ecological conditions and soil quality provide an optimal environment for the cultivation of various crops, including rice, soybeans, corn, and ramie. Rice exhibits a remarkably consistent cultivation timeline: sowing (early–mid-May), growth (mid-May–August), and harvesting (August–late September). Corn and soybeans display a shorter cycle, with sowing in early April and harvest by late August. Ramie (Boehmeria nivea), a perennial herbaceous plant highly valued for its high-quality fiber, is also an important economic crop in Helonghu Town. Its planting and harvesting seasons span from mid-April to early September.

2.2. Sentinel Data

For SAR data in this experimental task, images acquired by Sentinel-1A (a satellite with a 12-day revisit cycle) were employed. Comprehensive SAR satellite metadata is detailed in Table 1. As a member of the European Space Agency (ESA)’s Sentinel program, this satellite was designed to supply high-quality data for global environmental monitoring. It is widely applied in natural disaster monitoring, resource management, and urban planning. The satellite is characterized by its extensive coverage and substantial data volume [46,47]. It operates by transmitting C-band signals to the ground, enabling radar image data acquisition for Earth observation under any temporal and weather conditions. The Sentinel-1 data format used in this experiment was Single-Look Complex (SLC), with basic information shown in Table 1. The temporal range of the collected imagery extended over approximately seven months (from 28 February through 19 September 2024), yielding 12 images for analysis. For this experimental task, the Sentinel-1 data utilized the Interferometric Wide Swath (IW) mode as its imaging configuration. This mode is primarily employed for terrestrial observations and satisfies the accuracy requirements of this experiment. In single-look mode, the spatial resolution of the IW mode for surface observation can reach 5 m × 20 m, and its imaging swath width can extend up to 250 km per strip. Employing the progressive scan-based Terrain Observation SAR (TOPSAR) approach, the IW mode can effectively acquire three sub—areas of the study region in terrain variation observation, which is unattainable using traditional scanning radar techniques. Moreover, the spatial resolution of SAR imaging and data processing efficiency are enhanced by improvements in the radar scanning mode and data processing algorithms within the TOPSAR technique.
The study incorporated multispectral imagery sourced from the Sentinel-2A and Sentinel-2B missions. Operating in a coordinated orbital pattern, this satellite constellation facilitates comprehensive planetary monitoring through an optimized revisit cycle. The spatial resolution of the images varies across different bands, with the multispectral band offering a wealth of spectral information. Notably, the B2,B3,B4,B8 bands achieve a spatial resolution of 10 m. Each image has less than 20% cloud cover. Detailed information on the Sentinel-2 satellite bands is presented in Table 2. A series of 8 qualified Sentinel-2 images formed the basis of the temporal analysis, spanning the period from 12 January 2024, to 18 September 2024.
All Sentinel-1 and Sentinel-2 satellite imagery used in this study was obtained from the official European Space Agency (ESA) Copernicus Open Access Hub. For Sentinel-2 data, only Level-2A products with less than 20% cloud cover were selected to ensure the quality of the optical images and minimize the influence of atmospheric conditions on classification accuracy.

2.3. Field Data Collection

The dataset encompassed crop types and location, which provided a substantial amount of ground truth for the subsequent crop classification. A total of 8 types of data were collected, including 4 types of crops: rice, corn, soybean, and ramie. Specifically, 1227 rice polygons, 19 corn polygons, 251 soybean polygons, and 67 ramie polygons were gathered. The distribution and information of the measured data are depicted in Figure 2 and detailed in Table 3, respectively.
In the study area, rice cultivation is divided into double-cropping rice and single cropping rice. Double-cropping rice is further subdivided into double-cropping early rice and double-cropping late rice. In general, the double-cropping early rice has a preparatory sowing period from 15 to 20 March, followed by sowing from 20 to 25 March. The seedling stage lasts for approximately 25–30 days. Transplanting takes place from 20 to 25 April, while the tillering stage occurs from 1 to 12 May. The jointing and heading stages typically occur from mid-May to mid-June, with the grain-filling period lasting about one month. Consequently, early double-cropping rice usually reaches maturity by mid-July.
The double-cropping late rice is sown and prepared for planting in mid-June, with transplanting carried out in mid-July. The tillering stage occurs from late July to early August, followed by the panicle initiation stage. Panicle elongation lasts for approximately one month, and the grain-filling period extends from late September to early October, lasting 40–45 days. Therefore, double-cropping late rice typically reaches maturity by around October 20. Single-cropping rice is prepared for plowing before 10 May, with sowing taking place in mid-to-late May. The seedling stage lasts for approximately 25–30 days. Transplanting is carried out in early to mid-June, followed by the tillering stage in mid-to-late June. The jointing and panicle elongation stage begins in mid-July, and the grain-filling stage lasts for about 40–45 days. The crop reaches maturity by the end of September.
Field surveys were conducted from 21 to 23 July 2024, corresponding to the mid-to-late growing season of the main crops in the study area. During this period, the double-cropping early rice had already been harvested. Due to variations in actual planting conditions, the double-cropped late rice remained in the vegetative growth stage, with the plants appearing relatively sparse, whereas the single-cropped rice had entered the jointing and panicle elongation stage, exhibiting dense growth. Soybeans and corn were in their reproductive growth stages, with soybeans approaching the pod filling stage and corn nearing tasseling and silking [48]. As a perennial fiber crop, ramie had already entered the late growth stage, just prior to harvest [49]. These phenological conditions formed distinct spectral and structural characteristics, which are crucial for the interpretation of remote sensing classification results. The growth stages corresponding to different crops are illustrated in Figure 3.

2.4. Random Forest Classification Algorithm

In this experiment, we employed the Random Forest algorithm, a prevalent ensemble learning technique commonly used for classification and regression tasks [50]. By developing an ensemble of decision trees and synthesizing their predictions, this methodology strengthens both the precision and reliability of the model, thereby reducing overfitting and improving generalization performance. Random Forest was selected as the classifier for all experiments to minimize algorithmic variance. This allows the study to focus specifically on evaluating the contribution of different feature sets, particularly the polarimetric decomposition parameters, to the classification accuracy. The Random Forest incorporated 150 decision trees. The reference dataset was divided into training and testing sets using a stratified random sampling strategy based on the total pixel count of each class, with a ratio of 7:3. This ensured that the sample distribution in both subsets was representative of the overall dataset variability.

2.5. Data Preprocessing

2.5.1. Data Processing Flow

We used the official ESA professional software SNAP 9.0.0 for the pre-processing of the Sentinel data [51]. For the Sentinel-1 data, it mainly involve Image split, Calibrate, Deburst, Extraction of C 2 matrix, Multi-looking and Speckle filter (9 × 9 boxcar) [52], and finally Geocoding resampling to 10 m resolution. For the Sentinel-2 data, we selected Level 2A imagery, which had already undergone atmospheric correction, so it is only necessary to resample the eight time-series images of the 12 bands to a 10 m spatial resolution, consistent with Bands 2, 3, 4, and 8 [53]. This resampling process ensured that the spatial resolution was consistent across all images, facilitating subsequent analysis.
Image fusion [13,24] is a powerful technique that combines data from different sources to enhance the feature and spectral information. In this experiment, the optical dataset consisted of 96 bands after construction, while the SAR dataset comprised 84 bands. By adding H / A / α / and the model-based dual-polarization decomposition features, and merging these two datasets, we construct a comprehensive feature dataset containing 180 bands. The rich spectral information and prominent feature characteristics of this dataset significantly improved the subsequent classification accuracy. The overall experimental workflow is illustrated in Figure 4.

2.5.2. Polarimetric Decomposition Features

This experiment derived five distinct polarimetric decomposition parameters. The C 2 matrix, which contains information about radar signal returns at different time points or under varying conditions, serves as a crucial basis for extracting surface features and analyzing changes [54,55]. It is calculated as follows (Equation (1)):
C 2   =   S V V 2 S V V S V H * S V H S V V * S V H 2
The main diagonal elements (“ C 11 ” and “ C 22 ”) of the covariance matrix “ C 2 ” represent the VV and VH polarimetric scattering signals, respectively.
(1)
Cloude-Pottier decomposition
The Cloude-Pottier decomposition (also named H / A )/ α ) is an eigenvalue–eigenvector-based method [56], which can extract parameters from the polarimetric coherence matrix (Equation (2)) to quantify scattering randomness and dominant mechanisms.
T 2   = U 2 Σ U 2 1 = i = 1 2 λ i T 2 i = i = 1 2 λ i u i u i * T
This decomposition method has three decomposition parameters, the entropy of polarization ( H ), degree of anisotropy ( A ), and mean scattering angle ( α ), with the expressions (Equations (3)–(5)):
H   =   i   =   1 n   P i l o g n P i ,   0 H 1
A = λ 2 λ 3 λ 2   + λ 3
α = i = 1 n P i α i
(2)
Dual-polarization model-based decomposition
Mascolo et al. [57] established a decomposition methodology specifically designed for dual-polarization SAR data, utilizing model-based scattering mechanisms. All current polarization decomposition models consist of a body scattering model and a method to extract one or more (polarization) residual terms, which is not possible due to the limited information on the dual polarization for this complete decomposition. Therefore, they simulated the Stokes vector by combining the body scattering model and polarimetric waves to obtain three components of the Stokes vector modeling, denoted as:
s   =   m v s v + m s s p + n s n
where s n is the randomly polarimetric Stokes vector and represents the noise term. s v and s p are the un-polarimetric and polarimetric Stokes vectors. m v , m s and n are the corresponding total powers. It is shown that the noise term can be effectively removed by filtering techniques, and this decomposition uses a random dipole cloud instead of a body scattering model, then the decomposition model can be expressed as:
s = m v 1 ± 0.5 0 0 + m s 1 c o s 2 α s i n 2 α c o s δ s i n 2 α s i n δ
The four unknown parameters present in this model correspond to four observables in Stokes. The idea of solving this model lies in using the polarization information of the s p term. Where G is a commonly used matrix in Stokes vector in Equation (8):
G = 1 0 0 0 0 1 0 0 0 0 1 0 0 0 0 1 d e t C 2 0   = 0 s p T G s p = 0
The volumetric power solution follows from Equation (9) quadratic form, evaluated using Equation (10) analytical expression:
G s m v s v T G s m v s v = 0
a m v 2 + b m v + c = 0 ,   a =   s v T G s v = 0.75 b = 2 s T G s v = 2 s 1   ±   0.5 s 2 c = s T G s = s 1 2 s 2 2     s 3 2 s 4 2
In solving the quadratic equation, only one solution satisfies the law of conservation of energy m v s 1 , so a unique m v can be found. In addition, this decomposition avoids the problem of solving for complex eigenvalues. The polarization component power m s can also be found from Equation (11).
s     m v s v =   m s s p
In summary, based on the C 2 matrix extraction, we performed H / A / α / decomposition and dual-polarization model-based decomposition for feature extraction. This method is founded on the eigenvalue analysis of the coherence matrix. This section extracted richer image features, which are beneficial for distinguishing different crops and can enhance classification accuracy.

2.6. Accuracy Assessment

In this experiment, classification performance was statistically evaluated via confusion matrix analysis [58], which is a routinely used tool in remote sensing image classification evaluation. The confusion matrix provides four key parameters:
Producer’s Accuracy (PA) [59]: The metric evaluates category-specific detection reliability by calculating the percentage of correctly predicted positive cases out of all genuine instances of that class.
User’s Accuracy (UA) [59]: This evaluation criterion characterizes precision as the relationship between correctly classified positive samples and the sum of all samples designated as positive by the classifier. It reflects the reliability of the model’s classification for a specific class.
Overall Accuracy (OA): The metric assesses overall model performance by measuring the percentage of all accurately identified cases across all classes in relation to the total observations, thereby measuring the model’s global classification performance.
Kappa Coefficient [60]: This metric provides an integrated assessment that incorporates both producer’s accuracy (recall) and user’s accuracy (precision) into a unified evaluation framework.

3. Results

3.1. Multi-Temporal Sentinel-2 Images Classification Results

To evaluate the temporal properties of Sentinel-2 optical imagery for crop classification within the study area, a series of experiments were designed and implemented. Using the multi-temporal Sentinel-2 dataset employed in this study—comprising 8 acquisitions from different phenological stages—we progressively combined images across time and quantitatively analyzed the resulting classification accuracy for each crop type. This accuracy was measured as a function of the number of images used, as illustrated in Figure 5. The PA for each crop is shown in Table 4. The results indicate that integrating four optical images markedly enhances the classification performance for all four crop types. Maximum accuracy is achieved with the inclusion of six images, beyond which further additions lead to a decline in performance.
To further investigate the contribution of early-season imagery (January and February), we conducted a comparative experiment by excluding these months from the input dataset. The results indicated a decrease in Overall Accuracy to 91.56% (Table 5). This finding confirms that despite the absence of standing crops, the spectral information from the fallow period—serving as a phenological baseline and aiding in the separation of cropland from evergreen vegetation—is essential for achieving optimal classification performance.

3.2. Multi-Temporal Sentinel-1 Intensity Images Classification Results

To evaluate the contribution of Sentinel-1 dual-polarization decomposition features to crop classification, a controlled experiment was designed utilizing only preprocessed Sentinel-1 backscatter intensity images. The relevant results are shown in Figure 6, with the accuracy for each crop listed in Table 6. The subsequent classification results demonstrated that ramie and soybeans achieved recognition accuracies of 92.18% and 89.03%, respectively, when using a time series of 12 Sentinel-1 intensity images. However, the accuracy of rice, the main crop in the planting area, was only 66.59%. Therefore, classification based solely on Sentinel-1 intensity images cannot meet application requirements. It is necessary to extract more features based on polarimetric SAR data to enhance the classification effect.
When only the 12 preprocessed intensity images from Sentinel-1 were used, the classification outcome proved inadequate. The OA of the classification results is 65.00%, the Kappa coefficient is 0.52. The PA values of the four crops including rice, soybean, corn, and ramie are 66.59%, 89.03%, 72.18%, 92.18%, respectively. Only utilizing the backscattering products of Sentinel-1 does not achieve good classification results. Consequently, in subsequent experiments, we leveraged dual-polarization decomposition to enhance the feature space of Sentinel-1 imagery and more comprehensively evaluated its efficacy in crop classification.

3.3. Multi-Temporal Sentinel-1 Features Images Classification Results

To improve the precision of crop type identification with Sentinel-1 imagery, based on the C 2 matrix, we extracted the polarization decomposition features of Sentinel-1. And the temporal features of crop classification of Sentinel-1 data were obtained by using 12 Sentinel-1 images for crop classification. The classification accuracy for each crop is shown in Table 7. We incrementally aggregated multi-temporal images and quantitatively assessed the per-crop classification accuracy, evaluated as functions of image count variations, as shown in Figure 7. The overall classification accuracy variation for multi-temporal optical data and multi-temporal SAR data is shown in Figure 8.
Based on experimental results, it demonstrated that after adding the polarization decomposition feature, the classification accuracies of the four crops are improved to different degrees compared with the classification results of Sentinel-1 intensity images. Among them, the classification accuracy of rice is improved by 5.50%, soybean by 2.60%, corn by 2.41%, and ramie by1.35%. Notably, rice displayed the most pronounced enhancement, which is particularly valuable for operational monitoring since it is the most extensively cultivated crop in the region.

3.4. The Best Crop Classification Strategy

With the time-series classification curves from both satellite systems now established, we proceeded to conduct comparative experiments aimed at developing an optimal crop classification strategy for the region.
Section 3.1 and Section 3.3 have illustrated the OA of classifications derived from Sentinel-1 and Sentinel-2 imagery. The results demonstrate that the classification accuracy of optical images initially rises with the inclusion of more temporal data, peaks at an optimal number of images, and subsequently declines with further additions. The overall classification accuracy of optical images peaks at 6 images. However, the overall accuracy of SAR data increases with the number of images and reaches a peak at 12 images. Several studies have demonstrated that combining Sentinel-1/2 time series could significantly improves crop discrimination accuracy [61,62,63]. Therefore, all Sentinel-1 feature images and the combination of Sentinel-2 images from 12 January 2024 to 24 August 2024 were selected for crop classification as the most applicable classification scheme for the experimental area. A thematic map illustrating the output of the best classification strategy is provided in Figure 9 and Confusion Matrix as shown in Table 8. The performance of each crop under the optimal classification strategy is shown in Figure 10.
A comparison of the accuracy matrices reveals that the overall classification accuracy with polarimetric decomposition features reaches 94.20%, representing only a 0.20% improvement over the best result achieved using optical data alone. Although this increase is modest, focusing on specific crop types reveals more significant improvements. Notably, the classification accuracy for rice improved by 2.92%, and for corn, it increased by 3.89%. This enhancement, while not substantial in terms of overall accuracy, is particularly significant in practical applications, especially under challenging conditions where optical imagery is either absent or of poor quality. The study area’s climatic conditions, characterized by persistent precipitation and prolonged cloud cover during the rice and corn growing seasons, as well as intercropping practices, make it difficult to achieve reliable classification when relying solely on optical imagery. The integration of SAR data, particularly the polarimetric decomposition features, effectively addresses these challenges and improves the classification accuracy of specific crops in the study area, demonstrating its crucial application value when optical data quality is compromised or missing.

4. Discussion

4.1. Main Findings and Explanations

The rice and corn growing season in the study area is affected by rainy weather, making it difficult to achieve the objective of the task of extracting rice and corn growing areas by relying only on optical remote sensing data. Therefore, this case study systematically quantifies dual-polarization SAR’s efficacy for crop phenology monitoring under cloud-constrained environments and decomposition features in the extraction of crop planting areas, demonstrating that integrating multi-temporal Sentinel-2 optical data with Sentinel-1 polarimetric decomposition features substantially improves crop classification accuracy in crops landscapes. Three principal findings include:
Non-linear temporal effects: Classification accuracy showed significant improvement when using four multi-temporal images acquired between 12 January and 9 August 2024. This improvement likely stems from the close temporal alignment of the August 9 image with the field data collection period, ensuring high feature similarity between ground samples and satellite observations.
However, as additional images were incorporated, the overall accuracy plateaued and slightly decreased upon including the 18 September 2024 image. Analysis of this image (Figure 11) revealed cloud cover obstructing ground features in the study area. Thus, accuracy improvements depend more critically on image quality (e.g., radiometric consistency and noise levels) than merely on increasing the number of images.
The spectral signatures of rice during the mid-growth stage are highly similar to those of dense grasses, as shown in Figure 12, leading to potential misclassification in optical imagery. This spectral ambiguity is further quantified through sample separability analysis. Jeffries–Matusita Distance [64,65], a standardized remote sensing metric for class distinguishability (Equation (12)), ranges from 0 (inseparable) to 2 (fully separable). According to Richards and Jia [65], a JM distance greater than 1.90 is typically required to indicate good separability. Consequently, the obtained results suggest that the spectral separability between rice and water, as well as between rice and grass, is insufficient, as shown in Table 9. However, in the practice of remote sensing crop classification, a JM distance value exceeding 1.9 is typically required to indicate distinct separability between two classes.
J M p , q = 2 1     e D B p , q
where p and q, respectively, denote the two probability distribution created by a feature.
Due to persistent cloud and rainfall interference during the growing season in the study area (as depicted in Figure 11), merely increasing the number of stacked optical images does not significantly enhance crop classification accuracy. Furthermore, it remains challenging to acquire optical imagery that concurrently satisfies the requirements of both minimal cloud obstruction and temporal continuity. Considering the unique geographical location of the study area and its high susceptibility to frequent cloud cover and rainfall, this case study focuses on investigating the importance of polarimetric SAR features in supplementing optical data to optimize crop classification performance. Therefore, only the feature importance of SAR-derived parameters was analyzed, as shown in Figure 13. The polarimetric decomposition features selected in this study—entropy ( H ), anisotropy ( A ), and mean scattering angle ( α ) from the Cloude–Pottier decomposition, together with the model-based dual-polarization parameters of surface scattering power ( m s ) and volume scattering power ( m v )—have been demonstrated to provide significant contributions to crop classification. Specifically,   H , A , and α describe the randomness, relative importance of scattering mechanisms, and the dominant scattering type of targets, respectively. Previous studies have shown that these parameters are highly sensitive to vegetation structure and phenological stages [66,67], enabling the effective discrimination of different crop types beyond the capability of simple backscatter coefficients. On the other hand, the dual-polarization model-based decomposition parameters m s and m v provide physical insights into surface and volume scattering contributions [57,68]. m s is closely associated with soil–plant interface reflections, whereas m v reflects canopy volume scattering, which is strongly related to crop biomass and height. Recent studies have confirmed that these parameters are effective indicators of crop phenology and structural variability, thereby improving class separability when integrated into classification models.
Therefore, the combination of H , A , α , m s , and m v offers a more comprehensive description of crop scattering mechanisms, enabling higher classification accuracy compared to the use of intensity-only features.

4.2. Comparison with Previous Work

Our classification strategy shows good results (OA = 94.20%, Kappa = 0.91) and is comparable with previous studies. For instance, Veloso et al. [11] reported 89% OA for European crops using Sentinel-1/2 data but without incorporating polarimetric features. Zhao et al. [27] achieved 91% OA using Sentinel-1 coherence features, though their approach required 15 acquisitions. In contrast, our method attained higher accuracy with fewer images (12 SAR acquisitions) by strategically leveraging physical scattering mechanisms ( m s , m v ) and phenological trajectories. Bargiel et al. [69] developed an innovative classification approach that integrates time-series radar data and accounts for climate-induced variations in agricultural areas; however, most of his studies focused on non-cereal crops [70,71]. Wang et al. [72] demonstrated the potential of GF-3 full-polarimetric data for dryland crop classification. Nevertheless, compared to full-polarimetric data, Sentinel data offers free accessibility and wider availability, making it more practical for large-scale applications. Furthermore, polarimetric decomposition features provide unique advantages in identifying surface scattering mechanisms of crops, significantly enhancing the accuracy of rice area extraction in our study region.

4.3. Limitations and Future Work

Although the overall accuracy reached 94.20%, there remain areas for improvement and several limitations. First, regarding timeliness and sample abundance: the field data were not temporally characterized and were concentrated only at the time of collection (July 2024). Future studies should aim to collect time-matched ground samples across the growing season to better capture crop dynamics. Second, the limited quantity of training samples for certain crops, such as corn, poses a constraint. in machine learning. Class imbalance is a common challenge, and typical solutions include data-level techniques such as over-sampling (e.g., Synthetic Minority Over-sampling Technique (SMOTE)) and under-sampling, as well as algorithmic approaches like weighted loss functions (e.g., Focal Loss) and cost-sensitive learning [73,74,75]. However, it is important to note that the samples used in this study were directly collected from the field in the experimental area, ensuring the authenticity and real-world relevance of the dataset. To maintain the integrity of the experiment and reflect actual agricultural conditions, we used the original samples without augmentation for classification. This approach prioritizes the accuracy of results based on real, ground-truth data, rather than relying on synthetic or extrapolated samples. It is worth noting that the random split of pixels may introduce spatial autocorrelation effects compared to object-based independent validation, potentially resulting in slightly higher accuracy estimates. Third, the incorporation of deep learning approaches may further optimize the classification framework in future studies. Advanced architectures, such as LSTM, GRU, and semantic segmentation networks, have been successfully applied to time-series SAR and optical data for refined crop mapping [76,77,78]. Models such as Convolutional Neural Networks (CNNs), Long Short-Term Memory networks (LSTMs), and Transformer-based architectures have demonstrated strong capabilities in exploiting the spatial, temporal, and polarimetric characteristics of Sentinel data. For example, Liu et al. [67] applied a patch-based neural network using polarization decomposition features from Sentinel-1 time series and achieved significantly higher classification accuracy than intensity-based methods. Similarly, Qi et al. [79] reported that Transformer models outperformed CNNs and LSTMs when integrating Sentinel-1 and Sentinel-2 imagery for multi-crop mapping. More recently, Gallo et al. [80] demonstrated that Transformer architectures such as Swin UNETR improved temporal segmentation stability in satellite time-series classification, while Patnala et al. [81] proposed a bi-modal self-supervised framework to alleviate the dependence on large labeled datasets. By integrating the dual-polarization decomposition features analyzed in this study, deep learning methods could potentially capture more comprehensive hierarchical scattering representations, achieving enhanced crop discrimination and better generalization across regions. Last, although the inclusion of Sentinel-1 SAR data provides all-weather observation capabilities and compensates for data gaps, the rigorous cloud screening of Sentinel-2 imagery inevitably disrupted the continuity of the time series. In this study, the absence of high-quality optical images during key phenological stages (e.g., the specific heading or ripening stages of paddy rice) may have led to the loss of critical spectral signatures that are vital for distinguishing spectrally similar crops. While the polarimetric decomposition features of SAR data offer supplementary structural information, they cannot fully replace the rich spectral information provided by optical bands, especially for crops with similar geometric structures but distinct pigment contents.

5. Conclusions

In this study, the crop classification capabilities of multi-temporal Sentinel-1 and Sentinel-2 data were independently evaluated. Particular emphasis was placed on assessing the contribution of dual-polarization model-based decomposition features for crop classification. An integrated classification framework tailored to the study area was developed using Sentinel imagery. By analyzing the effect of multi-temporal data stacking on classification outcomes, the following conclusions were drawn:
(1)
Stacking multi-temporal SAR or optical images generally improves overall classification accuracy; however, the improvement is not strictly linear and may even decline after reaching saturation.
(2)
Optical imagery can adequately support crop classification when free from clouds or other masking effects. In cloudy conditions, classification accuracy declines, and incorporating SAR decomposition features can effectively enhance performance.
(3)
Parameters derived from dual-polarization model–based decomposition efficiently improved classification accuracy, particularly for identifying rice and soybean crops.

Author Contributions

Conceptualization, R.W., L.X. and Q.X.; methodology, R.W., L.X. and T.J.; software, R.W., Q.Z. and T.J.; validation, R.W., Q.Z. and T.J.; formal analysis, R.W., L.X., T.J., Q.H., Q.X. and H.F.; investigation, R.W. L.X., Q.H., Q.X. and H.F.; resources, L.X., Q.H., Q.X. and H.F.; data curation, R.W., Q.Z. and T.J.; writing—original draft preparation, R.W.; writing—review and editing, L.X., T.J., Q.Z., Q.H., Q.X. and H.F.; visualization, R.W., Q.Z. and T.J.; supervision, L.X., Q.X. and H.F.; project administration, L.X., Q.X. and H.F.; funding acquisition, L.X., Q.X. and H.F. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded in part by the National Natural Science Foundation of China (Grant No. 42171387, 41820104005).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The original contributions presented in this study are included in the article. Further inquiries can be directed to the corresponding author.

Acknowledgments

The authors would like to thank the European Space Agency for providing the Sentinel-1 and Sentinel-2 data, and Jianghao Yu, Lei Chen and Jie Yang from the PSML lab, China University of Geosciences (Wuhan) for their help with field work.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Seelan, S.K.; Laguette, S.; Casady, G.M.; Seielstad, G.A. Remote sensing applications for precision agriculture: A learning community approach. Remote Sens. Environ. 2003, 88, 157–169. [Google Scholar] [CrossRef]
  2. Sishodia, R.P.; Ray, R.L.; Singh, S.K. Applications of Remote Sensing in Precision Agriculture: A Review. Remote Sens. 2020, 12, 3136. [Google Scholar] [CrossRef]
  3. Wu, B.; Zhang, M.; Zeng, H.; Tian, F.; Potgieter, A.B.; Qin, X.; Yan, N.; Chang, S.; Zhao, Y.; Dong, Q.; et al. Challenges and opportunities in remote sensing-based crop monitoring: A review. Natl. Sci. Rev. 2022, 10, nwac290. [Google Scholar] [CrossRef] [PubMed]
  4. Shi, S.; Ye, Y.; Xiao, R. Evaluation of Food Security Based on Remote Sensing Data—Taking Egypt as an Example. Remote Sens. 2022, 14, 2876. [Google Scholar] [CrossRef]
  5. Liu, Z.; Liu, J.; Su, Y.; Xiao, X.; Dong, J.; Liu, L. A General Model for Large-Scale Paddy Rice Mapping by Combining Biological Characteristics, Deep Learning, and multi-source Remote Sensing Data. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2025, 18, 14705–14717. [Google Scholar] [CrossRef]
  6. Fang, H.; Liang, S.; Chen, Y.; Ma, H.; Li, W.; He, T.; Tian, F.; Zhang, F. A comprehensive review of rice mapping from satellite data: Algorithms, product characteristics and consistency assessment. Sci. Remote Sens. 2024, 10, 100172. [Google Scholar] [CrossRef]
  7. Roy, V.; Chandanan, A.K.; Maheshwary, P.; Sarathe, V.K.; Shukla, P.K. Economic, social, and environmental challenges in Agri 4.0. In Agri 4.0 and the Future of Cyber-Physical Agricultural Systems; Academic Press: Cambridge, MA, USA, 2024; pp. 91–113. [Google Scholar]
  8. Weiss, M.; Jacob, F.; Duveiller, G. Remote sensing for agricultural applications: A meta-review. Remote Sens. Environ. 2020, 236, 111402. [Google Scholar] [CrossRef]
  9. Jayaraman, P.P.; Yavari, A.; Georgakopoulos, D.; Morshed, A.; Zaslavsky, A. Internet of things platform for smart farming: Experiences and lessons learnt. Sensors 2016, 16, 1884. [Google Scholar] [CrossRef]
  10. Segarra, J.; Buchaillot, M.L.; Araus, J.L.; Kefauver, S.C. Remote sensing for precision agriculture: Sentinel-2 improved features and applications. Agronomy 2020, 10, 641. [Google Scholar] [CrossRef]
  11. Veloso, A.; Mermoz, S.; Bouvet, A.; Le Toan, T.; Planells, M.; Dejoux, J.-F.; Ceschia, E. Understanding the temporal behavior of crops using Sentinel-1 and Sentinel-2-like data for agricultural applications. Remote Sens. Environ. 2017, 199, 415–426. [Google Scholar] [CrossRef]
  12. Vuolo, F.; Neuwirth, M.; Immitzer, M.; Atzberger, C.; Ng, W.-T. How much does multi-temporal Sentinel-2 data improve crop type classification? Int. J. Appl. Earth Obs. Geoinf. 2018, 72, 122–130. [Google Scholar] [CrossRef]
  13. Clerici, N.; Valbuena Calderón, C.A.; Posada, J.M. Fusion of Sentinel-1A and Sentinel-2A data for land cover mapping: A case study in the lower Magdalena region, Colombia. J. Maps. 2017, 13, 718–726. [Google Scholar] [CrossRef]
  14. Soriano-González, J.; Angelats, E.; Martínez-Eixarch, M.; Alcaraz, C. Monitoring rice crop and yield estimation with Sentinel-2 data. Field Crops Res. 2022, 281, 108507. [Google Scholar] [CrossRef]
  15. Boori, M.S.; Choudhary, K.; Paringer, R.; Sharma, A.K.; Kupriyanov, A.; Corgne, S. Monitoring crop phenology using NDVI time series from Sentinel 2 satellite data. In Proceedings of the 2019 5th International Conference on Frontiers of Signal Processing (ICFSP), Marseille, France, 18–20 September 2019; pp. 62–66. [Google Scholar]
  16. Nazir, A.; Ullah, S.; Saqib, Z.A.; Abbas, A.; Ali, A.; Iqbal, M.S.; Hussain, K.; Shakir, M.; Shah, M.; Butt, M.U. Estimation and Forecasting of Rice Yield Using Phenology-Based Algorithm and Linear Regression Model on Sentinel-II Satellite Data. Agriculture 2021, 11, 1026. [Google Scholar] [CrossRef]
  17. Chen, Y.; Hu, J.; Cai, Z.; Yang, J.; Zhou, W.; Hu, Q.; Wang, C.; You, L.; Xu, B. A phenology-based vegetation index for improving ratoon rice mapping using harmonized Landsat and Sentinel-2 data. J. Integr. Agric. 2024, 23, 1164–1178. [Google Scholar] [CrossRef]
  18. Wiseman, G.; McNairn, H.; Homayouni, S.; Shang, J. RADARSAT-2 polarimetric SAR response to crop biomass for agricultural production monitoring. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2014, 7, 4461–4471. [Google Scholar] [CrossRef]
  19. Mandal, D.; Kumar, V.; Ratha, D.; Lopez-Sanchez, J.M.; Bhattacharya, A.; McNairn, H.; Rao, Y.S.; Ramana, K. Assessment of rice growth conditions in a semi-arid region of India using the Generalized Radar Vegetation Index derived from RADARSAT-2 polarimetric SAR data. Remote Sens. Environ. 2020, 237, 111561. [Google Scholar] [CrossRef]
  20. Ma, Y.; Jiang, Q.; Wu, X.; Zhu, R.; Gong, Y.; Peng, Y.; Duan, B.; Fang, S. Monitoring Hybrid Rice Phenology at Initial Heading Stage Based on Low-Altitude Remote Sensing Data. Remote Sens. 2021, 13, 86. [Google Scholar] [CrossRef]
  21. Ding, M.; Guan, Q.; Li, L.; Zhang, H.; Liu, C.; Zhang, L. Phenology-Based Rice Paddy Mapping Using Multi-Source Satellite Imagery and a Fusion Algorithm Applied to the Poyang Lake Plain, Southern China. Remote Sens. 2020, 12, 1022. [Google Scholar] [CrossRef]
  22. Huang, C.; Zhang, C.; He, Y.; Liu, Q.; Li, H.; Su, F.; Liu, G.; Bridhikitti, A. Land cover mapping in cloud-prone tropical areas using Sentinel-2 data: Integrating spectral features with Ndvi temporal dynamics. Remote Sens. 2020, 12, 1163. [Google Scholar] [CrossRef]
  23. Amherdt, S.; Di Leo, N.C.; Balbarani, S.; Pereira, A.; Cornero, C.; Pacino, M.C. Exploiting Sentinel-1 data time-series for crop classification and harvest date detection. Int. J. Remote Sens. 2021, 42, 7313–7331. [Google Scholar] [CrossRef]
  24. Ienco, D.; Interdonato, R.; Gaetano, R.; Minh, D.H.T. Combining Sentinel-1 and Sentinel-2 Satellite Image Time Series for land cover mapping via a multi-source deep learning architecture. ISPRS J. Photogramm. Remote Sens. 2019, 158, 11–22. [Google Scholar] [CrossRef]
  25. Skriver, H. Crop classification by multitemporal C-and L-band single-and dual-polarization and fully polarimetric SAR. IEEE Trans. Geosci. Remote Sens. 2011, 50, 2138–2149. [Google Scholar] [CrossRef]
  26. Skriver, H.; Mattia, F.; Satalino, G.; Balenzano, A.; Pauwels, V.R.; Verhoest, N.E.; Davidson, M. Crop classification using short-revisit multitemporal SAR data. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2011, 4, 423–431. [Google Scholar] [CrossRef]
  27. Zhao, Q.; Xie, Q.; Peng, X.; Lai, K.; Wang, J.; Fu, H.; Zhu, J.; Song, Y. Understanding the temporal dynamics of coherence and backscattering using Sentinel-1 imagery for crop-type mapping. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2024, 17, 6875–6893. [Google Scholar] [CrossRef]
  28. Xie, Q.; Lai, K.; Wang, J.; Lopez-Sanchez, J.M.; Shang, J.; Liao, C.; Zhu, J.; Fu, H.; Peng, X. Crop monitoring and classification using polarimetric RADARSAT-2 time-series data across growing season: A case study in southwestern Ontario, Canada. Remote Sens. 2021, 13, 1394. [Google Scholar] [CrossRef]
  29. Xie, Q.; Wang, J.; Liao, C.; Shang, J.; Lopez-Sanchez, J.M.; Fu, H.; Liu, X. On the use of Neumann decomposition for crop classification using multi-temporal RADARSAT-2 polarimetric SAR data. Remote Sens. 2019, 11, 776. [Google Scholar] [CrossRef]
  30. Hong, S.-H.; Kim, H.-O.; Wdowinski, S.; Feliciano, E. Evaluation of polarimetric SAR decomposition for classifying wetland vegetation types. Remote Sens. 2015, 7, 8563–8585. [Google Scholar] [CrossRef]
  31. Jian, S. Applying the Decomposition Technique in Vegetated Surface to Estimate Soil Moisture by Multi-Temporal Measurements. Remote Sens. 2005, 4, 3–6. [Google Scholar]
  32. Liang, B.; Zhao, R.; Tan, J.; Xia, L.; Cao, H.; Wu, S.; Yang, P. The Application of Compact Polarization Decomposition in the Construction of a Dual-Polarization Radar Index and the Effect Evaluation of Rape Extraction. IEEE J. Sel. Top. Appl. Earth Obs. Remote. Sens. 2023, 16, 5315–5330. [Google Scholar] [CrossRef]
  33. Lee, J.-S.; Grunes, M.R.; Ainsworth, T.L.; Du, L.-J.; Schuler, D.L.; Cloude, S.R. Unsupervised classification using polarimetric decomposition and the complex Wishart classifier. IEEE Trans. Geosci. Remote Sens. 1999, 37, 2249–2258. [Google Scholar]
  34. Tan, C.-P.; Koay, J.-Y.; Lim, K.-S.; Ewe, H.-T.; Chuah, H.-T. Classification of multi-temporal SAR images for rice crops using combined entropy decomposition and support vector machine technique. Prog. Electromagn. Res. 2007, 71, 19–39. [Google Scholar] [CrossRef]
  35. Wang, M.; Wang, L.; Guo, Y.; Cui, Y.; Liu, J.; Chen, L.; Wang, T.; Li, H. A Comprehensive Evaluation of Dual-Polarimetric Sentinel-1 SAR Data for Monitoring Key Phenological Stages of Winter Wheat. Remote Sens. 2024, 16, 1659. [Google Scholar] [CrossRef]
  36. Tan, C.P.; Ewe, H.T.; Chuah, H.T. Agricultural crop-type classification of multi-polarization SAR images using a hybrid entropy decomposition and support vector machine technique. Int. J. Remote Sens. 2011, 32, 7057–7071. [Google Scholar] [CrossRef]
  37. Haldar, D.; Rana, P.; Hooda, R.S. Biophysical parameter assessment of winter crops using polarimetric variables—Entropy (H), anisotropy (A), and alpha (α). Ara. J. Geosci. 2019, 12, 375. [Google Scholar] [CrossRef]
  38. Larrañaga, A.; Álvarez-Mozos, J. On the added value of quad-pol data in a multi-temporal crop classification framework based on RADARSAT-2 imagery. Remote Sens. 2016, 8, 335. [Google Scholar] [CrossRef]
  39. Verma, A.; Kumar, A.; Lal, K. Kharif crop characterization using combination of SAR and MSI Optical Sentinel Satellite datasets. J. Earth Sys. Sci. 2019, 128, 230. [Google Scholar] [CrossRef]
  40. Zhang, X.; Zhang, P.; Shen, K.; Pei, Z. Rice identification at the early stage of the rice growth season with single fine quad Radarsat-2 data. In Proceedings of the Remote Sensing for Agriculture, Ecosystems, and Hydrology XVIII; SPIE Remote Sensing: Edinburgh, UK, 2016; pp. 494–502. [Google Scholar]
  41. Tan, B.; Li, Z.; Li, B.; Zhang, P. Rice field mapping and monitoring using singe-temporal and dual polarized ENVISAT ASAR data. Trans. Chin. Soc. Agric. Eng. 2006, 22, 121–127. [Google Scholar]
  42. Inoue, Y.; Kurosu, T.; Maeno, H.; Uratsuka, S.; Kozu, T.; Dabrowska-Zielinska, K.; Qi, J. Season-long daily measurements of multifrequency (Ka, Ku, X, C, and L) and full-polarization backscatter signatures over paddy rice field and their relationship with biological variables. Remote Sens. Environ. 2002, 81, 194–204. [Google Scholar] [CrossRef]
  43. Khabbazan, S.; Vermunt, P.; Steele-Dunne, S.; Ratering Arntz, L.; Marinetti, C.; van der Valk, D.; Iannini, L.; Molijn, R.; Westerdijk, K.; van der Sande, C. Crop monitoring using Sentinel-1 data: A case study from The Netherlands. Remote Sens. 2019, 11, 1887. [Google Scholar] [CrossRef]
  44. Yin, Q.; Du, Y.; Li, F.; Zhou, Y.; Zhang, F. Multi-Temporal Dual Polarimetric SAR Crop Classification Based on Spatial Information Comprehensive Utilization. Remote Sens. 2025, 17, 2304. [Google Scholar] [CrossRef]
  45. Tsuchiya, Y.; Sonobe, R. Crop Classification Using Time-Series Sentinel-1 SAR Data: A Comparison of LSTM, GRU, and TCN with Attention. Remote Sens. 2025, 17, 2095. [Google Scholar] [CrossRef]
  46. Schwerdt, M.; Schmidt, K.; Tous Ramon, N.; Klenk, P.; Yague-Martinez, N.; Prats-Iraola, P.; Zink, M.; Geudtner, D. Independent system calibration of Sentinel-1B. Remote Sens. 2017, 9, 511. [Google Scholar] [CrossRef]
  47. Steinhausen, M.J.; Wagner, P.D.; Narasimhan, B.; Waske, B. Combining Sentinel-1 and Sentinel-2 data for improved land use and land cover mapping of monsoon regions. Int. J. Appl. Earth Obs. Geoinf. 2018, 73, 595–604. [Google Scholar] [CrossRef]
  48. Jiao, X.; McNairn, H.; Shang, J.; Pattey, E.; Liu, J.; Champagne, C. The sensitivity of RADARSAT-2 polarimetric SAR data to corn and soybean leaf area index. Can. J. Remote Sens. 2011, 37, 69–81. [Google Scholar] [CrossRef]
  49. Fu, H.; Chen, J.; Lu, J.; Yue, Y.; Xu, M.; Jiao, X.; Cui, G.; She, W. A Comparison of Different Remote Sensors for Ramie Leaf Area Index Estimation. Agronomy 2023, 13, 899. [Google Scholar] [CrossRef]
  50. Breiman, L. Random forests. Mach. Learn. 2001, 45, 5–32. [Google Scholar] [CrossRef]
  51. Foumelis, M.; Blasco, J.M.D.; Desnos, Y.-L.; Engdahl, M.; Fernández, D.; Veci, L.; Lu, J.; Wong, C. ESA SNAP-StaMPS integrated processing for Sentinel-1 persistent scatterer interferometry. In Proceedings of the IGARSS 2018-2018 IEEE International Geoscience and Remote Sensing Symposium, Valencia, Spain, 22–27 July 2018; pp. 1364–1367. [Google Scholar]
  52. Yahia, M.; Ali, T.; Mortula, M.M.; Abdelfattah, R.; El Mahdy, S. Polarimetric SAR speckle reduction by hybrid iterative filtering. IEEE Access 2020, 8, 89603–89616. [Google Scholar] [CrossRef]
  53. Persson, M.; Lindberg, E.; Reese, H. Tree species classification with multi-temporal Sentinel-2 data. Remote Sens. 2018, 10, 1794. [Google Scholar] [CrossRef]
  54. Eltoft, T.; Doulgeris, A.P. Model-based polarimetric decomposition with higher order statistics. IEEE Geosci. Remote Sens. Lett. 2019, 16, 992–996. [Google Scholar] [CrossRef]
  55. Karachristos, K.; Koukiou, G.; Anastassopoulos, V. A review on PolSAR decompositions for feature extraction. J. Imag. 2024, 10, 75. [Google Scholar] [CrossRef]
  56. Cloude, S. The dual polarization entropy/alpha decomposition: A PALSAR case study. Sci. Appl. SAR Polarim. Polarim. Interferom. 2007, 644, 2. [Google Scholar]
  57. Mascolo, L.; Cloude, S.R.; Lopez-Sanchez, J.M. Model-based decomposition of dual-pol SAR data: Application to Sentinel-1. IEEE Trans. Geosci. Remote Sens. 2021, 60, 5220119. [Google Scholar] [CrossRef]
  58. Foody, G.M. Status of land cover classification accuracy assessment. Remote Sens. Environ. 2002, 80, 185–201. [Google Scholar] [CrossRef]
  59. Story, M.; Congalton, R.G. Accuracy assessment: A user’s perspective. Photogramm. Eng. Remote Sens. 1986, 52, 397–399. [Google Scholar]
  60. Cohen, J. A coefficient of agreement for nominal scales. Educ. Psycho. Meas. 1960, 20, 37–46. [Google Scholar] [CrossRef]
  61. Wang, Z.; Sun, X.; Liu, X.; Xu, F.; Huang, H.; Ti, R.; Yu, H.; Wang, Y.; Wei, Y. Improved Paddy Rice Classification Utilizing Sentinel-1/2 Imagery in Anhui China: Phenological Features, Algorithms, Validation and Analysis. Agriculture 2024, 14, 1282. [Google Scholar] [CrossRef]
  62. Sheng, L.; Lv, Y.; Ren, Z.; Zhou, H.; Deng, X. Detection of the Optimal Temporal Windows for Mapping Paddy Rice Under a Double-Cropping System Using Sentinel-2 Imagery. Remote Sens. 2025, 17, 57. [Google Scholar] [CrossRef]
  63. Eisfelder, C.; Boemke, B.; Gessner, U.; Sogno, P.; Alemu, G.; Hailu, R.; Mesmer, C.; Huth, J. Cropland and Crop Type Classification with Sentinel-1 and Sentinel-2 Time Series Using Google Earth Engine for Agricultural Monitoring in Ethiopia. Remote Sens. 2024, 16, 866. [Google Scholar] [CrossRef]
  64. Sen, R.; Goswami, S.; Chakraborty, B. Jeffries-Matusita distance as a tool for feature selection. In Proceedings of the 2019 International Conference on Data Science and Engineering (ICDSE), Patna, India, 26–28 September 2019; pp. 15–20. [Google Scholar]
  65. Richards, J.A.; Jia, X. Remote Sensing Digital Image Analysis: An Introduction; Springer: Berlin/Heidelberg, Germany, 2006. [Google Scholar]
  66. Rucha, B.D.; Koushik, S.; Amit, K.; Manisha, V.; Nidhin, P.; Abishek, M. Analysing the potential of polarimetric decomposition parameters of Sentinel–1 dual-pol SAR data for estimation of rice crop biophysical parameters. J. Agrometeorol. 2023, 25, 105–112. [Google Scholar] [CrossRef]
  67. Liu, Y.; Pu, X.; Shen, Z. Crop Type Mapping Based on Polarization Information of Time Series Sentinel-1 Images Using Patch-Based Neural Network. Remote Sens. 2023, 15, 3384. [Google Scholar] [CrossRef]
  68. Kumar, S.S.; Rajendra, P.; Vivek, T.; Srivastava, P.K. An improved volume power approach to estimate LAI from optimized dual-polarized SAR decomposition. Int. J. Remote Sens. 2023, 44, 5736–5754. [Google Scholar]
  69. Bargiel, D. A new method for crop classification combining time series of radar images and crop phenology information. Remote Sens. Environ. 2017, 198, 369–383. [Google Scholar] [CrossRef]
  70. Desai, G.; Gaikwad, A. Deep learning techniques for crop classification applied to SAR imagery: A survey. In Proceedings of the 2021 Asian Conference on Innovation in Technology (ASIANCON), Pune, India, 27–29 August 2021; pp. 1–6. [Google Scholar]
  71. McNairn, H.; Shang, J.; Champagne, C.; Jiao, X. TerraSAR-X and RADARSAT-2 for crop classification and acreage estimation. In Proceedings of the 2009 IEEE International Geoscience and Remote Sensing Symposium, Cape Town, South Africa, 12–17 July 2009; pp. II-898–II-901. [Google Scholar]
  72. Wang, M.; Liu, C.; Han, D.; Wang, F.; Hou, X.; Liang, S.; Sui, X. Assessment of GF3 Full-Polarimetric SAR Data for Dryland Crop Classification with Different Polarimetric Decomposition Methods. Sensors 2022, 22, 6087. [Google Scholar] [CrossRef]
  73. He, H.; Garcia, E.A. Learning from imbalanced data. IEEE Trans. Knowl. Data Eng. 2009, 21, 1263–1284. [Google Scholar] [CrossRef]
  74. Buda, M.; Maki, A.; Mazurowski, M.A. A systematic study of the class imbalance problem in convolutional neural networks. Neural Netw. 2018, 106, 249–259. [Google Scholar] [CrossRef]
  75. Abd Elrahman, S.M.; Abraham, A. A review of class imbalance problem. J. Netw. Innov. Comput. 2013, 1, 9. [Google Scholar]
  76. Xu, T.; Cai, P.; Wei, H.; He, H.; Wang, H. Integrating Phenological Features with Time Series Transformer for Accurate Rice Field Mapping in Fragmented and Cloud-Prone Areas. Sensors 2025, 25, 7488. [Google Scholar] [CrossRef]
  77. Shen, G.; Liao, J. Paddy Rice Mapping in Hainan Island Using Time-Series Sentinel-1 SAR Data and Deep Learning. Remote Sens. 2025, 17, 1033. [Google Scholar] [CrossRef]
  78. Li, G.; Han, W.; Dong, Y.; Zhai, X.; Huang, S.; Ma, W.; Cui, X.; Wang, Y. Multi-Year Crop Type Mapping Using Sentinel-2 Imagery and Deep Semantic Segmentation Algorithm in the Hetao Irrigation District in China. Remote Sens. 2023, 15, 875. [Google Scholar] [CrossRef]
  79. Qi, Y.; Bitelli, G.; Mandanici, E.; Trevisiol, F. Application of deep learning crop classification model based on multispectral and sar satellite imagery. Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. 2023, XLVIII-1/W2-2023, 1515–1521. [Google Scholar] [CrossRef]
  80. Gallo, I.; Gatti, M.; Landro, N.; Loschiavo, C.; Boschetti, M.; La Grassa, R.; Rehman, A.U. Enhancing crop segmentation in satellite image time-series with transformer networks. In Proceedings of the Sixteenth International Conference on Machine Vision (ICMV 2023), Yerevan, Armenia, 15–18 November 2024; pp. 62–69. [Google Scholar]
  81. Patnala, A.; Schultz, M.G.; Gall, J. BERT Bi-modal self-supervised learning for crop classification using Sentinel-2 and Planetscope. Front. Remote Sens. 2025, 6, 1555887. [Google Scholar] [CrossRef]
Figure 1. (a) Overview of the research area; (b) A Sentinel-2B true-color (RGB) composite image (acquisition date 4 August 2024). (Red: B4, Green: B3, Blue: B2).
Figure 1. (a) Overview of the research area; (b) A Sentinel-2B true-color (RGB) composite image (acquisition date 4 August 2024). (Red: B4, Green: B3, Blue: B2).
Sensors 26 00586 g001
Figure 2. Map of Field Collection Data in Helonghu Township.
Figure 2. Map of Field Collection Data in Helonghu Township.
Sensors 26 00586 g002
Figure 3. Agricultural Calendar of Rice, Soybean, Corn, and Ramie.
Figure 3. Agricultural Calendar of Rice, Soybean, Corn, and Ramie.
Sensors 26 00586 g003
Figure 4. General technical flowchart of this study.
Figure 4. General technical flowchart of this study.
Sensors 26 00586 g004
Figure 5. PA for the four crops versus the quantity of multi-temporal Sentinel-2 images.
Figure 5. PA for the four crops versus the quantity of multi-temporal Sentinel-2 images.
Sensors 26 00586 g005
Figure 6. Variation in PA across four crop types with increasing number of Sentinel-1 intensity images.
Figure 6. Variation in PA across four crop types with increasing number of Sentinel-1 intensity images.
Sensors 26 00586 g006
Figure 7. Variation in PA across four crop types with increasing number of Sentinel-1 features images.
Figure 7. Variation in PA across four crop types with increasing number of Sentinel-1 features images.
Sensors 26 00586 g007
Figure 8. Variation in OA between multi-temporal optical data and SAR data.
Figure 8. Variation in OA between multi-temporal optical data and SAR data.
Sensors 26 00586 g008
Figure 9. The result for the best classification combination.
Figure 9. The result for the best classification combination.
Sensors 26 00586 g009
Figure 10. Comparison of different satellite data ((ad) represent the classification accuracy of rice, soybeans, corn, and ramie, respectively, under different datasets.).
Figure 10. Comparison of different satellite data ((ad) represent the classification accuracy of rice, soybeans, corn, and ramie, respectively, under different datasets.).
Sensors 26 00586 g010aSensors 26 00586 g010b
Figure 11. Cloud cover on 18 September 2024 is present on the image.
Figure 11. Cloud cover on 18 September 2024 is present on the image.
Sensors 26 00586 g011
Figure 12. Field photos in the study area. (a) Grass; (b) Rice.
Figure 12. Field photos in the study area. (a) Grass; (b) Rice.
Sensors 26 00586 g012
Figure 13. Normalized feature importance ranking results based on seven SAR features.
Figure 13. Normalized feature importance ranking results based on seven SAR features.
Sensors 26 00586 g013
Table 1. Satellite information for Sentinel-1.
Table 1. Satellite information for Sentinel-1.
ParameterInformation
Satellite nameSentinel-1A
Incidence range30.3–42.8°
Azimuth Pixel Spacing13.98 m
Sensor typeSynthetic Aperture Radar (SAR)
Operating frequencyC (5.4 GHz)
Polarimetric modeVV, VH.
Imaging modeInterferometric Wide Swath (IW)
Available data20240228, 20240311, 20240323, 20240404, 20240416, 20240428, 20240522, 20240615, 20240814, 20240826, 20240907, 20240919
Data formatSingle Look Complex (SLC)
Table 2. Satellite information for Sentinel-2.
Table 2. Satellite information for Sentinel-2.
Sentinel-2 Bands InformationCentre Wavelength (μm)Resolution (m)
B10.4460
B20.4910
B30.5610
B40.6710
B50.7120
B60.7420
B70.7820
B80.8410
B8A0.8720
B90.9560
B101.3860
B111.6120
B122.1920
Available data20240112, 20240211, 20240312, 20240804, 20240809, 20240824, 20240908, 20240918.
LevelL2A
Table 3. Number of samples in each category of field collection data.
Table 3. Number of samples in each category of field collection data.
CategoryTraining PolygonsTraining PixelsTesting PolygonsTesting Pixels
forest40523022292
water50635,3745813,928
soybean1973634541416
ramie55191712793
rice102828,21519911,270
grassland99153741679
building8894274582
corn144035181
Table 4. Matrix of variation in classification accuracy with number of Sentinel-2 images for four crops.
Table 4. Matrix of variation in classification accuracy with number of Sentinel-2 images for four crops.
CombinationNumber of ImageRice-PASoybean-PARamie-PACorn-PA
0112186.879.1780.1375.14
0112-0211287.5887.1583.3576.01
0112-0312388.9889.6297.177.89
0112-0804491.695.0610080.03
0112-0809592.0995.0610080.66
0112-0824692.1695.5510081.22
0112-0908790.9595.7610080.56
0112-0918890.3795.6910080.32
Table 5. Classification accuracy for the four crops and overall accuracy after removing optical images from January to March.
Table 5. Classification accuracy for the four crops and overall accuracy after removing optical images from January to March.
CombinationNumber of ImageRice-PASoybean-PARamie-PACorn-PA
0312-0918687.5296.7599.7580.11
OA91.56
Kappa0.87
Table 6. Variation in PA across four crop types with increasing number of Sentinel-1 intensity images.
Table 6. Variation in PA across four crop types with increasing number of Sentinel-1 intensity images.
CombinationNumber of ImageRice-PASoybean-PARamie-PACorn-PA
0228115.8729.4543.7715.44
0228-0311217.9144.9859.9618.33
0228-0323329.5456.5264.7227.88
0228-0404448.3761.5979.3244.34
0228-0416555.0264.885.150.67
0228-0428658.9470.1489.7955.9
0228-0522762.3678.2389.7863.88
0228-0615863.7381.1290.4665.32
0228-0814964.2684.0990.7868.12
0228-08261065.0385.2291.2269.09
0228-09071165.9589.0392.1870.91
0228-09191266.5988.4391.872.18
Table 7. Matrix of variation in classification accuracy with number of Sentinel-1 features images for four crops.
Table 7. Matrix of variation in classification accuracy with number of Sentinel-1 features images for four crops.
CombinationNumber of ImageRice-PASoybean-PARamie-PACorn-PA
0228121.9629.4548.519.55
0228-0311224.6249.6861.9621.99
0228-0323331.3159.7566.9232.6
0228-0404456.3363.9381.3259.12
0228-0416558.1771.4189.160.77
0228-0428663.4974.1490.5666.15
0228-0522765.3488.89270.18
0228-0615864.7687.9292.4669.84
0228-0814967.691.2492.2973.45
0228-08261068.9389.6792.2372.43
0228-09071170.5591.6392.1971.51
0228-09191272.0989.0393.4874.59
Table 8. Confusion matrix of the best classification result. (1, rice; 2, soybean; 3, ramie; 4, grassland; 5, water; 6, forest; 7, corn; 8, building).
Table 8. Confusion matrix of the best classification result. (1, rice; 2, soybean; 3, ramie; 4, grassland; 5, water; 6, forest; 7, corn; 8, building).
Ground Truth
Class12345678PAUA
110,716220221332033895.0894.56
213213570620033095.8585.67
300793323000100.0096.83
489130283600141.6872.19
5251003313,56635101397.4095.44
6632071119410084.6993.41
71700000145085.1189.51
82220600053091.0794.64
OA94.20
Kappa0.91
Table 9. Comparison of separability between classes.
Table 9. Comparison of separability between classes.
ClassRiceWaterGrassland
rice 1.310.97
water1.31
grassland 1.92
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Wang, R.; Xia, L.; Jia, T.; Zhao, Q.; He, Q.; Xie, Q.; Fu, H. Evaluating Multi-Temporal Sentinel-1 and Sentinel-2 Imagery for Crop Classification: A Case Study in a Paddy Rice Growing Region of China. Sensors 2026, 26, 586. https://doi.org/10.3390/s26020586

AMA Style

Wang R, Xia L, Jia T, Zhao Q, He Q, Xie Q, Fu H. Evaluating Multi-Temporal Sentinel-1 and Sentinel-2 Imagery for Crop Classification: A Case Study in a Paddy Rice Growing Region of China. Sensors. 2026; 26(2):586. https://doi.org/10.3390/s26020586

Chicago/Turabian Style

Wang, Rui, Le Xia, Tonglu Jia, Qinxin Zhao, Qiuhua He, Qinghua Xie, and Haiqiang Fu. 2026. "Evaluating Multi-Temporal Sentinel-1 and Sentinel-2 Imagery for Crop Classification: A Case Study in a Paddy Rice Growing Region of China" Sensors 26, no. 2: 586. https://doi.org/10.3390/s26020586

APA Style

Wang, R., Xia, L., Jia, T., Zhao, Q., He, Q., Xie, Q., & Fu, H. (2026). Evaluating Multi-Temporal Sentinel-1 and Sentinel-2 Imagery for Crop Classification: A Case Study in a Paddy Rice Growing Region of China. Sensors, 26(2), 586. https://doi.org/10.3390/s26020586

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop