Next Article in Journal
The Influence of Texture on Soil Moisture Modeling for Soils of Diverse Roughness Using Backscattering Coefficient and Polarimetric Decompositions Derived from Sentinel-1 Data
Previous Article in Journal
Radiometric Cross-Calibration and Validation of KOMPSAT-3/AEISS Using Sentinel-2A/MSI
Previous Article in Special Issue
Understanding 20 Years of Vegetation Change in Deer-Impacted Grasslands
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Development of an Optical–Radar Fusion Method for Riparian Vegetation Monitoring and Its Application to Representative Rivers in Japan

1
Department of Urban and Civil Engineering, Ibaraki University, 4-12-1 Nakanarusawa-cho, Hitachi 316-8511, Japan
2
Spatial Information Department, Technical Center, Tohoku Division, PASCO Corporation, Sendai 983-0864, Japan
*
Author to whom correspondence should be addressed.
Remote Sens. 2025, 17(19), 3281; https://doi.org/10.3390/rs17193281
Submission received: 6 August 2025 / Revised: 18 September 2025 / Accepted: 21 September 2025 / Published: 24 September 2025

Abstract

Highlights

What are the main findings?
  • A composite index integrating the normalized difference vegetation index and backscatter intensity was developed, enabling the joint characterization of horizontal vegetation activity and vertical structural density across riparian environments.
  • The proposed method achieves an average geometric correction error of less than three pixels, and the spatial distribution of the composite index closely aligns with the actual vegetation conditions.
What are the implications of the main findings?
  • The proposed riparian vegetation monitoring method provides a stable characterization of vegetation distribution and demonstrates strong temporal sensitivity.
  • The proposed method represents a viable tool for remote sensing-based water environment management and ecological assessment and lays the foundation for broader applications in riparian ecosystem monitoring.

Abstract

Riparian vegetation plays a critical role in maintaining ecosystem function, ensuring drainage capacity, and enhancing disaster prevention and mitigation. However, existing ground-based survey methods are limited in both spatial coverage and temporal resolution, which increases the difficulty of meeting the growing demand for rapid, dynamic, and fine-scale monitoring of riverine vegetation. To address this challenge, this study proposes a remote sensing approach that integrates Sentinel-1 synthetic aperture radar imagery with Sentinel-2 optical data. A composite vegetation index was developed by combining the normalized difference vegetation index and synthetic aperture radar backscatter coefficients, thereby enabling the joint characterization of horizontal and vertical vegetation activity. The method was first tested in the Kuji River Basin in Japan and subsequently validated across eight representative river systems nationwide using 16 sets of satellite images acquired between 2016 and 2023. The results demonstrate that the proposed method achieves an average geometric correction error of less than three pixels and yields a spatial distribution of the composite index that closely aligns with the actual vegetation conditions. Moreover, the difference rate between sparse and dense vegetation exceeded 90% across all rivers, indicating a strong discriminative capability and temporal sensitivity. Overall, this method is well-suited for the multiregional and multitemporal monitoring of riparian vegetation and offers a reliable quantitative tool for water environment management and ecological assessment.

1. Introduction

Riparian vegetation is a fundamental component of riverine ecosystems that provides essential habitats and ecological corridors for aquatic organisms, supports biodiversity and water quality, and directly influences hydraulic conditions and drainage capacity by altering channel roughness coefficients [1,2]. During flood events, excessive vegetation growth can obstruct the main flow path, increasing hydraulic resistance and consequently reducing flood conveyance efficiency while exacerbating the risk of inland inundation [3,4,5]. Therefore, the timely acquisition of accurate data regarding spatial distribution, vegetation type, and biomass variation is essential for effective flood prevention, disaster mitigation, and integrated river management [6,7,8,9].
The intensification and increased frequency of extreme precipitation events driven by global climate change [10,11,12] has increased the risk of flood disasters. Moreover, depopulation and industrial decline in the rural and mountainous regions of Japan have led to shortages of manpower and funding for routine river inspections and maintenance [13,14]. Although the National Survey on Natural Environment in Rivers and Watershores, initiated in 1990, has served as a national data source for river management, its five-year survey interval fails to capture the seasonal and interannual dynamics of vegetation [15]. In addition, because of the limited spatial coverage and low temporal resolution of ground-based survey methods, they fall short of meeting the governance demand for rapid, dynamic, and three-dimensional monitoring of riparian vegetation. Against this backdrop, remote sensing has emerged as a powerful alternative that offers broad spatial coverage, high revisit frequency, and timely data updates to support vegetation monitoring and river management applications [16,17,18,19,20].
Optical satellite imagery, which captures reflectance in the visible to near-infrared spectra, has been widely used to monitor riparian vegetation because it can be used to assess chlorophyll activity [21,22]. Among the optical indices, the normalized difference vegetation index (NDVI) is particularly effective for quantifying vegetation cover and tracking spatiotemporal dynamics. For example, Rivas-Fandiño et al. [23] constructed a riparian zone quality index using high-resolution WorldView-2 NDVI data and achieved a classification accuracy of 92%, thus demonstrating the effectiveness of fine-resolution imagery in assessing riparian ecological quality. Yousef et al. [24] analyzed NDVI time-series data from 1992 to 2011 and found no significant degradation in vegetation health during the spring across 17 agricultural riparian zones in the Midwest US, thereby confirming the utility of the NDVI for assessing vegetation resilience under agricultural pressure. Nagler et al. [25] linked Landsat-derived NDVI data with flood history to estimate the evapotranspiration demand and ecological water supply thresholds for red gum forests in Australia’s Murray–Darling Basin, thereby providing a quantitative foundation for environmental flow management. Fu and Burgher [26] used a 23-year NDVI dataset along with regression tree modeling to reveal the stratified response of riparian vegetation to temperature, precipitation, flooding, and groundwater depth in the semi-arid Namoi River Basin. These studies collectively affirmed the reliability and general applicability of the NDVI for characterizing vegetation cover, health trends, and coupled hydrological–anthropogenic influences in riparian zones. However, satellite optical imagery lacks vertical observational capacity and is limited in its ability to capture the density and height of the upper canopy vegetation [27,28].
In contrast, synthetic aperture radar (SAR) offers all-weather day-and-night imaging capabilities and provides stable backscatter signals, even under cloudy or rainy conditions. SAR is particularly sensitive to the vertical structure of vegetation [29,30]. Rüetschi et al. [31] used multitemporal Sentinel-1 C-band VV/VH polarization data to monitor phenological changes in temperate mixed forests in northern Switzerland. They identified distinct seasonal backscatter patterns between deciduous and coniferous forests, with a forest-type classification accuracy of 86%, thereby highlighting the sensitivity of C-band SAR to canopy structure and seasonal dynamics. Townsend [32] employed a thresholding method on a time series of 11 Radarsat-1 scenes to detect surface inundation beneath forest canopies along North Carolina’s Roanoke River and achieved an overall classification accuracy of 93.5%, thus demonstrating the ability of SAR to penetrate vegetation under both leafy and leaf-off conditions. Lang et al. [33] systematically evaluated the backscatter response of three forest types using C-HH data across 23.5–47° incidence angles. They identified interactions between the incidence angle and forest structure that influence vegetation–ground separability and offered a quantitative basis for parameter optimization. Pan et al. [34] incorporated Landsat NDVI data into a Sentinel-1A interferometric decorrelation model and established a bilinear relationship between the NDVI and VV/VH coherence, thereby enabling a priori estimates of vegetation-induced decorrelation. Nevertheless, SAR data continue to face limitations, including speckle noise, complexity of land cover interpretations, and reduced sensitivity to low-stature grasslands. These limitations hinder the ability of SAR to comprehensively characterize diverse vegetation types when used in isolation [35,36,37].
Given these limitations and complementary strengths, this study aimed to develop a remote sensing-based approach for monitoring riparian vegetation by integrating optical and SAR data. The method was validated using multiyear satellite observations across multiple river basins. First, high-accuracy geometric correction was performed using stable ground control points (e.g., bridges, ports, and road intersections) to ensure spatial alignment between the different sensor platforms. Second, satellite images and river environment maps for the Kuji River were used to analyze the distribution patterns and correlations of NDVI and SAR backscatter coefficients for different vegetation types. The results confirmed the complementary roles of the NDVI in capturing horizontal photosynthetic activity and SAR in detecting vertical vegetation structure. Building on these complementary characteristics, we propose a nonlinear multiplicative formulation-based composite index that combines horizontal and vertical vegetation information into a physically interpretable metric. Comparative analysis of roughness coefficients and statistical evaluations across different vegetation types revealed that the index effectively represents both structural and functional vegetation characteristics. Finally, the proposed method was applied to multiyear datasets from representative rivers and quantitatively assessed using a difference coefficient, and the results revealed that the index can distinguish vegetation types and densities, thus highlighting its robustness in capturing spatial and temporal trends across diverse riparian systems.

2. Methods

The riparian vegetation monitoring method proposed in this study consisted of two principal components: image preprocessing and index construction. The workflow is illustrated in Figure 1. Sentinel-1 SAR data and Sentinel-2A/2 B optical images were used to obtain spatial information on the target riverine areas. High-resolution aerial imagery from Google Earth Pro was used as the geometric reference layer for the image alignment. Geometric correction was performed by selecting ground control points (GCPs) and applying a third-order polynomial coordinate transformation [38,39]. Root mean square error (RMSE) was used to evaluate spatial alignment accuracy.
After completing the geometric registration, the NDVI was extracted from the optical imagery and fused with the backscatter coefficient derived from the SAR data to construct a composite index representing vegetation conditions within the riparian zones.

2.1. Data Sources

This study primarily focused on the Kuji River in Ibaraki Prefecture, Japan, and utilized remote sensing imagery from 2017 to 2021 for detailed analysis. The Kuji River originates in Mount Yamizo, located at the border of Fukushima, Tochigi, and Ibaraki prefectures. The main stream extends approximately 124 km, with a catchment area of approximately 1490 km2. The upper reaches are characterized by mountainous gorges with narrow valley plains, where dominant forests and bamboo groves are interspersed with paddy fields and settlements. The middle reaches flow through a gorge between the Yamizo and Abukuma mountain ranges that features meandering channels, gravel beds, and alternating riffle–pool sequences, with grasslands, farmlands, and riparian woodlands distributed along the banks. The lower reaches enter an alluvial plain, where the river channel widens and forms sandbars. Reeds and bamboo groves are common on the floodplains, while the surrounding areas consist of a mixture of farmland and urban settlements. Overall, land use in the Kuji River Basin is dominated by forested areas (approximately 85%), followed by agricultural land (approximately 14%), and urban areas (less than 1%). This gradual transition from mountainous to plain topography, together with the diverse vegetation coverage, provides an ideal setting for examining the responses of the NDVI and SAR backscatter under different environmental conditions (Figure 2a).
To further evaluate the generalizability of the proposed method across different regions and periods, eight additional representative rivers across Japan were selected: Naka, Tone, Edo, Sagami, Yoshino, Shimanto, Chikugo, and Kuma Rivers. The basins of these rivers cover a wide range of environmental contexts, such as the plain-type, mountainous, and urbanized basins, thereby providing diverse conditions for validating the applicability of the proposed index (Figure 2b–d and Table 1). For each river, imagery from two distinct time points was used to establish a multiregional validation framework. The target areas for all rivers corresponded to segments covered by official river environment maps, which facilitated consistent classification and subsequent validation.
Satellite data were obtained from the Copernicus Program of the European Space Agency. Sentinel-1 SAR images with a spatial resolution of 5 m were used to extract backscatter coefficients under all-weather and day conditions. Optical imagery from Sentinel-2A and 2B, with a 10-m resolution and 13 spectral bands, was used to calculate the NDVI. High-resolution aerial imagery from Google Earth Pro was used as the geometric reference layer for image correction. Additionally, river environment maps provided by the Ministry of Land, Infrastructure, Transport, and Tourism, which contain polygon-based statistical classifications of vegetation domains, were used as auxiliary reference data to distinguish forested areas from grasslands and interpret sparse versus dense vegetation conditions, thereby supporting the validation of our method. These maps are polygon-based datasets that integrate vegetation distribution with river features, such as rapids, tributaries, and revetments. All preprocessing, geometric corrections, and data integration were conducted using ENVI 5.6 and ArcGIS Pro 3.3 software.

2.2. Image Preprocessing

To ensure spatial consistency between the different types of remote sensing imagery, geometric correction was applied to both the Sentinel-1 and Sentinel-2 datasets prior to index calculation. The overall correction workflow is illustrated in Figure 1. Specifically, the Sentinel-1 imagery used in this study was acquired in Interferometric Wide Swath mode and provided as single-look complex data with VV polarization. The images were subjected to a series of preprocessing steps, including border subset extraction, multilooking, image co-registration, and speckle noise suppression. Following preprocessing, a backscatter intensity image was generated and subsequently subjected to geometric correction. For the Sentinel-2 data, Level-2A products with atmospheric correction were utilized. After unifying the coordinate reference systems, a geometric alignment was performed directly using multispectral imagery as the registration target.
During the geometric correction process, high-resolution aerial images provided by Google Earth Pro were used as the spatial reference layer. GCPs were manually identified for image registration. Following a comparative analysis of several coordinate transformation models, a third-order polynomial transformation was selected for the final alignment because of its superior performance in preserving the spatial accuracy [38,39]. The registration accuracy was evaluated using the RMSE, which was calculated as follows:
R M S E x , y = 1 n i = 1 n x i x 0 2 + y i y 0 2
where x i , y i are the measured coordinates of the i -th GCP, x 0 , y 0 are the corresponding true coordinates, and n is the total number of GCPs.
Although slight variations in the RMSE were observed among the different segments, the average registration error remained within three pixels across all areas. The corrected images were considered sufficiently accurate for subsequent index calculations and classification analyses.

2.3. Index Construction

Two types of satellite-derived features were used in this study to characterize riverine vegetation: backscatter coefficients extracted from SAR images, and the NDVI derived from multispectral optical imagery. As illustrated in Figure 1, these features were extracted from the preprocessed Sentinel-1 and Sentinel-2 images and subsequently integrated to construct a composite vegetation index.
The backscatter coefficient (σ0) derived from C-band Sentinel-1 SAR imagery is highly sensitive to vertical surface structures, particularly complex vegetation formations, such as forest canopies. After preprocessing, the surface backscatter coefficient was extracted using the SARscape module in ENVI and converted into decibel units to obtain a calibrated backscatter intensity image for subsequent index computation and analysis. Previous studies have demonstrated that C-band radar signals show significant correlations with vegetation parameters such as height, leaf area index, and branch–leaf density. When volume scattering within the canopy is enhanced, the backscatter intensity effectively reflects differences in vertical vegetation structure [40,41,42]. Considering the typical vegetation distribution in riparian environments—where tall and dense vegetation such as trees and bamboo stands contrast with low herbaceous cover—this study interprets SAR backscatter as a proxy for vertical vegetation density.
The NDVI was calculated from Sentinel-2 optical data using the red (Red) and near-infrared (NIR) spectral bands. This index represents photosynthetic activity and horizontal vegetation coverage [43]. The NDVI was computed for each pixel using the following standard formula:
N D V I = N I R R e d N I R + R e d
where N I R and R e d are the reflectance values of the near-infrared and red bands, respectively. The NDVI values range from –1 to 1, with higher positive values indicating more vigorous vegetation. In this study, the NDVI was used to delineate vegetated areas and evaluate spatial distribution patterns.
Based on this framework, this study proposes a composite vegetation index that integrates NDVI and SAR backscatter data to more comprehensively represent the structural density of riparian vegetation. As illustrated in Figure 1, after geometric correction, the negative values of the backscatter coefficient (σ0) were multiplied by −1 and subsequently inverted to preserve their original variation trends while converting them into positive weights. The resulting values were then multiplied by the corresponding NDVI at each pixel location. To address the magnitude difference between the NDVI and transformed SAR backscatter values, the constant scaling factor a (set to 10 in this study) was further applied. This coefficient serves only as a linear amplification factor to improve readability for visualization and interpretation and does not alter the relative relationships among index values or affect the discrimination between vegetation types. This procedure produced a new image in which each pixel value represented the product of the NDVI and the transformed SAR backscatter at the same location. The composite index was defined as follows:
I i = a N D V I i × 1 σ i
where I i denotes the composite vegetation index value at pixel i , N D V I i is the normalized difference vegetation index at pixel i , σ i is the original SAR backscatter coefficient at pixel i , and a is a positive scaling constant introduced to enhance the contrast between regions. This composite index was used to represent the spatial distribution of vegetation density within the riverine areas and served as a foundational metric for subsequent analysis.

3. Verification of Method Reliability

This section verifies the reliability of the proposed vegetation index based on three aspects: (1) assessment of geometric correction and registration accuracy during preprocessing, (2) analysis of the correlation between the base variables (NDVI and SAR backscatter) and vegetation distribution, and (3) evaluation of the effectiveness of the index in differentiating vegetation types. Verification was primarily conducted using data from the Kuji River, allowing for a detailed examination of the spatial consistency, variable behavior, and representational validity of the proposed index in riverine environments.

3.1. Geometric Correction and Registration Accuracy

To ensure the spatial consistency between the Sentinel-1 SAR and Sentinel-2 optical images, geometric correction was applied to both datasets using high-resolution aerial imagery from Google Earth Pro as the base reference layer. GCPs were evenly distributed across the study area, and points that presented clear and temporally stable features, such as large buildings, bridges, and port terminals, allowed individual point errors to be controlled within three pixels and prioritized. The number of GCPs used for each image is listed in Table 2 and Table 3.
For the Sentinel-1 SAR images, the absence of color and the use of a single C-band channel made the identification of GCPs more challenging. However, as illustrated in Figure 3, features with strong backscatter, such as large man-made structures, were identifiable even in the radar imagery. As shown in Table 2, the RMSE across multi-seasonal images remained below 3.0 pixels, indicating that the overall correction accuracy was within acceptable bounds for the analysis of riverbank vegetation areas. Figure 4 shows the geometrically corrected SAR images from January 2017 to 2021.
In contrast, Sentinel-2 optical images, which have a 10-m resolution and multiple spectral bands, offered a clearer distinction of ground objects. Even in scenes with partial cloud cover, precise GCPs can be selected based on spatially fixed features, thereby enabling reliable registration. The number of GCPs and RMSE for these images are summarized in Table 3. Figure 5 shows geometrically corrected optical images for the same dates.
All the corrections were performed using third-order polynomial transformations. The results confirmed that the registration accuracy for both SAR and optical data was sufficient for subsequent feature extraction and index computation.

3.2. Relationship Between NDVI/SAR and Vegetation

To verify the suitability of the NDVI and SAR backscatter coefficients as indicators of vegetation conditions in riverine environments, representative samples were extracted from the upper, middle, and lower reaches of the Kuji River. Vegetation classifications, namely grasslands and forests, were based on a 2017 River Environment Map derived from the National Survey on the Natural Environment of River Zones. Within each segment, sampling points were randomly selected from areas exhibiting either sparse or dense vegetation, as interpreted from optical satellite imagery (Figure 6, Figure 7 and Figure 8).
Backscatter coefficients were extracted from the Sentinel-1 images, whereas the NDVI was computed from the corresponding Sentinel-2 multispectral images for January 2017 and January 2021. These datasets enabled a temporal comparison of the radiometric responses of different vegetation types under varying density conditions.
As summarized in Figure 9, the average SAR backscatter coefficient decreased with decreasing vegetation density for both grassland and forest samples. This pattern was consistent across the years. Moreover, forests exhibited significantly higher backscatter values than grasslands, which was attributable to their greater vertical structural complexity and stronger radar wave interactions. In contrast, grasslands tended to show smaller variations in backscatter owing to their relatively flat and homogeneous surface properties. The consistency across years further highlights the stability of SAR data in capturing structural vegetation characteristics, regardless of seasonal effects.
Figure 10 shows the corresponding NDVI statistics. Forests demonstrated a clear distinction between dense and sparse vegetation zones, with higher NDVI values recorded in denser areas. However, this distinction was less pronounced in the grasslands. This may have been due to spatial heterogeneity in the selected sampling zones, particularly in the downstream region, where vegetation conditions are more uniform during the winter season. Additionally, the NDVI values for both vegetation types were higher in 2017 than in 2021, which was likely influenced by interannual variations in weather and vegetation growth conditions.
These results confirm that both the NDVI and SAR backscatter reflect relevant aspects of vegetation distribution in riverine areas. In particular, forests showed strong differentiation in both indices, supporting their use as baseline variables in the subsequent development of vegetation quantification indices.

3.3. Validation of the Proposed Method

To evaluate the applicability of the proposed riparian vegetation monitoring method, a comparative analysis was conducted using grassland and forest samples from the upper, middle, and lower reaches of the Kuji River. Table 4 summarizes the mean index values calculated for both vegetation types in 2017 and 2021.
In general, the index values for forests were consistently higher than those for grasslands across all regions, suggesting the basic capacity of the index to differentiate between vegetation types. In addition, the forest index values showed a decreasing trend from upstream to downstream. Figure 11, Figure 12 and Figure 13 illustrate the representative spatial distributions of the index for each river segment. High-value clusters were observed in the upstream region, corresponding to areas with dense forest cover. In contrast, the downstream areas displayed lower and more dispersed index values, a pattern that aligned well with the observations from optical imagery.
Grassland index values were notably lower than those of forests in all segments and exhibited minimal variation between regions. This outcome reflects the relatively uniform spatial characteristics of the grassland vegetation during the winter observation period. When comparing the two observation years, a slight decrease in index values was found in 2021 relative to 2017, which is consistent with the reductions in the vegetated areas observed in the satellite imagery.
In addition, we compared the average index values with the dimensionless roughness coefficients of three vegetation types (tall trees, medium-to-low shrubs, and turfgrass), as shown in Figure 14 and Figure 15. The dimensionless roughness coefficient follows the definition of the Kerby roughness coefficient [44], which is a dimensionless parameter representing the degree of resistance to water flow; denser vegetation cover produces greater resistance and hence higher coefficient values. In this study, the assignment of roughness coefficients was based on a combination of parameter settings from previous hydrodynamic and riparian vegetation resistance studies (e.g., tall trees were assigned the highest class, while turfgrass was assigned the lowest class) and the interpretation of vegetation height and density. The comparison was conducted by calculating the mean index value for each vegetation category and matching it with the corresponding roughness coefficient level.
It should be noted that this study only considered three vegetation types that could be clearly identified from satellite imagery and set the roughness coefficients using a preliminary classification scheme. Therefore, it was not possible to distinguish subtle differences among various vegetation species. Nevertheless, under these conditions, the mean index values of different vegetation types still exhibited an approximately linear distribution against the roughness coefficients, indicating a certain degree of correlation. This suggests that the proposed index can capture variations in vegetation density and structural complexity to some extent, with particularly strong performance in forested areas.
Because winter imagery was used in this study, turfgrass in some areas was difficult to distinguish from barren land, which reduced the clarity of the correlation. In addition, the river environment map did not provide a separate classification for medium-to-low shrubs; therefore, this vegetation type was extracted only through the interpretation of optical imagery, which introduced certain limitations. To enhance the reliability and applicability of the index, we plan to integrate more refined normalization methods, contemporaneous field vegetation surveys, and more detailed roughness coefficient datasets to conduct a more systematic analysis of the relationship between the index and roughness coefficients across different vegetation types.

4. Generalizability Assessment Across Multiple Rivers

This section aims to validate the applicability and discriminative capabilities of the proposed riparian vegetation monitoring method for various river conditions, vegetation types, and temporal backgrounds.

4.1. Distribution Characteristics of the Method in Representative Regions

To examine the applicability of the proposed riparian vegetation monitoring method across different watershed conditions, representative results from the upstream, midstream, and downstream areas of the Chikugo River are illustrated in Figure 16, Figure 17 and Figure 18.
As shown in Figure 16, the upstream area generally displayed high index values, with contiguous red and orange zones indicating dense vegetation coverage. These high-value areas corresponded well with the forest-dense zones in the optical images, suggesting a favorable response of the index to dense vegetation regions. In the midstream region (Figure 17), the index values exhibited more dispersed patterns with intermingled red, yellow, and green areas. This reflected a local mixture of grassland and forest cover, a pattern that was also confirmed by the heterogeneity observed in the comparison images. Figure 18 presents the downstream area, where the index values were generally lower and dominated by blue and green tones, indicating an overall sparse vegetation trend. In particular, the 2021 images showed further reductions in the index values in some areas, thus aligning with the predominantly grassy and sparsely forested landscapes observed in the optical images.
From a temporal perspective, a declining trend in the index values was observed in the mid- and downstream regions over successive years, which is consistent with the vegetation reduction visible in the optical imagery. This indicates that the method is capable of capturing interannual differences in vegetation status under the same seasonal conditions and thus exhibits a certain degree of temporal responsiveness. In this context, temporal responsiveness refers specifically to variations between different years within the same season and not differences among seasons. Given that riparian vegetation exhibits pronounced seasonal characteristics that influence both the NDVI and SAR signals, future research will incorporate multi-seasonal imagery to further examine the stability and applicability of the method under dynamic seasonal conditions.

4.2. Validation of the Method’s Discriminative Capability

To evaluate whether the proposed riparian vegetation monitoring method can effectively distinguish between regions of differing vegetation volume, a “difference coefficient” is introduced as a quantitative measure of the overlap between index values from two contrasting regions. The coefficient is defined as follows:
D i f f e r e n c e   R a t e = ( 1 D 0 D 1 + D 2 D 0 ) × 100 %
where D 1 denotes the number of pixels in Region 1, D 2 is the number of pixels in Region 2, and D 0 is the number of pixels in which the index values of both regions overlap. A larger difference rate indicates a greater ability to distinguish between two regions.

4.2.1. Validation Based on Vegetation Domains in River Environment Maps

The adjacent grassland and forest areas identified in the river environment maps were selected. The index values were extracted separately from each vegetation domain, and the numerical distributions were analyzed. Figure 19 illustrates the distribution curves for the two representative rivers (Yoshino and Edo Rivers in 2016). Table 5 summarizes the difference rates for each case. Most rivers exhibited values ranging from 50% to 70%, with the highest being 94.9% for the Yoshino River and the lowest being 42.2% for the Edo River. Further verification using optical imagery (Figure 20) confirmed that regions with higher difference rates corresponded to substantial differences in vegetation volume.

4.2.2. Validation Based on Interpreted Optical Imagery

To further test the generalizability of this method, regions with clearly differing vegetation volumes, such as zones dominated by tall trees versus low grasses, were selected based on visual interpretation of optical satellite images. The index values were extracted from both the dense and sparse vegetation zones. The results demonstrate that, in all river and year combinations, the difference rate exceeded 90%, indicating that the index possesses a strong discriminative capability in scenarios where there are substantial differences in vegetation volume. Table 6 presents the difference rates of the index values in each zone, while Table 7 lists their mean and standard deviation. Figure 21 shows a representative example of the index value distributions. A distinct bimodal distribution was observed, with minimal overlap between the high and low vegetation volume zones.
In addition to the difference rate analysis, we further performed statistical significance testing to validate whether the observed differences between sparse and dense vegetation zones were consistent across rivers and years. As summarized in Table 7, two-sample t-tests confirmed that all differences were statistically significant (p < 0.01). These results indicate that in addition to the intuitive distributional separation captured by the difference rate, the index differences are also supported by conventional statistical testing. Taken together, the difference rate highlights the overall separation of distributions in a spatially explicit manner, while the t-test establishes the statistical reliability of these differences, thereby providing complementary evidence to strengthen the robustness of our conclusions.

4.3. Limitations and Future Work

Although the proposed optical–radar composite index demonstrated good applicability and robustness in riparian vegetation monitoring, several limitations remain that require further improvement.
(1)
In terms of data, this study mainly relied on Sentinel-1 C-band VV polarization imagery and Sentinel-2 optical imagery, which offer advantages of temporal continuity and broad spatial coverage. However, other polarization modes (such as VH) and radar bands (such as L-band and X-band) were not explored despite providing complementary sensitivity to the vegetation structure and moisture content. Future work could systematically compare the performance of different polarizations and radar bands and incorporate higher-resolution optical satellite imagery as auxiliary validation data to enhance the applicability of the proposed method across different spatial scales and heterogeneous environments.
(2)
In terms of methodology, the composite index was constructed using a nonlinear multiplicative formulation that integrates NDVI and SAR backscatter data, thereby capturing both horizontal vegetation coverage and vertical structural characteristics. Nevertheless, cross-year or cross-region normalization was not applied, and differences in radiometric conditions and environmental backgrounds across years may affect the comparability of the results [45,46]. Future studies should explore normalization and standardization strategies to improve the robustness and transferability of the index across temporal and spatial domains.
(3)
In terms of validation, this study mainly relied on river environment maps and optical image interpretation to differentiate vegetation types and used the difference rate and statistical tests to evaluate the effectiveness. While this approach provides some indication of the method’s discriminative capacity, it still has limitations. First, validation depends on indirect interpretation of two-dimensional imagery, which introduces uncertainty in structurally complex or mixed vegetation areas. Second, direct comparisons with field survey data were not conducted, which increased the difficulty of establishing quantitative relationships between index values and actual vegetation parameters, such as biomass or canopy height. Third, the analysis was limited to winter imagery, meaning that the applicability was not tested under other seasonal conditions. Future research should combine field surveys or unmanned aerial vehicle-based data to improve accuracy and incorporate LiDAR or aerial photogrammetry as three-dimensional observations to directly link the index with vegetation structural parameters. In addition, high-resolution optical imagery may be used as auxiliary reference data and the composite index could be introduced as an input feature in machine learning models for comparison with other features or extended to larger-scale applications through population-based deep learning approaches.

5. Conclusions

This study proposes a riparian vegetation monitoring method that integrates optical and radar remote sensing data. By constructing a composite index based on the NDVI and backscatter intensity, this method enables a quantitative characterization of riparian vegetation across multiple rivers and time periods. The methodology comprises precise geometric correction of remote sensing imagery, design of a composite index computation workflow, and multilevel validation of the applicability and discriminative capacity using both river environment maps and optical imagery. A series of analyses conducted on eight representative rivers, including the Chikugo and Kuji Rivers, confirmed the robustness and practical utility of the proposed approach under complex topographical and vegetative conditions.
  • By identifying geodetically stable features such as bridges, ports, and road intersections as GCP candidates, subpixel-level image registration accuracy was achieved. The RMSE was maintained within three pixels for both the SAR and optical imagery, thereby ensuring a reliable foundation for large-scale regional analysis.
  • Using river environment maps and satellite data from the Kuji River and seven other major rivers, a quantitative investigation of the NDVI and backscatter characteristics of grassland and forested areas was conducted. The results confirmed the complementary relationship between the NDVI and backscatter intensity in capturing horizontal vegetation activity and vertical structural density.
  • The proposed method, constructed by integrating the NDVI and backscatter intensity, exhibited a strong spatial correspondence with the actual vegetation distribution. Its capacity to distinguish between vegetation types and densities was verified through preliminary comparisons with dimensionless roughness coefficients and quantitative evaluation using a difference coefficient.
  • A comparative analysis across different catchments and years demonstrates the generalizability of the method for reflecting temporal trends in vegetation quantity. However, to ensure consistent applicability across the temporal and spatial domains, normalization techniques must be incorporated to account for interannual variability and regional heterogeneity.
In summary, the proposed riparian vegetation monitoring method provides a stable and quantitative representation of riparian vegetation distribution and variability. This offers a viable tool for remote sensing-based water environment management and ecological assessment and lays a foundation for broader applications in riparian ecosystem monitoring.

Author Contributions

H.L., Conceptualization, Methodology, Validation, Writing—Original Draft; H.K., Methodology, Validation, Writing—Review and Editing; Y.S., Methodology, Validation; Y.K., Funding Acquisition, Supervision, Project Administration, Writing—Review and Editing. All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported by the Environmental Research and Technology Development Fund (S-18-3(3), JPMEERF20S11813) of the Environmental Restoration and Conservation Agency of Japan.

Data Availability Statement

Satellite data were downloaded from https://browser.dataspace.copernicus.eu/ (accessed on 18 September 2025), and the river environment maps were downloaded from https://www.nilim.go.jp/lab/fbg/ksnkankyo/ (accessed on 18 September 2025).

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Gurnell, A. Plants as river system engineers. Earth Surf. Process. Landf. 2014, 39, 4–25. [Google Scholar] [CrossRef]
  2. Gurnell, A.M.; Bertoldi, W. Extending the conceptual model of river island development to incorporate different tree species and environmental conditions. River Res. Appl. 2020, 36, 1730–1747. [Google Scholar] [CrossRef]
  3. Västilä, K.; Järvelä, J. Modeling the flow resistance of woody vegetation using physically based properties of the foliage and stem. Water Resour. Res. 2014, 50, 229–245. [Google Scholar] [CrossRef]
  4. Clark, S.D.A.; Cooper, J.R.; Rameshwaran, P.; Naden, P.; Li, M.; Hooke, J. Modelling river flow through in-stream natural vegetation for a gravel-bed river reach. In Recent Trends in Environmental Hydraulics: 38th International School of Hydraulics; Kalinowska, M.B., Mrokowska, M.M., Rowiński, P.M., Eds.; Springer International Publishing: Cham, Switzerland, 2020; pp. 33–41. [Google Scholar] [CrossRef]
  5. Aberle, J.; Järvelä, J. Flow resistance of emergent rigid and flexible floodplain vegetation. J. Hydraul. Res. 2013, 51, 33–45. [Google Scholar] [CrossRef]
  6. Vermuyten, E.; Meert, P.; Wolfs, V.; Willems, P. Impact of seasonal changes in vegetation on the river model prediction accuracy and real-time flood control performance. J. Flood Risk Manag. 2020, 13, e12651. [Google Scholar] [CrossRef]
  7. Roy, D.P.; Wulder, M.A.; Loveland, T.R.; Ce, W.; Allen, R.G.; Anderson, M.C.; Helder, D.; Irons, J.R.; Johnson, D.M.; Kennedy, R.; et al. Landsat-8: Science and product vision for terrestrial global change research. Remote Sens. Environ. 2014, 145, 154–172. [Google Scholar] [CrossRef]
  8. Densmore, R.V.; Karle, K.F. Flood effects on an Alaskan stream restoration project: The value of long-term monitoring 1. J. Am. Water Resour. Assoc. 2009, 45, 1424–1433. [Google Scholar] [CrossRef]
  9. Dauwalter, D.C.; Fesenmyer, K.A.; Miller, S.W.; Porter, T. Response of riparian vegetation, instream habitat, and aquatic biota to riparian grazing exclosures. N. Am. J. Fish. Manag. 2018, 38, 1187–1200. [Google Scholar] [CrossRef]
  10. Zhou, B.-T.; Qian, J. Changes of weather and climate extremes in the IPCC in AR6. Adv. Clim. Change Res. 2021, 17, 713. [Google Scholar]
  11. Myhre, G.; Alterskjær, K.; Stjern, C.W.; Hodnebrog, Ø.; Marelle, L.; Samset, B.H.; Sillmann, J.; Schaller, N.; Fischer, E.; Schulz, M.; et al. Frequency of extreme precipitation increases extensively with event rareness under global warming. Sci. Rep. 2019, 9, 16063. [Google Scholar] [CrossRef]
  12. Kawase, H.; Nosaka, M.; Watanabe, S.I.; Yamamoto, K.; Shimura, T.; Naka, Y.; Wu, Y.H.; Okachi, H.; Hoshino, T.; Ito, R.; et al. Identifying robust changes of extreme precipitation in Japan from large ensemble 5-km-Grid regional experiments for 4K warming scenario. JGR Atmos. 2023, 128, e2023JD038513. [Google Scholar] [CrossRef]
  13. Aurich, A. Japan’s Forgotten Countryside: Demographic Crisis and Revival Strategies; East-West Center: Honolulu, HI, USA, 2025. [Google Scholar]
  14. Japan International Cooperation Agency (JICA); Nippon Koei Co., Ltd. Japan’s Experience on Water Resources Management; Japan International Cooperation Agency (JICA): Tokyo, Japan, 2022; Available online: https://www.jica.go.jp (accessed on 18 September 2025).
  15. Ministry of Land, Infrastructure, Transport and Tourism; Kanto Regional Development Bureau. National Survey on Rivers and Waterside Environments. Available online: https://www.ktr.mlit.go.jp/river/shihon/river_shihon00000125.html (accessed on 18 September 2025).
  16. Dufour, S.; Rodríguez-González, P.M.; Laslier, M. Tracing the scientific trajectory of riparian vegetation studies: Main topics, approaches and needs in a globally changing world. Sci. Total Environ. 2019, 653, 1168–1185. [Google Scholar] [CrossRef]
  17. Chatrabhuj; Meshram, K.; Mishra, U.; Omar, P.J. Integration of remote sensing data and GIS technologies in river management system. Discov. Geosci. 2024, 2, 67. [Google Scholar] [CrossRef]
  18. Zhao, Q.; Pan, J.; Devlin, A.T.; Tang, M.; Yao, C.; Zamparelli, V.; Falabella, F.; Pepe, A.; Pepe, A. On the exploitation of remote sensing technologies for the monitoring of coastal and river delta regions. Remote Sens. 2022, 14, 2384. [Google Scholar] [CrossRef]
  19. Kerle, N.; van den Homberg, M.J.C. Remote sensing for disaster risk management: Advances and limitations. In Comprehensive Remote Sensing; Elsevier: Amsterdam, The Netherlands, 2024. [Google Scholar] [CrossRef]
  20. Ko, D.W.; Kim, D.; Narantsetseg, A.; Kang, S. Comparison of field- and satellite-based vegetation cover estimation methods. J. Ecol. Environ. 2017, 41, 5. [Google Scholar] [CrossRef]
  21. Redowan, M.; Kanan, A.H. Potentials and limitations of NDVI and other vegetation indices (VIs) for monitoring vegetation parameters from remotely sensed data. Bangladesh Res. Publ. J. 2012, 7, 291–299. [Google Scholar]
  22. Hislop, S.; Soto-Berelov, M.; Jellinek, S.; Chee, Y.E.; Jones, S. Monitoring riparian vegetation in urban areas with Sentinel-2 satellite imagery. Ecol. Manag. Restor. 2025, 26, e12624. [Google Scholar] [CrossRef]
  23. Rivas-Fandiño, P.; Acuña-Alonso, C.; Novo, A.; Pacheco, F.A.L.; Álvarez, X. Assessment of high spatial resolution satellite imagery for monitoring riparian vegetation: Riverine management in the smallholding. Environ. Monit. Assess. 2022, 195, 81. [Google Scholar] [CrossRef] [PubMed]
  24. Yousef, F.; Gebremichael, M.; Ghebremichael, L.; Perine, J. Remote-sensing based assessment of long-term riparian vegetation health in proximity to agricultural lands with herbicide use history. Integr. Environ. Assess. Manag. 2019, 15, 528–543. [Google Scholar] [CrossRef]
  25. Nagler, P.L.; Doody, T.M.; Glenn, E.P.; Jarchow, C.J.; Barreto-Muñoz, A.; Didan, K. Wide-area estimates of evapotranspiration by red gum (Eucalyptus camaldulensis) and associated vegetation in the Murray–Darling River Basin, Australia. Hydrol. Process. 2016, 30, 1376–1387. [Google Scholar] [CrossRef]
  26. Fu, B.; Burgher, I. Riparian vegetation NDVI dynamics and its relationship with climate, surface water and groundwater. J. Arid Environ. 2015, 113, 59–68. [Google Scholar] [CrossRef]
  27. Guo, Y.; Senthilnath, J.; Wu, W.; Zhang, X.; Zeng, Z.; Huang, H. Radiometric calibration for multispectral camera of different imaging conditions mounted on a UAV platform. Sustainability 2019, 11, 978. [Google Scholar] [CrossRef]
  28. Frantz, D. FORCE—Landsat+ Sentinel-2 analysis ready data and beyond. Remote Sens. 2019, 11, 1124. [Google Scholar] [CrossRef]
  29. Dostálová, A.; Milenkovic, M.; Hollaus, M.; Wagner, W. Influence of forest structure on the Sentinel-1 backscatter variation-analysis with full-waveform lidar data. In Proceedings of the Living Planet Symposium, Prague, Czech Republic, 9–13 May 2016; Volume 740. [Google Scholar]
  30. Vreugdenhil, M.; Wagner, W.; Bauer-Marschallinger, B.; Pfeil, I.; Teubner, I.; Rüdiger, C.; Strauss, P. Sensitivity of Sentinel-1 backscatter to vegetation dynamics: An Austrian case study. Remote Sens. 2018, 10, 1396. [Google Scholar] [CrossRef]
  31. Rüetschi, M.; Schaepman, M.E.; Small, D. Using multitemporal sentinel-1 c-band backscatter to monitor phenology and classify deciduous and coniferous forests in northern Switzerland. Remote Sens. 2018, 10, 55. [Google Scholar] [CrossRef]
  32. Townsend, P.A. Mapping seasonal flooding in forested wetlands using multi-temporal Radarsat SAR. Photogramm. Eng. Remote Sens. 2001, 67, 857–864. [Google Scholar]
  33. Lang, M.W.; Townsend, P.A.; Kasischke, E.S. Influence of incidence angle on detecting flooded forests using C-HH synthetic aperture radar data. Remote Sens. Environ. 2008, 112, 3898–3907. [Google Scholar] [CrossRef]
  34. Pan, J.; Zhao, R.; Xu, Z.; Cai, Z.; Yuan, Y. Quantitative estimation of sentinel-1A interferometric decorrelation using vegetation index. Front. Earth Sci. 2022, 10, 1016491. [Google Scholar] [CrossRef]
  35. Tsyganskaya, V.; Martinis, S.; Marzahn, P.; Ludwig, R. SAR-based detection of flooded vegetation–a review of characteristics and approaches. Int. J. Remote Sens. 2018, 39, 2255–2293. [Google Scholar] [CrossRef]
  36. Gašparović, M.; Dobrinić, D. Comparative assessment of machine learning methods for urban vegetation mapping using multitemporal sentinel-1 imagery. Remote Sens. 2020, 12, 1952. [Google Scholar] [CrossRef]
  37. Adeli, S.; Salehi, B.; Mahdianpari, M.; Quackenbush, L.J.; Brisco, B.; Tamiminia, H.; Shaw, S. Wetland monitoring using SAR data: A meta-analysis and comprehensive review. Remote Sens. 2020, 12, 2190. [Google Scholar] [CrossRef]
  38. Dai, X.; Khorram, S. The effects of image misregistration on the accuracy of remotely sensed change detection. IEEE Trans. Geosci. Remote Sens. 1998, 36, 1566–1577. [Google Scholar] [CrossRef]
  39. Zitová, B.; Flusser, J. Image registration methods: A survey. Image Vis. Comput. 2003, 21, 977–1000. [Google Scholar] [CrossRef]
  40. Duguay, Y.; Bernier, M.; Lévesque, E.; Tremblay, B. Potential of C and X Band SAR for Shrub Growth Monitoring in Sub-Arctic Environments. Remote Sens. 2015, 7, 9410–9430. [Google Scholar] [CrossRef]
  41. Bruggisser, M.; Dorigo, W.; Dostálová, A.; Hollaus, M.; Navacchi, C.; Schlaffer, S.; Pfeifer, N. Potential of Sentinel-1 C-Band Time Series to Derive Structural Parameters of Temperate Deciduous Forests. Remote Sens. 2021, 13, 798. [Google Scholar] [CrossRef]
  42. Rosenqvist, A.; Killough, B. A Layman’s Interpretation Guide to L-band and C-band Synthetic Aperture Radar Data, v2.0; Global Forest Observations Initiative: Rome, Italy, 2018. [Google Scholar]
  43. Rouse, J.W., Jr.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring the Vernal Advancement and Retrogradation (Green Wave Effect) of Natural Vegetation (No. NASA-CR-132982); NASA: Washington, DC, USA, 1973. [Google Scholar]
  44. Texas Department of Transportation (TxDOT). The Kerby Method—Time of Concentration. In TxDOT Design Manual: Hydrology; Texas Department of Transportation (TxDOT): Austin, TX, USA, 2023; Chapter 4, Section 11. Available online: https://www.txdot.gov/manuals/des/hyd/chapter-4--hydrology/section-11--time-of-concentration/the-kerby-method.html (accessed on 18 September 2025).
  45. Gastellu-Etchegorry, J.-P.; Wang, Y.; Regaieg, O.; Yin, T.; Malenovský, Z.; Zhen, Z.; Yang, X.; Tao, Z.; Landier, L.; Al Bitar, A.; et al. Why to model remote sensing measurements in 3D? Recent advances in DART: Atmosphere, topography, large landscape, chlorophyll fluorescence and satellite image inversion. In Proceedings of the 2020 5th International Conference on Advanced Technologies for Signal and Image Processing (ATSIP), Sousse, Tunisia, 2–5 September 2020; IEEE: New York, NY, USA, 2020; pp. 1–6. [Google Scholar] [CrossRef]
  46. Hu, Y.; Zhan, H.; He, Q.; Zhan, W. Assessment of Atmospheric Correction Algorithms for Landsat-8/9 Operational Land Imager over Inland and Coastal Waters. Remote Sens. 2025, 17, 3055. [Google Scholar] [CrossRef]
Figure 1. Workflow of the riparian vegetation monitoring method based on synthetic aperture radar (SAR)-normalized difference vegetation index (NDVI) data.
Figure 1. Workflow of the riparian vegetation monitoring method based on synthetic aperture radar (SAR)-normalized difference vegetation index (NDVI) data.
Remotesensing 17 03281 g001
Figure 2. Study area and validation river basins across Japan. (a) Kuji River: core study site, divided into upper, middle, and lower reaches. (bd) Eight additional rivers selected for multiregional validation.
Figure 2. Study area and validation river basins across Japan. (a) Kuji River: core study site, divided into upper, middle, and lower reaches. (bd) Eight additional rivers selected for multiregional validation.
Remotesensing 17 03281 g002
Figure 3. Representative features identified from Sentinel-1 imagery used as interpretation references. (a) Large-scale building structure; (b) bridge structure (example 1); (c) bridge structure (example 2); (d) port terminal area.
Figure 3. Representative features identified from Sentinel-1 imagery used as interpretation references. (a) Large-scale building structure; (b) bridge structure (example 1); (c) bridge structure (example 2); (d) port terminal area.
Remotesensing 17 03281 g003
Figure 4. Sentinel-1 backscatter coefficient images and corresponding ground control point (GCP) distributions for January 2017 (left) and January 2021 (right).
Figure 4. Sentinel-1 backscatter coefficient images and corresponding ground control point (GCP) distributions for January 2017 (left) and January 2021 (right).
Remotesensing 17 03281 g004
Figure 5. Sentinel-2 optical base images and corresponding GCP distributions for January 2017 (left) and January 2021 (right).
Figure 5. Sentinel-2 optical base images and corresponding GCP distributions for January 2017 (left) and January 2021 (right).
Remotesensing 17 03281 g005
Figure 6. Sampling locations of dense and sparse vegetation in forest and grassland areas in the upper reach of the Kuji River.
Figure 6. Sampling locations of dense and sparse vegetation in forest and grassland areas in the upper reach of the Kuji River.
Remotesensing 17 03281 g006
Figure 7. Sampling locations of dense and sparse vegetation in forest and grassland areas in the middle reach of the Kuji River.
Figure 7. Sampling locations of dense and sparse vegetation in forest and grassland areas in the middle reach of the Kuji River.
Remotesensing 17 03281 g007
Figure 8. Sampling locations of dense and sparse vegetation in forest and grassland areas in the lower reach of the Kuji River.
Figure 8. Sampling locations of dense and sparse vegetation in forest and grassland areas in the lower reach of the Kuji River.
Remotesensing 17 03281 g008
Figure 9. Comparison of SAR backscatter coefficients. (a) Average backscatter coefficients by vegetation type and density in 2017 and 2021; (b) distribution of backscatter coefficients for forests in 2017; (c) distribution of backscatter coefficients for forests in 2021; (d) distribution of backscatter coefficients for grassland in 2017; (e) distribution of backscatter coefficients for grassland in 2021.
Figure 9. Comparison of SAR backscatter coefficients. (a) Average backscatter coefficients by vegetation type and density in 2017 and 2021; (b) distribution of backscatter coefficients for forests in 2017; (c) distribution of backscatter coefficients for forests in 2021; (d) distribution of backscatter coefficients for grassland in 2017; (e) distribution of backscatter coefficients for grassland in 2021.
Remotesensing 17 03281 g009
Figure 10. Comparison of NDVI results. (a) Average NDVI by vegetation type and density in 2017 and 2021; (b) distribution of NDVI values for forests in 2017; (c) distribution of NDVI values for forests in 2021; (d) distribution of NDVI values for grassland in 2017; (e) distribution of NDVI values for grassland in 2021.
Figure 10. Comparison of NDVI results. (a) Average NDVI by vegetation type and density in 2017 and 2021; (b) distribution of NDVI values for forests in 2017; (c) distribution of NDVI values for forests in 2021; (d) distribution of NDVI values for grassland in 2017; (e) distribution of NDVI values for grassland in 2021.
Remotesensing 17 03281 g010
Figure 11. Index distribution in the upper reach of the Kuji River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Figure 11. Index distribution in the upper reach of the Kuji River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Remotesensing 17 03281 g011
Figure 12. Index distribution in the middle reach of the Kuji River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Figure 12. Index distribution in the middle reach of the Kuji River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Remotesensing 17 03281 g012
Figure 13. Index distribution in the lower reach of the Kuji River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Figure 13. Index distribution in the lower reach of the Kuji River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Remotesensing 17 03281 g013
Figure 14. Index and roughness coefficient (2017).
Figure 14. Index and roughness coefficient (2017).
Remotesensing 17 03281 g014
Figure 15. Index and roughness coefficient (2021).
Figure 15. Index and roughness coefficient (2021).
Remotesensing 17 03281 g015
Figure 16. Index distribution in the upper reach of the Chikugo River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Figure 16. Index distribution in the upper reach of the Chikugo River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Remotesensing 17 03281 g016
Figure 17. Index distribution in the middle reach of the Chikugo River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Figure 17. Index distribution in the middle reach of the Chikugo River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Remotesensing 17 03281 g017
Figure 18. Index distribution in the lower reach of the Chikugo River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Figure 18. Index distribution in the lower reach of the Chikugo River. (a) Satellite image and environmental base map for 2017; (b) satellite image and environmental base map for 2021; (c) index distribution in 2017; (d) index distribution in 2021.
Remotesensing 17 03281 g018
Figure 19. Distribution of index values for adjacent forest and grassland areas. (a) Yoshino River (2016); (b) Edo River (2016).
Figure 19. Distribution of index values for adjacent forest and grassland areas. (a) Yoshino River (2016); (b) Edo River (2016).
Remotesensing 17 03281 g019
Figure 20. Optical imagery showing selected areas of forest and grassland. (a) Yoshino River (2016); (b) Edo River (2016).
Figure 20. Optical imagery showing selected areas of forest and grassland. (a) Yoshino River (2016); (b) Edo River (2016).
Remotesensing 17 03281 g020
Figure 21. Distribution of index values for dense and sparse vegetation zones of the Tone River (2021).
Figure 21. Distribution of index values for dense and sparse vegetation zones of the Tone River (2021).
Remotesensing 17 03281 g021
Table 1. Study rivers and observation years used in the analysis.
Table 1. Study rivers and observation years used in the analysis.
RiverT1 (Year)T2 (Year)
Kuji River20172021
Naka River20172022
Tone River20162021
Edo River20162021
Sagami River20162021
Yoshino River20162021
Shimanto River20182023
Chikugo River20172021
Kuma River20182022
Table 2. Observation dates, GCP number, and root mean square error (RMSE) values for Sentinel-1 images in 2017 and 2021.
Table 2. Observation dates, GCP number, and root mean square error (RMSE) values for Sentinel-1 images in 2017 and 2021.
YearObservation DateNumber of GCPsRMSE (Pixel)RMSE Average
20172 January291.7062.233
7 February242.332
6 March332.252
8 April252.051
14 May272.453
7 June222.202
13 July232.382
9 August242.671
11 September201.655
5 October272.161
22 November412.507
16 December332.428
20215 January232.9392.766
10 February202.967
6 March232.813
11 April222.791
5 May222.526
10 June262.461
16 July212.879
9 August222.774
2 September202.516
8 October223.512
25 November212.914
17 December172.105
Table 3. Observation dates, GCP number, and RMSE values for Sentinel-2 images in 2017 and 2021.
Table 3. Observation dates, GCP number, and RMSE values for Sentinel-2 images in 2017 and 2021.
YearObservation DateNumber of GCPsRMSERMSE Average
201718 January1061.8481.766
17 February1001.66
9 March1041.675
8 May1221.77
17 June1101.753
30 October1161.831
9 November1051.805
9 December1131.787
20212 January151.6661.994
6 February212.097
3 March272.147
22 April492.068
1 June362.133
21 July471.982
10 August522.265
19 September432.228
4 October731.942
18 November791.707
23 December1071.698
Table 4. Average index values in grassland and forests areas across river segments for 2017 and 2021.
Table 4. Average index values in grassland and forests areas across river segments for 2017 and 2021.
River SegmentGrassland (2017)Forests (2017)Grassland (2021)Forests (2021)
Upper Reach0.2500.6160.2310.504
Middle Reach0.2410.4680.2170.364
Lower Reach0.2410.4400.2000.355
Whole Area0.2440.5080.2160.408
Table 5. Difference rate (%) between grassland and forest areas (D1: grassland zone, D2: forests zone) as defined by river environment maps for each river.
Table 5. Difference rate (%) between grassland and forest areas (D1: grassland zone, D2: forests zone) as defined by river environment maps for each river.
RiverYearD1 (Count)D2 (Count)D0 (Count)Difference Rate (%)
Naka River201787887110693.5
202296089245667.3
Tone River2016117094242874.6
20211258103076050.3
Edo River201634534325242.2
202137328322747.1
Sagami River201664256140649.1
202154049733352.7
Yoshino River20162522432494.9
20201991998473.2
Shimanto River201833924616062.4
202333925217557.9
Chikugo River201727735119056.6
202130235919558.2
Kuma River201756448915682.6
202151541416578.4
Table 6. Difference rate (%) between sparse and dense vegetation zones (D1: sparse vegetation, D2: dense vegetation) as detected by optical imagery for each river.
Table 6. Difference rate (%) between sparse and dense vegetation zones (D1: sparse vegetation, D2: dense vegetation) as detected by optical imagery for each river.
RiverYearD1 (Count)D2 (Count)D0 (Count)Difference Rate (%)
Naka River2017519591599.5
202255730100
Tone River20161451841894.2
20214024461298.6
Edo River201665630100
20211691120100
Sagami River20163343153494.5
20213113291597.6
Yoshino River2016230207199.8
20203714093495.4
Shimanto River20181091210100
2023306285299.7
Chikugo River2017138158199.7
20211541751096.9
Kuma River20173763481398.2
2021166195299.4
Table 7. Mean values, standard deviations, and statistical significance (p-values) of the vegetation indices for sparse and dense vegetation zones by river and year.
Table 7. Mean values, standard deviations, and statistical significance (p-values) of the vegetation indices for sparse and dense vegetation zones by river and year.
RiverYearSparse (Mean)Dense (Mean)Sparse (SD)Dense (SD)Significance
Naka River20170.140.390.030.05p < 0.001
20220.130.580.060.11p < 0.001
Tone River20160.280.470.070.42p < 0.001
20210.140.370.030.07p < 0.001
Edo River20160.130.370.020.06p < 0.001
20210.130.410.020.04p < 0.001
Sagami River20160.140.350.040.12p < 0.001
20210.110.380.050.11p < 0.001
Yoshino River20160.140.600.050.07p < 0.001
20200.170.410.040.08p < 0.001
Shimanto River20180.140.700.030.11p < 0.001
20230.180.470.040.07p < 0.001
Chikugo River20170.190.660.170.10p < 0.001
20210.190.540.050.22p = 0.007
Kuma River20170.200.530.030.13p < 0.001
20210.200.530.040.11p < 0.001
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Li, H.; Kurusu, H.; Suzuki, Y.; Kuwahara, Y. Development of an Optical–Radar Fusion Method for Riparian Vegetation Monitoring and Its Application to Representative Rivers in Japan. Remote Sens. 2025, 17, 3281. https://doi.org/10.3390/rs17193281

AMA Style

Li H, Kurusu H, Suzuki Y, Kuwahara Y. Development of an Optical–Radar Fusion Method for Riparian Vegetation Monitoring and Its Application to Representative Rivers in Japan. Remote Sensing. 2025; 17(19):3281. https://doi.org/10.3390/rs17193281

Chicago/Turabian Style

Li, Han, Hiroki Kurusu, Yuzuna Suzuki, and Yuji Kuwahara. 2025. "Development of an Optical–Radar Fusion Method for Riparian Vegetation Monitoring and Its Application to Representative Rivers in Japan" Remote Sensing 17, no. 19: 3281. https://doi.org/10.3390/rs17193281

APA Style

Li, H., Kurusu, H., Suzuki, Y., & Kuwahara, Y. (2025). Development of an Optical–Radar Fusion Method for Riparian Vegetation Monitoring and Its Application to Representative Rivers in Japan. Remote Sensing, 17(19), 3281. https://doi.org/10.3390/rs17193281

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop