Next Article in Journal
Electrically Tunable Metasurface for Multi-Polarized Reflection
Previous Article in Journal
Advances in Deep Learning Applications for Plant Disease and Pest Detection: A Review
Previous Article in Special Issue
An Overview of Within-Season Agricultural Monitoring from Remotely Sensed Data
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Estimation of Aboveground Biomass of Chinese Milk Vetch Based on UAV Multi-Source Map Fusion

1
College of Public Administration, Huazhong Agricultural University, Wuhan 430070, China
2
College of Resources and Environment, Huazhong Agricultural University, Wuhan 430070, China
3
Artificial Intelligence Research Institute, Faculty of Computing, Harbin Institute of Technology, Harbin 150008, China
4
National Key Laboratory of Smart Farm Technologies and Systems, Harbin 150008, China
*
Author to whom correspondence should be addressed.
These authors contributed equally to this work.
Remote Sens. 2025, 17(4), 699; https://doi.org/10.3390/rs17040699
Submission received: 31 December 2024 / Revised: 14 February 2025 / Accepted: 16 February 2025 / Published: 18 February 2025

Abstract

:
Chinese milk vetch (CMV), as a typical green manure in southern China, plays an important role in improving soil quality and partially substituting nitrogen chemical fertilizers for rice production. Accurately estimating the aboveground biomass (AGB) of CMV is crucial for quantifying the biological nitrogen fixation amount (BNFA) and assessing its viability as a nitrogen fertilizer alternative. However, the traditional estimation methods have low efficiency in field-scale evaluations. Recently, unmanned aerial vehicle (UAV) remote sensing technology has been widely adopted for AGB estimation. This study utilized UAV-based multispectral and RGB imagery to extract spectral (Sp), textural (Tex), and structural features (Str), comparing various feature combinations in AGB estimation for CMV. The results indicated that the fusion of spectral, textural, and structural features indicated optimal estimation performance across all feature combinations, resulting in R2 values of 0.89 and 0.83 for model cross-validation and spatial transferability validation, respectively. The inclusion of textural and spectral features notably improved AGB estimation, indicated an increase of 0.15 and 0.14 in R2 values for model cross-validation and spatial transferability validation, respectively, compared with relying on spectral features only. Estimation based exclusively on structural features resulted in R2 values of 0.65 and 0.52 for model cross-validation and spatial transferability validation, respectively. The present study establishes a rapid and extensive approach to evaluate the BNFA of CMV at the full blooming stage utilizing the optimal AGB estimation model, which will provide an effective calculation method for chemical fertilizer reduction.

1. Introduction

The yearly seasonal fallow area encompasses 24.9 million hectares, which constitutes 25% of arable land in China [1]. Planting green manure crops on fallow areas significantly contributes to expanding fertilizer sources and enhancing soil nitrogen and organic matter levels. Chinese milk vetch (CMV, Astragalus sinicus L.) indicates robust winter growth capabilities, enriched with nitrogen, phosphorus, potassium, and trace elements. Prevalent in the Yangtze River basin, CMV is strategically cultivated in fallow fields after rice harvest. Subsequently, at the full blooming stage, CMV is incorporated into the soil to increase soil organic matter and thus substitute a portion of nitrogen chemical fertilizers. Previous research indicated that the CMV’s biological nitrogen fixation approximately contributes to 78% of the overall nitrogen accumulation, supplying 41–146 kg/ha of nitrogen for subsequent crops [2]. Consequently, CMV has the potential to substitute 20–40% of nitrogen fertilizers after incorporation [3]. The utilization of this natural nitrogen fertilizer substitute has triggered substantial studies on estimating the biological nitrogen fixation amount (BNFA) of CMV [4,5,6]. The nitrogen fixation capacity of CMV correlates with aboveground biomass (AGB), nitrogen content, and biological nitrogen fixation rate, with AGB emerging as the primary influencing factor. Therefore, accurately estimating the AGB of CMV becomes pivotal in evaluating its impact of CMV on soil fertility and optimizing nitrogen fertilizer application in rice fields.
Traditional approaches for estimating AGB and related physiological and biochemical parameters, such as field surveys and vegetation growth models, encounter notable limitations. Field surveys are time-consuming and susceptible to investigator bias. Additionally, the application of vegetation growth models requires many parameters and heavily depends on extensive field survey data for calibration. For example, the APSIM (Agricultural Production Systems Simulator) model requires detailed calibration of parameters related to photosynthetic and respiration rates, which are specific to plant species and environmental conditions [7,8]. These parameters need to be measured under controlled conditions, often requiring complex experimental setups [7,8,9,10]. Furthermore, there is currently no comprehensive explanation of the mechanisms by which extreme atmospheric conditions impact agricultural production, leading to lower accuracy in crop simulation under extreme agricultural climates [11,12]. Therefore, crop growth models need to be improved by incorporating appropriate empirical models tailored to different crops.
Fortunately, the recent emergence of unmanned aerial vehicle (UAV) remote sensing technology offers a promising alternative to traditional AGB estimation methods. UAV remote sensing has several advantages, including high-speed data acquisition, objectivity, quantifiability, and non-destructiveness [13,14,15]. This technological innovation has proven its efficiency and accuracy in estimating vegetation AGB and other physiological and biochemical indicators [14,16,17,18]. However, the application of UAVs also faces challenges, such as regulatory issues related to airspace restrictions and flight permissions, which can complicate deployment in certain regions. Additionally, battery capacity, though less of a concern, can limit flight duration and the area that can be covered in a single survey. Despite these challenges, solutions are emerging—regulatory frameworks are gradually evolving to accommodate UAV usage, and advancements in battery technology and energy-efficient systems are likely to extend flight times, making these limitations manageable rather than insurmountable [19,20].
A prevalent approach for estimating vegetation parameters through UAV remote sensing capitalizes on the interaction between vegetation and solar radiation. By establishing correlations between reflectance and physiological and biochemical indices, such as AGB, the estimation of these indices becomes attainable [10,13,15,17,21]. To improve accuracy and mitigate interference from non-vegetated features, researchers have introduced various vegetation indices, such as the normalized difference vegetation index (NDVI), normalized green–red difference index (NGRDI), modified chlorophyll absorption ratio index (MCARI), and simple ratio index (SR) [18,22,23,24,25]. This method, based on optical remote sensing, offers the advantages of requiring minimal parameters and featuring straightforward and accurate mathematical expressions [26,27]. However, in open field scenarios, optical remote sensing methods encounter challenges arising from various factors, including soil background, variations in vegetation structure during different growth periods, and sowing densities. These factors may result in saturation issues in optical measurements [28,29].
Being a creeping crop, the high-density sowing of CMV fosters intense competition among individual plants, resulting in alterations in plant morphology. These changes can influence canopy spectral information, internal plant structure, and the composition of AGB, leading to substantial variations. Given these considerations, it is necessary to develop a reliable remote sensing estimation method for the AGB of CMV, crucial for accurately capturing the spatial distribution and structural composition of CMV plants in open fields.
In recent years, the fusion of diverse modeling features has emerged as a pivotal strategy to improve AGB estimation. For example, textural features extracted from UAV imagery can autonomously capture spatial information in the images, regardless of color tones. Key texture features, such as variance, homogeneity, and contrast, are commonly used in this context. Variance reflects the image’s contrast level by measuring the intensity variation across neighboring pixels, homogeneity captures the uniformity of pixel intensities within the image, and contrast quantifies the difference in intensity between adjacent pixels. These texture features, when integrated into models, enhance the ability to discern vegetation characteristics and improve AGB estimation accuracy. This capability assists in discerning variations in the spatial distribution of vegetation, vegetation types, and densities [30,31], thereby improving the precision of AGB estimation [32,33,34]. Furthermore, UAVs equipped with light detection and ranging (LiDAR) technology can reconstruct 3D point cloud information of vegetation [14,35,36]. Structural features, such as vegetation plant height, canopy diameter, and canopy cover, derived from this 3D point cloud, facilitate efficient AGB estimation [37]. Despite its potential, the application of this method in precision agriculture management still encounters challenges related to technology costs. Nevertheless, advancements in the structure from motion (SFM) algorithm for motion recovery now enable the reconstruction of vegetation 3D point cloud information using low-cost, high-resolution RGB images [38]. This development allows for a cost-effective AGB estimation [39,40,41]. Moreover, the fusion of structural and spectral features has progressively proven to be an effective approach for enhancing the accuracy of vegetation AGB estimation [42,43]. Additionally, the comprehensive fusion of spectral, structural, and image textural features captures the spatial distribution and growth status of vegetation, contributing to further improvements in AGB estimation performance [16,44].
Previous studies underscore the effectiveness of UAV multi-source map fusion technology in improving AGB estimation through the resolution of optical measurement saturation challenges. However, no investigations have applied this technique to CMV. A comprehensive examination is imperative to assess the applicability and spatial transferability of this optimization methodology in the AGB estimation of CMV, given its prostrate growth and high-density sowing. Hence, this study aims to (1) quantitatively evaluate the impact of UAV-based spectral, textural, and structural features on the AGB of CMV modeling and (2) ascertain the efficacy of various feature combinations in mitigating optical remote sensing saturation during modeling. Through the construction of an optimized AGB estimation model, this study contributes to the large-scale rapid estimation of BNFA, thus optimizing the fertilizer application practice in the CMV-rice rotation system.

2. Materials and Methods

2.1. Study Area

The study area was in Taihu Farm, Jingzhou City, Hubei Province, China (30°21′N, 112°02′E), with an average annual temperature of 16.3 °C and an annual rainfall of 1200 mm (Figure 1). The study area implemented a CMV-rice rotation system, where rice was cultivated as a single-season crop and CMV was grown during the winter fallow period. The CMV variety was Yijiang (Wuhu Qing Yijiang Seed Industry Co., Ltd., Wuhu, China), and the seeds were sown uniformly at a rate of 30 kg/ha. No chemical fertilizers or irrigation were applied during the CMV growth period.
This study was conducted in two specified regions identified as Field East and Field West, both located at Taihu Farm. The fields share similar conditions in terms of sunlight, air temperature, irrigation, and soil characteristics. Each study field was subdivided into 96 plots, measuring 32 m2 (4 m × 8 m) for Field East and 20 m2 (4 m × 5 m) for Field West, respectively, with 30 cm wide ridges separating the plots. The field trial was set up in 2015 to evaluate the effect of CMV incorporation on soil fertility and rice growth. The primary objective of the current investigation was to develop a model for estimating the AGB of CMV using remote sensing technology. Consequently, data acquisition exclusively focused on plots with CMV cultivation. Specifically, 75 plots from Field East were selected for constructing the AGB estimation model of CMV and conducting cross-validation, while 48 plots from Field West were chosen for the spatial transferability validation.

2.2. Overall Workflow

The workflow comprises five key components: image preprocessing, feature extraction, feature selection, feature combination, model construction, and spatial transferability validation (Figure 2).

2.3. Ground Data Acquisition and Processing

(1)
Ground-based measurement of CMV plant height
Ground truth plant height data were measured on 13 April 2021, during the full blooming stage of CMV. Initially, a sample plot with dimensions of 30 cm × 30 cm was randomly chosen from each plot, and the precise count of CMV plants within it was documented. Following this, ten representative plants were selected from the plot, and the height of fully extended plants was gauged using a telescopic ruler. The average of these measurements was regarded as the plant height value (in centimeters) for the respective plot.
(2)
Ground-based measurement of the AGB of CMV
Ground data measurement ensued after acquiring actual plant height measurements on the same day. All CMV plants were exclusively harvested within each plot, and the fresh weight meticulously recorded to determine the AGB of CMV in kg/m2 for the respective plot.
(3)
Ground-based measurement of CMV moisture content and nitrogen content
For each plot, approximately 500 g of harvested CMV samples were taken and subjected to fresh weight measurement. Subsequently, the samples were dried in an oven (initially at 105 °C for the first 30 min) until a constant weight was achieved at 70 °C. The measured dry weight facilitated the calculation of moisture content (%) for the CMV. Following this, the plant samples were ground and digested with H2SO4-H2O2. The nitrogen content (%) was then measured using the Kjeldahl method. The data on moisture and nitrogen contents were utilized for the estimation of BNFA (kg/ha).

2.4. UAV Image Data Acquisition and Processing

To ensure consistency between UAV remote sensing data and ground-based data, RGB and multispectral images were acquired using DJI Mavic 2 Pro and P4 multispectral quadcopter UAVs between 11:00 and 13:00 on the day of ground data acquisition.
The Mavic 2 Pro, designed for commercial applications, is a lightweight drone equipped with an advanced omnidirectional vision system, infrared sensors, and a high-precision anti-shake gimbal. It features a 20-megapixel CMOS sensor, a 24 mm focal length lens, and an 85° angle of view for high-resolution RGB image capture. Similarly, the P4 multispectral, tailored for multispectral imaging, shares key features with the Mavic 2 Pro. It incorporates a multi-directional vision system, an infrared sensor system, and a high-precision anti-shake gimbal with a built-in real-time kinematic (RTK) system. Equipped with six CMOS image sensors, the P4 multispectral captures images in five wavelength bands: blue (450 ± 16 nm), green (560 ± 16 nm), red (650 ± 16 nm), red edge (730 ± 16 nm), and near-infrared (840 ± 16 nm). Each monochrome sensor features 2.12 million pixels, a lens focal length of 40 mm, and a viewing angle of 62.7°.
All aerial missions occurred under optimal weather conditions, including clear skies, minimal wind, at an altitude of 20 m, and a speed of 2 m/s. Images were acquired at regular 2 s intervals. The flight grid was a single grid, and both the forward and side overlaps were set to 80%, ensuring adequate coverage between images and improving the accuracy of image stitching. Flight paths covered the entire study area, and radiometric calibration panels were strategically placed within the flight coverage. For radiometric calibration, the empirical linearization radiometric calibration (ELRC) method [45] (Farrand et al., 1994) was applied to ensure accuracy and consistency of radiometric values acquired by the multispectral sensors during flights. Additionally, ten ground control points (GCPs) were positioned within each study field for precise georeferencing. Horizontal measurements of the GCPs’ geographic coordinates were conducted using a GNSS device based on RTK technology, providing precise locations for geometric correction of images. Moreover, RGB images of the study area were acquired during the bare ground period between 11:00 and 13:00 on 18 September 2020. These images were utilized for generating a digital elevation model (DEM) representing the bare ground period after rice harvest and before CMV sowing.

2.4.1. Image Mosaic and Radiometric Calibration

In this study, the UAV multispectral and RGB images from the two study fields were mosaiced using Agisoft Metashape 1.8.0 software. Subsequently, radiometric calibration was performed on the multispectral images using the ELRC method.
R i = D N i + G a i n i + O f f s e t i i = 1,2 5
0.10 0.30 0.60 = D N 0.10 D N 0.30 D N 0.60 × G a i n i × O f f s e t i
Note: Ri and DNi represent the surface reflectance and gray value of a specific pixel in the i-th band. The gain and offset values for the i-th band are constants derived from experimental measurements, considering the reflectance characteristics of diverse radiometric calibration targets.

2.4.2. Removal of Soil Background

In the quantitative assessment of vegetation physiological and biochemical indices, non-vegetative elements, such as soil may distort radiometric data of vegetation targets, lowering accuracy in information extraction. To mitigate this, removing soil background from remote sensing images is crucial before extracting vegetation-related data. ArcGIS 10.8 and ENVI 5.3 were used in this study for soil background removal from radiometrically calibrated multispectral and RGB images. Figure 3 shows multispectral images before and after soil removal (similarly for RGB images). A soil background mask was constructed through supervised classification using an SVM classifier, with visual interpretation markers established simultaneously. Ten thousand random sample points for soil and vegetation were selected to evaluate the SVM classifier’s performance. Classification results indicated overall accuracies of 95.43% for soil and 93.36% for vegetation in both multispectral and RGB images (Figure 4). The validated soil background mask was then exported as a vector file, facilitating soil background removal from the images.

2.4.3. Extraction of Spectral, Textural, and Structural Features

In the study area, a rectangular region of interest (ROI) was defined for multispectral and RGB images based on plot areas to extract spectral, structural, and textural features. To mitigate edge effects, ROI dimensions were set 0.2 m smaller than the original plots. Spectral, structural, and textural features were extracted using the “zonal statistics as table” function from the ArcPy library. Twelve vegetation indices, widely recognized for AGB estimation, were selected as spectral features. Corresponding spectral feature images were systematically computed using the geospatial data abstraction library (GDAL) in Python (version 3.10). The calculation formulas are presented in Table 1.
Textural features based on the gray level co-occurrence matrix (GLCM) using ENVI 5.3 included eight metrics: mean (Mean), variance (Var), homogeneity (Hom), contrast (Con), dissimilarity (Dis), entropy (En), second moment (Sm), and correlation (Cor). For multispectral images, forty textural features were calculated across five bands. The second-order probabilistic statistical filter used default settings with a 3 × 3 window size and a diagonal direction of [1, 1]. Calculation formulas are presented in Table 2.
Utilizing the SFM algorithm to reconstruct 3D point cloud information from RGB images provides elevation details for the study area. However, its accuracy is limited, offering only vegetation canopy elevation information rather than precise plant height. To overcome this limitation, the study aims to determine the bare ground height of the study field, crucial for accurate plant height information. After the rice harvest on 18 September 2020, RGB images were acquired to generate the DEM of the study area (Figure 5b). Subsequently, the DSM (Figure 5a), derived from RGB images during the CMV full blooming stage, underwent raster subtraction with the DEM, resulting in the specific CMV canopy height model (CHM) (Figure 5c). Finally, the soil background mask from Section 2.4.2 was applied to eliminate soil pixels from the CHM, ensuring more accurate plant height information and laying the foundation for subsequent feature extraction.
Validating the extracted plant height from the CHM is crucial for ensuring accuracy and reliability in practical applications. In this study, linear regression analysis was utilized to examine the correlation between the estimated plant height from CHM and the measured ground plant height. Figure 6 indicates a robust correlation (r = 0.78, p < 0.001), confirming the reliability of the CHM-extracted plant height for subsequent extraction of vegetation structure features.
Methods using volumetric pixels and cylindrical surface fitting models 3D vegetation canopy structures with optimal cylindrical shapes derived from LiDAR or photogrammetric point clouds to extract vegetation volume in diverse environments, such as forests and shrubs [57,58]. However, the 3D point cloud generated from RGB images and SFM algorithms has limitations, capturing only outer canopy volume (CV) and lacking information about the internal structure of complex vegetation growth [59]. To overcome this, our study employed the surface difference method to calculate canopy volume. This method involves determining the volume between the highest and lowest points of the canopy based on the CHM. The CV for each plot was computed as the product of the extracted plant height and the number of vegetation pixels within the plot [60,61].
Ultimately, four structural features were extracted using the CHM: mean plant height (PHmean), standard deviation of plant height (PHstd), coefficient of variation of plant height (PHcv), CV, and plot-scale canopy cover (CC) derived through vegetation and soil classification masks [17]. PHstd and PHcv are employed to characterize the vertical structure complexity of vegetation [62]. These structural features offer a quantitative depiction of the vertical vegetation structure. The calculation formulas are presented in Table 3.

2.5. Selection and Fusion of Features

Pearson correlation analysis quantifies the linear correlation strength between two variables, commonly applied in exploratory data analysis. It calculates the correlation coefficient (“r”) as the ratio of the product of variable covariance to their standard deviation. The coefficient (“r”) ranges from −1 to 1, with a larger absolute value indicating a stronger correlation. Variable selection using random forests (VSURF) is a feature selection technique based on the random forests algorithm (RF) [63] (Genuer et al., 2010). It efficiently selects crucial features from a vast pool, addressing model uncertainty and multiple covariance issues [64,65]. In this study, RStudio 1.4 (R 4.2.0) and the R language library VSURF were employed for feature selection. The feature selection process occurred sequentially in this study using Pearson and VSURF for all feature types. Seven feature combinations were established for the retained features: Sp, Tex, Str, Sp + Tex, Sp + Str, Tex + Str, and Sp + Tex + Str.

2.6. Model Evaluation and AGB, BNFA Mapping

In this study, the Python Scikit-learn library was employed to construct an AGB estimation model for CMV using the RF algorithm. Grid search optimization focused on key hyperparameters, such as the number of decision trees (n_estimators), maximum tree depth (max_depth), maximum features at each split (max_features), minimum samples required for leaf nodes (min_samples_leaf), and internal nodes of a tree (min_samples_split). The process involved defining a hyperparameter space, systematically adjusting parameters, and determining the set that maximized accuracy on the model validation set. Model performance was assessed through repeated k-fold cross-validation and spatial transferability validation, using evaluation metrics, including the coefficient of determination (R2), root mean square error (RMSE), and relative root mean square error (rRMSE). A more robust model is indicated by higher R2 and lower RMSE and rRMSE values (Equations (3)–(5)).
R 2 = 1 i = 1 n y i y ^ i 2 i = 1 n y i y ¯ 2
R M S E = 1 n i = 1 n y i y ^ i 2
r R M S E = 1 n i = 1 n y i y ^ i 2 1 n i = 1 n y i × 100 %
Note: n represents the number of samples, yi represents the measured AGB value of the i-th sample, ŷi represents the predicted AGB value of the i-th sample, and ȳ represents the average measured AGB value.
In this study, the AGB estimation model was constructed under various feature combination settings and employed to generate AGB estimation maps for CMV. Subsequently, the BNFA of CMV was calculated with the biological nitrogen fixation rate of 66% [66] (Bolger et al., 1995). The formula is presented below:
B N F A = A G B × 1 M C % × N % × B N F % × 10 4
Note: BNFA represents the biological nitrogen fixation amount (kg/ha), AGB represents the aboveground fresh weight of CMV (kg/m2), MC% represents the moisture content (%), N% represents the nitrogen content (%), and BNF% represents the rate of biological nitrogen fixation (%). The average values for BNF%, MC%, and N% were measured from the field studies (i.e., 66%, 87.76%, and 2.03%), and 104 represents the unit conversion.

3. Results

3.1. Selected Features for AGB Modeling

(1)
Spectral features
The Pearson coefficients, indicating the correlation between the AGB of CMV and the 12 spectral features derived from UAV multispectral images, are presented in Table 4. The results indicate that the majority of spectral features exhibit a strong correlation with AGB (|r| > 0.7, p < 0.01), except for NDRE, CIRE, NGRDI, NPCI, and LCI. Particularly, CIgreen shows the highest correlation with AGB (r = 0.76, p < 0.05), while NGRDI exhibits the lowest correlation (|r| = 0.62, p < 0.01). Subsequently, seven spectral features indicating high correlation with AGB (|r| > 0.7, p < 0.05) were selected for the second period of VSURF feature selection. After the initial Pearson-based selection, these features underwent further refinement using VSURF, resulting in the final selected features: CIgreen, GNDVI, GRDVI, SR, and GLI.
(2)
Textural features
Table 5 indicates that, among all textural features, except for five features (R-En, R-Sm, RE-Con, RE-En, RE-Sm) that exhibit no significant correlation with AGB. The remaining 35 features demonstrate a noteworthy correlation with AGB (p < 0.05). In this study, strong correlations with AGB (|r| > 0.6) were identified and retained from the aforementioned 35 features, resulting in a total of 11 textural features. Among these, NIR-Cor exhibits the highest correlation with AGB (r = −0.82, p < 0.05), and all eight textural features extracted based on the NIR band show strong correlations with AGB. Following the initial selection by Pearson, these 11 textural features underwent further refinement using VSURF, ultimately yielding the final modeling features: RE-Cor, NIR-Con, NIR-Sm, and NIR-En.
Table 6 indicates the results of the Pearson correlation analysis conducted on the five structural features extracted from RGB images and the AGB of CMV. Following the analysis, it was observed that the correlation between PHstd and AGB lacked significance, leading to its exclusion from the selected structural features. The remaining four structural features (PHmean, PHcv, CV, and CC) exhibited a correlation with AGB beyond the moderate level, prompting their retention as part of the Str feature combination. To maintain a balance among different feature types, the remaining four structural features did not undergo VSURF feature selection.
Table 7 presents the results of Pearson correlation analysis and VSURF selection, resulting in the selection of five spectral features and four textural features from the initial 12 spectral features and 40 textural features. These chosen features serve as inputs for training the RF model, along with the four structural features retained after the Pearson correlation analysis selection.

3.2. Evaluation of AGB Estimation Models Using Various Feature Combinations

Based on the results of feature selection in Section 3.1 and feature combination settings in Section 2.5, the AGB estimation model utilizing the RF algorithm is constructed. Figure 7 shows the cross-validation scatter plot, while Figure 8 displays the scatter plot for spatial transferability validation.
Figure 7 indicates the performance of various feature combinations in modeling the AGB of CMV. Among models constructed with a single type of features, the Tex-based model indicates the best performance (R2 = 0.75, RMSE = 0.31 kg/m2, rRMSE = 9.92%), followed by the Sp-based model (R2 = 0.69, RMSE = 0.34 kg/m2, rRMSE = 10.88%), while the Str-based model performs relatively poorly (R2 = 0.65, RMSE = 0.37 kg/m2, rRMSE = 11.78%). The Sp + Tex combination (Figure 7d) significantly improves model performance (R2 = 0.84, RMSE = 0.25 kg/m2, rRMSE = 8.11%). Additionally, the combination of three types of features (Sp + Tex + Str) results in a substantial improvement in model accuracy (R2 = 0.89, RMSE = 0.22 kg/m2, rRMSE = 6.89%).
In summary, a model constructed based on a single type of features, Sp or Tex, can achieve high estimation accuracy, while the Str-based model has lower accuracy. Combining two types of features improves accuracy to some extent, while combining all three features leads to a significant increase in model accuracy.
Figure 8 indicates the results of spatial transferability validation for the AGB estimation model with different feature combinations. Similarly, the model with a combination of three types of features (Sp + Tex + Str) performs the best (R2 = 0.83, RMSE = 0.23 kg/m2, rRMSE = 8.29%).

3.3. Mapping Results for AGB and BNFA

In this section, AGB maps for both Field East and Field West (Figure 9b–h and Figure 10b–h) were generated using AGB estimation models incorporating various feature combinations. Upon visual examination of the RGB orthophotographs (Figure 9a and Figure 10a) and the resulting AGB maps, a consistent spatial distribution pattern with CMV growth conditions and bare plot height was observed. Specifically, the AGB map generated from the optimal estimation model for Field East indicated a spatial trend with higher values in the south, lower in the north, higher in the west, and lower in the east. The AGB values ranged from 1.45 to 4.29 kg/m2. Similarly, the AGB map based on the optimal estimation model for Field West indicated a spatial trend with higher values in the south, lower in the north, higher in the east, and lower in the west, with AGB values ranging from 1.24 to 3.59 kg/m2.
Furthermore, employing the best estimation model in conjunction with ground-based data (including plot-specific moisture content, nitrogen content, and a 66% biological nitrogen fixation rate measured from previous studies in the same study field), BNFA mapping for CMV in the study field was accomplished (Figure 9i and Figure 10i). The BNFA distributions for Field East and Field West aligned with the spatial patterns of AGB on their respective AGB maps generated from the best estimation model, ranging from 20.95 to 71.88 kg/ha and 17.86 to 51.70 kg/ha, respectively.

4. Discussion

4.1. Fusion of Textural and Structural Features to Compensate for Spectral Saturation Effect

Spectral features play a crucial role in enhancing vegetation signals. However, their effectiveness can diminish due to non-linear changes in the capacity of canopy chlorophyll to absorb and reflect specific wavelength bands as vegetation matures. A previous study has highlighted saturation in vegetation indices, such as NDVI beyond certain values of AGB and leaf area index, primarily due to the saturation of red band reflectance [67,68] (Gitelson, 2004; Wang et al., 2012). Despite attempts to improve AGB estimation by integrating multiple vegetation indices, challenges persist, including soil background interference, variations in vegetation structure across fertility periods, and sowing density in highly vegetated regions, leading to spectral saturation. In this study, the AGB estimation model utilizing a combination of spectral features from various vegetation indices exhibited suboptimal performance in both cross-validation (R2 = 0.69) and spatial transferability validation (R2 = 0.65). Figure 11 indicates saturation points for key indices, such as NDVI at approximately 0.86, SR at around 14, GRDVI at about 85, and others. The plateauing of these spectral features beyond specific AGB thresholds aligns with findings in analogous studies on different vegetation types [69,70]. This phenomenon may be attributed to CMV’s distinct morphological characteristics, including lower plant height, dense ground cover, and higher chlorophyll content. These attributes increase susceptibility to leaf influence during optical sensor observations. Additionally, the close shading among plants may attenuate electromagnetic waves passing through the canopy, challenging the accurate acquisition of information on upright stems and surface stubble [71].
Based on the outlined feature combinations in Section 2.5, this study constructed and compared the AGB estimation model for CMV, aiming to quantitatively evaluate the impact of these feature combinations on the AGB of CMV modeling (Figure 7 and Figure 8). Cross-validation and spatial transferability validation results indicated that the modeling accuracy of spectral features (Sp) (R2 of 0.69 and 0.65, respectively) surpassed that of structural features (Str) (R2 of 0.65 and 0.52, respectively). This finding contrasts with a previous study on maize AGB estimation using SFM structural features [72]. Nevertheless, other studies have consistently indicated the effectiveness of this method in estimating various vegetation parameters, such as AGB [73,74], LAI [75], nitrogen, and chlorophyll [76]. Notably, textural features exhibited superior modeling performance (R2 of 0.75 and 0.69, respectively).
The fusion of image textural features in this study significantly improved the AGB estimation accuracy for CMV, achieving R2 values of 0.84 and 0.79 in cross-validation and spatial transferability validation, respectively. This aligns with findings from similar previous studies [77,78]. Additionally, structural features, such as plant height, canopy volume, and canopy cover provided valuable information to complement spectral features, enhancing the accuracy of vegetation AGB estimation. The importance of representing plant height and information below the canopy, especially in areas with high vegetation cover, was emphasized. Structural features proved effective in mitigating underestimation issues under dense vegetation conditions [79]. The results of this study underscore the substantial improvement in AGB estimation accuracy for CMV using this approach (R2 of 0.74 and 0.71, respectively).
Within the context of this study, the fusion of spectral, textural, and structural features (Sp + Tex + Str) indicated the potential to effectively mitigate the saturation effect observed in optical remote sensing. This fusion yielded an efficient estimation of the AGB of CMV at the full blooming stage, achieving an R2 of 0.89 in cross-validation and 0.83 in spatial transferability validation. An outstanding advantage of this approach is its independence from additional expert knowledge and the complexity of parameters such as geometrical-optical models [80,81] and radiative transfer models [82,83]. Moreover, the model facilitates rapid and highly accurate spatial mapping of the AGB of CMV during the full blooming stage at the field scale, providing insights into the growth status of CMV.

4.2. Large-Scale Rapid Estimation of the BNFA of CMV

In response to increasing environmental awareness, the substitution of certain chemical fertilizers with leguminous green manures has become a significant trend in agricultural practices. The inherent biological nitrogen fixation of legume green manures is acknowledged as a crucial pathway for on-farm nitrogen inputs, presenting considerable potential for replacing nitrogen fertilizers. Previous studies have indicated that up to 60% of nitrogen uptake in subsequent crops within a legume green manure-based rotation system can be attributed to biological nitrogen fixation from previously planted legume green manures [84]. CMV emerges as a natural alternative to nitrogen fertilizers, prompting significant study interest in effective BNFA estimation. Traditional investigations into biological nitrogen fixation in green manures predominantly rely on the field surveys [85,86]. However, the need for field sampling renders it time-consuming and unsuitable for swift decision-making and studies covering extensive geographical areas. Consequently, the rapid estimation of the BNFA of CMV over large areas has become an imperative.
In this study, UAV remote sensing technology was leveraged to estimate the AGB of CMV, achieving BNFA estimation over a large area and spatial mapping through relevant formulas. This method facilitates the swift assessment of the BNFA of CMV across diverse study fields, reducing the workload for field researchers. It also enables the formulation of land management strategies, optimization of nitrogen fertilizer application, and improvement of agricultural production efficiency. However, certain limitations exist in this study. The utilization of nitrogen and moisture contents for estimation did not fully capture individual growth differences, leading to an overestimation of BNFA in some underdeveloped plots. Future studies will focus on rapidly and accurately estimating the moisture content and nitrogen content of CMV to improve BNFA estimation precision.

4.3. Future Studies

The study clearly indicates a spectral saturation effect of optical remote sensing in AGB estimation under high CMV coverage during the full blooming stage. This challenge was effectively addressed through the application of the UAV multi-source mapping fusion technique [15,17]. However, it’s crucial to acknowledge that the data used in this study were confined to a single year, limiting the comprehensive validation of the method’s robustness across different years. Subsequent studies should prioritize validation using multi-year datasets to ensure the method’s reliability in long-term applications.
Furthermore, this study utilized the default smaller window size (3 × 3) of the ENVI 5.3 software for calculating image textural features. This choice contradicts the perspective proposed by Liu et al., suggesting the use of a larger window size during periods of vigorous vegetation growth for a more comprehensive characterization of plant structure [87]. Notably, the results of this study indicate that the modeling accuracy of textural features surpassed that of using spectral features alone, even with the default window size setting. This contradicts previous views but underscores the potential of textural features for AGB estimation in areas with high CMV cover. However, this finding must be approached with caution, considering that the use of default window sizes may not have fully exploited the advantages of textural features for AGB estimation in the case of high CMV cover. Therefore, future study directions should comprehensively consider the effect of window size to maximize the modeling benefits of textural features. In terms of DEM data, the LiDAR sensor could provide more accurate information, but it was not available during the flight mission for this study. Therefore, DEM data were generated by the RGB-based SfM method widely used in previous similar studies [14,16,17]. In future studies, the use of LiDAR sensors could potentially further increase mapping accuracy.
Additionally, given the advantages of the multi-source map fusion technique, future study directions can explore incorporating multiple sensors (e.g., hyperspectral sensors, temperature sensors, etc.) to extract additional features from the images [15,88,89]. This multi-source feature fusion strategy is anticipated to enable the model to more comprehensively and accurately reflect vegetation growth conditions, thereby further improving estimation accuracy.

5. Conclusions

This study aims to establish an AGB estimation model during the full blooming stage of CMV. It was achieved by leveraging multi-source UAV imagery features, encompassing spectral, textural, and structural features. The findings indicated that relying solely on spectral features led to subpar estimation performance in regions with higher AGB, primarily due to canopy spectral saturation. In contrast, image textural features exhibited superior capability in capturing variations in canopy density attributed to AGB differences in smaller areas. The amalgamation of textural and spectral features notably alleviated the issue of canopy spectral saturation, resulting in a substantial improvement in AGB estimation accuracy. However, the utilization of structural features alone proved insufficient in effectively capturing the intricate growth conditions of CMV, yielding unsatisfactory estimation performance. Among all feature combinations, the fusion of spectral, textural, and structural features demonstrated superior performance, surpassing models constructed based on single or dual feature combinations. In summary, this study establishes that the fusion of spectral, textural, and structural features from UAV images enables effective AGB estimation for CMV. Additionally, an approximate estimation of the BNFA of CMV in the two study fields can be achieved. These data contribute valuable insights for formulating a sound strategy to optimize fertilizer input for rice production in southern China.

Author Contributions

Conceptualization, C.Z., R.M. and Q.Z.; methodology, C.Z., Z.F. and R.M.; software, C.Z. and Z.F.; validation, C.Z., R.M. and Q.Z.; formal analysis, C.Z.; investigation, C.Z., Z.F., C.Y. and M.G.; resources, R.M., M.G. and Q.Z.; data curation, C.Z., Z.F. and C.Y.; writing—original draft preparation, C.Z.; writing—review and editing, C.Z., R.M. and Q.Z.; visualization, C.Z. and Z.F.; supervision, R.M., M.G., and Q.Z.; project administration, R.M., C.Y. and Q.Z.; funding acquisition, R.M., M.G. and Q.Z. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Key Research and Development Program of China (grant No. 2021YFD1700200), National Natural Science Foundation of China (grant No. 42471362), and Key Research and Development Program of Heilongjiang, China (grant No. 2022ZX01A25; JD2023GJ01).

Data Availability Statement

The data presented in this study are available upon request from the corresponding author.

Conflicts of Interest

No conflict of interest was reported by the authors.

Abbreviations

The following abbreviations are used in this manuscript:
CMVChinese milk vetch
AGBaboveground biomass
BNFAbiological nitrogen fixation amount
UAVunmanned aerial vehicle
Spspectral features
Textextural features
Strstructural features
NDVInormalized difference vegetation index
NGRDIgreen–red difference index
MCARImodified chlorophyll absorption ratio index
SRsimple ratio index
LiDARlight detection and ranging
GRDVIGreen Re-normalized Different Vegetation Index
NDRENormalized Difference RedEdge Index
CIgreenGreen Chlorophyll Index
CIREChlorophyII Index-RedEdge
CLIGreen Leaf Index
GNDVIGreen Normalized Difference Vegetation Index
NGRDINormalized Green–Red Difference Index
NPCINormalized Pigment Chlorophyll Ratio Index
VARIREVisible Atmospherically Resistant Index-RedEdge
LCILeaf Chlorophyll Index
SFMstructure from motion
RTKreal-time kinematic
ELRCempirical linearization radiometric calibration
GCPsground control points
DEMdigital elevation model
GDALgeospatial data abstraction library
Varvariance
Homhomogeneity
Concontrast
Disdissimilarity
Enentropy
Smsecond moment
Corcorrelation
DSMdigital surface model
CHMcanopy height model
CCcanopy cover
CVcanopy volume
PHmeanmean plant height
PHstdstandard deviation of plant height
PHcvcoefficient of variation of plant height
VSURFVariable selection using random forests
RFrandom forests
R2coefficient of determination
RMSEroot mean square error
rRMSErelative root mean square error
BNF%rate of biological nitrogen fixation
MC%moisture content
N%nitrogen content

References

  1. Gao, S.; Zhou, G.; Chang, D.; Liang, H.; Nie, J.; Liao, Y.; Lu, Y.; Xu, C.; Liu, J.; Wu, J.; et al. Southern China can produce more high-quality rice with less N by green manuring. Resour. Conserv. Recycl. 2023, 196, 107025. [Google Scholar] [CrossRef]
  2. Cai, S.; Pittelkow, C.M.; Zhao, X.; Wang, S. Winter legume-rice rotations can reduce nitrogen pollution and carbon footprint while maintaining net ecosystem economic benefits. J. Clean. Prod. 2018, 195, 289–300. [Google Scholar] [CrossRef]
  3. Xie, Z.; Shah, F.; Tu, S.; Xu, C.; Cao, W. Chinese milk vetch as green manure mitigates nitrous oxide emission from monocropped rice system in South China. PLoS ONE 2016, 11, e0168134. [Google Scholar] [CrossRef] [PubMed]
  4. Issah, G.; Schoenau, J.J.; Lardner, H.A.; Knight, J.D. Nitrogen Fixation and Resource Partitioning in Alfalfa (Medicago sativa L.), Cicer Milkvetch (Astragalus cicer L.) and Sainfoin (Onobrychis viciifolia Scop.) Using 15N Enrichment under Controlled Environment Conditions. Agronomy 2020, 10, 1438. [Google Scholar] [CrossRef]
  5. Zhang, J.-q.; Dong, Y.-b.; Jiao, Y.; Wang, B.-x.; Wang, C.-y.; Song, M.-x.; Xiong, Z.-q. Nitrogen management regulates nitrogen fixation efficiency of milk vetch and rice productivity under milk vetch-rice rotation system. J. Plant Nutr. Fertil. 2024, 30, 1–11. [Google Scholar] [CrossRef]
  6. Zhang, J.-s.; Zhang, L.; Ding, L.; Liu, C.-z.; Lü, Y.-h.; Zheng, C.-f.; Zhang, C.-l.; Nie, L.-p.; Cao, W.-d.; Zhang, Y.-t. Effects of Chinese milk vetch incorporation and chemical fertilizer reduction on soil nitrogen supply and rice growth. J. Plant Nutr. Fertil. 2022, 28, 1793–1803. [Google Scholar] [CrossRef]
  7. Tan, Y.; Cheng, E.; Feng, X.; Zhao, B.; Chen, J.; Xie, Q.; Peng, H.; Li, C.; Lu, C.; Li, Y. Application of APSIM model in winter wheat growth monitoring. Front. Plant Sci. 2024, 15, 1500103. [Google Scholar] [CrossRef]
  8. McNeill, K.; Macdonald, K.; Singh, A.; Binns, A.D. Food and water security: Analysis of integrated modeling platforms. Agric. Water Manag. 2017, 194, 100–112. [Google Scholar] [CrossRef]
  9. He, C.J.; Cheng, S.Y.; Zheng, R.; Liu, J. Delay-and-Sum Beamforming-Based Spatial Mapping for Multisource Sound Localization. IEEE Internet Things 2024, 11, 16048–16060. [Google Scholar] [CrossRef]
  10. Lv, Z.; Meng, R.; Chen, G.; Zhao, F.; Xu, B.; Zhao, Y.; Huang, Z.; Zhou, L.; Zeng, L.; Yan, J. Combining multiple spectral enhancement features for improving spectroscopic asymptomatic detection and symptomatic severity classification of southern corn leaf blight. Precis. Agric. 2023, 24, 25. [Google Scholar] [CrossRef]
  11. Pagani, V.; Guarneri, T.; Fumagalli, D.; Movedi, E.; Testi, L.; Klein, T.; Calanca, P.; Villalobos, F.; Lopez-Bernal, A.; Niemeyer, S.; et al. Improving cereal yield forecasts in Europe—The impact of weather extremes. Eur. J. Agron. 2017, 89, 97–106. [Google Scholar] [CrossRef]
  12. Ma, Z.F.; Zhang, H.; Liu, J. DB-RNN: An RNN for Precipitation Nowcasting Deblurring. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 5026–5041. [Google Scholar] [CrossRef]
  13. Lv, Z.; Xu, B.; Zhong, L.; Chen, G.; Huang, Z.; Sun, R.; Huang, W.; Zhao, F.; Meng, R. Improved monitoring of southern corn rust using UAV-based multi-view imagery and an attention-based deep learning method. Comput. Electron. Agric. 2024, 224, 109232. [Google Scholar] [CrossRef]
  14. Zhou, L.; Meng, R.; Tan, Y.; Lv, Z.; Zhao, Y.; Xu, B.; Zhao, F. Comparison of UAV-based LiDAR and digital aerial photogrammetry for measuring crown-level canopy height in the urban environment. Urban For. Urban Green. 2022, 69, 127489. [Google Scholar] [CrossRef]
  15. Xu, B.; Meng, R.; Chen, G.; Liang, L.; Lv, Z.; Zhou, L.; Sun, R.; Zhao, F.; Yang, W. Improved weed mapping in corn fields by combining UAV-based spectral, textural, structural, and thermal measurements. Pest Manag. Sci. 2023, 79, 2591–2602. [Google Scholar] [CrossRef]
  16. Xu, L.; Zhou, L.; Meng, R.; Zhao, F.; Lv, Z.; Xu, B.; Zeng, L.; Yu, X.; Peng, S. An improved approach to estimate ratoon rice aboveground biomass by integrating UAV-based spectral, textural and structural features. Precis. Agric. 2022, 23, 1276–1301. [Google Scholar] [CrossRef]
  17. Lv, Z.; Meng, R.; Man, J.; Zeng, L.; Wang, M.; Xu, B.; Gao, R.; Sun, R.; Zhao, F. Modeling of winter wheat fAPAR by integrating Unmanned Aircraft Vehicle-based optical, structural and thermal measurement. Int. J. Appl. Earth Obs. Geoinf. 2021, 102, 102407. [Google Scholar] [CrossRef]
  18. Zhou, L.; Meng, R.; Yu, X.; Liao, Y.; Huang, Z.; Lü, Z.; Xu, B.; Yang, G.; Peng, S.; Xu, L. Improved Yield Prediction of Ratoon Rice Using Unmanned Aerial Vehicle-Based Multi-Temporal Feature Method. Rice Sci. 2023, 30, 247–256. [Google Scholar] [CrossRef]
  19. Yuan, J.H.; Zhang, Y.L.; Zheng, Z.J.; Yao, W.; Wang, W.S.; Guo, L.F. Grain Crop Yield Prediction Using Machine Learning Based on UAV Remote Sensing: A Systematic Literature Review. Drones 2024, 8, 559. [Google Scholar] [CrossRef]
  20. Gade, S.A.; Madolli, M.J.; García-Caparrós, P.; Ullah, H.; Cha-um, S.; Datta, A.; Himanshu, S.K. Advancements in UAV remote sensing for agricultural yield estimation: A systematic comprehensive review of platforms, sensors, and data analytics. Remote Sens. Appl. 2025, 37, 101418. [Google Scholar] [CrossRef]
  21. Huang, Z.; Zhong, L.; Zhao, F.; Wu, J.; Tang, H.; Lv, Z.; Xu, B.; Zhou, L.; Sun, R.; Meng, R. A spectral-temporal constrained deep learning method for tree species mapping of plantation forests using time series Sentinel-2 imagery. ISPRS J. Photogramm. Remote Sens. 2023, 204, 397–420. [Google Scholar] [CrossRef]
  22. Han, L.; Yang, G.; Dai, H.; Xu, B.; Yang, H.; Feng, H.; Li, Z.; Yang, X. Modeling maize above-ground biomass based on machine learning approaches using UAV remote-sensing data. Plant Methods 2019, 15, 10. [Google Scholar] [CrossRef] [PubMed]
  23. Yue, J.; Yang, G.; Tian, Q.; Feng, H.; Xu, K.; Zhou, C. Estimate of winter-wheat above-ground biomass based on UAV ultrahigh-ground-resolution image textures and vegetation indices. ISPRS J. Photogramm. Remote Sens. 2019, 150, 226–244. [Google Scholar] [CrossRef]
  24. Zheng, H.; Cheng, T.; Zhou, M.; Li, D.; Yao, X.; Tian, Y.; Cao, W.; Zhu, Y. Improved estimation of rice aboveground biomass combining textural and spectral analysis of UAV imagery. Precis. Agric. 2019, 20, 611–629. [Google Scholar] [CrossRef]
  25. Meng, R.; Lv, Z.G.; Yan, J.B.; Chen, G.S.; Zhao, F.; Zeng, L.L.; Xu, B.Y. Development of Spectral Disease Indices for Southern Corn Rust Detection and Severity Classification. Remote Sens. 2020, 12, 3233. [Google Scholar] [CrossRef]
  26. Bendig, J.; Bolten, A.; Bennertz, S.; Broscheit, J.; Eichfuss, S.; Bareth, G. Estimating Biomass of Barley Using Crop Surface Models (CSMs) Derived from UAV-Based RGB Imaging. Remote Sens. 2014, 6, 10395–10412. [Google Scholar] [CrossRef]
  27. Cen, H.; Wan, L.; Zhu, J.; Li, Y.; Li, X.; Zhu, Y.; Weng, H.; Wu, W.; Yin, W.; Xu, C.; et al. Dynamic monitoring of biomass of rice under different nitrogen treatments using a lightweight UAV with dual image-frame snapshot cameras. Plant Methods 2019, 15, 32. [Google Scholar] [CrossRef]
  28. Li, W.; Jiang, J.; Weiss, M.; Madec, S.; Tison, F.; Philippe, B.; Comar, A.; Baret, F. Impact of the reproductive organs on crop BRDF as observed from a UAV. Remote Sens. Environ. 2021, 259, 112433. [Google Scholar] [CrossRef]
  29. Nguy-Robertson, A.; Gitelson, A.; Peng, Y.; Viña, A.; Arkebauer, T.; Rundquist, D. Green Leaf Area Index Estimation in Maize and Soybean: Combining Vegetation Indices to Achieve Maximal Sensitivity. Agron. J. 2012, 104, 1336–1347. [Google Scholar] [CrossRef]
  30. Kuplich, T.M.; Curran, P.J.; Atkinson, P.M. Relating SAR image texture to the biomass of regenerating tropical forests. Int. J. Remote Sens. 2005, 26, 4829–4854. [Google Scholar] [CrossRef]
  31. Liao, Z.; He, B.; Quan, X. Potential of texture from SAR tomographic images for forest aboveground biomass estimation. Int. J. Appl. Earth Obs. Geoinf. 2020, 88, 102049. [Google Scholar] [CrossRef]
  32. Dai, M.; Yang, T.; Yao, Z.; Liu, T.; Sun, C. Wheat Biomass Estimation in Different Growth Stages Based on Color and Texture Features of UAV Images. Smart Agric. 2022, 4, 71–83. [Google Scholar] [CrossRef]
  33. Liu, Y.; Liu, S.; Li, J.; Guo, X.; Wang, S.; Lu, J. Estimating biomass of winter oilseed rape using vegetation indices and texture metrics derived from UAV multispectral images. Comput. Electron. Agric. 2019, 166, 105026. [Google Scholar] [CrossRef]
  34. Sarker, L.R.; Nichol, J.E. Improved forest biomass estimates using ALOS AVNIR-2 texture indices. Remote Sens. Environ. 2011, 115, 968–977. [Google Scholar] [CrossRef]
  35. da Costa, M.B.T.; Silva, C.A.; Broadbent, E.N.; Leite, R.V.; Mohan, M.; Liesenberg, V.; Stoddart, J.; do Amaral, C.H.; de Almeida, D.R.A.; da Silva, A.L.; et al. Beyond trees: Mapping total aboveground biomass density in the Brazilian savanna using high-density UAV-lidar data. For. Ecol. Manag. 2021, 491, 119155. [Google Scholar] [CrossRef]
  36. Lu, J.; Wang, H.; Qin, S.; Cao, L.; Pu, R.; Li, G.; Sun, J. Estimation of aboveground biomass of Robinia pseudoacacia forest in the Yellow River Delta based on UAV and Backpack LiDAR point clouds. Int. J. Appl. Earth Obs. Geoinf. 2020, 86, 102014. [Google Scholar] [CrossRef]
  37. Wang, Q.; Pang, Y.; Chen, D.; Liang, X.; Lu, J. Lidar biomass index: A novel solution for tree-level biomass estimation using 3D crown information. For. Ecol. Manag. 2021, 499, 119542. [Google Scholar] [CrossRef]
  38. Yang, D.; Meng, R.; Morrison, B.D.; McMahon, A.; Hantson, W.; Hayes, D.J.; Breen, A.L.; Salmon, V.G.; Serbin, S.P. A multi-sensor unoccupied aerial system improves characterization of vegetation composition and canopy properties in the Arctic tundra. Remote Sens. 2020, 12, 2638. [Google Scholar] [CrossRef]
  39. Cucchiaro, S.; Straffelini, E.; Chang, K.-J.; Tarolli, P. Mapping vegetation-induced obstruction in agricultural ditches: A low-cost and flexible approach by UAV-SfM. Agric. Water Manag. 2021, 256, 107083. [Google Scholar] [CrossRef]
  40. Holiaka, D.; Kato, H.; Yoschenko, V.; Onda, Y.; Igarashi, Y.; Nanba, K.; Diachuk, P.; Holiaka, M.; Zadorozhniuk, R.; Kashparov, V.; et al. Scots pine stands biomass assessment using 3D data from unmanned aerial vehicle imagery in the Chernobyl Exclusion Zone. J. Environ. Manag. 2021, 295, 113319. [Google Scholar] [CrossRef] [PubMed]
  41. Zhang, Y.; Onda, Y.; Kato, H.; Feng, B.; Gomi, T. Understory biomass measurement in a dense plantation forest based on drone-SfM data by a manual low-flying drone under the canopy. J. Environ. Manag. 2022, 312, 114862. [Google Scholar] [CrossRef] [PubMed]
  42. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  43. Tilly, N.; Aasen, H.; Bareth, G. Fusion of Plant Height and Vegetation Indices for the Estimation of Barley Biomass. Remote Sens. 2015, 7, 11449–11480. [Google Scholar] [CrossRef]
  44. Shu, M.; Shen, M.; Dong, Q.; Yang, X.; Li, B.; Ma, Y. Estimating the maize above-ground biomass by constructing the tridimensional concept model based on UAV-based digital and multi-spectral images. Field Crops Res. 2022, 282, 108491. [Google Scholar] [CrossRef]
  45. Farrand, W.H.; Singer, R.B.; Merényi, E. Retrieval of apparent surface reflectance from AVIRIS data: A comparison of empirical line, radiative transfer, and spectral mixture methods. Remote Sens. Environ. 1994, 47, 311–321. [Google Scholar] [CrossRef]
  46. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  47. Jordan, C.F. Derivation of Leaf-Area Index from Quality of Light on the Forest Floor. Ecology 1969, 50, 663–666. [Google Scholar] [CrossRef]
  48. Cao, Q.; Miao, Y.; Wang, H.; Huang, S.; Cheng, S.; Khosla, R.; Jiang, R. Non-destructive estimation of rice plant nitrogen status with Crop Circle multispectral active canopy sensor. Field Crops Res. 2013, 154, 133–144. [Google Scholar] [CrossRef]
  49. Barnes, E.; Clarke, T.; Richards, S.; Colaizzi, P.; Haberland, J.; Kostrzewski, M.; Waller, P.; Choi, C.; Riley, E.; Thompson, T. Coincident detection of crop water stress, nitrogen status and canopy density using ground based multispectral data. In Proceedings of the Fifth International Conference on Precision Agriculture, Bloomington, MN, USA, 16–19 July 2000; p. 6. [Google Scholar]
  50. Gitelson, A.A.; Gritz, Y.; Merzlyak, M.N. Relationships between leaf chlorophyll content and spectral reflectance and algorithms for non-destructive chlorophyll assessment in higher plant leaves. J. Plant Physiol. 2003, 160, 271–282. [Google Scholar] [CrossRef]
  51. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially Located Platform and Aerial Photography for Documentation of Grazing Impacts on Wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  52. Gitelson, A.A.; Kaufman, Y.J.; Merzlyak, M.N. Use of a green channel in remote sensing of global vegetation from EOS-MODIS. Remote Sens. Environ. 1996, 58, 289–298. [Google Scholar] [CrossRef]
  53. Jannoura, R.; Brinkmann, K.; Uteau, D.; Bruns, C.; Joergensen, R.G. Monitoring of crop biomass using true colour aerial photographs taken from a remote controlled hexacopter. Biosyst. Eng. 2015, 129, 341–351. [Google Scholar] [CrossRef]
  54. Peñuelas, J.; Gamon, J.A.; Fredeen, A.L.; Merino, J.; Field, C.B. Reflectance indices associated with physiological changes in nitrogen- and water-limited sunflower leaves. Remote Sens Environ. 1994, 48, 135–146. [Google Scholar] [CrossRef]
  55. Viña, A.; Gitelson, A.A.; Rundquist, D.C.; Keydan, G.; Leavitt, B.; Schepers, J. Monitoring Maize (Zea mays L.) Phenology with Remote Sensing. Agron. J. 2004, 96, 1139–1147. [Google Scholar] [CrossRef]
  56. Thenkabail, P.S.; Smith, R.B.; De Pauw, E. Hyperspectral Vegetation Indices and Their Relationships with Agricultural Crop Characteristics. Remote Sens. Environ. 2000, 71, 158–182. [Google Scholar] [CrossRef]
  57. Popescu, S.C.; Zhao, K. A voxel-based lidar method for estimating crown base height for deciduous and pine trees. Remote Sens. Environ. 2008, 112, 767–781. [Google Scholar] [CrossRef]
  58. Kim, E.; Lee, W.-K.; Yoon, M.; Lee, J.-Y.; Son, Y.; Abu Salim, K. Estimation of voxel-based above-ground biomass using airborne LiDAR data in an intact tropical rain forest, Brunei. Forests 2016, 7, 259. [Google Scholar] [CrossRef]
  59. Kachamba, D.J.; Ørka, H.O.; Næsset, E.; Eid, T.; Gobakken, T. Influence of plot size on efficiency of biomass estimates in inventories of dry tropical forests assisted by photogrammetric data from an unmanned aircraft system. Remote Sens. 2017, 9, 610. [Google Scholar] [CrossRef]
  60. Maimaitijiang, M.; Sagan, V.; Sidike, P.; Maimaitiyiming, M.; Hartling, S.; Peterson, K.T.; Maw, M.J.W.; Shakoor, N.; Mockler, T.; Fritschi, F.B. Vegetation Index Weighted Canopy Volume Model (CVMVI) for soybean biomass estimation from Unmanned Aerial System-based RGB imagery. ISPRS J. Photogramm. Remote Sens. 2019, 151, 27–41. [Google Scholar] [CrossRef]
  61. Zeng, L.; Peng, G.; Meng, R.; Man, J.; Li, W.; Xu, B.; Lv, Z.; Sun, R. Wheat yield prediction based on unmanned aerial vehicles-collected red–green–blue imagery. Remote Sens. 2021, 13, 2937. [Google Scholar] [CrossRef]
  62. Li, W.; Niu, Z.; Chen, H.; Li, D.; Wu, M.; Zhao, W. Remote estimation of canopy height and aboveground biomass of maize using high-resolution stereo images from a low-cost unmanned aerial vehicle system. Ecol. Indic. 2016, 67, 637–648. [Google Scholar] [CrossRef]
  63. Genuer, R.; Poggi, J.-M.; Tuleau-Malot, C. Variable selection using random forests. Pattern Recognit. Lett. 2010, 31, 2225–2236. [Google Scholar] [CrossRef]
  64. Stolbov, M.; Shchepeleva, M. Does one size fit all? Comparing the determinants of the FinTech market segments expansion. J. Financ. Data Sci. 2023, 9, 100095. [Google Scholar] [CrossRef]
  65. Liu, T.; Li, P.; Zhao, F.; Liu, J.; Meng, R. Early-Stage Mapping of Winter Canola by Combining Sentinel-1 and Sentinel-2 Data in Jianghan Plain China. Remote Sens. 2024, 16, 3197. [Google Scholar] [CrossRef]
  66. Bolger, T.P.; Pate, J.S.; Unkovich, M.J.; Turner, N.C. Estimates of seasonal nitrogen fixation of annual subterranean clover-based pastures using the15N natural abundance technique. Plant Soil 1995, 175, 57–66. [Google Scholar] [CrossRef]
  67. Gitelson, A.A. Wide Dynamic Range Vegetation Index for Remote Quantification of Biophysical Characteristics of Vegetation. J. Plant Physiol. 2004, 161, 165–173. [Google Scholar] [CrossRef]
  68. Wang, W.; Yao, X.; Yao, X.; Tian, Y.; Liu, X.; Ni, J.; Cao, W.; Zhu, Y. Estimating leaf nitrogen concentration with three-band vegetation indices in rice and wheat. Field Crops Res. 2012, 129, 90–98. [Google Scholar] [CrossRef]
  69. González-Jaramillo, V.; Fries, A.; Bendix, J. AGB Estimation in a Tropical Mountain Forest (TMF) by Means of RGB and Multispectral Images Using an Unmanned Aerial Vehicle (UAV). Remote Sens. 2019, 11, 1413. [Google Scholar] [CrossRef]
  70. Prabhakara, K.; Hively, W.D.; McCarty, G.W. Evaluating the relationship between biomass, percent groundcover and remote sensing indices across six winter cover crop fields in Maryland, United States. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 88–102. [Google Scholar] [CrossRef]
  71. Li, F.; Piasecki, C.; Millwood, R.J.; Wolfe, B.; Mazarei, M.; Stewart, C.N. High-Throughput Switchgrass Phenotyping and Biomass Modeling by UAV. Front. Plant Sci. 2020, 11, 574073. [Google Scholar] [CrossRef]
  72. Niu, Y.; Zhang, L.; Zhang, H.; Han, W.; Peng, X. Estimating Above-Ground Biomass of Maize Using Features Derived from UAV-Based RGB Imagery. Remote Sens. 2019, 11, 1261. [Google Scholar] [CrossRef]
  73. Gil-Docampo, M.d.l.L.; Arza-García, M.; Ortiz-Sanz, J.; Martínez-Rodriguez, S.; Marcos-Robles, J.L.; Sánchez-Sastre, L.F. Above-ground biomass estimation of arable crops using UAV-based SfM photogrammetry. Geocarto Int. 2020, 35, 687–699. [Google Scholar] [CrossRef]
  74. Zhang, H.; Tang, Z.; Wang, B.; Meng, B.; Qin, Y.; Sun, Y.; Lv, Y.; Zhang, J.; Yi, S. A non-destructive method for rapid acquisition of grassland aboveground biomass for satellite ground verification using UAV RGB images. Glob. Ecol. Conserv. 2022, 33, e01999. [Google Scholar] [CrossRef]
  75. Lin, L.; Yu, K.; Yao, X.; Deng, Y.; Hao, Z.; Chen, Y.; Wu, N.; Liu, J. UAV Based Estimation of Forest Leaf Area Index (LAI) through Oblique Photogrammetry. Remote Sens. 2021, 13, 803. [Google Scholar] [CrossRef]
  76. Kopačková-Strnadová, V.; Koucká, L.; Jelének, J.; Lhotáková, Z.; Oulehle, F. Canopy top, height and photosynthetic pigment estimation using Parrot Sequoia multispectral imagery and the Unmanned Aerial Vehicle (UAV). Remote Sens. 2021, 13, 705. [Google Scholar] [CrossRef]
  77. Adeluyi, O.; Harris, A.; Foster, T.; Clay, G.D. Exploiting centimetre resolution of drone-mounted sensors for estimating mid-late season above ground biomass in rice. Eur. J. Agron. 2022, 132, 126411. [Google Scholar] [CrossRef]
  78. Ren, H.; Zhao, Y.; Xiao, W.; Yang, X.; Ding, B.; Chen, C. Monitoring potential spontaneous combustion in a coal waste dump after reclamation through UAV RGB imagery-based on alfalfa aboveground biomass (AGB). Land Degrad. Dev. 2022, 33, 2728–2742. [Google Scholar] [CrossRef]
  79. ten Harkel, J.; Bartholomeus, H.; Kooistra, L. Biomass and Crop Height Estimation of Different Crops Using UAV-Based Lidar. Remote Sens. 2020, 12, 17. [Google Scholar] [CrossRef]
  80. Fengjuan, S.; Weimin, J.; Meihong, F.; Weiliang, F. Retrieval of Forest above Ground Biomass Using 4-Scale Geometrical Optical Model and Remote Sensing Data. Remote Sens. Technol. Appl. 2019, 33, 1046–1055. [Google Scholar]
  81. Chen, W.; Zhao, J.; Cao, C.; Tian, H. Shrub biomass estimation in semi-arid sandland ecosystem based on remote sensing technology. Glob. Ecol. Conserv. 2018, 16, e00479. [Google Scholar] [CrossRef]
  82. Zhang, L.; Gao, H.; Zhang, X. Combining Radiative Transfer Model and Regression Algorithms for Estimating Aboveground Biomass of Grassland in West Ujimqin, China. Remote Sens. 2023, 15, 2918. [Google Scholar] [CrossRef]
  83. Wan, L.; Zhang, J.; Dong, X.; Du, X.; Zhu, J.; Sun, D.; Liu, Y.; He, Y.; Cen, H. Unmanned aerial vehicle-based field phenotyping of crop biomass using growth traits retrieved from PROSAIL model. Comput. Electron. Agric. 2021, 187, 106304. [Google Scholar] [CrossRef]
  84. Li, X.; Petersen, S.O.; Sørensen, P.; Olesen, J.E. Effects of contrasting catch crops on nitrogen availability and nitrous oxide emissions in an organic cropping system. Agric. Ecosyst. Environ. 2015, 199, 382–393. [Google Scholar] [CrossRef]
  85. Unkovich, M.; Baldock, J.; Peoples, M. Prospects and problems of simple linear models for estimating symbiotic N 2 fixation by crop and pasture legumes. Plant Soil 2010, 329, 75–89. [Google Scholar] [CrossRef]
  86. Yang, L.; Nie, J.; Xu, C.; Cao, W. Biological nitrogen fixation of Chinese Milk Vetch (Astragalus sinicus L.) as affected by exogenous carbon and nitrogen input. Symbiosis 2021, 85, 69–77. [Google Scholar] [CrossRef]
  87. Liu, J.; Zhu, Y.; Song, L.; Su, X.; Li, J.; Zheng, J.; Zhu, X.; Ren, L.; Wang, W.; Li, X. Optimizing window size and directional parameters of GLCM texture features for estimating rice AGB based on UAVs multispectral imagery. Front. Plant Sci. 2023, 14, 1284235. [Google Scholar] [CrossRef]
  88. Sun, D.; Robbins, K.; Morales, N.; Shu, Q.; Cen, H. Advances in optical phenotyping of cereal crops. Trends Plant Sci. 2022, 27, 191–208. [Google Scholar] [CrossRef]
  89. Shen, R.; Li, M.; Zhao, C.; Wang, B.; Guan, Y.; Liu, J.; Jiang, J. Hierarchical Causal Discovery From Large-Scale Observed Variables. IEEE Trans. Knowl. Data Eng. 2025, 1–14. [Google Scholar] [CrossRef]
Figure 1. Study area. (a) Location of Hubei Province in China; (b) Location of Jingzhou in Hubei Province and the experimental field in Jingzhou; (c) UAV RGB image of the “Field East”; (d) UAV RGB im-age of the “Field West”.
Figure 1. Study area. (a) Location of Hubei Province in China; (b) Location of Jingzhou in Hubei Province and the experimental field in Jingzhou; (c) UAV RGB image of the “Field East”; (d) UAV RGB im-age of the “Field West”.
Remotesensing 17 00699 g001
Figure 2. The workflow diagram of the present study.
Figure 2. The workflow diagram of the present study.
Remotesensing 17 00699 g002
Figure 3. Multispectral image comparison before (a) and after soil background removal in Field East (b), where the green areas represent the CMV plants and the rest represents the soil.
Figure 3. Multispectral image comparison before (a) and after soil background removal in Field East (b), where the green areas represent the CMV plants and the rest represents the soil.
Remotesensing 17 00699 g003
Figure 4. Validation of CMV vegetation and soil background classification during the full blooming stage using UAV multispectral (left) and RGB images (right).
Figure 4. Validation of CMV vegetation and soil background classification during the full blooming stage using UAV multispectral (left) and RGB images (right).
Remotesensing 17 00699 g004
Figure 5. DSM during the full blooming stage (a), DEM during the bare ground period (b), CHM (c), and raster image generation with a ground mask (d) for canopy cover (CC) extraction.
Figure 5. DSM during the full blooming stage (a), DEM during the bare ground period (b), CHM (c), and raster image generation with a ground mask (d) for canopy cover (CC) extraction.
Remotesensing 17 00699 g005
Figure 6. Linear correlation analysis between CHM-estimated CMV plant height and measured plant height.
Figure 6. Linear correlation analysis between CHM-estimated CMV plant height and measured plant height.
Remotesensing 17 00699 g006
Figure 7. Scatterplot for cross-validation with various feature combinations. (a) “Sp”; (b) “Tex”; (c) “Str”; (d) “Sp + Tex”; (e) “Sp + Str”; (f) “Tex + Str”; (g) “Sp + Tex + Str”.
Figure 7. Scatterplot for cross-validation with various feature combinations. (a) “Sp”; (b) “Tex”; (c) “Str”; (d) “Sp + Tex”; (e) “Sp + Str”; (f) “Tex + Str”; (g) “Sp + Tex + Str”.
Remotesensing 17 00699 g007
Figure 8. Scatterplot for spatial transferability validation with various feature combinations. (a) “Sp”; (b) “Tex”; (c) “Str”; (d) “Sp + Tex”; (e) “Sp + Str”; (f) “Tex + Str”; (g) “Sp + Tex + Str”.
Figure 8. Scatterplot for spatial transferability validation with various feature combinations. (a) “Sp”; (b) “Tex”; (c) “Str”; (d) “Sp + Tex”; (e) “Sp + Str”; (f) “Tex + Str”; (g) “Sp + Tex + Str”.
Remotesensing 17 00699 g008
Figure 9. (a) The RGB image, where the green areas represent the CMV plants and the rest represents the soil; (bh) The AGB map of Field East generated with various feature combinations; (i) the BNFA map.
Figure 9. (a) The RGB image, where the green areas represent the CMV plants and the rest represents the soil; (bh) The AGB map of Field East generated with various feature combinations; (i) the BNFA map.
Remotesensing 17 00699 g009
Figure 10. (a) The RGB image, where the green areas represent the CMV plants and the rest represents the soil; (bh) The AGB map of Field East generated with various feature combinations; (i) the BNFA map.
Figure 10. (a) The RGB image, where the green areas represent the CMV plants and the rest represents the soil; (bh) The AGB map of Field East generated with various feature combinations; (i) the BNFA map.
Remotesensing 17 00699 g010
Figure 11. Spectral features of the two fields plotted against the AGB of CMV.
Figure 11. Spectral features of the two fields plotted against the AGB of CMV.
Remotesensing 17 00699 g011
Table 1. Spectral features of the present study.
Table 1. Spectral features of the present study.
SpectralFormulaReference
NDVI (Normalized Difference Vegetation Index) N I R R N I R + R [46]
SR (Simple Ratio Index) N I R R [47]
GRDVI (Green Re-normalized Different Vegetation Index) N I R G N I R + G [48]
NDRE (Normalized Difference RedEdge Index) N I R R E N I R + R E [49]
CIgreen (Green Chlorophyll Index) N I R G 1 [50]
CIRE (ChlorophyII Index-RedEdge) N I R R E 1 [50]
CLI (Green Leaf Index) 2 × G R B 2 × G + R + B [51]
GNDVI (Green Normalized Difference Vegetation Index) N I R G N I R + G [52]
NGRDI (Normalized Green–Red Difference Index) G R G + R [53]
NPCI (Normalized Pigment Chlorophyll Ratio Index) R B R + B [54]
VARIRE (Visible Atmospherically Re-sistant Index-RedEdge) R E 1.7 × R + 0.7 × B R E + 2.3 × R 1.3 × B [55]
LCI (Leaf Chlorophyll Index) N I R R E N I R + R E [56]
Table 2. Textural features of the present study.
Table 2. Textural features of the present study.
TexturalFormulaDescription
Mean i j x ( i , j ) p ( i , j ) Average situation of gray value
Variance i j ( i u ) 2 p ( i , j ) Degree of change of gray value
Homogeneity i j 1 1 + ( i j ) 2 p ( i , j ) Local texture homogeneity
Contrast n = 0 N g 1 n 2 i = 1 N g j = 1 N g p ( i , j ) i j = n Texture clarity
Dissimilarity n = 1 N g 1 n i = 1 N g j = 1 N g p ( i , j ) 2 i j = n Texture information similarity
Entropy i j p ( i , j ) log ( p ( i , j ) ) Texture complexity and non-uniformity
Second moment i j p ( i , j ) 2 Roughness of texture and uniformity of image gray distribution
Correlation i j i j p i , j μ x μ y σ x σ y Texture consistency
Mean i j x ( i , j ) p ( i , j ) Average situation of gray value
Variance i j ( i u ) 2 p ( i , j ) Degree of change of gray value
Note: In the equation, Ng represents the gray level, i represents the grayscale value at point (x, y), and j represents the grayscale value at another point located at a distance d from the point (x, y). p (i, j) represents the frequency of occurrence of pixels with grayscale value j at a distance d from the point (x, y); u represents the average of all gray values in the image area. μx and μy represent the mean texture values in the x and y directions, respectively, while μx and μy represent the standard deviations in the x and y directions.
Table 3. Structural features of the present study.
Table 3. Structural features of the present study.
StructuralFormula
Mean plant height (PHmean) (cm) 1 n × i = 1 n H i
Standard deviation of plant height (PHstd) (cm) 1 n 1 × i = 1 n H i P H m e a n
Coefficient of variation of plant height (PHcv) (%) P H s t d / P H m e a n
Canopy Volume (CV) (m3/pixel3) i = 1 n A i × H i
Canopy cover (CC) (%) C a n o p y p i x e l s / R O I p i x e l s
Note: i represents the i-th pixel in the CHM, n represents the number of vegetation pixels within the plot, Ai represents the area of the i-th pixel, Hi represents the height of the i-th pixel, and ROIpixels represents the total number of pixels within the ROI.
Table 4. Pearson’s correlation coefficient (r) for multispectral image spectral features and the AGB of CMV in Field East (n = 75).
Table 4. Pearson’s correlation coefficient (r) for multispectral image spectral features and the AGB of CMV in Field East (n = 75).
SpectralCorrelation Coefficient (r)SpectralCorrelation Coefficient (r)
NDVI0.71 **GLI0.72 **
SR0.70 **GNDVI0.74 **
GRDVI0.75 **NGRDI0.62 **
NDRE0.65 **NPCI−0.64 **
CIgreen0.76 **VARIRE0.71 **
CIRE0.63 **LCI0.67 **
Note: ** p < 0.01.
Table 5. Pearson’s correlation coefficient (r) for multispectral image textural features and the AGB of CMV in Field East (n = 75).
Table 5. Pearson’s correlation coefficient (r) for multispectral image textural features and the AGB of CMV in Field East (n = 75).
TexturalCorrelation Coefficient (r)TexturalCorrelation Coefficient (r)
B-Mean0.39 **R-Dis0.34 **
B-Var0.34 **R-En0.21 NS
B-Hom−0.33 **R-Sm−0.2 NS
B-Con0.38 **R-Cor−0.29*
B-Dis0.36 **RE-Mean0.59 **
B-En0.30 **RE-Var−0.27 *
B-Sm−0.31 *RE-Hom0.32 **
B-Cor−0.52 **RE-Con−0.21 NS
G-Mean0.54 **RE-Dis−0.26 *
G-Var0.54 **RE-En−0.18 NS
G-Hom−0.59 **RE-Sm0.17 NS
G-Con0.56 **RE-Cor−0.80 **
G-Dis0.57 **NIR-Mean0.61 **
G-En0.61 **NIR-Var−0.64 **
G-Sm−0.61 **NIR-Hom0.76 **
G-Cor−0.41 **NIR-Con−0.62 **
R-Mean−0.44 **NIR-Dis−0.68 **
R-Var0.37 **NIR-En−0.77 **
R-Hom−0.26 *NIR-Sm0.77 **
R-Con0.38 **NIR-Cor−0.82 **
B-Mean0.39 **R-Dis0.34 **
B-Var0.34 **R-En0.21 NS
B-Hom−0.33 **R-Sm−0.2 NS
B-Con0.38 **R-Cor−0.29 *
Note: * p < 0.05, ** p < 0.01, and NS: represents not significant correlation with p-value < 0.05.
Table 6. Pearson’s correlation coefficient (r) for multispectral image structural features and the AGB of CMV in Field East (n = 75).
Table 6. Pearson’s correlation coefficient (r) for multispectral image structural features and the AGB of CMV in Field East (n = 75).
StructuralCorrelation Coefficient (r)
PHmean0.57 **
PHstd−0.21 NS
PHcv−0.60 **
CV0.54 **
CC0.66 **
Note:, ** p < 0.01, and NS: represents not significant correlation with p-value < 0.05.
Table 7. Features selected through a two-step selection process.
Table 7. Features selected through a two-step selection process.
Feature TypeNumber of Input FeaturesPearson Filters
the Retained Features
VSURF Filters Retained Features
Spectral12NDVISRGRDVISRGRDVI
CIgreenGLIGNDVICIgreenGLI
VARIRE GNDVI
Textural40G-EnG-SmRE-CorRE-CorNIR-Con
NIR-MeanNIR-VarNIR-HomNIR-SmNIR-En
NIR-ConNIR-DisNIR-Cor
NIR-EnNIR-Sm
Structural5PHmeanPHcvCV
CC
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhang, C.; Zhu, Q.; Fu, Z.; Yuan, C.; Geng, M.; Meng, R. Estimation of Aboveground Biomass of Chinese Milk Vetch Based on UAV Multi-Source Map Fusion. Remote Sens. 2025, 17, 699. https://doi.org/10.3390/rs17040699

AMA Style

Zhang C, Zhu Q, Fu Z, Yuan C, Geng M, Meng R. Estimation of Aboveground Biomass of Chinese Milk Vetch Based on UAV Multi-Source Map Fusion. Remote Sensing. 2025; 17(4):699. https://doi.org/10.3390/rs17040699

Chicago/Turabian Style

Zhang, Chaoyang, Qiang Zhu, Zhenghuan Fu, Chu Yuan, Mingjian Geng, and Ran Meng. 2025. "Estimation of Aboveground Biomass of Chinese Milk Vetch Based on UAV Multi-Source Map Fusion" Remote Sensing 17, no. 4: 699. https://doi.org/10.3390/rs17040699

APA Style

Zhang, C., Zhu, Q., Fu, Z., Yuan, C., Geng, M., & Meng, R. (2025). Estimation of Aboveground Biomass of Chinese Milk Vetch Based on UAV Multi-Source Map Fusion. Remote Sensing, 17(4), 699. https://doi.org/10.3390/rs17040699

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop