Next Article in Journal
The Development of a Forest Tourism Attractiveness Model and a Foundational Framework for Forest Climatic Spa Resorts: An Attributive Theory Approach
Next Article in Special Issue
Implementation Potential of the SILVANUS Project Outcomes for Wildfire Resilience and Sustainable Forest Management in the Slovak Republic
Previous Article in Journal
Fungi Associated with Dying Buckthorn in North America
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Comparative Calculation of Spectral Indices for Post-Fire Changes Using UAV Visible/Thermal Infrared and JL1 Imagery in Jinyun Mountain, Chongqing, China

1
College of Forestry, Central South University of Forestry & Technology, Changsha 410004, China
2
Central South Academy of Inventory and Planning of NFGA, Changsha 410004, China
3
College of Resources and Environment, Southwest University, Chongqing 400715, China
4
Faculty of Life Science and Technology, Central South University of Forestry & Technology, Changsha 410004, China
5
National Engineering Laboratory for Applied Technology of Forestry & Ecology in South China, Changsha 410004, China
*
Authors to whom correspondence should be addressed.
Forests 2025, 16(7), 1147; https://doi.org/10.3390/f16071147
Submission received: 8 May 2025 / Revised: 30 June 2025 / Accepted: 9 July 2025 / Published: 11 July 2025
(This article belongs to the Special Issue Wildfire Behavior and the Effects of Climate Change in Forests)

Abstract

This study used Jilin-1 satellite data and unmanned aerial vehicle (UAV)-collected visible-thermal infrared imagery to calculate twelve spectral indices and evaluate their effectiveness in distinguishing post-fire forest areas and identifying human-altered land-cover changes in Jinyun Mountain, Chongqing. The research goals included mapping wildfire impacts with M-statistic separability, measuring land-cover distinguishability through Jeffries–Matusita (JM) distance analysis, classifying land-cover types using the random forest (RF) algorithm, and verifying classification accuracy. Cumulative human disturbances—such as land clearing, replanting, and road construction—significantly blocked the natural recovery of burn scars, and during long-term human-assisted recovery periods over one year, the Red Green Blue Index (RGBI), Green Leaf Index (GLI), and Excess Green Index (EXG) showed high classification accuracy for six land-cover types: road, bare soil, deadwood, bamboo, broadleaf, and grass. Key accuracy measures showed producer accuracy (PA) > 0.8, user accuracy (UA) > 0.8, overall accuracy (OA) > 90%, and a kappa coefficient > 0.85. Validation results confirmed that visible-spectrum indices are good at distinguishing photosynthetic vegetation, thermal bands help identify artificial surfaces, and combined thermal-visible indices solve spectral confusion in deadwood recognition. Spectral indices provide high-precision quantitative evidence for monitoring post-fire land-cover changes, especially under human intervention, thus offering important data support for time-based modeling of post-fire forest recovery and improvement of ecological restoration plans.

1. Introduction

Wildfire disturbance poses significant challenges to forest management [1]. An accurate assessment of post-fire spectral index dynamics is critical for understanding ecosystem recovery mechanisms and guiding restoration practices [2]. Recent advancements in unmanned aerial vehicle (UAV) and satellite remote sensing technologies have enabled comprehensive characterization of forest fire scars [3], providing essential data to support post-fire reconstruction decisions and management strategy optimization [4,5].
Despite extensive validation of remote sensing spectral indices [6,7], effective tools for analyzing land-cover changes in fire scars—particularly anthropogenically disturbed burned areas—remain limited [8]. Representative indices include the following:
  • The Burned Area Index (BAI), integrating visible and near-infrared bands [9,10];
  • The Normalized Burn Ratio (NBR), incorporating short-wave infrared [11,12];
  • The thermal infrared-enhanced Normalized Burn Ratio (NBRT) [13].
Additionally, spectral indices developed for RGB imagery demonstrate robust capabilities in vegetation identification and land-cover change detection, including the following:
  • Simple band ratios (e.g., R/G, G/B) [14,15];
  • Complex transformations (e.g., the Excess Green Index [EXG], Green Leaf Index [GLI], Triangular Greenness Index [TGI]) [16].
These RGB-based indices leverage ubiquitous visible-light imagery from multi-source platforms (UAVs, cameras, smartphones) [16], offering distinct operational advantages [17,18].
Fire scar dynamics encompass both natural vegetation recovery [2,19] and anthropogenic activities [8], such as land clearing, road construction, and artificial reforestation. Evaluating these processes requires the integrated application of
  • Vegetation indices [19];
  • Fire scar indices [11];
  • Soil indices [20,21].
Post-fire surface residues progressively degrade due to environmental factors and human intervention [22,23], while concurrent vegetation regeneration drives scar evolution [5]. This regeneration correlates significantly with vegetation-specific spectral indices [24,25]. However, anthropogenic disturbances frequently induce abnormal fluctuations and uncertainties in these indices [26], necessitating optimization through feature transformation, zonal classification, or recalibration [27,28].
This study investigates anthropogenic restoration practices during the forest post-fire recovery process at Jinyun Mountain, Chongqing [19,29]. Using visible-thermal infrared imagery from a DJI Mavic 2 Enterprise Dual (M2D) UAV (was designed/manufactured by DJI Innovations in Shenzhen, China) and contemporaneous Jilin-1 (JL1) satellite data [28,30], we calculated twelve spectral indices to
  • Assess accuracy in distinguishing burned areas [31,32];
  • Identify anthropogenically disturbed land cover [8,27].
The evaluation encompassed four objectives:
  • Delineating fire scars using M-statistic separability metrics [7];
  • Quantifying land-cover separability via Jeffries–Matusita (JM) distance analysis [31,33];
  • Classifying land cover with random forest (RF) [25,27];
  • Validating accuracy [24,31,33].
This study focuses specifically on identifying and analyzing anthropogenic traces left after the fire. It examines their spectral characteristics in RGB bands using UAV imagery and compares them with high-resolution satellite imagery captured during a similar period. The goal is to systematically evaluate how well different spectral indices can distinguish between land-cover types within these altered areas, thereby providing a foundation for more accurate post-fire trace extraction.

2. Materials and Methods

2.1. Overview of Study Area and Technical Workflow

2.1.1. Study Area

The study area comprises the initial ignition zone of a forest fire that occurred from 21 to 26 August 2022 in the Jinyun Mountain Nature Reserve (106°19′–106°23′ E, 29°45′–29°49′ N; Figure 1), Beibei District, Chongqing [19,29]. Characterized by mountainous terrain under a subtropical humid monsoon climate, the reserve exhibits a mean annual temperature of 13.6 °C and annual precipitation of 1100 mm [25,26]. Dominant vegetation includes plantations of Pinus massoniana Lamb. and Cunninghamia lanceolata (Lamb.) Hook., with secondary evergreen broad-leaved forests. The understory is dominated by Lauraceae/Fagaceae-associated bamboos and shrubs, and herbaceous layers consist primarily of Asteraceae, Poaceae, and pteridophytes.
Triggered by temperatures exceeding 40 °C on 21 August 2022, the fire was contained within five days through integrated suppression strategies such as manual firebreak reinforcement, aerial water bombing, and controlled backfiring. Post-fire restoration initiatives involved planting fire-resistant species and enhancing infrastructure resilience, while recovery was monitored via UAV-based vegetation indices [2,20]. During the 2023 restoration programs, over 5000 volunteers planted native saplings across fire-affected zones [25].

2.1.2. Technical Workflow

The framework (Figure 2) integrates five interconnected modules:
Data acquisition (Section 2.2.1): UAV dual-sensor (visible/thermal infrared) imagery (M2D; 28 December 2023) and JL1 satellite data (0.75 m resolution; 27 November 2023) were simultaneously acquired, supplemented by ground-truth sampling [4,5,20].
Data preprocessing (Section 2.2.2):
JL1 data: Radiometric calibration, atmospheric correction, co-registration, data fusion, resampling, and clipping were performed in ENVI v5.6.2 [6,7,20,34].
UAV data: Visible and thermal imagery were processed in Agisoft Metashape Pro v2.0.3 to generate orthomosaics [6,34], followed by co-registration, thermal-visible fusion, resampling, and clipping in ENVI [7,34].
Spectral index computation (Section 2.3): Spectral indices were calculated using the spectral index toolbox or raster calculator in ENVI [7,20,25,33], and then exported to ArcGIS v10.8.2 for analysis [11,23].
Statistical validation (Section 2.4): Statistical analyses encompassing correlation coefficients, discriminative metrics (M), and Jeffries–Matusita (JM) distance were computed in SPSS v30.0 [9,11,30].
Accuracy assessment (Section 2.5): Classification results from the random forest algorithm in ENVI were exported to ArcGIS for spatial refinement. Producer accuracy (PA), user accuracy (UA), overall accuracy (OA), and the kappa coefficient (kappa) were quantified in SPSS [2,11,23,35].

2.2. Image Data Acquisition and Preprocessing

2.2.1. Image Data Acquisition

At the burn scar site, UAV imagery was acquired on 28 December 2023 using a M2D equipped with visible-light and thermal infrared cameras (iron-red colormap) [14,36], concurrent with ground surveys [37]. A stratified random sampling design established 180 plots (3 m × 3 m) across six land-cover types: roads, bare soil, deadwood, bamboo, broadleaf, and grass (Figure 3d) [16,18,38].
Plot centroids were georeferenced using Trimble GeoXT receivers (<0.1 m horizontal accuracy) [37]. At 10–30 m above each plot centroid, the M2D acquired 6–10 nadir orthorectified images for validation [14,36].
JL1 images were obtained from the Jilin-1 official platform [20,39], with acquisition dates and specifications detailed in Table 1: 3 August and 6 September 2022 (JL1KF01C, 0.5 m), 27 November 2023 (JL1GF02B, 0.75 m), and 12 August 2024 (JL1KF01B, 0.5 m).

2.2.2. Image Preprocessing

UAV Image Processing: After orthomosaics were generated from quality-controlled UAV images in Agisoft Metashape v2.0.3, standardized ENVI v5.6.2 processing was conducted.
Geometric Registration: Thermal infrared imagery was registered to visible-light imagery as the spatial reference [15,35].
Thermal-Visible Fusion: Thermal infrared and visible red bands were fused using the Gram–Schmidt algorithm [34,40].
Resampling: Spatial resolution was standardized to 0.75 m via bilinear interpolation [23,39].
Data Normalization: The operation b1/255.0 was applied to convert DN values to the [0,1] reflectance range [37].
Cropping: Data were clipped to the study area ROI (Figure 1d) [16,34].
JL1 Satellite Image Processing: The standardized workflow comprised the following:
Radiometric Calibration: DN values were converted to reflectance using sensor-specific calibration coefficients [7,20].
Atmospheric Correction: Scattering/absorption effects were corrected with the FLAASH module (Mid-Latitude Summer model) [11,26].
Geometric Registration: Sub-pixel alignment (RMS < 0.5 pixels) was achieved relative to UAV orthomosaics [37,39].
Spatial Enhancement: Spatial resolution was enhanced through Gram–Schmidt pan-sharpening [26,40].
Resampling: Resolution was unified to 0.75 m via bilinear interpolation [23,39].
Data Normalization: The formula b1/10,000.0 was used to transform DN values into reflectance [20].
Cropping: ROI-based clipping was implemented (Figure 1d and Figure 3) [34].

2.3. Spectral Index Calculation

Resampled imagery was processed in ENVI to calculate spectral indices. UAV imagery spectral indices were computed using the Band Math toolbox ENVI v5.6.2 [5,28]. For JL1 data, the Burned Area Index (BAI), Difference Vegetation Index (DVI), Global DVI (GDVI), Normalized Difference Vegetation Index (NDVI), and Triangular Greenness Index 2 (TGI2) were calculated via the Spectral Index Calculation toolbox [7,24,31,41,42,43]. Remaining indices were computed using the Band Math toolbox [28]. A total of 12 spectral indices were calculated (Table 2). Pixel-level extraction was performed using ArcGIS [18,29].

2.4. Data Processing and Analysis

2.4.1. Distinguishing Indices

In SPSS, the discrimination index (M) was computed to quantitatively assess the spectral separability for post-fire land-cover classification, defined as [31,33,38]:
M = μ μ 0 σ + σ 0
where μ and μ0 are the average spectral indices of the land type and the compared land, respectively, and σ and σ0 are the corresponding standard deviations. The larger the M value is, the easier it is to distinguish the land type from the contrasted land. In this case, M > 1 indicates good discrimination, and M < 1 indicates poor discrimination [9,11,23].

2.4.2. Jeffries–Matusita (JM) Distance

The Jeffries–Matusita (JM) distance is also commonly used to assess the separability of the two classes [11,23,40], which is calculated with Equation (2).
JM = 2 1 e α
α = 1 8 μ i μ j T C i + C j 2 1 μ i μ j + 1 2 ln C i + C j 2 C i × C j
where i and j are two categories; C is the variance of the categorical eigenvalues; and μ is the mean of the characteristic values of the category. The higher the JM value, the easier it is to distinguish the site features from the forest land. In this case, JM > 1 indicates good separability, and JM < 1 indicates poor separability [11,17].

2.4.3. Relevance

In SPSS, the covariance matrix and correlation coefficient matrix are generated by computing statistical measures for all pixel values derived from the image. This process enables the assessment of the linear correlation between variables using the Pearson correlation coefficient [10,11,37], r x y , calculated as follows:
r x y = i = 1 n x i x ¯ y i y ¯ i = 1 n x i x ¯ 2 · i = 1 n y i y ¯ 2
where x i , y i are the values of two bands; x ¯ ,   y ¯ are the averages of the two bands; and λ is the total number of pixels in the image.

2.5. Accuracy Evaluation

The random forest classification was performed on spectral indices from 180 field plots (144 training/36 validation) in ENVI v5.6.2 (100 trees, default parameters), with the results extracted to sample points in ArcGIS. SPSS was used to generate a confusion matrix, from which the overall accuracy (OA), kappa coefficient (kappa), producer accuracy (PA; omission error control), and user accuracy (UA; commission error control) were calculated [45].

3. Results

3.1. Spectral Band Comparison

Raw image bands in the study area underwent correlation analysis (Figure 4). For UAV visible bands, there were strong inter-band correlations (r > 0.9) among RGB channels; for UAV thermal band (8–14 μm) pseudocolor (iron-red palette) imagery, the blue channel exhibited negative correlations with the visible red (r = −0.87) and green bands (r = −0.62), where iron-red hues indicate elevated surface temperatures. There were high correlations (r > 0.95) for JL1 visible bands and low correlations (r < 0.6) for JL1 NIR vs. visible bands, enabling optimal band combinations for spectral enhancement.

3.2. Comparison of Distinguishing Indices

The spectral discrimination indices (M) were compared between UAV and JL1 imagery for pre-fire (3 August 2022) and post-fire (6 September 2022) scenarios (Table 3). Table 3 reveals three key findings: Immediate post-fire (11 days): Post-fire M values (JL1-96, M83) for all spectral indices exceeded 1, with the NDVI (M = 3.08) demonstrating the highest separability from pre-fire conditions and the BAI (M = 1.05) the lowest. This indicates NDVI is more sensitive to fire-induced vegetation changes. Long-term recovery (>1 year): Human activities (road construction/logging) showed divergent separability between pre- and post-fire baselines. For JL1-27, most indices had M < 1 (except EXG, TGI1, TGI2). Anomaly: Pre-fire M > 1 and post-fire M < 1. Sensor-specific responses: UAV-visible: the EXG (M < 1) showed reduced separability, while others inverted from M > 1 (pre-fire) to M < 1 (post-fire). UAV-thermal: Only the RGBI maintained M > 1 post-fire; the others had M < 1; see Table 3.

3.3. Comparison of Spectral Indices of Different Ground Objects

This study characterizes post-fire land-cover spectra by comparing average spectral index values (Table 4) from UAV and JL1-27 imagery.
Average EXG-V values for vegetation (bamboo = 0.164, broad-leaved forest = 0.109) significantly exceeded non-vegetation (road = 0.027, bare land = 0.008), confirming visible-light indices’ sensitivity to chlorophyll. EXG-V showed the highest discriminative power for live vegetation. The average RGBI-VT value for deadwood (0.539) was 107% higher than bare land (0.260) and 114% higher than roads (0.252), demonstrating that fused visible-thermal indices improve deadwood discrimination by integrating spectral and thermal features. Average NDVI values showed a strong positive correlation with vegetation cover (bamboo = 0.389, road = 0.101), aligning with ecological expectations. Conversely, indices like TGI2 exhibited universal negative average values (deadwood = −1.058), highlighting the need to optimize index formulas for sensor-specific spectral configurations.
To further explore the characteristics of spectral indices, violin plots (Figure 5) were used to visualize the distribution and central tendency of the data.
Analysis of the violin plots of indices across six types of land cover (road, bare land, deadwood, grass, bamboo, broadleaf) revealed the following key findings:
The RGBI-T of roads shows a right-skewed, compact distribution with a mean of 0.718, significantly higher than those of bamboo (0.497) and broadleaf (0.616), validating the impact of the high thermal emissivity of human-made materials on thermal indices.
Bamboo and broadleaf exhibit left-skewed BR-T distributions (mean values of 0.571 and 0.616, respectively), which are consistent with vegetation’s thermal radiation regulation through canopy transpiration.
For the RGBI-VT of deadwood, the mean (0.539) is 107% higher than that of bare land (0.260), suggesting that classification challenges posed by similar visible spectra can be resolved by integrating spectral and thermal features.
Grass shows a higher EXG-VT distribution dispersion (mean 0.289) than bamboo, reflecting high spectral variability in herbaceous vegetation.
Artificial surfaces (roads) exhibit right-skewed, peaked index distributions, while vegetation indices (e.g., BR-T) are left-skewed. Deadwood and bare land overlap in thermal indices but are separable via fused indices.
Significant inter-class mean differences (e.g., 44.5% between road RGBI-T and bamboo BR-T) provide visual evidence for fine-scale post-fire land-cover classification.
Violin plots effectively characterize the spectral index distributions across various types of land cover, confirming the superiority of thermal infrared indices in identifying artificial surfaces and the value of fused indices in deadwood mapping. This provides critical references for remote sensing-based fine-scale land-cover discrimination.

3.4. Correlation of UAVs with JL1-27 Spectral Index

To analyze the correlation between UAV and JL1-27 spectral indices, the correlation coefficients were calculated by statistically analyzing the index values of 180 sample plots (Figure 6).
This correlation matrix heatmap reveals index relationships, supporting the following core findings:
Single-band indices (visible/thermal infrared) show strong positive correlations (redundant functions). Cross-band indices have weak/negative correlations, enabling “spectral complementarity” (e.g., visible + thermal infrared) to enhance classification. This underpins the value of fused indices.
Fused indices (e.g., RGBI-VT) integrate dual-band information. They retain single-band sensitivity and resolve classification challenges for spectrally similar types of land cover (e.g., deadwood vs. bare land), validating their utility.
Special indices (e.g., TGI series) show weak negative correlations with others, adding classification details. However, indices such as TGI2 (with universal negative values) require formula optimization for sensor-specific spectra.
Using cross-band indices (weak correlation), simplifying redundant single-band indices, and leveraging special indices, scientific index combinations boost efficiency and accuracy in post-fire land-cover classification.

3.5. JM Distance and Spectral Indices

By integrating visible, thermal infrared, and fused spectral indices with original imagery, JM distances were calculated from 180 sample plots to evaluate the classification potential for six land-cover classes: grass, deadwood, bare land, road, bamboo, and broadleaf. To further understand the degree of differentiation of each index for each category, a random forest model was used to randomly select training samples and validation samples from 180 sample plots according to 80–20, of which 144 training samples were used for each land class and 36 validation samples were used to calculate the JM distance.
The heatmap (Figure 7) illustrates how all spectral indices, derived from original imagery, and JM distances quantify the separability of land-cover pairs (such as grass/road and deadwood/bare land). The guide to spectral index performance for land-cover pairs is based on the JM distance (Figure 8).
Thermal and fused indices (e.g., RGB-T, RGB-VT) strongly separate artificial surfaces (roads) from natural covers (grass, broadleaf). They are, thus, effective for identifying human-made areas.
Visible light indices (e.g., RGB-V) are better for distinguishing between vegetation types (grass vs. broadleaf/bamboo). They are, thus, useful for plant classes.
Fused indices (RGB-VT) work for both artificial/natural and vegetation/vegetation pairs. They are helpful for land covers that look similar in single bands (e.g., deadwood vs. bare land).
The JM Distance Spectral Index Ground Object Recognition Evaluation Guide (Figure 8) indicates the following:
Most of the spectral indices of UAV visible light show a large degree of discrimination (JM > 1) in distinguishing between bamboo and deadwood; bamboo and bare soil; bamboo and road; deadwood and broadleaf; broadleaf and bare soil; and broadleaf and road. That is, there is a significant distinction between living vegetation (bamboo, broadleaf) and non-living vegetation (road, deadwood, bare soil), as well as between non-vegetation types.
Most of the thermal infrared spectral indices and the spectral indices obtained by fusing thermal infrared with visible light exhibit a large degree of discrimination (JM > 0.7) in distinguishing between bamboo and road; grass and broadleaf; deadwood and broadleaf; and broadleaf and road. In other words, there is a notable distinction between living vegetation (bamboo, grass, broadleaf) and non-living vegetation (road, deadwood).
Most of the JL1 spectral indices demonstrate a large degree of discrimination (JM > 0.7) in distinguishing between bamboo and deadwood; bamboo and road; and broadleaf and road. That is, there is a significant distinction between living vegetation (bamboo, broadleaf) and non-living vegetation (road, deadwood).

3.6. Accuracy Evaluation

To verify the accuracy of spectral indices, 180 field validation results were analyzed (Figure 9).
For UAV visible light spectral indices, the overall classification accuracy (OA) is close to or exceeds 80%, except for TGI1 (52%). Specifically, the RGBI and GLI exhibit accuracies exceeding 95%.
For UAV thermal infrared spectral indices, the overall classification accuracy is close to or exceeds 90%, except for TGI1 (52%). Notably, the RGBI (98%) and EXG (94%) demonstrate particularly high accuracies.
For JL1 spectral indices, the overall classification accuracy exceeds 90%, except for TGI2 (58%).
Other evaluation metrics (PA, UA, and kappa) show consistent trends with the overall classification accuracy (OA).

4. Discussion

4.1. Differences in the Extraction of Forest Fire Scars at the Pixel Scale Between UAV and JL1 Images

UAV RGB spectral indices demonstrate distinct advantages for vegetation identification, a finding consistent with multiple studies [16,37]. This research further validates their efficacy in distinguishing deadwood from bare ground. Integrating UAV thermal infrared imagery enhances feature differentiation through enriched spectral information, aligning with established thermal applications in forest fire scar analysis [14,36]. Although JL1 high-resolution satellite imagery is limited to four bands (R/G/B/NIR), it enables the extraction of >30 spectral indices from panchromatic (0.5 m/0.75 m) and multispectral (2 m/3 m) data [39]. Further investigation of these indices is warranted to assess their impact on forest fire scar dynamics, particularly deadwood distribution and early-stage natural vegetation recovery [2,14]. Such analysis is critical for optimizing UAV thermal/infrared and high-resolution satellite synergy in forest fire management [35].

4.2. Relationship Between Spectral Indices

UAV RGB bands exhibit autocorrelation, corroborating prior research [16]. Indices derived from these bands similarly display inter-correlation. Vegetation indices designed for plant identification also partially discriminate other land types, with RGB ratio indices demonstrating broad applicability. Thermal infrared pseudocolor gradation [46] significantly influences RGB spectral indices; the iron-red palette optimally represents forest fire scar dynamics, aligning with M2D thermal infrared methodologies [14,36]. In this study, thermal gradation was configured as iron-red, where elevated temperatures increase red values and decrease blue values, while lower temperatures augment blue values. Consequently, blue-band ratios exhibit negative correlations with temperature-sensitive indices.
Beyond pre-fire imagery [35], persistent fire remnants provide effective baselines for change assessment [45]. These exhibit opposing discriminative trends: pre-fire M > 1 versus post-fire M < 1. JL1-812 data revealed pronounced differentiation 11 days post-fire (RR > 1 excepted). UAV visible indices (excluding EXG and TGI1) and thermal/fused RR indices registered M < 1, potentially attributable to resolution disparities: pre-fire (0.5 m), post-fire (0.75 m), UAV visible (0.07 m), and UAV thermal (0.36 m). Resampling to 0.75 m for spatial uniformity may obscure fine-scale changes [34].

4.3. Advantages and Disadvantages of Ground Surveys, Unmanned Aerial Vehicles, and Satellite Imagery

Ground surveys yield precise surface data but are labor-intensive, time-consuming, and costly [38]. UAVs offer operational flexibility yet suffer limited spatial coverage per acquisition [5]. JL1 satellites deliver wide-area high-resolution imagery but lack timely data acquisition [30]. Integrating high-resolution satellites, UAVs, and targeted ground surveys remains the optimal approach for comprehensive forest fire scar assessment [35].

5. Conclusions

In this study, the spectral indices of three living vegetation types (bamboo, broadleaf trees, grass), two non-vegetation classes (roads, bare soil), and deadwood in human-intervened post-fire environments were analyzed using ultra-high-resolution UAV visible/thermal infrared and Jilin-1 satellite data, employing the spectral discrimination index (M), JM distance, and random forest modeling [11,27,31]. The key findings demonstrate significant negative correlations (r > 0.7, p < 0.01) in UAV thermal pseudocolor imagery (iron-red gradation) among low-temperature (blue band), high-temperature (red band), and mid-temperature zones (green band), while JL1 near-infrared (NIR) exhibits a weak correlation with RGB bands (r < 0.6), providing a foundation for novel spectral index construction [16,17,37]. In the early post-fire stage (6 September 2022), all 12 spectral indices outperformed the BAI in separability (M-index) from the pre-fire forest (3 August 2022), with the NDVI achieving peak discrimination (M = 3.08) versus the BAI minimum (M = 1.05) [12,31]. During human-managed recovery (>1 year post 27 November 2023), visible light indices achieved a JM > 0.5 across six land-cover classes with critical distinction (JM > 1) between living vegetation (bamboo/broadleaf) and non-vegetation (roads/deadwood/bare soil), while thermal/visible fusion enhanced deadwood recognition accuracy by 107%, with optimal combinations yielding an overall accuracy of >90% (kappa > 0.85) [14,25,36]. Field validation confirms that pre-/post-fire separability (M-index) benchmarks detect latent land-cover transitions: a low M relative to pre-fire indicates living vegetation recovery, whereas a low M relative to early post-fire suggests non-vegetation dominance, with UAV dual-camera (visible/thermal) imagery demonstrating superior transitional-state sensitivity [24,39]. Correlating UAV-satellite spectral indices is critical for quantifying anthropogenic impacts (road engineering, deadwood removal, replanting) on vegetation reassembly, forest thermal/hydrological rebalancing, and landscape resilience trajectories [19,29,32].

Author Contributions

Conceptualization, F.L.; Methodology, Y.L.; Software, Y.L.; Validation, X.L. and F.L.; Investigation, J.Z. and Y.L.; Resources, Y.L.; Data curation, Y.L.; Writing—original draft, J.Z.; Writing—review & editing, Y.L. and X.L.; Funding acquisition, Y.L. and F.L. All authors have read and agreed to the published version of the manuscript.

Funding

This research was supported by the Chongqing Water Conservancy Science and Technology Project (Grant No. CQSLK-2022021) and the Forestry Science and Technology Innovation Project of Hunan Province (Grant No. XLK202435). The APC was covered by the authors’ personal funds. Funding acquisition, Y.L. and F.L.

Data Availability Statement

Data is contained within the article.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Han, Y.; Zheng, C.; Liu, X.; Tian, Y.; Dong, Z. Burned Area and Burn Severity Mapping with a Transformer-Based Change Detection Model. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2024, 17, 13866–13880. [Google Scholar] [CrossRef]
  2. Zhang, L.; Wang, M.; Fu, Y.; Ding, Y. A Forest Fire Recognition Method Using UAV Images Based on Transfer Learning. Forests 2022, 13, 975. [Google Scholar] [CrossRef]
  3. Xu, Y.; Li, J.; Zhang, F. A UAV-Based Forest Fire Patrol Path Planning Strategy. Forests 2022, 13, 1952. [Google Scholar] [CrossRef]
  4. Li, X.; Gao, H.; Zhang, M.; Zhang, S.; Gao, Z.; Liu, J.; Sun, S.; Hu, T.; Sun, L. Prediction of Forest Fire Spread Rate Using UAV Images and an LSTM Model Considering the Interaction between Fire and Wind. Remote Sens. 2021, 13, 4325. [Google Scholar] [CrossRef]
  5. Larrinaga, A.R.; Brotons, L. Greenness Indices from a Low-Cost UAV Imagery as Tools for Monitoring Post-Fire Forest Recovery. Drones 2019, 3, 6. [Google Scholar] [CrossRef]
  6. Guimaraes, N.; Padua, L.; Marques, P.; Silva, N.; Peres, E.; Sousa, J.J. Forestry Remote Sensing from Unmanned Aerial Vehicles: A Review Focusing on the Data, Processing and Potentialities. Remote Sens. 2020, 12, 1046. [Google Scholar] [CrossRef]
  7. Chuvieco, E.; Martín, M.P.; Palacios, A. Assessment of different spectral indices in the red-near-infrared spectral domain for burned land discrimination. Int. J. Remote Sens. 2002, 23, 5103–5110. [Google Scholar] [CrossRef]
  8. Fraser, R.H.; van der Sluijs, J.; Hall, R.J. Calibrating Satellite-Based Indices of Burn Severity from UAV-Derived Metrics of a Burned Boreal Forest in NWT, Canada. Remote Sens. 2017, 9, 279. [Google Scholar] [CrossRef]
  9. Quintano, C.; Fernandez-Manso, A.; Stein, A.; Bijker, W. Estimation of area burned by forest fires in Mediterranean countries: A remote sensing data mining perspective. For. Ecol. Manag. 2011, 262, 1597–1607. [Google Scholar] [CrossRef]
  10. Perez, C.C.; Olthoff, A.E.; Hernandez-Trejo, H.; Rullan-Silva, C.D. Evaluating the best spectral indices for burned areas in the tropical Pantanos de Centla Biosphere Reserve, Southeastern Mexico. Remote Sens. Appl. Soc. Environ. 2022, 25, 100664. [Google Scholar] [CrossRef]
  11. Pacheco, A.d.P.; da Silva, J.A., Jr.; Ruiz-Armenteros, A.M.; Henriques, R.F.F.; Santos, I.d.O. Analysis of Spectral Separability for Detecting Burned Areas Using Landsat-8 OLI/TIRS Images under Different Biomes in Brazil and Portugal. Forests 2023, 14, 663. [Google Scholar] [CrossRef]
  12. Ji, X.; Zhou, Z.; Gouda, M.; Zhang, W.; He, Y.; Ye, G.; Li, X. A novel labor-free method for isolating crop leaf pixels from RGB imagery: Generating labels via a topological strategy. Comput. Electron. Agric. 2024, 218, 108631. [Google Scholar] [CrossRef]
  13. Roy, D.R.; Boschetti, L.; Trigg, S.N. Remote sensing of fire severity: Assessing the performance of the normalized Burn ratio. IEEE Geosci. Remote Sens. Lett. 2006, 3, 112–116. [Google Scholar] [CrossRef]
  14. Burnett, J.D.; Wing, M.G. A low-cost near-infrared digital camera for fire detection and monitoring. Int. J. Remote Sens. 2018, 39, 741–753. [Google Scholar] [CrossRef]
  15. Kedia, A.C.; Kapos, B.; Liao, S.; Draper, J.; Eddinger, J.; Updike, C.; Frazier, A.E. An Integrated Spectral-Structural Workflow for Invasive Vegetation Mapping in an Arid Region Using Drones. Drones 2021, 5, 19. [Google Scholar] [CrossRef]
  16. Trencanova, B.; Proenca, V.; Bernardino, A. Development of Semantic Maps of Vegetation Cover from UAV Images to Support Planning and Management in Fine-Grained Fire-Prone Landscapes. Remote Sens. 2022, 14, 1262. [Google Scholar] [CrossRef]
  17. Mpakairi, K.S.; Kadzunge, S.L.; Ndaimani, H. Testing the utility of the blue spectral region in burned area mapping: Insights from savanna wildfires. Remote Sens. Appl. Soc. Environ. 2020, 20, 100365. [Google Scholar] [CrossRef]
  18. Xiao, W.; Deng, X.; He, T.; Guo, J. Using POI and time series Landsat data to identify and rebuilt surface mining, vegetation disturbance and land reclamation process based on Google Earth Engine. J. Environ. Manag. 2023, 327, 116920. [Google Scholar] [CrossRef]
  19. Ji, S.; Wang, Y.; He, L.; Zhang, Z.; Meng, F.; Li, X.; Chen, Y.; Wang, D.; Gong, Z. Greenhouse gas emission in the whole process of forest fire including rescue: A case of forest fire in Beibei District of Chongqing. Environ. Sci. Pollut. Res. 2023, 30, 113105–113117. [Google Scholar] [CrossRef]
  20. Fu, B.; Zuo, P.; Liu, M.; Lan, G.; He, H.; Lao, Z.; Zhang, Y.; Fan, D.; Gao, E. Classifying vegetation communities karst wetland synergistic use of image fusion and object-based machine learning algorithm with Jilin-1 and UAV multispectral images. Ecol. Indic. 2022, 140, 108989. [Google Scholar] [CrossRef]
  21. Beltran-Marcos, D.; Suarez-Seoane, S.; Fernandez-Guisuraga, J.M.; Fernandez-Garcia, V.; Marcos, E.; Calvo, L. Relevance of UAV and sentinel-2 data fusion for estimating topsoil organic carbon after forest fire. Geoderma 2023, 430, 116290. [Google Scholar] [CrossRef]
  22. Udelhoven, T. TimeStats: A Software Tool for the Retrieval of Temporal Patterns from Global Satellite Archives. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2011, 4, 310–317. [Google Scholar] [CrossRef]
  23. Wu, B.; Zheng, H.; Xu, Z.; Wu, Z.; Zhao, Y. Forest Burned Area Detection Using a Novel Spectral Index Based on Multi-Objective Optimization. Forests 2022, 13, 1787. [Google Scholar] [CrossRef]
  24. Woebbecke, D.M.; Meyer, G.E.; Vonbargen, K.; Mortensen, D.A. Color indexes for weed identification under various soil, residue, and lighting conditions. Trans. ASAE 1995, 38, 259–269. [Google Scholar] [CrossRef]
  25. Bendig, J.; Yu, K.; Aasen, H.; Bolten, A.; Bennertz, S.; Broscheit, J.; Gnyp, M.L.; Bareth, G. Combining UAV-based plant height from crop surface models, visible, and near infrared vegetation indices for biomass monitoring in barley. Int. J. Appl. Earth Obs. Geoinf. 2015, 39, 79–87. [Google Scholar] [CrossRef]
  26. Ma, D.; Wang, Q.; Huang, Q.; Lin, Z.; Yan, Y. Spatio-Temporal Evolution of Vegetation Coverage and Eco-Environmental Quality and Their Coupling Relationship: A Case Study of Southwestern Shandong Province, China. Forests 2024, 15, 1200. [Google Scholar] [CrossRef]
  27. Kaufman, Y.J.; Remer, L.A. Detection of forests using mid-IR reflectance: An application for aerosol studies. IEEE Trans. Geosci. Remote Sens. 1994, 32, 672–683. [Google Scholar] [CrossRef]
  28. Hao, P.; Zhan, Y.; Wang, L.; Niu, Z.; Shakir, M. Feature Selection of Time Series MODIS Data for Early Crop Classification Using Random Forest: A Case Study in Kansas, USA. Remote Sens. 2015, 7, 5347–5369. [Google Scholar] [CrossRef]
  29. Zhao, Y.; Huang, Y.; Sun, X.; Dong, G.; Li, Y.; Ma, M. Forest Fire Mapping Using Multi-Source Remote Sensing Data: A Case Study in Chongqing. Remote Sens. 2023, 15, 2323. [Google Scholar] [CrossRef]
  30. Sripada, R.P. Determining In-Season Nitrogen Requirements for Corn Using Aerial Color-Infrared Photography. Ph.D. Thesis, North Carolina State University, Raleigh, NC, USA, 2005. [Google Scholar]
  31. Vargas-Sanabria, D.; Campos-Vargas, C. Multitemporal comparison of burned areas in a tropical dry forest using the burned area index (BAI). Rev. For. Mesoam. Kuru-Rfmk 2020, 17, 29–36. [Google Scholar] [CrossRef]
  32. Richardson, A.D.; Jenkins, J.P.; Braswell, B.H.; Hollinger, D.Y.; Ollinger, S.V.; Smith, M.-L. Use of digital webcam images to track spring green-up in a deciduous broadleaf forest. Oecologia 2007, 152, 323–334. [Google Scholar] [CrossRef] [PubMed]
  33. Pereira, J.M.C.; Mota, B.; Privette, J.L.; Caylor, K.K.; Silva, J.M.N.; Sá, A.C.L.; Ni-Meister, W. A simulation analysis of the detectability of understory burns in miombo woodlands. Remote Sens. Environ. 2004, 93, 296–310. [Google Scholar] [CrossRef]
  34. Carvajal-Ramirez, F.; Marques da Silva, J.R.; Aguera-Vega, F.; Martinez-Carricondo, P.; Serrano, J.; Jesus Moral, F. Evaluation of Fire Severity Indices Based on Pre- and Post-Fire Multispectral Imagery Sensed from UAV. Remote Sens. 2019, 11, 993. [Google Scholar] [CrossRef]
  35. Padua, L.; Guimaraes, N.; Adao, T.; Sousa, A.; Peres, E.; Sousa, J.J. Effectiveness of Sentinel-2 in Multi-Temporal Post-Fire Monitoring When Compared with UAV Imagery. ISPRS Int. J. Geo-Inf. 2020, 9, 225. [Google Scholar] [CrossRef]
  36. Hendel, I.-G.; Ross, G.M. Efficacy of Remote Sensing in Early Forest Fire Detection: A Thermal Sensor Comparison. Can. J. Remote Sens. 2020, 46, 414–428. [Google Scholar] [CrossRef]
  37. de Jesus Marcial-Pablo, M.; Gonzalez-Sanchez, A.; Ivan Jimenez-Jimenez, S.; Ernesto Ontiveros-Capurata, R.; Ojeda-Bustamante, W. Estimation of vegetation fraction using RGB and multispectral images from UAV. Int. J. Remote Sens. 2019, 40, 420–438. [Google Scholar] [CrossRef]
  38. Tucker, C.J. Red and photographic infrared linear combinations for monitoring vegetation. Remote Sens. Environ. 1979, 8, 127–150. [Google Scholar] [CrossRef]
  39. Sun, J.; Zhu, R.; Gong, J.; Qu, C.; Guo, F. Cross-Comparison Between Jilin-1GF03B and Sentinel-2 Multi-Spectral Measurements and Phenological Monitors. IEEE Access 2024, 12, 43540–43551. [Google Scholar] [CrossRef]
  40. Zhang, X.; Sun, Y.; Shang, K.; Zhang, L.; Wang, S. Crop Classification Based on Feature Band Set Construction and Object-Oriented Approach Using Hyperspectral Images. IEEE J. Sel. Top. Appl. Earth Obs. Remote Sens. 2016, 9, 4117–4128. [Google Scholar] [CrossRef]
  41. Louhaichi, M.; Borman, M.M.; Johnson, D.E. Spatially located platform and aerial photography for documentation of grazing impacts on wheat. Geocarto Int. 2001, 16, 65–70. [Google Scholar] [CrossRef]
  42. Hunt, E.R., Jr.; Daughtry, C.S.T.; Eitel, J.U.H.; Long, D.S. Remote sensing leaf chlorophyll content using a visible band index. Agron. J. 2011, 103, 1090–1099. [Google Scholar] [CrossRef]
  43. Rouse, J.W.; Haas, R.H.; Schell, J.A.; Deering, D.W. Monitoring vegetation systems in the great plains with ERTS. In Proceedings of the 3rd Earth Resources Technology Satellite Symposium, Washington, DC, USA, 10–14 December 1973. [Google Scholar]
  44. Hunt, E.R., Jr.; Doraiswamy, P.C.; McMurtrey, J.E.; Daughtry, C.S.T.; Perry, E.M.; Akhmedov, B. A visible band index for remote sensing leaf chlorophyll content at the canopy scale. Int. J. Appl. Earth Obs. Geoinf. 2013, 21, 103–112. [Google Scholar] [CrossRef]
  45. Bourgoin, C.; Betbeder, J.; Couteron, P.; Blanc, L.; Dessard, H.; Oszwald, J.; Le Roux, R.; Cornu, G.; Reymondin, L.; Mazzei, L.; et al. UAV-based canopy textures assess changes in forest structure from long-term degradation. Ecol. Indic. 2020, 115, 106386. [Google Scholar] [CrossRef]
  46. Alpar, O.; Krejcar, O. Quantization and Equalization of Pseudocolor Images in Hand Thermography. In Bioinformatics and Biomedical Engineering, Proceedings of the IWBBIO 2017, Granada, Spain, 26–28 April 2017, Part I; Rojas, I., Ortuno, F., Eds.; Lecture Notes in Bioinformatics; Springer: Cham, Switzerland, 2017; Volume 10208, pp. 397–407. [Google Scholar]
Figure 1. Location maps: (a) within China; (b) Chongqing; (c) Jinyun Mountain Nature Reserve; (d) the Region of Interest (ROI).
Figure 1. Location maps: (a) within China; (b) Chongqing; (c) Jinyun Mountain Nature Reserve; (d) the Region of Interest (ROI).
Forests 16 01147 g001
Figure 2. Spectral analysis workflow.
Figure 2. Spectral analysis workflow.
Forests 16 01147 g002
Figure 3. UAV and JL1 imagery. (a) JL1 (3 August 2022); (b) JL1 (6 September 2022); (c) (28 December 2023); (A) UAV visible; (B) UAV thermal; (C) UAV fusion; (D) JL1 (27 November 2023); (d) sample points.
Figure 3. UAV and JL1 imagery. (a) JL1 (3 August 2022); (b) JL1 (6 September 2022); (c) (28 December 2023); (A) UAV visible; (B) UAV thermal; (C) UAV fusion; (D) JL1 (27 November 2023); (d) sample points.
Forests 16 01147 g003
Figure 4. Band correlation. R—red band; G—green band; B—blue band; NIR—near-infrared band; _V—visible light; _T—thermal infrared; and _VT—fused visible thermal infrared, which is not marked as JL1.
Figure 4. Band correlation. R—red band; G—green band; B—blue band; NIR—near-infrared band; _V—visible light; _T—thermal infrared; and _VT—fused visible thermal infrared, which is not marked as JL1.
Forests 16 01147 g004
Figure 5. Spectral index violin line. (a) _V—visible light; (b) _T —thermal infrared; and (c) _VT—fused visible thermal infrared, (d) which is not marked as JL1.
Figure 5. Spectral index violin line. (a) _V—visible light; (b) _T —thermal infrared; and (c) _VT—fused visible thermal infrared, (d) which is not marked as JL1.
Forests 16 01147 g005
Figure 6. Correlation of spectral indices. _V—visible light; _T—thermal infrared; and _VT—fused visible thermal infrared, which is not marked as JL1.
Figure 6. Correlation of spectral indices. _V—visible light; _T—thermal infrared; and _VT—fused visible thermal infrared, which is not marked as JL1.
Forests 16 01147 g006
Figure 7. JM distance of land-cover pairs for all spectral indices derived from original imagery. RGB-V: Visible Light Image; Index-V: Visible Light Spectral Index Set; RGB-T: Thermal Infrared Image; Index-T: Thermal Infrared Spectral Index Set; RGB-VT: Thermal Infrared Fusion Image; Index-VT: Thermal Infrared Fusion Spectral Index Set; JL1: JL1 Multispectral Image (27 November 2023); JL1Index: JL1 Spectral Index Set (27 November 2023).
Figure 7. JM distance of land-cover pairs for all spectral indices derived from original imagery. RGB-V: Visible Light Image; Index-V: Visible Light Spectral Index Set; RGB-T: Thermal Infrared Image; Index-T: Thermal Infrared Spectral Index Set; RGB-VT: Thermal Infrared Fusion Image; Index-VT: Thermal Infrared Fusion Spectral Index Set; JL1: JL1 Multispectral Image (27 November 2023); JL1Index: JL1 Spectral Index Set (27 November 2023).
Forests 16 01147 g007
Figure 8. Spectral index performance guide for land-cover pairs based on JM distance. _V—visible light; _T—thermal infrared; and _VT—fused visible thermal infrared, which is not marked as JL1.
Figure 8. Spectral index performance guide for land-cover pairs based on JM distance. _V—visible light; _T—thermal infrared; and _VT—fused visible thermal infrared, which is not marked as JL1.
Forests 16 01147 g008
Figure 9. Evaluation of spectrum indices classification accuracy. (a) Visible light; (b) thermal infrared; (c) thermal-visible fusion; (d) jl1. _V—visible light; _T—thermal infrared; _VT—thermal-visible fusion, which is not marked as JL1.
Figure 9. Evaluation of spectrum indices classification accuracy. (a) Visible light; (b) thermal infrared; (c) thermal-visible fusion; (d) jl1. _V—visible light; _T—thermal infrared; _VT—thermal-visible fusion, which is not marked as JL1.
Forests 16 01147 g009
Table 1. UAV and JL1 band information.
Table 1. UAV and JL1 band information.
BandsChannelsWavelength/μmSpatial Resolution/mAcquisition DateSatellite SensorSource
Pan 0.45~0.800.5/0.7527 November 2023JL1GF02Bhttps://www.jl1mall.com/ (Available from 12 August 2024 to 15 October 2024)
Band1Blue0.45~0.512/312 August 2024JL1KF01B
Band2Green0.51~0.586 September 2022JL1KF01C
Band3Red0.63~0.693 August 2022
Band4NIR0.77~0.895
RGB-VRed/Green/Blue0.40~0.700.0728 December 2023M2D visible sensorThis study
RGB-TRed/Green/Blue8~140.3628 December 2023M2D thermal infrared
-V represents visible light; -T represents thermal infrared. JL1GF02B: 0.75 m; JL1KF01B: 0.5 m; JL1KF01C: 0.5 m.
Table 2. Expressions for calculating spectral indices.
Table 2. Expressions for calculating spectral indices.
IndexAbbreviationExpressionReference
Red RatioRR R / ( R + G + B )

[32]
Green RatioGR G / ( R + G + B )

[32]
Blue RatioBR B / ( R + G + B )

[32]
Red/Green/Blue IndexRGBI ( G 2 R × B ) / ( G 2 + R × B )

[25]
Excess GreenEXG 2 × G R B

[24]
Green Leaf IndexGLI 2 × G R B / 2 × G + R + B

[41]
Triangular Greenness Index 1TGI1 G 0.39 × R 0.61 × B

[44]
Triangular Greenness Index 2TGI2 ( λ R E D λ B L U E × ( ρ R E D ρ G R E E N ) ( λ R E D λ G R E E N ) × ( ρ R E D ρ B L U E ) ) / 2

[42]
Difference Vegetation IndexDVI ρ N I R ρ R E D

[38]
Green Difference Vegetation IndexGDVI ρ N I R ρ G R E E N

[30]
Normalized Difference Vegetation IndexNDVI ( ρ N I R ρ R E D ) / ( ρ N I R + ρ R E D )

[43]
Burned Area IndexBAI 1 / ( ( 0.1 R E D ) 2 + ( 0.06 N I R ) 2 )

[7]
RED (R), GREEN (G), BLUE (B), and NIR represent the reflectivity or normalized value of the red band, green band, blue band, and near-infrared band; λ represents the wavelength; and ρ represents the reflectivity.
Table 3. Discrimination (M) of different spectral indices.
Table 3. Discrimination (M) of different spectral indices.
IndexPre-Fire (JL1-83)JL1-96JL1-27JL1-812UAV-VUAV-TUAV-VT
MeanSTDMeanSTDMM83M96M83M96M83M96M83M96M83M96
RR0.23540.00800.33940.02882.82522.07910.29852.42360.07235.01450.16900.37450.84410.15930.6919
GR0.37970.00990.33420.00942.35851.22490.17240.81121.67971.04210.58550.02960.44740.10070.4061
BR0.38490.01170.32640.02121.78031.13180.25986.47152.83222.49620.55380.30490.53180.16800.4228
RGBI0.22800.03800.00650.04182.77361.45180.17290.75461.89121.26950.58191.12101.86230.47431.0486
EXG0.02160.00570.00000.00591.86500.93340.13720.74981.54130.58470.87780.49100.55590.42200.4863
GLI0.10070.02070.00180.02102.36721.23500.16520.79271.70441.05050.58750.08490.40790.15360.3729
TGI10.00830.00280.00040.00211.60180.72400.08811.20391.95140.78771.00910.26500.30540.23080.2709
BAI38.538830.1057161.220886.54821.05170.09381.14390.57571.4923
DVI0.20490.06810.06690.02301.51560.05031.11110.60052.0382
GDVI0.18260.06540.06880.02531.25520.07911.12870.74892.1332
NDVI0.72280.06630.31150.06713.08310.80301.03780.08102.0397
TGI20.68510.25230.05030.17241.49470.66130.07201.33042.0627
JL1-83: JL1 image acquired on 3 August 2022; JL1-96: JL1 image acquired on 6 September 2022; JL1-27: JL1 image acquired on 27 November 2023; JL1-812: JL1 image acquired on 12 August 2024; UAV-V: UAV visible light imagery; UAV-T: UAV thermal infrared imagery; UAV-VT: fused UAV visible/thermal imagery; Mean: average value; STD: standard deviation; M83: discrimination index M using pre-fire reference (3 August 2022); M96: discrimination index M using post-fire reference (6 September 2022).
Table 4. Mean spectral index.
Table 4. Mean spectral index.
IndicesRoadBare LandDeadwoodGrassBambooBroad-Leaved Forest
RR_V0.3720.3600.3460.3590.3440.341
GR_V0.3380.3350.3340.3490.3780.363
BR_V0.2910.3050.3200.2920.2780.296
RGBI_V0.0270.0120.0050.0770.1970.132
GLI_V0.0090.0040.0020.0350.0960.065
EXG_V0.0270.0080.0030.0640.1640.109
TGI1_V0.0330.0140.0050.0420.0920.061
RR_T0.2300.3110.3130.3140.0790.098
GR_T0.4570.3690.4140.4120.3500.285
BR_T0.3130.3200.2730.2740.5710.616
RGBI_T0.7180.4360.5420.5600.4970.220
GLI_T0.2460.0580.1560.1520.029−0.119
EXG_T0.4840.1570.3270.3120.073−0.078
TGI1_T0.2350.0860.1780.168−0.022−0.083
RR_VT0.3350.3480.3260.3290.1240.209
GR_VT0.3820.3600.4160.3990.3470.301
BR_VT0.2830.2920.2580.2730.5290.490
RGBI_VT0.2520.2600.5390.4820.3700.016
GLI_VT0.1040.0480.1610.1290.023−0.079
EXG_VT0.3560.1400.3210.2920.055−0.114
TGI1_VT0.1920.0850.1760.161−0.026−0.092
RR0.3970.3470.3400.3310.3190.315
GR0.3180.3180.3160.3190.3250.321
BR0.2850.3350.3450.3500.3560.363
RGBI−0.053−0.069−0.079−0.064−0.034−0.052
GLI−0.035−0.036−0.040−0.033−0.018−0.028
EXG−0.028−0.021−0.023−0.018−0.010−0.014
TGI1−0.006−0.010−0.012−0.010−0.006−0.009
TGI2−0.301−0.873−1.058−0.903−0.620−0.858
DVI0.0560.0770.0730.0940.1710.153
GDVI0.1070.0910.0840.1000.1680.151
NDVI0.1010.1850.1890.2380.3890.372
_V—visible light; _T—thermal infrared; and _VT—fused visible thermal infrared, which is not marked as JL1.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Zhu, J.; Liu, Y.; Liang, X.; Liu, F. Comparative Calculation of Spectral Indices for Post-Fire Changes Using UAV Visible/Thermal Infrared and JL1 Imagery in Jinyun Mountain, Chongqing, China. Forests 2025, 16, 1147. https://doi.org/10.3390/f16071147

AMA Style

Zhu J, Liu Y, Liang X, Liu F. Comparative Calculation of Spectral Indices for Post-Fire Changes Using UAV Visible/Thermal Infrared and JL1 Imagery in Jinyun Mountain, Chongqing, China. Forests. 2025; 16(7):1147. https://doi.org/10.3390/f16071147

Chicago/Turabian Style

Zhu, Juncheng, Yijun Liu, Xiaocui Liang, and Falin Liu. 2025. "Comparative Calculation of Spectral Indices for Post-Fire Changes Using UAV Visible/Thermal Infrared and JL1 Imagery in Jinyun Mountain, Chongqing, China" Forests 16, no. 7: 1147. https://doi.org/10.3390/f16071147

APA Style

Zhu, J., Liu, Y., Liang, X., & Liu, F. (2025). Comparative Calculation of Spectral Indices for Post-Fire Changes Using UAV Visible/Thermal Infrared and JL1 Imagery in Jinyun Mountain, Chongqing, China. Forests, 16(7), 1147. https://doi.org/10.3390/f16071147

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop