Mapping Soil Burn Severity at Very High Spatial Resolution from Unmanned Aerial Vehicles

The evaluation of the effect of burn severity on forest soils is essential to determine the impact of wildfires on a range of key ecological processes, such as nutrient cycling and vegetation recovery. The main objective of this study was to assess the potentiality of different spectral products derived from RGB and multispectral imagery collected by unmanned aerial vehicles (UAVs) at very high spatial resolution for discriminating spatial variations in soil burn severity after a heterogeneous wildfire. In the case study, we chose a mixed-severity fire that occurred in the northwest (NW) of the Iberian Peninsula (Spain) in 2019 that affected 82.74 ha covered by three different types of forests, each dominated by Pinus pinaster, Pinus sylvestris, and Quercus pyrenaica. We evaluated soil burn severity in the field 1 month after the fire using the Composite Burn Soil Index (CBSI), as well as a pool of five individual indicators (ash depth, ash cover, fine debris cover, coarse debris cover, and unstructured soil depth) of easy interpretation. Simultaneously, we operated an unmanned aerial vehicle to obtain RGB and multispectral postfire images, allowing for deriving six spectral indices. Then, we explored the relationship between spectral indices and field soil burn severity metrics by means of univariate proportional odds regression models. These models were used to predict CBSI categories, and classifications were validated through confusion matrices. Results indicated that multispectral indices outperformed RGB indices when assessing soil burn severity, being more strongly related to CBSI than to individual indicators. The Normalized Difference Water Index (NDWI) was the best-performing spectral index for modelling CBSI (R2cv = 0.69), showing the best ability to predict CBSI categories (overall accuracy = 0.83). Among the individual indicators of soil burn severity, ash depth was the one that achieved the best results, specifically when it was modelled from NDWI (R2cv = 0.53). This work provides a useful background to design quick and accurate assessments of soil burn severity to be implemented immediately after the fire, which is a key factor to identify priority areas for emergency actions after forest fires.


Introduction
Wildfires are major drivers of forest functioning [1][2][3]. In the Mediterranean Basin, current shifts in fire regime parameters (e.g., frequency, intensity, and severity), associated with global change trends, might generate harsh ecological effects in forest ecosystems [4]. Particularly relevant are the ecological consequences of burn severity, defined as the magnitude of the environmental change caused by fire [5]. Burn severity patterns might vary at different scales across the landscape due to environmental heterogeneity associated with differences in topography, moisture, vegetation diversity, and flammability. Consequently, complex and heterogeneous land mosaics can emerge after fire occurrence [6,7]. The multiscale evaluation of such spatial variations of burn severity and their ecological implications on biotic and abiotic fluxes is, at present, a leading topic in the research field of fire ecology [8][9][10].
Soil burn severity is defined as the fire-induced changes in organic and mineral soil layers, including organic matter loss, char depth, altered color and structure, and reduced infiltration [11][12][13]. As a key compartment of ecosystems, soil supports a wide range of ecological processes, such as nutrient cycling, productivity, and vegetation dynamics [14,15]. Therefore, when forest landscapes are affected by mixed-severity fire events, a spatially explicit diagnosis of soil damage should be performed immediately after the disturbance in order to determine the subsequent ecological and biochemical impacts generated within and across ecosystems [16,17]. In this sense, accurate methodologies to identify soil burn severity need to be developed at a landscape scale to allow for recognizing priority areas where soil has been affected by high soil burn severity and, consequently, would need restoration actions to avoid further runoff and erosion [18][19][20]. Currently, the methodology for building a standard field quantitative index to determine soil burn severity is a hot topic in fire ecology research [21,22]. Land managers and soil scientists often apply methodological standards based on the alteration of four vertical vegetation strata and one soil stratum, such as the Composite Burn Index (CBI), developed for initial (immediately after fire) and extended (1 year after fire) assessments of burn severity [5], and one of its multiple adaptations [23][24][25]. Usually, the estimation of soil burn severity is based on semiquantitative visual indicators measurable in the field, such as ash characteristics, unburned branch thickness, degree of fuel consumption, and soil structural alterations [11,18,19,[26][27][28]. Of particular interest is the presence of char and ashes after fire [13,[29][30][31], as well as the depth of unstructured soil, due to their crucial role in postfire erosion processes [32].
The current challenge is to develop spatially explicit tools that allow for a proper and easy evaluation of the above-mentioned soil physicochemical parameters in the face of fine-scale spatial variation of soil burn severity across heterogeneous landscapes. In this sense, remote sensing-based techniques have shown to be effective at discriminating spatial patterns in specific soil properties based on the environmental characteristics of each landscape [33,34]. Multispectral imagery collected by earth observation satellites (EOSs) is a powerful instrument that has been widely explored in many fire ecology applications [9,10,24,31,35], including burn severity. A multitude of remote sensing approaches have been developed to map burn severity on the basis of multispectral variations. For instance, the decrease in red and near-infrared (NIR) reflectance after fire is known to allow for obtaining singular spectral signatures from different surfaces affected by fire. Specifically, a large number of spectral indices, such as the Normalized Difference Vegetation Index (NDVI) [36], the Normalized Burn Ratio (NBR) [5], the Enhanced Vegetation Index (EVI) [37], and the Char Soil Index (CSI) [38], have been used for estimating total (i.e., vegetation and soil) burn severity [24]. Nevertheless, spectral indices computed from multispectral satellite imagery may reflect better burn severity in vegetation than in soil [24]. In addition, the fine spatial scale of soil burn severity patterns in landscapes with strong environmental heterogeneity requires the use of remote sensing data at higher spatial resolution than that commonly provided by current EOS missions [39,40]. The availability of cloud-free satellite imagery for specific dates to perform bitemporal burn severity assessments is also another shortcoming of this type of imagery. In this context, RGB and multispectral imagery collected by unmanned aerial vehicles (UAVs) are a promising tool for evaluating spatial variations in soil burn severity at very high spatial resolution, as can be acquired on demand at a low economic cost [41,42]. UAVs are, in fact, a sound alternative to other remote sensing techniques to survey relatively small areas (tens of hectares) at very high spatial resolution with great versatility [43,44]. UAV imagery provides more information in terms of spatial variability in heterogeneous burned areas in comparison with high-resolution satellite imagery like WorldView-2 [42]. Despite such a great potentiality, the development and implementation of operational and accurate tools targeted to map soil burn severity from UAV imagery, in the short term with a reasonable effort, is still a challenge in current forest fire research.
The main objective of this study is to assess the ability of different spectral products derived from RGB and multispectral imagery collected at very high spatial resolution from a UAV platform to characterize spatial variations in soil burn severity in the short term after a mixed-severity wildfire. Specifically, we aim to analyze the capacity of RGB and multispectral indices to model composite indices of soil burn severity (Composite Burn Soil Index) and individual indicators of soil burn severity (ash depth, ash cover, fine debris cover, coarse debris cover, and unstructured soil depth). We expect that UAV multispectral indices will allow a better soil burn severity characterization than RGB indices, identifying fire-induced modifications in the forest soil compartment at a fine spatial scale.

Materials and Methods
The methodology followed in this study was structured in four steps ( Figure 1): (i) selection of the study area; (ii) collection of field measurements of soil burn severity; (iii) UAV imagery acquisition, preprocessing, and spectral index calculation; and (iv) statistical analysis. resolution satellite imagery like WorldView-2 [42]. Despite such a great potentiality, the development and implementation of operational and accurate tools targeted to map soil burn severity from UAV imagery, in the short term with a reasonable effort, is still a challenge in current forest fire research.
The main objective of this study is to assess the ability of different spectral products derived from RGB and multispectral imagery collected at very high spatial resolution from a UAV platform to characterize spatial variations in soil burn severity in the short term after a mixed-severity wildfire. Specifically, we aim to analyze the capacity of RGB and multispectral indices to model composite indices of soil burn severity (Composite Burn Soil Index) and individual indicators of soil burn severity (ash depth, ash cover, fine debris cover, coarse debris cover, and unstructured soil depth). We expect that UAV multispectral indices will allow a better soil burn severity characterization than RGB indices, identifying fire-induced modifications in the forest soil compartment at a fine spatial scale.

Materials and Methods
The methodology followed in this study was structured in four steps ( Figure 1): (i) selection of the study area; (ii) collection of field measurements of soil burn severity; (iii) UAV imagery acquisition, preprocessing, and spectral index calculation; and (iv) statistical analysis.

Study Area
The study was conducted in Villapadierna (León province, NW Spain; Figure 2

Study Area
The study was conducted in Villapadierna (León province, NW Spain; Figure 2  area suffered a similar fire in the summer of 2012 that burned 70 ha of pine forest. The study area has a smooth relief, and elevation ranges between 922 and 1027 m a.s.l. and slopes between 6% and 27%. The lithology is dominated by silts, sands, and clays, with conglomerate layers in the lowest areas [45]. Soils are classified as Dystric and Humic Cambisols [46]. The climate is Mediterranean with 2-3 months of summer drought and a mean annual precipitation and temperature of 761 mm and 10.7 • C, respectively [47]. 2021, 12, x FOR PEER REVIEW 4 of 15 tree mortality, and residual presence of scorched canopies and living trees ( Figure 2). This area suffered a similar fire in the summer of 2012 that burned 70 ha of pine forest. The study area has a smooth relief, and elevation ranges between 922 and 1027 m a.s.l. and slopes between 6% and 27%. The lithology is dominated by silts, sands, and clays, with conglomerate layers in the lowest areas [45]. Soils are classified as Dystric and Humic Cambisols [46]. The climate is Mediterranean with 2-3 months of summer drought and a mean annual precipitation and temperature of 761 mm and 10.7 °C, respectively [47].

Field Measurements of Soil Burn Severity
To quantify soil burn severity in the field, a total of 80 plots of 50 cm × 50 cm were randomly established 1 month after the wildfire within a study framework of 16 ha. These plots were located on homogeneous burn patches with a minimum size of 2 × 2 m, with the absence of tree canopies to avoid interference in the UAV imagery acquisition. The

Field Measurements of Soil Burn Severity
To quantify soil burn severity in the field, a total of 80 plots of 50 cm × 50 cm were randomly established 1 month after the wildfire within a study framework of 16 ha. These plots were located on homogeneous burn patches with a minimum size of 2 × 2 m, with the absence of tree canopies to avoid interference in the UAV imagery acquisition. The center Soil burn severity was evaluated in the field by two different ways: (i) using a Composite Burn Soil Index (CBSI) and (ii) using individual visual indicators different from those included in the CBSI. The CBSI is a modified version of the substrate stratum of the Composite Burn Index (CBI) [5] that includes modifications made by other authors, such as the inclusion of char depth [24] and the combination of medium and heavy fuel consumed into one rating factor [23]. CBSI semiquantitative estimations were made in the field considering four biophysical parameters (Table 1), and two trained observers independently classified each of the 50 × 50 cm field plots into a severity class: no effect (0), low severity (0.1-1.24), moderate severity (1.25-2.24), and high severity (2.25-3) ( Figure 2 and Table 1). CBSI thresholds were based on those proposed by [48] for CBI. Since the CBSI can be affected by some interpretation limitations (i.e., it is unclear if all the considered parameters in the index reflect the same pattern of change), we additionally measured on a continuous scale a set of different and complementary visual parameters to those integrated into the CBSI, which individually can help to explain the soil loss in the short term after wildfires [18,19,26,49]: (1) ash depth (cm), (2) ash cover (%), (3) cover of fine debris (<2 cm; %), (4) cover of coarse debris (>2 cm; %), and (5) unstructured soil depth (cm). Ash depth was quantified as the mean value of five measurements collected with a digital king foot on a soil profile for each plot. Cover variables (ash and debris) were estimated as the visual percentage cover [50]. The unstructured soil depth indicates the depth at which the soil structure has been altered due to the fire, allowing greater subsidence [32]. This parameter was measured on the soil surface (i.e., the point where soil depth shows a slight resistance) with a digital king foot in five random points per plot to be subsequently averaged on a plot basis. Neither significant precipitation episodes nor wind events were recorded from the fire until the end of the sampling period.

UAV Imagery Acquisition, Preprocessing, and Spectral Index Calculation
UAV imagery covered the sampling framework of 16 ha that was stablished within the fire perimeter for field work ( Figure 2). The aerial survey was conducted on 25 September 2019 using an FV8 octocopter developed by ATyges. A total of six flights were carried out between 11:00 and 13:00 UTC in optimal atmospheric conditions in terms of lighting and wind. Each flight had an effective duration of 5-6 min, excluding both takeoff and landing. The average flight speed was 5 m·s −1 .
A Parrot SEQUOIA multispectral camera was installed underneath the UAV platform. This camera has four monochrome sensors with a resolution of 1.2 megapixels (1280 × 960), which allow for collecting global shutter imagery along four discrete spectral bands [51]: green (center wavelength: 550 nm), red (660 nm), red edge (735 nm), and near-infrared (NIR; 790 nm). The horizontal (HFOV), vertical (VFOV), and diagonal (DFOV) fields of view of the multispectral camera were 70.6 • , 52.6 • , and 89.6 • , respectively, with a focal length of 4 mm. The average ground sample distance (GSD) was 6.64 cm, corresponding to a flight height of 50 m above the ground level. The camera trigger interval and the waypoint route planned allowed the collection of 5060 multispectral images with an 80% forward and side overlap. The exposure time was set to automatic. The camera included an irradiance sensor to record light conditions during the flights. The setting of every image was saved in a text metadata file along with the irradiance sensor data. Before each flight, an image of the Sequoia reflectance calibration panel was captured in order to radiometrically calibrate the multispectral orthomosaic to obtain absolute reflectance values. Additionally, a 16-megapixel (4608 × 3456) Sony a6000 mirrorless RGB camera with a 20 mm Sony Pancake lens was installed underneath the UAV platform. A total amount of 1265 RGB images were collected from a flying altitude of 50 m above the ground level, resulting in a ground sample distance (GSD) of 1.70 cm.
The UAV photogrammetric process was carried out using Pix4dmapper v4.4.12. This software integrates computer vision techniques with photogrammetry algorithms to obtain high-accuracy aerial imagery products [52,53]. Pix4d uses structure-from-motion (SfM) algorithms in an automated workflow to align the raw imagery and create a densified 3D point cloud [54]. Once a densified 3D point cloud is generated, the software builds highly detailed digital surface models (DSMs), which were used to generate the reflectance and RGB orthomosaics. Multispectral and RGB imagery were processed using the "Ag Multispectral" and "Ag RGB" templates, respectively. To improve the spatial accuracy, a dataset consisting of 16 ground control points (GCPs) was added into the photogrammetric workflow. The final georeferencing of the multispectral orthomosaic achieved a root-meansquare error in X, Y (RMSE XY ) < 26 cm. The RGB orthomosaic featured an RMSE XY < 48 cm.
From the reflectance orthomosaics, we calculated a set of spectral indices potentially useful for burn soil severity detection, according to the literature ( Table 2). Regarding RGB indices, the Excess Green Index (EGI) [55] and the Green Chromatic Coordinate (GCC) [56] have been widely applied in recent studies to discriminate between burn severity classes [57][58][59], as they emphasize the difference between the green reflectance peak and the reflectance of blue and red [59]. Both indices standardize differences in scene illumination conditions [60]. To detect charred organic material at the soil compartment, we used the Char Index (CI) [61]. This index is based on the low visible reflectance of burned surfaces (characterized using the Brightness Index (BI)) and the flat spectrum that is responsible for the lack of color (quantified using a Maximum RGB Difference Index (MaxDiff)) [61]. The selected indices derived from multispectral imagery are expected to be sensitive to physical or biophysical parameters at the soil compartment after a fire. In particular, the Normalized Difference Vegetation Index (NDVI) [36] and the Normalized Difference Vegetation Red Edge Index (NDVIRE) [62] have been widely used for fire severity estimation [59,63]. On the other hand, the Normalized Difference Water Index (NDWI) [64] has been implemented to assess soil moisture in different applications, such as agriculture, hydrology, meteorology, and natural disaster management, including fire severity [65][66][67]. This index is highly related to soil properties, which control water availability, and, therefore, the extent of soil impact by burn severity [68,69]. The values of the RGB and multispectral indices corresponding to each 50 × 50 cm CBSI plot were obtained by averaging the total pixel values distributed within each plot according to the procedure described in [70].

Statistical Analysis
The CBSI was modeled through univariate proportional odds (PO) models [71,72] given the ordinal nature of the response variable. RGB and multispectral indices computed from UAV imagery were used as predictors, and the categorized CBSI was selected as response variable. The significance of the predictors was assessed through p Wald χ 2 statistic [73]. Internal model validation was conducted through 10-fold cross-validation with 10 repeats, providing the average R 2 cv and Somers' D as a quality measure of model fit and ordinal predictions, respectively [74]. Additionally, the dataset was randomly partitioned into a training (2/3 of the data) and a validation subset (1/3 of the data) to generate CBSI category predictions based on the output likelihood for each observation. From the confusion matrix, we computed the user's and producer's accuracy, as well as the overall classification accuracy and the kappa index [75]. The relationship between each individual indicator of soil burn severity (i.e., ash depth, ash cover, cover of fine debris (<2 cm), cover of coarse debris (>2 cm), and unstructured soil depth) and RGB and multispectral indices was analyzed using univariate linear models. The assumptions of homoscedasticity and normality of model residuals were checked graphically [23]. Model validation was conducted through 10-fold cross-validation with 10 repeats, providing the average R 2 cv to assess model performance. All statistical analyses were conducted in R [76] using "MASS" [77], "rms" [78], "ordinal" [79], and "effects" [80,81] packages.

Results
Moderate and high soil burn severity categories defined by the CBSI in the field had a similar mean spectral signature through the four bands of the multispectral orthomosaic, as opposed to the spectral signature of the low soil burn severity category ( Figure 3A). The NIR band featured the maximum separability between low/moderate and high soil burn severity categories. Nevertheless, the RGB mean signature of the field plots burned at moderate and high severity showed the maximum differences. The plots burned at low severity showed a different trend ( Figure 3B).  All multispectral and RGB indices were statistically significant (p Wald χ2 < 0.0001) in the univariate proportional odds (PO) models run for the CBSI, multispectral indices being more strongly related to the CBSI than RGB indices (Table 3 and Figure 4). Among the multispectral indices, NDWI featured the best performance in univariate PO models, in both 10-fold cross-validation (R 2 cv = 0.6893 and DXY = 0.7724) and CBSI category predictions (OA (overall accuracy) = 0.8333 and kappa = 0.7419) ( Figure 5). High NDWI values were significantly related to high probabilities of classification on the high CBSI severity category (Figure 4). Regarding RGB indices, CI was the best predictor of burn severity (R 2 cv = 0.4413 and DXY = 0.4750; OA = 0.5833 and kappa = 0.3375), being inversely related to the probability of high soil burn severity occurrence (Figure 4). Moderate CBSI classification yielded worse accuracy than low and high categories through every multispectral and RGB index (Table 4). Table 3. Univariate proportional odds (PO) CBSI model performance measured by 10-fold crossvalidation and CBSI category predictions.  All multispectral and RGB indices were statistically significant (p Wald χ2 < 0.0001) in the univariate proportional odds (PO) models run for the CBSI, multispectral indices being more strongly related to the CBSI than RGB indices (Table 3 and Figure 4). Among the multispectral indices, NDWI featured the best performance in univariate PO models, in both 10-fold cross-validation (R 2 cv = 0.6893 and D XY = 0.7724) and CBSI category predictions (OA (overall accuracy) = 0.8333 and kappa = 0.7419) ( Figure 5). High NDWI values were significantly related to high probabilities of classification on the high CBSI severity category (Figure 4). Regarding RGB indices, CI was the best predictor of burn severity (R 2 cv = 0.4413 and D XY = 0.4750; OA = 0.5833 and kappa = 0.3375), being inversely related to the probability of high soil burn severity occurrence (Figure 4). Moderate CBSI classification yielded worse accuracy than low and high categories through every multispectral and RGB index (Table 4). Table 3. Univariate proportional odds (PO) CBSI model performance measured by 10-fold crossvalidation and CBSI category predictions. 1 Figure 4. Predicted probabilities of classification into low, moderate, and high burn severity CBSI categories for each spectral predictor in univariate proportional odds (PO) CBSI models.

10-Fold Cross-Validation CBSI Category Predictions
Individual indicators of soil burn severity were also modeled more accurately from multispectral than from RGB indices ( Table 5). Ash depth and ash cover models achieved the best performance, specifically through NDWI (R 2 cv = 0.5325 and R 2 cv = 0.4401, respectively). The correlations of each individual indicator with UAV products were lower than those achieved when using the CBSI as burn severity indicator.

Discussion
In this study, we evaluated for first time the potential of very high-resolution multispectral and RGB products obtained by UAVs to characterize soil burn severity. The accuracies reached in our work when predicting soil burn severity with UAVs were similar to or higher than those found in other studies related to total burn severity [10,24,82,83], demonstrating the potential of high-resolution multispectral imagery for soil burn severity assessment.
Our results showed that multispectral indices (NDVI, NDVIRE, NDWI) outperformed RGB indices (EGI, GCC, CI) when assessing soil burn severity, measured either as a composite index or as individual indicators. The highest separability between low/moderate and high severity categories found in the NIR region and the good performance of the NDWI are two pieces of evidence that demonstrate the relevance of NIR (absent in RGB indices) for discriminating burn severity [5]. This finding has been widely reported using satellite imagery at lower spatial resolution, such as AVIRIS [66], Sentinel-2 [10],

Discussion
In this study, we evaluated for first time the potential of very high-resolution multispectral and RGB products obtained by UAVs to characterize soil burn severity. The accuracies reached in our work when predicting soil burn severity with UAVs were similar to or higher than those found in other studies related to total burn severity [10,24,82,83], demonstrating the potential of high-resolution multispectral imagery for soil burn severity assessment.
Our results showed that multispectral indices (NDVI, NDVIRE, NDWI) outperformed RGB indices (EGI, GCC, CI) when assessing soil burn severity, measured either as a composite index or as individual indicators. The highest separability between low/moderate and high severity categories found in the NIR region and the good performance of the NDWI are two pieces of evidence that demonstrate the relevance of NIR (absent in RGB indices) for discriminating burn severity [5]. This finding has been widely reported using satellite imagery at lower spatial resolution, such as AVIRIS [66], Sentinel-2 [10], Landsat [5], and MODIS [84]. This could be explained by the NIR sensitivity to the changes caused by high temperatures on (i) mineral soil [85], (ii) ash color and quantity [86], and (iii) structure and density of leaves covering the soil surface [24,87]. The NDVIRE also showed great capacity to determine soil burn severity, which highlights the value of the red-edge band for burn severity assessments. In this sense, Ref. [63] found that the NDVIRE calculated with Sentinel-2 imagery outperformed the NDVI for assessing total burn severity (vegetation and soil burn severity). These authors attributed their results to the high sensitivity of red edge to photosynthetic pigments. Although, overall, the multispectral indices showed a high capacity to determine soil burn severity, a better performance could be expected by including a short-wave infrared (SWIR) band in the UAV multispectral camera [42]. It has been shown that SWIR bands in satellite optical sensors enhance sensitivity to changes in soil properties after fire, such as fluctuations in moisture and char content [10,88]. Spectral indices including NIR and SWIR bands would improve the discrimination of soil burn severity levels in heterogeneous landscapes, compared with indices based on NIR and red bands. Despite this, visible and NIR reflectance data at very high spatial resolution accurately detected changes on the postfire forest soil surface.
Moreover, we found that UAV-derived indices were more able to model the composite index of soil burn severity (CBSI) than the individual indicators of soil burn severity (ash depth, ash cover, fine debris cover, coarse debris cover, and unstructured soil depth). This result indicates that spectral indices better retrieve the overall change caused by fire on soil than the particular changes occurring across individual biophysical parameters. This fact has probably been assumed by many authors, as there is a general preference, across the literature, for using composite indices, such as the CBI [5], GeoCBI [23], and its adaptions [24], rather than individual indicators, which could be used by itself or combined a posteriori in a burn severity index similar to the CBSI. Other approaches for burn severity evaluation [25], based on fire effects on the soil compartment, have already shown the capacity of visual indicators and Composite Burn Index (CBI) values to reflect changes in soil biophysical properties. Indeed, several studies [89][90][91] have revealed that using individual visual indicators, such as soil organic depth, together with semiquantitative metrics, such as the CBI, has proven to be useful in the assessment of other soil characteristics, such as changes in carbon storage after combustion. Future research should focus on developing accurate remote sensing methodologies that allow a better understanding of the integrated environmental impacts of forest fires on soils.
Focusing on individual indicators, we found that ash depth and ash cover were the best predicted parameters from the NDWI. In accordance with [38], spectral indices based on a NIR and SWIR band proved to be useful to identify significant spectral variations in ash abundance. In this study, high spatial resolution sensors were effective in detecting postfire patches of ash due to the small-scale large variability noted on the soil. Moreover, imagery of high spectral resolution has also shown a high capacity to quantify severity indicators, such as ash cover [82]. This suggests that new remote sensing methods for improving the detection of different individual indicators of soil burn severity could be addressed by combining high spatial and spectral resolution imagery.
Although previous works have analyzed the potential of UAV-derived products in the fire ecology field [42,61], this study is pioneer in evidencing the usefulness of multispectral sensors on board UAVs for accurately mapping soil burn severity at very high spatial resolution, which is particularly relevant in postfire landscapes with a high fine-scale spatial heterogeneity [83,92,93]. However, future research should be conducted to validate the proposed methodology in other biomes, such as boreal forests, where fire disturbances can also originate substantial heterogeneity in soil burn severity at both plot and landscape scales [94].

Conclusions
This study constitutes an interesting contribution to fire ecology research and forest management, as it shows, for the first time, the ability of multispectral and RGB imagery collected with UAV technology to characterize soil burn severity in landscapes affected by mixed-severity wildfires. The proposed approach provides new insights that can be helpful in identifying critical areas for postfire emergency actions. The main achievements are (i) multispectral indices, particularly the NDWI, perform better than RGB indices to determine soil burn severity at very high spatial resolution; (ii) field measurements of soil burn severity combined in a composite index (CBSI) are better predicted by spectral indices than individual indicators of soil burn severity; and (iii) ash depth and ash cover are individual indicators of burn severity best predicted by spectral indices. Further research is needed to validate the proposed approach in other biomes under different fire regime scenarios and address the capability of multispectral cameras with SWIR bands or, even better, hyperspectral cameras on board UAVs to improve the mapping of individual soil indicators of burn severity. In addition, UAV remote sensing technology could benefit future research lines in terms of carbon sequestration and loss, fire emissions, and other ecological impacts of fire. Funding: This study was financially supported by the Spanish Ministry of Economy and Competitiveness and the European Regional Development Fund (ERDF) in the framework of the FIRESEVES project (AGL2017-86075-C2-1-R), and by the Regional Government of Castile and León in the framework of the WUIFIRECYL project (LE005P20). The European Regional Development Fund also provided funding for the present study. D.B.-M. was supported by a predoctoral contract from the Regional Government of Castile and León cofinanced by the European Social Fund (EDU/ 556/2019).

Conflicts of Interest:
The authors declare no conflict of interest.