Evaluation of Prescribed Fires from Unmanned Aerial Vehicles (UAVs) Imagery and Machine Learning Algorithms

: Prescribed ﬁres have been applied in many countries as a useful management tool to prevent large forest ﬁres. Knowledge on burn severity is of great interest for predicting post-ﬁre evolution in such burned areas and, therefore, for evaluating the e ﬃ cacy of this type of action. In this research work, the severity of two prescribed ﬁres that occurred in “La Sierra de Ur í a” (Asturias, Spain) in October 2017, was evaluated. An Unmanned Aerial Vehicle (UAV) with a Parrot SEQUOIA multispectral camera on board was used to obtain post-ﬁre surface reﬂectance images on the green (550 nm), red (660 nm), red edge (735 nm), and near-infrared (790 nm) bands at high spatial resolution (GSD 20 cm). Additionally, 153 ﬁeld plots were established to estimate soil and vegetation burn severity. Severity patterns were explored using Probabilistic Neural Networks algorithms (PNN) based on ﬁeld data and UAV image-derived products. PNN classiﬁed 84.3% of vegetation and 77.8% of soil burn severity levels (overall accuracy) correctly. Future research needs to be carried out to validate the e ﬃ cacy of this type of action in other ecosystems under di ﬀ erent climatic conditions and ﬁre regimes.


Introduction
Wildfires are a natural phenomenon that have dramatically increased in number, extent and severity in the Mediterranean Basin due to huge territorial changes and global warming in recent decades, amongst other causes [1,2]. Satisfactory management policies that stimulate vegetation regeneration after fire and impede soil losses can only be derived from accurate maps of the impact of fire on vegetation and soil [3]. Burn severity refers to the effects of a fire on the environment, typically focusing on the loss of vegetation both above and below ground, but also including soil impacts [4]. Vegetation burn severity refers to the effect on vegetation including short-and long-term impacts [5], whereas soil burn severity mainly refers to the loss of organic matter in soil [6]. In this context, technical advances in both in situ evaluation of fire damage and post-fire regeneration monitoring must be a priority for management purposes in fire prone areas [7]. In recent years, new insights into geo-information technology have provided a great opportunity for evaluating fire effects in natural ecosystems at different scales with low field effort [8]. Although satellite images have been widely applied in this field, they might show certain weaknesses, such as low temporal resolution not controlled by the user, cloud cover or multispectral spatial resolution >1 m. This could limit their use in post-fire monitoring studies requiring very high spatial resolution, such as those evaluating changes in soil organic carbon or soil structure [9,10]. Unmanned Aerial Vehicles (UAV) may assist in these situations. Their low speed and flight altitude enable very high spatial resolution images (less than 0.02 m) to be obtained [11]. Indeed, they are usually less costly than other techniques when used in small zones. Other relevant advantages are the possibility of user-programing flights for data collection in target areas and flexibility of type of sensor installed on board (for example, RedGreenBlue (RGB), multispectral or Laser Imaging Detection and Ranging (LiDAR)) [12].
Prescribed burnings are based on an intentional and controlled use of fire. Thus, fire is introduced with an established duration, determinate intensity and fixed rate of spread under specific environmental conditions [13]. Usually, the specific conditions are different to those encountered during the fire season. Consequently, prescribed burnings mainly impact surface fuels and understory vegetation, conversely to wildfires [13]. In prescribed burnings, vegetation is managed by controlling fire intensity to remove different percentages of fuel according to specific objectives [14]. The real intensity of fire and amount of consumed fuel are the key parameters when analysing the efficacy of this action [15]. Both factors are directly linked to burning severity and their assessment is essential to anticipate post-fire evolution. In general, spatial patterns of burn severity after prescribed burning may be highly heterogeneous depending on plant community composition prior to fire, fuel distribution and environmental characteristics during burning. In such cases, images with a spatial resolution of less than 1 m, like those collected by UAVs, can be used to measure the efficacy of both prescribed burnings and post-fire management actions [11]. Although few studies have been carried out to prove the usefulness of sensors onboard UAVs in the evaluation of post-fire vegetation damage and recovery [16,17], we are convinced of their usefulness.
Regarding the methodology used to classify multispectral images and estimate burn severity, Artificial Neural Networks (ANNs) demonstrated their suitability for classifying objects in forest science [18][19][20]. ANNs are a type of artificial intelligence (AI) modeled on the biological neural networks of the human brain, which are able to model any linear or non-linear relationships between a set of input and output variables [21]. Thus, an ANN is based on highly interconnected processing elements which simulate the basic functions of human neurons [21].
In this context, our goal is to evaluate the viability of images obtained by a multispectral sensor on board a UAV to estimate vegetation and soil burn severity after prescribed burning using an ANN-based classifier.

Study Area
Our study area is located in Sierra de Uría (Asturias, Spain). The Principality of Asturias is a Spanish territory with a significant occurrence of forest fires. In the period 2015-2019, 6000 fires affecting a total of 65,000 ha occurred in the area. Two prescribed fires were conducted in adjacent 7 ha, plots located at 43 • 6 17"N, 6 • 50 52" W ( Figure 1). The first prescribed fire was performed on the 8 th of October 2017 and the second, a week later. The area lies at an average altitude of 1170 m above sea level, with a 10% slope, facing west. There are no important topographic variations in slope or aspect in either plot. Similarly, the study area is fairly homogeneous regarding vegetation type. It comprises quartzite and highly organic stony ground with Umbrisol soils. The current vegetation is heath-gorse belonging to the Pterosparto-Ericetum aragonensis subas. ulicetosum breoganii nova association. The flora composition in these shrublands consists of Spanish heather (Erica australis subsp. aragonensis), bell heather (Erica cinerea and, Erica umbellate), common heather (Calluna vulgaris) and St Dabeoc's heather (Daboecia cantabrica), with western gorse (Ulex gallii subsp. breoganii), prickly broom and winged broom (Pterospartum tridentatum subsp. lasianthum). It corresponds to a Rothermel's fuel model 6 (0.5-1.2 m) [22].
An FV8 octocopter (from ATyges), weighing 3.5kg with a maximum payload mass of 1.5 kg, was used as UAV. We collected post-fire images using a Parrot Sequoia with a multispectral camera with four 1.2megapixel monochrome sensors. Each of the four sensors acquires data at a different wavelength range: green (530-570 nm), red (640-680 nm), red-edge (730-740 nm) and near-infrared (NIR, 770-810 nm). The horizontal field of view (HFOV) of the multispectral camera is 70.6°; the vertical field of view (VFOV), 52.6°, and the diagonal field of view (DFOV), 89.6°, the focal length being equal to 4 mm. At a mean flight altitude of 120 m, the ground sampling distance (GSD) was 14.4 cm. The multispectral data was georeferenced from the computed image positions based on an onboard Global Navigation Satellite System (GNSS). Mean geolocation accuracy (m) was x, 1.04; y, 1.04; and z, 1.27. In addition, an irradiance sensor recording the specific light condition was installed on the UAV facing upwards. Each image capture adjustment is kept in a metadata text file together with the irradiance sensor data.

Materials
An FV8 octocopter (from ATyges), weighing 3.5kg with a maximum payload mass of 1.5 kg, was used as UAV. We collected post-fire images using a Parrot Sequoia with a multispectral camera with four 1.2megapixel monochrome sensors. Each of the four sensors acquires data at a different wavelength range: green (530-570 nm), red (640-680 nm), red-edge (730-740 nm) and near-infrared (NIR, 770-810 nm). The horizontal field of view (HFOV) of the multispectral camera is 70.6 • ; the vertical field of view (VFOV), 52.6 • , and the diagonal field of view (DFOV), 89.6 • , the focal length being equal to 4 mm. At a mean flight altitude of 120 m, the ground sampling distance (GSD) was 14.4 cm. The multispectral data was georeferenced from the computed image positions based on an onboard Global Navigation Satellite System (GNSS). Mean geolocation accuracy (m) was x, 1.04; y, 1.04; and z, 1.27. In addition, an irradiance sensor recording the specific light condition was installed on the UAV facing upwards. Each image capture adjustment is kept in a metadata text file together with the irradiance sensor data. During the pre-processing stage, the absolute reflectance values were obtained using Pix4D software. Thus, four surface reflectance images (GSD = 20cm) were achieved as output. The UAV covered a flight area of 0.9151 km 2 (91.5137 ha). The flight took place approximately one month after burning (7 th November 2017).

Methods
A set of 153 1 m 1 m plots were set up in the field to estimate soil and vegetation burn severity and positions were GPS recorded (see Figure 1: lower). Field plots were systematically distributed following a square grid. We established 88 plots in high burn severity, 25 in moderate-low burn severity, and 40 in non-burned areas. We adapted the method proposed by Key and Benson [23] to quantify burn severity in each plot. For each stratum (substrate and vegetation) we rated different parameters from 0 (non-burned) to 3 (maximum burn severity) and averaged them to obtain a single value per stratum. The following variables were used to assess burn severity: light fuel consumed, as well as char and color for the substrate; and foliage consumed and stem diameter for the vegetation stratum. Figure 2 shows some pictures before and after the prescribed burn and examples of the different burn severity levels (high, moderate-low and non-burned).
As the time between the prescribed burning was very short, the datasets were analyzed jointly using a specific type of ANN: Probabilistic Neural Network algorithm (PNN). This is a non-parametric method to classify observations, which has proved to be highly accurate in previous remotely sensed applications [24][25][26]. PNN-based classifiers have shown higher accuracy than back propagation neural networks (BpNNs), radial basis functions (RBFs) and multi perceptron neural networks (MLPs) [27,28]. The PNN classifiers do not make assumptions about the nature of the distribution of variables, as they are non-parametric algorithms [29,30]. They construct the assumption of the density function of each class using a Parzen window, an excellent kernel-based method that weighs observations in each group in relation to distance from the specified location [31]. Usually, the effect of the Parzen weight function can be optimized by jackknifing or can be defined by the user [29]. Figure 3 shows a basic scheme of the PNN structure developed in this study, consisting of (1) an input layer including four neurons (our four input variables: green, red, red-edge and NIR data), (2) a pattern layer including 153 neurons (the samples used to train the network as we used jackknifing), (3) a summation layer including three neurons (our outputs: high, moderate-low burn severity levels and non-burned) and (4) an output layer with a binary neuron for each output [24].
Basically, the input layer feeds the neurons of the pattern layer. It provides the next layer with the information from the four spectral bands of the UAV. These values, denoted by X 1 through X 4 , are then standardized by subtracting the sample mean of the 153 training cases and dividing by the sample standard deviation. From these standardized values, the pattern layer builds an activation function to estimate the probability density function for each group. In this network, the activation function quantifies the contribution of the i-th value in the training case to the estimate the density function for group j and is given by: where W is the Gaussian function because of its shape and σ is a scale parameter that defines how quickly the influence of a point decreases as a function of its distance from X.
Remote Sens. 2020, 12, 1295 5 of 11 parameters from 0 (non-burned) to 3 (maximum burn severity) and averaged them to obtain a single value per stratum. The following variables were used to assess burn severity: light fuel consumed, as well as char and color for the substrate; and foliage consumed and stem diameter for the vegetation stratum. Figure 2 shows some pictures before and after the prescribed burn and examples of the different burn severity levels (high, moderate-low and non-burned).  Next, the estimates of the probability density function are transferred to the next layer. The summation layer puts the information from the 153 training cases together, with misclassification costs and prior probabilities, thus obtaining a score for each group. Allowing n j to represent the number of observations in the training set belonging to group j, the estimated density function for group j at location X is proportional to: Finally, these scores enable the binary neuron in the output layer corresponding to the group with the largest score to be turned on and all other output neurons are turned off.
We trained the PNN using jackknifing. Jackknifing removes one sample at a time from the training set (n = 153), determining how often it is correctly classified when it is not used to estimate the group scores. We used jackknifing rather than training and validation sets because of the relatively low Remote Sens. 2020, 12, 1295 6 of 11 number of samples we had (in particular, low-moderate samples). All the computations were made using Statgraphics Centurion software. function for group j and is given by: where W is the Gaussian function because of its shape and  is a scale parameter that defines how quickly the influence of a point decreases as a function of its distance from X.

Results
Based on the scatterplots in Figure 4 and partial correlation coefficients in Table 1, a linear relationship and high correlation were found for red-edge and NIR bands and for green and red bands. However, low separability between non-burned and burned samples can be observed in the green versus red scatterplot, whereas it is higher in the red-edge versus NIR. On the other hand, the scatterplots for green versus red-edge, green versus red, green versus NIR, and red versus NIR bands showed little confusion between burn severity levels ( Figure 4).
Remote Sens. 2020, 12, x FOR PEER REVIEW 6 of 11 Next, the estimates of the probability density function are transferred to the next layer. The summation layer puts the information from the 153 training cases together, with misclassification costs and prior probabilities, thus obtaining a score for each group. Allowing nj to represent the number of observations in the training set belonging to group j, the estimated density function for group j at location X is proportional to: Finally, these scores enable the binary neuron in the output layer corresponding to the group with the largest score to be turned on and all other output neurons are turned off.
We trained the PNN using jackknifing. Jackknifing removes one sample at a time from the training set (n=153), determining how often it is correctly classified when it is not used to estimate the group scores. We used jackknifing rather than training and validation sets because of the relatively low number of samples we had (in particular, low-moderate samples). All the computations were made using Statgraphics Centurion software.

Results
Based on the scatterplots in Figure 4 and partial correlation coefficients in Table1, a linear relationship and high correlation were found for red-edge and NIR bands and for green and red bands. However, low separability between non-burned and burned samples can be observed in the green versus red scatterplot, whereas it is higher in the red-edge versus NIR. On the other hand, the scatterplots for green versus red-edge, green versus red, green versus NIR, and red versus NIR bands showed little confusion between burn severity levels ( Figure 4). Partial correlation coefficients between each pair of UAV spectral bands are shown in Table 1.
To complement this information, Table 1 also displays the coefficients (intercept and slope) of the  Partial correlation coefficients between each pair of UAV spectral bands are shown in Table 1. To complement this information, Table 1 also displays the coefficients (intercept and slope) of the linear regression among the four UAV spectral bands for each burn class (high burn severity, moderate-low burn severity, and non-burned). Table 2 shows the percentage of vegetation and soil burn severity levels correctly classified (overall accuracy) by the PNN using the total data set. More accurate results were obtained for vegetation burn severity (84.31%) than for soil burn severity (77.78%). Percentages of accuracy for moderate-low severity levels were lower than for high severity levels (except for producer accuracy in vegetation burn severity). PNN discriminated categorically non-burned from burned areas (the percentage of correct classification for the non-burned class was 100.0%).

Discussion
This research demonstrated the usefulness of UAV multispectral data for distinguishing soil and vegetation burn severity levels shortly after prescribed burnings. In this sense, our results agree with those from previous research papers [11,16,17], proving the efficacy of UAV multispectral data for analyzing fire damage. For instance, different levels of burn severity have been successfully determined in boreal forests on the basis of UAV imagery [17]. We tested an initial assessment of burn severity Remote Sens. 2020, 12, 1295 8 of 11 (as flights were conducted approximately one month after burning), but future studies should evaluate the persistence of the effects of prescribed burning.
Red, red-edge and NIR bands were the most useful spectral wavelengths to discriminate burn severity levels. Many studies have already validated the suitability of red and NIR bands for this purpose (e.g. [32]), whereas the red-edge wavelength is becoming increasingly useful as a key product in fire ecology applications [33,34]. In particular, spectral indices obtained from the Sentinel-2 MultiSpectral Instruments (MSI) have been successfully used for discriminating vegetation burn severity in Mediterranean ecosystems, the most suitable being those based on the red edge band which can measure variations in chlorophyll content, and NIR, mainly related to variations in leaf structure [33]. In this sense, a promising index based on Sentinel-2 MSI red-edge bands has been proposed to estimate the burned area affected by forest fires [34]. Our results highlighted that red-edge band at high spatial resolution could be sensitive not only to vegetation burn severity, but also to soil burn severity. Moreover, it has already been demonstrated that satellite products, like those derived from Landsat 7 ETM+, allowed for evaluating changes in soil properties affected by forest fires at high severity level [6], which agree with our results (see Table 2). We did not find other previous studies relating UAV multispectral data to soil burn severity. It is worth highlighting that this is a preliminary work based on the original spectral bands of UAV imagery. We recommend future studies to test the advantages of using spectral indices based on UAV multispectral imagery, or even texture metrics.
We distinguished two burn severity levels, high and low-moderate, due to the fact differences between low and moderate burn severity were very small in the field sampling, as has occurred in other similar studies [35][36][37][38]. Furthermore, from a management point of view, the priority areas to consider specific restoration activities are those affected by high burn severity [39]. Thus, accurate identification of these areas is key for post-fire management purposes [40].
Despite the promising results we obtained in this study, the use of red-edge bands of multispectral sensors on board UAVs to discriminate vegetation and soil burn severity should be further validated in other study areas with different vegetation types, climatic conditions and fire regimes. Similarly, the persistence in time of the scar in the UAV image should be tested to define the maximum time after prescribed burning to conduct the UAV flight. Though the PNN we used has shown a good performance, other machine learning algorithms such as Random Forest have also proved their validity when working with remotely sensed data [41][42][43][44]. Both machine learning methods (ANNs and RFs) tend to be more powerful than conventional classifiers, and both can be used as variable selection tools to identify informative variables based on the network's performance [45] or variable importance score [46,47]. Therefore, in future research, we recommend comparing the performance of both algorithms when estimating burn severity from UAV multispectral data. Finally, we should underline that our study area has very homogeneous characteristics. A detailed study of the applicability of the proposed method in more heterogeneous soils should be conducted in the future.

Conclusions
Multispectral images obtained with a Parrot SEQUOIA camera on board a UAV, in combination with artificial intelligence-based methods, allow the successful evaluation of soil and vegetation burn severity at very high spatial resolution after prescribed burnings. These results can contribute to the accurate evaluation of the usefulness of prescribed burning. Nevertheless, further research is required to extrapolate the conclusions from this initial study to other forest fire regimes and different ecosystems. Funding: This work is part of the FIRESEVES (AGL2017-86075-C2-1-R) project funded by the Spanish Ministry of Economy and Competitiveness and the European Regional Development Fund, and SEFIRECYL (LE001P17), funded by the government of Castile and León autonomous region.