Normalized Burn Ratio Plus (NBR+): A New Index for Sentinel-2 Imagery

: The monitoring of burned areas can easily be performed using satellite multispectral images: several indices are available in the literature for highlighting the differences between healthy vegetation areas and burned areas, in consideration of their different signatures. However, these indices may have limitations determined, for example, by the presence of clouds or water bodies that produce false alarms. To avoid these inaccuracies and optimize the results, this work proposes a new index for detecting burned areas named Normalized Burn Ratio Plus (NBR+), based on the involvement of Sentinel-2 bands. The efﬁciency of this index is veriﬁed by comparing it with ﬁve other existing indices, all applied on an area with a surface of about 500 km 2 and covering the north-eastern part of Sicily (Italy). To achieve this aim, both a uni-temporal approach (single date image) and a bi-temporal approach (two date images) are adopted. The maximum likelihood classiﬁer (MLC) is applied to each resulting index map to deﬁne the threshold separating burned pixels from non-burned ones. To evaluate the efﬁciency of the indices, confusion matrices are constructed and compared with each other. The NBR+ shows excellent results, especially because it excludes a large part of the areas incorrectly classiﬁed as burned by other indices, despite being clouds or water bodies.


Introduction
An important research field when using satellite images is the monitoring of active fires [1], their impact on air quality [2], and other traces they leave on the environment [3,4]: burned areas (BA).In fact, accurate and rapid mapping of fire damaged areas is necessary to support fire management, estimate environmental cost, define planning strategies, and monitor the restoration of vegetation [5].The identification of BA through the use of remote sensing (RS) techniques, instruments, and methods represents a field of research in continuous development [6].The ability to identify BA using Earth observation satellites with a high geometric, temporal, and spectral resolution makes it possible to monitor and preserve the state of the sites; indeed, very often, fires involve areas of particular naturalistic value [7,8].Therefore, in order to preserve the natural and man-made landscape, it is necessary to identify suitable methodologies for monitoring burned areas and, at the same time, produce severity estimation maps using satellite data [9].
In recent decades, several satellite-mounted sensors have been used for this activity, such as Advanced Very High-Resolution Radiometer (AVHRR) [10] and Advanced Space borne Thermal Emission and Reflection Radiometer (ASTER) [11].Another widely used sensor is Moderate Resolution Imaging Spectroradiometer (MODIS) due to its high temporal resolution, which allows the rapid detection of active fires, the identification of burned areas, or forest fire risk assessment [12][13][14].However, MODIS has a coarse spatial resolution, which makes detecting the spatial extent of smaller fires more difficult [15].The launch of Landsat-8 OLI (Operational Land Imager), i.e., the eighth satellite of the Landsat program developed by a collaboration between NASA and the United States Geological Survey (USGS), and Sentinel-2 (Earth observation mission from the Copernicus Program, formerly called the Global Monitoring for Environment and Security-GMES) allowed us to obtain images with a better spatial resolution than previous satellite-based sensors.In particular, Sentinel-2 collects multispectral land surface imagery by two satellites with a revisit cycle of 5 days at 10 m, 20 m, and 60 m spatial resolutions.Their single instrument is the Multispectral Imager (MSI), which collects data in 13 spectral bands using a line-scanner technology with a wide field of view [16].
The processing of remotely sensed data is a prerequisite for generating robust spatial information with scientific quality and is appropriate for fire monitoring at different scales and over time [17].Indeed, the realization of increasingly performing indices in the field of RS for the detection and analysis of a phenomenon, such as burnt areas, is in continuous development.Therefore, the increasing spatial and spectral resolution of satellite platform sensors combined with the development of indices in the RS field mean that even small BA can be detected with high accuracy.BA detection is strictly correlated to the monitoring of vegetation.For this purpose, several methodologies have been proposed based on the use of spectral indices, such as the Normalized Difference Vegetation Index (NDVI), which can be defined as the normalized ratio of the difference of the near-infrared and red bands [18]; this index allows us to investigate the relationship between the amount of vegetation consumed and fire severity [19,20].In addition, specific indices have been developed that record the effects of fire with greater spectral contrast, such as the Normalized Burn Ratio (NBR), which is the normalized ratio of the difference of the near-infrared and shortwave-infrared bands [20,21].The combination of these latter bands makes it possible to analyze the phenomenon in pre-and post-fire conditions.In the NIR wavelengths, the absorbance of vegetation is low, whereas reflectance and transmittance are high; in the SWIR wavelengths, the reflectance and transmittance of the vegetation are low, and the absorbance is very high.In the post-fire zone, the recently burned areas show a relatively low reflectivity in the near-infrared band and a high reflectance in the short-wave infrared band [22,23].This index was applied with success to satellite data, such as Sentinel-2 [24] and Landsat-7 and Landsat-8 [25] images in order to identify burned areas.
Recent advancements in remote sensing technology have facilitated new approaches to study fire ecology, including different aspects related to fire risk mapping, fuel mapping, active fire detection, burned area estimates, burn severity assessment, and post-fire vegetation recovery monitoring [26].Specific indices for BA detection related to the availability of bands of specific sensors have been developed.A large part of the current application is based on a bi-temporal approach that permits us to identify the burned areas as results of changes due to the vegetation decrease in the pre-fire and post-fire images; however, other land-cover features might present similar responses in some specific bands such as NIR and SWIR bands, generating an increase in commission errors [27].In addition, the peculiarity of the scene can influence the accuracy of BA delineation: for example, water bodies, clouds, and shadows might produce false alarms, so specific precautions and remedies are required [28].Using SWIRs (short and long SWIR), these errors are mitigated [29], since the distinctive spectral signatures of water and burned areas beyond the NIR region, where water tends to absorb longer wavelengths almost completely, whereas burned forest reflectance remains fairly constant or shows a slightly growing trend [30].
The aim of this paper is to improve the performance obtained by the application of the existing indices for the identification of BA on Sentinel-2 images.Starting from the NBR index, this work proposes a new index, called Normalized Burn Ratio Plus (NBR+).The NBR+ is tested on an area located in the north-eastern part of Sicily (Italy) and affected by fires during the summer of 2019; its performances are evaluated by means of accuracy assessment of classification results and compared with five other methods that already exist in the literature and are applicable to Sentinel-2 multispectral geo-data, pre-and post-fire event.

Study Area and Dataset
Sentinel-2 satellite images are provided free of charge by ESA through a constellation of two satellites, part of the Copernicus program, which are called Sentinel-2A and Sentinel-2B.For this work, images provided by the Sentinel-2A satellite are used.The feature of bands with their central wavelengths and resolutions are reported in Table 1 [31].The Copernicus program, through the use of Sentinel satellites, deals with the continuous monitoring of the Earth: in the framework of geographic data and their storage [32], Sentinel images are the main acquisition channel of the Copernicus geodatabase of the Risk and Recovery Mapping [33].
Two Sentinel-2A datasets are used for this study: the first dataset was acquired on 24 July 2019 and represents the pre-fire scenario; the second was acquired on 23 August 2019 and represents the post-fire scenario.The images cover the north-eastern part of Sicily (Italy), including the province of Messina (Italy), as shown in Figure 1.
tested on an area located in the north-eastern part of Sicily (Italy) and affected by fires during the summer of 2019; its performances are evaluated by means of accuracy assessment of classification results and compared with five other methods that already exist in the literature and are applicable to Sentinel-2 multispectral geo-data, pre-and post-fire event.

Study Area and Dataset
Sentinel-2 satellite images are provided free of charge by ESA through a constellation of two satellites, part of the Copernicus program, which are called Sentinel-2A and Sentinel-2B.For this work, images provided by the Sentinel-2A satellite are used.The feature of bands with their central wavelengths and resolutions are reported in Table 1 [31].The Copernicus program, through the use of Sentinel satellites, deals with the continuous monitoring of the Earth: in the framework of geographic data and their storage [32], Sentinel images are the main acquisition channel of the Copernicus geodatabase of the Risk and Recovery Mapping [33].
Two Sentinel-2A datasets are used for this study: the first dataset was acquired on 24 July 2019 and represents the pre-fire scenario; the second was acquired on 23 August 2019 and represents the post-fire scenario.The images cover the north-eastern part of Sicily (Italy), including the province of Messina (Italy), as shown in Figure 1.As can be seen in Figure 2, the land is mostly surrounded by the sea, except in the southwestern part.If, on the one hand, the coastal areas are heavily inhabited and urbanized, on the other hand, the inland areas are predominantly rural, characterized by the dense vegetation of the Peloritani mountains [34]: as reported in [35], the vegetation in this area is very diversified and particularly prone to bushfires.
As can be seen in Figure 2, the land is mostly surrounded by the sea, except in the southwestern part.If, on the one hand, the coastal areas are heavily inhabited and urbanized, on the other hand, the inland areas are predominantly rural, characterized by the dense vegetation of the Peloritani mountains [34]: as reported in [35], the vegetation in this area is very diversified and particularly prone to bushfires.

Methods
This section presents the new index proposed to identify burnt areas using Sentinel-2 multispectral images and illustrates the methodological approach adopted to establish the level of performance of this index.In particular, some indices already present in the literature and usually applied for the purpose, are first recalled, reporting the relative formulas and specifying the bands used, since we applied them in comparison with NBR+.Then, the reasons that determined the formulation of the proposed index are explained.Subsequently,

Methods
This section presents the new index proposed to identify burnt areas using Sentinel-2 multispectral images and illustrates the methodological approach adopted to establish the level of performance of this index.In particular, some indices already present in the literature and usually applied for the purpose, are first recalled, reporting the relative formulas and specifying the bands used, since we applied them in comparison with NBR+.
Then, the reasons that determined the formulation of the proposed index are explained.Subsequently, we recall the fundamentals of the supervised classification applied to the maps obtained by the selected indices.Finally, the procedure for verifying the accuracy of the results, based on the use of test areas and the canonical approach applied in remote sensing [36], is described, including the confusion matrix [37] and related indices [38].
The workflow of the whole methodological approach is shown in Figure 3.
Remote Sens. 2022, 14, x FOR PEER REVIEW 5 of 19 we recall the fundamentals of the supervised classification applied to the maps obtained by the selected indices.Finally, the procedure for verifying the accuracy of the results, based on the use of test areas and the canonical approach applied in remote sensing [36], is described, including the confusion matrix [37] and related indices [38].
The workflow of the whole methodological approach is shown in Figure 3.All experiments are carried out using GIS software named QGIS, version 3.16: Raster Calculator tool allows to implement index formulas by means of Map Algebra Operators.

Spectral Indices Used for the Identification of Burnt Areas
The Near Infrared (NIR) and Shortwave Infrared (SWIR) spectral regions are relevant for detecting burned areas: NIR highlights changes in canopy cover and brightness of leaf burn [39], whereas SWIR detects changes in landscape dryness [40].After a fire, with the destruction of vegetation, the NIR reflectance strongly decreases and, on the other hand, the SWIR reflectance increases due to the fire's removal of water-retaining vegetation [41].Other important spectral regions for BA detection are the Red and Red-Edge, because they are linked to strong absorption of the chlorophyll content in plants [42,43].
For mapping burned areas, several algorithms use a combination of NIR and SWIR spectral regions, although some spectral indices combining only SWIR spectral regions or Red/Red-edge and NIR spectral regions have been developed as well.
Figure 4 compares burned area signature with healthy vegetation, remarking how areas devastated by fire have a very high reflectance in the SWIR wavelengths and low reflectance in the NIR.All experiments are carried out using GIS software named QGIS, version 3.16: Raster Calculator tool allows to implement index formulas by means of Map Algebra Operators.

Spectral Indices Used for the Identification of Burnt Areas
The Near Infrared (NIR) and Shortwave Infrared (SWIR) spectral regions are relevant for detecting burned areas: NIR highlights changes in canopy cover and brightness of leaf burn [39], whereas SWIR detects changes in landscape dryness [40].After a fire, with the destruction of vegetation, the NIR reflectance strongly decreases and, on the other hand, the SWIR reflectance increases due to the fire's removal of water-retaining vegetation [41].Other important spectral regions for BA detection are the Red and Red-Edge, because they are linked to strong absorption of the chlorophyll content in plants [42,43].
For mapping burned areas, several algorithms use a combination of NIR and SWIR spectral regions, although some spectral indices combining only SWIR spectral regions or Red/Red-edge and NIR spectral regions have been developed as well.
Figure 4 compares burned area signature with healthy vegetation, remarking how areas devastated by fire have a very high reflectance in the SWIR wavelengths and low reflectance in the NIR.Among the wide range of existing spectral indices described in the literature, five different approaches are applied and compared in this study: Normalized Burn Ratio (NBR), Normalized Burn Ratio-SWIR (NBRSWIR), Normalized Difference Shortwave Infrared Index (NDSWIR), Mid-Infrared Bi-Spectral Index (MIRBI), and Burnt Area Index for Sentinel 2 (BAIS2).The main characteristics of these indices are summarized below, and the description of the new index proposed in this study is reported in the following subsection.

Normalized Burn Ratio (NBR)
A famous index widely used in the literature to highlight burned areas in large fire zones is the Normalized Burn Ratio (NBR) [21], which is considered a standard for fire severity assessments.The Normalized Burn Ratio formula combines the use of both NIR (B8A) and SWIR (B12) wavelengths, where a high NBR value generally indicates healthy vegetation, and a low value indicates bare ground and recently burned areas.
The NBR formula is similar to that of NDVI.It is calculated as a ratio between the NIR and SWIR values and can be defined as: Areas with values close to zero are considered no-burned areas.

Normalized Burn Ratio-SWIR (NBRSWIR)
A new fire index, developed for BA identification with Landsat-8 OLI data, is Normalized Burn Ratio-SWIR (NBRSWIR), designed by Liu et al. in 2020 [27].The formula is similar to the NBR index, but SWIR1 (B11) and SWIR2 (B12) bands are used.Moreover, in the equation, two small constants are considered: in the numerator, the constant value of 0.02 is subtracted to set the changes in water close to zero or even negative, and in the denominator, 0.1 is added to avoid the positive amplification of some abnormal water changes: Among the wide range of existing spectral indices described in the literature, five different approaches are applied and compared in this study: Normalized Burn Ratio (NBR), Normalized Burn Ratio-SWIR (NBRSWIR), Normalized Difference Shortwave Infrared Index (NDSWIR), Mid-Infrared Bi-Spectral Index (MIRBI), and Burnt Area Index for Sentinel 2 (BAIS2).The main characteristics of these indices are summarized below, and the description of the new index proposed in this study is reported in the following subsection.

Normalized Burn Ratio (NBR)
A famous index widely used in the literature to highlight burned areas in large fire zones is the Normalized Burn Ratio (NBR) [21], which is considered a standard for fire severity assessments.The Normalized Burn Ratio formula combines the use of both NIR (B8A) and SWIR (B12) wavelengths, where a high NBR value generally indicates healthy vegetation, and a low value indicates bare ground and recently burned areas.
The NBR formula is similar to that of NDVI.It is calculated as a ratio between the NIR and SWIR values and can be defined as: Areas with values close to zero are considered no-burned areas.

Normalized Burn Ratio-SWIR (NBRSWIR)
A new fire index, developed for BA identification with Landsat-8 OLI data, is Normalized Burn Ratio-SWIR (NBRSWIR), designed by Liu et al. in 2020 [27].The formula is similar to the NBR index, but SWIR1 (B11) and SWIR2 (B12) bands are used.Moreover, in the equation, two small constants are considered: in the numerator, the constant value of 0.02 is subtracted to set the changes in water close to zero or even negative, and in the denominator, 0.1 is added to avoid the positive amplification of some abnormal water changes: 3.1.3.Normalized Difference Shortwave Infrared Index (NDSWIR) As the SWIR was known to be useful for the detection of older fire scars, Gerard et al. used the individual SWIR waveband and NDSWIR to support forest fire scar detection in the Boreal Forest [44].Particularly, they demonstrate that it is possible to detect fire scars up to ten years old using SPOT-VEGETATION data from a single year and that the use of a vegetation index based on near-and shortwave-infrared reflectance such as NDSWIR is crucial to this success: 3.1.4.Mid-Infrared Bi-Spectral Index (MIRBI) The Mid-Infrared Burn Index (MIRBI), developed by Trigg and Flasse in 2001 [40], is designed in the discriminating "short wavelength Mid-Infrared" and "long wavelength Mid-Infrared" space (respectively, in Sentinel-2 band 11 and band 12).This index is highly sensitive to spectral changes caused by burning and relatively insensitive to noise: 3.1.5.Burnt Area Index for Sentinel 2 (BAIS2) Burnt Area Index for Sentinel 2, BAIS2 [5], adapts the traditional BAI for Sentinel-2 bands, taking advantage of a combination of bands that have been demonstrated to be suitable for post-fire burned area detection (Visible (B4), Red-Edge (B6 and B7), NIR (B8A) and SWIR (B12) bands).The range of values for the BAIS2 is −1 to 1 for burn scars, and 1-6 for active fires.It can be obtained as shown in the following formula:

Development of a New Spectral Index for the Identification of BA
In order to correctly use the classical indices for the BA detection, it is usually necessary to apply a mask to remove the water bodies and clouds [45].In fact, as reported in numerous articles [46][47][48][49][50], water bodies can be mistaken for BA by indicators such as NBR, since the BA show similar reflection values as the water surfaces [24].Generally, a mask is created to remove water bodies by using the Normalized Difference Water Index (NDWI), an index that highlights the water in relation to the soil [51].As well known, water typically has strong reflectance in blue-green wavelengths [52]; therefore, we propose an enhanced version of NBR that takes into account the reflectance of water, which we call NBR+: On the other hand, this formula provides negative values for the clouds as they have the reflectance in the B12 band significantly lower than the sum of the other three bands.Consequently, since the NBR+ values can vary in a range between −1 and 1, the pixels related to the clouds in the resulting NBR+ image present certainly low (negative) values and therefore cannot be confused with those related to the BA, which present high values and tend to white.In fact, the NBR+ values can vary in a range between −1 and 1, where the pixels with higher values represent the BA.

Maximum Likelihood Classification
For this work, we chose to use the Maximum Likelihood Classification (MLC) in order to obtain two distinct classes, i.e., burned areas and unburned areas.This classifier falls into the category of supervised classification methods, which involve the use of training sites (TS) to have a priori knowledge of the samples of the investigated image [53].The image is classified by starting from the reflectance values of each pixel and assigning them to the class for which there is the greatest similarity [54].It is of fundamental importance that the TS are selected as accurately as possible [55], and that they comply with certain guidelines [56], such as being representative of the classes investigated, including a significant number of pixels for each class and containing sufficient information to describe the classes based on the behavior of its spectral signatures [57].
The MLC is based on Bayes' Theorem [58], and it is considered one of the most accurate methods in the literature [59].Generally, MLC is applied on multiple bands, but it can also be applied on a single band [60], for example, a synthetic band such as NBR.Particularly, in this study, TS relating the two researched classes are identified by means of visual analysis on RGB true color and RGB false color compositions of the Sentinel 2-A dataset.

Burned Area Mapping
Burned areas are identified by applying MLC to a false band.Particularly, in this work single date and bi-temporal approach is used for each index: a first classification is carried out on the post-burned images (single date approach); the second classification is carried out comparing pre-and post-burn images (bi-temporal approach), as proposed in several papers [61][62][63].
Since the singe date approach consists of classifying only the images related to the post-burn, it is faster to apply and does not present the errors related to the bi-temporal approach, such as those caused by differences in phenology [64], misregistration of image pixels [65], and differences in sensor calibration [66], sun-sensor geometry, and atmospheric effects [67].Furthermore, it is not always easy to find images, useful for BA detection, in two different moments in time that are not partially covered by clouds [68]: clouds and their shadows can be easily mistaken for BA when classification is carried out [69].The single date approach avoids all these problems, but the lack of a pre-burn reference image can generate difficulties in mapping areas whose spectral signature has remained constant, such as water and aging vegetation that can be mistaken for burned areas [70].
The bi-temporal analysis is carried out by calculating the indices listed above both on the pre-burn and post-burn images and then making the difference by generating the delta index: ∆Index = PostBurn Index − PreBurn Index (7)

Thematic Accuracy Assessment
In order to evaluate the thematic accuracy of the results obtained, test sites are used.These are different from the training sites, but like the latter, they are representative of the two classes (BA/Non-BA) if a sufficient and significant sample is taken [71].We consider two groups of test sites: group A including water bodies and clouds, and group B excluding them.In this way, we test the performance of each method for the ability both to not attribute to the class of burnt areas pixels of water and clouds (commission errors), and to distinguish the two clusters (BA/Non BA).
The test sites are achieved from the visual analysis of the multispectral images; they allow us to know how many pixels are classified correctly and how many are wrongly classified.Particularly, we apply the largely used approach based on the confusion matrix, a simple and powerful tool that denotes the accuracy of the remotely sensed image classification.A confusion matrix is a table that shows the correspondence between the classification result and the ground truth data, as derived by visual inspection of the same remotely sensed images or other information layers (such as maps) or acquired in situ and recorded with a GNSS (Global Navigation Satellite System) receiver [72].Instead of consid-ering the whole images, the confusion matrices are usually constructed from the values taken from the test sites, which allow evaluating the thematic accuracy of the adopted classification method [73].In particular, in this study, two confusion matrices are created for each method used: one representative of the post-fire image only and one representative of the "delta" image concerning the bi-temporal approach.The numerical analysis of the thematic accuracy of each classified image is summarized by the values of the indices named Producer Accuracy (PA), User Accuracy (UA), and Overall Accuracy (OA) [74].
PA indicates the probability that a certain land cover of an area on the ground is classified as such.It is computed by dividing the number of pixels classified accurately in each category by the total number of reference pixels for that category.
UA indicates the probability that a certain area classified into a given category actually represents that category on the ground.It is computed by dividing the number of correctly classified pixels in each category by the total number of pixels that were classified in that category.
OA represents the probability that all categories are correctly classified.It is computed by dividing the total number of correctly classified pixels by the total number of reference pixels.

Results and Discussion
A first assessment of each resulting BA map can be made by means of visual inspection: image comparison between NBR+ and NBR is shown in Figure 5 and ∆NBR+ and ∆NBR in Figure 6.
Comparing the images in Figure 5, it is evident that NBR presents noise around the coasts, or more generally, in the shallow waters, which NBR+ does not present at all; areas affected by this noise are incorrectly classified as BA, since they have very high radiance values.
A second disturbing element in this case is the clouds and the shadows they generate: these not only hide the burned areas from the sensor, but in the case of NBR, they can be mistaken for BA.The wrong classification of clouds and shallow waters has been resolved in NBR+; in fact, these areas have very low radiance values compared to the high BA radiance values.remotely sensed images or other information layers (such as maps) or acquired in situ and recorded with a GNSS (Global Navigation Satellite System) receiver [72].Instead of considering the whole images, the confusion matrices are usually constructed from the values taken from the test sites, which allow evaluating the thematic accuracy of the adopted classification method [73].In particular, in this study, two confusion matrices are created for each method used: one representative of the post-fire image only and one representative of the "delta" image concerning the bi-temporal approach.The numerical analysis of the thematic accuracy of each classified image is summarized by the values of the indices named Producer Accuracy (PA), User Accuracy (UA), and Overall Accuracy (OA) [74].
PA indicates the probability that a certain land cover of an area on the ground is classified as such.It is computed by dividing the number of pixels classified accurately in each category by the total number of reference pixels for that category.
UA indicates the probability that a certain area classified into a given category actually represents that category on the ground.It is computed by dividing the number of correctly classified pixels in each category by the total number of pixels that were classified in that category.
OA represents the probability that all categories are correctly classified.It is computed by dividing the total number of correctly classified pixels by the total number of reference pixels.

Results and Discussion
A first assessment of each resulting BA map can be made by means of visual inspection: image comparison between NBR+ and NBR is shown in Figure 5 and ΔNBR+ and ΔNBR in Figure 6.Comparing the images in Figure 5, it is evident that NBR presents noise around the coasts, or more generally, in the shallow waters, which NBR+ does not present at all; areas affected by this noise are incorrectly classified as BA, since they have very high radiance values.
A second disturbing element in this case is the clouds and the shadows they generate: these not only hide the burned areas from the sensor, but in the case of NBR, they can be mistaken for BA.The wrong classification of clouds and shallow waters has been resolved in NBR+; in fact, these areas have very low radiance values compared to the high BA radiance values.
Comparing the images in Figure 6, in the case of NBR, the noise relating to shallow waters has been slightly reduced (although it continues to generate disturbance); however, the noise relating to clouds has increased.In fact, when the bi-temporal analysis is carried out, if the two images (pre-and post-fire) both have clouds, these will be represented on the Δ image in a different way: the clouds inherent to the pre-fire image are represented with low values of radiance, whereas the post-fire image clouds are represented with high radiance values.
Therefore, even in this case, the clouds and shallow waters (to a lesser extent) can be mistakenly classified as BAs if NBR is applied.In the image related to NBR+, the BAs immediately catch the eye due to the high radiance compared to the rest of the image, even if there is a slight noise on the edges of the clouds, which, however, is rarely identified as BA by the MLC.
The different results supplied by each index in terms of BA/NON BA detections are compared and shown in Figure 7 for single-date analysis and Figure 8 for bi-temporal analysis.
Furthermore, Figure 9 shows details to highlight the impact of clouds on classification, and Figure 10 shows detail to highlight the impact of water bodies on the classification of the images.Comparing the images in Figure 6, in the case of NBR, the noise relating to shallow waters has been slightly reduced (although it continues to generate disturbance); however, the noise relating to clouds has increased.In fact, when the bi-temporal analysis is carried out, if the two images (pre-and post-fire) both have clouds, these will be represented on the ∆ image in a different way: the clouds inherent to the pre-fire image are represented with low values of radiance, whereas the post-fire image clouds are represented with high radiance values.
Therefore, even in this case, the clouds and shallow waters (to a lesser extent) can be mistakenly classified as BAs if NBR is applied.In the image related to NBR+, the BAs immediately catch the eye due to the high radiance compared to the rest of the image, even if there is a slight noise on the edges of the clouds, which, however, is rarely identified as BA by the MLC.
The different results supplied by each index in terms of BA/NON BA detections are compared and shown in Figure 7 for single-date analysis and Figure 8 for bi-temporal analysis.
Furthermore, Figure 9 shows details to highlight the impact of clouds on classification, and Figure 10 shows detail to highlight the impact of water bodies on the classification of the images.
The quantitative analysis of the performance of each approach is based on test sites and the confusion matrix approach.The results obtained from the confusion matrices are shown in Tables 2 and 3 for the post-fire analysis and in Tables 4 and 5 for the bi-temporal analysis.Particularly, Tables 2 and 4 concern test sites including water bodies and clouds (Group A), and Tables 3 and 5        The quantitative analysis of the performance of each approach is based on test sites and the confusion matrix approach.The results obtained from the confusion matrices are shown in Tables 2 and 3 for the post-fire analysis and in Tables 4 and 5 for the bi-temporal analysis.Particularly, Tables 2 and 4 concern test sites including water bodies and clouds (Group A), and Tables 3 and 5 concern test sites excluding water bodies and clouds (Group B).The following indices are considered:   The quantitative analysis of the performance of each approach is based on test sites and the confusion matrix approach.The results obtained from the confusion matrices are shown in Tables 2 and 3 for the post-fire analysis and in Tables 4 and 5 for the bi-temporal analysis.Particularly, Tables 2 and 4 concern test sites including water bodies and clouds (Group A), and Tables 3 and 5 concern test sites excluding water bodies and clouds (Group B).The following indices are considered:  The results obtained are very encouraging for the proposed index.In fact, the closer the accuracy index values are to 1, the more satisfactory the results are, as it means that the proposed method is effective and is able to correctly classify the areas considered, distinguishing between burnt and not.
NBR+ is always the best performing index, reaching in any case the highest value of OA.The results obtained for the already existing indices in the literature are in line with the results presented by other authors [75][76][77].
The statistical values obtained in the post-fire images tend to be lower than those obtained in the bi-temporal analysis, confirming the higher efficacy of the latter method as already stated in the literature [78].
Analyzing the values shown in Table 2, the MIRBI is always the least well-performing index; NBR, NBRSWIR and NDSWIR present results that are very close to each other; BAIS2 is the only one among the "traditional" indices to present values that exceed 0.9 (PA-NON BA and UA-BA) and a much higher OA value than the others (0.798).The NBR+ has very high values compared to the other indices, it is the only one to exceed the value of 0.9 in terms of OA and the only one to have all the values higher than 0.8.
Table 3 confirms the high performance of the proposed index in the case of single-date thematic accuracy assessment on the test sites excluding water bodies and clouds.In fact, NBR+ supplies the highest value of OA (0.925) and very good results in terms of PA and UA.
The values reported in Table 4 are generally high for all the indices; the OA is always higher than 0.85, confirming the validity of each method tested in this work.NDSWIR and MIRBI present OA values very close to each other (as well as being the lowest), and the PA and UA values are similar but opposite.In fact, NDSWIR has very high values of PA-BA and UA-NON BA (0.968 and 0.960) and lower values of PA-NON BA and UA-BA (0.766 and 0.805); on the contrary, the MIRBI has very high values of PA-NON BA and UA-BA (0.992 and 0.990, respectively) and lower values of PA-BA and UA-NON BA (0.749 and 0.798).BAIS2 has a slightly higher OA value (0.881) than NDSWIR and MIRBI.Very good results are obtained from the application of NBR and NBRSWIR, with OA values higher than 0.9 for both.The best-performing index is NBR+; in fact, it always presents excellent results, reaching an accuracy of 100% in the PA-BA and UA-NON BA, and the highest value ever in the OA (0.974).
Table 5 testifies that NBR+ is also the best-performing index in the case of bi-temporal thematic accuracy assessment on test sites excluding water bodies and clouds.

Conclusions
The results of this research confirm the capability to precisely map burned areas using remote sensing science: multispectral images provided by satellite sensors such as Sentinel-2 are very useful for the scope but require an appropriate selection and combination of the bands.In fact, several indices are available in the literature and allow us to achieve good results in many cases.However, the specificity of the scene can reduce the level of accuracy of the final map: the presence of water bodies or clouds induces commission errors for some indices and produces false alarms with an unjustified increase in the number of pixels classified as burnt.Even if the disturbance of water bodies and clouds can be removed from the observed scene by resorting to specific remedies such as, for example, the use of masks, it is true that the ability of the indicator used to self-exclude commission errors is certainly welcome.
The new index proposed in this work for Sentinel-2 images allows us to delineate fire scars accurately, similar to other indices, but better than these, it guarantees high performance even in the presence of water bodies and clouds.Compared with NBR, NBRSWIR, NDSWIR, MIRBI, and BIAS2, NBR+ is the most performing index reaching very high values of PA, UA, and OA.The excellent performances of NBR+ are evident both in the case of the uni-temporal (post-fire) and in the bi-temporal (pre-fire and post-fire in comparison) approach.The other indexes analyzed show a more limited accuracy of the results in the case of application on a single image (post-fire); on the contrary, NBR+ provides excellent performance even in this situation.
The genesis of the index is simple and effective, aimed at enhancing the performance provided by the NBR.Using only the bands B12 and B8A, water bodies tend to have high brightness values of the pixels, so we decided to also introduce the bands B2 and B3: by subtracting the amount of energy, the pixels that reflect in these bands (water) tend to go dark, so finally, only the burnt areas become light and are easily identifiable.In addition, NBR+ supplies negative values for clouds, therefore, cannot be confused with BA pixels.
Concerning the future developments of this work, further studies will be focused on the possibility of applying the NBR+ efficiently to other types of satellite images.Of course, the presence of meaningful bands for detecting burned areas must be found.In other words, we need images acquired in specific bands such as Blue, Green, NIR, and

Figure 3 .
Figure 3.The workflow of the whole methodological approach for testing the performance of the proposed index in comparison with other five indices.

Figure 3 .
Figure 3.The workflow of the whole methodological approach for testing the performance of the proposed index in comparison with other five indices.

Figure 4 .
Figure 4. Comparison of the spectral response of healthy vegetation and burned areas (Source: USDA Forest Service).

Figure 4 .
Figure 4. Comparison of the spectral response of healthy vegetation and burned areas (Source: USDA Forest Service).
concern test sites excluding water bodies and clouds (Group B).The following indices are considered: Producer Accuracy of the Burned Areas (PA-BA); Producer Accuracy of the Non-Burned Areas (PA-NON BA); User Accuracy of the Burned Areas (UA-BA); User Accuracy of the Non-Burned Areas (UA-NON BA); Overall Accuracy (OA).

Table 2 .
Results of single-date thematic accuracy assessment on test sites including also water bodies and clouds (Group A).

Table 3 .
Results of single-date thematic accuracy assessment on test sites excluding water bodies and clouds (Group B).

Table 4 .
Results of bi-temporal thematic accuracy assessment on test sites including water bodies and clouds (Group A).

Table 5 .
Results of bi-temporal thematic accuracy assessment on test sites excluding water bodies and clouds (Group B).