The Effects of Crown Scorch on Post-ﬁre Delayed Mortality Are Modiﬁed by Drought Exposure in California (USA)

: Accurately predicting the mortality of trees that initially survive a ﬁre event is important for management, such as planning post-ﬁre salvage, planting, and prescribed ﬁres. Although crown scorch has been successfully used to predict post-ﬁre mortality exposed to drought and that crown scorch is an effective post-ﬁre mortality predictor up to 10 years post-ﬁre.


Introduction
The 2012-2016, California drought and associated tree mortality [1] brought renewed urgency to concerns that post-fire tree mortality can be exacerbated by drought [2,3]. Forest managers need tools to account for drought effects on tree mortality in everything from planning fuel hazard reduction treatments to post-fire recovery [4]. Post-fire mortality Fire 2022, 5, 21 2 of 17 models typically focus on first-order effects such as crown scorch, not second-order factors such as drought [5]. The commonly used Ryan-Amman model is designed to accurately predict immediate post-fire mortality using two first-order metrics: crown scorch and fire resistance (bark thickness). Though the Ryan-Amman model can predict up to three years post-fire [6], the predictive ability of the model beyond the first year post-fire, so-called delayed mortality, across wildfires is unclear.
Accounting for drought is important when predicting delayed mortality because trees surviving a wild-fire may be more sensitive to fire-caused injury following episodes of drought-stress and associated hydraulic failure [3,7]. The combination of first-order damage and pre-disposing or subsequent stressors, such as drought, may result in substantial delayed mortality. Models that do not incorporate second-order effects such as drought or insect damage might underpredict mortality occurring within the first decade post-fire. Researchers have adjusted the Ryan-Amman model to account for major biotic secondorder mortality drivers. For example, a predictive term was added for bark beetle attacks to increase prediction accuracy in areas of active attacks [8]. Some see a need to augment crown scorch with additional first-order factors, such as indicators of root damage, to account for mortality in trees with little to no crown injury [5,9,10].
From an operational point of view, predictors added to the Ryan-Amman model should be easy to measure and obtain [11]. Crown scorch, for example, is easy to visually assess and is likely the most commonly used post-fire mortality predictor for logistic regression models [12]. For drought, the Palmer Drought Severity Index (PDSI) is commonly used in studies examining the link between drought severity and wildfire events [13]. PDSI uses temperature and precipitation data to estimate drought severity relative to historically "normal" precipitation. Using PDSI, researchers have found that increasing drought severity is linked to elevated tree mortality relative to background mortality [14][15][16]. Topographical factors, such as terrain slope, have also been linked to drought-driven mortality [14].
To examine post-fire mortality, we took advantage of the USDA Forest Service (USFS) forest inventory and analysis (FIA). The Fire Effects and Recovery Study (FERS) began as a research study following Oregon's 2002 Biscuit fire. It provides an opportunity to link observed fire effects to both pre-fire forest inventory measurements and data collected at future FIA remeasurement visits. FERS supplements FIA's robust standard forest inventory plot remeasurement protocols that, for example, account for tree status (live/dead/harvested/downed) and sample down wood, with assessments of readily observable fire impacts on trees and ground surface substrates. Attributes collected for each previously live tree include crown proportions burned, scorched and unburned, high and low bole char height and direction, and bole base char circumference. Cover by degree of char is estimated for each of ten ground surface substrates. FERS was implemented initially in California under sponsorship from USDA Forest Service Region 5, and later expanded to Oregon and Washington, on all FIA plots within perimeters of 184 fires larger than 400 ha, primarily in years with more robust fire activity and success in obtaining funding to sample off-panel. From 2015-2019, FERS was implemented only on plots sampled on-panel within fire perimeters < 2 years old.
In this study, we used FERS plots on which field crews have measured fire-impacted FIA plots for fire effects within two years (and typically within one year) post-fire [17]. These plots were subsequently measured again within 10 years post-fire, allowing for an assessment of delayed mortality. Our primary objective was to use individual trees from these remeasured plots to test if adding PDSI and soil char along with terrain slope and aspect as explanatory variables could predict post-fire mortality more than one year postfire better than a model based on only crown scorch and bark thickness. We hypothesized that drought exposure would be a significant driver of post-fire mortality of individual trees, in addition to first-order fire damage (crown scorch, soil char and stem char) and fire resistance (bark thickness). We expected that adding a second-order drought exposure predictor (PSDI) would have at least as much influence on prediction success as some first-order predictors, and that trees occurring on more drought-exposed sites (e.g., SW aspect), would experience greater delayed mortality.

Materials and Methods
To examine post-fire mortality, we selected trees from FIA plots [18] that were included in the FERS. We considered only trees occurring on plots that had been assessed at three times: before (pre-fire), one-year following (1st post-fire) and several years following (2nd post-fire) a fire large enough (over 404 hectares) to have been mapped by Monitoring Trends in Burn Severity (MTBS) [19]. The first post-fire visit provided data on several first-order fire effects, including crown scorch and soil char, along with standard FIA attributes [17]. These criteria were met by 97 FERS plots that had been affected by one of 27 fires in California occurring between 2002 and 2009. We built an analysis dataset consisting of 1569 trees that belonged to one of the six tree species with more than 100 trees across these 97 plots (Table 1). Post-fire data were collected within one-year following fire on all of these plots, and a 2nd post-fire assessment was conducted 3 to 8 years after the 1st, with a mean interval of 5 years and median of 4. We sought to predict mortality at the 2nd post-fire assessment for trees that remained alive at the time of the 1st post-fire assessment. We reference this as "delayed mortality" because it accounts only for trees that did not die within one year of fire. Table 1. Species common and scientific names and numbers of trees(n) and species-specific bark coefficient (Vsp) from fire-affected FIA plots used to test the post-fire delayed logistic mortality model. The first post-fire visit provided data for three first-order predictors: crown scorch, soil char and stem char. These metrics rely on visual inspection by field crews. For soil effects, we focused on the degree of char observed on exposed mineral soil because a high incidence of a high degree of char on exposed soil indicates potential for damage to roots resulting from high fire temperatures [20,21] or long-duration heating [22]. Percent cover of bare soil by char class was ocularly estimated on up to four microplots per plot, each containing a 2.07 m horizontal radius (1/750 hectare) sample area [18] without reference to pre-fire conditions, as described by the FERS protocol [23]. Crews assessed percent cover of four mineral soil char classes within the sample area: unburned, light (black), moderate (gray) and deep (orange). We summed the cover estimates for moderate and deep char mineral soil as our soil char indicator of potential damage to roots. Crown scorch was assessed on each sample tree larger than 2.54 cm dbh as a percent of pre-fire compacted crown length for each of three classes: unburned, scorched and consumed. In this study, we defined crown scorch as the sum of the estimated percent assessed for scorched and consumed crown [10]. For stem char, field crews assessed the percentage of the stem base circumference exhibiting char on the bole at the point where the bole intersects the ground. Details on the assessment of these metrics can be found in the FERS manual Sections 1.2, 1.6.3 and 1.6.5 [23]. Bark thickness was calculated as the product of tree diameter observed at the pre-fire visit and a species-specific multiplier (Vsp) derived from FOFEM 6 [24] Fire 2022, 5, 21 4 of 17 (Table 1). Aspect and slope were observed at the pre-fire visit; we transformed aspect into a categorical variable as one of four cardinal directions.
We represented drought exposure via the Palmer Drought Severity Index (PDSI), a widely used index that relies on temperature and precipitation data to estimate relative dryness on a standardized scale ranging from very dry (−10) to very wet (10) [25]. We downloaded monthly PDSI data from NOAA's ClimDiv website (doi:10.7289/V5M32STR). We assigned plots to a NOAA climate division and then, for each plot, we identified the ten-year period ending in the year of the 2nd post-fire assessment. Minimum, mean, median and maximum monthly PDSI were calculated for each plot over these ten-year periods (120 observations per plot). We chose a ten-year period from the 2nd post-fire measurement because it could be expected to cover drought effects during the delayed mortality timeframe and the decadal timeframe is commonly used in studies that examine PDSI trends, for example, [26]. We based our use of regional PDSI to represent drought for assessing tree mortality on [14]. We also checked for high (>0.7) Pearson's correlation among PDSI summary metrics. We removed mean PDSI given its high correlation with median PDSI (0.82). The highest correlation observed in the other PDSI metrics was 0.53, between minimum PDSI and median PDSI. None of the correlations between predictors variables in the model selection process exceeded 0.7.
We evaluated 10 first and second order predictors of tree mortality predictors with logistic regression (non-categorical variables summarized in Table 2). Four can be thought of as first order predictors: crown scorch, soil char, stem char, and bark thickness, and five as drought exposure predictors: slope and aspect plus minimum, mean, median and maximum monthly mean PDSI over 10 years. To account for differences among species, we added a term for tree species, with canyon live oak as the reference, as is required for categorical variables in logistic regression. Our final post-fire model equation for the logit of delayed mortality was (Equation (1)): Mortality at 2nd post-fire assessment~crown scorch + bark thickness + stem char + soil char + median PDSI + minimum PDSI + maximum PDSI + slope + aspect + species (1) We also tested whether two parsimonious model alternatives better predicted delayed mortality. First, we tested a model (Ryan-Amman) with only crown scorch and bark thickness as predictors, resulting in an Akaike information criterion (AIC) of 1135. Second, we tested a model with the additional first order variables (soil char and stem char), resulting an AIC of 1129. The full model (Equation (1)) had an AIC of 1036, so we selected this model for mortality prediction. We also explored if interactions between crown scorch and the other variables might have affected the results but including interactions only slightly worse AIC of 1070. There was the significant interaction between crown scorch and median PDSI (p = 0.0004), which we presented to illustrate the relationship between drought and post-fire mortality. We assessed the influence of predictor variables based on a combination of z scores and p-values. High z scores and low p-values indicate high influence on model predictions. We included some predictors that p-values indicate are not significant because they are a part of the best model as determined by AIC.
We used the R package ROCit 2.1 package [27] to calculate accuracy metrics. Logistic regression models estimate a probability of mortality that ranges from 0 to 1. The models return the logit, which can then be back-transformed into a probability. To classify trees as survivor or mortality, we used decision thresholds at 0.5, 0.25 and 0.75. A threshold of 0.5 is typically used with the Ryan-Amman model, but higher thresholds, 0.75 for example, are used where managers want to minimize the number of trees incorrectly classified as mortality [28]. Table 3 describes the summary statistics used to assess model performance. For performance across all probability thresholds, we calculated the area under curve (AUC) of the receiver operator curve (ROC), which plots sensitivity vs. false positive rate (see Table 3). An AUC above 75% is a minimum for acceptable classification performance [29]. Further, a good mortality model will keep correct predictions (true positives) of tree mortality at a high level while also keeping trees misclassified as mortality (false positives) at a low level [29]. To evaluate the model beyond overall performance (AUC), we reported accuracy, sensitivity, specificity, and false positive rates at our three decision threshold points (see Table 3 for definitions). Model results were visualized using the following R packages: ggplot2 3.3 [30], effects 4.2 [31], and interactions 1.1 [32]. Table 3. Classification table and model performance statistics used to evaluate delayed mortality model. Adapted from [33].

Crown Scorch and Drought Exposure
Increasing first-order fire damage was associated with increased probability of post-fire mortality with crown scorch the predictor with the most explanatory power (i.e., highest Z score) ( Table 4). Predicted mortality increased with increasing crown scorch with the 0.5 mortality threshold occurring at 80% crown scorch ( Figure 1). Both survivor trees and mortality had low levels of crown scorch. Mean crown scorch for all trees was 18% with a median of 5% (Table 2). For mortality trees, mean and median crown scorch were 43% and 40%, respectively ( Figure 2). Survivor trees had a lower mean crown scorch at 14% and a median of 2% ( Figure 2).  Table 4. Model coefficients (standard error), z score, and p-values for variables used in the delayed mortality logistic regression model by bark thickness, percent crown scorch, tree species (Canyon Live Oak as reference), soil char, stem char, min/max/median monthly ten-year PDSI, slope, aspect (NE as reference), and interaction between crown scorch and median monthly ten-year PDSI and soil char. After intercept, variables are ordered by p-value (significant predictors below p-value of 0.05 in italics). NS = non-significant at p-value of 0.05.

Variable
Coefficient ( Ten-year median PDSI modified the effects of crown scorch as evidenced by a significant interaction (p = 0.0014). Mortality risk decreased with increasing drought exposure (Table 4). Median PDSI were all slightly below normal levels < 0) while minimum PDSI indicated exposure to extreme drought (<−4.0) ( Table 2). As median PDSI increased (i.e., less drought), the risk of mortality increased (Figure 3). Increasingly negative minimum PDSI and greater drought exposure, was also associated with a significant decreasing mortality probability (Table 4). Max PDSI was not a significant predictor (Table 4). Looking at the interaction between crown scorch and median monthly PDSI, we can see that, at lower crown scorch levels, the mortality risk trend matched the general trend noted above ( Figure 4). However, at higher levels of crown scorch, mortality risk increased with increasing drought exposure greater than the mean of the median monthly PDSI (Figure 4).

Influence of other First-order, Site and Species Predictors
Soil char and stem char were significant predictors of tree mortality (Table 4). Mortality probability increased with increasing soil char (Table 4). Mean soil char was low and stem char was high for most trees (Table 2). Both had similar levels between mortality and survivor trees (Figure 3). Mortality probability increased with bark thickness (BT), but this variable was not a significant predictor (Table 4). Mean bark thickness (BT) was the same in mortality trees and surviving trees (1.1 cm). Mean tree diameter, a variable that drives the calculation of bark thickness, was also similar in surviving and mortality trees,  Both site factors (slope and aspect) had a modestly significant effect on post-fire mortality, with variable importance similar to the PDSI variables (Table 4). Mortality probability increased with decreasing slope (Table 4). Relative to the NE aspect, NW and SE aspects had lower probability of mortality (Table 4). These predictors had relatively minor effects (Figure 3) compared to crown scorch (Figure 1).
Jeffrey pine, with the highest mortality, had significantly greater odds of mortality compared to canyon live oak (reference species) ( Table 4). Mean post-fire mortality across all species was 17%, ranging from lows of 10% in Douglas-fir and 11% (canyon live oak) to a high of 30% in Jeffrey pine. Closer to the overall mean mortality were white fir (19%), ponderosa pine (16%), and incense cedar (14%). Ten-year median PDSI modified the effects of crown scorch as evidenced by a significant interaction (p = 0.0014). Mortality risk decreased with increasing drought exposure (Table 4). Median PDSI were all slightly below normal levels < 0) while minimum PDSI indicated exposure to extreme drought (<−4.0) ( Table 2). As median PDSI increased (i.e., less drought), the risk of mortality increased (Figure 3). Increasingly negative minimum PDSI and greater drought exposure, was also associated with a significant decreasing mortality probability (Table 4). Max PDSI was not a significant predictor (Table 4). Looking at the interaction between crown scorch and median monthly PDSI, we can see that, at lower crown scorch levels, the mortality risk trend matched the general trend noted above ( Figure 4). However, at higher levels of crown scorch, mortality risk increased with increasing drought exposure greater than the mean of the median monthly PDSI ( Figure  4).       [32]. Median PDSI was-0.73 (0.2 sd) (see Table 2). Positive sd (+1 sd) indicates less drought exposure and negative sd (−1 sd) indicates increased drought exposure.

Influence of other First-order, Site and Species Predictors
Soil char and stem char were significant predictors of tree mortality (Table 4). Mortality probability increased with increasing soil char (Table 4). Mean soil char was low and stem char was high for most trees (Table 2). Both had similar levels between mortality and survivor trees (Figure 3). Mortality probability increased with bark thickness (BT), but this variable was not a significant predictor (Table 4). Mean bark thickness (BT) was the same in mortality trees and surviving trees (1.1 cm). Mean tree diameter, a variable that drives the calculation of bark thickness, was also similar in surviving and mortality trees, at 20 and 18 cm, respectively. Incense-cedar had the clearest separation between mortality and survivor trees ( Figure A1).
Both site factors (slope and aspect) had a modestly significant effect on post-fire mortality, with variable importance similar to the PDSI variables (Table 4). Mortality probability increased with decreasing slope (Table 4). Relative to the NE aspect, NW and SE aspects had lower probability of mortality (Table 4). These predictors had relatively minor effects ( Figure 3) compared to crown scorch (Figure 1). Jeffrey pine, with the highest mortality, had significantly greater odds of mortality compared to canyon live oak (reference species) ( Table 4). Mean post-fire mortality across all species was 17%, ranging from lows of 10% in Douglas-fir and 11% (canyon live oak)  [32]. Median PDSI was-0.73 (0.2 sd) (see Table 2). Positive sd (+1 sd) indicates less drought exposure and negative sd (−1 sd) indicates increased drought exposure.

Logistic Model Performance
Estimated AUC for the post-fire mortality logistic model was 79 % (95 % CI: 77% to 83%). Overall accuracy was 87% at 0.5, and 86% at 0.75 mortality threshold (Table 5). Reflecting the imbalanced dataset (far more survivor trees than mortality trees), the logistic model had better success in predicting surviving trees (high specificity) than mortality trees (low sensitivity), with sensitivity increasing as the decision threshold decreased (Table 5). At the false positive rate of 10%, the correct positive rate (sensitivity) was 0.52, which occurred at the 0.25 decision cut-off ( Table 5). The model did not reach a 0.75 sensitivity until an false positive rate of 40%. Table 5. Model accuracy, sensitivity, specificity, and false positive rate for predicting delayed mortality at three thresholds for the delayed mortality logistic regression by bark thickness (Vsp), % crown scorch, median ten-year PDSI, % moderate/severe ground char, and tree species (Canyon Live Oak as reference). Bold indicates best score for each column.

Drought Exposure Modifies Crown Scorch Mortality Predictions
We confirmed that crown scorch increased the risk of delayed post-fire mortality, a finding consistent with what has been observed for first year mortality [5]. Fire driven mortality does not necessarily occur during or immediately after a wildfire. Trees may not succumb to fire injuries until as long as three years following fire [5,6,34]. Research has suggested that trees with over 70% crown scorch are at the highest risk of dying within three to five years after a wildfire [8]. High levels of crown scorch diminish survival by reducing photosynthetic capacity, depleting stored, nonstructural carbohydrates needed to regenerate foliage, and are indicative of damage to critical plant tissues other than foliage [10,35]. We expected that drought exposure combined with crown scorch would lead to increasing mortality risk. We found that high median drought exposure was associated with elevated mortality risk in trees exhibiting high levels of crown scorch. The trees most at risk were those for which median drought exposure was near or beyond the threshold for a designation of "abnormally dry", i.e., PDSI < −1.0.
The relationship between mortality risk and drought exposure, however, was not as clear as with crown scorch. We based our initial predictions on studies indicating that increased drought exposure, as measured by PDSI, has been linked to drought-driven mortality [14,15]. We expected increased drought exposure to heighten post-fire mortality risk across the board [3]. Fire damage and drought can both drive mortality by impairing xylem functioning [36]. Drought can also cause depletion of non-structural carbohydrates contributing to mortality [37]. To the contrary, we found that trees with minor crown damage had an elevated risk of mortality at lower levels of drought exposure. Mortality risk was lower where drought exposure was extreme, as indicated by low values of minimum PDSI; however, unlike median PDSI, the interaction between minimum PDSI and crown scorch was not significant.
Changes brought by the fire itself may help explain our counterintuitive finding that increased drought exposure resulted in decreased post-fire tree mortality. Stand density can alter tree responses to drought conditions by altering the water availability, through lower/higher evapotranspiration, for example. Fire-induced reduction in stand density through tree mortality can reduce competition for resources in the surviving trees, possibly offsetting drought impacts in areas experiencing high competition [38,39]. We also note that that expected responses of plants to disturbance are best characterized by context-specific predictors and not generic plant traits [40].
Our mortality model included topographic factors, because they have been reported to be a driver of variability in drought-induced mortality [41]; we found that slope and aspect did impact post-fire mortality. However, their effects did not align with expectations. Typically, more xeric topographical positions, such as SW aspects and ridgetops, can be more vulnerable to drought owing to increased evapotranspiration [14]. However, trees on the driest aspect (SW) in our dataset presented mortality risk similar to those on the wettest (NE), with both of these demonstrating mortality that exceeded other aspects (SE and NW). We also found that tree mortality declined as slope increased, contradicting our expectation that mortality would be greater on steeper, and potentially drier, slopes. Moreover, topographical factors can lead to complex fire effects, which can complicate generalizing their effects on tree mortality [42]. Interactions of slope and aspect with fire-driven changes in stand density, as noted above, might also help explain the mortality effects we observed.
The lack of consistent mortality response to drought is not unique to this study. Others have found the substantial variation in mortality response to drought to be a major challenge [43,44]. Mechanisms behind the interacting influence of drought stress and fire on tree survival are not yet well understood [45]. One potential explanation for the weak relationship found between drought and mortality may be that trees adapt by accessing water available at greater depth in the soil profile or via physiological adaptions [46]. Another possibility is that trees with greater drought exposure and/or growing on drier topographic sites might have better adapted to water stress than trees with less drought exposure [47].
PDSI coupled with our ten-year window may not have sufficiently captured the effects of drought exposure on tree mortality. Alternative drought metrics might perform better, and more directly account for effects driven by temperature or precipitation. For example, normal annual precipitation or standardized precipitation index (SPI) might prove to be more effective predictors of individual tree mortality [48]. Combining complementary drought indices (e.g., PDSI and a fire weather index) has been suggested as a potentially promising approach to capturing effects of drought on forests [49]. All drought metrics have substantial limitations when attempting to assess the full impact of droughts' impacts on evaporative demand and precipitation deficits in forest ecosystems [50].
Drought can induce second order effects, triggering or amplifying other changes that contribute to post-fire mortality. Trees weakened by drought have increased vulnerability to mortality [51]. For example, drought-stressed trees can produce less resin, leading to greater bark beetle mortality [52]. California's drought recently contributed to a major bark beetle attack from 2012-2017 [53]. However, effects varied among trees owing to a combination of lags in drought effects, site factors, and host preferences of the beetle species. For example, insect-induced mortality was greater on low elevation sites varied among tree species. In particular, larger diameter ponderosa pines at lower elevation in the Sierra Nevada experienced very high mortality rates [53]. We did not observe this in our analysis. Post-fire mortality in ponderosa pine was 16 percent, but was much greater in Jeffrey pine, at 30 percent. Given reports of large-scale insect outbreaks in California during some of the years covered by this data, we had planned to include bark beetle damage observed at the first post-fire visits as a predictor. However, no cases of obvious beetle attack were attributed for the trees in this study, perhaps because beetle populations driving mortality events don't necessarily become elevated in the first year following fire [54].

Limits of First-Order Factors in Predicting Post-Fire Mortality
Despite an acceptable overall model performance (76% AUC), our model had less success in predicting mortality trees (low sensitivity) than survivor trees when relying on exceedance of the 0.5 decision threshold to classify mortality trees. The challenges of predicting mortality outcomes are commonly discussed in post-fire mortality modelling research [55,56]. Ideally, an effective mortality predictor would clearly distinguish survivor trees from mortality trees. Crown scorch was the most effective at this among the predictors we tested. Survivor trees had low levels of crown scorch (median = 2%), but mortality trees did not cluster at the high end of crown scorch (median = 40%) ( Figure A1), thereby, producing false negatives that reduced model sensitivity. Longer post-fire surveys, such as FERS or [57], may further clarify long-term fire-caused mortality by revealing how separation in a predictor such as crown scorch might change over time.
We expected that the other first-order predictors (bark thickness, soil char and stem char) would be more influential in predicting post-fire mortality than what our analysis demonstrated. One potential explanation is that there was much less separation in the means of these predictors between survivor and mortality trees for these first-order predictors than there is for crown scorch. The lack of separation confirms that these factors are not decisive predictors of mortality. Because bark thickness was calculated from measurements of tree diameter collected up to 10 years pre-fire (mean = 4 years), it may slightly understate the capacity of the tree to survive fire. We did find that bark thickness was a significant predictor of first order mortality (unpublished results). In a previous study of FERs stands, we did find that smaller-diameter trees were more vulnerable to first order mortality [17]. Others have found bark thickness to be less predictive of mortality than crown scorch [5,33]. It should be noted that difficultly in measuring cambium damage for modelling purposes does not rule out its role in contributing to delayed mortality in the field.
Our first-order metrics may have failed to fully capture mortality effects. There are several factors that can limit the impact of ground fires on root systems. The soil medium itself can limit fire impacts on roots by protecting roots deeper in the soil profile from heat damage [21,58]. For roots closer to the surface, rapid turnover of fine roots facilitates quick post-fire recovery [59]. Wind-driven wildfires often have short residence times and, therefore, cause relatively little damage to root systems compared to slower moving fires [59]. Unexpected mortality from root damage is a major concern in prescribed fires operations, often conducted when winds are light or absent, where longer residence times can increase damage to root systems [22,60].
The capacity of soils to buffer disturbances also has implications for predicting droughtdriven mortality. Roots deeper in the soil profile are largely sheltered from direct fire effects and can buffer water stress in the upper surface profile. Subsurface water access can delay or prevent drought-driven mortality. Forests in southern California experienced greater mortality relative to central and northern California because of a combination of depleted subsurface water and increased evapotranspiration [61].

Moving Forward with Post-Fire Mortality Modelling
Tree mortality forecasters must balance correctly predicting mortality trees while minimizing incorrectly labeling survivor trees as dead (false positives). Our model did a better job of avoiding false positives than it did for correctly predicting mortality trees. One contributing factor was that our dataset was imbalanced in favor of survivor trees. Logistic regression models can have trouble classifying mortality when the sample has a low proportion of dead trees [62], as was the case here, because we focused on delayed, not immediate, mortality. Such class imbalances also occur when conducting research on prescribed fires, leading to underprediction of mortality. One solution would be to lower the mortality decision threshold to increase sensitivity. However, in this case, that would require lowering the threshold below the 50/50 level and would lead to a large false positive rate (e.g., 40% for a 0.25 decision threshold). Alternatively, the random forest algorithm might be used to improve correct classification of dead trees in applications where the goal is retrospective analysis of mortality [62].
The tolerance for error from mortality models depends on the management objectives. The persistence of wildfire effects on forests is a key question facing managers in planning post-fire recovery, such as ensuring suitable seed sources for natural regeneration [63,64]. In the forest recovery context, our crown-scorch dominated model would have reasonable success in correctly predicting living trees, and perhaps stand attributes such as stocking, in the first decade post-fire. Clearly, the opposite is not true for management needs centered on dead trees (e.g., predicting snags [65,66]). In this case, our model would likely underpredict the creation of new snags. Progress in improving model sensitivity might also come from improved measurement of first-order fire damage beyond the metrics included in the FERS protocol. For example, cambium damage can contribute to mortality but is often assessed indirectly, as was the case in this study. We relied on bark thickness as an indicator of potential cambium damage, but this provides only an indication of a tree's potential to resist fire effects and not an indication of actual damage [5]. Measuring charcoal reflectance on tree boles can help predict mortality by allowing for a more direct assessment of fire damage [67]. For cases where trees can be invasively sampled, direct assessment of cambium damage (cambium kill rating) can be used to improve predictive accuracy [68].
However, it is not clear that even perfectly captured first-order effects alone would be effective in predicting post-fire mortality. In the absence of major disturbances, such as wildfire, tree mortality is likely the result of multiple accumulating processes rather than a single driver [69]. First-order fire damage mortality drivers will likely diminish in importance as time since fire increases. Nonetheless, fire-injured trees may be more vulnerable to mortality in the medium term (10-30 years post-fire) [57]. Continued monitoring and assessment of fire-affected forest inventory plots may provide vital data in this regard [70].
The feasibility of implementing modelling alternatives to the logistic regression model is an active area of research [5,62]. One approach is taking a non-linear dose-response to heat exposure from wildfire [45,71]. Low to moderate levels of stress on trees, including from fire damage, have been shown to benefit tree survival and growth [35]. If trees can benefit from moderate levels of first order damage, then this would help explain some of the limitations of relying on damage metrics to predict post-fire mortality.
Our attempt to incorporate PDSI highlight the challenges of incorporating secondorder factors into the Ryan-Amman mortality model. The lack of consistent patterns in tree responses to drought likely precludes simply adding a drought term to the RA model, even within ecoregions. Further, our results suggest that a range of drought outcomes need to be addressed for effective mortality prediction. Currently, the commonly used FVS model incorporates drought by assuming that fire mortality, as predicted by the RA model, increases background mortality beyond user-adjustable mortality multipliers [72]. Drought driven mortality has been shown to exhibit threshold responses, where mortality increases rapidly after reaching a level of drought [73]. Regional drought thresholds could be established and used to create scenarios of potential drought impacts. A range of scenarios is needed, however, as these results demonstrate that drought does not automatically increase mortality risk.
Given the long-term dynamics of drought, fully understanding the interaction between wildfire and drought requires long-term monitoring. A major challenge in post-fire vegetation monitoring is obtaining relatively balanced samples across forest types and initial conditions. Many research studies used for post-fire mortality model development and validation largely consist of special one-off studies, with notable exceptions such as the National Park Services' long-term post-fire monitoring studies [12]. The paucity of long-term monitoring hinders managers needing to plan and execute post-fire management. Managers have used standard FIA data to manage post-fire forest recovery [74]. One advantage of using FERS data for post-fire monitoring is that it enables long term monitoring across ownership types and different initial pre-fire conditions. Continued post-fire monitoring will provide long records of tree outcomes as the plots go through remeasurements at regular intervals (currently ten years, typically, in the western U.S.). FERS also offers a basis for bridging the gap between short-and long-term assessment of post-fire vegetative dynamics [75].

Conclusions
We confirmed that crown scorch is the most effective predictor of post-fire delayed mortality. Including additional first-order metrics, such as soil char, led to only modest improvements in model performance compared to crown scorch alone. Drought exposure can increase mortality risk as median drought exposure increases. However, high drought exposure did not correspond to elevated mortality in all cases. The opposite trend was found when considering maximum values of PDSI. We also found that slope and aspect influenced delayed mortality, but to a lesser degree than crown scorch. Species was, largely, not a significant factor, except for incense-cedar. Given the potential non-linear effects of drought on mortality, PDSI might be an early warning signal of future drought-driven mortality. Regardless, predicting delayed post-fire mortality will likely be more challenging than predicting first-order mortality, as successful second-order predictors will likely not be as universally effective as crown scorch. A combination of improved first-order metrics to add to the Ryan-Amman equation coupled with improved post-fire monitoring could help improve model performance