Impacts of Forest Fires and Climate Variability on the Hydrology of an Alpine Medium Sized Catchment in the Canadian Rocky Mountains

This study investigates the hydrology of Castle River in the southern Canadian Rocky Mountains. Temperature and precipitation data are analyzed regarding a climate trend between 1960 and 2010 and a general warming is identified. Observed streamflow has been declining in reaction to a decreasing snow cover and increasing evapotranspiration. To simulate the hydrological processes in the watershed, the physically based hydrological model WaSiM (Water Balance Simulation Model) is applied. Calibration and validation provide very accurate results and also the observed declining runoff trend can be reproduced with a slightly differing inclination. Besides climate change induced runoff variations, the impact of a vast wildfire in 2003 is analyzed. To determine burned areas a remote sensing method of differenced burn ratios is applied using Landsat data. The results show good agreement compared to observed fire perimeter areas. The impacts of the wildfires are evident in observed runoff data. They also result in a distinct decrease in model efficiency if not considered via an adapted model parameterization, taking into account the modified land cover OPEN ACCESS Hydrology 2015, 2 24 characteristics for the burned area. Results in this study reveal (i) the necessity to establish specific land cover classes for burned areas; (ii) the relevance of climate and land cover change on the hydrological response of the Castle River watershed; and (iii) the sensitivity of the hydrological model to accurately simulate the hydrological behavior under varying boundary conditions. By these means, the presented methodological approach is considered robust to implement a scenario simulations framework for projecting the impacts of future climate and land cover change in the vulnerable region of Alberta’s Rocky Mountains.


Introduction
The southern Rocky Mountains in Alberta, Canada, are an important region for the generation of freshwater runoff especially due to their contribution of melt water in spring months [1,2]. Climate change affects the watersheds in the southern Rocky Mountains with an increase in air temperatures, especially during winter months, and thus a decrease in snow cover. As a consequence, spring melt runoff has been declining [3]. Precipitation changes are more ambiguous in pattern and amplitude, but also contribute to changes in runoff variability. The impact of climate change on southern Albertan watersheds has been investigated by several authors [1][2][3][4][5]. The trend in runoff due to climate variability in the Castle River watershed was analyzed by Rood [5] and Byrne [1].
Land cover changes can also impact the watershed's runoff regime and discharge amounts. Wildfires are an important natural occurrence on the eastern slopes of Alberta [6] and have always played an integral role in the functioning of ecosystems in Canadian forests [7]. However, over the past three decades there has been a distinct trend in increase of areas burned by wildfire [8]. Flannigan et al. [9] examined the effect of climate change on wildfires and emphasize that fire projection models combined with climate change predictions indicate not only an increase in the area burned but also in fire season length, the intensity of fires and thus in burn severity [7]. Pierson, et al. [10] investigated the impact of wildfires on watershed hydrology and detected a reduced infiltration and an increased erosion in a steep rangeland study side. According to Silins [11], after the 2003 Lost Creek fire in southern Alberta, important changes in the hydrology of the affected rivers could be measured; mean annual flows increased due to reduced actual evapotranspiration, as did peak discharges and sediment concentration.
To analyze changes in discharge patterns due to environmental change, hydrological modeling is broadly applied. Currently, special focus in environmental modeling is set on the investigation of the impacts of climate variability, land cover changes or human induced land degradation [12]. To provide input data for hydrological modeling or to detect changes in land cover, satellite remote sensing is a recommended and widely used method. Landsat data are available at high spatial and temporal resolution and deliver information in a broad range of wavelengths especially suited to distinguish different land cover [13]. Thus, different types of patterns in land cover can be detected as well as variations when applying a multi-temporal change detection method. Particularly changes of vegetation characteristics can be distinguished by using infrared bands [13].
In this study, the physically based Water Balance Simulation Model (WaSiM) is applied to simulate the hydrological processes in the Castle watershed. The effect of the fire on the entire Castle River watershed has neither been investigated nor modeled. Thus, this study provides an analysis whether the Castle River shows a reaction in streamflow after the fire and if the model is able to simulate runoff behavior with a changed land cover after wildfires.
The study intends answer to the following questions: How has the hydrology of Castle River been changing during the last decades and what are the reasons for that? Is the hydrological model WaSiM able to represent the hydrological conditions? How severe was the wildfire in 2003 and can the effects on hydrology be observed and simulated with an adjusted parameterization?
The objectives of this study are to (i) analyze characteristics and trends of measured climate and streamflow data; (ii) calibrate and validate the hydrological model WaSiM for the watershed; (iii) map burn severity from satellite imagery for a severe wildfire in 2003; and (iv) simulate the impact of climate variability and forest fires on runoff behavior in the Castle River watershed.

Study Area
The Castle River watershed is situated at the eastern slopes of the Rocky Mountains in southwestern Alberta ( Figure 1). It is located between the Waterton Lakes National Park in the south, the Crowsnest River watershed in the north and the border to British Columbia in the west. In the east, the Castle River drains into the plains and provides water for a wide variety of purposes, in particular for irrigated agriculture [14].
The gauge is situated near the town of Beaver Mines (at 49°29ʹ19ʺ North and 114°8ʹ39ʺ West) and defines the watershed for this study with a size of 820.70 km 2 [15]. The watershed has an elevation difference of more than 1600 m as the gauge is at 1187 m and the Castle Mountain has an altitude of 2766 m [16]. The mean annual streamflow of the Castle River at the Beaver Mines gauge is 15.57 m 3 ·s −1 (runoff data from 1960 to present available), with an explicit maximum in May and June and minimum runoff in the winter months. The peak of the hydrograph in early summer reflects the convergence of the period of peak melt and the most rainy period of the year in June [11].
The Castle basin has frequently been affected by wildfires [11]. The Lost Creek wildfire in summer 2003 was one of the most severe in the upper eastern slopes in many decades and burned 21,000 ha in both the headwater regions of Crowsnest and northern Castle River [17], where it spread in the catchment areas of Lynx Creek and Lost Creek. The 2003 fire perimeter is indicated in Figure 1. The fire burned in all structural forest strata as a so-called crown fire and expanded quite quickly so that the landscape was changed significantly. As the Lost Creek fire took place along two of the tributaries of the Oldman River, i.e., the Castle and the Crowsnest Rivers, local and regional water-based resources were impacted through these forest disturbances [11].

Climate and Streamflow Observations and Trends
Historically, meteorological monitoring in the eastern slopes of the Rocky Mountains is extremely sparse [4]. The only available data in the area are interpolated daily minimum and maximum temperatures (in °C) as well as daily precipitation sums (in mm) from AAFC [18] from 1950 to 2010. This data is organized in a regular station grid with a spacing of 10 km, which contains interpolated point estimates. It is derived from an Anusplin interpolation method of Environment Canada climate stations south of 60°N in which a thin-plate smoothing spline surface fitting method was implemented using longitude, latitude and elevation [19]. The situation of the station grid points is represented in Figure 1. Hence, the meteorological data used in this study represent variations from real measured values but they demonstrate general patterns and trends. Climate features are analyzed over a time period of 61 years to investigate the data regarding a climate change trend. To analyze mean temperature characteristics in the region of Castle River watershed, a mean of all 26 stations values is calculated for every day. The same procedure is applied to the precipitation data. Additionally, to analyze spatial variability in the watershed, spatially distributed analyses are accomplished for which the individual station values are considered.
Stream discharge data from Castle River are available at the gauge near the town of Beaver Mines, operated by Environment Canada [15]. The location of the gauge is indicated in Figure 1. Daily runoff values from 1960 to 2010 are analyzed regarding runoff regime, quantitative or temporal changes and trends.

Hydrological Modeling with WaSiM ETH
The hydrological modeling in this study is executed by the mainly physically based, distributed and deterministic hydrological model WaSiM (Water Balance Simulation Model) [20]. It runs in continuous time steps (one day in this study) and the spatial resolution is a raster of constant grid cell size, which is 100 m in this study. WaSiM allows short-term flood event runoff simulations as well as long-term water balance applications [21]. According to Beckers, et al. [22], it delivers very high model functionality and complexity for forest management and climate change applications and is, therefore, applied in this study. A detailed description of all modules, algorithms and the functioning of the model is specified in the WaSiM Model Description by Schulla [20]. In this application the model version 9.1.0 is applied, which integrates the Richard-equation.

WaSiM Input Data
WaSiM requires meteorological and spatially distributed input data. The main meteorological input data for an optimal configuration are spatially explicit temperature (°C), precipitation (mm/time step), wind speed (m·s −1 ), relative humidity (1/1) or vapor pressure (mbar) and either sunshine duration (1/1) or global radiation (Wh·m −2 ) for each time step. The data are acquired from several meteorological station points distributed in and around the considered watershed so that an interpolation in WaSiM delivers a comprehensive meteorological information.
As AAFC [18] provides minimum and maximum temperature and precipitation data in daily resolution [19] but WaSiM requires daily mean temperature, the mean values are calculated with the equation (Tmax + Tmin)/2, according to Klein Tank, et al. [23].
For the study area wind speed, global radiation and relative humidity are only available as monthly means from a spline interpolation of climate normals from the National Climate Data Information Archive [24]. Considering wind speed, in this study the available monthly mean values are used instead of daily ones. For relative humidity and global radiation an estimation of daily data based on air temperature and precipitation data is applied using equations after Thornton, et al. [25] and Bristow and Campbell [26]. The meteorological data are thus supplemented by information about the geographical situation and altitude of each station.
Temporal constant geographical data in a grid format are required for a digital elevation model (DEM), land use and soil types. The DEM [27] contributes information about the elevation, the slope, the exposition and the curvature of the surface. Additionally, hydrologic information is derived from the DEM using the program Topographic Analysis (TANALYS) from Schulla [20]. Land cover information in 100 m spatial resolution for the Castle watershed are provided by the GeoBase Land Cover Product [28].
However, an official soil map with soil texture for the Castle River area is not available. WaSiM requires information about the soils in order to calculate infiltration and evaporation processes which depend essentially on the soil texture [20]. Thus, a soil texture map is generated, mainly based on the land cover information and the predominant soil types in different vegetated areas, according to the Canadian Society of Soil Science [29]. In the application of the model, the different soil texture classes obtain specific attributes in several soil layers like the hydraulic conductivity, the saturated and residual water content and other parameters that depend on the grain-size composition of the soil [20].

Calibration and Validation
Calibration of WaSiM is executed gradually. Sensitive model parameters to be calibrated in the Richards-Approach version of WaSiM are the soil model parameters of the unsaturated zone, especially the scaling parameter for interflow and the recession parameter for base discharge for saturated hydraulic conductivity with increasing soil depth (controls peak runoff and base flow) [12].
The calibration period is defined from 1 November 1960 until 31 October 1970, so it equals 10 hydrological years. A comparison of the modeled and observed data sets is conducted and the hydrological goodness of fit criteria NSE and R 2 are calculated in daily and monthly resolution [30][31][32]. Furthermore, the mean annual runoff (MQ), the monthly mean runoff (MMQ), as well as mean high (MHQ) and mean low flow (MLQ) values are compared. Calibration steps are iteratively conducted until a maximum in model efficiency is reached.
An elevation dependent correction for precipitation data is applied separately for rain and snow to derive altitudinal gradients. A threshold temperature between rain and snow have to be chosen for which the value 1 °C showed the best model performance, which is also the threshold temperature for rain in the snow model. The method used is a combination of inverse distance weighting and an elevation dependent regression with internal processing making use of all available meteorological observations. The stomatal resistance parameter was calibrated and needed to be increased to obtain realistic values for actual evapotranspiration. During this calibration process the model efficiency is increased until a stable parameter set is gained.
In order to validate the established model the calibrated parameter set is applied for the following 30 years. Validation runs are executed for each of the decades from 1970 to 1980, from 1980 to 1990, from 1990 to 1999 and from 2000 to 2010.

Remote Sensing of Forest Fires
Spatial wildfire data from ESRD Alberta [33] delivers perimeters of historical big fire events in Alberta. The shapefile polygons are derived from aerial photos and show, beside other fire events, the expansion of the wildfire in 2003, however they lack information on burn severity, which is critical to assess the impact of fires on hydrological variables.

Data
To determine the grade of damage of the burned land cover, infrared Landsat satellite data are applied for vegetation change detection as these wavelengths react sensitive for water content and chlorophyll [34]. Landsat Thematic Mapper 5 (TM5) and Enhanced Thematic Mapper+ (ETM+) data, provided in GeoTIFF format, are used at a spatial resolution of 30 m. Pre-and post-fire images from the 2003 wildfire are chosen according to smallest cloud cover and seasonal similarity to ensure analogous phenology and minimized sun angle effects. Pre-fire imagery is taken from 2002, post-fire imagery stems from shortly after the wildfire was extinguished.

Normalized Burn Ratio and Change Detection Algorithm
The Normalized Burn Ratio (NBR) has been developed for the detection of fire scars [35]; it is explained in detail by Miller and Thode [13] and Miller, et al. [36]. The NBR is computed from Landsat bands 4 and 7, the near and short wave infrared bands [36]. Analyzing pixels from pre-and post-fire images, both bands demonstrate considerable changes especially in forested regions. The reflectance in band 4 decreases because of a loss of the photosynthetic activity while the reflectance of band 7 increases due to less water absorption.
Miller and Thode [13] emphasize that vegetated pre-fire areas have different NBR values as the value refers to the density in vegetation cover with its chlorophyll and water content. Hence, to generate comparable fire severity values, a change detection method (differenced Normalized Burn Ratio (dNBR)) is applied in which each post-fire NBR is subtracted from the pre-fire NBR [36]. Low dNBR values indicate unchanged landscape or low fire severity, high dNBR values represent severe damage to vegetation through the fire [13].
The dNBR is meant to normalize the results to pre-fire vegetation cover. However, in several studies the relationship between measured severity values and calculated dNBR has been tenuous [37,38]. Miller and Thode [13] finally present a relative version of the dNBR which includes the previous vegetation density before the fire and minimizes the error caused by pre-fire vegetation. The RdNBR is defined as the relationship between the vegetation killed through the fire and the amount of pre-fire vegetation. Thus, positive RdNBR values represent a decrease in vegetation cover and consequently a high burn severity; negative values indicate an increase of vegetation and thus characterize unburned areas [36]. According to existing thresholds derived from measured data after fire events from Miller and Thode [13] four burn severity classes are defined from the RdNBR calculation.

Background
Fire severity is the greatest determinant on the impact of a fire on streamflow generation. The changes in annual water budgets can last for decades in forested regions [39]. Severe fire events can reduce infiltration rates as a consequence of the generation of a hydrophobic soil layer caused by high temperatures or soil sealing forming crusts after a fire [10,39]. The decreased infiltration has an immense effect on post-fire runoff especially for extreme precipitation events during convective storms, when overland flow rises. The soils generally recover within a few years [39]. The long-term effects of forest fires on hydrology are caused by a reduction in evapotranspiration due to the loss of biomass, similar to those effects occurring after harvesting. Interception of rain and snow is also reduced. During the winter the snow melt rates are higher due to reduced shading by canopy [39].
The Southern Rockies Watershed Project [11] analyzed the impact of disturbance by the 2003 Lost Creek wildfire on hydrology, water quality and ecology on several reaches affected by the fire. The results are an increase in snow packs (54% higher in burned forest stands compared to unburned) and snow density, higher runoff values, especially in low flow periods during summer due to less evaporation and interception (by 14% compared to mean values from the 1990s), and peak runoffs are produced more rapidly in burned watersheds. Furthermore, an earlier spring streamflow peak was identified as well as a second peak during the highest precipitation in June. To analyze whether wildfire impacts at smaller reaches propagate in the runoff behavior of the whole Castle River watershed further investigation is carried out.

Data and Methods
To analyze whether the established WaSiM model can reproduce the observed runoff under changed land cover conditions after the wildfire, further analyses are conducted. The pixels identified as burned using the RdNBR method are now reclassified to depict the land cover conditions after a wildfire. Previously, land cover in the burned area was mainly coniferous forest which contributes immensely to evapotranspiration. Due to the lack of on-site information about the conditions after the fire, four replacement options are tested in the burned area: (1) no change; (2) barren soil; (3) grass and (4) shrubs, for which individual land use parameter settings are provided in the hydrological model. Specific values for albedo, stomata resistance, leaf area index and root depth must be provided to calculate evapotranspiration [20]. The parameter values for the four classes are presented in Table 1. According to Schulla [20] these are the most important parameters and the values vary between the four classes. It is notable, that barren land has mostly constant values as vegetation is missing.
First a model run from August 1993 to July 2003 is carried out with the original land cover to set the initial state of the hydrological conditions for the time the fire began. With these initial conditions, model runs for the first year after the fire from August 2003 to October 2004 are executed with the four land cover properties applied for the burned area. Subsequently, the different applied land cover classes in the burned areas are assessed regarding the model efficiency of the produced runoff.  An indication of a warming climate is shown in Figure 2 by a linear increasing trend in yearly temperature by ca. 0.03 °C i.e., 1 °C in 30 years. A general temperature increase is evident throughout the whole watershed. Temperatures in the lower section of Castle River watershed rose more considerably (2 °C) as compared to the mountainous parts (1 °C). Minimum and maximum temperatures are rising at a greater rate than the annual means which may indicate a shift towards less cold extremes and increasing hot day events.

Observed Climate and Streamflow Trend Results
Considering the monthly mean temperatures during the last 60 years the winter temperatures, especially in January have increased by about 4 °C in the last three decades. Monthly mean temperature data are shown in Figure 3. The spring temperatures also rose by ca. 2 °C in the 1990s and 2000s compared to the middle of the 20th century. In July, the monthly mean temperature in the 2000s was about 2 °C warmer than in the other decades and the August temperatures have also risen during these last decades. There is no distinct temperature trend visible in the fall.
The illustrated temporal and spatial trends of the temperature data from AAFC [18] do generally agree with the conclusions from the IPCC [40]. In the southern Rockies, especially temperatures between December and April have increased by up to 2.5 °C during the last six decades. This trend is also reflected by data from the Castle area in Figure 2.  Mean annual precipitation in the Castle River watershed is estimated at 686 mm·a −1 [18], with a high variability within the watershed ranging from ca. 600 mm·a −1 in the lower valleys to over 1000 mm·a −1 in the mountains [6]. Highest precipitation is observed in June (average of 86.8 mm) with a minimum in July (46.4 mm) or during winter [18].
Analyzing precipitation data from 1950 to 2010 reveals the following trends: The annual precipitation sum is generally slightly increasing with a high interannual variability. The seasonal distribution of precipitation is changing, even though the patterns are more ambiguous than the temperature data.
Winter precipitation has been decreasing in the last six decades while the summer precipitation is rising with increasing temperatures. The 1980s showed significantly lower precipitation compared to other decades (annual mean: 655 mm·a −1 in the 1980s vs. 692 mm·a −1 ). In the 1980s, winters, springs and summers were drier and warmer. The reasons for this relative drought were probably weaker west winds and a series of stable high pressure fields which blocked out storms [41].
Spatially distributed precipitation is high in the mountainous areas of the watershed in the south and west and is declining towards the northeast. The trend from 1950 to 2010 amplifies this pattern, as the mountains became more humid with increases of the annual precipitation sum by about 50 mm and the lower lying areas became drier.
The derived precipitation trend results are similar to those presented in the IPCC [40], which show an annual precipitation increase by 12% on average but in the southern Rockies this trend is weaker [42]. Significant changes can be observed in terms of temporal distribution of precipitation which conform to those of Castle watershed. Heavy precipitation frequencies increased as well in the last 60 years [42,43].
Considering gauge data at Beaver Mines from Environment Canada [15], the mean runoff has not been constant throughout the last 45 years. In the 1960s the mean annual flow was 15.1 m 3 ·s −1 (580 mm·a −1 ), then it declined until, in the 1980s, it was 13.2 m 3 ·s −1 (507 mm·a −1 ). In the 1990s it increased to 14.8 m 3 ·s −1 (569 mm·a −1 ) and then decreased again to 13.3 m 3 ·s −1 (511 mm·a −1 ) between 2000 and 2010. The monthly low and high flows show a corresponding behavior through the decades.
Overall, when applying a linear regression, the data shows a trend of decreasing annual mean runoff values over the considered decades. The observed annual mean runoff and the derived trend in a 5-year moving average for the whole period is shown in Figure 4. From 1962 to 1999, runoff decreased about 20%, which corresponds to an annual decrease of ca. 0.53%. This reduction of annual river flow conforms to the investigations from Byrne [1] who analyzed streamflow of the Castle and Oldman Rivers. They derived a general decline in mean streamflow of the Oldman River watershed by about 26% since 1949. The reason for this trend is attributed to the reduction of snow packs in the watershed due to the identified rising temperatures. Although precipitation in winter is increasing, this cannot compensate for the warming and thus more precipitation falls as rain than as snow. Due to less melting spring runoff is reduced [1], even at a stronger rate than in other comparable watersheds in the North American Rocky Mountain region.
Considering the runoff trend for the decade 2000 to 2010 a slight increase is detected. This can mainly be explained by extraordinary high precipitation in 2002 with 800 mm·a −1 and over 1050 mm·a −1 in the year 2005 [18]. Figure 5 shows the monthly mean runoff values per decade. The declining runoff values in May and June are an indicator of reduced snowpack, as less snow supplies less melt water for runoff generation. It is noticeable that the 1980s reflected a much lower runoff rate than the other decades, especially between May and August due to the relative dryness in this decade. Apart from the 1980s, the peak streamflow in June has constantly declined. In the other seasons, a distinct trend is not visible.

Calibration and Validation Results of WaSiM
First, time series of the simulated runoff in calibration and validation periods are discussed. Second, an analysis of the spatial output grids for water balance calculation and runoff components is executed. Table 2. The result of the best calibration run from 1960 to 1970 is shown in Figure 6. Daily and monthly mean observed and simulated data are represented as well as the precipitation.  Model efficiency is assessed according to Moriasi [30] who suggested a performance rating in four NSE classes: (i) Very good (0.75 < NSE ≤ 1.00); (ii) Good (0.65 < NSE ≤ 0.75); (iii) Satisfactory (0.50 < NSE ≤ 0.65) and (iv) Unsatisfactory (NSE ≤ 0.50). Generally, the efficiency of the established WaSiM model is very good in daily (NSE: 0.81, R 2 : 0.82) and in monthly mean temporal resolution (NSE: 0.88, R 2 : 0.89), as is listed in Table 2 Generally the set up WaSiM model is able to reproduce runoff time series of the Castle River watershed and provides consistent simulation quality in a ca. 40 year period with very high accuracy.

Calibration and validation results are listed in
In addition to the temporal and the quantitative accuracy of the model, spatial output data of the three runoff components is analyzed. Especially the direct flow but also the interflow is mainly generated in the mountainous parts, where precipitation is higher. Direct flow is very high in non-vegetated rocky summit regions. Base flow is mainly built in regions where the soil is more developed and is especially visible in the valleys. These spatial patterns seem to reflect real conditions very well. The interflow contributes the largest amount of water to the streamflow with over 440 mm·a −1 , whereas the direct flow and the base flow together only account for around 150 mm·a −1 . Base flow contributes the highest amount of runoff in the low flow periods. Interflow is especially high during snow melt in the spring to early summer whereas the direct flow reacts to high precipitation events.
A water balance [44] is calculated for the calibration period from 1960 to 1970 and presented in Figure 7. The mountainous precipitation is increased after the precipitation correction method in the model. According to Silins [11] who conducted a few meteorological measurements in the northern reaches of the Castle watershed and detected annual precipitation values of over 1200 mm·a −1 the derived precipitation in the mountains can be claimed to be appropriate. However, the precipitation in the lower regions of the watershed is thereby overestimated. The mean annual precipitation at the meteorological station Beaver Mines, close to the gauge, amounts to around 650 mm·a −1 [24] whereas the model output defines at least 910 mm·a −1 . Nevertheless, the precipitation correction is applied, as without it the model efficiency is significantly lower (NSE: 0.40, R 2 : 0.68) and the mean flow of the Castle River is significantly underestimated (5.5 m 3 ·s −1 compared to the observed 15.5 m 3 ·s −1 ). Most of the runoff is produced in mountainous areas of the watershed, where precipitation correction delivers more appropriate values. Furthermore, most of the snow is produced in the mountains and the evapotranspiration has its maximum in the lower areas. Thus, the derived mistake by an overestimation of precipitation downstream can be neglected, as its contribution to the streamflow generation is low.
The evapotranspiration varies between values of ca. 500 mm·a −1 to more than 750 mm·a −1 . Due to low transpiration the calculated evapotranspiration is lower in non-vegetated areas such as the higher altitudes. In the forested valleys and in the agricultural used areas in the northeast, very high evapotranspiration values can be detected. The mean runoff for the period 1960 to 1970 is 596 mm·a −1 . It is obvious that the highest runoff values (up to over 850 mm·a −1 ) are generated in the mountain tops as precipitation and thus snow cover is higher there. The lowest contribution to the runoff is generated in the agricultural areas because of lower precipitation and higher evaporation. The results of the established WaSiM model show that it is able to reproduce runoff with high accuracy over a period of 40 years. This also implies that the partially calculated or generated input data is relatively accurate and that calibration has been successful in this study.

Comparison of Observed and Simulated Declining Runoff Trend
The observed and simulated runoff in an annual mean is demonstrated in Figure 4. Analyzing the trends in total runoff between 1960 and 1999, the observed data show a linear decrease of ca. 1 m 3 ·s −1 (38 mm) in 12 years. Therefore, in the 40 years considered, the observed streamflow declined by about 3.1 m 3 ·s −1 which equals a decrease of 119 mm·a −1 , although in the four decades the trend has not been continuous. Regarding monthly mean values the simulated runoff trend has a very similar inclination compared the observed. A decrease of 1 m 3 ·s −1 in 9 years and a total decline of 4.1 m 3 ·s −1 (157 mm·a −1 ) between 1960 and 1999 can be derived from the simulation. Low and high flow trends are also decreasing. The decline in snow cover since 1960 is captured by the model. WaSiM generates a total snow storage output which sums the snow in mm per day. The snow cover is varying but the linear trend shows a distinct decline of 53 mm·a −1 in an annual mean.
Considering the observed and simulated runoff between 2000 and 2010 a different behavior of the model is evident. Runoff is slightly increasing in observed values while WaSiM produces a decrease of 9.1 m 3 ·s −1 in the decade when regarding monthly mean values. Figure 4 shows that between 2002 and 2007 runoff is overestimated considerably. The assumption is that the changed condition in the basin after the 2003 wildfire is the reason for the decline in model efficiency (compare Table 2). Thus, further investigation in this decade is conducted in chapter 4.5.
The water balance for the period 1990 to 1999 is calculated and then compared to the water balance from 1960 to 1970. The change maps for simulated precipitation, evapotranspiration and runoff are demonstrated in Figure 8. Considering precipitation, a spatially differenced distribution and an increased amplitude is evident, as the decrease in the lower parts is up to 30 mm·a −1 , and the increase, especially in the northern and western mountains, is up to 99 mm·a −1 . This observation conforms with the projections from Field [42] which show an increasing variability in precipitation. Considering evapotranspiration values between the 1960s and 1990s, an overall increase can be seen due to increased temperatures. Nevertheless, in the higher mountainous parts a decreasing evapotranspiration was simulated. A possible explanation for this development could be the decline in snow cover which is represented by the model. Mac Donald, et al. [45] emphasized the importance of sublimation of snow cover for the mass balance in a watershed. Therefore, the reduction in evapotranspiration could possibly be explained by the loss of sublimation which contributes to evapotranspiration. As a consequence of the developments in the distribution of precipitation and evapotranspiration, runoff is on average declining between the 1960s and 1990s. In the western, high altitude regions, more precipitation produces more runoff, especially direct streamflow and interflow. In the valleys, increasing evapotranspiration reduces the runoff generation and causes its declining trend. It can be claimed that WaSiM is able to reproduce quite well runoff values under changing climate conditions. The decline in snow cover and, consequently, in total runoff is reproduced in the model results. Changes in precipitation distribution and quantity, an increase in evapotranspiration and trends in runoff generation in which mainly the interflow is declining are demonstrated with the established model. The intensity of decline is slightly overestimated. Further investigation with measured input data or a tested and approved precipitation correction factor would be necessary to describe the changes to the watershed more accurately.

Results of Burn Ratio and Derived Burned Areas
The result of the Relative differenced NBR (RdNBR) is presented in Figure 9. High RdNBR values signify a decreased vegetation cover and thus a high burn severity. Low or negative RdNBR values indicate an increase in vegetation cover and show unchanged areas [13]. The derived image shows a clear zoning of the Lost Creek fire mainly in the high severity class. Figure 9. Relative differenced Normalized Burn Ratio (RdNBR) of the Lost Creek fire; thresholds according to [13].
Field measurements and derived burn severity values for the wildfire in the Castle River watershed are not available. Hence, the dNBR and RdNBR thresholds derived from Miller and Thode [13], who analyzed 14 fires in the Sierra Nevada Mountain range in California, USA, are applied. They compared measured CBI (Composite Burn Index) values with dNBR and RdNBR from pre-and post-fire Landsat data and defined thresholds, derived from a regression model.
Applying the RdNBR shows that the derivation of burned areas from Landsat satellite imagery is generally feasible in the Castle River watershed. The area derived by this method agrees quite well with the size collected by ESRD Alberta [33], with a discrepancy of 14%. There are two possible reasons for this difference. Either, the burned area was collected very generally and the RdNBR calculation gives a more precise result for the burned areas, or else there are mistakes within the calculation or the application of the thresholds from [13] are used. Therefore, the definition of burned areas in the severity categories can only be an approximation. For a correct delineation of burn severity categories in Castle River watershed, location-dependent CBI-values would have been necessary. Nevertheless, the presented method has several advantages. The Landsat data are freely available, and long time-series exist so that change detection and post-fire succession analyses are possible. Compared to the simple burned area map from ESRD Alberta [33], Landsat data deliver valuable information about the plant characteristics and changes of vital vegetation occurring with a fire event [46]. Most importantly, the RdNBR method delivers information about different grades of burn severity which are, according to Luce [39], the supreme determinants for the dimension of the impact of a wildfire on the hydrology of a watershed.
For further analysis of the impacts of the extensive forest fire on the hydrology of Castle River watershed, moderate and high severity RdNBR values are included in the land cover map and defined in a "burned" land cover class. These newly derived land cover maps are later applied as input data sets for the hydrological model WaSiM.

Results of Simulated Runoff Reaction to Fires
The model run from August 1993 to July 2003, with the purpose to produce an initial state for the time the fire began, does not have a considerable lower efficiency than the model runs in the decades before (NSE: 0.76, R 2 : 0.82). Simulating runoff after the fire from August 2003 to November 2004 does not lead to accurate results applying the original land cover input for the burned area (NSE: 0.51, R 2 : 0.86). The mean annual flow is overestimated considerably by 18%. Thus, it can be assumed that the model allows an accurate runoff simulation only until the fire changed the land cover conditions.
Applying the three other land cover characteristics in the burned area as explained in 3.4.2, NSE efficiency results range between 0.44 with shrub and 0.56 for barren soil characteristics in the burned area. The relative error of the mean annual runoff is around 20%. Barren soil characteristics in the burned area deliver the best results for this time period (NSE: 0.56, R 2 : 0.90, MQ relative error: +17%). The simulation results compared with the observed data are shown in Figure 10. Although the mean annual runoff is overestimated under this scenario, satisfactory results can still be achieved, regarding the categorization from Moriasi [30]. Consequently, barren soil characteristics seem to represent a burned vegetation in the first year after the fire more accurately than the other soil characteristics applied. Although dead tree trunks and burned organic litter are present in burned areas, all vegetation based hydrological processes are obviously interrupted. Especially low flow is represented quite well. Melt runoff peaks are characterized quite well but the runoff in July, when highest precipitation values are present, is overestimated by WaSiM. Regarding the simulated total runoff, it is evident that in the burned region runoff generation is slightly increased compared to the unburned area. This is caused by an increase of interflow by ca. 8% in the burned area. Due to an absent transpiration without vegetation cover, more water can infiltrate and contribute to the total runoff. This is especially caused by the parameterization of low LAI, low evaporation (rs_interception and rs_evaporation) and high rsc values which are demonstrated in Table 1.
Despite the modifications it is conspicuous that WaSiM still produces less accurate results for the period after the forest fire than in the four decades before. A reason for this can be traced to the change in land cover after the fire. In addition, as the model results showed a reaction to changes in the burned area land cover parameterization it demonstrates that this approach is correct. Hence a new land cover class for burned areas needs to be established.
Another reason why a high model accuracy cannot be achieved for the burned simulation scenario is the climate variability. As shown before, WaSiM reproduces a declining runoff trend between 1960 and 2010 due to changes in climate patterns, but the runoff and the decreasing trend are also overestimated in this period. In the 2000s, precipitation was lower in the winter months but higher in the summer months than in other decades. The applied precipitation correction likely resulted in an overestimation of precipitation in summers in the watershed, which, in turn, alters the annual streamflow and unrealistically increases the peak flows in June and July.
Therefore, generally, a superposed situation is given in Castle River watershed after the 2003 fire. On the one hand a distinct climate trend reduces the freshets, which mainly contribute to runoff generation in southern Albertan alpine catchments. On the other hand, the general behavior after wildfires is an increase in runoff which results in an earlier spring melt flow as well as higher peak flows. Both characteristics can be detected in the observed and modeled runoff values. It is demonstrated by a higher low flow between August and March and an earlier begin of melt possibly due to the fire effects as well as a significantly decreased peak flow as a consequence of reduced snow packs due to climate variability.

Conclusions and Outlook
This study investigated the hydrology of Castle River watershed in the southern Albertan Rocky Mountains with particular focus on climate change induced alterations and the impacts of forest fires on streamflow. The initially raised questions and objectives could mainly be achieved. The declining runoff trend, predominantly a consequence of higher evapotranspiration and decreasing snow cover due to rising temperatures, is evident in the Castle River. This trend could also be reproduced by the applied hydrological model WaSiM, which resulted in high accuracy in runoff simulations in the watershed when compared to recorded streamflow records.
The fire severity of the Lost Creek fire in 2003 was detected with pre-and post-fire Landsat satellite images using a differencing burn ratio method (RdNBR). Thus, the area burned by the fire as well as severity classes could be defined very precisely. The impact on the discharge in the watershed is evident in the observed runoff. Nevertheless, a decrease in spring peak flow due to climate induced decreasing snow cover was dominant over the increased peak flow as a consequence of the fire.
Applying the model after the wildfire in 2003 lead to a significantly lower model efficiency. This concludes that WaSiM output is sensitive to land cover changes due to forest fires. An attempt to imitate burned land cover characteristics resulted in increasing accuracy in runoff simulations. Though it is anticipated that a new land cover class which contains parameter values more representative of a burned forest would result in an improved simulation of runoff with WaSiM after a fire. Depending on burn severity a land cover class similar to barren soil would be appropriate, but including features such as e.g. a burned organic layer, changed soil characteristics and dead tree trunks which can still intercept some water. In the following years a dynamic forest succession in the burned areas has to be applied.
This study also shows that it is possible to undertake satisfactory hydrological modeling even if principle input data is missing. The meteorological input data relative humidity and global radiation were estimated based on air temperature and precipitation data, soil information in the watershed was deviated from land use and the temperature and precipitation data were also not measured but interpolated in a broad scale. This confirms the quality of the model WaSiM, which yields plausible results with a coherent parameterization even with poor input data availability.
Further investigation of the outlined research questions is recommended. Future trends are tending towards lower discharge due to decreasing snow cover as a result of climate change [1] as the trend from 1960 to 2010 already represents. Furthermore, an increase in forest fires is expected due to rising temperatures and an extended fire season length [47]. Additionally, changes in land cover either by forest fires or other partly anthropogenic interventions will impact the hydrology in the Castle area in the future. The establishment of gauges of sub-basins and meteorological stations at different altitudes in the watershed would provide significant hydro-meteorological data that is currently estimated [18]. A site-specific elevation dependent regression for correcting measured precipitation could be derived and applied in the hydrological model to obtain spatially distributed meteorological information. A precise soil texture mapping would possibly further improve the conditions to work with the model and produce more accurate results as infiltration processes could be reproduced more precisely.
However, the presented methodological approach is considered robust to implement a scenario simulations framework for projecting the impacts of future climate and land cover change in the vulnerable region of Alberta's Rocky Mountains and comparable mountainous watersheds.