Runoff Response to Climate Warming and Forest Disturbance in a Mid-mountain Basin

A headwater basin in the Sumava Mountains (Czech Republic), the upper Vydra basin, has undergone forest disturbance as a result of repeated windstorms, a bark beetle outbreak, and forest management. This study analyzed the long-term (1961–2010) hydro-climatic changes by using a combination of statistical analyses, including Mann-Kendall tests, CUSUM analysis, Buishand's and Petitt's homogeneity tests, and Kriging. Although the runoff balance over the study period experienced no apparent changes due to climate warming and forest disturbance, significant changes were detected in the share of direct runoff and baseflow, intra-annual variability of the runoff regime, seasonal runoff patterns, and the distribution of peak and low flow events. The seasonal runoff substantially shifted from summers (decreased from 40% to 28%) to springs (increased by 10%). The occurrence of peak flow events has doubled since the 1980s, with a seasonal shift from late spring towards the early spring, while the occurrence of low-flow days decreased by two-thirds. By 1990, these changes were followed by a seasonal shift in runoff from autumn to midwinter. The changes in hydrological regime in the mid-mountain basin indicate the sensitivity of its hydrological system and the complexity of its feedback with the changing environment.


Introduction
In montane regions, both long-term and abrupt changes of climate and/or land cover may result in significant shifts in the hydrological regime, especially in relation to extreme hydrological events such as floods and droughts.
In past decades, climate change, including changes in precipitation, temperature, vapor pressure, and wind speed, has directly or indirectly altered hydrological regimes [1][2][3][4], and studies that identify the linkage between warmer air temperatures and the occurrence of extreme hydrological events or basins' water yield have drawn decision-makers' attention to natural resource management [5][6][7][8].
Studies have consistently stated that changes in land cover/land use and climatic changes significantly govern the hydrological regimes (i.e., pattern, magnitude, frequency, timing, duration, and rate of change) [9][10][11][12][13][14]. Forest disturbance, as one of the causes driving severe land cover change, has major impacts on interception, evapotranspiration, surface soil hydraulic conductivity, and soil storage, which may lead to changes in the water yield [15,16], the runoff formation process [17,18], snow hydrology [19,20], floods [9,21], and the low-flow regime [22,23].The effects of different forest disturbances caused or triggered by wildfire, insect infestation, windstorm, logging, pollution, urbanization, agricultural activities, and management interventions on the stream flow have been widely studied at multiple temporal and spatial scales [24][25][26][27][28]. Appropriate environmental policy regarding basin management requires an integrated understanding of the hydrological responses to both climatic and changes in forest land cover, especially in montane areas, which are highly vulnerable to these changes [25,29].
The headwater of the Sumava Mountains, located in Central Europe at the border between the Czech Republic and Germany, has undergone a significant forest disturbance in the past two decades as a result of repeated windstorms and bark beetle infestation.In the core zone of the disturbance, the upper Vydra basin, the extent of forest decay reached almost 60% of the basin [28].Simultaneously, this area is experiencing significant increases in observed air temperatures.As a national park with restricted management, the area serves as a natural laboratory, enabling the study of the effects of these environmental changes on hydrological processes in the mid-mountain environment.
This paper aims to assess the hydro-climatic changes in the upper Vydra basin in the period 1961-2010, covering the observed changes in air temperature as well as the forest disturbance.The objectives of the study were the following: (i) to analyze the effect of rising air temperatures and extensive forest disturbance on the hydrological response of the mid-mountain basin, and (ii) to assess the hydro-climatic indicators that are suitable for detection of such changes and their extent and timing.
The study applies a set of methods to analyze various aspects of hydro-climatic variability including baseflow separation by recursive digital filter, CUSUM analysis, analysis of changes in discharge variability and seasonality with a Mann-Kendall test, analysis of changing frequency and seasonality of peak flows and low flows, Buishand's and Petitt's homogeneity tests to detect past regime change, and Kriging.

Study Area
In this study, the upper Vydra, which covers 90.1 km 2 (Figure 1), is defined as the upper part of Vydra basin, ending at the Modrava gauging station (49°1′30.0216′′N, 13°29′47.1624′′E).The studied basin is a headwater located at the top of the Sumava Mountain range with an average altitude of 1112 m (Table 1) in the southwestern part of the Czech Republic, along its border with Germany in Central Europe.The bedrock consists mainly of gneisses with locally permeating granitic rocks [3].The climate of the study area has typical montane features with moderately warm, distinct summer seasons and relatively high precipitation (1378 mm) compared to the rest of the Czech Republic (700 mm) [30,31].Approximately 40% of the precipitation is in the form of snow, and the snow cover lasts an average 143 days per year [32].The basin is dominated by small streams with fast hydrological response to precipitation events, and the basin's annual mean discharge is 3.34 m 3 •s −1 (i.e., 1175 mm).The land cover in the Sumava Mountain was dominated by the original virgin forest and was replaced in the 18th century by a Norway spruce (Picea abies [L.] Karst.)monoculture for the wood industry [33].Before the 1980s, 86% of the land cover in the upper Vydra basin consisted of coniferous forests and was stable.Bark beetle outbreaks in the upper Vydra basin started after windstorms in Bavarian Forest in 1984, reaching their peak in the mid-1990s [26] and again accelerating after the windstorms Kyrill and Emma in 2007 and 2008 [28,30].Between 1984 and 2010, healthy forested area decreased from 48% of the basin area to 13%.Substantial change is apparent in the heavy forms of forest damage-the categories of damaged and decayed forest rose from 1% in 1985 to 33.1% in 2010, and acceleration is apparent after the last windstorms in 2007 and 2008.40.9% of the catchment area is classified as unforested, including marginal and stable share of deciduous forest cover (Figure 2).The affected areas were left as non-intervention zones as part of the Sumava National Park management strategy, where the bottom layer of vegetation is undergoing a quick recovery.

Data Sources
Daily and monthly hydro-climatic long-term observations of precipitation, air temperature, and discharge were obtained from the various Czech Hydrometeorological Institution (CHMI) stations [34], and the period of 1961-2010 was used for the analysis.The air temperature for the upper Vydra basin was estimated according to the elevation gradient (a temperature lapse rate 0.6 °C per 100 m elevation) by using the observations from three stations: Filipova Hut, Churáňov and Kasperske Hory.
Precipitation for the upper Vydra basin was estimated by using orographic regression based on the neighboring six CHMI precipitation stations at Filipova Hut, Kvilda, Horská Kvilda, Churáňov, Borová Lada, Horská Kvilda and Srní.These stations were completed by six monthly totalizer rain gauges operated by the National Park Bayerischer Wald, situated on the Czech-German border to capture the uppermost precipitation regime.The resulting mean precipitation lapse rate was 9.5 mm per 100 m elevation.Monthly data from Filipova Hut station located within the Vydra catchment were orographically corrected using weighted elevation categories representing 100 m increments in elevation determined from a digital elevation model (DEM).The changes in snow cover were assessed using the monthly data mean snow cover depth and days with snow cover from Churáňov station in the period of 1961-2010 [35].
The daily discharges were measured at the outlet of the upper Vydra basin, Modrava station (Figure 1).The topographical information, at 5 m × 5 m (DEM), was acquired from the State Administration of Land Surveying and Cadastre [36,37].The topographic layers from the digital water management database DIBAVOD including basin delineation [38] and stream typology [39] were used in this study.Data on forest cover changes were derived from the map layers of defoliation and mortality of forests [40].

Applied Methods
A combined approach of statistical and analytical methods, including double-mass curve analysis, recursive digital filter, Mann-Kendall test, CUSUM analysis, Buishand's and Petitt´s homogeneity tests, and ordinary Kriging was applied.
The double-mass curve was used to illustrate the precipitation-discharge relationship in the upper Vydra basin [3].A Mann-Kendall non-parametric test to detect the trends in precipitation data and discharge data at different temporal scales: monthly, seasonally, and yearly [41,42].The test was performed by either accepting or rejecting the null hypothesis that no trend existed at the two stations, and two levels of significance of the alternative hypothesis, two p-values (p ≤ 0.01 and p ≤ 0.05), were considered to distinguish the strength of the trend.
A two-parameter recursive digital filter [43] was applied to separate the direct flow and baseflow signals present in the continuous daily discharge data.The applied algorithm aims to partition the streamflow hydrograph into two components: "high frequency", corresponding to direct runoff, and "low frequency", corresponding to baseflow [44].The baseflow index (BFI) was calculated as a ratio of baseflow to total streamflow [44].The baseflow separation is calculated according to the relationship proposed by Eckardt [43] (Equation 1).
where Bk is baseflow at time step k, Qk+1 is streamflow at time step k + 1, α is baseflow filter parameter, and BFImax is the maximum BFI over the full range of time k.For this study, the baseflow filter parameter α was set to 0.98 and BFImax to 0.80 in the web-based WHAT model [45], corresponding to perennial streams with porous aquifers [43].The analyses of yearly changes of direct flow and base flow components were based on median yearly values of the indices calculated from daily values.
CUSUM analysis (i.e., cumulative sum, [46,47]) was employed to assess the course of cumulative discharge differences, relative to the whole assessed period's mean discharge.The cumulative sums are calculated as the accumulated differences from the constant target value, in this case represented by the mean value of the applied time series (Equation 2).
where Si is the cumulative sum, xj is the value at time j, and c is the constant target value.
The daily discharge (Qd) was used to define the low flows (Q330d), high flows (Q30d), and the standard deviation, which were applied as a measure of intra-annual runoff variability.The value of the peak flow over threshold (POT) was set as the minimum value of the yearly maximum discharge in the reference period of 1961-2010 (POT is equivalent to 12.2 m 3 •s −1 ), which secured a selection of at least one event per year in the time series.The low-flow events under a given threshold (LOF) were identified according to a long-term Q330d value (LOF is equivalent to 1.02 m 3 •s −1 ).The analysis of frequency, duration, and magnitude of annual POT events was performed on the resulting dataset.
Analysis of changes in frequency and seasonality of events with POT and LOF is based on gridding the frequency of event occurrence as a function of time, expressed by year, and seasonality, expressed by day of year (DoY).For gridding, the data were aggregated into periods of ten years for the years, and into periods of 90 days for the seasons.Ordinary Kriging was used as the interpolation method to derive the patterns of frequency distribution across time and seasonality as well as for interpolation of snow cover properties.
Homogeneity of the time series based on mean yearly values was tested by Buishand's and Petitt's tests [48,49] to detect the potential points of change in the assessed time series.The non-parametric Petitt's test is a modified Mann-Whitney test that, similar to Buishand's range test, allows for identification of the time at which the shift occurs.Both tests are suitable for any type of distribution with unknown position of the point of change [50].Selection of these tests was based on indications of their good performance on hydrological time series [49] and their results in locating the points of change despite different distributions and origins of the time series [49,50].The applied tests tend to be more sensitive to breaks in the middle of time series [51].In both tests, the null hypothesis (Ho: data are homogeneous) and an alternative hypothesis (Ha: there is a date at which there is a change in the data) were tested.The p-value has been computed using 10000 Monte Carlo simulations and the significance level alpha was set to 0.05 (α = 0.05, s, the risk of rejecting the hypothesis is equivalent to 5%).Despite their different natures, their applications to various types of hydro-climatic time series reported comparable results [52].In this study, both tests were used to keep potential variability in the assessment.
The detected change points were then applied as the split points for calculations of the trend lines appearing in the graphs.In the cases where different years of change were detsected, the detected year resulting from the test with lower p-value was selected as the split point.
The statistical software packages Knime 2.11 (KNIME.com,Zurich, Switzerland), XLStat 2014 (Addinsoft, Paris, France) and Surfer 10 (Golden Software, LLC, Golden, CO, USA) were used for statistical calculations.Grapher 10 (Golden Software, LLC, Golden, CO, USA) was used for calculations of trend lines, running average and for plotting, and ArcGIS 10 (Environmental Systems Research Institute, Inc., Redlands, CA, USA) was used for GIS analysis and spatial data integration.

Variability of Climatic Conditions
Notable changes were identified in the climatic conditions considered to be drivers of runoff.First, an increase of the annual air temperature by 1.5 o C has been detected since the 1980s.A continuous increase in observed air temperatures is apparent over the whole assessed period with notable acceleration since the late 1980s (Figure 3).
The observed annual total precipitation slightly fluctuated between 1100 and 1300 mm during 1961-2010, with a slight increase since the mid-1990s.However, since the 1980s, there has been an apparent change in the structure of precipitation.In particular, snow precipitation has diminished.During the 1980s, there was a significant drop in the average snow depth in the area and in the total number of days with snow cover.After those break points, both indicators are slightly increasing again (Figure 3).The changes in snow cover depth and duration are gentle but apparent.The snow season becomes shorter at both ends-the snowpack accumulation starts later and the snowmelt is occurring earlier (Figure 4).

Homogeneity and Change Points in Time Series
Homogeneity analysis was applied to the time series of observed and calculated data to identify the significant turning points in the trends of the assessed indicators, which could suggest potential relations to the drivers of change.The results are very similar in terms of the timing of the identified turning points, with only minor differences in levels of significance.
Only for a subset of the tested indicators was a point of change identified with statistical significance where the calculated p-values were smaller than 0.05 (Table 2).The identified change points were used as breaks for the calculation of linear trends in the line graphs.As in some cases the tests identified different years of change, the result of test with lower p-value was selected as the breakpoint for calculation of the trend lines.Among the indicators, where the inhomogeneity is of statistical significance, we can distinguish several time periods between which points of change were detected.
There are two indicators for which inhomogeneity was found in the mid-1960s (decreasing number of LOF days since 1964) and mid-1970s (rising share of the spring season in runoff balance since 1974).Changes in the trends of these indicators thus cannot be related either to the effect of the rising air temperatures since the 1980s or extensive forest disturbance in the 1990s and are more likely to be attributed to the large fluctuations in total precipitation and runoff occurring in the 1960s (Figure 3).
Most of the indicators have a change point located in the 1980s, when the increase of air temperatures became apparent in the observed data (Figure 3).These changes comprise the decreasing share in the runoff in the fall season (1981), the increase in the standard deviation of daily discharge values (1985), the decrease in the mean snow depth (1988), the decrease in direct runoff values (1988) and the increase in the baseflow index since 1990.
Only two indicators have the change points detected after the period of the extensive forest disturbance, with peaks in the early 1990s.These are the increasing number of POT events since 1994 and positive values of cumulative difference of the daily discharge values compared to the long-term average (CUSUM,) since 2001.
Besides the above listed indicators, the other tested time series were classified as homogeneous without a significant break in the 1961-2010 period.2.

Changes in Runoff Balance and Variability
Changes in trends are apparent for the indices of intra-annual variability of discharge, namely the high flows (Q30d), low flows (Q330d), and the mean annual discharge standard deviation (Figure 6).Specifically, the yearly Q330d values expressing the low flows show a decrease since the 1980s, after an increase in the preceding decades.The trend of yearly Q30d values, representing the high flows, runs in the opposite direction.A slight decline before the 1980s is followed by a significant increase since the break point in the 1980s, with large variations ranging from approximately 5 m 3 s −1 to 12 m 3 s −1 .The rising standard deviation of daily discharge values demonstrates the rising intra-annual volatility of the runoff process since the 1980s.
The cumulative analysis of differences in daily discharge from its long-term average (CUSUM, Figure 7) indicates repeated, large fluctuations between the 1960s and the 1970s.The accumulated differences continuously turned from negative towards positive values in the 1980s with a significant reduction in variability and a steep increase in positive values in the late 1990s (Figure 7).  2.

Changes of Seasonal Runoff Distribution
Despite the stable rainfall-runoff balance, indicated by the double mass curve of cumulated precipitation and runoff values (Figure 5), a significant change was detected in the seasonal distribution of these parameters.The Mann-Kendall test was based on monthly values of precipitation at the neighboring precipitation stations Kvilda (1059 m.a.s.l.), Churáňov (1122 m.a.s.l.) and Filipova Hut (1102 m.a.s.l.) and discharge values at the basin outlet at Modrava (Table 3).The distribution into seasons reflects the hydrological year, which starts in this region in November.
Precipitation shows a significant rising trend (p ≤ 0.05) in winter (months of XI-I), while large declines of precipitation (with p ≤ 0.05) were found in two stations in early spring (II-IV) and fall (VIII-X), significant in April only.The Mann-Kendall test detected that the rising trend of the basin's discharge was in March (p ≤ 0.05) and April (p ≤ 0.01), followed by insignificant decreases in following months.Nevertheless, the increment is not caused by precipitation but by earlier snowmelt as a consequence of higher temperatures in April.Substantial shifts in the seasonal distribution of runoff are apparent, as shown by the results of the Mann-Kendall test (Figure 8).First, there is continuous decline of the share of the fall period prior to the snowpack accumulation (VIII-X, Figure 8a).The homogeneity tests (Table 2) indicated the turning point is 1981, corresponding to the period of rising air temperatures.The second significant change is the rising share of the hydrological spring season (II-IV, Figure 8a), with the key point of change in 1974 (Table 2) but a slight decline the since 1990s.The shares in hydrological winter (XI-I) have been in apparent decline since the late 1990s, which coincides with the period of apparent diminishing of snow cover extent in terms of days with snow and average snow depth (Figure 4).In contrast, the late spring season (V-VII) has apparently been on the rise since the 1980s (Figure 8b), but this result is statistically insignificant (Table 2).2.

Variability and Seasonality of Peak Flows and Low Flows
The peak flow (POT) and low flow (LOF) events were evaluated within the assessed period in order to understand the frequency of extreme flow events and the seasonality of their occurrences.
Concerning the regime of the POT events, the average number of days with POT was approximately 6 days per year (Figure 9a).Before the 1980s, the number of days with POT was in sharp decline; afterwards, the number of days with POT had a continuously rising trend.The rate of POT events increased from approximately 4 events (1961-1980) to 5 events (1981-2010) per year, and after 2000, the frequency has almost doubled to 7 events compared to the average rate of such events during 1961-1980.Furthermore, the average duration of POT over time has slightly decreased (the average duration of each POT event decreased from 2.5 days in 1961-1980 to 2.2 days in 1981-2010, supplementary material, Figure S1), although the average frequency of POT has increased.Such a change in the peak flow regime implies that the increasing number of days with POT in the last 30 years was generated by a rising number of short peak flow events.
The changes of peak flow frequency have a counterpart in shifts of seasonality of their occurrences.The pattern of peak flows is indicated in Figure 9b.The majority of POT events were related to the late winter (around Day of the Year (DoY) 90 in March), and the spring snow-melting season (around DoY 120 in April and DoY 150 in May).However, there has been a clear shift in seasonality from mid-spring (around DoY 150) towards early spring (around DoY 90-DoY 120) since the 1980s (Figure 9b).Specifically, in 1961-1970, there were 98 days with POT recorded in May, accounting for 64% of the total 136 POT days.Since the 1980s, the shift in seasonality is distinctly apparent, and during 2001-2010, the average number of POT days in May accounted for only 20% of a total of 161 POT days.The POT events thus tend to occur earlier in the spring season, which corresponds with findings on changes in air temperature seasonal distribution.
Substantial changes in low-flow regimes have been found (Figure 9a).In the 1960s, there were 42 LOF events with a total duration of 527 days.During the 1970s, there were only 11 LOF events with a total duration of 138 days, and there were only 11 LOF events with a total duration of 57 days in the 1980s (supplementary material, Figure S2).However, at the end of the 1990s, there has been an apparent increase in the frequency of low-flow events, as in 2001-2010, during which there were 27 LOF events detected, lasting a total of 250 days.
Two trends have been observed in the duration of LOF events (supplementary material, Figure S2).First, there has been a decrease in the LOF event duration from an average of 14 days in the 1960s to 6 days in the 1980s; secondly, there has been an apparent growth after 1990 in both the frequency and the average duration of LOF periods.Since 2000, the average duration of LOF events rose to approximately 12 days (Figure 9a), and this decade includes the third-longest continuous LOF period of 77 days, recorded from January to March 2006.
The LOF occurrence has undergone a significant seasonal shift from autumn to mid-winter during 1960-2010 (Figure 9c).The three periods of LOF in the 1960s were typically distributed in early (DoY 330) and late (DoY 30) winters, and late summers (around DoY 270); this LOF pattern has disappeared since the 1980s.After the 1990s, a new seasonal distribution pattern of low flows was established, concentrated in a single period during early winter (DoY 30), and the frequency of LOF was lower than in the 1960s.

Discussion
The observed trends and variations of runoff response in the assessed basin can be discussed in view of the key driving forces altering the hydrological processes in the area, which are climate changes and forest disturbance, as well the effect of spatial scale of observation, which is vital for understanding and interpreting the observed changes.
Both of the driving processes (the rising air temperatures and forest disturbance) act in the same area and in overlapping time periods but are of distinctly different scopes.Although climate change is a rapid process when regarded from a long-term perspective, the rising air temperatures at the local level are detected as gradual changes with multiple fluctuations.The effect on the indices affected by transient climate is thus also progressive, with a spatially extensive impact, long-term follow-up and potential shifts in timing.Compared to this, forest disturbance, initiated by the windstorms and consequent bark beetle infestation, was a rapid process that abruptly changed conditions that affect runoff processes in large segments of the basin.The effect of such change is swift, but the extent of the change is local, because of rapid vegetation regeneration and the limited time.These differences should be taken into account in interpretation, and in relating the changes of hydro-climatic indices to the potential factors of change.
The analysis of the observed long-term climate parameters showed a trend of increasing air temperature in the study region since the 1980s (Figure 2).The warming climate has apparently not altered the upper Vydra basin's annual water balance based on the double mass curve in Figure 3 and the annual mean discharge (Table 2); however, the changes detected in runoff seasonality and variability are significant.
The seasonal and monthly precipitation and runoff trends were detected by the Mann-Kendall test (Table 3), which found that precipitation increased in winters (the months of XI-I) and decreased in springs, especially in April.However, the runoff in April experienced an increasing trend because of the seasonal runoff share substantially shifting from summer to spring (Figure 8).Such seasonal runoff shifts can be attributed to the increase in air temperature, especially for April, as Bässler [53] reported, and as the detected change of spring runoff share is located in 1974, corresponding to the climate pre-warming window (Table 2).The warmer air in later winter triggers the snow-melting process earlier, and the earlier melt could also explain the inconsistency of precipitation-runoff relation in April and reconfirm the impact of the increasing temperature on the early spring hydrological regime.The decrease in direct runoff (Figure 6) begins in 1988 (Table 2), at the same time as the observed increase in air temperature.
Forest disturbance changes the functionality of the vegetation as well as the land cover, which could alter the runoff generation process and the runoff regimes.The study area is covered by a coniferous forest, the canopy of which can efficiently intercept precipitation, especially in the form of snow.The intercepted rainwater or snow directly control water losses through sublimation governed by radiation fluxes [54,55].Therefore, during a snow-melting period, forest decline or forest disturbance led to an increase in the frequency and magnitude of high flows [18,55].In the studied basin, a similar winter runoff regime has been detected after the forest disturbance, resulting in an increase in POT days and shifts in the seasonal water share of the same periods (Figure 9a and Table 2).
The disturbed forest could decrease water losses from vegetation evapotranspiration and increase the groundwater storage [56].However, those two indicators are the most uncertain components of the water balance because they are difficult to measure and quantify in space and time.The impacts of forest canopy loss after insect-induced disturbance on the significant decrease of evaporation and the increase of the soil moisture, groundwater storage and groundwater recharge were identified by studies at the stand-level as well as at the level of watersheds [57,58].Bearup et al. [59] used a three-component isotope hydrograph separation technique.They identified that a significant increase of the runoff contribution was from groundwater storage in the growing season; however, they concluded that the mechanism of basin-scale effect of the bark beetle infestation on the hydrological cycle still remains relatively unknown.
The rising trend of the baseflow index obtained from a two-parameter recursive digital filter hydrograph separation technique (Figure 6) began in 1990 (Table 2), which coincides with the period when the forest disturbance occurred in the basin (Table 2) and also overlaps with the period of the temperature increase (Table 2).As the forest has been disturbed in this period, the transpiration loss is supposed to decrease.However, the damaged canopy results in larger open areas with direct radiation on the soil surface, which increases the evaporation from the soil.The higher soil evaporation decreases the capacity of the water storage in surface soil.Therefore, evapotranspiration, as one of the water balance components, is influenced by two counteracting effects.The baseflow is governed not by the groundwater intake but by the top soil moisture.The increase of the BFI may be attributed to the loss of transpiration of forest cover resulting from the extensive forest disturbance since the mid-1990s.
Different changes in high flows and low flows in partially or completely deforested basins have been found in many studies.It has been claimed that high flows may be enhanced by deforestation/forest cover decline [14,16,21,60], while other studies suggest that high flow may be reduced by reforestation [56,61].In this studied basin, along with the severe declining of the forest cover induced by bark beetle infestation from 1980 to 2010, apparent changes in high flow were found in many aspects: more days and events with POT, higher frequency of the short high flow days, and larger variability of the POT event duration (Figure 9a and supplementary material, Figure S1).
The homogeneity analysis of the assessed indicators of runoff response has determined that the change points in most of the time series were in the 1980s, excluding peak and low-flow occurrence and CUSUM values (Table 2).This coincidence in break points may suggest the decisive role of increasing air temperature on the detected changes in seasonality and intra-annual variability of runoff response compared to the effect of the forest disturbance.However, the impact of forest cover changes on runoff response is quite complex and should not be underestimated.The spatial scale of the assessment is here one of the key factors affecting interpretation and understanding of the underlying processes.A number of studies have been done to analyze the impact of forest harvesting and clear-cutting on annual runoff, and most of them agreed that the runoff will be affected less by increasing the size of the basin [9,25].At different scales of observation, individual driving processes and factors may be more or less important.In small, homogeneous catchments, the effect of forces acting at larger scale can be overridden by local factors.For instance, a study of runoff changes at three experimental micro-catchments in the Bohemian Massif [62] attributed the diverging trends in runoff response in the period of 1990-2007 to highly divergent physiographic conditions rather than to forest disturbance.Focusing on such fine spatial detail may result in missing the "big picture."By contrast, in complex and heterogeneous basins with many drivers, effects often counteract each other and may hinder the identification of leading drivers of change.

Conclusions
This study analyzed hydro-climatic changes in a montane mid-latitude environment, using the example of a mid-montane basin of the upper Vydra in the Sumava Mountains, Central Europe.The study area has undergone a significant increase in observed air temperatures since the 1980s and extensive forest disturbance and decay since the 1990s.
The observed changes in hydrological response are complex and are apparent in different aspects of the runoff regime.The runoff balance has experienced no apparent change over the assessed period.However, diverging trends in the baseflow and direct runoff shares of the total flow were detected.The increase in the baseflow index since 1980s and the further acceleration of that increase in the 2000s have a counterpart in the increase of intra-annual runoff variability on the same time scale.Furthermore, substantial seasonal shifts in the share of total runoff from summer (decreasing from nearly 40% to 28%) to spring (with a 10% increase) were detected, as well as in the distribution and frequency of peak and low flows.The occurrence of peak flow events doubled after the 1980s, with a shift of frequency from late spring towards the early spring/late winter.The occurrence of low-flow days shifted from autumn to mid-winter and experienced a decrease (from 60 days per year down to 20 days), followed by an increase in both days and duration since 1990.
The extent of the changes and the correspondence of their timing with observed air temperature increase and forest disturbance imply a relationship between the identified changes and the changing environment.Despite the unchanged annual runoff balance, the substantial shifts in runoff variability and seasonal distribution as well as the changes in frequency and seasonal patterns of peak and low flows are notable, indicating substantial changes in the hydrological behavior of the basin.The environment of the mid-latitude montane basin proved to be very sensitive to climate changes as well as to the changes in land cover.The extent of the changes, apparent at the basin scale, is also vital information for efficient water management and conservation of montane catchments.

Figure 1 .
Figure 1.Study area.Location of the upper Vydra basin with river network, gauging and meteorological stations.

Figure 2 .
Figure 2. Key stages of the forest disturbance progress in the upper Vydra basin and the change in forest status between 1984 and 2011.

Figure 4 .
Figure 4. Interpolated values of number of days with snow cover and average snow depth by years and months at the Churáňov monitoring station during 1961-2010.
Homogeneity testing resulted in two distinct groups of indicators.The indicators where a change point was detected have calculated p-values significantly below the given threshold, while the difference is of orders of magnitude.The indicators of runoff variability based on the daily discharge values are the only group of indicators where the calculated p-values are nearing the threshold level from both sides.
Double-mass curves of accumulated monthly values of discharge (Q) and precipitation (P) during 1961-2010 (Figure 5) shows no apparent change of rainfall-runoff relationships.The analysis of cumulative values of two key hydro-climatic variables did not indicate any change either in the 1980s when the air temperature rose significantly, or in the 1990s when the intensive forest disturbance occurred in the basin.

Figure 5 .
Figure 5. Double-mass curve of accumulated monthly values of discharge (Q) at Vydra-Modrava and precipitation (P) at Filipova Hut stations during 1961-2010 and two magnified periods of the 1980s and the 1990s.

Figure 6 .
Figure 6.Annual mean values of the direct flow and baseflow index (BFI), high flow (Q30d) and low flow (Q330d), mean annual discharge (Qa) and its standard deviation in the period of 1961-2010.The trend lines are based on results from Table 2.

Figure 7 .
Figure 7. CUSUM analysis of daily discharge (Qd) during 1961-2010.The trend lines are based on results from Table2.
legend.Blue-increasing trend, red-decreasing trend.Darker color indicate more significant trend.

Figure 8 .
Figure 8. Changes in seasonal distribution of runoff in period 1961-2010 expressed as shares of the total yearly balance: (a) shares of winter (XI-I) and spring (II-V); and (b) shares of summer (V-VII) and fall (VIII-X).The trend lines are based on results from Table2.

Figure 9 .
Figure 9. Patterns of frequency and seasonality of peak and low flows during 1961-2010: (a) occurrence of peak flows events, (b) changes in frequency and seasonality of days with POT; and (c) changes in frequency and seasonality of days with LOF.The trend lines are based on results from Table2.Details of the durations and onsets of POT and LOF events are given in the supplementary materials FiguresS1 and S2.
Figure 9. Patterns of frequency and seasonality of peak and low flows during 1961-2010: (a) occurrence of peak flows events, (b) changes in frequency and seasonality of days with POT; and (c) changes in frequency and seasonality of days with LOF.The trend lines are based on results from Table2.Details of the durations and onsets of POT and LOF events are given in the supplementary materials FiguresS1 and S2.

Table 1 .
Physiographic characteristics of the upper Vydra basin.

Table 2 .
Homogeneity tests and detection of change points in the assessed hydro-climatic indicators using Buishand's and Petitt's tests.In bold are marked indicators with p-value lower than 0.05, mean1 and mean2 are the mean values of indicator in periods before and after change point.Underlined are the years of change with lower p-values resulting from both tests, applied as the breakpoints for calculation of trend lines.

Table 3 .
Mann-Kendall test of seasonal and monthly precipitation (P) and discharge (Q) trends during 1961-2010.