3.1. Influence of Mesoscale Soil and Land Use Parameterization on the Simulation of the Headwater Catchment
A shows measured and simulated discharge rates for the two simulations Wbach and WbachEsoilConi for the years 2010 and 2011. Observed discharge was characterized by a strong seasonality with a pronounced low flow period during the summer and high variability during snow dominated periods in the winter. Generally, both simulation scenarios reproduced the discharge dynamics well but overestimated peaks during the winter (due to an overestimation of snow melt by the snow model) and omitted some peaks during the summer. The usage of mesoscale soil data from the Erkensruhr (model scenario WbachEsoilConi) intensified the tendency to overestimate peak discharge rates.
Differences in discharge between the reference simulation Wbach and the simulations WbachDeci and WbachGrass (Figure 2
B) were smaller than ±0.5 mm for more than 90% of the simulation period indicating a weak sensitivity of discharge to changes in land use parameterization. Higher discharge rates of WbachDeci and WbachGrass in late summer 2010 resulted from differences in LAI development and corresponding changes in interception. At the end of 2011, differences in discharge resulted from differences in soil moisture. The WbachDeci simulation had lower soil moistures in all depths than the Wbach simulation and therefore rainfall was primarily replenishing the water storage. The WbachGrass simulation had highest soil moisture at the same time and accordingly, highest discharge rates. In Figure 2
C, differences between Wbach and the WbachEsoilDeci and WbachEsoilGrass simulations are shown. Both simulations produced higher discharge rates during both years with an extreme overestimation during 2010 of the WbachEsoilDeci model scenario.
summarizes statistical measures of model performance for the hydrological winter 2010/2011 and—as a mean value—for the hydrological summer periods in 2010 and 2011.
All statistical measures varied more strongly between simulations during summer than during winter because (1) differences in evapotranspiration simulation only became apparent during summer and (2) small changes in discharge amount and timing had a high impact on statistical measures during the low flow period.
During winter, all model scenarios produced unsatisfactory R2 values (0.61–0.68) but very good bias values (0.94–0.98); the coefficient of variation was lower than unity for all simulations due to the underestimation of discharge variability during winter.
Changing land use primarily affected the coefficient of variation during the hydrological summer with increases for grassland and decreases for deciduous forest. In contrast, a change in soil data heavily influenced the bias and the R2. The unique behavior of the simulation WbachEsoilDeci in terms of very high increases in bias and R2 compared to WbachDeci has already been mentioned in the previous paragraph. The reason for this increase in bias will be further analyzed in this chapter and in the discussion section.
The water balance of the Wüstebach simulations (Table 4
) showed some interesting features concerning evapotranspiration components and infiltration sums. The total amount of actual evapotranspiration significantly changed between different land uses with highest values for WbachGrass due to the changes in transpiration parameters. In 2010, the amount of actual evapotranspiration for the WbachDeci simulation equaled that of Wbach but in 2011 the evapotranspiration was higher by 50 mm. Infiltration sums and fractions of subsurface flow varied between the years but not between simulation variants using the same soil data.
Comparing simulations with high-resolution soil data of the Wüstebach to those with mesoscale Erkensruhr soil data, significant differences in the water balance components and in the fractions of subsurface flow became apparent. For both forested land uses, actual evapotranspiration decreased by 37 mm (2010) and 25 mm (2011) for coniferous and by 126 mm (2010) and 56 mm (2011) for the deciduous forest. The decrease in evapotranspiration resulted from a decrease in infiltration sums by 77 mm (2010) and 62 mm (2011) for coniferous and by 113 mm (2010) and 89 mm (2011) for the deciduous forest. Despite the decrease in infiltration sums, discharge sums were much higher and as a result, the fraction of subsurface flow decreased by 12%–14% in 2010 and 6%–7% in 2011. The effect described above was stronger in 2010 than in 2011. In April and May 2010 precipitation rates were larger than PET rates but in April and May 2011 precipitation rates were lower. The surplus in PET significantly reduced soil moisture in 2011 thus dampening the effect of mesoscale soil data on runoff generation processes.
In contrast to the forest land uses, the WbachEsoilGrass scenario showed small changes in total evapotranspiration (≤27 mm) and correspondingly lowest variations in infiltration sums. The deviations in the water balance during 2010 and 2011 arose from intense rainfall rates during December in both years.
Comparing water balance results of the setups Wbach and WbachEsoilConi to measured water balance components mentioned in Cornelissen et al.
], the following observations can be stated: (1) Simulated discharge amounts match well with measured discharge rates for the Wbach scenario in both years; the WbachEsoilConi overestimated discharge by 20 mm (2010) and 40 mm (2011); (2) The estimated fraction of evaporation (20% of precipitation) matches very well to simulated fractions; (3) The amounts of actual evapotranspiration are largely underestimated in 2011 by 50 mm (Wbach) and 80 mm (WbachEsoilConi) due to low transpiration rates.
In the context of this paper, soil moisture simulation results
are compared between simulations but not to measurements. For a detailed comparison between simulated and measured soil moisture of the Wüstebach catchment, the reader is referred to Cornelissen et al.
We noted pronounced differences in simulated soil moisture dynamics between land use types at all depths. At 5 cm depth, differences were most pronounced during August and July 2010 when the WbachDeci simulation maintained soil moisture values above 0.5 while soil moisture for both the Wbach and the WbachGrass simulations dropped below 0.3. In August and July 2011, the WbachDeci simulation was again the wettest but differences to Wbach and WbachGrass were smaller. The Wbach and WbachGrass scenarios showed small differences at 5 cm depth because their root depth (refer to Table 2
) was comparably small with 0.5 m and 0.35 m respectively. At 20 cm depth (Figure 4
) the WbachDeci scenario produced the lowest soil moisture in both years. During July and August of both years, WbachGrass and Wbach maintained soil moisture values of about 0.6 while WbachDeci dropped below 0.4 in 2011. In both years, the WbachGrass scenario produced the highest soil moisture. At 50 cm depth a clear hierarchy following root depths was found in both years with the highest moistures for WbachGrass (featuring the lowest root depth) and lowest values for WbachDeci (featuring the highest root depth).
The usage of mesoscale soil data generally increased soil wetness and intensified short term soil moisture dynamics down to 50 cm depth. Differences were again most pronounced for the simulation with deciduous land use. The increased soil moisture dynamic with coarser soil data led to a decrease in infiltration and transpiration with a corresponding increase in discharge.
The relationship between mean soil moisture and its standard deviation (σθ
(<θ>)) showed little variations between different land use types at 5 cm depth (Figure 5
). Simulations with Erkensruhr soil data produced a steeper slope with higher standard deviations at the same moisture. This is attributed to the fact that the VGM parameters of the model setups used in this study were aggregated from a model resolution of 25 m ([15
]; also refer to Chapter 2.5). As demonstrated recently by Qu et al.
] the shape of σθ
(<θ>) can be explained to a large extent by the spatial variance of soil hydraulic properties.
3.2. Influence of Parameter Regionalization and Spatially Distributed Input Data on the Simulation of the Mesoscale Catchment
In the following, the results of the four Erkensruhr simulations are analyzed separately for the whole Erkensruhr catchment and for the Wüstebach sub-catchment. Water balance results were only available for the Erkensruhr as HGS does not enable the export of water balance results for sub-catchments.
The three Erkensruhr simulations with homogeneous rainfall (Figure 6
) heavily overestimated discharge amounts, especially during autumn because the applied rainfall originated from a climate station located in the southwestern—and thus wettest—part of the catchment. The usage of distributed precipitation substantially improved the discharge simulation of the Erkensruhr in terms of total sum, rising and falling limbs and low flows (Figure 7
). However the discharge peaks were underestimated, possibly because the same interception and transpiration parameterization was used for a different precipitation input data set. We further found that using spatially aggregated instead of distributed radar precipitation data, produced equal simulation results in terms of water balance and discharge (not shown).
The overestimation of simulated discharge amounts at the Erkensruhr outlets caused bias values around 1.6 for the Erk model scenario during summer (Figure 8
values of the Erkensruhr simulations were considerably higher during winter (0.86) than during summer (0.22). As the R2
during winter was higher for the Erkensruhr simulations than for the independent Wüstebach simulations (refer to Figure 3
), we assumed that the snow model used in both simulations performed better for the smoother discharge curve of the mesoscale catchment. A plausible reason for this is a smoothing effect on winter discharges due to the larger catchment size.
The usage of distributed precipitation data mainly improved the bias during winter. During summer, distributed precipitation rates caused the bias to change from a 50% overestimation (Erk_LN_PET) to a 50% underestimation.
Evaporation amounts of the Erk simulation which considered spatially homogeneous coniferous land use throughout the catchment were slightly lower (by 15 mm) than that of Wbach and WbachEsoilConi, with the same land use type (refer to Table 4
and Table 5
). The consideration of heterogeneous land use in the Erk_LN scenario slightly increased evaporation by 13 mm in 2010 and 17 mm in 2011. As already mentioned, the mesoscale soil data decreased simulated transpiration and infiltration amounts in the Wüstebach simulations independent of land use type. However transpiration of the Erk scenario was equal to that of the Wbach scenario and infiltration slightly increased.
The total evapotranspiration amount increased when heterogeneous land use information was used compared to the Erk setup. The consideration of distributed PET increased total evapotranspiration in 2011 by 25 mm with increases in both evapotranspiration components.
Reasons for the deviations in the water balance of the simulations Erk, Erk_LN, Erk_LN_PET are comparable to the Wüstebach simulations that have already been explained in Chapter 4.1.
shows measured and simulated monthly deviations from mean annual evapotranspiration rates for coniferous (Figure 9
A), grassland (Figure 9
B) and deciduous (Figure 9
C) vegetation. The simulated values were compared with measured eddy-covariance data in the case of coniferous and grassland vegetation; and to literature values from Mendel [45
] in the case of deciduous vegetation.
For coniferous and grassland vegetation, the trend in mean monthly evapotranspiration was well simulated with R2
values larger than 0.9. In the case of coniferous vegetation, the monthly evapotranspiration was overestimated between April and July and in December while it was underestimated in August, September and February. Distributed precipitation rates improved the simulation in July while the simulation Erk_LN_PET was the most unfavorable simulation meaning that PET was slightly underestimated for coniferous land use when the distribution method was used. Simulation of the evapotranspiration of the grassland vegetation was best during the winter and worst during March and April. For deciduous vegetation, Figure 9
C reveals largest deviations between simulated and measured data taken from literature [45
shows the pattern of simulated mean actual evapotranspiration given as a relative value of the sum of evapotranspiration for 2010 and 2011. The pattern of the Erk scenario (Figure 10
A), shows a clearly defined riparian and stream area with very high relative evapotranspiration values close to unity. Driest conditions were found at the ridge of hills at the eastern, western and southern borders of the catchment. The pattern shown in Figure 10
B for the Erk_LN scenario, illustrates that the incorporation of heterogeneous land use enhanced evapotranspiration in the central part of the catchment covered with grassland. Distributed PET decreased actual evapotranspiration in higher parts of the catchments (e.g., south-western border). The incorporation of distributed precipitation generally decreased the contribution of grassland areas.