Torrents are defined as steep waterways in mountainous environments [1
]. Hazardous torrent processes are characterized by the rapid propagation of large quantities of available sediments, debris, and water from an upslope source via a transit zone, to a downslope depositional area where human settlements may be established. In theory, torrent processes can be further differentiated as debris flows, hyperconcentrated flows, or fluvial sediment transport, based on the respective characteristics and dominant processes of each event [2
While conventional approaches based the classification process on flow behavior alone [4
], peak discharge has since been recommended as a complimentary criterion [3
]. Observable flow characteristics reflect variable concentrations of water and sediment, which provides insight into the internal physics of the flow. For instance, debris flows typically produce thicker, more hummocky and lobate depositions, and are characterized as very rapid to extremely rapid flows of saturated, non-plastic materials along steep channels, headed by a coarse surge front [3
]. While debris floods are associated with the transport of considerable quantities of coarse sediments, the flows are generally characterized as thin, wide sheets of materials [3
]. Relative to debris floods, hyperconcentrated or sediment-laden flows transport relatively less, albeit still notable quantities of fine sediment in suspension [6
]. In particular, mudflows have been defined as very to extremely rapid flows of saturated plastic debris in a channel, and are characterized by significantly larger quantities of water content with respect to the amount of solid source materials; the plastic index is greater than 5% [8
]. Following the definition by Bradley and McCutcheon [9
], mudflows also have sufficient viscosity to transport sizable boulders, and natural and anthropogenic debris within a matrix of smaller-sized particles. It is possible for the flow behavior of a single torrent event to evolve as it occurs, depending on the types, velocity, and quantities of materials propagated downwards. In this respect, events may often be more realistically described as a combination or evolution of the aforementioned torrent process types.
The distribution of these materials downslope from the active zone is further influenced by underlying site characteristics (e.g., topography, presence of confined or unconfined preferential pathways). Due to the combined effects of composition, flow behavior and site-specific characteristics, torrent processes are associated with variable peak discharges, sediment transport capacities, momentums, and subsequently differential potentials to cause damage to elements at risk upon impact.
Assessing the physical vulnerability of elements at risk (e.g., affected buildings) due to these processes is a part of consequence analysis within risk assessment, where the intensity of a given event is related to the damages sustained [10
]. The design and implementation of effective risk mitigation strategies is dependent on the results of such analyses. The estimation of expected direct losses, as a result of the hazard process patterns with respect to the properties of exposed elements, is possible with the derivation of representative physical vulnerability functions [12
However, a persistent challenge in vulnerability studies on torrent processes is the high uncertainty and limited amounts of direct, field-based observation data that is available, since its collection during the occurrence of these types of events is difficult or impossible [13
]. Detailed analyses have been conducted for events that result in high losses in both Austria and Switzerland. These analyses generally report on the triggering and boundary conditions of the particular torrent process, the process evolution, the extent of runout zones, and estimates of eroded materials deposited downslope (e.g., [14
]). However, comprehensive information on processes, damage patterns, and their interactions with structural building properties have not been adequately documented to date.
To address this challenge, proxies have been adapted to inform about event intensities, including, but not limited to sediment deposition heights, velocities, and impact pressures [16
]. It is of interest to replicate past events with process models to determine if simulated intensity proxy data can be considered in further consequence analysis. Following Mazzorana et al. [12
], process modeling is the first three of five steps to accurately assessing the physical vulnerability of the built environment. A range of recognized methods have been applied to different process models including empirical [7
], empirical-statistical combined with simple flow equations [24
], topographic gradient-based [25
], numerical-based with the integration of shallow water equations [26
], and smoothed particle hydrodynamics (SPH) or Lagrangian [42
] (see References [47
] for review). Of the numerical models, 1- (e.g., DAN-W [28
]; DFEM-1D [49
]) or 2- (e.g., FLO-2D [27
]; RAMMS-DF [35
]; TopRunDF [7
]; MassMov2D [34
]) dimensional runout modeling approaches can be adapted. Rickenmann [50
] provides a comprehensive overview of the advantages and limitations of each type of modeling approach as a combination of how flows are propagated and distributed on the alluvial fan. Furthermore, flows are represented as either single-phased, homogenous fluids with constant rheological properties, or dual-phased, heterogeneous matrices. Each approach has specific data requirements, which may limit its application to data scarce case studies. The different model approaches also highlight tradeoffs between minimizing computational time and various degrees of accuracy in simulated results. Consequently, the choice of a suitable model is determined by data availability and how representative a given approach is to representing a complex, heterogeneous 3D problem, while minimizing the computation time required to return a solution.
Furthermore, known sources of uncertainties can have adverse implications on model results [51
]. Sources associated with debris flow models include, but are not limited to, the quality of model input and calibration data, initial and boundary conditions, how accurately the model structure represents events of interest, the sensitivity of defined parameters, and the calibration method applied. Accurate topographic representation is imperative to model both the propagation of debris flows within the torrent channel and the lateral distribution of materials that exceed bankfull conditions onto the alluvial fan [52
]. For example, Rickenmann et al. [32
] demonstrated that debris flow model results are sensitive to the presence of local topographic features, which can divert flows and consequently determine where material is deposited on alluvial fans. Moreover, infrequent generation of topographic inputs, which are used in debris flow models, may not necessarily reflect the same conditions when the event occurred. This temporal mismatch results in modelling with topography that does not accurately represent initial conditions and can include differences in channel slope steepness, channel width, and inaccurate representation of mitigation structures along the channel [53
]. Additional uncertainties in debris flow modelling stem from the lack of basic data required to reconstruct the characteristics of debris flow events. These may include the point where the debris flow event was initiated, the hydrograph peak and duration [16
], and the volume of material at the start of the event and further entrained during its course [54
]. In lieu of actual data, working assumptions are made to reconstruct ranges of plausible data values, which contains inherent uncertainties that are introduced in the simulated results. For example, Rickenmann et al. [32
] attributed sources of model errors to the lack of detailed data needed to describe a debris flow consisting of multiple surges, which was described with the use of a simpler, single surge hydrograph instead. The degree of model complexity can also be a source of uncertainty. For instance, sediment entrainment during a debris flow changes the volume of materials and flow behavior [55
]. Simpler debris flow models do not replicate entrainment, but partly consider this process by including additional quantities of material at the debris initiation point, while more complex models explicitly replicate the spatially and temporally-distributed erosion of channel materials as the flow is propagated [57
]. However, while a more complex model that includes entrainment can better replicate debris flow heights near the point of initiation [57
], this additional process introduces another source of uncertainty into the model with the introduction of additional parameters (e.g., erosion rate), which require values that may not be known. Event and site-specific parameters for debris flow models are estimated through calibration, and efforts usually focus on flow resistance [48
] and rheological parameters [58
]. These parameters are associated with wide ranges of plausible values in literature and may be difficult to determine which are most representative for a given event. The majority of debris flow model research uses a trial and error approach to calibration [58
], and may not arrive to optimal parameter sets because calibration is time consuming and the process may be ended prematurely [52
]. Methods exist to efficiently derive parameter sets (e.g., genetic algorithms), but these methods are rarely, if ever, employed in debris flow modelling.
Given the inherent complexities that characterize natural hazard processes and the contributing sources of uncertainties, modeling past torrent events requires the exploration of relatively large parameter spaces, even with model simplifications. Sensitivity analysis (SA) describes how varying inputs in a numerical model subsequently varies its outputs [59
]. The inclusion of SA for model calibration supports a better understanding of model behavior, its parameterization, and its associated uncertainties [60
In this respect, the inclusion of SA offers several advantages as a part of a sound model calibration and evaluation framework. Firstly, it is instrumental in reducing the number of parameters that require calibration. Calibration is an example of an inverse problem, where the most optimal agreement between simulated and observed reference data is obtained with parameter combinations and values that result in higher model performance. Secondly, SA supports the identification of the degree of influence that input factors have on model simulation results. Thirdly, SA can highlight limitations of model calibration due to residual sources of uncertainties once parameter uncertainties are accounted for [59
]. Consequently, the inclusion of SA into the modelling process identifies optimal results while highlighting application-specific limitations with greater efficiency. Furthermore, epistemic uncertainty about natural hazard phenomena that occur in complex systems remains prevalent [61
]. In particular, the combination of model and input data limitations compromises the ability to learn about the most influential parameter(s) within the systems of interest to a sufficiently high degree of accuracy. Uncertainties due to imperfect knowledge about initial conditions and simplified representations of model inputs are generally addressed with ensembles of model predictions, where each simulation represents a different choice of parameter combinations and values. A structured statistical approach to assess parametric uncertainty and model performance is preferable, where decisions about parameters can be made in a transparent and explicit way, using methods that can be easily understood [62
]. This study addresses the two aforementioned challenges, firstly, relating to the large parameter spaces needed to be explored to fully capture the complexities of hazardous torrent events, and secondly, evaluating model performance to gain a better understanding of epistemic uncertainties.
In response to the first challenge, the utility of SA for model calibration is assessed with the back-calculation of the 2005 debris flow event that occurred in Brienz, Switzerland. Taking all of these requirements into consideration, the FLO-2D [27
], a simplified, physically-based, 2-dimensional numerical model, was used to model the event. Single-phased models like FLO-2D are commonly used to simulate debris flows by researchers and practitioners working in the risk community (Table 1
) as a computationally efficient first step to gain insight into these complex processes [32
]. In the study conducted by De Blasioa et al. [38
], it was suggested that in principle, the presence of larger blocks interspersed within a mud matrix may behave comparably to a pure Bingham fluid for certain types of flows. This working assumption was also adopted to minimize the amount of computation time to support the evaluation of a wider model parameter space. Furthermore, the application of more complex models often requires input data and initial conditions that exceed that which is available [64
]. With respect to specific model data requirements, FLO-2D results have been reported to be strongly influenced by topography [58
]; the availability of a high resolution SwissALTI3D digital elevation model (DEM) [66
] satisfied this requirement. Previous studies (e.g., [58
]) have also cited that FLO-2D is capable of generating accurate runout distances and capturing the distribution of materials across the fan through back-calculation. Finally, to support subsequent investigations of building vulnerability with simulated hazard intensities, FLO-2D is capable of generating sediment deposition (flow) heights, in addition to flow velocities and impact pressures.
Model performance is assessed against post-event observations of sediment deposition extent, sediment deposition heights, and a single point estimate of flow velocity for close to 4000 completed simulations. Focusing only on studies conducted with the commonly applied FLO-2D model, Table 1
presents an overview of published literature on the assessment of simulated results specific to the back-calculation of torrent events. To date, the performance of simulated torrent events has generally been assessed by visual comparison, with limited instances where quantitative or hybrid approaches are applied to consider validation data in 3-dimensions. In particular, while comparisons of observed sediment deposition extents are more common, the additional inclusion deposition heights is limited. For instance, only three known studies with FLO-2D assessed results with both post-event observations of sediment deposition extent and deposition heights. From these cases, sample sizes of observed points were limited (n
< 20), and no further statistical analyses were published at the time this manuscript was prepared. Furthermore, other studies that quantified model performance presented the percent of over or under prediction by sediment deposition extent or only visually displayed the simulated results without quantification.
In light of these findings, statistically-based performance metrics are applied in this study to address the second challenge. This supports the quantitative formulation of aggregated uncertainties related to data quality, parameterization, and model suitability; it is a step towards effectively identifying priorities for data collection and future modeling efforts based on quantitative methods. In effect, the methods proposed in this study combines SA for calibration together with the evaluation of model performance and behavior with a set of statistically-based metrics to support the process in a systematic and efficient way.