The classical approach to flood frequency analysis (FFA) may result in significant jumps in the estimates of upper quantiles along with the lengthening series of measurements. Our proposal is a multi-model approach, also called the aggregation technique, which has turned out to be an effective method for the modeling of maximum flows, in large part eliminating the disadvantages of traditional methods. In this article, we present a probability mixture model relying on the aggregation the probabilities of non-exceedance of a constant flow value from the candidate distributions; and we compare it with the previously presented model of quantile mixture, which consists in aggregating the quantiles of the same order from individual models. Here, we defined an asymptotic standard error of design quantiles for both statistical models in two versions: without the bias of quantiles from candidate distributions with respect to aggregated quantiles and with taking it into account. The simulation experiment indicates that the latter version is more accurate and allows for reducing the quantile bias with respect to the unknown population quantile. For the case study, the 0.99 quantiles are determined for both variants of aggregation along with the assessment of its accuracy. The differences between the two proposed aggregation methods are discussed.
This is an open access article distributed under the Creative Commons Attribution License
which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited