A Transient Stochastic Rainfall Generator for Climate Changes Analysis at Hydrological Scales in Central Italy

: In this work, a comprehensive methodology for trend investigation in rainfall time series, in a climate-change context, is proposed. The crucial role played by a Stochastic Rainfall Generator (SRG) is highlighted. Indeed, SRG application is particularly suitable to obtain rainfall series that are representative of future rainfall series at hydrological scales. Moreover, the methodology investigates the climate change e ﬀ ects on several timescales, considering the well-known Mann–Kendall test and analyzing the variation of probability distributions of extremes and hazard. The hypothesis is that the e ﬀ ects of climate changes could be more evident only for speciﬁc time resolutions, and only for some considered aspects. Applications regarded the rainfall time series of the Viterbo rain gauge in Central Italy. A.P. and L.G..; data curation, A.P.; writing—original draft preparation, D.L.D.L., A.P. and L.G.; writing—review and editing, D.L.D.L., A.P. and L.G. All authors have read and agreed to the published version of the


Introduction
Climate changes and their impacts on agriculture, industry, economy, human health and ecosystems constitute an always more important topic for the scientific community. Consequently, Climate Change Adaptation (CCA) and Disaster Risk Reduction (DRR) are nowadays a worldwide priority, in order to have resilient societies. Concerning hydraulic and geological risks, heavy rainfall events characterized by high intensity and short duration, and the consequent floods and landslides, increased in many parts of the world, and they are expected to increase further in the future.
In general, the investigation of climate change effects on rainfall series at hydrological scale can be carried out as follows [1]. For past conditions, it is possible to analyze possible trends only for long-term observed time series. For future conditions, the evaluation of changes in rainfall statistics is based on extrapolation from historical records, or on future scenarios simulated by using climatic models and hypothesizing different greenhouse gas emissions.
Concerning observed time series and regarding trend detection in extreme rainfall, many studies were carried out on data recorded at a daily timescale [2]. Such data are not so suitable for hydrological contexts, because sub-daily (and in particular sub-hourly) scales are of main interest for the analysis of flash floods and shallow landslides, in particular for small catchments that present short response times of runoff to rainfall. Only a few studies have focused on the detection of trends in sub-daily data series for large areas [3]. Overall, trends in historical series of meteorological data were usually investigated with parametric and non-parametric approaches [4]. The non-parametric Mann-Kendall (MK) statistical test [5,6] was frequently used to quantify the significance of trends in precipitation time series [7][8][9] and it was widely recommended by the World Meteorological Organization (WMO) [10]. Moreover, even if the sample size of high resolution series is long enough, the obtained trends from observed data could not be suitable for extrapolation into the future, for two main reasons [11]. First, they could be related to climate variability and not to persistent changes in time. Second, an investigated trend depends on the observation period, so it could be different if the observation period is extended.
To overcome such problems, predictions of future rainfall changes can be carried out by using Regional Climate Models (RCM), but their outputs are mainly available at daily scales, and averaged over large spatial resolutions, so they require statistical downscaling or bias correction methods [1,12] for hydrological analyses. Only very recent RCM applications regarded high resolutions (hourly) and small spatial scales (e.g., [13,14]).
Consequently, in this context of uncertainty, mainly due to the scarcity of high-resolution rainfall observed data with an adequate sample size, the use of Stochastic Rainfall Generators (SRG) appears helpful for a more in-depth analysis of rainfall processes. For instance, they could be used for obtaining perturbed time series that are representative of future rainfall at hydrological scales, which are finer than RCM ones. It is noteworthy that SRG generally present a simple formulation and low computational costs, thus allowing to quickly generating large ensembles of long rainfall time series [15,16].
SRG are usually calibrated by using observed rain gauge data. Then their parameters are perturbed by considering the changes predicted by RCM, under the hypothesis that the statistics of areally averaged rainfall simulated by climate models reflect changes at the point rain gauge scale (e.g., [17,18]). In details, these changes (also named as Change Factors, CF) [19,20] are usually computed as differences (or ratios) between the statistics of simulated (by RCM) rainfall series for two future horizons (i.e., near future, 2035-2064, and far future, 2070-2099), and the control period for the current climate (i.e., . In doing so, they are applied in additive (or multiplicative) way to the statistics of the adopted SRG.
Using a SRG for climate change analyses has been successfully tested for many climate conditions [19,[21][22][23]. For instance, Maraun et al. [24] recommended the use of SRG as alternative to bias correction methods on RCM [12], due to their ability to simulate local variability and climate extremes.
In this work, a modified version of the Neyman-Scott Rectangular Pulses model (NSRP, [24,25]) was introduced and used as SRG. The model was calibrated by using Annual Maximum Rainfall (AMR) data, which are usually longer than continuous observed series and allow for a better reconstruction of extreme events [26]. Moreover, the seasonality was modeled in a parsimonious way [27]. The selected case study regarded the rainfall time series of the Viterbo rain gauge in Central Italy.
Specifically, the transient SRG was implemented in an equivalent way with respect to CF methodology. We hypothesized mathematical functions representing possible trends for NSRP parameters (which are strongly related to finest scales) on the whole considered future horizon (i.e., 100 years), and then we focused on the best equations that provided comparable results with RCM predictions (indeed associated to coarser resolutions), in terms of variation of prefixed statistics in the spatial cells of the selected case study [28]. Obviously, this constitutes a further and not negligible source of uncertainty [29]. In fact, all the successive evaluations depend on the adopted mathematical expression for parameters' trends and on the assumption that it remains valid for the entire future design period.
From the obtained perturbed synthetic rainfall series, the changes of extreme values distributions at several timescales were investigated, together with the hazard values on prefixed time horizons.
Overall, the aim of this work is to highlight the need of considering a comprehensive framework for trend investigation, in which many timescales are investigated and several methods are used. In fact, the effects of climate changes could be more evident only for specific time resolutions. Moreover, as previously mentioned, the analysis of only the observed time series could be insufficient, so that the use of perturbed synthetic data should be preferred, together with the adoption and comparison of several approaches for checking stationary or not-stationary behaviors.
In detail, the main goals of this study are to investigate if the following assumptions are true: • The effects of climate changes could be more evident only on specific data time resolutions; • The analysis of only the observed time series could be insufficient, so that the use of perturbed synthetic data could be preferred, together with the adoption and comparison of several approaches for checking stationary or not-stationary behaviors.

Case Study
The selected case study regards the rainfall time series of the Viterbo rain gauge, located in Lazio region, Central Italy ( Figure 1). The rainfall data (provided by 'Agenzia Regionale di Protezione Civile-Centro Funzionale Regionale' of Lazio region) consist of Annual Maximum Rainfall (AMR) series of different sub-daily durations (1, 3, 6, 12, and 24 h) with a sample size N = 71 years. Moreover, a continuous high-resolution (5 min) rainfall series from 1994 to 2015 is available, from which the AMR series related to 5, 15 and 30 min were obtained, with N = 22 years.
Atmosphere 2020, 11, x FOR PEER REVIEW 3 of 21 the use of perturbed synthetic data should be preferred, together with the adoption and comparison of several approaches for checking stationary or not-stationary behaviors. In detail, the main goals of this study are to investigate if the following assumptions are true:


The effects of climate changes could be more evident only on specific data time resolutions;  The analysis of only the observed time series could be insufficient, so that the use of perturbed synthetic data could be preferred, together with the adoption and comparison of several approaches for checking stationary or not-stationary behaviors.

Case Study
The selected case study regards the rainfall time series of the Viterbo rain gauge, located in Lazio region, Central Italy ( Figure 1). The rainfall data (provided by 'Agenzia Regionale di Protezione Civile-Centro Funzionale Regionale' of Lazio region) consist of Annual Maximum Rainfall (AMR) series of different sub-daily durations (1, 3, 6, 12, and 24 h) with a sample size N = 71 years. Moreover, a continuous high-resolution (5 min) rainfall series from 1994 to 2015 is available, from which the AMR series related to 5, 15 and 30 min were obtained, with N = 22 years.

Overview of the Adopted SRG Model
Many works [25,[30][31][32][33][34][35] remarked the ability of a specific class of SRG, the Poisson cluster models, to satisfactorily reproduce the observed summary statistics (such as mean, variance, skewness, proportion of dry/wet periods, and k-lag autocorrelation values) of the continuous rainfall process at fine timescale. This class includes the well-known Neyman-Scott and Bartlett-Lewis families, which perform equally well [36] and that were applied in many countries (see [27] for a list of references).
In this work, the single-site Neyman-Scott Rectangular Pulses (NSRP [25,26]) model was considered, which is based on the following assumptions related to a clustering approach (see also Figure 2): 1. Five quantities play a crucial role in an NSRP model: (i) the inter-arrival time ( s T ) between the origins of two consecutive storms; (ii) the number, M , of rain cells (also indicated as bursts or pulses) inside a specific storm; (iii) the waiting time, W , between a specific burst origin and the origin of the associated storm; (iv) the intensity, I ; and (v) the duration, D , of a specific burst, having a rectangular shape, belonging to a storm.

Overview of the Adopted SRG Model
Many works [25,[30][31][32][33][34][35] remarked the ability of a specific class of SRG, the Poisson cluster models, to satisfactorily reproduce the observed summary statistics (such as mean, variance, skewness, proportion of dry/wet periods, and k-lag autocorrelation values) of the continuous rainfall process at fine timescale. This class includes the well-known Neyman-Scott and Bartlett-Lewis families, which perform equally well [36] and that were applied in many countries (see [27] for a list of references).
In this work, the single-site Neyman-Scott Rectangular Pulses (NSRP [25,26]) model was considered, which is based on the following assumptions related to a clustering approach (see also Figure 2):

1.
Five quantities play a crucial role in an NSRP model: (i) the inter-arrival time (T s ) between the origins of two consecutive storms; (ii) the number, M, of rain cells (also indicated as bursts or pulses) inside a specific storm; (iii) the waiting time, W, between a specific burst origin and the origin of the associated storm; (iv) the intensity, I; and (v) the duration, D, of a specific burst, having a rectangular shape, belonging to a storm.

2.
T s , M, W, I and D are assumed as random variables, following assigned probability distributions (below described), and then synthetic rainfall series can be generated in stochastic way as reported in the following: Atmosphere 2020, 11, 1292 4 of 21 (a) The inter-arrival time (T s ) between the origins of two consecutive storms is assumed as an independent and identically distributed random variable, following an exponential distribution: where P[T s ≤ t s ] is the probability to have a new storm origin after a waiting time T s ≤ t s from the previous one, and 1/λ represents the mean value for the inter-arrival times, i.e., E[T s ] = 1/λ. According to the usual notation in statistics, capital letters refer to the random variables, while lowercase letters represent particular values assumed by the random variables. Consequently, t s is a specific value that can be assumed by T s . Moreover, the symbol E[.] refers to the expected value, i.e., the mean value for a specific variable.
The number, M, of rain bursts is usually set as geometric or Poisson distributed. In this work, a geometric distribution is considered with a mean value E[M] = θ, and, with the goal of having at least one burst for each storm, the random variable C = M − 1 is adopted, with a mean value E[C] = θ − 1: The waiting time, W, is assumed as an exponentially distributed variable with parameter β W and mean value E[W] = 1/β W : The intensity, I, and the duration, D, of each rectangular burst are both considered as exponentially distributed, with parameters β I and β D , respectively, and mean values (e) The total precipitation intensity, Y(t), at time, t, is then calculated as the sum of the intensities which are related to the active bursts at time, t (see also Figure 2d and [27]); (f) Then, the aggregated process, i.e., the rainfall height R (τ) j cumulated on the temporal τ resolution and related to the time interval j with extremes ( j − 1)τ and jτ, is as follows: This basic version of a NSRP model has a five-parameter set Θ = (Θ 1 , Θ 2 , Θ 3 , Θ 4 , Θ 5 ) = (λ, θ, β W , β D , β I ) that is usually calibrated by minimizing an objective function, which is defined as the sum of residuals (normalized or not) between the considered (by user) statistical properties of the observed data at chosen time resolutions and their theoretical expressions. The statistical properties are typically referred to a high-resolution continuous time series (e.g., 1 or 5 min rainfall time series): Mean, variance, and k-lag autocorrelation for R    [27]. Representation of the Neyman-Scott Rectangular Pulses (NSRP) stochastic process for at-site rainfall modeling: (a) occurrences of storms (Equation (1)); (b) for each storm, determination of the number of pulses (Equation (2)) and their temporal occurrences (Equation (3)); (c) estimation of intensity and duration for all the pulses (Equations (4) and (5)); (d) calculus of total intensity at time, t, calculated as the sum of the intensities which are related to the active bursts at time, t.
However, continuous high resolution datasets are typically very short (in general no more than 15-20 years, and hence not very suitable for obtaining robust estimations) or absent in many locations, where only AMR data are usually available with an adequate sample size.
Moreover, many studies remarked the inability of the SRG like NSRP to reproduce extreme events, because they usually underestimate the rainfall heights at hourly and sub-hourly scales (e.g. [38]). For this reason, many variants of model structure were proposed in the literature [35,[39][40][41][42][43][44], but in any case the calibration was carried out by using continuous high resolution datasets and the improvement on reconstruction of extreme events was not significant.
To overcome these critical aspects, the possibility to calibrate the SRG by using only AMR data was investigated in [27,28]; this choice clearly allowed for a better reconstruction of extreme events, which are usually of main interest in many hydrological studies.
Moreover, De Luca et al. [28] proposed a parsimonious modeling of seasonality, adopted in this work, for a basic version of a NSRP model. The goal is to avoid an over parameterization that could occur using monthly or seasonal parameter sets.
In details, due to the fact that the information about the seasonality of the rainfall process during the year is lost with the use of AMR series, a trigonometric series can be defined for any summary statistic of interest for NSRP (i. e. , 1/ , , 1/ , 1/ , 1/ ): where   t p is the summary statistic along the time t (expressed in min); 0 p is the mean value of  [27]. Representation of the Neyman-Scott Rectangular Pulses (NSRP) stochastic process for at-site rainfall modeling: (a) occurrences of storms (Equation (1)); (b) for each storm, determination of the number of pulses (Equation (2)) and their temporal occurrences (Equation (3)); (c) estimation of intensity and duration for all the pulses (Equations (4) and (5)); (d) calculus of total intensity at time, t, calculated as the sum of the intensities which are related to the active bursts at time, t.
However, continuous high resolution datasets are typically very short (in general no more than 15-20 years, and hence not very suitable for obtaining robust estimations) or absent in many locations, where only AMR data are usually available with an adequate sample size.
Moreover, many studies remarked the inability of the SRG like NSRP to reproduce extreme events, because they usually underestimate the rainfall heights at hourly and sub-hourly scales (e.g., [38]). For this reason, many variants of model structure were proposed in the literature [35,[39][40][41][42][43][44], but in any case the calibration was carried out by using continuous high resolution datasets and the improvement on reconstruction of extreme events was not significant.
To overcome these critical aspects, the possibility to calibrate the SRG by using only AMR data was investigated in [27,28]; this choice clearly allowed for a better reconstruction of extreme events, which are usually of main interest in many hydrological studies.
Moreover, De Luca et al. [28] proposed a parsimonious modeling of seasonality, adopted in this work, for a basic version of a NSRP model. The goal is to avoid an over parameterization that could occur using monthly or seasonal parameter sets.
In details, due to the fact that the information about the seasonality of the rainfall process during the year is lost with the use of AMR series, a trigonometric series can be defined for any summary statistic of interest for NSRP (i.e., 1/λ, θ, 1/β W , 1/β D , 1/β I ): where p(t) is the summary statistic along the time t (expressed in min); p 0 is the mean value of p(t) on the whole year; K is the maximum number of trigonometric functions (also named as harmonics) to be considered; n is the n-th harmonic function; T y is total number of minutes on the whole year (considered with 365 days); A n corresponds to the amplitude for the n-th harmonic function; and ϕ n corresponds to the phase shift for the n-th harmonic function. The proposed approach is very flexible, because the user can choice to use Equation (7) only for some summary statistics (and the remaining ones are assumed as constant along the time) or for all. Adoption of Equation (7) implies the estimation of 1 + 2K parameters for each summary statistic, i.e., the annual mean value and the K couples regarding amplitude and phase shift.
Under the assumption that seasonal variation regards all five of the summary statistics, the proposed SRG is characterized by 15 parameters if K = 1 for all, 25 parameters if K = 2 for all, 35 parameters if K = 3 for all and so on. Obviously, K can be different from a summary statistic to the other. Moreover, it can be noted that the number of requested parameters makes this SRG parsimonious with respect to the version in which 12 monthly sets should be estimated (i.e., 60 parameters).
In this work, as also specified in Section 3, for the selected case study, it was sufficient to adopt for 1/λ a seasonal variation with only one harmonics, i.e., K = 1 in Equation (7), while other summary statistics were considered as constant along the year, that means K = 0 for these in Equation (7).
Thus, Equation (7) was applied as follows for 1/λ: where 1 λ 0 is the annual mean value of the summary statistic (i.e., p 0 in Equation (7)); is the minimum value of the summary statistic along the year; and ϕ 1,λ is the phase shift for the harmonic function.
In the next subsections, the methodologies about calibration of the stationary model and the subsequent trend modeling are discussed.

Calibration Methodology Without Parameter Trend
De Luca and Galasso [27] proposed a calibration methodology for a stationary SRG when the sample size of AMR series is significantly long for sub-hourly resolutions. In their work, the Authors did not model the seasonality. Although such hypothesis could seem unrealistic, it is coherent with the extreme values theory [45], in which the assumed probability distributions are derived by considering, inside a fixed temporal block (usually a year), a sequence of values (whence the maximum is extracted) as independent and identically distributed random variables, i.e., without any seasonality feature.
An improvement of the calibration methodology, which is adopted in this study, was used in De Luca et al. [28]. Starting from the assigned ranges of variation for each considered parameter, a pure random generation of 10,000 parameter set for NSRP was carried out. Then, De Luca et al. [28] selected (for the successive elaborations) the set that optimizes an objective function, i.e., that minimizes the differences between observed and modeled statistics for the investigated variables. As well-known in the literature [46], the iterations of a pure random search are completely independent and never use previous information to affect the search strategy. Although this methodology could be time consuming, it clearly allows for investigating the whole hypercube of parameters space and the performances related to many local minima with a very simple algorithm.
In particular, for this work the ranges of variation for NSRP parameters can be fixed as follows [37,47]: Moreover, the range [2 days; 5 days] is adopted for 1/λ min and [0, 2π] for ϕ 1,λ (see Equation (8)). Then, ten thousand 7-parameter sets can be randomly generated and, for each hypothesized parameter set, a single 500-year realization of continuous 1 min rainfall heights can be reproduced with the usual Monte Carlo techniques [48]. The 500-year rainfall time series is determined according to the property of ergodicity [49,50] related to a stationary process, from which statistical properties of a very long single stationary series are equal to the statistical properties calculated, for each time interval, from multiple realizations. Thus, for a stationary process, generating only a very long time series provides the same information of analysis from shorter multiple realizations. Consequently, due to the ergodicity property of a stationary process, AMR and annual synthetic series can be extracted from a single long 1-min series and compared with the sample ones. Authors chose here to calibrate the stationary model by considering the set that optimizes the reproduction of average values for hourly and multi-hourly AMR, plus the Mean Annual Precipitation (MAP). Then, the validation is carried out by comparing the following: (i) sample and theoretical Amount-Duration-Frequency (ADF) curves for hourly and multi-hourly rainfall heights; (ii) sample and theoretical average values of sub-hourly AMR series.

Methodology for Parameter Trend
After the model calibration, some parameters trends, i.e., assumed functions Θ(t), can be hypothesized. Clearly, a further source of uncertainty is introduced, related to the assumed function linking parameters and the covariate t [29]. In fact, all the successive evaluations depend on the specific mathematical expression for Θ(t) and on the assumption that it remains valid for the entire future investigated period.
In this context, several expressions could be assumed [50]: linear, quadratic, regime shifts, sequence of different forms, etc. In this work, we assumed the following linear expressions: where p(0) is the value of the summary statistics at the beginning of the trend (i.e., it corresponds to the value of the stationary process), p(m) is its value after m years and α p is the rate of increase/decrease in one year.

Scenarios from RCMs
From projections of RCM for Europe, higher rainfall intensities and longer dry periods are generally expected in the future [2,51,52]. In particular, for each adopted RCM, four scenarios of Representative Concentration Pathways (RCP) of total radiative forcing (i.e., the cumulative measure of human emissions of greenhouse gasses from all sources expressed in W/m 2 ) are commonly used as input: RCP 2.6, RCP 4.5, RCP 6 and RCP 8.5. Moreover, an ensemble average projection from all the RCMs is also derived for each RCP. The obtained outputs show an increase of daily precipitation in many areas in winter season, up to 35% during the 21st century, while in summer season it is expected a decrease, particularly in southern and southwestern areas [53].
Considering Italy, and based on the publication of the Italian Institute for Environmental Protection and Research (ISPRA) [54] that was focused on RCP 4.5 (intermediate emissions) and RCP 8.5 (high emissions), with respect to the reference period 1971-2000, the results of the simulations related to a future period until 2090 are as follows: • A decrease of the annual cumulative precipitation value. In details, the ensemble mean reduction is 13 mm for RCP 4.5 and 71 mm for RCP 8.5; • A modest increase for the annual maximum daily rainfall. The ensemble mean is, for both RCP 4.5 and RCP 8.5, an increase of 5-7 mm; • A significant increase for the waiting time between two consecutive rainfall events. The ensemble mean is 8 days for RCP 4.5, and 16 days for RCP 8.5.
All of these ensemble results, averaged on the whole Italian territory, are also referred to as the spatial cell comprising the Viterbo rain gauge in Central Italy.

Scenarios Analysis from the Adopted Transient SRG
As already mentioned in the Introduction, an in-depth analysis of perturbed synthetic data, obtained from application the proposed variant of NSRP model, was carried out.
In detail, 500 realizations, each one concerning 101 years of continuous 1 min rainfall heights, were generated with the usual Monte Carlo techniques [48]. The first generated year, denoted as "0", has the features of the calibrated stationary process, while the successive years are characterized by the assumed parameter trends. In this case, analysis with multiple realizations is necessary as the process is not assumed as stationary in this step, and then investigation of only one long time series is not possible, because ergodicity property cannot be considered.
Then, the associated sub-hourly and hourly AMR series were calculated for each realization, and a comprehensive trend detection was structured, by using the Mann-Kendall (MK) test and the evaluation of Hazard variation, which are briefly described in the following subsections.

The Mann-Kendall Test
The Mann-Kendall (MK) test is a non-parametric test for identifying trends in time-series data and, as previously mentioned, has been recommended widely by the World Meteorological Organization for public application [10]. The test considers the relative magnitudes of sample data rather than the data values themselves, and it does not require the data to follow any particular probability distribution.
In details, the MK statistic, S, is defined as follows: where n corresponds to the sample size, x i and x j are two generic sequential data values (with i = 1, 2, n − 1, and j = i + 1, i + 2, . . . , n), and function sign x j − x i assumes the following values: S is approximately normally distributed with the mean E[S] equal to 0, and the variance V(S) computed as follows: In Equation (12) n is the number of data, m is the number of tied groups and t k is the number of ties of extent k. A tied group is a set of sample data having the same value. The standard normal test statistic Z S is then considered, which is computed for n > 10 as follows: Z S > 0 can indicate increasing trends, while Z S < 0 can indicate decreasing trends. Moreover, once chosen a specific α significance level and calculated the theoretical quantile Z 1− α 2 , the null hypothesis of no trend is rejected when |Z S | > Z 1− α 2 . For the frequently adopted value of α = 0.05, the null hypothesis of no trend is rejected when |Z S | > 1.96.

Hazard Variation
As well-known in the risk-analysis literature [55,56], Hazard, H, is associated to a prefixed value R * of a variable R (hydrological or not), and it corresponds to the probability that the event R > R * Atmosphere 2020, 11, 1292 9 of 21 occurs at least one time in N years (where N is, for example, the design life of a hydraulic structure). In mathematical terms, Hazard is expressed as follows: where P i [R ≤ R * ] is the probability associated to no exceedance along the i-th year (I = 1, . . . , N).
Obviously, we can come to the following conclusions: • For stationary processes, P i [R ≤ R * ] assumes a constant value P[R ≤ R * ], and then Equation (15) can be rewritten as follows: • For nonstationary processes, Pi[R ≤ R * ] depends on the time variation for the NSRP parameter set Θ.
Although analytical expressions of probability distributions do not exist for AMR series from the NSRP model, they can be approximated with an EV1 distribution [57], which also provides a good fit for the AMR sample data in this study (Section 3): where α i > 0, ε i > 0 for the i-th year, and R * is the reference value for AMR at a specific time-resolution. Consequently, Equation (14) can be rewritten as follows: In this context, along the 100-year period of NSRP synthetic data generation (excluding the 0-th year), N is assumed as a moving time window (with obviously N < 100 years), and then the hazard evolution in time, by considering different "starts", can be calculated as follows: Starting from the NSRP data series, the adopted procedure for hazard analysis can be schematized in the following steps for each timescale:
The possibility of using regression formulas for both α i , ε i along the 100-year period is investigated, in order to use these formulas in Equation (18); 3.
R * is assumed to be a quantile related to a specific return period, T (e.g., T = 100, 200 years), of the stationary process.

NSRP Calibration and Validation of the Stationary Version
First of all, the available datasets for Viterbo rain gauge were analyzed in terms of the following: Goodness-of-fit for AMR series with EV1 distributions, which constitute a good approximation for AMR from NSRP synthetic data [27], as also remarked in Section 2.4.2. Consequently, a satisfactory EV1 fitting can clearly support the adoption of an NSRP model for the selected case study.
At this step of analysis, sub-hourly AMR series were not investigated, since the sample size has a limited time span (the data are observed since 1994), and the trend evaluation would not have been reliable.
Chronological histograms for the investigated time series are reported in Figure 3. Results from the MK test are schematized in Table 1; the null hypothesis of no trend until 2015 cannot be rejected at 0.05 significance level for all the series, as Z S is less than 1.96 for all the samples, so the whole dataset can be used for calibration of the stationary model.   The comparisons among average values for sample and NSRP AMR series (also including subhourly series, used for validation) and Annual Precipitation are reported in Table 3.   1916 1922 1928 1934 1940 1946 1952 1958 1964 1970 1976 1982 1988 1994   The EV1 probabilistic plots in Figure 4 show the EV1 goodness-of-fit for the hourly AMR sample series, hence confirming the possibility to use the NSRP model. Moreover, for validation, the comparisons among Amount-Duration-Frequency (ADF) curves obtained from the sample AMR series and those derived from the simulated continuous NSRP process were also carried out. Both sample and NSRP curves were calculated with the common twoparameters ADF formula: As a second step, calibration of the stationary model was carried out as specified in Section 2.2.1, and the results are shown in Table 2. It can be noted that ϕ 1,λ assumes the value equal to π, which means that the mean waiting time 1/λ(t) assumes the minimum value (1/λ min ) when t is very close to 0 and T y (i.e., in the winter period), while its maximum value occurs during the summer season. This result is coherent with the climatology of the Central and Southern Italy [58]. Table 2. Calibration results of the stationary version of NSRP model for Viterbo rain gauge. The comparisons among average values for sample and NSRP AMR series (also including sub-hourly series, used for validation) and Annual Precipitation are reported in Table 3. Moreover, for validation, the comparisons among Amount-Duration-Frequency (ADF) curves obtained from the sample AMR series and those derived from the simulated continuous NSRP process were also carried out. Both sample and NSRP curves were calculated with the common two-parameters ADF formula: where AMR T (d) is the annual maximum cumulative value for rainfall height (mm), d is the rainfall duration (h), and a T (mm/h) and n T (-) are the ADF parameters related to return period T, based on rain gauge observations. Table 4 reports the estimates for a T and n T at prefixed values of return period T for both cases. From Figure 5 we can observe, by applying Equation (19) with parameters in Table 4, a difference of about 2 mm for T = 20 years, about 0.9 mm for T = 50 years, about 0.1 mm per T = 100 years and about 0.6 mm for T = 200 years. Table 4.
Viterbo rain gauge: sample and NSRP estimates for a T (mm/h) and n T (-) Amount-Duration-Frequency (ADF) curves parameters. T is the return period (y).

Results of Transient Version for the Proposed NSRP Model
As already mentioned in the Introduction, NSRP parameters are strongly related to finest scales, while RCM predictions are only available for coarser resolutions. Thus, some NSRP parameters trends were hypothesized, and the Authors considered those that provided compatible results with RCM scenarios, in terms of variation for maximum daily rainfall and annual precipitation. In detail, the adopted scenario combines the following hypotheses:

Results of Transient Version for the Proposed NSRP Model
As already mentioned in the Introduction, NSRP parameters are strongly related to finest scales, while RCM predictions are only available for coarser resolutions. Thus, some NSRP parameters trends were hypothesized, and the Authors considered those that provided compatible results with RCM scenarios, in terms of variation for maximum daily rainfall and annual precipitation. In detail, the adopted scenario combines the following hypotheses: • A linear increasing trend of 50% in 100 years concerning the mean value of Bursts Intensity (i.e., 1/β I ); • A linear decreasing trend of 25% in 100 years concerning the mean value of Bursts Duration (i.e., 1/β D ); • A linear increasing trend of 50% in 100 years concerning the mean waiting time between two consecutive storms (i.e., 1/λ 0 ).
These trends provided a marked variation, in terms of frequency distribution, for AMR series at finer resolutions (5 and 15 min) and for Annual Precipitation, while AMR series at coarser scales seem to be not so influenced by the imposed trends. Some comparisons are shown in Figure 6, in which the frequency distributions for the initial year (t0) and the 25th (t25), the 50th (t50) and the 100th (t100) years are represented.
Focusing on Annual Precipitation (AP) and 24-h AMR series, the temporal evolution of their average values (calculated for each year from the correspondent 500 realizations), compatible with RCM projections, were characterized by the following (Table 5):

1.
A mean reduction of 82.5 mm in 100 years obtained for AP (well-matched with 71 mm in 90 years from RCP 8.5); 2.
A slight increase for 24-h AMR, of about 4 mm in 100 years (the ensemble mean is, for both RCP 4.5 and RCP 8.5, up to 5-7 mm in 90 years, related to daily duration).
These trends provided a marked variation, in terms of frequency distribution, for AMR series at finer resolutions (5 and 15 min) and for Annual Precipitation, while AMR series at coarser scales seem to be not so influenced by the imposed trends. Some comparisons are shown in Figure 6, in which the frequency distributions for the initial year (t0) and the 25th (t25), the 50th (t50) and the 100th (t100) years are represented. Focusing on Annual Precipitation (AP) and 24-h AMR series, the temporal evolution of their average values (calculated for each year from the correspondent 500 realizations), compatible with RCM projections, were characterized by the following (Table 5): 1. A mean reduction of 82.5 mm in 100 years obtained for AP (well-matched with 71 mm in 90 years from RCP 8.5); 2. A slight increase for 24-h AMR, of about 4 mm in 100 years (the ensemble mean is, for both RCP 4.5 and RCP 8.5, up to 5-7 mm in 90 years, related to daily duration). Table 5. Evolution of average values for Annual Maximum Rainfall (AMR) and of Mean Annual Precipitation (MAP): from now (t0) to 25 (t25), 50 (t50), 75 (t75) and 100 (t100) years.

min AMR (mm) 1min AMR (mm) 30 min AMR (mm) 1 h AMR (mm) 3 h AMR (mm) t0
7   Moreover, from Table 5, it can be noted that there is an increase in mean values of about 37% (5 min AMR), 28% (15 min AMR) and 17% (30 min AMR) in 100 years. These differences are lower and less evident for hourly timescales; indeed, the increase in average values is about 5-7% in 100 years.

min AMR (mm) 1min AMR (mm) 30 min AMR (mm) 1 h AMR (mm) 3 h AMR (mm)
The results can be justified from the assumed trend scenario: • An increase in burst intensity induces a clear marked effect (i.e., an expected increase in rainfall height) mainly for finer time resolutions (5-30 min), which are not so influenced by a contemporary reduction of burst duration.
• On the contrary, for coarser resolutions (from 1 h), the simultaneous presence of an increase for intensity and of a reduction in bursts duration produces a sort of balance for rainfall heights, and then it is not possible to highlight a significant trend for AMR series.

•
The increase of the average waiting time between two consecutive storms mainly influences the reduction of annual precipitation, as expected from RCM projections.
This "scale effect", i.e., that there are some particular time resolutions where the assumed temporal changes in parameters could be "hidden" when AMR series are studied, is also confirmed by analyzing the NSRP realizations with MK test.
The histograms in Figure 7 indicate, for each investigated time resolution, the percentage of realizations with Zs > 1.96. The percentages decrease, passing from a 5-min to 1-h time resolution, and are always less than 10% for hourly and multi-hourly AMR synthetic data. Concerning AP, the percentage with Zs > 1.96 is about 30%.
less evident for hourly timescales; indeed, the increase in average values is about 5-7% in 100 years.
The results can be justified from the assumed trend scenario:  An increase in burst intensity induces a clear marked effect (i.e., an expected increase in rainfall height) mainly for finer time resolutions (5-30 min), which are not so influenced by a contemporary reduction of burst duration.  On the contrary, for coarser resolutions (from 1 h), the simultaneous presence of an increase for intensity and of a reduction in bursts duration produces a sort of balance for rainfall heights, and then it is not possible to highlight a significant trend for AMR series.  The increase of the average waiting time between two consecutive storms mainly influences the reduction of annual precipitation, as expected from RCM projections.
This "scale effect", i.e., that there are some particular time resolutions where the assumed temporal changes in parameters could be "hidden" when AMR series are studied, is also confirmed by analyzing the NSRP realizations with MK test.
The histograms in Figure 7 indicate, for each investigated time resolution, the percentage of realizations with Zs > 1.96. The percentages decrease, passing from a 5-min to 1-h time resolution, and are always less than 10% for hourly and multi-hourly AMR synthetic data. Concerning AP, the percentage with Zs > 1.96 is about 30%. The results from MK test highlight the importance of analysis with a SRG, and in particular with a large number of simulations. In fact, if a user would consider only one realization for all the investigated scales, he could obtain misleading assessments concerning the presence of trend or not for specific time resolutions. For example, if the considered 3-h AMR series belongs to the subset with Zs > 1.96, then the null hypothesis of no-trend is rejected; on the contrary, using a 15-min AMR series with Zs < 1.96 could imply the null hypothesis is not rejected. Similar considerations could be made if the application of MK test (or other similar tests) is carried out only on the observed series for a selected case study. Then, an overall analysis with a large number of simulations allows for a very in-depth investigation.
Concerning The results from MK test highlight the importance of analysis with a SRG, and in particular with a large number of simulations. In fact, if a user would consider only one realization for all the investigated scales, he could obtain misleading assessments concerning the presence of trend or not for specific time resolutions. For example, if the considered 3-h AMR series belongs to the subset with Zs > 1.96, then the null hypothesis of no-trend is rejected; on the contrary, using a 15-min AMR series with Zs < 1.96 could imply the null hypothesis is not rejected. Similar considerations could be made if the application of MK test (or other similar tests) is carried out only on the observed series for a selected case study. Then, an overall analysis with a large number of simulations allows for a very in-depth investigation.
Concerning the variation of T-year percentiles and Hazard (Section 2.4.2), the possibility of applying regression formulas for estimated EV1 parameters along the 100-year period was investigated. Coefficients for linear regressions are shown in Table 6, together with the R 2 values, which are greater than 0.5, so that the obtained regressions can be considered as acceptable [59]. As examples, the plots for 5-and 15-min EV1 parameters are illustrated in Figure 8. Table 6. Estimation of linear regression coefficients (m is the angular coefficient, q is the intercept and R 2 is the coefficient of determination) for time evolution of EV1 parameters. investigated. Coefficients for linear regressions are shown in Table 6, together with the R 2 values, which are greater than 0.5, so that the obtained regressions can be considered as acceptable [59]. As examples, the plots for 5-and 15-min EV1 parameters are illustrated in Figure 8.  Then, the regression laws were used for quantifying the temporal variation of the following:
The 100-year and 200-year AMR quantiles with respect to the correspondent values R * of the stationary process ( Figure 10); 2.
The increases are significant for finer resolutions, while they are relatively more limited, but still not negligible, for coarser scales. In details, increases for 24 h quantiles are 11-12% with respect to the stationary case, while they exceed 40% and 35% for the 5 and 15 min scales, respectively.
Hazard values, associated to a 20-year moving window and related to 100-year and 200-year quantiles of the stationary case, are as follows:   The increases are significant for finer resolutions, while they are relatively more limited, but still not negligible, for coarser scales. In details, increases for 24 h quantiles are 11-12% with respect to the stationary case, while they exceed 40% and 35% for the 5 and 15 min scales, respectively.
Hazard values, associated to a 20-year moving window and related to 100-year and 200-year quantiles of the stationary case, are as follows: Thus, although variations for the investigated 24-h percentiles are relatively modest, the associated hazards at least double if compared with the values assumed at the beginning of the time horizon.
These results highlight the importance of investigation of several aspects before the choice of a stationary or a non-stationary model is made.
However, in a climate change context, it should be remarked the difference between the terms "stationarity" and "change", and the fact that they are not mutually exclusive [49]. Very popular "examples" can be mentioned [49]. For instance, in the absence of an external force, the position of a body in motion changes in time but the velocity is unchanged (Newton's first law). If a constant force  The increases are significant for finer resolutions, while they are relatively more limited, but still not negligible, for coarser scales. In details, increases for 24 h quantiles are 11-12% with respect to the stationary case, while they exceed 40% and 35% for the 5 and 15 min scales, respectively.
Hazard values, associated to a 20-year moving window and related to 100-year and 200-year quantiles of the stationary case, are as follows: Thus, although variations for the investigated 24-h percentiles are relatively modest, the associated hazards at least double if compared with the values assumed at the beginning of the time horizon.
These results highlight the importance of investigation of several aspects before the choice of a stationary or a non-stationary model is made.
However, in a climate change context, it should be remarked the difference between the terms "stationarity" and "change", and the fact that they are not mutually exclusive [49]. Very popular "examples" can be mentioned [49]. For instance, in the absence of an external force, the position of a body in motion changes in time but the velocity is unchanged (Newton's first law). If a constant force is present, then the velocity changes but the acceleration is constant (Newton's second law). If the force changes, e.g., the gravitational force with changing distance in planetary motion, the acceleration is no longer constant, but other invariant properties emerge, e.g., the angular momentum (Newton's law of gravitation).
Non-stationarity is usually considered as synonymous with change, but change is a general notion applicable everywhere, including the real (material) world, while stationarity and non-stationarity are applied only to models, and not to the real world. Thus, environmental changes can be modeled also with stationary approaches [60] and only in justified cases with non-stationary approaches.
In this specific case study, related to the Viterbo rain gauge in Central Italy, the imposed trends on β I , β D and λ imply the need of using non stationary approaches for the finer time resolutions which are comparable with β D and the annual aggregation (i.e., for MAP evaluation). This is true because a change on the inter-arrival time between two consecutive storms firstly influences monthly and yearly scales ( Figure 6).
On the contrary, a stationary modeling could be used for other investigated resolutions, as the temporal variation of the probability distributions seems not so significant. However, a complete analysis including the investigation of hazard and high percentiles makes the use of a non-stationary approach more plausible also for the coarser time resolutions.

Conclusions
The results of this work shed new insights on the analysis of climate change effects on rainfall series. The main goal of the applied methodology is understanding if significant trends depend on the investigated timescales. The crucial role in the investigated methodology is played by the use of a Stochastic Rainfall Generator (SRG), which allows users to carry out a very in-depth investigation.
Different sources of uncertainty can arise due to several factors. First, the intrinsic model uncertainty due to the structure of a SRG should be carefully investigated. Second, a further uncertainty can lie in the SRG parameters estimation. Third, the reliability of RCM projections and the assumption of their validity at point scales are questionable. Forth, there is the uncertainty due to the adopted mathematical expression for assessing parameters' trends and to the assumption that such mathematical expression remains valid for the entire future investigated period. Notwithstanding, the obtained results highlighted that the imposed parameter trends on prefixed time resolutions could imply an "apparent stationarity" of the process and, therefore, that stationary models could be employed, if a user focuses only on some aspects. In fact, for the selected case study, if the AMR probability distributions, derived from simulations with a transient SRG, are analyzed, the associated variations for coarser resolutions are not so significant. This would imply that the use of a stationary model seems to be suitable. Conversely, a more detailed study regarding high percentiles and hazard would allow for us to reconsider a non-stationary approach for all the timescales.