Open AccessArticle
Stochastic Period and Cohort Effect State-Space Mortality Models Incorporating Demographic Factors via Probabilistic Robust Principal Components
Risks 2017, 5(3), 42; doi:10.3390/risks5030042 -
Abstract
In this study we develop a multi-factor extension of the family of Lee-Carter stochastic mortality models. We build upon the time, period and cohort stochastic model structure to extend it to include exogenous observable demographic features that can be used as additional factors
[...] Read more.
In this study we develop a multi-factor extension of the family of Lee-Carter stochastic mortality models. We build upon the time, period and cohort stochastic model structure to extend it to include exogenous observable demographic features that can be used as additional factors to improve model fit and forecasting accuracy. We develop a dimension reduction feature extraction framework which (a) employs projection based techniques of dimensionality reduction; in doing this we also develop (b) a robust feature extraction framework that is amenable to different structures of demographic data; (c) we analyse demographic data sets from the patterns of missingness and the impact of such missingness on the feature extraction, and (d) introduce a class of multi-factor stochastic mortality models incorporating time, period, cohort and demographic features, which we develop within a Bayesian state-space estimation framework; finally (e) we develop an efficient combined Markov chain and filtering framework for sampling the posterior and forecasting. We undertake a detailed case study on the Human Mortality Database demographic data from European countries and we use the extracted features to better explain the term structure of mortality in the UK over time for male and female populations when compared to a pure Lee-Carter stochastic mortality model, demonstrating our feature extraction framework and consequent multi-factor mortality model improves both in sample fit and importantly out-off sample mortality forecasts by a non-trivial gain in performance. Full article
Open AccessArticle
Robust Estimation of Value-at-Risk through Distribution-Free and Parametric Approaches Using the Joint Severity and Frequency Model: Applications in Financial, Actuarial, and Natural Calamities Domains
Risks 2017, 5(3), 41; doi:10.3390/risks5030041 -
Abstract
Value-at-Risk (VaR) is a well-accepted risk metric in modern quantitative risk management (QRM). The classical Monte Carlo simulation (MCS) approach, denoted henceforth as the classical approach, assumes the independence of loss severity and loss frequency. In practice, this assumption does not always hold
[...] Read more.
Value-at-Risk (VaR) is a well-accepted risk metric in modern quantitative risk management (QRM). The classical Monte Carlo simulation (MCS) approach, denoted henceforth as the classical approach, assumes the independence of loss severity and loss frequency. In practice, this assumption does not always hold true. Through mathematical analyses, we show that the classical approach is prone to significant biases when the independence assumption is violated. This is also corroborated by studying both simulated and real-world datasets. To overcome the limitations and to more accurately estimate VaR, we develop and implement the following two approaches for VaR estimation: the data-driven partitioning of frequency and severity (DPFS) using clustering analysis, and copula-based parametric modeling of frequency and severity (CPFS). These two approaches are verified using simulation experiments on synthetic data and validated on five publicly available datasets from diverse domains; namely, the financial indices data of Standard & Poor’s 500 and the Dow Jones industrial average, chemical loss spills as tracked by the US Coast Guard, Australian automobile accidents, and US hurricane losses. The classical approach estimates VaR inaccurately for 80% of the simulated data sets and for 60% of the real-world data sets studied in this work. Both the DPFS and the CPFS methodologies attain VaR estimates within 99% bootstrap confidence interval bounds for both simulated and real-world data. We provide a process flowchart for risk practitioners describing the steps for using the DPFS versus the CPFS methodology for VaR estimation in real-world loss datasets. Full article
Open AccessFeature PaperArticle
The Class of (p,q)-spherical Distributions with an Extension of the Sector and Circle Number Functions
Risks 2017, 5(3), 40; doi:10.3390/risks5030040 -
Abstract
For evaluating the probabilities of arbitrary random events with respect to a given multivariate probability distribution, specific techniques are of great interest. An important two-dimensional high risk limit law is the Gauss-exponential distribution whose probabilities can be dealt with based on the Gauss–Laplace
[...] Read more.
For evaluating the probabilities of arbitrary random events with respect to a given multivariate probability distribution, specific techniques are of great interest. An important two-dimensional high risk limit law is the Gauss-exponential distribution whose probabilities can be dealt with based on the Gauss–Laplace law. The latter will be considered here as an element of the newly-introduced family of (p,q)-spherical distributions. Based on a suitably-defined non-Euclidean arc-length measure on (p,q)-circles, we prove geometric and stochastic representations of these distributions and correspondingly distributed random vectors, respectively. These representations allow dealing with the new probability measures similarly to with elliptically-contoured distributions and more general homogeneous star-shaped ones. This is demonstrated by the generalization of the Box–Muller simulation method. In passing, we prove an extension of the sector and circle number functions. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Stress Testing German Industry Sectors: Results from a Vine Copula Based Quantile Regression
Risks 2017, 5(3), 38; doi:10.3390/risks5030038 -
Abstract
Measuring interdependence between probabilities of default (PDs) in different industry sectors of an economy plays a crucial role in financial stress testing. Thereby, regression approaches may be employed to model the impact of stressed industry sectors as covariates on other response sectors. We
[...] Read more.
Measuring interdependence between probabilities of default (PDs) in different industry sectors of an economy plays a crucial role in financial stress testing. Thereby, regression approaches may be employed to model the impact of stressed industry sectors as covariates on other response sectors. We identify vine copula based quantile regression as an eligible tool for conducting such stress tests as this method has good robustness properties, takes into account potential nonlinearities of conditional quantile functions and ensures that no quantile crossing effects occur. We illustrate its performance by a data set of sector specific PDs for the German economy. Empirical results are provided for a rough and a fine-grained industry sector classification scheme. Amongst others, we confirm that a stressed automobile industry has a severe impact on the German economy as a whole at different quantile levels whereas, e.g., for a stressed financial sector the impact is rather moderate. Moreover, the vine copula based quantile regression approach is benchmarked against both classical linear quantile regression and expectile regression in order to illustrate its methodological effectiveness in the scenarios evaluated. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Valuation of Non-Life Liabilities from Claims Triangles
Risks 2017, 5(3), 39; doi:10.3390/risks5030039 -
Abstract
This paper provides a complete program for the valuation of aggregate non-life insurance liability cash flows based on claims triangle data. The valuation is fully consistent with the principle of valuation by considering the costs associated with a transfer of the liability to
[...] Read more.
This paper provides a complete program for the valuation of aggregate non-life insurance liability cash flows based on claims triangle data. The valuation is fully consistent with the principle of valuation by considering the costs associated with a transfer of the liability to a so-called reference undertaking subject to capital requirements throughout the runoff of the liability cash flow. The valuation program includes complete details on parameter estimation, bias correction and conservative estimation of the value of the liability under partial information. The latter is based on a new approach to the estimation of mean squared error of claims reserve prediction. Full article
Figures

Figure 1

Open AccessArticle
Bubbles, Blind-Spots and Brexit
Risks 2017, 5(3), 37; doi:10.3390/risks5030037 -
Abstract
In this paper we develop a well-established financial model to investigate whether bubbles were present in opinion polls and betting markets prior to the UK’s vote on EU membership on 23 June 2016. The importance of our contribution is threefold. Firstly, our continuous-time
[...] Read more.
In this paper we develop a well-established financial model to investigate whether bubbles were present in opinion polls and betting markets prior to the UK’s vote on EU membership on 23 June 2016. The importance of our contribution is threefold. Firstly, our continuous-time model allows for irregularly spaced time series—a common feature of polling data. Secondly, we build on qualitative comparisons that are often made between market cycles and voting patterns. Thirdly, our approach is theoretically elegant. Thus, where bubbles are found we suggest a suitable adjustment. We find evidence of bubbles in polling data. This suggests they systematically over-estimate the proportion voting for remain. In contrast, bookmakers’ odds appear to show none of this bubble-like over-confidence. However, implied probabilities from bookmakers’ odds appear remarkably unresponsive to polling data that nonetheless indicates a close-fought vote. Full article
Figures

Figure 1

Open AccessArticle
A Robust Approach to Hedging and Pricing in Imperfect Markets
Risks 2017, 5(3), 36; doi:10.3390/risks5030036 -
Abstract
This paper proposes a model-free approach to hedging and pricing in the presence of market imperfections such as market incompleteness and frictions. The generality of this framework allows us to conduct an in-depth theoretical analysis of hedging strategies with a wide family of
[...] Read more.
This paper proposes a model-free approach to hedging and pricing in the presence of market imperfections such as market incompleteness and frictions. The generality of this framework allows us to conduct an in-depth theoretical analysis of hedging strategies with a wide family of risk measures and pricing rules, and study the conditions under which the hedging problem admits a solution and pricing is possible. The practical implications of our proposed theoretical approach are illustrated with an application on hedging economic risk. Full article
Open AccessArticle
Implied Distributions from GBPUSD Risk-Reversals and Implication for Brexit Scenarios
Risks 2017, 5(3), 35; doi:10.3390/risks5030035 -
Abstract
Much of the debate around a potential British exit (Brexit) from the European Union has centred on the potential macroeconomic impact. In this paper, we instead focus on understanding market expectations for price action around the Brexit referendum date. Extracting implied distributions from
[...] Read more.
Much of the debate around a potential British exit (Brexit) from the European Union has centred on the potential macroeconomic impact. In this paper, we instead focus on understanding market expectations for price action around the Brexit referendum date. Extracting implied distributions from the GBPUSD option volatility surface, we originally estimated, based on our visual observation of implied probability densities available up to 13 June 2016, that the market expected that a vote to leave could result in a move in the GBPUSD exchange rate from 1.4390 (spot reference on 10 June 2016) down to a range in 1.10 to 1.30, i.e., a 10–25% decline—very probably with highly volatile price action. To quantify this more objectively, we construct a mixture model corresponding to two scenarios for the GBPUSD exchange rate after the referendum vote, one scenario for “remain” and one for “leave”. Calibrating this model to four months of market data, from 24 February to 22 June 2016, we find that a “leave” vote was associated with a predicted devaluation of the British pound to approximately 1.37 USD per GBP, a 4.5% devaluation, and quite consistent with the observed post-referendum exchange rate move down from 1.4877 to 1.3622. We contrast the behaviour of the GBPUSD option market in the run-up to the Brexit vote with that during the 2014 Scottish Independence referendum, finding the potential impact of Brexit to be considerably higher. Full article
Figures

Figure 1

Open AccessArticle
Backtesting the Lee–Carter and the Cairns–Blake–Dowd Stochastic Mortality Models on Italian Death Rates
Risks 2017, 5(3), 34; doi:10.3390/risks5030034 -
Abstract
This work proposes a backtesting analysis that compares the Lee–Carter and the Cairns–Blake–Dowd mortality models, employing Italian data. The mortality data come from the Italian National Statistics Institute (ISTAT) database and span the period 1975–2014, over which we computed back-projections evaluating the performances
[...] Read more.
This work proposes a backtesting analysis that compares the Lee–Carter and the Cairns–Blake–Dowd mortality models, employing Italian data. The mortality data come from the Italian National Statistics Institute (ISTAT) database and span the period 1975–2014, over which we computed back-projections evaluating the performances of the models compared with real data. We propose three different backtest approaches, evaluating the goodness of short-run forecast versus medium-length ones. We find that neither model was able to capture the improving shock on mortality observed for the male population on the analysed period. Moreover, the results suggest that CBD forecasts are reliable prevalently for ages above 75, and that LC forecasts are basically more accurate for this data. Full article
Figures

Figure 1

Open AccessArticle
Analyzing the Gaver—Lewis Pareto Process under an Extremal Perspective
Risks 2017, 5(3), 33; doi:10.3390/risks5030033 -
Abstract
Pareto processes are suitable to model stationary heavy-tailed data. Here, we consider the auto-regressive Gaver–Lewis Pareto Process and address a study of the tail behavior. We characterize its local and long-range dependence. We will see that consecutive observations are asymptotically tail independent, a
[...] Read more.
Pareto processes are suitable to model stationary heavy-tailed data. Here, we consider the auto-regressive Gaver–Lewis Pareto Process and address a study of the tail behavior. We characterize its local and long-range dependence. We will see that consecutive observations are asymptotically tail independent, a feature that is often misevaluated by the most common extremal models and with strong relevance to the tail inference. This also reveals clustering at “penultimate” levels. Linear correlation may not exist in a heavy-tailed context and an alternative diagnostic tool will be presented. The derived properties relate to the auto-regressive parameter of the process and will provide estimators. A comparison of the proposals is conducted through simulation and an application to a real dataset illustrates the procedure. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Effects of Gainsharing Provisions on the Selection of a Discount Rate for a Defined Benefit Pension Plan
Risks 2017, 5(2), 32; doi:10.3390/risks5020032 -
Abstract
This paper examines the effect of gainsharing provisions on the selection of a discount rate for a defined benefit pension plan. The paper uses a traditional actuarial approach of discounting liabilities using the expected return of the associated pension fund. A stochastic Excel
[...] Read more.
This paper examines the effect of gainsharing provisions on the selection of a discount rate for a defined benefit pension plan. The paper uses a traditional actuarial approach of discounting liabilities using the expected return of the associated pension fund. A stochastic Excel model was developed to simulate the effect of varying investment returns on a pension fund with four asset classes. Lognormal distributions were fitted to historical returns of two of the asset classes; large company stocks and long-term government bonds. A third lognormal distribution was designed to represent the investment returns of alternative investments, such as real estate and private equity. The fourth asset class represented short term cash investments and that return was held constant. The following variables were analyzed to determine their relative impact of gainsharing on the selection of a discount rate: hurdle rate, percentage of gainsharing, actuarial asset method smoothing period, and variations in asset allocation. A 50% gainsharing feature can reduce the discount rate for a defined benefit pension plan from 0.5% to more than 2.5%, depending on the gainsharing design and asset allocation. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Actuarial Geometry
Risks 2017, 5(2), 31; doi:10.3390/risks5020031 -
Abstract
The literature on capital allocation is biased towards an asset modeling framework rather than an actuarial framework. The asset modeling framework leads to the proliferation of inappropriate assumptions about the effect of insurance line of business growth on aggregate loss distributions. This paper
[...] Read more.
The literature on capital allocation is biased towards an asset modeling framework rather than an actuarial framework. The asset modeling framework leads to the proliferation of inappropriate assumptions about the effect of insurance line of business growth on aggregate loss distributions. This paper explains why an actuarial analog of the asset volume/return model should be based on a Lévy process. It discusses the impact of different loss models on marginal capital allocations. It shows that Lévy process-based models provide a better fit to the US statutory accounting data, and identifies how parameter risk scales with volume and increases with time. Finally, it shows the data suggest a surprising result regarding the form of insurance parameter risk. Full article
Figures

Figure 1

Open AccessArticle
State Space Models and the Kalman-Filter in Stochastic Claims Reserving: Forecasting, Filtering and Smoothing
Risks 2017, 5(2), 30; doi:10.3390/risks5020030 -
Abstract
This paper gives a detailed overview of the current state of research in relation to the use of state space models and the Kalman-filter in the field of stochastic claims reserving. Most of these state space representations are matrix-based, which complicates
[...] Read more.
This paper gives a detailed overview of the current state of research in relation to the use of state space models and the Kalman-filter in the field of stochastic claims reserving. Most of these state space representations are matrix-based, which complicates their applications. Therefore, to facilitate the implementation of state space models in practice, we present a scalar state space model for cumulative payments, which is an extension of the well-known chain ladder (CL) method. The presented model is distribution-free, forms a basis for determining the entire unobservable lower and upper run-off triangles and can easily be applied in practice using the Kalman-filter for prediction, filtering and smoothing of cumulative payments. In addition, the model provides an easy way to find outliers in the data and to determine outlier effects. Finally, an empirical comparison of the scalar state space model, promising prior state space models and some popular stochastic claims reserving methods is performed. Full article
Figures

Figure 1

Open AccessArticle
Maximum Market Price of Longevity Risk under Solvency Regimes: The Case of Solvency II
Risks 2017, 5(2), 29; doi:10.3390/risks5020029 -
Abstract
Longevity risk constitutes an important risk factor for life insurance companies, and it can be managed through longevity-linked securities. The market of longevity-linked securities is at present far from being complete and does not allow finding a unique pricing measure. We propose a
[...] Read more.
Longevity risk constitutes an important risk factor for life insurance companies, and it can be managed through longevity-linked securities. The market of longevity-linked securities is at present far from being complete and does not allow finding a unique pricing measure. We propose a method to estimate the maximum market price of longevity risk depending on the risk margin implicit within the calculation of the technical provisions as defined by Solvency II. The maximum price of longevity risk is determined for a survivor forward (S-forward), an agreement between two counterparties to exchange at maturity a fixed survival-dependent payment for a payment depending on the realized survival of a given cohort of individuals. The maximum prices determined for the S-forwards can be used to price other longevity-linked securities, such as q-forwards. The Cairns–Blake–Dowd model is used to represent the evolution of mortality over time that combined with the information on the risk margin, enables us to calculate upper limits for the risk-adjusted survival probabilities, the market price of longevity risk and the S-forward prices. Numerical results can be extended for the pricing of other longevity-linked securities. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Risk Management under Omega Measure
Risks 2017, 5(2), 27; doi:10.3390/risks5020027 -
Abstract
We prove that the Omega measure, which considers all moments when assessing portfolio performance, is equivalent to the widely used Sharpe ratio under jointly elliptic distributions of returns. Portfolio optimization of the Sharpe ratio is then explored, with an active-set algorithm presented for
[...] Read more.
We prove that the Omega measure, which considers all moments when assessing portfolio performance, is equivalent to the widely used Sharpe ratio under jointly elliptic distributions of returns. Portfolio optimization of the Sharpe ratio is then explored, with an active-set algorithm presented for markets prohibiting short sales. When asymmetric returns are considered, we show that the Omega measure and Sharpe ratio lead to different optimal portfolios. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Asymptotic Estimates for the One-Year Ruin Probability under Risky Investments
Risks 2017, 5(2), 28; doi:10.3390/risks5020028 -
Abstract
Motivated by the EU Solvency II Directive, we study the one-year ruin probability of an insurer who makes investments and hence faces both insurance and financial risks. Over a time horizon of one year, the insurance risk is quantified as a nonnegative random
[...] Read more.
Motivated by the EU Solvency II Directive, we study the one-year ruin probability of an insurer who makes investments and hence faces both insurance and financial risks. Over a time horizon of one year, the insurance risk is quantified as a nonnegative random variable X equal to the aggregate amount of claims, and the financial risk as a d-dimensional random vector Y consisting of stochastic discount factors of the d financial assets invested. To capture both heavy tails and asymptotic dependence of Y in an integrated manner, we assume that Y follows a standard multivariate regular variation (MRV) structure. As main results, we derive exact asymptotic estimates for the one-year ruin probability for the following cases: (i) X and Y are independent with X of Fréchet type; (ii) X and Y are independent with X of Gumbel type; (iii) X and Y jointly possess a standard MRV structure; (iv) X and Y jointly possess a nonstandard MRV structure. Full article
Open AccessFeature PaperArticle
Bond and CDS Pricing via the Stochastic Recovery Black-Cox Model
Risks 2017, 5(2), 26; doi:10.3390/risks5020026 -
Abstract
Building on recent work incorporating recovery risk into structural models by Cohen & Costanzino (2015), we consider the Black-Cox model with an added recovery risk driver. The recovery risk driver arises naturally in the context of imperfect information implicit in the structural framework.
[...] Read more.
Building on recent work incorporating recovery risk into structural models by Cohen & Costanzino (2015), we consider the Black-Cox model with an added recovery risk driver. The recovery risk driver arises naturally in the context of imperfect information implicit in the structural framework. This leads to a two-factor structural model we call the Stochastic Recovery Black-Cox model, whereby the asset risk driver At defines the default trigger and the recovery risk driver Rt defines the amount recovered in the event of default. We then price zero-coupon bonds and credit default swaps under the Stochastic Recovery Black-Cox model. Finally, we compare our results with the classic Black-Cox model, give explicit expressions for the recovery risk premium in the Stochastic Recovery Black-Cox model, and detail how the introduction of separate but correlated risk drivers leads to a decoupling of the default and recovery risk premiums in the credit spread. We conclude this work by computing the effect of adding coupons that are paid continuously until default, and price perpetual (consol bonds) in our two-factor firm value model, extending calculations in the seminal paper by Leland (1994). Full article
Open AccessArticle
Enhancing Singapore’s Pension Scheme: A Blueprint for Further Flexibility
Risks 2017, 5(2), 25; doi:10.3390/risks5020025 -
Abstract
Building a social security system to ensure Singapore residents have peace of mind in funding for retirement has been at the top of Singapore government’s policy agenda over the last decade. Implementation of the Lifelong Income For the Elderly (LIFE) scheme in 2009
[...] Read more.
Building a social security system to ensure Singapore residents have peace of mind in funding for retirement has been at the top of Singapore government’s policy agenda over the last decade. Implementation of the Lifelong Income For the Elderly (LIFE) scheme in 2009 clearly shows that the government spares no effort in improving its pension scheme to boost its residents’ income after retirement. Despite the recent modifications to the LIFE scheme, Singapore residents must still choose between two plans: the Standard and Basic plans. To enhance the flexibility of the LIFE scheme with further streamlining of its fund management, we propose some plan modifications such that scheme members do not face a dichotomy of plan choices. Instead, they select two age parameters: the Payout Age and the Life-annuity Age. This paper discusses the actuarial analysis for determining members’ payouts and bequests based on the proposed age parameters. We analyze the net cash receipts and Internal Rate of Return (IRR) for various plan-parameter configurations. This information helps members make their plan choices. To address cost-of-living increases we propose to extend the plan to accommodate an annual step-up of monthly payouts. By deferring the Payout Age from 65 to 68, members can enjoy an annual increase of about 2% of the payouts for the same first-year monthly benefits. Full article
Figures

Figure 1

Open AccessArticle
Applying spectral biclustering to mortality data
Risks 2017, 5(2), 24; doi:10.3390/risks5020024 -
Abstract
We apply spectral biclustering to mortality datasets in order to capture three relevant aspects: the period, the age and the cohort effects, as their knowledge is a key factor in understanding actuarial liabilities of private life insurance companies, pension funds as well as
[...] Read more.
We apply spectral biclustering to mortality datasets in order to capture three relevant aspects: the period, the age and the cohort effects, as their knowledge is a key factor in understanding actuarial liabilities of private life insurance companies, pension funds as well as national pension systems. While standard techniques generally fail to capture the cohort effect, on the contrary, biclustering methods seem particularly suitable for this aim. We run an exploratory analysis on the mortality data of Italy, with ages representing genes, and years as conditions: by comparison between conventional hierarchical clustering and spectral biclustering, we observe that the latter offers more meaningful results. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Actuarial Applications and Estimation of Extended CreditRisk+
Risks 2017, 5(2), 23; doi:10.3390/risks5020023 -
Abstract
We introduce an additive stochastic mortality model which allows joint modelling and forecasting of underlying death causes. Parameter families for mortality trends can be chosen freely. As model settings become high dimensional, Markov chain Monte Carlo (MCMC) is used for parameter estimation. We
[...] Read more.
We introduce an additive stochastic mortality model which allows joint modelling and forecasting of underlying death causes. Parameter families for mortality trends can be chosen freely. As model settings become high dimensional, Markov chain Monte Carlo (MCMC) is used for parameter estimation. We then link our proposed model to an extended version of the credit risk model CreditRisk+. This allows exact risk aggregation via an efficient numerically stable Panjer recursion algorithm and provides numerous applications in credit, life insurance and annuity portfolios to derive P&L distributions. Furthermore, the model allows exact (without Monte Carlo simulation error) calculation of risk measures and their sensitivities with respect to model parameters for P&L distributions such as value-at-risk and expected shortfall. Numerous examples, including an application to partial internal models under Solvency II, using Austrian and Australian data are shown. Full article
Figures

Figure 1