Open AccessArticle
Deflation Risk and Implications for Life Insurers
Risks 2016, 4(4), 46; doi:10.3390/risks4040046 (registering DOI) -
Abstract
Life insurers are exposed to deflation risk: falling prices could lead to insufficient investment returns, and inflation-indexed protections could make insurers vulnerable to deflation. In this spirit, this paper proposes a market-based methodology for measuring deflation risk based on a discrete framework: the
[...] Read more.
Life insurers are exposed to deflation risk: falling prices could lead to insufficient investment returns, and inflation-indexed protections could make insurers vulnerable to deflation. In this spirit, this paper proposes a market-based methodology for measuring deflation risk based on a discrete framework: the latter accounts for the real interest rate, the inflation index level, its conditional variance, and the expected inflation rate. US inflation data are then used to estimate the model and show the importance of deflation risk. Specifically, the distribution of a fictitious life insurer’s future payments is investigated. We find that the proposed inflation model yields higher risk measures than the ones obtained using competing models, stressing the need for dynamic and market-consistent inflation modelling in the life insurance industry. Full article
Figures

Figure 1

Open AccessArticle
Predicting Human Mortality: Quantitative Evaluation of Four Stochastic Models
Risks 2016, 4(4), 45; doi:10.3390/risks4040045 (registering DOI) -
Abstract
In this paper, we quantitatively compare the forecasts from four different mortality models. We consider one discrete-time model proposed by Lee and Carter (1992) and three continuous-time models: the Wills and Sherris (2011) model, the Feller process and the Ornstein-Uhlenbeck (OU) process. The
[...] Read more.
In this paper, we quantitatively compare the forecasts from four different mortality models. We consider one discrete-time model proposed by Lee and Carter (1992) and three continuous-time models: the Wills and Sherris (2011) model, the Feller process and the Ornstein-Uhlenbeck (OU) process. The first two models estimate the whole surface of mortality simultaneously, while in the latter two, each generation is modelled and calibrated separately. We calibrate the models to UK and Australian population data. We find that all the models show relatively similar absolute total error for a given dataset, except the Lee-Carter model, whose performance differs significantly. To evaluate the forecasting performance we therefore look at two alternative measures: the relative error between the forecasted and the actual mortality rates and the percentage of actual mortality rates which fall within a prediction interval. In terms of the prediction intervals, the results are more divergent since each model implies a different structure for the variance of mortality rates. According to our experiments, the Wills and Sherris model produces superior results in terms of the prediction intervals. However, in terms of the mean absolute error, the OU and the Feller processes perform better. The forecasting performance of the Lee Carter model is mostly dependent on the choice of the dataset. Full article
Figures

Figure 1

Open AccessArticle
Estimation of Star-Shaped Distributions
Risks 2016, 4(4), 44; doi:10.3390/risks4040044 -
Abstract
Scatter plots of multivariate data sets motivate modeling of star-shaped distributions beyond elliptically contoured ones. We study properties of estimators for the density generator function, the star-generalized radius distribution and the density in a star-shaped distribution model. For the generator function and the
[...] Read more.
Scatter plots of multivariate data sets motivate modeling of star-shaped distributions beyond elliptically contoured ones. We study properties of estimators for the density generator function, the star-generalized radius distribution and the density in a star-shaped distribution model. For the generator function and the star-generalized radius density, we consider a non-parametric kernel-type estimator. This estimator is combined with a parametric estimator for the contours which are assumed to follow a parametric model. Therefore, the semiparametric procedure features the flexibility of nonparametric estimators and the simple estimation and interpretation of parametric estimators. Alternatively, we consider pure parametric estimators for the density. For the semiparametric density estimator, we prove rates of uniform, almost sure convergence which coincide with the corresponding rates of one-dimensional kernel density estimators when excluding the center of the distribution. We show that the standardized density estimator is asymptotically normally distributed. Moreover, the almost sure convergence rate of the estimated distribution function of the star-generalized radius is derived. A particular new two-dimensional distribution class is adapted here to agricultural and financial data sets. Full article
Open AccessArticle
Parameter Estimation in Stable Law
Risks 2016, 4(4), 43; doi:10.3390/risks4040043 -
Abstract For general stable distribution, cumulant function based parameter estimators are proposed. Extensive simulation experiments are carried out to validate the effectiveness of the estimates over the entire parameter space. An application to non-life insurance losses distribution is made. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Optimal Premium as a Function of the Deductible: Customer Analysis and Portfolio Characteristics
Risks 2016, 4(4), 42; doi:10.3390/risks4040042 -
Abstract
An insurance company offers an insurance contract (p,K), consisting of a premium p and a deductible K. In this paper, we consider the problem of choosing the premium optimally as a function of the deductible. The insurance
[...] Read more.
An insurance company offers an insurance contract (p,K), consisting of a premium p and a deductible K. In this paper, we consider the problem of choosing the premium optimally as a function of the deductible. The insurance company is facing a market of N customers, each characterized by their personal claim frequency, α, and risk aversion, β. When a customer is offered an insurance contract, she/he will, based on these characteristics, choose whether or not to insure. The decision process of the customer is analyzed in detail. Since the customer characteristics are unknown to the company, it models them as i.i.d. random variables; A1,,AN for the claim frequencies and B1,,BN for the risk aversions. Depending on the distributions of Ai and Bi, expressions for the portfolio size n(p;K)[0,N] and average claim frequency α(p;K) in the portfolio are obtained. Knowing these, the company can choose the premium optimally, mainly by minimizing the ruin probability. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
A Note on Upper Tail Behavior of Liouville Copulas
Risks 2016, 4(4), 40; doi:10.3390/risks4040040 -
Abstract
The family of Liouville copulas is defined as the survival copulas of multivariate Liouville distributions, and it covers the Archimedean copulas constructed by Williamson’s d-transform. Liouville copulas provide a very wide range of dependence ranging from positive to negative dependence in the
[...] Read more.
The family of Liouville copulas is defined as the survival copulas of multivariate Liouville distributions, and it covers the Archimedean copulas constructed by Williamson’s d-transform. Liouville copulas provide a very wide range of dependence ranging from positive to negative dependence in the upper tails, and they can be useful in modeling tail risks. In this article, we study the upper tail behavior of Liouville copulas through their upper tail orders. Tail orders of a more general scale mixture model that covers Liouville distributions is first derived, and then tail order functions and tail order density functions of Liouville copulas are derived. Concrete examples are given after the main results. Full article
Open AccessFeature PaperArticle
Incorporation of Stochastic Policyholder Behavior in Analytical Pricing of GMABs and GMDBs
Risks 2016, 4(4), 41; doi:10.3390/risks4040041 -
Abstract
Variable annuities represent certain unit-linked life insurance products offering different types of protection commonly referred to as guaranteed minimum benefits (GMXBs). They are designed for the increasing demand of the customers for private pension provision. In this paper we analytically price variable annuities
[...] Read more.
Variable annuities represent certain unit-linked life insurance products offering different types of protection commonly referred to as guaranteed minimum benefits (GMXBs). They are designed for the increasing demand of the customers for private pension provision. In this paper we analytically price variable annuities with guaranteed minimum repayments at maturity and in case of the insured’s death. If the contract is prematurely surrendered, the policyholder is entitled to the current value of the fund account reduced by the prevailing surrender fee. The financial market and the mortality model are affine linear. For the surrender model, a Cox process is deployed whose intensity is given by a deterministic function (s-curve) with stochastic inputs from the financial market. So, the policyholders’ surrender behavior depends on the performance of the financial market and is stochastic. The presented pricing scheme incorporates the stochastic surrender behavior of the policyholders and is only based on suitable closed-form approximations. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Frailty and Risk Classification for Life Annuity Portfolios
Risks 2016, 4(4), 39; doi:10.3390/risks4040039 -
Abstract
Life annuities are attractive mainly for healthy people. In order to expand their business, in recent years, some insurers have started offering higher annuity rates to those whose health conditions are critical. Life annuity portfolios are then supposed to become larger and more
[...] Read more.
Life annuities are attractive mainly for healthy people. In order to expand their business, in recent years, some insurers have started offering higher annuity rates to those whose health conditions are critical. Life annuity portfolios are then supposed to become larger and more heterogeneous. With respect to the insurer’s risk profile, there is a trade-off between portfolio size and heterogeneity that we intend to investigate. In performing this, there is a second and possibly more important issue that we address. In actuarial practice, the different mortality levels of the several risk classes are obtained by applying adjustment coefficients to population mortality rates. Such a choice is not supported by a rigorous model. On the other hand, the heterogeneity of a population with respect to mortality can formally be described with a frailty model. We suggest adopting a frailty model for risk classification. We identify risk groups (or classes) within the population by assigning specific ranges of values to the frailty within each group. The different levels of mortality of the various groups are based on the conditional probability distributions of the frailty. Annuity rates for each class then can be easily justified, and a comprehensive investigation of insurer’s liabilities can be performed. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
A Note on Health Insurance under Ex Post Moral Hazard
Risks 2016, 4(4), 38; doi:10.3390/risks4040038 -
Abstract
In the linear coinsurance problem, examined first by Mossin (1968), a higher absolute risk aversion with respect to wealth in the sense of Arrow–Pratt implies a higher optimal coinsurance rate. We show that this property does not hold for health insurance under ex
[...] Read more.
In the linear coinsurance problem, examined first by Mossin (1968), a higher absolute risk aversion with respect to wealth in the sense of Arrow–Pratt implies a higher optimal coinsurance rate. We show that this property does not hold for health insurance under ex post moral hazard; i.e., when illness severity cannot be observed by insurers, and policyholders decide on their health expenditures. The optimal coinsurance rate trades off a risk-sharing effect and an incentive effect, both related to risk aversion. Full article
Open AccessFeature PaperArticle
A Note on Realistic Dividends in Actuarial Surplus Models
Risks 2016, 4(4), 37; doi:10.3390/risks4040037 -
Abstract
Because of the profitable nature of risk businesses in the long term, de Finetti suggested that surplus models should allow for cash leakages, as otherwise the surplus would unrealistically grow (on average) to infinity. These leakages were interpreted as ‘dividends’. Subsequent literature on
[...] Read more.
Because of the profitable nature of risk businesses in the long term, de Finetti suggested that surplus models should allow for cash leakages, as otherwise the surplus would unrealistically grow (on average) to infinity. These leakages were interpreted as ‘dividends’. Subsequent literature on actuarial surplus models with dividend distribution has mainly focussed on dividend strategies that either maximise the expected present value of dividends until ruin or lead to a probability of ruin that is less than one (see Albrecher and Thonhauser, Avanzi for reviews). An increasing number of papers are directly interested in modelling dividend policies that are consistent with actual practice in financial markets. In this short note, we review the corporate finance literature with the specific aim of fleshing out properties that dividend strategies should ideally satisfy, if one wants to model behaviour that is consistent with practice. Full article
Open AccessArticle
Nested MC-Based Risk Measurement of Complex Portfolios: Acceleration and Energy Efficiency
Risks 2016, 4(4), 36; doi:10.3390/risks4040036 -
Abstract
Risk analysis and management currently have a strong presence in financial institutions, where high performance and energy efficiency are key requirements for acceleration systems, especially when it comes to intraday analysis. In this regard, we approach the estimation of the widely-employed portfolio risk
[...] Read more.
Risk analysis and management currently have a strong presence in financial institutions, where high performance and energy efficiency are key requirements for acceleration systems, especially when it comes to intraday analysis. In this regard, we approach the estimation of the widely-employed portfolio risk metrics value-at-risk (VaR) and conditional value-at-risk (cVaR) by means of nested Monte Carlo (MC) simulations. We do so by combining theory and software/hardware implementation. This allows us for the first time to investigate their performance on heterogeneous compute systems and across different compute platforms, namely central processing unit (CPU), many integrated core (MIC) architecture XeonPhi, graphics processing unit (GPU), and field-programmable gate array (FPGA). To this end, the OpenCL framework is employed to generate portable code, and the size of the simulations is scaled in order to evaluate variations in performance. Furthermore, we assess different parallelization schemes, and the targeted platforms are evaluated and compared in terms of runtime and energy efficiency. Our implementation also allowed us to derive a new algorithmic optimization regarding the generation of the required random number sequences. Moreover, we provide specific guidelines on how to properly handle these sequences in portable code, and on how to efficiently implement nested MC-based VaR and cVaR simulations on heterogeneous compute systems. Full article
Figures

Figure 1

Open AccessArticle
Sharp Convex Bounds on the Aggregate Sums–An Alternative Proof
Risks 2016, 4(4), 34; doi:10.3390/risks4040034 -
Abstract
It is well known that a random vector with given marginals is comonotonic if and only if it has the largest convex sum, and that a random vector with given marginals (under an additional condition) is mutually exclusive if and only if it
[...] Read more.
It is well known that a random vector with given marginals is comonotonic if and only if it has the largest convex sum, and that a random vector with given marginals (under an additional condition) is mutually exclusive if and only if it has the minimal convex sum. This paper provides an alternative proof of these two results using the theories of distortion risk measure and expected utility. Full article
Open AccessArticle
A Note on the Impact of Parameter Uncertainty on Barrier Derivatives
Risks 2016, 4(4), 35; doi:10.3390/risks4040035 -
Abstract
This paper presents a comprehensive extension of pricing two-dimensional derivatives depending on two barrier constraints. We assume randomness on the covariance matrix as a way of generalizing. We analyse common barrier derivatives, enabling us to study parameter uncertainty and the risk related to
[...] Read more.
This paper presents a comprehensive extension of pricing two-dimensional derivatives depending on two barrier constraints. We assume randomness on the covariance matrix as a way of generalizing. We analyse common barrier derivatives, enabling us to study parameter uncertainty and the risk related to the estimation procedure (estimation risk). In particular, we use the distribution of empirical parameters from IBM and EURO STOXX50. The evidence suggests that estimation risk should not be neglected in the context of multidimensional barrier derivatives, as it could cause price differences of up to 70%. Full article
Figures

Figure 1

Open AccessArticle
Multivariate TVaR-Based Risk Decomposition for Vector-Valued Portfolios
Risks 2016, 4(4), 33; doi:10.3390/risks4040033 -
Abstract
In order to protect stakeholders of insurance companies and financial institutions against adverse outcomes of risky businesses, regulators and senior management use capital allocation techniques. For enterprise-wide risk management, it has become important to calculate the contribution of each risk within a portfolio.
[...] Read more.
In order to protect stakeholders of insurance companies and financial institutions against adverse outcomes of risky businesses, regulators and senior management use capital allocation techniques. For enterprise-wide risk management, it has become important to calculate the contribution of each risk within a portfolio. For that purpose, bivariate lower and upper orthant tail value-at-risk can be used for capital allocation. In this paper, we present multivariate value-at-risk and tail-value-at-risk for d2, and we focus on three different methods to calculate optimal values for the contribution of each risk within the sums of random vectors to the overall portfolio, which could particularly apply to insurance and financial portfolios. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
The Wasserstein Metric and Robustness in Risk Management
Risks 2016, 4(3), 32; doi:10.3390/risks4030032 -
Abstract
In the aftermath of the financial crisis, it was realized that the mathematical models used for the valuation of financial instruments and the quantification of risk inherent in portfolios consisting of these financial instruments exhibit a substantial model risk. Consequently, regulators and other
[...] Read more.
In the aftermath of the financial crisis, it was realized that the mathematical models used for the valuation of financial instruments and the quantification of risk inherent in portfolios consisting of these financial instruments exhibit a substantial model risk. Consequently, regulators and other stakeholders have started to require that the internal models used by financial institutions are robust. We present an approach to consistently incorporate the robustness requirements into the quantitative risk management process of a financial institution, with a special focus on insurance. We advocate the Wasserstein metric as the canonical metric for approximations in robust risk management and present supporting arguments. Representing risk measures as statistical functionals, we relate risk measures with the concept of robustness and hence continuity with respect to the Wasserstein metric. This allows us to use results from robust statistics concerning continuity and differentiability of functionals. Finally, we illustrate our approach via practical applications. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
Choosing Markovian Credit Migration Matrices by Nonlinear Optimization
Risks 2016, 4(3), 31; doi:10.3390/risks4030031 -
Abstract
Transition matrices, containing credit risk information in the form of ratings based on discrete observations, are published annually by rating agencies. A substantial issue arises, as for higher rating classes practically no defaults are observed yielding default probabilities of zero. This does not
[...] Read more.
Transition matrices, containing credit risk information in the form of ratings based on discrete observations, are published annually by rating agencies. A substantial issue arises, as for higher rating classes practically no defaults are observed yielding default probabilities of zero. This does not always reflect reality. To circumvent this shortcoming, estimation techniques in continuous-time can be applied. However, raw default data may not be available at all or not in the desired granularity, leaving the practitioner to rely on given one-year transition matrices. Then, it becomes necessary to transform the one-year transition matrix to a generator matrix. This is known as the embedding problem and can be formulated as a nonlinear optimization problem, minimizing the distance between the exponential of a potential generator matrix and the annual transition matrix. So far, in credit risk-related literature, solving this problem directly has been avoided, but approximations have been preferred instead. In this paper, we show that this problem can be solved numerically with sufficient accuracy, thus rendering approximations unnecessary. Our direct approach via nonlinear optimization allows one to consider further credit risk-relevant constraints. We demonstrate that it is thus possible to choose a proper generator matrix with additional structural properties. Full article
Figures

Open AccessArticle
On the Capital Allocation Problem for a New Coherent Risk Measure in Collective Risk Theory
Risks 2016, 4(3), 30; doi:10.3390/risks4030030 -
Abstract
In this paper we introduce a new coherent cumulative risk measure on a subclass in the space of càdlàg processes. This new coherent risk measure turns out to be tractable enough within a class of models where the aggregate claims is driven by
[...] Read more.
In this paper we introduce a new coherent cumulative risk measure on a subclass in the space of càdlàg processes. This new coherent risk measure turns out to be tractable enough within a class of models where the aggregate claims is driven by a spectrally positive Lévy process. We focus our motivation and discussion on the problem of capital allocation. Indeed, this risk measure is well-suited to address the problem of capital allocation in an insurance context. We show that the capital allocation problem for this risk measure has a unique solution determined by the Euler allocation method. Some examples and connections with existing results as well as practical implications are also discussed. Full article
Open AccessArticle
Optimal Insurance with Heterogeneous Beliefs and Disagreement about Zero-Probability Events
Risks 2016, 4(3), 29; doi:10.3390/risks4030029 -
Abstract
In problems of optimal insurance design, Arrow’s classical result on the optimality of the deductible indemnity schedule holds in a situation where the insurer is a risk-neutral Expected-Utility (EU) maximizer, the insured is a risk-averse EU-maximizer, and the two parties share the same
[...] Read more.
In problems of optimal insurance design, Arrow’s classical result on the optimality of the deductible indemnity schedule holds in a situation where the insurer is a risk-neutral Expected-Utility (EU) maximizer, the insured is a risk-averse EU-maximizer, and the two parties share the same probabilistic beliefs about the realizations of the underlying insurable loss. Recently, Ghossoub re-examined Arrow’s problem in a setting where the two parties have different subjective beliefs about the realizations of the insurable random loss, and he showed that if these beliefs satisfy a certain compatibility condition that is weaker than the Monotone Likelihood Ratio (MLR) condition, then optimal indemnity schedules exist and are nondecreasing in the loss. However, Ghossoub only gave a characterization of these optimal indemnity schedules in the special case of an MLR. In this paper, we consider the general case, allowing for disagreement about zero-probability events. We fully characterize the class of all optimal indemnity schedules that are nondecreasing in the loss, in terms of their distribution under the insured’s probability measure, and we obtain Arrow’s classical result, as well as one of the results of Ghossoub as corollaries. Finally, we formalize Marshall’s argument that, in a setting of belief heterogeneity, an optimal indemnity schedule may take “any”shape. Full article
Open AccessFeature PaperArticle
Using Climate and Weather Data to Support Regional Vulnerability Screening Assessments of Transportation Infrastructure
Risks 2016, 4(3), 28; doi:10.3390/risks4030028 -
Abstract
Extreme weather and climate change can have a significant impact on all types of infrastructure and assets, regardless of location, with the potential for human casualties, physical damage to assets, disruption of operations, economic and community distress, and environmental degradation. This paper describes
[...] Read more.
Extreme weather and climate change can have a significant impact on all types of infrastructure and assets, regardless of location, with the potential for human casualties, physical damage to assets, disruption of operations, economic and community distress, and environmental degradation. This paper describes a methodology for using extreme weather and climate data to identify climate-related risks and to quantify the potential impact of extreme weather events on certain types of transportation infrastructure as part of a vulnerability screening assessment. This screening assessment can be especially useful when a large number of assets or large geographical areas are being studied, with the results enabling planners and asset managers to undertake a more detailed assessment of vulnerability on a more targeted number of assets or locations. The methodology combines climate, weather, and impact data to identify vulnerabilities to a range of weather and climate related risks over a multi-decadal planning period. The paper applies the methodology to perform an extreme weather and climate change vulnerability screening assessment on transportation infrastructure assets for the State of Tennessee. This paper represents the results of one of the first efforts at spatial vulnerability assessments of transportation infrastructure and provides important insights for any organization considering the impact of climate and weather events on transportation or other critical infrastructure systems. Full article
Figures

Open AccessFeature PaperArticle
Understanding Reporting Delay in General Insurance
Risks 2016, 4(3), 25; doi:10.3390/risks4030025 -
Abstract
The aim of this paper is to understand and to model claims arrival and reporting delay in general insurance. We calibrate two real individual claims data sets to the statistical model of Jewell and Norberg. One data set considers property insurance and the
[...] Read more.
The aim of this paper is to understand and to model claims arrival and reporting delay in general insurance. We calibrate two real individual claims data sets to the statistical model of Jewell and Norberg. One data set considers property insurance and the other one casualty insurance. For our analysis we slightly relax the model assumptions of Jewell allowing for non-stationarity so that the model is able to cope with trends and with seasonal patterns. The performance of our individual claims data prediction is compared to the prediction based on aggregate data using the Poisson chain-ladder method. Full article
Figures