A Note on Realistic Dividends in Actuarial Surplus Models*Risks* **2016**, *4*(4), 37; doi:10.3390/risks4040037 - 20 October 2016**Abstract **

Because of the profitable nature of risk businesses in the long term, de Finetti suggested that surplus models should allow for cash leakages, as otherwise the surplus would unrealistically grow (on average) to infinity. These leakages were interpreted as ‘dividends’. Subsequent literature
[...] Read more.

Because of the profitable nature of risk businesses in the long term, de Finetti suggested that surplus models should allow for cash leakages, as otherwise the surplus would unrealistically grow (on average) to infinity. These leakages were interpreted as ‘dividends’. Subsequent literature on actuarial surplus models with dividend distribution has mainly focussed on dividend strategies that either maximise the expected present value of dividends until ruin or lead to a probability of ruin that is less than one (see Albrecher and Thonhauser, Avanzi for reviews). An increasing number of papers are directly interested in modelling dividend policies that are consistent with actual practice in financial markets. In this short note, we review the corporate finance literature with the specific aim of fleshing out properties that dividend strategies should ideally satisfy, if one wants to model behaviour that is consistent with practice.
Full article

Nested MC-Based Risk Measurement of Complex Portfolios: Acceleration and Energy Efficiency*Risks* **2016**, *4*(4), 36; doi:10.3390/risks4040036 - 18 October 2016**Abstract **

►
Figures
Risk analysis and management currently have a strong presence in financial institutions, where high performance and energy efficiency are key requirements for acceleration systems, especially when it comes to intraday analysis. In this regard, we approach the estimation of the widely-employed portfolio
[...] Read more.

Risk analysis and management currently have a strong presence in financial institutions, where high performance and energy efficiency are key requirements for acceleration systems, especially when it comes to intraday analysis. In this regard, we approach the estimation of the widely-employed portfolio risk metrics value-at-risk (VaR) and conditional value-at-risk (cVaR) by means of nested Monte Carlo (MC) simulations. We do so by combining theory and software/hardware implementation. This allows us for the first time to investigate their performance on heterogeneous compute systems and across different compute platforms, namely central processing unit (CPU), many integrated core (MIC) architecture XeonPhi, graphics processing unit (GPU), and field-programmable gate array (FPGA). To this end, the OpenCL framework is employed to generate portable code, and the size of the simulations is scaled in order to evaluate variations in performance. Furthermore, we assess different parallelization schemes, and the targeted platforms are evaluated and compared in terms of runtime and energy efficiency. Our implementation also allowed us to derive a new algorithmic optimization regarding the generation of the required random number sequences. Moreover, we provide specific guidelines on how to properly handle these sequences in portable code, and on how to efficiently implement nested MC-based VaR and cVaR simulations on heterogeneous compute systems.
Full article

Sharp Convex Bounds on the Aggregate Sums–An Alternative Proof*Risks* **2016**, *4*(4), 34; doi:10.3390/risks4040034 - 29 September 2016**Abstract **

It is well known that a random vector with given marginals is comonotonic if and only if it has the largest convex sum, and that a random vector with given marginals (under an additional condition) is mutually exclusive if and only if
[...] Read more.

It is well known that a random vector with given marginals is comonotonic if and only if it has the largest convex sum, and that a random vector with given marginals (under an additional condition) is mutually exclusive if and only if it has the minimal convex sum. This paper provides an alternative proof of these two results using the theories of distortion risk measure and expected utility.
Full article

A Note on the Impact of Parameter Uncertainty on Barrier Derivatives*Risks* **2016**, *4*(4), 35; doi:10.3390/risks4040035 - 29 September 2016**Abstract **

►
Figures
This paper presents a comprehensive extension of pricing two-dimensional derivatives depending on two barrier constraints. We assume randomness on the covariance matrix as a way of generalizing. We analyse common barrier derivatives, enabling us to study parameter uncertainty and the risk related
[...] Read more.

This paper presents a comprehensive extension of pricing two-dimensional derivatives depending on two barrier constraints. We assume randomness on the covariance matrix as a way of generalizing. We analyse common barrier derivatives, enabling us to study parameter uncertainty and the risk related to the estimation procedure (estimation risk). In particular, we use the distribution of empirical parameters from IBM and EURO STOXX50. The evidence suggests that estimation risk should not be neglected in the context of multidimensional barrier derivatives, as it could cause price differences of up to 70%.
Full article

Multivariate TVaR-Based Risk Decomposition for Vector-Valued Portfolios*Risks* **2016**, *4*(4), 33; doi:10.3390/risks4040033 - 23 September 2016**Abstract **

►
Figures
In order to protect stakeholders of insurance companies and financial institutions against adverse outcomes of risky businesses, regulators and senior management use capital allocation techniques. For enterprise-wide risk management, it has become important to calculate the contribution of each risk within a
[...] Read more.

In order to protect stakeholders of insurance companies and financial institutions against adverse outcomes of risky businesses, regulators and senior management use capital allocation techniques. For enterprise-wide risk management, it has become important to calculate the contribution of each risk within a portfolio. For that purpose, bivariate lower and upper orthant tail value-at-risk can be used for capital allocation. In this paper, we present multivariate value-at-risk and tail-value-at-risk for $d\ge 2$ , and we focus on three different methods to calculate optimal values for the contribution of each risk within the sums of random vectors to the overall portfolio, which could particularly apply to insurance and financial portfolios.
Full article

The Wasserstein Metric and Robustness in Risk Management*Risks* **2016**, *4*(3), 32; doi:10.3390/risks4030032 - 31 August 2016**Abstract **

►
Figures
In the aftermath of the financial crisis, it was realized that the mathematical models used for the valuation of financial instruments and the quantification of risk inherent in portfolios consisting of these financial instruments exhibit a substantial model risk. Consequently, regulators and
[...] Read more.

In the aftermath of the financial crisis, it was realized that the mathematical models used for the valuation of financial instruments and the quantification of risk inherent in portfolios consisting of these financial instruments exhibit a substantial model risk. Consequently, regulators and other stakeholders have started to require that the internal models used by financial institutions are robust. We present an approach to consistently incorporate the robustness requirements into the quantitative risk management process of a financial institution, with a special focus on insurance. We advocate the Wasserstein metric as the canonical metric for approximations in robust risk management and present supporting arguments. Representing risk measures as statistical functionals, we relate risk measures with the concept of robustness and hence continuity with respect to the Wasserstein metric. This allows us to use results from robust statistics concerning continuity and differentiability of functionals. Finally, we illustrate our approach via practical applications.
Full article

Choosing Markovian Credit Migration Matrices by Nonlinear Optimization*Risks* **2016**, *4*(3), 31; doi:10.3390/risks4030031 - 30 August 2016**Abstract **

►
Figures
Transition matrices, containing credit risk information in the form of ratings based on discrete observations, are published annually by rating agencies. A substantial issue arises, as for higher rating classes practically no defaults are observed yielding default probabilities of zero. This does
[...] Read more.

Transition matrices, containing credit risk information in the form of ratings based on discrete observations, are published annually by rating agencies. A substantial issue arises, as for higher rating classes practically no defaults are observed yielding default probabilities of zero. This does not always reflect reality. To circumvent this shortcoming, estimation techniques in continuous-time can be applied. However, raw default data may not be available at all or not in the desired granularity, leaving the practitioner to rely on given one-year transition matrices. Then, it becomes necessary to transform the one-year transition matrix to a generator matrix. This is known as the embedding problem and can be formulated as a nonlinear optimization problem, minimizing the distance between the exponential of a potential generator matrix and the annual transition matrix. So far, in credit risk-related literature, solving this problem directly has been avoided, but approximations have been preferred instead. In this paper, we show that this problem can be solved numerically with sufficient accuracy, thus rendering approximations unnecessary. Our direct approach via nonlinear optimization allows one to consider further credit risk-relevant constraints. We demonstrate that it is thus possible to choose a proper generator matrix with additional structural properties.
Full article

On the Capital Allocation Problem for a New Coherent Risk Measure in Collective Risk Theory*Risks* **2016**, *4*(3), 30; doi:10.3390/risks4030030 - 16 August 2016**Abstract **

In this paper we introduce a new coherent cumulative risk measure on a subclass in the space of càdlàg processes. This new coherent risk measure turns out to be tractable enough within a class of models where the aggregate claims is driven
[...] Read more.

In this paper we introduce a new coherent cumulative risk measure on a subclass in the space of càdlàg processes. This new coherent risk measure turns out to be tractable enough within a class of models where the aggregate claims is driven by a spectrally positive Lévy process. We focus our motivation and discussion on the problem of capital allocation. Indeed, this risk measure is well-suited to address the problem of capital allocation in an insurance context. We show that the capital allocation problem for this risk measure has a unique solution determined by the Euler allocation method. Some examples and connections with existing results as well as practical implications are also discussed.
Full article

Optimal Insurance with Heterogeneous Beliefs and Disagreement about Zero-Probability Events*Risks* **2016**, *4*(3), 29; doi:10.3390/risks4030029 - 5 August 2016**Abstract **

In problems of optimal insurance design, Arrow’s classical result on the optimality of the deductible indemnity schedule holds in a situation where the insurer is a risk-neutral Expected-Utility (EU) maximizer, the insured is a risk-averse EU-maximizer, and the two parties share the
[...] Read more.

In problems of optimal insurance design, Arrow’s classical result on the optimality of the deductible indemnity schedule holds in a situation where the insurer is a risk-neutral Expected-Utility (EU) maximizer, the insured is a risk-averse EU-maximizer, and the two parties share the same probabilistic beliefs about the realizations of the underlying insurable loss. Recently, Ghossoub re-examined Arrow’s problem in a setting where the two parties have different subjective beliefs about the realizations of the insurable random loss, and he showed that if these beliefs satisfy a certain compatibility condition that is weaker than the Monotone Likelihood Ratio (MLR) condition, then optimal indemnity schedules exist and are nondecreasing in the loss. However, Ghossoub only gave a characterization of these optimal indemnity schedules in the special case of an MLR. In this paper, we consider the general case, allowing for disagreement about zero-probability events. We fully characterize the class of all optimal indemnity schedules that are nondecreasing in the loss, in terms of their distribution under the insured’s probability measure, and we obtain Arrow’s classical result, as well as one of the results of Ghossoub as corollaries. Finally, we formalize Marshall’s argument that, in a setting of belief heterogeneity, an optimal indemnity schedule may take “any”shape.
Full article

Using Climate and Weather Data to Support Regional Vulnerability Screening Assessments of Transportation Infrastructure*Risks* **2016**, *4*(3), 28; doi:10.3390/risks4030028 - 3 August 2016**Abstract **

►
Figures
Extreme weather and climate change can have a significant impact on all types of infrastructure and assets, regardless of location, with the potential for human casualties, physical damage to assets, disruption of operations, economic and community distress, and environmental degradation. This paper
[...] Read more.

Extreme weather and climate change can have a significant impact on all types of infrastructure and assets, regardless of location, with the potential for human casualties, physical damage to assets, disruption of operations, economic and community distress, and environmental degradation. This paper describes a methodology for using extreme weather and climate data to identify climate-related risks and to quantify the potential impact of extreme weather events on certain types of transportation infrastructure as part of a vulnerability screening assessment. This screening assessment can be especially useful when a large number of assets or large geographical areas are being studied, with the results enabling planners and asset managers to undertake a more detailed assessment of vulnerability on a more targeted number of assets or locations. The methodology combines climate, weather, and impact data to identify vulnerabilities to a range of weather and climate related risks over a multi-decadal planning period. The paper applies the methodology to perform an extreme weather and climate change vulnerability screening assessment on transportation infrastructure assets for the State of Tennessee. This paper represents the results of one of the first efforts at spatial vulnerability assessments of transportation infrastructure and provides important insights for any organization considering the impact of climate and weather events on transportation or other critical infrastructure systems.
Full article

Understanding Reporting Delay in General Insurance*Risks* **2016**, *4*(3), 25; doi:10.3390/risks4030025 - 8 July 2016**Abstract **

►
Figures
The aim of this paper is to understand and to model claims arrival and reporting delay in general insurance. We calibrate two real individual claims data sets to the statistical model of Jewell and Norberg. One data set considers property insurance and
[...] Read more.

The aim of this paper is to understand and to model claims arrival and reporting delay in general insurance. We calibrate two real individual claims data sets to the statistical model of Jewell and Norberg. One data set considers property insurance and the other one casualty insurance. For our analysis we slightly relax the model assumptions of Jewell allowing for non-stationarity so that the model is able to cope with trends and with seasonal patterns. The performance of our individual claims data prediction is compared to the prediction based on aggregate data using the Poisson chain-ladder method.
Full article

Risk Minimization for Insurance Products via F-Doubly Stochastic Markov Chains*Risks* **2016**, *4*(3), 23; doi:10.3390/risks4030023 - 7 July 2016**Abstract **

►
Figures
We study risk-minimization for a large class of insurance contracts. Given that the individual progress in time of visiting an insurance policy’s states follows an $\mathbb{F}$ -doubly stochastic Markov chain, we describe different state-dependent types of insurance benefits. These cover single payments
[...] Read more.

We study risk-minimization for a large class of insurance contracts. Given that the individual progress in time of visiting an insurance policy’s states follows an $\mathbb{F}$ -doubly stochastic Markov chain, we describe different state-dependent types of insurance benefits. These cover single payments at maturity, annuity-type payments and payments at the time of a transition. Based on the intensity of the $\mathbb{F}$ -doubly stochastic Markov chain, we provide the Galtchouk-Kunita-Watanabe decomposition for a general insurance contract and specify risk-minimizing strategies in a Brownian financial market setting. The results are further illustrated explicitly within an affine structure for the intensity.
Full article

Optimal Reinsurance with Heterogeneous Reference Probabilities*Risks* **2016**, *4*(3), 26; doi:10.3390/risks4030026 - 7 July 2016**Abstract **

►
Figures
This paper studies the problem of optimal reinsurance contract design. We let the insurer use dual utility, and the premium is an extended Wang’s premium principle. The novel contribution is that we allow for heterogeneity in the beliefs regarding the underlying probability
[...] Read more.

This paper studies the problem of optimal reinsurance contract design. We let the insurer use dual utility, and the premium is an extended Wang’s premium principle. The novel contribution is that we allow for heterogeneity in the beliefs regarding the underlying probability distribution. We characterize layer-reinsurance as an optimal reinsurance contract. Moreover, we characterize layer-reinsurance as optimal contracts when the insurer faces costs of holding regulatory capital. We illustrate this in cases where both firms use the Value-at-Risk or the conditional Value-at-Risk.
Full article

Lead–Lag Relationship Using a Stop-and-Reverse-MinMax Process*Risks* **2016**, *4*(3), 27; doi:10.3390/risks4030027 - 7 July 2016**Abstract **

►
Figures
The intermarket analysis, in particular the lead–lag relationship, plays an important role within financial markets. Therefore, a mathematical approach to be able to find interrelations between the price development of two different financial instruments is developed in this paper. Computing the differences
[...] Read more.

The intermarket analysis, in particular the lead–lag relationship, plays an important role within financial markets. Therefore, a mathematical approach to be able to find interrelations between the price development of two different financial instruments is developed in this paper. Computing the differences of the relative positions of relevant local extrema of two charts, i.e., the local phase shifts of these price developments, gives us an empirical distribution on the unit circle. With the aid of directional statistics, such angular distributions are studied for many pairs of markets. It is shown that there are several very strongly correlated financial instruments in the field of foreign exchange, commodities and indexes. In some cases, one of the two markets is significantly ahead with respect to the relevant local extrema, i.e., there is a phase shift unequal to zero between them.
Full article

A Unified Pricing of Variable Annuity Guarantees under the Optimal Stochastic Control Framework*Risks* **2016**, *4*(3), 22; doi:10.3390/risks4030022 - 5 July 2016**Abstract **

►
Figures
In this paper, we review pricing of the variable annuity living and death guarantees offered to retail investors in many countries. Investors purchase these products to take advantage of market growth and protect savings. We present pricing of these products via an
[...] Read more.

In this paper, we review pricing of the variable annuity living and death guarantees offered to retail investors in many countries. Investors purchase these products to take advantage of market growth and protect savings. We present pricing of these products via an optimal stochastic control framework and review the existing numerical methods. We also discuss pricing under the complete/incomplete financial market models, stochastic mortality and optimal/sub-optimal policyholder behavior, and in the presence of taxes. For numerical valuation of these contracts in the case of simple risky asset process, we develop a direct integration method based on the Gauss-Hermite quadratures with a one-dimensional cubic spline for calculation of the expected contract value, and a bi-cubic spline interpolation for applying the jump conditions across the contract cashflow event times. This method is easier to implement and faster when compared to the partial differential equation methods if the transition density (or its moments) of the risky asset underlying the contract is known in closed form between the event times. We present accurate numerical results for pricing of a Guaranteed Minimum Accumulation Benefit (GMAB) guarantee available on the market that can serve as a numerical benchmark for practitioners and researchers developing pricing of variable annuity guarantees to assess the accuracy of their numerical implementation.
Full article

Superforecasting: The Art and Science of Prediction. By Philip Tetlock and Dan Gardner*Risks* **2016**, *4*(3), 24; doi:10.3390/risks4030024 - 5 July 2016**Abstract **
Let me say from the outset that this is an excellent book to read. It is not only informative, as it should be for a book on forecasting, but it is highly entertaining.[...]
Full article

►
Figures
Survey on Log-Normally Distributed Market-Technical Trend Data*Risks* **2016**, *4*(3), 20; doi:10.3390/risks4030020 - 4 July 2016**Abstract **

►
Figures
In this survey, a short introduction of the recent discovery of log-normally-distributed market-technical trend data will be given. The results of the statistical evaluation of typical market-technical trend variables will be presented. It will be shown that the log-normal assumption fits better
[...] Read more.

In this survey, a short introduction of the recent discovery of log-normally-distributed market-technical trend data will be given. The results of the statistical evaluation of typical market-technical trend variables will be presented. It will be shown that the log-normal assumption fits better to empirical trend data than to daily returns of stock prices. This enables one to mathematically evaluate trading systems depending on such variables. In this manner, a basic approach to an anti-cyclic trading system will be given as an example.
Full article

The Myth of Methuselah and the Uncertainty of Death: The Mortality Fan Charts*Risks* **2016**, *4*(3), 21; doi:10.3390/risks4030021 - 4 July 2016**Abstract **

►
Figures
This paper uses mortality fan charts to illustrate prospective future male mortality. These fan charts show both the most likely path of male mortality and the bands of uncertainty surrounding that path. The fan charts are based on a model of male
[...] Read more.

This paper uses mortality fan charts to illustrate prospective future male mortality. These fan charts show both the most likely path of male mortality and the bands of uncertainty surrounding that path. The fan charts are based on a model of male mortality that is known to provide a good fit to UK mortality data. The fan charts suggest that there are clear limits to longevity—that future mortality rates are very uncertain and tend to become more uncertain the further ahead the forecast—and that forecasts of future mortality uncertainty must also take account of uncertainty in the parameters of the underlying mortality model.
Full article

An Optimal Turkish Private Pension Plan with a Guarantee Feature*Risks* **2016**, *4*(3), 19; doi:10.3390/risks4030019 - 27 June 2016**Abstract **

►
Figures
The Turkish Private Pension System is an investment system which aims to generate income for future consumption. This is a volunteer system, and the contributions are held in individual portfolios. Therefore, management of the funds is an important issue for both the
[...] Read more.

The Turkish Private Pension System is an investment system which aims to generate income for future consumption. This is a volunteer system, and the contributions are held in individual portfolios. Therefore, management of the funds is an important issue for both the participants and the insurance company. In this study, we propose an optimal private pension plan with a guarantee feature that is based on Constant Proportion Portfolio Insurance (CPPI). We derive a closed form formula for the optimal strategy with the help of dynamic programming. Moreover, our model is evaluated with numerical examples, and we compare its performance by implementing a sensitivity analysis.
Full article

Consistent Re-Calibration of the Discrete-Time Multifactor Vasiček Model*Risks* **2016**, *4*(3), 18; doi:10.3390/risks4030018 - 23 June 2016**Abstract **

►
Figures
The discrete-time multifactor Vasiček model is a tractable Gaussian spot rate model. Typically, two- or three-factor versions allow one to capture the dependence structure between yields with different times to maturity in an appropriate way. In practice, re-calibration of the model to
[...] Read more.

The discrete-time multifactor Vasiček model is a tractable Gaussian spot rate model. Typically, two- or three-factor versions allow one to capture the dependence structure between yields with different times to maturity in an appropriate way. In practice, re-calibration of the model to the prevailing market conditions leads to model parameters that change over time. Therefore, the model parameters should be understood as being time-dependent or even stochastic. Following the consistent re-calibration (CRC) approach, we construct models as concatenations of yield curve increments of Hull–White extended multifactor Vasiček models with different parameters. The CRC approach provides attractive tractable models that preserve the no-arbitrage premise. As a numerical example, we fit Swiss interest rates using CRC multifactor Vasiček models.
Full article