Open AccessArticle
Risk Aversion, Loss Aversion, and the Demand for Insurance
Risks 2018, 6(2), 60; https://doi.org/10.3390/risks6020060 (registering DOI) -
Abstract
In this paper we analyze insurance demand when the utility function depends both upon final wealth and the level of losses or gains relative to a reference point. Besides some comparative statics results, we discuss the links with first-order risk aversion, with the
[...] Read more.
In this paper we analyze insurance demand when the utility function depends both upon final wealth and the level of losses or gains relative to a reference point. Besides some comparative statics results, we discuss the links with first-order risk aversion, with the Omega measure, and with a tendency to over-insure modest risks that has been been extensively documented in real insurance markets. Full article
Open AccessArticle
On the Moments and the Distribution of Aggregate Discounted Claims in a Markovian Environment
Risks 2018, 6(2), 59; https://doi.org/10.3390/risks6020059 -
Abstract
This paper studies the moments and the distribution of the aggregate discounted claims (ADCs) in a Markovian environment, where the claim arrivals, claim amounts, and forces of interest (for discounting) are influenced by an underlying Markov process. Specifically, we assume that claims occur
[...] Read more.
This paper studies the moments and the distribution of the aggregate discounted claims (ADCs) in a Markovian environment, where the claim arrivals, claim amounts, and forces of interest (for discounting) are influenced by an underlying Markov process. Specifically, we assume that claims occur according to a Markovian arrival process (MAP). The paper shows that the vector of joint Laplace transforms of the ADC occurring in each state of the environment process by any specific time satisfies a matrix-form first-order partial differential equation, through which a recursive formula is derived for the moments of the ADC occurring in certain states (a subset). We also study two types of covariances of the ADC occurring in any two subsets of the state space and with two different time lengths. The distribution of the ADC occurring in certain states by any specific time is also investigated. Numerical results are also presented for a two-state Markov-modulated model case. Full article
Figures

Figure 1

Open AccessArticle
A Credit-Risk Valuation under the Variance-Gamma Asset Return
Risks 2018, 6(2), 58; https://doi.org/10.3390/risks6020058 -
Abstract
This paper considers risks of the investment portfolio, which consist of distributed mortgages and sold European call options. It is assumed that the stream of the credit payments could fall by a jump. The time of the jump is modeled by the exponential
[...] Read more.
This paper considers risks of the investment portfolio, which consist of distributed mortgages and sold European call options. It is assumed that the stream of the credit payments could fall by a jump. The time of the jump is modeled by the exponential distribution. We suggest that the returns on stock are variance-gamma distributed. The value at risk, the expected shortfall and the entropic risk measure for this portfolio are calculated in closed forms. The obtained formulas exploit the values of generalized hypergeometric functions. Full article
Open AccessArticle
On Two Mixture-Based Clustering Approaches Used in Modeling an Insurance Portfolio
Risks 2018, 6(2), 57; https://doi.org/10.3390/risks6020057 -
Abstract
We review two complementary mixture-based clustering approaches for modeling unobserved heterogeneity in an insurance portfolio: the generalized linear mixed cluster-weighted model (CWM) and mixture-based clustering for an ordered stereotype model (OSM). The latter is for modeling of ordinal variables, and the former is
[...] Read more.
We review two complementary mixture-based clustering approaches for modeling unobserved heterogeneity in an insurance portfolio: the generalized linear mixed cluster-weighted model (CWM) and mixture-based clustering for an ordered stereotype model (OSM). The latter is for modeling of ordinal variables, and the former is for modeling losses as a function of mixed-type of covariates. The article extends the idea of mixture modeling to a multivariate classification for the purpose of testing unobserved heterogeneity in an insurance portfolio. The application of both methods is illustrated on a well-known French automobile portfolio, in which the model fitting is performed using the expectation-maximization (EM) algorithm. Our findings show that these mixture-based clustering methods can be used to further test unobserved heterogeneity in an insurance portfolio and as such may be considered in insurance pricing, underwriting, and risk management. Full article
Figures

Figure 1

Open AccessArticle
Stochastic Modeling of Wind Derivatives in Energy Markets
Risks 2018, 6(2), 56; https://doi.org/10.3390/risks6020056 -
Abstract
We model the logarithm of the spot price of electricity with a normal inverse Gaussian (NIG) process and the wind speed and wind power production with two Ornstein–Uhlenbeck processes. In order to reproduce the correlation between the spot price and the wind power
[...] Read more.
We model the logarithm of the spot price of electricity with a normal inverse Gaussian (NIG) process and the wind speed and wind power production with two Ornstein–Uhlenbeck processes. In order to reproduce the correlation between the spot price and the wind power production, namely between a pure jump process and a continuous path process, respectively, we replace the small jumps of the NIG process by a Brownian term. We then apply our models to two different problems: first, to study from the stochastic point of view the income from a wind power plant, as the expected value of the product between the electricity spot price and the amount of energy produced; then, to construct and price a European put-type quanto option in the wind energy markets that allows the buyer to hedge against low prices and low wind power production in the plant. Calibration of the proposed models and related price formulas is also provided, according to specific datasets. Full article
Figures

Figure 1

Open AccessArticle
Using Cutting-Edge Tree-Based Stochastic Models to Predict Credit Risk
Risks 2018, 6(2), 55; https://doi.org/10.3390/risks6020055 -
Abstract
Credit risk is a critical issue that affects banks and companies on a global scale. Possessing the ability to accurately predict the level of credit risk has the potential to help the lender and borrower. This is achieved by alleviating the number of
[...] Read more.
Credit risk is a critical issue that affects banks and companies on a global scale. Possessing the ability to accurately predict the level of credit risk has the potential to help the lender and borrower. This is achieved by alleviating the number of loans provided to borrowers with poor financial health, thereby reducing the number of failed businesses, and, in effect, preventing economies from collapsing. This paper uses state-of-the-art stochastic models, namely: Decision trees, random forests, and stochastic gradient boosting to add to the current literature on credit-risk modelling. The Australian mining industry has been selected to test our methodology. Mining in Australia generates around $138 billion annually, making up more than half of the total goods and services. This paper uses publicly-available financial data from 750 risky and not risky Australian mining companies as variables in our models. Our results indicate that stochastic gradient boosting was the superior model at correctly classifying the good and bad credit-rated companies within the mining sector. Our model showed that ‘Property, Plant, & Equipment (PPE) turnover’, ‘Invested Capital Turnover’, and ‘Price over Earnings Ratio (PER)’ were the variables with the best explanatory power pertaining to predicting credit risk in the Australian mining sector. Full article
Figures

Figure 1

Open AccessArticle
Diversification and Systemic Risk: A Financial Network Perspective
Risks 2018, 6(2), 54; https://doi.org/10.3390/risks6020054 -
Abstract
In this paper, we study the implications of diversification in the asset portfolios of banks for financial stability and systemic risk. Adding to the existing literature, we analyse this issue in a network model of the interbank market. We carry out a simulation
[...] Read more.
In this paper, we study the implications of diversification in the asset portfolios of banks for financial stability and systemic risk. Adding to the existing literature, we analyse this issue in a network model of the interbank market. We carry out a simulation study that determines the probability of a systemic crisis in the banking network as a function of both the level of diversification, and the connectivity and structure of the financial network. In contrast to earlier studies we find that diversification at the level of individual banks may be beneficial for financial stability even if it does lead to a higher asset return correlation across banks. Full article
Figures

Figure 1

Open AccessArticle
A General Framework for Portfolio Theory—Part I: Theory and Various Models
Risks 2018, 6(2), 53; https://doi.org/10.3390/risks6020053 -
Abstract
Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two-dimensional space of utility and risk. This is a rather
[...] Read more.
Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two-dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz (1959) and the capital market pricing model Sharpe (1964), are special cases of our general framework when the risk measure is taken to be the standard deviation and the utility function is the identity mapping. Using our general framework, we also recover and extend the results in Rockafellar et al. (2006), which were already an extension of the capital market pricing model to allow for the use of more general deviation measures. This generalized capital asset pricing model also applies to e.g., when an approximation of the maximum drawdown is considered as a risk measure. Furthermore, the consideration of a general utility function allows for going beyond the “additive” performance measure to a “multiplicative” one of cumulative returns by using the log utility. As a result, the growth optimal portfolio theory Lintner (1965) and the leverage space portfolio theory Vince (2009) can also be understood and enhanced under our general framework. Thus, this general framework allows a unification of several important existing portfolio theories and goes far beyond. For simplicity of presentation, we phrase all for a finite underlying probability space and a one period market model, but generalizations to more complex structures are straightforward. Full article
Figures

Figure 1

Open AccessArticle
Modelling and Forecasting Stock Price Movements with Serially Dependent Determinants
Risks 2018, 6(2), 52; https://doi.org/10.3390/risks6020052 -
Abstract
The direction of price movements are analysed under an ordered probit framework, recognising the importance of accounting for discreteness in price changes. By extending the work of Hausman et al. (1972) and Yang and Parwada (2012),This paper focuses on improving the forecast performance
[...] Read more.
The direction of price movements are analysed under an ordered probit framework, recognising the importance of accounting for discreteness in price changes. By extending the work of Hausman et al. (1972) and Yang and Parwada (2012),This paper focuses on improving the forecast performance of the model while infusing a more practical perspective by enhancing flexibility. This is achieved by extending the existing framework to generate short term multi period ahead forecasts for better decision making, whilst considering the serial dependence structure. This approach enhances the flexibility and adaptability of the model to future price changes, particularly targeting risk minimisation. Empirical evidence is provided, based on seven stocks listed on the Australian Securities Exchange (ASX). The prediction success varies between 78 and 91 per cent for in-sample and out-of-sample forecasts for both the short term and long term. Full article
Figures

Figure 1

Open AccessArticle
Real-Option Valuation in a Finite-Time, Incomplete Market with Jump Diffusion and Investor-Utility Inflation
Risks 2018, 6(2), 51; https://doi.org/10.3390/risks6020051 -
Abstract
We extend an existing numerical model (Grasselli (2011)) for valuing a real option to invest in a capital project in an incomplete market with a finite time horizon. In doing so, we include two separate effects: the possibility that the project value is
[...] Read more.
We extend an existing numerical model (Grasselli (2011)) for valuing a real option to invest in a capital project in an incomplete market with a finite time horizon. In doing so, we include two separate effects: the possibility that the project value is partly describable according to a jump-diffusion process, and incorporation of a time-dependent investor utility function, taking into account the effect of inflation. We adopt a discrete approximation to the jump process, whose parameters are restricted in order to preserve the drift and the volatility of the project-value process that it modifies. By controlling for these low-order effects, the higher-order effects may be considered in isolation. Our simulated results demonstrate that the inclusion of the jump process tends to decrease the value of the option, and expand the circumstances under which it should be exercised. Our results also demonstrate that an appropriate selection of the time-dependent investor utility function yields more reasonable investor-behaviour predictions regarding the decision to exercise the option, than would occur otherwise. Full article
Figures

Figure 1

Open AccessFeature PaperArticle
The Effect of Non-Proportional Reinsurance: A Revision of Solvency II Standard Formula
Risks 2018, 6(2), 50; https://doi.org/10.3390/risks6020050 -
Abstract
Solvency II Standard Formula provides a methodology to recognise the risk-mitigating impact of excess of loss reinsurance treaties in premium risk modelling. We analyse the proposals of both Quantitative Impact Study 5 and Commission Delegated Regulation highlighting some inconsistencies. This paper tries to
[...] Read more.
Solvency II Standard Formula provides a methodology to recognise the risk-mitigating impact of excess of loss reinsurance treaties in premium risk modelling. We analyse the proposals of both Quantitative Impact Study 5 and Commission Delegated Regulation highlighting some inconsistencies. This paper tries to bridge main pitfalls of both versions. To this aim, we propose a revision of non-proportional adjustment factor in order to measure the effect of excess of loss treaties on premium risk volatility. In this way, capital requirement can be easily assessed. As numerical results show, this proposal appears to be a feasible and much more consistent approach to describe the effect of non-proportional reinsurance on premium risk. Full article
Figures

Figure 1

Open AccessArticle
Properties of Stochastic Arrangement Increasing and Their Applications in Allocation Problems
Risks 2018, 6(2), 49; https://doi.org/10.3390/risks6020049 -
Abstract
There are extensive studies on the allocation problems in the field of insurance and finance. We observe that these studies, although involving different methodologies, share some inherent commonalities. In this paper, we develop a new framework for these studies with the tool of
[...] Read more.
There are extensive studies on the allocation problems in the field of insurance and finance. We observe that these studies, although involving different methodologies, share some inherent commonalities. In this paper, we develop a new framework for these studies with the tool of arrangement increasing functions. This framework unifies many existing studies and provides shortcuts to developing new results. Full article
Open AccessArticle
The Italian Pension Gap: A Stochastic Optimal Control Approach
Risks 2018, 6(2), 48; https://doi.org/10.3390/risks6020048 -
Abstract
We study the gap between the state pension provided by the Italian pension system pre-Dini reform and post-Dini reform. The goal is to fill the gap between the old and the new pension by joining a defined contribution pension scheme and adopting an
[...] Read more.
We study the gap between the state pension provided by the Italian pension system pre-Dini reform and post-Dini reform. The goal is to fill the gap between the old and the new pension by joining a defined contribution pension scheme and adopting an optimal investment strategy that is target-based. We find that it is possible to cover, at least partially, this gap with the additional income of the pension scheme, especially in the presence of late retirement and in the presence of stagnant careers. Workers with dynamic careers and workers who retire early are those who are most penalised by the reform. Results are intuitive and in line with previous studies on the subject. Full article
Figures

Figure 1

Open AccessArticle
The Cascade Bayesian Approach: Prior Transformation for a Controlled Integration of Internal Data, External Data and Scenarios
Risks 2018, 6(2), 47; https://doi.org/10.3390/risks6020047 -
Abstract
According to the last proposals of the Basel Committee on Banking Supervision, banks or insurance companies under the advanced measurement approach (AMA) must use four different sources of information to assess their operational risk capital requirement. The fourth includes ’business environment and internal
[...] Read more.
According to the last proposals of the Basel Committee on Banking Supervision, banks or insurance companies under the advanced measurement approach (AMA) must use four different sources of information to assess their operational risk capital requirement. The fourth includes ’business environment and internal control factors’, i.e., qualitative criteria, whereas the three main quantitative sources available to banks for building the loss distribution are internal loss data, external loss data and scenario analysis. This paper proposes an innovative methodology to bring together these three different sources in the loss distribution approach (LDA) framework through a Bayesian strategy. The integration of the different elements is performed in two different steps to ensure an internal data-driven model is obtained. In the first step, scenarios are used to inform the prior distributions and external data inform the likelihood component of the posterior function. In the second step, the initial posterior function is used as the prior distribution and the internal loss data inform the likelihood component of the second posterior function. This latter posterior function enables the estimation of the parameters of the severity distribution that are selected to represent the operational risk event types. Full article
Figures

Figure 1

Open AccessArticle
Volatility Is Log-Normal—But Not for the Reason You Think
Risks 2018, 6(2), 46; https://doi.org/10.3390/risks6020046 -
Abstract
It is impossible to discriminate between the commonly used stochastic volatility models of Heston, log-normal, and 3-over-2 on the basis of exponentially weighted averages of daily returns—even though it appears so at first sight. However, with a 5-min sampling frequency, the models can
[...] Read more.
It is impossible to discriminate between the commonly used stochastic volatility models of Heston, log-normal, and 3-over-2 on the basis of exponentially weighted averages of daily returns—even though it appears so at first sight. However, with a 5-min sampling frequency, the models can be differentiated and empirical evidence overwhelmingly favours a fast mean-reverting log-normal model. Full article
Figures

Figure 1

Open AccessArticle
Estimating and Forecasting Conditional Risk Measures with Extreme Value Theory: A Review
Risks 2018, 6(2), 45; https://doi.org/10.3390/risks6020045 -
Abstract
One of the key components of financial risk management is risk measurement. This typically requires modeling, estimating and forecasting tail-related quantities of the asset returns’ conditional distribution. Recent advances in the financial econometrics literature have developed several models based on Extreme Value Theory
[...] Read more.
One of the key components of financial risk management is risk measurement. This typically requires modeling, estimating and forecasting tail-related quantities of the asset returns’ conditional distribution. Recent advances in the financial econometrics literature have developed several models based on Extreme Value Theory (EVT) to carry out these tasks. The purpose of this paper is to review these methods. Full article
Figures

Figure 1

Open AccessArticle
An Empirical Study on Stochastic Mortality Modelling under the Age-Period-Cohort Framework: The Case of Greece with Applications to Insurance Pricing
Risks 2018, 6(2), 44; https://doi.org/10.3390/risks6020044 -
Abstract
During the last decades, life expectancy has risen significantly in the most developed countries all over the world. Greece is a case in point; consequently, higher governmental financial responsibilities occur as well as serious concerns are raised owing to population ageing. To address
[...] Read more.
During the last decades, life expectancy has risen significantly in the most developed countries all over the world. Greece is a case in point; consequently, higher governmental financial responsibilities occur as well as serious concerns are raised owing to population ageing. To address this issue, an efficient forecasting method is required. Therefore, the most important stochastic models were comparatively applied to Greek data for the first time. An analysis of their fitting behaviour by gender was conducted and the corresponding forecasting results were evaluated. In particular, we incorporated the Greek population data into seven stochastic mortality models under a common age-period-cohort framework. The fitting performance of each model was thoroughly evaluated based on information criteria values as well as the likelihood ratio test and their robustness to period changes was investigated. In addition, parameter risk in forecasts was assessed by employing bootstrapping techniques. For completeness, projection results for both genders were also illustrated in pricing insurance-related products. Full article
Figures

Figure 1

Open AccessArticle
Life Insurance and Annuity Demand under Hyperbolic Discounting
Risks 2018, 6(2), 43; https://doi.org/10.3390/risks6020043 -
Abstract
In this paper, we analyse and construct a lifetime utility maximisation model with hyperbolic discounting. Within the model, a number of assumptions are made: complete markets, actuarially fair life insurance/annuity is available, and investors have time-dependent preferences. Time dependent preferences are in contrast
[...] Read more.
In this paper, we analyse and construct a lifetime utility maximisation model with hyperbolic discounting. Within the model, a number of assumptions are made: complete markets, actuarially fair life insurance/annuity is available, and investors have time-dependent preferences. Time dependent preferences are in contrast to the usual case of constant preferences (exponential discounting). We find: (1) investors (realistically) demand more life insurance after retirement (in contrast to the standard model, which showed strong demand for life annuities), and annuities are rarely purchased; (2) optimal consumption paths exhibit a humped shape (which is usually only found in incomplete markets under the assumptions of the standard model). Full article
Figures

Figure 1

Open AccessReview
Credit Risk Meets Random Matrices: Coping with Non-Stationary Asset Correlations
Risks 2018, 6(2), 42; https://doi.org/10.3390/risks6020042 -
Abstract
We review recent progress in modeling credit risk for correlated assets. We employ a new interpretation of the Wishart model for random correlation matrices to model non-stationary effects. We then use the Merton model in which default events and losses are derived from
[...] Read more.
We review recent progress in modeling credit risk for correlated assets. We employ a new interpretation of the Wishart model for random correlation matrices to model non-stationary effects. We then use the Merton model in which default events and losses are derived from the asset values at maturity. To estimate the time development of the asset values, the stock prices are used, the correlations of which have a strong impact on the loss distribution, particularly on its tails. These correlations are non-stationary, which also influences the tails. We account for the asset fluctuations by averaging over an ensemble of random matrices that models the truly existing set of measured correlation matrices. As a most welcome side effect, this approach drastically reduces the parameter dependence of the loss distribution, allowing us to obtain very explicit results, which show quantitatively that the heavy tails prevail over diversification benefits even for small correlations. We calibrate our random matrix model with market data and show how it is capable of grasping different market situations. Furthermore, we present numerical simulations for concurrent portfolio risks, i.e., for the joint probability densities of losses for two portfolios. For the convenience of the reader, we give an introduction to the Wishart random matrix model. Full article
Figures

Figure 1

Open AccessArticle
Active Management of Operational Risk in the Regimes of the “Unknown”: What Can Machine Learning or Heuristics Deliver?
Risks 2018, 6(2), 41; https://doi.org/10.3390/risks6020041 -
Abstract
Advanced machine learning has achieved extraordinary success in recent years. “Active” operational risk beyond ex post analysis of measured-data machine learning could provide help beyond the regime of traditional statistical analysis when it comes to the “known unknown” or even the “unknown unknown.”
[...] Read more.
Advanced machine learning has achieved extraordinary success in recent years. “Active” operational risk beyond ex post analysis of measured-data machine learning could provide help beyond the regime of traditional statistical analysis when it comes to the “known unknown” or even the “unknown unknown.” While machine learning has been tested successfully in the regime of the “known,” heuristics typically provide better results for an active operational risk management (in the sense of forecasting). However, precursors in existing data can open a chance for machine learning to provide early warnings even for the regime of the “unknown unknown.” Full article
Figures

Figure 1