Next Issue
Previous Issue

Table of Contents

Risks, Volume 7, Issue 2 (June 2019)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Cover Story (view full-size image) The cover figure shows the treatment level healthcare data modeling approach. In the paper, the [...] Read more.
View options order results:
result details:
Displaying articles 1-36
Export citation of selected articles as:
Open AccessArticle
Predicting Motor Insurance Claims Using Telematics Data—XGBoost versus Logistic Regression
Received: 9 May 2019 / Revised: 7 June 2019 / Accepted: 12 June 2019 / Published: 20 June 2019
Viewed by 281 | PDF Full-text (2386 KB) | HTML Full-text | XML Full-text
Abstract
XGBoost is recognized as an algorithm with exceptional predictive capacity. Models for a binary response indicating the existence of accident claims versus no claims can be used to identify the determinants of traffic accidents. This study compared the relative performances of logistic regression [...] Read more.
XGBoost is recognized as an algorithm with exceptional predictive capacity. Models for a binary response indicating the existence of accident claims versus no claims can be used to identify the determinants of traffic accidents. This study compared the relative performances of logistic regression and XGBoost approaches for predicting the existence of accident claims using telematics data. The dataset contained information from an insurance company about the individuals’ driving patterns—including total annual distance driven and percentage of total distance driven in urban areas. Our findings showed that logistic regression is a suitable model given its interpretability and good predictive capacity. XGBoost requires numerous model-tuning procedures to match the predictive performance of the logistic regression model and greater effort as regards to interpretation. Full article
(This article belongs to the Special Issue Machine Learning in Insurance)
Figures

Figure 1

Open AccessArticle
Smallholder Farmers’ Willingness to Pay for Agricultural Production Cost Insurance in Rural West Java, Indonesia: A Contingent Valuation Method (CVM) Approach
Received: 22 May 2019 / Revised: 11 June 2019 / Accepted: 18 June 2019 / Published: 20 June 2019
Viewed by 255 | PDF Full-text (2008 KB) | HTML Full-text | XML Full-text
Abstract
To reduce the negative impacts of risks in farming due to climate change, the government implemented agricultural production cost insurance in 2015. Although a huge amount of subsidy has been allocated by the government (80 percent of the premium), farmers’ participation rate is [...] Read more.
To reduce the negative impacts of risks in farming due to climate change, the government implemented agricultural production cost insurance in 2015. Although a huge amount of subsidy has been allocated by the government (80 percent of the premium), farmers’ participation rate is still low (23 percent of the target in 2016). In order to solve the issue, it is indispensable to identify farmers’ willingness to pay (WTP) for and determinants of their participation in agricultural production cost insurance. Based on a field survey of 240 smallholder farmers in the Garut District, West Java Province in August–October 2017 and February 2018, the contingent valuation method (CVM) estimated that farmers’ mean willingness to pay (WTP) was Rp 30,358/ha/cropping season ($2.25/ha/cropping season), which was 16 percent lower than the current premium (Rp 36,000/ha/cropping season = $2.67/ha/cropping season). Farmers who participated in agricultural production cost insurance shared some characteristics: operating larger farmland, more contact with agricultural extension service, lower expected production for the next cropping season, and a downstream area location. Full article
Figures

Figure 1

Open AccessArticle
Ruin Probability Functions and Severity of Ruin as a Statistical Decision Problem
Received: 24 May 2019 / Revised: 6 June 2019 / Accepted: 11 June 2019 / Published: 17 June 2019
Viewed by 228 | PDF Full-text (407 KB) | HTML Full-text | XML Full-text
Abstract
It is known that the classical ruin function under exponential claim-size distribution depends on two parameters, which are referred to as the mean claim size and the relative security loading. These parameters are assumed to be unknown and random, thus, a loss function [...] Read more.
It is known that the classical ruin function under exponential claim-size distribution depends on two parameters, which are referred to as the mean claim size and the relative security loading. These parameters are assumed to be unknown and random, thus, a loss function that measures the loss sustained by a decision-maker who takes as valid a ruin function which is not correct can be considered. By using squared-error loss function and appropriate distribution function for these parameters, the issue of estimating the ruin function derives in a mixture procedure. Firstly, a bivariate distribution for mixing jointly the two parameters is considered, and second, different univariate distributions for mixing both parameters separately are examined. Consequently, a catalogue of ruin probability functions and severity of ruin, which are more flexible than the original one, are obtained. The methodology is also extended to the Pareto claim size distribution. Several numerical examples illustrate the performance of these functions. Full article
(This article belongs to the Special Issue Loss Models: From Theory to Applications)
Open AccessArticle
Credit Risk Assessment Model for Small and Micro-Enterprises: The Case of Lithuania
Received: 7 April 2019 / Revised: 4 June 2019 / Accepted: 5 June 2019 / Published: 13 June 2019
Viewed by 227 | PDF Full-text (621 KB) | HTML Full-text | XML Full-text
Abstract
In this research, trade credit is analysed form a seller (supplier) perspective. Trade credit allows the supplier to increase sales and profits but creates the risk that the customer will not pay, and at the same time increases the risk of the supplier’s [...] Read more.
In this research, trade credit is analysed form a seller (supplier) perspective. Trade credit allows the supplier to increase sales and profits but creates the risk that the customer will not pay, and at the same time increases the risk of the supplier’s insolvency. If the supplier is a small or micro-enterprise (SMiE), it is usually an issue of human and technical resources. Therefore, when dealing with these issues, the supplier needs a high accuracy but simple and highly interpretable trade credit risk assessment model that allows for assessing the risk of insolvency of buyers (who are usually SMiE). The aim of the research is to create a statistical enterprise trade credit risk assessment (ETCRA) model for Lithuanian small and micro-enterprises (SMiE). In the empirical analysis, the financial and non-financial data of 734 small and micro-sized enterprises in the period of 2010–2012 were chosen as the samples. Based on the logistic regression, the ETCRA model was developed using financial and non-financial variables. In the ETCRA model, the enterprise’s financial performance is assessed from different perspectives: profitability, liquidity, solvency, and activity. Varied model variants have been created using (i) only financial ratios and (ii) financial ratios and non-financial variables. Moreover, the inclusion of non-financial variables in the model does not substantially improve the characteristics of the model. This means that the models that use only financial ratios can be used in practice, and the models that include non-financial variables can also be used. The designed models can be used by suppliers when making decisions of granting a trade credit for small or micro-enterprises. Full article
(This article belongs to the Special Issue Advances in Credit Risk Modeling and Management)
Figures

Figure 1

Open AccessArticle
Risk Factor Evolution for Counterparty Credit Risk under a Hidden Markov Model
Received: 31 March 2019 / Revised: 3 June 2019 / Accepted: 5 June 2019 / Published: 12 June 2019
Viewed by 440 | PDF Full-text (824 KB) | HTML Full-text | XML Full-text
Abstract
One of the key components of counterparty credit risk (CCR) measurement is generating scenarios for the evolution of the underlying risk factors, such as interest and exchange rates, equity and commodity prices, and credit spreads. Geometric Brownian Motion (GBM) is a widely used [...] Read more.
One of the key components of counterparty credit risk (CCR) measurement is generating scenarios for the evolution of the underlying risk factors, such as interest and exchange rates, equity and commodity prices, and credit spreads. Geometric Brownian Motion (GBM) is a widely used method for modeling the evolution of exchange rates. An important limitation of GBM is that, due to the assumption of constant drift and volatility, stylized facts of financial time-series, such as volatility clustering and heavy-tailedness in the returns distribution, cannot be captured. We propose a model where volatility and drift are able to switch between regimes; more specifically, they are governed by an unobservable Markov chain. Hence, we model exchange rates with a hidden Markov model (HMM) and generate scenarios for counterparty exposure using this approach. A numerical study is carried out and backtesting results for a number of exchange rates are presented. The impact of using a regime-switching model on counterparty exposure is found to be profound for derivatives with non-linear payoffs. Full article
(This article belongs to the Special Issue Advances in Credit Risk Modeling and Management)
Figures

Figure 1

Open AccessArticle
Generalized Multiplicative Risk Apportionment
Received: 12 April 2019 / Revised: 30 May 2019 / Accepted: 4 June 2019 / Published: 12 June 2019
Viewed by 214 | PDF Full-text (319 KB) | HTML Full-text | XML Full-text
Abstract
This work examines apportionment of multiplicative risks by considering three dominance orderings: first-degree stochastic dominance, Rothschild and Stiglitz’s increase in risk and downside risk increase. We use the relative nth-degree risk aversion measure and decreasing relative nth-degree risk aversion to provide [...] Read more.
This work examines apportionment of multiplicative risks by considering three dominance orderings: first-degree stochastic dominance, Rothschild and Stiglitz’s increase in risk and downside risk increase. We use the relative nth-degree risk aversion measure and decreasing relative nth-degree risk aversion to provide conditions guaranteeing the preference for “harm disaggregation” of multiplicative risks. Further, we relate our conclusions to the preference toward bivariate lotteries, which interpret correlation-aversion, cross-prudence and cross-temperance. Full article
(This article belongs to the Special Issue Model Risk and Risk Measures)
Open AccessArticle
Default Ambiguity
Received: 29 March 2019 / Revised: 17 May 2019 / Accepted: 17 May 2019 / Published: 10 June 2019
Viewed by 250 | PDF Full-text (414 KB) | HTML Full-text | XML Full-text
Abstract
This paper discusses ambiguity in the context of single-name credit risk. We focus on uncertainty in the default intensity but also discuss uncertainty in the recovery in a fractional recovery of the market value. This approach is a first step towards integrating uncertainty [...] Read more.
This paper discusses ambiguity in the context of single-name credit risk. We focus on uncertainty in the default intensity but also discuss uncertainty in the recovery in a fractional recovery of the market value. This approach is a first step towards integrating uncertainty in credit-risky term structure models and can profit from its simplicity. We derive drift conditions in a Heath–Jarrow–Morton forward rate setting in the case of ambiguous default intensity in combination with zero recovery, and in the case of ambiguous fractional recovery of the market value. Full article
(This article belongs to the Special Issue Advances in Credit Risk Modeling and Management)
Figures

Figure 1

Open AccessArticle
A Renewal Shot Noise Process with Subexponential Shot Marks
Received: 28 April 2019 / Revised: 1 June 2019 / Accepted: 3 June 2019 / Published: 5 June 2019
Viewed by 234 | PDF Full-text (334 KB) | HTML Full-text | XML Full-text
Abstract
We investigate a shot noise process with subexponential shot marks occurring at renewal epochs. Our main result is a precise asymptotic formula for its tail probability. In doing so, some recent results regarding sums of randomly weighted subexponential random variables play a crucial [...] Read more.
We investigate a shot noise process with subexponential shot marks occurring at renewal epochs. Our main result is a precise asymptotic formula for its tail probability. In doing so, some recent results regarding sums of randomly weighted subexponential random variables play a crucial role. Full article
(This article belongs to the Special Issue Heavy-Tail Phenomena in Insurance, Finance, and Other Related Fields)
Open AccessFeature PaperArticle
Analysis of Stochastic Reserving Models By Means of NAIC Claims Data
Received: 26 February 2019 / Revised: 26 May 2019 / Accepted: 28 May 2019 / Published: 4 June 2019
Viewed by 348 | PDF Full-text (558 KB) | HTML Full-text | XML Full-text | Supplementary Files
Abstract
In the past two decades increasing computational power resulted in the development of more advanced claims reserving techniques, allowing the stochastic branch to overcome the deterministic methods, resulting in forecasts of enhanced quality. Hence, not only point estimates, but predictive distributions can be [...] Read more.
In the past two decades increasing computational power resulted in the development of more advanced claims reserving techniques, allowing the stochastic branch to overcome the deterministic methods, resulting in forecasts of enhanced quality. Hence, not only point estimates, but predictive distributions can be generated in order to forecast future claim amounts. The significant expansion in the variety of models requires the validation of these methods and the creation of supporting techniques for appropriate decision making. The present article compares and validates several existing and self-developed stochastic methods on actual data applying comparison measures in an algorithmic manner. Full article
Figures

Figure 1

Open AccessArticle
The Investigation of a Forward-Rate Mortality Framework
Received: 13 February 2019 / Revised: 10 April 2019 / Accepted: 15 May 2019 / Published: 1 June 2019
Viewed by 295 | PDF Full-text (1751 KB) | HTML Full-text | XML Full-text
Abstract
Stochastic mortality models have been developed for a range of applications from demographic projections to financial management. Financial risk based models built on methods used for interest rates and apply these to mortality rates. They have the advantage of being applied to financial [...] Read more.
Stochastic mortality models have been developed for a range of applications from demographic projections to financial management. Financial risk based models built on methods used for interest rates and apply these to mortality rates. They have the advantage of being applied to financial pricing and the management of longevity risk. Olivier and Jeffery (2004) and Smith (2005) proposed a model based on a forward-rate mortality framework with stochastic factors driven by univariate gamma random variables irrespective of age or duration. We assess and further develop this model. We generalize random shocks from a univariate gamma to a univariate Tweedie distribution and allow for the distributions to vary by age. Furthermore, since dependence between ages is an observed characteristic of mortality rate improvements, we formulate a multivariate framework using copulas. We find that dependence increases with age and introduce a suitable covariance structure, one that is related to the notion of ax minimum. The resulting model provides a more realistic basis for capturing the risk of mortality improvements and serves to enhance longevity risk management for pension and insurance funds. Full article
Figures

Figure 1

Open AccessArticle
A General Framework for Portfolio Theory. Part III: Multi-Period Markets and Modular Approach
Received: 6 May 2019 / Revised: 25 May 2019 / Accepted: 27 May 2019 / Published: 1 June 2019
Viewed by 303 | PDF Full-text (607 KB) | HTML Full-text | XML Full-text
Abstract
This is Part III of a series of papers which focus on a general framework for portfolio theory. Here, we extend a general framework for portfolio theory in a one-period financial market as introduced in Part I [Maier-Paape and Zhu, Risks 2018, 6(2), [...] Read more.
This is Part III of a series of papers which focus on a general framework for portfolio theory. Here, we extend a general framework for portfolio theory in a one-period financial market as introduced in Part I [Maier-Paape and Zhu, Risks 2018, 6(2), 53] to multi-period markets. This extension is reasonable for applications. More importantly, we take a new approach, the “modular portfolio theory”, which is built from the interaction among four related modules: (a) multi period market model; (b) trading strategies; (c) risk and utility functions (performance criteria); and (d) the optimization problem (efficient frontier and efficient portfolio). An important concept that allows dealing with the more general framework discussed here is a trading strategy generating function. This concept limits the discussion to a special class of manageable trading strategies, which is still wide enough to cover many frequently used trading strategies, for instance “constant weight” (fixed fraction). As application, we discuss the utility function of compounded return and the risk measure of relative log drawdowns. Full article
Figures

Figure 1

Open AccessArticle
American Options on High Dividend Securities: A Numerical Investigation
Received: 28 February 2019 / Revised: 7 May 2019 / Accepted: 15 May 2019 / Published: 21 May 2019
Viewed by 286 | PDF Full-text (871 KB) | HTML Full-text | XML Full-text
Abstract
I document a sizeable bias that might arise when valuing out of the money American options via the Least Square Method proposed by Longstaff and Schwartz (2001). The key point of this algorithm is the regression-based estimate of the continuation value of an [...] Read more.
I document a sizeable bias that might arise when valuing out of the money American options via the Least Square Method proposed by Longstaff and Schwartz (2001). The key point of this algorithm is the regression-based estimate of the continuation value of an American option. If this regression is ill-posed, the procedure might deliver biased results. The price of the American option might even fall below the price of its European counterpart. For call options, this is likely to occur when the dividend yield of the underlying is high. This distortion is documented within the standard Black–Scholes–Merton model as well as within its most common extensions (the jump-diffusion, the stochastic volatility and the stochastic interest rates models). Finally, I propose two easy and effective workarounds that fix this distortion. Full article
(This article belongs to the Special Issue Applications of Stochastic Optimal Control to Economics and Finance)
Figures

Figure 1

Open AccessFeature PaperArticle
Revisiting Calibration of the Solvency II Standard Formula for Mortality Risk: Does the Standard Stress Scenario Provide an Adequate Approximation of Value-at-Risk?
Received: 18 April 2019 / Revised: 10 May 2019 / Accepted: 13 May 2019 / Published: 19 May 2019
Viewed by 417 | PDF Full-text (1151 KB) | HTML Full-text | XML Full-text
Abstract
The primary objective of this work is to analyze model based Value-at-Risk associated with mortality risk arising from issued term life assurance contracts and to compare the results with the capital requirements for mortality risk as determined using Solvency II Standard Formula. In [...] Read more.
The primary objective of this work is to analyze model based Value-at-Risk associated with mortality risk arising from issued term life assurance contracts and to compare the results with the capital requirements for mortality risk as determined using Solvency II Standard Formula. In particular, two approaches to calculate Value-at-Risk are analyzed: one-year VaR and run-off VaR. The calculations of Value-at-Risk are performed using stochastic mortality rates which are calibrated using the Lee-Carter model fitted using mortality data of selected European countries. Results indicate that, depending on the approach taken to calculate Value-at-Risk, the key factors driving its relative size are: sensitivity of technical provisions to the latest mortality experience, volatility of mortality rates in a country, policy term and benefit formula. Overall, we found that Solvency II Standard Formula on average delivers an adequate capital requirement, however, we also highlight particular situations where it could understate or overstate portfolio specific model based Value-at-Risk for mortality risk. Full article
Figures

Figure 1

Open AccessArticle
The Determinants of Market-Implied Recovery Rates
Received: 1 April 2019 / Revised: 6 May 2019 / Accepted: 10 May 2019 / Published: 18 May 2019
Viewed by 299 | PDF Full-text (987 KB) | HTML Full-text | XML Full-text
Abstract
In the presence of recovery risk, the recovery rate is a random variable whose risk-neutral expectation can be inferred from the prices of defaultable instruments. I extract market-implied recovery rates from the term structures of credit default swap spreads for a sample of [...] Read more.
In the presence of recovery risk, the recovery rate is a random variable whose risk-neutral expectation can be inferred from the prices of defaultable instruments. I extract market-implied recovery rates from the term structures of credit default swap spreads for a sample of 497 United States (U.S.) corporate issuers over the 2005–2014 period. I analyze the explanatory factors of market-implied recovery rates within a linear regression framework and also within a Tobit model, and I compare them with the determinants of historical recovery rates that were previously identified in the literature. In contrast to their historical counterparts, market-implied recovery rates are mostly driven by macroeconomic factors and long-term, issuer-specific variables. Short-term financial variables and industry conditions significantly impact the slope of market-implied recovery rates. These results indicate that the design of a recovery risk model should be based on specific market factors, not on the statistical evidence that is provided by historical recovery rates. Full article
(This article belongs to the Special Issue Advances in Credit Risk Modeling and Management)
Figures

Figure 1

Open AccessFeature PaperArticle
Statistical Inference for the Beta Coefficient
Received: 26 March 2019 / Revised: 24 April 2019 / Accepted: 8 May 2019 / Published: 15 May 2019
Viewed by 288 | PDF Full-text (6427 KB) | HTML Full-text | XML Full-text
Abstract
The beta coefficient plays a crucial role in finance as a risk measure of a portfolio in comparison to the benchmark portfolio. In the paper, we investigate statistical properties of the sample estimator for the beta coefficient. Assuming that both the holding portfolio [...] Read more.
The beta coefficient plays a crucial role in finance as a risk measure of a portfolio in comparison to the benchmark portfolio. In the paper, we investigate statistical properties of the sample estimator for the beta coefficient. Assuming that both the holding portfolio and the benchmark portfolio consist of the same assets whose returns are multivariate normally distributed, we provide the finite sample and the asymptotic distributions of the sample estimator for the beta coefficient. These findings are used to derive a statistical test for the beta coefficient and to construct a confidence interval for the beta coefficient. Moreover, we show that the sample estimator is an unbiased estimator for the beta coefficient. The theoretical results are implemented in an empirical study. Full article
Figures

Figure 1

Open AccessArticle
Model Efficiency and Uncertainty in Quantile Estimation of Loss Severity Distributions
Received: 29 January 2019 / Revised: 23 April 2019 / Accepted: 26 April 2019 / Published: 15 May 2019
Viewed by 334 | PDF Full-text (557 KB) | HTML Full-text | XML Full-text
Abstract
Quantiles of probability distributions play a central role in the definition of risk measures (e.g., value-at-risk, conditional tail expectation) which in turn are used to capture the riskiness of the distribution tail. Estimates of risk measures are needed in many practical situations such [...] Read more.
Quantiles of probability distributions play a central role in the definition of risk measures (e.g., value-at-risk, conditional tail expectation) which in turn are used to capture the riskiness of the distribution tail. Estimates of risk measures are needed in many practical situations such as in pricing of extreme events, developing reserve estimates, designing risk transfer strategies, and allocating capital. In this paper, we present the empirical nonparametric and two types of parametric estimators of quantiles at various levels. For parametric estimation, we employ the maximum likelihood and percentile-matching approaches. Asymptotic distributions of all the estimators under consideration are derived when data are left-truncated and right-censored, which is a typical loss variable modification in insurance. Then, we construct relative efficiency curves (REC) for all the parametric estimators. Specific examples of such curves are provided for exponential and single-parameter Pareto distributions for a few data truncation and censoring cases. Additionally, using simulated data we examine how wrong quantile estimates can be when one makes incorrect modeling assumptions. The numerical analysis is also supplemented with standard model diagnostics and validation (e.g., quantile-quantile plots, goodness-of-fit tests, information criteria) and presents an example of when those methods can mislead the decision maker. These findings pave the way for further work on RECs with potential for them being developed into an effective diagnostic tool in this context. Full article
Figures

Figure 1

Open AccessArticle
Direct and Hierarchical Models for Aggregating Spatially Dependent Catastrophe Risks
Received: 31 March 2019 / Revised: 28 April 2019 / Accepted: 5 May 2019 / Published: 8 May 2019
Viewed by 470 | PDF Full-text (1179 KB) | HTML Full-text | XML Full-text
Abstract
We present several fast algorithms for computing the distribution of a sum of spatially dependent, discrete random variables to aggregate catastrophe risk. The algorithms are based on direct and hierarchical copula trees. Computing speed comes from the fact that loss aggregation at branching [...] Read more.
We present several fast algorithms for computing the distribution of a sum of spatially dependent, discrete random variables to aggregate catastrophe risk. The algorithms are based on direct and hierarchical copula trees. Computing speed comes from the fact that loss aggregation at branching nodes is based on combination of fast approximation to brute-force convolution, arithmetization (regriding) and linear complexity of the method for computing the distribution of comonotonic sum of risks. We discuss the impact of tree topology on the second-order moments and tail statistics of the resulting distribution of the total risk. We test the performance of the presented models by accumulating ground-up loss for 29,000 risks affected by hurricane peril. Full article
Figures

Figure 1

Open AccessArticle
The Population Accuracy Index: A New Measure of Population Stability for Model Monitoring
Received: 21 March 2019 / Revised: 15 April 2019 / Accepted: 23 April 2019 / Published: 6 May 2019
Viewed by 603 | PDF Full-text (1383 KB) | HTML Full-text | XML Full-text
Abstract
Risk models developed on one dataset are often applied to new data and, in such cases, it is prudent to check that the model is suitable for the new data. An important application is in the banking industry, where statistical models are applied [...] Read more.
Risk models developed on one dataset are often applied to new data and, in such cases, it is prudent to check that the model is suitable for the new data. An important application is in the banking industry, where statistical models are applied to loans to determine provisions and capital requirements. These models are developed on historical data, and regulations require their monitoring to ensure they remain valid on current portfolios—often years since the models were developed. The Population Stability Index (PSI) is an industry standard to measure whether the distribution of the current data has shifted significantly from the distribution of data used to develop the model. This paper explores several disadvantages of the PSI and proposes the Prediction Accuracy Index (PAI) as an alternative. The superior properties and interpretation of the PAI are discussed and it is concluded that the PAI can more accurately summarise the level of population stability, helping risk analysts and managers determine whether the model remains fit-for-purpose. Full article
Figures

Figure 1

Open AccessArticle
Spatial Risk Measures and Rate of Spatial Diversification
Received: 21 December 2018 / Revised: 25 March 2019 / Accepted: 4 April 2019 / Published: 2 May 2019
Viewed by 382 | PDF Full-text (512 KB) | HTML Full-text | XML Full-text
Abstract
An accurate assessment of the risk of extreme environmental events is of great importance for populations, authorities and the banking/insurance/reinsurance industry. Koch (2017) introduced a notion of spatial risk measure and a corresponding set of axioms which are well suited to analyze the [...] Read more.
An accurate assessment of the risk of extreme environmental events is of great importance for populations, authorities and the banking/insurance/reinsurance industry. Koch (2017) introduced a notion of spatial risk measure and a corresponding set of axioms which are well suited to analyze the risk due to events having a spatial extent, precisely such as environmental phenomena. The axiom of asymptotic spatial homogeneity is of particular interest since it allows one to quantify the rate of spatial diversification when the region under consideration becomes large. In this paper, we first investigate the general concepts of spatial risk measures and corresponding axioms further and thoroughly explain the usefulness of this theory for both actuarial science and practice. Second, in the case of a general cost field, we give sufficient conditions such that spatial risk measures associated with expectation, variance, value-at-risk as well as expected shortfall and induced by this cost field satisfy the axioms of asymptotic spatial homogeneity of order 0, −2, −1 and −1, respectively. Last but not least, in the case where the cost field is a function of a max-stable random field, we provide conditions on both the function and the max-stable field ensuring the latter properties. Max-stable random fields are relevant when assessing the risk of extreme events since they appear as a natural extension of multivariate extreme-value theory to the level of random fields. Overall, this paper improves our understanding of spatial risk measures as well as of their properties with respect to the space variable and generalizes many results obtained in Koch (2017). Full article
(This article belongs to the Special Issue Risk, Ruin and Survival: Decision Making in Insurance and Finance)
Open AccessFeature PaperArticle
The Optimum Leverage Level of the Banking Sector
Received: 28 March 2019 / Revised: 26 April 2019 / Accepted: 27 April 2019 / Published: 1 May 2019
Viewed by 390 | PDF Full-text (3450 KB) | HTML Full-text | XML Full-text
Abstract
Banks make profits from the difference between short-term and long-term loan interest rates. To issue loans, banks raise funds from capital markets. Since the long-term loan rate is relatively stable, but short-term interest is usually variable, there is an interest rate risk. Therefore, [...] Read more.
Banks make profits from the difference between short-term and long-term loan interest rates. To issue loans, banks raise funds from capital markets. Since the long-term loan rate is relatively stable, but short-term interest is usually variable, there is an interest rate risk. Therefore, banks need information about the optimal leverage strategies based on the current economic situation. Recent studies on the economic crisis by many economists showed that the crisis was due to too much leveraging by “big banks”. This leveraging turns out to be close to Kelly’s optimal point. It is known that Kelly’s strategy does not address risk adequately. We used the return–drawdown ratio and inflection point of Kelly’s cumulative return curve in a finite investment horizon to derive more conservative leverage levels. Moreover, we carried out a sensitivity analysis to determine strategies during a period of interest rates increase, which is the most important and risky period to leverage. Thus, we brought theoretical results closer to practical applications. Furthermore, by using the sensitivity analysis method, banks can change the allocation sizes to loans with different maturities to mediate the risks corresponding to different monetary policy environments. This provides bank managers flexible tools in mitigating risk. Full article
Figures

Figure 1

Open AccessArticle
Practice Oriented and Monte Carlo Based Estimation of the Value-at-Risk for Operational Risk Measurement
Received: 22 March 2019 / Revised: 15 April 2019 / Accepted: 25 April 2019 / Published: 1 May 2019
Viewed by 408 | PDF Full-text (585 KB) | HTML Full-text | XML Full-text
Abstract
We explore the Monte Carlo steps required to reduce the sampling error of the estimated 99.9% quantile within an acceptable threshold. Our research is of primary interest to practitioners working in the area of operational risk measurement, where the annual loss distribution cannot [...] Read more.
We explore the Monte Carlo steps required to reduce the sampling error of the estimated 99.9% quantile within an acceptable threshold. Our research is of primary interest to practitioners working in the area of operational risk measurement, where the annual loss distribution cannot be analytically determined in advance. Usually, the frequency and the severity distributions should be adequately combined and elaborated with Monte Carlo methods, in order to estimate the loss distributions and risk measures. Naturally, financial analysts and regulators are interested in mitigating sampling errors, as prescribed in EU Regulation 2018/959. In particular, the sampling error of the 99.9% quantile is of paramount importance, along the lines of EU Regulation 575/2013. The Monte Carlo error for the operational risk measure is here assessed on the basis of the binomial distribution. Our approach is then applied to realistic simulated data, yielding a comparable precision of the estimate with a much lower computational effort, when compared to bootstrap, Monte Carlo repetition, and two other methods based on numerical optimization. Full article
(This article belongs to the Special Issue Risk, Ruin and Survival: Decision Making in Insurance and Finance)
Figures

Figure 1

Open AccessArticle
Stackelberg Equilibrium Premium Strategies for Push-Pull Competition in a Non-Life Insurance Market with Product Differentiation
Received: 16 January 2019 / Revised: 12 April 2019 / Accepted: 14 April 2019 / Published: 1 May 2019
Viewed by 312 | PDF Full-text (907 KB) | HTML Full-text | XML Full-text
Abstract
Two insurance companies I1,I2 with reserves R1(t),R2(t) compete for customers, such that in a suitable differential game the smaller company I2 with R2(0)< [...] Read more.
Two insurance companies I 1 , I 2 with reserves R 1 ( t ) , R 2 ( t ) compete for customers, such that in a suitable differential game the smaller company I 2 with R 2 ( 0 ) < R 1 ( 0 ) aims at minimizing R 1 ( t ) R 2 ( t ) by using the premium p 2 as control and the larger I 1 at maximizing by using p 1 . Deductibles K 1 , K 2 are fixed but may be different. If K 1 > K 2 and I 2 is the leader choosing its premium first, conditions for Stackelberg equilibrium are established. For gamma-distributed rates of claim arrivals, explicit equilibrium premiums are obtained, and shown to depend on the running reserve difference. The analysis is based on the diffusion approximation to a standard Cramér-Lundberg risk process extended to allow investment in a risk-free asset. Full article
(This article belongs to the Special Issue Recent Development in Actuarial Science and Related Fields)
Figures

Figure 1

Open AccessFeature PaperArticle
Optimal Excess-of-Loss Reinsurance for Stochastic Factor Risk Models
Received: 31 January 2019 / Revised: 18 April 2019 / Accepted: 26 April 2019 / Published: 1 May 2019
Viewed by 337 | PDF Full-text (1315 KB) | HTML Full-text | XML Full-text
Abstract
We study the optimal excess-of-loss reinsurance problem when both the intensity of the claims arrival process and the claim size distribution are influenced by an exogenous stochastic factor. We assume that the insurer’s surplus is governed by a marked point process with dual-predictable [...] Read more.
We study the optimal excess-of-loss reinsurance problem when both the intensity of the claims arrival process and the claim size distribution are influenced by an exogenous stochastic factor. We assume that the insurer’s surplus is governed by a marked point process with dual-predictable projection affected by an environmental factor and that the insurance company can borrow and invest money at a constant real-valued risk-free interest rate r. Our model allows for stochastic risk premia, which take into account risk fluctuations. Using stochastic control theory based on the Hamilton-Jacobi-Bellman equation, we analyze the optimal reinsurance strategy under the criterion of maximizing the expected exponential utility of the terminal wealth. A verification theorem for the value function in terms of classical solutions of a backward partial differential equation is provided. Finally, some numerical results are discussed. Full article
(This article belongs to the Special Issue Applications of Stochastic Optimal Control to Economics and Finance)
Figures

Figure 1

Open AccessFeature PaperArticle
Contingent Convertible Debt: The Impact on Equity Holders
Received: 30 March 2019 / Revised: 16 April 2019 / Accepted: 16 April 2019 / Published: 29 April 2019
Viewed by 374 | PDF Full-text (1739 KB) | HTML Full-text | XML Full-text
Abstract
Contingent Convertible (CoCo) is a hybrid debt issued by banks with a specific feature forcing its conversion to equity in the event of the bank’s financial distress. CoCo carries two major risks: the risk of default, which threatens any type of debt instrument, [...] Read more.
Contingent Convertible (CoCo) is a hybrid debt issued by banks with a specific feature forcing its conversion to equity in the event of the bank’s financial distress. CoCo carries two major risks: the risk of default, which threatens any type of debt instrument, plus the exclusive risk of mandatory conversion. In this paper, we propose a model to value CoCo debt instruments as a function of the debt ratio. Although the CoCo is a more expensive instrument than traditional debt, its presence in the capital structure lowers the cost of ordinary debt and reduces the total cost of debt. For preliminary equity holders, the presence of CoCo in the bank’s capital structure increases the shareholder’s aggregate value. Full article
(This article belongs to the Special Issue Advances in Credit Risk Modeling and Management)
Figures

Figure 1

Open AccessArticle
Measuring and Allocating Systemic Risk
Received: 26 November 2018 / Revised: 13 April 2019 / Accepted: 16 April 2019 / Published: 26 April 2019
Cited by 3 | Viewed by 565 | PDF Full-text (389 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we develop a framework for measuring, allocating and managing systemic risk. SystRisk, our measure of total systemic risk, captures the a priori cost to society for providing tail-risk insurance to the financial system. Our allocation principle distributes the total systemic [...] Read more.
In this paper, we develop a framework for measuring, allocating and managing systemic risk. SystRisk, our measure of total systemic risk, captures the a priori cost to society for providing tail-risk insurance to the financial system. Our allocation principle distributes the total systemic risk among individual institutions according to their size-shifted marginal contributions. To describe economic shocks and systemic feedback effects, we propose a reduced form stochastic model that can be calibrated to historical data. We also discuss systemic risk limits, systemic risk charges and a cap and trade system for systemic risk. Full article
Open AccessArticle
Sound Deposit Insurance Pricing Using a Machine Learning Approach
Received: 30 December 2018 / Revised: 7 April 2019 / Accepted: 13 April 2019 / Published: 19 April 2019
Viewed by 461 | PDF Full-text (418 KB) | HTML Full-text | XML Full-text
Abstract
While the main conceptual issue related to deposit insurances is the moral hazard risk, the main technical issue is inaccurate calibration of the implied volatility. This issue can raise the risk of generating an arbitrage. In this paper, first, we discuss that by [...] Read more.
While the main conceptual issue related to deposit insurances is the moral hazard risk, the main technical issue is inaccurate calibration of the implied volatility. This issue can raise the risk of generating an arbitrage. In this paper, first, we discuss that by imposing the no-moral-hazard risk, the removal of arbitrage is equivalent to removing the static arbitrage. Then, we propose a simple quadratic model to parameterize implied volatility and remove the static arbitrage. The process of removing the static risk is as follows: Using a machine learning approach with a regularized cost function, we update the parameters in such a way that butterfly arbitrage is ruled out and also implementing a calibration method, we make some conditions on the parameters of each time slice to rule out calendar spread arbitrage. Therefore, eliminating the effects of both butterfly and calendar spread arbitrage make the implied volatility surface free of static arbitrage. Full article
(This article belongs to the Special Issue Machine Learning in Insurance)
Figures

Figure 1

Open AccessArticle
Bank Competition in India: Some New Evidence Using Risk-Adjusted Lerner Index Approach
Received: 17 February 2019 / Revised: 9 April 2019 / Accepted: 10 April 2019 / Published: 18 April 2019
Viewed by 394 | PDF Full-text (1046 KB) | HTML Full-text | XML Full-text
Abstract
Banks in India have been gone through structural changes in the last three decades. The prices that bank charge depend on the competitive levels in the banking sector and the risk the assets and liabilities carry in banks’ balance sheet. The traditional Lerner [...] Read more.
Banks in India have been gone through structural changes in the last three decades. The prices that bank charge depend on the competitive levels in the banking sector and the risk the assets and liabilities carry in banks’ balance sheet. The traditional Lerner Index indicates competitive levels. However, this measure does not account for the risk, and this study introduces a risk-adjusted Lerner Index for evaluating competition in Indian banking for the period 1996 to 2016. The market power estimated through the adjusted Lerner Index has been declining since 1996, which indicates an improvement in competitive condition for the overall period. Further, as indicated by risk-adjusted Lerner Index, the Indian banking system exerts much less market power and hence are more competitive contrary to what is suggested by traditional Lerner index. Full article
(This article belongs to the Special Issue Financial Risks and Regulation)
Figures

Figure 1

Open AccessArticle
Treatment Level and Store Level Analyses of Healthcare Data
Received: 23 February 2019 / Revised: 3 April 2019 / Accepted: 13 April 2019 / Published: 17 April 2019
Viewed by 531 | PDF Full-text (834 KB) | HTML Full-text | XML Full-text
Abstract
The presented research discusses general approaches to analyze and model healthcare data at the treatment level and at the store level. The paper consists of two parts: (1) a general analysis method for store-level product sales of an organization and (2) a treatment-level [...] Read more.
The presented research discusses general approaches to analyze and model healthcare data at the treatment level and at the store level. The paper consists of two parts: (1) a general analysis method for store-level product sales of an organization and (2) a treatment-level analysis method of healthcare expenditures. In the first part, our goal is to develop a modeling framework to help understand the factors influencing the sales volume of stores maintained by a healthcare organization. In the second part of the paper, we demonstrate a treatment-level approach to modeling healthcare expenditures. In this part, we aim to improve the operational-level management of a healthcare provider by predicting the total cost of medical services. From this perspective, treatment-level analyses of medical expenditures may help provide a micro-level approach to predicting the total amount of expenditures for a healthcare provider. We present a model for analyzing a specific type of medical data, which may arise commonly in a healthcare provider’s standardized database. We do this by using an extension of the frequency-severity approach to modeling insurance expenditures from the actuarial science literature. Full article
(This article belongs to the Special Issue Young Researchers in Insurance and Risk Management)
Figures

Figure 1

Open AccessArticle
Defining Geographical Rating Territories in Auto Insurance Regulation by Spatially Constrained Clustering
Received: 11 March 2019 / Accepted: 13 April 2019 / Published: 17 April 2019
Viewed by 337 | PDF Full-text (1985 KB) | HTML Full-text | XML Full-text
Abstract
Territory design and analysis using geographical loss cost are a key aspect in auto insurance rate regulation. The major objective of this work is to study the design of geographical rating territories by maximizing the within-group homogeneity, as well as maximizing the among-group [...] Read more.
Territory design and analysis using geographical loss cost are a key aspect in auto insurance rate regulation. The major objective of this work is to study the design of geographical rating territories by maximizing the within-group homogeneity, as well as maximizing the among-group heterogeneity from statistical perspectives, while maximizing the actuarial equity of pure premium, as required by insurance regulation. To achieve this goal, the spatially-constrained clustering of industry level loss cost was investigated. Within this study, in order to meet the contiguity, which is a legal requirement on the design of geographical rating territories, a clustering approach based on Delaunay triangulation is proposed. Furthermore, an entropy-based approach was introduced to quantify the homogeneity of clusters, while both the elbow method and the gap statistic are used to determine the initial number of clusters. This study illustrated the usefulness of the spatially-constrained clustering approach in defining geographical rating territories for insurance rate regulation purposes. The significance of this work is to provide a new solution for better designing geographical rating territories. The proposed method can be useful for other demographical data analysis because of the similar nature of the spatial constraint. Full article
Figures

Figure 1

Open AccessArticle
Pricing of Longevity Derivatives and Cost of Capital
Received: 13 March 2019 / Revised: 6 April 2019 / Accepted: 8 April 2019 / Published: 15 April 2019
Viewed by 502 | PDF Full-text (5938 KB) | HTML Full-text | XML Full-text
Abstract
Annuities providers become more and more exposed to longevity risk due to the increase in life expectancy. To hedge this risk, new longevity derivatives have been proposed (longevity bonds, q-forwards, S-swaps…). Although academic researchers, policy makers and practitioners have talked about it for [...] Read more.
Annuities providers become more and more exposed to longevity risk due to the increase in life expectancy. To hedge this risk, new longevity derivatives have been proposed (longevity bonds, q-forwards, S-swaps…). Although academic researchers, policy makers and practitioners have talked about it for years, longevity-linked securities are not widely traded in financial markets, due in particular to the pricing difficulty. In this paper, we compare different existing pricing methods and propose a Cost of Capital approach. Our method is designed to be more consistent with Solvency II requirement (longevity risk assessment is based on a one year time horizon). The price of longevity risk is determined for a S-forward and a S-swap but can be used to price other longevity-linked securities. We also compare this Cost of capital method with some classical pricing approaches. The Hull and White and CIR extended models are used to represent the evolution of mortality over time. We use data for Belgian population to derive prices for the proposed longevity linked securities based on the different methods. Full article
Figures

Figure 1

Risks EISSN 2227-9091 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top