Risks doi: 10.3390/risks6040144

Authors: Yikai Gong Zhuangdi Li Maria Milazzo Kristen Moore Matthew Provencher

Credibility theory is used widely in group health and casualty insurance. However, it is generally not used in individual life and annuity business. With the introduction of principle-based reserving (PBR), which relies more heavily on company-specific experience, credibility theory is becoming increasingly important for life actuaries. In this paper, we review the two most commonly used credibility methods: limited fluctuation and greatest accuracy (Bühlmann) credibility. We apply the limited fluctuation method to M Financial Group’s experience data and describe some general qualitative observations. In addition, we use simulation to generate a universe of data and compute Limited Fluctuation and greatest accuracy credibility factors for actual-to-expected (A/E) mortality ratios. We also compare the two credibility factors to an intuitive benchmark credibility measure. We see that for our simulated data set, the limited fluctuation factors are significantly lower than the greatest accuracy factors, particularly for low numbers of claims. Thus, the limited fluctuation method may understate the credibility for companies with favorable mortality experience. The greatest accuracy method has a stronger mathematical foundation, but it generally cannot be applied in practice because of data constraints. The National Association of Insurance Commissioners (NAIC) recognizes and is addressing the need for life insurance experience data in support of PBR—this is an area of current work.

]]>Risks doi: 10.3390/risks6040143

Authors: Ranjan Das Gupta Rajesh Pathak

A risk-return association under normal market conditions can be conventional positive (risk-averse) or &ldquo;paradoxical&rdquo; negative (risk seeking). This study has the objective to investigate whether such an association is stable across market trends (i.e., bull and bear) and for overall, industry-classified and partitions sub-samples after controlling for a firm&rsquo;s age, size, leverage and liquidity using operating performance risk-return measures. In total, this study analyses 2666 firms (1199 firms from 15 developed countries and 1467 firms from 12 emerging countries) for the period of 1999&ndash;2015. Results show that in the overall and bull sub-periods, firms across countries are showing conventional positive (superior firms) and &ldquo;paradoxical&rdquo; negative (poor firms) in most cases. However, in the bear sub-periods all firms from emerging countries are risk seeking in order to maintain their position in the pecking order.

]]>Risks doi: 10.3390/risks6040142

Authors: Matthias Fischer Thorsten Moser Marius Pfeuffer

In both financial theory and practice, Value-at-risk (VaR) has become the predominant risk measure in the last two decades. Nevertheless, there is a lively and controverse on-going discussion about possible alternatives. Against this background, our first objective is to provide a current overview of related competitors with the focus on credit risk management which includes definition, references, striking properties and classification. The second part is dedicated to the measurement of risk concentrations of credit portfolios. Typically, credit portfolio models are used to calculate the overall risk (measure) of a portfolio. Subsequently, Euler&rsquo;s allocation scheme is applied to break the portfolio risk down to single counterparties (or different subportfolios) in order to identify risk concentrations. We first carry together the Euler formulae for the risk measures under consideration. In two cases (Median Shortfall and Range-VaR), explicit formulae are presented for the first time. Afterwards, we present a comprehensive study for a benchmark portfolio according to Duellmann and Masschelein (2007) and nine different risk measures in conjunction with the Euler allocation. It is empirically shown that&mdash;in principle&mdash;all risk measures are capable of identifying both sectoral and single-name concentration. However, both complexity of IT implementation and sensitivity of the risk figures w.r.t. changes of portfolio quality vary across the specific risk measures.

]]>Risks doi: 10.3390/risks6040141

Authors: Christina Erlwein-Sayer

We enhance the modelling and risk assessment of sovereign bond spreads by taking into account quantitative information gained from macro-economic news sentiment. We investigate sovereign bonds spreads of five European countries and improve the prediction of spread changes by incorporating news sentiment from relevant entities and macro-economic topics. In particular, we create daily news sentiment series from sentiment scores as well as positive and negative news volume and investigate their effects on yield spreads and spread volatility. We conduct a correlation and rolling correlation analysis between sovereign bond spreads and accumulated sentiment series and analyse changing correlation patterns over time. Market regimes are detected through correlation series and the impact of news sentiment on sovereign bonds in different market circumstances is investigated. We find best-suited external variables for forecasts in an ARIMAX model set-up. Error measures for forecasts of spread changes and volatility proxies are improved when sentiment is considered. These findings are then utilised to monitor sovereign bonds from European countries and detect changing risks through time.

]]>Risks doi: 10.3390/risks6040140

Authors: Tihana Škrinjarić

Effects of seasonal affective disorder (SAD) are explored on several selected Central and South East European markets in this study for the period 2010&ndash;2018. Both return and risk sensitivities on the SAD effect are examined for 11 markets in total (Bosnia and Herzegovina, Bulgaria, Croatia, Czech Republic, Hungary, Poland, Serbia, Slovakia, Slovenia, Romania and Ukraine). SAD effects are based upon psychiatric and behavioural theories, and are rarely observed on the stock markets today. Thus, this research provides empirical evaluation of the mentioned effects for some of the markets for the first time in the literature. The results indicate that 6 out of 11 markets exhibit SAD effects to some extent, meaning that investors&rsquo; risk aversion does change over the year, depending upon the season of the year. Such results have consequences in finance theory modelling and practical usage in investment strategies on stock markets as well.

]]>Risks doi: 10.3390/risks6040139

Authors: Stefano Cavastracci Strascia Agostino Tripodi

The aim of this paper is to carry out a closed tool to estimate the one-year volatility of the claims reserve, calculated through the generalized linear models (GLM), notably the overdispersed- Poisson model. Up to now, this one-year volatility has been estimated through the well-known bootstrap methodology that demands the use of the Monte Carlo method with a re-reserving technique. Nonetheless, this method is time consuming under the calculation point of view; therefore, approximation techniques are often used in practice, such as an emergence pattern based on the link between the one-year volatility&mdash;resulting from the Merz&ndash;W&uuml;thrich method&mdash;and the ultimate volatility&mdash;resulting from the Mack method.

]]>Risks doi: 10.3390/risks6040138

Authors: Abel Cadenillas Ricardo Huamán-Aguilar

We develop a government debt management model to study the optimal debt ceiling when the ability of the government to generate primary surpluses to reduce the debt ratio is limited. We succeed in finding a solution for the optimal debt ceiling. We study the conditions under which a country is not able to reduce its debt ratio to reach its optimal debt ceiling, even in the long run. In addition, this model with bounded intervention is consistent with the fact that, in reality, countries that succeed in reducing their debt ratio do not do so immediately, but over some period of time. To the best of our knowledge, this is the first theoretical model on the debt ceiling that accounts for bounded interventions.

]]>Risks doi: 10.3390/risks6040137

Authors: Aida Barkauskaite Ausrine Lakstutiene Justyna Witkowska

Scientific discussions have emphasized that the main problem with the current deposit insurance system is that the current system does not evaluate the risks that banks assume to calculate the deposit insurance premiums in many countries of the European Union (E.U.). Thus, the prevailing system does not safeguard a sufficient level of stability in the banking system. Scientific studies show that the deposit insurance system should consider not only the risk indicators for individual banks, but it must also consider the systemic risk of banks that affects the stability of the banking system. Hence, the question arises as to whether measurements of systemic risk in a common E.U. risk-based deposit insurance system are a formal necessity or if they are a value-adding process. Expanding the discussion of scientists, this article analyzes how contributions to insurance funds would change the banks of Lithuania following the introduction of the E.U.&rsquo;s overall risk-based deposit insurance system and after taking into consideration the additional systemic risk. The research results that were obtained provide evidence that the introduction of a risk-based deposit insurance system would redistribute payments to the deposit insurance fund between banks operating in Lithuania, and, thereby, would contribute to a reduction in the negative effects of the deposit insurance system and would improve the stability in the financial system.

]]>Risks doi: 10.3390/risks6040136

Authors: Haipeng Xing Yang Yu

The financial crises which occurred in the last several decades have demonstrated the significant impact of market structural breaks on firms&rsquo; credit behavior. To incorporate the impact of market structural break into the analysis of firms&rsquo; credit rating transitions and firms&rsquo; asset structure, we develop a continuous-time modulated Markov model for firms&rsquo; credit rating transitions with unobserved market structural breaks. The model takes a semi-parametric multiplicative regression form, in which the effects of firms&rsquo; observable covariates and macroeconomic variables are represented parametrically and nonparametrically, respectively, and the frailty effects of unobserved firm-specific and market-wide variables are incorporated via the integration form of the model assumption. We further develop a mixtured-estimating-equation approach to make inference on the effect of market variations, baseline intensities of all firms&rsquo; credit rating transitions, and rating transition intensities for each individual firm. We then use the developed model and inference procedure to analyze the monthly credit rating of U.S. firms from January 1986 to December 2012, and study the effect of market structural breaks on firms&rsquo; credit rating transitions.

]]>Risks doi: 10.3390/risks6040135

Authors: Hongmin Xiao Lin Xie

In this paper, the risk model with constant interest based on an entrance process is investigated. Under the assumptions that the entrance process is a renewal process and the claims sizes satisfy a certain dependence structure, which belong to the different heavy-tailed distribution classes, the finite-time asymptotic estimate of the bidimensional risk model with constant interest force is obtained. Particularly, when inter-arrival times also satisfy a certain dependence structure, these formulas still hold.

]]>Risks doi: 10.3390/risks6040134

Authors: Tomer Shushi

In risk theory, risks are often modeled by risk measures which allow quantifying the risks and estimating their possible outcomes. Risk measures rely on measure theory, where the risks are assumed to be random variables with some distribution function. In this work, we derive a novel topological-based representation of risks. Using this representation, we show the differences between diversifiable and non-diversifiable. We show that topological risks should be modeled using two quantities, the risk measure that quantifies the predicted amount of risk, and a distance metric which quantifies the uncertainty of the risk.

]]>Risks doi: 10.3390/risks6040133

Authors: Jung-Bin Su Jui-Cheng Hung

This study utilizes the seven bivariate generalized autoregressive conditional heteroskedasticity (GARCH) models to forecast the out-of-sample value-at-risk (VaR) of 21 stock portfolios and seven currency-stock portfolios with three weight combinations, and then employs three accuracy tests and one efficiency test to evaluate the VaR forecast performance for the above models. The seven models are constructed by four types of bivariate variance-covariance specifications and two approaches of parameters estimates. The four types of bivariate variance-covariance specifications are the constant conditional correlation (CCC), asymmetric and symmetric dynamic conditional correlation (ADCC and DCC), and the BEKK, whereas the two types of approach include the standard and non-standard approaches. Empirical results show that, regarding the accuracy tests, the VaR forecast performance of stock portfolios varies with the variance-covariance specifications and the approaches of parameters estimate, whereas it does not vary with the weight combinations of portfolios. Conversely, the VaR forecast performance of currency-stock portfolios is almost the same for all models and still does not vary with the weight combinations of portfolios. Regarding the efficiency test via market risk capital, the NS-BEKK model is the most suitable model to be used in the stock and currency-stock portfolios for bank risk managers irrespective of the weight combination of portfolios.

]]>Risks doi: 10.3390/risks6040132

Authors: Abdul-Aziz Ibn Musah Jianguo Du Hira Salah ud din Khan Alhassan Alolo Abdul-Rasheed Akeji

In recent times, investing in volatile security increases the risk of losses and reduces gains. Many traders who depend on these risks indulge in multiple volatility procedures to inform their trading strategies. We explore two models to measure the tails behaviour and the period the stock will gain or fall within a five-month trading period. We obtained data from the Ghana stock exchange and applied generalized extreme value distribution validated by backtesting and an artificial neural network for forecasting. The network training produces and manages more than 90% accuracy respectively for gains and falls for given input-output pairs. Based on this, estimates of extreme value distribution proves that it is formidable. There is a significant development in market prediction in assessing the results of actual and forecast performance. The study reveals that once every five months, at a 5% confidence level, the market is expected to gain and fall 2.12% and 2.23%, respectively. The Ghana stock exchange market showed a maximum monthly stock gain above or below 2.12% in the fourth and fifth months, whiles maximum monthly stock fell above or below 2.23% in the third and fourth months. The study reveals that once every five months&rsquo; trading period, the stock market will gain and fall by almost an equal percentage, with a significant increase in value-at-risk and expected shortfall at the left tail as the quantiles increases compared to the right tail.

]]>Risks doi: 10.3390/risks6040131

Authors: Aditya Maheshwari Andrey Sarantsev

In our model, private actors with interbank cash flows similar to, but more general than that by Carmona et al. (2013) borrow from the non-banking financial sector at a certain interest rate, controlled by the central bank, and invest in risky assets. Each private actor aims to maximize its expected terminal logarithmic wealth. The central bank, in turn, aims to control the overall economy by means of an exponential utility function. We solve all stochastic optimal control problems explicitly. We are able to recreate occasions such as liquidity trap. We study distribution of the number of defaults (net worth of a private actor going below a certain threshold).

]]>Risks doi: 10.3390/risks6040130

Authors: Huong Dieu Dang

The informal constraints that arise from the national culture in which a firm resides have a pervasive impact on managerial decision making and corporate credit risk, which in turn impacts on corporate ratings and rating changes. In some cultures, firms are naturally predisposed to rating changes in a particular direction (downgrade or upgrade) while, in other cultures, firms are more likely to migrate from the current rating in either direction. This study employs a survival analysis framework to examine the effect of national culture on the probability of rating transitions of 5360 firms across 50 countries over the period 1985&ndash;2010. Firms located in long-term oriented cultures are less likely to be downgraded and, in some cases, more likely to be upgraded. Downgrades occur more often in strong uncertainty-avoiding countries and less often in large power distance (hierarchy) and embeddedness countries. There is some evidence that masculinity predisposes firms to more rating transitions. Studying culture helps enrich our understanding of corporate rating migrations, and helps develop predictive models of corporate rating changes across countries.

]]>Risks doi: 10.3390/risks6040129

Authors: Xinyuan Wei Jun-ya Gotoh Stan Uryasev

This paper studies the peer-to-peer lending and loan application processing of LendingClub. We tried to reproduce the existing loan application processing algorithm and find features used in this process. Loan application processing is considered a binary classification problem. We used the area under the ROC curve (AUC) for evaluation of algorithms. Features were transformed with splines for improving the performance of algorithms. We considered three classification algorithms: logistic regression, buffered AUC (bAUC) maximization, and AUC maximization.With only three features, Debt-to-Income Ratio, Employment Length, and Risk Score, we obtained an AUC close to 1. We have done both in-sample and out-of-sample evaluations. The codes for cross-validation and solving problems in a Portfolio Safeguard (PSG) format are in the Appendix. The calculation results with the data and codes are posted on the website and are available for downloading.

]]>Risks doi: 10.3390/risks6040128

Authors: Minh Ngo Marc Rieger Shuonan Yuan

Stocks are riskier than bonds. This causes a risk premium for stocks. That the size of this premium, however, seems to be larger than risk aversion alone can explain the so-called “equity premium puzzle”. One possible explanation is the inclusion of a degree of ambiguity in stock returns to account for an additional ambiguity premium, whose size depends on the degree of ambiguity aversion among investors. It is, however, difficult to test this empirically. In this paper, we compute the first firm-level estimation of equity premium based on the internal rate of return (IRR) approach for a total of N = 28,256 companies in 54 countries worldwide. Using a survey of international data on ambiguity aversion, we find a strong and robust relation between equity premia and ambiguity aversion.

]]>Risks doi: 10.3390/risks6040127

Authors: Pavel V. Gapeev Hessah Al Motairi

We present closed-form solutions to the perpetual American dividend-paying put and call option pricing problems in two extensions of the Black&ndash;Merton&ndash;Scholes model with random dividends under full and partial information. We assume that the dividend rate of the underlying asset price changes its value at a certain random time which has an exponential distribution and is independent of the standard Brownian motion driving the price of the underlying risky asset. In the full information version of the model, it is assumed that this time is observable to the option holder, while in the partial information version of the model, it is assumed that this time is unobservable to the option holder. The optimal exercise times are shown to be the first times at which the underlying risky asset price process hits certain constant levels. The proof is based on the solutions of the associated free-boundary problems and the applications of the change-of-variable formula.

]]>Risks doi: 10.3390/risks6040126

Authors: Fabian Capitanio Antonio De Pin

Risk management policy in agriculture has become particularly prominent nowadays, considering the evolution of the Common Agricultural Policy (CAP) and climate change. Moreover, the Word Trade Organization places constraints on it. In this context, (1) the aim is to analyze the causes of the loss of effectiveness of the Italian insurance system, unable to deal with the specific coverage demand from agriculture. (2) The analysis is carried out through the economic evaluation of convenience in adhering to the instruments offered by the insurance market to winegrowers in the Controlled and Guaranteed Denomination of Origin (DOCG) area of Conegliano-Valdobbiadene. (3) The study highlights that the subsidized coverage alone is not the most adequate measure of agricultural policy. Adhering to preferential programs implies the drafting of a supplementary insurance policy to minimize the loss function. (4) The current insurance system impasse demonstrates that the producer hardly accepts to policies which do not convert into an immediate income benefit. The European risk management regulation confirms its limits in terms of usefulness and efficiency of the agrarian policy. (5) The prediction of probabilistic increase of severe-weather patterns makes the search for innovative risk assessment models more urgent, models which can combine the different needs of stakeholders: farmers, insurance companies, and society.

]]>Risks doi: 10.3390/risks6040125

Authors: Marco Neffelli

Portfolio weights solely based on risk avoid estimation errors from the sample mean, but they are still affected from the misspecification in the sample covariance matrix. To solve this problem, we shrink the covariance matrix towards the Identity, the Variance Identity, the Single-index model, the Common Covariance, the Constant Correlation, and the Exponential Weighted Moving Average target matrices. Using an extensive Monte Carlo simulation, we offer a comparative study of these target estimators, testing their ability in reproducing the true portfolio weights. We control for the dataset dimensionality and the shrinkage intensity in the Minimum Variance (MV), Inverse Volatility (IV), Equal-Risk-Contribution (ERC), and Maximum Diversification (MD) portfolios. We find out that the Identity and Variance Identity have very good statistical properties, also being well conditioned in high-dimensional datasets. In addition, these two models are the best target towards which to shrink: they minimise the misspecification in risk-based portfolio weights, generating estimates very close to the population values. Overall, shrinking the sample covariance matrix helps to reduce weight misspecification, especially in the Minimum Variance and the Maximum Diversification portfolios. The Inverse Volatility and the Equal-Risk-Contribution portfolios are less sensitive to covariance misspecification and so benefit less from shrinkage.

]]>Risks doi: 10.3390/risks6040124

Authors: Chengbo Fu

The variance of stock returns is decomposed based on a conditional Fama&ndash;French three-factor model instead of its unconditional counterpart. Using time-varying alpha and betas in this model, it is evident that four additional risk terms must be considered. They include the variance of alpha, the variance of the interaction between the time-varying component of beta and factors, and two covariance terms. These additional risk terms are components that are included in the idiosyncratic risk estimate using an unconditional model. By investigating the relation between the risk terms and stock returns, we find that only the variance of the time-varying alpha is negatively associated with stock returns. Further tests show that stock returns are not affected by the variance of time-varying beta. These results are consistent with the findings in the literature identifying return predictability from time-varying alpha rather than betas.

]]>Risks doi: 10.3390/risks6040123

Authors: Marie Angèle Cathleen Alijean Jason Narsoo

Mortality forecasting has always been a target of study by academics and practitioners. Since the introduction and rising significance of securitization of risk in mortality and longevity, more in-depth studies regarding mortality have been carried out to enable the fair pricing of such derivatives. In this article, a comparative analysis is performed on the mortality forecasting accuracy of four mortality models. The methodology employs the Age-Period-Cohort model, the Cairns-Blake-Dowd model, the classical Lee-Carter model and the Kou-Modified Lee-Carter model. The Kou-Modified Lee-Carter model combines the classical Lee-Carter with the Double Exponential Jump Diffusion model. This paper is the first study to employ the Kou model to forecast French mortality data. The dataset comprises death data of French males from age 0 to age 90, available for the years 1900&ndash;2015. The paper differentiates between two periods: the 1900&ndash;1960 period where extreme mortality events occurred for French males and the 1961&ndash;2015 period where no significant jump is observed. The Kou-modified Lee-Carter model turns out to give the best mortality forecasts based on the RMSE, MAE, MPE and MAPE metrics for the period 1900&ndash;1960 during which the two World Wars occurred. This confirms that the consideration of jumps and leptokurtic features conveys important information for mortality forecasting.

]]>Risks doi: 10.3390/risks6040122

Authors: Dietmar Pfeifer Olena Ragulina

We propose a Monte Carlo simulation method to generate stress tests by VaR scenarios under Solvency II for dependent risks on the basis of observed data. This is of particular interest for the construction of Internal Models. The approach is based on former work on partition-of-unity copulas, however with a direct scenario estimation of the joint density by product beta distributions after a suitable transformation of the original data.

]]>Risks doi: 10.3390/risks6040121

Authors: Robert J. Powell Duc H. Vo Thach N. Pham

There has been much discussion in the literature about how central measures of equity risk such as standard deviation fail to account for extreme tail risk of equities. Similarly, parametric measures of value at risk (VaR) may also fail to account for extreme risk as they assume a normal distribution which is often not the case in practice. Nonparametric measures of extreme risk such as nonparametric VaR and conditional value at risk (CVaR) have often been found to overcome this problem by measuring actual tail risk without applying any predetermined assumptions. However, this article argues that it is not just the actual risk of equites that is important to investor choices, but also the relative (ordinal) risk of equities compared to each other. Using an applied setting of industry portfolios in a variety of Asian countries (benchmarked to the United States), over crisis and non-crisis periods, this article finds that nonparametric measures of VaR and CVaR may provide only limited new information to investors about relative risk in the portfolios examined as there is a high degree of similarity found in relative industry risk when using nonparametric metrics as compared to central or parametric measures such as standard deviation and parametric VaR.

]]>Risks doi: 10.3390/risks6040120

Authors: Fengming Qin Junru Zhang Zhaoyong Zhang

This study examines empirically the volatility spillover effects between the RMB foreign exchange markets and the stock markets by employing daily returns of the Chinese RMB exchange rates and the stock markets in China and Japan during the period in 1998&ndash;2018. We find evidence that there exist co-volatility effects among the financial markets in China and Japan, and the volatility of RMB exchange rates contribute to the co-volatility spillovers across the financial markets. Reversely, the return shock from the stock markets can also generate co-volatility spillover to the foreign exchange markets. The bidirectional relationship reveals that both the fundamental hypothesis and the investor-induced hypothesis are valid. Our estimates also show that the spillover effects led by the stock market in Japan are stronger than that from the foreign exchange markets and the Chinese stock markets, implying that market with higher accessibility has greater spillover effects onto other markets. We also found that the average co-volatility spillover effects among the RMB exchange markets and the stock markets in Japan and China are generally negative. These findings have important policy implications for risk management and hedging strategies.

]]>Risks doi: 10.3390/risks6040119

Authors: Chengping Gong Chengxiu Ling

Based on suitable left-truncated or censored data, two flexible classes of M-estimations of Weibull tail coefficient are proposed with two additional parameters bounding the impact of extreme contamination. Asymptotic normality with n -rate of convergence is obtained. Its robustness is discussed via its asymptotic relative efficiency and influence function. It is further demonstrated by a small scale of simulations and an empirical study on CRIX.

]]>Risks doi: 10.3390/risks6040118

Authors: Elisson Andrade Fabio Mattos Roberto Arruda de Souza Lima

The objective of this research is to evaluate the influence on hedging decisions of a realistic set of transaction costs which are largely stochastic. The stochastic nature of some transaction costs (such as margin calls) means that their exact value is unknown when the hedge is placed, since they depend on the trajectory of futures prices during the hedge. Results are consistent with previous studies in that the introduction of transaction costs tend to affect hedge ratios. However, as opposed to the traditional literature, the introduction of stochastic costs in futures hedging can either decrease or increase hedge ratios depending on how these costs are determined.

]]>Risks doi: 10.3390/risks6040117

Authors: Jarmila Horváthová Martina Mokrišová

In this paper, the following research problem was addressed: Is DEA (Data Envelopment Analysis) method a suitable alternative to Altman model in predicting the risk of bankruptcy? Based on the above-mentioned research problem, we formulated the aim of the paper: To apply DEA method for predicting the risk of bankruptcy and to compare its results with the results of Altman model. The research problem and the aim of the paper follow the research of authors aimed at the application of methods which are appropriate for measuring business financial health, performance and competitiveness as well as for predicting the risk of bankruptcy. To address the problem, the following methods were applied: financial ratios, Altman model for private non-manufacturing firms and DEA method. When applying DEA method, we formulated input-oriented DEA CCR model. We found that DEA method is an appropriate alternative to Altman model in predicting the risk of possible business bankruptcy. The important conclusion is that DEA allows us to apply not only outputs but also inputs. Since prediction models do not include these indicators, DEA method appears to be the right choice. We recommend, especially for Slovak companies, to apply cost ratio when calculating risk of bankruptcy.

]]>Risks doi: 10.3390/risks6040116

Authors: Vincenzo Candila Salvatore Farace

Addressing the volatility spillovers of agricultural commodities is important for at least two reasons. First, for the last several years, the volatility of agricultural commodity prices seems to have increased. Second, according to the Food and Agriculture Organization, there is a strong need for understanding the potential (negative) impacts on food security caused by food commodity volatilities. This paper aims at investigating the presence, the size, and the persistence of volatility spillovers among five agricultural commodities (corn, sugar, wheat, soybean, and bioethanol) and five Latin American (Argentina, Brazil, Chile, Colombia, Peru) stock market indexes. Overall, when a negative shock hits the commodity market, Latin American stock market volatility tends to increase. This happens, for instance, for the relationships from corn to Chile and Colombia and from wheat to Peru and Chile.

]]>Risks doi: 10.3390/risks6040115

Authors: Xin Liu Jiang Wu Chen Yang Wenjun Jiang

In this paper, we propose a clustering procedure of financial time series according to the coefficient of weak lower-tail maximal dependence (WLTMD). Due to the potential asymmetry of the matrix of WLTMD coefficients, the clustering procedure is based on a generalized weighted cuts method instead of the dissimilarity-based methods. The performance of the new clustering procedure is evaluated by simulation studies. Finally, we illustrate that the optimal mean-variance portfolio constructed based on the resulting clusters manages to reduce the risk of simultaneous large losses effectively.

]]>Risks doi: 10.3390/risks6040114

Authors: Chen Li Xiaohu Li

This paper studies a Pareto-optimal reinsurance contract in the presence of negative statistical dependence between the insurance claim and the random recovery rate. In the context of symmetric information model and asymmetric information model, we investigate properties of the Pareto-optimal indemnity schedules. For risk neutral reinsurer with proportional cost and associated expense, we present possible forms of the Pareto-optimal indemnity schedule as well.

]]>Risks doi: 10.3390/risks6040113

Authors: Arvind Shrivastava Kuldeep Kumar Nitin Kumar

The objective of the study is to perform corporate distress prediction for an emerging economy, such as India, where bankruptcy details of firms are not available. Exhaustive panel dataset extracted from Capital IQ has been employed for the purpose. Foremost, the study contributes by devising novel framework to capture incipient signs of distress for Indian firms by employing a combination of firm specific parameters. The strategy not only enables enlarging the sample of distressed firms but also enables to obtain robust results. The analysis applies both standard Logistic and Bayesian modeling to predict distressed firms in Indian corporate sector. Thereby, a comparison of predictive ability of the two approaches has been carried out. Both in-sample and out of sample evaluation reveal a consistently better predictive capability employing Bayesian methodology. The study provides useful structure to indicate the early signals of failure in Indian corporate sector that is otherwise limited in literature.

]]>Risks doi: 10.3390/risks6040112

Authors: Florent Gallien Serge Kassibrakis Semyon Malamud

We solve the problem of optimal risk management for an investor holding an illiquid, alpha-generating fund and hedging his/her position with a liquid futures contract. When the investor is subject to a lower bound on net return, he/she is forced to reduce the total risk of his/her portfolio after a loss. In this case, he/she faces a tradeoff of either paying the transaction costs and deleveraging or keeping his/her current position in the illiquid instrument and hedging away some of the risk while keeping the residual, unhedgeable risk on his/her balance sheet. We explicitly characterize this tradeoff and study its dependence on asset characteristics. In particular, we show that higher alpha and lower beta typically widen the no-trading zone, while the impact of volatility is ambiguous.

]]>Risks doi: 10.3390/risks6040111

Authors: Angelo Corelli

The paper analyzes the relationship between the most popular cryptocurrencies and a range of selected fiat currencies, in order to identify any pattern and/or causality between the series. Cryptocurrencies are a hot topic in Finance due to their strict relationship with the Blockchain system they originate from and therefore are normally considered as part of the ongoing, world-wide financial revolution. This innovative study investigates this relationship for the first time by thoroughly investigating the data, their features, and the way they are interconnected. Results show very interesting results in terms of how concentrated the causality effect on some specific cryptocurrencies and fiat currencies is. The outcome is a clear and possibly explainable relationship between cryptocurrencies and Asian markets, while envisioning some kind of Asian effect.

]]>Risks doi: 10.3390/risks6040110

Authors: Sooie-Hoe Loke Enrique Thomann

In this paper, a dual risk model under constant force of interest is considered. The ruin probability in this model is shown to satisfy an integro-differential equation, which can then be written as an integral equation. Using the collocation method, the ruin probability can be well approximated for any gain distributions. Examples involving exponential, uniform, Pareto and discrete gains are considered. Finally, the same numerical method is applied to the Laplace transform of the time of ruin.

]]>Risks doi: 10.3390/risks6040109

Authors: Marcos González-Fernández Carmen González-Velasco

The aim of this paper is to analyze the relation between maturity structure, sovereign bond yields and sovereign risk in the Economic and Monetary Union for the period of 1990&ndash;2013. The results confirm the existence of an inverse relationship between sovereign bond yields, sovereign risk and the maturity structure of sovereign debt, regardless of the proxy that is used to measure sovereign risk and the time variance of the variables employed. The results indicate that risk shortens the maturity structure of sovereign debt because it reduces the stock of long-term debt. The relationship between maturity structure and sovereign bond yields differs depending on the risk of the countries analyzed (non-monotonic relationship) and the differences between peripheral and core countries are greater for higher levels of the yields. If we control for the indebtedness level of these countries, the results show that the relationship between the sovereign bond yields and maturity strengthens as the debt level increases.

]]>Risks doi: 10.3390/risks6040108

Authors: Kris Peremans Stefan Van Aelst Tim Verdonck

The chain ladder method is a popular technique to estimate the future reserves needed to handle claims that are not fully settled. Since the predictions of the aggregate portfolio (consisting of different subportfolios) do not need to be equal to the sum of the predictions of the subportfolios, a general multivariate chain ladder (GMCL) method has already been proposed. However, the GMCL method is based on the seemingly unrelated regression (SUR) technique which makes it very sensitive to outliers. To address this issue, we propose a robust alternative that estimates the SUR parameters in a more outlier resistant way. With the robust methodology it is possible to automatically flag the claims with a significantly large influence on the reserve estimates. We introduce a simulation design to generate artificial multivariate run-off triangles based on the GMCL model and illustrate the importance of taking into account contemporaneous correlations and structural connections between the run-off triangles. By adding contamination to these artificial datasets, the sensitivity of the traditional GMCL method and the good performance of the robust GMCL method is shown. From the analysis of a portfolio from practice it is clear that the robust GMCL method can provide better insight in the structure of the data.

]]>Risks doi: 10.3390/risks6040107

Authors: Phong Luu Jingzhi Tie Qing Zhang

A mean-reverting model is often used to capture asset price movements fluctuating around its equilibrium. A common strategy trading such mean-reverting asset is to buy low and sell high. However, determining these key levels in practice is extremely challenging. In this paper, we study the optimal trading of such mean-reverting asset with a fixed transaction (commission and slippage) cost. In particular, we focus on a threshold type policy and develop a method that is easy to implement in practice. We formulate the optimal trading problem in terms of a sequence of optimal stopping times. We follow a dynamic programming approach and obtain the value functions by solving the associated HJB equations. The optimal threshold levels can be found by solving a set of quasi-algebraic equations. In addition, a verification theorem is provided together with sufficient conditions. Finally, a numerical example is given to illustrate our results. We note that a complete treatment of this problem was done recently by Leung and associates. Nevertheless, our work was done independently and focuses more on developing necessary optimality conditions.

]]>Risks doi: 10.3390/risks6040106

Authors: Ghislain Léveillé Ilie-Radu Mitric Victor Côté

In this document, we examine the effects of the age process on aggregate discounted claims by studying the conditional raw and joint moments, the moment generating function and the distribution function of the increments of compound renewal sums with discounted claims, taking into account the past experience of an insurance portfolio.

]]>Risks doi: 10.3390/risks6040105

Authors: Chia-Lin Chang Jukka Ilomäki Hannu Laurila Michael McAleer

This paper examines how the size of the rolling window, and the frequency used in moving average (MA) trading strategies, affects financial performance when risk is measured. We use the MA rule for market timing, that is, for when to buy stocks and when to shift to the risk-free rate. The important issue regarding the predictability of returns is assessed. It is found that performance improves, on average, when the rolling window is expanded and the data frequency is low. However, when the size of the rolling window reaches three years, the frequency loses its significance and all frequencies considered produce similar financial performance. Therefore, the results support stock returns predictability in the long run. The procedure takes account of the issues of variable persistence as we use only returns in the analysis. Therefore, we use the performance of MA rules as an instrument for testing returns predictability in financial stock markets.

]]>Risks doi: 10.3390/risks6040104

Authors: Naji Massad Jørgen Vitting Andersen

We introduce tools to capture the dynamics of three different pathways, in which the synchronization of human decision-making could lead to turbulent periods and contagion phenomena in financial markets. The first pathway is caused when stock market indices, seen as a set of coupled integrate-and-fire oscillators, synchronize in frequency. The integrate-and-fire dynamics happens due to &ldquo;change blindness&rdquo;, a trait in human decision-making where people have the tendency to ignore small changes, but take action when a large change happens. The second pathway happens due to feedback mechanisms between market performance and the use of certain (decoupled) trading strategies. The third pathway occurs through the effects of communication and its impact on human decision-making. A model is introduced in which financial market performance has an impact on decision-making through communication between people. Conversely, the sentiment created via communication has an impact on financial market performance. The methodologies used are: agent based modeling, models of integrate-and-fire oscillators, and communication models of human decision-making.

]]>Risks doi: 10.3390/risks6030103

Authors: Jin Sun Pavel V. Shevchenko Man Chung Fung

Variable annuities, as a class of retirement income products, allow equity market exposure for a policyholder&rsquo;s retirement fund with optional guarantees to limit the downside risk of the market. Management fees andguarantee insurance fees are charged respectively for the market exposure and for the protection from the downside risk. We investigate the pricing of variable annuity guarantees under optimal withdrawal strategies when management fees are present. We consider from both policyholder&rsquo;s and insurer&rsquo;s perspectives optimal withdrawal strategies and calculate the respective fair insurance fees. We reveal a discrepancy where the fees from the insurer&rsquo;s perspective can be significantly higher due to the management fees serving as a form of market friction. Our results provide a possible explanation of lower guarantee insurance fees observed in the market than those predicted from the insurer&rsquo;s perspective. Numerical experiments are conducted to illustrate the results.

]]>Risks doi: 10.3390/risks6030102

Authors: Matija Vidmar

A fluctuation theory and, in particular, a theory of scale functions is developed for upwards skip-free L&eacute;vy chains, i.e., for right-continuous random walks embedded into continuous time as compound Poisson processes. This is done by analogy to the spectrally negative class of L&eacute;vy processes&mdash;several results, however, can be made more explicit/exhaustive in the compound Poisson setting. Importantly, the scale functions admit a linear recursion, of constant order when the support of the jump measure is bounded, by means of which they can be calculated&mdash;some examples are presented. An application to the modeling of an insurance company&rsquo;s aggregate capital process is briefly considered.

]]>Risks doi: 10.3390/risks6030101

Authors: Yaseen Ghulam Kamini Dhruva Sana Naseem Sophie Hill

We utilize the data of a very large UK automobile loan firm to study the interaction of the characteristics of borrowers and loans in predicting the subsequent loan performance. Our broader findings confirm the earlier research on the issue of subprime auto loans. More importantly, unmarried borrowers living with furnished tenancy agreements who have relatively new jobs have a probability of defaulting of more than 60% compared to an average 7% default rate in overall subprime borrowers in the dataset. Also, in the above category are those who live in a less prosperous part of the UK such as the north-west, are full-time self-employed, have other large loan arrears, fall into the bottom 25% percentile of monthly income, secure loans with high loan to total value (LTV), purchase expensive automobiles with shorter loan duration payment plans, and have a high dependency on government support. This in fact is also true of those who go into arrears, except that the highest probability in this context is around 40% compared to 6% for an overall sample. These findings shall help in the understanding of subprime auto loans performance in relation to borrowers and loan features alongside helping auto finance firms improve predictive models and decision-making.

]]>Risks doi: 10.3390/risks6030100

Authors: Nadezhda Gribkova Ričardas Zitikis

Background, or systematic, risks are integral parts of many systems and models in insurance and finance. These risks can, for example, be economic in nature, or they can carry more technical connotations, such as errors or intrusions, which could be intentional or unintentional. A most natural question arises from the practical point of view: is the given system really affected by these risks? In this paper we offer an algorithm for answering this question, given input-output data and appropriately constructed statistics, which rely on the order statistics of inputs and the concomitants of outputs. Even though the idea is rooted in complex statistical and probabilistic considerations, the algorithm is easy to implement and use in practice, as illustrated using simulated data.

]]>Risks doi: 10.3390/risks6030099

Authors: Claude Lefèvre Stéphane Loisel Muhsin Tamturk Sergey Utev

A quantum mechanics approach is proposed to model non-life insurance risks and to compute the future reserve amounts and the ruin probabilities. The claim data, historical or simulated, are treated as coming from quantum observables and analyzed with traditional machine learning tools. They can then be used to forecast the evolution of the reserves of an insurance company. The following methodology relies on the Dirac matrix formalism and the Feynman path-integral method.

]]>Risks doi: 10.3390/risks6030098

Authors: Chuan-Chuan Ko Tyrone T. Lin Fu-Min Zeng Chien-Yu Liu

The study considers the product life cycle in the stages of technological innovation, and focuses on how to evaluate the optimal investment strategy and the project value. It applies different product stages (three stages including production innovation, manufacture innovation, and business innovation) factors to different risks to build a technology innovation strategy model. This study of option premiums aims for the best strategy timing for each innovation stage. It shows that the variation of business cycle will affect the purchasing power under the uncertainty of Gross Domestic Product (GDP). In application, the compound binomial options for the manufacture innovation will only be considered after the execution of the production innovation, whereas the operation innovation will only be considered after the execution of the manufacture innovation. Thus, this paper constructs the dynamic investment sequential decision model, assesses the feasibility of an investment strategy, and makes a decision on the appropriate project value and option premiums for each stage under the possible change of GDP. Numerically, the result shows the equity value of the investment is greater than 0. Therefore, this paper recommends the case firm to invest in its innovation project known as one-time passwords. Sensitivity analysis shows when the risk-adjusted discounted rate r increases, the risk of the investment market increases accordingly, hence the equity value must also be higher in order to attract the case firm&rsquo;s investment interest. Also, the average GDP growth rate u sensitivity analysis results in different phenomena. The equity value gradually decreases when the average GDP growth rate rises. When the average GDP growth rate u rises to a certain extent, however, its equity value is gradually growing. The study investigates the product life cycle innovation investment topic by using the compound binomial options method and therefore provide a more flexible strategy decision compared with other trend forecast criteria.

]]>Risks doi: 10.3390/risks6030097

Authors: Gareth W. Peters

The paper addresses three objectives: the first is a presentation and overview of some important developments in quantile times series approaches relevant to demographic applications&mdash;secondly, development of a general framework to represent quantile regression models in a unifying manner, which can further enhance practical extensions and assist in formation of connections between existing models for practitioners. In this regard, the core theme of the paper is to provide perspectives to a general audience of core components that go into construction of a quantile time series model. The third objective is to compare and discuss the application of the different quantile time series models on several sets of interesting demographic and mortality related time series data sets. This has relevance to life insurance analysis and the resulting exploration undertaken includes applications in mortality, fertility, births and morbidity data for several countries, with a more detailed analysis of regional data in England, Wales and Scotland.

]]>Risks doi: 10.3390/risks6030096

Authors: Eric Beutner Henryk Zähle

Almost sure bootstrap consistency of the blockwise bootstrap for the Average Value at Risk of single risks is established for strictly stationary &beta; -mixing observations. Moreover, almost sure bootstrap consistency of a multiplier bootstrap for the Average Value at Risk of collective risks is established for independent observations. The main results rely on a new functional delta-method for the almost sure bootstrap of uniformly quasi-Hadamard differentiable statistical functionals, to be presented here. The latter seems to be interesting in its own right.

]]>Risks doi: 10.3390/risks6030095

Authors: Paolo Giudici Laura Parisi

We propose a novel credit risk measurement model for Corporate Default Swap (CDS) spreads that combines vector autoregressive regression with correlation networks. We focus on the sovereign CDS spreads of a collection of countries that can be regarded as idiosyncratic measures of credit risk. We model CDS spreads by means of a structural vector autoregressive model, composed by a time dependent country specific component, and by a contemporaneous component that describes contagion effects among countries. To disentangle the two components, we employ correlation networks, derived from the correlation matrix between the reduced form residuals. The proposed model is applied to ten countries that are representative of the recent financial crisis: top borrowing/lending countries, and peripheral European countries. The empirical findings show that the contagion variable derived in this study can be considered as a network centrality measure. From an applied viewpoint, the results indicate that, in the last 10 years, contagion has induced a &ldquo;clustering effect&rdquo; between core and peripheral countries, with the two groups further diverging through, and because of, contagion propagation, thus creating a sort of diabolic loop extremely difficult to be reversed. Finally, the outcomes of the analysis confirm that core countries are importers of risk, as contagion increases their CDS spread, whereas peripheral countries are exporters of risk. Greece is an unfortunate exception, as its spreads seem to increase for both idiosyncratic factors and contagion effects.

]]>Risks doi: 10.3390/risks6030094

Authors: Adnen Ben Nasr Juncal Cunado Rıza Demirer Rangan Gupta

This study examines the linkages between Brazil, Russia, India, and China (BRICS) stock market returns, country risk ratings, and international factors via Non-linear Auto Regressive Distributed Lags models (NARDL) that allow for testing the asymmetric effects of changes in country risk ratings on stock market returns. We show that BRICS countries exhibit quite a degree of heterogeneity in the interaction of their stock market returns with country-specific political, financial, and economic risk ratings. Positive and negative rating changes in some BRICS countries are found to have significant implications for both local stock market returns, as well as commodity price dynamics. While the commodity market acts as a catalyst for these emerging stock markets in the long-run, we also observe that negative changes in the country risk ratings generally command a higher impact on stock returns, implying the greater impact of bad news on market dynamics. Our findings suggest that not all BRICS nations are the same in terms of how they react to ratings changes and how they interact with global market variables.

]]>Risks doi: 10.3390/risks6030093

Authors: Guus Balkema Paul Embrechts

There exist several estimators of the regression line in the simple linear regression: Least Squares, Least Absolute Deviation, Right Median, Theil&ndash;Sen, Weighted Balance, and Least Trimmed Squares. Their performance for heavy tails is compared below on the basis of a quadratic loss function. The case where the explanatory variable is the inverse of a standard uniform variable and where the error has a Cauchy distribution plays a central role, but heavier and lighter tails are also considered. Tables list the empirical sd and bias for ten batches of one hundred thousand simulations when the explanatory variable has a Pareto distribution and the error has a symmetric Student distribution or a one-sided Pareto distribution for various tail indices. The results in the tables may be used as benchmarks. The sample size is n = 100 but results for n = &infin; are also presented. The error in the estimate of the slope tneed not be asymptotically normal. For symmetric errors, the symmetric generalized beta prime densities often give a good fit.

]]>Risks doi: 10.3390/risks6030092

Authors: Janine Balter Alexander J. McNeil

A justification of the Basel liquidity formula for risk capital in the trading book is given under the assumption that market risk-factor changes form a Gaussian white noise process over 10-day time steps and changes to P&amp;L (profit-and-loss) are linear in the risk-factor changes. A generalization of the formula is derived under the more general assumption that risk-factor changes are multivariate elliptical. It is shown that the Basel formula tends to be conservative when the elliptical distributions are from the heavier-tailed generalized hyperbolic family. As a by-product of the analysis, a Fourier approach to calculating expected shortfall for general symmetric loss distributions is developed.

]]>Risks doi: 10.3390/risks6030091

Authors: Riccardo Gatto

In this article we introduce the stability analysis of a compound sum: it consists of computing the standardized variation of the survival function of the sum resulting from an infinitesimal perturbation of the common distribution of the summands. Stability analysis is complementary to the classical sensitivity analysis, which consists of computing the derivative of an important indicator of the model, with respect to a model parameter. We obtain a computational formula for this stability from the saddlepoint approximation. We apply the formula to the compound Poisson insurer loss with gamma individual claim amounts and to the compound geometric loss with Weibull individual claim amounts.

]]>Risks doi: 10.3390/risks6030090

Authors: Jean-Pierre Fouque Zhaoyu Zhang

We study a toy model of linear-quadratic mean field game with delay. We &ldquo;lift&rdquo; the delayed dynamic into an infinite dimensional space, and recast the mean field game system which is made of a forward Kolmogorov equation and a backward Hamilton-Jacobi-Bellman equation. We identify the corresponding master equation. A solution to this master equation is computed, and we show that it provides an approximation to a Nash equilibrium of the finite player game.

]]>Risks doi: 10.3390/risks6030089

Authors: Jatin Malhotra Angelo Corelli

The paper analyzes the relationship between the credit default swaps (CDS) spreads for 5-year CDS in Europe and US, and fundamental macroeconomic variables such as regional stock indices, oil prices, gold prices, and interest rates. The dataset includes consideration of multiple industry sectors in both economies, and it is split in two sections, before and after the global financial crisis. The analysis is carried out using multivariate regression of each index vs. the macroeconomic variables, and a Granger causality test. Both approaches are performed on the change of value of the variables involved. Results show that equity markets lead in price discovery, bidirectional causality between interest rate, and CDS spreads for most sectors involved. There is also bidirectional causality between stock and oil returns to CDS spreads.

]]>Risks doi: 10.3390/risks6030088

Authors: Rui Fang Xiaohu Li

Co-risk measures and risk contribution measures have been introduced to evaluate the degree of interaction between paired risks in actuarial risk management. This paper attempts to study the ordering behavior of measures on interaction between paired risks. For various co-risk measures and risk contribution measures, we investigate how the marginal distributions and the dependence structure impact on the level of interaction between paired risks. Also, several numerical examples based on Monte Carlo simulation are presented to illustrate the main findings.

]]>Risks doi: 10.3390/risks6030087

Authors: Raluca Vernic

With the purpose of introducing dependence between different types of claims, multivariate collective models have recently gained a lot of attention. However, when it comes to the evaluation of the corresponding compound distribution, the problems increase with the dimensionality of the model. In this paper, we consider a multivariate collective model that generalizes a model already studied from the point of view of recursive and FFT evaluation of its distribution, and we extend the same study to the general model. With the intention to see which method works better for this general model, we compare the recursive method with the FFT technique, and emphasize the advantages and drawbacks of each one, based on numerical examples.

]]>Risks doi: 10.3390/risks6030086

Authors: Fouad Marri Franck Adékambi Khouzeima Moutanabbir

In this paper, we study the discounted renewal aggregate claims with a full dependence structure. Based on a mixing exponential model, the dependence among the inter-claim times, the claim sizes, as well as the dependence between the inter-claim times and the claim sizes are included. The main contribution of this paper is the derivation of the closed-form expressions for the higher moments of the discounted aggregate renewal claims. Then, explicit expressions of these moments are provided for specific copulas families and some numerical illustrations are given to analyze the impact of dependency on the moments of the discounted aggregate amount of claims.

]]>Risks doi: 10.3390/risks6030085

Authors: Mohamed Amine Lkabous Jean-François Renaud

In this short paper, we study a VaR-type risk measure introduced by Gu&eacute;rin and Renaud and which is based on cumulative Parisian ruin. We derive some properties of this risk measure and we compare it to the risk measures of Trufin et al. and Loisel and Trufin.

]]>Risks doi: 10.3390/risks6030084

Authors: Holger Fink Andreas Fuest Henry Port

A functional ARMA-GARCH model for predicting the value-at-risk of the EURUSD exchange rate is introduced. The model implements the yield curve differentials between EUR and the US as exogenous factors. Functional principal component analysis allows us to use the information of basically the whole yield curve in a parsimonious way for exchange rate risk prediction. The data analyzed in our empirical study consist of the EURUSD exchange rate and the EUR- and US-yield curves from 15 August 2005–30 September 2016. As a benchmark, we take an ARMA-GARCH and an ARMAX-GARCHX with the 2y-yield difference as the exogenous variable and compare the forecasting performance via likelihood ratio tests. However, while our model performs better in one situation, it does not seem to improve the performance in other setups compared to its competitors.

]]>Risks doi: 10.3390/risks6030083

Authors: Michelle Xia

In this paper, we study the problem of misrepresentation under heavy-tailed regression models with the presence of both misrepresented and correctly-measured risk factors. Misrepresentation is a type of fraud when a policy applicant gives a false statement on a risk factor that determines the insurance premium. Under the regression context, we introduce heavy-tailed misrepresentation models based on the lognormal, Weibull and Pareto distributions. The proposed models allow insurance modelers to identify risk characteristics associated with the misrepresentation risk, by imposing a latent logit model on the prevalence of misrepresentation. We prove the theoretical identifiability and implement the models using Bayesian Markov chain Monte Carlo techniques. The model performance is evaluated through both simulated data and real data from the Medical Panel Expenditure Survey. The simulation study confirms the consistency of the Bayesian estimators in large samples, whereas the case study demonstrates the necessity of the proposed models for real applications when the losses exhibit heavy-tailed features.

]]>Risks doi: 10.3390/risks6030082

Authors: Giuseppe Montesi Giovanni Papiro

We present a stochastic simulation forecasting model for stress testing that is aimed at assessing banks&rsquo; capital adequacy, financial fragility, and probability of default. The paper provides a theoretical presentation of the methodology and the essential features of the forecasting model on which it is based. Also, for illustrative purposes and to show in practical terms how to apply the methodology and the types of outcomes and analysis that can be obtained, we report the results of an empirical application of the methodology proposed to the Global Systemically Important Banks (G-SIB) banks. The results of the stress test exercise are compared with the results of the supervisory stress tests performed in 2014 by the Federal Reserve and EBA/ECB.

]]>Risks doi: 10.3390/risks6030081

Authors: Marjolein van Rooijen Chaw-Yin Myint Milena Pavlova Wim Groot

(1) Background: Health insurance and social protection in Myanmar are negligible, which leaves many citizens at risk of financial hardship in case of a serious illness. The aim of this study is to explore the views of healthcare consumers and compare them to the views of key informants on the design and implementation of a nationwide health insurance system in Myanmar. (2) Method: Data were collected through nine focus group discussions with healthcare consumers and six semi-structured interviews with key health system informants. (3) Results: The consumers supported a mandatory basic health insurance and voluntary supplementary health insurance. Tax-based funding was suggested as an option that can help to enhance healthcare utilization among the poor and vulnerable groups. However, a fully tax-based funding was perceived to have limited chances of success given the low level of government resources available. Community-based insurance, where community members pool money in a healthcare fund, was seen as more appropriate for the rural areas. (4) Conclusion: This study suggests a healthcare financing mechanism based on a mixed insurance model for the creation of nationwide health insurance. Further inquiry into the feasibility of the vital aspects of the nationwide health insurance is needed.

]]>Risks doi: 10.3390/risks6030080

Authors: Martin Ewen

This paper examines the impact of volatility-based fund classification on portfolio performance. Using historical data on equity indices, we find that a strategy based on long-term portfolio volatility, as is imposed by the Synthetic Risk Reward Indicator (SRRI), yields better Sharpe Ratios (SR) and Buy and Hold Returns (BHR) than passive investments. However, accounting for the Fama&ndash;French factors in the historical data reveals no significant alphas for the vast majority of the strategies. Further analyses conducted by running a simulation study based on a GJR(1,1)-model show no significant difference in mean returns, but significantly lower SRs for the volatility-based strategies. This evidence suggests that neither the higher leverage induced by the SRRI, nor the potential protection in downside markets pay off on a risk adjusted basis.

]]>Risks doi: 10.3390/risks6030079

Authors: Vadim Semenikhine Edward Furman Jianxi Su

One way to formulate a multivariate probability distribution with dependent univariate margins distributed gamma is by using the closure under convolutions property. This direction yields an additive background risk model, and it has been very well-studied. An alternative way to accomplish the same task is via an application of the Bernstein&ndash;Widder theorem with respect to a shifted inverse Beta probability density function. This way, which leads to an arguably equally popular multiplicative background risk model (MBRM), has been by far less investigated. In this paper, we reintroduce the multiplicative multivariate gamma (MMG) distribution in the most general form, and we explore its various properties thoroughly. Specifically, we study the links to the MBRM, employ the machinery of divided differences to derive the distribution of the aggregate risk random variable explicitly, look into the corresponding copula function and the measures of nonlinear correlation associated with it, and, last but not least, determine the measures of maximal tail dependence. Our main message is that the MMG distribution is (1) very intuitive and easy to communicate, (2) remarkably tractable, and (3) possesses rich dependence and tail dependence characteristics. Hence, the MMG distribution should be given serious considerations when modelling dependent risks.

]]>Risks doi: 10.3390/risks6030078

Authors: Denis-Alexandre Trottier Frédéric Godin Emmanuel Hamel

Insurers issuing segregated fund policies apply dynamic hedging to mitigate risks related to guarantees embedded in such policies. A typical industry practice consists of using fund mapping regressions to represent basis risk stemming from the imperfect correlation between the underlying fund and its corresponding hedging instruments. The current work discusses the implications of using fund mapping regressions when the joint dynamics of the underlying and hedging assets is a regime-switching process. The potential underestimation of capital requirements stemming from the use of a fund mapping regression under such dynamics is discussed. The magnitude of the latter phenomenon is quantified through simulations calibrated on market data.

]]>Risks doi: 10.3390/risks6030077

Authors: Donatien Hainaut

Most of the models leading to an analytical expression for option prices are based on the assumption that underlying asset returns evolve according to a Brownian motion with drift. For some asset classes like commodities, a Brownian model does not fit empirical covariance and autocorrelation structures. This failure to replicate the covariance introduces a bias in the valuation of calendar spread exchange options. As the payoff of these options depends on two asset values at different times, particular care must be taken for the modeling of covariance and autocorrelation. This article proposes a simple alternative model for asset prices with sub-exponential, exponential and hyper-exponential autocovariance structures. In the proposed approach, price processes are seen as conditional Gaussian fields indexed by the time. In general, this process is not a semi-martingale, and therefore, we cannot rely on stochastic differential calculus to evaluate options. However, option prices are still calculable by the technique of the change of numeraire. A numerical illustration confirms the important influence of the covariance structure in the valuation of calendar spread exchange options for Brent against WTI crude oil and for gold against silver.

]]>Risks doi: 10.3390/risks6030076

Authors: Stanislaus Maier-Paape Qiji Jim Zhu

The aim of this paper is to provide several examples of convex risk measures necessary for the application of the general framework for portfolio theory of Maier-Paape and Zhu (2018), presented in Part I of this series. As an alternative to classical portfolio risk measures such as the standard deviation, we, in particular, construct risk measures related to the &ldquo;current&rdquo; drawdown of the portfolio equity. In contrast to references Chekhlov, Uryasev, and Zabarankin (2003, 2005), Goldberg and Mahmoud (2017), and Zabarankin, Pavlikov, and Uryasev (2014), who used the absolute drawdown, our risk measure is based on the relative drawdown process. Combined with the results of Part I, Maier-Paape and Zhu (2018), this allows us to calculate efficient portfolios based on a drawdown risk measure constraint.

]]>Risks doi: 10.3390/risks6030075

Authors: Michel Dacorogna Alessandro Ferriero David Krief

We study the dynamics of the one-year change in P&amp;C insurance reserves estimation by analyzing the process that leads to the ultimate risk in the case of &ldquo;fixed-sum&rdquo; insurance contracts. The random variable ultimately is supposed to follow a binomial distribution. We compute explicitly various quantities of interest, in particular the Solvency Capital Requirement for one year change and the Risk Margin, using the characteristics of the underlying model. We then compare them with the same figures calculated with existing risk estimation methods. In particular, our study shows that standard methods (Merz&ndash;W&uuml;thrich) can lead to materially incorrect results if the assumptions are not fulfilled. This is due to a multiplicative error assumption behind the standard methods, whereas our example has an additive error propagation as often happens in practice.

]]>Risks doi: 10.3390/risks6030074

Authors: Fabiana Gómez Jorge Ponce

This paper provides a rationale for the macro-prudential regulation of insurance companies, where capital requirements increase in their contribution to systemic risk. In the absence of systemic risk, the formal model in this paper predicts that optimal regulation may be implemented by capital regulation (similar to that observed in practice, e.g., Solvency II ) and by actuarially fair technical reserve. However, these instruments are not sufficient when insurance companies are exposed to systemic risk: prudential regulation should also add a systemic component to capital requirements that is non-decreasing in the firm&rsquo;s exposure to systemic risk. Implementing the optimal policy implies separating insurance firms into two categories according to their exposure to systemic risk: those with relatively low exposure should be eligible for bailouts, while those with high exposure should not benefit from public support if a systemic event occurs.

]]>Risks doi: 10.3390/risks6030073

Authors: Christian Hipp

In this note we study the problem of company values with a ruin constraint in classical continuous time Lundberg models. For this, we adapt the methods and results for discrete de Finetti models to time and state continuous Lundberg models. The policy improvement method works also in continuous models, but it is slow and needs discretization. Better results can be obtained faster using the barrier method for discrete models which can be adjusted for Lundberg models. In this method, dividend strategies are considered which are based on barrier sequences. In our continuous state model, optimal barriers can be computed with the Lagrange method leading to a backward recursion scheme. The resulting dividend strategies will not always be optimal: in the case without ruin constraint, there are examples in which band strategies are superior. We also develop equations for optimal control of dynamic reinsurance to maximize the company value under a ruin constraint. These identify the optimal reinsurance strategy in no action regions and allow for an interactive computation of the value function. We apply the methods in a numerical example with exponential claims.

]]>Risks doi: 10.3390/risks6030072

Authors: Andreas Mühlbacher Thomas Guhr

The stability of the financial system is associated with systemic risk factors such as the concurrent default of numerous small obligors. Hence, it is of utmost importance to study the mutual dependence of losses for different creditors in the case of large, overlapping credit portfolios. We analytically calculate the multivariate joint loss distribution of several credit portfolios on a non-stationary market. To take fluctuating asset correlations into account, we use an random matrix approach which preserves, as a much appreciated side effect, analytical tractability and drastically reduces the number of parameters. We show that, for two disjoint credit portfolios, diversification does not work in a correlated market. Additionally, we find large concurrent portfolio losses to be rather likely. We show that significant correlations of the losses emerge not only for large portfolios with thousands of credit contracts, but also for small portfolios consisting of a few credit contracts only. Furthermore, we include subordination levels, which were established in collateralized debt obligations to protect the more senior tranches from high losses. We analytically corroborate the observation that an extreme loss of the subordinated creditor is likely to also yield a large loss of the senior creditor.

]]>Risks doi: 10.3390/risks6030071

Authors: Guojun Gan

A variable annuity is a popular life insurance product that comes with financial guarantees. Using Monte Carlo simulation to value a large variable annuity portfolio is extremely time-consuming. Metamodeling approaches have been proposed in the literature to speed up the valuation process. In metamodeling, a metamodel is first fitted to a small number of variable annuity contracts and then used to predict the values of all other contracts. However, metamodels that have been investigated in the literature are sophisticated predictive models. In this paper, we investigate the use of linear regression models with interaction effects for the valuation of large variable annuity portfolios. Our numerical results show that linear regression models with interactions are able to produce accurate predictions and can be useful additions to the toolbox of metamodels that insurance companies can use to speed up the valuation of large VA portfolios.

]]>Risks doi: 10.3390/risks6030070

Authors: Jonas Harnau

Although both over-dispersed Poisson and log-normal chain-ladder models are popular in claim reserving, it is not obvious when to choose which model. Yet, the two models are obviously different. While the over-dispersed Poisson model imposes the variance to mean ratio to be common across the array, the log-normal model assumes the same for the standard deviation to mean ratio. Leveraging this insight, we propose a test that has the power to distinguish between the two models. The theory is asymptotic, but it does not build on a large size of the array and, instead, makes use of information accumulating within the cells. The test has a non-standard asymptotic distribution; however, saddle point approximations are available. We show in a simulation study that these approximations are accurate and that the test performs well in finite samples and has high power.

]]>Risks doi: 10.3390/risks6030069

Authors: Himchan Jeong Guojun Gan Emiliano A. Valdez

For automobile insurance, it has long been implied that when a policyholder made at least one claim in the prior year, the subsequent premium is likely to increase. When this happens, the policyholder may seek to switch to another insurance company to possibly avoid paying for a higher premium. In such situations, insurers may be faced with the challenges of policyholder retention by keeping premiums low in the face of competition. In this paper, we seek to find empirical evidence of possible association between policyholder switching after a claim and the associated change in premium. In accomplishing this goal, we employ the method of association rule learning, a data mining technique that has its origins in marketing for analyzing and understanding consumer purchase behavior. We apply this unique technique in two stages. In the first stage, we identify policyholder and vehicle characteristics that affect the size of the claim and resulting change in premium regardless of policy switch. In the second stage, together with policyholder and vehicle characteristics, we identify the association among the size of the claim, the level of premium increase and policy switch. This empirical process is often challenging to insurers because they are unable to observe the new premium for those policyholders who switched. However, we used nine-year claims data for the entire Singapore automobile insurance market that allowed us to track information before and after the switch. Our results provide evidence of a strong association among the size of the claim, the level of premium increase and policy switch. We attribute this to the possible inefficiency of the insurance market because of the lack of sharing and exchange of claims history among the companies.

]]>Risks doi: 10.3390/risks6030068

Authors: Kristian Buchardt Thomas Møller

In investment and insurance contracts, certain stipulated payments may depend on the hedging strategy. We study the problem of calculation, hedging and valuation of such cash flows, by considering a payment process in a setup with taxes and investment costs that are functions of the investment returns or the current value of the hedging strategy. We determine the market value of the combined liability and decompose the value into the tax part, the investment cost part and the benefit part, and we determine the associated hedging strategies. Moreover, we identify the expected future tax payments and investment cost cash flows. Our results show that the current Danish insurance accounting practice for taxes is in general conservative, when considered in an idealized setting with symmetric and continuously-paid taxes. Finally, we consider the special case of affine interest rates, where explicit results can be obtained, and study some numerical results.

]]>Risks doi: 10.3390/risks6030067

Authors: Edouard Debonneuil Anne Eyraud-Loisel Frédéric Planchet

Pension funds, which manage the financing of a large share of global retirement schemes, need to invest their assets in a diversified manner and over long durations while managing interest rate and longevity risks. In recent years, a new type of investment has emerged, that we call a longevity megafund, which invests in clinical trials for solutions against lifespan-limiting diseases and provides returns positively correlated with longevity. After describing ongoing biomedical developments against ageing-related diseases, we model the needed capital for pension funds to face longevity risk and find that it is far above current practices. After investigating the financial returns of pharmaceutical developments, we estimate the returns of a longevity megafund. Combined, our models indicate that investing in a longevity megafund is an appropriate method to significantly reduce longevity risk and the associated economic capital need.

]]>Risks doi: 10.3390/risks6030066

Authors: Tobias Burkhart

Participating life insurance contracts entitle the policyholder to participate in the company&rsquo;s annual surplus. Typically, they are also equipped with a surrender option that allows the policyholder to terminate the contract prior to maturity, receiving a predetermined surrender value. The option interacts with (often cliquet-style) interest guarantees that are a key feature of traditional participating contracts. Surrender options can considerably affect an insurer&rsquo;s liabilities and bear material risks. This paper addresses the recognition of those risks in the quantitative assessment of a heterogeneous insurance portfolio under Solvency II, taking into account the complex interrelation between minimum interest guarantees, reserving requirements, and profit sharing. The lapse risk module of the Solvency II standard formula requires the identification of portfolio segments that are exposed to a specific change of surrender rates (long-term increase/decrease, one-off increase). We provide a heuristic that identifies homogeneous risk groups in the sense that the respective stress would increase the insurer&rsquo;s liabilities. Our approach can be used to derive an appropriate segmentation in practical applications. We further analyze implications of the segmentation on the Risk Margin (as part of the Technical Provisions under Solvency II) and discuss consequences of policyholder options on the calculation of Going Concern Reserve and Surplus Funds. To illustrate our findings, we set up a stochastic balance sheet and cash flow projection model for a stylized life insurance company. We conclude that current methods used for practical applications underestimate surrender risk under Solvency II and that the proposed modeling refinements may improve the appropriateness of solvency ratios for participating business.

]]>Risks doi: 10.3390/risks6030065

Authors: Youngna Choi

We investigate masked financial instability caused by wealth inequality. When an economic sector is decomposed into two subsectors that possess a severe wealth inequality, the sector in entirety can look financially stable while the two subsectors possess extreme financially instabilities of opposite nature, one from excessive equity, the other from lack thereof. The unstable subsector can result in further financial distress and even trigger a financial crisis. The market instability indicator, an early warning system derived from dynamical systems applied to agent-based models, is used to analyze the subsectoral financial instabilities. Detailed mathematical analysis is provided to explain what financial instabilities can arise amid seemingly stable economy and positive market data. The theoretical conjecture is verified by historical macroeconomic time series of the United States households among whom a substantial wealth inequality has been officially confirmed.

]]>Risks doi: 10.3390/risks6030064

Authors: Frédéric Vrins

Statistical modeling techniques&mdash;and factor models in particular&mdash;are extensively used in practice, especially in the insurance and finance industry, where many risks have to be accounted for. In risk management applications, it might be important to analyze the situation when fixing the value of a weighted sum of factors, for example to a given quantile. In this work, we derive the (n&minus;1)-dimensional distribution corresponding to a n-dimensional i.i.d. standard Normal vector Z=(Z1,Z2,&hellip;,Zn)&prime; subject to the weighted sum constraint w&prime;Z=c, where w=(w1,w2,&hellip;,wn)&prime; and wi&ne;0. This law is proven to be a Normal distribution, whose mean vector &mu; and covariance matrix &Sigma; are explicitly derived as a function of (w,c). The derivation of the density relies on the analytical inversion of a very specific positive definite matrix. We show that it does not correspond to naive sampling techniques one could think of. This result is then used to design algorithms for sampling Z under constraint that w&prime;Z=c or w&prime;Z&le;c and is illustrated on two applications dealing with Value-at-Risk and Expected Shortfall.

]]>Risks doi: 10.3390/risks6030063

Authors: Jiwook Jang Siti Norafidah Mohd Ramli

We explored the effect of the jump-diffusion process on a social benefit scheme consisting of life insurance, unemployment/disability benefits, and retirement benefits. To do so, we used a four-state Markov chain with multiple decrements. Assuming independent state-wise intensities taking the form of a jump-diffusion process and deterministic interest rates, we evaluated the prospective reserves for this scheme in which the individual is employed at inception. We then numerically demonstrated the state of the reserves for the scheme under jump-diffusion and non-jump-diffusion settings. By decomposing the reserve equation into five components, our numerical illustration indicated that an extension of the retirement age has a spillover effect that would increase government expenses for other social insurance programs. We also conducted sensitivity analyses and examined the total-reserves components by changing the relevant parameters of the transition intensities, which are the average jump-size parameter, average jump frequency, and diffusion parameters of the chosen states, with figures provided. Our computation revealed that the total reserve is most sensitive to changes in average jump frequency.

]]>Risks doi: 10.3390/risks6020062

Authors: Anne-Sophie Krah Zoran Nikolić Ralf Korn

The Solvency II directive asks insurance companies to derive their solvency capital requirement from the full loss distribution over the coming year. While this is in general computationally infeasible in the life insurance business, an application of the Least-Squares Monte Carlo (LSMC) method offers a possibility to overcome this computational challenge. We outline in detail the challenges a life insurer faces, the theoretical basis of the LSMC method and the necessary steps on the way to a reliable proxy modeling in the life insurance business. Further, we illustrate the advantages of the LSMC approach via presenting (slightly disguised) real-world applications.

]]>Risks doi: 10.3390/risks6020061

Authors: James Ming Chen

This article reviews two leading measures of financial risk and an emerging alternative. Embraced by the Basel accords, value-at-risk and expected shortfall are the leading measures of financial risk. Expectiles offset the weaknesses of value-at-risk (VaR) and expected shortfall. Indeed, expectiles are the only elicitable law-invariant coherent risk measures. After reviewing practical concerns involving backtesting and robustness, this article more closely examines regulatory applications of expectiles. Expectiles are most readily evaluated as a special class of quantiles. For ease of regulatory implementation, expectiles can be defined exclusively in terms of VaR, expected shortfall, and the thresholds at which those competing risk measures are enforced. Moreover, expectiles are in harmony with gain/loss ratios in financial risk management. Expectiles may address some of the flaws in VaR and expected shortfall&mdash;subject to the reservation that no risk measure can achieve exactitude in regulation.

]]>Risks doi: 10.3390/risks6020060

Authors: Louis Eeckhoudt Anna Maria Fiori Emanuela Rosazza Gianin

In this paper we analyze insurance demand when the utility function depends both upon final wealth and the level of losses or gains relative to a reference point. Besides some comparative statics results, we discuss the links with first-order risk aversion, with the Omega measure, and with a tendency to over-insure modest risks that has been been extensively documented in real insurance markets.

]]>Risks doi: 10.3390/risks6020059

Authors: Shuanming Li Yi Lu

This paper studies the moments and the distribution of the aggregate discounted claims (ADCs) in a Markovian environment, where the claim arrivals, claim amounts, and forces of interest (for discounting) are influenced by an underlying Markov process. Specifically, we assume that claims occur according to a Markovian arrival process (MAP). The paper shows that the vector of joint Laplace transforms of the ADC occurring in each state of the environment process by any specific time satisfies a matrix-form first-order partial differential equation, through which a recursive formula is derived for the moments of the ADC occurring in certain states (a subset). We also study two types of covariances of the ADC occurring in any two subsets of the state space and with two different time lengths. The distribution of the ADC occurring in certain states by any specific time is also investigated. Numerical results are also presented for a two-state Markov-modulated model case.

]]>Risks doi: 10.3390/risks6020058

Authors: Roman V. Ivanov

This paper considers risks of the investment portfolio, which consist of distributed mortgages and sold European call options. It is assumed that the stream of the credit payments could fall by a jump. The time of the jump is modeled by the exponential distribution. We suggest that the returns on stock are variance-gamma distributed. The value at risk, the expected shortfall and the entropic risk measure for this portfolio are calculated in closed forms. The obtained formulas exploit the values of generalized hypergeometric functions.

]]>Risks doi: 10.3390/risks6020057

Authors: Tatjana Miljkovic Daniel Fernández

We review two complementary mixture-based clustering approaches for modeling unobserved heterogeneity in an insurance portfolio: the generalized linear mixed cluster-weighted model (CWM) and mixture-based clustering for an ordered stereotype model (OSM). The latter is for modeling of ordinal variables, and the former is for modeling losses as a function of mixed-type of covariates. The article extends the idea of mixture modeling to a multivariate classification for the purpose of testing unobserved heterogeneity in an insurance portfolio. The application of both methods is illustrated on a well-known French automobile portfolio, in which the model fitting is performed using the expectation-maximization (EM) algorithm. Our findings show that these mixture-based clustering methods can be used to further test unobserved heterogeneity in an insurance portfolio and as such may be considered in insurance pricing, underwriting, and risk management.

]]>Risks doi: 10.3390/risks6020056

Authors: Fred Espen Benth Luca Di Persio Silvia Lavagnini

We model the logarithm of the spot price of electricity with a normal inverse Gaussian (NIG) process and the wind speed and wind power production with two Ornstein&ndash;Uhlenbeck processes. In order to reproduce the correlation between the spot price and the wind power production, namely between a pure jump process and a continuous path process, respectively, we replace the small jumps of the NIG process by a Brownian term. We then apply our models to two different problems: first, to study from the stochastic point of view the income from a wind power plant, as the expected value of the product between the electricity spot price and the amount of energy produced; then, to construct and price a European put-type quanto option in the wind energy markets that allows the buyer to hedge against low prices and low wind power production in the plant. Calibration of the proposed models and related price formulas is also provided, according to specific datasets.

]]>Risks doi: 10.3390/risks6020055

Authors: Khaled Halteh Kuldeep Kumar Adrian Gepp

Credit risk is a critical issue that affects banks and companies on a global scale. Possessing the ability to accurately predict the level of credit risk has the potential to help the lender and borrower. This is achieved by alleviating the number of loans provided to borrowers with poor financial health, thereby reducing the number of failed businesses, and, in effect, preventing economies from collapsing. This paper uses state-of-the-art stochastic models, namely: Decision trees, random forests, and stochastic gradient boosting to add to the current literature on credit-risk modelling. The Australian mining industry has been selected to test our methodology. Mining in Australia generates around $138 billion annually, making up more than half of the total goods and services. This paper uses publicly-available financial data from 750 risky and not risky Australian mining companies as variables in our models. Our results indicate that stochastic gradient boosting was the superior model at correctly classifying the good and bad credit-rated companies within the mining sector. Our model showed that &lsquo;Property, Plant, &amp; Equipment (PPE) turnover&rsquo;, &lsquo;Invested Capital Turnover&rsquo;, and &lsquo;Price over Earnings Ratio (PER)&rsquo; were the variables with the best explanatory power pertaining to predicting credit risk in the Australian mining sector.

]]>Risks doi: 10.3390/risks6020054

Authors: Rüdiger Frey Juraj Hledik

In this paper, we study the implications of diversification in the asset portfolios of banks for financial stability and systemic risk. Adding to the existing literature, we analyse this issue in a network model of the interbank market. We carry out a simulation study that determines the probability of a systemic crisis in the banking network as a function of both the level of diversification, and the connectivity and structure of the financial network. In contrast to earlier studies we find that diversification at the level of individual banks may be beneficial for financial stability even if it does lead to a higher asset return correlation across banks.

]]>Risks doi: 10.3390/risks6020053

Authors: Stanislaus Maier-Paape Qiji Jim Zhu

Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two-dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz (1959) and the capital market pricing model Sharpe (1964), are special cases of our general framework when the risk measure is taken to be the standard deviation and the utility function is the identity mapping. Using our general framework, we also recover and extend the results in Rockafellar et al. (2006), which were already an extension of the capital market pricing model to allow for the use of more general deviation measures. This generalized capital asset pricing model also applies to e.g., when an approximation of the maximum drawdown is considered as a risk measure. Furthermore, the consideration of a general utility function allows for going beyond the &ldquo;additive&rdquo; performance measure to a &ldquo;multiplicative&rdquo; one of cumulative returns by using the log utility. As a result, the growth optimal portfolio theory Lintner (1965) and the leverage space portfolio theory Vince (2009) can also be understood and enhanced under our general framework. Thus, this general framework allows a unification of several important existing portfolio theories and goes far beyond. For simplicity of presentation, we phrase all for a finite underlying probability space and a one period market model, but generalizations to more complex structures are straightforward.

]]>Risks doi: 10.3390/risks6020052

Authors: Rasika Yatigammana Shelton Peiris Richard Gerlach David Edmund Allen

The direction of price movements are analysed under an ordered probit framework, recognising the importance of accounting for discreteness in price changes. By extending the work of Hausman et al. (1972) and Yang and Parwada (2012),This paper focuses on improving the forecast performance of the model while infusing a more practical perspective by enhancing flexibility. This is achieved by extending the existing framework to generate short term multi period ahead forecasts for better decision making, whilst considering the serial dependence structure. This approach enhances the flexibility and adaptability of the model to future price changes, particularly targeting risk minimisation. Empirical evidence is provided, based on seven stocks listed on the Australian Securities Exchange (ASX). The prediction success varies between 78 and 91 per cent for in-sample and out-of-sample forecasts for both the short term and long term.

]]>Risks doi: 10.3390/risks6020051

Authors: Timothy Hillman Nan Zhang Zhuo Jin

We extend an existing numerical model (Grasselli (2011)) for valuing a real option to invest in a capital project in an incomplete market with a finite time horizon. In doing so, we include two separate effects: the possibility that the project value is partly describable according to a jump-diffusion process, and incorporation of a time-dependent investor utility function, taking into account the effect of inflation. We adopt a discrete approximation to the jump process, whose parameters are restricted in order to preserve the drift and the volatility of the project-value process that it modifies. By controlling for these low-order effects, the higher-order effects may be considered in isolation. Our simulated results demonstrate that the inclusion of the jump process tends to decrease the value of the option, and expand the circumstances under which it should be exercised. Our results also demonstrate that an appropriate selection of the time-dependent investor utility function yields more reasonable investor-behaviour predictions regarding the decision to exercise the option, than would occur otherwise.

]]>Risks doi: 10.3390/risks6020050

Authors: Gian Paolo Clemente

Solvency II Standard Formula provides a methodology to recognise the risk-mitigating impact of excess of loss reinsurance treaties in premium risk modelling. We analyse the proposals of both Quantitative Impact Study 5 and Commission Delegated Regulation highlighting some inconsistencies. This paper tries to bridge main pitfalls of both versions. To this aim, we propose a revision of non-proportional adjustment factor in order to measure the effect of excess of loss treaties on premium risk volatility. In this way, capital requirement can be easily assessed. As numerical results show, this proposal appears to be a feasible and much more consistent approach to describe the effect of non-proportional reinsurance on premium risk.

]]>Risks doi: 10.3390/risks6020049

Authors: Wei Wei

There are extensive studies on the allocation problems in the field of insurance and finance. We observe that these studies, although involving different methodologies, share some inherent commonalities. In this paper, we develop a new framework for these studies with the tool of arrangement increasing functions. This framework unifies many existing studies and provides shortcuts to developing new results.

]]>Risks doi: 10.3390/risks6020048

Authors: Alessandro Milazzo Elena Vigna

We study the gap between the state pension provided by the Italian pension system pre-Dini reform and post-Dini reform. The goal is to fill the gap between the old and the new pension by joining a defined contribution pension scheme and adopting an optimal investment strategy that is target-based. We find that it is possible to cover, at least partially, this gap with the additional income of the pension scheme, especially in the presence of late retirement and in the presence of stagnant careers. Workers with dynamic careers and workers who retire early are those who are most penalised by the reform. Results are intuitive and in line with previous studies on the subject.

]]>Risks doi: 10.3390/risks6020047

Authors: Bertrand K. Hassani Alexis Renaudin

According to the last proposals of the Basel Committee on Banking Supervision, banks or insurance companies under the advanced measurement approach (AMA) must use four different sources of information to assess their operational risk capital requirement. The fourth includes &rsquo;business environment and internal control factors&rsquo;, i.e., qualitative criteria, whereas the three main quantitative sources available to banks for building the loss distribution are internal loss data, external loss data and scenario analysis. This paper proposes an innovative methodology to bring together these three different sources in the loss distribution approach (LDA) framework through a Bayesian strategy. The integration of the different elements is performed in two different steps to ensure an internal data-driven model is obtained. In the first step, scenarios are used to inform the prior distributions and external data inform the likelihood component of the posterior function. In the second step, the initial posterior function is used as the prior distribution and the internal loss data inform the likelihood component of the second posterior function. This latter posterior function enables the estimation of the parameters of the severity distribution that are selected to represent the operational risk event types.

]]>Risks doi: 10.3390/risks6020046

Authors: Martin Tegnér Rolf Poulsen

It is impossible to discriminate between the commonly used stochastic volatility models of Heston, log-normal, and 3-over-2 on the basis of exponentially weighted averages of daily returns&mdash;even though it appears so at first sight. However, with a 5-min sampling frequency, the models can be differentiated and empirical evidence overwhelmingly favours a fast mean-reverting log-normal model.

]]>Risks doi: 10.3390/risks6020045

Authors: Marco Bee Luca Trapin

One of the key components of financial risk management is risk measurement. This typically requires modeling, estimating and forecasting tail-related quantities of the asset returns&rsquo; conditional distribution. Recent advances in the financial econometrics literature have developed several models based on Extreme Value Theory (EVT) to carry out these tasks. The purpose of this paper is to review these methods.

]]>