Risks doi: 10.3390/risks7010034

Authors: Zbigniew Palmowski Łukasz Stettner Anna Sulima

We study a portfolio selection problem in a continuous-time It&ocirc;&ndash;Markov additive market with prices of financial assets described by Markov additive processes that combine L&eacute;vy processes and regime switching models. Thus, the model takes into account two sources of risk: the jump diffusion risk and the regime switching risk. For this reason, the market is incomplete. We complete the market by enlarging it with the use of a set of Markovian jump securities, Markovian power-jump securities and impulse regime switching securities. Moreover, we give conditions under which the market is asymptotic-arbitrage-free. We solve the portfolio selection problem in the It&ocirc;&ndash;Markov additive market for the power utility and the logarithmic utility.

]]>Risks doi: 10.3390/risks7010033

Authors: Andrea Nigri Susanna Levantesi Mario Marino Salvatore Scognamiglio Francesca Perla

In the field of mortality, the Lee&ndash;Carter based approach can be considered the milestone to forecast mortality rates among stochastic models. We could define a &ldquo;Lee&ndash;Carter model family&rdquo; that embraces all developments of this model, including its first formulation (1992) that remains the benchmark for comparing the performance of future models. In the Lee&ndash;Carter model, the &kappa; t parameter, describing the mortality trend over time, plays an important role about the future mortality behavior. The traditional ARIMA process usually used to model &kappa; t shows evident limitations to describe the future mortality shape. Concerning forecasting phase, academics should approach a more plausible way in order to think a nonlinear shape of the projected mortality rates. Therefore, we propose an alternative approach the ARIMA processes based on a deep learning technique. More precisely, in order to catch the pattern of &kappa; t series over time more accurately, we apply a Recurrent Neural Network with a Long Short-Term Memory architecture and integrate the Lee&ndash;Carter model to improve its predictive capacity. The proposed approach provides significant performance in terms of predictive accuracy and also allow for avoiding the time-chunks&rsquo; a priori selection. Indeed, it is a common practice among academics to delete the time in which the noise is overflowing or the data quality is insufficient. The strength of the Long Short-Term Memory network lies in its ability to treat this noise and adequately reproduce it into the forecasted trend, due to its own architecture enabling to take into account significant long-term patterns.

]]>Risks doi: 10.3390/risks7010032

Authors: Zhuo Jin Zhixin Yang Quan Yuan

This paper studies the optimal investment and consumption strategies in a two-asset model. A dynamic Value-at-Risk constraint is imposed to manage the wealth process. By using Value at Risk as the risk measure during the investment horizon, the decision maker can dynamically monitor the exposed risk and quantify the maximum expected loss over a finite horizon period at a given confidence level. In addition, the decision maker has to filter the key economic factors to make decisions. Considering the cost of filtering the factors, the decision maker aims to maximize the utility of consumption in a finite horizon. By using the Kalman filter, a partially observed system is converted to a completely observed one. However, due to the cost of information processing, the decision maker fails to process the information in an arbitrarily rational manner and can only make decisions on the basis of the limited observed signals. A genetic algorithm was developed to find the optimal investment, consumption strategies, and observation strength. Numerical simulation results are provided to illustrate the performance of the algorithm.

]]>Risks doi: 10.3390/risks7010031

Authors: Wanbing Zhang Sisi Zhang Peibiao Zhao

Value at Risk (VaR) is used to illustrate the maximum potential loss under a given confidence level, and is just a single indicator to evaluate risk ignoring any information about income. The present paper will generalize one-dimensional VaR to two-dimensional VaR with income-risk double indicators. We first construct a double-VaR with ( &mu; , &sigma; 2 ) (or ( &mu; , V a R 2 ) ) indicators, and deduce the joint confidence region of ( &mu; , &sigma; 2 ) (or ( &mu; , V a R 2 ) ) by virtue of the two-dimensional likelihood ratio method. Finally, an example to cover the empirical analysis of two double-VaR models is stated.

]]>Risks doi: 10.3390/risks7010030

Authors: Fabien Le Floc’h Cornelis W. Oosterlee

This paper explores the stochastic collocation technique, applied on a monotonic spline, as an arbitrage-free and model-free interpolation of implied volatilities. We explore various spline formulations, including B-spline representations. We explain how to calibrate the different representations against market option prices, detail how to smooth out the market quotes, and choose a proper initial guess. The technique is then applied to concrete market options and the stability of the different approaches is analyzed. Finally, we consider a challenging example where convex spline interpolations lead to oscillations in the implied volatility and compare the spline collocation results with those obtained through arbitrage-free interpolation technique of Andreasen and Huge.

]]>Risks doi: 10.3390/risks7010029

Authors: Martin Leo Suneel Sharma K. Maddulety

There is an increasing influence of machine learning in business applications, with many solutions already implemented and many more being explored. Since the global financial crisis, risk management in banks has gained more prominence, and there has been a constant focus around how risks are being detected, measured, reported and managed. Considerable research in academia and industry has focused on the developments in banking and risk management and the current and emerging challenges. This paper, through a review of the available literature seeks to analyse and evaluate machine-learning techniques that have been researched in the context of banking risk management, and to identify areas or problems in risk management that have been inadequately explored and are potential areas for further research. The review has shown that the application of machine learning in the management of banking risks such as credit risk, market risk, operational risk and liquidity risk has been explored; however, it doesn&rsquo;t appear commensurate with the current industry level of focus on both risk management and machine learning. A large number of areas remain in bank risk management that could significantly benefit from the study of how machine learning can be applied to address specific problems.

]]>Risks doi: 10.3390/risks7010028

Authors: Shi Chen Jyh-Horng Lin Wenyu Yao Fu-Wei Huang

In this paper, we develop a contingent claim model to evaluate the equity, default risk, and efficiency gain/loss from managerial overconfidence of a shadow-banking life insurer under the purchases of distressed assets by the government. Our paper focuses on managerial overconfidence where the chief executive officer (CEO) overestimates the returns on investment. The investment market faced by the life insurer is imperfectly competitive, and investment is core to the provision of profit-sharing life insurance policies. We show that CEO overconfidence raises the default risk in the life insurer&rsquo;s equity returns, thereby adversely affecting the financial stability. Either shadow-banking involvement or government bailout attenuates the unfavorable effect. There is an efficiency gain from CEO overconfidence to investment. Government bailout helps to reduce the life insurer&rsquo;s default risk, but simultaneously reduce the efficiency gain from CEO overconfidence. Our results contribute to the managerial overconfidence literature linking insurer shadow-banking involvement and government bailout in particular during a financial crisis.

]]>Risks doi: 10.3390/risks7010027

Authors: Apostolos Bozikas Georgios Pitselis

In this paper, we propose a credible regression approach with random coefficients to model and forecast the mortality dynamics of a given population with limited data. Age-specific mortality rates are modelled and extrapolation methods are utilized to estimate future mortality rates. The results on Greek mortality data indicate that credibility regression contributed to more accurate forecasts than those produced from the Lee&ndash;Carter and Cairns&ndash;Blake&ndash;Dowd models. An application on pricing insurance-related products is also provided.

]]>Risks doi: 10.3390/risks7010026

Authors: Susanna Levantesi Virginia Pizzorusso

Estimation of future mortality rates still plays a central role among life insurers in pricing their products and managing longevity risk. In the literature on mortality modeling, a wide number of stochastic models have been proposed, most of them forecasting future mortality rates by extrapolating one or more latent factors. The abundance of proposed models shows that forecasting future mortality from historical trends is non-trivial. Following the idea proposed in Deprez et al. (2017), we use machine learning algorithms, able to catch patterns that are not commonly identifiable, to calibrate a parameter (the machine learning estimator), improving the goodness of fit of standard stochastic mortality models. The machine learning estimator is then forecasted according to the Lee-Carter framework, allowing one to obtain a higher forecasting quality of the standard stochastic models. Out-of sample forecasts are provided to verify the model accuracy.

]]>Risks doi: 10.3390/risks7010025

Authors: Phillip J. Monin

We introduce a financial stress index that was developed by the Office of Financial Research (OFR FSI) and detail its purpose, construction, interpretation, and use in financial market monitoring. The index employs a novel and flexible methodology using daily data from global financial markets. Analysis for the 2000&ndash;2018 time period is presented. Using a logistic regression framework and dates of government intervention in the financial system as a proxy for stress events, we found that the OFR FSI performs well in identifying systemic financial stress. In addition, we find that the OFR FSI leads the Chicago Fed National Activity Index in a Granger causality analysis, suggesting that increases in financial stress help predict decreases in economic activity.

]]>Risks doi: 10.3390/risks7010023

Authors: Sel Ly Kim-Hung Pho Sal Ly Wing-Keung Wong

Determining distributions of the functions of random variables is one of the most important problems in statistics and applied mathematics because distributions of functions have wide range of applications in numerous areas in economics, finance, risk management, science, and others. However, most studies only focus on the distribution of independent variables or focus on some common distributions such as multivariate normal joint distributions for the functions of dependent random variables. To bridge the gap in the literature, in this paper, we first derive the general formulas to determine both density and distribution of the product for two or more random variables via copulas to capture the dependence structures among the variables. We then propose an approach combining Monte Carlo algorithm, graphical approach, and numerical analysis to efficiently estimate both density and distribution. We illustrate our approach by examining the shapes and behaviors of both density and distribution of the product for two log-normal random variables on several different copulas, including Gaussian, Student-t, Clayton, Gumbel, Frank, and Joe Copulas, and estimate some common measures including Kendall&rsquo;s coefficient, mean, median, standard deviation, skewness, and kurtosis for the distributions. We found that different types of copulas affect the behavior of distributions differently. In addition, we also discuss the behaviors via all copulas above with the same Kendall&rsquo;s coefficient. Our results are the foundation of any further study that relies on the density and cumulative probability functions of product for two or more random variables. Thus, the theory developed in this paper is useful for academics, practitioners, and policy makers.

]]>Risks doi: 10.3390/risks7010024

Authors: Ahsan Nawaz Ahsan Waqar Syyed Adnan Raheel Shah Muhammad Sajid Muhammad Irslan Khalid

Risk management is a comparatively new field and there is no core system of risk management in the construction industries of developing countries. In Pakistan, construction is an extremely risk-seeking industry lacking a good reputation for handling risk. However, it is gradually giving it more importance as a result of increased competition and construction activities. For this purpose, a survey-based study has been conducted which aims to investigate the risk management practices used in construction projects in Pakistan. To achieve the objective, data was collected from 22 contractor firms working on 100 diverse projects. The analysis indicates that risk management has been implemented at a low level in the local environment. The results also disclose that there is a higher degree of correlation between effective risk management and project success. The findings reveal the importance of risk management techniques, their usage, implication, and the effect of these techniques on the success of construction projects from the contractor&rsquo;s perspective, thus convincing the key participants of projects about the use of risk management.

]]>Risks doi: 10.3390/risks7010022

Authors: Han Li Colin O’Hare

Extrapolative methods are one of the most commonly-adopted forecasting approaches in the literature on projecting future mortality rates. It can be argued that there are two types of mortality models using this approach. The first extracts patterns in age, time and cohort dimensions either in a deterministic fashion or a stochastic fashion. The second uses non-parametric smoothing techniques to model mortality and thus has no explicit constraints placed on the model. We argue that from a forecasting point of view, the main difference between the two types of models is whether they treat recent and historical information equally in the projection process. In this paper, we compare the forecasting performance of the two types of models using Great Britain male mortality data from 1950&ndash;2016. We also conduct a robustness test to see how sensitive the forecasts are to the changes in the length of historical data used to calibrate the models. The main conclusion from the study is that more recent information should be given more weight in the forecasting process as it has greater predictive power over historical information.

]]>Risks doi: 10.3390/risks7010021

Authors: Mariarosaria Coppola Maria Russolillo Rosaria Simone

The management of National Social Security Systems is being challenged more and more by the rapid ageing of the population, especially in the industrialized countries. In order to chase the Pension System sustainability, several countries in Europe are setting up pension reforms linking the retirement age and/or benefits to life expectancy. In this context, the accurate modelling and projection of mortality rates and life expectancy play a central role and represent issues of great interest in recent literature. Our study refers to the Italian mortality experience and considers an indexing mechanism based on the expected residual life to adjust the retirement age and keep costs at an expected budgeted level, in the spirit of sharing the longevity risk between Social Security Systems and retirees. In order to combine fitting and projections performances of selected stochastic mortality models, a model assembling technique is applied to face uncertainty in model selection, while accounting for uncertainty of estimation as well. The resulting proposal is an averaged model that is suitable to discuss about the gender gap in longevity risk and its alleged narrowing over time.

]]>Risks doi: 10.3390/risks7010020

Authors: Diby François Kassi Dilesha Nawadali Rathnayake Pierre Axel Louembe Ning Ding

This study examines the effect of market risk on the financial performance of 31 non-financial companies listed on the Casablanca Stock Exchange (CSE) over the period 2000&ndash;2016. We utilized three alternative variables to assess financial performance, namely, the return on assets, the return on equity and the profit margin. We used the degree of financial leverage, the book-to-market ratio, and the gearing ratio as the indicators of market risk. Then, we employed the pooled OLS model, the fixed effects model, the random effects model, the difference-GMM and the system-GMM models. The results show that the different measures of market risk have significant negative influences on the companies&rsquo; financial performance. The elasticities are greater following the degree of financial leverage compared with the book-to-market ratio and the gearing ratio. In most cases, the firm&rsquo;s age, the cash holdings ratio, the firm&rsquo;s size, the debt-to-assets ratio, and the tangibility ratio have positive effects on financial performance, whereas the debt-to-income ratio and the stock turnover hurt the performance of these non-financial companies. Therefore, decision-makers and managers should mitigate market risk through appropriate strategies of risk management, such as derivatives and insurance techniques.

]]>Risks doi: 10.3390/risks7010019

Authors: Hui Ye Anthony Bellotti

Based on a rich dataset of recoveries donated by a debt collection business, recovery rates for non-performing loans taken from a single European country are modelled using linear regression, linear regression with Lasso, beta regression and inflated beta regression. We also propose a two-stage model: beta mixture model combined with a logistic regression model. The proposed model allowed us to model the multimodal distribution we found for these recovery rates. All models were built using loan characteristics, default data and collections data prior to purchase by the debt collection business. The intended use of the models was to estimate future recovery rates for improved risk assessment, capital requirement calculations and bad debt management. They were compared using a range of quantitative performance measures under K-fold cross validation. Among all the models, we found that the proposed two-stage beta mixture model performs best.

]]>Risks doi: 10.3390/risks7010018

Authors: Florin Avram Danijel Grahovac Ceren Vardar-Acar

As is well-known, the benefit of restricting L&eacute;vy processes without positive jumps is the &ldquo; W , Z scale functions paradigm&rdquo;, by which the knowledge of the scale functions W , Z extends immediately to other risk control problems. The same is true largely for strong Markov processes X t , with the notable distinctions that (a) it is more convenient to use as &ldquo;basis&rdquo; differential exit functions &nu; , &delta; , and that (b) it is not yet known how to compute &nu; , &delta; or W , Z beyond the L&eacute;vy, diffusion, and a few other cases. The unifying framework outlined in this paper suggests, however, via an example that the spectrally negative Markov and L&eacute;vy cases are very similar (except for the level of work involved in computing the basic functions &nu; , &delta; ). We illustrate the potential of the unified framework by introducing a new objective (33) for the optimization of dividends, inspired by the de Finetti problem of maximizing expected discounted cumulative dividends until ruin, where we replace ruin with an optimally chosen Azema-Yor/generalized draw-down/regret/trailing stopping time. This is defined as a hitting time of the &ldquo;draw-down&rdquo; process Y t = sup 0 &le; s &le; t X s &minus; X t obtained by reflecting X t at its maximum. This new variational problem has been solved in a parallel paper.

]]>Risks doi: 10.3390/risks7010017

Authors: Søren Asmussen Patrick J. Laub Hailiang Yang

Phase-type (PH) distributions are defined as distributions of lifetimes of finite continuous-time Markov processes. Their traditional applications are in queueing, insurance risk, and reliability, but more recently, also in finance and, though to a lesser extent, to life and health insurance. The advantage is that PH distributions form a dense class and that problems having explicit solutions for exponential distributions typically become computationally tractable under PH assumptions. In the first part of this paper, fitting of PH distributions to human lifetimes is considered. The class of generalized Coxian distributions is given special attention. In part, some new software is developed. In the second part, pricing of life insurance products such as guaranteed minimum death benefit and high-water benefit is treated for the case where the lifetime distribution is approximated by a PH distribution and the underlying asset price process is described by a jump diffusion with PH jumps. The expressions are typically explicit in terms of matrix-exponentials involving two matrices closely related to the Wiener-Hopf factorization, for which recently, a L&eacute;vy process version has been developed for a PH horizon. The computational power of the method of the approach is illustrated via a number of numerical examples.

]]>Risks doi: 10.3390/risks7010016

Authors: Shuaiqiang Liu Cornelis W. Oosterlee Sander M. Bohte

This paper proposes a data-driven approach, by means of an Artificial Neural Network (ANN), to value financial options and to calculate implied volatilities with the aim of accelerating the corresponding numerical methods. With ANNs being universal function approximators, this method trains an optimized ANN on a data set generated by a sophisticated financial model, and runs the trained ANN as an agent of the original solver in a fast and efficient way. We test this approach on three different types of solvers, including the analytic solution for the Black-Scholes equation, the COS method for the Heston stochastic volatility model and Brent&rsquo;s iterative root-finding method for the calculation of implied volatilities. The numerical results show that the ANN solver can reduce the computing time significantly.

]]>Risks doi: 10.3390/risks7010015

Authors: Mansi Jain Gagan Deep Sharma Mrinalini Srivastava

&lsquo;Sustainable investment&rsquo;&mdash;includes a variety of asset classes selected while caring for the causes of environmental, social, and governance (ESG). It is an investment strategy that seeks to combine social and/ or environmental benefits with financial returns, thus linking investor&rsquo;s social, ethical, ecological and economic concerns Under certain conditions, these indices also help to attract foreign capital, seeking international participation in the local capital markets. The purpose of this paper is to study whether the sustainable investment alternatives offer better financial returns than the conventional indices from both developed and emerging markets. With an intent to maintain consistency, this paper comparatively analyzes the financial returns of the Thomson Reuters/S-Network global indices, namely the developed markets (excluding US) ESG index&mdash;TRESGDX, emerging markets ESG index&mdash;TRESGEX, US large-cap ESG index&mdash;TRESGUS, Europe ESG index&mdash;TRESGEU, and those of the usual markets, namely MSCI world index (MSCI W), MSCI All Country World Equity index (MSCI ACWI), MSCI USA index (MSCI USA), and MSCI Europe Australasia Far East index (MSCI EAFE), MSCI Emerging Markets index (MSCI EM) and MSCI Europe index (MSCI EU). The study also focusses on the inter-linkages between these indices. Daily closing prices of all the benchmark indices are taken for the five-year period of January 2013&ndash;December 2017. Line charts and unit-root tests are applied to check the stationary nature of the series; Granger&rsquo;s causality model, auto-regressive conditional heteroskedasticity (ARCH)-GARCH type modelling is performed to find out the linkages between the markets under study followed by the Johansen&rsquo;s cointegration test and the Vector Error Correction Model to test the volatility spillover between the sustainable indices and the conventional indices. The study finds that the sustainable indices and the conventional indices are integrated and there is a flow of information between the two investment avenues. The results indicate that there is no significant difference in the performance between sustainable indices and the traditional conventional indices, being a good substitute to the latter. Hence, the financial/investment managers can obtain more insights regarding investment decisions, and the study further suggests that their portfolios should consider both the indices with the perspective of diversifying the risk and hedging, and reap benefits of the same. Additionally, corporate executives shall use it to benchmark their own performance against peers and track news as well.

]]>Risks doi: 10.3390/risks7010014

Authors: Rui Zhou Guangyu Xing Min Ji

Standardized longevity risk transfers often involve modeling mortality rates of multiple populations. Some researchers have found that mortality indexes of selected countries are cointegrated, meaning that a linear relationship exists between the indexes. Vector error correction model (VECM) was used to incorporate this relation, thereby forcing the mortality rates of multiple populations to revert to a long-run equilibrium. However, the long-run equilibrium may change over time. It is crucial to incorporate these changes such that mortality dependence is adequately modeled. In this paper, we develop a framework to examine the presence of equilibrium changes and to incorporate these changes into the mortality model. In particular, we focus on equilibrium changes caused by threshold effect, the phenomenon that mortality indexes alternate between different VECMs depending on the value of a threshold variable. Our framework comprises two steps. In the first step, a statistical test is performed to examine the presence of threshold effect in the VECM for multiple mortality indexes. In the second step, threshold vector error correction model (TVECM) is fitted to the mortality indexes and model adequacy is evaluated. We illustrate this framework with the mortality data of England and Wales (EW) and Canadian populations. We further apply the TVECM to forecast future mortalities and price an illustrative longevity bond with multivariate Wang transform. Our numerical results show that TVECM predicted much faster mortality improvement for EW and Canada than single-regime VECM and thus the incorporation of threshold effect significant increases longevity bond price.

]]>Risks doi: 10.3390/risks7010013

Authors: Mauricio Junca Harold A. Moreno-Franco José Luis Pérez

We consider the optimal bail-out dividend problem with fixed transaction cost for a L&eacute;vy risk model with a constraint on the expected present value of injected capital. To solve this problem, we first consider the optimal bail-out dividend problem with transaction cost and capital injection and show the optimality of reflected ( c 1 , c 2 ) -policies. We then find the optimal Lagrange multiplier, by showing that in the dual Lagrangian problem the complementary slackness conditions are met. Finally, we present some numerical examples to support our results.

]]>Risks doi: 10.3390/risks7010012

Authors: Ranjit Singh Jayashree Bhattacharjee

Risk perception is an idiosyncratic process of interpretation. It is a highly personal process of making a decision based on an individual&rsquo;s frame of reference that has evolved over time. The purpose of this paper is to find out the risk perception level of equity investors and to identify the factors influencing their risk perception. The study was conducted using a stratified random sampling design of 358 investors. It was found that the overall risk perception level of equity investors is moderate and that the main factors affecting their risk perception are information screening, investment education, fear psychosis, fundamental expertise, technical expertise, familiarity bias, information asymmetry, understanding of the market, etc. Considering the above findings, efforts should be made to bring people with a high risk perception to the low risk perception category by providing them with training to handle or manage high-risk scenarios which will help in promoting an equity-investment culture.

]]>Risks doi: 10.3390/risks7010011

Authors: Jackie Li Atsuyuki Kogure Jia Liu

In this paper, we suggest a Bayesian multivariate approach for pricing a reverse mortgage, allowing for house price risk, interest rate risk and longevity risk. We adopt the principle of maximum entropy in risk-neutralisation of these three risk components simultaneously. Our numerical results based on Australian data suggest that a reverse mortgage would be financially sustainable under the current financial environment and the model settings and assumptions.

]]>Risks doi: 10.3390/risks7010010

Authors: Ravi Summinga-Sonagadu Jason Narsoo

In this paper, we employ 99% intraday value-at-risk (VaR) and intraday expected shortfall (ES) as risk metrics to assess the competency of the Multiplicative Component Generalised Autoregressive Heteroskedasticity (MC-GARCH) models based on the 1-min EUR/USD exchange rate returns. Five distributional assumptions for the innovation process are used to analyse their effects on the modelling and forecasting performance. The high-frequency volatility models were validated in terms of in-sample fit based on various statistical and graphical tests. A more rigorous validation procedure involves testing the predictive power of the models. Therefore, three backtesting procedures were used for the VaR, namely, the Kupiec&rsquo;s test, a duration-based backtest, and an asymmetric VaR loss function. Similarly, three backtests were employed for the ES: a regression-based backtesting procedure, the Exceedance Residual backtest and the V-Tests. The validation results show that non-normal distributions are best suited for both model fitting and forecasting. The MC-GARCH(1,1) model under the Generalised Error Distribution (GED) innovation assumption gave the best fit to the intraday data and gave the best results for the ES forecasts. However, the asymmetric Skewed Student&rsquo;s-t distribution for the innovation process provided the best results for the VaR forecasts. This paper presents the results of the first empirical study (to the best of the authors&rsquo; knowledge) in: (1) forecasting the intraday Expected Shortfall (ES) under different distributional assumptions for the MC-GARCH model; (2) assessing the MC-GARCH model under the Generalised Error Distribution (GED) innovation; (3) evaluating and ranking the VaR predictability of the MC-GARCH models using an asymmetric loss function.

]]>Risks doi: 10.3390/risks7010009

Authors: Fangyuan Dong Nick Halen Kristen Moore Qinglai Zeng

Life Insurance Retirement Plans (LIRPs) offer tax-deferred cash value accumulation, tax-free withdrawals (if properly structured), and a tax-free death benefit to beneficiaries. Thus, LIRPs share many of the tax advantages of other retirement savings vehicles but with less restrictive limitations on income and contributions. Opinions are mixed about the effectiveness of LIRPs; some financial advisers recommend them enthusiastically, while others are more skeptical. In this paper, we examine the potential of LIRPs to meet both income and bequest needs in retirement. We contrast retirement portfolios that include a LIRP with those that include only investment products with no life insurance. We consider different issue ages, face amounts, and withdrawal patterns. We simulate market scenarios and we demonstrate that portfolios that include LIRPs yield higher legacy potential and smaller income risk than those that exclude it. Thus, we conclude that the inclusion of a LIRP can improve financial outcomes in retirement.

]]>Risks doi: 10.3390/risks7010008

Authors: Maria Elena De Giuli Alessandro Greppi Marina Resta

We use Object Oriented Bayesian Networks (OOBNs) to analyze complex ties in the equity market and to detect drivers for the Standard &amp; Poor&rsquo;s 500 (S&amp;P 500) index. To such aim, we consider a vast number of indicators drawn from various investment areas (Value, Growth, Sentiment, Momentum, and Technical Analysis), and, with the aid of OOBNs, we study the role they played along time in influencing the dynamics of the S&amp;P 500. Our results highlight that the centrality of the indicators varies in time, and offer a starting point for further inquiries devoted to combine OOBNs with trading platforms.

]]>Risks doi: 10.3390/risks7010007

Authors: Delia Coculescu Freddy Delbaen

We use the theory of coherent measures to look at the problem of surplus sharing in an insurance business. The surplus share of an insured is calculated by the surplus premium in the contract. The theory of coherent risk measures and the resulting capital allocation gives a way to divide the surplus between the insured and the capital providers, i.e., the shareholders.

]]>Risks doi: 10.3390/risks7010006

Authors: Guangyuan Gao Mario V. Wüthrich

The aim of this project is to analyze high-frequency GPS location data (second per second) of individual car drivers (and trips). We extract feature information about speeds, acceleration, deceleration, and changes of direction from this high-frequency GPS location data. Time series of this feature information allow us to appropriately allocate individual car driving trips to selected drivers using convolutional neural networks.

]]>Risks doi: 10.3390/risks7010005

Authors: Carmine De Franco Johann Nicolle Huyên Pham

One of the main challenges investors have to face is model uncertainty. Typically, the dynamic of the assets is modeled using two parameters: the drift vector and the covariance matrix, which are both uncertain. Since the variance/covariance parameter is assumed to be estimated with a certain level of confidence, we focus on drift uncertainty in this paper. Building on filtering techniques and learning methods, we use a Bayesian learning approach to solve the Markowitz problem and provide a simple and practical procedure to implement optimal strategy. To illustrate the value added of using the optimal Bayesian learning strategy, we compare it with an optimal nonlearning strategy that keeps the drift constant at all times. In order to emphasize the prevalence of the Bayesian learning strategy above the nonlearning one in different situations, we experiment three different investment universes: indices of various asset classes, currencies and smart beta strategies.

]]>Risks doi: 10.3390/risks7010004

Authors: Risks Editorial Office

Rigorous peer-review is the corner-stone of high-quality academic publishing. [...]

]]>Risks doi: 10.3390/risks7010003

Authors: Paolo Giudici Laura Parisi

We propose a statistical measure, based on correlation networks, to evaluate the systemic risk that could arise from the resolution of a failing or likely-to-fail financial institution, under three alternative scenarios: liquidation, private recapitalization, or bail-in. The measure enhances the observed CDS spreads with a risk premium that derives from contagion effects across financial institutions. The empirical findings reveal that the recapitalization of a distressed bank performed by the other banks in the system and the bail-in resolution minimize the potential losses for the banking sector with respect to the liquidation scenario, thus posing limited systemic risks. A closer comparison between the private intervention recapitalization and the bail-in tool shows that the latter slightly reduces contagion effects with respect to the private intervention scenario.

]]>Risks doi: 10.3390/risks7010002

Authors: Man Chung Fung Katja Ignatieva Michael Sherris

This paper assesses the hedge effectiveness of an index-based longevity swap and a longevity cap for a life annuity portfolio. Although longevity swaps are a natural instrument for hedging longevity risk, derivatives with non-linear pay-offs, such as longevity caps, provide more effective downside protection. A tractable stochastic mortality model with age dependent drift and volatility is developed and analytical formulae for prices of longevity derivatives are derived. The model is calibrated using Australian mortality data. The hedging of the life annuity portfolio is comprehensively assessed for a range of assumptions for the longevity risk premium, the term to maturity of the hedging instruments, as well as the size of the underlying annuity portfolio. The results compare the risk management benefits and costs of longevity derivatives with linear and nonlinear payoff structures.

]]>Risks doi: 10.3390/risks7010001

Authors: Daniel Doyle Chris Groendyke

This paper explores the use of neural networks to reduce the computational cost of pricing and hedging variable annuity guarantees. Pricing these guarantees can take a considerable amount of time because of the large number of Monte Carlo simulations that are required for the fair value of these liabilities to converge. This computational requirement worsens when Greeks must be calculated to hedge the liabilities of these guarantees. A feedforward neural network is a universal function approximator that is proposed as a useful machine learning technique to interpolate between previously calculated values and avoid running a full simulation to obtain a value for the liabilities. We propose methodologies utilizing neural networks for both the tasks of pricing as well as hedging four different varieties of variable annuity guarantees. We demonstrated a significant efficiency gain using neural networks in this manner. We also experimented with different error functions in the training of the neural networks and examined the resulting changes in network performance.

]]>Risks doi: 10.3390/risks6040145

Authors: Ashu Tiwari Archana Patro

Policymakers in developing and emerging countries are facing higher risk that is related to natural disasters in comparison to developed ones because of persistent problem of supply-side bottleneck for disaster insurance. Additionally, lower insurance consumption, higher disaster risk, and high income elasticity of insurance demand have worsened the loss consequences of natural disaster in these markets. In this context, current study for the first time argues that the supply side bottleneck problem has its origin in peculiar pattern of disaster consumption owing to memory cues. The study finds that relatively higher frequency of natural disasters acts as a negative memory cue and positively impacts insurance consumption. On the other hand, a relatively lower frequency of natural disasters adversely impacts insurance consumption in the background of variation in risk aversion behavior. For this purpose, current study has based its work on Mullainathan (2002), which builds its argument around memory cues.

]]>Risks doi: 10.3390/risks6040144

Authors: Yikai (Maxwell) Gong Zhuangdi Li Maria Milazzo Kristen Moore Matthew Provencher

Credibility theory is used widely in group health and casualty insurance. However, it is generally not used in individual life and annuity business. With the introduction of principle-based reserving (PBR), which relies more heavily on company-specific experience, credibility theory is becoming increasingly important for life actuaries. In this paper, we review the two most commonly used credibility methods: limited fluctuation and greatest accuracy (B&uuml;hlmann) credibility. We apply the limited fluctuation method to M Financial Group&rsquo;s experience data and describe some general qualitative observations. In addition, we use simulation to generate a universe of data and compute Limited Fluctuation and greatest accuracy credibility factors for actual-to-expected (A/E) mortality ratios. We also compare the two credibility factors to an intuitive benchmark credibility measure. We see that for our simulated data set, the limited fluctuation factors are significantly lower than the greatest accuracy factors, particularly for low numbers of claims. Thus, the limited fluctuation method may understate the credibility for companies with favorable mortality experience. The greatest accuracy method has a stronger mathematical foundation, but it generally cannot be applied in practice because of data constraints. The National Association of Insurance Commissioners (NAIC) recognizes and is addressing the need for life insurance experience data in support of PBR&mdash;this is an area of current work.

]]>Risks doi: 10.3390/risks6040143

Authors: Ranjan Das Gupta Rajesh Pathak

A risk-return association under normal market conditions can be conventional positive (risk-averse) or &ldquo;paradoxical&rdquo; negative (risk seeking). This study has the objective to investigate whether such an association is stable across market trends (i.e., bull and bear) and for overall, industry-classified and partitions sub-samples after controlling for a firm&rsquo;s age, size, leverage and liquidity using operating performance risk-return measures. In total, this study analyses 2666 firms (1199 firms from 15 developed countries and 1467 firms from 12 emerging countries) for the period of 1999&ndash;2015. Results show that in the overall and bull sub-periods, firms across countries are showing conventional positive (superior firms) and &ldquo;paradoxical&rdquo; negative (poor firms) in most cases. However, in the bear sub-periods all firms from emerging countries are risk seeking in order to maintain their position in the pecking order.

]]>Risks doi: 10.3390/risks6040142

Authors: Matthias Fischer Thorsten Moser Marius Pfeuffer

In both financial theory and practice, Value-at-risk (VaR) has become the predominant risk measure in the last two decades. Nevertheless, there is a lively and controverse on-going discussion about possible alternatives. Against this background, our first objective is to provide a current overview of related competitors with the focus on credit risk management which includes definition, references, striking properties and classification. The second part is dedicated to the measurement of risk concentrations of credit portfolios. Typically, credit portfolio models are used to calculate the overall risk (measure) of a portfolio. Subsequently, Euler&rsquo;s allocation scheme is applied to break the portfolio risk down to single counterparties (or different subportfolios) in order to identify risk concentrations. We first carry together the Euler formulae for the risk measures under consideration. In two cases (Median Shortfall and Range-VaR), explicit formulae are presented for the first time. Afterwards, we present a comprehensive study for a benchmark portfolio according to Duellmann and Masschelein (2007) and nine different risk measures in conjunction with the Euler allocation. It is empirically shown that&mdash;in principle&mdash;all risk measures are capable of identifying both sectoral and single-name concentration. However, both complexity of IT implementation and sensitivity of the risk figures w.r.t. changes of portfolio quality vary across the specific risk measures.

]]>Risks doi: 10.3390/risks6040141

Authors: Christina Erlwein-Sayer

We enhance the modelling and risk assessment of sovereign bond spreads by taking into account quantitative information gained from macro-economic news sentiment. We investigate sovereign bonds spreads of five European countries and improve the prediction of spread changes by incorporating news sentiment from relevant entities and macro-economic topics. In particular, we create daily news sentiment series from sentiment scores as well as positive and negative news volume and investigate their effects on yield spreads and spread volatility. We conduct a correlation and rolling correlation analysis between sovereign bond spreads and accumulated sentiment series and analyse changing correlation patterns over time. Market regimes are detected through correlation series and the impact of news sentiment on sovereign bonds in different market circumstances is investigated. We find best-suited external variables for forecasts in an ARIMAX model set-up. Error measures for forecasts of spread changes and volatility proxies are improved when sentiment is considered. These findings are then utilised to monitor sovereign bonds from European countries and detect changing risks through time.

]]>Risks doi: 10.3390/risks6040140

Authors: Tihana Škrinjarić

Effects of seasonal affective disorder (SAD) are explored on several selected Central and South East European markets in this study for the period 2010&ndash;2018. Both return and risk sensitivities on the SAD effect are examined for 11 markets in total (Bosnia and Herzegovina, Bulgaria, Croatia, Czech Republic, Hungary, Poland, Serbia, Slovakia, Slovenia, Romania and Ukraine). SAD effects are based upon psychiatric and behavioural theories, and are rarely observed on the stock markets today. Thus, this research provides empirical evaluation of the mentioned effects for some of the markets for the first time in the literature. The results indicate that 6 out of 11 markets exhibit SAD effects to some extent, meaning that investors&rsquo; risk aversion does change over the year, depending upon the season of the year. Such results have consequences in finance theory modelling and practical usage in investment strategies on stock markets as well.

]]>Risks doi: 10.3390/risks6040139

Authors: Stefano Cavastracci Strascia Agostino Tripodi

The aim of this paper is to carry out a closed tool to estimate the one-year volatility of the claims reserve, calculated through the generalized linear models (GLM), notably the overdispersed- Poisson model. Up to now, this one-year volatility has been estimated through the well-known bootstrap methodology that demands the use of the Monte Carlo method with a re-reserving technique. Nonetheless, this method is time consuming under the calculation point of view; therefore, approximation techniques are often used in practice, such as an emergence pattern based on the link between the one-year volatility&mdash;resulting from the Merz&ndash;W&uuml;thrich method&mdash;and the ultimate volatility&mdash;resulting from the Mack method.

]]>Risks doi: 10.3390/risks6040138

Authors: Abel Cadenillas Ricardo Huamán-Aguilar

We develop a government debt management model to study the optimal debt ceiling when the ability of the government to generate primary surpluses to reduce the debt ratio is limited. We succeed in finding a solution for the optimal debt ceiling. We study the conditions under which a country is not able to reduce its debt ratio to reach its optimal debt ceiling, even in the long run. In addition, this model with bounded intervention is consistent with the fact that, in reality, countries that succeed in reducing their debt ratio do not do so immediately, but over some period of time. To the best of our knowledge, this is the first theoretical model on the debt ceiling that accounts for bounded interventions.

]]>Risks doi: 10.3390/risks6040137

Authors: Aida Barkauskaite Ausrine Lakstutiene Justyna Witkowska

Scientific discussions have emphasized that the main problem with the current deposit insurance system is that the current system does not evaluate the risks that banks assume to calculate the deposit insurance premiums in many countries of the European Union (E.U.). Thus, the prevailing system does not safeguard a sufficient level of stability in the banking system. Scientific studies show that the deposit insurance system should consider not only the risk indicators for individual banks, but it must also consider the systemic risk of banks that affects the stability of the banking system. Hence, the question arises as to whether measurements of systemic risk in a common E.U. risk-based deposit insurance system are a formal necessity or if they are a value-adding process. Expanding the discussion of scientists, this article analyzes how contributions to insurance funds would change the banks of Lithuania following the introduction of the E.U.&rsquo;s overall risk-based deposit insurance system and after taking into consideration the additional systemic risk. The research results that were obtained provide evidence that the introduction of a risk-based deposit insurance system would redistribute payments to the deposit insurance fund between banks operating in Lithuania, and, thereby, would contribute to a reduction in the negative effects of the deposit insurance system and would improve the stability in the financial system.

]]>Risks doi: 10.3390/risks6040136

Authors: Haipeng Xing Yang Yu

The financial crises which occurred in the last several decades have demonstrated the significant impact of market structural breaks on firms&rsquo; credit behavior. To incorporate the impact of market structural break into the analysis of firms&rsquo; credit rating transitions and firms&rsquo; asset structure, we develop a continuous-time modulated Markov model for firms&rsquo; credit rating transitions with unobserved market structural breaks. The model takes a semi-parametric multiplicative regression form, in which the effects of firms&rsquo; observable covariates and macroeconomic variables are represented parametrically and nonparametrically, respectively, and the frailty effects of unobserved firm-specific and market-wide variables are incorporated via the integration form of the model assumption. We further develop a mixtured-estimating-equation approach to make inference on the effect of market variations, baseline intensities of all firms&rsquo; credit rating transitions, and rating transition intensities for each individual firm. We then use the developed model and inference procedure to analyze the monthly credit rating of U.S. firms from January 1986 to December 2012, and study the effect of market structural breaks on firms&rsquo; credit rating transitions.

]]>Risks doi: 10.3390/risks6040135

Authors: Hongmin Xiao Lin Xie

In this paper, the risk model with constant interest based on an entrance process is investigated. Under the assumptions that the entrance process is a renewal process and the claims sizes satisfy a certain dependence structure, which belong to the different heavy-tailed distribution classes, the finite-time asymptotic estimate of the bidimensional risk model with constant interest force is obtained. Particularly, when inter-arrival times also satisfy a certain dependence structure, these formulas still hold.

]]>Risks doi: 10.3390/risks6040134

Authors: Tomer Shushi

In risk theory, risks are often modeled by risk measures which allow quantifying the risks and estimating their possible outcomes. Risk measures rely on measure theory, where the risks are assumed to be random variables with some distribution function. In this work, we derive a novel topological-based representation of risks. Using this representation, we show the differences between diversifiable and non-diversifiable. We show that topological risks should be modeled using two quantities, the risk measure that quantifies the predicted amount of risk, and a distance metric which quantifies the uncertainty of the risk.

]]>Risks doi: 10.3390/risks6040133

Authors: Jung-Bin Su Jui-Cheng Hung

This study utilizes the seven bivariate generalized autoregressive conditional heteroskedasticity (GARCH) models to forecast the out-of-sample value-at-risk (VaR) of 21 stock portfolios and seven currency-stock portfolios with three weight combinations, and then employs three accuracy tests and one efficiency test to evaluate the VaR forecast performance for the above models. The seven models are constructed by four types of bivariate variance-covariance specifications and two approaches of parameters estimates. The four types of bivariate variance-covariance specifications are the constant conditional correlation (CCC), asymmetric and symmetric dynamic conditional correlation (ADCC and DCC), and the BEKK, whereas the two types of approach include the standard and non-standard approaches. Empirical results show that, regarding the accuracy tests, the VaR forecast performance of stock portfolios varies with the variance-covariance specifications and the approaches of parameters estimate, whereas it does not vary with the weight combinations of portfolios. Conversely, the VaR forecast performance of currency-stock portfolios is almost the same for all models and still does not vary with the weight combinations of portfolios. Regarding the efficiency test via market risk capital, the NS-BEKK model is the most suitable model to be used in the stock and currency-stock portfolios for bank risk managers irrespective of the weight combination of portfolios.

]]>Risks doi: 10.3390/risks6040132

Authors: Abdul-Aziz Ibn Musah Jianguo Du Hira Salah Ud din Khan Alhassan Alolo Abdul-Rasheed Akeji

In recent times, investing in volatile security increases the risk of losses and reduces gains. Many traders who depend on these risks indulge in multiple volatility procedures to inform their trading strategies. We explore two models to measure the tails behaviour and the period the stock will gain or fall within a five-month trading period. We obtained data from the Ghana stock exchange and applied generalized extreme value distribution validated by backtesting and an artificial neural network for forecasting. The network training produces and manages more than 90% accuracy respectively for gains and falls for given input-output pairs. Based on this, estimates of extreme value distribution proves that it is formidable. There is a significant development in market prediction in assessing the results of actual and forecast performance. The study reveals that once every five months, at a 5% confidence level, the market is expected to gain and fall 2.12% and 2.23%, respectively. The Ghana stock exchange market showed a maximum monthly stock gain above or below 2.12% in the fourth and fifth months, whiles maximum monthly stock fell above or below 2.23% in the third and fourth months. The study reveals that once every five months&rsquo; trading period, the stock market will gain and fall by almost an equal percentage, with a significant increase in value-at-risk and expected shortfall at the left tail as the quantiles increases compared to the right tail.

]]>Risks doi: 10.3390/risks6040131

Authors: Aditya Maheshwari Andrey Sarantsev

In our model, private actors with interbank cash flows similar to, but more general than that by Carmona et al. (2013) borrow from the non-banking financial sector at a certain interest rate, controlled by the central bank, and invest in risky assets. Each private actor aims to maximize its expected terminal logarithmic wealth. The central bank, in turn, aims to control the overall economy by means of an exponential utility function. We solve all stochastic optimal control problems explicitly. We are able to recreate occasions such as liquidity trap. We study distribution of the number of defaults (net worth of a private actor going below a certain threshold).

]]>Risks doi: 10.3390/risks6040130

Authors: Huong Dieu Dang

The informal constraints that arise from the national culture in which a firm resides have a pervasive impact on managerial decision making and corporate credit risk, which in turn impacts on corporate ratings and rating changes. In some cultures, firms are naturally predisposed to rating changes in a particular direction (downgrade or upgrade) while, in other cultures, firms are more likely to migrate from the current rating in either direction. This study employs a survival analysis framework to examine the effect of national culture on the probability of rating transitions of 5360 firms across 50 countries over the period 1985&ndash;2010. Firms located in long-term oriented cultures are less likely to be downgraded and, in some cases, more likely to be upgraded. Downgrades occur more often in strong uncertainty-avoiding countries and less often in large power distance (hierarchy) and embeddedness countries. There is some evidence that masculinity predisposes firms to more rating transitions. Studying culture helps enrich our understanding of corporate rating migrations, and helps develop predictive models of corporate rating changes across countries.

]]>Risks doi: 10.3390/risks6040129

Authors: Xinyuan Wei Jun-ya Gotoh Stan Uryasev

This paper studies the peer-to-peer lending and loan application processing of LendingClub. We tried to reproduce the existing loan application processing algorithm and find features used in this process. Loan application processing is considered a binary classification problem. We used the area under the ROC curve (AUC) for evaluation of algorithms. Features were transformed with splines for improving the performance of algorithms. We considered three classification algorithms: logistic regression, buffered AUC (bAUC) maximization, and AUC maximization.With only three features, Debt-to-Income Ratio, Employment Length, and Risk Score, we obtained an AUC close to 1. We have done both in-sample and out-of-sample evaluations. The codes for cross-validation and solving problems in a Portfolio Safeguard (PSG) format are in the Appendix. The calculation results with the data and codes are posted on the website and are available for downloading.

]]>Risks doi: 10.3390/risks6040128

Authors: Minh Ngo Marc Rieger Shuonan Yuan

Stocks are riskier than bonds. This causes a risk premium for stocks. That the size of this premium, however, seems to be larger than risk aversion alone can explain the so-called “equity premium puzzle”. One possible explanation is the inclusion of a degree of ambiguity in stock returns to account for an additional ambiguity premium, whose size depends on the degree of ambiguity aversion among investors. It is, however, difficult to test this empirically. In this paper, we compute the first firm-level estimation of equity premium based on the internal rate of return (IRR) approach for a total of N = 28,256 companies in 54 countries worldwide. Using a survey of international data on ambiguity aversion, we find a strong and robust relation between equity premia and ambiguity aversion.

]]>Risks doi: 10.3390/risks6040127

Authors: Pavel V. Gapeev Hessah Al Motairi

We present closed-form solutions to the perpetual American dividend-paying put and call option pricing problems in two extensions of the Black&ndash;Merton&ndash;Scholes model with random dividends under full and partial information. We assume that the dividend rate of the underlying asset price changes its value at a certain random time which has an exponential distribution and is independent of the standard Brownian motion driving the price of the underlying risky asset. In the full information version of the model, it is assumed that this time is observable to the option holder, while in the partial information version of the model, it is assumed that this time is unobservable to the option holder. The optimal exercise times are shown to be the first times at which the underlying risky asset price process hits certain constant levels. The proof is based on the solutions of the associated free-boundary problems and the applications of the change-of-variable formula.

]]>Risks doi: 10.3390/risks6040126

Authors: Fabian Capitanio Antonio De Pin

Risk management policy in agriculture has become particularly prominent nowadays, considering the evolution of the Common Agricultural Policy (CAP) and climate change. Moreover, the Word Trade Organization places constraints on it. In this context, (1) the aim is to analyze the causes of the loss of effectiveness of the Italian insurance system, unable to deal with the specific coverage demand from agriculture. (2) The analysis is carried out through the economic evaluation of convenience in adhering to the instruments offered by the insurance market to winegrowers in the Controlled and Guaranteed Denomination of Origin (DOCG) area of Conegliano-Valdobbiadene. (3) The study highlights that the subsidized coverage alone is not the most adequate measure of agricultural policy. Adhering to preferential programs implies the drafting of a supplementary insurance policy to minimize the loss function. (4) The current insurance system impasse demonstrates that the producer hardly accepts to policies which do not convert into an immediate income benefit. The European risk management regulation confirms its limits in terms of usefulness and efficiency of the agrarian policy. (5) The prediction of probabilistic increase of severe-weather patterns makes the search for innovative risk assessment models more urgent, models which can combine the different needs of stakeholders: farmers, insurance companies, and society.

]]>Risks doi: 10.3390/risks6040125

Authors: Marco Neffelli

Portfolio weights solely based on risk avoid estimation errors from the sample mean, but they are still affected from the misspecification in the sample covariance matrix. To solve this problem, we shrink the covariance matrix towards the Identity, the Variance Identity, the Single-index model, the Common Covariance, the Constant Correlation, and the Exponential Weighted Moving Average target matrices. Using an extensive Monte Carlo simulation, we offer a comparative study of these target estimators, testing their ability in reproducing the true portfolio weights. We control for the dataset dimensionality and the shrinkage intensity in the Minimum Variance (MV), Inverse Volatility (IV), Equal-Risk-Contribution (ERC), and Maximum Diversification (MD) portfolios. We find out that the Identity and Variance Identity have very good statistical properties, also being well conditioned in high-dimensional datasets. In addition, these two models are the best target towards which to shrink: they minimise the misspecification in risk-based portfolio weights, generating estimates very close to the population values. Overall, shrinking the sample covariance matrix helps to reduce weight misspecification, especially in the Minimum Variance and the Maximum Diversification portfolios. The Inverse Volatility and the Equal-Risk-Contribution portfolios are less sensitive to covariance misspecification and so benefit less from shrinkage.

]]>Risks doi: 10.3390/risks6040124

Authors: Chengbo Fu

The variance of stock returns is decomposed based on a conditional Fama&ndash;French three-factor model instead of its unconditional counterpart. Using time-varying alpha and betas in this model, it is evident that four additional risk terms must be considered. They include the variance of alpha, the variance of the interaction between the time-varying component of beta and factors, and two covariance terms. These additional risk terms are components that are included in the idiosyncratic risk estimate using an unconditional model. By investigating the relation between the risk terms and stock returns, we find that only the variance of the time-varying alpha is negatively associated with stock returns. Further tests show that stock returns are not affected by the variance of time-varying beta. These results are consistent with the findings in the literature identifying return predictability from time-varying alpha rather than betas.

]]>Risks doi: 10.3390/risks6040123

Authors: Marie Angèle Cathleen Alijean Jason Narsoo

Mortality forecasting has always been a target of study by academics and practitioners. Since the introduction and rising significance of securitization of risk in mortality and longevity, more in-depth studies regarding mortality have been carried out to enable the fair pricing of such derivatives. In this article, a comparative analysis is performed on the mortality forecasting accuracy of four mortality models. The methodology employs the Age-Period-Cohort model, the Cairns-Blake-Dowd model, the classical Lee-Carter model and the Kou-Modified Lee-Carter model. The Kou-Modified Lee-Carter model combines the classical Lee-Carter with the Double Exponential Jump Diffusion model. This paper is the first study to employ the Kou model to forecast French mortality data. The dataset comprises death data of French males from age 0 to age 90, available for the years 1900&ndash;2015. The paper differentiates between two periods: the 1900&ndash;1960 period where extreme mortality events occurred for French males and the 1961&ndash;2015 period where no significant jump is observed. The Kou-modified Lee-Carter model turns out to give the best mortality forecasts based on the RMSE, MAE, MPE and MAPE metrics for the period 1900&ndash;1960 during which the two World Wars occurred. This confirms that the consideration of jumps and leptokurtic features conveys important information for mortality forecasting.

]]>Risks doi: 10.3390/risks6040122

Authors: Dietmar Pfeifer Olena Ragulina

We propose a Monte Carlo simulation method to generate stress tests by VaR scenarios under Solvency II for dependent risks on the basis of observed data. This is of particular interest for the construction of Internal Models. The approach is based on former work on partition-of-unity copulas, however with a direct scenario estimation of the joint density by product beta distributions after a suitable transformation of the original data.

]]>Risks doi: 10.3390/risks6040121

Authors: Robert J. Powell Duc H. Vo Thach N. Pham

There has been much discussion in the literature about how central measures of equity risk such as standard deviation fail to account for extreme tail risk of equities. Similarly, parametric measures of value at risk (VaR) may also fail to account for extreme risk as they assume a normal distribution which is often not the case in practice. Nonparametric measures of extreme risk such as nonparametric VaR and conditional value at risk (CVaR) have often been found to overcome this problem by measuring actual tail risk without applying any predetermined assumptions. However, this article argues that it is not just the actual risk of equites that is important to investor choices, but also the relative (ordinal) risk of equities compared to each other. Using an applied setting of industry portfolios in a variety of Asian countries (benchmarked to the United States), over crisis and non-crisis periods, this article finds that nonparametric measures of VaR and CVaR may provide only limited new information to investors about relative risk in the portfolios examined as there is a high degree of similarity found in relative industry risk when using nonparametric metrics as compared to central or parametric measures such as standard deviation and parametric VaR.

]]>Risks doi: 10.3390/risks6040120

Authors: Fengming Qin Junru Zhang Zhaoyong Zhang

This study examines empirically the volatility spillover effects between the RMB foreign exchange markets and the stock markets by employing daily returns of the Chinese RMB exchange rates and the stock markets in China and Japan during the period in 1998&ndash;2018. We find evidence that there exist co-volatility effects among the financial markets in China and Japan, and the volatility of RMB exchange rates contribute to the co-volatility spillovers across the financial markets. Reversely, the return shock from the stock markets can also generate co-volatility spillover to the foreign exchange markets. The bidirectional relationship reveals that both the fundamental hypothesis and the investor-induced hypothesis are valid. Our estimates also show that the spillover effects led by the stock market in Japan are stronger than that from the foreign exchange markets and the Chinese stock markets, implying that market with higher accessibility has greater spillover effects onto other markets. We also found that the average co-volatility spillover effects among the RMB exchange markets and the stock markets in Japan and China are generally negative. These findings have important policy implications for risk management and hedging strategies.

]]>Risks doi: 10.3390/risks6040119

Authors: Chengping Gong Chengxiu Ling

Based on suitable left-truncated or censored data, two flexible classes of M-estimations of Weibull tail coefficient are proposed with two additional parameters bounding the impact of extreme contamination. Asymptotic normality with n -rate of convergence is obtained. Its robustness is discussed via its asymptotic relative efficiency and influence function. It is further demonstrated by a small scale of simulations and an empirical study on CRIX.

]]>Risks doi: 10.3390/risks6040118

Authors: Elisson Andrade Fabio Mattos Roberto Arruda de Souza Lima

The objective of this research is to evaluate the influence on hedging decisions of a realistic set of transaction costs which are largely stochastic. The stochastic nature of some transaction costs (such as margin calls) means that their exact value is unknown when the hedge is placed, since they depend on the trajectory of futures prices during the hedge. Results are consistent with previous studies in that the introduction of transaction costs tend to affect hedge ratios. However, as opposed to the traditional literature, the introduction of stochastic costs in futures hedging can either decrease or increase hedge ratios depending on how these costs are determined.

]]>Risks doi: 10.3390/risks6040117

Authors: Jarmila Horváthová Martina Mokrišová

In this paper, the following research problem was addressed: Is DEA (Data Envelopment Analysis) method a suitable alternative to Altman model in predicting the risk of bankruptcy? Based on the above-mentioned research problem, we formulated the aim of the paper: To apply DEA method for predicting the risk of bankruptcy and to compare its results with the results of Altman model. The research problem and the aim of the paper follow the research of authors aimed at the application of methods which are appropriate for measuring business financial health, performance and competitiveness as well as for predicting the risk of bankruptcy. To address the problem, the following methods were applied: financial ratios, Altman model for private non-manufacturing firms and DEA method. When applying DEA method, we formulated input-oriented DEA CCR model. We found that DEA method is an appropriate alternative to Altman model in predicting the risk of possible business bankruptcy. The important conclusion is that DEA allows us to apply not only outputs but also inputs. Since prediction models do not include these indicators, DEA method appears to be the right choice. We recommend, especially for Slovak companies, to apply cost ratio when calculating risk of bankruptcy.

]]>Risks doi: 10.3390/risks6040116

Authors: Vincenzo Candila Salvatore Farace

Addressing the volatility spillovers of agricultural commodities is important for at least two reasons. First, for the last several years, the volatility of agricultural commodity prices seems to have increased. Second, according to the Food and Agriculture Organization, there is a strong need for understanding the potential (negative) impacts on food security caused by food commodity volatilities. This paper aims at investigating the presence, the size, and the persistence of volatility spillovers among five agricultural commodities (corn, sugar, wheat, soybean, and bioethanol) and five Latin American (Argentina, Brazil, Chile, Colombia, Peru) stock market indexes. Overall, when a negative shock hits the commodity market, Latin American stock market volatility tends to increase. This happens, for instance, for the relationships from corn to Chile and Colombia and from wheat to Peru and Chile.

]]>Risks doi: 10.3390/risks6040115

Authors: Xin Liu Jiang Wu Chen Yang Wenjun Jiang

In this paper, we propose a clustering procedure of financial time series according to the coefficient of weak lower-tail maximal dependence (WLTMD). Due to the potential asymmetry of the matrix of WLTMD coefficients, the clustering procedure is based on a generalized weighted cuts method instead of the dissimilarity-based methods. The performance of the new clustering procedure is evaluated by simulation studies. Finally, we illustrate that the optimal mean-variance portfolio constructed based on the resulting clusters manages to reduce the risk of simultaneous large losses effectively.

]]>Risks doi: 10.3390/risks6040114

Authors: Chen Li Xiaohu Li

This paper studies a Pareto-optimal reinsurance contract in the presence of negative statistical dependence between the insurance claim and the random recovery rate. In the context of symmetric information model and asymmetric information model, we investigate properties of the Pareto-optimal indemnity schedules. For risk neutral reinsurer with proportional cost and associated expense, we present possible forms of the Pareto-optimal indemnity schedule as well.

]]>Risks doi: 10.3390/risks6040113

Authors: Arvind Shrivastava Kuldeep Kumar Nitin Kumar

The objective of the study is to perform corporate distress prediction for an emerging economy, such as India, where bankruptcy details of firms are not available. Exhaustive panel dataset extracted from Capital IQ has been employed for the purpose. Foremost, the study contributes by devising novel framework to capture incipient signs of distress for Indian firms by employing a combination of firm specific parameters. The strategy not only enables enlarging the sample of distressed firms but also enables to obtain robust results. The analysis applies both standard Logistic and Bayesian modeling to predict distressed firms in Indian corporate sector. Thereby, a comparison of predictive ability of the two approaches has been carried out. Both in-sample and out of sample evaluation reveal a consistently better predictive capability employing Bayesian methodology. The study provides useful structure to indicate the early signals of failure in Indian corporate sector that is otherwise limited in literature.

]]>Risks doi: 10.3390/risks6040112

Authors: Florent Gallien Serge Kassibrakis Semyon Malamud

We solve the problem of optimal risk management for an investor holding an illiquid, alpha-generating fund and hedging his/her position with a liquid futures contract. When the investor is subject to a lower bound on net return, he/she is forced to reduce the total risk of his/her portfolio after a loss. In this case, he/she faces a tradeoff of either paying the transaction costs and deleveraging or keeping his/her current position in the illiquid instrument and hedging away some of the risk while keeping the residual, unhedgeable risk on his/her balance sheet. We explicitly characterize this tradeoff and study its dependence on asset characteristics. In particular, we show that higher alpha and lower beta typically widen the no-trading zone, while the impact of volatility is ambiguous.

]]>Risks doi: 10.3390/risks6040111

Authors: Angelo Corelli

The paper analyzes the relationship between the most popular cryptocurrencies and a range of selected fiat currencies, in order to identify any pattern and/or causality between the series. Cryptocurrencies are a hot topic in Finance due to their strict relationship with the Blockchain system they originate from and therefore are normally considered as part of the ongoing, world-wide financial revolution. This innovative study investigates this relationship for the first time by thoroughly investigating the data, their features, and the way they are interconnected. Results show very interesting results in terms of how concentrated the causality effect on some specific cryptocurrencies and fiat currencies is. The outcome is a clear and possibly explainable relationship between cryptocurrencies and Asian markets, while envisioning some kind of Asian effect.

]]>Risks doi: 10.3390/risks6040110

Authors: Sooie-Hoe Loke Enrique Thomann

In this paper, a dual risk model under constant force of interest is considered. The ruin probability in this model is shown to satisfy an integro-differential equation, which can then be written as an integral equation. Using the collocation method, the ruin probability can be well approximated for any gain distributions. Examples involving exponential, uniform, Pareto and discrete gains are considered. Finally, the same numerical method is applied to the Laplace transform of the time of ruin.

]]>Risks doi: 10.3390/risks6040109

Authors: Marcos González-Fernández Carmen González-Velasco

The aim of this paper is to analyze the relation between maturity structure, sovereign bond yields and sovereign risk in the Economic and Monetary Union for the period of 1990&ndash;2013. The results confirm the existence of an inverse relationship between sovereign bond yields, sovereign risk and the maturity structure of sovereign debt, regardless of the proxy that is used to measure sovereign risk and the time variance of the variables employed. The results indicate that risk shortens the maturity structure of sovereign debt because it reduces the stock of long-term debt. The relationship between maturity structure and sovereign bond yields differs depending on the risk of the countries analyzed (non-monotonic relationship) and the differences between peripheral and core countries are greater for higher levels of the yields. If we control for the indebtedness level of these countries, the results show that the relationship between the sovereign bond yields and maturity strengthens as the debt level increases.

]]>Risks doi: 10.3390/risks6040108

Authors: Kris Peremans Stefan Van Aelst Tim Verdonck

The chain ladder method is a popular technique to estimate the future reserves needed to handle claims that are not fully settled. Since the predictions of the aggregate portfolio (consisting of different subportfolios) do not need to be equal to the sum of the predictions of the subportfolios, a general multivariate chain ladder (GMCL) method has already been proposed. However, the GMCL method is based on the seemingly unrelated regression (SUR) technique which makes it very sensitive to outliers. To address this issue, we propose a robust alternative that estimates the SUR parameters in a more outlier resistant way. With the robust methodology it is possible to automatically flag the claims with a significantly large influence on the reserve estimates. We introduce a simulation design to generate artificial multivariate run-off triangles based on the GMCL model and illustrate the importance of taking into account contemporaneous correlations and structural connections between the run-off triangles. By adding contamination to these artificial datasets, the sensitivity of the traditional GMCL method and the good performance of the robust GMCL method is shown. From the analysis of a portfolio from practice it is clear that the robust GMCL method can provide better insight in the structure of the data.

]]>Risks doi: 10.3390/risks6040107

Authors: Phong Luu Jingzhi Tie Qing Zhang

A mean-reverting model is often used to capture asset price movements fluctuating around its equilibrium. A common strategy trading such mean-reverting asset is to buy low and sell high. However, determining these key levels in practice is extremely challenging. In this paper, we study the optimal trading of such mean-reverting asset with a fixed transaction (commission and slippage) cost. In particular, we focus on a threshold type policy and develop a method that is easy to implement in practice. We formulate the optimal trading problem in terms of a sequence of optimal stopping times. We follow a dynamic programming approach and obtain the value functions by solving the associated HJB equations. The optimal threshold levels can be found by solving a set of quasi-algebraic equations. In addition, a verification theorem is provided together with sufficient conditions. Finally, a numerical example is given to illustrate our results. We note that a complete treatment of this problem was done recently by Leung and associates. Nevertheless, our work was done independently and focuses more on developing necessary optimality conditions.

]]>Risks doi: 10.3390/risks6040106

Authors: Ghislain Léveillé Ilie-Radu Mitric Victor Côté

In this document, we examine the effects of the age process on aggregate discounted claims by studying the conditional raw and joint moments, the moment generating function and the distribution function of the increments of compound renewal sums with discounted claims, taking into account the past experience of an insurance portfolio.

]]>Risks doi: 10.3390/risks6040105

Authors: Chia-Lin Chang Jukka Ilomäki Hannu Laurila Michael McAleer

This paper examines how the size of the rolling window, and the frequency used in moving average (MA) trading strategies, affects financial performance when risk is measured. We use the MA rule for market timing, that is, for when to buy stocks and when to shift to the risk-free rate. The important issue regarding the predictability of returns is assessed. It is found that performance improves, on average, when the rolling window is expanded and the data frequency is low. However, when the size of the rolling window reaches three years, the frequency loses its significance and all frequencies considered produce similar financial performance. Therefore, the results support stock returns predictability in the long run. The procedure takes account of the issues of variable persistence as we use only returns in the analysis. Therefore, we use the performance of MA rules as an instrument for testing returns predictability in financial stock markets.

]]>Risks doi: 10.3390/risks6040104

Authors: Naji Massad Jørgen Vitting Andersen

We introduce tools to capture the dynamics of three different pathways, in which the synchronization of human decision-making could lead to turbulent periods and contagion phenomena in financial markets. The first pathway is caused when stock market indices, seen as a set of coupled integrate-and-fire oscillators, synchronize in frequency. The integrate-and-fire dynamics happens due to &ldquo;change blindness&rdquo;, a trait in human decision-making where people have the tendency to ignore small changes, but take action when a large change happens. The second pathway happens due to feedback mechanisms between market performance and the use of certain (decoupled) trading strategies. The third pathway occurs through the effects of communication and its impact on human decision-making. A model is introduced in which financial market performance has an impact on decision-making through communication between people. Conversely, the sentiment created via communication has an impact on financial market performance. The methodologies used are: agent based modeling, models of integrate-and-fire oscillators, and communication models of human decision-making.

]]>Risks doi: 10.3390/risks6030103

Authors: Jin Sun Pavel V. Shevchenko Man Chung Fung

Variable annuities, as a class of retirement income products, allow equity market exposure for a policyholder&rsquo;s retirement fund with optional guarantees to limit the downside risk of the market. Management fees andguarantee insurance fees are charged respectively for the market exposure and for the protection from the downside risk. We investigate the pricing of variable annuity guarantees under optimal withdrawal strategies when management fees are present. We consider from both policyholder&rsquo;s and insurer&rsquo;s perspectives optimal withdrawal strategies and calculate the respective fair insurance fees. We reveal a discrepancy where the fees from the insurer&rsquo;s perspective can be significantly higher due to the management fees serving as a form of market friction. Our results provide a possible explanation of lower guarantee insurance fees observed in the market than those predicted from the insurer&rsquo;s perspective. Numerical experiments are conducted to illustrate the results.

]]>Risks doi: 10.3390/risks6030102

Authors: Matija Vidmar

A fluctuation theory and, in particular, a theory of scale functions is developed for upwards skip-free L&eacute;vy chains, i.e., for right-continuous random walks embedded into continuous time as compound Poisson processes. This is done by analogy to the spectrally negative class of L&eacute;vy processes&mdash;several results, however, can be made more explicit/exhaustive in the compound Poisson setting. Importantly, the scale functions admit a linear recursion, of constant order when the support of the jump measure is bounded, by means of which they can be calculated&mdash;some examples are presented. An application to the modeling of an insurance company&rsquo;s aggregate capital process is briefly considered.

]]>Risks doi: 10.3390/risks6030101

Authors: Yaseen Ghulam Kamini Dhruva Sana Naseem Sophie Hill

We utilize the data of a very large UK automobile loan firm to study the interaction of the characteristics of borrowers and loans in predicting the subsequent loan performance. Our broader findings confirm the earlier research on the issue of subprime auto loans. More importantly, unmarried borrowers living with furnished tenancy agreements who have relatively new jobs have a probability of defaulting of more than 60% compared to an average 7% default rate in overall subprime borrowers in the dataset. Also, in the above category are those who live in a less prosperous part of the UK such as the north-west, are full-time self-employed, have other large loan arrears, fall into the bottom 25% percentile of monthly income, secure loans with high loan to total value (LTV), purchase expensive automobiles with shorter loan duration payment plans, and have a high dependency on government support. This in fact is also true of those who go into arrears, except that the highest probability in this context is around 40% compared to 6% for an overall sample. These findings shall help in the understanding of subprime auto loans performance in relation to borrowers and loan features alongside helping auto finance firms improve predictive models and decision-making.

]]>Risks doi: 10.3390/risks6030100

Authors: Nadezhda Gribkova Ričardas Zitikis

Background, or systematic, risks are integral parts of many systems and models in insurance and finance. These risks can, for example, be economic in nature, or they can carry more technical connotations, such as errors or intrusions, which could be intentional or unintentional. A most natural question arises from the practical point of view: is the given system really affected by these risks? In this paper we offer an algorithm for answering this question, given input-output data and appropriately constructed statistics, which rely on the order statistics of inputs and the concomitants of outputs. Even though the idea is rooted in complex statistical and probabilistic considerations, the algorithm is easy to implement and use in practice, as illustrated using simulated data.

]]>Risks doi: 10.3390/risks6030099

Authors: Claude Lefèvre Stéphane Loisel Muhsin Tamturk Sergey Utev

A quantum mechanics approach is proposed to model non-life insurance risks and to compute the future reserve amounts and the ruin probabilities. The claim data, historical or simulated, are treated as coming from quantum observables and analyzed with traditional machine learning tools. They can then be used to forecast the evolution of the reserves of an insurance company. The following methodology relies on the Dirac matrix formalism and the Feynman path-integral method.

]]>Risks doi: 10.3390/risks6030098

Authors: Chuan-Chuan Ko Tyrone T. Lin Fu-Min Zeng Chien-Yu Liu

The study considers the product life cycle in the stages of technological innovation, and focuses on how to evaluate the optimal investment strategy and the project value. It applies different product stages (three stages including production innovation, manufacture innovation, and business innovation) factors to different risks to build a technology innovation strategy model. This study of option premiums aims for the best strategy timing for each innovation stage. It shows that the variation of business cycle will affect the purchasing power under the uncertainty of Gross Domestic Product (GDP). In application, the compound binomial options for the manufacture innovation will only be considered after the execution of the production innovation, whereas the operation innovation will only be considered after the execution of the manufacture innovation. Thus, this paper constructs the dynamic investment sequential decision model, assesses the feasibility of an investment strategy, and makes a decision on the appropriate project value and option premiums for each stage under the possible change of GDP. Numerically, the result shows the equity value of the investment is greater than 0. Therefore, this paper recommends the case firm to invest in its innovation project known as one-time passwords. Sensitivity analysis shows when the risk-adjusted discounted rate r increases, the risk of the investment market increases accordingly, hence the equity value must also be higher in order to attract the case firm&rsquo;s investment interest. Also, the average GDP growth rate u sensitivity analysis results in different phenomena. The equity value gradually decreases when the average GDP growth rate rises. When the average GDP growth rate u rises to a certain extent, however, its equity value is gradually growing. The study investigates the product life cycle innovation investment topic by using the compound binomial options method and therefore provide a more flexible strategy decision compared with other trend forecast criteria.

]]>Risks doi: 10.3390/risks6030097

Authors: Gareth W. Peters

The paper addresses three objectives: the first is a presentation and overview of some important developments in quantile times series approaches relevant to demographic applications&mdash;secondly, development of a general framework to represent quantile regression models in a unifying manner, which can further enhance practical extensions and assist in formation of connections between existing models for practitioners. In this regard, the core theme of the paper is to provide perspectives to a general audience of core components that go into construction of a quantile time series model. The third objective is to compare and discuss the application of the different quantile time series models on several sets of interesting demographic and mortality related time series data sets. This has relevance to life insurance analysis and the resulting exploration undertaken includes applications in mortality, fertility, births and morbidity data for several countries, with a more detailed analysis of regional data in England, Wales and Scotland.

]]>Risks doi: 10.3390/risks6030096

Authors: Eric Beutner Henryk Zähle

Almost sure bootstrap consistency of the blockwise bootstrap for the Average Value at Risk of single risks is established for strictly stationary &beta; -mixing observations. Moreover, almost sure bootstrap consistency of a multiplier bootstrap for the Average Value at Risk of collective risks is established for independent observations. The main results rely on a new functional delta-method for the almost sure bootstrap of uniformly quasi-Hadamard differentiable statistical functionals, to be presented here. The latter seems to be interesting in its own right.

]]>Risks doi: 10.3390/risks6030095

Authors: Paolo Giudici Laura Parisi

We propose a novel credit risk measurement model for Corporate Default Swap (CDS) spreads that combines vector autoregressive regression with correlation networks. We focus on the sovereign CDS spreads of a collection of countries that can be regarded as idiosyncratic measures of credit risk. We model CDS spreads by means of a structural vector autoregressive model, composed by a time dependent country specific component, and by a contemporaneous component that describes contagion effects among countries. To disentangle the two components, we employ correlation networks, derived from the correlation matrix between the reduced form residuals. The proposed model is applied to ten countries that are representative of the recent financial crisis: top borrowing/lending countries, and peripheral European countries. The empirical findings show that the contagion variable derived in this study can be considered as a network centrality measure. From an applied viewpoint, the results indicate that, in the last 10 years, contagion has induced a &ldquo;clustering effect&rdquo; between core and peripheral countries, with the two groups further diverging through, and because of, contagion propagation, thus creating a sort of diabolic loop extremely difficult to be reversed. Finally, the outcomes of the analysis confirm that core countries are importers of risk, as contagion increases their CDS spread, whereas peripheral countries are exporters of risk. Greece is an unfortunate exception, as its spreads seem to increase for both idiosyncratic factors and contagion effects.

]]>Risks doi: 10.3390/risks6030094

Authors: Adnen Ben Nasr Juncal Cunado Rıza Demirer Rangan Gupta

This study examines the linkages between Brazil, Russia, India, and China (BRICS) stock market returns, country risk ratings, and international factors via Non-linear Auto Regressive Distributed Lags models (NARDL) that allow for testing the asymmetric effects of changes in country risk ratings on stock market returns. We show that BRICS countries exhibit quite a degree of heterogeneity in the interaction of their stock market returns with country-specific political, financial, and economic risk ratings. Positive and negative rating changes in some BRICS countries are found to have significant implications for both local stock market returns, as well as commodity price dynamics. While the commodity market acts as a catalyst for these emerging stock markets in the long-run, we also observe that negative changes in the country risk ratings generally command a higher impact on stock returns, implying the greater impact of bad news on market dynamics. Our findings suggest that not all BRICS nations are the same in terms of how they react to ratings changes and how they interact with global market variables.

]]>Risks doi: 10.3390/risks6030093

Authors: Guus Balkema Paul Embrechts

There exist several estimators of the regression line in the simple linear regression: Least Squares, Least Absolute Deviation, Right Median, Theil&ndash;Sen, Weighted Balance, and Least Trimmed Squares. Their performance for heavy tails is compared below on the basis of a quadratic loss function. The case where the explanatory variable is the inverse of a standard uniform variable and where the error has a Cauchy distribution plays a central role, but heavier and lighter tails are also considered. Tables list the empirical sd and bias for ten batches of one hundred thousand simulations when the explanatory variable has a Pareto distribution and the error has a symmetric Student distribution or a one-sided Pareto distribution for various tail indices. The results in the tables may be used as benchmarks. The sample size is n = 100 but results for n = &infin; are also presented. The error in the estimate of the slope tneed not be asymptotically normal. For symmetric errors, the symmetric generalized beta prime densities often give a good fit.

]]>Risks doi: 10.3390/risks6030092

Authors: Janine Balter Alexander J. McNeil

A justification of the Basel liquidity formula for risk capital in the trading book is given under the assumption that market risk-factor changes form a Gaussian white noise process over 10-day time steps and changes to P&amp;L (profit-and-loss) are linear in the risk-factor changes. A generalization of the formula is derived under the more general assumption that risk-factor changes are multivariate elliptical. It is shown that the Basel formula tends to be conservative when the elliptical distributions are from the heavier-tailed generalized hyperbolic family. As a by-product of the analysis, a Fourier approach to calculating expected shortfall for general symmetric loss distributions is developed.

]]>Risks doi: 10.3390/risks6030091

Authors: Riccardo Gatto

In this article we introduce the stability analysis of a compound sum: it consists of computing the standardized variation of the survival function of the sum resulting from an infinitesimal perturbation of the common distribution of the summands. Stability analysis is complementary to the classical sensitivity analysis, which consists of computing the derivative of an important indicator of the model, with respect to a model parameter. We obtain a computational formula for this stability from the saddlepoint approximation. We apply the formula to the compound Poisson insurer loss with gamma individual claim amounts and to the compound geometric loss with Weibull individual claim amounts.

]]>Risks doi: 10.3390/risks6030090

Authors: Jean-Pierre Fouque Zhaoyu Zhang

We study a toy model of linear-quadratic mean field game with delay. We &ldquo;lift&rdquo; the delayed dynamic into an infinite dimensional space, and recast the mean field game system which is made of a forward Kolmogorov equation and a backward Hamilton-Jacobi-Bellman equation. We identify the corresponding master equation. A solution to this master equation is computed, and we show that it provides an approximation to a Nash equilibrium of the finite player game.

]]>Risks doi: 10.3390/risks6030089

Authors: Jatin Malhotra Angelo Corelli

The paper analyzes the relationship between the credit default swaps (CDS) spreads for 5-year CDS in Europe and US, and fundamental macroeconomic variables such as regional stock indices, oil prices, gold prices, and interest rates. The dataset includes consideration of multiple industry sectors in both economies, and it is split in two sections, before and after the global financial crisis. The analysis is carried out using multivariate regression of each index vs. the macroeconomic variables, and a Granger causality test. Both approaches are performed on the change of value of the variables involved. Results show that equity markets lead in price discovery, bidirectional causality between interest rate, and CDS spreads for most sectors involved. There is also bidirectional causality between stock and oil returns to CDS spreads.

]]>Risks doi: 10.3390/risks6030088

Authors: Rui Fang Xiaohu Li

Co-risk measures and risk contribution measures have been introduced to evaluate the degree of interaction between paired risks in actuarial risk management. This paper attempts to study the ordering behavior of measures on interaction between paired risks. For various co-risk measures and risk contribution measures, we investigate how the marginal distributions and the dependence structure impact on the level of interaction between paired risks. Also, several numerical examples based on Monte Carlo simulation are presented to illustrate the main findings.

]]>Risks doi: 10.3390/risks6030087

Authors: Raluca Vernic

With the purpose of introducing dependence between different types of claims, multivariate collective models have recently gained a lot of attention. However, when it comes to the evaluation of the corresponding compound distribution, the problems increase with the dimensionality of the model. In this paper, we consider a multivariate collective model that generalizes a model already studied from the point of view of recursive and FFT evaluation of its distribution, and we extend the same study to the general model. With the intention to see which method works better for this general model, we compare the recursive method with the FFT technique, and emphasize the advantages and drawbacks of each one, based on numerical examples.

]]>Risks doi: 10.3390/risks6030086

Authors: Fouad Marri Franck Adékambi Khouzeima Moutanabbir

In this paper, we study the discounted renewal aggregate claims with a full dependence structure. Based on a mixing exponential model, the dependence among the inter-claim times, the claim sizes, as well as the dependence between the inter-claim times and the claim sizes are included. The main contribution of this paper is the derivation of the closed-form expressions for the higher moments of the discounted aggregate renewal claims. Then, explicit expressions of these moments are provided for specific copulas families and some numerical illustrations are given to analyze the impact of dependency on the moments of the discounted aggregate amount of claims.

]]>Risks doi: 10.3390/risks6030085

Authors: Mohamed Amine Lkabous Jean-François Renaud

In this short paper, we study a VaR-type risk measure introduced by Gu&eacute;rin and Renaud and which is based on cumulative Parisian ruin. We derive some properties of this risk measure and we compare it to the risk measures of Trufin et al. and Loisel and Trufin.

]]>Risks doi: 10.3390/risks6030084

Authors: Holger Fink Andreas Fuest Henry Port

A functional ARMA-GARCH model for predicting the value-at-risk of the EURUSD exchange rate is introduced. The model implements the yield curve differentials between EUR and the US as exogenous factors. Functional principal component analysis allows us to use the information of basically the whole yield curve in a parsimonious way for exchange rate risk prediction. The data analyzed in our empirical study consist of the EURUSD exchange rate and the EUR- and US-yield curves from 15 August 2005–30 September 2016. As a benchmark, we take an ARMA-GARCH and an ARMAX-GARCHX with the 2y-yield difference as the exogenous variable and compare the forecasting performance via likelihood ratio tests. However, while our model performs better in one situation, it does not seem to improve the performance in other setups compared to its competitors.

]]>Risks doi: 10.3390/risks6030083

Authors: Michelle Xia

In this paper, we study the problem of misrepresentation under heavy-tailed regression models with the presence of both misrepresented and correctly-measured risk factors. Misrepresentation is a type of fraud when a policy applicant gives a false statement on a risk factor that determines the insurance premium. Under the regression context, we introduce heavy-tailed misrepresentation models based on the lognormal, Weibull and Pareto distributions. The proposed models allow insurance modelers to identify risk characteristics associated with the misrepresentation risk, by imposing a latent logit model on the prevalence of misrepresentation. We prove the theoretical identifiability and implement the models using Bayesian Markov chain Monte Carlo techniques. The model performance is evaluated through both simulated data and real data from the Medical Panel Expenditure Survey. The simulation study confirms the consistency of the Bayesian estimators in large samples, whereas the case study demonstrates the necessity of the proposed models for real applications when the losses exhibit heavy-tailed features.

]]>Risks doi: 10.3390/risks6030082

Authors: Giuseppe Montesi Giovanni Papiro

We present a stochastic simulation forecasting model for stress testing that is aimed at assessing banks&rsquo; capital adequacy, financial fragility, and probability of default. The paper provides a theoretical presentation of the methodology and the essential features of the forecasting model on which it is based. Also, for illustrative purposes and to show in practical terms how to apply the methodology and the types of outcomes and analysis that can be obtained, we report the results of an empirical application of the methodology proposed to the Global Systemically Important Banks (G-SIB) banks. The results of the stress test exercise are compared with the results of the supervisory stress tests performed in 2014 by the Federal Reserve and EBA/ECB.

]]>Risks doi: 10.3390/risks6030081

Authors: Marjolein Van Rooijen Chaw-Yin Myint Milena Pavlova Wim Groot

(1) Background: Health insurance and social protection in Myanmar are negligible, which leaves many citizens at risk of financial hardship in case of a serious illness. The aim of this study is to explore the views of healthcare consumers and compare them to the views of key informants on the design and implementation of a nationwide health insurance system in Myanmar. (2) Method: Data were collected through nine focus group discussions with healthcare consumers and six semi-structured interviews with key health system informants. (3) Results: The consumers supported a mandatory basic health insurance and voluntary supplementary health insurance. Tax-based funding was suggested as an option that can help to enhance healthcare utilization among the poor and vulnerable groups. However, a fully tax-based funding was perceived to have limited chances of success given the low level of government resources available. Community-based insurance, where community members pool money in a healthcare fund, was seen as more appropriate for the rural areas. (4) Conclusion: This study suggests a healthcare financing mechanism based on a mixed insurance model for the creation of nationwide health insurance. Further inquiry into the feasibility of the vital aspects of the nationwide health insurance is needed.

]]>Risks doi: 10.3390/risks6030080

Authors: Martin Ewen

This paper examines the impact of volatility-based fund classification on portfolio performance. Using historical data on equity indices, we find that a strategy based on long-term portfolio volatility, as is imposed by the Synthetic Risk Reward Indicator (SRRI), yields better Sharpe Ratios (SR) and Buy and Hold Returns (BHR) than passive investments. However, accounting for the Fama&ndash;French factors in the historical data reveals no significant alphas for the vast majority of the strategies. Further analyses conducted by running a simulation study based on a GJR(1,1)-model show no significant difference in mean returns, but significantly lower SRs for the volatility-based strategies. This evidence suggests that neither the higher leverage induced by the SRRI, nor the potential protection in downside markets pay off on a risk adjusted basis.

]]>