Next Issue
Volume 8, September
Previous Issue
Volume 8, March

Risks, Volume 8, Issue 2 (June 2020) – 35 articles

Cover Story (view full-size image): The insureds’ decisions to opt for a specific insurance plan and level of deductible depend on observed and unobserved characteristics. The aim of the research by Kalouguina and Wagner is to understand the correlation between insurance plan choices and lifestyle through health and medical care consumption in the setting of Swiss mandatory health insurance. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Order results
Result details
Select all
Export citation of selected articles as:
Open AccessArticle
No-Arbitrage Principle in Conic Finance
Risks 2020, 8(2), 66; https://doi.org/10.3390/risks8020066 - 19 Jun 2020
Viewed by 828
Abstract
In a one price economy, the Fundamental Theorem of Asset Pricing (FTAP) establishes that no-arbitrage is equivalent to the existence of an equivalent martingale measure. Such an equivalent measure can be derived as the normal unit vector of the hyperplane that separates the [...] Read more.
In a one price economy, the Fundamental Theorem of Asset Pricing (FTAP) establishes that no-arbitrage is equivalent to the existence of an equivalent martingale measure. Such an equivalent measure can be derived as the normal unit vector of the hyperplane that separates the attainable gain subspace and the convex cone representing arbitrage opportunities. However, in two-price financial models (where there is a bid–ask price spread), the set of attainable gains is not a subspace anymore. We use convex optimization, and the conic property of this region to characterize the “no-arbitrage” principle in financial models with the bid–ask price spread present. This characterization will lead us to the generation of a set of price factor random variables. Under such a set, we can find the lower and upper bounds (supper-hedging and sub-hedging bounds) for the price of any future cash flow. We will show that for any given cash flow, for which the price is outside the above range, we can build a trading strategy that provides one with an arbitrage opportunity. We will generalize this structure to any two-price finite-period financial model. Full article
(This article belongs to the Special Issue Portfolio Optimization, Risk and Factor Analysis)
Show Figures

Figure 1

Open AccessArticle
A New Approach to Risk Attribution and Its Application in Credit Risk Analysis
Risks 2020, 8(2), 65; https://doi.org/10.3390/risks8020065 - 16 Jun 2020
Cited by 2 | Viewed by 1017
Abstract
How can risk of a company be allocated to its divisions and attributed to risk factors? The Euler principle allows for an economically justified allocation of risk to different divisions. We introduce a method that generalizes the Euler principle to attribute risk to [...] Read more.
How can risk of a company be allocated to its divisions and attributed to risk factors? The Euler principle allows for an economically justified allocation of risk to different divisions. We introduce a method that generalizes the Euler principle to attribute risk to its driving factors when these factors affect losses in a nonlinear way. The method splits loss contributions over time and is straightforward to implement. We show in an example how this risk decomposition can be applied in the context of credit risk. Full article
(This article belongs to the Special Issue Model Risk and Risk Measures)
Show Figures

Figure 1

Open AccessArticle
A Multi-State Approach to Modelling Intermediate Events and Multiple Mortgage Loan Outcomes
Risks 2020, 8(2), 64; https://doi.org/10.3390/risks8020064 - 10 Jun 2020
Cited by 3 | Viewed by 1163
Abstract
This paper proposes a novel system-wide multi-state framework to model state occupations and the transitions among current, delinquency, default, prepayment, repurchase, short sale and foreclosure on mortgage loans. The approach allows for the modelling of the progression of borrowers from one state to [...] Read more.
This paper proposes a novel system-wide multi-state framework to model state occupations and the transitions among current, delinquency, default, prepayment, repurchase, short sale and foreclosure on mortgage loans. The approach allows for the modelling of the progression of borrowers from one state to another to fully understand the risks of a cohort of borrowers over time. We use a multi-state Markov model to model the transitions to and from various states. The key factors affecting the transition into various loan outcomes are the ability to pay as measured by debt-to-income ratio, equity as marked by loan-to-value ratio, interest rates and the property type. Our findings have broader policy implications for better decision-making on granting loans and the design of debt relief and mortgage modification policies. Full article
(This article belongs to the Special Issue Credit Risk Modeling and Management in Banking Business)
Show Figures

Figure 1

Open AccessArticle
A Raroc Valuation Scheme for Loans and Its Application in Loan Origination
Risks 2020, 8(2), 63; https://doi.org/10.3390/risks8020063 - 10 Jun 2020
Cited by 1 | Viewed by 996
Abstract
In this article, a risk-adjusted return on capital (RAROC) valuation scheme for loans is derived. The critical assumption throughout the article is that no market information on a borrower’s credit quality like bond or CDS (Credit Default Swap) spreads is available. Therefore, market-based [...] Read more.
In this article, a risk-adjusted return on capital (RAROC) valuation scheme for loans is derived. The critical assumption throughout the article is that no market information on a borrower’s credit quality like bond or CDS (Credit Default Swap) spreads is available. Therefore, market-based approaches are not applicable, and an alternative combining market and statistical information is needed. The valuation scheme aims to derive the individual cost components of a loan which facilitates the allocation to a bank’s operational units. After its introduction, a theoretical analysis of the scheme linking the level of interest rates and borrower default probabilities shows that a bank should only originate a loan, when the interest rate a borrower is willing to accept is inside the profitability range for this client. This range depends on a bank’s internal profitability target and is always a finite interval only or could even be empty if a borrower’s credit quality is too low. Aside from analyzing the theoretical properties of the scheme, we show how it can be directly applied in the daily loan origination process of a bank. Full article
(This article belongs to the Special Issue Quantitative Methods in Economics and Finance)
Show Figures

Figure 1

Open AccessArticle
Hedging with Liquidity Risk under CEV Diffusion
Risks 2020, 8(2), 62; https://doi.org/10.3390/risks8020062 - 05 Jun 2020
Viewed by 821
Abstract
We study a discrete time hedging and pricing problem in a market with the liquidity risk. We consider a discrete version of the constant elasticity of variance (CEV) model by applying Leland’s discrete time replication scheme. The pricing equation becomes a nonlinear partial [...] Read more.
We study a discrete time hedging and pricing problem in a market with the liquidity risk. We consider a discrete version of the constant elasticity of variance (CEV) model by applying Leland’s discrete time replication scheme. The pricing equation becomes a nonlinear partial differential equation, and we solve it by a multi scale perturbation method. A numerical example is provided. Full article
Show Figures

Graphical abstract

Open AccessArticle
A Multivariate Model to Quantify and Mitigate Cybersecurity Risk
Risks 2020, 8(2), 61; https://doi.org/10.3390/risks8020061 - 04 Jun 2020
Cited by 2 | Viewed by 1046
Abstract
The cost of cybersecurity incidents is large and growing. However, conventional methods for measuring loss and choosing mitigation strategies use simplifying assumptions and are often not supported by cyber attack data. In this paper, we present a multivariate model for different, dependent types [...] Read more.
The cost of cybersecurity incidents is large and growing. However, conventional methods for measuring loss and choosing mitigation strategies use simplifying assumptions and are often not supported by cyber attack data. In this paper, we present a multivariate model for different, dependent types of attack and the effect of mitigation strategies on those attacks. Utilising collected cyber attack data and assumptions on mitigation approaches, we look at an example of using the model to optimise the choice of mitigations. We find that the optimal choice of mitigations will depend on the goal—to prevent extreme damages or damage on average. Numerical experiments suggest the dependence aspect is important and can alter final risk estimates by as much as 30%. The methodology can be used to quantify the cost of cyber attacks and support decision making on the choice of optimal mitigation strategies. Full article
(This article belongs to the Special Issue Cyber Risk and Security)
Show Figures

Figure 1

Open AccessFeature PaperArticle
A Bank Salvage Model by Impulse Stochastic Controls
Risks 2020, 8(2), 60; https://doi.org/10.3390/risks8020060 - 04 Jun 2020
Viewed by 799
Abstract
The present paper is devoted to the study of a bank salvage model with a finite time horizon that is subjected to stochastic impulse controls. In our model, the bank’s default time is a completely inaccessible random quantity generating its own filtration, then [...] Read more.
The present paper is devoted to the study of a bank salvage model with a finite time horizon that is subjected to stochastic impulse controls. In our model, the bank’s default time is a completely inaccessible random quantity generating its own filtration, then reflecting the unpredictability of the event itself. In this framework the main goal is to minimize the total cost of the central controller, which can inject capitals to save the bank from default. We address the latter task, showing that the corresponding quasi-variational inequality (QVI) admits a unique viscosity solution—Lipschitz continuous in space and Hölder continuous in time. Furthermore, under mild assumptions on the dynamics the smooth-fit W l o c ( 1 , 2 ) , p property is achieved for any 1 < p < + . Full article
Open AccessFeature PaperArticle
How Does the Volatility of Volatility Depend on Volatility?
Risks 2020, 8(2), 59; https://doi.org/10.3390/risks8020059 - 03 Jun 2020
Cited by 1 | Viewed by 893
Abstract
We investigate the state dependence of the variance of the instantaneous variance of the S&P 500 index empirically. Time-series analysis of realized variance over a 20-year period shows strong evidence of an elasticity of variance of the variance parameter close to that of [...] Read more.
We investigate the state dependence of the variance of the instantaneous variance of the S&P 500 index empirically. Time-series analysis of realized variance over a 20-year period shows strong evidence of an elasticity of variance of the variance parameter close to that of a log-normal model, albeit with an empirical autocorrelation function that one-factor diffusion models fail to capture at horizons above a few weeks. When studying option market behavior (in-sample pricing as well as out-of-sample pricing and hedging over the period 2004–2019), messages are mixed, but systematic, model-wise. The log-normal but drift-free SABR (stochastic-alpha-beta-rho) model performs best for short-term options (times-to-expiry of three months and below), the Heston model—in which variance is stationary but not log-normal—is superior for long-term options, and a mixture of the two models does not lead to improvements. Full article
Show Figures

Figure 1

Open AccessFeature PaperArticle
Risk and Policy Uncertainty on Stock–Bond Return Correlations: Evidence from the US Markets
Risks 2020, 8(2), 58; https://doi.org/10.3390/risks8020058 - 01 Jun 2020
Cited by 2 | Viewed by 828
Abstract
This paper investigates dynamic correlations of stock–bond returns for different stock indices and bond maturities. Evidence in the US shows that stock–bond relations are time-varying and display a negative trend. The stock–bond correlations are negatively correlated with implied volatilities in stock and bond [...] Read more.
This paper investigates dynamic correlations of stock–bond returns for different stock indices and bond maturities. Evidence in the US shows that stock–bond relations are time-varying and display a negative trend. The stock–bond correlations are negatively correlated with implied volatilities in stock and bond markets. Tests show that stock–bond relations are positively correlated with economic policy uncertainty, however, are negatively correlated with the monetary policy and fiscal policy uncertainties. Correlation coefficients between stock and bond returns are positively related to total policy uncertainty for returns of the Dow-Jones Industrial Average (DJIA) and the S&P 500 Value stock index (VALUE), but negatively correlated with returns of S&P500 (Total market), the NASDAQ Composite Index (NASDAQ), and the RUSSELL 2000 (RUSSELL). Full article
(This article belongs to the Special Issue Risks: Feature Papers 2020)
Show Figures

Graphical abstract

Open AccessFeature PaperArticle
Heads and Tails of Earnings Management: Quantitative Analysis in Emerging Countries
Risks 2020, 8(2), 57; https://doi.org/10.3390/risks8020057 - 01 Jun 2020
Cited by 11 | Viewed by 1023
Abstract
Earnings management is a globally used tool for long-term profitable enterprises and for the apparatus of reduction of bankruptcy risk in developed countries. This phenomenon belongs to the integral and fundamental part of their business finance. However, this has still been lax in [...] Read more.
Earnings management is a globally used tool for long-term profitable enterprises and for the apparatus of reduction of bankruptcy risk in developed countries. This phenomenon belongs to the integral and fundamental part of their business finance. However, this has still been lax in emerging countries. The models of detections of the existence of earnings management are based on discretionary accrual. The goal of this article is to detect the existence of earnings management in emerging countries by times series analysis. This econometric investigation uses the observations of earnings before interest and taxes of 1089 Slovak enterprises and 1421 Bulgarian enterprises in financial modelling. Our findings confirm the significant existence of earnings management in both analyzed countries, based on a quantitative analysis of unit root and stationarity. The managerial activities are purposeful, which is proven by the existence of no stationarity in the time series and a clear occurrence of the unit root. In addition, the results highlight the year 2014 as a significant milestone of change in the development of earnings management in both countries, based on homogeneity analyses. These facts identify significant parallels between Slovak and Bulgarian economics and business finance. Full article
(This article belongs to the Special Issue Quantitative Methods in Economics and Finance)
Show Figures

Figure 1

Open AccessArticle
Copula Model Selection for Vehicle Component Failures Based on Warranty Claims
Risks 2020, 8(2), 56; https://doi.org/10.3390/risks8020056 - 01 Jun 2020
Cited by 2 | Viewed by 612
Abstract
In the automotive industry, it is important to know whether the failure of some car parts may be related to the failure of others. This project studies warranty claims for five engine components obtained from a major car manufacturer with the purpose of [...] Read more.
In the automotive industry, it is important to know whether the failure of some car parts may be related to the failure of others. This project studies warranty claims for five engine components obtained from a major car manufacturer with the purpose of modeling the joint distributions of the failure of two parts. The one-dimensional distributions of components are combined to construct a bivariate copula model for the joint distribution that makes it possible to estimate the probabilities of two components failing before a given time. Ultimately, the influence of the failure of one part on the operation of another related part can be described, predicted, and addressed. The performance of several families of one-parameter Archimedean copula models (Clayton, Gumbel–Hougaard, survival copulas) is analyzed, and Bayesian model selection is performed. Both right censoring and conditional approaches are considered with the emphasis on conditioning to the warranty period. Full article
(This article belongs to the Special Issue Young Researchers in Insurance and Risk Management)
Show Figures

Figure 1

Open AccessArticle
Copula-Based Assessment of Co-Movement and Tail Dependence Structure Among Major Trading Foreign Currencies in Ghana
Risks 2020, 8(2), 55; https://doi.org/10.3390/risks8020055 - 01 Jun 2020
Viewed by 839
Abstract
This paper examines the joint movement and tail dependence structure between the pair of foreign exchange rates (EUR, USD and GBP) against the GHS, using daily exchange rates data expressed in GHS per unit of foreign currencies (EUR, USD and GBP) between the [...] Read more.
This paper examines the joint movement and tail dependence structure between the pair of foreign exchange rates (EUR, USD and GBP) against the GHS, using daily exchange rates data expressed in GHS per unit of foreign currencies (EUR, USD and GBP) between the time range of 24 February 2009 and 19 December 2019. We use different sets of both static (time-invariant) and time-varying copulas with different levels of dependence and tail dependence measures, and the study results reveal positive dependence between all exchange rates pairs, though the dependencies for EUR-USD and GBP-USD pairs are not as strong as the EUR-GBP pair. The findings also reveal symmetric tail dependence, and dependence evolves over time. Notwithstanding this, the asymmetric tail dependence copulas provide evidence of upper tail dependence. We compare the copula results to DCC(1,1)-GARCH(1,1) model result and find the copula to be more sensitive to extreme co-movement between the currency pairs. The afore-mentioned findings, therefore, offer forex market players the opportunity to relax in hoarding a particular foreign currency in anticipation of domestic currency depreciation. Full article
(This article belongs to the Special Issue Credit Risk Modeling and Management in Banking Business)
Show Figures

Figure 1

Open AccessEditorial
Special Issue “Machine Learning in Insurance”
Risks 2020, 8(2), 54; https://doi.org/10.3390/risks8020054 - 25 May 2020
Cited by 1 | Viewed by 796
Abstract
It is our pleasure to prologue the special issue on “Machine Learning in Insurance”, which represents a compilation of ten high-quality articles discussing avant-garde developments or introducing new theoretical or practical advances in this field [...] Full article
(This article belongs to the Special Issue Machine Learning in Insurance)
Open AccessArticle
Ruin Probability for Stochastic Flows of Financial Contract under Phase-Type Distribution
Risks 2020, 8(2), 53; https://doi.org/10.3390/risks8020053 - 22 May 2020
Viewed by 792
Abstract
This paper examines the impact of the parameters of the distribution of the time at which a bank’s client defaults on their obligated payments, on the Lundberg adjustment coefficient, the upper and lower bounds of the ruin probability. We study the corresponding ruin [...] Read more.
This paper examines the impact of the parameters of the distribution of the time at which a bank’s client defaults on their obligated payments, on the Lundberg adjustment coefficient, the upper and lower bounds of the ruin probability. We study the corresponding ruin probability on the assumption of (i) a phase-type distribution for the time at which default occurs and (ii) an embedding of the stochastic cash flow or the reserves of the bank to the Sparre Andersen model. The exact analytical expression for the ruin probability is not tractable under these assumptions, so Cramér Lundberg bounds types are obtained for the ruin probabilities with concomitant explicit equations for the calculation of the adjustment coefficient. To add some numerical flavour to our results, we provide some numerical illustrations. Full article
Show Figures

Figure 1

Open AccessArticle
Bankruptcy Prediction and Stress Quantification Using Support Vector Machine: Evidence from Indian Banks
Risks 2020, 8(2), 52; https://doi.org/10.3390/risks8020052 - 22 May 2020
Cited by 2 | Viewed by 925
Abstract
Banks play a vital role in strengthening the financial system of a country; hence, their survival is decisive for the stability of national economies. Therefore, analyzing the survival probability of the banks is an essential and continuing research activity. However, the current literature [...] Read more.
Banks play a vital role in strengthening the financial system of a country; hence, their survival is decisive for the stability of national economies. Therefore, analyzing the survival probability of the banks is an essential and continuing research activity. However, the current literature available indicates that research is currently limited on banks’ stress quantification in countries like India where there have been fewer failed banks. The literature also indicates a lack of scientific and quantitative approaches that can be used to predict bank survival and failure probabilities. Against this backdrop, the present study attempts to establish a bankruptcy prediction model using a machine learning approach and to compute and compare the financial stress that the banks face. The study uses the data of failed and surviving private and public sector banks in India for the period January 2000 through December 2017. The explanatory features of bank failure are chosen by using a two-step feature selection technique. First, a relief algorithm is used for primary screening of useful features, and in the second step, important features are fed into the support vector machine to create a forecasting model. The threshold values of the features for the decision boundary which separates failed banks from survival banks are calculated using the decision boundary of the support vector machine with a linear kernel. The results reveal, inter alia, that support vector machine with linear kernel shows 92.86% forecasting accuracy, while a support vector machine with radial basis function kernel shows 71.43% accuracy. The study helps to carry out comparative analyses of financial stress of the banks and has significant implications for their decisions of various stakeholders such as shareholders, management of the banks, analysts, and policymakers. Full article
(This article belongs to the Special Issue Credit Risk Modeling and Management in Banking Business)
Show Figures

Figure 1

Open AccessArticle
Diversification and Desynchronicity: An Organizational Portfolio Perspective on Corporate Risk Reduction
Risks 2020, 8(2), 51; https://doi.org/10.3390/risks8020051 - 22 May 2020
Cited by 5 | Viewed by 1126
Abstract
A longstanding objective of managers is to reduce risk to their businesses. The conventional strategy for risk reduction is diversification; however, evidence for the effectiveness of diversification remains inconclusive. According to Organizational Portfolio Analysis, firms are viewed as portfolios of business units, and [...] Read more.
A longstanding objective of managers is to reduce risk to their businesses. The conventional strategy for risk reduction is diversification; however, evidence for the effectiveness of diversification remains inconclusive. According to Organizational Portfolio Analysis, firms are viewed as portfolios of business units, and the key to risk reduction is both diversification and synchronization compensation. This study introduces “desynchronicity”, a process that operationalizes synchronization compensation by assessing the degree of correlation between income streams of business units. Two samples of 737 and 332 firms (from COMPUSTAT) were used to empirically test the relationships between diversification and risk, and desynchronicity and risk. The results show that diversification alone will not always lead to a lower corporate risk. To reduce risk, firms also need to consider the desynchronicity of their business portfolios. Other practical implications include improved decisions on portfolio composition. Full article
Show Figures

Figure 1

Open AccessFeature PaperArticle
Machine Learning for Multiple Yield Curve Markets: Fast Calibration in the Gaussian Affine Framework
Risks 2020, 8(2), 50; https://doi.org/10.3390/risks8020050 - 21 May 2020
Viewed by 912
Abstract
Calibration is a highly challenging task, in particular in multiple yield curve markets. This paper is a first attempt to study the chances and challenges of the application of machine learning techniques for this. We employ Gaussian process regression, a machine learning methodology [...] Read more.
Calibration is a highly challenging task, in particular in multiple yield curve markets. This paper is a first attempt to study the chances and challenges of the application of machine learning techniques for this. We employ Gaussian process regression, a machine learning methodology having many similarities with extended Kálmán filtering, which has been applied many times to interest rate markets and term structure models. We find very good results for the single-curve markets and many challenges for the multi-curve markets in a Vasiček framework. The Gaussian process regression is implemented with the Adam optimizer and the non-linear conjugate gradient method, where the latter performs best. We also point towards future research. Full article
(This article belongs to the Special Issue Machine Learning in Finance, Insurance and Risk Management)
Show Figures

Figure 1

Open AccessArticle
Proactive Management of Regulatory Policy Ripple Effects via a Computational Hierarchical Change Management Structure
Risks 2020, 8(2), 49; https://doi.org/10.3390/risks8020049 - 21 May 2020
Cited by 1 | Viewed by 746
Abstract
The paper proposes a novel computational impact analysis framework to proactively manage dynamic constraints and optimally promote the inception of central banks’ regulatory policies. Currently, central banks are encountering contradictory challenges in developing and implementing regulatory policy. These constraints mainly comprise of incomplete [...] Read more.
The paper proposes a novel computational impact analysis framework to proactively manage dynamic constraints and optimally promote the inception of central banks’ regulatory policies. Currently, central banks are encountering contradictory challenges in developing and implementing regulatory policy. These constraints mainly comprise of incomplete or anomalous information (information asymmetry), and very tight temporal and resources limitations (bounded rationality) when the efficiency of a policy is determined at a system-level. The complex relationships of the policy attributes and their interactions generate very dynamic emergent behaviours due to the complex causal relationships. This paper adopted and tailored the hierarchical change management structure framework to design a first step framework called ‘computational regulatory policy change governance’. The methodology uses interviews, focus-group workshop and the application of empirical data. The results of the evaluation and case study validate its applicability in computing policy parameters and the impacts of their interactions. The evaluation of the framework gained a remarkable score, averaging a 130 per cent improvement compared to the existing methods. However, the research paper used a single case study, and its outcomes require further evaluation and testing. Accordingly, we invite regulators, banks, scholars and practitioners to explore the uniqueness and features of the proposed framework. Full article
Show Figures

Figure 1

Open AccessArticle
Testing the Least-Squares Monte Carlo Method for the Evaluation of Capital Requirements in Life Insurance
Risks 2020, 8(2), 48; https://doi.org/10.3390/risks8020048 - 18 May 2020
Viewed by 822
Abstract
In this paper, we test the efficiency of least-squares Monte Carlo method to estimate capital requirements in life insurance. We choose a simplified Gaussian evaluation framework where closed-form formulas are available and allow us to obtain solid benchmarks. Extensive numerical experiments were conducted [...] Read more.
In this paper, we test the efficiency of least-squares Monte Carlo method to estimate capital requirements in life insurance. We choose a simplified Gaussian evaluation framework where closed-form formulas are available and allow us to obtain solid benchmarks. Extensive numerical experiments were conducted by considering different combinations of simulation runs and basis functions, and the corresponding results are illustrated. Full article
(This article belongs to the Special Issue Model Risk and Risk Measures)
Show Figures

Figure 1

Open AccessFeature PaperArticle
Implementing the Rearrangement Algorithm: An Example from Computational Risk Management
Risks 2020, 8(2), 47; https://doi.org/10.3390/risks8020047 - 14 May 2020
Viewed by 866
Abstract
After a brief overview of aspects of computational risk management, the implementation of the rearrangement algorithm in R is considered as an example from computational risk management practice. This algorithm is used to compute the largest quantile (worst value-at-risk) of the sum of [...] Read more.
After a brief overview of aspects of computational risk management, the implementation of the rearrangement algorithm in R is considered as an example from computational risk management practice. This algorithm is used to compute the largest quantile (worst value-at-risk) of the sum of the components of a random vector with specified marginal distributions. It is demonstrated how a basic implementation of the rearrangement algorithm can gradually be improved to provide a fast and reliable computational solution to the problem of computing worst value-at-risk. Besides a running example, an example based on real-life data is considered. Bootstrap confidence intervals for the worst value-at-risk as well as a basic worst value-at-risk allocation principle are introduced. The paper concludes with selected lessons learned from this experience. Full article
(This article belongs to the Special Issue Computational Risk Management)
Show Figures

Figure 1

Open AccessArticle
Die Hard: Probability of Default and Soft Information
Risks 2020, 8(2), 46; https://doi.org/10.3390/risks8020046 - 13 May 2020
Cited by 3 | Viewed by 806
Abstract
The research aims to verify whether the credit risk of small and medium-sized enterprises can be estimated more accurately using qualitative variables together with financial information from reports. In our paper, we select qualitative variables within the conceptual framework of the balanced scorecard [...] Read more.
The research aims to verify whether the credit risk of small and medium-sized enterprises can be estimated more accurately using qualitative variables together with financial information from reports. In our paper, we select qualitative variables within the conceptual framework of the balanced scorecard to assess the credit quality of Italian companies of various sizes, from micro to medium. Data were collected to estimate the company’s resilience following the shock of the financial crisis of 2007–2008. The analysis based on customer size, processes, knowledge, and corporate finance, synthesized with balanced scorecard methodology, allows us to estimate the resilience of companies in a period of crisis. The research highlights the important contribution of qualitative variables for the estimation of credit risk. The implications concern both financial intermediaries and their supervisory functions, and regulators for rating models based on soft forward and countercyclical variables. Full article
(This article belongs to the Special Issue Credit Risk Modeling and Management in Banking Business)
Open AccessArticle
Towards an Economic Cyber Loss Index for Parametric Cover Based on IT Security Indicator: A Preliminary Analysis
Risks 2020, 8(2), 45; https://doi.org/10.3390/risks8020045 - 08 May 2020
Cited by 1 | Viewed by 900
Abstract
As cyber events have virtually no geographical limitations and can result in economic losses on a global scale, the assessment of return periods for such economic losses is currently debated among experts. The potential accumulation of consequential insurance losses due to intrusions or [...] Read more.
As cyber events have virtually no geographical limitations and can result in economic losses on a global scale, the assessment of return periods for such economic losses is currently debated among experts. The potential accumulation of consequential insurance losses due to intrusions or viruses is one of the major reasons why the (re-)insurance industry has limited risk appetite for cyber related risks. In order to increase the risk appetite for cyber risk and based on a first batch of data provided by Symantec, the goal of this article is to: Check if IT activity, i.e., the number of virus or intrusions being blocked by Norton on end-user computers could be used as an index for parametric covers that reinsurance companies could propose to their cedants; Look into the correlations of this IT activity across different regions, thereby confirming the absence of geographical limitations for cyber risk, and hence confirming the systemic nature of this risk. This first study on the Symantec dataset shows that a cyber index based on IT activity could be a useful tool to design parametric reinsurance product. Full article
(This article belongs to the Special Issue Cyber Risk and Security)
Show Figures

Figure 1

Open AccessArticle
Technical Analysis on the Bitcoin Market: Trading Opportunities or Investors’ Pitfall?
Risks 2020, 8(2), 44; https://doi.org/10.3390/risks8020044 - 06 May 2020
Cited by 4 | Viewed by 1260
Abstract
In this paper we aimed to examine the profitability of technical trading rules in the Bitcoin market by using trend-following and mean-reverting strategies. We applied our strategies on the Bitcoin price series sampled both at 5-min intervals and on a daily basis, during [...] Read more.
In this paper we aimed to examine the profitability of technical trading rules in the Bitcoin market by using trend-following and mean-reverting strategies. We applied our strategies on the Bitcoin price series sampled both at 5-min intervals and on a daily basis, during the period 1 January 2012 to 20 August 2019. Our findings suggest that, overall, trading on daily data is more profitable than going intraday. Furthermore, we concluded that the Buy and Hold strategy outperforms the examined alternatives on an intraday basis, while Simple Moving Averages yield the best performances when dealing with daily data. Full article
(This article belongs to the Special Issue Financial Networks in Fintech Risk Management)
Show Figures

Figure 1

Open AccessFeature PaperArticle
Multivariate Collective Risk Model: Dependent Claim Numbers and Panjer’s Recursion
Risks 2020, 8(2), 43; https://doi.org/10.3390/risks8020043 - 02 May 2020
Viewed by 1085
Abstract
In this paper, we discuss a generalization of the collective risk model and of Panjer’s recursion. The model we consider consists of several business lines with dependent claim numbers. The distributions of the claim numbers are assumed to be Poisson mixture distributions. We [...] Read more.
In this paper, we discuss a generalization of the collective risk model and of Panjer’s recursion. The model we consider consists of several business lines with dependent claim numbers. The distributions of the claim numbers are assumed to be Poisson mixture distributions. We let the claim causes have certain dependence structures and prove that Panjer’s recursion is also applicable by finding an appropriate equivalent representation of the claim numbers. These dependence structures are of a stochastic non-negative linear nature and may also produce negative correlations between the claim causes. The consideration of risk groups also includes dependence between claim sizes. Compounding the claim causes by common distributions also keeps Panjer’s recursion applicable. Full article
(This article belongs to the Special Issue Interplay between Financial and Actuarial Mathematics)
Show Figures

Figure 1

Open AccessErratum
Erratum: Hui Ye, Anthony Bellotti. Modelling Recovery Rates for Non-Performing Loans. Risks 7 (2019): 19
Risks 2020, 8(2), 42; https://doi.org/10.3390/risks8020042 - 29 Apr 2020
Viewed by 727
Abstract
The authors wish to make the following corrections to this paper (Ye and Bellotti 2019): [...] Full article
(This article belongs to the Special Issue Advances in Credit Risk Modeling and Management)
Open AccessArticle
How Do Health, Care Services Consumption and Lifestyle Factors Affect the Choice of Health Insurance Plans in Switzerland?
Risks 2020, 8(2), 41; https://doi.org/10.3390/risks8020041 - 27 Apr 2020
Viewed by 1268
Abstract
In compulsory health insurance in Switzerland, policyholders can choose two main features, the level of deductible and the type of plan. Deductibles can be chosen among six levels, which range from CHF 300 to 2500. While the coverage and benefits are identical, insurers [...] Read more.
In compulsory health insurance in Switzerland, policyholders can choose two main features, the level of deductible and the type of plan. Deductibles can be chosen among six levels, which range from CHF 300 to 2500. While the coverage and benefits are identical, insurers offer several plans where policyholders must first call a medical hotline, consult their family doctor, or visit a doctor from a defined network. The main benefit of higher deductibles and insurance plans with limitations is lower premiums. The insureds’ decisions to opt for a specific cover depend on observed and unobserved characteristics. The aim of this research is to understand the correlation between insurance plan choices and lifestyle through the state of health and medical care consumption in the setting of Swiss mandatory health insurance. To do so, we account for individual health and medical health care consumption as unobserved variables employing structural equation modeling. Our empirical analysis is based on data from the Swiss Health Survey wherein lifestyle factors like the body mass index, diet, physical activity, and commuting mode are available. From the 9301 recorded observations, we find a positive relationship between having a “healthy” lifestyle, a low consumption of doctors’ services, and choosing a high deductible, as well as an insurance plan with restrictions. Conversely, higher health care services’ usage triggers the choice of lower deductibles and standard insurance plans. Full article
(This article belongs to the Special Issue Risks: Feature Papers 2020)
Show Figures

Figure 1

Open AccessArticle
Deep Arbitrage-Free Learning in a Generalized HJM Framework via Arbitrage-Regularization
Risks 2020, 8(2), 40; https://doi.org/10.3390/risks8020040 - 23 Apr 2020
Cited by 1 | Viewed by 1285
Abstract
A regularization approach to model selection, within a generalized HJM framework, is introduced, which learns the closest arbitrage-free model to a prespecified factor model. This optimization problem is represented as the limit of a one-parameter family of computationally tractable penalized model selection tasks. [...] Read more.
A regularization approach to model selection, within a generalized HJM framework, is introduced, which learns the closest arbitrage-free model to a prespecified factor model. This optimization problem is represented as the limit of a one-parameter family of computationally tractable penalized model selection tasks. General theoretical results are derived and then specialized to affine term-structure models where new types of arbitrage-free machine learning models for the forward-rate curve are estimated numerically and compared to classical short-rate and the dynamic Nelson-Siegel factor models. Full article
(This article belongs to the Special Issue Machine Learning in Finance, Insurance and Risk Management)
Show Figures

Figure 1

Open AccessArticle
A Tail Dependence-Based MST and Their Topological Indicators in Modeling Systemic Risk in the European Insurance Sector
Risks 2020, 8(2), 39; https://doi.org/10.3390/risks8020039 - 22 Apr 2020
Cited by 2 | Viewed by 991
Abstract
In the present work, we analyze the dynamics of indirect connections between insurance companies that result from market price channels. In our analysis, we assume that the stock quotations of insurance companies reflect market sentiments, which constitute a very important systemic risk factor. [...] Read more.
In the present work, we analyze the dynamics of indirect connections between insurance companies that result from market price channels. In our analysis, we assume that the stock quotations of insurance companies reflect market sentiments, which constitute a very important systemic risk factor. Interlinkages between insurers and their dynamics have a direct impact on systemic risk contagion in the insurance sector. Herein, we propose a new hybrid approach to the analysis of interlinkages dynamics based on combining the copula-DCC-GARCH model and minimum spanning trees (MST). Using the copula-DCC-GARCH model, we determine the tail dependence coefficients. Then, for each analyzed period we construct MST based on these coefficients. The dynamics are analyzed by means of the time series of selected topological indicators of the MSTs in the years 2005–2019. The contribution to systemic risk of each institution is determined by analyzing the deltaCoVaR time series using the copula-DCC-GARCH model. Our empirical results show the usefulness of the proposed approach to the analysis of systemic risk (SR) in the insurance sector. The times series obtained from the proposed hybrid approach reflect the phenomena occurring in the market. We check whether the analyzed MST topological indicators can be considered as systemic risk predictors. Full article
(This article belongs to the Special Issue Systemic Risk and Reinsurance)
Show Figures

Figure 1

Open AccessArticle
Information Sharing, Bank Penetration and Tax Evasion in Emerging Markets
Risks 2020, 8(2), 38; https://doi.org/10.3390/risks8020038 - 20 Apr 2020
Cited by 1 | Viewed by 967
Abstract
Tax evasion, which is typically considered an illegal activity, is a critical problem and is considered a barrier to economic growth. A review of the literature shows that tax and social security contributions, regulations, public sector services, the quality of institutions and tax [...] Read more.
Tax evasion, which is typically considered an illegal activity, is a critical problem and is considered a barrier to economic growth. A review of the literature shows that tax and social security contributions, regulations, public sector services, the quality of institutions and tax compliance, play important roles in determining the degree to which firms attempt to evade taxes. Measuring tax evasion is problematic due to data requirements and inadequacies. Few tax evasion indices have been estimated but it appears that they cannot be used for international comparisons across countries. This important issue has largely been ignored in the literature, in particular for emerging markets. Consequently, this paper is conducted to develop a new tax evasion index (TEI) using the most substantial and recent data from the standardized World Bank Enterprises Survey 2006–2017. In addition, using the newly developed TEI, the paper examines the importance and contribution of information sharing and bank penetration to the degree of tax evasion in emerging markets. The paper uses a sample of 112 emerging markets from 2006–2017 and the Tobit model in estimation. The empirical findings from the paper indicate that the average TEI during the 2006–2017 period for emerging markets is 0.62, with a range of (0.25, 0.75). In addition, we find that information sharing and bank penetration negatively affect the degree of tax evasion, as proxied by the TEI, in emerging markets. The empirical results also confirm the view that large firms are considered to have adopted good tax compliance practices, while firms located in remote areas are more likely to evade taxes. Policy implications have emerged on the basis of the empirical findings from the paper. Full article
(This article belongs to the Special Issue Measuring and Modelling Financial Risk and Derivatives)
Open AccessArticle
Impact of Credit Risk on Momentum and Contrarian Strategies: Evidence from South Asian Markets
Risks 2020, 8(2), 37; https://doi.org/10.3390/risks8020037 - 14 Apr 2020
Cited by 1 | Viewed by 1293
Abstract
We examine the profitability of the momentum and contrarian strategies in three South Asian markets, i.e., Bangladesh, India, and Pakistan. We also analyze, whether credit risk influences momentum and contrarian return for these markets from 2008 to 2014. We use default risk that [...] Read more.
We examine the profitability of the momentum and contrarian strategies in three South Asian markets, i.e., Bangladesh, India, and Pakistan. We also analyze, whether credit risk influences momentum and contrarian return for these markets from 2008 to 2014. We use default risk that relates to non-payments of debts by firms as a measure of credit risk. For that purpose, we use distance to default (DD) by Kealhofer, McQuown, and Vasicek (KMV) model as a proxy of credit risk. We calculate the credit risk and form the momentum and contrarian strategies of the firms based on high, medium, and low risk. We find that in all three markets, the momentum and contrarian returns are significant for medium and high credit risk portfolios and no momentum and contrarian returns for low credit risk portfolios. Full article
(This article belongs to the Special Issue Credit Risk Modeling and Management in Banking Business)
Previous Issue
Next Issue
Back to TopTop