Risks doi: 10.3390/risks7030098

Authors: Patrick Beissner

This paper considers fundamental questions of arbitrage pricing that arises when the uncertainty model incorporates ambiguity about risk. This additional ambiguity motivates a new principle of risk- and ambiguity-neutral valuation as an extension of the paper by Ross (1976) (Ross, Stephen A. 1976. The arbitrage theory of capital asset pricing. Journal of Economic Theory 13: 341–60). In the spirit of Harrison and Kreps (1979) (Harrison, J. Michael, and David M. Kreps. 1979. Martingales and arbitrage in multiperiod securities markets. Journal of Economic Theory 20: 381–408), the paper establishes a micro-economic foundation of viability in which ambiguity-neutrality imposes a fair-pricing principle via symmetric multiple prior martingales. The resulting equivalent symmetric martingale measure set exists if the uncertain volatility in asset prices is driven by an ambiguous Brownian motion.

]]>Risks doi: 10.3390/risks7030097

Authors: Kevin Kuo

We propose a novel approach for loss reserving based on deep neural networks. The approach allows for joint modeling of paid losses and claims outstanding, and incorporation of heterogeneous inputs. We validate the models on loss reserving data across lines of business, and show that they improve on the predictive accuracy of existing stochastic methods. The models require minimal feature engineering and expert input, and can be automated to produce forecasts more frequently than manual workflows.

]]>Risks doi: 10.3390/risks7030096

Authors: Jiandong Ren Kristina Sendova Ričardas Zitikis

It has been six years since Editor-in-Chief Steffensen (2013) wrote in his editorial that &ldquo;to Risks inclusiveness, inter-disciplinarity, and open-mindedness is the very starting point [...]

]]>Risks doi: 10.3390/risks7030095

Authors: Jacky H. L. Poon

In actuarial modelling of risk pricing and loss reserving in general insurance, also known as P&amp;C or non-life insurance, there is business value in the predictive power and automation through machine learning. However, interpretability can be critical, especially in explaining to key stakeholders and regulators. We present a granular machine learning model framework to jointly predict loss development and segment risk pricing. Generalising the Payments per Claim Incurred (PPCI) loss reserving method with risk variables and residual neural networks, this combines interpretable linear and sophisticated neural network components so that the &lsquo;unexplainable&rsquo; component can be identified and regularised with a separate penalty. The model is tested for a real-life insurance dataset, and generally outperformed PPCI on predicting ultimate loss for sufficient sample size.

]]>Risks doi: 10.3390/risks7030094

Authors: Jason S. Anquandah Leonid V. Bogachev

Managing unemployment is one of the key issues in social policies. Unemployment insurance schemes are designed to cushion the financial and morale blow of loss of job but also to encourage the unemployed to seek new jobs more proactively due to the continuous reduction of benefit payments. In the present paper, a simple model of unemployment insurance is proposed with a focus on optimality of the individual&rsquo;s entry to the scheme. The corresponding optimal stopping problem is solved, and its similarity and differences with the perpetual American call option are discussed. Beyond a purely financial point of view, we argue that in the actuarial context the optimal decisions should take into account other possible preferences through a suitable utility function. Some examples in this direction are worked out.

]]>Risks doi: 10.3390/risks7030093

Authors: Alex Garivaltis

I derive practical formulas for optimal arrangements between sophisticated stock market investors (continuous-time Kelly gamblers or, more generally, CRRA investors) and the brokers who lend them cash for leveraged bets on a high Sharpe asset (i.e., the market portfolio). Rather than, say, the broker posting a monopoly price for margin loans, the gambler agrees to use a greater quantity of margin debt than he otherwise would in exchange for an interest rate that is lower than the broker would otherwise post. The gambler thereby attains a higher asymptotic capital growth rate and the broker enjoys a greater rate of intermediation profit than would be obtained under non-cooperation. If the threat point represents a complete breakdown of negotiations (resulting in zero margin loans), then we get an elegant rule of thumb: r L * = 3 / 4 r + 1 / 4 &nu; &minus; &sigma; 2 / 2 , where r is the broker&rsquo;s cost of funds, &nu; is the compound-annual growth rate of the market index, and &sigma; is the annual volatility. We show that, regardless of the particular threat point, the gambler will negotiate to size his bets as if he himself could borrow at the broker&rsquo;s call rate.

]]>Risks doi: 10.3390/risks7030092

Authors: Iegor Rudnytskyi Joël Wagner

Long-term care (LTC) encompasses a set of services provided to impaired and dependent elderly people. To assess the level of the dependence several scales are used, including activities of daily living (ADL), instrumental ADL (IADL) and functional limitations. Once an elderly person fails to perform these activities independently, he or she requires special assistance. Help can be provided as informal care by relatives and as formal care by professionals. The aim of this research is to study individual characteristics that relate to the demand of LTC and to analyze the relation between formal and informal care. We base our study on data from the Swiss Health Survey focusing on respondents aged over 65 years. Using the structural equation modeling technique, we develop a statistical model that considers the dependence concept as a latent variable. This hidden dependence variable combines three indices linked to the limitations in ADL, in IADL and functional limitations. Accounting for causality links between covariates enables us to include the indirect effect of pathologies on the receipt of LTC mediated via dependence. In our model, we do not assume a causal relationship between formal and informal care. From our results, we observe a significant impact of pathologies as well as of the socio-demographic factors on the demand for LTC. The relationship between formal and informal care is found to be of both a complementary and substitutional nature.

]]>Risks doi: 10.3390/risks7030091

Authors: Hans Rau-Bredow

This paper provides a critical analysis of the subadditivity axiom, which is the key condition for coherent risk measures. Contrary to the subadditivity assumption, bank mergers can create extra risk. We begin with an analysis how a merger affects depositors, junior or senior bank creditors, and bank owners. Next it is shown that bank mergers can result in higher payouts having to be made by the deposit insurance scheme. Finally, we demonstrate that if banks are interconnected via interbank loans, a bank merger could lead to additional contagion risks. We conclude that the subadditivity assumption should be rejected, since a subadditive risk measure, by definition, cannot account for such increased risks.

]]>Risks doi: 10.3390/risks7030090

Authors: Xin Huang

Credit default swap (CDS) spreads measure the default risk of the reference entity and have been frequently used in recent empirical papers. To provide a rigorous econometrics foundation for empirical CDS analysis, this paper applies the augmented Dickey&ndash;Fuller, Phillips&ndash;Perron, Kwiatkowski&ndash;Phillips&ndash;Schmidt&ndash;Shin, and Ng&ndash;Perron tests to study the unit root property of CDS spreads, and it uses the Phillips&ndash;Ouliaris&ndash;Hansen tests to determine whether they are cointegrated. The empirical sample consists of daily CDS spreads of the six large U.S. banks from 2001 to 2018. The main findings are that it is log, not raw, CDS spreads that are unit root processes, and that log CDS spreads are cointegrated. These findings imply that, even though the risks of individual banks may deviate from each other in the short run, there is a long-run relation that ties them together. As these CDS spreads are an important input for financial systemic risk, there are at least two policy implications. First, in monitoring systemic risk, policymakers should focus on long-run trends rather than short-run fluctuations of CDS spreads. Second, in controlling systemic risk, policy measures that reduce the long-run risks of individual banks, such as stress testing and capital buffers, are helpful in mitigating overall systemic risk.

]]>Risks doi: 10.3390/risks7030089

Authors: Simona Galletta Sebastiano Mazzù

This paper examines the bank liquidity risk while using a maturity mismatch indicator of loans and deposits (LTDm) during a specific period. Core banking activities that are based on the process of maturity transformation are the most exposed to liquidity risk. The financial crisis in 2007&ndash;2009 highlighted the importance of liquidity to the functioning of both the financial markets and the banking sector. We investigate how characteristics of a bank, such as size, capital, and business model, are related to liquidity risk, while using a sample of European banks in the period after the financial crisis, from 2011 to 2017. While employing a generalized method of moment two-step estimator, we find that the banking size increases the liquidity risk, whereas capital is not an effective deterrent. Moreover, our findings reveal that, for savings banks, income diversification raises the liquidity risk while investment banks reliant on non-deposit funding decrease the exposure to liquidity risk.

]]>Risks doi: 10.3390/risks7030088

Authors: Marc Pierre Henrard

With the expected discontinuation of the LIBOR publication, a robust fallback for related financial instruments is paramount. In recent months, several consultations have taken place on the subject. The results of the first ISDA consultation have been published in November 2018 and a new one just finished at the time of writing. This note describes issues associated to the proposed approaches and potential alternative approaches in the framework and the context of quantitative finance. It evidences a clear lack of details and lack of measurability of the proposed approaches which would not be achievable in practice. It also describes the potential of asymmetrical information between market participants coming from the adjustment spread computation. In the opinion of this author, a fundamental revision of the fallback&rsquo;s foundations is required.

]]>Risks doi: 10.3390/risks7030087

Authors: Pavel V. Gapeev Neofytos Rodosthenous V. L. Raju Chinthalapati

We obtain closed-form expressions for the value of the joint Laplace transform of the running maximum and minimum of a diffusion-type process stopped at the first time at which the associated drawdown or drawup process hits a constant level before an independent exponential random time. It is assumed that the coefficients of the diffusion-type process are regular functions of the current values of its running maximum and minimum. The proof is based on the solution to the equivalent inhomogeneous ordinary differential boundary-value problem and the application of the normal-reflection conditions for the value function at the edges of the state space of the resulting three-dimensional Markov process. The result is related to the computation of probability characteristics of the take-profit and stop-loss values of a market trader during a given time period.

]]>Risks doi: 10.3390/risks7030086

Authors: Marcos López de Prado Ralph Vince Qiji Jim Zhu

The Growth-Optimal Portfolio (GOP) theory determines the path of bet sizes that maximize long-term wealth. This multi-horizon goal makes it more appealing among practitioners than myopic approaches, like Markowitz&rsquo;s mean-variance or risk parity. The GOP literature typically considers risk-neutral investors with an infinite investment horizon. In this paper, we compute the optimal bet sizes in the more realistic setting of risk-averse investors with finite investment horizons. We find that, under this more realistic setting, the optimal bet sizes are considerably smaller than previously suggested by the GOP literature. We also develop quantitative methods for determining the risk-adjusted growth allocations (or risk budgeting) for a given finite investment horizon.

]]>Risks doi: 10.3390/risks7030085

Authors: Wenyuan Wang Xiaowen Zhou

This paper revisits the spectrally negative L&eacute;vy risk process embedded with the general tax structure introduced in Kyprianou and Zhou (2009). A joint Laplace transform is found concerning the first down-crossing time below level 0. The potential density is also obtained for the taxed L&eacute;vy risk process killed upon leaving [ 0 , b ] . The results are expressed using scale functions.

]]>Risks doi: 10.3390/risks7030084

Authors: Beata Ślusarczyk Katarzyna Grondys

The sector of SME is the major force for the national economic and social development. Financial risk is one of the key threats to the activity of small and medium enterprises. The most common manifestation of the financial risk of SMEs is difficulty in financing the business and lack of funds for development. Banks are unwilling to grant loans to such companies. Moreover, it is the rising operating costs that cause shrinking profits, which may result in corporate debt, difficulty in debt repayment, and consequently, high financial risk of these entities. Numerous differences in conducting the activity of small and large enterprises intensify this risk and mean that the model of credit financing for companies is not adjusted to the capabilities and principles of the operation of small enterprises. Therefore, risk management is one of the most important internal processes in small and medium enterprises. The identification of factors that affect the level of financial risk in these entities is therefore crucial. The main objective of this research was to analyze the impact of selected parametric characteristics of the SME sector on the intensity of financial risk they take. This objective was accomplished on the basis of the survey with the participation of Polish SMEs. In order to test the adopted research assumptions, the linear regression model was used with four continuous variables for each type of the identified financial risk. Based on the final research results, the logit model was obtained for the risk of insufficient profits. It was indicated that both the internationalization of the company and the ability to manage risk are the only factors that affect a high level of risk of low income. The article ends with the discussion and the comparison with some previous research in this area.

]]>Risks doi: 10.3390/risks7030083

Authors: Krzysztof Dȩbicki Lanpeng Ji Tomasz Rolski

We consider a two-dimensional ruin problem where the surplus process of business lines is modelled by a two-dimensional correlated Brownian motion with drift. We study the ruin function P ( u ) for the component-wise ruin (that is both business lines are ruined in an infinite-time horizon), where u is the same initial capital for each line. We measure the goodness of the business by analysing the adjustment coefficient, that is the limit of &minus; ln P ( u ) / u as u tends to infinity, which depends essentially on the correlation &rho; of the two surplus processes. In order to work out the adjustment coefficient we solve a two-layer optimization problem.

]]>Risks doi: 10.3390/risks7030082

Authors: Greg Taylor

The purpose of this paper is to survey recent developments in granular models and machine learning models for loss reserving, and to compare the two families with a view to assessment of their potential for future development. This is best understood against the context of the evolution of these models from their predecessors, and the early sections recount relevant archaeological vignettes from the history of loss reserving. However, the larger part of the paper is concerned with the granular models and machine learning models. Their relative merits are discussed, as are the factors governing the choice between them and the older, more primitive models. Concluding sections briefly consider the possible further development of these models in the future.

]]>Risks doi: 10.3390/risks7030081

Authors: Andrew Leung

It is intuitive that proximity to hospitals can only improve the chances of survival from a range of medical conditions. This study examines the empirical evidence for this assertion, based on Australian data. While hospital proximity might serve as a proxy for other factors, such as indigenity, income, wealth or geography, the evidence suggests that proximity provides the most direct link to these factors. In addition, as it turns out, a very statistically significant one that transcends economies.

]]>Risks doi: 10.3390/risks7030080

Authors: Ana M. Pérez-Marín Montserrat Guillen Manuela Alcañiz Lluís Bermúdez

We analyzed real telematics information for a sample of drivers with usage-based insurance policies. We examined the statistical distribution of distance driven above the posted speed limit&mdash;which presents a strong positive asymmetry&mdash;using quantile regression models. We found that, at different percentile levels, the distance driven at speeds above the posted limit depends on total distance driven and, more generally, on factors such as the percentage of urban and nighttime driving and on the driver&rsquo;s gender. However, the impact of these covariates differs according to the percentile level. We stress the importance of understanding telematics information, which should not be limited to simply characterizing average drivers, but can be useful for signaling dangerous driving by predicting quantiles associated with specific driver characteristics. We conclude that the risk of driving for long distances above the speed limit is heterogeneous and, moreover, we show that prevention campaigns should target primarily male non-urban drivers, especially if they present a high percentage of nighttime driving.

]]>Risks doi: 10.3390/risks7030079

Authors: Francis Duval Mathieu Pigeon

In this paper, we propose models for non-life loss reserving combining traditional approaches such as Mack&rsquo;s or generalized linear models and gradient boosting algorithm in an individual framework. These claim-level models use information about each of the payments made for each of the claims in the portfolio, as well as characteristics of the insured. We provide an example based on a detailed dataset from a property and casualty insurance company. We contrast some traditional aggregate techniques, at the portfolio-level, with our individual-level approach and we discuss some points related to practical applications.

]]>Risks doi: 10.3390/risks7030078

Authors: Nader Trabelsi Aviral Kumar Tiwari

In this paper, the generalized Pareto distribution (GPD) copula approach is utilized to solve the conditional value-at-risk (CVaR) portfolio problem. Particularly, this approach used (i) copula to model the complete linear and non-linear correlation dependence structure, (ii) Pareto tails to capture the estimates of the parametric Pareto lower tail, the non-parametric kernel-smoothed interior and the parametric Pareto upper tail and (iii) Value-at-Risk (VaR) to quantify risk measure. The simulated sample covers the G7, BRICS (association of Brazil, Russia, India, China and South Africa) and 14 popular emerging stock-market returns for the period between 1997 and 2018. Our results suggest that the efficient frontier with the minimizing CVaR measure and simulated copula returns combined outperforms the risk/return of domestic portfolios, such as the US stock market. This result improves international diversification at the global level. We also show that the Gaussian and t-copula simulated returns give very similar but not identical results. Furthermore, the copula simulation provides more accurate market-risk estimates than historical simulation. Finally, the results support the notion that G7 countries can provide an important opportunity for diversification. These results are important to investors and policymakers.

]]>Risks doi: 10.3390/risks7030077

Authors: Oliver Lukason María-del-Mar Camacho-Miñano

The aim of this study was to investigate whether firms&rsquo; reporting delays are interconnected with bankruptcy risk and its financial determinants. This study was based on 698,189 firm-year observations from Estonia. Annual report submission delay, either in a binary or ordinal form, was used as the dependent variable, while bankruptcy risk based on an international model or the financial ratios determining it were the independent variables. The findings indicated that firms with lower values of liquidity and annual and accumulated profitability were more likely to delay the submission of an annual report over the legal deadline. In turn, firm leverage was not interconnected with reporting delays. In addition, firms with a higher risk of bankruptcy were more likely to delay the submission of their annual reports. Firms with different ages, sizes and industries varied in respect to the obtained results. Different stakeholders should be aware that when reporting delays occur, these can be conditioned by higher bankruptcy risk or poor performance, and thus, for instance, crediting such firms should be treated with caution. State institutions controlling timely submission should take strict(er) measures in cases of firms delaying for a lengthy period.

]]>Risks doi: 10.3390/risks7030076

Authors: Dan Cheng Pasquale Cirillo

We propose an alternative approach to the modeling of the positive dependence between the probability of default and the loss given default in a portfolio of exposures, using a bivariate urn process. The model combines the power of Bayesian nonparametrics and statistical learning, allowing for the elicitation and the exploitation of experts&rsquo; judgements, and for the constant update of this information over time, every time new data are available. A real-world application on mortgages is described using the Single Family Loan-Level Dataset by Freddie Mac.

]]>Risks doi: 10.3390/risks7030075

Authors: Matteo Foglia Eliana Angelini

In this paper, we measure the systemic risk with a novel methodology, based on a &ldquo;spatial-temporal&rdquo; approach. We propose a new bank systemic risk measure to consider the two components of systemic risk: cross-sectional and time dimension. The aim is to highlight the &ldquo;time-space dynamics&rdquo; of contagion, i.e., if the CDS spread of bank i depends on the CDS spread of other banks. To do this, we use an advanced spatial econometrics design with a time-varying spatial dependence that can be interpreted as an index of the degree of cross-sectional spillovers. The findings highlight that the Eurozone banks have strong spatial dependence in the evolution of CDS spread, namely the contagion effect is present and persistent. Moreover, we analyse the role of the European Central Bank in managing contagion risk. We find that monetary policy has been effective in reducing systemic risk. However, the results show that systemic risk does not imply a policy intervention, highlighting how financial stability policy is not yet an objective.

]]>Risks doi: 10.3390/risks7030074

Authors: Prayut Jain Shashi Jain

The Hierarchical risk parity (HRP) approach of portfolio allocation, introduced by Lopez de Prado (2016), applies graph theory and machine learning to build a diversified portfolio. Like the traditional risk-based allocation methods, HRP is also a function of the estimate of the covariance matrix, however, it does not require its invertibility. In this paper, we first study the impact of covariance misspecification on the performance of the different allocation methods. Next, we study under an appropriate covariance forecast model whether the machine learning based HRP outperforms the traditional risk-based portfolios. For our analysis, we use the test for superior predictive ability on out-of-sample portfolio performance, to determine whether the observed excess performance is significant or if it occurred by chance. We find that when the covariance estimates are crude, inverse volatility weighted portfolios are more robust, followed by the machine learning-based portfolios. Minimum variance and maximum diversification are most sensitive to covariance misspecification. HRP follows the middle ground; it is less sensitive to covariance misspecification when compared with minimum variance or maximum diversification portfolio, while it is not as robust as the inverse volatility weighed portfolio. We also study the impact of the different rebalancing horizon and how the portfolios compare against a market-capitalization weighted portfolio.

]]>Risks doi: 10.3390/risks7030073

Authors: Jean-François Renaud

We consider de Finetti&rsquo;s stochastic control problem when the (controlled) process is allowed to spend time under the critical level. More precisely, we consider a generalized version of this control problem in a spectrally negative L&eacute;vy model with exponential Parisian ruin. We show that, under mild assumptions on the L&eacute;vy measure, an optimal strategy is formed by a barrier strategy and that this optimal barrier level is always less than the optimal barrier level when classical ruin is implemented. In addition, we give necessary and sufficient conditions for the barrier strategy at level zero to be optimal.

]]>Risks doi: 10.3390/risks7030072

Authors: Antonio Pallaria Nino Savelli

Solvency II requirements introduced new issues for actuarial risk management in non-life insurance, challenging the market to have a consciousness of its own risk profile, and also investigating the sensitivity of the solvency ratio depending on the insurance risks and technical results on either a short-term and medium-term perspective. For this aim, in the present paper, a partial internal model for premium risk is developed for three multi-line non-life insurers, and the impact of some different business mixes is analyzed. Furthermore, the risk-mitigation and profitability impact of reinsurance in the premium risk model are introduced, and a global framework for a feasible application of this model consistent with a medium-term analysis is provided. Numerical results are also figured out with evidence of various effects for several portfolios and reinsurance arrangements, pointing out the main reasons for these differences.

]]>Risks doi: 10.3390/risks7030071

Authors: Marjan Qazvini

In this study, we consider the problem of zero claims in a liability insurance portfolio and compare the predictability of three models. We use French motor third party liability (MTPL) insurance data, which has been used for a pricing game, and show that how the type of coverage and policyholders&rsquo; willingness to subscribe to insurance pricing, based on telematics data, affects their driving behaviour and hence their claims. Using our validation set, we then predict the number of zero claims. Our results show that although a zero-inflated Poisson (ZIP) model performs better than a Poisson regression, it can even be outperformed by logistic regression.

]]>Risks doi: 10.3390/risks7020070

Authors: Jessica Pesantez-Narvaez Montserrat Guillen Manuela Alcañiz

XGBoost is recognized as an algorithm with exceptional predictive capacity. Models for a binary response indicating the existence of accident claims versus no claims can be used to identify the determinants of traffic accidents. This study compared the relative performances of logistic regression and XGBoost approaches for predicting the existence of accident claims using telematics data. The dataset contained information from an insurance company about the individuals&rsquo; driving patterns&mdash;including total annual distance driven and percentage of total distance driven in urban areas. Our findings showed that logistic regression is a suitable model given its interpretability and good predictive capacity. XGBoost requires numerous model-tuning procedures to match the predictive performance of the logistic regression model and greater effort as regards to interpretation.

]]>Risks doi: 10.3390/risks7020069

Authors: Dadang Jainal Mutaqin Koichi Usami

To reduce the negative impacts of risks in farming due to climate change, the government implemented agricultural production cost insurance in 2015. Although a huge amount of subsidy has been allocated by the government (80 percent of the premium), farmers&rsquo; participation rate is still low (23 percent of the target in 2016). In order to solve the issue, it is indispensable to identify farmers&rsquo; willingness to pay (WTP) for and determinants of their participation in agricultural production cost insurance. Based on a field survey of 240 smallholder farmers in the Garut District, West Java Province in August&ndash;October 2017 and February 2018, the contingent valuation method (CVM) estimated that farmers&rsquo; mean willingness to pay (WTP) was Rp 30,358/ha/cropping season ($2.25/ha/cropping season), which was 16 percent lower than the current premium (Rp 36,000/ha/cropping season = $2.67/ha/cropping season). Farmers who participated in agricultural production cost insurance shared some characteristics: operating larger farmland, more contact with agricultural extension service, lower expected production for the next cropping season, and a downstream area location.

]]>Risks doi: 10.3390/risks7020068

Authors: Emilio Gómez-Déniz José María Sarabia Enrique Calderín-Ojeda

It is known that the classical ruin function under exponential claim-size distribution depends on two parameters, which are referred to as the mean claim size and the relative security loading. These parameters are assumed to be unknown and random, thus, a loss function that measures the loss sustained by a decision-maker who takes as valid a ruin function which is not correct can be considered. By using squared-error loss function and appropriate distribution function for these parameters, the issue of estimating the ruin function derives in a mixture procedure. Firstly, a bivariate distribution for mixing jointly the two parameters is considered, and second, different univariate distributions for mixing both parameters separately are examined. Consequently, a catalogue of ruin probability functions and severity of ruin, which are more flexible than the original one, are obtained. The methodology is also extended to the Pareto claim size distribution. Several numerical examples illustrate the performance of these functions.

]]>Risks doi: 10.3390/risks7020067

Authors: Rasa Kanapickiene Renatas Spicas

In this research, trade credit is analysed form a seller (supplier) perspective. Trade credit allows the supplier to increase sales and profits but creates the risk that the customer will not pay, and at the same time increases the risk of the supplier&rsquo;s insolvency. If the supplier is a small or micro-enterprise (SMiE), it is usually an issue of human and technical resources. Therefore, when dealing with these issues, the supplier needs a high accuracy but simple and highly interpretable trade credit risk assessment model that allows for assessing the risk of insolvency of buyers (who are usually SMiE). The aim of the research is to create a statistical enterprise trade credit risk assessment (ETCRA) model for Lithuanian small and micro-enterprises (SMiE). In the empirical analysis, the financial and non-financial data of 734 small and micro-sized enterprises in the period of 2010&ndash;2012 were chosen as the samples. Based on the logistic regression, the ETCRA model was developed using financial and non-financial variables. In the ETCRA model, the enterprise&rsquo;s financial performance is assessed from different perspectives: profitability, liquidity, solvency, and activity. Varied model variants have been created using (i) only financial ratios and (ii) financial ratios and non-financial variables. Moreover, the inclusion of non-financial variables in the model does not substantially improve the characteristics of the model. This means that the models that use only financial ratios can be used in practice, and the models that include non-financial variables can also be used. The designed models can be used by suppliers when making decisions of granting a trade credit for small or micro-enterprises.

]]>Risks doi: 10.3390/risks7020066

Authors: Ioannis Anagnostou Drona Kandhai

One of the key components of counterparty credit risk (CCR) measurement is generating scenarios for the evolution of the underlying risk factors, such as interest and exchange rates, equity and commodity prices, and credit spreads. Geometric Brownian Motion (GBM) is a widely used method for modeling the evolution of exchange rates. An important limitation of GBM is that, due to the assumption of constant drift and volatility, stylized facts of financial time-series, such as volatility clustering and heavy-tailedness in the returns distribution, cannot be captured. We propose a model where volatility and drift are able to switch between regimes; more specifically, they are governed by an unobservable Markov chain. Hence, we model exchange rates with a hidden Markov model (HMM) and generate scenarios for counterparty exposure using this approach. A numerical study is carried out and backtesting results for a number of exchange rates are presented. The impact of using a regime-switching model on counterparty exposure is found to be profound for derivatives with non-linear payoffs.

]]>Risks doi: 10.3390/risks7020065

Authors: Hongxia Wang

This work examines apportionment of multiplicative risks by considering three dominance orderings: first-degree stochastic dominance, Rothschild and Stiglitz&rsquo;s increase in risk and downside risk increase. We use the relative nth-degree risk aversion measure and decreasing relative nth-degree risk aversion to provide conditions guaranteeing the preference for &ldquo;harm disaggregation&rdquo; of multiplicative risks. Further, we relate our conclusions to the preference toward bivariate lotteries, which interpret correlation-aversion, cross-prudence and cross-temperance.

]]>Risks doi: 10.3390/risks7020064

Authors: Tolulope Fadina Thorsten Schmidt

This paper discusses ambiguity in the context of single-name credit risk. We focus on uncertainty in the default intensity but also discuss uncertainty in the recovery in a fractional recovery of the market value. This approach is a first step towards integrating uncertainty in credit-risky term structure models and can profit from its simplicity. We derive drift conditions in a Heath&ndash;Jarrow&ndash;Morton forward rate setting in the case of ambiguous default intensity in combination with zero recovery, and in the case of ambiguous fractional recovery of the market value.

]]>Risks doi: 10.3390/risks7020063

Authors: Yiqing Chen

We investigate a shot noise process with subexponential shot marks occurring at renewal epochs. Our main result is a precise asymptotic formula for its tail probability. In doing so, some recent results regarding sums of randomly weighted subexponential random variables play a crucial role.

]]>Risks doi: 10.3390/risks7020062

Authors: László Martinek

In the past two decades increasing computational power resulted in the development of more advanced claims reserving techniques, allowing the stochastic branch to overcome the deterministic methods, resulting in forecasts of enhanced quality. Hence, not only point estimates, but predictive distributions can be generated in order to forecast future claim amounts. The significant expansion in the variety of models requires the validation of these methods and the creation of supporting techniques for appropriate decision making. The present article compares and validates several existing and self-developed stochastic methods on actual data applying comparison measures in an algorithmic manner.

]]>Risks doi: 10.3390/risks7020061

Authors: Daniel H. Alai Katja Ignatieva Michael Sherris

Stochastic mortality models have been developed for a range of applications from demographic projections to financial management. Financial risk based models built on methods used for interest rates and apply these to mortality rates. They have the advantage of being applied to financial pricing and the management of longevity risk. Olivier and Jeffery (2004) and Smith (2005) proposed a model based on a forward-rate mortality framework with stochastic factors driven by univariate gamma random variables irrespective of age or duration. We assess and further develop this model. We generalize random shocks from a univariate gamma to a univariate Tweedie distribution and allow for the distributions to vary by age. Furthermore, since dependence between ages is an observed characteristic of mortality rate improvements, we formulate a multivariate framework using copulas. We find that dependence increases with age and introduce a suitable covariance structure, one that is related to the notion of ax minimum. The resulting model provides a more realistic basis for capturing the risk of mortality improvements and serves to enhance longevity risk management for pension and insurance funds.

]]>Risks doi: 10.3390/risks7020060

Authors: Stanislaus Maier-Paape Andreas Platen Qiji Jim Zhu

This is Part III of a series of papers which focus on a general framework for portfolio theory. Here, we extend a general framework for portfolio theory in a one-period financial market as introduced in Part I [Maier-Paape and Zhu, Risks 2018, 6(2), 53] to multi-period markets. This extension is reasonable for applications. More importantly, we take a new approach, the &ldquo;modular portfolio theory&rdquo;, which is built from the interaction among four related modules: (a) multi period market model; (b) trading strategies; (c) risk and utility functions (performance criteria); and (d) the optimization problem (efficient frontier and efficient portfolio). An important concept that allows dealing with the more general framework discussed here is a trading strategy generating function. This concept limits the discussion to a special class of manageable trading strategies, which is still wide enough to cover many frequently used trading strategies, for instance &ldquo;constant weight&rdquo; (fixed fraction). As application, we discuss the utility function of compounded return and the risk measure of relative log drawdowns.

]]>Risks doi: 10.3390/risks7020059

Authors: Francesco Rotondi

I document a sizeable bias that might arise when valuing out of the money American options via the Least Square Method proposed by Longstaff and Schwartz (2001). The key point of this algorithm is the regression-based estimate of the continuation value of an American option. If this regression is ill-posed, the procedure might deliver biased results. The price of the American option might even fall below the price of its European counterpart. For call options, this is likely to occur when the dividend yield of the underlying is high. This distortion is documented within the standard Black&ndash;Scholes&ndash;Merton model as well as within its most common extensions (the jump-diffusion, the stochastic volatility and the stochastic interest rates models). Finally, I propose two easy and effective workarounds that fix this distortion.

]]>Risks doi: 10.3390/risks7020058

Authors: Rokas Gylys Jonas Šiaulys

The primary objective of this work is to analyze model based Value-at-Risk associated with mortality risk arising from issued term life assurance contracts and to compare the results with the capital requirements for mortality risk as determined using Solvency II Standard Formula. In particular, two approaches to calculate Value-at-Risk are analyzed: one-year VaR and run-off VaR. The calculations of Value-at-Risk are performed using stochastic mortality rates which are calibrated using the Lee-Carter model fitted using mortality data of selected European countries. Results indicate that, depending on the approach taken to calculate Value-at-Risk, the key factors driving its relative size are: sensitivity of technical provisions to the latest mortality experience, volatility of mortality rates in a country, policy term and benefit formula. Overall, we found that Solvency II Standard Formula on average delivers an adequate capital requirement, however, we also highlight particular situations where it could understate or overstate portfolio specific model based Value-at-Risk for mortality risk.

]]>Risks doi: 10.3390/risks7020057

Authors: Pascal François

In the presence of recovery risk, the recovery rate is a random variable whose risk-neutral expectation can be inferred from the prices of defaultable instruments. I extract market-implied recovery rates from the term structures of credit default swap spreads for a sample of 497 United States (U.S.) corporate issuers over the 2005&ndash;2014 period. I analyze the explanatory factors of market-implied recovery rates within a linear regression framework and also within a Tobit model, and I compare them with the determinants of historical recovery rates that were previously identified in the literature. In contrast to their historical counterparts, market-implied recovery rates are mostly driven by macroeconomic factors and long-term, issuer-specific variables. Short-term financial variables and industry conditions significantly impact the slope of market-implied recovery rates. These results indicate that the design of a recovery risk model should be based on specific market factors, not on the statistical evidence that is provided by historical recovery rates.

]]>Risks doi: 10.3390/risks7020056

Authors: Taras Bodnar Arjun K. Gupta Valdemar Vitlinskyi Taras Zabolotskyy

The beta coefficient plays a crucial role in finance as a risk measure of a portfolio in comparison to the benchmark portfolio. In the paper, we investigate statistical properties of the sample estimator for the beta coefficient. Assuming that both the holding portfolio and the benchmark portfolio consist of the same assets whose returns are multivariate normally distributed, we provide the finite sample and the asymptotic distributions of the sample estimator for the beta coefficient. These findings are used to derive a statistical test for the beta coefficient and to construct a confidence interval for the beta coefficient. Moreover, we show that the sample estimator is an unbiased estimator for the beta coefficient. The theoretical results are implemented in an empirical study.

]]>Risks doi: 10.3390/risks7020055

Authors: Vytaras Brazauskas Sahadeb Upretee

Quantiles of probability distributions play a central role in the definition of risk measures (e.g., value-at-risk, conditional tail expectation) which in turn are used to capture the riskiness of the distribution tail. Estimates of risk measures are needed in many practical situations such as in pricing of extreme events, developing reserve estimates, designing risk transfer strategies, and allocating capital. In this paper, we present the empirical nonparametric and two types of parametric estimators of quantiles at various levels. For parametric estimation, we employ the maximum likelihood and percentile-matching approaches. Asymptotic distributions of all the estimators under consideration are derived when data are left-truncated and right-censored, which is a typical loss variable modification in insurance. Then, we construct relative efficiency curves (REC) for all the parametric estimators. Specific examples of such curves are provided for exponential and single-parameter Pareto distributions for a few data truncation and censoring cases. Additionally, using simulated data we examine how wrong quantile estimates can be when one makes incorrect modeling assumptions. The numerical analysis is also supplemented with standard model diagnostics and validation (e.g., quantile-quantile plots, goodness-of-fit tests, information criteria) and presents an example of when those methods can mislead the decision maker. These findings pave the way for further work on RECs with potential for them being developed into an effective diagnostic tool in this context.

]]>Risks doi: 10.3390/risks7020054

Authors: Rafał Wójcik Charlie Wusuo Liu Jayanta Guin

We present several fast algorithms for computing the distribution of a sum of spatially dependent, discrete random variables to aggregate catastrophe risk. The algorithms are based on direct and hierarchical copula trees. Computing speed comes from the fact that loss aggregation at branching nodes is based on combination of fast approximation to brute-force convolution, arithmetization (regriding) and linear complexity of the method for computing the distribution of comonotonic sum of risks. We discuss the impact of tree topology on the second-order moments and tail statistics of the resulting distribution of the total risk. We test the performance of the presented models by accumulating ground-up loss for 29,000 risks affected by hurricane peril.

]]>Risks doi: 10.3390/risks7020053

Authors: Ross Taplin Clive Hunt

Risk models developed on one dataset are often applied to new data and, in such cases, it is prudent to check that the model is suitable for the new data. An important application is in the banking industry, where statistical models are applied to loans to determine provisions and capital requirements. These models are developed on historical data, and regulations require their monitoring to ensure they remain valid on current portfolios&mdash;often years since the models were developed. The Population Stability Index (PSI) is an industry standard to measure whether the distribution of the current data has shifted significantly from the distribution of data used to develop the model. This paper explores several disadvantages of the PSI and proposes the Prediction Accuracy Index (PAI) as an alternative. The superior properties and interpretation of the PAI are discussed and it is concluded that the PAI can more accurately summarise the level of population stability, helping risk analysts and managers determine whether the model remains fit-for-purpose.

]]>Risks doi: 10.3390/risks7020052

Authors: Erwan Koch

An accurate assessment of the risk of extreme environmental events is of great importance for populations, authorities and the banking/insurance/reinsurance industry. Koch (2017) introduced a notion of spatial risk measure and a corresponding set of axioms which are well suited to analyze the risk due to events having a spatial extent, precisely such as environmental phenomena. The axiom of asymptotic spatial homogeneity is of particular interest since it allows one to quantify the rate of spatial diversification when the region under consideration becomes large. In this paper, we first investigate the general concepts of spatial risk measures and corresponding axioms further and thoroughly explain the usefulness of this theory for both actuarial science and practice. Second, in the case of a general cost field, we give sufficient conditions such that spatial risk measures associated with expectation, variance, value-at-risk as well as expected shortfall and induced by this cost field satisfy the axioms of asymptotic spatial homogeneity of order 0, &minus;2, &minus;1 and &minus;1, respectively. Last but not least, in the case where the cost field is a function of a max-stable random field, we provide conditions on both the function and the max-stable field ensuring the latter properties. Max-stable random fields are relevant when assessing the risk of extreme events since they appear as a natural extension of multivariate extreme-value theory to the level of random fields. Overall, this paper improves our understanding of spatial risk measures as well as of their properties with respect to the space variable and generalizes many results obtained in Koch (2017).

]]>Risks doi: 10.3390/risks7020051

Authors: Sagara Dewasurendra Pedro Judice Qiji Zhu

Banks make profits from the difference between short-term and long-term loan interest rates. To issue loans, banks raise funds from capital markets. Since the long-term loan rate is relatively stable, but short-term interest is usually variable, there is an interest rate risk. Therefore, banks need information about the optimal leverage strategies based on the current economic situation. Recent studies on the economic crisis by many economists showed that the crisis was due to too much leveraging by &ldquo;big banks&rdquo;. This leveraging turns out to be close to Kelly&rsquo;s optimal point. It is known that Kelly&rsquo;s strategy does not address risk adequately. We used the return&ndash;drawdown ratio and inflection point of Kelly&rsquo;s cumulative return curve in a finite investment horizon to derive more conservative leverage levels. Moreover, we carried out a sensitivity analysis to determine strategies during a period of interest rates increase, which is the most important and risky period to leverage. Thus, we brought theoretical results closer to practical applications. Furthermore, by using the sensitivity analysis method, banks can change the allocation sizes to loans with different maturities to mediate the risks corresponding to different monetary policy environments. This provides bank managers flexible tools in mitigating risk.

]]>Risks doi: 10.3390/risks7020050

Authors: Francesca Greselin Fabio Piacenza Ričardas Zitikis

We explore the Monte Carlo steps required to reduce the sampling error of the estimated 99.9% quantile within an acceptable threshold. Our research is of primary interest to practitioners working in the area of operational risk measurement, where the annual loss distribution cannot be analytically determined in advance. Usually, the frequency and the severity distributions should be adequately combined and elaborated with Monte Carlo methods, in order to estimate the loss distributions and risk measures. Naturally, financial analysts and regulators are interested in mitigating sampling errors, as prescribed in EU Regulation 2018/959. In particular, the sampling error of the 99.9% quantile is of paramount importance, along the lines of EU Regulation 575/2013. The Monte Carlo error for the operational risk measure is here assessed on the basis of the binomial distribution. Our approach is then applied to realistic simulated data, yielding a comparable precision of the estimate with a much lower computational effort, when compared to bootstrap, Monte Carlo repetition, and two other methods based on numerical optimization.

]]>Risks doi: 10.3390/risks7020049

Authors: Søren Asmussen Bent Jesper Christensen Julie Thøgersen

Two insurance companies I 1 , I 2 with reserves R 1 ( t ) , R 2 ( t ) compete for customers, such that in a suitable differential game the smaller company I 2 with R 2 ( 0 ) &lt; R 1 ( 0 ) aims at minimizing R 1 ( t ) &minus; R 2 ( t ) by using the premium p 2 as control and the larger I 1 at maximizing by using p 1 . Deductibles K 1 , K 2 are fixed but may be different. If K 1 &gt; K 2 and I 2 is the leader choosing its premium first, conditions for Stackelberg equilibrium are established. For gamma-distributed rates of claim arrivals, explicit equilibrium premiums are obtained, and shown to depend on the running reserve difference. The analysis is based on the diffusion approximation to a standard Cram&eacute;r-Lundberg risk process extended to allow investment in a risk-free asset.

]]>Risks doi: 10.3390/risks7020048

Authors: Matteo Brachetta Claudia Ceci

We study the optimal excess-of-loss reinsurance problem when both the intensity of the claims arrival process and the claim size distribution are influenced by an exogenous stochastic factor. We assume that the insurer&rsquo;s surplus is governed by a marked point process with dual-predictable projection affected by an environmental factor and that the insurance company can borrow and invest money at a constant real-valued risk-free interest rate r. Our model allows for stochastic risk premia, which take into account risk fluctuations. Using stochastic control theory based on the Hamilton-Jacobi-Bellman equation, we analyze the optimal reinsurance strategy under the criterion of maximizing the expected exponential utility of the terminal wealth. A verification theorem for the value function in terms of classical solutions of a backward partial differential equation is provided. Finally, some numerical results are discussed.

]]>Risks doi: 10.3390/risks7020047

Authors: Delphine Boursicot Geneviève Gauthier Farhad Pourkalbassi

Contingent Convertible (CoCo) is a hybrid debt issued by banks with a specific feature forcing its conversion to equity in the event of the bank&rsquo;s financial distress. CoCo carries two major risks: the risk of default, which threatens any type of debt instrument, plus the exclusive risk of mandatory conversion. In this paper, we propose a model to value CoCo debt instruments as a function of the debt ratio. Although the CoCo is a more expensive instrument than traditional debt, its presence in the capital structure lowers the cost of ordinary debt and reduces the total cost of debt. For preliminary equity holders, the presence of CoCo in the bank&rsquo;s capital structure increases the shareholder&rsquo;s aggregate value.

]]>Risks doi: 10.3390/risks7020046

Authors: Markus K. Brunnermeier Patrick Cheridito

In this paper, we develop a framework for measuring, allocating and managing systemic risk. SystRisk, our measure of total systemic risk, captures the a priori cost to society for providing tail-risk insurance to the financial system. Our allocation principle distributes the total systemic risk among individual institutions according to their size-shifted marginal contributions. To describe economic shocks and systemic feedback effects, we propose a reduced form stochastic model that can be calibrated to historical data. We also discuss systemic risk limits, systemic risk charges and a cap and trade system for systemic risk.

]]>Risks doi: 10.3390/risks7020045

Authors: Hirbod Assa Mostafa Pouralizadeh Abdolrahim Badamchizadeh

While the main conceptual issue related to deposit insurances is the moral hazard risk, the main technical issue is inaccurate calibration of the implied volatility. This issue can raise the risk of generating an arbitrage. In this paper, first, we discuss that by imposing the no-moral-hazard risk, the removal of arbitrage is equivalent to removing the static arbitrage. Then, we propose a simple quadratic model to parameterize implied volatility and remove the static arbitrage. The process of removing the static risk is as follows: Using a machine learning approach with a regularized cost function, we update the parameters in such a way that butterfly arbitrage is ruled out and also implementing a calibration method, we make some conditions on the parameters of each time slice to rule out calendar spread arbitrage. Therefore, eliminating the effects of both butterfly and calendar spread arbitrage make the implied volatility surface free of static arbitrage.

]]>Risks doi: 10.3390/risks7020044

Authors: Rakesh Arrawatia Arun Misra Varun Dawar Debasish Maitra

Banks in India have been gone through structural changes in the last three decades. The prices that bank charge depend on the competitive levels in the banking sector and the risk the assets and liabilities carry in banks&rsquo; balance sheet. The traditional Lerner Index indicates competitive levels. However, this measure does not account for the risk, and this study introduces a risk-adjusted Lerner Index for evaluating competition in Indian banking for the period 1996 to 2016. The market power estimated through the adjusted Lerner Index has been declining since 1996, which indicates an improvement in competitive condition for the overall period. Further, as indicated by risk-adjusted Lerner Index, the Indian banking system exerts much less market power and hence are more competitive contrary to what is suggested by traditional Lerner index.

]]>Risks doi: 10.3390/risks7020043

Authors: Kaiwen Wang Jiehui Ding Kristen R. Lidwell Scott Manski Gee Y. Lee Emilio Xavier Esposito

The presented research discusses general approaches to analyze and model healthcare data at the treatment level and at the store level. The paper consists of two parts: (1) a general analysis method for store-level product sales of an organization and (2) a treatment-level analysis method of healthcare expenditures. In the first part, our goal is to develop a modeling framework to help understand the factors influencing the sales volume of stores maintained by a healthcare organization. In the second part of the paper, we demonstrate a treatment-level approach to modeling healthcare expenditures. In this part, we aim to improve the operational-level management of a healthcare provider by predicting the total cost of medical services. From this perspective, treatment-level analyses of medical expenditures may help provide a micro-level approach to predicting the total amount of expenditures for a healthcare provider. We present a model for analyzing a specific type of medical data, which may arise commonly in a healthcare provider&rsquo;s standardized database. We do this by using an extension of the frequency-severity approach to modeling insurance expenditures from the actuarial science literature.

]]>Risks doi: 10.3390/risks7020042

Authors: Shengkun Xie

Territory design and analysis using geographical loss cost are a key aspect in auto insurance rate regulation. The major objective of this work is to study the design of geographical rating territories by maximizing the within-group homogeneity, as well as maximizing the among-group heterogeneity from statistical perspectives, while maximizing the actuarial equity of pure premium, as required by insurance regulation. To achieve this goal, the spatially-constrained clustering of industry level loss cost was investigated. Within this study, in order to meet the contiguity, which is a legal requirement on the design of geographical rating territories, a clustering approach based on Delaunay triangulation is proposed. Furthermore, an entropy-based approach was introduced to quantify the homogeneity of clusters, while both the elbow method and the gap statistic are used to determine the initial number of clusters. This study illustrated the usefulness of the spatially-constrained clustering approach in defining geographical rating territories for insurance rate regulation purposes. The significance of this work is to provide a new solution for better designing geographical rating territories. The proposed method can be useful for other demographical data analysis because of the similar nature of the spatial constraint.

]]>Risks doi: 10.3390/risks7020041

Authors: Fadoua Zeddouk Pierre Devolder

Annuities providers become more and more exposed to longevity risk due to the increase in life expectancy. To hedge this risk, new longevity derivatives have been proposed (longevity bonds, q-forwards, S-swaps&hellip;). Although academic researchers, policy makers and practitioners have talked about it for years, longevity-linked securities are not widely traded in financial markets, due in particular to the pricing difficulty. In this paper, we compare different existing pricing methods and propose a Cost of Capital approach. Our method is designed to be more consistent with Solvency II requirement (longevity risk assessment is based on a one year time horizon). The price of longevity risk is determined for a S-forward and a S-swap but can be used to price other longevity-linked securities. We also compare this Cost of capital method with some classical pricing approaches. The Hull and White and CIR extended models are used to represent the evolution of mortality over time. We use data for Belgian population to derive prices for the proposed longevity linked securities based on the different methods.

]]>Risks doi: 10.3390/risks7020040

Authors: Logan Ewanchuk Christoph Frei

A recently introduced accounting standard, namely the International Financial Reporting Standard 9, requires banks to build provisions based on forward-looking expected loss models. When there is a significant increase in credit risk of a loan, additional provisions must be charged to the income statement. Banks need to set for each loan a threshold defining what such a significant increase in credit risk constitutes. A low threshold allows banks to recognize credit risk early, but leads to income volatility. We introduce a statistical framework to model this trade-off between early recognition of credit risk and avoidance of excessive income volatility. We analyze the resulting optimization problem for different models, relate it to the banking stress test of the European Union, and illustrate it using default data by Standard and Poor&rsquo;s.

]]>Risks doi: 10.3390/risks7020039

Authors: John Moriarty Jan Palczewski

As decarbonisation progresses and conventional thermal generation gradually gives way to other technologies including intermittent renewables, there is an increasing requirement for system balancing from new and also fast-acting sources such as battery storage. In the deregulated context, this raises questions of market design and operational optimisation. In this paper, we assess the real option value of an arrangement under which an autonomous energy-limited storage unit sells incremental balancing reserve. The arrangement is akin to a perpetual American swing put option with random refraction times, where a single incremental balancing reserve action is sold at each exercise. The power used is bought in an energy imbalance market (EIM), whose price we take as a general regular one-dimensional diffusion. The storage operator&rsquo;s strategy and its real option value are derived in this framework by solving the twin timing problems of when to buy power and when to sell reserve. Our results are illustrated with an operational and economic analysis using data from the German Amprion EIM.

]]>Risks doi: 10.3390/risks7020038

Authors: Farid Flici Frédéric Planchet

The aim of this paper is to construct prospective life tables adapted to the experience of Algerian retirees. Mortality data of the retired population are only available for the ages from 50 to 95 years and older and for the period from 2004 to 2013. The use of the conventional prospective mortality models is not supposed to provide robust forecasts given data limitation in terms of either exposure to death risk or data length. To improve forecasting robustness, we use the global population mortality as an external reference. The adjustment of the experience mortality on the reference allows projecting the age-specific death rates calculated based on the experience of the retired population. We propose a generalized version of the Brass-type relational model incorporating a quadratic effect to perform the adjustment. Results show no significant difference for men, either retired or not, but reveal a gap of over three years in the remaining life expectancy at age 50 in favor of retired women compared to those of the global population.

]]>Risks doi: 10.3390/risks7020037

Authors: Yasutaka Shimizu Zhimin Zhang

A statistical inference for ruin probability from a certain discrete sample of the surplus is discussed under a spectrally negative L&eacute;vy insurance risk. We consider the Laguerre series expansion of ruin probability, and provide an estimator for any of its partial sums by computing the coefficients of the expansion. We show that the proposed estimator is asymptotically normal and consistent with the optimal rate of convergence and estimable asymptotic variance. This estimator enables not only a point estimation of ruin probability but also an approximated interval estimation and testing hypothesis.

]]>Risks doi: 10.3390/risks7020036

Authors: Jean-Philippe Aguilar Jan Korbel

We provide ready-to-use formulas for European options prices, risk sensitivities, and P&amp;L calculations under L&eacute;vy-stable models with maximal negative asymmetry. Particular cases, efficiency testing, and some qualitative features of the model are also discussed.

]]>Risks doi: 10.3390/risks7020035

Authors: Massimiliano Menzietti Maria Francesca Morabito Manuela Stranges

In small populations, mortality rates are characterized by a great volatility, the datasets are often available for a few years and suffer from missing data. Therefore, standard mortality models may produce high uncertain and biologically improbable projections. In this paper, we deal with the mortality projections of the Maltese population, a small country with less than 500,000 inhabitants, whose data on exposures and observed deaths suffers from all the typical problems of small populations. We concentrate our analysis on older adult mortality. Starting from some recent suggestions in the literature, we assume that the mortality of a small population can be modeled starting from the mortality of a bigger one (the reference population) adding a spread. The first part of the paper is dedicated to the choice of the reference population, then we test alternative mortality models. Finally, we verify the capacity of the proposed approach to reduce the volatility of the mortality projections. The results obtained show that the model is able to significantly reduce the uncertainty of projected mortality rates and to ensure their coherent and biologically reasonable evolution.

]]>Risks doi: 10.3390/risks7010034

Authors: Zbigniew Palmowski Łukasz Stettner Anna Sulima

We study a portfolio selection problem in a continuous-time It&ocirc;&ndash;Markov additive market with prices of financial assets described by Markov additive processes that combine L&eacute;vy processes and regime switching models. Thus, the model takes into account two sources of risk: the jump diffusion risk and the regime switching risk. For this reason, the market is incomplete. We complete the market by enlarging it with the use of a set of Markovian jump securities, Markovian power-jump securities and impulse regime switching securities. Moreover, we give conditions under which the market is asymptotic-arbitrage-free. We solve the portfolio selection problem in the It&ocirc;&ndash;Markov additive market for the power utility and the logarithmic utility.

]]>Risks doi: 10.3390/risks7010033

Authors: Andrea Nigri Susanna Levantesi Mario Marino Salvatore Scognamiglio Francesca Perla

In the field of mortality, the Lee&ndash;Carter based approach can be considered the milestone to forecast mortality rates among stochastic models. We could define a &ldquo;Lee&ndash;Carter model family&rdquo; that embraces all developments of this model, including its first formulation (1992) that remains the benchmark for comparing the performance of future models. In the Lee&ndash;Carter model, the &kappa; t parameter, describing the mortality trend over time, plays an important role about the future mortality behavior. The traditional ARIMA process usually used to model &kappa; t shows evident limitations to describe the future mortality shape. Concerning forecasting phase, academics should approach a more plausible way in order to think a nonlinear shape of the projected mortality rates. Therefore, we propose an alternative approach the ARIMA processes based on a deep learning technique. More precisely, in order to catch the pattern of &kappa; t series over time more accurately, we apply a Recurrent Neural Network with a Long Short-Term Memory architecture and integrate the Lee&ndash;Carter model to improve its predictive capacity. The proposed approach provides significant performance in terms of predictive accuracy and also allow for avoiding the time-chunks&rsquo; a priori selection. Indeed, it is a common practice among academics to delete the time in which the noise is overflowing or the data quality is insufficient. The strength of the Long Short-Term Memory network lies in its ability to treat this noise and adequately reproduce it into the forecasted trend, due to its own architecture enabling to take into account significant long-term patterns.

]]>Risks doi: 10.3390/risks7010032

Authors: Zhuo Jin Zhixin Yang Quan Yuan

This paper studies the optimal investment and consumption strategies in a two-asset model. A dynamic Value-at-Risk constraint is imposed to manage the wealth process. By using Value at Risk as the risk measure during the investment horizon, the decision maker can dynamically monitor the exposed risk and quantify the maximum expected loss over a finite horizon period at a given confidence level. In addition, the decision maker has to filter the key economic factors to make decisions. Considering the cost of filtering the factors, the decision maker aims to maximize the utility of consumption in a finite horizon. By using the Kalman filter, a partially observed system is converted to a completely observed one. However, due to the cost of information processing, the decision maker fails to process the information in an arbitrarily rational manner and can only make decisions on the basis of the limited observed signals. A genetic algorithm was developed to find the optimal investment, consumption strategies, and observation strength. Numerical simulation results are provided to illustrate the performance of the algorithm.

]]>Risks doi: 10.3390/risks7010031

Authors: Wanbing Zhang Sisi Zhang Peibiao Zhao

Value at Risk (VaR) is used to illustrate the maximum potential loss under a given confidence level, and is just a single indicator to evaluate risk ignoring any information about income. The present paper will generalize one-dimensional VaR to two-dimensional VaR with income-risk double indicators. We first construct a double-VaR with ( &mu; , &sigma; 2 ) (or ( &mu; , V a R 2 ) ) indicators, and deduce the joint confidence region of ( &mu; , &sigma; 2 ) (or ( &mu; , V a R 2 ) ) by virtue of the two-dimensional likelihood ratio method. Finally, an example to cover the empirical analysis of two double-VaR models is stated.

]]>Risks doi: 10.3390/risks7010030

Authors: Fabien Le Floc’h Cornelis W. Oosterlee

This paper explores the stochastic collocation technique, applied on a monotonic spline, as an arbitrage-free and model-free interpolation of implied volatilities. We explore various spline formulations, including B-spline representations. We explain how to calibrate the different representations against market option prices, detail how to smooth out the market quotes, and choose a proper initial guess. The technique is then applied to concrete market options and the stability of the different approaches is analyzed. Finally, we consider a challenging example where convex spline interpolations lead to oscillations in the implied volatility and compare the spline collocation results with those obtained through arbitrage-free interpolation technique of Andreasen and Huge.

]]>Risks doi: 10.3390/risks7010029

Authors: Martin Leo Suneel Sharma K. Maddulety

There is an increasing influence of machine learning in business applications, with many solutions already implemented and many more being explored. Since the global financial crisis, risk management in banks has gained more prominence, and there has been a constant focus around how risks are being detected, measured, reported and managed. Considerable research in academia and industry has focused on the developments in banking and risk management and the current and emerging challenges. This paper, through a review of the available literature seeks to analyse and evaluate machine-learning techniques that have been researched in the context of banking risk management, and to identify areas or problems in risk management that have been inadequately explored and are potential areas for further research. The review has shown that the application of machine learning in the management of banking risks such as credit risk, market risk, operational risk and liquidity risk has been explored; however, it doesn&rsquo;t appear commensurate with the current industry level of focus on both risk management and machine learning. A large number of areas remain in bank risk management that could significantly benefit from the study of how machine learning can be applied to address specific problems.

]]>Risks doi: 10.3390/risks7010028

Authors: Shi Chen Jyh-Horng Lin Wenyu Yao Fu-Wei Huang

In this paper, we develop a contingent claim model to evaluate the equity, default risk, and efficiency gain/loss from managerial overconfidence of a shadow-banking life insurer under the purchases of distressed assets by the government. Our paper focuses on managerial overconfidence where the chief executive officer (CEO) overestimates the returns on investment. The investment market faced by the life insurer is imperfectly competitive, and investment is core to the provision of profit-sharing life insurance policies. We show that CEO overconfidence raises the default risk in the life insurer&rsquo;s equity returns, thereby adversely affecting the financial stability. Either shadow-banking involvement or government bailout attenuates the unfavorable effect. There is an efficiency gain from CEO overconfidence to investment. Government bailout helps to reduce the life insurer&rsquo;s default risk, but simultaneously reduce the efficiency gain from CEO overconfidence. Our results contribute to the managerial overconfidence literature linking insurer shadow-banking involvement and government bailout in particular during a financial crisis.

]]>Risks doi: 10.3390/risks7010027

Authors: Apostolos Bozikas Georgios Pitselis

In this paper, we propose a credible regression approach with random coefficients to model and forecast the mortality dynamics of a given population with limited data. Age-specific mortality rates are modelled and extrapolation methods are utilized to estimate future mortality rates. The results on Greek mortality data indicate that credibility regression contributed to more accurate forecasts than those produced from the Lee&ndash;Carter and Cairns&ndash;Blake&ndash;Dowd models. An application on pricing insurance-related products is also provided.

]]>Risks doi: 10.3390/risks7010026

Authors: Susanna Levantesi Virginia Pizzorusso

Estimation of future mortality rates still plays a central role among life insurers in pricing their products and managing longevity risk. In the literature on mortality modeling, a wide number of stochastic models have been proposed, most of them forecasting future mortality rates by extrapolating one or more latent factors. The abundance of proposed models shows that forecasting future mortality from historical trends is non-trivial. Following the idea proposed in Deprez et al. (2017), we use machine learning algorithms, able to catch patterns that are not commonly identifiable, to calibrate a parameter (the machine learning estimator), improving the goodness of fit of standard stochastic mortality models. The machine learning estimator is then forecasted according to the Lee-Carter framework, allowing one to obtain a higher forecasting quality of the standard stochastic models. Out-of sample forecasts are provided to verify the model accuracy.

]]>Risks doi: 10.3390/risks7010025

Authors: Phillip J. Monin

We introduce a financial stress index that was developed by the Office of Financial Research (OFR FSI) and detail its purpose, construction, interpretation, and use in financial market monitoring. The index employs a novel and flexible methodology using daily data from global financial markets. Analysis for the 2000&ndash;2018 time period is presented. Using a logistic regression framework and dates of government intervention in the financial system as a proxy for stress events, we found that the OFR FSI performs well in identifying systemic financial stress. In addition, we find that the OFR FSI leads the Chicago Fed National Activity Index in a Granger causality analysis, suggesting that increases in financial stress help predict decreases in economic activity.

]]>Risks doi: 10.3390/risks7010023

Authors: Sel Ly Kim-Hung Pho Sal Ly Wing-Keung Wong

Determining distributions of the functions of random variables is one of the most important problems in statistics and applied mathematics because distributions of functions have wide range of applications in numerous areas in economics, finance, risk management, science, and others. However, most studies only focus on the distribution of independent variables or focus on some common distributions such as multivariate normal joint distributions for the functions of dependent random variables. To bridge the gap in the literature, in this paper, we first derive the general formulas to determine both density and distribution of the product for two or more random variables via copulas to capture the dependence structures among the variables. We then propose an approach combining Monte Carlo algorithm, graphical approach, and numerical analysis to efficiently estimate both density and distribution. We illustrate our approach by examining the shapes and behaviors of both density and distribution of the product for two log-normal random variables on several different copulas, including Gaussian, Student-t, Clayton, Gumbel, Frank, and Joe Copulas, and estimate some common measures including Kendall&rsquo;s coefficient, mean, median, standard deviation, skewness, and kurtosis for the distributions. We found that different types of copulas affect the behavior of distributions differently. In addition, we also discuss the behaviors via all copulas above with the same Kendall&rsquo;s coefficient. Our results are the foundation of any further study that relies on the density and cumulative probability functions of product for two or more random variables. Thus, the theory developed in this paper is useful for academics, practitioners, and policy makers.

]]>Risks doi: 10.3390/risks7010024

Authors: Ahsan Nawaz Ahsan Waqar Syyed Adnan Raheel Shah Muhammad Sajid Muhammad Irslan Khalid

Risk management is a comparatively new field and there is no core system of risk management in the construction industries of developing countries. In Pakistan, construction is an extremely risk-seeking industry lacking a good reputation for handling risk. However, it is gradually giving it more importance as a result of increased competition and construction activities. For this purpose, a survey-based study has been conducted which aims to investigate the risk management practices used in construction projects in Pakistan. To achieve the objective, data was collected from 22 contractor firms working on 100 diverse projects. The analysis indicates that risk management has been implemented at a low level in the local environment. The results also disclose that there is a higher degree of correlation between effective risk management and project success. The findings reveal the importance of risk management techniques, their usage, implication, and the effect of these techniques on the success of construction projects from the contractor&rsquo;s perspective, thus convincing the key participants of projects about the use of risk management.

]]>Risks doi: 10.3390/risks7010022

Authors: Han Li Colin O’Hare

Extrapolative methods are one of the most commonly-adopted forecasting approaches in the literature on projecting future mortality rates. It can be argued that there are two types of mortality models using this approach. The first extracts patterns in age, time and cohort dimensions either in a deterministic fashion or a stochastic fashion. The second uses non-parametric smoothing techniques to model mortality and thus has no explicit constraints placed on the model. We argue that from a forecasting point of view, the main difference between the two types of models is whether they treat recent and historical information equally in the projection process. In this paper, we compare the forecasting performance of the two types of models using Great Britain male mortality data from 1950&ndash;2016. We also conduct a robustness test to see how sensitive the forecasts are to the changes in the length of historical data used to calibrate the models. The main conclusion from the study is that more recent information should be given more weight in the forecasting process as it has greater predictive power over historical information.

]]>Risks doi: 10.3390/risks7010021

Authors: Mariarosaria Coppola Maria Russolillo Rosaria Simone

The management of National Social Security Systems is being challenged more and more by the rapid ageing of the population, especially in the industrialized countries. In order to chase the Pension System sustainability, several countries in Europe are setting up pension reforms linking the retirement age and/or benefits to life expectancy. In this context, the accurate modelling and projection of mortality rates and life expectancy play a central role and represent issues of great interest in recent literature. Our study refers to the Italian mortality experience and considers an indexing mechanism based on the expected residual life to adjust the retirement age and keep costs at an expected budgeted level, in the spirit of sharing the longevity risk between Social Security Systems and retirees. In order to combine fitting and projections performances of selected stochastic mortality models, a model assembling technique is applied to face uncertainty in model selection, while accounting for uncertainty of estimation as well. The resulting proposal is an averaged model that is suitable to discuss about the gender gap in longevity risk and its alleged narrowing over time.

]]>Risks doi: 10.3390/risks7010020

Authors: Diby François Kassi Dilesha Nawadali Rathnayake Pierre Axel Louembe Ning Ding

This study examines the effect of market risk on the financial performance of 31 non-financial companies listed on the Casablanca Stock Exchange (CSE) over the period 2000&ndash;2016. We utilized three alternative variables to assess financial performance, namely, the return on assets, the return on equity and the profit margin. We used the degree of financial leverage, the book-to-market ratio, and the gearing ratio as the indicators of market risk. Then, we employed the pooled OLS model, the fixed effects model, the random effects model, the difference-GMM and the system-GMM models. The results show that the different measures of market risk have significant negative influences on the companies&rsquo; financial performance. The elasticities are greater following the degree of financial leverage compared with the book-to-market ratio and the gearing ratio. In most cases, the firm&rsquo;s age, the cash holdings ratio, the firm&rsquo;s size, the debt-to-assets ratio, and the tangibility ratio have positive effects on financial performance, whereas the debt-to-income ratio and the stock turnover hurt the performance of these non-financial companies. Therefore, decision-makers and managers should mitigate market risk through appropriate strategies of risk management, such as derivatives and insurance techniques.

]]>Risks doi: 10.3390/risks7010019

Authors: Hui Ye Anthony Bellotti

Based on a rich dataset of recoveries donated by a debt collection business, recovery rates for non-performing loans taken from a single European country are modelled using linear regression, linear regression with Lasso, beta regression and inflated beta regression. We also propose a two-stage model: beta mixture model combined with a logistic regression model. The proposed model allowed us to model the multimodal distribution we found for these recovery rates. All models were built using loan characteristics, default data and collections data prior to purchase by the debt collection business. The intended use of the models was to estimate future recovery rates for improved risk assessment, capital requirement calculations and bad debt management. They were compared using a range of quantitative performance measures under K-fold cross validation. Among all the models, we found that the proposed two-stage beta mixture model performs best.

]]>Risks doi: 10.3390/risks7010018

Authors: Florin Avram Danijel Grahovac Ceren Vardar-Acar

As is well-known, the benefit of restricting L&eacute;vy processes without positive jumps is the &ldquo; W , Z scale functions paradigm&rdquo;, by which the knowledge of the scale functions W , Z extends immediately to other risk control problems. The same is true largely for strong Markov processes X t , with the notable distinctions that (a) it is more convenient to use as &ldquo;basis&rdquo; differential exit functions &nu; , &delta; , and that (b) it is not yet known how to compute &nu; , &delta; or W , Z beyond the L&eacute;vy, diffusion, and a few other cases. The unifying framework outlined in this paper suggests, however, via an example that the spectrally negative Markov and L&eacute;vy cases are very similar (except for the level of work involved in computing the basic functions &nu; , &delta; ). We illustrate the potential of the unified framework by introducing a new objective (33) for the optimization of dividends, inspired by the de Finetti problem of maximizing expected discounted cumulative dividends until ruin, where we replace ruin with an optimally chosen Azema-Yor/generalized draw-down/regret/trailing stopping time. This is defined as a hitting time of the &ldquo;draw-down&rdquo; process Y t = sup 0 &le; s &le; t X s &minus; X t obtained by reflecting X t at its maximum. This new variational problem has been solved in a parallel paper.

]]>Risks doi: 10.3390/risks7010017

Authors: Søren Asmussen Patrick J. Laub Hailiang Yang

Phase-type (PH) distributions are defined as distributions of lifetimes of finite continuous-time Markov processes. Their traditional applications are in queueing, insurance risk, and reliability, but more recently, also in finance and, though to a lesser extent, to life and health insurance. The advantage is that PH distributions form a dense class and that problems having explicit solutions for exponential distributions typically become computationally tractable under PH assumptions. In the first part of this paper, fitting of PH distributions to human lifetimes is considered. The class of generalized Coxian distributions is given special attention. In part, some new software is developed. In the second part, pricing of life insurance products such as guaranteed minimum death benefit and high-water benefit is treated for the case where the lifetime distribution is approximated by a PH distribution and the underlying asset price process is described by a jump diffusion with PH jumps. The expressions are typically explicit in terms of matrix-exponentials involving two matrices closely related to the Wiener-Hopf factorization, for which recently, a L&eacute;vy process version has been developed for a PH horizon. The computational power of the method of the approach is illustrated via a number of numerical examples.

]]>Risks doi: 10.3390/risks7010016

Authors: Shuaiqiang Liu Cornelis W. Oosterlee Sander M. Bohte

This paper proposes a data-driven approach, by means of an Artificial Neural Network (ANN), to value financial options and to calculate implied volatilities with the aim of accelerating the corresponding numerical methods. With ANNs being universal function approximators, this method trains an optimized ANN on a data set generated by a sophisticated financial model, and runs the trained ANN as an agent of the original solver in a fast and efficient way. We test this approach on three different types of solvers, including the analytic solution for the Black-Scholes equation, the COS method for the Heston stochastic volatility model and Brent&rsquo;s iterative root-finding method for the calculation of implied volatilities. The numerical results show that the ANN solver can reduce the computing time significantly.

]]>Risks doi: 10.3390/risks7010015

Authors: Mansi Jain Gagan Deep Sharma Mrinalini Srivastava

&lsquo;Sustainable investment&rsquo;&mdash;includes a variety of asset classes selected while caring for the causes of environmental, social, and governance (ESG). It is an investment strategy that seeks to combine social and/ or environmental benefits with financial returns, thus linking investor&rsquo;s social, ethical, ecological and economic concerns Under certain conditions, these indices also help to attract foreign capital, seeking international participation in the local capital markets. The purpose of this paper is to study whether the sustainable investment alternatives offer better financial returns than the conventional indices from both developed and emerging markets. With an intent to maintain consistency, this paper comparatively analyzes the financial returns of the Thomson Reuters/S-Network global indices, namely the developed markets (excluding US) ESG index&mdash;TRESGDX, emerging markets ESG index&mdash;TRESGEX, US large-cap ESG index&mdash;TRESGUS, Europe ESG index&mdash;TRESGEU, and those of the usual markets, namely MSCI world index (MSCI W), MSCI All Country World Equity index (MSCI ACWI), MSCI USA index (MSCI USA), and MSCI Europe Australasia Far East index (MSCI EAFE), MSCI Emerging Markets index (MSCI EM) and MSCI Europe index (MSCI EU). The study also focusses on the inter-linkages between these indices. Daily closing prices of all the benchmark indices are taken for the five-year period of January 2013&ndash;December 2017. Line charts and unit-root tests are applied to check the stationary nature of the series; Granger&rsquo;s causality model, auto-regressive conditional heteroskedasticity (ARCH)-GARCH type modelling is performed to find out the linkages between the markets under study followed by the Johansen&rsquo;s cointegration test and the Vector Error Correction Model to test the volatility spillover between the sustainable indices and the conventional indices. The study finds that the sustainable indices and the conventional indices are integrated and there is a flow of information between the two investment avenues. The results indicate that there is no significant difference in the performance between sustainable indices and the traditional conventional indices, being a good substitute to the latter. Hence, the financial/investment managers can obtain more insights regarding investment decisions, and the study further suggests that their portfolios should consider both the indices with the perspective of diversifying the risk and hedging, and reap benefits of the same. Additionally, corporate executives shall use it to benchmark their own performance against peers and track news as well.

]]>Risks doi: 10.3390/risks7010014

Authors: Rui Zhou Guangyu Xing Min Ji

Standardized longevity risk transfers often involve modeling mortality rates of multiple populations. Some researchers have found that mortality indexes of selected countries are cointegrated, meaning that a linear relationship exists between the indexes. Vector error correction model (VECM) was used to incorporate this relation, thereby forcing the mortality rates of multiple populations to revert to a long-run equilibrium. However, the long-run equilibrium may change over time. It is crucial to incorporate these changes such that mortality dependence is adequately modeled. In this paper, we develop a framework to examine the presence of equilibrium changes and to incorporate these changes into the mortality model. In particular, we focus on equilibrium changes caused by threshold effect, the phenomenon that mortality indexes alternate between different VECMs depending on the value of a threshold variable. Our framework comprises two steps. In the first step, a statistical test is performed to examine the presence of threshold effect in the VECM for multiple mortality indexes. In the second step, threshold vector error correction model (TVECM) is fitted to the mortality indexes and model adequacy is evaluated. We illustrate this framework with the mortality data of England and Wales (EW) and Canadian populations. We further apply the TVECM to forecast future mortalities and price an illustrative longevity bond with multivariate Wang transform. Our numerical results show that TVECM predicted much faster mortality improvement for EW and Canada than single-regime VECM and thus the incorporation of threshold effect significant increases longevity bond price.

]]>Risks doi: 10.3390/risks7010013

Authors: Mauricio Junca Harold A. Moreno-Franco José Luis Pérez

We consider the optimal bail-out dividend problem with fixed transaction cost for a L&eacute;vy risk model with a constraint on the expected present value of injected capital. To solve this problem, we first consider the optimal bail-out dividend problem with transaction cost and capital injection and show the optimality of reflected ( c 1 , c 2 ) -policies. We then find the optimal Lagrange multiplier, by showing that in the dual Lagrangian problem the complementary slackness conditions are met. Finally, we present some numerical examples to support our results.

]]>Risks doi: 10.3390/risks7010012

Authors: Ranjit Singh Jayashree Bhattacharjee

Risk perception is an idiosyncratic process of interpretation. It is a highly personal process of making a decision based on an individual&rsquo;s frame of reference that has evolved over time. The purpose of this paper is to find out the risk perception level of equity investors and to identify the factors influencing their risk perception. The study was conducted using a stratified random sampling design of 358 investors. It was found that the overall risk perception level of equity investors is moderate and that the main factors affecting their risk perception are information screening, investment education, fear psychosis, fundamental expertise, technical expertise, familiarity bias, information asymmetry, understanding of the market, etc. Considering the above findings, efforts should be made to bring people with a high risk perception to the low risk perception category by providing them with training to handle or manage high-risk scenarios which will help in promoting an equity-investment culture.

]]>Risks doi: 10.3390/risks7010011

Authors: Jackie Li Atsuyuki Kogure Jia Liu

In this paper, we suggest a Bayesian multivariate approach for pricing a reverse mortgage, allowing for house price risk, interest rate risk and longevity risk. We adopt the principle of maximum entropy in risk-neutralisation of these three risk components simultaneously. Our numerical results based on Australian data suggest that a reverse mortgage would be financially sustainable under the current financial environment and the model settings and assumptions.

]]>Risks doi: 10.3390/risks7010010

Authors: Ravi Summinga-Sonagadu Jason Narsoo

In this paper, we employ 99% intraday value-at-risk (VaR) and intraday expected shortfall (ES) as risk metrics to assess the competency of the Multiplicative Component Generalised Autoregressive Heteroskedasticity (MC-GARCH) models based on the 1-min EUR/USD exchange rate returns. Five distributional assumptions for the innovation process are used to analyse their effects on the modelling and forecasting performance. The high-frequency volatility models were validated in terms of in-sample fit based on various statistical and graphical tests. A more rigorous validation procedure involves testing the predictive power of the models. Therefore, three backtesting procedures were used for the VaR, namely, the Kupiec&rsquo;s test, a duration-based backtest, and an asymmetric VaR loss function. Similarly, three backtests were employed for the ES: a regression-based backtesting procedure, the Exceedance Residual backtest and the V-Tests. The validation results show that non-normal distributions are best suited for both model fitting and forecasting. The MC-GARCH(1,1) model under the Generalised Error Distribution (GED) innovation assumption gave the best fit to the intraday data and gave the best results for the ES forecasts. However, the asymmetric Skewed Student&rsquo;s-t distribution for the innovation process provided the best results for the VaR forecasts. This paper presents the results of the first empirical study (to the best of the authors&rsquo; knowledge) in: (1) forecasting the intraday Expected Shortfall (ES) under different distributional assumptions for the MC-GARCH model; (2) assessing the MC-GARCH model under the Generalised Error Distribution (GED) innovation; (3) evaluating and ranking the VaR predictability of the MC-GARCH models using an asymmetric loss function.

]]>Risks doi: 10.3390/risks7010009

Authors: Fangyuan Dong Nick Halen Kristen Moore Qinglai Zeng

Life Insurance Retirement Plans (LIRPs) offer tax-deferred cash value accumulation, tax-free withdrawals (if properly structured), and a tax-free death benefit to beneficiaries. Thus, LIRPs share many of the tax advantages of other retirement savings vehicles but with less restrictive limitations on income and contributions. Opinions are mixed about the effectiveness of LIRPs; some financial advisers recommend them enthusiastically, while others are more skeptical. In this paper, we examine the potential of LIRPs to meet both income and bequest needs in retirement. We contrast retirement portfolios that include a LIRP with those that include only investment products with no life insurance. We consider different issue ages, face amounts, and withdrawal patterns. We simulate market scenarios and we demonstrate that portfolios that include LIRPs yield higher legacy potential and smaller income risk than those that exclude it. Thus, we conclude that the inclusion of a LIRP can improve financial outcomes in retirement.

]]>Risks doi: 10.3390/risks7010008

Authors: Maria Elena De Giuli Alessandro Greppi Marina Resta

We use Object Oriented Bayesian Networks (OOBNs) to analyze complex ties in the equity market and to detect drivers for the Standard &amp; Poor&rsquo;s 500 (S&amp;P 500) index. To such aim, we consider a vast number of indicators drawn from various investment areas (Value, Growth, Sentiment, Momentum, and Technical Analysis), and, with the aid of OOBNs, we study the role they played along time in influencing the dynamics of the S&amp;P 500. Our results highlight that the centrality of the indicators varies in time, and offer a starting point for further inquiries devoted to combine OOBNs with trading platforms.

]]>Risks doi: 10.3390/risks7010007

Authors: Delia Coculescu Freddy Delbaen

We use the theory of coherent measures to look at the problem of surplus sharing in an insurance business. The surplus share of an insured is calculated by the surplus premium in the contract. The theory of coherent risk measures and the resulting capital allocation gives a way to divide the surplus between the insured and the capital providers, i.e., the shareholders.

]]>Risks doi: 10.3390/risks7010006

Authors: Guangyuan Gao Mario V. Wüthrich

The aim of this project is to analyze high-frequency GPS location data (second per second) of individual car drivers (and trips). We extract feature information about speeds, acceleration, deceleration, and changes of direction from this high-frequency GPS location data. Time series of this feature information allow us to appropriately allocate individual car driving trips to selected drivers using convolutional neural networks.

]]>Risks doi: 10.3390/risks7010005

Authors: Carmine De Franco Johann Nicolle Huyên Pham

One of the main challenges investors have to face is model uncertainty. Typically, the dynamic of the assets is modeled using two parameters: the drift vector and the covariance matrix, which are both uncertain. Since the variance/covariance parameter is assumed to be estimated with a certain level of confidence, we focus on drift uncertainty in this paper. Building on filtering techniques and learning methods, we use a Bayesian learning approach to solve the Markowitz problem and provide a simple and practical procedure to implement optimal strategy. To illustrate the value added of using the optimal Bayesian learning strategy, we compare it with an optimal nonlearning strategy that keeps the drift constant at all times. In order to emphasize the prevalence of the Bayesian learning strategy above the nonlearning one in different situations, we experiment three different investment universes: indices of various asset classes, currencies and smart beta strategies.

]]>Risks doi: 10.3390/risks7010004

Authors: Risks Editorial Office

Rigorous peer-review is the corner-stone of high-quality academic publishing. [...]

]]>Risks doi: 10.3390/risks7010003

Authors: Paolo Giudici Laura Parisi

We propose a statistical measure, based on correlation networks, to evaluate the systemic risk that could arise from the resolution of a failing or likely-to-fail financial institution, under three alternative scenarios: liquidation, private recapitalization, or bail-in. The measure enhances the observed CDS spreads with a risk premium that derives from contagion effects across financial institutions. The empirical findings reveal that the recapitalization of a distressed bank performed by the other banks in the system and the bail-in resolution minimize the potential losses for the banking sector with respect to the liquidation scenario, thus posing limited systemic risks. A closer comparison between the private intervention recapitalization and the bail-in tool shows that the latter slightly reduces contagion effects with respect to the private intervention scenario.

]]>Risks doi: 10.3390/risks7010002

Authors: Man Chung Fung Katja Ignatieva Michael Sherris

This paper assesses the hedge effectiveness of an index-based longevity swap and a longevity cap for a life annuity portfolio. Although longevity swaps are a natural instrument for hedging longevity risk, derivatives with non-linear pay-offs, such as longevity caps, provide more effective downside protection. A tractable stochastic mortality model with age dependent drift and volatility is developed and analytical formulae for prices of longevity derivatives are derived. The model is calibrated using Australian mortality data. The hedging of the life annuity portfolio is comprehensively assessed for a range of assumptions for the longevity risk premium, the term to maturity of the hedging instruments, as well as the size of the underlying annuity portfolio. The results compare the risk management benefits and costs of longevity derivatives with linear and nonlinear payoff structures.

]]>Risks doi: 10.3390/risks7010001

Authors: Daniel Doyle Chris Groendyke

This paper explores the use of neural networks to reduce the computational cost of pricing and hedging variable annuity guarantees. Pricing these guarantees can take a considerable amount of time because of the large number of Monte Carlo simulations that are required for the fair value of these liabilities to converge. This computational requirement worsens when Greeks must be calculated to hedge the liabilities of these guarantees. A feedforward neural network is a universal function approximator that is proposed as a useful machine learning technique to interpolate between previously calculated values and avoid running a full simulation to obtain a value for the liabilities. We propose methodologies utilizing neural networks for both the tasks of pricing as well as hedging four different varieties of variable annuity guarantees. We demonstrated a significant efficiency gain using neural networks in this manner. We also experimented with different error functions in the training of the neural networks and examined the resulting changes in network performance.

]]>Risks doi: 10.3390/risks6040145

Authors: Ashu Tiwari Archana Patro

Policymakers in developing and emerging countries are facing higher risk that is related to natural disasters in comparison to developed ones because of persistent problem of supply-side bottleneck for disaster insurance. Additionally, lower insurance consumption, higher disaster risk, and high income elasticity of insurance demand have worsened the loss consequences of natural disaster in these markets. In this context, current study for the first time argues that the supply side bottleneck problem has its origin in peculiar pattern of disaster consumption owing to memory cues. The study finds that relatively higher frequency of natural disasters acts as a negative memory cue and positively impacts insurance consumption. On the other hand, a relatively lower frequency of natural disasters adversely impacts insurance consumption in the background of variation in risk aversion behavior. For this purpose, current study has based its work on Mullainathan (2002), which builds its argument around memory cues.

]]>Risks doi: 10.3390/risks6040144

Authors: Yikai (Maxwell) Gong Zhuangdi Li Maria Milazzo Kristen Moore Matthew Provencher

Credibility theory is used widely in group health and casualty insurance. However, it is generally not used in individual life and annuity business. With the introduction of principle-based reserving (PBR), which relies more heavily on company-specific experience, credibility theory is becoming increasingly important for life actuaries. In this paper, we review the two most commonly used credibility methods: limited fluctuation and greatest accuracy (B&uuml;hlmann) credibility. We apply the limited fluctuation method to M Financial Group&rsquo;s experience data and describe some general qualitative observations. In addition, we use simulation to generate a universe of data and compute Limited Fluctuation and greatest accuracy credibility factors for actual-to-expected (A/E) mortality ratios. We also compare the two credibility factors to an intuitive benchmark credibility measure. We see that for our simulated data set, the limited fluctuation factors are significantly lower than the greatest accuracy factors, particularly for low numbers of claims. Thus, the limited fluctuation method may understate the credibility for companies with favorable mortality experience. The greatest accuracy method has a stronger mathematical foundation, but it generally cannot be applied in practice because of data constraints. The National Association of Insurance Commissioners (NAIC) recognizes and is addressing the need for life insurance experience data in support of PBR&mdash;this is an area of current work.

]]>