Next Issue
Previous Issue

Table of Contents

Risks, Volume 7, Issue 1 (March 2019)

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Readerexternal link to open them.
Cover Story (view full-size image) A genetic algorithm (GA) simulates the evolution process. Starting with the initial population, it [...] Read more.
View options order results:
result details:
Displaying articles 1-34
Export citation of selected articles as:
Open AccessArticle Optimal Portfolio Selection in an Itô–Markov Additive Market
Received: 8 December 2018 / Revised: 12 March 2019 / Accepted: 18 March 2019 / Published: 25 March 2019
Viewed by 212 | PDF Full-text (469 KB) | HTML Full-text | XML Full-text
Abstract
We study a portfolio selection problem in a continuous-time Itô–Markov additive market with prices of financial assets described by Markov additive processes that combine Lévy processes and regime switching models. Thus, the model takes into account two sources of risk: the jump diffusion [...] Read more.
We study a portfolio selection problem in a continuous-time Itô–Markov additive market with prices of financial assets described by Markov additive processes that combine Lévy processes and regime switching models. Thus, the model takes into account two sources of risk: the jump diffusion risk and the regime switching risk. For this reason, the market is incomplete. We complete the market by enlarging it with the use of a set of Markovian jump securities, Markovian power-jump securities and impulse regime switching securities. Moreover, we give conditions under which the market is asymptotic-arbitrage-free. We solve the portfolio selection problem in the Itô–Markov additive market for the power utility and the logarithmic utility. Full article
(This article belongs to the Special Issue Applications of Stochastic Optimal Control to Economics and Finance)
Open AccessArticle A Deep Learning Integrated Lee–Carter Model
Received: 31 December 2018 / Revised: 2 March 2019 / Accepted: 13 March 2019 / Published: 16 March 2019
Viewed by 654 | PDF Full-text (5864 KB) | HTML Full-text | XML Full-text
Abstract
In the field of mortality, the Lee–Carter based approach can be considered the milestone to forecast mortality rates among stochastic models. We could define a “Lee–Carter model family” that embraces all developments of this model, including its first formulation (1992) that remains the [...] Read more.
In the field of mortality, the Lee–Carter based approach can be considered the milestone to forecast mortality rates among stochastic models. We could define a “Lee–Carter model family” that embraces all developments of this model, including its first formulation (1992) that remains the benchmark for comparing the performance of future models. In the Lee–Carter model, the κ t parameter, describing the mortality trend over time, plays an important role about the future mortality behavior. The traditional ARIMA process usually used to model κ t shows evident limitations to describe the future mortality shape. Concerning forecasting phase, academics should approach a more plausible way in order to think a nonlinear shape of the projected mortality rates. Therefore, we propose an alternative approach the ARIMA processes based on a deep learning technique. More precisely, in order to catch the pattern of κ t series over time more accurately, we apply a Recurrent Neural Network with a Long Short-Term Memory architecture and integrate the Lee–Carter model to improve its predictive capacity. The proposed approach provides significant performance in terms of predictive accuracy and also allow for avoiding the time-chunks’ a priori selection. Indeed, it is a common practice among academics to delete the time in which the noise is overflowing or the data quality is insufficient. The strength of the Long Short-Term Memory network lies in its ability to treat this noise and adequately reproduce it into the forecasted trend, due to its own architecture enabling to take into account significant long-term patterns. Full article
(This article belongs to the Special Issue New Perspectives in Actuarial Risk Management)
Figures

Figure 1

Open AccessFeature PaperArticle A Genetic Algorithm for Investment–Consumption Optimization with Value-at-Risk Constraint and Information-Processing Cost
Received: 15 February 2019 / Revised: 8 March 2019 / Accepted: 9 March 2019 / Published: 11 March 2019
Viewed by 330 | PDF Full-text (474 KB) | HTML Full-text | XML Full-text
Abstract
This paper studies the optimal investment and consumption strategies in a two-asset model. A dynamic Value-at-Risk constraint is imposed to manage the wealth process. By using Value at Risk as the risk measure during the investment horizon, the decision maker can dynamically monitor [...] Read more.
This paper studies the optimal investment and consumption strategies in a two-asset model. A dynamic Value-at-Risk constraint is imposed to manage the wealth process. By using Value at Risk as the risk measure during the investment horizon, the decision maker can dynamically monitor the exposed risk and quantify the maximum expected loss over a finite horizon period at a given confidence level. In addition, the decision maker has to filter the key economic factors to make decisions. Considering the cost of filtering the factors, the decision maker aims to maximize the utility of consumption in a finite horizon. By using the Kalman filter, a partially observed system is converted to a completely observed one. However, due to the cost of information processing, the decision maker fails to process the information in an arbitrarily rational manner and can only make decisions on the basis of the limited observed signals. A genetic algorithm was developed to find the optimal investment, consumption strategies, and observation strength. Numerical simulation results are provided to illustrate the performance of the algorithm. Full article
(This article belongs to the Special Issue Loss Models: From Theory to Applications)
Figures

Figure 1

Open AccessArticle On Double Value at Risk
Received: 10 February 2019 / Revised: 2 March 2019 / Accepted: 5 March 2019 / Published: 8 March 2019
Viewed by 276 | PDF Full-text (1213 KB) | HTML Full-text | XML Full-text
Abstract
Value at Risk (VaR) is used to illustrate the maximum potential loss under a given confidence level, and is just a single indicator to evaluate risk ignoring any information about income. The present paper will generalize one-dimensional VaR to two-dimensional VaR with income-risk [...] Read more.
Value at Risk (VaR) is used to illustrate the maximum potential loss under a given confidence level, and is just a single indicator to evaluate risk ignoring any information about income. The present paper will generalize one-dimensional VaR to two-dimensional VaR with income-risk double indicators. We first construct a double-VaR with ( μ , σ 2 ) (or ( μ , V a R 2 ) ) indicators, and deduce the joint confidence region of ( μ , σ 2 ) (or ( μ , V a R 2 ) ) by virtue of the two-dimensional likelihood ratio method. Finally, an example to cover the empirical analysis of two double-VaR models is stated. Full article
Figures

Figure 1

Open AccessArticle Model-Free Stochastic Collocation for an Arbitrage-Free Implied Volatility, Part II
Received: 22 January 2019 / Revised: 18 February 2019 / Accepted: 20 February 2019 / Published: 6 March 2019
Viewed by 221 | PDF Full-text (480 KB) | HTML Full-text | XML Full-text
Abstract
This paper explores the stochastic collocation technique, applied on a monotonic spline, as an arbitrage-free and model-free interpolation of implied volatilities. We explore various spline formulations, including B-spline representations. We explain how to calibrate the different representations against market option prices, detail how [...] Read more.
This paper explores the stochastic collocation technique, applied on a monotonic spline, as an arbitrage-free and model-free interpolation of implied volatilities. We explore various spline formulations, including B-spline representations. We explain how to calibrate the different representations against market option prices, detail how to smooth out the market quotes, and choose a proper initial guess. The technique is then applied to concrete market options and the stability of the different approaches is analyzed. Finally, we consider a challenging example where convex spline interpolations lead to oscillations in the implied volatility and compare the spline collocation results with those obtained through arbitrage-free interpolation technique of Andreasen and Huge. Full article
Figures

Figure 1

Open AccessArticle Machine Learning in Banking Risk Management: A Literature Review
Received: 25 January 2019 / Revised: 23 February 2019 / Accepted: 27 February 2019 / Published: 5 March 2019
Viewed by 566 | PDF Full-text (860 KB) | HTML Full-text | XML Full-text
Abstract
There is an increasing influence of machine learning in business applications, with many solutions already implemented and many more being explored. Since the global financial crisis, risk management in banks has gained more prominence, and there has been a constant focus around how [...] Read more.
There is an increasing influence of machine learning in business applications, with many solutions already implemented and many more being explored. Since the global financial crisis, risk management in banks has gained more prominence, and there has been a constant focus around how risks are being detected, measured, reported and managed. Considerable research in academia and industry has focused on the developments in banking and risk management and the current and emerging challenges. This paper, through a review of the available literature seeks to analyse and evaluate machine-learning techniques that have been researched in the context of banking risk management, and to identify areas or problems in risk management that have been inadequately explored and are potential areas for further research. The review has shown that the application of machine learning in the management of banking risks such as credit risk, market risk, operational risk and liquidity risk has been explored; however, it doesn’t appear commensurate with the current industry level of focus on both risk management and machine learning. A large number of areas remain in bank risk management that could significantly benefit from the study of how machine learning can be applied to address specific problems. Full article
Figures

Figure 1

Open AccessArticle CEO Overconfidence and Shadow-Banking Life Insurer Performance Under Government Purchases of Distressed Assets
Received: 30 January 2019 / Revised: 27 February 2019 / Accepted: 27 February 2019 / Published: 5 March 2019
Viewed by 222 | PDF Full-text (1802 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we develop a contingent claim model to evaluate the equity, default risk, and efficiency gain/loss from managerial overconfidence of a shadow-banking life insurer under the purchases of distressed assets by the government. Our paper focuses on managerial overconfidence where the [...] Read more.
In this paper, we develop a contingent claim model to evaluate the equity, default risk, and efficiency gain/loss from managerial overconfidence of a shadow-banking life insurer under the purchases of distressed assets by the government. Our paper focuses on managerial overconfidence where the chief executive officer (CEO) overestimates the returns on investment. The investment market faced by the life insurer is imperfectly competitive, and investment is core to the provision of profit-sharing life insurance policies. We show that CEO overconfidence raises the default risk in the life insurer’s equity returns, thereby adversely affecting the financial stability. Either shadow-banking involvement or government bailout attenuates the unfavorable effect. There is an efficiency gain from CEO overconfidence to investment. Government bailout helps to reduce the life insurer’s default risk, but simultaneously reduce the efficiency gain from CEO overconfidence. Our results contribute to the managerial overconfidence literature linking insurer shadow-banking involvement and government bailout in particular during a financial crisis. Full article
(This article belongs to the Special Issue Financial Risks and Regulation)
Figures

Figure 1

Open AccessArticle Credible Regression Approaches to Forecast Mortality for Populations with Limited Data
Received: 3 December 2018 / Revised: 14 February 2019 / Accepted: 21 February 2019 / Published: 26 February 2019
Viewed by 263 | PDF Full-text (1350 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we propose a credible regression approach with random coefficients to model and forecast the mortality dynamics of a given population with limited data. Age-specific mortality rates are modelled and extrapolation methods are utilized to estimate future mortality rates. The results [...] Read more.
In this paper, we propose a credible regression approach with random coefficients to model and forecast the mortality dynamics of a given population with limited data. Age-specific mortality rates are modelled and extrapolation methods are utilized to estimate future mortality rates. The results on Greek mortality data indicate that credibility regression contributed to more accurate forecasts than those produced from the Lee–Carter and Cairns–Blake–Dowd models. An application on pricing insurance-related products is also provided. Full article
(This article belongs to the Special Issue Recent Development in Actuarial Science and Related Fields)
Figures

Figure 1

Open AccessArticle Application of Machine Learning to Mortality Modeling and Forecasting
Received: 30 November 2018 / Revised: 1 February 2019 / Accepted: 21 February 2019 / Published: 26 February 2019
Cited by 1 | Viewed by 539 | PDF Full-text (2219 KB) | HTML Full-text | XML Full-text
Abstract
Estimation of future mortality rates still plays a central role among life insurers in pricing their products and managing longevity risk. In the literature on mortality modeling, a wide number of stochastic models have been proposed, most of them forecasting future mortality rates [...] Read more.
Estimation of future mortality rates still plays a central role among life insurers in pricing their products and managing longevity risk. In the literature on mortality modeling, a wide number of stochastic models have been proposed, most of them forecasting future mortality rates by extrapolating one or more latent factors. The abundance of proposed models shows that forecasting future mortality from historical trends is non-trivial. Following the idea proposed in Deprez et al. (2017), we use machine learning algorithms, able to catch patterns that are not commonly identifiable, to calibrate a parameter (the machine learning estimator), improving the goodness of fit of standard stochastic mortality models. The machine learning estimator is then forecasted according to the Lee-Carter framework, allowing one to obtain a higher forecasting quality of the standard stochastic models. Out-of sample forecasts are provided to verify the model accuracy. Full article
(This article belongs to the Special Issue New Perspectives in Actuarial Risk Management)
Figures

Figure 1

Open AccessFeature PaperArticle The OFR Financial Stress Index
Received: 28 January 2019 / Revised: 21 February 2019 / Accepted: 21 February 2019 / Published: 26 February 2019
Viewed by 219 | PDF Full-text (1689 KB) | HTML Full-text | XML Full-text
Abstract
We introduce a financial stress index that was developed by the Office of Financial Research (OFR FSI) and detail its purpose, construction, interpretation, and use in financial market monitoring. The index employs a novel and flexible methodology using daily data from global financial [...] Read more.
We introduce a financial stress index that was developed by the Office of Financial Research (OFR FSI) and detail its purpose, construction, interpretation, and use in financial market monitoring. The index employs a novel and flexible methodology using daily data from global financial markets. Analysis for the 2000–2018 time period is presented. Using a logistic regression framework and dates of government intervention in the financial system as a proxy for stress events, we found that the OFR FSI performs well in identifying systemic financial stress. In addition, we find that the OFR FSI leads the Chicago Fed National Activity Index in a Granger causality analysis, suggesting that increases in financial stress help predict decreases in economic activity. Full article
(This article belongs to the Special Issue Financial Risks and Regulation)
Figures

Figure 1

Open AccessArticle An Innovative Framework for Risk Management in Construction Projects in Developing Countries: Evidence from Pakistan
Received: 28 November 2018 / Revised: 13 February 2019 / Accepted: 19 February 2019 / Published: 25 February 2019
Viewed by 268 | PDF Full-text (760 KB) | HTML Full-text | XML Full-text
Abstract
Risk management is a comparatively new field and there is no core system of risk management in the construction industries of developing countries. In Pakistan, construction is an extremely risk-seeking industry lacking a good reputation for handling risk. However, it is gradually giving [...] Read more.
Risk management is a comparatively new field and there is no core system of risk management in the construction industries of developing countries. In Pakistan, construction is an extremely risk-seeking industry lacking a good reputation for handling risk. However, it is gradually giving it more importance as a result of increased competition and construction activities. For this purpose, a survey-based study has been conducted which aims to investigate the risk management practices used in construction projects in Pakistan. To achieve the objective, data was collected from 22 contractor firms working on 100 diverse projects. The analysis indicates that risk management has been implemented at a low level in the local environment. The results also disclose that there is a higher degree of correlation between effective risk management and project success. The findings reveal the importance of risk management techniques, their usage, implication, and the effect of these techniques on the success of construction projects from the contractor’s perspective, thus convincing the key participants of projects about the use of risk management. Full article
(This article belongs to the Special Issue Young Researchers in Insurance and Risk Management)
Figures

Figure 1

Open AccessArticle Determining Distribution for the Product of Random Variables by Using Copulas
Received: 14 January 2019 / Revised: 18 February 2019 / Accepted: 19 February 2019 / Published: 25 February 2019
Cited by 1 | Viewed by 291 | PDF Full-text (616 KB) | HTML Full-text | XML Full-text
Abstract
Determining distributions of the functions of random variables is one of the most important problems in statistics and applied mathematics because distributions of functions have wide range of applications in numerous areas in economics, finance, risk management, science, and others. However, most studies [...] Read more.
Determining distributions of the functions of random variables is one of the most important problems in statistics and applied mathematics because distributions of functions have wide range of applications in numerous areas in economics, finance, risk management, science, and others. However, most studies only focus on the distribution of independent variables or focus on some common distributions such as multivariate normal joint distributions for the functions of dependent random variables. To bridge the gap in the literature, in this paper, we first derive the general formulas to determine both density and distribution of the product for two or more random variables via copulas to capture the dependence structures among the variables. We then propose an approach combining Monte Carlo algorithm, graphical approach, and numerical analysis to efficiently estimate both density and distribution. We illustrate our approach by examining the shapes and behaviors of both density and distribution of the product for two log-normal random variables on several different copulas, including Gaussian, Student-t, Clayton, Gumbel, Frank, and Joe Copulas, and estimate some common measures including Kendall’s coefficient, mean, median, standard deviation, skewness, and kurtosis for the distributions. We found that different types of copulas affect the behavior of distributions differently. In addition, we also discuss the behaviors via all copulas above with the same Kendall’s coefficient. Our results are the foundation of any further study that relies on the density and cumulative probability functions of product for two or more random variables. Thus, the theory developed in this paper is useful for academics, practitioners, and policy makers. Full article
(This article belongs to the Special Issue Measuring and Modelling Financial Risk and Derivatives)
Figures

Figure 1

Open AccessArticle Mortality Forecasting: How Far Back Should We Look in Time?
Received: 26 November 2018 / Revised: 2 February 2019 / Accepted: 19 February 2019 / Published: 22 February 2019
Viewed by 226 | PDF Full-text (1074 KB) | HTML Full-text | XML Full-text
Abstract
Extrapolative methods are one of the most commonly-adopted forecasting approaches in the literature on projecting future mortality rates. It can be argued that there are two types of mortality models using this approach. The first extracts patterns in age, time and cohort dimensions [...] Read more.
Extrapolative methods are one of the most commonly-adopted forecasting approaches in the literature on projecting future mortality rates. It can be argued that there are two types of mortality models using this approach. The first extracts patterns in age, time and cohort dimensions either in a deterministic fashion or a stochastic fashion. The second uses non-parametric smoothing techniques to model mortality and thus has no explicit constraints placed on the model. We argue that from a forecasting point of view, the main difference between the two types of models is whether they treat recent and historical information equally in the projection process. In this paper, we compare the forecasting performance of the two types of models using Great Britain male mortality data from 1950–2016. We also conduct a robustness test to see how sensitive the forecasts are to the changes in the length of historical data used to calibrate the models. The main conclusion from the study is that more recent information should be given more weight in the forecasting process as it has greater predictive power over historical information. Full article
Figures

Figure 1

Open AccessArticle An Indexation Mechanism for Retirement Age: Analysis of the Gender Gap
Received: 29 November 2018 / Revised: 8 February 2019 / Accepted: 8 February 2019 / Published: 22 February 2019
Viewed by 264 | PDF Full-text (3615 KB) | HTML Full-text | XML Full-text
Abstract
The management of National Social Security Systems is being challenged more and more by the rapid ageing of the population, especially in the industrialized countries. In order to chase the Pension System sustainability, several countries in Europe are setting up pension reforms linking [...] Read more.
The management of National Social Security Systems is being challenged more and more by the rapid ageing of the population, especially in the industrialized countries. In order to chase the Pension System sustainability, several countries in Europe are setting up pension reforms linking the retirement age and/or benefits to life expectancy. In this context, the accurate modelling and projection of mortality rates and life expectancy play a central role and represent issues of great interest in recent literature. Our study refers to the Italian mortality experience and considers an indexing mechanism based on the expected residual life to adjust the retirement age and keep costs at an expected budgeted level, in the spirit of sharing the longevity risk between Social Security Systems and retirees. In order to combine fitting and projections performances of selected stochastic mortality models, a model assembling technique is applied to face uncertainty in model selection, while accounting for uncertainty of estimation as well. The resulting proposal is an averaged model that is suitable to discuss about the gender gap in longevity risk and its alleged narrowing over time. Full article
(This article belongs to the Special Issue New Perspectives in Actuarial Risk Management)
Figures

Figure 1

Open AccessArticle Market Risk and Financial Performance of Non-Financial Companies Listed on the Moroccan Stock Exchange
Received: 17 January 2019 / Revised: 17 February 2019 / Accepted: 18 February 2019 / Published: 21 February 2019
Viewed by 396 | PDF Full-text (534 KB) | HTML Full-text | XML Full-text
Abstract
This study examines the effect of market risk on the financial performance of 31 non-financial companies listed on the Casablanca Stock Exchange (CSE) over the period 2000–2016. We utilized three alternative variables to assess financial performance, namely, the return on assets, the return [...] Read more.
This study examines the effect of market risk on the financial performance of 31 non-financial companies listed on the Casablanca Stock Exchange (CSE) over the period 2000–2016. We utilized three alternative variables to assess financial performance, namely, the return on assets, the return on equity and the profit margin. We used the degree of financial leverage, the book-to-market ratio, and the gearing ratio as the indicators of market risk. Then, we employed the pooled OLS model, the fixed effects model, the random effects model, the difference-GMM and the system-GMM models. The results show that the different measures of market risk have significant negative influences on the companies’ financial performance. The elasticities are greater following the degree of financial leverage compared with the book-to-market ratio and the gearing ratio. In most cases, the firm’s age, the cash holdings ratio, the firm’s size, the debt-to-assets ratio, and the tangibility ratio have positive effects on financial performance, whereas the debt-to-income ratio and the stock turnover hurt the performance of these non-financial companies. Therefore, decision-makers and managers should mitigate market risk through appropriate strategies of risk management, such as derivatives and insurance techniques. Full article
Open AccessArticle Modelling Recovery Rates for Non-Performing Loans
Received: 12 February 2019 / Accepted: 15 February 2019 / Published: 20 February 2019
Viewed by 261 | PDF Full-text (440 KB) | HTML Full-text | XML Full-text
Abstract
Based on a rich dataset of recoveries donated by a debt collection business, recovery rates for non-performing loans taken from a single European country are modelled using linear regression, linear regression with Lasso, beta regression and inflated beta regression. We also propose a [...] Read more.
Based on a rich dataset of recoveries donated by a debt collection business, recovery rates for non-performing loans taken from a single European country are modelled using linear regression, linear regression with Lasso, beta regression and inflated beta regression. We also propose a two-stage model: beta mixture model combined with a logistic regression model. The proposed model allowed us to model the multimodal distribution we found for these recovery rates. All models were built using loan characteristics, default data and collections data prior to purchase by the debt collection business. The intended use of the models was to estimate future recovery rates for improved risk assessment, capital requirement calculations and bad debt management. They were compared using a range of quantitative performance measures under K-fold cross validation. Among all the models, we found that the proposed two-stage beta mixture model performs best. Full article
(This article belongs to the Special Issue Advances in Credit Risk Modeling and Management)
Figures

Figure 1

Open AccessArticle The W,Z/ν,δ Paradigm for the First Passage of Strong Markov Processes without Positive Jumps
Received: 21 November 2018 / Revised: 31 January 2019 / Accepted: 13 February 2019 / Published: 19 February 2019
Viewed by 239 | PDF Full-text (493 KB) | HTML Full-text | XML Full-text
Abstract
As is well-known, the benefit of restricting Lévy processes without positive jumps is the “W,Z scale functions paradigm”, by which the knowledge of the scale functions W,Z extends immediately to other risk control problems. The same is true [...] Read more.
As is well-known, the benefit of restricting Lévy processes without positive jumps is the “ W , Z scale functions paradigm”, by which the knowledge of the scale functions W , Z extends immediately to other risk control problems. The same is true largely for strong Markov processes X t , with the notable distinctions that (a) it is more convenient to use as “basis” differential exit functions ν , δ , and that (b) it is not yet known how to compute ν , δ or W , Z beyond the Lévy, diffusion, and a few other cases. The unifying framework outlined in this paper suggests, however, via an example that the spectrally negative Markov and Lévy cases are very similar (except for the level of work involved in computing the basic functions ν , δ ). We illustrate the potential of the unified framework by introducing a new objective (33) for the optimization of dividends, inspired by the de Finetti problem of maximizing expected discounted cumulative dividends until ruin, where we replace ruin with an optimally chosen Azema-Yor/generalized draw-down/regret/trailing stopping time. This is defined as a hitting time of the “draw-down” process Y t = sup 0 s t X s X t obtained by reflecting X t at its maximum. This new variational problem has been solved in a parallel paper. Full article
Figures

Figure 1

Open AccessArticle Phase-Type Models in Life Insurance: Fitting and Valuation of Equity-Linked Benefits
Received: 11 December 2018 / Revised: 31 January 2019 / Accepted: 4 February 2019 / Published: 11 February 2019
Viewed by 327 | PDF Full-text (2051 KB) | HTML Full-text | XML Full-text
Abstract
Phase-type (PH) distributions are defined as distributions of lifetimes of finite continuous-time Markov processes. Their traditional applications are in queueing, insurance risk, and reliability, but more recently, also in finance and, though to a lesser extent, to life and health insurance. The advantage [...] Read more.
Phase-type (PH) distributions are defined as distributions of lifetimes of finite continuous-time Markov processes. Their traditional applications are in queueing, insurance risk, and reliability, but more recently, also in finance and, though to a lesser extent, to life and health insurance. The advantage is that PH distributions form a dense class and that problems having explicit solutions for exponential distributions typically become computationally tractable under PH assumptions. In the first part of this paper, fitting of PH distributions to human lifetimes is considered. The class of generalized Coxian distributions is given special attention. In part, some new software is developed. In the second part, pricing of life insurance products such as guaranteed minimum death benefit and high-water benefit is treated for the case where the lifetime distribution is approximated by a PH distribution and the underlying asset price process is described by a jump diffusion with PH jumps. The expressions are typically explicit in terms of matrix-exponentials involving two matrices closely related to the Wiener-Hopf factorization, for which recently, a Lévy process version has been developed for a PH horizon. The computational power of the method of the approach is illustrated via a number of numerical examples. Full article
Figures

Figure 1

Open AccessArticle Pricing Options and Computing Implied Volatilities using Neural Networks
Received: 8 January 2019 / Revised: 3 February 2019 / Accepted: 6 February 2019 / Published: 9 February 2019
Viewed by 344 | PDF Full-text (981 KB) | HTML Full-text | XML Full-text
Abstract
This paper proposes a data-driven approach, by means of an Artificial Neural Network (ANN), to value financial options and to calculate implied volatilities with the aim of accelerating the corresponding numerical methods. With ANNs being universal function approximators, this method trains an optimized [...] Read more.
This paper proposes a data-driven approach, by means of an Artificial Neural Network (ANN), to value financial options and to calculate implied volatilities with the aim of accelerating the corresponding numerical methods. With ANNs being universal function approximators, this method trains an optimized ANN on a data set generated by a sophisticated financial model, and runs the trained ANN as an agent of the original solver in a fast and efficient way. We test this approach on three different types of solvers, including the analytic solution for the Black-Scholes equation, the COS method for the Heston stochastic volatility model and Brent’s iterative root-finding method for the calculation of implied volatilities. The numerical results show that the ANN solver can reduce the computing time significantly. Full article
Figures

Figure 1

Open AccessReview Can Sustainable Investment Yield Better Financial Returns: A Comparative Study of ESG Indices and MSCI Indices
Received: 24 December 2018 / Revised: 22 January 2019 / Accepted: 29 January 2019 / Published: 2 February 2019
Viewed by 704 | PDF Full-text (807 KB) | HTML Full-text | XML Full-text
Abstract
‘Sustainable investment’—includes a variety of asset classes selected while caring for the causes of environmental, social, and governance (ESG). It is an investment strategy that seeks to combine social and/ or environmental benefits with financial returns, thus linking investor’s social, ethical, ecological and [...] Read more.
‘Sustainable investment’—includes a variety of asset classes selected while caring for the causes of environmental, social, and governance (ESG). It is an investment strategy that seeks to combine social and/ or environmental benefits with financial returns, thus linking investor’s social, ethical, ecological and economic concerns Under certain conditions, these indices also help to attract foreign capital, seeking international participation in the local capital markets. The purpose of this paper is to study whether the sustainable investment alternatives offer better financial returns than the conventional indices from both developed and emerging markets. With an intent to maintain consistency, this paper comparatively analyzes the financial returns of the Thomson Reuters/S-Network global indices, namely the developed markets (excluding US) ESG index—TRESGDX, emerging markets ESG index—TRESGEX, US large-cap ESG index—TRESGUS, Europe ESG index—TRESGEU, and those of the usual markets, namely MSCI world index (MSCI W), MSCI All Country World Equity index (MSCI ACWI), MSCI USA index (MSCI USA), and MSCI Europe Australasia Far East index (MSCI EAFE), MSCI Emerging Markets index (MSCI EM) and MSCI Europe index (MSCI EU). The study also focusses on the inter-linkages between these indices. Daily closing prices of all the benchmark indices are taken for the five-year period of January 2013–December 2017. Line charts and unit-root tests are applied to check the stationary nature of the series; Granger’s causality model, auto-regressive conditional heteroskedasticity (ARCH)-GARCH type modelling is performed to find out the linkages between the markets under study followed by the Johansen’s cointegration test and the Vector Error Correction Model to test the volatility spillover between the sustainable indices and the conventional indices. The study finds that the sustainable indices and the conventional indices are integrated and there is a flow of information between the two investment avenues. The results indicate that there is no significant difference in the performance between sustainable indices and the traditional conventional indices, being a good substitute to the latter. Hence, the financial/investment managers can obtain more insights regarding investment decisions, and the study further suggests that their portfolios should consider both the indices with the perspective of diversifying the risk and hedging, and reap benefits of the same. Additionally, corporate executives shall use it to benchmark their own performance against peers and track news as well. Full article
Figures

Figure 1

Open AccessArticle Changes of Relation in Multi-Population Mortality Dependence: An Application of Threshold VECM
Received: 7 January 2019 / Revised: 26 January 2019 / Accepted: 28 January 2019 / Published: 1 February 2019
Viewed by 293 | PDF Full-text (427 KB) | HTML Full-text | XML Full-text
Abstract
Standardized longevity risk transfers often involve modeling mortality rates of multiple populations. Some researchers have found that mortality indexes of selected countries are cointegrated, meaning that a linear relationship exists between the indexes. Vector error correction model (VECM) was used to incorporate this [...] Read more.
Standardized longevity risk transfers often involve modeling mortality rates of multiple populations. Some researchers have found that mortality indexes of selected countries are cointegrated, meaning that a linear relationship exists between the indexes. Vector error correction model (VECM) was used to incorporate this relation, thereby forcing the mortality rates of multiple populations to revert to a long-run equilibrium. However, the long-run equilibrium may change over time. It is crucial to incorporate these changes such that mortality dependence is adequately modeled. In this paper, we develop a framework to examine the presence of equilibrium changes and to incorporate these changes into the mortality model. In particular, we focus on equilibrium changes caused by threshold effect, the phenomenon that mortality indexes alternate between different VECMs depending on the value of a threshold variable. Our framework comprises two steps. In the first step, a statistical test is performed to examine the presence of threshold effect in the VECM for multiple mortality indexes. In the second step, threshold vector error correction model (TVECM) is fitted to the mortality indexes and model adequacy is evaluated. We illustrate this framework with the mortality data of England and Wales (EW) and Canadian populations. We further apply the TVECM to forecast future mortalities and price an illustrative longevity bond with multivariate Wang transform. Our numerical results show that TVECM predicted much faster mortality improvement for EW and Canada than single-regime VECM and thus the incorporation of threshold effect significant increases longevity bond price. Full article
Figures

Figure 1

Open AccessArticle Optimal Bail-Out Dividend Problem with Transaction Cost and Capital Injection Constraint
Received: 18 December 2018 / Revised: 28 January 2019 / Accepted: 29 January 2019 / Published: 31 January 2019
Viewed by 341 | PDF Full-text (891 KB) | HTML Full-text | XML Full-text
Abstract
We consider the optimal bail-out dividend problem with fixed transaction cost for a Lévy risk model with a constraint on the expected present value of injected capital. To solve this problem, we first consider the optimal bail-out dividend problem with transaction cost and [...] Read more.
We consider the optimal bail-out dividend problem with fixed transaction cost for a Lévy risk model with a constraint on the expected present value of injected capital. To solve this problem, we first consider the optimal bail-out dividend problem with transaction cost and capital injection and show the optimality of reflected ( c 1 , c 2 ) -policies. We then find the optimal Lagrange multiplier, by showing that in the dual Lagrangian problem the complementary slackness conditions are met. Finally, we present some numerical examples to support our results. Full article
Figures

Figure 1

Open AccessArticle Measuring Equity Share Related Risk Perception of Investors in Economically Backward Regions
Received: 7 December 2018 / Revised: 14 January 2019 / Accepted: 14 January 2019 / Published: 30 January 2019
Viewed by 295 | PDF Full-text (1143 KB) | HTML Full-text | XML Full-text
Abstract
Risk perception is an idiosyncratic process of interpretation. It is a highly personal process of making a decision based on an individual’s frame of reference that has evolved over time. The purpose of this paper is to find out the risk perception level [...] Read more.
Risk perception is an idiosyncratic process of interpretation. It is a highly personal process of making a decision based on an individual’s frame of reference that has evolved over time. The purpose of this paper is to find out the risk perception level of equity investors and to identify the factors influencing their risk perception. The study was conducted using a stratified random sampling design of 358 investors. It was found that the overall risk perception level of equity investors is moderate and that the main factors affecting their risk perception are information screening, investment education, fear psychosis, fundamental expertise, technical expertise, familiarity bias, information asymmetry, understanding of the market, etc. Considering the above findings, efforts should be made to bring people with a high risk perception to the low risk perception category by providing them with training to handle or manage high-risk scenarios which will help in promoting an equity-investment culture. Full article
Figures

Figure 1

Open AccessArticle Multivariate Risk-Neutral Pricing of Reverse Mortgages under the Bayesian Framework
Received: 16 December 2018 / Revised: 20 January 2019 / Accepted: 21 January 2019 / Published: 24 January 2019
Viewed by 298 | PDF Full-text (2947 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we suggest a Bayesian multivariate approach for pricing a reverse mortgage, allowing for house price risk, interest rate risk and longevity risk. We adopt the principle of maximum entropy in risk-neutralisation of these three risk components simultaneously. Our numerical results [...] Read more.
In this paper, we suggest a Bayesian multivariate approach for pricing a reverse mortgage, allowing for house price risk, interest rate risk and longevity risk. We adopt the principle of maximum entropy in risk-neutralisation of these three risk components simultaneously. Our numerical results based on Australian data suggest that a reverse mortgage would be financially sustainable under the current financial environment and the model settings and assumptions. Full article
Figures

Figure 1

Open AccessArticle Risk Model Validation: An Intraday VaR and ES Approach Using the Multiplicative Component GARCH
Received: 12 December 2018 / Revised: 18 January 2019 / Accepted: 19 January 2019 / Published: 23 January 2019
Viewed by 544 | PDF Full-text (7082 KB) | HTML Full-text | XML Full-text
Abstract
In this paper, we employ 99% intraday value-at-risk (VaR) and intraday expected shortfall (ES) as risk metrics to assess the competency of the Multiplicative Component Generalised Autoregressive Heteroskedasticity (MC-GARCH) models based on the 1-min EUR/USD exchange rate returns. Five distributional assumptions for the [...] Read more.
In this paper, we employ 99% intraday value-at-risk (VaR) and intraday expected shortfall (ES) as risk metrics to assess the competency of the Multiplicative Component Generalised Autoregressive Heteroskedasticity (MC-GARCH) models based on the 1-min EUR/USD exchange rate returns. Five distributional assumptions for the innovation process are used to analyse their effects on the modelling and forecasting performance. The high-frequency volatility models were validated in terms of in-sample fit based on various statistical and graphical tests. A more rigorous validation procedure involves testing the predictive power of the models. Therefore, three backtesting procedures were used for the VaR, namely, the Kupiec’s test, a duration-based backtest, and an asymmetric VaR loss function. Similarly, three backtests were employed for the ES: a regression-based backtesting procedure, the Exceedance Residual backtest and the V-Tests. The validation results show that non-normal distributions are best suited for both model fitting and forecasting. The MC-GARCH(1,1) model under the Generalised Error Distribution (GED) innovation assumption gave the best fit to the intraday data and gave the best results for the ES forecasts. However, the asymmetric Skewed Student’s-t distribution for the innovation process provided the best results for the VaR forecasts. This paper presents the results of the first empirical study (to the best of the authors’ knowledge) in: (1) forecasting the intraday Expected Shortfall (ES) under different distributional assumptions for the MC-GARCH model; (2) assessing the MC-GARCH model under the Generalised Error Distribution (GED) innovation; (3) evaluating and ranking the VaR predictability of the MC-GARCH models using an asymmetric loss function. Full article
Figures

Figure 1

Open AccessArticle Efficient Retirement Portfolios: Using Life Insurance to Meet Income and Bequest Goals in Retirement
Received: 26 October 2018 / Revised: 21 December 2018 / Accepted: 4 January 2019 / Published: 18 January 2019
Viewed by 340 | PDF Full-text (2124 KB) | HTML Full-text | XML Full-text
Abstract
Life Insurance Retirement Plans (LIRPs) offer tax-deferred cash value accumulation, tax-free withdrawals (if properly structured), and a tax-free death benefit to beneficiaries. Thus, LIRPs share many of the tax advantages of other retirement savings vehicles but with less restrictive limitations on income and [...] Read more.
Life Insurance Retirement Plans (LIRPs) offer tax-deferred cash value accumulation, tax-free withdrawals (if properly structured), and a tax-free death benefit to beneficiaries. Thus, LIRPs share many of the tax advantages of other retirement savings vehicles but with less restrictive limitations on income and contributions. Opinions are mixed about the effectiveness of LIRPs; some financial advisers recommend them enthusiastically, while others are more skeptical. In this paper, we examine the potential of LIRPs to meet both income and bequest needs in retirement. We contrast retirement portfolios that include a LIRP with those that include only investment products with no life insurance. We consider different issue ages, face amounts, and withdrawal patterns. We simulate market scenarios and we demonstrate that portfolios that include LIRPs yield higher legacy potential and smaller income risk than those that exclude it. Thus, we conclude that the inclusion of a LIRP can improve financial outcomes in retirement. Full article
(This article belongs to the Special Issue Young Researchers in Insurance and Risk Management)
Figures

Figure 1

Open AccessArticle An Object-Oriented Bayesian Framework for the Detection of Market Drivers
Received: 23 November 2018 / Revised: 3 January 2019 / Accepted: 4 January 2019 / Published: 14 January 2019
Viewed by 312 | PDF Full-text (742 KB) | HTML Full-text | XML Full-text
Abstract
We use Object Oriented Bayesian Networks (OOBNs) to analyze complex ties in the equity market and to detect drivers for the Standard & Poor’s 500 (S&P 500) index. To such aim, we consider a vast number of indicators drawn from various investment areas [...] Read more.
We use Object Oriented Bayesian Networks (OOBNs) to analyze complex ties in the equity market and to detect drivers for the Standard & Poor’s 500 (S&P 500) index. To such aim, we consider a vast number of indicators drawn from various investment areas (Value, Growth, Sentiment, Momentum, and Technical Analysis), and, with the aid of OOBNs, we study the role they played along time in influencing the dynamics of the S&P 500. Our results highlight that the centrality of the indicators varies in time, and offer a starting point for further inquiries devoted to combine OOBNs with trading platforms. Full article
Figures

Figure 1

Open AccessArticle Surplus Sharing with Coherent Utility Functions
Received: 2 November 2018 / Revised: 20 December 2018 / Accepted: 27 December 2018 / Published: 10 January 2019
Viewed by 282 | PDF Full-text (357 KB) | HTML Full-text | XML Full-text
Abstract
We use the theory of coherent measures to look at the problem of surplus sharing in an insurance business. The surplus share of an insured is calculated by the surplus premium in the contract. The theory of coherent risk measures and the resulting [...] Read more.
We use the theory of coherent measures to look at the problem of surplus sharing in an insurance business. The surplus share of an insured is calculated by the surplus premium in the contract. The theory of coherent risk measures and the resulting capital allocation gives a way to divide the surplus between the insured and the capital providers, i.e., the shareholders. Full article
Open AccessArticle Convolutional Neural Network Classification of Telematics Car Driving Data
Received: 29 October 2018 / Revised: 21 December 2018 / Accepted: 9 January 2019 / Published: 10 January 2019
Viewed by 401 | PDF Full-text (1670 KB) | HTML Full-text | XML Full-text
Abstract
The aim of this project is to analyze high-frequency GPS location data (second per second) of individual car drivers (and trips). We extract feature information about speeds, acceleration, deceleration, and changes of direction from this high-frequency GPS location data. Time series of this [...] Read more.
The aim of this project is to analyze high-frequency GPS location data (second per second) of individual car drivers (and trips). We extract feature information about speeds, acceleration, deceleration, and changes of direction from this high-frequency GPS location data. Time series of this feature information allow us to appropriately allocate individual car driving trips to selected drivers using convolutional neural networks. Full article
(This article belongs to the Special Issue Insurance: Spatial and Network Data)
Figures

Figure 1

Open AccessArticle Dealing with Drift Uncertainty: A Bayesian Learning Approach
Received: 20 November 2018 / Revised: 25 December 2018 / Accepted: 4 January 2019 / Published: 9 January 2019
Viewed by 390 | PDF Full-text (556 KB) | HTML Full-text | XML Full-text
Abstract
One of the main challenges investors have to face is model uncertainty. Typically, the dynamic of the assets is modeled using two parameters: the drift vector and the covariance matrix, which are both uncertain. Since the variance/covariance parameter is assumed to be estimated [...] Read more.
One of the main challenges investors have to face is model uncertainty. Typically, the dynamic of the assets is modeled using two parameters: the drift vector and the covariance matrix, which are both uncertain. Since the variance/covariance parameter is assumed to be estimated with a certain level of confidence, we focus on drift uncertainty in this paper. Building on filtering techniques and learning methods, we use a Bayesian learning approach to solve the Markowitz problem and provide a simple and practical procedure to implement optimal strategy. To illustrate the value added of using the optimal Bayesian learning strategy, we compare it with an optimal nonlearning strategy that keeps the drift constant at all times. In order to emphasize the prevalence of the Bayesian learning strategy above the nonlearning one in different situations, we experiment three different investment universes: indices of various asset classes, currencies and smart beta strategies. Full article
(This article belongs to the Special Issue Applications of Stochastic Optimal Control to Economics and Finance)
Figures

Figure 1

Risks EISSN 2227-9091 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top