Risks doi: 10.3390/risks6020059

Authors: Shuanming Li Yi Lu

This paper studies the moments and the distribution of the aggregate discounted claims (ADCs) in a Markovian environment, where the claim arrivals, claim amounts, and forces of interest (for discounting) are influenced by an underlying Markov process. Specifically, we assume that claims occur according to a Markovian arrival process (MAP). The paper shows that the vector of joint Laplace transforms of the ADC occurring in each state of the environment process by any specific time satisfies a matrix-form first-order partial differential equation, through which a recursive formula is derived for the moments of the ADC occurring in certain states (a subset). We also study two types of covariances of the ADC occurring in any two subsets of the state space and with two different time lengths. The distribution of the ADC occurring in certain states by any specific time is also investigated. Numerical results are also presented for a two-state Markov-modulated model case.

]]>Risks doi: 10.3390/risks6020058

Authors: Roman V. Ivanov

This paper considers risks of the investment portfolio, which consist of distributed mortgages and sold European call options. It is assumed that the stream of the credit payments could fall by a jump. The time of the jump is modeled by the exponential distribution. We suggest that the returns on stock are variance-gamma distributed. The value at risk, the expected shortfall and the entropic risk measure for this portfolio are calculated in closed forms. The obtained formulas exploit the values of generalized hypergeometric functions.

]]>Risks doi: 10.3390/risks6020057

Authors: Tatjana Miljkovic and Daniel Fernández

We review two complementary mixture-based clustering approaches for modeling unobserved heterogeneity in an insurance portfolio: the generalized linear mixed cluster-weighted model (CWM) and mixture-based clustering for an ordered stereotype model (OSM). The latter is for modeling of ordinal variables, and the former is for modeling losses as a function of mixed-type of covariates. The article extends the idea of mixture modeling to a multivariate classification for the purpose of testing unobserved heterogeneity in an insurance portfolio. The application of both methods is illustrated on a well-known French automobile portfolio, in which the model fitting is performed using the expectation-maximization (EM) algorithm. Our findings show that these mixture-based clustering methods can be used to further test unobserved heterogeneity in an insurance portfolio and as such may be considered in insurance pricing, underwriting, and risk management.

]]>Risks doi: 10.3390/risks6020056

Authors: Fred Espen Benth Luca Di Persio Silvia Lavagnini

We model the logarithm of the spot price of electricity with a normal inverse Gaussian (NIG) process and the wind speed and wind power production with two Ornstein&ndash;Uhlenbeck processes. In order to reproduce the correlation between the spot price and the wind power production, namely between a pure jump process and a continuous path process, respectively, we replace the small jumps of the NIG process by a Brownian term. We then apply our models to two different problems: first, to study from the stochastic point of view the income from a wind power plant, as the expected value of the product between the electricity spot price and the amount of energy produced; then, to construct and price a European put-type quanto option in the wind energy markets that allows the buyer to hedge against low prices and low wind power production in the plant. Calibration of the proposed models and related price formulas is also provided, according to specific datasets.

]]>Risks doi: 10.3390/risks6020055

Authors: Khaled Halteh Kuldeep Kumar Adrian Gepp

Credit risk is a critical issue that affects banks and companies on a global scale. Possessing the ability to accurately predict the level of credit risk has the potential to help the lender and borrower. This is achieved by alleviating the number of loans provided to borrowers with poor financial health, thereby reducing the number of failed businesses, and, in effect, preventing economies from collapsing. This paper uses state-of-the-art stochastic models, namely: Decision trees, random forests, and stochastic gradient boosting to add to the current literature on credit-risk modelling. The Australian mining industry has been selected to test our methodology. Mining in Australia generates around $138 billion annually, making up more than half of the total goods and services. This paper uses publicly-available financial data from 750 risky and not risky Australian mining companies as variables in our models. Our results indicate that stochastic gradient boosting was the superior model at correctly classifying the good and bad credit-rated companies within the mining sector. Our model showed that &lsquo;Property, Plant, &amp; Equipment (PPE) turnover&rsquo;, &lsquo;Invested Capital Turnover&rsquo;, and &lsquo;Price over Earnings Ratio (PER)&rsquo; were the variables with the best explanatory power pertaining to predicting credit risk in the Australian mining sector.

]]>Risks doi: 10.3390/risks6020054

Authors: Rüdiger Frey Juraj Hledik

In this paper, we study the implications of diversification in the asset portfolios of banks for financial stability and systemic risk. Adding to the existing literature, we analyse this issue in a network model of the interbank market. We carry out a simulation study that determines the probability of a systemic crisis in the banking network as a function of both the level of diversification, and the connectivity and structure of the financial network. In contrast to earlier studies we find that diversification at the level of individual banks may be beneficial for financial stability even if it does lead to a higher asset return correlation across banks.

]]>Risks doi: 10.3390/risks6020053

Authors: Stanislaus Maier-Paape Qiji Jim Zhu

Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two-dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz (1959) and the capital market pricing model Sharpe (1964), are special cases of our general framework when the risk measure is taken to be the standard deviation and the utility function is the identity mapping. Using our general framework, we also recover and extend the results in Rockafellar et al. (2006), which were already an extension of the capital market pricing model to allow for the use of more general deviation measures. This generalized capital asset pricing model also applies to e.g., when an approximation of the maximum drawdown is considered as a risk measure. Furthermore, the consideration of a general utility function allows for going beyond the &ldquo;additive&rdquo; performance measure to a &ldquo;multiplicative&rdquo; one of cumulative returns by using the log utility. As a result, the growth optimal portfolio theory Lintner (1965) and the leverage space portfolio theory Vince (2009) can also be understood and enhanced under our general framework. Thus, this general framework allows a unification of several important existing portfolio theories and goes far beyond. For simplicity of presentation, we phrase all for a finite underlying probability space and a one period market model, but generalizations to more complex structures are straightforward.

]]>Risks doi: 10.3390/risks6020052

Authors: Rasika Yatigammana Shelton Peiris Richard Gerlach David Edmund Allen

The direction of price movements are analysed under an ordered probit framework, recognising the importance of accounting for discreteness in price changes. By extending the work of Hausman et al. (1972) and Yang and Parwada (2012),This paper focuses on improving the forecast performance of the model while infusing a more practical perspective by enhancing flexibility. This is achieved by extending the existing framework to generate short term multi period ahead forecasts for better decision making, whilst considering the serial dependence structure. This approach enhances the flexibility and adaptability of the model to future price changes, particularly targeting risk minimisation. Empirical evidence is provided, based on seven stocks listed on the Australian Securities Exchange (ASX). The prediction success varies between 78 and 91 per cent for in-sample and out-of-sample forecasts for both the short term and long term.

]]>Risks doi: 10.3390/risks6020051

Authors: Timothy Hillman Nan Zhang Zhuo Jin

We extend an existing numerical model (Grasselli (2011)) for valuing a real option to invest in a capital project in an incomplete market with a finite time horizon. In doing so, we include two separate effects: the possibility that the project value is partly describable according to a jump-diffusion process, and incorporation of a time-dependent investor utility function, taking into account the effect of inflation. We adopt a discrete approximation to the jump process, whose parameters are restricted in order to preserve the drift and the volatility of the project-value process that it modifies. By controlling for these low-order effects, the higher-order effects may be considered in isolation. Our simulated results demonstrate that the inclusion of the jump process tends to decrease the value of the option, and expand the circumstances under which it should be exercised. Our results also demonstrate that an appropriate selection of the time-dependent investor utility function yields more reasonable investor-behaviour predictions regarding the decision to exercise the option, than would occur otherwise.

]]>Risks doi: 10.3390/risks6020050

Authors: Gian Paolo Clemente

Solvency II Standard Formula provides a methodology to recognise the risk-mitigating impact of excess of loss reinsurance treaties in premium risk modelling. We analyse the proposals of both Quantitative Impact Study 5 and Commission Delegated Regulation highlighting some inconsistencies. This paper tries to bridge main pitfalls of both versions. To this aim, we propose a revision of non-proportional adjustment factor in order to measure the effect of excess of loss treaties on premium risk volatility. In this way, capital requirement can be easily assessed. As numerical results show, this proposal appears to be a feasible and much more consistent approach to describe the effect of non-proportional reinsurance on premium risk.

]]>Risks doi: 10.3390/risks6020049

Authors: Wei Wei

There are extensive studies on the allocation problems in the field of insurance and finance. We observe that these studies, although involving different methodologies, share some inherent commonalities. In this paper, we develop a new framework for these studies with the tool of arrangement increasing functions. This framework unifies many existing studies and provides shortcuts to developing new results.

]]>Risks doi: 10.3390/risks6020048

Authors: Alessandro Milazzo Elena Vigna

We study the gap between the state pension provided by the Italian pension system pre-Dini reform and post-Dini reform. The goal is to fill the gap between the old and the new pension by joining a defined contribution pension scheme and adopting an optimal investment strategy that is target-based. We find that it is possible to cover, at least partially, this gap with the additional income of the pension scheme, especially in the presence of late retirement and in the presence of stagnant careers. Workers with dynamic careers and workers who retire early are those who are most penalised by the reform. Results are intuitive and in line with previous studies on the subject.

]]>Risks doi: 10.3390/risks6020047

Authors: Bertrand K. Hassani Alexis Renaudin

According to the last proposals of the Basel Committee on Banking Supervision, banks or insurance companies under the advanced measurement approach (AMA) must use four different sources of information to assess their operational risk capital requirement. The fourth includes &rsquo;business environment and internal control factors&rsquo;, i.e., qualitative criteria, whereas the three main quantitative sources available to banks for building the loss distribution are internal loss data, external loss data and scenario analysis. This paper proposes an innovative methodology to bring together these three different sources in the loss distribution approach (LDA) framework through a Bayesian strategy. The integration of the different elements is performed in two different steps to ensure an internal data-driven model is obtained. In the first step, scenarios are used to inform the prior distributions and external data inform the likelihood component of the posterior function. In the second step, the initial posterior function is used as the prior distribution and the internal loss data inform the likelihood component of the second posterior function. This latter posterior function enables the estimation of the parameters of the severity distribution that are selected to represent the operational risk event types.

]]>Risks doi: 10.3390/risks6020046

Authors: Martin Tegnér Rolf Poulsen

It is impossible to discriminate between the commonly used stochastic volatility models of Heston, log-normal, and 3-over-2 on the basis of exponentially weighted averages of daily returns&mdash;even though it appears so at first sight. However, with a 5-min sampling frequency, the models can be differentiated and empirical evidence overwhelmingly favours a fast mean-reverting log-normal model.

]]>Risks doi: 10.3390/risks6020045

Authors: Marco Bee Luca Trapin

One of the key components of financial risk management is risk measurement. This typically requires modeling, estimating and forecasting tail-related quantities of the asset returns&rsquo; conditional distribution. Recent advances in the financial econometrics literature have developed several models based on Extreme Value Theory (EVT) to carry out these tasks. The purpose of this paper is to review these methods.

]]>Risks doi: 10.3390/risks6020044

Authors: Apostolos Bozikas Georgios Pitselis

During the last decades, life expectancy has risen significantly in the most developed countries all over the world. Greece is a case in point; consequently, higher governmental financial responsibilities occur as well as serious concerns are raised owing to population ageing. To address this issue, an efficient forecasting method is required. Therefore, the most important stochastic models were comparatively applied to Greek data for the first time. An analysis of their fitting behaviour by gender was conducted and the corresponding forecasting results were evaluated. In particular, we incorporated the Greek population data into seven stochastic mortality models under a common age-period-cohort framework. The fitting performance of each model was thoroughly evaluated based on information criteria values as well as the likelihood ratio test and their robustness to period changes was investigated. In addition, parameter risk in forecasts was assessed by employing bootstrapping techniques. For completeness, projection results for both genders were also illustrated in pricing insurance-related products.

]]>Risks doi: 10.3390/risks6020043

Authors: Siqi Tang Sachi Purcal Jinhui Zhang

In this paper, we analyse and construct a lifetime utility maximisation model with hyperbolic discounting. Within the model, a number of assumptions are made: complete markets, actuarially fair life insurance/annuity is available, and investors have time-dependent preferences. Time dependent preferences are in contrast to the usual case of constant preferences (exponential discounting). We find: (1) investors (realistically) demand more life insurance after retirement (in contrast to the standard model, which showed strong demand for life annuities), and annuities are rarely purchased; (2) optimal consumption paths exhibit a humped shape (which is usually only found in incomplete markets under the assumptions of the standard model).

]]>Risks doi: 10.3390/risks6020042

Authors: Andreas Mühlbacher Thomas Guhr

We review recent progress in modeling credit risk for correlated assets. We employ a new interpretation of the Wishart model for random correlation matrices to model non-stationary effects. We then use the Merton model in which default events and losses are derived from the asset values at maturity. To estimate the time development of the asset values, the stock prices are used, the correlations of which have a strong impact on the loss distribution, particularly on its tails. These correlations are non-stationary, which also influences the tails. We account for the asset fluctuations by averaging over an ensemble of random matrices that models the truly existing set of measured correlation matrices. As a most welcome side effect, this approach drastically reduces the parameter dependence of the loss distribution, allowing us to obtain very explicit results, which show quantitatively that the heavy tails prevail over diversification benefits even for small correlations. We calibrate our random matrix model with market data and show how it is capable of grasping different market situations. Furthermore, we present numerical simulations for concurrent portfolio risks, i.e., for the joint probability densities of losses for two portfolios. For the convenience of the reader, we give an introduction to the Wishart random matrix model.

]]>Risks doi: 10.3390/risks6020041

Authors: Udo Milkau Jürgen Bott

Advanced machine learning has achieved extraordinary success in recent years. &ldquo;Active&rdquo; operational risk beyond ex post analysis of measured-data machine learning could provide help beyond the regime of traditional statistical analysis when it comes to the &ldquo;known unknown&rdquo; or even the &ldquo;unknown unknown.&rdquo; While machine learning has been tested successfully in the regime of the &ldquo;known,&rdquo; heuristics typically provide better results for an active operational risk management (in the sense of forecasting). However, precursors in existing data can open a chance for machine learning to provide early warnings even for the regime of the &ldquo;unknown unknown.&rdquo;

]]>Risks doi: 10.3390/risks6020040

Authors: Gabriel Frahm

An intersection&ndash;union test for supporting the hypothesis that a given investment strategy is optimal among a set of alternatives is presented. It compares the Sharpe ratio of the benchmark with that of each other strategy. The intersection&ndash;union test takes serial dependence into account and does not presume that asset returns are multivariate normally distributed. An empirical study based on the G&ndash;7 countries demonstrates that it is hard to find significant results due to the lack of data, which confirms a general observation in empirical finance.

]]>Risks doi: 10.3390/risks6020039

Authors: Jyh-Jiuan Lin Chuen-Ping Chang Shi Chen

The topic of bank default risk in connection with government bailouts has recently attracted a great deal of attention. In this paper, the question of how a bank&rsquo;s default risk is affected by a distress acquisition is investigated. Specifically, the government provides a bailout program of distressed loan purchases for a strong bank to acquire a bank in distress. The acquirer bank may likely refuse the acquisition with a bailout when the amount of distressed loan purchases is large or the knock-out value of the acquired bank is high. When the acquirer bank realizes acquisition gains, the default risk in the consolidated bank&rsquo;s equity return is negatively related to loan purchases, but positively to the knock-out value of the acquired bank. The government bailout, as such, in large part contributes to banking stability.

]]>Risks doi: 10.3390/risks6020038

Authors: Peter Martey Addo Dominique Guegan Bertrand Hassani

Due to the advanced technology associated with Big Data, data availability and computing power, most banks or lending institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision-making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modeling process to test the stability of binary classifiers by comparing their performance on separate data. We observe that the tree-based models are more stable than the models based on multilayer artificial neural networks. This opens several questions relative to the intensive use of deep learning systems in enterprises.

]]>Risks doi: 10.3390/risks6020037

Authors: Georgia Warren-Myers Gideon Aschwanden Franz Fuerst Andy Krause

The estimation of future sea level rise (SLR) is a major concern for cities near coastlines and river systems. Despite this, current modelling underestimates the future risks of SLR to property. Direct risks posed to property include inundation, loss of physical property and associated economic and social costs. It is also crucial to consider the risks that emerge from scenarios after SLR. These may produce one-off or periodic events that will inflict physical, economic and social implications, and direct, indirect and consequential losses. Using a case study approach, this paper combines various forms of data to examine the implications of future SLR to further understand the potential risks. The research indicates that the financial implications for local government will be loss of rates associated with total property loss and declines in value. The challenges identified are not specific to this research. Other municipalities worldwide experience similar barriers (i.e., financial implications, coastal planning predicaments, data paucity, knowledge and capacity, and legal and political challenges). This research highlights the need for private and public stakeholders to co-develop and implement strategies to mitigate and adapt property to withstand the future challenges of climate change and SLR.

]]>Risks doi: 10.3390/risks6020036

Authors: Xavier Milhaud Victorien Poncelet Clement Saillard

This work addresses crucial questions about the robustness of the PSDization process for applications in insurance. PSDization refers to the process that forces a matrix to become positive semidefinite. For companies using copulas to aggregate risks in their internal model, PSDization occurs when working with correlation matrices to compute the Solvency Capital Requirement (SCR). We examine how classical operational choices concerning the modelling of risk dependence impacts the SCR during PSDization. These operations refer to the permutations of risks (or business lines) in the correlation matrix, the addition of a new risk, and the introduction of confidence weights given to the correlation coefficients. The use of genetic algorithms shows that theoretically neutral transformations of the correlation matrix can surprisingly lead to significant sensitivities of the SCR (up to 6%). This highlights the need for a very strong internal control around the PSDization step.

]]>Risks doi: 10.3390/risks6020035

Authors: Florin Avram Sooie-Hoe Loke

Modeling the interactions between a reinsurer and several insurers, or between a central management branch (CB) and several subsidiary business branches, or between a coalition and its members, are fascinating problems, which suggest many interesting questions. Beyond two dimensions, one cannot expect exact answers. Occasionally, reductions to one dimension or heuristic simplifications yield explicit approximations, which may be useful for getting qualitative insights. In this paper, we study two such problems: the ruin problem for a two-dimensional CB network under a new mathematical model, and the problem of valuation of two-dimensional CB networks by optimal dividends. A common thread between these two problems is that the one dimensional reduction exploits the concept of invariant cones. Perhaps the most important contribution of the paper is the questions it raises; for that reason, we have found it useful to complement the particular examples solved by providing one possible formalization of the concept of a multi-dimensional risk network, which seems to us an appropriate umbrella for the kind of questions raised here.

]]>Risks doi: 10.3390/risks6020034

Authors: Emilio Gómez-Déniz Enrique Calderín-Ojeda

In the classical bonus-malus system the premium assigned to each policyholder is based only on the number of claims made without having into account the claims size. Thus, a policyholder who has declared a claim that results in a relatively small loss is penalised to the same extent as one who has declared a more expensive claim. Of course, this is seen unfair by many policyholders. In this paper, we study the factors that affect the number of claims in car insurance by using a trivariate discrete distribution. This approach allows us to discern between three types of claims depending wether the claims are above, between or below certain thresholds. Therefore, this model implements the two fundamental random variables in this scenario, the number of claims as well as the amount associated with them. In addition, we introduce a trivariate prior distribution conjugated with this discrete distribution that produce credibility bonus-malus premiums that satisfy appropriate traditional transition rules. A practical example based on real data is shown to examine the differences with respect to the premiums obtained under the traditional system of tarification.

]]>Risks doi: 10.3390/risks6020033

Authors: José-Luis Pérez Kazutoshi Yamazaki

Given a spectrally-negative Lévy process and independent Poisson observation times, we consider a periodic barrier strategy that pushes the process down to a certain level whenever the observed value is above it. We also consider the versions with additional classical reflection above and/or below. Using scale functions and excursion theory, various fluctuation identities are computed in terms of the scale functions. Applications in de Finetti’s dividend problems are also discussed.

]]>Risks doi: 10.3390/risks6020032

Authors: Lukas Michel Johanna Anzengruber Marco Wölfle Nick Hixson

Despite real changes in the work place and the negative consequences of prevailing hierarchical structures with rigid management systems, little attention has yet been paid to shifting management modes to accommodate the dynamics of the external environment, particularly when a firm&rsquo;s operating environment demands a high degree of flexibility. Building on the resource-based view as a basis for competitive advantage, we posit that differences in the stability of an organization&rsquo;s environment and the degree of managerial control explain variations in the management mode used in firms. Unlike other studies which mainly focus on either the dynamics of the external environment or management control, we have developed a theoretical model combining both streams of research, in a context frame to describe under what conditions firms engage in rules-based, change-based, engagement-based and capability-based management modes. To test our theoretical framework, we conducted a survey with 54 firms in various industries and nations on how their organizations cope with a dynamic environment and what management style they used in response. Our study reveals that the appropriate mode can be determined by analyzing purpose, motivation, knowledge and information, as well as the degree of complexity, volatility and uncertainty the firm is exposed to. With our framework, we attempt to advance the understanding of when organizations should adapt their management style to the changing business environment.

]]>Risks doi: 10.3390/risks6020031

Authors: Mohamed Badaoui Begoña Fernández Anatoliy Swishchuk

In this paper we consider the problem of an insurance company where the wealth of the insurer is described by a Cramér-Lundberg process. The insurer is allowed to invest in a risky asset with stochastic volatility subject to the influence of an economic factor and the remaining surplus in a bank account. The price of the risky asset and the economic factor are modeled by a system of correlated stochastic differential equations. In a finite horizon framework and assuming that the market is incomplete, we study the problem of maximizing the expected utility of terminal wealth. When the insurer’s preferences are exponential, an existence and uniqueness theorem is proven for the non-linear Hamilton-Jacobi-Bellman equation (HJB). The optimal strategy and the value function have been produced in closed form. In addition and in order to show the connection between the insurer’s decision and the correlation coefficient we present two numerical approaches: A Monte-Carlo method based on the stochastic representation of the solution of the insurer problem via Feynman-Kac’s formula, and a mixed Finite Difference Monte-Carlo one. Finally the results are presented in the case of Scott model.

]]>Risks doi: 10.3390/risks6020030

Authors: Jacques Drèze

Under state-dependent preferences, probabilities and units of scale of state-dependent utilities are not separately identified. In standard models, only their products matter to decisions. Separate identification has been studied under implicit actions by Drèze or under explicit actions and observations by Karni. This paper complements both approaches and relates them when conditional preferences for final outcomes are independent of actions and observations. That special case permits drastic technical simplification while remaining open to some natural extensions.

]]>Risks doi: 10.3390/risks6020029

Authors: Andrea Gabrielli Mario V. Wüthrich

The aim of this project is to develop a stochastic simulation machine that generates individual claims histories of non-life insurance claims. This simulation machine is based on neural networks to incorporate individual claims feature information. We provide a fully calibrated stochastic scenario generator that is based on real non-life insurance data. This stochastic simulation machine allows everyone to simulate their own synthetic insurance portfolio of individual claims histories and back-test thier preferred claims reserving method.

]]>Risks doi: 10.3390/risks6020028

Authors: Andrea Barletta Paolo Santucci de Magistris

This paper introduces a new computational tool for the analysis of the risks embedded in a set of prices of European-style options. The software enables the estimation of the risk-neutral density (RND) from the observed option prices by means of orthogonal polynomial expansions. Orthogonal polynomials offer a viable alternative to more standard techniques based on interpolation and estimation of the second-order derivatives of option prices. The app rndfittool is available on GitHub and its usage is illustrated with examples based on real data.

]]>Risks doi: 10.3390/risks6020027

Authors: Dimitrios Konstantinides

The precise large deviations asymptotics for the sums of independent identical random variables when the distribution of the summand belongs to the class S ∗ of heavy tailed distributions is studied. Under mild conditions, we extend the previous results from the paper Denisov et al. (2010) to asymptotics that are valid uniformly over some time interval. Finally, we apply the main result on the multi-risk model introduced by Wang and Wang (2007).

]]>Risks doi: 10.3390/risks6020026

Authors: Yanlin Shi Yang Yang

In this paper, we propose an Adaptive Hyperbolic EGARCH (A-HYEGARCH) model to estimate the long memory of high frequency time series with potential structural breaks. Based on the original HYGARCH model, we use the logarithm transformation to ensure the positivity of conditional variance. The structural change is further allowed via a flexible time-dependent intercept in the conditional variance equation. To demonstrate its effectiveness, we perform a range of Monte Carlo studies considering various data generating processes with and without structural changes. Empirical testing of the A-HYEGARCH model is also conducted using high frequency returns of S&amp;P 500, FTSE 100, ASX 200 and Nikkei 225. Our simulation and empirical evidence demonstrate that the proposed A-HYEGARCH model outperforms various competing specifications and can effectively control for structural breaks. Therefore, our model may provide more reliable estimates of long memory and could be a widely useful tool for modelling financial volatility in other contexts.

]]>Risks doi: 10.3390/risks6020025

Authors: Jonas Harnau

Despite the widespread use of chain-ladder models, so far no theory was available to test for model specification. The popular over-dispersed Poisson model assumes that the over-dispersion is common across the data. A further assumption is that accident year effects do not vary across development years and vice versa. The log-normal chain-ladder model makes similar assumptions. We show that these assumptions can easily be tested and that similar tests can be used in both models. The tests can be implemented in a spreadsheet. We illustrate the implementation in several empirical applications. While the results for the log-normal model are valid in finite samples, those for the over-dispersed Poisson model are derived for large cell mean asymptotics which hold the number of cells fixed. We show in a simulation study that the finite sample performance is close to the asymptotic performance.

]]>Risks doi: 10.3390/risks6020024

Authors: Xiaoyi Zhang Junyi Guo

This paper investigates the optimal investment strategy for a defined contribution (DC) pension plan during the decumulation phase which is risk-averse and pays close attention to inflation risk. The plan aims to maximize the expected constant relative risk aversion (CRRA) utility from the terminal real wealth by investing the fund in a financial market consisting of an inflation-indexed bond, an ordinary zero coupon bond and a risk-free asset. We derive the optimal investment strategy in closed-form using the dynamic programming approach by solving the related Hamilton-Jacobi-Bellman (HJB) equation. The results reveal that, with any level of the parameters, an inflation-indexed bond has significant advantage to hedge inflation risk.

]]>Risks doi: 10.3390/risks6010023

Authors: José Garrido Ramin Okhrati

An arbitrage portfolio provides a cash flow that can never be negative at zero cost. We define the weaker concept of a “desirable portfolio” delivering cash flows with negative risk at zero cost. Although these are not completely risk-free investments and subject to the risk measure used, they can provide attractive investment opportunities for investors. We investigate in detail the theoretical aspects of this portfolio selection procedure and the existence of such opportunities in fixed income markets. Then, we present two applications of the theory: one in analyzing market integration problem and the other in gauging the credit quality of defaultable bonds in a portfolio. We also discuss the model calibration and provide some numerical illustrations.

]]>Risks doi: 10.3390/risks6010022

Authors: Thierry Moudiki Frédéric Planchet Areski Cousin

We are interested in obtaining forecasts for multiple time series, by taking into account the potential nonlinear relationships between their observations. For this purpose, we use a specific type of regression model on an augmented dataset of lagged time series. Our model is inspired by dynamic regression models (Pankratz 2012), with the response variable’s lags included as predictors, and is known as Random Vector Functional Link (RVFL) neural networks. The RVFL neural networks have been successfully applied in the past, to solving regression and classification problems. The novelty of our approach is to apply an RVFL model to multivariate time series, under two separate regularization constraints on the regression parameters.

]]>Risks doi: 10.3390/risks6010021

Authors: Robert Aykroyd Víctor Leiva Carolina Marchant

Since its origins and numerous applications in material science, the Birnbaum–Saunders family of distributions has now found widespread uses in some areas of the applied sciences such as agriculture, environment and medicine, as well as in quality control, among others. It is able to model varied data behaviour and hence provides a flexible alternative to the most usual distributions. The family includes Birnbaum–Saunders and log-Birnbaum–Saunders distributions in univariate and multivariate versions. There are now well-developed methods for estimation and diagnostics that allow in-depth analyses. This paper gives a detailed review of existing methods and of relevant literature, introducing properties and theoretical results in a systematic way. To emphasise the range of suitable applications, full analyses are included of examples based on regression and diagnostics in material science, spatial data modelling in agricultural engineering and control charts for environmental monitoring. However, potential future uses in new areas such as business, economics, finance and insurance are also discussed. This work is presented to provide a full tool-kit of novel statistical models and methods to encourage other researchers to implement them in these new areas. It is expected that the methods will have the same positive impact in the new areas as they have had elsewhere.

]]>Risks doi: 10.3390/risks6010020

Authors: Edita Kizinevič Jonas Šiaulys

In this work, the non-homogeneous risk model is considered. In such a model, claims and inter-arrival times are independent but possibly non-identically distributed. The easily verifiable conditions are found such that the ultimate ruin probability of the model satisfies the exponential estimate exp { − ϱ u } for all values of the initial surplus u ⩾ 0 . Algorithms to estimate the positive constant ϱ are also presented. In fact, these algorithms are the main contribution of this work. Sharpness of the derived inequalities is illustrated by several numerical examples.

]]>Risks doi: 10.3390/risks6010019

Authors: Zinoviy Landsman Udi Makov Tomer Shushi

In this paper, we offer a novel class of utility functions applied to optimal portfolio selection. This class incorporates as special cases important measures such as the mean-variance, Sharpe ratio, mean-standard deviation and others. We provide an explicit solution to the problem of optimal portfolio selection based on this class. Furthermore, we show that each measure in this class generally reduces to the efficient frontier that coincides or belongs to the classical mean-variance efficient frontier. In addition, a condition is provided for the existence of the a one-to-one correspondence between the parameter of this class of utility functions and the trade-off parameter λ in the mean-variance utility function. This correspondence essentially provides insight into the choice of this parameter. We illustrate our results by taking a portfolio of stocks from National Association of Securities Dealers Automated Quotation (NASDAQ).

]]>Risks doi: 10.3390/risks6010018

Authors: Andrea Macrina Obeid Mahomed

The general problem of asset pricing when the discount rate differs from the rate at which an asset’s cash flows accrue is considered. A pricing kernel framework is used to model an economy that is segmented into distinct markets, each identified by a yield curve having its own market, credit and liquidity risk characteristics. The proposed framework precludes arbitrage within each market, while the definition of a curve-conversion factor process links all markets in a consistent arbitrage-free manner. A pricing formula is then derived, referred to as the across-curve pricing formula, which enables consistent valuation and hedging of financial instruments across curves (and markets). As a natural application, a consistent multi-curve framework is formulated for emerging and developed inter-bank swap markets, which highlights an important dual feature of the curve-conversion factor process. Given this multi-curve framework, existing multi-curve approaches based on HJM and rational pricing kernel models are recovered, reviewed and generalised and single-curve models extended. In another application, inflation-linked, currency-based and fixed-income hybrid securities are shown to be consistently valued using the across-curve valuation method.

]]>Risks doi: 10.3390/risks6010017

Authors: Asmerilda Hitaj Cesario Mateus Ilaria Peri

This paper presents the first methodological proposal of estimation of the Λ V a R . Our approach is dynamic and calibrated to market extreme scenarios, incorporating the need of regulators and financial institutions in more sensitive risk measures. We also propose a simple backtesting methodology by extending the V a R hypothesis-testing framework. Hence, we test our Λ V a R proposals under extreme downward scenarios of the financial crisis and different assumptions on the profit and loss distribution. The findings show that our Λ V a R estimations are able to capture the tail risk and react to market fluctuations significantly faster than the V a R and expected shortfall. The backtesting exercise displays a higher level of accuracy for our Λ V a R estimations.

]]>Risks doi: 10.3390/risks6010016

Authors: Pavel Shevchenko

n/a

]]>Risks doi: 10.3390/risks6010015

Authors: Reda Aboutajdine Pierre Picard

Audit mechanisms frequently take place in the context of repeated relationships between auditor and auditee. This paper focuses attention on the insurance fraud problem in a setting where insurers repeatedly verify claims satisfied by service providers (e.g., affiliated car repairers or members of managed care networks). We highlight a learning bias that leads insurers to over-audit service providers at the beginning of their relationship. The paper builds a bridge between the literature on optimal audit in insurance and the exploitation/exploration trade-off in multi-armed bandit problems.

]]>Risks doi: 10.3390/risks6010014

Authors: Yang Shen Tak Siu

This paper presents a novel risk-based approach for an optimal asset allocation problem with default risk, where a money market account, an ordinary share and a defaultable security are investment opportunities in a general non-Markovian economy incorporating random market parameters. The objective of an investor is to select an optimal mix of these securities such that a risk metric of an investment portfolio is minimized. By adopting a sub-additive convex risk measure, which takes into account interest rate risk, as a measure for risk, the investment problem is discussed mathematically in a form of a two-player, zero-sum, stochastic differential game between the investor and the market. A backward stochastic differential equation approach is used to provide a flexible and theoretically sound way to solve the game problem. Closed-form expressions for the optimal strategies of the investor and the market are obtained when the penalty function is a quadratic function and when the risk measure is a sub-additive coherent risk measure. An important case of the general non-Markovian model, namely the self-exciting threshold diffusion model with time delay, is considered. Numerical examples based on simulations for the self-exciting threshold diffusion model with and without time delay are provided to illustrate how the proposed model can be applied in this important case. The proposed model can be implemented using Excel spreadsheets.

]]>Risks doi: 10.3390/risks6010013

Authors: Valeria D’Amato Emilia Di Lorenzo Marilena Sibillo

The relevance of critical illness coverage and life insurance in cause-specific mortality conditions is increasing in many industrialized countries. Specific conditions on the illness and on death event, providing cheapest premiums for the insureds and lower obligations for the insurers, constitute interesting products in an insurance market looking to offer appealing products. On the other hand, the systematic improvement in longevity gives rise to a market with agents getting increasingly older, and the insurer pays attention to this trend. There are financial contracts joined with insurance coverage, and this particularly happens in the case of the so-called insured loan. Insured loans are financial contracts often proposed together with a term life insurance in order to cover the lender and the heirs against the borrower’s death event within the loan duration. This paper explores new insurance products that, linked to an insured loan, are founded on specific illness hypotheses and/or cause-specific mortality. The aim is to value how much the insurance costs lighten with respect to the traditional term insurance. The authors project cause-specific mortality rates and specific diagnosis rates, in this last case overcoming the discontinuities in the data. The new contractual schemes are priced. Numerical applications also show, with several graphs, the rates projection procedure and plenty of tables report the premiums in the new proposed contractual forms. The complete amortization schedule closes the work.

]]>Risks doi: 10.3390/risks6010012

Authors: David Babbel Miguel Herce

Little in the scholarly economics literature is directed specifically to the performance of stable value funds, although they occupy a leading place among retirement investment vehicles. They are currently offered in more than one-third of all defined contribution plans in the USA, with more than $800 billion of assets under management. This paper rigorously examines their performance throughout the entire period since their inception in 1973. We produce a composite index of stable value returns. We next conduct mean-variance analysis, Sharpe and Sortino ratio analysis, stochastic dominance analysis, and optimal multi-period portfolio composition analysis. Our evidence suggests that stable value funds dominate (on average) two major asset classes based on a historical analysis, and that they often occupy a significant position in optimized portfolios across a broad range of risk aversion levels. We discuss factors that contributed to stable value funds’ past performance and whether they can continue to perform well into the future. We also discuss considerations regarding whether or not to include stable value as an element in target date funds within defined contribution pension plans.

]]>Risks doi: 10.3390/risks6010011

Authors: Enrique Calderín-Ojeda

Composite models have received much attention in the recent actuarial literature to describe heavy-tailed insurance loss data. One of the models that presents a good performance to describe this kind of data is the composite Weibull–Pareto (CWL) distribution. On this note, this distribution is revisited to carry out estimation of parameters via mle and mle2 optimization functions in R. The results are compared with those obtained in a previous paper by using the nlm function, in terms of analytical and graphical methods of model selection. In addition, the consistency of the parameter estimation is examined via a simulation study.

]]>Risks doi: 10.3390/risks6010010

Authors: Yang Chang Michael Sherris

The design and development of post-retirement income products require the assessment of longevity risk, as well as a basis for hedging these risks. Most indices for longevity risk are age-period based. We develop and assess a cohort-based value index for life insurers and pension funds to manage longevity risk. There are two innovations in the development of this index. Firstly, the underlying variables of most existing longevity indices are based on mortality experience only. The value index is based on the present value of future cash flow obligations, capturing all the risks in retirement income products. We use the index to manage both longevity risk and interest rate risk. Secondly, we capture historical dependencies between ages and cohorts with a cohort-based stochastic mortality model. We achieve this by introducing age-dependent model parameters. With our mortality model, we obtain realistic cohort correlation structures and improve the fitting performance, particularly for very old ages.

]]>Risks doi: 10.3390/risks6010009

Authors: Catalina Bolancé Montserrat Guillen Jens Perch Nielsen Fredrik Thuring

Prospective customers of financial and insurance products can be targeted based on the profit the provider expects to earn from them. We present a model for individual expected profit and two alternatives for calculating optimal personalized prices that maximize the expected profit. For one of these alternatives, we obtain a closed-form expression for the price offered to each prospective customer; for the other, we need to use a numerical approximation. In both approaches, the profits generated by prospective customers are not immediately observed, given that the products sold by these companies have a risk component. We assume that willingness to pay is heterogeneous and apply our methodology using real data from a European insurance company. Our study indicates that a substantial boost in profits can be expected when applying the simplest optimal pricing method proposed.

]]>Risks doi: 10.3390/risks6010008

Authors: Georges Dionne Denise Desjardins Martin Lebeau Stéphane Messier André Dascal

The ability and willingness of health care workers to report for work during a pandemic are essential to pandemic response. The main contribution of this article is to examine the relationship between risk perception of personal and work activities and willingness to report for work during an influenza pandemic. Data were collected through a quantitative Web-based survey sent to health care workers on the island of Montreal. Respondents were asked about their perception of various risks to obtain index measures of risk perception. A multinomial logit model was applied for the probability estimations, and a factor analysis was conducted to compute risk perception indexes (scores). Risk perception associated with personal and work activities is a significant predictor of intended presence at work during an influenza pandemic. This means that correcting perceptual biases should be a public policy concern. These results have not been previously reported in the literature. Many organizational variables are also significant.

]]>Risks doi: 10.3390/risks6010007

Authors: Noemi Nava Tiziana Di Matteo Tomaso Aste

We introduce a multistep-ahead forecasting methodology that combines empirical mode decomposition (EMD) and support vector regression (SVR). This methodology is based on the idea that the forecasting task is simplified by using as input for SVR the time series decomposed with EMD. The outcomes of this methodology are compared with benchmark models commonly used in the literature. The results demonstrate that the combination of EMD and SVR can outperform benchmark models significantly, predicting the Standard &amp; Poor’s 500 Index from 30 s to 25 min ahead. The high-frequency components better forecast short-term horizons, whereas the low-frequency components better forecast long-term horizons.

]]>Risks doi: 10.3390/risks6010006

Authors: Kam Wat Kam Yuen Wai Li Xueyuan Wu

This paper extends the work of Yuen et al. (2013), who obtained explicit results for the discount-free Gerber–Shiu function for a compound binomial risk model in the presence of delayed claims and a randomized dividend strategy with a zero threshold level. Specifically, we establish a recursion method for computing the Gerber–Shiu expected discounted penalty function, which entails a number of important quantities in ruin theory, within the framework of the compound binomial aggregate claims with delayed by-claims and randomized dividends payable at a non-negative threshold level.

]]>Risks doi: 10.3390/risks6010005

Authors: Jerome Detemple Yerkin Kitapbayev

This paper studies the valuation of real options when the cost of investment jumps at a random time. Three valuation formulas are derived. The first expresses the value of the project in terms of a collection of knockout barrier claims. The second identifies the premium relative to a project with delayed investment right and prices its components. The last one identifies the premium/discount relative to a project with constant cost equal to the post-jump cost and prices its components. All formulas are in closed form. The behavior of optimal investment boundaries and valuation components are examined.

]]>Risks doi: 10.3390/risks6010004

Authors: Albert Cohen

In the nearly thirty years since Hans Buhlmann (Buhlmann (1987)) set out the notion of the Actuary of the Third Kind, the connection between Actuarial Science (AS) and Mathematical Finance (MF) has been continually reinforced. As siblings in the family of Risk Management techniques, practitioners in both fields have learned a great deal from each other. The collection of articles in this volume are contributed by scholars who are not only experts in areas of AS and MF, but also those who present diverse perspectives from both industry and academia. Topics from multiple areas, such as Stochastic Modeling, Credit Risk, Monte Carlo Simulation, and Pension Valuation, among others, that were maybe thought to be the domain of one type of risk manager are shown time and again to have deep value to other areas of risk management as well. The articles in this collection, in my opinion, contribute techniques, ideas, and overviews of tools that specialists in both AS and MF will find useful and interesting to implement in their work. It is also my hope that this collection will inspire future collaboration between those who seek an interdisciplinary approach to risk management.

]]>Risks doi: 10.3390/risks6010003

Authors: Risks Editorial Office

n/a

]]>Risks doi: 10.3390/risks6010002

Authors: Nick Costanzino Michael Curran

We propose a Traffic Light approach to backtesting Expected Shortfall which is completely consistent with, and analogous to, the Traffic Light approach to backtesting VaR (Value at Risk) initially proposed by the Basel Committee on Banking Supervision in their 1996 consultative document Basle Committee on Banking Supervision (1996). The approach relies on the generalized coverage test for Expected Shortfall developed in Costanzino and Curran (2015).

]]>Risks doi: 10.3390/risks6010001

Authors: Christian Hipp

Optimal dividend payment under a ruin constraint is a two objective control problem which—in simple models—can be solved numerically by three essentially different methods. One is based on a modified Bellman equation and the policy improvement method (see Hipp (2003)). In this paper we use explicit formulas for running allowed ruin probabilities which avoid a complete search and speed up and simplify the computation. The second is also a policy improvement method, but without the use of a dynamic equation (see Hipp (2016)). It is based on closed formulas for first entry probabilities and discount factors for the time until first entry. Third a new, faster and more intuitive method which uses appropriately chosen barrier levels and a closed formula for the corresponding dividend value. Using the running allowed ruin probabilities, a simple test for admissibility—concerning the ruin constraint—is given. All these methods work for the discrete De Finetti model and are applied in a numerical example. The non stationary Lagrange multiplier method suggested in Hipp (2016), Section 2.2.2, also yields optimal dividend strategies which differ from those in all other methods, and Lagrange gaps are present here.

]]>Risks doi: 10.3390/risks5040065

Authors: Albert Cohen Nick Costanzino

In this work, we introduce a general framework for incorporating stochastic recovery into structural models. The framework extends the approach to recovery modeling developed in Cohen and Costanzino (2015, 2017) and provides for a systematic way to include different recovery processes into a structural credit model. The key observation is a connection between the partial information gap between firm manager and the market that is captured via a distortion of the probability of default. This last feature is computed by what is essentially a Girsanov transformation and reflects untangling of the recovery process from the default probability. Our framework can be thought of as an extension of Ishizaka and Takaoka (2003) and, in the same spirit of their work, we provide several examples of the framework including bounded recovery and a jump-to-zero model. One of the nice features of our framework is that, given prices from any one-factor structural model, we provide a systematic way to compute corresponding prices with stochastic recovery. The framework also provides a way to analyze correlation between Probability of Default (PD) and Loss Given Default (LGD), and term structure of recovery rates.

]]>Risks doi: 10.3390/risks5040064

Authors: Krzysztof Burnecki Mario Giuricich

We consider the subject of approximating tail probabilities in the general compound renewal process framework, where severity data are assumed to follow a heavy-tailed law (in that only the first moment is assumed to exist). By using the weak convergence of compound renewal processes to α -stable Lévy motion, we derive such weak approximations. Their applicability is then highlighted in the context of an existing, classical, index-linked catastrophe bond pricing model, and in doing so, we specialize these approximations to the case of a compound time-inhomogeneous Poisson process. We emphasize a unique feature of our approximation, in that it only demands finiteness of the first moment of the aggregate loss processes. Finally, a numerical illustration is presented. The behavior of our approximations is compared to both Monte Carlo simulations and first-order single risk loss process approximations and compares favorably.

]]>Risks doi: 10.3390/risks5040063

Authors: Luca Regis

The aim of the Special Issue is to address some of the main challenges individuals and companies face in managing financial and actuarial risks, when dealing with their investment/retirement or business-related decisions [...]

]]>Risks doi: 10.3390/risks5040062

Authors: Nguyet Nguyen

Future stock prices depend on many internal and external factors that are not easy to evaluate. In this paper, we use the Hidden Markov Model, (HMM), to predict a daily stock price of three active trading stocks: Apple, Google, and Facebook, based on their historical data. We first use the Akaike information criterion (AIC) and Bayesian information criterion (BIC) to choose the numbers of states from HMM. We then use the models to predict close prices of these three stocks using both single observation data and multiple observation data. Finally, we use the predictions as signals for trading these stocks. The criteria tests’ results showed that HMM with two states worked the best among two, three and four states for the three stocks. Our results also demonstrate that the HMM outperformed the naïve method in forecasting stock prices. The results also showed that active traders using HMM got a higher return than using the naïve forecast for Facebook and Google stocks. The stock price prediction method has a significant impact on stock trading and derivative hedging.

]]>Risks doi: 10.3390/risks5040061

Authors: Peter Carr

Diffusions are widely used in finance due to their tractability. Driftless diffusions are needed to describe ratios of asset prices under a martingale measure. We provide a simple example of a tractable driftless diffusion which also has a bounded state space.

]]>Risks doi: 10.3390/risks5040058

Authors: Arthur Charpentier Arthur David Romuald Elie

In this paper, we investigate the impact of the accident reporting strategy of drivers, within a Bonus-Malus system. We exhibit the induced modification of the corresponding class level transition matrix and derive the optimal reporting strategy for rational drivers. The hunger for bonuses induces optimal thresholds under which, drivers do not claim their losses. Mathematical properties of the induced level class process are studied. A convergent numerical algorithm is provided for computing such thresholds and realistic numerical applications are discussed.

]]>Risks doi: 10.3390/risks5040060

Authors: Enrique Calderín-Ojeda Kevin Fergusson Xueyuan Wu

Generalized linear models might not be appropriate when the probability of extreme events is higher than that implied by the normal distribution. Extending the method for estimating the parameters of a double Pareto lognormal distribution (DPLN) in Reed and Jorgensen (2004), we develop an EM algorithm for the heavy-tailed Double-Pareto-lognormal generalized linear model. The DPLN distribution is obtained as a mixture of a lognormal distribution with a double Pareto distribution. In this paper the associated generalized linear model has the location parameter equal to a linear predictor which is used to model insurance claim amounts for various data sets. The performance is compared with those of the generalized beta (of the second kind) and lognorma distributions.

]]>Risks doi: 10.3390/risks5040059

Authors: Sebastian Fuchs Ruben Schlotter Klaus Schmidt

In the present paper, we study quantile risk measures and their domain. Our starting point is that, for a probability measure Q on the open unit interval and a wide class L Q of random variables, we define the quantile risk measure ϱ Q as the map that integrates the quantile function of a random variable in L Q with respect to Q. The definition of L Q ensures that ϱ Q cannot attain the value + ∞ and cannot be extended beyond L Q without losing this property. The notion of a quantile risk measure is a natural generalization of that of a spectral risk measure and provides another view of the distortion risk measures generated by a distribution function on the unit interval. In this general setting, we prove several results on quantile or spectral risk measures and their domain with special consideration of the expected shortfall. We also present a particularly short proof of the subadditivity of expected shortfall.

]]>Risks doi: 10.3390/risks5040057

Authors: Gaurav Khemka Adam Butt

This paper considers an alternative way of structuring stochastic variables in a dynamic programming framework where the model structure dictates that numerical methods of solution are necessary. Rather than estimating integrals within a Bellman equation using quadrature nodes, we use nodes directly from the underlying data. An example of the application of this approach is presented using individual lifetime financial modelling. The results show that data-driven methods lead to the least losses in result accuracy compared to quadrature and Quasi-Monte Carlo approaches, using historical data as a base. These results hold for both a single stochastic variable and multiple stochastic variables. The results are significant for improving the computational accuracy of lifetime financial models and other models that employ stochastic dynamic programming.

]]>Risks doi: 10.3390/risks5040056

Authors: Mohamed Abdelghani Alexander Melnikov

The paper deals with defaultable markets, one of the main research areas of mathematical finance. It proposes a new approach to the theory of such markets using techniques from the calculus of optional stochastic processes on unusual probability spaces, which was not presented before. The paper is a foundation paper and contains a number of fundamental results on modeling of defaultable markets, pricing and hedging of defaultable claims and results on the probability of default under such conditions. Moreover, several important examples are presented: a new pricing formula for a defaultable bond and a new pricing formula for credit default swap. Furthermore, some results on the absence of arbitrage for markets on unusual probability spaces and markets with default are also provided.

]]>Risks doi: 10.3390/risks5040055

Authors: Georges Dionne Sara Malekan

We address the moral hazard problem of securitization using a principal-agent model where the investor is the principal and the lender is the agent. Our model considers structured asset-backed securitization with a credit enhancement (tranching) procedure. We assume that the originator can affect the default probability and the conditional loss distribution. We show that the optimal form of retention must be proportional to the pool default loss even in the absence of systemic risk when the originator can affect the conditional loss given default rate, yet the current regulations propose a constant retention rate.

]]>Risks doi: 10.3390/risks5040054

Authors: Jean-Philippe Boucher Steven Côté Montserrat Guillen

In Pay-As-You-Drive (PAYD) automobile insurance, the premium is fixed based on the distance traveled, while in usage-based insurance (UBI) the driving patterns of the policyholder are also considered. In those schemes, drivers who drive more pay a higher premium compared to those with the same characteristics who drive only occasionally, because the former are more exposed to the risk of accident. In this paper, we analyze the simultaneous effect of the distance traveled and exposure time on the risk of accident by using Generalized Additive Models (GAM). We carry out an empirical application and show that the expected number of claims (1) stabilizes once a certain number of accumulated distance-driven is reached and (2) it is not proportional to the duration of the contract, which is in contradiction to insurance practice. Finally, we propose to use a rating system that takes into account simultaneously exposure time and distance traveled in the premium calculation. We think that this is the trend the automobile insurance market is going to follow with the eruption of telematics data.

]]>Risks doi: 10.3390/risks5040053

Authors: Gareth Peters Rodrigo Targino Mario Wüthrich

The main objective of this work is to develop a detailed step-by-step guide to the development and application of a new class of efficient Monte Carlo methods to solve practically important problems faced by insurers under the new solvency regulations. In particular, a novel Monte Carlo method to calculate capital allocations for a general insurance company is developed, with a focus on coherent capital allocation that is compliant with the Swiss Solvency Test. The data used is based on the balance sheet of a representative stylized company. For each line of business in that company, allocations are calculated for the one-year risk with dependencies based on correlations given by the Swiss Solvency Test. Two different approaches for dealing with parameter uncertainty are discussed and simulation algorithms based on (pseudo-marginal) Sequential Monte Carlo algorithms are described and their efficiency is analysed.

]]>Risks doi: 10.3390/risks5040052

Authors: A. Seetharaman Vikas Kumar Sahu A. Saravanan John Rudolph Raj Indu Niranjan

An empirical study was conducted to determine the impact of different types of risk on the performance management of credit rating agencies (CRAs). The different types of risks were classified as operational, market, business, financial, and credit. All these five variables were analysed to ascertain their impact on the performance of CRAs. In addition, apart from identifying the significant variables, the study focused on setting out a structured framework for future research. The five independent variables were tested statistically using structural equation modelling (SEM). The results indicated that market risk, financial risk, and credit risk have a significant impact on the performance of CRAs, whereas operational risk and business risk, though important, do not have a significant influence. This finding has a significant implication for the examination and inter-firm evaluation of CRAs.

]]>Risks doi: 10.3390/risks5030051

Authors: Carolyn W. Chang Jack S. K. Chang

We propose an integrated approach straddling the actuarial science and the mathematical finance approaches to pricing a default-risky catastrophe reinsurance contract. We first apply an incomplete-market version of the no-arbitrage martingale pricing paradigm to price the reinsurance contract as a martingale by a measure change, then we apply risk loading to price in—as in the traditional actuarial practice—market imperfections, the underwriting cycle, and other idiosyncratic factors identified in the practice and empirical literatures. This integrated approach is theoretically appealing for its merit of factoring risk premiums into the probability measure, and yet practical for being applicable to price a contract not traded on financial markets. We numerically study the catastrophe pricing effects and find that the reinsurance contract is more valuable when the catastrophe is more severe and the reinsurer’s default risk is lower because of a stronger balance sheet. We also find that the price is more sensitive to the severity of catastrophes than to the arrival frequency; implying (re)insurers should focus more on hedging the severity than the arrival frequency in their risk management programs.

]]>Risks doi: 10.3390/risks5030050

Authors: Silvia Romagnoli Simona Santoro

After financial crisis, the role of uncertainty in decision making processes has largely been recognized as the new variable that contributes to shaping interest rates and bond prices. Our aim is to discuss the impact of ambiguity on bonds interest rates (yields). Starting from the realistic assumption that investors ask for an ambiguity premium depending on the efficacy of government interventions (if any), we lead to an exponential multi-factor affine model which includes ambiguity as well as an ambiguous version of the Heath-Jarrow-Morton (HJM)model. As an example, we propose the realistic economic framework given by Ulrich (2008, 2011), and we recover the corresponding ambiguous HJM framework, thus offering a large set of interest rate models enriched with ambiguity. We also give a concrete view of how different simulated scenarios of ambiguity can influence the economic cycle (through rates and bond prices).

]]>Risks doi: 10.3390/risks5030049

Authors: Daoping Yu Vytaras Brazauskas

Over the last decade, researchers, practitioners, and regulators have had intense debates about how to treat the data collection threshold in operational risk modeling. Several approaches have been employed to fit the loss severity distribution: the empirical approach, the “naive” approach, the shifted approach, and the truncated approach. Since each approach is based on a different set of assumptions, different probability models emerge. Thus, model uncertainty arises. The main objective of this paper is to understand the impact of model uncertainty on the value-at-risk (VaR) estimators. To accomplish that, we take the bank’s perspective and study a single risk. Under this simplified scenario, we can solve the problem analytically (when the underlying distribution is exponential) and show that it uncovers similar patterns among VaR estimates to those based on the simulation approach (when data follow a Lomax distribution). We demonstrate that for a fixed probability distribution, the choice of the truncated approach yields the lowest VaR estimates, which may be viewed as beneficial to the bank, whilst the “naive” and shifted approaches lead to higher estimates of VaR. The advantages and disadvantages of each approach and the probability distributions under study are further investigated using a real data set for legal losses in a business unit (Cruz 2002).

]]>Risks doi: 10.3390/risks5030048

Authors: Daniel Leonhardt Antony Ware Rudi Zagst

Energy commodities and their futures naturally show cointegrated price movements. However, there is empirical evidence that the prices of futures with different maturities might have, e.g., different jump behaviours in different market situations. Observing commodity futures over time, there is also evidence for different states of the underlying volatility of the futures. In this paper, we therefore allow for cointegration of the term structure within a multi-factor model, which includes seasonality, as well as joint and individual jumps in the price processes of futures with different maturities. The seasonality in this model is realized via a deterministic function, and the jumps are represented with thinned-out compound Poisson processes. The model also includes a regime-switching approach that is modelled through a Markov chain and extends the class of geometric models. We show how the model can be calibrated to empirical data and give some practical applications.

]]>Risks doi: 10.3390/risks5030047

Authors: Johan Andréasson Pavel Shevchenko

Means-tested pension policies are typical for many countries, and the assessment of policy changes is critical for policy makers. In this paper, we consider the Australian means-tested Age Pension. In 2015, two important changes were made to the popular Allocated Pension accounts: the income means-test is now based on deemed income rather than account withdrawals, and the income-test deduction no longer applies. We examine the implications of the new changes in regard to optimal decisions for consumption, investment and housing. We account for regulatory minimum withdrawal rules that are imposed by regulations on Allocated Pension accounts, as well as the 2017 asset-test rebalancing. The policy changes are considered under a utility-maximising life cycle model solved as an optimal stochastic control problem. We find that the new rules decrease the advantages of planning the consumption in relation to the means-test, while risky asset allocation becomes more sensitive to the asset-test. The difference in optimal drawdown between the old and new policy is only noticeable early in retirement until regulatory minimum withdrawal rates are enforced. However, the amount of extra Age Pension received by many households is now significantly higher due to the new deeming income rules, which benefit wealthier households who previously would not have received Age Pension due to the income-test and minimum withdrawals.

]]>Risks doi: 10.3390/risks5030046

Authors: Knut Aase

We reconsider costs in insurance, and suggest a new type of cost function, which we argue is a natural choice when there are relatively small, but frequent, claims. If a fixed cost is incurred each time a claim is made, we obtain a Pareto optimal deductible even if the cost function does not vary with the indemnity. The classical result says that deductibles appear if and only if costs are variable. This implies that when the claims are relatively small, it is not optimal for the insured to be compensated, since the costs outweigh the benefits and a deductible will naturally occur. When we constrain the contract to contain a cap, a non-trivial deductible is Pareto optimal regardless of the assumptions about the cost structure, which is what is known as an XL-contract.

]]>Risks doi: 10.3390/risks5030044

Authors: Andreas Hermes Stanislaus Maier-Paape

In this paper, the multivariate fractional trading ansatz of money management from Vince (Vince 1990) is discussed. In particular, we prove existence and uniqueness of an “optimal f” of the respective optimization problem under reasonable assumptions on the trade return matrix. This result generalizes a similar result for the univariate fractional trading ansatz. Furthermore, our result guarantees that the multivariate optimal f solutions can always be found numerically by steepest ascent methods.

]]>Risks doi: 10.3390/risks5030045

Authors: George-Jason Siouris Alex Karagrigoriou

In this work, we focus on volatility estimation which plays a crucial role in risk analysis and management. In order to improve value at risk (VaR) forecasts, we discuss the concept of low price effect and introduce the low price correction which does not require any additional parameters and instead of returns it takes into account the prices of the asset. Judgement on the forecasting quality of the proposed methodology is based on both the relative number of violations and VaR volatility. For illustrative purposes, a real example from the Athens Stock Exchange is fully explored.

]]>Risks doi: 10.3390/risks5030043

Authors: Dimitrina Dimitrova Zvetan Ignatov Vladimir Kaishev

We derive a closed form expression for the probability that a non-decreasing, pure jump stochastic risk process with the order statistics (OS) property will not exit the strip between two non-decreasing, possibly discontinuous, time-dependent boundaries, within a finite time interval. The result yields new expressions for the ruin probability in the insurance and the dual risk models with dependence between the claim severities or capital gains respectively.

]]>Risks doi: 10.3390/risks5030042

Authors: Dorota Toczydlowska Gareth Peters Man Fung Pavel Shevchenko

In this study we develop a multi-factor extension of the family of Lee-Carter stochastic mortality models. We build upon the time, period and cohort stochastic model structure to extend it to include exogenous observable demographic features that can be used as additional factors to improve model fit and forecasting accuracy. We develop a dimension reduction feature extraction framework which (a) employs projection based techniques of dimensionality reduction; in doing this we also develop (b) a robust feature extraction framework that is amenable to different structures of demographic data; (c) we analyse demographic data sets from the patterns of missingness and the impact of such missingness on the feature extraction, and (d) introduce a class of multi-factor stochastic mortality models incorporating time, period, cohort and demographic features, which we develop within a Bayesian state-space estimation framework; finally (e) we develop an efficient combined Markov chain and filtering framework for sampling the posterior and forecasting. We undertake a detailed case study on the Human Mortality Database demographic data from European countries and we use the extracted features to better explain the term structure of mortality in the UK over time for male and female populations when compared to a pure Lee-Carter stochastic mortality model, demonstrating our feature extraction framework and consequent multi-factor mortality model improves both in sample fit and importantly out-off sample mortality forecasts by a non-trivial gain in performance.

]]>Risks doi: 10.3390/risks5030041

Authors: Sabyasachi Guharay KC Chang Jie Xu

Value-at-Risk (VaR) is a well-accepted risk metric in modern quantitative risk management (QRM). The classical Monte Carlo simulation (MCS) approach, denoted henceforth as the classical approach, assumes the independence of loss severity and loss frequency. In practice, this assumption does not always hold true. Through mathematical analyses, we show that the classical approach is prone to significant biases when the independence assumption is violated. This is also corroborated by studying both simulated and real-world datasets. To overcome the limitations and to more accurately estimate VaR, we develop and implement the following two approaches for VaR estimation: the data-driven partitioning of frequency and severity (DPFS) using clustering analysis, and copula-based parametric modeling of frequency and severity (CPFS). These two approaches are verified using simulation experiments on synthetic data and validated on five publicly available datasets from diverse domains; namely, the financial indices data of Standard &amp; Poor’s 500 and the Dow Jones industrial average, chemical loss spills as tracked by the US Coast Guard, Australian automobile accidents, and US hurricane losses. The classical approach estimates VaR inaccurately for 80% of the simulated data sets and for 60% of the real-world data sets studied in this work. Both the DPFS and the CPFS methodologies attain VaR estimates within 99% bootstrap confidence interval bounds for both simulated and real-world data. We provide a process flowchart for risk practitioners describing the steps for using the DPFS versus the CPFS methodology for VaR estimation in real-world loss datasets.

]]>Risks doi: 10.3390/risks5030040

Authors: Wolf-Dieter Richter

For evaluating the probabilities of arbitrary random events with respect to a given multivariate probability distribution, specific techniques are of great interest. An important two-dimensional high risk limit law is the Gauss-exponential distribution whose probabilities can be dealt with based on the Gauss–Laplace law. The latter will be considered here as an element of the newly-introduced family of ( p , q ) -spherical distributions. Based on a suitably-defined non-Euclidean arc-length measure on ( p , q ) -circles, we prove geometric and stochastic representations of these distributions and correspondingly distributed random vectors, respectively. These representations allow dealing with the new probability measures similarly to with elliptically-contoured distributions and more general homogeneous star-shaped ones. This is demonstrated by the generalization of the Box–Muller simulation method. In passing, we prove an extension of the sector and circle number functions.

]]>Risks doi: 10.3390/risks5030039

Authors: Mathias Lindholm Filip Lindskog Felix Wahl

This paper provides a complete program for the valuation of aggregate non-life insurance liability cash flows based on claims triangle data. The valuation is fully consistent with the principle of valuation by considering the costs associated with a transfer of the liability to a so-called reference undertaking subject to capital requirements throughout the runoff of the liability cash flow. The valuation program includes complete details on parameter estimation, bias correction and conservative estimation of the value of the liability under partial information. The latter is based on a new approach to the estimation of mean squared error of claims reserve prediction.

]]>Risks doi: 10.3390/risks5030038

Authors: Matthias Fischer Daniel Kraus Marius Pfeuffer Claudia Czado

Measuring interdependence between probabilities of default (PDs) in different industry sectors of an economy plays a crucial role in financial stress testing. Thereby, regression approaches may be employed to model the impact of stressed industry sectors as covariates on other response sectors. We identify vine copula based quantile regression as an eligible tool for conducting such stress tests as this method has good robustness properties, takes into account potential nonlinearities of conditional quantile functions and ensures that no quantile crossing effects occur. We illustrate its performance by a data set of sector specific PDs for the German economy. Empirical results are provided for a rough and a fine-grained industry sector classification scheme. Amongst others, we confirm that a stressed automobile industry has a severe impact on the German economy as a whole at different quantile levels whereas, e.g., for a stressed financial sector the impact is rather moderate. Moreover, the vine copula based quantile regression approach is benchmarked against both classical linear quantile regression and expectile regression in order to illustrate its methodological effectiveness in the scenarios evaluated.

]]>Risks doi: 10.3390/risks5030036

Authors: Hirbod Assa Nikolay Gospodinov

This paper proposes a model-free approach to hedging and pricing in the presence of market imperfections such as market incompleteness and frictions. The generality of this framework allows us to conduct an in-depth theoretical analysis of hedging strategies with a wide family of risk measures and pricing rules, and study the conditions under which the hedging problem admits a solution and pricing is possible. The practical implications of our proposed theoretical approach are illustrated with an application on hedging economic risk.

]]>Risks doi: 10.3390/risks5030037

Authors: John Fry Andrew Brint

In this paper we develop a well-established financial model to investigate whether bubbles were present in opinion polls and betting markets prior to the UK’s vote on EU membership on 23 June 2016. The importance of our contribution is threefold. Firstly, our continuous-time model allows for irregularly spaced time series—a common feature of polling data. Secondly, we build on qualitative comparisons that are often made between market cycles and voting patterns. Thirdly, our approach is theoretically elegant. Thus, where bubbles are found we suggest a suitable adjustment. We find evidence of bubbles in polling data. This suggests they systematically over-estimate the proportion voting for remain. In contrast, bookmakers’ odds appear to show none of this bubble-like over-confidence. However, implied probabilities from bookmakers’ odds appear remarkably unresponsive to polling data that nonetheless indicates a close-fought vote.

]]>Risks doi: 10.3390/risks5030034

Authors: Carlo Maccheroni Samuel Nocito

This work proposes a backtesting analysis that compares the Lee–Carter and the Cairns–Blake–Dowd mortality models, employing Italian data. The mortality data come from the Italian National Statistics Institute (ISTAT) database and span the period 1975–2014, over which we computed back-projections evaluating the performances of the models compared with real data. We propose three different backtest approaches, evaluating the goodness of short-run forecast versus medium-length ones. We find that neither model was able to capture the improving shock on mortality observed for the male population on the analysed period. Moreover, the results suggest that CBD forecasts are reliable prevalently for ages above 75, and that LC forecasts are basically more accurate for this data.

]]>Risks doi: 10.3390/risks5030035

Authors: Iain Clark Saeed Amen

Much of the debate around a potential British exit (Brexit) from the European Union has centred on the potential macroeconomic impact. In this paper, we instead focus on understanding market expectations for price action around the Brexit referendum date. Extracting implied distributions from the GBPUSD option volatility surface, we originally estimated, based on our visual observation of implied probability densities available up to 13 June 2016, that the market expected that a vote to leave could result in a move in the GBPUSD exchange rate from 1.4390 (spot reference on 10 June 2016) down to a range in 1.10 to 1.30, i.e., a 10–25% decline—very probably with highly volatile price action. To quantify this more objectively, we construct a mixture model corresponding to two scenarios for the GBPUSD exchange rate after the referendum vote, one scenario for “remain” and one for “leave”. Calibrating this model to four months of market data, from 24 February to 22 June 2016, we find that a “leave” vote was associated with a predicted devaluation of the British pound to approximately 1.37 USD per GBP, a 4.5% devaluation, and quite consistent with the observed post-referendum exchange rate move down from 1.4877 to 1.3622. We contrast the behaviour of the GBPUSD option market in the run-up to the Brexit vote with that during the 2014 Scottish Independence referendum, finding the potential impact of Brexit to be considerably higher.

]]>Risks doi: 10.3390/risks5030033

Authors: Marta Ferreira Helena Ferreira

Pareto processes are suitable to model stationary heavy-tailed data. Here, we consider the auto-regressive Gaver–Lewis Pareto Process and address a study of the tail behavior. We characterize its local and long-range dependence. We will see that consecutive observations are asymptotically tail independent, a feature that is often misevaluated by the most common extremal models and with strong relevance to the tail inference. This also reveals clustering at “penultimate” levels. Linear correlation may not exist in a heavy-tailed context and an alternative diagnostic tool will be presented. The derived properties relate to the auto-regressive parameter of the process and will provide estimators. A comparison of the proposals is conducted through simulation and an application to a real dataset illustrates the procedure.

]]>Risks doi: 10.3390/risks5020032

Authors: Robert Rietz Evan Cronick Shelby Mathers Matt Pollie

This paper examines the effect of gainsharing provisions on the selection of a discount rate for a defined benefit pension plan. The paper uses a traditional actuarial approach of discounting liabilities using the expected return of the associated pension fund. A stochastic Excel model was developed to simulate the effect of varying investment returns on a pension fund with four asset classes. Lognormal distributions were fitted to historical returns of two of the asset classes; large company stocks and long-term government bonds. A third lognormal distribution was designed to represent the investment returns of alternative investments, such as real estate and private equity. The fourth asset class represented short term cash investments and that return was held constant. The following variables were analyzed to determine their relative impact of gainsharing on the selection of a discount rate: hurdle rate, percentage of gainsharing, actuarial asset method smoothing period, and variations in asset allocation. A 50% gainsharing feature can reduce the discount rate for a defined benefit pension plan from 0.5% to more than 2.5%, depending on the gainsharing design and asset allocation.

]]>Risks doi: 10.3390/risks5020031

Authors: Stephen Mildenhall

The literature on capital allocation is biased towards an asset modeling framework rather than an actuarial framework. The asset modeling framework leads to the proliferation of inappropriate assumptions about the effect of insurance line of business growth on aggregate loss distributions. This paper explains why an actuarial analog of the asset volume/return model should be based on a Lévy process. It discusses the impact of different loss models on marginal capital allocations. It shows that Lévy process-based models provide a better fit to the US statutory accounting data, and identifies how parameter risk scales with volume and increases with time. Finally, it shows the data suggest a surprising result regarding the form of insurance parameter risk.

]]>Risks doi: 10.3390/risks5020030

Authors: Nataliya Chukhrova Arne Johannssen

This paper gives a detailed overview of the current state of research in relation to the use of state space models and the Kalman-filter in the field of stochastic claims reserving. Most of these state space representations are matrix-based, which complicates their applications. Therefore, to facilitate the implementation of state space models in practice, we present a scalar state space model for cumulative payments, which is an extension of the well-known chain ladder (CL) method. The presented model is distribution-free, forms a basis for determining the entire unobservable lower and upper run-off triangles and can easily be applied in practice using the Kalman-filter for prediction, filtering and smoothing of cumulative payments. In addition, the model provides an easy way to find outliers in the data and to determine outlier effects. Finally, an empirical comparison of the scalar state space model, promising prior state space models and some popular stochastic claims reserving methods is performed.

]]>Risks doi: 10.3390/risks5020029

Authors: Susanna Levantesi Massimiliano Menzietti

Longevity risk constitutes an important risk factor for life insurance companies, and it can be managed through longevity-linked securities. The market of longevity-linked securities is at present far from being complete and does not allow finding a unique pricing measure. We propose a method to estimate the maximum market price of longevity risk depending on the risk margin implicit within the calculation of the technical provisions as defined by Solvency II. The maximum price of longevity risk is determined for a survivor forward (S-forward), an agreement between two counterparties to exchange at maturity a fixed survival-dependent payment for a payment depending on the realized survival of a given cohort of individuals. The maximum prices determined for the S-forwards can be used to price other longevity-linked securities, such as q-forwards. The Cairns–Blake–Dowd model is used to represent the evolution of mortality over time that combined with the information on the risk margin, enables us to calculate upper limits for the risk-adjusted survival probabilities, the market price of longevity risk and the S-forward prices. Numerical results can be extended for the pricing of other longevity-linked securities.

]]>Risks doi: 10.3390/risks5020028

Authors: Jing Liu Huan Zhang

Motivated by the EU Solvency II Directive, we study the one-year ruin probability of an insurer who makes investments and hence faces both insurance and financial risks. Over a time horizon of one year, the insurance risk is quantified as a nonnegative random variable X equal to the aggregate amount of claims, and the financial risk as a d-dimensional random vector Y consisting of stochastic discount factors of the d financial assets invested. To capture both heavy tails and asymptotic dependence of Y in an integrated manner, we assume that Y follows a standard multivariate regular variation (MRV) structure. As main results, we derive exact asymptotic estimates for the one-year ruin probability for the following cases: (i) X and Y are independent with X of Fréchet type; (ii) X and Y are independent with X of Gumbel type; (iii) X and Y jointly possess a standard MRV structure; (iv) X and Y jointly possess a nonstandard MRV structure.

]]>Risks doi: 10.3390/risks5020027

Authors: Michael Metel Traian A. Pirvu Julian Wong

We prove that the Omega measure, which considers all moments when assessing portfolio performance, is equivalent to the widely used Sharpe ratio under jointly elliptic distributions of returns. Portfolio optimization of the Sharpe ratio is then explored, with an active-set algorithm presented for markets prohibiting short sales. When asymmetric returns are considered, we show that the Omega measure and Sharpe ratio lead to different optimal portfolios.

]]>Risks doi: 10.3390/risks5020026

Authors: Albert Cohen Nick Costanzino

Building on recent work incorporating recovery risk into structural models by Cohen &amp; Costanzino (2015), we consider the Black-Cox model with an added recovery risk driver. The recovery risk driver arises naturally in the context of imperfect information implicit in the structural framework. This leads to a two-factor structural model we call the Stochastic Recovery Black-Cox model, whereby the asset risk driver At defines the default trigger and the recovery risk driver Rt defines the amount recovered in the event of default. We then price zero-coupon bonds and credit default swaps under the Stochastic Recovery Black-Cox model. Finally, we compare our results with the classic Black-Cox model, give explicit expressions for the recovery risk premium in the Stochastic Recovery Black-Cox model, and detail how the introduction of separate but correlated risk drivers leads to a decoupling of the default and recovery risk premiums in the credit spread. We conclude this work by computing the effect of adding coupons that are paid continuously until default, and price perpetual (consol bonds) in our two-factor firm value model, extending calculations in the seminal paper by Leland (1994).

]]>Risks doi: 10.3390/risks5020025

Authors: Koon-Shing Kwong Yiu-Kuen Tse Wai-Sum Chan

Building a social security system to ensure Singapore residents have peace of mind in funding for retirement has been at the top of Singapore government’s policy agenda over the last decade. Implementation of the Lifelong Income For the Elderly (LIFE) scheme in 2009 clearly shows that the government spares no effort in improving its pension scheme to boost its residents’ income after retirement. Despite the recent modifications to the LIFE scheme, Singapore residents must still choose between two plans: the Standard and Basic plans. To enhance the flexibility of the LIFE scheme with further streamlining of its fund management, we propose some plan modifications such that scheme members do not face a dichotomy of plan choices. Instead, they select two age parameters: the Payout Age and the Life-annuity Age. This paper discusses the actuarial analysis for determining members’ payouts and bequests based on the proposed age parameters. We analyze the net cash receipts and Internal Rate of Return (IRR) for various plan-parameter configurations. This information helps members make their plan choices. To address cost-of-living increases we propose to extend the plan to accommodate an annual step-up of monthly payouts. By deferring the Payout Age from 65 to 68, members can enjoy an annual increase of about 2% of the payouts for the same first-year monthly benefits.

]]>