Next Issue
Volume 9, February
Previous Issue
Volume 8, December
 
 

Risks, Volume 9, Issue 1 (January 2021) – 27 articles

Cover Story (view full-size image): The cover figure represents the logarithm of crude death rates from 1970 to 2017 for France along with the logarithm of extrapolated death rates from 2018 to 2050 obtained by elastic-net regularization and cross-validation. The use of regularization for the mortality surface allows obtaining a parsimonious mortality model which is robust to mortality perturbations. With the presence of a COVID-type effect, we found that our approach outperforms the P-spline model in terms of prediction and stability. View this paper.
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
15 pages, 510 KiB  
Article
Determinants of Demand for Private Long-Term Care Insurance (Empirical Evidence from Poland)
by Łukasz Jurek and Wioletta Wolańska
Risks 2021, 9(1), 27; https://doi.org/10.3390/risks9010027 - 18 Jan 2021
Cited by 1 | Viewed by 3050
Abstract
The main aim of the article is to evaluate determinants of demand for private long-term care insurance in Poland. Since this type of insurance is not (yet) offered on the market, the demand was examined through a survey in which respondents declared their [...] Read more.
The main aim of the article is to evaluate determinants of demand for private long-term care insurance in Poland. Since this type of insurance is not (yet) offered on the market, the demand was examined through a survey in which respondents declared their willingness to purchase it. From the obtained results, it can be concluded that Poles declare a high propensity for private protection in the event of dependence. The vast majority (almost two-thirds) of the respondents were interested in purchasing long-term care insurance, while only one in sixteen respondents showed no such interest. Factors that predominantly determine the willingness to buy are as follows: individual foresight, knowledge about the costs of long-term care, preferences regarding methods of financing care, having children, and the level of education. Full article
Show Figures

Figure 1

23 pages, 459 KiB  
Article
Discrete-Time Risk Models with Claim Correlated Premiums in a Markovian Environment
by Dhiti Osatakul and Xueyuan Wu
Risks 2021, 9(1), 26; https://doi.org/10.3390/risks9010026 - 14 Jan 2021
Cited by 2 | Viewed by 2846
Abstract
In this paper we consider a discrete-time risk model, which allows the premium to be adjusted according to claims experience. This model is inspired by the well-known bonus-malus system in the non-life insurance industry. Two strategies of adjusting periodic premiums are considered: aggregate [...] Read more.
In this paper we consider a discrete-time risk model, which allows the premium to be adjusted according to claims experience. This model is inspired by the well-known bonus-malus system in the non-life insurance industry. Two strategies of adjusting periodic premiums are considered: aggregate claims or claim frequency. Recursive formulae are derived to compute the finite-time ruin probabilities, and Lundberg-type upper bounds are also derived to evaluate the ultimate-time ruin probabilities. In addition, we extend the risk model by considering an external Markovian environment in which the claims distributions are governed by an external Markov process so that the periodic premium adjustments vary when the external environment state changes. We then study the joint distribution of premium level and environment state at ruin given ruin occurs. Two numerical examples are provided at the end of this paper to illustrate the impact of the initial external environment state, the initial premium level and the initial surplus on the ruin probability. Full article
Show Figures

Figure 1

17 pages, 444 KiB  
Article
The Assignment Problem in Human Resource Project Management under Uncertainty
by Helena Gaspars-Wieloch
Risks 2021, 9(1), 25; https://doi.org/10.3390/risks9010025 - 12 Jan 2021
Cited by 9 | Viewed by 5485
Abstract
The assignment problem (AP) is a discrete and combinatorial problem where agents are assigned to perform tasks for efficiency maximization or cost (time) minimization. AP is a part of human resource project management (HRPM). The AP optimization model, with deterministic parameters describing agent–task [...] Read more.
The assignment problem (AP) is a discrete and combinatorial problem where agents are assigned to perform tasks for efficiency maximization or cost (time) minimization. AP is a part of human resource project management (HRPM). The AP optimization model, with deterministic parameters describing agent–task performance, can be easily solved, but it is characteristic of standard, well-known projects realized in a quiet environment. When considering new (innovation or innovative) projects or projects performed in very turbulent times, the parameter estimation becomes more complex (in extreme cases, even the use of the probability calculus is not recommended). Therefore, we suggest an algorithm combining binary programming with scenario planning and applying the optimism coefficient, which describes the manager’s nature (attitude towards risk). The procedure is designed for one-shot decisions (i.e., for situations where the selected alternative is performed only once) and pure strategies (the execution of a weighted combination of several decision variants is not possible). Full article
28 pages, 577 KiB  
Article
Optimal Investment in Cyber-Security under Cyber Insurance for a Multi-Branch Firm
by Alessandro Mazzoccoli and Maurizio Naldi
Risks 2021, 9(1), 24; https://doi.org/10.3390/risks9010024 - 12 Jan 2021
Cited by 11 | Viewed by 3676
Abstract
Investments in security and cyber-insurance are two cyber-risk management strategies that can be employed together to optimize the overall security expense. In this paper, we provide a closed form for the optimal investment under a full set of insurance liability scenarios (full liability, [...] Read more.
Investments in security and cyber-insurance are two cyber-risk management strategies that can be employed together to optimize the overall security expense. In this paper, we provide a closed form for the optimal investment under a full set of insurance liability scenarios (full liability, limited liability, and limited liability with deductibles) when we consider a multi-branch firm with correlated vulnerability. The insurance component results to be the major expense. It ends up being the only recommended approach (i.e., setting zero investments in security) when the intrinsic vulnerability is either very low or very high. We also study the robustness of the investment choices when our knowledge of vulnerability and correlation is uncertain, concluding that the uncertainty induced on investment by either uncertain correlation or uncertain vulnerability is not significant. Full article
Show Figures

Figure 1

38 pages, 1629 KiB  
Article
The Weak Convergence Rate of Two Semi-Exact Discretization Schemes for the Heston Model
by Annalena Mickel and Andreas Neuenkirch
Risks 2021, 9(1), 23; https://doi.org/10.3390/risks9010023 - 12 Jan 2021
Cited by 5 | Viewed by 3059
Abstract
Inspired by the article Weak Convergence Rate of a Time-Discrete Scheme for the Heston Stochastic Volatility Model, Chao Zheng, SIAM Journal on Numerical Analysis 2017, 55:3, 1243–1263, we studied the weak error of discretization schemes for the Heston model, which are based [...] Read more.
Inspired by the article Weak Convergence Rate of a Time-Discrete Scheme for the Heston Stochastic Volatility Model, Chao Zheng, SIAM Journal on Numerical Analysis 2017, 55:3, 1243–1263, we studied the weak error of discretization schemes for the Heston model, which are based on exact simulation of the underlying volatility process. Both for an Euler- and a trapezoidal-type scheme for the log-asset price, we established weak order one for smooth payoffs without any assumptions on the Feller index of the volatility process. In our analysis, we also observed the usual trade off between the smoothness assumption on the payoff and the restriction on the Feller index. Moreover, we provided error expansions, which could be used to construct second order schemes via extrapolation. In this paper, we illustrate our theoretical findings by several numerical examples. Full article
(This article belongs to the Special Issue Computational Finance and Risk Analysis in Insurance)
Show Figures

Figure 1

9 pages, 380 KiB  
Review
Are Sports Bettors Biased toward Longshots, Favorites, or Both? A Literature Review
by Philip W. S. Newall and Dominic Cortis
Risks 2021, 9(1), 22; https://doi.org/10.3390/risks9010022 - 12 Jan 2021
Cited by 9 | Viewed by 4773
Abstract
A large body of literature on the favorite–longshot bias finds that sports bettors in a variety of markets appear to have irrational biases toward either longshots (which offer a small chance of winning a large amount of money) or favorites (which offer a [...] Read more.
A large body of literature on the favorite–longshot bias finds that sports bettors in a variety of markets appear to have irrational biases toward either longshots (which offer a small chance of winning a large amount of money) or favorites (which offer a high chance of winning a small amount of money). While early studies in horse racing led to an impression that longshot bias is dominant, favorite bias has also now been found in a variety of sports betting markets. This review proposes that the evidence is consistent with both biases being present in the average sports bettor. Sports betting markets with only two potential outcomes, where the favorite therefore has a probability >0.5 of happening, often produce favorite bias. Sports betting markets with multiple outcomes, where the favorite’s probability is usually <0.5, appear more consistent with longshot bias. The presence of restricted odds ranges within any given betting market provides an explanation for why single studies support, at most, one bias. This literature review highlights how individual sports bettors might possess biases toward both highly likely, and highly unlikely, events, a contradictory view that has not been summarized in detail before. Full article
(This article belongs to the Special Issue Risks in Gambling)
22 pages, 1844 KiB  
Article
The Interaction between Banking Sector and Financial Technology Companies: Qualitative Assessment—A Case of Lithuania
by Ruihui Pu, Deimante Teresiene, Ina Pieczulis, Jie Kong and Xiao-Guang Yue
Risks 2021, 9(1), 21; https://doi.org/10.3390/risks9010021 - 11 Jan 2021
Cited by 18 | Viewed by 11842
Abstract
The role of financial technology companies increases every day. From one side this process generates more possibilities for consumers from other side it is related with new risks which arise in banking sector. At the beginning of FinTech era lots of analyst were [...] Read more.
The role of financial technology companies increases every day. From one side this process generates more possibilities for consumers from other side it is related with new risks which arise in banking sector. At the beginning of FinTech era lots of analyst were discussing about disruptive potential in financial services. Later, however, we can see more discussions about cooperation between FinTech companies and banks. The other point which is very important to discuss about is a financial inclusion. The purpose of this study is to analyze the interaction between banking sector and FinTech companies. We use a case study of Lithuania because here FinTech sector is growing very intensively. First of all we try to analyze the scientific literature which analyzes the main aspects of FinTech sector. The second part of the article provides the progress of the FinTech sector and presents the main points of methodology. The research of the FinTech sector in Lithuania was focused on strengths, weaknesses, opportunities, and threats (SWOT) and political, economic, social, technological, environmental, legal (PESTEL) analysis and main statistical parameters. We also used a correlation and regression analysis together with qualitative assessments. Our results showed that in order to value the interaction between banking and financial technology better to focus on qualitative assessment because only statistical analysis can give different and wrong results. We identified that both sectors interact with each other and there is no a disruptive effect of FinTech in Lithuania. Full article
Show Figures

Figure 1

18 pages, 1322 KiB  
Article
On the Market-Consistent Valuation of Participating Life Insurance Heterogeneous Contracts under Longevity Risk
by Anna Rita Bacinello, An Chen, Thorsten Sehner and Pietro Millossovich
Risks 2021, 9(1), 20; https://doi.org/10.3390/risks9010020 - 11 Jan 2021
Cited by 5 | Viewed by 3123
Abstract
The purpose of this paper is to conduct a market-consistent valuation of life insurance participating liabilities sold to a population of partially heterogeneous customers under the joint impact of biometric and financial risk. In particular, the heterogeneity between groups of policyholders stems from [...] Read more.
The purpose of this paper is to conduct a market-consistent valuation of life insurance participating liabilities sold to a population of partially heterogeneous customers under the joint impact of biometric and financial risk. In particular, the heterogeneity between groups of policyholders stems from their offered minimum interest rate guarantees and contract maturities. We analyse the effects of these features on the company’s insolvency while embracing the insurer’s goal to achieve the same expected return for different cohorts of policyholders. Within our extensive numerical analyses, we determine the fair participation rates and other key figures, and discuss the implications for the stakeholders, taking account of various degrees of conservativeness of the insurer when pricing the contracts. Full article
Show Figures

Figure 1

17 pages, 3439 KiB  
Article
An Expectation-Maximization Algorithm for the Exponential-Generalized Inverse Gaussian Regression Model with Varying Dispersion and Shape for Modelling the Aggregate Claim Amount
by George Tzougas and Himchan Jeong
Risks 2021, 9(1), 19; https://doi.org/10.3390/risks9010019 - 8 Jan 2021
Cited by 8 | Viewed by 2301
Abstract
This article presents the Exponential–Generalized Inverse Gaussian regression model with varying dispersion and shape. The EGIG is a general distribution family which, under the adopted modelling framework, can provide the appropriate level of flexibility to fit moderate costs with high frequencies and heavy-tailed [...] Read more.
This article presents the Exponential–Generalized Inverse Gaussian regression model with varying dispersion and shape. The EGIG is a general distribution family which, under the adopted modelling framework, can provide the appropriate level of flexibility to fit moderate costs with high frequencies and heavy-tailed claim sizes, as they both represent significant proportions of the total loss in non-life insurance. The model’s implementation is illustrated by a real data application which involves fitting claim size data from a European motor insurer. The maximum likelihood estimation of the model parameters is achieved through a novel Expectation Maximization (EM)-type algorithm that is computationally tractable and is demonstrated to perform satisfactorily. Full article
Show Figures

Figure 1

22 pages, 7752 KiB  
Article
A Bayesian Approach to Measurement of Backtest Overfitting
by Jiří Witzany
Risks 2021, 9(1), 18; https://doi.org/10.3390/risks9010018 - 8 Jan 2021
Cited by 1 | Viewed by 2628
Abstract
Quantitative investment strategies are often selected from a broad class of candidate models estimated and tested on historical data. Standard statistical techniques to prevent model overfitting such as out-sample backtesting turn out to be unreliable in situations when the selection is based on [...] Read more.
Quantitative investment strategies are often selected from a broad class of candidate models estimated and tested on historical data. Standard statistical techniques to prevent model overfitting such as out-sample backtesting turn out to be unreliable in situations when the selection is based on results of too many models tested on the holdout sample. There is an ongoing discussion of how to estimate the probability of backtest overfitting and adjust the expected performance indicators such as the Sharpe ratio in order to reflect properly the effect of multiple testing. We propose a consistent Bayesian approach that yields the desired robust estimates on the basis of a Markov chain Monte Carlo (MCMC) simulation. The approach is tested on a class of technical trading strategies where a seemingly profitable strategy can be selected in the naïve approach. Full article
Show Figures

Figure 1

18 pages, 908 KiB  
Article
Minimal Expected Time in Drawdown through Investment for an Insurance Diffusion Model
by Leonie Violetta Brinker
Risks 2021, 9(1), 17; https://doi.org/10.3390/risks9010017 - 6 Jan 2021
Cited by 3 | Viewed by 2630
Abstract
Consider an insurance company whose surplus is modelled by an arithmetic Brownian motion of not necessarily positive drift. Additionally, the insurer has the possibility to invest in a stock modelled by a geometric Brownian motion independent of the surplus. Our key variable is [...] Read more.
Consider an insurance company whose surplus is modelled by an arithmetic Brownian motion of not necessarily positive drift. Additionally, the insurer has the possibility to invest in a stock modelled by a geometric Brownian motion independent of the surplus. Our key variable is the (absolute) drawdown Δ of the surplus X, defined as the distance to its running maximum X¯. Large, long-lasting drawdowns are unfavourable for the insurance company. We consider the stochastic optimisation problem of minimising the expected time that the drawdown is larger than a positive critical value (weighted by a discounting factor) under investment. A fixed-point argument is used to show that the value function is the unique solution to the Hamilton–Jacobi–Bellman equation related to the problem. It turns out that the optimal investment strategy is given by a piecewise monotone and continuously differentiable function of the current drawdown. Several numerical examples illustrate our findings. Full article
(This article belongs to the Special Issue Interplay between Financial and Actuarial Mathematics)
Show Figures

Graphical abstract

16 pages, 537 KiB  
Review
Supply Chain Risk Management: Literature Review
by Amulya Gurtu and Jestin Johny
Risks 2021, 9(1), 16; https://doi.org/10.3390/risks9010016 - 6 Jan 2021
Cited by 96 | Viewed by 66127
Abstract
The risks associated with global supply chain management has created a discourse among practitioners and academics. This is evident by the business uncertainties growing in supply chain management, which pose threats to the entire network flow and economy. This paper aims to review [...] Read more.
The risks associated with global supply chain management has created a discourse among practitioners and academics. This is evident by the business uncertainties growing in supply chain management, which pose threats to the entire network flow and economy. This paper aims to review the existing literature on risk factors in supply chain management in an uncertain and competitive business environment. Papers that contained the word “risk” in their titles, keywords, or abstracts were selected for conducting the theoretical analyses. Supply chain risk management is an integral function of the supply network. It faces unpredictable challenges due to nations’ economic policies and globalization, which have raised uncertainty and challenges for supply chain organizations. These significantly affect the financial performance of the organizations and the economy of a nation. Debate on supply chain risk management may promote competitiveness in business. Risk mitigation strategies will reduce the impact caused due to natural and human-made disasters. Full article
Show Figures

Figure 1

28 pages, 658 KiB  
Article
Retrospective Reserves and Bonus with Policyholder Behavior
by Debbie Kusch Falden and Anna Kamille Nyegaard
Risks 2021, 9(1), 15; https://doi.org/10.3390/risks9010015 - 5 Jan 2021
Cited by 8 | Viewed by 2932
Abstract
Legislation imposes insurance companies to project their assets and liabilities in various financial scenarios. Within the setup of with-profit life insurance, we consider retrospective reserves and bonus, and we study projection of balances with and without policyholder behavior. The projection resides in a [...] Read more.
Legislation imposes insurance companies to project their assets and liabilities in various financial scenarios. Within the setup of with-profit life insurance, we consider retrospective reserves and bonus, and we study projection of balances with and without policyholder behavior. The projection resides in a system of differential equations of the savings account and the surplus, and the policyholder behavior options surrender and conversion to free-policy are included. The inclusion results in a structure where the system of differential equations of the savings account and the surplus is non-trivial. We consider a case where we are able to find accurate differential equations and suggest an approximation method to project the savings account and the surplus, including policyholder behavior, in general. To highlight the practical applications of the results in this paper, we study a numerical example. Full article
(This article belongs to the Special Issue Interplay between Financial and Actuarial Mathematics)
Show Figures

Figure 1

26 pages, 951 KiB  
Article
Modelling Volatile Time Series with V-Transforms and Copulas
by Alexander J. McNeil
Risks 2021, 9(1), 14; https://doi.org/10.3390/risks9010014 - 5 Jan 2021
Cited by 5 | Viewed by 3556
Abstract
An approach to the modelling of volatile time series using a class of uniformity-preserving transforms for uniform random variables is proposed. V-transforms describe the relationship between quantiles of the stationary distribution of the time series and quantiles of the distribution of a predictable [...] Read more.
An approach to the modelling of volatile time series using a class of uniformity-preserving transforms for uniform random variables is proposed. V-transforms describe the relationship between quantiles of the stationary distribution of the time series and quantiles of the distribution of a predictable volatility proxy variable. They can be represented as copulas and permit the formulation and estimation of models that combine arbitrary marginal distributions with copula processes for the dynamics of the volatility proxy. The idea is illustrated using a Gaussian ARMA copula process and the resulting model is shown to replicate many of the stylized facts of financial return series and to facilitate the calculation of marginal and conditional characteristics of the model including quantile measures of risk. Estimation is carried out by adapting the exact maximum likelihood approach to the estimation of ARMA processes, and the model is shown to be competitive with standard GARCH in an empirical application to Bitcoin return data. Full article
(This article belongs to the Special Issue Risks: Feature Papers 2020)
Show Figures

Figure 1

20 pages, 615 KiB  
Article
Quantifying the Model Risk Inherent in the Calibration and Recalibration of Option Pricing Models
by Yu Feng, Ralph Rudd, Christopher Baker, Qaphela Mashalaba, Melusi Mavuso and Erik Schlögl
Risks 2021, 9(1), 13; https://doi.org/10.3390/risks9010013 - 4 Jan 2021
Cited by 2 | Viewed by 2468
Abstract
We focus on two particular aspects of model risk: the inability of a chosen model to fit observed market prices at a given point in time (calibration error) and the model risk due to the recalibration of model parameters (in contradiction to the [...] Read more.
We focus on two particular aspects of model risk: the inability of a chosen model to fit observed market prices at a given point in time (calibration error) and the model risk due to the recalibration of model parameters (in contradiction to the model assumptions). In this context, we use relative entropy as a pre-metric in order to quantify these two sources of model risk in a common framework, and consider the trade-offs between them when choosing a model and the frequency with which to recalibrate to the market. We illustrate this approach by applying it to the seminal Black/Scholes model and its extension to stochastic volatility, while using option data for Apple (AAPL) and Google (GOOG). We find that recalibrating a model more frequently simply shifts model risk from one type to another, without any substantial reduction of aggregate model risk. Furthermore, moving to a more complicated stochastic model is seen to be counterproductive if one requires a high degree of robustness, for example, as quantified by a 99% quantile of aggregate model risk. Full article
Show Figures

Figure 1

23 pages, 654 KiB  
Article
Bayesian Predictive Analysis of Natural Disaster Losses
by Min Deng, Mostafa Aminzadeh and Min Ji
Risks 2021, 9(1), 12; https://doi.org/10.3390/risks9010012 - 2 Jan 2021
Cited by 4 | Viewed by 2694
Abstract
Different types of natural events hit the United States every year. The data of natural hazards from 1900 to 2016 in the US shows that there is an increasing trend in annul natural disaster losses after 1980. Climate change is recognized as one [...] Read more.
Different types of natural events hit the United States every year. The data of natural hazards from 1900 to 2016 in the US shows that there is an increasing trend in annul natural disaster losses after 1980. Climate change is recognized as one of the factors causing this trend, and predictive analysis of natural losses becomes important in loss prediction and risk prevention as this trend continues. In this paper, we convert natural disaster losses to the year 2016 dollars using yearly average Consumers Price Index (CPI), and conduct several tests to verify that the CPI adjusted amounts of loss from individual natural disasters are independent and identically distributed. Based on these test results, we use various model selection quantities to find the best model for the natural loss severity among three composite distributions, namely Exponential-Pareto, Inverse Gamma-Pareto, and Lognormal-Pareto. These composite distributions model piecewise small losses with high frequency and large losses with low frequency. Remarkably, we make the first attempt to derive analytical Bayesian estimate of the Lognormal-Pareto distribution based on the selected priors, and show that the Lognormal-Pareto distribution outperforms the other two composite distributions in modeling natural disaster losses. Important risk measures for natural disasters are thereafter derived and discussed. Full article
Show Figures

Figure 1

18 pages, 2680 KiB  
Article
A Study on Link Functions for Modelling and Forecasting Old-Age Survival Probabilities of Australia and New Zealand
by Jacie Jia Liu
Risks 2021, 9(1), 11; https://doi.org/10.3390/risks9010011 - 2 Jan 2021
Cited by 2 | Viewed by 2425
Abstract
Forecasting survival probabilities and life expectancies is an important exercise for actuaries, demographers, and social planners. In this paper, we examine extensively a number of link functions on survival probabilities and model the evolution of period survival curves of lives aged 60 over [...] Read more.
Forecasting survival probabilities and life expectancies is an important exercise for actuaries, demographers, and social planners. In this paper, we examine extensively a number of link functions on survival probabilities and model the evolution of period survival curves of lives aged 60 over time for the elderly populations in Australasia. The link functions under examination include the newly proposed gevit and gevmin, which are compared against the traditional ones like probit, complementary log-log, and logit. We project the model parameters and so the survival probabilities into the future, from which life expectancies at old ages can be forecasted. We find that some of these models on survival probabilities, particularly those based on the new links, can provide superior fitting results and forecasting performances when compared to the more conventional approach of modelling mortality rates. Furthermore, we demonstrate how these survival probability models can be extended to incorporate extra explanatory variables such as macroeconomic factors in order to further improve the forecasting performance. Full article
(This article belongs to the Special Issue Mortality Forecasting and Applications)
Show Figures

Figure 1

10 pages, 1218 KiB  
Article
A Method for Assessing Threats to the Economic Security of a Region: A Case Study of Public Procurement in Russia
by Valentina Kravchenko, Tatiana Kudryavtseva and Yuriy Kuporov
Risks 2021, 9(1), 10; https://doi.org/10.3390/risks9010010 - 1 Jan 2021
Cited by 9 | Viewed by 3257
Abstract
The issue of economic security is becoming an increasingly urgent one. The purpose of this article is to develop a method for assessing threats to the economic security of the Russian region. This method is based on step-by-step actions: first of all, choosing [...] Read more.
The issue of economic security is becoming an increasingly urgent one. The purpose of this article is to develop a method for assessing threats to the economic security of the Russian region. This method is based on step-by-step actions: first of all, choosing an element of the region’s economic security system and collecting its descriptive indicators; then grouping indicators by admittance-process-result categories and building hypotheses about their influence; testing hypotheses using a statistical package and choosing the most significant connections, which can pose a threat to the economic security of the region; thereafter ranking regions by the level of threats and developing further recommendations. The importance of this method is that with the help of grouping regions (territory of a country) based on proposed method, it is possible to develop individual economic security monitoring tools. As a result, the efficiency of that country’s region can be higher. In this work, the proposed method was tested in the framework of public procurement in Russia. A total of 14 indicators of procurement activity were collected for each region of the Russian Federation for the period from 2014 to 2018. Regression models were built on the basis of the grouped indicators. Ordinary Least Squares (OLS) Estimation was used. As a result of pairwise regression models analysis, we have defined four significant relationships between public procurement indicators. There are positive connections between contracts that require collateral and the percentage of tolerances, between the number of bidders and the number of regular suppliers, between the number of bidders and the average price drop, and between the number of purchases made from a single supplier and the number of contracts concluded without reduction. It was determined that the greatest risks for the system were associated with the connection between competition and budget savings. It was proposed to rank analyzed regions into four groups: ineffective government procurement, effective government procurement, and government procurement that threatens the system of economic security of the region, that is, high competition with low savings and low competition with high savings. Based on these groups, individual economic security monitoring tools can be developed for each region. Full article
Show Figures

Figure 1

18 pages, 2523 KiB  
Article
Global Stock Selection with Hidden Markov Model
by Nguyet Nguyen and Dung Nguyen
Risks 2021, 9(1), 9; https://doi.org/10.3390/risks9010009 - 31 Dec 2020
Cited by 7 | Viewed by 4620
Abstract
Hidden Markov model (HMM) is a powerful machine-learning method for data regime detection, especially time series data. In this paper, we establish a multi-step procedure for using HMM to select stocks from the global stock market. First, the five important factors of a [...] Read more.
Hidden Markov model (HMM) is a powerful machine-learning method for data regime detection, especially time series data. In this paper, we establish a multi-step procedure for using HMM to select stocks from the global stock market. First, the five important factors of a stock are identified and scored based on its historical performances. Second, HMM is used to predict the regimes of six global economic indicators and find the time periods in the past during which these indicators have a combination of regimes that is similar to those predicted. Then, we analyze the five stock factors of the All country world index (ACWI) in the identified time periods to assign a weighted score for each stock factor and to calculate the composite score of the five factors. Finally, we make a monthly selection of 10% of the global stocks that have the highest composite scores. This strategy is shown to outperform those relying on either ACWI, any single stock factor, or the simple average of the five stock factors. Full article
(This article belongs to the Special Issue Portfolio Optimization, Risk and Factor Analysis)
Show Figures

Figure 1

17 pages, 1614 KiB  
Article
Enhancing Pension Adequacy While Reducing the Fiscal Budget and Creating Essential Capital for Domestic Investments and Growth: Analysing the Risks and Outcomes in the Case of Greece
by Georgios Symeonidis, Platon Tinios and Panos Xenos
Risks 2021, 9(1), 8; https://doi.org/10.3390/risks9010008 - 29 Dec 2020
Cited by 4 | Viewed by 3337
Abstract
Many countries around the world are resorting to mandatory funded components in their multi-pillar pension systems with the purpose of catering for the financial pressure from ageing. This paper aims at analysing the possible replacement rates for such a scheme, by choosing different [...] Read more.
Many countries around the world are resorting to mandatory funded components in their multi-pillar pension systems with the purpose of catering for the financial pressure from ageing. This paper aims at analysing the possible replacement rates for such a scheme, by choosing different assumptions and setting the best combined area for the expected result. Then, an approach for analysing the potential for the implementation of such a scheme in Greece is presented along with the actuarially projected expected benefit expenditure and respective accrued capital. A result of the introduction of such a component is expected to be the elevated replacement rate at retirement with a concurrent alleviation of the fiscal burden for the state. The projected scale of savings will also provide domestic financing for investments generating growth. Full article
(This article belongs to the Special Issue Pension Design, Modelling and Risk Management)
Show Figures

Figure 1

14 pages, 953 KiB  
Article
Mining Actuarial Risk Predictors in Accident Descriptions Using Recurrent Neural Networks
by Jean-Thomas Baillargeon, Luc Lamontagne and Etienne Marceau
Risks 2021, 9(1), 7; https://doi.org/10.3390/risks9010007 - 29 Dec 2020
Cited by 7 | Viewed by 2813
Abstract
One crucial task of actuaries is to structure data so that observed events are explained by their inherent risk factors. They are proficient at generalizing important elements to obtain useful forecasts. Although this expertise is beneficial when paired with conventional statistical models, it [...] Read more.
One crucial task of actuaries is to structure data so that observed events are explained by their inherent risk factors. They are proficient at generalizing important elements to obtain useful forecasts. Although this expertise is beneficial when paired with conventional statistical models, it becomes limited when faced with massive unstructured datasets. Moreover, it does not take profit from the representation capabilities of recent machine learning algorithms. In this paper, we present an approach to automatically extract textual features from a large corpus that departs from the traditional actuarial approach. We design a neural architecture that can be trained to predict a phenomenon using words represented as dense embeddings. We then extract features identified as important by the model to assess the relationship between the words and the phenomenon. The technique is illustrated through a case study that estimates the number of cars involved in an accident using the accident’s description as input to a Poisson regression model. We show that our technique yields models that are more performing and interpretable than some usual actuarial data mining baseline. Full article
(This article belongs to the Special Issue Data Mining in Actuarial Science: Theory and Applications)
Show Figures

Figure 1

19 pages, 1089 KiB  
Article
Pension Fund Management, Investment Performance, and Herding in the Context of Regulatory Changes: New Evidence from the Polish Pension System
by Łukasz Dopierała and Magdalena Mosionek-Schweda
Risks 2021, 9(1), 6; https://doi.org/10.3390/risks9010006 - 29 Dec 2020
Cited by 6 | Viewed by 4176
Abstract
The aim of this paper is to assess the impact of reforms introduced in the operation of Polish open pension funds on management style, risk exposure and related investment performance. The article analyzes the impact of the reformed regulations on the herd behavior [...] Read more.
The aim of this paper is to assess the impact of reforms introduced in the operation of Polish open pension funds on management style, risk exposure and related investment performance. The article analyzes the impact of the reformed regulations on the herd behavior of fund managers. In particular, we examined whether the elimination of the internal benchmark for fund evaluation impacts the elimination or reduction of herd behavior. We proposed a multi-factor market model to evaluate the performance of funds investing in various types of instruments. Moreover, we used panel estimation to directly take into account the impact of the internal benchmark on herd behavior. Our results indicate that highly regulated funds may slightly outperform passive benchmarks and their unregulated competitors. In the case of Polish open pension funds, limiting investments in Treasury debt instruments clearly resulted in increased risk and volatility of returns. However, it also raised competition between funds and decreased the herd behavior. Additionally, the withdrawal of the mechanism evaluating funds based on the internal benchmark was also important in reducing herd behavior. Full article
Show Figures

Figure 1

18 pages, 1513 KiB  
Article
Parsimonious Predictive Mortality Modeling by Regularization and Cross-Validation with and without Covid-Type Effect
by Karim Barigou, Stéphane Loisel and Yahia Salhi
Risks 2021, 9(1), 5; https://doi.org/10.3390/risks9010005 - 24 Dec 2020
Cited by 4 | Viewed by 3106
Abstract
Predicting the evolution of mortality rates plays a central role for life insurance and pension funds. Standard single population models typically suffer from two major drawbacks: on the one hand, they use a large number of parameters compared to the sample size and, [...] Read more.
Predicting the evolution of mortality rates plays a central role for life insurance and pension funds. Standard single population models typically suffer from two major drawbacks: on the one hand, they use a large number of parameters compared to the sample size and, on the other hand, model choice is still often based on in-sample criterion, such as the Bayes information criterion (BIC), and therefore not on the ability to predict. In this paper, we develop a model based on a decomposition of the mortality surface into a polynomial basis. Then, we show how regularization techniques and cross-validation can be used to obtain a parsimonious and coherent predictive model for mortality forecasting. We analyze how COVID-19-type effects can affect predictions in our approach and in the classical one. In particular, death rates forecasts tend to be more robust compared to models with a cohort effect, and the regularized model outperforms the so-called P-spline model in terms of prediction and stability. Full article
(This article belongs to the Special Issue Mortality Forecasting and Applications)
Show Figures

Figure 1

26 pages, 657 KiB  
Review
Machine Learning in P&C Insurance: A Review for Pricing and Reserving
by Christopher Blier-Wong, Hélène Cossette, Luc Lamontagne and Etienne Marceau
Risks 2021, 9(1), 4; https://doi.org/10.3390/risks9010004 - 23 Dec 2020
Cited by 32 | Viewed by 14707
Abstract
In the past 25 years, computer scientists and statisticians developed machine learning algorithms capable of modeling highly nonlinear transformations and interactions of input features. While actuaries use GLMs frequently in practice, only in the past few years have they begun studying these newer [...] Read more.
In the past 25 years, computer scientists and statisticians developed machine learning algorithms capable of modeling highly nonlinear transformations and interactions of input features. While actuaries use GLMs frequently in practice, only in the past few years have they begun studying these newer algorithms to tackle insurance-related tasks. In this work, we aim to review the applications of machine learning to the actuarial science field and present the current state of the art in ratemaking and reserving. We first give an overview of neural networks, then briefly outline applications of machine learning algorithms in actuarial science tasks. Finally, we summarize the future trends of machine learning for the insurance industry. Full article
(This article belongs to the Special Issue Data Mining in Actuarial Science: Theory and Applications)
Show Figures

Figure 1

28 pages, 2016 KiB  
Article
An Actuarial Approach for Modeling Pandemic Risk
by Donatien Hainaut
Risks 2021, 9(1), 3; https://doi.org/10.3390/risks9010003 - 23 Dec 2020
Cited by 7 | Viewed by 4087
Abstract
In this article, a model for pandemic risk and two stochastic extensions is proposed. It is designed for actuarial valuation of insurance plans providing healthcare and death benefits. The core of our approach relies on a deterministic model that is an efficient alternative [...] Read more.
In this article, a model for pandemic risk and two stochastic extensions is proposed. It is designed for actuarial valuation of insurance plans providing healthcare and death benefits. The core of our approach relies on a deterministic model that is an efficient alternative to the susceptible-infected-recovered (SIR) method. This model explains the evolution of the first waves of COVID-19 in Belgium, Germany, Italy and Spain. Furthermore, it is analytically tractable for fair pure premium calculation. In a first extension, we replace the time by a gamma stochastic clock. This approach randomizes the timing of the epidemic peak. A second extension consists of adding a Brownian noise and a jump process to explain the erratic evolution of the population of confirmed cases. The jump component allows for local resurgences of the epidemic. Full article
Show Figures

Figure 1

15 pages, 1003 KiB  
Article
Are Investors’ Attention and Uncertainty Aversion the Risk Factors for Stock Markets? International Evidence from the COVID-19 Crisis
by Falik Shear, Badar Nadeem Ashraf and Mohsin Sadaqat
Risks 2021, 9(1), 2; https://doi.org/10.3390/risks9010002 - 22 Dec 2020
Cited by 20 | Viewed by 6428
Abstract
In this paper, we examine the impact of investors’ attention to COVID-19 on stock market returns and the moderating effect of national culture on this relationship. Using daily data from 34 countries over the period 23 January to 12 June 2020, and measuring [...] Read more.
In this paper, we examine the impact of investors’ attention to COVID-19 on stock market returns and the moderating effect of national culture on this relationship. Using daily data from 34 countries over the period 23 January to 12 June 2020, and measuring investors’ attention with the Google search volume (GSV) of the word “coronavirus” for each country, we find that investors’ enhanced attention to the COVID-19 pandemic results in negative stock market returns. Further, measuring the national culture with the uncertainty avoidance index (the aspect of national culture which measures the cross-country differences in decision-making under stress and ambiguity), we find that the negative impact of investors’ attention on stock market returns is stronger in countries where investors possess higher uncertainty avoidance cultural values. Our findings imply that uncertainty avoidance cultural values of investors promote financial market instability amid the crisis. Full article
(This article belongs to the Special Issue Financial Stability and Systemic Risk in Times of Pandemic)
Show Figures

Figure 1

21 pages, 2152 KiB  
Article
Use of Neural Networks to Accommodate Seasonal Fluctuations When Equalizing Time Series for the CZK/RMB Exchange Rate
by Zuzana Rowland, George Lazaroiu and Ivana Podhorská
Risks 2021, 9(1), 1; https://doi.org/10.3390/risks9010001 - 22 Dec 2020
Cited by 31 | Viewed by 3873
Abstract
The global nature of the Czech economy means that quantitative knowledge of the influence of the exchange rate provides useful information for all participants in the international economy. Systematic and academic research show that the issue of estimating the Czech crown/Chinese yuan exchange [...] Read more.
The global nature of the Czech economy means that quantitative knowledge of the influence of the exchange rate provides useful information for all participants in the international economy. Systematic and academic research show that the issue of estimating the Czech crown/Chinese yuan exchange rate, with consideration for seasonal fluctuations, has yet to be dealt with in detail. The aim of this contribution is to present a methodology based on neural networks that takes into consideration seasonal fluctuations when equalizing time series by using the Czech crown and Chinese yuan as examples. The analysis was conducted using daily information on the Czech crown/Chinese yuan exchange rate over a period of more than nine years. This is the equivalent of 3303 data inputs. Statistica software, version 12 by Dell Inc. was used to process the input data and, subsequently, to generate multi-layer perceptron networks and radial basis function neural networks. Two versions of neural structures were produced for regression purposes, the second of which used seasonal fluctuations as a categorical variable–year, month, day of the month and week—when the value was measured. All the generated and retained networks had the ability to equalize the analyzed time series, although the second variant demonstrated higher efficiency. The results indicate that additional variables help the equalized time series to retain order and precision. Of further interest is the finding that multi-layer perceptron networks are more efficient than radial basis function neural networks. Full article
(This article belongs to the Special Issue Quantitative Methods in Economics and Finance)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop