Computational Methods for Risk Management in Economics and Finance

A special issue of Risks (ISSN 2227-9091).

Deadline for manuscript submissions: closed (31 October 2019) | Viewed by 85326

Printed Edition Available!
A printed edition of this Special Issue is available here.

Special Issue Editor

Department of Economics and Business Studies, University of Genova, 16126 Genova, GE, Italy
Interests: machine learning; portfolio optimization; bayesian networks; networks analysis
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

Nowadays computational methods have gained considerable attention in economics and finance as an alternative to conventional analytical and numerical paradigms. This special issue is devoted to bringing together contributions from both the theoretical and the application side, with a focus on the use of computational intelligence in finance and economics. We therefore welcome and encourage the submission of high quality papers related (but not limited) to:

  • Asset Pricing

  • Business Analytics

  • Big Data Analytics

  • Financial Data Mining

  • Economic and Financial Decision Making under Uncertainty

  • Portfolio Management and Optimization

  • Risk Management

  • Credit Risk Modelling

  • Commodity Markets

  • Term Structure Models

  • Trading Systems

  • Hedging Strategies

  • Risk Arbitrage

  • Exotic Options

  • Deep Learning and Artificial Neural Networks

  • Fuzzy Sets, Rough Sets, & Granular Computing

  • Hybrid Systems

  • Support Vector Machines

  • Non-linear Dynamics

Dr. Marina Resta
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Risks is an international peer-reviewed open access monthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1800 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Computational methods

  • Financial engineering

  • Data analytics

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Jump to: Review

33 pages, 2751 KiB  
Article
Markov Chain Monte Carlo Methods for Estimating Systemic Risk Allocations
by Takaaki Koike and Marius Hofert
Risks 2020, 8(1), 6; https://doi.org/10.3390/risks8010006 - 15 Jan 2020
Cited by 10 | Viewed by 6253
Abstract
In this paper, we propose a novel framework for estimating systemic risk measures and risk allocations based on Markov Chain Monte Carlo (MCMC) methods. We consider a class of allocations whose jth component can be written as some risk measure of the [...] Read more.
In this paper, we propose a novel framework for estimating systemic risk measures and risk allocations based on Markov Chain Monte Carlo (MCMC) methods. We consider a class of allocations whose jth component can be written as some risk measure of the jth conditional marginal loss distribution given the so-called crisis event. By considering a crisis event as an intersection of linear constraints, this class of allocations covers, for example, conditional Value-at-Risk (CoVaR), conditional expected shortfall (CoES), VaR contributions, and range VaR (RVaR) contributions as special cases. For this class of allocations, analytical calculations are rarely available, and numerical computations based on Monte Carlo (MC) methods often provide inefficient estimates due to the rare-event character of the crisis events. We propose an MCMC estimator constructed from a sample path of a Markov chain whose stationary distribution is the conditional distribution given the crisis event. Efficient constructions of Markov chains, such as the Hamiltonian Monte Carlo and Gibbs sampler, are suggested and studied depending on the crisis event and the underlying loss distribution. The efficiency of the MCMC estimators is demonstrated in a series of numerical experiments. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

16 pages, 530 KiB  
Article
Developing an Impairment Loss Given Default Model Using Weighted Logistic Regression Illustrated on a Secured Retail Bank Portfolio
by Douw Gerbrand Breed, Tanja Verster, Willem D. Schutte and Naeem Siddiqi
Risks 2019, 7(4), 123; https://doi.org/10.3390/risks7040123 - 13 Dec 2019
Cited by 5 | Viewed by 9408
Abstract
This paper proposes a new method to model loss given default (LGD) for IFRS 9 purposes. We develop two models for the purposes of this paper—LGD1 and LGD2. The LGD1 model is applied to the non-default (performing) accounts and its empirical value based [...] Read more.
This paper proposes a new method to model loss given default (LGD) for IFRS 9 purposes. We develop two models for the purposes of this paper—LGD1 and LGD2. The LGD1 model is applied to the non-default (performing) accounts and its empirical value based on a specified reference period using a lookup table. We also segment this across the most important variables to obtain a more granular estimate. The LGD2 model is applied to defaulted accounts and we estimate the model by means of an exposure weighted logistic regression. This newly developed LGD model is tested on a secured retail portfolio from a bank. We compare this weighted logistic regression (WLR) (under the assumption of independence) with generalised estimating equations (GEEs) to test the effects of disregarding the dependence among the repeated observations per account. When disregarding this dependence in the application of WLR, the standard errors of the parameter estimates are underestimated. However, the practical effect of this implementation in terms of model accuracy is found to be negligible. The main advantage of the newly developed methodology is the simplicity of this well-known approach, namely logistic regression of binned variables, resulting in a scorecard format. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

15 pages, 1042 KiB  
Article
On Identifying the Systemically Important Tunisian Banks: An Empirical Approach Based on the △CoVaR Measures
by Wided Khiari and Salim Ben Sassi
Risks 2019, 7(4), 122; https://doi.org/10.3390/risks7040122 - 12 Dec 2019
Cited by 8 | Viewed by 4051
Abstract
The aim of this work is to assess systemic risk of Tunisian listed banks. The goal is to identify the institutions that contribute the most to systemic risk and that are most exposed to it. We use the CoVaR that considered the systemic [...] Read more.
The aim of this work is to assess systemic risk of Tunisian listed banks. The goal is to identify the institutions that contribute the most to systemic risk and that are most exposed to it. We use the CoVaR that considered the systemic risk as the value at risk (VaR) of a financial institution conditioned on the VaR of another institution. Thus, if the CoVaR increases with respect to the VaR, the spillover risk also increases among the institutions. The difference between these measurements is termed △CoVaR, and it allows for estimating the exposure and contribution of each bank to systemic risk. Results allow classifying Tunisian banks in terms of systemic risk involvement. They show that public banks occupy the top places, followed by the two largest private banks in Tunisia. These five banks are the main systemic players in the Tunisian banking sector. It seems that they are the least sensitive to the financial difficulties of existing banks and the most important contributors to the distress of the other banks. This work aims to add a broader perspective to the micro prudential application of regulation, including contagion, proposing a macro prudential vision and strengthening of regulatory policy. Supervisors could impose close supervision for institutions considered as potentially systemic banks. Furthermore, regulations should consider the systemic contribution when defining risk requirements to minimize the consequences of possible herd behavior. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

20 pages, 2048 KiB  
Article
Target Matrix Estimators in Risk-Based Portfolios
by Marco Neffelli
Risks 2018, 6(4), 125; https://doi.org/10.3390/risks6040125 - 5 Nov 2018
Cited by 4 | Viewed by 3867
Abstract
Portfolio weights solely based on risk avoid estimation errors from the sample mean, but they are still affected from the misspecification in the sample covariance matrix. To solve this problem, we shrink the covariance matrix towards the Identity, the Variance Identity, the Single-index [...] Read more.
Portfolio weights solely based on risk avoid estimation errors from the sample mean, but they are still affected from the misspecification in the sample covariance matrix. To solve this problem, we shrink the covariance matrix towards the Identity, the Variance Identity, the Single-index model, the Common Covariance, the Constant Correlation, and the Exponential Weighted Moving Average target matrices. Using an extensive Monte Carlo simulation, we offer a comparative study of these target estimators, testing their ability in reproducing the true portfolio weights. We control for the dataset dimensionality and the shrinkage intensity in the Minimum Variance (MV), Inverse Volatility (IV), Equal-Risk-Contribution (ERC), and Maximum Diversification (MD) portfolios. We find out that the Identity and Variance Identity have very good statistical properties, also being well conditioned in high-dimensional datasets. In addition, these two models are the best target towards which to shrink: they minimise the misspecification in risk-based portfolio weights, generating estimates very close to the population values. Overall, shrinking the sample covariance matrix helps to reduce weight misspecification, especially in the Minimum Variance and the Maximum Diversification portfolios. The Inverse Volatility and the Equal-Risk-Contribution portfolios are less sensitive to covariance misspecification and so benefit less from shrinkage. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

31 pages, 3269 KiB  
Article
A General Framework for Portfolio Theory. Part II: Drawdown Risk Measures
by Stanislaus Maier-Paape and Qiji Jim Zhu
Risks 2018, 6(3), 76; https://doi.org/10.3390/risks6030076 - 7 Aug 2018
Cited by 8 | Viewed by 4658
Abstract
The aim of this paper is to provide several examples of convex risk measures necessary for the application of the general framework for portfolio theory of Maier-Paape and Zhu (2018), presented in Part I of this series. As an alternative to classical portfolio [...] Read more.
The aim of this paper is to provide several examples of convex risk measures necessary for the application of the general framework for portfolio theory of Maier-Paape and Zhu (2018), presented in Part I of this series. As an alternative to classical portfolio risk measures such as the standard deviation, we, in particular, construct risk measures related to the “current” drawdown of the portfolio equity. In contrast to references Chekhlov, Uryasev, and Zabarankin (2003, 2005), Goldberg and Mahmoud (2017), and Zabarankin, Pavlikov, and Uryasev (2014), who used the absolute drawdown, our risk measure is based on the relative drawdown process. Combined with the results of Part I, Maier-Paape and Zhu (2018), this allows us to calculate efficient portfolios based on a drawdown risk measure constraint. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

35 pages, 500 KiB  
Article
A General Framework for Portfolio Theory—Part I: Theory and Various Models
by Stanislaus Maier-Paape and Qiji Jim Zhu
Risks 2018, 6(2), 53; https://doi.org/10.3390/risks6020053 - 8 May 2018
Cited by 13 | Viewed by 5833
Abstract
Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two-dimensional space of utility and risk. This is a rather [...] Read more.
Utility and risk are two often competing measurements on the investment success. We show that efficient trade-off between these two measurements for investment portfolios happens, in general, on a convex curve in the two-dimensional space of utility and risk. This is a rather general pattern. The modern portfolio theory of Markowitz (1959) and the capital market pricing model Sharpe (1964), are special cases of our general framework when the risk measure is taken to be the standard deviation and the utility function is the identity mapping. Using our general framework, we also recover and extend the results in Rockafellar et al. (2006), which were already an extension of the capital market pricing model to allow for the use of more general deviation measures. This generalized capital asset pricing model also applies to e.g., when an approximation of the maximum drawdown is considered as a risk measure. Furthermore, the consideration of a general utility function allows for going beyond the “additive” performance measure to a “multiplicative” one of cumulative returns by using the log utility. As a result, the growth optimal portfolio theory Lintner (1965) and the leverage space portfolio theory Vince (2009) can also be understood and enhanced under our general framework. Thus, this general framework allows a unification of several important existing portfolio theories and goes far beyond. For simplicity of presentation, we phrase all for a finite underlying probability space and a one period market model, but generalizations to more complex structures are straightforward. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

22 pages, 587 KiB  
Article
Modelling and Forecasting Stock Price Movements with Serially Dependent Determinants
by Rasika Yatigammana, Shelton Peiris, Richard Gerlach and David Edmund Allen
Risks 2018, 6(2), 52; https://doi.org/10.3390/risks6020052 - 7 May 2018
Cited by 2 | Viewed by 5848
Abstract
The direction of price movements are analysed under an ordered probit framework, recognising the importance of accounting for discreteness in price changes. By extending the work of Hausman et al. (1972) and Yang and Parwada (2012),This paper focuses on improving the forecast performance [...] Read more.
The direction of price movements are analysed under an ordered probit framework, recognising the importance of accounting for discreteness in price changes. By extending the work of Hausman et al. (1972) and Yang and Parwada (2012),This paper focuses on improving the forecast performance of the model while infusing a more practical perspective by enhancing flexibility. This is achieved by extending the existing framework to generate short term multi period ahead forecasts for better decision making, whilst considering the serial dependence structure. This approach enhances the flexibility and adaptability of the model to future price changes, particularly targeting risk minimisation. Empirical evidence is provided, based on seven stocks listed on the Australian Securities Exchange (ASX). The prediction success varies between 78 and 91 per cent for in-sample and out-of-sample forecasts for both the short term and long term. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

20 pages, 620 KiB  
Article
Credit Risk Analysis Using Machine and Deep Learning Models
by Peter Martey Addo, Dominique Guegan and Bertrand Hassani
Risks 2018, 6(2), 38; https://doi.org/10.3390/risks6020038 - 16 Apr 2018
Cited by 161 | Viewed by 38875
Abstract
Due to the advanced technology associated with Big Data, data availability and computing power, most banks or lending institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision-making and transparency. In this work, [...] Read more.
Due to the advanced technology associated with Big Data, data availability and computing power, most banks or lending institutions are renewing their business models. Credit risk predictions, monitoring, model reliability and effective loan processing are key to decision-making and transparency. In this work, we build binary classifiers based on machine and deep learning models on real data in predicting loan default probability. The top 10 important features from these models are selected and then used in the modeling process to test the stability of binary classifiers by comparing their performance on separate data. We observe that the tree-based models are more stable than the models based on multilayer artificial neural networks. This opens several questions relative to the intensive use of deep learning systems in enterprises. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

Review

Jump to: Research

25 pages, 2834 KiB  
Review
Credit Risk Meets Random Matrices: Coping with Non-Stationary Asset Correlations
by Andreas Mühlbacher and Thomas Guhr
Risks 2018, 6(2), 42; https://doi.org/10.3390/risks6020042 - 23 Apr 2018
Cited by 2 | Viewed by 4987
Abstract
We review recent progress in modeling credit risk for correlated assets. We employ a new interpretation of the Wishart model for random correlation matrices to model non-stationary effects. We then use the Merton model in which default events and losses are derived from [...] Read more.
We review recent progress in modeling credit risk for correlated assets. We employ a new interpretation of the Wishart model for random correlation matrices to model non-stationary effects. We then use the Merton model in which default events and losses are derived from the asset values at maturity. To estimate the time development of the asset values, the stock prices are used, the correlations of which have a strong impact on the loss distribution, particularly on its tails. These correlations are non-stationary, which also influences the tails. We account for the asset fluctuations by averaging over an ensemble of random matrices that models the truly existing set of measured correlation matrices. As a most welcome side effect, this approach drastically reduces the parameter dependence of the loss distribution, allowing us to obtain very explicit results, which show quantitatively that the heavy tails prevail over diversification benefits even for small correlations. We calibrate our random matrix model with market data and show how it is capable of grasping different market situations. Furthermore, we present numerical simulations for concurrent portfolio risks, i.e., for the joint probability densities of losses for two portfolios. For the convenience of the reader, we give an introduction to the Wishart random matrix model. Full article
(This article belongs to the Special Issue Computational Methods for Risk Management in Economics and Finance)
Show Figures

Figure 1

Back to TopTop