1. Introduction
The development of the theory and practice of risk management is closely related to the emergence of different risks. Some risks have existed since very ancient times. This refers in particular to natural catastrophe risk resulting from, for example, floods, earthquakes or hurricanes. Eventually, risks related to business activities started to play a significant role in risk management.
The evolution of risk management in economics, finance and business can be presented with regard to several driving forces:
- -
The development of theoretical tools of risk analysis;
- -
The development of the instruments of risk management;
- -
The development of regulations in the area of financial risk.
Theoretical tools of financial risk analysis have been used since the beginning of the twentieth century, such as:
- -
Macaulay duration, used to measure interest rate risk;
- -
Markowitz portfolio theory and extension, proposed by James Tobin;
- -
The Capital Asset Pricing Model;
- -
The option-pricing model developed by Fischer Black, Myron Scholes and Robert Merton;
- -
The advanced model to estimate volatility developed by Robert Engle.
The instruments of financial risk management are mainly derivatives. Despite the fact that they have existed from at least the nineteenth century, the dynamic development of these instruments has primarily taken place in the last 50 years in the exchanges, such as Chicago Board of Options Exchange or Chicago Mercantile Exchange, but they have also significantly developed in the OTC markets. This refers in particular to equity derivatives, currency derivatives, interest rate derivatives and commodity derivatives in the form of options, futures, forwards and swaps.
Finally, in the last decade of the twentieth century, the other classes of derivatives were introduced: credit derivatives, catastrophe derivatives, weather derivatives and property derivatives. Another direction of the development of derivatives has been the introduction of exotic derivatives characterized by the enhanced flexibility of their parameters, such as: payoff profile, number of underlying instruments, contingency of the execution, etc.
Regulations in the area of financial risk have been introduced mainly for financial institutions, particularly banks. This has been developed by the Basel Committee for Banking Supervision under the well-known Basel Capital Acord, New Capital Accord and follow up documents. They can be considered as good practices, but in many countries, financial supervisory authorities regard these documents as “soft” regulations. The rules proposed by the Basel Committee initially covered credit risk, but then grew to include market risk, operational risk and liquidity risk. These extensions were introduced after risky events in the financial markets, e.g., the collapse of Barings Bank, the financial crisis of 2007–2008, etc. Very similar regulations, under the name of solvency, were introduced for insurance companies.
In the twenty-first century, much attention has been paid to macroprudential regulations being a response to the threat of systemic risk, as the collapse of large banks (or other financial institutions) could lead to a domino effect in the whole financial sector with an impact on the real economy.
Sometimes, regulations have emerged because they followed the good practices of financial risk management established in both financial institutions and non-financial corporations. These good-practice standards were also introduced in the public sector. Finally, efforts have been made to increase households’ awareness of the need for financial risk management.
The risk management process contains several stages: identification, assessment, steering, monitoring and control. Successful risk steering (mitigation) depends on the assessment (measurement), which depends on the application of data analysis methods. The development of computer technology has enabled the effective use of advanced data analysis methods. This has been possible because of several driving forces:
- -
The increased speed of computers;
- -
The growth of the available data;
- -
The increased speed of data transfers across the world;
- -
The growth of social network connections.
The data analysis methods used today are often classified as so-called artificial intelligence methods, particularly machine learning methods. Most of these methods were developed many years ago (e.g., in the 1960s) in the area of statistical data analysis. Their effective use can be observed in recent years thanks to the increase of computer speed and the growth of available data.
This has provided great opportunities to analyze large sets of data (Big Data) but has also created some challenges, namely:
- -
The aggregation of classical data (numerical data and text data) with alternative data through video files, audio files, images, sensor data, etc.;
- -
The detection of fake data, mostly from social media;
- -
The suitable methodology for extreme risk, which is defined as resulting from an event that has a very low probability of occurrence and leads to very large losses;
- -
The transparency of machine learning algorithms, which should be understood by the end user;
- -
The customizability of machine learning algorithms, which should be fitted to the needs of the end user; in risk management, “one size fits all” does not apply;
- -
The clear assessment of model risk, which is defined as “a risk resulting from erratic model used in real world”.
All of the mentioned challenges are related to data analysis in economics, finance and business, and are the main topic of the present Special Issue.
2. A Short Review of the Contributions in This Issue
The present Issue contains 11 papers written by authors affiliated with universities in several countries. All papers contain valuable contributions in the area of applications of data analysis methods for risk management in finance, business and economics.
These papers can be classified into several areas, namely:
- -
Credit risk;
- -
Systemic risk;
- -
Risk analysis in financial investments;
- -
Risk management at the macro level;
- -
Risk management at the micro level.
Credit risk is a basic type of risk which has a long history of involvement with scientific methods. This volume contains four papers discussing the different problems of credit risk analysis.
Two of these papers present methods of modelling credit risk.
In their paper,
Berent and Rejman (
2021) apply a doubly stochastic process (based on Duffie–Duan proposal) to determine the probability of default. This model can be classified as a so-called intensity model, an alternative to scoring models in predicting the probability of default. The authors use the data available for more than 15,000 non-financial companies from emerging markets for the period of 2007–2017, resulting in high out-of-sample accuracy ratios.
The paper by
Ptak-Chmielewska and Kopciuszewski (
2022) also contains insights related to the estimation of probability of default. The authors apply a Bayesian approach for the recalibration of probability of default through Long-Term Average, considering the number and timing of defaults using simulated and empirical data.
The other two papers in this group concern implementation issues.
The paper by
Kochanski (
2022) reviews the ROC curve models (binormal, bigamma, bibeta, bilogistic, power and bifractal curves) proposed in the literature, and fits them to actual credit-scoring ROC data (based on publicly available data) in order to determine which models can be used in credit risk management practice.
In his paper,
Szepannek (
2022) gives a thorough overview of a number of open-source software packages designed for credit scorecard modelling, which can be considered as a link between academic research and practitioner implementation.
Systemic risk has shown increasing popularity in academic research after the financial crisis at the end of the first decade of the twenty-first century. This volume contains two papers discussing the issues of systemic risk.
The paper by
Dziwok and Karaś (
2021) presents an alternative approach to measuring systemic illiquidity applicable to countries with frontier and emerging financial markets where other existing methods are not applicable. The proposed measure, called Systemic Illiquidity Noise, uses Nelson–Siegel–Svensson methodology. Then, it is applied to a set of 10 Central and Eastern European countries in the period of 2006–2020. The results show three periods of increased risk in the sample period: the global financial crisis, the European public debt crisis, and the COVID-19 pandemic, as well as three sets of countries with different systemic liquidity risk characteristics.
Rivera-Escobar et al. (
2022), in their paper, use three well-known systemic risk measures to analyze systemic risk in the banking sector in Columbia during 2008–2017. The main finding is that the Colombian banking sector does not show a significant systemic risk.
Risk in financial investments has been studied for a very long time, and thus many methods to analyze this risk have been developed. This volume contains two papers in this area.
The paper by
Górka and Kuziak (
2022) concerns ESG (Environmental, Social, Governance) investments. This type of investment has gained a lot of attention in research over the past several years. The paper presents the comparison of the volatility of rates of return of selected ESG indices and the classical types of indices, as well as the dependence between them. The conditional volatility models from the GARCH family and tail-dependence coefficients from the copula-based approach are applied. The analysis period covered 2007 until 2019. The results of the research confirm the higher dependence of extreme values in the crisis period.
Arabyat et al. (
2022), in their paper, consider the standard problem of stock price forecasting. They give a proposal based on an artificial neutral network. The proposal is used to transform high-dimensional input into low-dimensional space to then be used for stock price forecasting.
The paper by
Shevchuk and Kopych (
2021) considers risk management at the macro level. The authors estimate the exchange rate volatility using the EGARCH (1,1) model and its impact on the business cycle fluctuations in four Central and Eastern European countries. The main findings of the paper are the impact of the components of the Index of Economic Freedom, inflation and crisis on exchange rate volatility.
This volume also contains two papers where the risk management is studied at the micro level.
The paper by
Frączkiewicz-Wronka et al. (
2021) considers the relationship between risk management and the financial stability of public hospitals in Poland. This was empirically studied using data from more than 100 hospitals. The results show that risk management practices are positively related to financial stability and that the hospitals with well-developed risk management practices are better prepared to keep their financial stability at the required level.
The other paper in this group, by
Śliwiński et al. (
2022), aims to identify the risk factors affecting bancassurance development in Poland. The group of risk factors contained the factors directly related to the insurance product and those resulting from the specificity of the bancassurance channel. The study was conducted on the basis of data on the gross premiums written in Poland in the years 2004–2019.