Recovering Historical Inflation Data from Postage Stamps Prices*J. Risk Financial Manag.* **2017**, *10*(4), 21; doi:10.3390/jrfm10040021 - 14 November 2017**Abstract **

►
Figures
For many developing countries, historical inflation figures are rarely available. We propose a simple method that aims to recover such figures of inflation using prices of postage stamps issued in earlier years. We illustrate our method for Suriname, where annual inflation rates are

[...] Read more.
For many developing countries, historical inflation figures are rarely available. We propose a simple method that aims to recover such figures of inflation using prices of postage stamps issued in earlier years. We illustrate our method for Suriname, where annual inflation rates are available for 1961 until 2015, and where fluctuations in inflation rates are prominent. We estimate the inflation rates for the sample 1873 to 1960. Our main finding is that high inflation periods usually last no longer than 2 or 3 years. An Exponential Generalized Autoregressive Conditional Heteroscedasticity (EGARCH) model for the recent sample and for the full sample with the recovered inflation rates shows the relevance of adding the recovered data.
Full article

A Risk Management Approach for a Sustainable Cloud Migration*J. Risk Financial Manag.* **2017**, *10*(4), 20; doi:10.3390/jrfm10040020 - 9 November 2017**Abstract **

►
Figures
Cloud computing is not just about resource sharing, cost savings and optimisation of business performance; it also involves fundamental concerns on how businesses need to respond on the risks and challenges upon migration. Managing risks is critical for a sustainable cloud adoption. It

[...] Read more.
Cloud computing is not just about resource sharing, cost savings and optimisation of business performance; it also involves fundamental concerns on how businesses need to respond on the risks and challenges upon migration. Managing risks is critical for a sustainable cloud adoption. It includes several dimensions such as cost, practising the concept of green IT, data quality, continuity of services to users and clients, guarantee tangible benefits. This paper presents a risk management approach for a sustainable cloud migration. We consider four dimensions of sustainability, i.e., economic, environmental, social and technology to determine the viability of cloud for the business context. The risks are systematically identified and analysed based on the existing in house controls and the cloud service provider offerings. We use Dempster Shafer (D-S) theory to measure the adequacy of controls and apply semi-quantitative approach to perform risk analysis based on the theory of belief. The risk exposure for each sustainability dimension allows us to determine the viability of cloud migration. A practical migration use case is considered to determine the applicability of our work. The results identify the risk exposure and recommended control for the risk mitigation. We conclude that risks depend on specific migration case and both Cloud Service Provider (CSP) and users are responsible for the risk mitigation. Inherent risks can evolve due to the cloud migration.
Full article

Bivariate Kumaraswamy Models via Modified FGM Copulas: Properties and Applications*J. Risk Financial Manag.* **2017**, *10*(4), 19; doi:10.3390/jrfm10040019 - 1 November 2017**Abstract **

A copula is a useful tool for constructing bivariate and/or multivariate distributions. In this article, we consider a new modified class of FGM (Farlie–Gumbel–Morgenstern) bivariate copula for constructing several different bivariate Kumaraswamy type copulas and discuss their structural properties, including dependence structures. It

[...] Read more.
A copula is a useful tool for constructing bivariate and/or multivariate distributions. In this article, we consider a new modified class of FGM (Farlie–Gumbel–Morgenstern) bivariate copula for constructing several different bivariate Kumaraswamy type copulas and discuss their structural properties, including dependence structures. It is established that construction of bivariate distributions by this method allows for greater flexibility in the values of Spearman’s correlation coefficient, $\rho $ and Kendall’s $\tau $ .
Full article

Financial Market Integration: Evidence from Cross-Listed French Firms*J. Risk Financial Manag.* **2017**, *10*(4), 18; doi:10.3390/jrfm10040018 - 12 October 2017**Abstract **

►
Figures
Using high frequency data we investigate the behavior of the intraday volatility and the volume of eight cross-listed French firms. There is a two hour “overlap” period during which French firms are traded in Paris and their related American Depositary Receipts (ADRs) are

[...] Read more.
Using high frequency data we investigate the behavior of the intraday volatility and the volume of eight cross-listed French firms. There is a two hour “overlap” period during which French firms are traded in Paris and their related American Depositary Receipts (ADRs) are traded in New York. Using concurrent 15-min returns, this article examines the extent of market integration—defined as prices in both markets reflecting the same fundamental information—involving these firms. Our results suggest that these markets are not perfectly integrated. A significant rise in volatility and volume is observed during the two hour “overlap” period. This suggests the existence of informed trading. An error correction model (ECM) is then used to examine changes in prices of French firms in Paris and New York. These temporary changes appear to converge over time.
Full article

GARCH Modelling of Cryptocurrencies*J. Risk Financial Manag.* **2017**, *10*(4), 17; doi:10.3390/jrfm10040017 - 1 October 2017**Abstract **

►
Figures
With the exception of Bitcoin, there appears to be little or no literature on GARCH modelling of cryptocurrencies. This paper provides the first GARCH modelling of the seven most popular cryptocurrencies. Twelve GARCH models are fitted to each cryptocurrency, and their fits are

[...] Read more.
With the exception of Bitcoin, there appears to be little or no literature on GARCH modelling of cryptocurrencies. This paper provides the first GARCH modelling of the seven most popular cryptocurrencies. Twelve GARCH models are fitted to each cryptocurrency, and their fits are assessed in terms of five criteria. Conclusions are drawn on the best fitting models, forecasts and acceptability of value at risk estimates.
Full article

Global Hedging through Post-Decision State Variables*J. Risk Financial Manag.* **2017**, *10*(3), 16; doi:10.3390/jrfm10030016 - 9 August 2017**Abstract **

Unlike delta-hedging or similar methods based on Greeks, global hedging is an approach that optimizes some terminal criterion that depends on the difference between the value of a derivative security and that of its hedging portfolio at maturity or exercise. Global hedging methods

[...] Read more.
Unlike delta-hedging or similar methods based on Greeks, global hedging is an approach that optimizes some terminal criterion that depends on the difference between the value of a derivative security and that of its hedging portfolio at maturity or exercise. Global hedging methods in discrete time can be implemented using dynamic programming. They provide optimal strategies at all rebalancing dates for all possible states of the world, and can easily accommodate transaction fees and other frictions. However, considering transaction fees in the dynamic programming model requires the inclusion of an additional state variable, which translates into a significant increase of the computational burden. In this short note, we show how a decomposition technique based on the concept of post-decision state variables can be used to reduce the complexity of the computations to the level of a problem without transaction fees. The latter complexity reduction allows for substantial gains in terms of computing time and should therefore contribute to increasing the applicability of global hedging schemes in practice where the timely execution of portfolio rebalancing trades is crucial.
Full article

Trade Openness and Bank Risk-Taking Behavior: Evidence from Emerging Economies*J. Risk Financial Manag.* **2017**, *10*(3), 15; doi:10.3390/jrfm10030015 - 29 July 2017**Abstract **

In this paper, we examine the impact of trade openness on bank risk-taking behavior. Using a panel dataset of 291 banks from 37 emerging countries over the period from 1998 to 2012, we find that higher trade openness decreases bank risk-taking. The results

[...] Read more.
In this paper, we examine the impact of trade openness on bank risk-taking behavior. Using a panel dataset of 291 banks from 37 emerging countries over the period from 1998 to 2012, we find that higher trade openness decreases bank risk-taking. The results are robust when we use alternative bank risk-taking proxies and alternative estimation methods. We argue that trade openness provides diversification opportunities to banks in lending activities, which decrease overall bank risk. Further to this end, we observe that higher trade openness helps domestic banks to smooth out income volatility and decreases the impact of a financial crisis on banks.
Full article

Safety Evaluation of Evacuation Routes in Central Tokyo Assuming a Large-Scale Evacuation in Case of Earthquake Disasters*J. Risk Financial Manag.* **2017**, *10*(3), 14; doi:10.3390/jrfm10030014 - 27 June 2017**Abstract **

►
Figures
The present study aims to conduct a quantitative evaluation of evacuation route safety using the Ant Colony Optimization (ACO) algorithm for risk management in central Tokyo. Firstly, the similarity in safety was focused on while taking into consideration road blockage probability. Then, by

[...] Read more.
The present study aims to conduct a quantitative evaluation of evacuation route safety using the Ant Colony Optimization (ACO) algorithm for risk management in central Tokyo. Firstly, the similarity in safety was focused on while taking into consideration road blockage probability. Then, by classifying roads by means of the hierarchical cluster analysis, the congestion rates of evacuation routes using ACO simulations were estimated. Based on these results, the multiple evacuation routes extracted were visualized on digital maps by means of Geographic Information Systems (GIS), and their safety was evaluated. Furthermore, the selection of safe evacuation routes between evacuation sites for cases when the possibility of large-scale evacuation after an earthquake disaster is high is made possible. As the evaluation method is based on public information, by obtaining the same geographic information as the present study, it is effective in other areas, regardless of whether the information is from the past or future. Therefore, in addition to spatial reproducibility, the evaluation method also has high temporal reproducibility. Because safety evaluations are conducted on evacuation routes based on quantified data, the selected highly safe evacuation routes have been quantitatively evaluated, and thus serve as an effective indicator when selecting evacuation routes.
Full article

OTC Derivatives and Global Economic Activity: An Empirical Analysis*J. Risk Financial Manag.* **2017**, *10*(2), 13; doi:10.3390/jrfm10020013 - 14 June 2017**Abstract **

►
Figures
That the global market for derivatives has expanded beyond recognition is well known. What is not know is how this market interacts with economic activity. We provide the first empirical characterization of interdependencies between OECD economic activity and the global OTC derivatives market.

[...] Read more.
That the global market for derivatives has expanded beyond recognition is well known. What is not know is how this market interacts with economic activity. We provide the first empirical characterization of interdependencies between OECD economic activity and the global OTC derivatives market. To this end, we apply a vector-error correction model to OTC derivatives disaggregated across instruments and counterparties. The results indicate that with one exception, the heterogeneity of OTC contracts is too pronounced to be reliably summarized by our measures of economic activity. The one exception is interest-rate derivatives held by Other Financial Institutions.
Full article

A Statistical Analysis of Cryptocurrencies*J. Risk Financial Manag.* **2017**, *10*(2), 12; doi:10.3390/jrfm10020012 - 31 May 2017**Abstract **

►
Figures
We analyze statistical properties of the largest cryptocurrencies (determined by market capitalization), of which Bitcoin is the most prominent example. We characterize their exchange rates versus the U.S. Dollar by fitting parametric distributions to them. It is shown that returns are clearly non-normal,

[...] Read more.
We analyze statistical properties of the largest cryptocurrencies (determined by market capitalization), of which Bitcoin is the most prominent example. We characterize their exchange rates versus the U.S. Dollar by fitting parametric distributions to them. It is shown that returns are clearly non-normal, however, no single distribution fits well jointly to all the cryptocurrencies analysed. We find that for the most popular currencies, such as Bitcoin and Litecoin, the generalized hyperbolic distribution gives the best fit, while for the smaller cryptocurrencies the normal inverse Gaussian distribution, generalized t distribution, and Laplace distribution give good fits. The results are important for investment and risk management purposes.
Full article

The Solvency II Standard Formula, Linear Geometry, and Diversification*J. Risk Financial Manag.* **2017**, *10*(2), 11; doi:10.3390/jrfm10020011 - 18 May 2017**Abstract **

The core of risk aggregation in the Solvency II Standard Formula is the so-called square root formula. We argue that it should be seen as a means for the aggregation of different risks to an overall risk rather than being associated with variance-covariance

[...] Read more.
The core of risk aggregation in the Solvency II Standard Formula is the so-called square root formula. We argue that it should be seen as a means for the aggregation of different risks to an overall risk rather than being associated with variance-covariance based risk analysis. Considering the Solvency II Standard Formula from the viewpoint of linear geometry, we immediately find that it defines a norm and therefore provides a homogeneous and sub-additive tool for risk aggregation. Hence, Euler’s Principle for the reallocation of risk capital applies and yields explicit formulas for capital allocation in the framework given by the Solvency II Standard Formula. This gives rise to the definition of *diversification functions*, which we define as monotone, subadditive, and homogeneous functions on a convex cone. Diversification functions constitute a class of models for the study of the aggregation of risk and diversification. The aggregation of risk measures using a diversification function preserves the respective properties of these risk measures. Examples of diversification functions are given by seminorms, which are monotone on the convex cone of non-negative vectors. Each ${L}^{p}$ norm has this property, and any scalar product given by a non-negative positive semidefinite matrix does as well. In particular, the Standard Formula is a diversification function and hence a risk measure that preserves homogeneity, subadditivity and convexity.
Full article

A Risk Management Framework for Cloud Migration Decision Support*J. Risk Financial Manag.* **2017**, *10*(2), 10; doi:10.3390/jrfm10020010 - 22 April 2017**Abstract ****Keywords:** risk management framework; risk assessment; cloud migration; security; analytic hierarchy process (AHP); business value
Full article

►
Figures
Capital Regulation, the Cost of Financial Intermediation and Bank Profitability: Evidence from Bangladesh*J. Risk Financial Manag.* **2017**, *10*(2), 9; doi:10.3390/jrfm10020009 - 17 April 2017**Abstract **

►
Figures
In response to the recent global financial crisis, the regulatory authorities in many countries have imposed stringent capital requirements in the form of the BASEL III Accord to ensure financial stability. On the other hand, bankers have criticized new regulation on the ground

[...] Read more.
In response to the recent global financial crisis, the regulatory authorities in many countries have imposed stringent capital requirements in the form of the BASEL III Accord to ensure financial stability. On the other hand, bankers have criticized new regulation on the ground that it would enhance the cost of funds for bank borrowers and deteriorate the bank profitability. In this study, we examine the impact of capital requirements on the cost of financial intermediation and bank profitability using a panel dataset of 32 Bangladeshi banks over the period from 2000 to 2015. By employing a dynamic panel generalized method of moments (GMM) estimator, we find robust evidence that higher bank regulatory capital ratios reduce the cost of financial intermediation and increase bank profitability. The results hold when we use equity to total assets ratio as an alternative measure of bank capital. We also observe that switching from BASEL I to BASEL II has no measurable impact on the cost of financial intermediation and bank profitability in Bangladesh. In the empirical analysis, we further observe that higher bank management and cost efficiencies are associated with the lower cost of financial intermediation and higher bank profitability. These results have important implications for bank regulators, academicians, and bankers.
Full article

An Empirical Study on the Impact of Basel III Standards on Banks’ Default Risk: The Case of Luxembourg*J. Risk Financial Manag.* **2017**, *10*(2), 8; doi:10.3390/jrfm10020008 - 12 April 2017**Abstract **

►
Figures
We study how the Basel III regulations, namely the Capital-to-Assets Ratio (CAR), the Net Stable Funding Ratio (NSFR) and the Liquidity Coverage Ratio (LCR), are likely to impact banks’ profitability (i.e., ROA), capital levels and default. We estimate historical series of the new

[...] Read more.
We study how the Basel III regulations, namely the Capital-to-Assets Ratio (CAR), the Net Stable Funding Ratio (NSFR) and the Liquidity Coverage Ratio (LCR), are likely to impact banks’ profitability (i.e., ROA), capital levels and default. We estimate historical series of the new Basel III regulations for a panel of Luxembourgish banks for a period covering 2003q2–2011q3. We econometrically investigate whether historical LCR and NSFR components, as well as CAR positions are able to explain the variation in a measure of a bank’s default risk (approximated by Z-score) and how these effects make their way through banks’ ROA and CAR.We find that the liquidity regulations induce a decrease in average probabilities of default. We find that the liquidity regulation focusing on maturity mismatches (i.e., NSFR) induces a decrease in average probabilities of default. Conversely, the impact on banks’ profitability is less clear-cut; what seems to matter is banks’ funding structure rather than the characteristics of the portfolio of assets. Additionally, we use a model of bank behavior to simulate the banks’ optimal adjustments of their balance sheets as if they had to adhere to the regulations starting in 2003q2. Then, we predict, using our preferred econometric model and based on the simulated data, the banks’ Z-score and ROA. The simulation exercise suggests that basically all banks would have seen a decrease in their default risk during a crisis episode if they had previously adhered to Basel III.
Full article

On the Power and Size Properties of Cointegration Tests in the Light of High-Frequency Stylized Facts*J. Risk Financial Manag.* **2017**, *10*(1), 7; doi:10.3390/jrfm10010007 - 7 February 2017**Abstract **

►
Figures
This paper establishes a selection of stylized facts for high-frequency cointegrated processes, based on one-minute-binned transaction data. A methodology is introduced to simulate cointegrated stock pairs, following none, some or all of these stylized facts. AR(1)-GARCH(1,1) and MR(3)-STAR(1)-GARCH(1,1) processes contaminated with reversible and

[...] Read more.
This paper establishes a selection of stylized facts for high-frequency cointegrated processes, based on one-minute-binned transaction data. A methodology is introduced to simulate cointegrated stock pairs, following none, some or all of these stylized facts. AR(1)-GARCH(1,1) and MR(3)-STAR(1)-GARCH(1,1) processes contaminated with reversible and non-reversible jumps are used to model the cointegration relationship. In a Monte Carlo simulation, the power and size properties of ten cointegration tests are assessed. We find that in high-frequency settings typical for stock price data, power is still acceptable, with the exception of strong or very frequent non-reversible jumps. Phillips–Perron and PGFF tests perform best.
Full article

Modeling NYSE Composite US 100 Index with a Hybrid SOM and MLP-BP Neural Model*J. Risk Financial Manag.* **2017**, *10*(1), 6; doi:10.3390/jrfm10010006 - 5 February 2017**Abstract **

►
Figures
Neural networks are well suited to predict future results of time series for various data types. This paper proposes a hybrid neural network model to describe the results of the database of the New York Stock Exchange (NYSE). This hybrid model brings together

[...] Read more.
Neural networks are well suited to predict future results of time series for various data types. This paper proposes a hybrid neural network model to describe the results of the database of the New York Stock Exchange (NYSE). This hybrid model brings together a self organizing map (SOM) with a multilayer perceptron with back propagation algorithm (MLP-BP). The SOM aims to segment the database into different clusters, where the differences between them are highlighted. The MLP-BP is used to construct a descriptive mathematical model that describes the relationship between the indicators and the closing value of each cluster. The model was developed from a database consisting of the NYSE Composite US 100 Index over the period of 2 April 2004 to 31 December 2015. As input variables for neural networks, ten technical financial indicators were used. The model results were fairly accurate, with a mean absolute percentage error varying between 0.16% and 0.38%.
Full article

Accurate Evaluation of Expected Shortfall for Linear Portfolios with Elliptically Distributed Risk Factors*J. Risk Financial Manag.* **2017**, *10*(1), 5; doi:10.3390/jrfm10010005 - 2 February 2017**Abstract **

►
Figures
We provide an accurate closed-form expression for the expected shortfall of linear portfolios with elliptically distributed risk factors. Our results aim to correct inaccuracies that originate in Kamdem (2005) and are present also in at least thirty other papers referencing it, including the

[...] Read more.
We provide an accurate closed-form expression for the expected shortfall of linear portfolios with elliptically distributed risk factors. Our results aim to correct inaccuracies that originate in Kamdem (2005) and are present also in at least thirty other papers referencing it, including the recent survey by Nadarajah et al. (2014) on estimation methods for expected shortfall. In particular, we show that the correction we provide in the popular multivariate Student t setting eliminates understatement of expected shortfall by a factor varying from at least four to more than 100 across different tail quantiles and degrees of freedom. As such, the resulting economic impact in ﬁnancial risk management applications could be signiﬁcant. We further correct such errors encountered also in closely related results in Kamdem (2007 and 2009) for mixtures of elliptical distributions. More generally, our ﬁndings point to the extra scrutiny required when deploying new methods for expected shortfall estimation in practice.
Full article

Determination of the Optimal Retention Level Based on Different Measures*J. Risk Financial Manag.* **2017**, *10*(1), 4; doi:10.3390/jrfm10010004 - 25 January 2017**Abstract **

►
Figures
This paper deals with the optimal retention level under four competitive criteria: survival probability, expected profit, variance and expected shortfall of the insurer’s risk. The aggregate claim amounts are assumed to be distributed as compound Poisson, and the individual claim amounts are distributed

[...] Read more.
This paper deals with the optimal retention level under four competitive criteria: survival probability, expected profit, variance and expected shortfall of the insurer’s risk. The aggregate claim amounts are assumed to be distributed as compound Poisson, and the individual claim amounts are distributed exponentially. We present an approach to determine the optimal retention level that maximizes the expected profit and the survival probability, whereas minimizing the variance and the expected shortfall of the insurer’s risk. In the decision making process, we concentrate on multi-attribute decision making methods: the Technique for Order of Preference by Similarity to Ideal Solution (TOPSIS) and the VlseKriterijumska Optimizacija I Kompromisno Resenje (VIKOR) methods with their extended versions. We also provide comprehensive analysis for the determination of the optimal retention level under both the expected value and standard deviation premium principles.
Full article

Capital Structure Arbitrage under a Risk-Neutral Calibration*J. Risk Financial Manag.* **2017**, *10*(1), 3; doi:10.3390/jrfm10010003 - 19 January 2017**Abstract **

►
Figures
By reinterpreting the calibration of structural models, a reassessment of the importance of the input variables is undertaken. The analysis shows that volatility is the key parameter to any calibration exercise, by several orders of magnitude. To maximize the sensitivity to volatility, a

[...] Read more.
By reinterpreting the calibration of structural models, a reassessment of the importance of the input variables is undertaken. The analysis shows that volatility is the key parameter to any calibration exercise, by several orders of magnitude. To maximize the sensitivity to volatility, a simple formulation of Merton’s model is proposed that employs deep out-of-the-money option implied volatilities. The methodology also eliminates the use of historic data to specify the default barrier, thereby leading to a full risk-neutral calibration. Subsequently, a new technique for identifying and hedging capital structure arbitrage opportunities is illustrated. The approach seeks to hedge the volatility risk, or vega, as opposed to the exposure from the underlying equity itself, or delta. The results question the efficacy of the common arbitrage strategy of only executing the delta hedge.
Full article

Acknowledgement to Reviewers of the *Journal of Risk and Financial Management* in 2016*J. Risk Financial Manag.* **2017**, *10*(1), 2; doi:10.3390/jrfm10010002 - 10 January 2017**Abstract **

The editors of the *Journal of Risk and Financial Management *would like to express their sincere gratitude to the following reviewers for assessing manuscripts in 2016.[...]
Full article