Special Issue "Computational Complexity in Bayesian Econometric Analysis"

A special issue of Econometrics (ISSN 2225-1146).

Deadline for manuscript submissions: closed (15 September 2015)

Special Issue Editors

Guest Editor
Prof. Herman K. van Dijk

Econometric Institute, Erasmus University Rotterdam and Econometrics Department, VU University Amsterdam, The Netherlands
Website | E-Mail
Interests: Simulation based Bayesian inference and decision analysis in econometrics; econometrics of growth and cycles; time varying volatilities and risk; neural networks and mixture processes; income distributions
Guest Editor
Dr. Nalan Basturk

Department of Quantitative Economics, School of Business and Economics, Maastricht University, The Netherlands
Website | E-Mail
Interests: Time series models; Markov switching; mixture models; cluster analysis; Bayesian analysis of instrumental variables; model averaging; simulation methods
Guest Editor
Prof. Roberto Casarin

Department of Economics, University Ca' Foscari, Venice, San Giobbe 873/b, 30121 Venezia, Italy
Website | E-Mail
Interests: Bayesian econometrics: graphical models, Bayesian nonparametric, Dirichlet processes, Markov chain Monte Carlo, sequential Monte Carlo; time series analysis: VAR, stochastic volatility, stochastic correlation, Markov-switching; forecasting: combination, calibration
Guest Editor
Prof. Dr. Francesco Ravazzolo

Associate Professor at Faculty of Economics and Management at Free University of Bozen/Bolzano, Italy
Website | E-Mail
Interests: Bayesian econometrics; financial econometrics and macroeconometrics

Special Issue Information

Dear Colleagues,

The computational revolution in simulation techniques is a key ingredient in Bayesian econometrics and opened up new possibilities to study complex economic and financial phenomena. Applications include risk measurement, forecasting, and assessment of policy effectiveness in macro and monetary economics. Papers that contain original research on this theme are actively solicited.

Prof. Herman K. van Dijk
Dr. Nalan Basturk
Prof. Roberto Casarin
Prof.Francesco Ravazzolo
Guest Editors

Please follow the time line:
Submission deadline: 15 September 2015
Decision on acceptance/revise and resubmit: 15 October 2015
Revision due: 30 November 2015
Publication deadline: 31 January 2016

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Econometrics is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charges (APCs) of 350 CHF (Swiss Francs) per published paper are partially funded by institutions through Knowledge Unlatched for a limited number of papers per year. Please contact the editorial office before submission to check whether KU waivers, or discounts are still available. Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Bayesian Inference
  • Econometrics
  • Parallel Computing
  • GPU
  • Prediction
  • Risk Measurement
  • Economic Uncertainty
  • Bayesian Policy Analysis

Published Papers (9 papers)

View options order results:
result details:
Displaying articles 1-9
Export citation of selected articles as:

Editorial

Jump to: Research

Open AccessEditorial Computational Complexity and Parallelization in Bayesian Econometric Analysis
Received: 27 January 2016 / Revised: 28 January 2016 / Accepted: 3 February 2016 / Published: 22 February 2016
PDF Full-text (638 KB) | HTML Full-text | XML Full-text
Abstract
Challenging statements have appeared in recent years in the literature on advances in computational procedures.[...] Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)

Research

Jump to: Editorial

Open AccessArticle Bayesian Calibration of Generalized Pools of Predictive Distributions
Econometrics 2016, 4(1), 17; https://doi.org/10.3390/econometrics4010017
Received: 15 September 2015 / Revised: 15 January 2016 / Accepted: 3 February 2016 / Published: 16 March 2016
Cited by 1 | PDF Full-text (8743 KB) | HTML Full-text | XML Full-text
Abstract
Decision-makers often consult different experts to build reliable forecasts on variables of interest. Combining more opinions and calibrating them to maximize the forecast accuracy is consequently a crucial issue in several economic problems. This paper applies a Bayesian beta mixture model to derive [...] Read more.
Decision-makers often consult different experts to build reliable forecasts on variables of interest. Combining more opinions and calibrating them to maximize the forecast accuracy is consequently a crucial issue in several economic problems. This paper applies a Bayesian beta mixture model to derive a combined and calibrated density function using random calibration functionals and random combination weights. In particular, it compares the application of linear, harmonic and logarithmic pooling in the Bayesian combination approach. The three combination schemes, i.e., linear, harmonic and logarithmic, are studied in simulation examples with multimodal densities and an empirical application with a large database of stock data. All of the experiments show that in a beta mixture calibration framework, the three combination schemes are substantially equivalent, achieving calibration, and no clear preference for one of them appears. The financial application shows that the linear pooling together with beta mixture calibration achieves the best results in terms of calibrated forecast. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Figures

Figure 1

Open AccessArticle The Evolving Transmission of Uncertainty Shocks in the United Kingdom
Econometrics 2016, 4(1), 16; https://doi.org/10.3390/econometrics4010016
Received: 4 September 2015 / Revised: 30 November 2015 / Accepted: 28 January 2016 / Published: 14 March 2016
Cited by 1 | PDF Full-text (767 KB) | HTML Full-text | XML Full-text
Abstract
This paper investigates if the impact of uncertainty shocks on the U.K. economy has changed over time. To this end, we propose an extended time-varying VAR model that simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on key [...] Read more.
This paper investigates if the impact of uncertainty shocks on the U.K. economy has changed over time. To this end, we propose an extended time-varying VAR model that simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on key macroeconomic and financial variables. We find that the impact of uncertainty shocks on these variables has declined over time. The timing of the change coincides with the introduction of inflation targeting in the U.K. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Figures

Figure 1

Open AccessArticle Timing Foreign Exchange Markets
Econometrics 2016, 4(1), 15; https://doi.org/10.3390/econometrics4010015
Received: 2 July 2015 / Revised: 14 December 2015 / Accepted: 28 January 2016 / Published: 11 March 2016
Cited by 2 | PDF Full-text (337 KB) | HTML Full-text | XML Full-text
Abstract
To improve short-horizon exchange rate forecasts, we employ foreign exchange market risk factors as fundamentals, and Bayesian treed Gaussian process (BTGP) models to handle non-linear, time-varying relationships between these fundamentals and exchange rates. Forecasts from the BTGP model conditional on the carry and [...] Read more.
To improve short-horizon exchange rate forecasts, we employ foreign exchange market risk factors as fundamentals, and Bayesian treed Gaussian process (BTGP) models to handle non-linear, time-varying relationships between these fundamentals and exchange rates. Forecasts from the BTGP model conditional on the carry and dollar factors dominate random walk forecasts on accuracy and economic criteria in the Meese-Rogoff setting. Superior market timing ability for large moves, more than directional accuracy, drives the BTGP’s success. We explain how, through a model averaging Monte Carlo scheme, the BTGP is able to simultaneously exploit smoothness and rough breaks in between-variable dynamics. Either feature in isolation is unable to consistently outperform benchmarks throughout the full span of time in our forecasting exercises. Trading strategies based on ex ante BTGP forecasts deliver the highest out-of-sample risk-adjusted returns for the median currency, as well as for both predictable, traded risk factors. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Open AccessArticle Return and Risk of Pairs Trading Using a Simulation-Based Bayesian Procedure for Predicting Stable Ratios of Stock Prices
Econometrics 2016, 4(1), 14; https://doi.org/10.3390/econometrics4010014
Received: 3 September 2015 / Revised: 26 January 2016 / Accepted: 28 January 2016 / Published: 10 March 2016
Cited by 1 | PDF Full-text (299 KB) | HTML Full-text | XML Full-text
Abstract
We investigate the direct connection between the uncertainty related to estimated stable ratios of stock prices and risk and return of two pairs trading strategies: a conditional statistical arbitrage method and an implicit arbitrage one. A simulation-based Bayesian procedure is introduced for predicting [...] Read more.
We investigate the direct connection between the uncertainty related to estimated stable ratios of stock prices and risk and return of two pairs trading strategies: a conditional statistical arbitrage method and an implicit arbitrage one. A simulation-based Bayesian procedure is introduced for predicting stable stock price ratios, defined in a cointegration model. Using this class of models and the proposed inferential technique, we are able to connect estimation and model uncertainty with risk and return of stock trading. In terms of methodology, we show the effect that using an encompassing prior, which is shown to be equivalent to a Jeffreys’ prior, has under an orthogonal normalization for the selection of pairs of cointegrated stock prices and further, its effect for the estimation and prediction of the spread between cointegrated stock prices. We distinguish between models with a normal and Student t distribution since the latter typically provides a better description of daily changes of prices on financial markets. As an empirical application, stocks are used that are ingredients of the Dow Jones Composite Average index. The results show that normalization has little effect on the selection of pairs of cointegrated stocks on the basis of Bayes factors. However, the results stress the importance of the orthogonal normalization for the estimation and prediction of the spread—the deviation from the equilibrium relationship—which leads to better results in terms of profit per capital engagement and risk than using a standard linear normalization. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Open AccessArticle Bayesian Nonparametric Measurement of Factor Betas and Clustering with Application to Hedge Fund Returns
Econometrics 2016, 4(1), 13; https://doi.org/10.3390/econometrics4010013
Received: 8 June 2015 / Revised: 4 December 2015 / Accepted: 28 January 2016 / Published: 8 March 2016
PDF Full-text (1265 KB) | HTML Full-text | XML Full-text
Abstract
We define a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return) and betas (to a choice set of explanatory factors) in a multivariate setting. [...] Read more.
We define a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return) and betas (to a choice set of explanatory factors) in a multivariate setting. This approach, as well as the outputs, has a dynamic, nonstationary and nonparametric form, which circumvents the problem of model risk and parametric assumptions that the Kalman filter and other widely used approaches rely on. The by-product of clusters, used for shrinkage and information borrowing, can be of use to determine relationships around specific events. This approach exhibits a smaller Root Mean Squared Error than traditionally used benchmarks in financial settings, which we illustrate through simulation. As an illustration, we use hedge fund index data, and find that our estimated alphas are, on average, 0.13% per month higher (1.6% per year) than alphas estimated through Ordinary Least Squares. The approach exhibits fast adaptation to abrupt changes in the parameters, as seen in our estimated alphas and betas, which exhibit high volatility, especially in periods which can be identified as times of stressful market events, a reflection of the dynamic positioning of hedge fund portfolio managers. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Figures

Figure 1

Open AccessArticle Evolutionary Sequential Monte Carlo Samplers for Change-Point Models
Econometrics 2016, 4(1), 12; https://doi.org/10.3390/econometrics4010012
Received: 24 August 2015 / Revised: 27 December 2015 / Accepted: 28 January 2016 / Published: 8 March 2016
PDF Full-text (694 KB) | HTML Full-text | XML Full-text
Abstract
Sequential Monte Carlo (SMC) methods are widely used for non-linear filtering purposes. However, the SMC scope encompasses wider applications such as estimating static model parameters so much that it is becoming a serious alternative to Markov-Chain Monte-Carlo (MCMC) methods. Not only do SMC [...] Read more.
Sequential Monte Carlo (SMC) methods are widely used for non-linear filtering purposes. However, the SMC scope encompasses wider applications such as estimating static model parameters so much that it is becoming a serious alternative to Markov-Chain Monte-Carlo (MCMC) methods. Not only do SMC algorithms draw posterior distributions of static or dynamic parameters but additionally they provide an estimate of the marginal likelihood. The tempered and time (TNT) algorithm, developed in this paper, combines (off-line) tempered SMC inference with on-line SMC inference for drawing realizations from many sequential posterior distributions without experiencing a particle degeneracy problem. Furthermore, it introduces a new MCMC rejuvenation step that is generic, automated and well-suited for multi-modal distributions. As this update relies on the wide heuristic optimization literature, numerous extensions are readily available. The algorithm is notably appropriate for estimating change-point models. As an example, we compare several change-point GARCH models through their marginal log-likelihoods over time. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Figures

Figure 1

Open AccessArticle Parallelization Experience with Four Canonical Econometric Models Using ParMitISEM
Econometrics 2016, 4(1), 11; https://doi.org/10.3390/econometrics4010011
Received: 15 September 2015 / Revised: 7 January 2016 / Accepted: 28 January 2016 / Published: 7 March 2016
Cited by 1 | PDF Full-text (1902 KB) | HTML Full-text | XML Full-text
Abstract
This paper presents the parallel computing implementation of the MitISEM algorithm, labeled Parallel MitISEM. The basic MitISEM algorithm provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of [...] Read more.
This paper presents the parallel computing implementation of the MitISEM algorithm, labeled Parallel MitISEM. The basic MitISEM algorithm provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of the target density is required. The approximation can be used as a candidate density in Importance Sampling or Metropolis Hastings methods for Bayesian inference on model parameters and probabilities. We present and discuss four canonical econometric models using a Graphics Processing Unit and a multi-core Central Processing Unit version of the MitISEM algorithm. The results show that the parallelization of the MitISEM algorithm on Graphics Processing Units and multi-core Central Processing Units is straightforward and fast to program using MATLAB. Moreover the speed performance of the Graphics Processing Unit version is much higher than the Central Processing Unit one. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Figures

Figure 1

Open AccessArticle Sequentially Adaptive Bayesian Learning for a Nonlinear Model of the Secular and Cyclical Behavior of US Real GDP
Econometrics 2016, 4(1), 10; https://doi.org/10.3390/econometrics4010010
Received: 1 October 2015 / Revised: 4 December 2015 / Accepted: 2 February 2016 / Published: 2 March 2016
Cited by 2 | PDF Full-text (646 KB) | HTML Full-text | XML Full-text
Abstract
There is a one-to-one mapping between the conventional time series parameters of a third-order autoregression and the more interpretable parameters of secular half-life, cyclical half-life and cycle period. The latter parameterization is better suited to interpretation of results using both Bayesian and maximum [...] Read more.
There is a one-to-one mapping between the conventional time series parameters of a third-order autoregression and the more interpretable parameters of secular half-life, cyclical half-life and cycle period. The latter parameterization is better suited to interpretation of results using both Bayesian and maximum likelihood methods and to expression of a substantive prior distribution using Bayesian methods. The paper demonstrates how to approach both problems using the sequentially adaptive Bayesian learning algorithm and sequentially adaptive Bayesian learning algorithm (SABL) software, which eliminates virtually of the substantial technical overhead required in conventional approaches and produces results quickly and reliably. The work utilizes methodological innovations in SABL including optimization of irregular and multimodal functions and production of the conventional maximum likelihood asymptotic variance matrix as a by-product. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Figures

Figure 1

Econometrics EISSN 2225-1146 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top