Computational Complexity in Bayesian Econometric Analysis

A special issue of Econometrics (ISSN 2225-1146).

Deadline for manuscript submissions: closed (15 September 2015) | Viewed by 61351

Special Issue Editors


E-Mail Website
Guest Editor
Econometric Institute, Erasmus University Rotterdam and Econometrics Department, VU University Amsterdam, The Netherlands
Interests: Simulation based Bayesian inference and decision analysis in econometrics; econometrics of growth and cycles; time varying volatilities and risk; neural networks and mixture processes; income distributions

E-Mail Website
Guest Editor
Department of Quantitative Economics, School of Business and Economics, Maastricht University, The Netherlands
Interests: Time series models; Markov switching; mixture models; cluster analysis; Bayesian analysis of instrumental variables; model averaging; simulation methods

E-Mail Website
Guest Editor
Department of Economics, University Ca' Foscari, Venice, San Giobbe 873/b, 30121 Venezia, Italy
Interests: Bayesian econometrics: graphical models, Bayesian nonparametric, Dirichlet processes, Markov chain Monte Carlo, sequential Monte Carlo; time series analysis: VAR, stochastic volatility, stochastic correlation, Markov-switching; forecasting: combination, calibration

E-Mail Website
Guest Editor
Faculty of Economics and Management, Free University of Bozen-Bolzano, 39100 Bolzano BZ, Italy
Interests: Bayesian econometrics; financial econometrics and macroeconometrics
Special Issues, Collections and Topics in MDPI journals

Special Issue Information

Dear Colleagues,

The computational revolution in simulation techniques is a key ingredient in Bayesian econometrics and opened up new possibilities to study complex economic and financial phenomena. Applications include risk measurement, forecasting, and assessment of policy effectiveness in macro and monetary economics. Papers that contain original research on this theme are actively solicited.

Prof. Herman K. van Dijk
Dr. Nalan Basturk
Prof. Roberto Casarin
Prof.Francesco Ravazzolo
Guest Editors

Please follow the time line:
Submission deadline: 15 September 2015
Decision on acceptance/revise and resubmit: 15 October 2015
Revision due: 30 November 2015
Publication deadline: 31 January 2016

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Econometrics is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1400 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • Bayesian Inference
  • Econometrics
  • Parallel Computing
  • GPU
  • Prediction
  • Risk Measurement
  • Economic Uncertainty
  • Bayesian Policy Analysis

Benefits of Publishing in a Special Issue

  • Ease of navigation: Grouping papers by topic helps scholars navigate broad scope journals more efficiently.
  • Greater discoverability: Special Issues support the reach and impact of scientific research. Articles in Special Issues are more discoverable and cited more frequently.
  • Expansion of research network: Special Issues facilitate connections among authors, fostering scientific collaborations.
  • External promotion: Articles in Special Issues are often promoted through the journal's social media, increasing their visibility.
  • e-Book format: Special Issues with more than 10 articles can be published as dedicated e-books, ensuring wide and rapid dissemination.

Further information on MDPI's Special Issue polices can be found here.

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Editorial

Jump to: Research

638 KiB  
Editorial
Computational Complexity and Parallelization in Bayesian Econometric Analysis
by Nalan Baştürk, Roberto Casarin, Francesco Ravazzolo and Herman K. Van Dijk
Econometrics 2016, 4(1), 9; https://doi.org/10.3390/econometrics4010009 - 22 Feb 2016
Viewed by 5503
Abstract
Challenging statements have appeared in recent years in the literature on advances in computational procedures.[...] Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)

Research

Jump to: Editorial

8743 KiB  
Article
Bayesian Calibration of Generalized Pools of Predictive Distributions
by Roberto Casarin, Giulia Mantoan and Francesco Ravazzolo
Econometrics 2016, 4(1), 17; https://doi.org/10.3390/econometrics4010017 - 16 Mar 2016
Cited by 11 | Viewed by 7633
Abstract
Decision-makers often consult different experts to build reliable forecasts on variables of interest. Combining more opinions and calibrating them to maximize the forecast accuracy is consequently a crucial issue in several economic problems. This paper applies a Bayesian beta mixture model to derive [...] Read more.
Decision-makers often consult different experts to build reliable forecasts on variables of interest. Combining more opinions and calibrating them to maximize the forecast accuracy is consequently a crucial issue in several economic problems. This paper applies a Bayesian beta mixture model to derive a combined and calibrated density function using random calibration functionals and random combination weights. In particular, it compares the application of linear, harmonic and logarithmic pooling in the Bayesian combination approach. The three combination schemes, i.e., linear, harmonic and logarithmic, are studied in simulation examples with multimodal densities and an empirical application with a large database of stock data. All of the experiments show that in a beta mixture calibration framework, the three combination schemes are substantially equivalent, achieving calibration, and no clear preference for one of them appears. The financial application shows that the linear pooling together with beta mixture calibration achieves the best results in terms of calibrated forecast. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

767 KiB  
Article
The Evolving Transmission of Uncertainty Shocks in the United Kingdom
by Haroon Mumtaz
Econometrics 2016, 4(1), 16; https://doi.org/10.3390/econometrics4010016 - 14 Mar 2016
Cited by 10 | Viewed by 6245
Abstract
This paper investigates if the impact of uncertainty shocks on the U.K. economy has changed over time. To this end, we propose an extended time-varying VAR model that simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on key [...] Read more.
This paper investigates if the impact of uncertainty shocks on the U.K. economy has changed over time. To this end, we propose an extended time-varying VAR model that simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on key macroeconomic and financial variables. We find that the impact of uncertainty shocks on these variables has declined over time. The timing of the change coincides with the introduction of inflation targeting in the U.K. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

337 KiB  
Article
Timing Foreign Exchange Markets
by Samuel W. Malone, Robert B. Gramacy and Enrique Ter Horst
Econometrics 2016, 4(1), 15; https://doi.org/10.3390/econometrics4010015 - 11 Mar 2016
Cited by 3 | Viewed by 7836
Abstract
To improve short-horizon exchange rate forecasts, we employ foreign exchange market risk factors as fundamentals, and Bayesian treed Gaussian process (BTGP) models to handle non-linear, time-varying relationships between these fundamentals and exchange rates. Forecasts from the BTGP model conditional on the carry and [...] Read more.
To improve short-horizon exchange rate forecasts, we employ foreign exchange market risk factors as fundamentals, and Bayesian treed Gaussian process (BTGP) models to handle non-linear, time-varying relationships between these fundamentals and exchange rates. Forecasts from the BTGP model conditional on the carry and dollar factors dominate random walk forecasts on accuracy and economic criteria in the Meese-Rogoff setting. Superior market timing ability for large moves, more than directional accuracy, drives the BTGP’s success. We explain how, through a model averaging Monte Carlo scheme, the BTGP is able to simultaneously exploit smoothness and rough breaks in between-variable dynamics. Either feature in isolation is unable to consistently outperform benchmarks throughout the full span of time in our forecasting exercises. Trading strategies based on ex ante BTGP forecasts deliver the highest out-of-sample risk-adjusted returns for the median currency, as well as for both predictable, traded risk factors. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
314 KiB  
Article
Return and Risk of Pairs Trading Using a Simulation-Based Bayesian Procedure for Predicting Stable Ratios of Stock Prices
by David Ardia, Lukasz T. Gatarek, Lennart Hoogerheide and Herman K. Van Dijk
Econometrics 2016, 4(1), 14; https://doi.org/10.3390/econometrics4010014 - 10 Mar 2016
Cited by 3 | Viewed by 7021 | Correction
Abstract
We investigate the direct connection between the uncertainty related to estimated stable ratios of stock prices and risk and return of two pairs trading strategies: a conditional statistical arbitrage method and an implicit arbitrage one. A simulation-based Bayesian procedure is introduced for predicting [...] Read more.
We investigate the direct connection between the uncertainty related to estimated stable ratios of stock prices and risk and return of two pairs trading strategies: a conditional statistical arbitrage method and an implicit arbitrage one. A simulation-based Bayesian procedure is introduced for predicting stable stock price ratios, defined in a cointegration model. Using this class of models and the proposed inferential technique, we are able to connect estimation and model uncertainty with risk and return of stock trading. In terms of methodology, we show the effect that using an encompassing prior, which is shown to be equivalent to a Jeffreys’ prior, has under an orthogonal normalization for the selection of pairs of cointegrated stock prices and further, its effect for the estimation and prediction of the spread between cointegrated stock prices. We distinguish between models with a normal and Student t distribution since the latter typically provides a better description of daily changes of prices on financial markets. As an empirical application, stocks are used that are ingredients of the Dow Jones Composite Average index. The results show that normalization has little effect on the selection of pairs of cointegrated stocks on the basis of Bayes factors. However, the results stress the importance of the orthogonal normalization for the estimation and prediction of the spread—the deviation from the equilibrium relationship—which leads to better results in terms of profit per capital engagement and risk than using a standard linear normalization. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
1265 KiB  
Article
Bayesian Nonparametric Measurement of Factor Betas and Clustering with Application to Hedge Fund Returns
by Urbi Garay, Enrique Ter Horst, German Molina and Abel Rodriguez
Econometrics 2016, 4(1), 13; https://doi.org/10.3390/econometrics4010013 - 8 Mar 2016
Cited by 1 | Viewed by 7384
Abstract
We define a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return) and betas (to a choice set of explanatory factors) in a multivariate setting. [...] Read more.
We define a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return) and betas (to a choice set of explanatory factors) in a multivariate setting. This approach, as well as the outputs, has a dynamic, nonstationary and nonparametric form, which circumvents the problem of model risk and parametric assumptions that the Kalman filter and other widely used approaches rely on. The by-product of clusters, used for shrinkage and information borrowing, can be of use to determine relationships around specific events. This approach exhibits a smaller Root Mean Squared Error than traditionally used benchmarks in financial settings, which we illustrate through simulation. As an illustration, we use hedge fund index data, and find that our estimated alphas are, on average, 0.13% per month higher (1.6% per year) than alphas estimated through Ordinary Least Squares. The approach exhibits fast adaptation to abrupt changes in the parameters, as seen in our estimated alphas and betas, which exhibit high volatility, especially in periods which can be identified as times of stressful market events, a reflection of the dynamic positioning of hedge fund portfolio managers. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

694 KiB  
Article
Evolutionary Sequential Monte Carlo Samplers for Change-Point Models
by Arnaud Dufays
Econometrics 2016, 4(1), 12; https://doi.org/10.3390/econometrics4010012 - 8 Mar 2016
Cited by 8 | Viewed by 6620
Abstract
Sequential Monte Carlo (SMC) methods are widely used for non-linear filtering purposes. However, the SMC scope encompasses wider applications such as estimating static model parameters so much that it is becoming a serious alternative to Markov-Chain Monte-Carlo (MCMC) methods. Not only do SMC [...] Read more.
Sequential Monte Carlo (SMC) methods are widely used for non-linear filtering purposes. However, the SMC scope encompasses wider applications such as estimating static model parameters so much that it is becoming a serious alternative to Markov-Chain Monte-Carlo (MCMC) methods. Not only do SMC algorithms draw posterior distributions of static or dynamic parameters but additionally they provide an estimate of the marginal likelihood. The tempered and time (TNT) algorithm, developed in this paper, combines (off-line) tempered SMC inference with on-line SMC inference for drawing realizations from many sequential posterior distributions without experiencing a particle degeneracy problem. Furthermore, it introduces a new MCMC rejuvenation step that is generic, automated and well-suited for multi-modal distributions. As this update relies on the wide heuristic optimization literature, numerous extensions are readily available. The algorithm is notably appropriate for estimating change-point models. As an example, we compare several change-point GARCH models through their marginal log-likelihoods over time. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

1902 KiB  
Article
Parallelization Experience with Four Canonical Econometric Models Using ParMitISEM
by Nalan Baştürk, Stefano Grassi, Lennart Hoogerheide and Herman K. Van Dijk
Econometrics 2016, 4(1), 11; https://doi.org/10.3390/econometrics4010011 - 7 Mar 2016
Cited by 3 | Viewed by 6563
Abstract
This paper presents the parallel computing implementation of the MitISEM algorithm, labeled Parallel MitISEM. The basic MitISEM algorithm provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of [...] Read more.
This paper presents the parallel computing implementation of the MitISEM algorithm, labeled Parallel MitISEM. The basic MitISEM algorithm provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of the target density is required. The approximation can be used as a candidate density in Importance Sampling or Metropolis Hastings methods for Bayesian inference on model parameters and probabilities. We present and discuss four canonical econometric models using a Graphics Processing Unit and a multi-core Central Processing Unit version of the MitISEM algorithm. The results show that the parallelization of the MitISEM algorithm on Graphics Processing Units and multi-core Central Processing Units is straightforward and fast to program using MATLAB. Moreover the speed performance of the Graphics Processing Unit version is much higher than the Central Processing Unit one. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

646 KiB  
Article
Sequentially Adaptive Bayesian Learning for a Nonlinear Model of the Secular and Cyclical Behavior of US Real GDP
by John Geweke
Econometrics 2016, 4(1), 10; https://doi.org/10.3390/econometrics4010010 - 2 Mar 2016
Cited by 3 | Viewed by 5457
Abstract
There is a one-to-one mapping between the conventional time series parameters of a third-order autoregression and the more interpretable parameters of secular half-life, cyclical half-life and cycle period. The latter parameterization is better suited to interpretation of results using both Bayesian and maximum [...] Read more.
There is a one-to-one mapping between the conventional time series parameters of a third-order autoregression and the more interpretable parameters of secular half-life, cyclical half-life and cycle period. The latter parameterization is better suited to interpretation of results using both Bayesian and maximum likelihood methods and to expression of a substantive prior distribution using Bayesian methods. The paper demonstrates how to approach both problems using the sequentially adaptive Bayesian learning algorithm and sequentially adaptive Bayesian learning algorithm (SABL) software, which eliminates virtually of the substantial technical overhead required in conventional approaches and produces results quickly and reliably. The work utilizes methodological innovations in SABL including optimization of irregular and multimodal functions and production of the conventional maximum likelihood asymptotic variance matrix as a by-product. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

Back to TopTop