Next Issue
Volume 4, June
Previous Issue
Volume 3, December
 
 

Econometrics, Volume 4, Issue 1 (March 2016) – 18 articles

  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
8743 KiB  
Article
Bayesian Calibration of Generalized Pools of Predictive Distributions
by Roberto Casarin, Giulia Mantoan and Francesco Ravazzolo
Econometrics 2016, 4(1), 17; https://doi.org/10.3390/econometrics4010017 - 16 Mar 2016
Cited by 11 | Viewed by 7204
Abstract
Decision-makers often consult different experts to build reliable forecasts on variables of interest. Combining more opinions and calibrating them to maximize the forecast accuracy is consequently a crucial issue in several economic problems. This paper applies a Bayesian beta mixture model to derive [...] Read more.
Decision-makers often consult different experts to build reliable forecasts on variables of interest. Combining more opinions and calibrating them to maximize the forecast accuracy is consequently a crucial issue in several economic problems. This paper applies a Bayesian beta mixture model to derive a combined and calibrated density function using random calibration functionals and random combination weights. In particular, it compares the application of linear, harmonic and logarithmic pooling in the Bayesian combination approach. The three combination schemes, i.e., linear, harmonic and logarithmic, are studied in simulation examples with multimodal densities and an empirical application with a large database of stock data. All of the experiments show that in a beta mixture calibration framework, the three combination schemes are substantially equivalent, achieving calibration, and no clear preference for one of them appears. The financial application shows that the linear pooling together with beta mixture calibration achieves the best results in terms of calibrated forecast. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

767 KiB  
Article
The Evolving Transmission of Uncertainty Shocks in the United Kingdom
by Haroon Mumtaz
Econometrics 2016, 4(1), 16; https://doi.org/10.3390/econometrics4010016 - 14 Mar 2016
Cited by 10 | Viewed by 5889
Abstract
This paper investigates if the impact of uncertainty shocks on the U.K. economy has changed over time. To this end, we propose an extended time-varying VAR model that simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on key [...] Read more.
This paper investigates if the impact of uncertainty shocks on the U.K. economy has changed over time. To this end, we propose an extended time-varying VAR model that simultaneously allows the estimation of a measure of uncertainty and its time-varying impact on key macroeconomic and financial variables. We find that the impact of uncertainty shocks on these variables has declined over time. The timing of the change coincides with the introduction of inflation targeting in the U.K. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

337 KiB  
Article
Timing Foreign Exchange Markets
by Samuel W. Malone, Robert B. Gramacy and Enrique Ter Horst
Econometrics 2016, 4(1), 15; https://doi.org/10.3390/econometrics4010015 - 11 Mar 2016
Cited by 3 | Viewed by 7376
Abstract
To improve short-horizon exchange rate forecasts, we employ foreign exchange market risk factors as fundamentals, and Bayesian treed Gaussian process (BTGP) models to handle non-linear, time-varying relationships between these fundamentals and exchange rates. Forecasts from the BTGP model conditional on the carry and [...] Read more.
To improve short-horizon exchange rate forecasts, we employ foreign exchange market risk factors as fundamentals, and Bayesian treed Gaussian process (BTGP) models to handle non-linear, time-varying relationships between these fundamentals and exchange rates. Forecasts from the BTGP model conditional on the carry and dollar factors dominate random walk forecasts on accuracy and economic criteria in the Meese-Rogoff setting. Superior market timing ability for large moves, more than directional accuracy, drives the BTGP’s success. We explain how, through a model averaging Monte Carlo scheme, the BTGP is able to simultaneously exploit smoothness and rough breaks in between-variable dynamics. Either feature in isolation is unable to consistently outperform benchmarks throughout the full span of time in our forecasting exercises. Trading strategies based on ex ante BTGP forecasts deliver the highest out-of-sample risk-adjusted returns for the median currency, as well as for both predictable, traded risk factors. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
314 KiB  
Article
Return and Risk of Pairs Trading Using a Simulation-Based Bayesian Procedure for Predicting Stable Ratios of Stock Prices
by David Ardia, Lukasz T. Gatarek, Lennart Hoogerheide and Herman K. Van Dijk
Econometrics 2016, 4(1), 14; https://doi.org/10.3390/econometrics4010014 - 10 Mar 2016
Cited by 3 | Viewed by 6579 | Correction
Abstract
We investigate the direct connection between the uncertainty related to estimated stable ratios of stock prices and risk and return of two pairs trading strategies: a conditional statistical arbitrage method and an implicit arbitrage one. A simulation-based Bayesian procedure is introduced for predicting [...] Read more.
We investigate the direct connection between the uncertainty related to estimated stable ratios of stock prices and risk and return of two pairs trading strategies: a conditional statistical arbitrage method and an implicit arbitrage one. A simulation-based Bayesian procedure is introduced for predicting stable stock price ratios, defined in a cointegration model. Using this class of models and the proposed inferential technique, we are able to connect estimation and model uncertainty with risk and return of stock trading. In terms of methodology, we show the effect that using an encompassing prior, which is shown to be equivalent to a Jeffreys’ prior, has under an orthogonal normalization for the selection of pairs of cointegrated stock prices and further, its effect for the estimation and prediction of the spread between cointegrated stock prices. We distinguish between models with a normal and Student t distribution since the latter typically provides a better description of daily changes of prices on financial markets. As an empirical application, stocks are used that are ingredients of the Dow Jones Composite Average index. The results show that normalization has little effect on the selection of pairs of cointegrated stocks on the basis of Bayes factors. However, the results stress the importance of the orthogonal normalization for the estimation and prediction of the spread—the deviation from the equilibrium relationship—which leads to better results in terms of profit per capital engagement and risk than using a standard linear normalization. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
1265 KiB  
Article
Bayesian Nonparametric Measurement of Factor Betas and Clustering with Application to Hedge Fund Returns
by Urbi Garay, Enrique Ter Horst, German Molina and Abel Rodriguez
Econometrics 2016, 4(1), 13; https://doi.org/10.3390/econometrics4010013 - 08 Mar 2016
Cited by 1 | Viewed by 7004
Abstract
We define a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return) and betas (to a choice set of explanatory factors) in a multivariate setting. [...] Read more.
We define a dynamic and self-adjusting mixture of Gaussian Graphical Models to cluster financial returns, and provide a new method for extraction of nonparametric estimates of dynamic alphas (excess return) and betas (to a choice set of explanatory factors) in a multivariate setting. This approach, as well as the outputs, has a dynamic, nonstationary and nonparametric form, which circumvents the problem of model risk and parametric assumptions that the Kalman filter and other widely used approaches rely on. The by-product of clusters, used for shrinkage and information borrowing, can be of use to determine relationships around specific events. This approach exhibits a smaller Root Mean Squared Error than traditionally used benchmarks in financial settings, which we illustrate through simulation. As an illustration, we use hedge fund index data, and find that our estimated alphas are, on average, 0.13% per month higher (1.6% per year) than alphas estimated through Ordinary Least Squares. The approach exhibits fast adaptation to abrupt changes in the parameters, as seen in our estimated alphas and betas, which exhibit high volatility, especially in periods which can be identified as times of stressful market events, a reflection of the dynamic positioning of hedge fund portfolio managers. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

694 KiB  
Article
Evolutionary Sequential Monte Carlo Samplers for Change-Point Models
by Arnaud Dufays
Econometrics 2016, 4(1), 12; https://doi.org/10.3390/econometrics4010012 - 08 Mar 2016
Cited by 6 | Viewed by 6122
Abstract
Sequential Monte Carlo (SMC) methods are widely used for non-linear filtering purposes. However, the SMC scope encompasses wider applications such as estimating static model parameters so much that it is becoming a serious alternative to Markov-Chain Monte-Carlo (MCMC) methods. Not only do SMC [...] Read more.
Sequential Monte Carlo (SMC) methods are widely used for non-linear filtering purposes. However, the SMC scope encompasses wider applications such as estimating static model parameters so much that it is becoming a serious alternative to Markov-Chain Monte-Carlo (MCMC) methods. Not only do SMC algorithms draw posterior distributions of static or dynamic parameters but additionally they provide an estimate of the marginal likelihood. The tempered and time (TNT) algorithm, developed in this paper, combines (off-line) tempered SMC inference with on-line SMC inference for drawing realizations from many sequential posterior distributions without experiencing a particle degeneracy problem. Furthermore, it introduces a new MCMC rejuvenation step that is generic, automated and well-suited for multi-modal distributions. As this update relies on the wide heuristic optimization literature, numerous extensions are readily available. The algorithm is notably appropriate for estimating change-point models. As an example, we compare several change-point GARCH models through their marginal log-likelihoods over time. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

150 KiB  
Editorial
Spatial Econometrics: A Rapidly Evolving Discipline
by Giuseppe Arbia
Econometrics 2016, 4(1), 18; https://doi.org/10.3390/econometrics4010018 - 07 Mar 2016
Cited by 7 | Viewed by 6602
Abstract
Spatial econometrics has a relatively short history in the scenario of the scientific thought. Indeed, the term “spatial econometrics” was introduced only forty years ago during the general address delivered by Jean Paelinck to the annual meeting of the Dutch Statistical Association in [...] Read more.
Spatial econometrics has a relatively short history in the scenario of the scientific thought. Indeed, the term “spatial econometrics” was introduced only forty years ago during the general address delivered by Jean Paelinck to the annual meeting of the Dutch Statistical Association in May 1974 (see [1]). [...] Full article
(This article belongs to the Special Issue Spatial Econometrics)
1902 KiB  
Article
Parallelization Experience with Four Canonical Econometric Models Using ParMitISEM
by Nalan Baştürk, Stefano Grassi, Lennart Hoogerheide and Herman K. Van Dijk
Econometrics 2016, 4(1), 11; https://doi.org/10.3390/econometrics4010011 - 07 Mar 2016
Cited by 3 | Viewed by 6225
Abstract
This paper presents the parallel computing implementation of the MitISEM algorithm, labeled Parallel MitISEM. The basic MitISEM algorithm provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of [...] Read more.
This paper presents the parallel computing implementation of the MitISEM algorithm, labeled Parallel MitISEM. The basic MitISEM algorithm provides an automatic and flexible method to approximate a non-elliptical target density using adaptive mixtures of Student-t densities, where only a kernel of the target density is required. The approximation can be used as a candidate density in Importance Sampling or Metropolis Hastings methods for Bayesian inference on model parameters and probabilities. We present and discuss four canonical econometric models using a Graphics Processing Unit and a multi-core Central Processing Unit version of the MitISEM algorithm. The results show that the parallelization of the MitISEM algorithm on Graphics Processing Units and multi-core Central Processing Units is straightforward and fast to program using MATLAB. Moreover the speed performance of the Graphics Processing Unit version is much higher than the Central Processing Unit one. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

646 KiB  
Article
Sequentially Adaptive Bayesian Learning for a Nonlinear Model of the Secular and Cyclical Behavior of US Real GDP
by John Geweke
Econometrics 2016, 4(1), 10; https://doi.org/10.3390/econometrics4010010 - 02 Mar 2016
Cited by 3 | Viewed by 5143
Abstract
There is a one-to-one mapping between the conventional time series parameters of a third-order autoregression and the more interpretable parameters of secular half-life, cyclical half-life and cycle period. The latter parameterization is better suited to interpretation of results using both Bayesian and maximum [...] Read more.
There is a one-to-one mapping between the conventional time series parameters of a third-order autoregression and the more interpretable parameters of secular half-life, cyclical half-life and cycle period. The latter parameterization is better suited to interpretation of results using both Bayesian and maximum likelihood methods and to expression of a substantive prior distribution using Bayesian methods. The paper demonstrates how to approach both problems using the sequentially adaptive Bayesian learning algorithm and sequentially adaptive Bayesian learning algorithm (SABL) software, which eliminates virtually of the substantial technical overhead required in conventional approaches and produces results quickly and reliably. The work utilizes methodological innovations in SABL including optimization of irregular and multimodal functions and production of the conventional maximum likelihood asymptotic variance matrix as a by-product. Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
Show Figures

Figure 1

1200 KiB  
Article
Volatility Forecasting: Downside Risk, Jumps and Leverage Effect
by Francesco Audrino and Yujia Hu
Econometrics 2016, 4(1), 8; https://doi.org/10.3390/econometrics4010008 - 23 Feb 2016
Cited by 40 | Viewed by 8527
Abstract
We provide empirical evidence of volatility forecasting in relation to asymmetries present in the dynamics of both return and volatility processes. Using recently-developed methodologies to detect jumps from high frequency price data, we estimate the size of positive and negative jumps and propose [...] Read more.
We provide empirical evidence of volatility forecasting in relation to asymmetries present in the dynamics of both return and volatility processes. Using recently-developed methodologies to detect jumps from high frequency price data, we estimate the size of positive and negative jumps and propose a methodology to estimate the size of jumps in the quadratic variation. The leverage effect is separated into continuous and discontinuous effects, and past volatility is separated into “good” and “bad”, as well as into continuous and discontinuous risks. Using a long history of the S & P500 price index, we find that the continuous leverage effect lasts about one week, while the discontinuous leverage effect disappears after one day. “Good” and “bad” continuous risks both characterize the volatility persistence, while “bad” jump risk is much more informative than “good” jump risk in forecasting future volatility. The volatility forecasting model proposed is able to capture many empirical stylized facts while still remaining parsimonious in terms of the number of parameters to be estimated. Full article
(This article belongs to the Special Issue Financial High-Frequency Data)
Show Figures

Figure 1

638 KiB  
Editorial
Computational Complexity and Parallelization in Bayesian Econometric Analysis
by Nalan Baştürk, Roberto Casarin, Francesco Ravazzolo and Herman K. Van Dijk
Econometrics 2016, 4(1), 9; https://doi.org/10.3390/econometrics4010009 - 22 Feb 2016
Viewed by 5196
Abstract
Challenging statements have appeared in recent years in the literature on advances in computational procedures.[...] Full article
(This article belongs to the Special Issue Computational Complexity in Bayesian Econometric Analysis)
353 KiB  
Article
Multiple Discrete Endogenous Variables in Weakly-Separable Triangular Models
by Sung Jae Jun, Joris Pinkse, Haiqing Xu and Neşe Yıldız
Econometrics 2016, 4(1), 7; https://doi.org/10.3390/econometrics4010007 - 04 Feb 2016
Cited by 7 | Viewed by 5871
Abstract
We consider a model in which an outcome depends on two discrete treatment variables, where one treatment is given before the other. We formulate a three-equation triangular system with weak separability conditions. Without assuming assignment is random, we establish the identification of an [...] Read more.
We consider a model in which an outcome depends on two discrete treatment variables, where one treatment is given before the other. We formulate a three-equation triangular system with weak separability conditions. Without assuming assignment is random, we establish the identification of an average structural function using two-step matching. We also consider decomposing the effect of the first treatment into direct and indirect effects, which are shown to be identified by the proposed methodology. We allow for both of the treatment variables to be non-binary and do not appeal to an identification-at-infinity argument. Full article
(This article belongs to the Special Issue Discrete Choice Modeling)
Show Figures

Figure 1

464 KiB  
Article
Functional-Coefficient Spatial Durbin Models with Nonparametric Spatial Weights: An Application to Economic Growth
by Mustafa Koroglu and Yiguo Sun
Econometrics 2016, 4(1), 6; https://doi.org/10.3390/econometrics4010006 - 03 Feb 2016
Cited by 7 | Viewed by 6314
Abstract
This paper considers a functional-coefficient spatial Durbin model with nonparametric spatial weights. Applying the series approximation method, we estimate the unknown functional coefficients and spatial weighting functions via a nonparametric two-stage least squares (or 2SLS) estimation method. To further improve estimation accuracy, we [...] Read more.
This paper considers a functional-coefficient spatial Durbin model with nonparametric spatial weights. Applying the series approximation method, we estimate the unknown functional coefficients and spatial weighting functions via a nonparametric two-stage least squares (or 2SLS) estimation method. To further improve estimation accuracy, we also construct a second-step estimator of the unknown functional coefficients by a local linear regression approach. Some Monte Carlo simulation results are reported to assess the finite sample performance of our proposed estimators. We then apply the proposed model to re-examine national economic growth by augmenting the conventional Solow economic growth convergence model with unknown spatial interactive structures of the national economy, as well as country-specific Solow parameters, where the spatial weighting functions and Solow parameters are allowed to be a function of geographical distance and the countries’ openness to trade, respectively. Full article
(This article belongs to the Special Issue Nonparametric Methods in Econometrics)
Show Figures

Figure 1

285 KiB  
Editorial
Acknowledgement to Reviewers of Econometrics in 2015
by Econometrics Editorial Office
Econometrics 2016, 4(1), 5; https://doi.org/10.3390/econometrics4010005 - 25 Jan 2016
Cited by 2 | Viewed by 3773
Abstract
The editors of Econometrics would like to express their sincere gratitude to the following reviewers for assessing manuscripts in 2015. [...] Full article
274 KiB  
Article
A Conditional Approach to Panel Data Models with Common Shocks
by Giovanni Forchini and Bin Peng
Econometrics 2016, 4(1), 4; https://doi.org/10.3390/econometrics4010004 - 12 Jan 2016
Cited by 4 | Viewed by 4841
Abstract
This paper studies the effects of common shocks on the OLS estimators of the slopes’ parameters in linear panel data models. The shocks are assumed to affect both the errors and some of the explanatory variables. In contrast to existing approaches, which rely [...] Read more.
This paper studies the effects of common shocks on the OLS estimators of the slopes’ parameters in linear panel data models. The shocks are assumed to affect both the errors and some of the explanatory variables. In contrast to existing approaches, which rely on using results on martingale difference sequences, our method relies on conditional strong laws of large numbers and conditional central limit theorems for conditionally-heterogeneous random variables. Full article
1192 KiB  
Article
Forecasting Value-at-Risk under Different Distributional Assumptions
by Manuela Braione and Nicolas K. Scholtes
Econometrics 2016, 4(1), 3; https://doi.org/10.3390/econometrics4010003 - 11 Jan 2016
Cited by 37 | Viewed by 12209
Abstract
Financial asset returns are known to be conditionally heteroskedastic and generally non-normally distributed, fat-tailed and often skewed. These features must be taken into account to produce accurate forecasts of Value-at-Risk (VaR). We provide a comprehensive look at the problem by considering the impact [...] Read more.
Financial asset returns are known to be conditionally heteroskedastic and generally non-normally distributed, fat-tailed and often skewed. These features must be taken into account to produce accurate forecasts of Value-at-Risk (VaR). We provide a comprehensive look at the problem by considering the impact that different distributional assumptions have on the accuracy of both univariate and multivariate GARCH models in out-of-sample VaR prediction. The set of analyzed distributions comprises the normal, Student, Multivariate Exponential Power and their corresponding skewed counterparts. The accuracy of the VaR forecasts is assessed by implementing standard statistical backtesting procedures used to rank the different specifications. The results show the importance of allowing for heavy-tails and skewness in the distributional assumption with the skew-Student outperforming the others across all tests and confidence levels. Full article
(This article belongs to the Special Issue Recent Developments of Financial Econometrics)
Show Figures

Figure 1

1976 KiB  
Article
How Credible Are Shrinking Wage Elasticities of Married Women Labour Supply?
by Duo Qin, Sophie Van Huellen and Qing-Chao Wang
Econometrics 2016, 4(1), 1; https://doi.org/10.3390/econometrics4010001 - 25 Dec 2015
Cited by 5 | Viewed by 6460
Abstract
This paper delves into the well-known phenomenon of shrinking wage elasticities for married women in the US over recent decades. The results of a novel model experimental approach via sample data ordering unveil considerable heterogeneity across different wage groups. Yet, surprisingly constant wage [...] Read more.
This paper delves into the well-known phenomenon of shrinking wage elasticities for married women in the US over recent decades. The results of a novel model experimental approach via sample data ordering unveil considerable heterogeneity across different wage groups. Yet, surprisingly constant wage elasticity estimates are maintained within certain wage groups over time. In addition to those constant wage elasticity estimates, we find that the composition of working women into different wage groups has changed considerably, resulting in shrinking wage elasticity estimates at the aggregate level. These findings would be impossible to obtain had we not dismantled and discarded the instrumental variable estimation route. Full article
Show Figures

Figure 1

512 KiB  
Article
Interpretation and Semiparametric Efficiency in Quantile Regression under Misspecification
by Ying-Ying Lee
Econometrics 2016, 4(1), 2; https://doi.org/10.3390/econometrics4010002 - 24 Dec 2015
Cited by 6 | Viewed by 6328
Abstract
Allowing for misspecification in the linear conditional quantile function, this paper provides a new interpretation and the semiparametric efficiency bound for the quantile regression parameter β ( τ ) in Koenker and Bassett (1978). The first result on interpretation shows that under a [...] Read more.
Allowing for misspecification in the linear conditional quantile function, this paper provides a new interpretation and the semiparametric efficiency bound for the quantile regression parameter β ( τ ) in Koenker and Bassett (1978). The first result on interpretation shows that under a mean-squared loss function, the probability limit of the Koenker–Bassett estimator minimizes a weighted distribution approximation error, defined as \(F_{Y}(X'\beta(\tau)|X) - \tau\), i.e., the deviation of the conditional distribution function, evaluated at the linear quantile approximation, from the quantile level. The second result implies that the Koenker–Bassett estimator semiparametrically efficiently estimates the quantile regression parameter that produces parsimonious descriptive statistics for the conditional distribution. Therefore, quantile regression shares the attractive features of ordinary least squares: interpretability and semiparametric efficiency under misspecification. Full article
(This article belongs to the Special Issue Quantile Methods)
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop