Special Issue "Celebrated Econometricians: Peter Phillips"

A special issue of Econometrics (ISSN 2225-1146).

Deadline for manuscript submissions: closed (31 October 2018).

Special Issue Editors

Federico Bandi
Website
Guest Editor
Johns Hopkins Carey Business School, USA
Interests: finance and econometrics
Alex Maynard
Website
Guest Editor
Department of Economics and Finance, University of Guelph, Canada
Interests: Applied Econometrics; Econometrics
Hyungsik Roger Moon
Website
Guest Editor
Department of Economics, University of Southern Califonia, USA
Interests: Econometrics Theory; Applied Econometrics
Benoit Perron
Website
Guest Editor
Department of Economics, Université de Montréal, Canada
Interests: Econometrics; Macroeconomics; Finance

Special Issue Information

Dear Colleagues,

Contributions for the Special Issue in honour of Peter Phillips should relate to an area of research in which Peter has made important contributions. These include, but are certainly not limited to: finite sample distribution theory; theory and practice of unit roots and co-integration; limit theory for near-nonstationary and near-explosive processes; financial econometrics, including detecting and date stamping speculative financial bubbles; model selection methods; continuous time econometric modelling; panel data econometrics; fractional integration and long memory; non-parametric estimation; instrumental variable methods; identification issues.

Informal enquiries as to the scope and suitability of a potential submission should first be made to: [email protected], [email protected], and [email protected].

Prof. Federico Bandi
Prof. Alex Maynard
Prof. Hyungsik Roger Moon
Prof. Benoit Perron
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Econometrics is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Published Papers (9 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Panel Data Estimation for Correlated Random Coefficients Models
Econometrics 2019, 7(1), 7; https://doi.org/10.3390/econometrics7010007 - 01 Feb 2019
Cited by 2
Abstract
This paper considers methods of estimating a static correlated random coefficient model with panel data. We mainly focus on comparing two approaches of estimating unconditional mean of the coefficients for the correlated random coefficients models, the group mean estimator and the generalized least [...] Read more.
This paper considers methods of estimating a static correlated random coefficient model with panel data. We mainly focus on comparing two approaches of estimating unconditional mean of the coefficients for the correlated random coefficients models, the group mean estimator and the generalized least squares estimator. For the group mean estimator, we show that it achieves Chamberlain (1992) semi-parametric efficiency bound asymptotically. For the generalized least squares estimator, we show that when T is large, a generalized least squares estimator that ignores the correlation between the individual coefficients and regressors is asymptotically equivalent to the group mean estimator. In addition, we give conditions where the standard within estimator of the mean of the coefficients is consistent. Moreover, with additional assumptions on the known correlation pattern, we derive the asymptotic properties of panel least squares estimators. Simulations are used to examine the finite sample performances of different estimators. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Open AccessArticle
Information Flow in Times of Crisis: The Case of the European Banking and Sovereign Sectors
Econometrics 2019, 7(1), 5; https://doi.org/10.3390/econometrics7010005 - 17 Jan 2019
Abstract
Crises in the banking and sovereign debt sectors give rise to heightened financial fragility. Of particular concern is the development of self-fulfilling feedback loops where crisis conditions in one sector are transmitted to the other sector and back again. We use time-varying tests [...] Read more.
Crises in the banking and sovereign debt sectors give rise to heightened financial fragility. Of particular concern is the development of self-fulfilling feedback loops where crisis conditions in one sector are transmitted to the other sector and back again. We use time-varying tests of Granger causality to demonstrate how empirical evidence of connectivity between the banking and sovereign sectors can be detected, and provide an application to the Greek, Irish, Italian, Portuguese and Spanish (GIIPS) countries and Germany over the period 2007 to 2016. While the results provide evidence of domestic feedback loops, the most important finding is that financial fragility is an international problem and cannot be dealt with purely on a country-by-country basis. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Show Figures

Figure 1

Open AccessArticle
On the Stock–Yogo Tables
Econometrics 2018, 6(4), 44; https://doi.org/10.3390/econometrics6040044 - 13 Nov 2018
Cited by 1
Abstract
A standard test for weak instruments compares the first-stage F-statistic to a table of critical values obtained by Stock and Yogo (2005) using simulations. We derive a closed-form solution for the expectation from which these critical values are derived, as well as [...] Read more.
A standard test for weak instruments compares the first-stage F-statistic to a table of critical values obtained by Stock and Yogo (2005) using simulations. We derive a closed-form solution for the expectation from which these critical values are derived, as well as present some second-order asymptotic approximations that may be of value in the presence of multiple endogenous regressors. Inspection of this new result provides insights not available from simulation, and will allow software implementations to be generalised and improved. Finally, we explore the calculation of p-values for the first-stage F-statistic weak instruments test. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Show Figures

Figure 1

Open AccessArticle
Estimation of Treatment Effects in Repeated Public Goods Experiments
Econometrics 2018, 6(4), 43; https://doi.org/10.3390/econometrics6040043 - 29 Oct 2018
Abstract
This paper provides a new statistical model for repeated voluntary contribution mechanism games. In a repeated public goods experiment, contributions in the first round are cross-sectionally independent simply because subjects are randomly selected. Meanwhile, contributions to a public account over rounds are serially [...] Read more.
This paper provides a new statistical model for repeated voluntary contribution mechanism games. In a repeated public goods experiment, contributions in the first round are cross-sectionally independent simply because subjects are randomly selected. Meanwhile, contributions to a public account over rounds are serially and cross-sectionally correlated. Furthermore, the cross-sectional average of the contributions across subjects usually decreases over rounds. By considering this non-stationary initial condition—the initial contribution has a different distribution from the rest of the contributions—we model statistically the time varying patterns of the average contribution in repeated public goods experiments and then propose a simple but efficient method to test for treatment effects. The suggested method has good finite sample performance and works well in practice. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Show Figures

Figure 1

Open AccessArticle
Econometric Fine Art Valuation by Combining Hedonic and Repeat-Sales Information
Econometrics 2018, 6(3), 32; https://doi.org/10.3390/econometrics6030032 - 24 Jun 2018
Abstract
Statistical methods are widely used for valuation (prediction of the value at sale or auction) of a unique object such as a work of art. The usual approach is estimation of a hedonic model for objects of a given class, such as paintings [...] Read more.
Statistical methods are widely used for valuation (prediction of the value at sale or auction) of a unique object such as a work of art. The usual approach is estimation of a hedonic model for objects of a given class, such as paintings from a particular school or period, or in the context of real estate, houses in a neighborhood. Where the object itself has previously sold, an alternative is to base an estimate on the previous sale price. The combination of these approaches has been employed in real estate price index construction (e.g., Jiang et al. 2015); in the present context, we treat the use of these different sources of information as a forecast combination problem. We first optimize the hedonic model, considering the level of aggregation that is appropriate for pooling observations into a sample, and applying model-averaging methods to estimate predictive models at the individual-artist level. Next, we consider an additional stage in which we incorporate repeat-sale information, in a subset of cases for which this information is available. The methods are applied to a data set of auction prices for Canadian paintings. We compare the out-of-sample predictive accuracy of different methods and find that those that allow us to use single-artist samples produce superior results, that data-driven averaging across predictive models tends to produce clear gains, and that, where available, repeat-sale information appears to yield further improvements in predictive accuracy. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Show Figures

Figure 1

Open AccessArticle
Jackknife Bias Reduction in the Presence of a Near-Unit Root
Econometrics 2018, 6(1), 11; https://doi.org/10.3390/econometrics6010011 - 05 Mar 2018
Cited by 2
Abstract
This paper considers the specification and performance of jackknife estimators of the autoregressive coefficient in a model with a near-unit root. The limit distributions of sub-sample estimators that are used in the construction of the jackknife estimator are derived, and the joint moment [...] Read more.
This paper considers the specification and performance of jackknife estimators of the autoregressive coefficient in a model with a near-unit root. The limit distributions of sub-sample estimators that are used in the construction of the jackknife estimator are derived, and the joint moment generating function (MGF) of two components of these distributions is obtained and its properties explored. The MGF can be used to derive the weights for an optimal jackknife estimator that removes fully the first-order finite sample bias from the estimator. The resulting jackknife estimator is shown to perform well in finite samples and, with a suitable choice of the number of sub-samples, is shown to reduce the overall finite sample root mean squared error, as well as bias. However, the optimal jackknife weights rely on knowledge of the near-unit root parameter and a quantity that is related to the long-run variance of the disturbance process, which are typically unknown in practice, and so, this dependence is characterised fully and a discussion provided of the issues that arise in practice in the most general settings. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Open AccessArticle
Bayesian Analysis of Bubbles in Asset Prices
Econometrics 2017, 5(4), 47; https://doi.org/10.3390/econometrics5040047 - 23 Oct 2017
Cited by 2
Abstract
We develop a new model where the dynamic structure of the asset price, after the fundamental value is removed, is subject to two different regimes. One regime reflects the normal period where the asset price divided by the dividend is assumed to follow [...] Read more.
We develop a new model where the dynamic structure of the asset price, after the fundamental value is removed, is subject to two different regimes. One regime reflects the normal period where the asset price divided by the dividend is assumed to follow a mean-reverting process around a stochastic long run mean. The second regime reflects the bubble period with explosive behavior. Stochastic switches between two regimes and non-constant probabilities of exit from the bubble regime are both allowed. A Bayesian learning approach is employed to jointly estimate the latent states and the model parameters in real time. An important feature of our Bayesian method is that we are able to deal with parameter uncertainty and at the same time, to learn about the states and the parameters sequentially, allowing for real time model analysis. This feature is particularly useful for market surveillance. Analysis using simulated data reveals that our method has good power properties for detecting bubbles. Empirical analysis using price-dividend ratios of S&P500 highlights the advantages of our method. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Show Figures

Figure 1

Open AccessArticle
Autoregressive Lag—Order Selection Using Conditional Saddlepoint Approximations
Econometrics 2017, 5(3), 43; https://doi.org/10.3390/econometrics5030043 - 19 Sep 2017
Abstract
A new method for determining the lag order of the autoregressive polynomial in regression models with autocorrelated normal disturbances is proposed. It is based on a sequential testing procedure using conditional saddlepoint approximations and permits the desire for parsimony to be explicitly incorporated, [...] Read more.
A new method for determining the lag order of the autoregressive polynomial in regression models with autocorrelated normal disturbances is proposed. It is based on a sequential testing procedure using conditional saddlepoint approximations and permits the desire for parsimony to be explicitly incorporated, unlike penalty-based model selection methods. Extensive simulation results indicate that the new method is usually competitive with, and often better than, common model selection methods. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Show Figures

Figure 1

Open AccessArticle
Unit Roots in Economic and Financial Time Series: A Re-Evaluation at the Decision-Based Significance Levels
Econometrics 2017, 5(3), 41; https://doi.org/10.3390/econometrics5030041 - 08 Sep 2017
Cited by 10
Abstract
This paper re-evaluates key past results of unit root tests, emphasizing that the use of a conventional level of significance is not in general optimal due to the test having low power. The decision-based significance levels for popular unit root tests, chosen using [...] Read more.
This paper re-evaluates key past results of unit root tests, emphasizing that the use of a conventional level of significance is not in general optimal due to the test having low power. The decision-based significance levels for popular unit root tests, chosen using the line of enlightened judgement under a symmetric loss function, are found to be much higher than conventional ones. We also propose simple calibration rules for the decision-based significance levels for a range of unit root tests. At the decision-based significance levels, many time series in Nelson and Plosser’s (1982) (extended) data set are judged to be trend-stationary, including real income variables, employment variables and money stock. We also find that nearly all real exchange rates covered in Elliott and Pesavento’s (2006) study are stationary; and that most of the real interest rates covered in Rapach and Weber’s (2004) study are stationary. In addition, using a specific loss function, the U.S. nominal interest rate is found to be stationary under economically sensible values of relative loss and prior belief for the null hypothesis. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Peter Phillips)
Show Figures

Figure 1

Back to TopTop