Special Issue "Financial High-Frequency Data"

A special issue of Econometrics (ISSN 2225-1146).

Deadline for manuscript submissions: closed (29 February 2016).

Special Issue Editor

Nikolaus Hautsch
Website
Guest Editor
Department of Statistics and Operations Research, University of Vienna, Austria
Interests: financial econometrics; econometric modelling of financial high-frequency data; time series econometrics; volatility estimation; empirical market microstructure; estimation of systemic risk

Special Issue Information

Dear Colleagues,

Technological progress and the advance of fully electronic trading systems, currently provides researchers access to detailed information on financial market activity at a high-frequency level. The analysis and use of such data triggered a new research area that currently belongs to the most active fields in econometrics and statistics. A major realm of research focuses on the construction of asset return volatility estimators that efficiently exploit high-frequency information while accounting for jumps and peculiarities of the data induced by market microstructure effects. Utilizing high-frequency based estimates in prediction models yields daily or weekly volatility forecasts that are superior to low-frequency based predictions and are beneficial in asset pricing and risk management. High-frequency data are moreover shown to be valuable for the estimation of high-dimensional asset return covariances. Recent research has made significant progress in constructing consistent and positive semi-definite covariance estimates in high dimensions while optimally handling the non-synchronicity and irregularity of noisy tick-by-tick data.
A further major research area focuses on the development of econometric methods to model high-frequency trading dynamics, orderbook processes and liquidity risks. Recent developments on financial markets, such as high-frequency trading, hidden liquidity, the increasing competition between trading platforms and the proliferation of new forms of trading are in the core interest of regulators and market supervision and challenge theoretical and empirical research. High-frequency based market microstructure analysis thus has emerged as a fast developing field that is more relevant today than ever before.

Prof. Dr. Nikolaus Hautsch
Guest Editor

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All papers will be peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Econometrics is an international peer-reviewed open access quarterly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 1000 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • high-frequency data
  • volatility
  • covariance
  • estimation
  • jumps
  • market microstructure
  • liquidity
  • high-frequency trading

Published Papers (6 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

Open AccessArticle
Copula–Based vMEM Specifications versus Alternatives: The Case of Trading Activity
Econometrics 2017, 5(2), 16; https://doi.org/10.3390/econometrics5020016 - 12 Apr 2017
Cited by 8
Abstract
We discuss several multivariate extensions of the Multiplicative Error Model to take into account dynamic interdependence and contemporaneously correlated innovations (vector MEM or vMEM). We suggest copula functions to link Gamma marginals of the innovations, in a specification where past values and conditional [...] Read more.
We discuss several multivariate extensions of the Multiplicative Error Model to take into account dynamic interdependence and contemporaneously correlated innovations (vector MEM or vMEM). We suggest copula functions to link Gamma marginals of the innovations, in a specification where past values and conditional expectations of the variables can be simultaneously estimated. Results with realized volatility, volumes and number of trades of the JNJ stock show that significantly superior realized volatility forecasts are delivered with a fully interdependent vMEM relative to a single equation. Alternatives involving log–Normal or semiparametric formulations produce substantially equivalent results. Full article
(This article belongs to the Special Issue Financial High-Frequency Data)
Show Figures

Figure 1

Open AccessArticle
Jump Variation Estimation with Noisy High Frequency Financial Data via Wavelets
Econometrics 2016, 4(3), 34; https://doi.org/10.3390/econometrics4030034 - 16 Aug 2016
Cited by 5
Abstract
This paper develops a method to improve the estimation of jump variation using high frequency data with the existence of market microstructure noises. Accurate estimation of jump variation is in high demand, as it is an important component of volatility in finance for [...] Read more.
This paper develops a method to improve the estimation of jump variation using high frequency data with the existence of market microstructure noises. Accurate estimation of jump variation is in high demand, as it is an important component of volatility in finance for portfolio allocation, derivative pricing and risk management. The method has a two-step procedure with detection and estimation. In Step 1, we detect the jump locations by performing wavelet transformation on the observed noisy price processes. Since wavelet coefficients are significantly larger at the jump locations than the others, we calibrate the wavelet coefficients through a threshold and declare jump points if the absolute wavelet coefficients exceed the threshold. In Step 2 we estimate the jump variation by averaging noisy price processes at each side of a declared jump point and then taking the difference between the two averages of the jump point. Specifically, for each jump location detected in Step 1, we get two averages from the observed noisy price processes, one before the detected jump location and one after it, and then take their difference to estimate the jump variation. Theoretically, we show that the two-step procedure based on average realized volatility processes can achieve a convergence rate close to O P ( n 4 / 9 ) , which is better than the convergence rate O P ( n 1 / 4 ) for the procedure based on the original noisy process, where n is the sample size. Numerically, the method based on average realized volatility processes indeed performs better than that based on the price processes. Empirically, we study the distribution of jump variation using Dow Jones Industrial Average stocks and compare the results using the original price process and the average realized volatility processes. Full article
(This article belongs to the Special Issue Financial High-Frequency Data)
Show Figures

Figure 1

Open AccessArticle
Market Microstructure Effects on Firm Default Risk Evaluation
Econometrics 2016, 4(3), 31; https://doi.org/10.3390/econometrics4030031 - 08 Jul 2016
Abstract
Default probability is a fundamental variable determining the credit worthiness of a firm and equity volatility estimation plays a key role in its evaluation. Assuming a structural credit risk modeling approach, we study the impact of choosing different non parametric equity volatility estimators [...] Read more.
Default probability is a fundamental variable determining the credit worthiness of a firm and equity volatility estimation plays a key role in its evaluation. Assuming a structural credit risk modeling approach, we study the impact of choosing different non parametric equity volatility estimators on default probability evaluation, when market microstructure noise is considered. A general stochastic volatility framework with jumps for the underlying asset dynamics is defined inside a Merton-like structural model. To estimate the volatility risk component of a firm we use high-frequency equity data: market microstructure noise is introduced as a direct effect of observing noisy high-frequency equity prices. A Monte Carlo simulation analysis is conducted to (i) test the performance of alternative non-parametric equity volatility estimators in their capability of filtering out the microstructure noise and backing out the true unobservable asset volatility; (ii) study the effects of different non-parametric estimation techniques on default probability evaluation. The impact of the non-parametric volatility estimators on risk evaluation is not negligible: a sensitivity analysis defined for alternative values of the leverage parameter and average jumps size reveals that the characteristics of the dataset are crucial to determine which is the proper estimator to consider from a credit risk perspective. Full article
(This article belongs to the Special Issue Financial High-Frequency Data)
Show Figures

Figure 1

Open AccessArticle
Continuous and Jump Betas: Implications for Portfolio Diversification
Econometrics 2016, 4(2), 27; https://doi.org/10.3390/econometrics4020027 - 01 Jun 2016
Cited by 1
Abstract
Using high-frequency data, we decompose the time-varying beta for stocks into beta for continuous systematic risk and beta for discontinuous systematic risk. Estimated discontinuous betas for S&P500 constituents between 2003 and 2011 generally exceed the corresponding continuous betas. We demonstrate how continuous and [...] Read more.
Using high-frequency data, we decompose the time-varying beta for stocks into beta for continuous systematic risk and beta for discontinuous systematic risk. Estimated discontinuous betas for S&P500 constituents between 2003 and 2011 generally exceed the corresponding continuous betas. We demonstrate how continuous and discontinuous betas decrease with portfolio diversification. Using an equiweighted broad market index, we assess the speed of convergence of continuous and discontinuous betas in portfolios of stocks as the number of holdings increase. We show that discontinuous risk dissipates faster with fewer stocks in a portfolio compared to its continuous counterpart. Full article
(This article belongs to the Special Issue Financial High-Frequency Data)
Show Figures

Figure 1

Open AccessArticle
Volatility Forecasting: Downside Risk, Jumps and Leverage Effect
Econometrics 2016, 4(1), 8; https://doi.org/10.3390/econometrics4010008 - 23 Feb 2016
Cited by 16
Abstract
We provide empirical evidence of volatility forecasting in relation to asymmetries present in the dynamics of both return and volatility processes. Using recently-developed methodologies to detect jumps from high frequency price data, we estimate the size of positive and negative jumps and propose [...] Read more.
We provide empirical evidence of volatility forecasting in relation to asymmetries present in the dynamics of both return and volatility processes. Using recently-developed methodologies to detect jumps from high frequency price data, we estimate the size of positive and negative jumps and propose a methodology to estimate the size of jumps in the quadratic variation. The leverage effect is separated into continuous and discontinuous effects, and past volatility is separated into “good” and “bad”, as well as into continuous and discontinuous risks. Using a long history of the S & P500 price index, we find that the continuous leverage effect lasts about one week, while the discontinuous leverage effect disappears after one day. “Good” and “bad” continuous risks both characterize the volatility persistence, while “bad” jump risk is much more informative than “good” jump risk in forecasting future volatility. The volatility forecasting model proposed is able to capture many empirical stylized facts while still remaining parsimonious in terms of the number of parameters to be estimated. Full article
(This article belongs to the Special Issue Financial High-Frequency Data)
Show Figures

Figure 1

Open AccessArticle
Non-Parametric Estimation of Intraday Spot Volatility: Disentangling Instantaneous Trend and Seasonality
Econometrics 2015, 3(4), 864-887; https://doi.org/10.3390/econometrics3040864 - 18 Dec 2015
Cited by 3
Abstract
We provide a new framework for modeling trends and periodic patterns in high-frequency financial data. Seeking adaptivity to ever-changing market conditions, we enlarge the Fourier flexible form into a richer functional class: both our smooth trend and the seasonality are non-parametrically time-varying and [...] Read more.
We provide a new framework for modeling trends and periodic patterns in high-frequency financial data. Seeking adaptivity to ever-changing market conditions, we enlarge the Fourier flexible form into a richer functional class: both our smooth trend and the seasonality are non-parametrically time-varying and evolve in real time. We provide the associated estimators and use simulations to show that they behave adequately in the presence of jumps and heteroskedastic and heavy-tailed noise. A study of exchange rate returns sampled from 2010 to 2013 suggests that failing to factor in the seasonality’s dynamic properties may lead to misestimation of the intraday spot volatility. Full article
(This article belongs to the Special Issue Financial High-Frequency Data)
Show Figures

Figure 1

Back to TopTop