Editor’s Choice Articles

Editor’s Choice articles are based on recommendations by the scientific editors of MDPI journals from around the world. Editors select a small number of articles recently published in the journal that they believe will be particularly interesting to readers, or important in the respective research area. The aim is to provide a snapshot of some of the most exciting work published in the various research areas of the journal.

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
30 pages, 6285 KB  
Article
Integration and Risk Transmission Dynamics Between Bitcoin, Currency Pairs, and Traditional Financial Assets in South Africa
by Benjamin Mudiangombe Mudiangombe and John Weirstrass Muteba Mwamba
Econometrics 2025, 13(3), 36; https://doi.org/10.3390/econometrics13030036 - 19 Sep 2025
Cited by 1 | Viewed by 3614
Abstract
This study explores the new insights into the integration and dynamic asymmetric volatility risk spillovers between Bitcoin, currency pairs (USD/ZAR, GBP/ZAR and EUR/ZAR), and traditional financial assets (ALSI, Bond, and Gold) in South Africa using daily data spanning the period from 2010 to [...] Read more.
This study explores the new insights into the integration and dynamic asymmetric volatility risk spillovers between Bitcoin, currency pairs (USD/ZAR, GBP/ZAR and EUR/ZAR), and traditional financial assets (ALSI, Bond, and Gold) in South Africa using daily data spanning the period from 2010 to 2024 and employing Time-Varying Parameter Vector Autoregression (TVP-VAR) and wavelet coherence. The findings revealed strengthened integration between traditional financial assets and currency pairs, as well as weak integration with BTC/ZAR. Furthermore, BTC/ZAR and traditional financial assets were receivers of shocks, while the currency pairs were transmitters of spillovers. Gold emerged as an attractive investment during periods of inflation or currency devaluation. However, the assets have a total connectedness index of 28.37%, offering a reduced systemic risk. Distinct patterns were observed in the short, medium, and long term in time scales and frequency. There is a diversification benefit and potential hedging strategies due to gold’s negative influence on BTC/ZAR. Bitcoin’s high volatility and lack of regulatory oversight continue to be deterrents for institutional investors. This study lays a solid foundation for understanding the financial dynamics in South Africa, offering valuable insights for investors and policymakers interested in the intricate linkages between BTC/ZAR, currency pairs, and traditional financial assets, allowing for more targeted policy measures. Full article
Show Figures

Figure 1

33 pages, 415 KB  
Article
A Statistical Characterization of Median-Based Inequality Measures
by Charles M. Beach and Russell Davidson
Econometrics 2025, 13(3), 31; https://doi.org/10.3390/econometrics13030031 - 9 Aug 2025
Viewed by 738
Abstract
For income distributions divided into middle, lower, and higher regions based on scalar median cut-offs, this paper establishes the asymptotic distribution properties—including explicit empirically applicable variance formulas and hence standard errors—of sample estimates of the proportion of the population within the group, their [...] Read more.
For income distributions divided into middle, lower, and higher regions based on scalar median cut-offs, this paper establishes the asymptotic distribution properties—including explicit empirically applicable variance formulas and hence standard errors—of sample estimates of the proportion of the population within the group, their share of total income, and the groups’ mean incomes. It then applies these results for relative mean income ratios, various polarization measures, and decile-mean income ratios. Since the derived formulas are not distribution-free, the study advises using a density estimation technique proposed by Comte and Genon-Catalot. A shrinking middle-income group with declining relative incomes and marked upper-tail polarization among men’s incomes are all found to be highly statistically significant. Full article
11 pages, 346 KB  
Article
Daily Emissions of CO2 in the World: A Fractional Integration Approach
by Luis Alberiko Gil-Alana and Carlos Poza
Econometrics 2025, 13(3), 26; https://doi.org/10.3390/econometrics13030026 - 17 Jul 2025
Viewed by 1156
Abstract
In this article, daily CO2 emissions for the years 2019–2022 are examined using fractional integration for Brazil, China, EU-27 (and the UK), India, and the USA. According to the findings, all series exhibit long memory mean-reversion tendencies, with orders of integration ranging [...] Read more.
In this article, daily CO2 emissions for the years 2019–2022 are examined using fractional integration for Brazil, China, EU-27 (and the UK), India, and the USA. According to the findings, all series exhibit long memory mean-reversion tendencies, with orders of integration ranging between 0.22 in the case of India (with white noise errors) and 0.70 for Brazil (under autocorrelated disturbances). Nevertheless, the differencing parameter estimates are all considerably below 1, which supports the theory of mean reversion and transient shocks. These results suggest the need for a greater intensification of green policies complemented with economic structural reforms to achieve the zero-emissions target by 2050. Full article
Show Figures

Figure 1

31 pages, 2058 KB  
Article
The Long-Run Impact of Changes in Prescription Drug Sales on Mortality and Hospital Utilization in Belgium, 1998–2019
by Frank R. Lichtenberg
Econometrics 2025, 13(3), 25; https://doi.org/10.3390/econometrics13030025 - 23 Jun 2025
Viewed by 1174
Abstract
Objectives: We investigate the long-run impact of changes in prescription drug sales on mortality and hospital utilization in Belgium during the first two decades of the 21st century. Methods: We analyze the correlation across diseases between changes in the drugs used to treat [...] Read more.
Objectives: We investigate the long-run impact of changes in prescription drug sales on mortality and hospital utilization in Belgium during the first two decades of the 21st century. Methods: We analyze the correlation across diseases between changes in the drugs used to treat the disease and changes in mortality or hospital utilization from that disease. The measure of the change in prescription drug sales we use is the long-run (1998–2018 or 2000–2019) change in the fraction of post-1999 drugs sold. A post-1999 drug is a drug that was not sold during 1989–1999. Results: The 1998–2018 increase in the fraction of post-1999 drugs sold is estimated to have reduced the number of years of life lost before ages 85, 75, and 65 in 2018 by about 438 thousand (31%), 225 thousand (31%), and 114 thousand (32%), respectively. The 1995–2014 increase in in the fraction of post-1999 drugs sold is estimated to have reduced the number of hospital days in 2019 by 2.66 million (20%). Conclusions: Even if we ignore the reduction in hospital utilization attributable to changes in pharmaceutical consumption, a conservative estimate of the 2018 cost per life-year before age 85 gained is EUR 6824. We estimate that previous changes in pharmaceutical consumption reduced 2019 expenditure on inpatient curative and rehabilitative care by EUR 3.55 billion, which is higher than the 2018 expenditure on drugs that were authorized during the period 1998–2018: EUR 2.99 billion. Full article
Show Figures

Figure 1

31 pages, 1988 KB  
Article
The Effect of Macroeconomic Announcements on U.S. Treasury Markets: An Autometric General-to-Specific Analysis of the Greenspan Era
by James J. Forest
Econometrics 2025, 13(3), 24; https://doi.org/10.3390/econometrics13030024 - 21 Jun 2025
Viewed by 4129
Abstract
This research studies the impact of macroeconomic announcement surprises on daily U.S. Treasury excess returns during the heart of Alan Greenspan’s tenure as Federal Reserve Chair, addressing the possible limitations of standard static regression (SSR) models, which may suffer from omitted variable bias, [...] Read more.
This research studies the impact of macroeconomic announcement surprises on daily U.S. Treasury excess returns during the heart of Alan Greenspan’s tenure as Federal Reserve Chair, addressing the possible limitations of standard static regression (SSR) models, which may suffer from omitted variable bias, parameter instability, and poor mis-specification diagnostics. To complement the SSR framework, an automated general-to-specific (Gets) modeling approach, enhanced with modern indicator saturation methods for robustness, is applied to improve empirical model discovery and mitigate potential biases. By progressively reducing an initially broad set of candidate variables, the Gets methodology steers the model toward congruence, dispenses unstable parameters, and seeks to limit information loss while seeking model congruence and precision. The findings, herein, suggest that U.S. Treasury market responses to macroeconomic news shocks exhibited stability for a core set of announcements that reliably influenced excess returns. In contrast to computationally costless standard static models, the automated Gets-based approach enhances parameter precision and provides a more adaptive structure for identifying relevant predictors. These results demonstrate the potential value of incorporating interpretable automated model selection techniques alongside traditional SSR and Markov switching approaches to improve empirical insights into macroeconomic announcement effects on financial markets. Full article
(This article belongs to the Special Issue Advancements in Macroeconometric Modeling and Time Series Analysis)
Show Figures

Figure 1

14 pages, 388 KB  
Article
Generalized Recentered Influence Function Regressions
by Javier Alejo, Antonio Galvao, Julián Martínez-Iriarte and Gabriel Montes-Rojas
Econometrics 2025, 13(2), 19; https://doi.org/10.3390/econometrics13020019 - 18 Apr 2025
Viewed by 2715
Abstract
This paper suggests a generalization of covariate shifts to study distributional impacts on inequality and distributional measures. It builds on the recentered influence function (RIF) regression method, originally designed for location shifts in covariates, and extends it to general policy interventions, such as [...] Read more.
This paper suggests a generalization of covariate shifts to study distributional impacts on inequality and distributional measures. It builds on the recentered influence function (RIF) regression method, originally designed for location shifts in covariates, and extends it to general policy interventions, such as location–scale or asymmetric interventions. Numerical simulations for the Gini, Theil, and Atkinson indexes demonstrate strong performance across a myriad of cases and distributional measures. An empirical application examining changes in Mincerian equations is presented to illustrate the method. Full article
Show Figures

Figure 1

29 pages, 1630 KB  
Article
A Meta-Analysis of Determinants of Success and Failure of Economic Sanctions
by Binyam Afewerk Demena and Peter A. G. van Bergeijk
Econometrics 2025, 13(2), 16; https://doi.org/10.3390/econometrics13020016 - 9 Apr 2025
Cited by 1 | Viewed by 4731
Abstract
Political scientists and economists often assert that they understand how economic sanctions function as a foreign policy tool and claim to have backed their theories with compelling statistical evidence. The research puzzle that this article addresses is the observation that despite almost four [...] Read more.
Political scientists and economists often assert that they understand how economic sanctions function as a foreign policy tool and claim to have backed their theories with compelling statistical evidence. The research puzzle that this article addresses is the observation that despite almost four decades of empirical research on economic sanctions, there is still no consensus on the direction and magnitude of the key variables that theoretically determine the success of economic sanctions. To address part of this research puzzle, we conducted a meta-analysis of 37 studies published between 1985 and 2018, focusing on three key determinants of sanction success: trade linkage, prior relations, and duration. Our analysis examines the factors contributing to the variation in findings reported by these primary studies. By constructing up to 27 moderator variables that capture the contexts in which researchers derive their estimates, we found that the differences across studies are primarily influenced by the data used, the variables controlled for in estimation methods, publication quality, and author characteristics. Our results reveal highly significant effects, indicating that sanctions are more likely to succeed when there is strong pre-sanction trade, when sanctions are implemented swiftly, and when they involve countries with better pre-sanction relationships. In our robustness checks, we consistently confirmed these core findings across different estimation techniques. Full article
Show Figures

Figure 1

19 pages, 3796 KB  
Article
Modeling and Forecasting Time-Series Data with Multiple Seasonal Periods Using Periodograms
by Solomon Buke Chudo and Gyorgy Terdik
Econometrics 2025, 13(2), 14; https://doi.org/10.3390/econometrics13020014 - 28 Mar 2025
Cited by 4 | Viewed by 5966
Abstract
Applications of high-frequency data, including energy management, economics, and finance, frequently require time-series forecasting characterized by complex seasonality. Recognizing prevailing seasonal trends continues to be difficult, given that the majority of solutions depend on basic decomposition techniques. This study introduces a new approach [...] Read more.
Applications of high-frequency data, including energy management, economics, and finance, frequently require time-series forecasting characterized by complex seasonality. Recognizing prevailing seasonal trends continues to be difficult, given that the majority of solutions depend on basic decomposition techniques. This study introduces a new approach employing periodograms from spectral density analysis to identify predominant seasonal periods. When analyzing hourly electricity consumption data from Brazil, we identified three significant seasonal patterns: sub-daily (6 h), half-daily (12 h), and daily (24 h). We assessed the predictive efficacy of the BATS, TBATS, and STL + ETS models using these seasonal periods. We performed data analysis and model fitting in R 4.4.1 and used accuracy metrics like MAE, MAPE, and others to compare the models. The STL + ETS model exhibited an enhanced performance, surpassing both BATS and TBATS in energy forecasting. These findings improve our understanding of multiple seasonal patterns, assist us in selecting dominating periods, provide new practical forecasting approaches for time-series analysis, and inform professionals seeking superior forecasting solutions in various fields. Full article
Show Figures

Figure 1

25 pages, 513 KB  
Article
Explosive Episodes and Time-Varying Volatility: A New MARMA–GARCH Model Applied to Cryptocurrencies
by Alain Hecq and Daniel Velasquez-Gaviria
Econometrics 2025, 13(2), 13; https://doi.org/10.3390/econometrics13020013 - 24 Mar 2025
Cited by 2 | Viewed by 2049
Abstract
Financial assets often exhibit explosive price surges followed by abrupt collapses, alongside persistent volatility clustering. Motivated by these features, we introduce a mixed causal–noncausal invertible–noninvertible autoregressive moving average generalized autoregressive conditional heteroskedasticity (MARMA–GARCH) model. Unlike standard ARMA processes, our model admits roots inside [...] Read more.
Financial assets often exhibit explosive price surges followed by abrupt collapses, alongside persistent volatility clustering. Motivated by these features, we introduce a mixed causal–noncausal invertible–noninvertible autoregressive moving average generalized autoregressive conditional heteroskedasticity (MARMA–GARCH) model. Unlike standard ARMA processes, our model admits roots inside the unit disk, capturing bubble-like episodes and speculative feedback, while the GARCH component explains time-varying volatility. We propose two estimation approaches: (i) Whittle-based frequency-domain methods, which are asymptotically equivalent to Gaussian likelihood under stationarity and finite variance, and (ii) time-domain maximum likelihood, which proves to be more robust to heavy tails and skewness—common in financial returns. To identify causal vs. noncausal structures, we develop a higher-order diagnostics procedure using spectral densities and residual-based tests. Simulation results reveal that overlooking noncausality biases GARCH parameters, downplaying short-run volatility reactions to news (α) while overstating volatility persistence (β). Our empirical application to Bitcoin and Ethereum enhances these insights: we find significant noncausal dynamics in the mean, paired with pronounced GARCH effects in the variance. Imposing a purely causal ARMA specification leads to systematically misspecified volatility estimates, potentially underestimating market risks. Our results emphasize the importance of relaxing the usual causality and invertibility assumption for assets prone to extreme price movements, ultimately improving risk metrics and expanding our understanding of financial market dynamics. Full article
Show Figures

Figure 1

18 pages, 4617 KB  
Article
Real Option Valuation of an Emerging Renewable Technology Design in Wave Energy Conversion
by James A. DiLellio, John C. Butler, Igor Rizaev, Wanan Sheng and George Aggidis
Econometrics 2025, 13(1), 11; https://doi.org/10.3390/econometrics13010011 - 4 Mar 2025
Cited by 2 | Viewed by 4246
Abstract
The untapped potential of wave energy offers another alternative to diversifying renewable energy sources and addressing climate change by reducing CO2 emissions. However, development costs to mature the technology remain significant hurdles to adoption at scale and the technology often must compete [...] Read more.
The untapped potential of wave energy offers another alternative to diversifying renewable energy sources and addressing climate change by reducing CO2 emissions. However, development costs to mature the technology remain significant hurdles to adoption at scale and the technology often must compete against other marine energy renewables such as offshore wind. Here, we conduct a real option valuation that includes the uncertain market price of wholesale electricity and managerial flexibility expressed in determining future optimal decisions. We demonstrate the probability that the project’s embedded compound real option value can turn a negative net present value wave energy project to a positive expected value. This change in investment decision uses decision tree analysis, where real options are developed as decision nodes, and models the uncertainty as a risk-neutral stochastic process using chance nodes. We also show how our results are analogous to a financial out-of-the-money call option. Our results highlight the distribution of outcomes and the benefit of a staged long-term investment in wave energy systems to better understand and manage project risk, recognizing that these probabilistic results are subject to the ongoing evolution of wholesale electricity prices and the stochastic process models used here to capture their future dynamics. Lastly, we show that the near-term optimal decision is to continue to fund ongoing development of a reference architecture to a higher technology readiness level to maintain the long-term option to deploy such a renewable energy system through private investment or private–public partnerships. Full article
Show Figures

Figure 1

17 pages, 1129 KB  
Article
A Study of Economic and Social Preferences in Energy-Saving Behavior Using a Structural Equation Modeling Approach: The Case of Romania
by Cristian Busu, Mihail Busu, Stelian Grasu, Ilona Skačkauskienė and Luis Miguel Fonseca
Econometrics 2025, 13(1), 10; https://doi.org/10.3390/econometrics13010010 - 24 Feb 2025
Cited by 1 | Viewed by 2479
Abstract
Examining the energy consumer behavioral model is critical for national governments and academia. This endeavor seeks to uncover effective solutions amid the energy crisis and climate change challenges. This article delves into legislative developments within the energy sector, European Commission recommendations for reducing [...] Read more.
Examining the energy consumer behavioral model is critical for national governments and academia. This endeavor seeks to uncover effective solutions amid the energy crisis and climate change challenges. This article delves into legislative developments within the energy sector, European Commission recommendations for reducing energy consumption, and existing constraints impacting individual consumers. By scrutinizing the relevant literature, we aimed to identify and analyze factors that can enhance individual benefits derived from energy savings. Then, a comprehensive set of variables was formulated to model the final consumers’ behavior. Data collection involved administering questionnaires to individual consumers, consumer associations, and energy micro-enterprises in Romania. The gathered data were meticulously analyzed using the Smart-Pls 4 statistical software. Building upon insights from specialized literature, this paper pinpoints the behavioral determinants influencing the reduction in energy consumption. These determinants serve as independent variables shaping the voluntary adoption of measures in lifestyle and behavior among various types of energy users. This study’s findings validate the assumptions presented in this article, highlighting that a reduction in energy consumption is a direct and intrinsic outcome achieved by cumulatively addressing several factors. These factors encompass investments in the energy sector, budget allocation for energy consumption expenditure, adherence to social behavior norms, access to pertinent information about the consequences of the energy crisis, and individual responsibility. Notably, the perception of energy-saving opportunities emerges as a mediator between the independent variables and energy savings with a significant effect. This aspect, developed for the first time in this article, draws inspiration from the prospect theory introduced by Kahneman and Tversky. Full article
Show Figures

Figure 1

17 pages, 1028 KB  
Article
Data-Based Parametrization for Affine GARCH Models Across Multiple Time Scales—Roughness Implications
by Marcos Escobar-Anel, Sebastian Ferrando, Fuyu Li and Ke Xu
Econometrics 2025, 13(1), 6; https://doi.org/10.3390/econometrics13010006 - 12 Feb 2025
Cited by 2 | Viewed by 1517
Abstract
This paper revisits the topic of time-scale parameterizations of the Heston–Nandi GARCH (1,1) model to create a new, theoretically valid setting compatible with real financial data. We first estimate parameters using three US market indices and six frequencies to let data reveal the [...] Read more.
This paper revisits the topic of time-scale parameterizations of the Heston–Nandi GARCH (1,1) model to create a new, theoretically valid setting compatible with real financial data. We first estimate parameters using three US market indices and six frequencies to let data reveal the correct, data-implied, time-scale parameterizations. We compared the data-implied parametrization to two popular candidates in the literature, demonstrating structurally different continuous-time limits, i.e., the data favor fractional Brownian motion (fBM)—instead of the standard Brownian motion (BM)-based parametrization. We then propose a theoretically flexible time-scale parameterization compatible with this fBM behavior. In this context, a fractional derivative analysis of our empirically based parametrization is performed, confirming an anomalous diffusion in the continuous-time limit. Such a finding is yet another endorsement of the recent and popular stylized fact known as rough volatility. Full article
Show Figures

Figure 1

14 pages, 1100 KB  
Article
Dynamic Factor Models and Fractional Integration—With an Application to US Real Economic Activity
by Guglielmo Maria Caporale, Luis Alberiko Gil-Alana and Pedro Jose Piqueras Martinez
Econometrics 2024, 12(4), 39; https://doi.org/10.3390/econometrics12040039 - 19 Dec 2024
Cited by 1 | Viewed by 2345
Abstract
This paper makes a twofold contribution. First, it develops the dynamic factor model of by allowing for fractional integration instead of imposing the classical dichotomy between I(0) stationary and I(1) non-stationary series. This more general setup provides valuable information on the [...] Read more.
This paper makes a twofold contribution. First, it develops the dynamic factor model of by allowing for fractional integration instead of imposing the classical dichotomy between I(0) stationary and I(1) non-stationary series. This more general setup provides valuable information on the degree of persistence and mean-reverting properties of the series. Second, the proposed framework is used to analyse five annual US Real Economic Activity series (Employees, Energy, Industrial Production, Manufacturing, Personal Income) over the period from 1967 to 2019 in order to shed light on their degree of persistence and cyclical behaviour. The results indicate that economic activity in the US is highly persistent and is also characterised by cycles with a periodicity of 6 years and 8 months. Full article
Show Figures

Figure 1

11 pages, 227 KB  
Article
Likert Scale Variables in Personal Finance Research: The Neutral Category Problem
by Blain Pearson, Donald Lacombe and Nasima Khatun
Econometrics 2024, 12(4), 33; https://doi.org/10.3390/econometrics12040033 - 6 Nov 2024
Cited by 3 | Viewed by 4603
Abstract
Personal finance research often utilizes Likert-type items and Likert scales as dependent variables, frequently employing standard probit and ordered probit models. If inappropriately modeled, the “neutral” category of discrete dependent variables can bias estimates of the remaining categories. Through the utilization of hierarchical [...] Read more.
Personal finance research often utilizes Likert-type items and Likert scales as dependent variables, frequently employing standard probit and ordered probit models. If inappropriately modeled, the “neutral” category of discrete dependent variables can bias estimates of the remaining categories. Through the utilization of hierarchical models, this paper demonstrates a methodology that accounts for the econometric issues of the neutral category. We then analyze the technique through an empirical exercise relevant to personal finance research using data from the National Financial Capability Study. We demonstrate that ignoring the “neutral” category bias can lead to incorrect inferences, hindering the progression of personal finance research. Our findings underscore the importance of refining statistical modeling techniques when dealing with Likert-type data. By accounting for the neutral category, we can enhance the reliability of personal finance research outcomes, fostering improved decision-relevant insights. Full article
20 pages, 478 KB  
Article
Long-Term Care in Germany in the Context of the Demographic Transition—An Outlook for the Expenses of Long-Term Care Insurance through 2050
by Patrizio Vanella, Christina Benita Wilke and Moritz Heß
Econometrics 2024, 12(4), 28; https://doi.org/10.3390/econometrics12040028 - 9 Oct 2024
Cited by 1 | Viewed by 9079
Abstract
Demographic aging results in a growing number of older people in need of care in many regions all over the world. Germany has witnessed steady population aging for decades, prompting policymakers and other stakeholders to discuss how to fulfill the rapidly growing demand [...] Read more.
Demographic aging results in a growing number of older people in need of care in many regions all over the world. Germany has witnessed steady population aging for decades, prompting policymakers and other stakeholders to discuss how to fulfill the rapidly growing demand for care workers and finance the rising costs of long-term care. Informed decisions on this matter to ensure the sustainability of the statutory long-term care insurance system require reliable knowledge of the associated future costs. These need to be simulated based on well-designed forecast models that holistically include the complexity of the forecast problem, namely the demographic transition, epidemiological trends, concrete demand for and supply of specific care services, and the respective costs. Care risks heavily depend on demographics, both in absolute terms and according to severity. The number of persons in need of care, disaggregated by severity of disability, in turn, is the main driver of the remuneration that is paid by long-term care insurance. Therefore, detailed forecasts of the population and care rates are important ingredients for forecasts of long-term care insurance expenditures. We present a novel approach based on a stochastic demographic cohort-component approach that includes trends in age- and sex-specific care rates and the demand for specific care services, given changing preferences over the life course. The model is executed for Germany until the year 2050 as a case study. Full article
(This article belongs to the Special Issue Advancements in Macroeconometric Modeling and Time Series Analysis)
Show Figures

Figure 1

11 pages, 246 KB  
Article
Estimating Treatment Effects Using Observational Data and Experimental Data with Non-Overlapping Support
by Kevin Han, Han Wu, Linjia Wu, Yu Shi and Canyao Liu
Econometrics 2024, 12(3), 26; https://doi.org/10.3390/econometrics12030026 - 20 Sep 2024
Cited by 1 | Viewed by 2627
Abstract
When estimating treatment effects, the gold standard is to conduct a randomized experiment and then contrast outcomes associated with the treatment group and the control group. However, in many cases, randomized experiments are either conducted with a much smaller scale compared to the [...] Read more.
When estimating treatment effects, the gold standard is to conduct a randomized experiment and then contrast outcomes associated with the treatment group and the control group. However, in many cases, randomized experiments are either conducted with a much smaller scale compared to the size of the target population or accompanied with certain ethical issues and thus hard to implement. Therefore, researchers usually rely on observational data to study causal connections. The downside is that the unconfoundedness assumption, which is the key to validating the use of observational data, is untestable and almost always violated. Hence, any conclusion drawn from observational data should be further analyzed with great care. Given the richness of observational data and usefulness of experimental data, researchers hope to develop credible methods to combine the strength of the two. In this paper, we consider a setting where the observational data contain the outcome of interest as well as a surrogate outcome, while the experimental data contain only the surrogate outcome. We propose an easy-to-implement estimator to estimate the average treatment effect of interest using both the observational data and the experimental data. Full article
18 pages, 1198 KB  
Article
Transient and Persistent Technical Efficiencies in Rice Farming: A Generalized True Random-Effects Model Approach
by Phuc Trong Ho, Michael Burton, Atakelty Hailu and Chunbo Ma
Econometrics 2024, 12(3), 23; https://doi.org/10.3390/econometrics12030023 - 12 Aug 2024
Viewed by 2694
Abstract
This study estimates transient and persistent technical efficiencies (TEs) using a generalized true random-effects (GTRE) model. We estimate the GTRE model using maximum likelihood and Bayesian estimation methods, then compare it to three simpler models nested within it to evaluate the robustness of [...] Read more.
This study estimates transient and persistent technical efficiencies (TEs) using a generalized true random-effects (GTRE) model. We estimate the GTRE model using maximum likelihood and Bayesian estimation methods, then compare it to three simpler models nested within it to evaluate the robustness of our estimates. We use a panel data set of 945 observations collected from 344 rice farming households in Vietnam’s Mekong River Delta. The results indicate that the GTRE model is more appropriate than the restricted models for understanding heterogeneity and inefficiency in rice production. The mean estimate of overall technical efficiency is 0.71 on average, with transient rather than persistent inefficiency being the dominant component. This suggests that rice farmers could increase output substantially and would benefit from policies that pay more attention to addressing short-term inefficiency issues. Full article
Show Figures

Figure 1

14 pages, 2200 KB  
Article
Exponential Time Trends in a Fractional Integration Model
by Guglielmo Maria Caporale and Luis Alberiko Gil-Alana
Econometrics 2024, 12(2), 15; https://doi.org/10.3390/econometrics12020015 - 31 May 2024
Viewed by 1828
Abstract
This paper introduces a new modelling approach that incorporates nonlinear, exponential deterministic terms into a fractional integration framework. The proposed model is based on a specific test on fractional integration that is more general than the standard methods, which allow for only linear [...] Read more.
This paper introduces a new modelling approach that incorporates nonlinear, exponential deterministic terms into a fractional integration framework. The proposed model is based on a specific test on fractional integration that is more general than the standard methods, which allow for only linear trends.. Its limiting distribution is standard normal, and Monte Carlo simulations show that it performs well in finite samples. Three empirical examples confirm that the suggested specification captures the properties of the data adequately. Full article
Show Figures

Figure 1

15 pages, 312 KB  
Article
A Pretest Estimator for the Two-Way Error Component Model
by Badi H. Baltagi, Georges Bresson and Jean-Michel Etienne
Econometrics 2024, 12(2), 9; https://doi.org/10.3390/econometrics12020009 - 16 Apr 2024
Cited by 2 | Viewed by 3278
Abstract
For a panel data linear regression model with both individual and time effects, empirical studies select the two-way random-effects (TWRE) estimator if the Hausman test based on the contrast between the two-way fixed-effects (TWFE) estimator and the TWRE estimator is not rejected. Alternatively, [...] Read more.
For a panel data linear regression model with both individual and time effects, empirical studies select the two-way random-effects (TWRE) estimator if the Hausman test based on the contrast between the two-way fixed-effects (TWFE) estimator and the TWRE estimator is not rejected. Alternatively, they select the TWFE estimator in cases where this Hausman test rejects the null hypothesis. Not all the regressors may be correlated with these individual and time effects. The one-way Hausman-Taylor model has been generalized to the two-way error component model and allow some but not all regressors to be correlated with these individual and time effects. This paper proposes a pretest estimator for this two-way error component panel data regression model based on two Hausman tests. The first Hausman test is based upon the contrast between the TWFE and the TWRE estimators. The second Hausman test is based on the contrast between the two-way Hausman and Taylor (TWHT) estimator and the TWFE estimator. The Monte Carlo results show that this pretest estimator is always second best in MSE performance compared to the efficient estimator, whether the model is random-effects, fixed-effects or Hausman and Taylor. This paper generalizes the one-way pretest estimator to the two-way error component model. Full article
Show Figures

Figure 1

15 pages, 2945 KB  
Article
Biases in the Maximum Simulated Likelihood Estimation of the Mixed Logit Model
by Maksat Jumamyradov, Murat Munkin, William H. Greene and Benjamin M. Craig
Econometrics 2024, 12(2), 8; https://doi.org/10.3390/econometrics12020008 - 27 Mar 2024
Viewed by 3028
Abstract
In a recent study, it was demonstrated that the maximum simulated likelihood (MSL) estimator produces significant biases when applied to the bivariate normal and bivariate Poisson-lognormal models. The study’s conclusion suggests that similar biases could be present in other models generated by correlated [...] Read more.
In a recent study, it was demonstrated that the maximum simulated likelihood (MSL) estimator produces significant biases when applied to the bivariate normal and bivariate Poisson-lognormal models. The study’s conclusion suggests that similar biases could be present in other models generated by correlated bivariate normal structures, which include several commonly used specifications of the mixed logit (MIXL) models. This paper conducts a simulation study analyzing the MSL estimation of the error components (EC) MIXL. We find that the MSL estimator produces significant biases in the estimated parameters. The problem becomes worse when the true value of the variance parameter is small and the correlation parameter is large in magnitude. In some cases, the biases in the estimated marginal effects are as large as 12% of the true values. These biases are largely invariant to increases in the number of Halton draws. Full article
Show Figures

Figure 1

30 pages, 583 KB  
Article
When It Counts—Econometric Identification of the Basic Factor Model Based on GLT Structures
by Sylvia Frühwirth-Schnatter, Darjus Hosszejni and Hedibert Freitas Lopes
Econometrics 2023, 11(4), 26; https://doi.org/10.3390/econometrics11040026 - 20 Nov 2023
Cited by 7 | Viewed by 4239
Abstract
Despite the popularity of factor models with simple loading matrices, little attention has been given to formally address the identifiability of these models beyond standard rotation-based identification such as the positive lower triangular (PLT) constraint. To fill this gap, we review the advantages [...] Read more.
Despite the popularity of factor models with simple loading matrices, little attention has been given to formally address the identifiability of these models beyond standard rotation-based identification such as the positive lower triangular (PLT) constraint. To fill this gap, we review the advantages of variance identification in simple factor analysis and introduce the generalized lower triangular (GLT) structures. We show that the GLT assumption is an improvement over PLT without compromise: GLT is also unique but, unlike PLT, a non-restrictive assumption. Furthermore, we provide a simple counting rule for variance identification under GLT structures, and we demonstrate that within this model class, the unknown number of common factors can be recovered in an exploratory factor analysis. Our methodology is illustrated for simulated data in the context of post-processing posterior draws in sparse Bayesian factor analysis. Full article
(This article belongs to the Special Issue High-Dimensional Time Series in Macroeconomics and Finance)
Show Figures

Figure 1

28 pages, 580 KB  
Article
On the Proper Computation of the Hausman Test Statistic in Standard Linear Panel Data Models: Some Clarifications and New Results
by Julie Le Gallo and Marc-Alexandre Sénégas
Econometrics 2023, 11(4), 25; https://doi.org/10.3390/econometrics11040025 - 8 Nov 2023
Cited by 4 | Viewed by 8380
Abstract
We provide new analytical results for the implementation of the Hausman specification test statistic in a standard panel data model, comparing the version based on the estimators computed from the untransformed random effects model specification under Feasible Generalized Least Squares and the one [...] Read more.
We provide new analytical results for the implementation of the Hausman specification test statistic in a standard panel data model, comparing the version based on the estimators computed from the untransformed random effects model specification under Feasible Generalized Least Squares and the one computed from the quasi-demeaned model estimated by Ordinary Least Squares. We show that the quasi-demeaned model cannot provide a reliable magnitude when implementing the Hausman test in a finite sample setting, although it is the most common approach used to produce the test statistic in econometric software. The difference between the Hausman statistics computed under the two methods can be substantial and even lead to opposite conclusions for the test of orthogonality between the regressors and the individual-specific effects. Furthermore, this difference remains important even with large cross-sectional dimensions as it mainly depends on the within-between structure of the regressors and on the presence of a significant correlation between the individual effects and the covariates in the data. We propose to supplement the test outcomes that are provided in the main econometric software packages with some metrics to address the issue at hand. Full article
Show Figures

Figure 1

32 pages, 10108 KB  
Article
Dirichlet Process Log Skew-Normal Mixture with a Missing-at-Random-Covariate in Insurance Claim Analysis
by Minkun Kim, David Lindberg, Martin Crane and Marija Bezbradica
Econometrics 2023, 11(4), 24; https://doi.org/10.3390/econometrics11040024 - 12 Oct 2023
Viewed by 2988
Abstract
In actuarial practice, the modeling of total losses tied to a certain policy is a nontrivial task due to complex distributional features. In the recent literature, the application of the Dirichlet process mixture for insurance loss has been proposed to eliminate the risk [...] Read more.
In actuarial practice, the modeling of total losses tied to a certain policy is a nontrivial task due to complex distributional features. In the recent literature, the application of the Dirichlet process mixture for insurance loss has been proposed to eliminate the risk of model misspecification biases. However, the effect of covariates as well as missing covariates in the modeling framework is rarely studied. In this article, we propose novel connections among a covariate-dependent Dirichlet process mixture, log-normal convolution, and missing covariate imputation. As a generative approach, our framework models the joint of outcome and covariates, which allows us to impute missing covariates under the assumption of missingness at random. The performance is assessed by applying our model to several insurance datasets of varying size and data missingness from the literature, and the empirical results demonstrate the benefit of our model compared with the existing actuarial models, such as the Tweedie-based generalized linear model, generalized additive model, or multivariate adaptive regression spline. Full article
Show Figures

Figure 1

27 pages, 1943 KB  
Article
Local Gaussian Cross-Spectrum Analysis
by Lars Arne Jordanger and Dag Tjøstheim
Econometrics 2023, 11(2), 12; https://doi.org/10.3390/econometrics11020012 - 21 Apr 2023
Cited by 3 | Viewed by 3256
Abstract
The ordinary spectrum is restricted in its applications, since it is based on the second-order moments (auto- and cross-covariances). Alternative approaches to spectrum analysis have been investigated based on other measures of dependence. One such approach was developed for univariate time series by [...] Read more.
The ordinary spectrum is restricted in its applications, since it is based on the second-order moments (auto- and cross-covariances). Alternative approaches to spectrum analysis have been investigated based on other measures of dependence. One such approach was developed for univariate time series by the authors of this paper using the local Gaussian auto-spectrum based on the local Gaussian auto-correlations. This makes it possible to detect local structures in univariate time series that look similar to white noise when investigated by the ordinary auto-spectrum. In this paper, the local Gaussian approach is extended to a local Gaussian cross-spectrum for multivariate time series. The local Gaussian cross-spectrum has the desirable property that it coincides with the ordinary cross-spectrum for Gaussian time series, which implies that it can be used to detect non-Gaussian traits in the time series under investigation. In particular, if the ordinary spectrum is flat, then peaks and troughs of the local Gaussian spectrum can indicate nonlinear traits, which potentially might reveal local periodic phenomena that are undetected in an ordinary spectral analysis. Full article
Show Figures

Figure 1

16 pages, 353 KB  
Article
Detecting Common Bubbles in Multivariate Mixed Causal–Noncausal Models
by Gianluca Cubadda, Alain Hecq and Elisa Voisin
Econometrics 2023, 11(1), 9; https://doi.org/10.3390/econometrics11010009 - 9 Mar 2023
Cited by 6 | Viewed by 3203
Abstract
This paper proposes concepts and methods to investigate whether the bubble patterns observed in individual time series are common among them. Having established the conditions under which common bubbles are present within the class of mixed causal–noncausal vector autoregressive models, we suggest statistical [...] Read more.
This paper proposes concepts and methods to investigate whether the bubble patterns observed in individual time series are common among them. Having established the conditions under which common bubbles are present within the class of mixed causal–noncausal vector autoregressive models, we suggest statistical tools to detect the common locally explosive dynamics in a Student t-distribution maximum likelihood framework. The performances of both likelihood ratio tests and information criteria were investigated in a Monte Carlo study. Finally, we evaluated the practical value of our approach via an empirical application on three commodity prices. Full article
Show Figures

Figure 1

33 pages, 992 KB  
Article
Semi-Metric Portfolio Optimization: A New Algorithm Reducing Simultaneous Asset Shocks
by Nick James, Max Menzies and Jennifer Chan
Econometrics 2023, 11(1), 8; https://doi.org/10.3390/econometrics11010008 - 7 Mar 2023
Cited by 12 | Viewed by 5849
Abstract
This paper proposes a new method for financial portfolio optimization based on reducing simultaneous asset shocks across a collection of assets. This may be understood as an alternative approach to risk reduction in a portfolio based on a new mathematical quantity. First, we [...] Read more.
This paper proposes a new method for financial portfolio optimization based on reducing simultaneous asset shocks across a collection of assets. This may be understood as an alternative approach to risk reduction in a portfolio based on a new mathematical quantity. First, we apply recently introduced semi-metrics between finite sets to determine the distance between time series’ structural breaks. Then, we build on the classical portfolio optimization theory of Markowitz and use this distance between asset structural breaks for our penalty function, rather than portfolio variance. Our experiments are promising: on synthetic data, we show that our proposed method does indeed diversify among time series with highly similar structural breaks and enjoys advantages over existing metrics between sets. On real data, experiments illustrate that our proposed optimization method performs well relative to nine other commonly used options, producing the second-highest returns, the lowest volatility, and second-lowest drawdown. The main implication for this method in portfolio management is reducing simultaneous asset shocks and potentially sharp associated drawdowns during periods of highly similar structural breaks, such as a market crisis. Our method adds to a considerable literature of portfolio optimization techniques in econometrics and could complement these via portfolio averaging. Full article
Show Figures

Figure 1

37 pages, 1354 KB  
Article
Building Multivariate Time-Varying Smooth Transition Correlation GARCH Models, with an Application to the Four Largest Australian Banks
by Anthony D. Hall, Annastiina Silvennoinen and Timo Teräsvirta
Econometrics 2023, 11(1), 5; https://doi.org/10.3390/econometrics11010005 - 6 Feb 2023
Cited by 3 | Viewed by 4077
Abstract
This paper proposes a methodology for building Multivariate Time-Varying STCC–GARCH models. The novel contributions in this area are the specification tests related to the correlation component, the extension of the general model to allow for additional correlation regimes, and a detailed exposition of [...] Read more.
This paper proposes a methodology for building Multivariate Time-Varying STCC–GARCH models. The novel contributions in this area are the specification tests related to the correlation component, the extension of the general model to allow for additional correlation regimes, and a detailed exposition of the systematic, improved modelling cycle required for such nonlinear models. There is an R-package that includes the steps in the modelling cycle. Simulations demonstrate the robustness of the recommended model building approach. The modelling cycle is illustrated using daily return series for Australia’s four largest banks. Full article
Show Figures

Figure 1

13 pages, 428 KB  
Article
Comparing the Conditional Logit Estimates and True Parameters under Preference Heterogeneity: A Simulated Discrete Choice Experiment
by Maksat Jumamyradov, Benjamin M. Craig, Murat Munkin and William Greene
Econometrics 2023, 11(1), 4; https://doi.org/10.3390/econometrics11010004 - 25 Jan 2023
Cited by 8 | Viewed by 4977
Abstract
Health preference research (HPR) is the subfield of health economics dedicated to understanding the value of health and health-related objects using observational or experimental methods. In a discrete choice experiment (DCE), the utility of objects in a choice set may differ systematically between [...] Read more.
Health preference research (HPR) is the subfield of health economics dedicated to understanding the value of health and health-related objects using observational or experimental methods. In a discrete choice experiment (DCE), the utility of objects in a choice set may differ systematically between persons due to interpersonal heterogeneity (e.g., brand-name medication, generic medication, no medication). To allow for interpersonal heterogeneity, choice probabilities may be described using logit functions with fixed individual-specific parameters. However, in practice, a study team may ignore heterogeneity in health preferences and estimate a conditional logit (CL) model. In this simulation study, we examine the effects of omitted variance and correlations (i.e., omitted heterogeneity) in logit parameters on the estimation of the coefficients, willingness to pay (WTP), and choice predictions. The simulated DCE results show that CL estimates may have been biased depending on the structure of the heterogeneity that we used in the data generation process. We also found that these biases in the coefficients led to a substantial difference in the true and estimated WTP (i.e., up to 20%). We further found that CL and true choice probabilities were similar to each other (i.e., difference was less than 0.08) regardless of the underlying structure. The results imply that, under preference heterogeneity, CL estimates may differ from their true means, and these differences can have substantive effects on the WTP estimates. More specifically, CL WTP estimates may be underestimated due to interpersonal heterogeneity, and a failure to recognize this bias in HPR indirectly underestimates the value of treatment, substantially reducing quality of care. These findings have important implications in health economics because CL remains widely used in practice. Full article
(This article belongs to the Special Issue Health Econometrics)
Show Figures

Figure 1

18 pages, 569 KB  
Article
Is Climate Change Time-Reversible?
by Francesco Giancaterini, Alain Hecq and Claudio Morana
Econometrics 2022, 10(4), 36; https://doi.org/10.3390/econometrics10040036 - 7 Dec 2022
Cited by 4 | Viewed by 5716
Abstract
This paper proposes strategies to detect time reversibility in stationary stochastic processes by using the properties of mixed causal and noncausal models. It shows that they can also be used for non-stationary processes when the trend component is computed with the Hodrick–Prescott filter [...] Read more.
This paper proposes strategies to detect time reversibility in stationary stochastic processes by using the properties of mixed causal and noncausal models. It shows that they can also be used for non-stationary processes when the trend component is computed with the Hodrick–Prescott filter rendering a time-reversible closed-form solution. This paper also links the concept of an environmental tipping point to the statistical property of time irreversibility and assesses fourteen climate indicators. We find evidence of time irreversibility in greenhouse gas emissions, global temperature, global sea levels, sea ice area, and some natural oscillation indices. While not conclusive, our findings urge the implementation of correction policies to avoid the worst consequences of climate change and not miss the opportunity window, which might still be available, despite closing quickly. Full article
(This article belongs to the Collection Econometric Analysis of Climate Change)
Show Figures

Figure 1

15 pages, 500 KB  
Article
A Theory-Consistent CVAR Scenario for a Monetary Model with Forward-Looking Expectations
by Katarina Juselius
Econometrics 2022, 10(2), 16; https://doi.org/10.3390/econometrics10020016 - 6 Apr 2022
Cited by 4 | Viewed by 3558
Abstract
A theory-consistent CVAR scenario describes a set of testable regularities capturing basic assumptions of the theoretical model. Using this concept, the paper considers a standard model for exchange rate determination with forward-looking expectations and shows that all assumptions about the model’s shock structure [...] Read more.
A theory-consistent CVAR scenario describes a set of testable regularities capturing basic assumptions of the theoretical model. Using this concept, the paper considers a standard model for exchange rate determination with forward-looking expectations and shows that all assumptions about the model’s shock structure and steady-state behavior can be formulated as testable hypotheses on common stochastic trends and cointegration. The basic stationarity assumptions of the monetary model failed to obtain empirical support. They were too restrictive to explain the observed long persistent swings in the real exchange rate, the real interest rates, and the inflation and interest rate differentials. Full article
(This article belongs to the Special Issue Celebrated Econometricians: David Hendry)
Show Figures

Figure 1

31 pages, 780 KB  
Article
Green Bonds for the Transition to a Low-Carbon Economy
by Andreas Lichtenberger, Joao Paulo Braga and Willi Semmler
Econometrics 2022, 10(1), 11; https://doi.org/10.3390/econometrics10010011 - 2 Mar 2022
Cited by 36 | Viewed by 11817
Abstract
The green bond market is emerging as an impactful financing mechanism in climate change mitigation efforts. The effectiveness of the financial market for this transition to a low-carbon economy depends on attracting investors and removing financial market roadblocks. This paper investigates the differential [...] Read more.
The green bond market is emerging as an impactful financing mechanism in climate change mitigation efforts. The effectiveness of the financial market for this transition to a low-carbon economy depends on attracting investors and removing financial market roadblocks. This paper investigates the differential bond performance of green vs non-green bonds with (1) a dynamic portfolio model that integrates negative as well as positive externality effects and via (2) econometric analyses of aggregate green bond and corporate energy time-series indices; as well as a cross-sectional set of individual bonds issued between 1 January 2017, and 1 October 2020. The asset pricing model demonstrates that, in the long-run, the positive externalities of green bonds benefit the economy through positive social returns. We use a deterministic and a stochastic version of the dynamic portfolio approach to obtain model-driven results and evaluate those through our empirical evidence using harmonic estimations. The econometric analysis of this study focuses on volatility and the risk–return performance (Sharpe ratio) of green and non-green bonds, and extends recent econometric studies that focused on yield differentials of green and non-green bonds. A modified Sharpe ratio analysis, cross-sectional methods, harmonic estimations, bond pairing estimations, as well as regression tree methodology, indicate that green bonds tend to show lower volatility and deliver superior Sharpe ratios (while the evidence for green premia is mixed). As a result, green bond investment can protect investors and portfolios from oil price and business cycle fluctuations, and stabilize portfolio returns and volatility. Policymakers are encouraged to make use of the financial benefits of green instruments and increase the financial flows towards sustainable economic activities to accelerate a low-carbon transition. Full article
(This article belongs to the Collection Econometric Analysis of Climate Change)
Show Figures

Figure 1

7 pages, 241 KB  
Article
A New Estimator for Standard Errors with Few Unbalanced Clusters
by Gianmaria Niccodemi and Tom Wansbeek
Econometrics 2022, 10(1), 6; https://doi.org/10.3390/econometrics10010006 - 21 Jan 2022
Cited by 3 | Viewed by 4524
Abstract
In linear regression analysis, the estimator of the variance of the estimator of the regression coefficients should take into account the clustered nature of the data, if present, since using the standard textbook formula will in that case lead to a severe downward [...] Read more.
In linear regression analysis, the estimator of the variance of the estimator of the regression coefficients should take into account the clustered nature of the data, if present, since using the standard textbook formula will in that case lead to a severe downward bias in the standard errors. This idea of a cluster-robust variance estimator (CRVE) generalizes to clusters the classical heteroskedasticity-robust estimator. Its justification is asymptotic in the number of clusters. Although an improvement, a considerable bias could remain when the number of clusters is low, the more so when regressors are correlated within cluster. In order to address these issues, two improved methods were proposed; one method, which we call CR2VE, was based on biased reduced linearization, while the other, CR3VE, can be seen as a jackknife estimator. The latter is unbiased under very strict conditions, in particular equal cluster size. To relax this condition, we introduce in this paper CR3VE-λ, a generalization of CR3VE where the cluster size is allowed to vary freely between clusters. We illustrate the performance of CR3VE-λ through simulations and we show that, especially when cluster sizes vary widely, it can outperform the other commonly used estimators. Full article
16 pages, 1805 KB  
Article
Forecasting Real GDP Growth for Africa
by Philip Hans Franses and Max Welz
Econometrics 2022, 10(1), 3; https://doi.org/10.3390/econometrics10010003 - 5 Jan 2022
Cited by 3 | Viewed by 6055
Abstract
We propose a simple and reproducible methodology to create a single equation forecasting model (SEFM) for low-frequency macroeconomic variables. Our methodology is illustrated by forecasting annual real GDP growth rates for 52 African countries, where the data are obtained from the World Bank [...] Read more.
We propose a simple and reproducible methodology to create a single equation forecasting model (SEFM) for low-frequency macroeconomic variables. Our methodology is illustrated by forecasting annual real GDP growth rates for 52 African countries, where the data are obtained from the World Bank and start in 1960. The models include lagged growth rates of other countries, as well as a cointegration relationship to capture potential common stochastic trends. With a few selection steps, our methodology quickly arrives at a reasonably small forecasting model per country. Compared with benchmark models, the single equation forecasting models seem to perform quite well. Full article
(This article belongs to the Special Issue Special Issue on Economic Forecasting)
Show Figures

Figure 1

17 pages, 600 KB  
Article
Second-Order Least Squares Estimation in Nonlinear Time Series Models with ARCH Errors
by Mustafa Salamh and Liqun Wang
Econometrics 2021, 9(4), 41; https://doi.org/10.3390/econometrics9040041 - 27 Nov 2021
Cited by 6 | Viewed by 4131
Abstract
Many financial and economic time series exhibit nonlinear patterns or relationships. However, most statistical methods for time series analysis are developed for mean-stationary processes that require transformation, such as differencing of the data. In this paper, we study a dynamic regression model with [...] Read more.
Many financial and economic time series exhibit nonlinear patterns or relationships. However, most statistical methods for time series analysis are developed for mean-stationary processes that require transformation, such as differencing of the data. In this paper, we study a dynamic regression model with nonlinear, time-varying mean function, and autoregressive conditionally heteroscedastic errors. We propose an estimation approach based on the first two conditional moments of the response variable, which does not require specification of error distribution. Strong consistency and asymptotic normality of the proposed estimator is established under strong-mixing condition, so that the results apply to both stationary and mean-nonstationary processes. Moreover, the proposed approach is shown to be superior to the commonly used quasi-likelihood approach and the efficiency gain is significant when the (conditional) error distribution is asymmetric. We demonstrate through a real data example that the proposed method can identify a more accurate model than the quasi-likelihood method. Full article
Show Figures

Figure 1

27 pages, 469 KB  
Article
Cointegration, Root Functions and Minimal Bases
by Massimo Franchi and Paolo Paruolo
Econometrics 2021, 9(3), 31; https://doi.org/10.3390/econometrics9030031 - 17 Aug 2021
Cited by 3 | Viewed by 3769
Abstract
This paper discusses the notion of cointegrating space for linear processes integrated of any order. It first shows that the notions of (polynomial) cointegrating vectors and of root functions coincide. Second, it discusses how the cointegrating space can be defined (i) as a [...] Read more.
This paper discusses the notion of cointegrating space for linear processes integrated of any order. It first shows that the notions of (polynomial) cointegrating vectors and of root functions coincide. Second, it discusses how the cointegrating space can be defined (i) as a vector space of polynomial vectors over complex scalars, (ii) as a free module of polynomial vectors over scalar polynomials, or finally (iii) as a vector space of rational vectors over rational scalars. Third, it shows that a canonical set of root functions can be used as a basis of the various notions of cointegrating space. Fourth, it reviews results on how to reduce polynomial bases to minimal order—i.e., minimal bases. The application of these results to Vector AutoRegressive processes integrated of order 2 is found to imply the separation of polynomial cointegrating vectors from non-polynomial ones. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Katarina Juselius and Søren Johansen)
20 pages, 589 KB  
Article
Semiparametric Estimation of a Corporate Bond Rating Model
by Yixiao Jiang
Econometrics 2021, 9(2), 23; https://doi.org/10.3390/econometrics9020023 - 28 May 2021
Cited by 3 | Viewed by 4943
Abstract
This paper investigates the incentive of credit rating agencies (CRAs) to bias ratings using a semiparametric, ordered-response model. The proposed model explicitly takes conflicts of interest into account and allows the ratings to depend flexibly on risk attributes through a semiparametric index structure. [...] Read more.
This paper investigates the incentive of credit rating agencies (CRAs) to bias ratings using a semiparametric, ordered-response model. The proposed model explicitly takes conflicts of interest into account and allows the ratings to depend flexibly on risk attributes through a semiparametric index structure. Asymptotic normality for the estimator is derived after using several bias correction techniques. Using Moody’s rating data from 2001 to 2016, I found that firms related to Moody’s shareholders were more likely to receive better ratings. Such favorable treatments were more pronounced in investment grade bonds compared with high yield bonds, with the 2007–2009 financial crisis being an exception. Parametric models, such as the ordered-probit, failed to identify this heterogeneity of the rating bias across different bond categories. Full article
Show Figures

Figure 1

21 pages, 731 KB  
Article
Asymptotic and Finite Sample Properties for Multivariate Rotated GARCH Models
by Manabu Asai, Chia-Lin Chang, Michael McAleer and Laurent Pauwels
Econometrics 2021, 9(2), 21; https://doi.org/10.3390/econometrics9020021 - 4 May 2021
Cited by 2 | Viewed by 3934
Abstract
This paper derives the statistical properties of a two-step approach to estimating multivariate rotated GARCH-BEKK (RBEKK) models. From the definition of RBEKK, the unconditional covariance matrix is estimated in the first step to rotate the observed variables in order to have the identity [...] Read more.
This paper derives the statistical properties of a two-step approach to estimating multivariate rotated GARCH-BEKK (RBEKK) models. From the definition of RBEKK, the unconditional covariance matrix is estimated in the first step to rotate the observed variables in order to have the identity matrix for its sample covariance matrix. In the second step, the remaining parameters are estimated by maximizing the quasi-log-likelihood function. For this two-step quasi-maximum likelihood (2sQML) estimator, this paper shows consistency and asymptotic normality under weak conditions. While second-order moments are needed for the consistency of the estimated unconditional covariance matrix, the existence of the finite sixth-order moments is required for the convergence of the second-order derivatives of the quasi-log-likelihood function. This paper also shows the relationship between the asymptotic distributions of the 2sQML estimator for the RBEKK model and variance targeting quasi-maximum likelihood estimator for the VT-BEKK model. Monte Carlo experiments show that the bias of the 2sQML estimator is negligible and that the appropriateness of the diagonal specification depends on the closeness to either the diagonal BEKK or the diagonal RBEKK models. An empirical analysis of the returns of stocks listed on the Dow Jones Industrial Average indicates that the choice of the diagonal BEKK or diagonal RBEKK models changes over time, but most of the differences between the two forecasts are negligible. Full article
Show Figures

Figure 1

35 pages, 3691 KB  
Article
Quantile Regression with Generated Regressors
by Liqiong Chen, Antonio F. Galvao and Suyong Song
Econometrics 2021, 9(2), 16; https://doi.org/10.3390/econometrics9020016 - 12 Apr 2021
Cited by 8 | Viewed by 5236
Abstract
This paper studies estimation and inference for linear quantile regression models with generated regressors. We suggest a practical two-step estimation procedure, where the generated regressors are computed in the first step. The asymptotic properties of the two-step estimator, namely, consistency and asymptotic normality [...] Read more.
This paper studies estimation and inference for linear quantile regression models with generated regressors. We suggest a practical two-step estimation procedure, where the generated regressors are computed in the first step. The asymptotic properties of the two-step estimator, namely, consistency and asymptotic normality are established. We show that the asymptotic variance-covariance matrix needs to be adjusted to account for the first-step estimation error. We propose a general estimator for the asymptotic variance-covariance, establish its consistency, and develop testing procedures for linear hypotheses in these models. Monte Carlo simulations to evaluate the finite-sample performance of the estimation and inference procedures are provided. Finally, we apply the proposed methods to study Engel curves for various commodities using data from the UK Family Expenditure Survey. We document strong heterogeneity in the estimated Engel curves along the conditional distribution of the budget share of each commodity. The empirical application also emphasizes that correctly estimating confidence intervals for the estimated Engel curves by the proposed estimator is of importance for inference. Full article
Show Figures

Figure 1

24 pages, 2606 KB  
Article
Forecast Accuracy Matters for Hurricane Damage
by Andrew B. Martinez
Econometrics 2020, 8(2), 18; https://doi.org/10.3390/econometrics8020018 - 14 May 2020
Cited by 26 | Viewed by 9191
Abstract
I analyze damage from hurricane strikes on the United States since 1955. Using machine learning methods to select the most important drivers for damage, I show that large errors in a hurricane’s predicted landfall location result in higher damage. This relationship holds across [...] Read more.
I analyze damage from hurricane strikes on the United States since 1955. Using machine learning methods to select the most important drivers for damage, I show that large errors in a hurricane’s predicted landfall location result in higher damage. This relationship holds across a wide range of model specifications and when controlling for ex-ante uncertainty and potential endogeneity. Using a counterfactual exercise I find that the cumulative reduction in damage from forecast improvements since 1970 is about $82 billion, which exceeds the U.S. government’s spending on the forecasts and private willingness to pay for them. Full article
(This article belongs to the Collection Econometric Analysis of Climate Change)
Show Figures

Figure 1

23 pages, 361 KB  
Article
Cointegration and Error Correction Mechanisms for Singular Stochastic Vectors
by Matteo Barigozzi, Marco Lippi and Matteo Luciani
Econometrics 2020, 8(1), 3; https://doi.org/10.3390/econometrics8010003 - 4 Feb 2020
Cited by 18 | Viewed by 7092
Abstract
Large-dimensional dynamic factor models and dynamic stochastic general equilibrium models, both widely used in empirical macroeconomics, deal with singular stochastic vectors, i.e., vectors of dimension r which are driven by a q-dimensional white noise, with q < r . The present paper [...] Read more.
Large-dimensional dynamic factor models and dynamic stochastic general equilibrium models, both widely used in empirical macroeconomics, deal with singular stochastic vectors, i.e., vectors of dimension r which are driven by a q-dimensional white noise, with q < r . The present paper studies cointegration and error correction representations for an I ( 1 ) singular stochastic vector y t . It is easily seen that y t is necessarily cointegrated with cointegrating rank c r q . Our contributions are: (i) we generalize Johansen’s proof of the Granger representation theorem to I ( 1 ) singular vectors under the assumption that y t has rational spectral density; (ii) using recent results on singular vectors by Anderson and Deistler, we prove that for generic values of the parameters the autoregressive representation of y t has a finite-degree polynomial. The relationship between the cointegration of the factors and the cointegration of the observable variables in a large-dimensional factor model is also discussed. Full article
(This article belongs to the Special Issue Celebrated Econometricians: Katarina Juselius and Søren Johansen)
14 pages, 304 KB  
Article
A Frequentist Alternative to Significance Testing, p-Values, and Confidence Intervals
by David Trafimow
Econometrics 2019, 7(2), 26; https://doi.org/10.3390/econometrics7020026 - 4 Jun 2019
Cited by 42 | Viewed by 11594
Abstract
There has been much debate about null hypothesis significance testing, p-values without null hypothesis significance testing, and confidence intervals. The first major section of the present article addresses some of the main reasons these procedures are problematic. The conclusion is that none [...] Read more.
There has been much debate about null hypothesis significance testing, p-values without null hypothesis significance testing, and confidence intervals. The first major section of the present article addresses some of the main reasons these procedures are problematic. The conclusion is that none of them are satisfactory. However, there is a new procedure, termed the a priori procedure (APP), that validly aids researchers in obtaining sample statistics that have acceptable probabilities of being close to their corresponding population parameters. The second major section provides a description and review of APP advances. Not only does the APP avoid the problems that plague other inferential statistical procedures, but it is easy to perform too. Although the APP can be performed in conjunction with other procedures, the present recommendation is that it be used alone. Full article
(This article belongs to the Special Issue Towards a New Paradigm for Statistical Evidence)
11 pages, 3486 KB  
Article
Pitfalls of Two-Step Testing for Changes in the Error Variance and Coefficients of a Linear Regression Model
by Pierre Perron and Yohei Yamamoto
Econometrics 2019, 7(2), 22; https://doi.org/10.3390/econometrics7020022 - 21 May 2019
Cited by 9 | Viewed by 7289
Abstract
In empirical applications based on linear regression models, structural changes often occur in both the error variance and regression coefficients, possibly at different dates. A commonly applied method is to first test for changes in the coefficients (or in the error variance) and, [...] Read more.
In empirical applications based on linear regression models, structural changes often occur in both the error variance and regression coefficients, possibly at different dates. A commonly applied method is to first test for changes in the coefficients (or in the error variance) and, conditional on the break dates found, test for changes in the variance (or in the coefficients). In this note, we provide evidence that such procedures have poor finite sample properties when the changes in the first step are not correctly accounted for. In doing so, we show that testing for changes in the coefficients (or in the variance) ignoring changes in the variance (or in the coefficients) induces size distortions and loss of power. Our results illustrate a need for a joint approach to test for structural changes in both the coefficients and the variance of the errors. We provide some evidence that the procedures suggested by Perron et al. (2019) provide tests with good size and power. Full article
Show Figures

Figure 1

24 pages, 365 KB  
Article
Covariance Prediction in Large Portfolio Allocation
by Carlos Trucíos, Mauricio Zevallos, Luiz K. Hotta and André A. P. Santos
Econometrics 2019, 7(2), 19; https://doi.org/10.3390/econometrics7020019 - 9 May 2019
Cited by 10 | Viewed by 8973
Abstract
Many financial decisions, such as portfolio allocation, risk management, option pricing and hedge strategies, are based on forecasts of the conditional variances, covariances and correlations of financial returns. The paper shows an empirical comparison of several methods to predict one-step-ahead conditional covariance matrices. [...] Read more.
Many financial decisions, such as portfolio allocation, risk management, option pricing and hedge strategies, are based on forecasts of the conditional variances, covariances and correlations of financial returns. The paper shows an empirical comparison of several methods to predict one-step-ahead conditional covariance matrices. These matrices are used as inputs to obtain out-of-sample minimum variance portfolios based on stocks belonging to the S&P500 index from 2000 to 2017 and sub-periods. The analysis is done through several metrics, including standard deviation, turnover, net average return, information ratio and Sortino’s ratio. We find that no method is the best in all scenarios and the performance depends on the criterion, the period of analysis and the rebalancing strategy. Full article
Back to TopTop