Next Issue
Volume 14, June
Previous Issue
Volume 13, December
 
 

Econometrics, Volume 14, Issue 1 (March 2026) – 16 articles

Cover Story (view full-size image): In this paper, we examine the market repercussions of the Binance USD (BUSD) delisting within the broader stablecoin ecosystem. Following regulatory pressures on Paxos, the phase-out of BUSD created a significant vacuum in the digital asset space. Utilizing a Local Projections (LP) approach, we analyze how this shock propagated across major stablecoins such as USDT, USDC, and DAI, in addition to core cryptocurrencies. Unlike traditional VAR models, our methodology captures the dynamic impulse responses of market liquidity and volume with greater flexibility. The findings highlight the interconnectedness of stablecoin markets and provide critical insights for investors and regulators regarding the systemic risks and stability of fiat-pegged digital assets during major delisting events. View this paper
  • Issues are regarded as officially published after their release is announced to the table of contents alert mailing list.
  • You may sign up for e-mail alerts to receive table of contents of newly released issues.
  • PDF is the official format for papers published in both, html and pdf forms. To view the papers in pdf format, click on the "PDF Full-text" link, and use the free Adobe Reader to open them.
Order results
Result details
Select all
Export citation of selected articles as:
16 pages, 1800 KB  
Article
Navigating Extreme Market Fluctuations: Asset Allocation Strategies in Developed vs. Emerging Economies
by Lumengo Bonga-Bonga
Econometrics 2026, 14(1), 16; https://doi.org/10.3390/econometrics14010016 - 17 Mar 2026
Viewed by 604
Abstract
This paper examines how assets from emerging and developed stock markets can be efficiently allocated during periods of financial crisis by integrating traditional portfolio theory with Extreme Value Theory (EVT), using the Generalized Pareto Distribution (GPD) and Generalized Extreme Value (GEV) approaches to [...] Read more.
This paper examines how assets from emerging and developed stock markets can be efficiently allocated during periods of financial crisis by integrating traditional portfolio theory with Extreme Value Theory (EVT), using the Generalized Pareto Distribution (GPD) and Generalized Extreme Value (GEV) approaches to model tail risks. This study evaluates mean-variance portfolios constructed under each EVT framework and finds that portfolios based on GPD estimates consistently favour emerging market assets, which outperform both developed market and internationally diversified portfolios during extreme market conditions. In contrast, GEV-based portfolios indicate superior performance for developed market assets, highlighting the distinct behaviour of returns in the upper and lower tails of the distribution. These contrasting results reveal the unique nature of safe-haven characteristics associated with developed economies, the assets of which demonstrate greater stability and resilience during episodes of financial stress. By showing how tail-risk modelling alters optimal portfolio weights across market types, this paper contributes new evidence to the literature on crisis-informed asset allocation and offers practical insights for investors seeking robust diversification strategies under extreme market fluctuations. Full article
Show Figures

Figure 1

28 pages, 1479 KB  
Article
Double-Edged Sword of Diversification: Commodities and African Equity Indices in Robust vs. Optimal Portfolio Strategies
by Anaclet K. Kitenge, John W. M. Mwamba and Jules C. Mba
Econometrics 2026, 14(1), 15; https://doi.org/10.3390/econometrics14010015 - 16 Mar 2026
Viewed by 555
Abstract
This study empirically investigates a central tension in quantitative finance: the divergence between theoretically optimal and robust portfolio construction under real-world estimation uncertainty. Using a dynamic, time-varying optimization framework, we compare the performance of three distinct strategies: the Maximum Sharpe ratio (P1), Minimum [...] Read more.
This study empirically investigates a central tension in quantitative finance: the divergence between theoretically optimal and robust portfolio construction under real-world estimation uncertainty. Using a dynamic, time-varying optimization framework, we compare the performance of three distinct strategies: the Maximum Sharpe ratio (P1), Minimum Variance (P2), and Maximum Entropy (P3) portfolios, with and without commodity proxy inclusion (gold and oil) in a multi-asset universe featuring prominent African equity indices. Our key finding challenges classical theory: the robust Maximum Entropy portfolio (P3) achieved superior realized risk-adjusted returns (Sharpe ratio: 1.164) compared to the theoretically optimal Maximum Sharpe portfolio (P1, Sharpe: 0.788). This result validates the “estimation-error maximization” critique, as P1’s performance was undermined by its sensitivity to noisy inputs. Conversely, the Minimum Variance portfolio (P2) successfully fulfilled its objective, achieving the lowest volatility (~5%) at the cost of modest returns (3.01–3.64%), illustrating the classic risk–return trade-off. Euler decomposition revealed that even this low-volatility portfolio exhibited significant concentration risk, with over 40% of its risk attributable to just three assets. The role of commodities is proven to be strategy contingent. They significantly enhanced returns and the Sharpe ratio for the aggressive P1 but were marginally detrimental to the robust P3. African market indices played specialized roles: Egypt and Nigeria acted as return drivers in P1, Morocco became a major risk contributor within the concentrated P2 strategy, and South Africa provided key diversification in the well-balanced P3. Ultimately, the study demonstrates that portfolio risk is determined more by asset concentration and diversification quality than by geographic labels, and that robust diversification methodologies outperform fragile theoretical optima in practice. We conclude that portfolio construction must prioritize robustness to estimation error and explicit risk-balancing to ensure stable, real-world performance. Full article
Show Figures

Figure 1

20 pages, 1386 KB  
Article
A New Functional Setting for Term Structure Modeling Using the Heath–Jarrow–Morton Framework
by Michael Pokojovy, Ebenezer Nkum and Thomas M. Fullerton, Jr.
Econometrics 2026, 14(1), 14; https://doi.org/10.3390/econometrics14010014 - 11 Mar 2026
Viewed by 453
Abstract
The well-known Heath–Jarrow–Morton (HJM) framework provides a universal and efficacious instrument for modeling the stochastic evolution of an entire yield curve by explaining the interest rate dynamics in continuous time under no-arbitrage conditions. Existing implementations involve exponentially weighted function spaces as theoretical settings [...] Read more.
The well-known Heath–Jarrow–Morton (HJM) framework provides a universal and efficacious instrument for modeling the stochastic evolution of an entire yield curve by explaining the interest rate dynamics in continuous time under no-arbitrage conditions. Existing implementations involve exponentially weighted function spaces as theoretical settings for the former stochastic evolution. While the choice of weight can have a drastic effect on model calibration and subsequent forecasting, it cannot be estimated from market data and does not allow for any objective interpretation. The proposed approach does not have this shortcoming as it adopts a suitably designed unweighted function space. The HJM equation is discretized using a finite difference approach. The resulting semiparametric model is then calibrated on real-world yield data with a new type of functional principal component analysis (PCA)-based approach. Backtesting and benchmarking are conducted against the one-factor Vasicek model using historical data to illustrate its simulation capabilities for prediction and uncertainty quantification. Additionally, in contrast to widely studied US treasuries, negative interest rates are observed for AAA Euro Bonds during the sample period employed for this study. Accordingly, the framework allows for the possibility of negative yields. Full article
Show Figures

Figure 1

12 pages, 289 KB  
Article
Analysis of School Absenteeism for Single- vs. Two-Parent Families: A Finite Mixture Roy Approach
by Murat K. Munkin and David Zimmer
Econometrics 2026, 14(1), 13; https://doi.org/10.3390/econometrics14010013 - 9 Mar 2026
Viewed by 428
Abstract
This paper analyzes factors affecting school absenteeism due to an injury or illness among the US school student population between 6 and 15 years of age. The number of missed school days displays overdispersion and is modeled using the Finite Mixture Roy (FMR) [...] Read more.
This paper analyzes factors affecting school absenteeism due to an injury or illness among the US school student population between 6 and 15 years of age. The number of missed school days displays overdispersion and is modeled using the Finite Mixture Roy (FMR) model for count variables. The married/single parent family status (treatment) is potentially endogenous to the dependent variable (missed days). The Roy structure controls observed heterogeneity due to the mother’s marital status. Finite mixtures are intended to control unobserved heterogeneity due to healthy and unhealthy children in the sample. This approach facilitates identification of latent subpopulations in which treatment and marginal effects are relatively homogeneous. The model also incorporates two application-driven extensions. First, probabilities of the latent components are modeled as functions of regressors. Secondly, the mother’s income affects treatment nonparametrically. The FMR model is estimated with two latent components in each state, corresponding to healthy and unhealthy students. The results indicate that maternal marital status decreases annual missed school days by approximately 13 percent for a randomly drawn child; however, this increases absenteeism by about 14 percent among families that self-select into two-parent households, which is evidence of adverse selection. Full article
Show Figures

Figure 1

15 pages, 593 KB  
Article
Using Subspace Algorithms for the Estimation of Linear State Space Models for Over-Differenced Processes
by Dietmar Bauer
Econometrics 2026, 14(1), 12; https://doi.org/10.3390/econometrics14010012 - 28 Feb 2026
Viewed by 429
Abstract
Subspace algorithms like canonical variate analysis (CVA) are regression-based methods for the estimation of linear dynamic state space models. They have been shown to deliver accurate (consistent and asymptotically equivalent to quasi-maximum likelihood estimation using the Gaussian likelihood) estimators for stably invertible stationary [...] Read more.
Subspace algorithms like canonical variate analysis (CVA) are regression-based methods for the estimation of linear dynamic state space models. They have been shown to deliver accurate (consistent and asymptotically equivalent to quasi-maximum likelihood estimation using the Gaussian likelihood) estimators for stably invertible stationary autoregressive moving average (ARMA) processes. These results use the assumption that there are no zeros of the spectral density on the unit circle corresponding to the state space system. In this technical study, we consider vector processes made stationary by applying differencing to all variables, ignoring potential co-integrating relations. This leads to spectral zeros violating the above mentioned assumptions. We show consistency for the CVA estimators, closing a gap in the literature. However, a simulation exercise shows that over-differencing (while leading to consistent estimation of the transfer function) also complicates inference for CVA estimators, not just maximum likelihood-based estimators. This is also demonstrated in a real-world data example. The result also applies to seasonal differencing. The present paper hence suggests working with original data, not working in differences. Full article
Show Figures

Figure 1

23 pages, 1745 KB  
Article
Graph Attention Networks in Exchange Rate Forecasting
by Joanna Landmesser-Rusek and Arkadiusz Orłowski
Econometrics 2026, 14(1), 11; https://doi.org/10.3390/econometrics14010011 - 25 Feb 2026
Viewed by 1169
Abstract
Exchange rate forecasting is an important issue in financial market analysis. Currency rates form a dynamic network of connections that can be efficiently modeled using graph neural networks (GNNs). The key mechanism of GNNs is the message passing between nodes, allowing for better [...] Read more.
Exchange rate forecasting is an important issue in financial market analysis. Currency rates form a dynamic network of connections that can be efficiently modeled using graph neural networks (GNNs). The key mechanism of GNNs is the message passing between nodes, allowing for better modeling of currency interactions. Each node updates its representation by aggregating features from its neighbors and combining them with its own. In convolutional graph neural networks (GCNs), all neighboring nodes are treated equally, but in reality, some may have a greater influence than others. To account for this changing importance of neighbors, graph attention networks (GAT) have been introduced. The aim of the study was to evaluate the effectiveness of GAT in forecasting exchange rates. The analysis covered time series of major world currencies from 2020 to 2024. The forecasting results obtained using GAT were compared with those obtained from benchmark models such as ARIMA, GARCH, MLP, GCN, and LSTM-GCN. The study showed that GAT networks outperform numerous methods. The results may have practical applications, supporting investors and analysts in decision-making. Full article
Show Figures

Figure 1

19 pages, 323 KB  
Article
Application of Resolution Regression and Resolution Graphs in Evaluating Probability Forecasts Generated Using Binary Choice Models
by Senarath Dharmasena, David A. Bessler and Oral Capps, Jr.
Econometrics 2026, 14(1), 10; https://doi.org/10.3390/econometrics14010010 - 24 Feb 2026
Viewed by 647
Abstract
Binary choice models are widely used in econometric modeling when the dependent variable corresponds to discrete outcomes. With appropriate decision rules, these models provide predictions of binary choices generated from predicted probabilities. The accuracy of these predictions in terms of classifying probabilities to [...] Read more.
Binary choice models are widely used in econometric modeling when the dependent variable corresponds to discrete outcomes. With appropriate decision rules, these models provide predictions of binary choices generated from predicted probabilities. The accuracy of these predictions in terms of classifying probabilities to events that occurred versus those that did not is a key issue. The use of expectation-prediction success at present is the standard method used to assess the accuracy of these predictions. However, this method is limited in its ability to correctly classify probabilities in the absence of appropriate predetermined cut-off levels. We propose alternative methods to classify probabilities generated through binary choice models, namely resolution graphs and resolution regressions that measure the ability to sort predicted probabilities against observed outcomes. Using probabilities generated from the use of logit models applied to purchasing decisions of various non-alcoholic beverages made by U.S. households, we compare probability sorting power using expectation-prediction success as well as resolution graphs and resolution regressions. Based on expectation-prediction success, the logit models performed better at classifying outcomes related to purchasing isotonic drinks, regular soft drinks, diet drinks, bottled water, and tea. Based on resolution regressions, the null hypothesis of perfect sorting of probabilities was rejected for all non-alcoholic beverages. Although the logit models generated upward-sloping resolution graphs as expected, they were relatively flat compared to the 45-degree perfect sorting line. Going forward, we recommend using resolution regression and resolution graphs to capture sorting of probabilities in addition to the conventional metrics used in ascertaining the ability of binary choice models to predict out-of-sample behavior. Full article
Show Figures

Figure 1

44 pages, 3374 KB  
Article
Econometric Analysis and Forecasts on Exports of Emerging Economies from Central and Eastern Europe
by Liviu Popescu, Mirela Găman, Laurențiu Stelian Mihai, Cristian Ovidiu Drăgan, Daniel Militaru and Ion Buligiu
Econometrics 2026, 14(1), 9; https://doi.org/10.3390/econometrics14010009 - 14 Feb 2026
Viewed by 941
Abstract
This study examines the evolution, heterogeneity, and short-term prospects of export performance in seven Central and Eastern European (CEE) economies—Croatia, Czech Republic, Hungary, Poland, Romania, Bulgaria, and Slovakia—over the period 1995–2024. Using annual World Bank data, exports are modeled as a share of [...] Read more.
This study examines the evolution, heterogeneity, and short-term prospects of export performance in seven Central and Eastern European (CEE) economies—Croatia, Czech Republic, Hungary, Poland, Romania, Bulgaria, and Slovakia—over the period 1995–2024. Using annual World Bank data, exports are modeled as a share of GDP to ensure cross-country comparability and to capture differences in trade dependence. The analysis combines descriptive and inferential statistics with Augmented Dickey–Fuller tests, non-parametric comparisons, Granger causality analysis, and country-specific ARIMA models to investigate export dynamics, the role of foreign direct investment (FDI), and future export trajectories. The results reveal a common long-term upward trend in export intensity across all countries, driven by European integration and structural transformation, but with pronounced cross-country differences in export dependence and volatility. Highly open economies such as Slovakia, Hungary, and the Czech Republic exhibit strong export performance alongside greater exposure to external shocks, while larger domestic markets such as Poland and Romania display lower export intensity and greater stabilization. Granger causality tests indicate that FDI contributes to export growth in several economies, often with multi-year lags, highlighting the importance of absorptive capacity and institutional quality in translating investment inflows into export competitiveness. ARIMA-based forecasts for 2025–2027 suggest continued export expansion and relative stabilization despite recent global disruptions. This study’s primary contribution lies in integrating comparative export analysis, causality testing, and short-term forecasting within a unified econometric framework, offering policy-relevant insights into export-led growth and economic convergence in post-transition European economies. Full article
Show Figures

Figure 1

15 pages, 990 KB  
Article
Posterior Probabilities of Dominance for Wealth Distributions
by William Griffiths and Duangkamon Chotikapanich
Econometrics 2026, 14(1), 8; https://doi.org/10.3390/econometrics14010008 - 12 Feb 2026
Viewed by 549
Abstract
Probability distributions, which are typically used to describe income distributions, are not suitable to describe a population’s distribution of wealth because of the existence of negative observations and a large concentration of values close to zero. To overcome these problems, we describe how [...] Read more.
Probability distributions, which are typically used to describe income distributions, are not suitable to describe a population’s distribution of wealth because of the existence of negative observations and a large concentration of values close to zero. To overcome these problems, we describe how the asymmetric Laplace distribution can be used for modelling wealth distributions and illustrate how it can be used to compute the posterior probabilities of first- and second-order stochastic dominance. Stochastic dominance concepts are useful for comparing wealth distributions and assessing whether changes in welfare have increased or decreased welfare in society. We use three distributions to make two such comparisons. The results are such that, in one comparison, one distribution clearly dominates the other. There is more uncertainty about dominance in the other comparison, with no dominance being the most likely outcome. Full article
Show Figures

Figure 1

25 pages, 336 KB  
Article
Social Security Transfers and Fiscal Sustainability in Turkey: Evidence from 1984–2024
by Huriye Gonca Diler, Nurgül E. Barın, Ercan Özen and Simon Grima
Econometrics 2026, 14(1), 7; https://doi.org/10.3390/econometrics14010007 - 31 Jan 2026
Viewed by 1011
Abstract
Social security systems constitute a structurally significant component of public finance in developing economies and often generate persistent fiscal pressures through budgetary transfers. Demographic transformation, widespread informality in labor markets, and weaknesses in contribution-based financing increase the dependence of social security systems on [...] Read more.
Social security systems constitute a structurally significant component of public finance in developing economies and often generate persistent fiscal pressures through budgetary transfers. Demographic transformation, widespread informality in labor markets, and weaknesses in contribution-based financing increase the dependence of social security systems on public resources. The objective of this study is to examine whether budget transfers to the social security system affect fiscal sustainability in Turkey by analyzing their relationship with the budget deficit and the public sector borrowing requirement. The analysis employs annual data for Turkey covering the period of 1984–2024. A comprehensive time-series econometric framework is adopted, incorporating conventional and structural-break unit root tests, the ARDL bounds testing approach with error correction modeling, and the Toda–Yamamoto causality method. The empirical findings provide evidence of a stable long-run relationship among the variables. The results indicate that social security budget transfers exert a statistically significant and persistent effect on the public sector borrowing requirement, while no direct long-run effect on the headline budget deficit is detected. Causality results further confirm that fiscal pressures associated with social security financing materialize primarily through borrowing dynamics rather than short-term budgetary imbalances. By explicitly modelling social security budget transfers as an independent fiscal channel over a long historical horizon, this study contributes to the literature by offering new empirical insights into the fiscal sustainability implications of social security financing in Turkey. The findings also provide policy-relevant evidence for developing economies facing similar institutional, demographic, and fiscal challenges. Full article
41 pages, 2234 KB  
Article
Binance USD Delisting and Stablecoins Repercussions: A Local Projections Approach
by Papa Ousseynou Diop and Julien Chevallier
Econometrics 2026, 14(1), 6; https://doi.org/10.3390/econometrics14010006 - 16 Jan 2026
Viewed by 1745
Abstract
The delisting of Binance USD (BUSD) constitutes a major regulatory intervention in the stablecoin market and provides a unique opportunity to examine how targeted regulation affects liquidity allocation, market concentration, and short-run systemic risk in crypto-asset markets. Using daily data for 2023 and [...] Read more.
The delisting of Binance USD (BUSD) constitutes a major regulatory intervention in the stablecoin market and provides a unique opportunity to examine how targeted regulation affects liquidity allocation, market concentration, and short-run systemic risk in crypto-asset markets. Using daily data for 2023 and a linear and nonlinear Local Projections event-study framework, this paper analyzes the dynamic market responses to the BUSD delisting across major stablecoins and cryptocurrencies. The results show that liquidity displaced from BUSD is reallocated primarily toward USDT and USDC, leading to a measurable increase in stablecoin market concentration, while decentralized and algorithmic stablecoins absorb only a limited share of the shock. At the same time, Bitcoin and Ethereum experience temporary liquidity contractions followed by a relatively rapid recovery, suggesting conditional resilience of core crypto-assets. Overall, the findings document how a regulatory-induced exit of a major stablecoin reshapes short-run market dynamics and concentration patterns, highlighting potential trade-offs between regulatory enforcement and market structure. The paper contributes to the literature by providing the first empirical analysis of the BUSD delisting and by illustrating the usefulness of Local Projections for studying regulatory shocks in cryptocurrency markets. Full article
Show Figures

Figure 1

25 pages, 1443 KB  
Article
Shock Next Door: Geographic Spillovers in FinTech Lending After Natural Disasters
by David Kuo Chuen Lee, Weibiao Xu, Jianzheng Shi, Yue Wang and Ding Ding
Econometrics 2026, 14(1), 5; https://doi.org/10.3390/econometrics14010005 - 15 Jan 2026
Viewed by 1099
Abstract
We examine geographic spillovers in digital credit markets by studying how natural disasters affect borrowing behavior in adjacent, physically undamaged regions. Using granular loan-level data from Indonesia’s largest FinTech lender (2021–2023) and leveraging quasi-random variation in disaster timing and location, we estimate fixed-effects [...] Read more.
We examine geographic spillovers in digital credit markets by studying how natural disasters affect borrowing behavior in adjacent, physically undamaged regions. Using granular loan-level data from Indonesia’s largest FinTech lender (2021–2023) and leveraging quasi-random variation in disaster timing and location, we estimate fixed-effects specifications that incorporate spatially lagged disaster exposure (an SLX-type spatial approach) to quantify spillovers. Disasters generate economically significant spillovers in neighboring provinces: a 1% increase in disaster frequency raises local borrowing by 0.036%, approximately 20% of the direct effect. Spillovers vary sharply with geographic connectivity—land-connected provinces experience effects about 6.6 times larger than sea-connected provinces. These results highlight that digital lending platforms can transmit geographically proximate risks beyond directly affected areas through channels that differ from traditional banking networks. The systematic nature of these spillovers suggests that disaster-response strategies may be more effective when they consider adjacent regions. That platform risk management can be strengthened by integrating spatial disaster exposure and connectivity into credit monitoring and decision rules. Full article
Show Figures

Figure 1

12 pages, 290 KB  
Article
A Theory-Based Formal-Econometric Interpretation of an Econometric Model
by Bernt Petter Stigum
Econometrics 2026, 14(1), 4; https://doi.org/10.3390/econometrics14010004 - 6 Jan 2026
Viewed by 511
Abstract
The references of most of the observations that econometricians have are ill defined. To use such data in an empirical analysis, the econometrician in charge must find a way to give them economic meaning. In this paper, I have data and an econometric [...] Read more.
The references of most of the observations that econometricians have are ill defined. To use such data in an empirical analysis, the econometrician in charge must find a way to give them economic meaning. In this paper, I have data and an econometric model, and I set out to show how economic theory can be used to interpret the variables and parameters of my econometric model. According to Ragnar Frisch, that is a difficult task. Economic theories reside in a Model World and the econometrician’s data reside in the Real World; the rational laws in the model world are fundamentally different from the empirical laws in the real world; and between the two worlds there is a gap that can never be bridged To accomplish my task, I build a bridge between Frisch’s two worlds with applied formal-econometric arguments, invent a pertinent model-world economic theory, walk the bridge with the invented theory, and use it to give economic meaning to the variables and parameters of my econometric model. At the end I demonstrate that the invented theory and the bridge I use in my analysis are empirically relevant in the empirical context of my econometric model. Full article
17 pages, 1374 KB  
Article
Bayesian Panel Variable Selection Under Model Uncertainty for High-Dimensional Data
by Pathairat Pastpipatkul and Htwe Ko
Econometrics 2026, 14(1), 3; https://doi.org/10.3390/econometrics14010003 - 4 Jan 2026
Viewed by 971
Abstract
Selecting the relevant covariates in high-dimensional panel data remains a central challenge in applied econometrics. Conventional fixed effects and random effects models are not designed for systematic variable selection under model uncertainty. In addition, many existing models such as LASSO in machine learning [...] Read more.
Selecting the relevant covariates in high-dimensional panel data remains a central challenge in applied econometrics. Conventional fixed effects and random effects models are not designed for systematic variable selection under model uncertainty. In addition, many existing models such as LASSO in machine learning or Bayesian approaches like model averaging, Bayesian Additive Regression Trees, and Bayesian Variable Selection with Shrinking and Diffusing Priors have been primarily developed for time series analysis. This paper develops and applies Bayesian Panel Variable Selection (BPVS) models to simulation and empirical applications. These models are designed to assist researchers in identifying which input covariates matter most, while also determining whether their effects should be treated as fixed or random through Bayesian hierarchical modeling and posterior inference, which jointly accounts for variable importance ranking. Both the simulation studies and the empirical application to socioeconomic determinants of subjective well-being show that Bayesian panel models outperform classical models, especially in terms of convergence stability, predictive accuracy, and reliable variable selection. Classical panel models, in contrast, remain attractive for their computational efficiency and simplicity. The Hausman test is used as a robustness check. The study adds an econometric approach for dealing with model uncertainty in high-dimensional panel analysis and offers open-source R 4.5.1 code to support future applications. Full article
Show Figures

Figure 1

15 pages, 615 KB  
Article
I(2) Cointegration in Macroeconometric Modelling: Tourism Price and Inflation Dynamics
by Sergej Gričar, Štefan Bojnec and Bjørnar Karlsen Kivedal
Econometrics 2026, 14(1), 2; https://doi.org/10.3390/econometrics14010002 - 4 Jan 2026
Cited by 1 | Viewed by 874
Abstract
This study enhances macroeconometric modelling by utilising an I(2) cointegration framework to analyse the dynamic link between tourism prices and inflation in Slovenia and the Eurozone. Using monthly data from 2000 to 2017, we estimate cointegrated VAR models that capture long-run equilibria, short-run [...] Read more.
This study enhances macroeconometric modelling by utilising an I(2) cointegration framework to analyse the dynamic link between tourism prices and inflation in Slovenia and the Eurozone. Using monthly data from 2000 to 2017, we estimate cointegrated VAR models that capture long-run equilibria, short-run adjustments, and persistent deviations inherent in I(2) processes. The results reveal strong spillover effects from Slovenian tourism and input prices to Eurozone inflation and hospitality prices in the short run, while Eurozone-wide shocks dominate the long-run dynamics. By explicitly accounting for nonstationarity, structural breaks, and seasonal patterns, the I(2) model provides a more reliable framework than traditional I(1)-based approaches, which are often prone to misspecification when higher-order integration and persistent deviations are ignored. The findings contribute to macroeconometric theory by demonstrating the value of I(2) cointegration in modelling complex price systems and offer policy insights into inflation management and competitiveness in tourism-dependent economies. Full article
(This article belongs to the Special Issue Advancements in Macroeconometric Modeling and Time Series Analysis)
Show Figures

Figure 1

28 pages, 2308 KB  
Article
Complexity-Aware Vector-Valued Machine Learning of State-Level Bond Returns: Evidence on South African Trade Spillovers Under SALT and OBBBA
by Gordon Dash, Nina Kajiji, Domenic Vonella and Helper Zhou
Econometrics 2026, 14(1), 1; https://doi.org/10.3390/econometrics14010001 - 23 Dec 2025
Viewed by 1196
Abstract
This study examines the impact of international trade shocks from South Africa and recent U.S. federal tax reforms on state-level municipal bond returns within the United States. Employing a unique transaction-level dataset comprising more than 50 million municipal bond trades from 2020 to [...] Read more.
This study examines the impact of international trade shocks from South Africa and recent U.S. federal tax reforms on state-level municipal bond returns within the United States. Employing a unique transaction-level dataset comprising more than 50 million municipal bond trades from 2020 to 2024, the empirical approach integrates machine learning estimators with econometric volatility models to examine daily nonlinear spillovers and structural complexity across twenty U.S. states. The study introduces and extends the application of a vector radial basis function neural network framework, leveraging its universal approximation capacity to jointly model multiple state-level outcomes and uncover complex response patterns The empirical results reveal substantial cross-state heterogeneity in bond-return resilience, influenced by variation in state tax regimes, economic complexity, and differential exposure to external financial forces. States exhibiting higher economic adaptability demonstrate faster recovery and weaker shock amplification, whereas structurally rigid states experience persistent volatility and slower mean reversion. These findings demonstrate that complexity-aware predictive modeling, when combined with granular fiscal and trade-linkage data, provides valuable insight into the pathways through which global and domestic shocks propagate into U.S. municipal bond markets and shape subnational financial stability. Full article
Show Figures

Figure 1

Previous Issue
Next Issue
Back to TopTop