Sign in to use this feature.

Years

Between: -

Article Types

Countries / Regions

Search Results (145)

Search Parameters:
Journal = Forecasting
Section = General

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
48 pages, 5217 KB  
Article
AutoML-Based Prediction of Unconfined Compressive Strength of Stabilized Soils: A Multi-Dataset Evaluation on Worldwide Experimental Data
by Romulo Murucci Oliveira, Deivid Campos, Katia Vanessa Bicalho, Bruno da S. Macêdo, Matteo Bodini, Camila Martins Saporetti and Leonardo Goliatt
Forecasting 2025, 7(4), 80; https://doi.org/10.3390/forecast7040080 - 18 Dec 2025
Abstract
Unconfined Compressive Strength (UCS) of stabilized soils is commonly used for evaluating the effectiveness of soil improvement techniques. Achieving target UCS values through conventional trial-and-error approaches requires extensive laboratory experiments, which are time-consuming and resource-intensive. Automated Machine Learning (AutoML) frameworks offer a promising [...] Read more.
Unconfined Compressive Strength (UCS) of stabilized soils is commonly used for evaluating the effectiveness of soil improvement techniques. Achieving target UCS values through conventional trial-and-error approaches requires extensive laboratory experiments, which are time-consuming and resource-intensive. Automated Machine Learning (AutoML) frameworks offer a promising alternative by enabling automated, reproducible, and accessible predictive modeling of UCS values from more readily obtainable index and physical soil and stabilizer properties, reducing the reliance on experimental testing and empirical relationships, and allowing systematic exploration of multiple models and configurations. This study evaluates the predictive performance of five state-of-the-art AutoML frameworks (i.e., AutoGluon, AutoKeras, FLAML, H2O, and TPOT) using analyses of results from 10 experimental datasets comprising 2083 samples from laboratory experiments spanning diverse soil types, stabilizers, and experimental conditions across many countries worldwide. Comparative analyses revealed that TPOT achieved the highest overall performance (R2 values up to 0.982, an RMSE of 12.87 kPa, and a MAPE of 3.813%), whereas AutoKeras exhibited lower accuracy on complex datasets; FLAML and H2O also demonstrated strong predictive capabilities, with performance varying with dataset characteristics. Despite the promising potential of AutoML, prior research has shown that fully automated frameworks have limited applicability to UCS prediction, highlighting a gap in end-to-end pipeline automation. The findings provide practical guidance for selecting AutoML tools based on dataset characteristics and research objectives, and suggest avenues for future studies, including expanding the range of AutoML frameworks and integrating interpretability techniques, such as feature importance analysis, to deepen understanding of soil–stabilizer interactions. Overall, the results indicate that AutoML frameworks can effectively accelerate UCS prediction, reduce laboratory workload, and support data-driven decision-making in geotechnical engineering. Full article
Show Figures

Figure 1

33 pages, 4760 KB  
Article
A Bayesian Markov Switching Autoregressive Model with Time-Varying Parameters for Dynamic Economic Forecasting
by Syarifah Inayati, Nur Iriawan, Irhamah and Uha Isnaini
Forecasting 2025, 7(4), 79; https://doi.org/10.3390/forecast7040079 - 17 Dec 2025
Viewed by 61
Abstract
This research tackles the challenge of forecasting nonlinear time series data with stochastic structural variations by proposing the Markov switching autoregressive model with time-varying parameters (MSAR-TVP). Although effective in modeling dynamic regime transitions, the Classical MSAR-TVP faces challenges with complex datasets. To address [...] Read more.
This research tackles the challenge of forecasting nonlinear time series data with stochastic structural variations by proposing the Markov switching autoregressive model with time-varying parameters (MSAR-TVP). Although effective in modeling dynamic regime transitions, the Classical MSAR-TVP faces challenges with complex datasets. To address these issues, a Bayesian MSAR-TVP framework was developed, incorporating flexible parameters that adapt dynamically across regimes. The model was tested on two periods of U.S. real GNP data: a historically stable segment (1952–1986) and a more complex, modern segment that includes more economic volatility (1947–2024). The Bayesian MSAR-TVP demonstrated superior performance in handling complex datasets, particularly in out-of-sample forecasting, outperforming the Bayesian AR-TVP, Classical MSAR-TVP, and Classical MSAR models, as evaluated by mean absolute percentage error (MAPE) and mean absolute error (MAE). For in-sample data, the Classical MSAR-TVP retained its stability advantage. These findings highlight the Bayesian MSAR-TVP’s ability to address parameter uncertainty and adapt to data fluctuations, making it highly effective for forecasting dynamic economic cycles. Additionally, the two-year forecast underscores its practical utility in predicting economic cycles, suggesting continued expansion. This reinforces the model’s significance for economic forecasting and strategic policy formulation. Full article
Show Figures

Figure 1

21 pages, 7395 KB  
Article
A New Loss Function for Enhancing Peak Prediction in Time Series Data with High Variability
by Mahan Hajiabbasi Somehsaraie, Soheyla Tofighi, Zhaoan Wang, Jun Wang and Shaoping Xiao
Forecasting 2025, 7(4), 75; https://doi.org/10.3390/forecast7040075 - 3 Dec 2025
Viewed by 645
Abstract
Time series models are considered among the most intricate models in machine learning. Due to sharp temporal variations, time series models normally fall short in predicting the peaks or local minima accurately. To overcome this challenge, we proposed a novel custom loss function, [...] Read more.
Time series models are considered among the most intricate models in machine learning. Due to sharp temporal variations, time series models normally fall short in predicting the peaks or local minima accurately. To overcome this challenge, we proposed a novel custom loss function, Enhanced Peak (EP) loss, specifically designed to pinpoint peaks and troughs in time series models, to address underestimations and overestimations in the forecasting process. EP loss applies an adaptive penalty when prediction errors exceed a specified threshold, encouraging the model to focus more effectively on these regions. To evaluate the effectiveness and versatility of EP loss, the loss function was tested on three highly variable datasets: NOx emissions, streamflow measurements, and gold price, implementing Gated Recurrent Unit and Transformer-based models. The results consistently demonstrated that EP loss significantly mitigates peak prediction errors compared to conventional loss functions, highlighting its potential for highly variable time series applications. Full article
(This article belongs to the Special Issue Feature Papers of Forecasting 2025)
Show Figures

Figure 1

38 pages, 1359 KB  
Article
A System Dynamics Framework for Market Share Forecasting in the Telecommunications Market
by Nikolaos Kanellos, Dimitrios Katsianis and Dimitris Varoutas
Forecasting 2025, 7(4), 74; https://doi.org/10.3390/forecast7040074 - 30 Nov 2025
Viewed by 400
Abstract
This paper presents a novel system dynamics-based framework for forecasting market share evolution in the telecommunications sector. The framework conceptualizes market share as flows of subscribers—driven by churn, attraction, and market growth—between interconnected compartments representing providers. It is designed to operate with limited [...] Read more.
This paper presents a novel system dynamics-based framework for forecasting market share evolution in the telecommunications sector. The framework conceptualizes market share as flows of subscribers—driven by churn, attraction, and market growth—between interconnected compartments representing providers. It is designed to operate with limited available market data and incorporates stochastic processes to capture market uncertainty, enabling risk-informed forecasts. The framework is applied to the Greek mobile telecommunications market using historical data (2006–2022), with a 5-year hold-back period for validation. Results highlight the dominant role of churn management in market share variability, particularly for the incumbent provider Cosmote, while subscriber attraction parameters show moderate influence for alternative providers Vodafone and Wind Hellas. Sensitivity analysis confirms the model’s robustness and identifies key drivers of forecast variability. The proposed framework provides actionable insights for strategic decision-making, making it a valuable tool for providers and policymakers to address churn, optimize attraction strategies, and ensure long-term competitiveness in dynamic markets. Full article
Show Figures

Figure 1

35 pages, 1417 KB  
Systematic Review
Demand Forecasting in the Automotive Industry: A Systematic Literature Review
by Nehalben Ranabhatt, Sérgio Barreto, Marco Pimpão and Pedro Prates
Forecasting 2025, 7(4), 73; https://doi.org/10.3390/forecast7040073 - 28 Nov 2025
Viewed by 964
Abstract
The automobile industry is one of the world’s largest manufacturing sectors and a key contributor to economic growth. Demand forecasting plays a critical role in supply chain management within the automotive sector. Reliable forecasts are essential for production planning, inventory control, and meeting [...] Read more.
The automobile industry is one of the world’s largest manufacturing sectors and a key contributor to economic growth. Demand forecasting plays a critical role in supply chain management within the automotive sector. Reliable forecasts are essential for production planning, inventory control, and meeting market demands efficiently. However, accurately predicting demand remains a challenge due to the influence of external factors such as socioeconomic trends and weather conditions. This study presents a systematic literature review of the forecasting methods employed within the automotive industry, encompassing both vehicle and spare parts demand. Following PRISMA guidelines, 63 publications were identified and analyzed, covering traditional statistical models such as ARIMA and SARIMA, as well as state-of-the-art artificial intelligence approaches, including artificial neural networks. The review finds that classical statistical models remain prevalent for vehicle demand forecasting, Croston’s method dominates spare parts forecasting, and AI-based techniques increasingly outperform conventional models in recent studies. Furthermore, the review compiles a broad set of external variables influencing demand and highlights the common challenges associated with demand forecasting. It concludes by outlining potential directions for future research. Full article
Show Figures

Figure 1

36 pages, 1860 KB  
Article
Carbon Trading Price Forecasting Based on Multidimensional News Text and Decomposition–Ensemble Model: The Case Study of China’s Pilot Regions
by Xu Wang, Yingjie Liu, Zhenao Guo, Tengfei Yang, Xu Gong and Zhichong Lyu
Forecasting 2025, 7(4), 72; https://doi.org/10.3390/forecast7040072 - 28 Nov 2025
Viewed by 376
Abstract
Accurately predicting carbon trading price is challenging due to pronounced nonlinearity, non-stationarity, and sensitivity to diverse factors, including macroeconomic conditions, market sentiment, and climate policy. This study proposes a novel hybrid forecasting framework that integrates multidimensional news text analysis, ICEEMDAN (Improved Complete Ensemble [...] Read more.
Accurately predicting carbon trading price is challenging due to pronounced nonlinearity, non-stationarity, and sensitivity to diverse factors, including macroeconomic conditions, market sentiment, and climate policy. This study proposes a novel hybrid forecasting framework that integrates multidimensional news text analysis, ICEEMDAN (Improved Complete Ensemble Empirical Mode Decomposition with Adaptive Noise) decomposition, and machine learning to predict carbon prices in China’s pilot trading prices. We first extract a market sentiment index from news texts in the WiseSearch News Database using a customized Chinese carbon-market dictionary. In addition, a price trend index and topic intensity index are derived using Latent Dirichlet Allocation (LDA) and Convolutional Neural Networks (CNN), respectively. All feature sequences are subsequently decomposed and reconstructed using sample-entropy-based ICEEMDAN approach. The resulting multi-frequency components were then used as inputs for a range of machine-learning models to evaluate predictive performance. The empirical results demonstrate that the incorporation of multidimensional text information on China’s carbon market, combined with financial features, yields a substantial gain in prediction accuracy. Our integrated decomposition-ensemble framework achieves optimal performance by employing dedicated models—BiGRU, XGBoost, and BiLSTM for the high-frequency, low-frequency, and trend components, respectively. This approach provides policymakers, regulators, and investors with a more reliable tool for forecasting carbon prices and supports more informed decision-making, offering a promising pathway for effective carbon-price prediction. Full article
Show Figures

Figure 1

27 pages, 585 KB  
Article
Bayesian LASSO with Categorical Predictors: Coding Strategies, Uncertainty Quantification, and Healthcare Applications
by Xi Lu, Jieni Li, Rajender R. Aparasu, Nebil Yusuf and Cen Wu
Forecasting 2025, 7(4), 69; https://doi.org/10.3390/forecast7040069 - 21 Nov 2025
Viewed by 397
Abstract
There is a growing interest in applying statistical machine learning methods, such as LASSO regression and its extensions, to analyze healthcare datasets. The existing study has examined LASSO and group LASSO regression with categorical predictors that are widely used in healthcare studies to [...] Read more.
There is a growing interest in applying statistical machine learning methods, such as LASSO regression and its extensions, to analyze healthcare datasets. The existing study has examined LASSO and group LASSO regression with categorical predictors that are widely used in healthcare studies to represent variables with nominal or ordinal categories. Despite the success of these studies, statistical inference procedures and quantifying uncertainty for regression with categorical predictors have largely been overlooked, partly due to the theoretical challenges practitioners face when applying these methods in behavioral research. In this article, we aim to fill this gap by investigating from a Bayesian perspective. Specifically, we conduct Bayesian LASSO analysis with categorical predictors under different coding strategies, and thoroughly investigate the impact of four representative coding strategies on variable selection and prediction. In particular, we have conducted uncertainty quantification in terms of marginal Bayesian credible intervals by leveraging the advantage that fully Bayesian analysis can enable exact statistical inference even on finite samples. In this study, we demonstrate that the variable selection, estimation and prediction of Bayesian LASSO are influenced by the coding strategies with the real-world Medical Expenditure Panel Survey (MEPS) data. The performance of Bayesian LASSO has also been compared with LASSO and linear regression. Full article
Show Figures

Figure 1

47 pages, 4175 KB  
Article
Detecting Stablecoin Failure with Simple Thresholds and Panel Binary Models: The Pivotal Role of Lagged Market Capitalization and Volatility
by Dean Fantazzini
Forecasting 2025, 7(4), 68; https://doi.org/10.3390/forecast7040068 - 19 Nov 2025
Viewed by 2890
Abstract
In this study, we extend research on stablecoin credit risk by introducing a novel rule-of-thumb approach to determine whether a stablecoin is “dead” or “alive” based on a simple price threshold. Using a comprehensive dataset of 98 stablecoins, we classify a coin as [...] Read more.
In this study, we extend research on stablecoin credit risk by introducing a novel rule-of-thumb approach to determine whether a stablecoin is “dead” or “alive” based on a simple price threshold. Using a comprehensive dataset of 98 stablecoins, we classify a coin as failed if its price falls below a predefined threshold (e.g., $0.80), validated through sensitivity analysis against established benchmarks such as CoinMarketCap delistings and Feder et al. (2018) methodology. We employ a wide range of panel binary models to forecast stablecoins’ probabilities of default (PDs), incorporating stablecoin-specific regressors. Our findings indicate that panel Cauchit models with fixed effects outperform other models across different definitions of stablecoin failure, while lagged average monthly market capitalization and lagged stablecoin volatility emerge as the most significant predictors—outweighing macroeconomic and policy-related variables. Random forest models complement our analysis, confirming the robustness of these key drivers. This approach not only enhances the predictive accuracy of stablecoin PDs but also provides a practical, interpretable framework for regulators and investors to assess stablecoin stability based on credit risk dynamics. Full article
Show Figures

Figure 1

23 pages, 507 KB  
Article
Rice Yield Forecasting in Northeast China with a Dual-Factor ARIMA Model Incorporating SPEI1-Sep. and Sown Area
by Song Nie and Zhi-Qiang Jiang
Forecasting 2025, 7(4), 67; https://doi.org/10.3390/forecast7040067 - 16 Nov 2025
Viewed by 471
Abstract
Amid escalating global climate change and geopolitical tensions threatening food supply chains, the three provinces of Northeast China, which serve as a major grain production base, play a crucial role in ensuring national food security. However, the region is experiencing more frequent extreme [...] Read more.
Amid escalating global climate change and geopolitical tensions threatening food supply chains, the three provinces of Northeast China, which serve as a major grain production base, play a crucial role in ensuring national food security. However, the region is experiencing more frequent extreme climatic events and increasing limitations on arable land. This necessitates an evaluation of the combined effects of climate conditions and sown area on rice (Oryza sativa L.) yields. Utilizing provincial panel data from 1990 to 2022, this study conducts baseline panel regression analyses at both the national and Northeast China levels. The results consistently identify the value of the standardized precipitation evapotranspiration index (SPEI) on September as a key climatic factor exerting a significant negative effect on rice total yield, whereas the rice sown area is a robust positive determinant. Based on these findings, we develop a dual-factor analytical framework that incorporates both climatic conditions and rice sown area, utilizing SPEI1-Sep. to identify critical growth stages of rice, with the aim of providing a more comprehensive understanding of their combined effects on yield. To further support predictive accuracy, the comparative performance assessments of the Extreme Gradient Boosting (XGBoost), random forest (RF), and Autoregressive Integrated Moving Average (ARIMA) models are conducted. The results show that the ARIMA model outperforms others in forecasting. Forecasts for 2023–2027 indicate slow yield growth in Jilin Province, with a 1.5% annual increase. Heilongjiang shows minor fluctuations, stabilizing between 24.97 and 25.56 million tons. Liaoning’s yield remains stable, projected between 5.13 and 5.20 million tons. These trends suggest limited overall yield expansion, highlighting the need for region-specific policies and resource management to ensure China’s grain security. This study clarifies the interplay between climate and sown area, demonstrates the relative forecasting advantage of ARIMA in this setting, and provides evidence to support managing yield variability and optimizing agricultural policy in Northeast China, with implications for long-term national food security. Full article
Show Figures

Figure 1

29 pages, 835 KB  
Article
Non-Negative Forecast Reconciliation: Optimal Methods and Operational Solutions
by Daniele Girolimetto
Forecasting 2025, 7(4), 64; https://doi.org/10.3390/forecast7040064 - 26 Oct 2025
Cited by 1 | Viewed by 825
Abstract
In many different applications such as retail, energy, and tourism, forecasts for a set of related time series must satisfy both linear and non-negativity constraints, as negative values are meaningless in practice. Standard regression-based reconciliation approaches achieve coherence with linear constraints, but may [...] Read more.
In many different applications such as retail, energy, and tourism, forecasts for a set of related time series must satisfy both linear and non-negativity constraints, as negative values are meaningless in practice. Standard regression-based reconciliation approaches achieve coherence with linear constraints, but may generate negative forecasts, reducing interpretability and usability. This paper develops and evaluates three alternative strategies for non-negative forecast reconciliation. First, reconciliation is formulated as a non-negative least squares problem and solved with the operator splitting quadratic program, allowing flexible inclusion of additional constraints. Second, we propose an iterative non-negative reconciliation with immutable forecasts, offering a practical optimization-based alternative. Third, we investigate a family of set-negative-to-zero heuristics that achieve efficiency and interpretability at minimal computational cost. Using the Australian Tourism Demand dataset, we compare these approaches in terms of forecast accuracy and computation time. The results show that non-negativity constraints consistently improve accuracy compared to base forecasts. Overall, set-negative-to-zero achieve near-optimal performance with negligible computation time, the block principal pivoting algorithm provides a good accuracy–efficiency compromise, and the operator splitting quadratic program offers flexibility for incorporating additional constraints in large-scale applications. Full article
(This article belongs to the Special Issue Feature Papers of Forecasting 2025)
Show Figures

Figure 1

32 pages, 3406 KB  
Article
Enhancing Policy Insights: Machine Learning-Based Forecasting of Euro Area Inflation HICP and Subcomponents
by László Vancsura, Tibor Tatay and Tibor Bareith
Forecasting 2025, 7(4), 63; https://doi.org/10.3390/forecast7040063 - 26 Oct 2025
Viewed by 1171
Abstract
Accurate inflation forecasting is of central importance for monetary authorities, governments, and businesses, as it shapes economic decisions and policy responses. While most studies focus on headline inflation, this paper analyses the Harmonised Index of Consumer Prices (HICP) and its 12 subcomponents in [...] Read more.
Accurate inflation forecasting is of central importance for monetary authorities, governments, and businesses, as it shapes economic decisions and policy responses. While most studies focus on headline inflation, this paper analyses the Harmonised Index of Consumer Prices (HICP) and its 12 subcomponents in the euro area over the period 2000–2023, covering episodes of financial crisis, economic stability, and recent inflationary shocks. We apply a broad set of machine learning and deep learning models, systematically optimized through grid search, and evaluate their performance using the Normalized Mean Absolute Error (NMAE). To complement traditional accuracy measures, we introduce the Forecastability Index (FI) and the Interquartile Range (IQR), which jointly capture both the difficulty and robustness of forecasts. Our results show that RNN and LSTM architectures consistently outperform traditional approaches such as SVR and RFR, particularly in volatile environments. Subcomponents such as Health and Education proved easier to forecast, while Recreation and culture and Restaurants and hotels were among the most challenging. The findings demonstrate that macroeconomic stability enhances forecasting accuracy, whereas crises amplify errors and inter-model dispersion. By highlighting the heterogeneous predictability of inflation subcomponents, this study provides novel insights with strong policy relevance, showing which categories can be forecast with greater confidence and where uncertainty requires more cautious intervention. Full article
Show Figures

Figure 1

19 pages, 769 KB  
Article
Can Simple Balancing Algorithms Improve School Dropout Forecasting? The Case of the State Education Network of Espírito Santo, Brazil
by Guilherme Armando de Almeida Pereira and Kiara de Deus Demura
Forecasting 2025, 7(4), 59; https://doi.org/10.3390/forecast7040059 - 18 Oct 2025
Viewed by 603
Abstract
This study evaluates the effect of simple data-level balancing techniques on predicting school dropout across all state public high schools in Espírito Santo, Brazil. We trained Logistic Regression with LASSO (LR), Random Forest (RF), and Naive Bayes (NB) models on first-quarter data from [...] Read more.
This study evaluates the effect of simple data-level balancing techniques on predicting school dropout across all state public high schools in Espírito Santo, Brazil. We trained Logistic Regression with LASSO (LR), Random Forest (RF), and Naive Bayes (NB) models on first-quarter data from 2018–2019 and forecasted dropouts for 2020, with additional validation in 2022. Facing strong class imbalance, we compared three balancing methods—RUS, SMOTE, and ROSE—against models trained on the original data. Performance was assessed using accuracy, sensitivity, specificity, precision, F1, AUC, and G-mean. Results show that the imbalance severely harmed RF and NB trained without balancing, while Logistic Regression remained more stable. Overall, balancing techniques improved most metrics: RUS and ROSE were often superior, while SMOTE produced mixed results. Optimal configurations varied by year and metric, and RUS and ROSE made up most of the best combinations. Although most configurations benefited from balancing, some decreased performance; therefore, we recommend systematic testing of multiple balancing strategies and further research into SMOTE variants and algorithm-level approaches. Full article
Show Figures

Figure 1

25 pages, 6191 KB  
Article
Machine Learning Forecasting of Direct Solar Radiation: A Multi-Model Evaluation with Trigonometric Cyclical Encoding
by Latif Bukari Rashid, Shahzada Zaman Shuja and Shafiqur Rehman
Forecasting 2025, 7(4), 58; https://doi.org/10.3390/forecast7040058 - 17 Oct 2025
Viewed by 1232
Abstract
As the world is shifting toward cleaner energy sources, accurate forecasting of solar radiation is critical for optimizing the performance and integration of solar energy systems. In this study, we explore eight machine learning models, namely, Random Forest Regressor, Linear Regression Model, Artificial [...] Read more.
As the world is shifting toward cleaner energy sources, accurate forecasting of solar radiation is critical for optimizing the performance and integration of solar energy systems. In this study, we explore eight machine learning models, namely, Random Forest Regressor, Linear Regression Model, Artificial Neural Network, k-Nearest Neighbors, Support Vector Regression, Gradient Boosting Regressor, Gaussian Process Regression, and Deep Learning, as to their use in forecasting direct solar radiation across six climatically diverse regions in the Kingdom of Saudi Arabia. The models were evaluated using eight statistical metrics along with time-series and absolute error analyses. A key contribution of this work is the introduction of Trigonometric Cyclical Encoding, which has significantly improved temporal representation learning. Comparative SHAP-based feature-importance analysis revealed that Trigonometric Cyclical Encoding enhanced the explanatory power of temporal features by 49.26% for monthly cycles and 53.30% for daily cycles. The findings show that Deep Learning achieved the lowest root mean square error, as well as the highest coefficient of determination, while Artificial Neural Network demonstrated consistently high accuracy across the sites. Support Vector Regression performed optimally but was less reliable in some regions. Error and time-series analyses reveal that Artificial Neural Network and Deep Learning maintained stable prediction accuracy throughout high solar radiation seasons, whereas Linear Regression, Random Forest Regressor, and k-Nearest Neighbors showed greater fluctuations. The proposed Trigonometric Cyclical Encoding technique further enhanced model performance by maintaining the overall fitness of the models, which ranged between 81.79% and 94.36% in all scenarios. This paper supports the effective planning of solar energy and integration in challenging climatic conditions. Full article
(This article belongs to the Topic Solar and Wind Power and Energy Forecasting, 2nd Edition)
Show Figures

Figure 1

17 pages, 887 KB  
Article
Comparison of Linear and Beta Autoregressive Models in Forecasting Nonstationary Percentage Time Series
by Carlo Grillenzoni
Forecasting 2025, 7(4), 57; https://doi.org/10.3390/forecast7040057 - 13 Oct 2025
Viewed by 559
Abstract
Positive percentage time series are present in many empirical applications; they take values in the continuous interval (0,1) and are often modeled with linear dynamic models. Risks of biased predictions (outside the admissible range) and problems of heteroskedasticity in the presence of asymmetric [...] Read more.
Positive percentage time series are present in many empirical applications; they take values in the continuous interval (0,1) and are often modeled with linear dynamic models. Risks of biased predictions (outside the admissible range) and problems of heteroskedasticity in the presence of asymmetric distributions are ignored by practitioners. Alternative models are proposed in the statistical literature; the most suitable is the dynamic beta regression which belongs to generalized linear models (GLM) and uses the logit transformation as a link function. However, owing to the Jensen inequality, this approach may also not be optimal in prediction; thus, the aim of the present paper is the in-depth forecasting comparison of linear and beta autoregressions. Simulation experiments and applications to nonstationary time series (the US unemployment rate and BR hydroelectric energy) are carried out. Rolling regression for time-varying parameters is applied to both linear and beta models, and a prediction criterion for the joint selection of model order and sample size is defined. Full article
(This article belongs to the Special Issue Feature Papers of Forecasting 2025)
Show Figures

Figure 1

39 pages, 5604 KB  
Article
Prediction of 3D Airspace Occupancy Using Machine Learning
by Cristian Lozano Tafur, Jaime Orduy Rodríguez, Pedro Melo Daza, Iván Rodríguez Barón, Danny Stevens Traslaviña and Juan Andrés Bermúdez
Forecasting 2025, 7(4), 56; https://doi.org/10.3390/forecast7040056 - 8 Oct 2025
Viewed by 1120
Abstract
This research introduces a system designed to predict three-dimensional airspace occupancy over Colombia using historical Automatic Dependent Surveillance-Broadcast (ADS-B) data and machine learning techniques. The goal is to support proactive air traffic management by estimating future aircraft positions—specifically their latitude, longitude, and flight [...] Read more.
This research introduces a system designed to predict three-dimensional airspace occupancy over Colombia using historical Automatic Dependent Surveillance-Broadcast (ADS-B) data and machine learning techniques. The goal is to support proactive air traffic management by estimating future aircraft positions—specifically their latitude, longitude, and flight level. To achieve this, four predictive models were developed and tested: K-Nearest Neighbors (KNN), Random Forest, Extreme Gradient Boosting (XGBoost), and Long Short-Term Memory (LSTM). Among them, the LSTM model delivered the most accurate results, with a Mean Absolute Error (MAE) of 312.59, a Root Mean Squared Error (RMSE) of 1187.43, and a coefficient of determination (R2) of 0.7523. Compared to the baseline models (KNN, Random Forest, XGBoost), these values represent an improvement of approximately 91% in MAE, 83% in RMSE, and an eighteen-fold increase in R2, demonstrating the substantial advantage of the LSTM approach. These metrics indicate a significant improvement over the other models, particularly in capturing temporal patterns and adjusting to evolving traffic conditions. The strength of the LSTM approach lies in its ability to model sequential data and adapt to dynamic environments—making it especially suitable for supporting future Trajectory-Based Operations (TBO). The results confirm that predicting airspace occupancy in three dimensions using historical data are not only possible but can yield reliable and actionable insights. Looking ahead, the integration of hybrid neural network architectures and their deployment in real-time systems offer promising directions to enhance both accuracy and operational value. Full article
(This article belongs to the Topic Short-Term Load Forecasting—2nd Edition)
Show Figures

Figure 1

Back to TopTop