Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (16)

Search Parameters:
Keywords = forecast reconciliation

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
38 pages, 5287 KiB  
Article
Comparative Analysis of Throughput Prediction Models in SAG Mill Circuits: A Geometallurgical Approach
by Madeleine Guillen, Guillermo Iriarte, Hector Montes, Gerardo San Martín and Nicole Fantini
Mining 2025, 5(3), 37; https://doi.org/10.3390/mining5030037 - 20 Jun 2025
Viewed by 464
Abstract
This study was conducted on a copper porphyry deposit located in Espinar, Cusco (Peru), with the objective of developing and comparing predictive models for processing capacity in SAG grinding circuits. A total of 174 samples were used for the JK Drop Weight Test [...] Read more.
This study was conducted on a copper porphyry deposit located in Espinar, Cusco (Peru), with the objective of developing and comparing predictive models for processing capacity in SAG grinding circuits. A total of 174 samples were used for the JK Drop Weight Test (JKDWT) and 1172 for the Bond Work Index (BWi), along with 36 months of operational plant data. Three modeling methodologies were evaluated: DWi-BWi, SGI-BWi, and SMC-BWi (Mia, Mib), all integrated into a geometallurgical block model. Validation was performed through reconciliation with actual plant data, considering operational constraints such as transfer size (T80) and maximum throughput (TPH). The model based on SMC parameters and BWi showed the best predictive performance, with a root mean square error (RMSE) of 143 t/h and a mean relative deviation of 1.5%. This approach enables more accurate throughput forecasting, improving mine planning and operational efficiency. The results highlight the importance of integrating geometallurgical and operational data to build robust models that are adaptable to ore variability and applicable to both short- and long-term planning scenarios. Full article
Show Figures

Figure 1

30 pages, 2741 KiB  
Article
Long-Term Multi-Resolution Probabilistic Load Forecasting Using Temporal Hierarchies
by Shafie Bahman and Hamidreza Zareipour
Energies 2025, 18(11), 2908; https://doi.org/10.3390/en18112908 - 1 Jun 2025
Viewed by 563
Abstract
Accurate long-term electricity load forecasting is critical for energy planning, infrastructure development, and risk management, especially under increasing uncertainty from climate and economic shifts. This study proposes a multi-resolution probabilistic load forecasting framework that leverages temporal hierarchies to generate coherent forecasts at hourly, [...] Read more.
Accurate long-term electricity load forecasting is critical for energy planning, infrastructure development, and risk management, especially under increasing uncertainty from climate and economic shifts. This study proposes a multi-resolution probabilistic load forecasting framework that leverages temporal hierarchies to generate coherent forecasts at hourly, daily, monthly, and yearly levels. The model integrates climate and economic indicators and employs tailored forecasting techniques at each resolution, including XGBoost and ARIMAX. Initially incoherent forecasts across time scales are reconciled using advanced methods such as Ordinary Least Squares (OLS), Weighted Least Squares with Series Variance Scaling (WLS_V), and Structural Scaling (WLS_S) to ensure consistency. Using historical data from Alberta, Canada, the proposed approach improves the accuracy of deterministic forecasts and enhances the reliability of probabilistic forecasts, particularly when using the OLS reconciliation method. These results highlight the value of temporal hierarchy structures in producing high-resolution long-horizon load forecasts, providing actionable insights for utilities and policymakers involved in long-term energy planning and system optimization. Full article
(This article belongs to the Special Issue Forecasting and Risk Management Techniques for Electricity Markets II)
Show Figures

Figure 1

48 pages, 1127 KiB  
Review
Artificial Intelligence vs. Efficient Markets: A Critical Reassessment of Predictive Models in the Big Data Era
by Antonio Pagliaro
Electronics 2025, 14(9), 1721; https://doi.org/10.3390/electronics14091721 - 23 Apr 2025
Cited by 2 | Viewed by 3741
Abstract
This paper critically examines artificial intelligence applications in stock market forecasting, addressing significant gaps in the existing literature that often overlook the tension between theoretical market efficiency and empirical predictability. While numerous reviews catalog methodologies, they frequently fail to rigorously evaluate model performance [...] Read more.
This paper critically examines artificial intelligence applications in stock market forecasting, addressing significant gaps in the existing literature that often overlook the tension between theoretical market efficiency and empirical predictability. While numerous reviews catalog methodologies, they frequently fail to rigorously evaluate model performance across different market regimes or reconcile statistical significance with economic relevance. We analyze techniques ranging from traditional statistical models to advanced deep learning architectures, finding that ensemble methods like Extra Trees, Random Forest, and XGBoost consistently outperform single classifiers, achieving directional accuracy of up to 86% in specific market conditions. Our analysis reveals that hybrid approaches integrating multiple data sources demonstrate superior performance by capturing complementary market signals, yet many models showing statistical significance fail to generate economic value after accounting for transaction costs and market impact. By addressing methodological challenges including backtest overfitting, regime changes, and implementation constraints, we provide a novel comprehensive framework for rigorous model assessment that bridges the divide between academic research and practical implementation. This review makes three key contributions: (1) a reconciliation of the Efficient Market Hypothesis with AI-driven predictability through an adaptive market framework, (2) a multi-dimensional evaluation methodology that extends beyond classification accuracy to financial performance, and (3) an identification of promising research directions in explainable AI, transfer learning, causal modeling, and privacy-preserving techniques that address current limitations. Full article
(This article belongs to the Special Issue Artificial Intelligence-Driven Emerging Applications)
Show Figures

Figure 1

29 pages, 2409 KiB  
Article
Enhancing Hierarchical Sales Forecasting with Promotional Data: A Comparative Study Using ARIMA and Deep Neural Networks
by Mariana Teixeira, José Manuel Oliveira and Patrícia Ramos
Mach. Learn. Knowl. Extr. 2024, 6(4), 2659-2687; https://doi.org/10.3390/make6040128 - 19 Nov 2024
Cited by 2 | Viewed by 3477
Abstract
Retailers depend on accurate sales forecasts to effectively plan operations and manage supply chains. These forecasts are needed across various levels of aggregation, making hierarchical forecasting methods essential for the retail industry. As competition intensifies, the use of promotions has become a widespread [...] Read more.
Retailers depend on accurate sales forecasts to effectively plan operations and manage supply chains. These forecasts are needed across various levels of aggregation, making hierarchical forecasting methods essential for the retail industry. As competition intensifies, the use of promotions has become a widespread strategy, significantly impacting consumer purchasing behavior. This study seeks to improve forecast accuracy by incorporating promotional data into hierarchical forecasting models. Using a sales dataset from a major Portuguese retailer, base forecasts are generated for different hierarchical levels using ARIMA models and Multi-Layer Perceptron (MLP) neural networks. Reconciliation methods including bottom-up, top-down, and optimal reconciliation with OLS and WLS (struct) estimators are employed. The results show that MLPs outperform ARIMA models for forecast horizons longer than one day. While the addition of regressors enhances ARIMA’s accuracy, it does not yield similar improvements for MLP. MLPs present a compelling balance of simplicity and efficiency, outperforming ARIMA in flexibility while offering faster training times and lower computational demands compared to more complex deep learning models, making them highly suitable for practical retail forecasting applications. Full article
(This article belongs to the Section Data)
Show Figures

Figure 1

17 pages, 765 KiB  
Article
Data Reconciliation-Based Hierarchical Fusion of Machine Learning Models
by Pál Péter Hanzelik, Alex Kummer and János Abonyi
Mach. Learn. Knowl. Extr. 2024, 6(4), 2601-2617; https://doi.org/10.3390/make6040125 - 11 Nov 2024
Viewed by 1596
Abstract
In the context of hierarchical system modeling, ensuring constraints between different hierarchy levels are met, so, for instance, ensuring the aggregation constraints are satisfied, is essential. However, modelling and forecasting each element of the hierarchy independently introduce errors. To mitigate this balance error, [...] Read more.
In the context of hierarchical system modeling, ensuring constraints between different hierarchy levels are met, so, for instance, ensuring the aggregation constraints are satisfied, is essential. However, modelling and forecasting each element of the hierarchy independently introduce errors. To mitigate this balance error, it is recommended to employ an optimal data reconciliation technique with an emphasis on measurement and modeling errors. In this study, three different machine learning methods for development were investigated. The first method involves no data reconciliation, relying solely on machine learning models built independently at each hierarchical level. The second approach incorporates measurement errors by adjusting the measured data to satisfy each constraint, and the machine learning model is developed based on this dataset. The third method is based on directly fine-tuning the machine learning predictions based on the prediction errors of each model. The three methods were compared using three case studies with different complexities, namely mineral composition estimation with 9 elements, forecasting of retail sales with 14 elements, and waste deposition forecasting with more than 3000 elements. From the results of this study, the conclusion can be drawn that the third method performs the best, and reliable machine learning models can be developed. Full article
(This article belongs to the Section Data)
Show Figures

Figure 1

28 pages, 3882 KiB  
Article
Short-Term Wind Speed Prediction via Sample Entropy: A Hybridisation Approach against Gradient Disappearance and Explosion
by Khathutshelo Steven Sivhugwana and Edmore Ranganai
Computation 2024, 12(8), 163; https://doi.org/10.3390/computation12080163 - 12 Aug 2024
Cited by 2 | Viewed by 1476
Abstract
High-variant wind speeds cause aberrations in wind power systems and compromise the effective operation of wind farms. A single model cannot capture the inherent wind speed randomness and complexity. In the proposed hybrid strategy, wavelet transform (WT) is used for data decomposition, sample [...] Read more.
High-variant wind speeds cause aberrations in wind power systems and compromise the effective operation of wind farms. A single model cannot capture the inherent wind speed randomness and complexity. In the proposed hybrid strategy, wavelet transform (WT) is used for data decomposition, sample entropy (SampEn) for subseries complexity evaluation, neural network autoregression (NNAR) for deterministic subseries prediction, long short-term memory network (LSTM) for complex subseries prediction, and gradient boosting machine (GBM) for prediction reconciliation. The proposed WT-NNAR-LSTM-GBM approach predicts minutely averaged wind speed data collected at Southern African Universities Radiometric Network (SAURAN) stations: Council for Scientific and Industrial Research (CSIR), Richtersveld (RVD), Venda, and the Namibian University of Science and Technology (NUST). For comparison purposes, in WT-NNAR-LSTM-GBM, LSTM and NNAR are respectively replaced with a k-nearest neighbour (KNN) to form the corresponding hybrids: WT-NNAR-KNN-GBM and WT-KNN-LSTM-GBM. We assessed WT-NNAR-LSTM-GBM’s efficacy against NNAR, LSTM, WT-NNAR-KNN-GBM, and WT-KNN-LSTM-GBM as well as the naïve model. The comparative study found that the WT-NNAR-LSTM-GBM model was the most accurate, sharpest, and robust based on mean absolute error, median absolute deviation, and residual analysis. The study results suggest using short-term forecasts to optimise wind power production, enhance grid operations in real-time, and open the door to further algorithmic enhancements. Full article
(This article belongs to the Special Issue Signal Processing and Machine Learning in Data Science)
Show Figures

Figure 1

21 pages, 27474 KiB  
Article
Hybrid Twins Modeling of a High-Level Radioactive Waste Cell Demonstrator for Long-Term Temperature Monitoring and Forecasting
by David Muñoz, Anoop Ebey Thomas, Julien Cotton, Johan Bertrand and Francisco Chinesta
Sensors 2024, 24(15), 4931; https://doi.org/10.3390/s24154931 - 30 Jul 2024
Viewed by 1223
Abstract
Monitoring a deep geological repository for radioactive waste during the operational phases relies on a combination of fit-for-purpose numerical simulations and online sensor measurements, both producing complementary massive data, which can then be compared to predict reliable and integrated information (e.g., in a [...] Read more.
Monitoring a deep geological repository for radioactive waste during the operational phases relies on a combination of fit-for-purpose numerical simulations and online sensor measurements, both producing complementary massive data, which can then be compared to predict reliable and integrated information (e.g., in a digital twin) reflecting the actual physical evolution of the installation over the long term (i.e., a century), the ultimate objective being to assess that the repository components/processes are effectively following the expected trajectory towards the closure phase. Data prediction involves using historical data and statistical methods to forecast future outcomes, but it faces challenges such as data quality issues, the complexity of real-world data, and the difficulty in balancing model complexity. Feature selection, overfitting, and the interpretability of complex models further contribute to the complexity. Data reconciliation involves aligning model with in situ data, but a major challenge is to create models capturing all the complexity of the real world, encompassing dynamic variables, as well as the residual and complex near-field effects on measurements (e.g., sensors coupling). This difficulty can result in residual discrepancies between simulated and real data, highlighting the challenge of accurately estimating real-world intricacies within predictive models during the reconciliation process. The paper delves into these challenges for complex and instrumented systems (multi-scale, multi-physics, and multi-media), discussing practical applications of machine and deep learning methods in the case study of thermal loading monitoring of a high-level waste (HLW) cell demonstrator (called ALC1605) implemented at Andra’s underground research laboratory. Full article
(This article belongs to the Section Electronic Sensors)
Show Figures

Figure 1

24 pages, 713 KiB  
Article
Hierarchical Time Series Forecasting of Fire Spots in Brazil: A Comprehensive Approach
by Ana Caroline Pinheiro and Paulo Canas Rodrigues
Stats 2024, 7(3), 647-670; https://doi.org/10.3390/stats7030039 - 27 Jun 2024
Cited by 2 | Viewed by 1421
Abstract
This study compares reconciliation techniques and base forecast methods to forecast a hierarchical time series of the number of fire spots in Brazil between 2011 and 2022. A three-level hierarchical time series was considered, comprising fire spots in Brazil, disaggregated by biome, and [...] Read more.
This study compares reconciliation techniques and base forecast methods to forecast a hierarchical time series of the number of fire spots in Brazil between 2011 and 2022. A three-level hierarchical time series was considered, comprising fire spots in Brazil, disaggregated by biome, and further disaggregated by the municipality. The autoregressive integrated moving average (ARIMA), the exponential smoothing (ETS), and the Prophet models were tested for baseline forecasts, and nine reconciliation approaches, including top-down, bottom-up, middle-out, and optimal combination methods, were considered to ensure coherence in the forecasts. Due to the need for transformation to ensure positive forecasts, two data transformations were considered: the logarithm of the number of fire spots plus one and the square root of the number of fire spots plus 0.5. To assess forecast accuracy, the data were split into training data for estimating model parameters and test data for evaluating forecast accuracy. The results show that the ARIMA model with the logarithmic transformation provides overall better forecast accuracy. The BU, MinT(s), and WLS(v) yielded the best results among the reconciliation techniques. Full article
(This article belongs to the Special Issue Modern Time Series Analysis II)
Show Figures

Figure 1

18 pages, 6385 KiB  
Article
Cross-Temporal Hierarchical Forecast Reconciliation of Natural Gas Demand
by Colin O. Quinn, George F. Corliss and Richard J. Povinelli
Energies 2024, 17(13), 3077; https://doi.org/10.3390/en17133077 - 21 Jun 2024
Cited by 1 | Viewed by 1285
Abstract
Local natural gas distribution companies (LDCs) require accurate demand forecasts across various time periods, geographic regions, and customer class hierarchies. Achieving coherent forecasts across these hierarchies is challenging but crucial for optimal decision making, resource allocation, and operational efficiency. This work introduces a [...] Read more.
Local natural gas distribution companies (LDCs) require accurate demand forecasts across various time periods, geographic regions, and customer class hierarchies. Achieving coherent forecasts across these hierarchies is challenging but crucial for optimal decision making, resource allocation, and operational efficiency. This work introduces a method that structures the gas distribution system into cross-temporal hierarchies to produce accurate and coherent forecasts. We apply our method to a case study involving three operational regions, forecasting at different geographical levels and analyzing both hourly and daily frequencies. Trained on five years of data and tested on one year, our model achieves a 10% reduction in hourly mean absolute scaled error and a 3% reduction in daily mean absolute scaled error. Full article
(This article belongs to the Section C: Energy Economics and Policy)
Show Figures

Figure 1

6 pages, 405 KiB  
Article
An Alternative Proof of Minimum Trace Reconciliation
by Sakai Ando and Futoshi Narita
Forecasting 2024, 6(2), 456-461; https://doi.org/10.3390/forecast6020025 - 18 Jun 2024
Viewed by 1510
Abstract
Minimum trace reconciliation, developed by Wickramasuriya et al., 2019, is an innovation in the literature on forecast reconciliation. The proof, however, has a gap, and the idea is not easy to extend to more general situations. This paper fills the gap by providing [...] Read more.
Minimum trace reconciliation, developed by Wickramasuriya et al., 2019, is an innovation in the literature on forecast reconciliation. The proof, however, has a gap, and the idea is not easy to extend to more general situations. This paper fills the gap by providing an alternative proof based on the first-order condition in the space of a non-square matrix and arguing that it is not only simpler but also can be extended to incorporate more general results on minimum weighted trace reconciliation in Panagiotelis et al., 2021. Thus, our alternative proof not only has pedagogical value but also connects the results in the literature from a unified perspective. Full article
(This article belongs to the Special Issue Feature Papers of Forecasting 2024)
10 pages, 366 KiB  
Proceeding Paper
Automatic Hierarchical Time-Series Forecasting Using Gaussian Processes
by Luis Roque, Luis Torgo and Carlos Soares
Eng. Proc. 2021, 5(1), 49; https://doi.org/10.3390/engproc2021005049 - 9 Jul 2021
Cited by 1 | Viewed by 3561
Abstract
Forecasting often involves multiple time-series that are hierarchically organized (e.g., sales by geography). In that case, there is a constraint that the bottom level forecasts add-up to the aggregated ones. Common approaches use traditional forecasting methods to predict all levels in the hierarchy [...] Read more.
Forecasting often involves multiple time-series that are hierarchically organized (e.g., sales by geography). In that case, there is a constraint that the bottom level forecasts add-up to the aggregated ones. Common approaches use traditional forecasting methods to predict all levels in the hierarchy and then reconcile the forecasts to satisfy that constraint. We propose a new algorithm that automatically forecasts multiple hierarchically organized time-series. We introduce a combination of additive Gaussian processes (GPs) with a hierarchical piece-wise linear function to estimate, respectively, the stationary and non-stationary components of the time-series. We define a flexible structure of additive GPs generated by each aggregated group in the hierarchy of the data. This formulation aims to capture the nested information in the hierarchy while avoiding overfitting. We extended the piece-wise linear function to be hierarchical by defining hyperparameters shared across related time-series. From our experiments, our algorithm can estimate hundreds of time-series at once. To work at this scale, the estimation of the posterior distributions of the parameters is performed using mean-field approximation. We validate the proposed method in two different real-world datasets showing its competitiveness when compared to the state-of-the-art approaches. In summary, our method simplifies the process of hierarchical forecasting as no reconciliation is required. It is easily adapted to non-Gaussian likelihoods and multiple or non-integer seasonalities. The fact that it is a Bayesian approach makes modeling uncertainty of the forecasts trivial. Full article
(This article belongs to the Proceedings of The 7th International Conference on Time Series and Forecasting)
Show Figures

Figure 1

13 pages, 1010 KiB  
Review
Frameworks on Patterns of Grasslands’ Sensitivity to Forecast Extreme Drought
by Taofeek O. Muraina
Sustainability 2020, 12(19), 7837; https://doi.org/10.3390/su12197837 - 23 Sep 2020
Cited by 5 | Viewed by 3104
Abstract
Climate models have predicted the future occurrence of extreme drought (ED). The management, conservation, or restoration of grasslands following ED requires a robust prior knowledge of the patterns and mechanisms of sensitivity—declining rate of ecosystem functions due to ED. Yet, the global-scale pattern [...] Read more.
Climate models have predicted the future occurrence of extreme drought (ED). The management, conservation, or restoration of grasslands following ED requires a robust prior knowledge of the patterns and mechanisms of sensitivity—declining rate of ecosystem functions due to ED. Yet, the global-scale pattern of grasslands’ sensitivity to any ED event remains unresolved. Here, frameworks were built to predict the sensitivity patterns of above-ground net primary productivity (ANPP) spanning the global precipitation gradient under ED. The frameworks particularly present three sensitivity patterns that could manipulate (weaken, strengthen, or erode) the orthodox positive precipitation–productivity relationship which exists under non-drought (ambient) condition. First, the slope of the relationship could become steeper via higher sensitivity at xeric sites than mesic and hydric ones. Second, if the sensitivity emerges highest in hydric, followed by mesic, then xeric, a weakened slope, flat line, or negative slope would emerge. Lastly, if the sensitivity emerges unexpectedly similar across the precipitation gradient, the slope of the relationship would remain similar to that of the ambient condition. Overall, the frameworks provide background knowledge on possible differences or similarities in responses of grasslands to forecast ED, and could stimulate increase in conduct of experiments to unravel the impacts of ED on grasslands. More importantly, the frameworks indicate the need for reconciliation of conflicting hypotheses of grasslands’ sensitivity to ED through global-scale experiments. Full article
(This article belongs to the Section Air, Climate Change and Sustainability)
Show Figures

Figure 1

17 pages, 1960 KiB  
Article
Forecasting Hierarchical Time Series in Power Generation
by Tiago Silveira Gontijo and Marcelo Azevedo Costa
Energies 2020, 13(14), 3722; https://doi.org/10.3390/en13143722 - 20 Jul 2020
Cited by 12 | Viewed by 7677
Abstract
Academic attention is being paid to the study of hierarchical time series. Especially in the electrical sector, there are several applications in which information can be organized into a hierarchical structure. The present study analyzed hourly power generation in Brazil (2018–2020), grouped according [...] Read more.
Academic attention is being paid to the study of hierarchical time series. Especially in the electrical sector, there are several applications in which information can be organized into a hierarchical structure. The present study analyzed hourly power generation in Brazil (2018–2020), grouped according to each of the electrical subsystems and their respective sources of generating energy. The objective was to calculate the accuracy of the main measures of aggregating and disaggregating the forecasts of the Autoregressive Integrated Moving Average (ARIMA) and Error, Trend, Seasonal (ETS) models. Specifically, the following hierarchical approaches were analyzed: (i) bottom-up (BU), (ii) top-down (TD), and (iii) optimal reconciliation. The optimal reconciliation models showed the best mean performance, considering the primary predictive windows. It was also found that energy forecasts in the South subsystem presented greater inaccuracy compared to the others, which signals the need for individualized models for this subsystem. Full article
(This article belongs to the Collection Energy Economics and Policy in Developed Countries)
Show Figures

Figure 1

16 pages, 388 KiB  
Article
Impact of Information Sharing and Forecast Combination on Fast-Moving-Consumer-Goods Demand Forecast Accuracy
by Dazhi Yang and Allan N. Zhang
Information 2019, 10(8), 260; https://doi.org/10.3390/info10080260 - 16 Aug 2019
Cited by 12 | Viewed by 6085
Abstract
This article empirically demonstrates the impacts of truthfully sharing forecast information and using forecast combinations in a fast-moving-consumer-goods (FMCG) supply chain. Although it is known a priori that sharing information improves the overall efficiency of a supply chain, information such as pricing or [...] Read more.
This article empirically demonstrates the impacts of truthfully sharing forecast information and using forecast combinations in a fast-moving-consumer-goods (FMCG) supply chain. Although it is known a priori that sharing information improves the overall efficiency of a supply chain, information such as pricing or promotional strategy is often kept proprietary for competitive reasons. In this regard, it is herein shown that simply sharing the retail-level forecasts—this does not reveal the exact business strategy, due to the effect of omni-channel sales—yields nearly all the benefits of sharing all pertinent information that influences FMCG demand. In addition, various forecast combination methods are used to further stabilize the forecasts, in situations where multiple forecasting models are used during operation. In other words, it is shown that combining forecasts is less risky than “betting” on any component model. Full article
(This article belongs to the Special Issue Big Data Research, Development, and Applications––Big Data 2018)
Show Figures

Figure 1

22 pages, 982 KiB  
Article
Assessing the Performance of Hierarchical Forecasting Methods on the Retail Sector
by José Manuel Oliveira and Patrícia Ramos
Entropy 2019, 21(4), 436; https://doi.org/10.3390/e21040436 - 24 Apr 2019
Cited by 21 | Viewed by 6107
Abstract
Retailers need demand forecasts at different levels of aggregation in order to support a variety of decisions along the supply chain. To ensure aligned decision-making across the hierarchy, it is essential that forecasts at the most disaggregated level add up to forecasts at [...] Read more.
Retailers need demand forecasts at different levels of aggregation in order to support a variety of decisions along the supply chain. To ensure aligned decision-making across the hierarchy, it is essential that forecasts at the most disaggregated level add up to forecasts at the aggregate levels above. It is not clear if these aggregate forecasts should be generated independently or by using an hierarchical forecasting method that ensures coherent decision-making at the different levels but does not guarantee, at least, the same accuracy. To give guidelines on this issue, our empirical study investigates the relative performance of independent and reconciled forecasting approaches, using real data from a Portuguese retailer. We consider two alternative forecasting model families for generating the base forecasts; namely, state space models and ARIMA. Appropriate models from both families are chosen for each time-series by minimising the bias-corrected Akaike information criteria. The results show significant improvements in forecast accuracy, providing valuable information to support management decisions. It is clear that reconciled forecasts using the Minimum Trace Shrinkage estimator (MinT-Shrink) generally improve on the accuracy of the ARIMA base forecasts for all levels and for the complete hierarchy, across all forecast horizons. The accuracy gains generally increase with the horizon, varying between 1.7% and 3.7% for the complete hierarchy. It is also evident that the gains in forecast accuracy are more substantial at the higher levels of aggregation, which means that the information about the individual dynamics of the series, which was lost due to aggregation, is brought back again from the lower levels of aggregation to the higher levels by the reconciliation process, substantially improving the forecast accuracy over the base forecasts. Full article
(This article belongs to the Special Issue Entropy Application for Forecasting)
Show Figures

Graphical abstract

Back to TopTop