# The Wisdom of the Data: Getting the Most Out of Univariate Time Series Forecasting

^{1}

^{2}

^{*}

## Abstract

**:**

## 1. Introduction

^{®}, SAS Forecasting Server

^{®}, and the forecast package for R statistical software. Such software and packages offer tools for batch and automatic forecasting with minimal to zero manual input. They integrate families of models, like exponential smoothing [2] and autoregressive integrated moving average, ARIMA [3], that can capture a wide range of data patterns and produce extrapolations with ease. However, such families of models rely on assumptions that are barely met in practice, and struggle to select the most appropriate model for a given time series due to the uncertainties involved: identifying the optimal model form, estimating the optimal set of parameters, and dealing with the inherent uncertainty in the data [4].

- Forecasting with sub-seasonal series (FOSS), in which a seasonal time series is split into multiple new series where only particular seasons (or sets of seasons) are observed and modelled [17];

## 2. Theta Method

`thetaf()`and

`theta()`of the packages forecast and tsutils, respectively. Finally, Petropoulos and Nikolopoulos [36] offer a step-by-step tutorial of the standard theta method coupled with an implementation in just 10 lines of R code.

## 3. Multiple Temporal Aggregation

`mapa()`function of the MAPA package (MAPA and MAPAx), the

`imapa()`function of the tsintermittent package (MAPA for intermittent demand data), and the

`thief()`function of the thief package (temporal hierarchies) for R statistical software.

## 4. Bagging

- A Box-Cox transformation is applied on the original series. The $\lambda $ parameter for the Box-Cox transformation is automatically selected based on the Guerrero’s method [58], but other methods such as the maximisation of the profile log likelihood of a linear model fitted to the original data could be used. The purpose of this step is twofold. First, the variance of the series is stabilised. Second, multiplicative patterns are converted into additive ones;
- The Box-Cox transformed series is decomposed into its components. If the series has periodicity greater than unity (e.g., quarterly or monthly data), then the seasonal and trend decomposition using Loess STL [59] decomposition is applied to separate the transformed series into the trend, seasonal, and remainder components. If the series has no periodicity (e.g., yearly data), then a Loess decomposition is applied to separate the series into two components: trend and remainder;
- The remainder component of the above decomposition is bootstrapped towards the creation of new vectors of remainders that follow the empirical distribution of the original remainder vector;
- The bootstrapped remainder vectors are added to the other extracted components from the decomposition of step 2 (trend and, where applicable, seasonality) to form new bootstrapped series. These series have the same underlying structural patterns with the original series;
- An inverse Box-Cox transformation is applied on each of the bootstrapped series, using the same $\lambda $ parameter of step 1. This transformation brings the bootstrapped series back to the same scale as the original data. Figure 3 shows the estimated bootstrapped series based on the original (seasonal) data;
- The original and the bootstrapped series are extrapolated using an automatic forecasting process, which may result on the use of the same or different model forms and parameters. In any case, many sets of forecasts are produced at the end of this step;
- The forecasts from the original and bootstrapped series of the previous step are aggregated (averaged).

- After producing the bootstrapped series the usual way, the authors identified the optimal models on these bootstrapped series. Instead of using the forecasts from these models directly, their model forms were applied back to the original data, for which a different optimal model form may have been identified. Effectively, the bootstrapped series provided the frequencies with which each model form was identified as ‘optimal’, and these frequencies were then translated into combination weights for averaging the point forecasts of the different model forms when applied on the original data. All model parameters and forecasts were estimated using the original data only. This variation of bagging is known as “bootstrap model combination” (BMC) and handles only the uncertainty in the model form;
- The optimal model form was identified using the original data only. Subsequently, this optimal model form was applied on the original data and the bootstraps to obtain multiple independent estimates of the model parameters. The combination of each set of model parameters with the unique optimal model form was then applied again on the original data to produce multiple sets of point forecasts. As with bootstrap model combination, the bootstrapped series were not used to produce forecasts directly. This variation solely handles the uncertainty in estimating the model parameters;
- The optimal model form and set of parameters were estimated using the original series only. Subsequently, this unique model form and set of parameters were applied on all bootstrapped series to produce multiple sets of point forecasts. This variation solely tackles the uncertainty associated with the data, as the bootstrapped series are not used for selecting between models or estimating their parameters.

`baggedETS()`and

`baggedClusterETS()`of the R packages forecast and tshacks, respectively.

## 5. Sub-Seasonal Series

## 6. Multiple Starting Points

## 7. Cross-Comparison

`ets()`function of the forecast package for R statistical software fits 19 models (8 for non-seasonal data) before a final model is selected and its forecasts are produced. The theta method is arguably one of the most inexpensive robust time series forecasting methods. In its standard implementation, it requires the fitting of just 2 models, one for each theta line (a simple linear regression model and SES). Even theta variations that consider more than two theta lines, the number of models required is small. The robust implementation by Legaki and Koutsouri [28] that uses a Box-Cox transformation offered, arguably, one of the best trade-offs in performance versus cost in the M4 competition [30].

## 8. Conclusions and a Look to the Future

`thief()`function—which offers theta as one of the methods to produce base forecasts, it would be even more interesting to see future studies that focus on the integration of the approaches described here. The only exception that we are aware of is the study by Wang et al. [77] that attempts to structurally integrate the concepts surrounding the theta method (and the manipulation of the local curvatures) with aspects of non-overlapping temporal aggregation. We believe that there is much scope for further research in integrating “wisdom of the data” approaches. For instance, one could consider defining a temporal hierarchy approach in which the base forecasts for the nodes of a certain aggregation level are not produced by considering the entire series consisting of all information at the same aggregation level, but each node is extrapolated separately using sub-seasonal series (FOSS). Another example would be the integration of bagging and multiple starting points approaches, since each of them focuses on a different way in extracting information from the data.

## Author Contributions

## Funding

## Institutional Review Board Statement

## Informed Consent Statement

## Data Availability Statement

## Acknowledgments

## Conflicts of Interest

## Abbreviations

ARIMA | Auto-regressive Integrated Moving Average |

ARIMAx | Auto-regressive Integrated Moving Average with Exogenous Variables |

BMC | Bootstrap Model Combination |

CBB | Circular Block Bootstrap |

ETS | Exponential Smoothing |

ETSx | Exponential Smoothing with Exogenous Variables |

FOSS | Forecasting with Sub-seasonal Series |

LPB | Linear Process Bootstrap |

MAPA | Multiple Temporal Aggregation Algorithm |

MAPAx | Multiple Temporal Aggregation Algorithm with Exogenous Variables |

MASE | Mean Absolute Scaled Error |

MBB | Moving Block Bootstrap |

MSP | Multiple Starting Points |

MTA | Multiple Temporal Aggregation |

SBA | Syntetos-Boylan Approximation |

SES | Simple Exponential Smoothing |

sMAPE | Mean Absolute Percentage Error |

STL | Seasonal and Trend decomposition using Loess |

THIEF | Temporal Hierarchical Forecasting |

TSB | Teunter-Syntetos-Babai (method) |

## References

- Fildes, R.; Ma, S.; Kolassa, S. Retail forecasting: Research and practice. Int. J. Forecast.
**2019**. [Google Scholar] [CrossRef][Green Version] - Hyndman, R.J.; Koehler, A.B.; Snyder, R.D.; Grose, S. A state space framework for automatic forecasting using exponential smoothing methods. Int. J. Forecast.
**2002**, 18, 439–454. [Google Scholar] [CrossRef][Green Version] - Hyndman, R.; Khandakar, Y. Automatic time series forecasting: The forecast package for R. J. Stat. Softw.
**2008**, 26, 1–22. [Google Scholar] - Petropoulos, F.; Hyndman, R.J.; Bergmeir, C. Exploring the sources of uncertainty: Why does bagging for time series forecasting work? Eur. J. Oper. Res.
**2018**, 268, 545–554. [Google Scholar] [CrossRef] - Assimakopoulos, V.; Nikolopoulos, K. The Theta model: A decomposition approach to forecasting. Int. J. Forecast.
**2000**, 16, 521–530. [Google Scholar] [CrossRef] - Hyndman, R.J.; Billah, B. Unmasking the Theta method. Int. J. Forecast.
**2003**, 19, 287–290. [Google Scholar] [CrossRef] - Thomakos, D.; Nikolopoulos, K. Fathoming the theta method for a unit root process. IMA J. Manag. Math.
**2012**, 25, 105–124. [Google Scholar] [CrossRef][Green Version] - Fiorucci, J.A.; Pellegrini, T.R.; Louzada, F.; Petropoulos, F.; Koehler, A.B. Models for optimising the theta method and their relationship to state space models. Int. J. Forecast.
**2016**, 32, 1151–1161. [Google Scholar] [CrossRef][Green Version] - Spiliotis, E.; Assimakopoulos, V.; Nikolopoulos, K. Forecasting with a hybrid method utilizing data smoothing, a variation of the Theta method and shrinkage of seasonal factors. Int. J. Prod. Econ.
**2019**, 209, 92–102. [Google Scholar] [CrossRef][Green Version] - Spiliotis, E.; Assimakopoulos, V.; Makridakis, S. Generalizing the Theta method for automatic forecasting. Eur. J. Oper. Res.
**2020**, 284, 550–558. [Google Scholar] [CrossRef] - Kourentzes, N.; Petropoulos, F.; Trapero, J.R. Improving forecasting by estimating time series structural components across multiple frequencies. Int. J. Forecast.
**2014**, 30, 291–302. [Google Scholar] [CrossRef][Green Version] - Petropoulos, F.; Kourentzes, N. Improving forecasting via multiple temporal aggregation. Foresight Int. J. Appl. Forecast.
**2014**, 34, 12–17. [Google Scholar] - Petropoulos, F.; Kourentzes, N. Forecast combinations for intermittent demand. J. Oper. Res. Soc.
**2015**, 66, 914–924. [Google Scholar] [CrossRef][Green Version] - Athanasopoulos, G.; Hyndman, R.J.; Kourentzes, N.; Petropoulos, F. Forecasting with temporal hierarchies. Eur. J. Oper. Res.
**2017**, 262, 60–74. [Google Scholar] [CrossRef][Green Version] - Bergmeir, C.; Hyndman, R.J.; Benítez, J.M. Bagging exponential smoothing methods using STL decomposition and Box–Cox transformation. Int. J. Forecast.
**2016**, 32, 303–312. [Google Scholar] [CrossRef][Green Version] - Dantas, T.M.; Cyrino Oliveira, F.L. Improving time series forecasting: An approach combining bootstrap aggregation, clusters and exponential smoothing. Int. J. Forecast.
**2018**, 34, 748–761. [Google Scholar] [CrossRef] - Li, X.; Petropoulos, F.; Kang, Y. Improving forecasting with sub-seasonal time series patterns. arXiv
**2021**, arXiv:2101.00827. [Google Scholar] - Disney, S.M.; Petropoulos, F. Forecast combinations using multiple starting points. In Proceedings of the Logistics & Operations Management Section Annual Conference (LOMSAC 2015), Glasgow, UK, 12–15 July 2015. [Google Scholar]
- Bai, Y.; Li, X.; Kang, Y. Improving forecasting with multiple starting points. 2021; Unpublished work. [Google Scholar]
- Livera, A.M.D.; Hyndman, R.J.; Snyder, R.D. Forecasting Time Series With Complex Seasonal Patterns Using Exponential Smoothing. J. Am. Stat. Assoc.
**2011**, 106, 1513–1527. [Google Scholar] [CrossRef][Green Version] - Pedregal, D.J.; Trapero, J.R.; Villegas, M.A.; Madrigal, J.J. Submission 260 to the M4 competition. In Github; Universidad de Castilla: Ciudad Real, Spain, 2018. [Google Scholar]
- Dokumentov, A.; Hyndman, R.J. STR: A Seasonal-Trend Decomposition Procedure Based on Regression. arXiv
**2020**, arXiv:2009.05894. [Google Scholar] - Makridakis, S.; Hibon, M. The M3-Competition: Results, conclusions and implications. Int. J. Forecast.
**2000**, 16, 451–476. [Google Scholar] [CrossRef] - Petropoulos, F.; Nikolopoulos, K. Optimizing Theta Model for Monthly Data. 2013. Available online: https://www.scitepress.org/PublicationsDetail.aspx?ID=PYc+bgnxmJE=&t=1 (accessed on 1 June 2021).
- Petropoulos, F. θ-reflections from the next generation of forecasters. In Forecasting with the Theta Method; Nikolopoulos, K., Thomakos, D.D., Eds.; John Wiley & Sons, Ltd.: Chichester, UK, 2019; pp. 161–175. [Google Scholar] [CrossRef]
- Fioruci, J.A.; Pellegrini, T.R.; Louzada, F.; Petropoulos, F. The Optimised Theta Method. arXiv
**2015**, arXiv:1503.03529. [Google Scholar] - Thomakos, D.D.; Nikolopoulos, K. Forecasting Multivariate Time Series with the Theta Method: Multivariate Theta Method. J. Forecast.
**2015**, 34, 220–229. [Google Scholar] [CrossRef][Green Version] - Legaki, N.Z.; Koutsouri, K. Submission 260 to the M4 competition. In Github; National Technical University of Athens: Athens, Greece, 2018. [Google Scholar]
- Makridakis, S.; Spiliotis, E.; Assimakopoulos, V. The M4 Competition: 100,000 time series and 61 forecasting methods. Int. J. Forecast.
**2020**, 36, 54–74. [Google Scholar] [CrossRef] - Gilliland, M. The value added by machine learning approaches in forecasting. Int. J. Forecast.
**2020**, 36, 161–166. [Google Scholar] [CrossRef] - Nikolopoulos, K.; Thomakos, D.; Petropoulos, F.; Assimakopoulos, V. Theta Model Forecasts for Financial Time Series: A Case Study in the S&P500. Technical Report 0033. 2009. Available online: https://ideas.repec.org/p/uop/wpaper/0033.html (accessed on 1 June 2021).
- Athanasopoulos, G.; Hyndman, R.J.; Song, H.; Wu, D.C. The tourism forecasting competition. Int. J. Forecast.
**2011**, 27, 822–844. [Google Scholar] [CrossRef] - Petropoulos, F.; Wang, X.; Disney, S.M. The inventory performance of forecasting methods: Evidence from the M3 competition data. Int. J. Forecast.
**2019**, 35, 251–265. [Google Scholar] [CrossRef] - Nikolopoulos, K.; Thomakos, D.D.; Katsagounos, I.; Alghassab, W. On the M4.0 forecasting competition: Can you tell a 4.0 earthquake from a 3.0? Int. J. Forecast.
**2020**, 36, 203–205. [Google Scholar] [CrossRef] - Nikolopoulos, K.I.; Thomakos, D.D. Forecasting with The Theta Method: Theory and Applications; John Wiley & Sons: Hoboken, NJ, USA, 2019. [Google Scholar]
- Petropoulos, F.; Nikolopoulos, K. The Theta Method. Foresight Int. J. Appl. Forecast.
**2017**, 46, 11–17. [Google Scholar] - Petropoulos, F.; Apiletti, D.; Assimakopoulos, V.; Babai, M.Z.; Barrow, D.K.; Ben Taieb, S.; Bergmeir, C.; Bessa, R.J.; Bijak, J.; Boylan, J.E.; et al. Forecasting: Theory and Practice. arXiv
**2021**, arXiv:2012.03854. [Google Scholar] - Nikolopoulos, K.; Syntetos, A.A.; Boylan, J.E.; Petropoulos, F.; Assimakopoulos, V. An aggregate-disaggregate intermittent demand approach (ADIDA) to forecasting: An empirical proposition and analysis. J. Oper. Res. Soc.
**2011**, 62, 544–554. [Google Scholar] [CrossRef][Green Version] - Babai, M.Z.; Ali, M.M.; Nikolopoulos, K. Impact of temporal aggregation on stock control performance of intermittent demand estimators: Empirical analysis. Omega
**2012**, 40, 713–721. [Google Scholar] [CrossRef] - Spithourakis, G.; Petropoulos, F.; Nikolopoulos, K.; Assimakopoulos, V. A Systemic View of ADIDA framework. IMA J. Manag. Math.
**2014**, 25, 125–137. [Google Scholar] [CrossRef][Green Version] - Spiliotis, E.; Petropoulos, F.; Assimakopoulos, V. Improving the forecasting performance of temporal hierarchies. PLoS ONE
**2019**, 14, e0223422. [Google Scholar] [CrossRef] [PubMed][Green Version] - Spiliotis, E.; Petropoulos, F.; Kourentzes, N.; Assimakopoulos, V. Cross-temporal aggregation: Improving the forecast accuracy of hierarchical electricity consumption. Appl. Energy
**2020**, 261, 114339. [Google Scholar] [CrossRef] - Croston, J.D. Forecasting and Stock Control for Intermittent Demands. Oper. Res. Q.
**1972**, 23, 289–303. [Google Scholar] [CrossRef] - Syntetos, A.A.; Boylan, J.E. The accuracy of intermittent demand estimates. Int. J. Forecast.
**2005**, 21, 303–314. [Google Scholar] [CrossRef] - Kostenko, A.V.; Hyndman, R.J. A note on the categorization of demand patterns. J. Oper. Res. Soc.
**2006**, 57, 1256–1257. [Google Scholar] [CrossRef][Green Version] - Kourentzes, N.; Petropoulos, F. Forecasting with multivariate temporal aggregation: The case of promotional modelling. Int. J. Prod. Econ.
**2016**, 181, 145–153. [Google Scholar] [CrossRef][Green Version] - Athanasopoulos, G.; Ahmed, R.A.; Hyndman, R.J. Hierarchical forecasts for Australian domestic tourism. Int. J. Forecast.
**2009**, 25, 146–166. [Google Scholar] [CrossRef][Green Version] - Hyndman, R.J.; Ahmed, R.A.; Athanasopoulos, G.; Shang, H.L. Optimal combination forecasts for hierarchical time series. Comput. Stat. Data Anal.
**2011**, 55, 2579–2589. [Google Scholar] [CrossRef][Green Version] - Athanasopoulos, G.; Gamakumara, P.; Panagiotelis, A.; Hyndman, R.J.; Affan, M. Hierarchical Forecasting. In Macroeconomic Forecasting in the Era of Big Data: Theory and Practice; Fuleky, P., Ed.; Springer International Publishing: Cham, Switzerland, 2020; pp. 689–719. [Google Scholar] [CrossRef][Green Version]
- Hollyman, R.; Petropoulos, F.; Tipping, M.E. Understanding forecast reconciliation. Eur. J. Oper. Res.
**2021**, 294, 149–160. [Google Scholar] [CrossRef] - Kourentzes, N.; Rostami-Tabar, B.; Barrow, D.K. Demand forecasting by temporal aggregation: Using optimal or multiple aggregation levels? J. Bus. Res.
**2017**, 78, 1–9. [Google Scholar] [CrossRef][Green Version] - Jeon, J.; Panagiotelis, A.; Petropoulos, F. Probabilistic forecast reconciliation with applications to wind power and electric load. Eur. J. Oper. Res.
**2019**, 279, 364–379. [Google Scholar] [CrossRef] - Nystrup, P.; Lindström, E.; Pinson, P.; Madsen, H. Temporal hierarchies with autocorrelation for load forecasting. Eur. J. Oper. Res.
**2020**, 280, 876–888. [Google Scholar] [CrossRef] - Kourentzes, N.; Athanasopoulos, G. Elucidate structure in intermittent demand series. Eur. J. Oper. Res.
**2021**, 288, 141–152. [Google Scholar] [CrossRef] - Teunter, R.H.; Syntetos, A.A.; Zied Babai, M. Intermittent demand: Linking forecasting to inventory obsolescence. Eur. J. Oper. Res.
**2011**, 214, 606–615. [Google Scholar] [CrossRef] - Kourentzes, N.; Athanasopoulos, G. Cross-temporal coherent forecasts for Australian tourism. Ann. Tour. Res.
**2019**, 75, 393–409. [Google Scholar] [CrossRef] - Yagli, G.M.; Yang, D.; Srinivasan, D. Reconciling solar forecasts: Sequential reconciliation. Sol. Energy
**2019**, 179, 391–397. [Google Scholar] [CrossRef] - Guerrero, V.M. Time-series analysis supported by power transformations. J. Forecast.
**1993**, 12, 37–48. [Google Scholar] [CrossRef] - Cleveland, R.B.; Cleveland, W.S.; McRae, J.E.; Terpenning, I. STL: A seasonal-trend decomposition procedure based on loess. J. Off. Stat.
**1990**, 6, 3–73. [Google Scholar] - Bandara, K.; Hewamalage, H.; Liu, Y.H.; Kang, Y.; Bergmeir, C. Improving the Accuracy of Global Forecasting Models using Time Series Data Augmentation. arXiv
**2020**, arXiv:2008.02663. [Google Scholar] - Kunsch, H.R. The Jackknife and the Bootstrap for General Stationary Observations. Ann. Stat.
**1989**, 17, 1217–1241. [Google Scholar] [CrossRef] - Bühlmann, P. Sieve Bootstrap for Time Series. Bernoulli
**1997**, 3, 123–148. [Google Scholar] [CrossRef] - Politis, D.N.; Romano, J.P. A Circular Block-Resampling Procedure for Stationary Data; Tech. Rep. No. 370; Department of Statistics, Stanford University: Stanford, CA, USA, 1991. [Google Scholar]
- McMurry, T.; Politis, D.N. Banded and tapered estimates of autocovariance matrices and the linear process bootstrap. J. Time Ser. Anal.
**2010**, 31, 471–482. [Google Scholar] [CrossRef][Green Version] - Dantas, T.M.; Cyrino Oliveira, F.L.; Varela Repolho, H.M. Air transportation demand forecast through Bagging Holt Winters methods. J. Air Transp. Manag.
**2017**, 59, 116–123. [Google Scholar] [CrossRef] - Meira, E.; Cyrino Oliveira, F.L.; Jeon, J. Treating and Pruning: New approaches to forecasting model selection and combination using prediction intervals. Int. J. Forecast.
**2021**, 37, 547–568. [Google Scholar] [CrossRef] - Pesaran, M.H.; Timmermann, A. Selection of estimation window in the presence of breaks. J. Econom.
**2007**, 137, 134–161. [Google Scholar] [CrossRef] - Spiliotis, E.; Kouloumos, A.; Assimakopoulos, V.; Makridakis, S. Are forecasting competitions data representative of the reality? Int. J. Forecast.
**2020**, 36, 37–53. [Google Scholar] [CrossRef] - de Oliveira, E.M.; Cyrino Oliveira, F.L. Forecasting mid-long term electric energy consumption through bagging ARIMA and exponential smoothing methods. Energy
**2018**, 144, 776–788. [Google Scholar] [CrossRef] - Syntetos, A.A.; Nikolopoulos, K.; Boylan, J.E. Judging the judges through accuracy-implication metrics: The case of inventory forecasting. Int. J. Forecast.
**2010**, 26, 134–143. [Google Scholar] [CrossRef] - Bates, J.M.; Granger, C.W.J. The Combination of Forecasts. Oper. Res. Soc.
**1969**, 20, 451–468. [Google Scholar] [CrossRef] - Timmermann, A. Forecast Combinations. Handb. Econ. Forecast.
**2006**, 1, 135–196. [Google Scholar] - Makridakis, S.; Spiliotis, E.; Assimakopoulos, V. The M5 Accuracy Competition: Results, Findings and Conclusions. 2020. Available online: https://drive.google.com/drive/u/1/folders/1S6IaHDohF4qalWsx9AABsIG1filB5V0q (accessed on 1 June 2021).
- Montero-Manso, P.; Athanasopoulos, G.; Hyndman, R.J.; Talagala, T.S. FFORMA: Feature-based forecast model averaging. Int. J. Forecast.
**2020**, 36, 86–92. [Google Scholar] [CrossRef] - Kang, Y.; Spiliotis, E.; Petropoulos, F.; Athiniotis, N.; Li, F.; Assimakopoulos, V. Déjà vu: A data-centric forecasting approach through time series cross-similarity. J. Bus. Res.
**2020**. [Google Scholar] [CrossRef] - Seaman, B. Considerations of a retail forecasting practitioner. Int. J. Forecast.
**2018**, 34, 822–829. [Google Scholar] [CrossRef] - Wang, B.; Petropoulos, F.; Jooyoung, J.; Erdogan, G. Integrating theta method and multiple temporal aggregation: Optimising aggregation levels. In Proceedings of the 39th International Symposium on Forecasting ISF 2019, Thessaloniki, Greece, 16–19 June 2019. [Google Scholar]

**Figure 1.**An illustrative example of producing theta lines for the theta method. The original data (black line) are de-seasonalised (red line). Then, a linear regression on trend produces the theta line with $\theta =0$ (blue line). The theta line with $\theta =2$ (green line) has double the curvatures of the seasonally adjusted data.

**Figure 2.**A visual example where multiple new temporally aggregated time series are created based on the original data. The monthly data (black line) are temporally aggregated to quarterly (red line), semesterly (blue line), and yearly (green line) data.

**Figure 4.**An illustrative example of producing sub-seasonal series by sub-sampling the original monthly data (first panel). In the second panel, we have produced a non-seasonal series that consists only of the periods in July of each year. The third and fourth panels show two more sub-sampled series with periodicity 2 and 3, respectively. Note that by considering particular subsamples, the level as well as other patterns change significantly.

**Figure 5.**An illustrative example of producing series from multiple starting points. The original data (first panel) are trimmed so that the periods from only the last two years (second panel), the last three years (third panel), or the last four years (fourth panel) are considered.

Approach | Random Component | Frequency | Length |
---|---|---|---|

Theta | ✓ | ||

MTA | ✓ | ✓ | ✓ |

Bagging | ✓ | ||

FOSS | ✓ | ✓ | |

MSP | ✓ |

Approach | Sources of Uncertainty | ||
---|---|---|---|

Data | Model | ||

Form | Parameters | ||

Theta | ✓ | ||

MTA | ✓ | ✓ | ✓ |

Bagging | ✓ | ✓ | ✓ |

FOSS | ✓ | ✓ | |

MSP | ✓ | ✓ | ✓ |

**Table 3.**The published average performance of the five approaches on the monthly data from the M3 and M4 competitions.

Approach | Variation | M3 | M4 | ||||
---|---|---|---|---|---|---|---|

Yearly | Quarterly | Monthly | Yearly | Quarterly | Monthly | ||

Theta | Standard [5] | 16.90 [23] | 8.96 [23] | 13.85 [23] | 14.59 [29] | 10.31 [29] | 13.00 [29] |

Optimised $\theta $ [8] | 15.94 [8] | 9.28 [8] | 13.74 [8] | 13.68 | 10.09 | 13.32 | |

Box-Cox [28] | 16.20 | 9.13 | 13.55 | 13.37 [29] | 10.15 [29] | 13.00 [29] | |

AutoTheta [42] | 16.02 | 9.18 | 13.86 | 13.80 | 10.13 | 13.13 | |

Theta-EXP [41] | 16.48 | 8.99 | 13.44 | 14.11 | 10.37 | 13.12 | |

MTA | MAPA-ETS [11] | 18.37 [11] | 9.63 [11] | 13.69 [11] | 14.88 | 10.27 | 12.97 |

THIEF-ETS [14] | 17.00 | 9.38 | 13.58 | 15.36 | 10.40 | 12.89 | |

THIEF-ARIMA [14] | 17.10 | 9.79 | 14.49 | 15.15 | 10.61 | 13.39 | |

Bagging | MBB-ETS [15] | 17.89 [15] | 10.13 [15] | 13.64 [15] | 14.47 [66] | 10.23 [66] | 13.30 [66] |

BMC-ETS [4] | 17.15 [4] | 9.56 [4] | 13.79 [4] | 14.94 [66] | 10.08 [66] | 13.07 [66] | |

Pruned and Treated [66] | 17.36 [66] | 9.74 [66] | 13.61 [66] | 14.49 [66] | 10.22 [66] | 13.27 [66] | |

FOSS | FOSS-ETS [17] | 9.24 [17] | 13.56 [17] | 10.15 [17] | 12.84 [17] | ||

FOSS-ARIMA [17] | 9.68 [17] | 14.01 [17] | 10.41 [17] | 12.87 [17] | |||

MSP | MSP-ETS [18] | 16.90 [18] | 9.79 [18] | 14.00 [18] |

Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |

© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Petropoulos, F.; Spiliotis, E. The Wisdom of the Data: Getting the Most Out of Univariate Time Series Forecasting. *Forecasting* **2021**, *3*, 478-497.
https://doi.org/10.3390/forecast3030029

**AMA Style**

Petropoulos F, Spiliotis E. The Wisdom of the Data: Getting the Most Out of Univariate Time Series Forecasting. *Forecasting*. 2021; 3(3):478-497.
https://doi.org/10.3390/forecast3030029

**Chicago/Turabian Style**

Petropoulos, Fotios, and Evangelos Spiliotis. 2021. "The Wisdom of the Data: Getting the Most Out of Univariate Time Series Forecasting" *Forecasting* 3, no. 3: 478-497.
https://doi.org/10.3390/forecast3030029